Tuesday, November 09, 2021

UNICORN TECH

Compact Fusion Power Plant Concept Uses State-of-the-Art Physics To Improve Energy Production

Compact Advanced Tokamak

The Compact Advanced Tokamak (CAT) is a potentially economical solution for fusion energy production that takes advantage of advances in simulation and technology. Credit: Image courtesy of General Atomics. Tokamak graphic modified from F. Najmabadi et al., The ARIES-AT advanced tokamak, Advanced technology fusion power plant, Fusion Engineering Design, 80, 3-23 (2006).

Fusion power plants use magnetic fields to hold a ball of current-carrying gas (called a plasma). This creates a miniature sun that generates energy through nuclear fusion. The Compact Advanced Tokamak (CAT) concept uses state-of-the-art physics models to potentially improve fusion energy production. The models show that by carefully shaping the plasma and the distribution of current in the plasma, fusion plant operators can suppress turbulent eddies in the plasma. These eddies can cause heat loss. This will enable operators to achieve higher pressures and fusion power with lower current. This advance could help achieve a state where the plasma sustains itself and drives most of its own current.

In this approach to tokamak reactors, the improved performance at reduced plasma current reduces stress and heat loads. This alleviates some of the engineering and materials challenges facing fusion plant designers. Higher pressure also increases an effect where the motion of particles in the plasma naturally generates the current required. This greatly reduces the need for expensive current drive systems that sap a fusion plant’s potential electric power output. It also enables a stationary “always-on” configuration. This approach leads to plants that suffer less stress during operation than typical pulsed approaches to fusion power, enabling smaller, less expensive power plants.

Over the past year, the Department of Energy’s (DOE) Fusion Energy Sciences Advisory Committee and the National Academies of Sciences, Engineering, and Medicine have released roadmaps calling for the aggressive development of fusion energy in the United States. Researchers believe that achieving that goal requires development of more efficient and economical approaches to creating fusion energy than currently exist. The approach used to create the CAT concept developed novel reactor simulations that leverage the latest physics understanding of plasma to improve performance. Researchers combined state-of-the-art theory validated at the DIII-D National Fusion Facility with leading-edge computing using the Cori supercomputer at the National Energy Research Scientific Computing Center. These simulations identified a path to a concept enabling a higher-performance, largely self-sustaining configuration that holds energy more efficiently than typical pulsed configurations, allowing it to be built at reduced scale and cost.

Reference: “The advanced tokamak path to a compact net electric fusion pilot plant” by R.J. Buttery, J.M. Park, J.T. McClenaghan, D. Weisberg, J. Canik, J. Ferron, A. Garofalo, C.T. Holcomb, J. Leuer, P.B. Snyder and The Atom Project Team, 19 March 2021, Nuclear Fusion.
DOI: 10.1088/1741-4326/abe4af

This work was supported by the Department of Energy Office of Science, Office of Fusion Energy Sciences, based on the DIII-D National Fusion Facility, a DOE Office of Science user facility, and the AToM Scientific Discovery through Advanced Computing project.

Integrating hot cores and cool edges in fusion reactors

Integrating hot cores and cool edges in fusion reactors
Overview schematic of a tokamak highlighting exhaust into the divertor region. The zoomed
 in region shows the narrow geometry of the divertor and the camera view from the
 experiments. Images from the experiments show that heat fluxes decreased during 
p powder injection, indicating the reduced heat fluxes during powder injection. 
Credit: F. Effenberg, Princeton Plasma Physics Laboratory, T. Wilks, Massachusetts Institute of Technology

Future fusion reactors have a conundrum: maintain a plasma core that is hotter than the surface of the sun without melting the walls that contain the plasma. Fusion scientists refer to this challenge as "core-edge integration." Researchers working at the DIII-D National Fusion Facility at General Atomics have recently tackled this problem in two ways: the first aims to make the fusion core even hotter, while the second focuses on cooling the material that reaches the wall. Protecting the plasma facing components could make them last longer, making future fusion power plants more cost-effective. 

Just like the more familiar internal combustion engine, vessels used in fusion research must exhaust heat and particles during operation. Like a car's exhaust pipe, this exit path is designed to handle  and material loads but only within certain limits. One key strategy for reducing the heat coming from the plasma core is to inject impurities—particles heavier than the mostly hydrogen plasma—into the exhaust region. These impurities help remove excess heat in the plasma before it hits the wall, helping the plasma-facing materials last longer. These same impurities, however, can travel back into regions where fusion reactions are occurring, reducing overall performance of the reactor.

Past  injection experiments have relied on gaseous impurities, but a research team from the U.S. Department of Energy's Princeton Plasma Physics Laboratory experimented with the injection of a powder consisting of boron, boron nitride, and lithium (Figure 1). The use of powder rather than gas offers several advantages. It allows a larger range of potential impurities, which can also be made purer and less likely to chemically react with the plasma. Experiments using powder injection on DIII-D are aimed at cooling the boundary of the plasma while maintaining the heat in the core of the . Measurements showed only a marginal decrease in fusion performance during the heat production.

Integrating hot cores and cool edges in fusion reactors
This graph of plasma performance against divertor temperature shows the progress in
 core edge integration in Super H-mode plasmas. Black data points are standard 
Super H-modes with no external impurities introduced, while pink points inject nitrogen
 to cool the divertor and overlap the target core-edge integration target region shaded
 green in the upper left. Credit: F. Effenberg, Princeton Plasma Physics Laboratory, T. Wilks
, Massachusetts Institute of Technology

The experiments developed a balanced approach that achieved significant edge cooling with only modest effects on core performance. Incorporating powder injection or the use of the Super H-mode into future reactor designs may allow them to maintain high levels of fusion performance while increasing the lifetime of divertor surfaces that exhaust waste .  Both sets of experimental results, coupled with theoretical simulations, suggest that these approaches would be compatible with larger devices like ITER, the international tokamak under construction in France, and would facilitate -edge integration in future  power plants.

Fast flows prevent buildup of impurities on the edge of tokamak plasmas

More information: GI02.00001. Mitigation of plasma-materials interactions with low-Z powders in DIII-D H-mode discharges

Provided by American Physical Society 

Unveiling the steady progress toward fusion energy gain

Unveiling the steady progress toward fusion energy gain
(Left) Record fusion triple products achieved by different fusion concepts over time illustrates progress towards fusion energy gain. (Right) Achieved values of Lawson parameter and temperature plotted against curves of scientific gain. Credit: Wurzel and Hsu

The march towards fusion energy gain, required for commercial fusion energy, is not always visible. Progress occurs in fits and starts through experiments in national laboratories, universities, and more recently at private companies. Sam Wurzel, a Technology-to-Market Advisor at the Advanced Research Projects Agency-Energy (ARPA-E), details and highlights this progress over the last 60 years by extracting and cataloging the performance of over 70 fusion experiments in this time span. The work illustrates the history and development of different approaches including magnetic-fusion devices such as tokamaks, stellarators and other "alternate concepts," laser-driven devices such as inertial confinement fusion (ICF), and hybrid approaches including liner-imploded and z-pinch concepts.

A minimum condition for developing fusion research into a viable  source for society is the achievement of large energy gain—that is, much more energy released due to  than the energy put into the system in the first place. In 1955, a British engineer named J.D. Lawson identified the requirements for achieving high levels of energy gain: high temperatures, and a high product of density and energy confinement (or burn) time. Multiplying all three parameters into a single value called the "fusion triple product" gives a metric that allows for the comparison of different fusion concepts along the axis of energy gain. By extracting data from dozens of fusion journal articles and reports over the last six decades, Wurzel shows that progress was rapid from the 1960s to the 1990s.

Past the nineties, however, the increase in fusion triple product did not grow as steadily, but has jumped significantly in recent years in laser-based ICF at the National Ignition Facility (NIF), which has achieved the largest fusion triple product values to date. Newer, lower-cost concepts pursued by private fusion companies, like the sheared-flow stabilized Z-pinch and the field-reversed configuration (FRC) and other novel configurations are showing progress and promise, surpassing the performance of early tokamaks.

The key figures from Wurzel's invited tutorial talk, based on the manuscript by Wurzel and Hsu, provide a comprehensive framework, inclusive of all thermonuclear fusion concepts, for tracking and understanding the physics progress of  toward energy breakeven and gain (Figure 1).Researchers report argon fluoride laser fusion research findings

More information: PT02.00001. Progress Toward Fusion Energy Breakeven and Gain as Measured Against the Lawson Criterion

Provided by American Physical Society 

Researchers at the brink of fusion ignition at National Ignition Facility

Researchers at the brink of fusion ignition at national ignition facility
The fusion yield (megajoules) from 2011 to present. Credit: LLNL

After decades of inertial confinement fusion research, a record yield of more than 1.3 megajoules (MJ) from fusion reactions was achieved in the laboratory for the first time during an experiment at Lawrence Livermore National Laboratory's (LLNL) National Ignition Facility (NIF) on Aug. 8, 2021. These results mark an 8-fold improvement over experiments conducted in spring 2021 and a 25-fold increase over NIF's 2018 record yield (Figure 1).

NIF precisely guides, amplifies, reflects, and focuses 192 powerful laser beams into a target about the size of a pencil eraser in a few billionths of a second. NIF generates temperatures in the target of more than 180 million F and pressures of more than 100 billion Earth atmospheres. Those  cause  in the target to fuse and release energy in a controlled thermonuclear reaction.

LLNL physicist Debbie Callahan will discuss this achievement during a plenary session at the 63rd Annual Meeting of the APS Division of Plasma Physics. While there has been significant media coverage of this achievement, this talk will represent the first opportunity to address these results and the path forward in a scientific conference setting.

Achieving these large yields has been a long-standing goal for inertial confinement fusion research and puts researchers at the threshold of fusion ignition, an important goal of NIF, the world's largest and most energetic laser.

The fusion research community uses many technical definitions for ignition, but the National Academy of Science adopted the definition of "gain greater than unity" in a 1997 review of NIF, meaning fusion yield greater than laser energy delivered. This experiment produced  yield of roughly two-thirds of the laser energy that was delivered, tantalizingly close to that goal.

The experiment built on several advances developed over the last several years by the NIF team including new diagnostics; target fabrication improvements in the capsule shell, fill tube and hohlraum (a gold cylinder that holds the target capsule); improved  precision; and design changes to increase the energy coupled to the implosion and the compression of the implosion.

These advances  to a new experimental regime, with new avenues for research and the opportunity to benchmark modeling used to understand the proximity to ignition.Unveiling the steady progress toward fusion energy gain

More information: Abstract: AR01.00001. Achieving a Burning Plasma on the National Ignition Facility (NIF) Laser

Provided by American Physical Society 

 

New Insights Into Heat Pathways Advances Understanding of Fusion Plasma

Heat Pathways Fusion Plasma

Physicist Suying Jin with computer-generated images showing the properties of heat pulse propagation in plasma. Credit: Headshot courtesy of Suying Jin / Collage courtesy of Kiran Sudarsanan

A high-tech fusion facility is like a thermos — both keep their contents as hot as possible. Fusion facilities confine electrically charged gas known as plasma at temperatures 10 times hotter than the sun, and keeping it hot is crucial to stoking the fusion reactions that scientists seek to harness to create a clean, plentiful source of energy for producing electricity.

Now, researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have made simple changes to equations that model the movement of heat in plasma. The changes improve insights that could help engineers avoid the conditions that could lead to heat loss in future fusion facilities.

Fusion, the power that drives the sun and stars, combines light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — that generates massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

“The whole magnetic confinement fusion approach basically boils down to holding a plasma together with magnetic fields and then getting it as hot as possible by keeping heat confined,” said Suying Jin, a graduate student in the Princeton Program for Plasma Physics and lead author of a paper reporting the results in Physical Review E. “To accomplish this goal, we have to fundamentally understand how heat moves through the system.”

Scientists had been using an analysis technique that assumed that the heat flowing among electrons was substantially unaffected by the heat flowing among the much larger ions, Jin said. But she and colleagues found that the two pathways for heat actually interact in ways that can profoundly affect how measurements are interpreted. By allowing for that interaction, scientists can measure the temperatures of electrons and ions more accurately. They also can infer information about one pathway from information about the other.

“What’s exciting about this is that it doesn’t require different equipment,” Jin said. “You can do the same experiments and then use this new model to extract much more information from the same data.”

Jin became interested in heat flow during earlier research into magnetic islands, plasma blobs formed from swirling magnetic fields. Modeling these blobs depends on accurate measurements of heat flow. “Then we noticed gaps in how other people had measured heat flow in the past,” Jin said. “They had calculated the movement of heat assuming that it moved only through one channel. They didn’t account for interactions between these two channels that affect how the heat moves through the plasma system. That omission led both to incorrect interpretations of the data for one species and missed opportunities to get further insights into the heat flow through both species.”

Jin’s new model provides fresh insights that weren’t available before. “It’s generally easier to measure electron heat transport than it is to measure ion heat transport,” said PPPL physicist Allan Reiman, a paper co-author. “These findings can give us an important piece of the puzzle in an easier way than expected.”

“It is remarkable that even minimal coupling between electrons and ions can profoundly change how heat propagates in plasma,” said Nat Fisch, Professor of Astrophysical Sciences at Princeton University and a co-author of the paper. “This sensitivity can now be exploited to inform our measurements.”

The new model will be used in future research. “We are looking at proposing another experiment in the near future, and this model will give us some extra knobs to turn to understand the results,” Reiman said. “With Jin’s model, our inferences will be more accurate. We now know how to extract the additional information we need.”

Reference: “Coupled heat pulse propagation in two-fluid plasmas” by S. Jin, A. H. Reiman and N. J. Fisch, 4 May 2021, Physical Review E.
DOI: 10.1103/PhysRevE.103.053201


'MAYBE' TECH
Sucking carbon out of the air promises to reverse our emissions. Will it work?

CO2 removal is proving costly and difficult to scale up.


The Orca facility, in Iceland, is intended to pull 4,000 tons of carbon dioxide out of the air each year.Halldor Kolbeins/Getty

Jackson Ryan
Nov. 8, 2021 

On March 19, the flat-topped Icelandic volcano known as Fagradalsfjall provided one of the most Instagrammable eruptions of all time. People flocked to the site to watch lava slither down the volcano's side for the first time in 800 years. The last time the volcano erupted, in the 13th century, carbon dioxide levels in the atmosphere were well below 300 parts per million (ppm).

But around 100 years ago, humanity saw carbon dioxide levels breach that mark. They've continued to rise exponentially throughout the 20th century, and six years ago they smashed through the 400ppm mark.

The cause is, unequivocally, human activity. Burning fossil fuels sends carbon dioxide into the atmosphere where it lingers for centuries (maybe even millennia). The molecule is good at trapping heat and forms something of a blanket over the Earth.

The most recent report, published by the Intergovernmental Panel on Climate Change in August, revealed that our excess carbon dioxide emissions have resulted in a 1.1 degree Celsius increase in temperatures since preindustrial times. Scientists say temperatures will continue to rise as carbon dioxide levels increase, resulting in more extreme weather events, more heat, more drought and a catastrophic decline in biodiversity.



The Fagradalsfjall volcano in Iceland began erupting in March. Saul Loeb/Getty

If humans have pumped all that excess carbon dioxide into the air, why don't we just try to vacuum it back up? Literally. Why not suck carbon dioxide out of the air? This concept, known as "direct air capture," or DAC, has been discussed for decades, but testing and deploying machines to perform the task has proven challenging, mostly because they are costly and inefficient.

As world leaders, activists and academics meet in Glasgow for COP26, the UN's premier climate summit, CNET Science has been examining some of the technological advances being developed to help tackle the climate crisis. While technology might help us adapt or mitigate the effects of climate change, alone it's not a solution to the problem.

With direct air capture facilities coming online and looking to expand, can we expect them to be a viable tool to reverse carbon emissions? Or are we getting sucked into the spin?

Ghost in the machine


Imagine 1 million particles of "air." The vast majority of these particles are nitrogen and, to a lesser extent, oxygen. Only about 412 particles are carbon dioxide, the heat-trapping greenhouse gas.

This is a simplistic view of air, but it helps describe the complex task a DAC machine has to carry out -- taking in millions and millions of particles of air and sifting through them to grab carbon dioxide.

To do so, DAC facilities use a series of huge fans to suck in ambient air and push it through a filter laced with chemicals that carbon dioxide reacts with and sticks to. Think of it as a specialized kind of flypaper. The CO2 gets trapped, while the other components of air pass right through.

The Climeworks system sucks in ambient air and traps CO2. The captured CO2 is mixed with water and pumped deep into the Earth where it can stay for millennia.Climeworks

Heat, pressure or other chemicals can unstick the concentrated carbon dioxide. You could squirrel away this stock of CO2 underground, mix it with water and inject it into the Earth where it mineralizes and turns to stone. Voila! You've just removed CO2 from the atmosphere.

But since carbon dioxide makes up such a tiny fraction of the air we breathe, DAC facilities need to take a whole lot in. This requires energy. The heating of the filter to free the concentrated CO2 also requires energy. If that energy is provided by fossil fuels, well… you can see the conundrum.

There are 19 direct air capture facilities in operation around the world, according to the International Energy Agency. Fifteen of these are operated by Swiss company Climeworks, and its most recent DAC facility highlights both the promise of vacuuming up CO2 and the remaining hurdles to large-scale builds.

870 cars

The crown jewel in Climeworks' operations is Orca, which lies just an hour away from Fagradalsfjall, at the Hellisheidi geothermal plant. It's the world's largest DAC facility. Orca opened two months ago, on Sept. 9, and has been sucking up CO2 ever since.

The world's largest such plant to date, Orca captures around 4,000 tons of carbon dioxide each year -- a very small amount, equivalent to taking just 870 cars off the road.

Climeworks has the backing of influential brands, like Shopify and Microsoft, which have purchased future removal of CO2 as the facility expands. The Climeworks team plans to scale up the Orca plant by 2024 and is shooting for a global rollout in 2027 which, it says, would see a hundredfold increase in removal.

Iceland is an ideal location for Orca because it sits on top of a fault line between two tectonic plates, bringing heat and magma closer to the surface. Fagradalsfjall's eruption is a beautiful consequence of this location, and it also means there's an abundance of geothermal energy -- a renewable energy source that uses heat generated within the earth. DAC uses energy supplied by the Hellisheidi Geothermal Power Plant.

The volcanic landscape also means there's basalt rock deep below the surface. Once Orca removes CO2 from the air, a nearby facility, run by Icelandic company Carbfix, injects it into this basalt layer. Within two years, the CO2 turns to stone and can be locked away for millennia.

It's important to note here that DAC is different from other carbon capture technologies, often referred to under the umbrella of "carbon capture, utilization and storage," or CCUS. These technologies have been developed and touted by fossil fuel industries as a way to try to capture carbon dioxide during burning of oil and gas -- that's a whole other issue covered very well by the Australian Broadcasting Corporation. Sometimes, DAC has been tarred with the same brush.

Nevertheless, large-scale facilities are being planned in the US and Scotland. A rival DAC company, Carbon Engineering, is hoping to eventually suck 1 million metric tons of CO2 out of the air at its facility in the US Permian Basin, the equivalent of taking around 200,000 cars off the road.

That's a big number, but as of 2020 global emissions were still reaching around 31.6 billion metric tons per year, and it's not clear they've peaked.

The spin and the cost


When Orca came online in September, climate scientist Peter Kalmus tweeted that he was rooting for it, but "only a fool would bet the planet on it."

Direct air capture is a promising technology, but there are a few major hurdles. The first is that the technology is prohibitively expensive. Taking one ton of carbon dioxide out of the atmosphere can cost between $100 and $600 -- or as much as $1,000. Carbon pricing, which forces fossil fuel polluters to pay for their emissions, is much cheaper.

That means that, at present, it's more cost-effective to emit carbon dioxide than to pull it out of the air. "It's really not good bang for your buck," says Alia Armstead, a climate researcher at the Australia Institute, a Canberra-based think tank.

To make DAC truly carbon negative, you also need to find good renewable sources of energy to power your removal plants. That's easy in highly volcanic Iceland, but access to renewables isn't as easy across the world. Some critics suggest the money invested in DAC would be better going directly to renewable projects -- preventing carbon dioxide from entering the atmosphere in the first place.

Mark Jacobson, a civil and environmental engineer at Stanford University, told CNET in February that lobbyists for carbon capture technologies, including DAC removal, are "basically lying to the public about the benefits." He's also said government support for the tech is a subsidy that would keep the fossil fuel industry in business.

But support from governments and private organizations is growing. Two weeks ago, the US Department of Energy announced $14.5 million to scale up DAC facilities, stating in a release that DAC is "critical to combatting the current climate crisis and achieving net-zero emissions by 2050." On Nov. 5, it announced a "Carbon Negative Earthshot" to accelerate research and innovation and to drive down removal costs. And earlier this year, Elon Musk's XPrize foundation fronted $100 million to help develop carbon capture technologies, with a winner expected to be announced in April 2025.

While there is growing support for carbon removal technologies, scientists widely recognize that DAC cannot replace the need to decarbonize. In addition, technological progress has been slow. Even with increased support, it remains too slow to keep us from overshooting the climate targets set out in the Paris Agreement.

"We need to invest in technologies that do work, and that can reduce emissions today, not in five or 10 years," says Armstead.
'MAYBE' TECH

AUSTRALIA
Can Moomba live up to the hype? Santos’ $220m carbon capture storage project

The first project registered for the government’s CCS carbon credits program is also claimed to be one of the world’s biggest


A model of Santos’ Moomba carbon capture storage project was on display in the Australian government’s pavilion at Cop26. 
Photograph: Santos Ltd/PR IMAGE

Supported by
A

Graham Readfearn
@readfearn
Mon 8 Nov 2021 16.30 GMT


About 800km north of Adelaide in the middle of the South Australia desert, an oil and gas company is working on a high-profile project to capture and store carbon dioxide.

It’s claimed to be one of the world’s biggest carbon capture projects and lauded by the federal government, but what is actually going on at Moomba?

Can this $220m project led by Santos really live up to the hype and will it store enough CO2 to make a difference?

What’s the hype?

The Santos chief executive, Kevin Gallagher, travelled to Glasgow for the start of the international climate talks to stand alongside Australia’s emissions reduction minister, Angus Taylor, and confirm the company’s investment in the project.

Gallagher said: “This carbon reduction project in the South Australian outback will be one of the biggest and lowest cost in the world and will safely and permanently store 1.7m tonnes of carbon dioxide per year in the same reservoirs that held oil and gas in place for tens of millions of years.”

The project is the first carbon capture and storage scheme to be registered as eligible to generate carbon credits under the Morrison government’s emissions reduction fund (ERF).

A model of the project was on display at the government’s pavilion during the first days of the Cop26 climate talks.

How will it work?

According to an environmental report submitted to the South Australian government in March, Santos will look to capture CO2 currently being vented to the atmosphere from its existing gas plant at Moomba.

That gas will be compressed and transported through new or existing pipelines “to suitable locations where it will be injected into target geological formations deep underground”.


Australia’s emissions from land clearing likely far higher than claimed, analysis indicates

Last year, the company told an industry roundtable it had identified six sites up to 60km away from Moomba that could be used for CO2 storage. Santos has said in the future it could store CO2 from other sources and third parties.
Will the project actually cut emissions?

Santos claims Moomba will be one of the world’s largest CCS projects and will store 1.7m tonnes of CO2-equivalent a year with the potential to scale up to 20m tonnes a year.

For context, 1.7m tonnes of CO2 is about 0.35% of Australia’s current annual emissions of 494.2m tonnes.

Operators of Australia’s biggest CCS scheme, Gorgon Carbon Dioxide Injection project, off the north-west coast of WA, say it can store 4m tonnes of CO2 a year but has missed targets.

According to Santos’ latest climate change report, the company emitted 5m tonnes of CO2-equivalent in the financial year 2019-20 when all its direct and indirect energy use was added up.

Santos has a target to cut these emissions by 26% by 2030 and to reach net zero by 2040, but this aim excludes emissions from burning the fossil fuels the company sells to customers.

According to Santos, these emissions – known as scope 3 – added another 24.3m tonnes of CO2e to the atmosphere in 2019-20.

But the climate impact of those scope 3 emissions will continue to grow because the company has a target to double its production of fossil fuels between 2018 and 2025.

Santos has a 66.7% share in the Moomba project, with the rest owned by Beach Energy. Both companies have said the project will help them cut their own emissions.

Capturing, processing and transporting the CO2 will also use extra energy. The Guardian asked Santos how this would affect the overall emissions reductions from injecting the CO2, but the company did not respond before publishing.
Is this a new plan?

Santos has been talking about capturing and storing CO2 from its gas processing plant at Moomba since at least 2006.


Australia considering more than 100 fossil fuel projects that could produce 5% of global industrial emissions


A 2007 information sheet from Santos describes how a demonstration phase “could commence as early as 2010” to capture CO2 from gas operations at Moomba and pump it into partly depleted fossil fuel reservoirs.

This, the document said, would “re-pressurise the reservoirs and sweep residual oil left behind by traditional recovery methods (this is called enhanced oil recovery)”.

The Guardian asked Santos if pumping CO2 into its reservoirs would push out extra oil or gas and, if it did, what the company would do with those fossil fuels. Santos did not respond before publishing.

Just days before the 2007 federal election, the Coalition led by John Howard promised it would give Santos $10m to fast-track a “‘Moomba Carbon Storage Concept” if it was elected. Howard lost the election to Kevin Rudd.

In 2009, Santos put the project “on hold” but by 2018 it was carrying out engineering studies, modelling and looking for potential sites for CO2 injection.

In June this year, the project was given a $15m grant from a $50m federal government program to support CCS.

But Santos had said the project’s future hinged on CCS being accredited by the government’s emissions reduction fund, meaning they could generate carbon credits that could be sold to the government or private industry.

In early 2021, the government installed several fossil fuel industry leaders and supporters to the panel that would decide if CCS should be added to the ERF.

In October, the government made CCS eligible for carbon credits and the Moomba project has now become the first project to be registered.

'MAYBE' TECH
US outlines vision for cheap, large-scale carbon capture technology
By Nick Lavars
November 07, 2021

Capturing carbon dioxide from the air is currently expensive, but a new venture from the US Department of Energy aims to change that

Preventing global warming from reaching dangerous levels will require dramatic steps to reduce the amount of carbon dioxide pouring into the atmosphere, but many see removing and safely storing what is already there as another critical part of the solution. A new initiative from the US Department of Energy (DOE) makes these technologies a key pillar in its plan to tackle climate change, outlining ambitions to significantly drive down the cost and foster innovations that allow for storage on a mass scale by mid-century.

Though the idea has been around for decades, the idea of extracting carbon dioxide from the air has gathered momentum in the past few years, largely on the back of startups making some key inroads in the space. A notable example is Swiss startup Climeworks, whose Direct Air Capture (DAC) systems collect CO2 from the ambient air and turn it into solid minerals that can be stored underground.

This technology was brought to life in the world's first negative emission power plant in 2017, and earlier this year the company opened the world's largest DAC plant in Iceland. Called Orca, the plant has the capacity to harvest around 4,000 tons of CO2 from the air each year. For context, humans pump out more than 30 billion tons, or 30 gigatons, into the air on a yearly basis.

So while the technology has been successfully demonstrated on a small scale, the world would need many Orcas to properly put a dent in the problem. And then comes the cost. Climeworks was able capture a ton of CO2 for around US$600 when it first opened its negative emissions plant in 2017, but by using a modular design for its DAC systems, expects this cost to come down toward $100 per ton as it scales up operations.

And this price point is a commonly held target among carbon capture startups. Canadian company Carbon Engineering is currently working on a large-scale DAC plant capable of capturing up to one million tons of CO2, and hopes to do so for a cost of $94 to $232 per ton. Australian startup Southern Green Gas plans to tap into the country's abundant sunlight to fuel solar-powered DAC plants that collect tons of carbon for as little as $72 a piece.

The US Department of Energy has been making moves to help things along, last year investing $22 million into research efforts that advance carbon capture technology, followed by a further $24 million this year. Its latest venture in this space is called the “Carbon Negative Shot" and is described as the US government's first major effort in carbon dioxide removal, and key to its plans of achieving net zero emissions by 2050.

The initiative centers on the ambition of deploying carbon capture technologies on a gigaton scale by 2050. According to the DOE, a gigaton of CO2 is around the amount generated by 250 million vehicles, or the US's entire light-duty fleet, each year. Getting the technology to this point would require some significant developments both in the technology itself and the capacity to deploy it on a massive, unprecedented scale.

The DOE hopes that through its Carbon Negative Shot program, it can give this burgeoning industry the shot in the arm it needs. Part of its efforts to accelerate innovation in the area will involve working with individual communities that might benefit from or be willing to take part in carbon capture programs. Ultimately, the venture aims to reduce the cost of carbon removal and storage to less than $100 per ton.

“By slashing the costs and accelerating the deployment of carbon dioxide removal – a crucial clean energy technology – we can take massive amounts of carbon pollution directly from the air and combat the climate crisis,” says Secretary of Energy Jennifer M. Granholm. “With our Carbon Negative Shot, we can help remove the greenhouse gases already warming our planet and affecting our health – positioning America as a net-zero leader and creating good-paying jobs for a transitioning clean energy workforce. The combination of the Carbon Negative Shot with our massive investments in hydrogen, battery storage, renewables and decarbonized fossil energy, can make net-zero emissions a reality here and abroad.”

Source: US Department of Energy

 

'We're Here To Call For Climate Justice,' Say Glasgow Protesters

"We're drops in the ocean, but eventually, those drops add up to something big."


Climate protesters gather for the Global Day of Action for Climate Justice march on November 6, 2021 in Glasgow, Scotland. Jeff J Mitchell / Getty Images

Mark Hertsgaard

This story is part of Covering Climate Now, a global journalism collaboration strengthening coverage of the climate story.

GLASGOW, SCOTLAND — "I'd feel ridiculous if I weren't here," said Tom Birch, a teacher from Edinburg, as he carried a sign reading "Soon Humanity Will Be Net-Zero." Birch was among the many tens of thousands of marchers who filled the streets of Glasgow, host of the United Nations COP26 climate conference, on Saturday as part of a Global Day for Climate Justice. "You get lifted up seeing all these people who care, who aren't just sitting at home in our silos complaining," he added. "This is the moment to make our voices heard. It's our last chance."

"Pledges are not action," read the back of Birch's sign, summarizing many activists' critique of the net-zero emissions pledges that governments and corporations have made at COP26. Eva Wewgorski, a librarian from Edinburgh who created the sign, said that "World leaders are acting like these pledges will solve the problem. But there've been countless pledges over the decades that haven't been kept, so why should we believe them now?"

Coming at the midway point of the two-week COP 26 conference, the Global Day for Climate Justice also featured demonstrations in London, Paris, South Korea, Indonesia, and the Philippines. The Guardian reported that there were more than 300 protests worldwide, with 100 in the United Kingdom alone.

Although the Glasgow march included representatives of Indigenous peoples from South America and youth activists such as Vanessa Nakate of Uganda, most of the crowd were locals judging from the paucity of umbrellas, despite bursts of heavy rain and gusty winds. "We're used to the rain," a local soccer coach and shopkeeper who gave only his first name, Niall, said with a grin.

Wearing uniforms of sparkling gold lamé, a dozen musicians with a local brass band called "Brass, Aye?" got marchers dancing with pulsing renditions of "When the Saints Go Marching In" and other New Orleans standards. "We're here to call for climate justice and bring a bit of joy and vibrancy to this march," said Scott, a blonde trombone player who directed the group.

A second group of street artists dressed head to toe in blood red, with faces painted white and set in grim expressions, stayed completely silent as they marched through Nelson Mandela Place in the heart of the city on their way to the march's terminus on the Glasgow Green.

"Inside that conference of polluters, the climate criminals are hiding behind barbed wire and fences and lines of police," Asad Rehman of the COP26 Coalition of activist groups, told the crowd on Glasgow Green. "We're not going to accept their suicide pact."

Police officers in yellow vests were especially noticeable around an office building of Scottish Power, the electric utility, at the intersection of St. Vincent and North streets. Positioned 10 paces apart behind metal barriers that confined the marchers to the middle of the street, the officers stood with hands folded across their waists, watchful but not aggressive. As the crowd passed by at 2pm, a rainbow briefly illuminated the northern sky, leading a mother pushing a toddler in a stroller to remark, "That's a nice omen, isn't it?"

"The right to protest is a cornerstone of democracy; it's a direct way to speak to your leaders without having to wait for an election," said Danielle, 19, a Glasgow resident marching with a contingent from Tear Fund, a Christian NGO working to alleviate poverty in the Global South through advancing social justice rather than conventional foreign aid. "Movements develop over time," she added. "Generations of people have been doing this kind of witnessing for years, and world leaders are starting to listen because of that. Eventually, you reach a watershed moment, and that's what's happening now."

Carrying signs quoting a verse from the Old Testament book of Micah—"Act justly, love mercy, walk humbly," Danielle and three other young women with Tear Fund set off from a muddy hillside in Kelvingrove Park where long strips of yellow cloth spelled out, "Amazonia For Life: Protect 80% by 2025."

"Countries in the Global South are paying more to service their debt than the $100 billion a year in climate reparations they're supposed to get at COP," said Mollie Somerville, a grandmother marching with a group whose yellow vests read, "Cancel the Debt for Climate Justice." "That debt is owed mostly to private banks that are making huge profits and don't need that money," she said. "No, I haven't heard COP leaders talk about this issue yet, but there's still time, so I'm hoping that this march will put pressure on them."

"Protests like this, it gives people a feeling of solidarity, knowing that we're not alone," added Somerville, who said she was "extremely worried" about the future awaiting her three grandchildren and a fourth expected next week. "I think government and business leaders see that these protests are getting bigger and the time for action is now. Standing here in the cold and the rain, that's why we do this. At times, it feels like we're drops in the ocean. But eventually, those drops add up to something big."

Mark Hertsgaard is the co-founder and executive director of Covering Climate and the environment correspondent for The Nation.

 

Only Two Percent of the Great Barrier Reef Has Escaped Coral Bleaching

Nov. 05, 2021 

Coral bleaching in the Great Barrier Reef. Brett Monroe Garner / Getty Images

The Great Barrier Reef is in trouble.

It has experienced five mass bleaching events since 1998, three of them in the last five years. Now, new research published in Current Biology Thursday finds that only two percent of the reef escaped some bleaching during that time.

"This is one of the most confronting results of my career," study lead author Terry Hughes, from the ARC Centre of Excellence for Coral Reef Studies at James Cook University, Tweeted.

Coral bleaching occurs when higher than average water temperatures force coral to expel the algae that provide them with nutrients and color, Phys.org explained. As the climate crisis persists, marine heat waves are leading to more frequent, intense and widespread bleaching events.

"Five bouts of mass bleaching since 1998 have turned the Great Barrier Reef into a checkerboard of reefs with very different recent histories, ranging from two percent of reefs that have escaped bleaching altogether, to 80 percent that have now bleached severely at least once since 2016," Hughes told Phys.org.

The researchers found that corals did have a better chance of adapting to higher temperatures if they survived a previous bleaching event, CNN explained. However, the time between bleaching events has decreased, giving corals less time to recover. Recent bleaching events occurred in 2016, 2017 and 2020, according to Phys.org.

Another paper also published in Current Biology Thursday shows another way in which these bleaching events have made the reef more vulnerable over time. The last three major bleaching events may have reduced the reefs' larvae supply by 26, 50 and 71 percent, in chronological order. This has made it harder for the reef to recover, but there are potential coral "refugia" in 13 percent of the reef, which have avoided temperature spikes. These refugia could spread larvae to 58 percent of the total reef, the study found.

Both studies come as the COP26 climate conference is taking place in Glasgow, and the authors warned that protecting the reef means sticking to the most ambitious Paris agreement goal of limiting warming to 1.5 degrees above pre-industrial levels.

"We don't yet know how long those thermal refuges will exist. They won't last forever," University of Queensland Professor Peter Mumby, a co-author on the second study, told The Guardian. "There are still healthy reefs producing larvae, but if we see these patterns [of warming] continue you would expect it would affect rates of coral recovery. One of the biggest concerns is even if we can stick to the 1.5C target, we still have committed warming."

The most recent analyses show that the promises world leaders have made so far at COP26 could put the world on track for 1.9 or 1.8 degrees Celsius of warming. But Hughes told The Guardian that likely won't be enough to save the Great Barrier Reef.

"So far 1.1C [of warming] has been sufficient to trigger five mass bleaching events," he said. "The higher the temperature goes, the more difficult it will be. I am confident [corals reefs] could handle another 0.3C of warming, but they will struggle at 1.9C and there's a lot of optimism in that 1.9C figure."

Big Finance commits $100tn-plus to tackle net-zero challenge

Carney-led GFANZ under pressure to dispel greenwashing concerns

A giant Earth looms over a meeting hall at the United Nations' COP26 climate change conference in Glasgow, Scotland, on Nov. 2. The financial sector has been playing a growing role in the global push to decarbonize. © Reuters

KYOHEI SUGA, 
Nikkei staff writer
November 8, 2021

TOKYO -- Banks, insurers, asset managers and others worldwide have committed more than $100 trillion in private capital to achieving net-zero greenhouse gas emissions over the next three decades as the financial sector takes a leading role in advancing technological and societal shifts.

More than 450 companies and other organizations now aim for net-zero emissions across their portfolios by 2050, according to the Glasgow Financial Alliance for Net Zero. GFANZ was launched in April by former Bank of England Gov. Mark Carney, the United Nations special envoy for climate action and finance.

The framework so far has served mainly to demonstrate a commitment to net zero, an executive at an asset manager said. But with members now responsible for more than $130 trillion in global financial assets -- almost doubling compared with the start -- they face growing pressure to turn talk into action.

Overall, GFANZ holds roughly 40% of the world's global financial assets. Members include some of the world's biggest financial companies, such as Bank of America, BNP Paribas, Citi, HSBC, Morgan Stanley and UBS. Eighteen Japanese institutions have also signed up, including the three megabanks -- Mitsubishi UFJ Financial Group, Sumitomo Mitsui Financial Group and Mizuho Financial Group -- as well as Nippon Life Insurance and Nomura Asset Management.

Moving forward, members will work toward such goals as reducing emissions by around 50% over 10 years, updating environmental targets every five years and disclosing progress annually. This is expected to put greater pressure on businesses that receive financing or investment from these institutions to disclose their own emissions and reduction plans.

About 30 asset managers that have been with GFANZ from the beginning had already set new interim targets through 2030 as of mid-October. Japan's Asset Management One said it will dedicate more than half of its portfolio to companies that either have or plan to achieve net-zero emissions. It will consider divesting from companies that refuse climate-related engagement.

The International Energy Agency has estimated that the world needs around $70 trillion in investment by 2040 to meet climate targets set by the Paris Agreement. GFANZ assets could provide a major boost.

But whether members of the coalition can truly achieve net-zero emissions across their portfolios by 2050 remains unclear.

Unlike asset managers, which can reallocate money in their portfolios based on what environmentally minded asset owners want, banks will need to engage in careful dialogue to persuade borrowers to accelerate decarbonization efforts.

"We can't take actions that are too disruptive, like just pulling financing from companies," an executive at a Japanese megabank said. Banks account for $66 trillion, or about half, of GFANZ assets.

A quicker transition to net zero will curb climate-related risks. But if financial institutions turn up the pressure too fast, more businesses could give up on decarbonization altogether. "It is extremely difficult to find the right balance," MUFG President and CEO Hironori Kamezawa said.

GFANZ has also faced criticism from the nonprofit sector. In an open letter this October, 90-plus groups raised concerns that "banks and other financial institutions are using 'Net Zero' promises largely as greenwash."

"In fact, many financial institutions, since becoming members of GFANZ, have issued new financing to companies expanding fossil fuel infrastructure," the letter said.

"Financial institutions are playing a greater role in decarbonization," said Takahide Kiuchi, executive economist at the Nomura Research Institute. "These institutions will need to take measures to increase transparency if distrust persists."

 

2.5-Billion-Year-Old Rocks Reveal Volcanic Eruptions Spurred First “Whiffs” of Oxygen in Earth’s Atmosphere

Mount McRae Shale in Western Australia

Roger Buick in 2004 at the Mount McRae Shale in Western Australia. Rocks drilled near here show “whiffs” of oxygen occurred before the Great Oxidation Event, 2.4 billion years ago. New analyses show a slightly earlier spike in the element mercury emitted by volcanoes, which could have boosted populations of single-celled organisms to produce a temporary “whiff” of oxygen. Credit: Roger Buick/University of Washington

A new analysis of 2.5-billion-year-old rocks from Australia finds that volcanic eruptions may have stimulated population surges of marine microorganisms, creating the first puffs of oxygen into the atmosphere. This would change existing stories of Earth’s early atmosphere, which assumed that most changes in the early atmosphere were controlled by geologic or chemical processes.

Though focused on Earth’s early history, the research also has implications for extraterrestrial life and even climate change. The study led by the University of Washington, the University of Michigan and other institutions was published recently in the Proceedings of the National Academy of Sciences.

“What has started to become obvious in the past few decades is there actually are quite a number of connections between the solid, nonliving Earth and the evolution of life,” said first author Jana Meixnerová, a UW doctoral student in Earth and space sciences. “But what are the specific connections that facilitated the evolution of life on Earth as we know it?”

In its earliest days, Earth had no oxygen in its atmosphere and few, if any, oxygen-breathing lifeforms. Earth’s atmosphere became permanently oxygen-rich about 2.4 billion years ago, likely after an explosion of lifeforms that photosynthesize, transforming carbon dioxide and water into oxygen.

But in 2007, co-author Ariel Anbar at Arizona State University analyzed rocks from the Mount McRae Shale in Western Australia, reporting a short-term whiff of oxygen about 50 to 100 million years before it became a permanent fixture in the atmosphere. More recent research has confirmed other, earlier short-term oxygen spikes, but hasn’t explained their rise and fall.

Mount McRae Shale Rock Cores

These are drill-cores of rocks from the Mount McRae Shale in Western Australia. Previous analysis showed a “whiff” of atmospheric oxygen preceding the Great Oxidation Event, 2.4 billion years ago. New analyses show a slightly earlier spike in minerals produced by volcanoes, which may have fertilized early communities of microbes to produce the oxygen. Credit: Roger Buick/University of Washington

In the new study, researchers at the University of Michigan, led by co-corresponding author Joel Blum, analyzed the same ancient rocks for the concentration and number of neutrons in the element mercury, emitted by volcanic eruptions. Large volcanic eruptions blast mercury gas into the upper atmosphere, where today it circulates for a year or two before raining out onto Earth’s surface. The new analysis shows a spike in mercury a few million years before the temporary rise in oxygen.

“Sure enough, in the rock below the transient spike in oxygen we found evidence of mercury, both in its abundance and isotopes, that would most reasonably be explained by volcanic eruptions into the atmosphere,” said co-author Roger Buick, a UW professor of Earth and Space Sciences.

Where there were volcanic emissions, the authors reason, there must have been lava and volcanic ash fields. And those nutrient-rich rocks would have weathered in the wind and rain, releasing phosphorus into rivers that could fertilize nearby coastal areas, allowing oxygen-producing cyanobacteria and other single-celled lifeforms to flourish.

“There are other nutrients that modulate biological activity on short timescales, but phosphorus is the one that is most important on long timescales,” Meixnerová said.

Today, phosphorus is plentiful in biological material and in agricultural fertilizer. But in very ancient times, weathering of volcanic rocks would have been the main source for this scarce resource.

“During weathering under the Archaean atmosphere, the fresh basaltic rock would have slowly dissolved, releasing the essential macro-nutrient phosphorus into the rivers. That would have fed microbes that were living in the shallow coastal zones and triggered increased biological productivity that would have created, as a byproduct, an oxygen spike,” Meixnerová said.

The precise location of those volcanoes and lava fields is unknown, but large lava fields of about the right age exist in modern-day India, Canada and elsewhere, Buick said.

“Our study suggests that for these transient whiffs of oxygen, the immediate trigger was an increase in oxygen production, rather than a decrease in oxygen consumption by rocks or other nonliving processes,” Buick said. “It’s important because the presence of oxygen in the atmosphere is fundamental – it’s the biggest driver for the evolution of large, complex life.”

Ultimately, researchers say the study suggests how a planet’s geology might affect any life evolving on its surface, an understanding that aids in identifying habitable exoplanets, or planets outside our solar system, in the search for life in the universe.

Reference: “Mercury abundance and isotopic composition indicate subaerial volcanism prior to the end-Archean “whiff” of oxygen” by Jana Meixnerová, Joel D. Blum, Marcus W. Johnson, Eva E. Stüeken, Michael A. Kipp, Ariel D. Anbar and Roger Buick, 17 August 2021, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2107511118

Other authors of the paper are co-corresponding author Eva Stüeken, a former UW astrobiology graduate student now at the University of St. Andrews in Scotland; Michael Kipp, a former UW graduate student now at the California Institute of Technology; and Marcus Johnson at the University of Michigan. The study was funded by NASA, the NASA-funded UW Virtual Planetary Laboratory team and the MacArthur Professorship to Blum at the University of Michigan.