Sunday, October 03, 2021

“Mega Comet” Heading Our Way Is Probably The Largest Ever Seen




ARTIST'S IMPRESSION OF C/2014 UN271 (BERNARDINELLI-BERNSTEIN), 
THE LARGEST AND MOST PRISTINE COMET WE HAVE EVER SEEN. 
IMAGE CREDIT: NOIRLAB/NSF/AURA/J. DA SILVA (SPACEENGINE) CC-BY-4.0


By Stephen Luntz 30 SEP 2021,


In June this year, two astronomers discovered probably the largest comet ever seen, an object so big there was initial debate if it might really be a dwarf planet on a comet-like orbit. This "mega comet" is on an inward-bound trajectory from the outer Solar System.

Now, its discoverers and many co-authors have reported the results of three months spent learning more about this exceptional object. A paper accepted by the Astrophysical Journal Letters (preprint on ArXiv.org) reveals plenty we didn't know about this world when its existence hit the news.

Anyone learning about C/2014 UN271 (Bernardinelli-Bernstein) for the first time needn’t worry, though. Even at its closest approach in 2031, UN271 will be more distant than Saturn at around 11 astronomical units away ((1 AU = distance from Earth to the Sun), frustrating astronomers who would love a closer look at something this unusual.

UN271's orbit has been traced, and its last approach to the Sun was around 3.5 million years ago. On that occasion it only got to 18 astronomical units away, almost twice the distance it will reach this time and around the distance to Uranus. It's likely to soon be the closest to the Sun it has ever been, making it the most pristine comet we have ever seen, a true throw-back to the Solar System's origins.

Early numbers for UN271's size were inevitably imprecise. However, the authors have now settled on an estimate of 150 kilometers (100 miles) across. That gives it a volume thousands of times a typical comet, and at least 10 times bigger than even a giant like Hale-Bopp. For comparison, comet 67P, which Rosetta studied, is only 2.6 miles (4.3 kilometers) across, and Arrakoth, the furthest world humanity has ever explored, is 22 miles (35 kilometers) long and 12 miles (20 kilometers) wide.

 

There were hopes of getting more certainty on the size when UN271 passed in front of a star as seen from Eastern Australia, but cloud covered the entire region from which the event might have been seen. However big, though, the comet won't be visible to the naked eye when it makes its close approach.

One of the first things astronomers wanted to know about UN271 was whether it was already showing cometary activity, that is having material turn to gas and form a coma. UN271 had been photographed by both TESS, NASA's planet-hunter, and the Dark Energy Survey (DES) in 2018 without anyone noticing its significance, so the authors checked the earlier images to see if they could find tell-tale signs of fuzziness.

They found a discrepancy in the measurements, with TESS reporting an object almost twice as bright. It turns out the reason was that DES was looking at only a small area around UN271, while TESS was aggregating over a bigger space, including a large, but faint coma, indicating material had been escaping for a long time. The coma's composition cannot yet be detected, but carbon dioxide carrying dust grains with it as it escapes is thought most likely.

UN271's tail is fainter still, requiring the combination of many images to detect it at all.

“It is usually a losing proposition to speculate on the future behavior of comets,” the paper acknowledges, but nevertheless projects that at its brightest UN271 should be around magnitude 9 – visible to amateurs with small telescopes under dark skies.

Some astronomers are keen to get a mission going to UN271, calculating the best time for a flyby is 2033, which would require a launch by 2028.
NOTHING TO TALK ABOUT
Enbridge says still willing to talk on Line 5, despite Michigan's frustration

Fri., October 1, 2021



WASHINGTON — The Canadian architect of the controversial Line 5 cross-border pipeline expansion project said Friday it remains committed to a negotiated solution to its impasse with the state of Michigan, even though the government has effectively walked away from the table.

Both sides are obliged by court order to engage in a good-faith effort to resolve the dispute, and Enbridge Inc. "remains ready to do just that," the Calgary-based pipeline giant said in a statement.

"Our goal from the beginning has been to work co-operatively to reconcile interests, resolve disputes and move forward in the best interest of people throughout the region," the company said.

"We believe in the process and participated in mediation in good faith. We are committed to continuing to seek resolution, whether through mediation or by asserting our rights in the courts if necessary."

Line 5 ferries upwards of 540,000 barrels per day of crude oil and natural gas liquids across the Canada-U.S. border and the Great Lakes by way of a twin line that runs along the lake bed beneath the ecologically sensitive Straits of Mackinac, which connect Lake Michigan and Lake Huron.

Proponents call it a vital and indispensible source of energy — particularly propane — for several midwestern states, including Michigan, Ohio and Pennsylvania, and a key source of feedstock for critical refineries on the northern side of the border, including those that supply jet fuel to some of Canada's busiest airports.

Critics, however, among them Michigan Gov. Gretchen Whitmer, want the line shut down, arguing it's only a matter of time before an anchor strike or technical failure triggers a catastrophic environmental disaster in one of the area's most important watersheds.

That's why last November, Whitmer abruptly revoked the easement that had allowed the pipeline to operate since 1953, giving the company until May to voluntarily cease operations and triggering a court case that has only dragged on since then.

Enbridge has insisted from the outset that it has no plans to voluntarily shut down the pipeline.


"We understand the stakes in this matter are important not only for Enbridge and the state, but for many others on both sides of the U.S.-Canada border who have a strong interest in its outcome," the company said.

"Meanwhile, we will continue to safely and responsibly deliver the energy the region relies upon from the Line 5 system."

A court-sanctioned voluntary mediation process, which began in April, has failed to yield any agreement and appears to have fallen apart, although the official status of those talks is difficult to divine.

Following the last meeting Sept. 9, Michigan's emissaries "unambiguously communicated to the mediator that any further continuation of the mediation process would be unproductive for them, and they have no 'desire to continue with the mediation process,'" court documents show.

Michigan District Court Judge Janet Neff, however, appears reluctant to call a halt to the process.

"Voluntary facilitative mediation necessarily requires voluntary participation by both parties," Neff said in a decision last week that dismissed as moot one of the state's motions aimed at short-circuiting the talks.

The process, Neff wrote, "is at least at a standstill, although the parties remain under a continuing obligation to engage in good faith to resolve this case."

Where that leaves matters is unclear. The attorney general's office in Michigan refused to comment Friday, referring media inquiries back to the court documents.

Enbridge has also pointed to a possible "diplomatic solution" under a 1977 U.S.-Canada treaty covering cross-border pipelines, which the Canadian government has argued applies in this case and obliges the court to step aside in favour of a negotiated bilateral settlement.

Environmental groups, meanwhile, have been unequivocal in their opposition to the pipeline and a potential replacement project.

Cathy Collentine, associate director of the Sierra Club's "Beyond Dirty Fuels" campaign, said the U.S. Army Corps of Engineers is in the process of an environmental impact assessment on the Line 5 project. It would then be up to the White House to decide whether to take action based on the findings, she said.


If President Joe Biden's administration is serious about confronting climate change, the most contentious cross-border pipeline projects of the last 15 years — Keystone XL, Line 5 and also Line 3, another Enbridge upgrade, this one in Minnesota — are the ones they should be blocking, Collentine said.

Such projects, with their capacity to increase fossil fuel production and consumption, are already affecting communities on the front lines of climate change, she said.

"Those are the exact projects that we have long said we cannot continue to build, we cannot continue to approve," Collentine said.

"It's a moment where the Biden administration, through these analyses, we believe should and hopefully will see that that is also true and not allow these projects to move forward or to continue operating."

This report by The Canadian Press was first published Oct. 1, 2021.

James McCarten, The Canadian Press

Researchers suggest a way to achieve net-zero emission plastics

plastic ocean
Credit: Pixabay/CC0 Public Domain

A team of researchers with members affiliated with institutions in Germany, Switzerland and the U.S. has created a model that they claim could be used to achieve net-zero-emission plastics by 2050. In their paper published in the journal Science, the group outlines their model and requirements for implementation.

A host of studies has shown that the production and  has become a significant environmental problem as it breaks down into microplastics, it makes its way into virtually every water source on the planet, resulting in  for organisms. Production of plastic is also a significant contributor to  due to the gasses emitted during manufacture. In this new effort, the researchers analyzed the data produced by over 400 research efforts aimed at solving the plastics problem and developed a model that they say could lead to a net-zero-emission-plastic world by 2050.

The model implements a cycle built around combining recycling of plastics with chemical reduction of the carbon dioxide they emit when they are burned or collected from biomass. They suggest a recycling rate as low as 70% would be sufficient to reach net-zero emissions, which would result in  of 34 to 53%. They also suggest that the operational costs involved would be on a par with other carbon-capture processes. They further suggest that the cost savings associated with implementing their model globally would amount to approximately $288 billion annually. They point out that production of plastics now accounts for approximately 6 percent of  and note that current forecasts suggest that the number could grow to 20% over the next 30 years if things continue as they are now. They conclude that the technology exists to solve the plastics problem—all that is needed to solve it is the will to do so.

Plastic in the UK: Practical and pervasive—but problematic
More information: Raoul Meys et al, Achieving net-zero greenhouse gas emission plastics by a circular carbon economy, Science (2021). DOI: 10.1126/science.abg9853
Journal information: Science 
Provided by Science X Network
BLOWN OVER 
Germany: Wind turbine collapses hours before official launch


The Associated Press
Thursday, September 30, 2021 

Remains of the tower of a wind turbine stand in the forest in Haltern, Germany, Thursday, Sept.30, 2021. The wind turbine, which is almost 240 metres high, has collapsed. (Guido Bludau/dpa via AP)

BERLIN -- Officials in Germany are investigating why a huge wind turbine collapsed just hours before it was due to be officially inaugurated.

The turbine, whose rotor blades reach a height of 239 metres, toppled over late Wednesday in a forest near the western town of Haltern.

German news agency dpa reported Thursday that police were not currently suspecting sabotage.

The wind turbine was scheduled to be officially launched Thursday, though it was connected to the power grid six months ago.

Germany is trying to ramp up its use of renewable energy such as wind and solar as part of a transition away from fossil fuels and nuclear power.

COSMOLOGY

WHAT HAPPENED BEFORE THE BIG BANG? 

A NASA ASTROPHYSICIST ANWSERS 

Thanks to time-traveling telescopes, we can see more about the Big Bang

VIDEO with Michelle Thaller

DESCRIPTION

One of the biggest misconceptions in science is that the Big Bang came out of nothing – according to astrophysicist Michelle Thaller, this is not correct. 

13.8 billion years ago right before the Big Bang, our universe existed within one tiny, compressed atom. But what we know now is that this one atom was not our entire universe. 

According to Thaller, there were trillions of atoms, all with their own universe inside. Today, we can only know of our observable universe, but there is far more out there than what meets the eye.

Dr. Michelle Thaller is an astronomer who studies binary stars and the life cycles of stars. She is Assistant Director of Science Communication at NASA. 



Ask Ethan: Will dark energy cause the Big Bang to disappear?

If we were born trillions of years in the future, could we even figure out our cosmic history?

The farther away we look, the closer in time we’re seeing towards the Big Bang. In the far future, there will be an enormous distance separating even the closest galaxies from the local group, but with enough motivation and a little luck, even a far-future civilization, in a universe dominated by dark energy, could still uncover the Big Bang origin of the universe. (Credit: Robin Dienel/Carnegie Institution for Science)

KEY TAKEAWAYS


Dark energy is causing the universe's expansion to accelerate, driving galaxies and light farther away from us.


In the far future, no signals beyond our Local Group will remain visible, eliminating the evidence we used to discover the Big Bang.


But a series of very clever measurements, if we're savvy enough to make them, could still reveal our cosmic history to us.


Ethan Siegel

13.8 billion years ago, the universe as we know it ⁠— full of matter and radiation, expanding and cooling and gravitating ⁠— came into existence with the onset of the hot Big Bang. Today, we can see and measure the signals that travel to us from enormous cosmic distances, enabling us to successfully reconstruct the universe’s history and how we came to be. But as time passes, a novel form of energy in our universe — dark energy — increasingly dominates the expansion of space. As dark energy takes over, it accelerates the universe’s expansion, which gradually removes the key information needed to draw the conclusions we’ve reached today.

It’s enough to make one wonder: If we were born in the far future instead of today, would we be able to learn about the Big Bang at all? That’s what Patreon supporter Aaron Weiss wanted to know, asking:

“[A]t some point in the future, all objects not gravitationally bound to us will recede away. [T]he only points of light in the night sky will be objects in our Local Group. At that point in time, will there be any evidence of the universe’s expansion that might suggest to future astronomers that there are/were stars and galaxies beyond what would be visible to them? Would they have lines-of-site that lead to nothing but the CMB?”

Does our ability to answer fundamental questions about the universe hinge upon when and where we happen to exist in cosmic history? Let’s look to the far future to find out.

The cosmic microwave background appears very different to observers at different redshifts, because they’re seeing it as it was earlier in time. In the far future, this radiation will shift into the radio and its density will drop rapidly, but it will never disappear entirely. (Credit: NASA/BlueEarth; ESO/S. Brunier; NASA/WMAP)

Today, there are four major pieces of evidence that we typically consider as the cornerstones of the hot Big Bang. The whole reason we consider the Big Bang as the unchallenged scientific consensus is because it’s the only framework, consistent with the laws of physics (like Einstein’s General Relativity), that explains the following four observations:

the expanding universe, discovered through the redshift-distance relation for galaxies
the abundance of the light elements, as measured through various gas clouds, nebulae, and stellar populations across the universe

the leftover glow from the Big Bang, which is today’s cosmic microwave background, as directly detected via microwave and radio observatories

the growth of large-scale structure in the universe, as revealed by galaxy evolution and their clumping and clustering patterns seen across cosmic time


It’s important to remember that cosmology, like all branches of the astronomical sciences, is fundamentally driven by observations. Whatever our theories predict, we can only compare them to observations in the universe. The way we discovered each of these phenomena in our universe has its own remarkable story, but it’s a story that won’t be around permanently for us to always observe.

The growth of the cosmic web and the large-scale structure in the Universe, shown here with the expansion itself scaled out, results in the Universe becoming more clustered and clumpier as time goes on. Initially small density fluctuations will grow to form a cosmic web with great voids separating them. However, once the nearest galaxies recede to too-great distances, we will have extraordinary difficulty in reconstructing the evolutionary history of our cosmos. (Credit: Volker Springel)

The reason is straightforward: the conclusions that we draw are informed by the light that we can observe. When we look out at the universe with our best modern tools, we see lots of objects within our own galaxy — the Milky Way — as well as many objects whose light originates from far beyond our own cosmic backyard. Although this is something we take for granted, perhaps we shouldn’t. After all, the conditions in our universe today won’t be the same as those in the distant future.

Our home galaxy currently extends a little over 100,000 light-years in diameter, and it contains roughly ~400 billion stars, as well as copious amounts of gas, dust, and dark matter, with a wide variety of stellar populations: old and young, red and blue, low-mass and high-mass, and containing both small and large fractions of heavy elements. Beyond that, we have perhaps 60 other galaxies within the Local Group (within about ~3 million light-years), and somewhere around 2 trillion galaxies littered throughout the visible universe. By looking at objects farther away in space, we’re actually measuring them over cosmic time, which enables us to reconstruct the history of the universe.

Fewer galaxies are seen nearby and at great distances than at intermediate ones, but that’s due to a combination of galaxy mergers, evolution, and our inability to see the ultra-distant, ultra-faint galaxies themselves. Many different effects are at play when it comes to understanding how the light from the distant universe gets redshifted. 
(Credit: NASA / ESA)

The problem, however, is that the universe isn’t merely expanding, but that the expansion is accelerating due to the existence and properties of dark energy. We understand that the universe is a struggle — a race, of sorts — between two main players:

the initial expansion rate that the universe was “born” with at the onset of the hot Big Bang

the sum total of all the various forms of matter and energy within the universe


The initial expansion compels the fabric of space to expand, stretching all unbound objects farther and farther away from one another. Based on the total energy density of the universe, gravitation works to counteract that expansion. As a result, you can imagine three possible fates for the universe:
expansion wins, and there isn’t enough gravitation in all the existing “stuff” to counteract the initial large expansion, and everything expands forever
gravitation wins, and the universe expands to a maximum size and then recollapses
a situation between the two, where the expansion rate asymptotes to zero, but never reverses itself

That was what we expected. But it turns out that the universe is doing a fourth, and rather unexpected, thing.

The different possible fates of the universe, with our actual, accelerating fate shown at the right. After enough time goes by, the acceleration will leave every bound galactic or supergalactic structure completely isolated in the universe, as all the other structures accelerate irrevocably away. We can only look to the past to infer dark energy’s presence and properties, which require at least one constant. But its implications are larger for the future. (Credit: NASA & ESA)

For the first few billion years of our cosmic history, it appeared as though we were right on the border between eternal expansion and an eventual recontraction. If you were to observe distant galaxies over time, each would have continued to recede from us. However, their inferred recession speed — as determined from their measured redshifts — appeared to slow down over time. That’s just what you’d expect for a matter-rich universe that was expanding.

But about six billion years ago, those same galaxies suddenly started to recede from us more quickly. In fact, the inferred recession speed of every object that isn’t already gravitationally bound to us — i.e., that’s outside of our Local Group — has been increasing over time, a finding that’s been confirmed by a wide suite of independent observations.

The culprit? There must be a new form of energy permeating the universe that’s inherent to the fabric of space, which doesn’t dilute but rather maintains a constant energy density as time goes on. This dark energy has come to dominate the energy budget of the universe, and will take over entirely in the far future. As the universe continues to expand, matter and radiation get less dense, but dark energy’s density remains constant
.
While matter (both normal and dark) and radiation become less dense as the Universe expands owing to its increasing volume, dark energy is a form of energy inherent to space itself. As new space gets created in the expanding universe, the dark energy density remains constant. In the far future, dark energy will be the only component of the universe important for determining our cosmic fate. 
(Credit: E. Siegel/Beyond the Galaxy)

This will have many effects, but one of the more fascinating things that will occur is that our Local Group will remain gravitationally bound together. Meanwhile, all of the other galaxies, galaxy groups, galaxy clusters, and any larger structures will all accelerate away from us. If we had come into existence at a later date after the Big Bang — 100 billion or even a few trillion years after the Big Bang, as opposed to 13.8 billion years — most of the evidence we presently use to infer the Big Bang would, by then, be completely removed from our view of the universe.

Our first hint of the expanding universe came from measuring the distance to, and the redshifts of, the nearest galaxies beyond our own. Today, those galaxies are only a few million, to a few tens of millions, light-years away from us. They’re bright and luminous, easily revealed with the smallest telescopes or even a pair of binoculars. But in the far future, the galaxies of the Local Group will all merge together, and even the closest galaxies beyond our Local Group will have receded away to tremendously large distances and incredible faintnesses. Once enough time passes, even today’s most powerful telescopes, would reveal not a single galaxy beyond our own, even if they were to observe the abyss of empty space for weeks on end

.
Looking back through cosmic time in the Hubble Ultra Deep Field, ALMA traced the presence of carbon-monoxide gas. This enabled astronomers to create a three-dimensional image of the star-forming potential of the cosmos, with gas-rich galaxies shown in orange. In the far future, larger, longer-wavelength observatories will be required to reveal even the closest galaxies. 
(Credit: R. Decarli (MPIA); ALMA (ESO/NAOJ/NRAO))

This accelerated expansion, brought on by the dominance of dark energy, would also steal from us critical information about the other cornerstones of the Big Bang.
Without any other galaxies or clusters/groups of galaxies to observe beyond our own, there’s no way to measure the large-scale structure of the universe, and infer how matter clumped, clustered, and evolved throughout it.
Without populations of gas and dust outside of our own galaxy, particularly with different abundances of heavy elements, there’s no way to reconstruct the early, initial abundance of the lightest elements before the formation of stars.
After a tremendous amount of time, there will be no cosmic microwave background anymore, as that leftover radiation from the Big Bang will become so sparse and low-energy, stretched and rarified by the expansion of the universe, that it will no longer be detectable.

On the surface, it appears that with all four of today’s cornerstones gone, we’d be completely unable to learn about our true cosmic history and the early, hot, dense stage that gave rise to the universe as we know it. Instead, we’d see that whatever our Local Group becomes — likely an evolved, gas-free, and potentially elliptical galaxy — it would appear that we were all alone in an otherwise empty universe.

The galaxy shown at the center of the image here, MCG+01-02-015, is a barred spiral galaxy located inside a great cosmic void. It is so isolated that if humanity were located in this galaxy instead of our own and developed astronomy at the same rate, we wouldn’t have detected the first galaxy beyond our own until we reached technology levels only achieved in the 1960s. In the far future, every inhabitant in the universe will have an even more difficult time reconstructing our cosmic history. 
(Credit: ESA/Hubble & NASA, N. Gorin (STScI), Acknowledgement: Judy Schmidt)

But that doesn’t mean we’ll have no signals at all that could lead us to conclusions concerning our cosmic origins. Many clues would still remain, both theoretically and observationally. With a clever enough species investigating them, they might be able to draw correct inferences about the hot Big Bang, which could then be borne out through the process of scientific investigation.

Here’s how a species from the far future could figure it all out.

Theoretically, once we discovered the present law of gravity — Einstein’s general relativity — we could apply it to the entire universe, arriving at the same early solutions that we discovered here on Earth during the 1910s and the 1920s, including the solution for an isotropic and homogeneous universe. We would discover that a static universe that was filled with “stuff” was unstable, and therefore must be expanding or contracting. Mathematically, we would work out the consequences of an expanding universe as a toy model. But on the surface, the universe would appear to be exhibiting a steady-state solution. However, observational clues would still exist.

The cluster Terzan 5 has many older, lower-mass stars present within (faint, and in red), but also hotter, younger, higher-mass stars, some of which will generate iron and even heavier elements. It contains a mix of Population I and Population II stars, indicating that this cluster underwent multiple episodes of star formation. The different properties of different generations can lead us to draw conclusions about the initial abundances of the light elements.
 (Credit: NASA/ESA/Hubble/F. Ferraro)

First off, stellar populations within our own galaxy would still come in tremendous varieties. The longest-lived stars in the universe can persist for many trillions of years. New episodes of star formation, although they’d become somewhat rare, should still occur, as long as our Local Group’s gas doesn’t become totally depleted. Through the science of stellar astronomy, this means we’d still be able to determine not only the age of various stars, but their metallicities: the abundances of the heavy elements with which they were born. Just as we do today, we’d be able to extrapolate back to “before the first stars formed, how abundant were the various elements,” and we would find the same abundances of helium-3, helium-4, and deuterium that the science of Big Bang nucleosynthesis yields today.

We could then look for three specific signals:
The severely redshifted leftover glow from the Big Bang, with just a few extremely long-wavelength radio-frequency photons arriving from all over the sky. A large, ultra-cool radio observatory in space could find it, but we’d have to know how to build it.
An even more severe and obscure signal would arise from very early times: the 21-cm spin-flip transition of hydrogen. When you form a hydrogen atom from protons and electrons, 50% of the atoms have aligned spins and 50% have anti-aligned spins. Over timescales of around ~10 million years, the aligned atoms will “flip” their spins, emitting radiation of a very specific wavelength that gets redshifted. If we knew the wavelength and sensitivity ranges in which we needed to look, we could detect this background.
The ultra-distant, ultra-faint galaxies that lie at the edge of the universe but never fully disappear from our view. This would require building a telescope large enough and in the proper wavelength band. We’d just have to know enough to justify building something so resource-intensive to look to such great distances, despite not having any direct evidence of such objects nearby.

This artist’s rendering shows a night view of the Extremely Large Telescope in operation on Cerro Armazones in northern Chile. The telescope is shown using lasers to create artificial stars high in the atmosphere. A larger, longer-wavelength observatory, most probably in space, will be required to reveal even the nearest galaxies in the far future. 
(Credit: ESO/L. Calçada.)

It’s an incredibly tall order to imagine the universe as it will be in the far future, when all of the evidence that led us to our present conclusions is no longer accessible to us. Instead, we have to think about what will be present and observable — both obviously and only if you figure out how to search for it — and then imagine a path towards discovery. Even though the task will be more difficult hundreds of billions, or even trillions, of years from now, a civilization smart and savvy enough would be able to create their own “four cornerstones” of cosmology that led them to the Big Bang.

The strongest clues would come from the same theoretical considerations we applied back in the early days of Einstein’s general relativity and the observational science of stellar astronomy, in particular an extrapolation to the primordial abundances of the light elements. From those pieces of evidence, we could figure out how to predict the existence and properties of the leftover glow from the Big Bang, the spin-flip transition of neutral hydrogen, and eventually the ultra-distant, ultra-faint galaxies that can still be observed. It won’t be an easy task. But if uncovering the nature of reality is at all important to a far-future civilization, it can be done. Whether they succeed, however, is entirely up to how much they’re willing to invest.

Send in your Ask Ethan questions to startswithabang at gmail dot com!



'MAYBE' TECH

  

POWERING THE FUTURE
The race is on to replicate the power of the sun with fusion energy

PUBLISHED FRI, OCT 1 2021
Katie Brigham
@KATIE_BRIGHAM

The idea of fusion power has intrigued scientists for nearly 100 years, when they first discovered the process that powers the sun and stars. A fusion reaction, which occurs when atoms fuse together, could generate four times more energy than today’s fission reactors, and about four million times more than burning coal, without producing any greenhouse gases. 

Additionally, the process does not generate long-term radioactive waste, fusion reactors cannot melt down, and fusion fuel (hydrogen) is readily available. Once scientists finally manage to create a sustained fusion reaction, it could be a huge game-changer for the energy industry.

ITER, a $22 billion dollar international megaproject in the south of France, is the best-funded fusion endeavor, paid for by the governments of its seven member nations. It hopes to be the first to demonstrate the viability of fusion by generating more energy than it consumes. But venture capitalists and billionaire investors are also pouring money into fusion start-ups, with hopes to commercialize fusion power within the next decade.

Can This $22 Billion Megaproject Make Nuclear Fusion Power A Reality?

Fusion is the process that powers the sun and the stars, and scientists are getting a lot closer to replicating it here on Earth. ITER, the $22 billion dollar international fusion megaproject in the south of France, is the most well-funded endeavor, paid for by the governments of its member nations. But VC’s and private investors are also pouring money into fusion start-ups, with hopes to commercialize fusion power within the next decade. With a number of breakthroughs already this year, the race is on to prove that fusion power is not only possible, but integral to a clean energy future.


Scientists Uncover an Additional Threat to Antarctica’s Floating Ice Shelves

Ice Melange in Antarctica

Ice melange, a combination of ice shelf fragments, windblown snow and frozen seawater, can act as a glue to fuse large rifts in floating ice in Antarctica. Researchers at UCI and NASA JPL found that a thinning of the substance over time can cause rifts to open, leading to the calving of large icebergs. Credit: Beck / NASA Operation IceBridge

Thinning of rift-healing slush is identified as a major cause of iceberg calving events.

Glaciologists at the University of California, Irvine and NASA’s Jet Propulsion Laboratory have examined the dynamics underlying the calving of the Delaware-sized iceberg A68 from Antarctica’s Larsen C ice shelf in July 2017, finding the likely cause to be a thinning of ice melange, a slushy concoction of windblown snow, iceberg debris and frozen seawater that normally works to heal rifts.

In a paper published on September 27, 2021, in Proceedings of the National Academy of Sciences, the researchers report that their modeling studies showed melange thinning to be a major driver of ice shelf collapse. The circulation of ocean water beneath ice shelves and radiative warming from above, they say, gradually deteriorate ice melange over the course of decades.

As ice shelves are thought to buttress and prevent land-borne glaciers from more rapidly flowing into the ocean, this new knowledge about rift dynamics illuminates a previously underappreciated link between climate change and ice shelf stability.

“The thinning of the ice melange that glues together large segments of floating ice shelves is another way climate change can cause rapid retreat of Antarctica’s ice shelves,” said co-author Eric Rignot, UCI professor of Earth system science. “With this in mind, we may need to rethink our estimates about the timing and extent of sea level rise from polar ice loss – i.e., it could come sooner and with a bigger bang than expected.”

Using NASA’s Ice-sheet and Sea-level System Model, observations from the agency’s Operation IceBridge mission, and data from NASA and European satellites, the researchers assessed hundreds of rifts in the Larsen C ice shelf to determine which ones were most vulnerable to breaking. They selected 11 top-to-bottom cracks for in-depth study, modeling to see which of three scenarios rendered them most likely to break: if the ice shelf thinned because of melting, if the ice melange grew thinner, or if both the ice shelf and the melange thinned.

“A lot of people thought intuitively, ‘If you thin the ice shelf, you’re going to make it much more fragile, and it’s going to break,’” said lead author Eric Larour, NASA JPL research scientist and group supervisor.

Instead, the model showed that a thinning ice shelf without any changes to the melange worked to heal the rifts, with average annual widening rates dropping from 79 to 22 meters (259 to 72 feet). Thinning both the ice shelf and the melange also slowed rift widening but to a lesser extent. But when modeling only melange thinning, the scientists found a widening of rifts from an average annual rate of 76 to 112 meters (249 to 367 feet).

The difference, Larour explained, reflects the different natures of the substances.

“The melange is thinner than ice to begin with,” he said. “When the melange is only 10 or 15 meters thick, it’s akin to water, and the ice shelf rifts are released and start to crack.”

Even in winter, warmer ocean water can reach the melange from below because rifts extend through the entire depth of an ice shelf.

“The prevailing theory behind the increase in large iceberg calving events in the Antarctic Peninsula has been hydrofracturing, in which melt pools on the surface allow water to seep down through cracks in the ice shelf, which expand when the water freezes again,” said Rignot, who is also a NASA JPL senior research scientist. “But that theory fails to explain how iceberg A68 could break from the Larsen C ice shelf in the dead of the Antarctic winter when no melt pools were present.”

He said that he and others in the cryosphere studies community have witnessed ice shelf collapse on the Antarctic Peninsula, stemming from a retreat that began decades ago.

“We have finally begun to seek an explanation as to why these ice shelves started retreating and coming into these configurations that became unstable decades before hydrofracturing could act on them,” Rignot said. “While the thinning ice melange is not the only process that could explain it, it’s sufficient to account for the deterioration that we’ve observed.”

Reference: “Physical processes controlling the rifting of Larsen C Ice Shelf, Antarctica, prior to the calving of iceberg A68” by E. Larour, E. Rignot, M. Poinelli and B. Scheuchl, 27 September 2021, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2105080118

Joining Rignot and Larour on this NASA-funded project were Bernd Scheuchl, UCI associate project scientist in Earth system science, and Mattia Poinelli, a Ph.D. candidate in geoscience and remote sensing at Delft University of Technology in the Netherlands.

How the global extinction crisis is linked to climate change

Oct 2, 2021
Global News

New research suggests our children face a dark future of climate change disasters. They’ll also live in a world with far fewer types of plants and animals. The U.S. Fish and Wildlife Service has declared 23 species extinct, as new research forecasts a dark future of climate change disasters and dwindling biodiversity. As Jackson Proskow reports, scientists say conservation efforts must shift in the face of the climate crisis.

Inside a Carbon Negative Power Station

Oct 1, 2021

Motherboard

The Next 50 Years: In Iceland, at the edge of the Arctic Circle, scientists are creating the one of the first carbon negative power stations, using geothermal technology and a revolutionary carbon capture process. They're now setting their sights exporting this ground-breaking technology to the world, by creating CO2 hubs, where lands rich in basalt can become natural storage for the world's CO2. *This series is supported by Delta. VICE News retains full editorial authority.



Microsoft climate head says planting trees won’t be enough to remove CO2 from the air

PUBLISHED THU, SEP 30 2021
Catherine Clifford@IN/CATCLIFFORD/@CATCLIFFORD


KEY POINTS

“It is cheaper and easier to establish trees and enrich soils than to deploy nascent technologies that capture carbon and store it geologically,” says Microsoft in an article published in the scientific journal Nature on Wednesday.

Currently, technical carbon capture technologies are “scarce, expensive and resource intensive,” the Nature article says.

Better carbon tracking and accounting tools are key to improving incentives in the market.



In this article
MSFT-0.50 (-0.17%)


Charred trunks are seen on a tract of Amazon jungle, that was recently burned by loggers and farmers, in Porto Velho, Brazil August 23, 2019.
Ueslei Marcelino | Reuters

The current race to address carbon emissions is driving companies to support planting trees and other nature-based solutions. That’s because other forms of carbon capture technology are too expensive to be attractive right now.

That has to change, argues Microsoft in an article published in the scientific journal Nature on Wednesday.

“It is cheaper and easier to establish trees and enrich soils than to deploy nascent technologies that capture carbon and store it geologically,” the Nature article says.

In Jan. 2020, Microsoft announced that it aims to be carbon negative by 2030. That means that as a company it will remove more carbon dioxide from the atmosphere than it emits.

And by 2050, Microsoft aims to have removed all of the emissions it released since its founding in 1975.

Those are bold, audacious goals.

One year after making that pledge, Microsoft announced it purchased the removal of 1.3 million metric tons of carbon from 26 projects around the world. Some of those projects include initiatives to regenerate soil across US farms; expand forests in Peru, Nicaragua and the United States; and pay the Zürich, Switzerland-headquartered company, Climeworks, to use its technology and operate a machine in Iceland which “vacuums” carbon dioxide from the air and puts it into the ground where it mineralizes and turns to stone.

The Nature article summarizes what Microsoft learned, and its call to action for the market and policy makers. It is co-written by Microsoft chief environmental officer Lucas Joppa, other members of Microsoft’s sustainability team and experts from Columbia University and the Environmental Defense Fund. It also draws on lessons from data the payment processing company Stripe has made public.

Trees are great, but not enough

When Microsoft was seeking proposals to remove carbon from the atmosphere, most were for nature-based storage projects which would sequester carbon for less than 100 years.

The pitches that Stripe got for biosphere based carbon storage projects — which means storage of carbon in plants and soils — cost $16 per ton of carbon dioxide.

Geosphere-based carbon storage project pitches — using technology to remove carbon dioxide and then storing it in rocks and minerals — cost anywhere from $20 to $10,000 per ton, and averaged $141, the article said.

Those prices were “similar” to what Microsoft received, the Nature article says.


The trouble with the cheaper methods is they’re not as reliable. For instance, trees can be cut down, burned, or destroyed by pests. Natural solutions are also constrained by competing uses of land for vital purposes like agriculture and housing.

To be clear, Microsoft does support planting trees as a way to reduce carbon in the atmosphere. Microsoft joined the U.S. chapter of 1t.org, an effort to plant 1 trillion trees that’s supported by Salesforce CEO Marc Benioff, when it launched in August 2020, a spokesperson for Microsoft told CNBC.

But even a spokesperson for that effort admits planting trees will not be sufficient.

“Key to the main thrust of the [Nature] piece is that it describes nature-based solutions as one part of the toolkit of carbon reduction strategies — we agree with that aspect entirely,” Michael Becker, director of communications for 1t.org, told CNBC.

“There’s been a lot of negative media regarding tree initiatives lately, namely the strawman argument that ‘planting trees can solve the crisis’ — something we firmly disagree with,” Becker said. “Trees are one of our best nature-based solutions but can’t solve the climate crisis on their own.”

Also, Becker emphasized that 1t.org is focused on tree growth, not just planting.

“Tree planting initiatives are a key part of reforestation commitments and essential in areas that are too degraded to recover on their own,” Becker told CNBC. “We use ‘grow’ to emphasize that simply putting a seedling in the ground is not sufficient – that tree needs to be supported to maturity to reach its full potential. It also needs to be the right species, planted in the right conditions, based in sound science.”

Technology, tracking, and standards must improve



In the Nature article, Microsoft and the other authors argue the market for carbon removal is small and needs to get bigger and better.

Microsoft received 189 proposals to remove 154 megatons of carbon dioxide, but only 55 of those megatons were available now and only two megatons were what Microsoft considers high-quality carbon dioxide removal.

That market will grow, but more investment is needed to develop carbon capture technologies, and companies need more incentives to to use the best carbon capture strategies.

“Nature-based removal and storage, and technology-enabled removal and geosphere-based storage are not equivalent commodities and should not be valued as such,” the article says. “Today’s pricing on a per-tonne basis encourages companies to buy the lowest-quality carbon offsets.”

To improve those incentives, we’ll need much more robust and standardized carbon accounting standards, the Nature article says.

Today, there aren’t even reliable tools for tracking carbon at scale. That makes it hard for companies to be able to monitor more fully risks associated with various strategies.

There are groups helping to develop better tracking, including the international non-profit Science-based Targets Initiative, the Oxford offsetting principles from researchers at the University of Oxford, and the business initiative Transform to Net Zero, all mentioned in the article. Also, the Nature article recognizes the work of the non-profit organization Greenhouse Gas Protocol, which has guidelines for assessing emissions a company has internally and from energy purchased off-site.

But estimating emissions from an entire supply chain is still very hard — and vital.

“Three-quarters of Microsoft’s emissions come from these, including building materials, business travel, product life cycles and the electricity that customers consume when using Microsoft’s products,” the article says.