Thursday, April 01, 2021

Astronomers discover an elusive 'Goldilocks' black hole

Ryan Morrison For Mailonline 3/30/2021
© Provided by Daily Mail MailOnline 

A 'Goldilocks' black hole that is about 55,000 times the mass of the sun has been discovered by astronomers, who say it is 'not too big, and not too small'.

The stellar phenomenon, discovered by University of Melbourne astronomers, was found about three billion light years way thanks to a technique involving the detection of the light from a gamma-ray burst bending on its way to the Earth.

Astronomers say the size of the 'intermediate black hole' is between a small 'supernova' black hole and a supermassive black hole at the heart of a galaxy.

It could be an 'ancient relic' dating to the early universe before the first stars and galaxies formed, suggests co-author Professor Eric Thrane from Monash University.

These 'intermediate black holes' may have been the seeds that over time led to the supermassive black holes that live at the heart of every known galaxy today.

While this one is three billion light years away, researchers estimate that there are some 46,000 intermediate-mass black holes in the vicinity of the Milky Way galaxy.

© Provided by Daily Mail The new black hole was found through the detection of a gravitationally lensed gamma-ray burst

The discovery, through gravitational lensing, of this new long predicted type of black hole fills in a 'missing link' in our understanding of the universe, the team explained.

It has been dubbed the 'Goldilocks' black hole as it sits right in the middle of all known black hole types, not too big and not too small.

A typical black hole, created from the explosion of a massive star at the end of its life, will be up to 10 times the mass of the sun.

In contrast, a supermassive black hole that sites at the centre of a galaxy, including the recently photographed one in M87, can be billions of times the mass of the sun.

The new type of 'Goldilocks' black hole is about 55,000 times the mass of our own star, filling in a gap that has left astronomers baffled for years.

Lead author and University of Melbourne PhD student, James Paynter, said the latest discovery sheds new light on how supermassive black holes form.

"While we know that these supermassive black holes lurk in the cores of most, if not all galaxies, we don't understand how these behemoths are able to grow so large within the age of the Universe," he said.

 
© Provided by Daily Mail Astronomers say the size of the 'intermediate black hole' is between a small 'supernova' black hole and a supermassive black hole at the heart of a galaxy

The new black hole was found through the detection of a gravitationally lensed gamma-ray burst, a half-second flash of high-energy light.

This light was emitted by a pair of merging stars, and was observed to have a tell-tale 'echo', caused by the intervening intermediate-mass black hole.

The black hole bends the path of the light from the gamma-ray burst on its way to Earth, so that astronomers see the same flash twice.

Powerful software developed to detect black holes from gravitational waves was adapted to establish that the two flashes are images of the same object.

© Provided by Daily Mail The Event Horizon Telescope (EHT) collaboration, who produced the first ever image of a black hole released in 2019, has today a new view of the massive object at the centre of the Messier 87 (M87) galaxy: how it looks in polarised light



INTERMEDIATE MASS BLACK HOLES: THE MISSING LINK IN UNIVERSE EVOLUTION


Intermediate mass black holes are the 'missing link' in universe evolution.

They sit between those created from an exploding star and supermassive black holes at the heart of a galaxy.

One recently detected using gravitational lensing was 55,000 times more massive than the sun.

They are thought to have been the 'seeds' that led to the creation of supermassive black holes.

Researchers estimate that there are some 46,000 intermediate-mass black holes in the vicinity of the Milky Way.

They are thought to sit at the heart of globular clusters, collections of stars within a galaxy bound by gravity.

"This newly discovered black hole could be an ancient relic - a primordial black hole - created in the early Universe before the first stars and galaxies formed," said Thrane.

"These early black holes may be the seeds of the supermassive black holes that live in the hearts of galaxies today."

Paper co-author, gravitational lensing pioneer, Professor Rachel Webster from the University of Melbourne, said the findings have the potential to help scientists make even greater strides in the understanding of the evolution of the universe.

"Using this new black hole candidate, we can estimate the total number of these objects in the universe,' Webster explained.

'We predicted that this might be possible 30 years ago, and it is exciting to have discovered a strong example.'

The researchers estimate that some 46,000 intermediate mass black holes are in the vicinity of our Milky Way galaxy.

These groupings of intermediate-mass black holes have long been thought to sit within the cores of globular clusters.

A globular cluster is a spherical collection of stars that are tightly bound by gravity, found in disc and spiral galaxies.

There are 150 known to exist in the Milky Way with many more likely still to be found.

Galaxy M87, up to 1,000 times older than the Milky Way, is thought to have as many as 13,000 globular clusters.

the details of the discovery have been published in the journal Nature Astronomy.


A 'starter kit' for supermassive black holes?

Scientists have reported the discovery of a rare, medium-sized black hole that may help answer one of the more tantalising questions in astronomy: how do their supermassive counterparts come into being?

© Valentina BRESCHI Illustration showing the different parts of a black hole

There are two well-known sizes of black hole -- at one end, so-called stellar-class ones which are typically three to ten times the mass of our Sun -- and at the other, supermassive ones, found at the centre of most galaxies, including the Milky Way, which are millions to billions times heavier.

© Handout Astronomers have yet to figure out the origin story of matter-eating monsters called supermassive black holes

The newly detected 'goldilocks' black hole -- about 55,000 solar masses -- could be a missing link between these two extremes, scientists suggested Monday in the journal Nature Astronomy.

Up to now, only a handful of intermediate-mass black holes -- between 100 and 100,000 solar masses -- have been detected, and none have been squarely in the middle of that range.

A black hole is a celestial object that compresses a huge mass into an extremely small space. Their gravitational pull is so strong nothing can escape them, not even light.

Stellar-class black holes form when a dying star collapses, but astronomers have yet to figure out the origin story of the larger, matter-eating monsters.

"How do we get so many supermassive black holes in the Universe?" asked co-author Rachel Webster, a professor at the University of Melbourne.

Senior author Eric Thrane, a professor at Monash University, said the newly discovered black hole "could be an ancient relic, a primordial black hole created before the first stars and galaxies formed."

"These early black holes may be the seeds of the supermassive black holes that live in the hearts of galaxies today."

- Born that way? -


The new specimen was observed indirectly thanks to a slight deviation in light from a stellar explosion in the early Universe, some eight billion light years distant.

Using a technique pioneered by Webster, astronomers analysed thousands of these gamma-ray bursts -- caused either by the violent collapse of a star or the merger of two stars -- looking for signs of gravitational lensing.

This occurs when an object -— in this case, the intermediate black hole -- acts as a lens and fleetingly bends the path of the light as it travels toward Earth, such that astronomers see the same flash twice.

While Thrane, Webster and lead author James Paynter, a PhD candidate, were able to measure the mass of their intermediate black hole with precision, they could only speculate on how it was formed.

"Broadly, there are three possibilities," Webster told AFP.

It could have been forged from the merger between two lesser black holes, as was true for another, much smaller intermediate black hole discovered in May 2019.

Alternatively, it might have been born as a stellar-class black hole and slowly accumulated mass as it sucked matter into its maw.

"But this is a slow process," said Webster. "It is hard to grow supermassive black holes from a solar mass seed over the age of the Universe."

A more likely scenario is that their discovery "was born that way," she said. "This could provide the answer."

The authors think that there are about 40,000 intermediate black holes in our own galaxy alone.

The gravitational waves that can bend light -- allowing for the detection of black holes -- were first measured in September 2015, earning the lead scientists a physics Nobel two years later.

Albert Einstein anticipated gravitational waves in his general theory of relativity, which theorised that they spread through the Universe at the speed of light.

mh/reb
Dutch couples mark 20th anniversary of world's first same-sex marriages

AMSTERDAM (Reuters) - Twenty years ago, Dutch couple Gert Kasteel and Dolf Pasker made history when they tied the knot in the world's first legally-recognised same-sex wedding in the Netherlands.

© Reuters/PIROSCHKA VAN DE WOUW Dutch couple Gert Kasteel and Dolf Pasker look back on the day they tied the knot in the world's first legally-recognised same-sex wedding
© Reuters/PIROSCHKA VAN DE WOUW Dutch couple Gert Kasteel and Dolf Pasker look back on the day they tied the knot in the world's first legally-recognised same-sex wedding

They were among four gay couples - three male and one female - to be married shortly after midnight by the mayor of Amsterdam on April 1, 2001.

On Thursday, they celebrated their 20th anniversaries in small groups or at home due to COVID-19 social distancing rules that prevented large gatherings.

"It's nicer to say to other people 'he's my husband, he's my man'," said Dolf, sitting next to Gert as they flipped through an album of photos and newspaper clippings of the wedding, which made headlines worldwide. "It has helped me to accept myself."

© Reuters/PIROSCHKA VAN DE WOUW Dutch couple Gert Kasteel and Dolf Pasker look back on the day they tied the knot in the world's first legally-recognised same-sex wedding

All four gay marriages have passed the test of time. One of the men, Frank Wittebrood, died of a heart attack in 2011 at 55.

Those who participated looked back with pride at having made legal history.

© Reuters/PIROSCHKA VAN DE WOUW Dutch couple Gert Kasteel and Dolf Pasker look back on the day they tied the knot in the world's first legally-recognised same-sex wedding

"People told me that the Netherlands would be the first and the last country (to pass same-sex marriages), the rest of the world won't follow you," said Henk Krol, a lawmaker who supported the bill when it passed the Dutch parliament in 2000.

© Reuters/PIROSCHKA VAN DE WOUW Dutch couple Gert Kasteel and Dolf Pasker look back on the day they tied the knot in the world's first legally-recognised same-sex wedding

"Almost 30 countries in the world followed the Dutch example," he said.

Most European Union countries, Britain, the United States, Australia, Mexico and South Africa are among 29 nations to have legalised same-sex marriage since 2001.

"I'm very proud that it's possible," said Gert, who before he could complete his sentence had Dolf jump in and finish it: "that we could play a little part of it. We made history."

(Reporting by Esther Verkaik; Writing by Anthony Deutsch; Editing by Garet




Trans Mountain pipeline expansion will lead to $11.9B in losses for Canada, study says

Bethany Lindsay CBC
3/31/2021
© Jason Franson//The Canadian Press The estimated construction cost for the Trans Mountain pipeline expansion has ballooned from $5.4 billion to $12.6 billion.

A new study from researchers in B.C. estimates that Canada will lose $11.9 billion because of the Trans Mountain pipeline expansion project, but some industry experts question that conclusion.

The paper from a team at Simon Fraser University's School of Resource and Environmental Management released on Wednesday argues there is no likely scenario in which the project would lead to a net benefit.

"The $11.9 billion loss to Canada is primarily due to a more than doubling of the Trans Mountain construction costs from the original $5.4 billion to $12.6 billion, combined with new climate policies just confirmed by the Supreme Court that will reduce the demand for oil," lead author and SFU professor Thomas Gunton said in a press release.

The Canadian government bought the Trans Mountain pipeline from energy giant Kinder Morgan in 2018 for $4.5 billion. The SFU study points out that Ottawa has not provided the public with an evaluation of the costs and benefits that led to that decision.

The researchers suggest the government would be better off shelving the project entirely and using the funds to invest in alternative energy projects.

"Private sector companies such as BP and Shell are responding to declining demand by shifting investments from oil to green energy and the federal government should follow their lead," Gunton said.

The government and an industry expert dispute those arguments. The government said the pipeline will eventually be sold to private investors, while an industry expert noted that demand for oil is nearly back to pre-pandemic levels.

The expansion project involves twinning the existing 1,150-kilometre pipeline between Strathcona County, Alta., and Burnaby, B.C. It will add 980 kilometres of new pipeline and increase capacity from 300,000 barrels a day to 890,000 barrels a day.

The project is expected to be finished by December 2022, and is currently about 20 per cent complete, according to Trans Mountain.
'Responsible investment'

In a written statement, the federal government said it's confident that the Trans Mountain project is a "responsible investment" for Canadians, and it will invest anything earned from it into clean energy projects.

"The government does not intend to be the long-term owner of Trans Mountain Corporation. It intends to launch a divestment process after the expansion project is further de-risked and after engagement with Indigenous groups has concluded," said the Office of the Deputy Prime Minister and Minister of Finance in an email.

Richard Masson, an executive fellow at the University of Calgary School of Public Policy and former CEO of the Alberta Petroleum Marketing Commission, questioned the study's conclusion.

"I think some of the assumptions that they made would be strongly challenged by industry," Masson said.

For example, he said, Canadian oil "production hasn't grown as much as we would like it to because we've been constrained on having a lack of pipeline capacity."

The existing Trans Mountain pipeline is operating at maximum capacity.

The latest modelling by the Canada Energy Regulator shows the capacity needed in the future is uncertain — depending in part on what happens with climate policies.

The regulator's Energy Futures Report shows a need for Keystone XL, the Trans Mountain expansion, and Enbridge's Line 3 pipeline under its reference scenario, which assumes "a lack of future domestic and global climate policy action."

However, under what the regulator calls its evolving scenario, Canada brings in new greenhouse gas reducing measures to meet its stated climate targets. Canadian oil and gas production declines, and there could be enough export capacity with either Enbridge's Line 3 or the Trans Mountain expansion.

The energy regulator doesn't make a case for or against any pipeline in its scenarios, but notes "the evolving scenario does project that, in some years, crude oil available for export is significantly lower than total pipeline capacity."

Although demand for oil slowed down in 2020, Masson said, it is close to reaching pre-pandemic levels and some people in the industry expect that demand to grow, despite countries trying to simultaneously meet the goals set in the Paris climate agreement.
Benefit-cost analysis

The SFU team said they decided to perform the benefit-cost analysis of the project because of the sheer magnitude of Canada's investment and significant changes that have happened since it was approved.

Their study attempts to estimate how changing economic and political conditions could affect the profitability of the project.

Those conditions include an increase in construction costs, the cancellation of the Keystone XL pipeline to the U.S., stronger federal policies to address climate change and a weaker market for oil.

None of those factors were taken into account when the National Energy Board recommended the approval of the expansion project in 2016 and then again in 2019.

The study suggests that the pipeline expansion will create excess capacity that isn't needed and will increase the risk of environmental damage.

The ballooning price tag for construction means that the tolls charged to shippers for using the pipelines, which were agreed upon in 2017, will not come close to covering the capital costs. The tolls were set to cover $7.4 billion in costs — or about 59 per cent of the current estimate for construction.
Bitcoin uses as much energy as Sweden and is on course to use even more. 

Experts say that's a 'major problem' for its future.

insider@insider.com (Billy Bambrough) 
3/31/2021
The Kizelovskaya State District Power Plant at the Gubakhinsky Coke and Chemical Works. Russian businessman Alexei Kolesnik bought the Kizelovskaya State District Power Plant to create a data center and a bitcoin mining farm in 2018. Maxim Kimerling\TASS via Getty Images

As bitcoin sores, the matrix of computers that run its software now uses as much energy as Sweden.

Rising value incentivizes bitcoin "miners" to compete with each to discover new tokens.

"I don't think the bitcoin industry is doing itself any favors by refusing even to accept that bitcoin's energy use is a problem.
"

As bitcoin surges to unprecedented value, the sprawling matrix of computers around the world that run its software is now consuming as much energy a year as Sweden, the latest calculations suggest.

The higher the price, the more electricity this network uses. Iran was recently rocked by power outages that were partly blamed on bitcoin. Bill Gates recently warned bitcoin was "not a great climate thing." U.S. Treasury Secretary Janet Yellan has called its energy use "staggering."


Conceived to defy central banks in the fallout of the financial crisis, bitcoin started life so counter-cultural that no one really knows who created it.


It has soared from around $10,000 through most of last year to around $58,000 now, thanks to investors who fear traditional currencies are set to lose value, an influx of traders who speculate on the future price, and Elon Musk who tweets that you can use it to buy Teslas. It is now seriously talked of as a potential new global reserve currency.

But as Wall Street banks roll out bitcoin services, they may find its environmental costs hard to balance with shareholders and customers who are increasingly conscious. Experts tell Insider that bitcoin faces a Catch-22.

Like the cryptocurrency itself, bitcoin's community is decentralized, defiant, and nebulous. No one can simply tell it to heed growing calls to reduce its carbon footprint.

Alex de Vries, a Dutch economist who created the Bitcoin Energy Consumption Index, estimates the electricity used has doubled since 2017 to between 78 terawatt hours (TWh) and 101 TWh a year. More than half of bitcoin miners in China, where most use coal.

 Bitcoin has no physical form. "Mining" refers to its network of computers finding new tokens by having them solve complex calculations.


 Hardware at the SberBit cryptocurrency mining equipment 
facility in Moscow, Russia, in 2017.
Vyacheslav Prokofyev\TASS via Getty Images

New tokens these calculations uncover are a reward to the miners for using their computing power and electricity to secure the network against hacks and record transactions on bitcoin's decentralized ledger, known as the blockchain.

As the price climbs, those running the vast networks of computers dedicated to solving these calculations can sell them and direct more computing power toward the network, creating a cyclical effect as they compete with other miners to find new bitcoin first.

SEE ALSO: Bitcoin mining can be a 'bridge' to a renewable energy future by supporting green projects, a leading North American miner says ROFLMAO

De Vries thinks bitcoin's energy use will continue to climb as the prices rises and miners buy more hardware.

He forecasts the network could soon consume a staggering 200 TWh, as much energy as all data centers globally and equivalent roughly to London.

He said potential investors may be put off by bitcoin's eye-watering energy use, adding, "I think this will be a major problem for bitcoin."

But for bitcoin advocates, the fact it allows people to make transactions semi-anonymously and without third-party approval, outweigh the environmental costs.

Nic Carter, a bitcoin investor and partner at crypto-focused venture capital firm Castle Island Ventures, told Insider its energy use was "not a new debate."

"The costs of the dollar system are harder to comprehend but they are extremely real," he added
.
The more bitcoin surges, the worse the problem becomes
 Alain Pitton/NurPhoto via Getty Images

He said that if investors conscious of environmental impact refused to buy bitcoin "because it consumes energy - like every other utility on the planet - they are just doing themselves and their investors a disservice."

Twitter chief executive Jack Dorsey said late last year that cryptocurrencies "will eventually be powered completely by clean power, eliminating its carbon footprint and driving adoption of renewables globally ... Published estimates indicate bitcoin already consumes a significant amount of clean energy."

Bitcoin miners, incentivized to use renewable energy by government subsidies, have sought sustainable ways of fueling their computers.

One German company set up a mining facility under the fjords of Norway, using hydroelectricity to power its machines and the freezing water to cool them.

But estimates of how much bitcoin's overall energy use is green vary hugely.

In a 2019 study, cryptocurrency asset management firm CoinShares' analysis concluded the bitcoin network gets up to 74% of its electricity from renewables. But in a survey by Cambridge University's Judge Business School the same year, only 39% of miners said that their power came from renewables.

Renewables aren't the only way to use less energy.

Ethereum, the second largest cryptocurrency after bitcoin, has a total value of $200 billion compared to bitcoin's $1 trillion market capitalization and soared over the last year. Its energy demands spiked to 30 TWh per year, up from 7 TWh 12 months ago, according to de Vries' calculations.

It sought to cut its energy use by moving to a "proof-of-stake" algorithm, where, instead of miners who create new tokens as a reward for securing the blockchain, "stalkers" hold existing tokens and can commit - or stake - them to the network, generating new tokens and helping to validate transactions.

But De Vries said fixing bitcoin's energy dilemma this way would be impossible without fundamental changes to bitcoin.

Its miners and developers could vote for such a change but fundamental alterations to bitcoin's core software are broadly unpopular.

One proposal to make bitcoin better suited to small payments in 2017 caused such a schism that a group of miners decided to "fork" the blockchain and create a rival cryptocurrency called bitcoin cash.

Frances Coppola, an author on banking, finance, and economics, told Insider bitcoin needed to evolve if it is to solve the problem.

"I don't think the bitcoin industry is doing itself any favors by refusing even to accept that bitcoin's energy use is a problem, let alone do anything about it," she added.

De Vries added that, if traders, miners and advocates cannot address its environmental impact, government action "seems like an inevitable outcome."

Bitcoin may be set up to be distanced from authorities but, as hundreds of thousands dream of following its early investors into the ranks of the world's richest, it is attracting the type of attention governments cannot ignore.

India has proposed fining anyone who trades or owns bitcoin, As U.S. Treasury Secretary Yellan condemned its energy use, she also warned investors it was "extremely inefficient" and "often for illicit finance."

But De Vries added that, whatever happened, bitcoin would likely survive in some form.

He said it would "continue to exist as long as some people think it has value ... It may just not be the same market value as it has today."

Read the original article on
Business Insider

NASA measures direct evidence humans are causing climate change

Jeff Berardelli 

It may come as a surprise, given the extensive body of evidence connecting humans to climate change, that directly-observed proof of the human impact on the climate had still eluded science. That is, until now.

© Getty Emissions Spew From Coal Fired Power Plant

In a first-of-its-kind study, NASA has calculated the individual driving forces of recent climate change through direct satellite observations. And consistent with what climate models have shown for decades, greenhouse gases and suspended pollution particles in the atmosphere, called aerosols, from the burning of fossil fuels are responsible for the lion's share of modern warming.

In other words, NASA has proven what is driving climate change through direct observations — a gold standard in scientific research.

"I think most people would be surprised that we hadn't yet closed this little gap in our long list of evidence supporting anthropogenic [human-caused] climate change," says Brian Soden, co-author of the study and professor of Atmospheric Sciences at the University of Miami's Rosenstiel School of Marine and Atmospheric Science.

By now it's common knowledge that the rapid warming of the past century is not natural. Rather, it is a result of the build up of heat-trapping greenhouse gases like carbon dioxide (CO2) and methane, much of it from the burning of fossil fuels.
The science behind why the Earth is warming

When sunlight enters the atmosphere some of it is reflected back to space without heating the Earth. The rest is absorbed by the Earth's surface and atmosphere and re-radiated as heat. Some of this heat escapes back into space, but the rest of the heat is trapped by specific molecules like CO2, methane and water vapor. Simply, the more greenhouse gases the atmosphere has, the more heat is trapped and the more the temperature goes up

.
© Provided by CBS News This NASA animation is a simplified illustration of Earth's planetary energy balance: The energy budget is balanced between incoming (yellow) and outgoing radiation (red). Natural and human-caused processes affect the amount of energy received as well as the amount emitted back to space. / Credit: NASA

Since the mid 1800s, CO2 in the atmosphere has increased from 280 parts per million to 415 parts per million — a 50% increase — and it is now the highest it has been in at least 3 million years. Carbon dioxide in the atmosphere is increasing at a pace 100 times faster than it naturally should.

At the same time, suspended pollution particles, called aerosols, cool the atmosphere by blocking sunlight. This unintentional side effect of the Industrial Revolution has proven useful in masking some greenhouse warming.

While these particles were effective at helping counteract some of the global warming in the mid to late 20th century, their impact is diminishing, because since the 1980s pollution has been gradually clearing up. While this is great news for health, it is unmasking additional warming in the system.

Together, the change in heat absorbed in our atmosphere because of changes in greenhouse gases and aerosols is called "radiative forcing." These changes in radiative forcing throw off Earth's energy balance. That's because, in order for Earth's average temperatures to remain steady, the "energy-in" from the sun must be equalized by the "energy-out" from Earth into space.

When those numbers are equal the Earth maintains balance. But when greenhouse gases build up, the energy going out is less than the energy entering the Earth system, which heats up our oceans and atmosphere, creating an imbalance in the Earth's energy budget.

What NASA has done in this study is to calculate, or quantify, the individual forcings measured from specialized satellite observations to determine how much each component warms or cools the atmosphere. To no one's surprise, what they have found is that the radiative forces, which computer models have indicated for decades were warming the Earth, match the changes they measure in observations.

New insight from NASA


Gavin Schmidt, the director of NASA Goddard Institute for Space Studies, says science has long had an overwhelming amount of indirect evidence of the factors warming the Earth. The predicted energy imbalance illustrated by decades' worth of computer models has become apparent for all of humanity to see, from disappearing glaciers to more extreme weather disasters to warming oceans.

"We have long had good evidence that the predicted energy imbalance was real because of the increases in ocean heat content. That is very powerful confirmation that the models were predicting warming for the right reasons," Schmidt explains. He says scientists have also had direct evidence that changes in greenhouse gases have been affecting the transfer and absorption of heat in the atmosphere, but only in localized settings, not a comprehensive evaluation.

Soden adds that science does have solid observational evidence that CO2 has increased over the last century due to the burning of greenhouse gases and that laboratory measurements confirm that CO2 absorbs heat, which theoretically should cause the planet to warm at roughly the rate observed over the last century. However, Soden says that observing the trapping of heat from space is actually quite challenging. This new research solves that challenge.

"This is the first calculation of the total radiative forcing of Earth using global observations, accounting for the effects of aerosols and greenhouse gases," said Ryan Kramer, first author on the paper and a researcher at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "It's direct evidence that human activities are causing changes to Earth's energy budget."

Specifically, this study has been able to calculate solid numbers for the changes in heat trapped in the Earth system from the individual contributors that influence heat transfer, like radiation, clouds and water vapor, for the period 2003-2019. The researchers did that by analyzing satellite observations and applying what they call "radiative kernels" to disentangle the various components controlling the transfer, absorption and emission of heat inside the Earth system and what is sent back out into space. Up to this point, satellite observations of Earth's radiation budget had only measured the sum total of radiation changes, not the individual components.

Then there are also feedbacks in the climate system which account for a smaller but still important amount of warming. One example of this is the fact that as the atmosphere warms it can hold more water vapor, and that means it can trap more heat, further allowing for more water vapor to build up. This is a positive feedback which perpetuates warming.

The result: From 2003 through 2018, radiative forcing has increased 0.5 watts per meter-squared (W/m2), which accounts for the planetary imbalance, the excess heat trapped in the Earth system. The researchers conclude that this increase has indeed been due to a combination of mainly rising concentrations of greenhouse gases and, to a lesser degree, recent reductions in aerosol emissions.

For reference, Schmidt says the excess .5 W/m2 added to the Earth system from 2003-2018 is roughly equivalent to one Christmas tree lightbulb for every 5-foot-square area on Earth. That may not sound like very much, but that much energy would be expected to warm the planet by more than a half a degree Fahrenheit in only 16 years. To put it another way, the 0.5 W/m2 of excess heat absorbed by the Earth system is 10 times the total energy used by humans in a year, meaning everything from cooking stoves to nuclear power.

"In reality, the observational results came in just as predicted by the theory," says Soden. "There is no surprise in the results, but rather it's really more of 'dotting the i's and crossing the t's' on anthropogenic [human-caused] climate change. It closes that last link between rising CO2 levels and planetary warming."

But this study does more than just provide concrete evidence of the link between humans and recent climate change. It also illustrates just how far science has come in uncovering the secrets which govern the workings of our physical universe.

Melting ice sheets caused sea levels to rise up to 18 metres




DURHAM UNIVERSITY

Research News

It is well known that climate-induced sea level rise is a major threat. New research has found that previous ice loss events could have caused sea-level rise at rates of around 3.6 metres per century, offering vital clues as to what lies ahead should climate change continue unabated.

A team of scientists, led by researchers from Durham University, used geological records of past sea levels to shed light on the ice sheets responsible for a rapid pulse of sea-level rise in Earth's recent past.

Geological records tell us that, at the end of the last ice age around 14,600 years ago, sea levels rose at ten times the current rate due to Meltwater Pulse 1A (MWP-1A); a 500 year, ~18 metre sea-level rise event.

Until now, the scientific community has not been able to agree about which ice sheet was responsible for this rapid rise, with the massive Antarctic Ice Sheet being a likely suspect, but some evidence pointing towards ice sheets in the Northern Hemisphere.

The new study uses detailed geological sea-level data and state-of-the-art modelling techniques to reveal the sources of MWP-1A. Interestingly, most of the meltwater appears to have originated from the former North American and Eurasian ice sheets, with minimal contribution from Antarctica, reconciling formerly disparate views.

In addition to flooding vast areas of low-lying land, this unparalleled discharge of freshwater into the ocean - comparable to melting an ice sheet twice the size of Greenland in only 500 years - will have disrupted ocean circulation, with knock-on effects for global climate. Knowing the source of the meltwater will improve the accuracy of climate models that are used to replicate the past and predict changes in the future.

The results are important for our understanding of ice-ocean-climate interactions which play a significant role in shaping terrestrial weather patterns. The findings are particularly timely with the Greenland ice sheet rapidly melting, contributing to a rise in sea levels and changes to global ocean circulation.

Of the findings, lead author Yucheng Lin, in the Department of Geography at Durham University notes: "Despite being identified over 30 years ago, it has been surprisingly challenging to determine which ice sheet was the major contributor to this dramatic rise in sea levels.

"Previously, scientists tried to work out the source of the sea-level rise based on sea-level data from the tropics, but the majority of those studies disagreed with geological records of ice sheet change.

Our study includes novel information from lakes around the coast of Scotland that were isolated from the ocean due to land uplift following the retreat of the British Ice Sheet, allowing us to confidently identify the meltwater sources."

Co-author Dr. Pippa Whitehouse, in the Department of Geography at Durham University said "The technique we have used allows us to really dig into the error bars on the data and explore which ice-melt scenarios were most likely.

"We found that most of the rapid sea-level rise was due to ice sheet melt across North America and Scandinavia, with a surprisingly small contribution from Antarctica.

"The next big question is to work out what triggered the ice melt, and what impact the massive influx of meltwater had on ocean currents in the North Atlantic. This is very much on our minds today - any disruption to the Gulf Stream, for example due to melting of the Greenland Ice Sheet, will have significant consequences for the UK climate."

Rising sea levels due to warming climate pose a great risk to society, improving our understand of why and how fast change could happen will help us plan for the impacts.

###

Yucheng Lin is funded by a Durham University - China Scholarship Council joint scholarship.

The Scotland data was collected and analysed by Durham University researchers, funded by the Natural Environment Research Council.

MEDIA INFORMATION

Interviews

Yucheng Lin in Durham University's Department of Geography is available for interview on Wednesday 31 March 2021 from 3:00pm to 5pm and can be reached at yucheng.lin@durham.ac.uk

Dr Pippa Whitehouse in Durham University's Department of Geography is available for interview on Wednesday 31 March 2021 from 12:30pm to 5pm and can be reached at pippa.whitehouse@durham.ac.uk

Alternatively, please contact Durham University Marketing and Communications Office on communications.team@durham.ac.uk

Source information

A reconciled solution of Meltwater Pulse 1A sources using sea-level fingerprinting :
https://doi.org/10.1038/s41467-021-21990-y

Pictures available

The following images are available via this Dropbox link:
https://www.dropbox.com/sh/nt0jpqx6u0abi71/AAAWlxM2m1KzKrY9-jNCbbala?dl=0

Picture 1.jpg and Picture 2.jpg

Caption
An isolation lake in north-west Scotland. Sediment analysed from the bottom of this low-lying lake tells us that it was once connected to the ocean.

Credit
Professor Ian Shennan, Department of Geography, Durham University.

Useful web links

Yucheng Lin's profile:
https://www.dur.ac.uk/directory/profile/?id=18973

Dr. Pippa Whitehouse's profile:
https://www.dur.ac.uk/directory/profile/?id=1553

Department of Geography, Durham University:
https://www.durham.ac.uk/departments/academic/geography/

About Durham University

Durham University is a globally outstanding centre of teaching and research based in historic Durham City in the UK.

We are a collegiate university committed to inspiring our people to do outstanding things at Durham and in the world.

We conduct boundary-breaking research that improves lives globally and we are ranked as a world top 100 university with an international reputation in research and education (QS World University Rankings 2021).

We are a member of the Russell Group of leading research-intensive UK universities and we are consistently ranked as a top 10 university in national league tables (Times and Sunday Times Good University Guide, Guardian University Guide and The Complete University Guide).

For more information about Durham University visit:
http://www.durham.ac.uk/about/

END OF MEDIA RELEASE:
Issued by Durham University Marketing and Communications Office - http://www.durham.ac.uk/news

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

GREEN CAPITALI$M
Orsted to link a huge offshore wind farm to 'renewable' hydrogen production

Anmar Frangoul CNBC


The last few years have seen a number of businesses take an interest in projects connected to "renewable" hydrogen.

Described by the International Energy Agency as a "versatile energy carrier," hydrogen can be produced in several ways.

© Provided by CNBC This photograph shows turbines at the
 Borkum Riffgrund 2 wind farm, which Orsted owns 50% of.

Danish energy company Orsted wants to construct a large-scale offshore wind farm in the North Sea and link it to so-called "renewable" hydrogen production on the European mainland, with the project garnering support from several major industrial firms.

Under the proposals, which were outlined on Wednesday, Orsted would develop a 2 gigawatt (GW) offshore wind facility and 1 GW of electrolyzer capacity, with the company claiming its plans would result in "one of the world's largest renewable hydrogen plants to be linked to industrial demand."

The SeaH2Land development — which is supported by companies including ArcelorMittal, Yara and Dow — would also include 45 kilometers of hydrogen pipelines between Belgium and the Netherlands.


The electrolyzer part of the project — to be built in two 500 megawatt phases — would use electricity from the wind farm to produce hydrogen.


Among other things, partners involved in the development need to undertake a full feasibility study of SeaH2Land, while Orsted has yet to take a final investment decision. If all goes smoothly and the project gets the green light, however, both portions of the electrolyzer could be up and running by 2030.

"As the world looks to decarbonise, it's paramount that we act now to secure the long-term competitiveness of European industry in a green economy," Martin Neubert, Orsted's chief commercial officer, said in a statement.

Described by the International Energy Agency as a "versatile energy carrier," hydrogen has a diverse range of applications and can be produced in a number of ways.

One method includes using electrolysis, with an electric current splitting water into oxygen and hydrogen. If the electricity used in the process comes from a renewable source such as wind or solar then some describe it as "green" or "renewable" hydrogen.

VERSUS THE MORE COMMON BLUE HYDROGEN MADE BY NATURAL GAS

The last few years have seen a number of businesses take an interest in projects connected to renewable hydrogen, while major economies such as the European Union have laid out plans to install at least 40 GW of renewable hydrogen electrolyzers by 2030.

In March, a major green hydrogen facility in Germany started operations. The "WindH2" project, as it's known, involves German steel giant Salzgitter, E.ON subsidiary Avacon and Linde, a firm specializing in engineering and industrial gases.

Elsewhere, a subsidiary of multinational building materials firm HeidelbergCement has worked with researchers from Swansea University to install and operate a green hydrogen demonstration unit at a site in the U.K.

The interest in hydrogen is not restricted to Europe. In a speech last November, Indian Prime Minister Narendra Modi said his country was proposing to launch what he described as "a comprehensive National Hydrogen Energy Mission."

Presenting the country's budget earlier this year, Nirmala Sitharaman, India's finance minister, referenced Modi's announcement, adding: "It is now proposed to launch a Hydrogen Energy Mission in 2021-22 for generating hydrogen from green power sources."

The planet's third biggest emitter of greenhouse gases, India's attempt to embrace hydrogen and other renewable technologies — it's targeting 450 GW of renewable capacity by 2030 — would, if fully realized, represent a significant shift for the country.

DEN' DURAKA

 

ДЕНЬ ДУРАКА

Where we live can affect male reproductive health, finds new study

UNIVERSITY OF NOTTINGHAM

Research News

New research, led by scientists at the University of Nottingham, suggests that the environment in which men live may affect their reproductive health.

The research, published in Scientific Reports, looked at the effects of geographical location on polluting chemicals found in dog testes, some of which are known to affect reproductive health. The unique research focused on dogs because, as a popular pet, they share the same environment as people and are effectively exposed to the same household chemicals as their owners.

The team also looked for signs of abnormalities in the testes. The findings showed that both the chemicals present and the extent of abnormalities in the testes were different depending on where the dog's had been living.

The researchers analysed the testes of dogs, which had been removed for routine clinical reasons, to see what polluting chemicals were present in the tissue. Samples were taken from across the UK, in the East and West Midlands, and the South East, as well as from Denmark and Finland.

Dr Rebecca Sumner, from the School of Veterinary Medicine and Science at the University, and lead author of the study, said: "For the first time, we have shown that the profile of chemical pollutants found in dog testes depends on where they are from. We have also shown that the same cohorts of dog testes also show geographic differences in testicular pathology and evidence of an imbalance in cells that are important for sperm production."

Dr Richard Lea, lead of the team, said: "Although this study suggests that there are fewer pathologies in dog testes from Finland compared to other locations, relating this to the chemicals detected is difficult, particularly as many other pollutants may also be present.

"We believe, that this study is of pivotal importance since our strategy to use the dog as a sentinel species for the human has allowed us to focus directly on the testis, where detected chemicals are likely to influence male reproductive function." Professor Gary England, Dean of School of Veterinary Medicine & Science, said "This work is significant since collectively, these findings indicate that environmental exposures are determined by location and this may underpin regional differences in male reproductive health."

Genome sequencing shows coronavirus variation drives pandemic surges

Fusing classical epidemiology and genomics is a tool for future pandemics

UNIVERSITY OF CALIFORNIA - DAVIS

Research News

Genome sequencing of thousands of SARS-CoV-2 samples shows that surges of COVID-19 cases are driven by the appearance of new coronavirus variants, according to new research from the School of Veterinary Medicine at the University of California, Davis published April 1 in Scientific Reports.

"As variants emerge, you're going to get new outbreaks," said Bart Weimer, professor of population health and reproduction at the UC Davis School of Veterinary Medicine. The merger of classical epidemiology with genomics provides a tool public health authorities could use to predict the course of pandemics, whether of coronavirus, influenza or some new pathogen.

Although it has just 15 genes, SARS-CoV-2 is constantly mutating. Most of these changes make very little difference but sometimes the virus becomes more or less transmissible.

Weimer and graduate student DJ Darwin R. Bandoy initially analyzed the genomes of 150 SARS-CoV-2 strains, mostly from outbreaks in Asia prior to March 1, 2020, as well as epidemiology and transmission information for those outbreaks. They classified outbreaks by stage: index (no outbreak), takeoff, exponential growth and decline. The ease of transmission of a virus is set by the value R, or reproductive number, where R is the average number of new infections caused by each infected person.

They combined all this information into a metric called GENI, for pathogen genome identity. Comparing GENI scores with the phase of an epidemic showed that an increase in genetic variation immediately preceded exponential growth in cases, for example in South Korea in late February. In Singapore, however, bursts of variation were associated with smaller outbreaks that public health authorities were able to quickly bring under control.

20,000 virus samples

Weimer and Bandoy then looked at 20,000 sequences of SARS-CoV-2 viruses collected and from February to April 2020 in the United Kingdom and compared them with data on cases.

They found that the GENI variation score rose steadily with the number of cases. When the British government imposed a national lockdown in late March, the number of new cases stabilized but the GENI score continued to rise. This shows that measures such as banning gatherings, mask mandates and social distancing are effective in controlling spread of disease in the face of rapid virus evolution.

It could also help explain "superspreader" events when large numbers of people get infected in a single incident where precautions are relaxed.

Weimer said he hopes that public health authorities will take up the approach of measuring virus variation and linking it to the local transmission rate, R.

"In this way you can get a very early warning of when a new outbreak is coming," he said. "Here's a recipe for how to go about it."

###

Parts of the work previously appeared online as a preprint. Bandoy is sponsored by the Philippine California Advanced Research Institute.

Beetle outbreak impacts vary across Colorado forests

COLORADO STATE UNIVERSITY

Research News

IMAGE

IMAGE: A SPRUCE BEETLE-IMPACTED FOREST IN SOUTHWESTERN COLORADO. view more 

CREDIT: SARAH HART/ COLORADO STATE UNIVERSITY

It's no secret. Colorado's forests have had a tough time in recent years. While natural disturbances such as insect outbreaks and wildfires occurred historically and maintained forest health over time, multiple, simultaneous insect disturbances in the greater region over the past two decades have led to rapid changes in the state's forests.

A bird's eye view can reveal much about these changes. Annual aerial surveys conducted by the Colorado State Forest Service and USDA Forest Service have provided yearly snapshots for the state. New collaborative research led by Colorado State University and the University of Wisconsin-Madison now supplements this understanding with even greater spatial detail.

The study, "Effects of Bark Beetle Outbreaks on Forest Landscape Pattern in the Southern Rocky Mountains, U.S.A.," analyzed Landsat satellite imagery between 1997-2019 to quantify how outbreaks of three different insect species have impacted forests across high-elevation forests in Colorado, southern Wyoming, and northern New Mexico. The research team found that while these collective beetle outbreaks impacted around 40 percent of the area studied, the effects of these outbreak varied due to differences in forest structures and species composition across the region.

"In contrast to research that has examined the heterogeneous effects of wildfire on trees, there hasn't been much work on the landscape-level variation in bark beetle effects on forests, particularly across broad areas," said Sarah Hart, co-author and assistant professor in the Forest and Rangeland Stewardship department. "Heterogeneity plays an important role in how these forests will look in the future, where surviving trees will regenerate the forest, and what potential there is for future outbreaks."

Their results indicate that most forest stands affected by insects still have mature trees that can be sources for reestablishing seeds and conditions for the next generation of trees to grow. Areas with tree mortality greater than 90 percent were relatively small and isolated. Unlike severe wildfires that can kill all trees in its path, trees typically survive bark beetle outbreaks, facilitating forest recovery in upcoming decades.

High-resolution, field-level accuracy

Widespread outbreaks of three important bark beetle species have occurred in Colorado's forests since the turn of the century: mountain pine beetle, spruce beetle, and the western balsam beetle (that affects various fir tree species). These bark beetles primarily target large trees with reduced defenses due to lower precipitation amounts and higher temperature trends since the turn of the century.

This research team combined satellite imagery capable of identifying small groups of dead trees with a decade of extensive field data from nearly 250 plots to develop presence and severity maps for tree mortality caused by bark beetle attacks. Having this data combination gave the research team detailed information about how many trees have died in particular places, and helped to identify what may still be causing the death of individual trees.

"These maps give us unique insight into the effects of recent insect outbreaks because they span a large area but also show a lot of detail, and we are confident that they are showing us how many trees are dying because technicians counted trees on the ground," Kyle Rodman, lead author and post-doctoral researcher at the University of Wisconsin-Madison said.

The maps the team produced indicate that areas most impacted by bark beetles are concentrated in northern and southwestern Colorado due to higher concentrations of old lodgepole pine and spruce forests which were then infested by mountain pine beetle and spruce beetle, respectively. Western balsam beetle impacts were also widespread across the region, but these beetles tended to kill fewer trees in any single location.

"Satellite data is a crucial bridge that allows us to take detailed information from individual places and extend this localized knowledge to large areas," Rodman said. "In using these maps, we can see how the forest has changed over the past 20 years during each of these outbreaks."

Fortunately, much of the 25,000 square kilometer study area showed low to moderate levels of tree mortality, with high tree mortality being contained in small and isolated patches averaging only about nine city blocks in overall size.

"People tend to notice what has changed, rather than what has stayed the same," Rodman said. "These forests have changed a lot, but I am hopeful. It will just take a little while for them to recover, but many of these beetle-killed forests are likely to recover within a few decades."