Sunday, August 30, 2020

Dramatic Tale of Environmental Change Told by Ancient Fossil Trees on Peru’s Central Andean Plateau

Fossil Tree With Team on Plateau
Caption: Left to right: Florentino Tunquipa who discovered and excavated the fossil tree on his land, Rodolfo Salas Gismondi, Carlos Jaramillo, Julia Tejada, Federico Moreno, Camila Martínez. San Miguel Community, Espinar Municipality, Cusco Department, Peru. 2014. Credit: Rodolfo Salas Gismondi
On an expedition to the Central Andean Plateau, researchers from the Smithsonian Tropical Research Institute (STRI) and colleagues were astounded to find a huge fossil-tree buried in the cold, grassy plain. The plant fossil record from this high-altitude site in southern Peru contains dramatic reminders that the environment in the Andes mountains changed drastically during the past 10 million years, but not in the ways that climate models of the past suggest. Findings from the expedition are presented in the journal Science Advances.
Fossil Tree Peruvian Central Plateau
Paleontologist Edwin Cadena poses next to giant, fossilized (permineralized) tree on Peruvian Central Plateau. Credit: Carlos Jaramillo, Smithsonian Tropical Research Institute
“This tree and the hundreds of fossil wood, leaf and pollen samples we collected on the expedition, reveal that when these plants were alive the ecosystem was more humid–even more humid than climate models of the past predicted,” said Camila Martinez, a fellow at STRI, who recently finished her doctorate at Cornell University. “There is probably no comparable modern ecosystem, because temperatures were higher when these fossils were deposited 10 million years ago.”
The anatomy of the petrified (permineralized) wood the researchers found is very much like wood anatomy in low-elevation tropical forests today. Indeed, the altitude then was probably only 2,000 meters above sea level.
But that ecosystem did not last for long. Today, the arid, intermountain plateau lies at 4,000 meters above sea level.
Five million year-old fossils from the same sites confirmed that the Puna ecosystem that now dominates the Andes’ high mountain plateaus had been born: the younger pollen samples were mostly from grasses and herbs, rather than from trees. Leaf material was from ferns, herbs, and shrubs, indicating that the plateau had already risen to its current altitude.
“The fossil record in the region tells us two things: both the altitude and the vegetation changed dramatically over a relatively short period of time, supporting a hypothesis that suggests the tectonic uplift of this region occurred in rapid pulses,” said Carlos Jaramillo, STRI staff scientist, and project leader.
Fossil Tree Team
Team visiting Crisoldo Cárdenas, known in the region for his fossil findings. San Miguel Community, Espinar Municipality, Cusco Department, Peru. 2014. From left to right: Federico Moreno, Nandini Kar, María Isabel Vélez, Crisoldo Cárdenas, Julia Tejada, Carlos Jaramillo, Gustavo Ballén, Drew Lubiniecki Below: Edwin Cadena, Rodoldo Salas Gismondi. Credit: Rodolfo Salas Gismondi
“Andean uplift played an important role in shaping the climate of South America, but the relationship between the rise of the Andes, local climates and vegetation is still not well understood,” Martinez said. “By the end of this century, changes in temperature and atmospheric carbon dioxide concentrations will again approximate the conditions 10 million years ago. Understanding the discrepancies between climate models and data based on the fossil record help us to elucidate the driving forces controlling the current climate of the Altiplano, and, ultimately, the climate across the South American continent.
###
Reference: “Neogene precipitation, vegetation, and elevation history of the Central Andean Plateau” by C. Martínez, C. Jaramillo, A. Correa-Metrío, W. Crepet, J. E. Moreno, A. Aliaga, F. Moreno, M. Ibañez-Mejia and M. B. Bush, 28 August 2020, Science Advances.
DOI: 10.1126/sciadv.aaz4724
Author affiliations include: STRI; Cornell University; CNRS, EPHE, IRD, Montpellier; Universidad Nacional Autónoma de México; Museo de Historia Natural, Lima, Peru; University of Rochester, Rochester, New York; and the Florida Institute of Technology.

KAPITALISMUS IS KRISIS

Covid-19 has hit advanced economies and Latin America particularly hard


Total coronavirus deaths per 100k versus projected decline in economic growth (2020)

Projected decline in economic growth in 2020 (%)
Italy
Spain
France
Population
Latin America and
the Caribbean
1 billion
UK
2 billion
10
Brazil
US
Germany
Advanced Economies
Emerging and Developing Europe
5
Middle East and Central Asia
Sub-Saharan Africa
Emerging and Developing Asia
0
China
0
20
40
60
Total Covid-19 deaths per 100k

SOURCE: JOHN HOPKINS UNIVERSITY, IMF WORLD ECONOMIC OUTLOOK

Exclusive: British Library’s chief librarian claims 'racism is the creation of white people'

Liz Jolly supports changes to displays and collections in the wake of Black Lives Matter protests to purge 'perceived racism' at the library

ByCraig Simpson 29 August 2020 • 


The building was designed by architect and former naval officer Sir Colin Wilson, and evokes the shape and character of an ocean-going craft CREDIT: Anadolu Agency

The British Library’s chief librarian has claimed “racism is a creation of white people” and backed calls for major cultural change at the institution, the Telegraph can reveal.

Liz Jolly manages the vast collection of literary treasures held by the institution, and is supporting changes to displays and collections in the wake of Black Lives Matter protests. Reforms are being proposed by a “Decolonising Working Group”, which claims the British Library’s London building is an imperialism symbol because it resembles a battleship.

Staff being supported to decolonise the UK’s national library have also suggested that traditional puppet Mr Punch reflects “colonial violence”

The Telegraph has exclusively obtained documents revealing these claims, and a letter endorsed hundreds of employees which declared a racial “state of emergency” at the institution.



In response to this emergency, an internal report called for the removal of statues of the library’s founding fathers, replacing “Eurocentric” maps, and reviewing collections of western classical music which staff branded part of the “outdated notion” of Western Civilization.

The institution recently faced calls for defunding from MPs after the Telegraph revealed employees had urged colleagues to donate to Black Lives Matter and back the work of Labour MP Dianne Abbott.

The Telegraph can now reveal that Chief Librarian Ms Jolly has urged white staff to support the institution's plans to purge the library of perceived racism.

In a video clip obtained by the Telegraph she tells colleagues: “I think, as I have said before, that we need to make sure some white colleagues are involved, because racism is a creation of white people.”

Ms Jolly receives between £120,000 and £125,000 per year for her work as Chief Librarian, a role she has held since 2018.

She has given her support to a broad “Anti-Racism Project” proposed in the wake of Black Lives Matter protests, telling colleagues “particularly pressing” work was ongoing to review “artworks in the St Pancras building”.

Ms Jolly assured staff that the project is “about developing and delivering major cultural change”, and part of this would be ensuring the repository of literary treasures will “reflect the diversity of Britain today”.

There are calls for the bust of the founder Sir Hans Sloane to be removed due to his connections with the slave trade CREDIT: Chris J Ratcliffe/Getty Images Europe

This followed on from a letter declaring a “state of emergency” at the library, signed by 200 employees, demanding BAME staff should review any job cuts which might affect employees with “protected characteristics” to ensure continued diversity at the library, better treatment of these workers, and ensuring the BAME Network has a say in who is employed as Head of Collections.

To tackle the legacy of “colonial violence” at the library, the letter also demands a statue of founder Sir Hans Sloane be removed.

This recommended removal is echoed in a report by the Decolonising Working Group, which claims that the “physical space” of the British Library contains “manifestations of the institution’s racism” by glorifying the British Empire.

Staff claim in the report that: “This glorification is hard to miss in the structure of the building itself, designed as it is in the form of a battleship, by far the greatest symbol of British imperialism.”

The building was designed by architect and former naval officer Sir Colin Wilson, who added maritime references such as portholes to the building.

In the report aimed at reforming the British Library “as a space”, visual aspects including a portrait of Mr Punch which hangs in the institution is critiqued.

Portrait of Mr Punch. Oil painting CREDIT: Alamy

The character, popular at seaside shows is: “A theatrical figure from the heyday of Victorian imperialism who ‘entertained’ through an abuse of women and children that mirrored colonial Violence.”

Staff have also claimed Library links to the “ongoing settler-colonisation of Palestine”, and have proclaimed that “every vacant wall” of the institution’s property should be used to advertise the cause of anti-racism.

Other recommended actions include reducing the number of European maps on display, being “tools of power”, and reviewing the presence of busts of Beethoven and Mendelssohn, whose revered status is part and parcel of “western civilizational supremacy”. The music collections should also be diversified, the report suggests.

The Decolonising Working Group is part of the BAME Network at the library, which advised colleagues to read Marxist authors and support Black Lives Matter, deeming “colour-blindness” a form of “covert white supremacy”.



Marble bust of Sir Joseph Banks (1743-1820) Botanist, Trustee and benefactor of the British Museum CREDIT: PjrTravel / Alamy Stock Photo

In their compiled report circulated among staff they demand the bust of another library founder, botanist Sir Joseph Banks, be removed and the statue of King George III be reviewed.

Their work has been acknowledged by the ongoing Anti-Racism Project at the institution, which is being backed Chief Executive Roly Keating and Chief Librarian Ms Jolly, who has enjoined white staff to advance the project to address the “creation by white people”

“This says to me that there is something fundamentally wrong with the leadership of the British Library,” said MP Ben Bradley.

“If the Chief Librarian is so unhappy with British history perhaps they should not be in that job.

“The very suggestion that racism only applies in one direction , by white people towards BAME people, is categorically false, inflammatory and divisive.”

The British Library houses almost 200 million objects, many of which have major importance to British history, culture, and literature. These include two copies of the Magna Carta.

The institution and Ms Jolly have been contacted for comment.

Coral develops 'osteoporosis' because of acidic oceans caused by climate change, study reveals

The ocean becomes more acidic as a result of more carbon dioxide building up

By Helena Horton

28 August 2020 • 2:54pm
Coral becomes deformed as a result of acidified oceans, a study has found 
CREDIT: TANE SINCLAIR-TAYLOR/AFP/Getty

Coral reefs develop 'osteoporosis' as a result of the ocean becoming more acidic due to carbon dioxide emissions, scientists have found.

As carbon dioxide builds up in concentration in the ocean, it becomes more acidic, and this is impeding the growth of coral, a new study in Geophysical Research Letters has revealed.

Scientists from the Woods Hole Oceanographic Institution (WHOI) found significant reduction in the density of coral skeleton along much of the Great Barrier Reef, the world's largest coral reef system, and also on two reefs in the South China Sea, which they attribute largely to the increasing acidity of the waters surrounding these reefs since 1950.

"This is the first unambiguous detection and attribution of ocean acidification's impact on coral growth," said ead author and WHOI scientist Weifu Guo.

"Our study presents strong evidence that 20th century ocean acidification, exacerbated by reef biogeochemical processes, had measurable effects on the growth of a keystone reef-building coral species across the Great Barrier Reef and in the South China Sea. These effects will likely accelerate as ocean acidification progresses over the next several decades."


The ocean has experienced a 0.1 unit decline in pH since the pre-industrial era, and scientists are just now discovering what effect that has on marine life.

Ocean acidification, has led to a 20 percent decrease in the concentration of carbonate ions in seawater. Calcium carbonate is what corals use to create their skeletons, so this decline means that they are at risk.

Because of the reductions in these concentrations, the density of their skeletons reduces, silently whittling away at the coral's strength, much like osteoporosis weakens bones in humans.

"The corals aren't able to tell us what they're feeling, but we can see it in their skeletons," said Anne Cohen, a WHOI scientist and co-author of the study. "The problem is that corals really need the strength they get from their density, because that's what keeps reefs from breaking apart. The compounding effects of temperature, local stressors, and now ocean acidification will be devastating for many reefs."
Related Topics
#GIFTECONOMY

The bright side of indebtedness

USE VALUE VS EXCHANGE VALUE

Guglielmo Faldetta 
Society and Business Review

ISSN: 1746-5680

Publication date: 3 December 2018 Reprints & Permissions

Abstract
Purpose
The purpose of this paper is to show that indebtedness can have a positive meaning for people who are embedded in social relationships in organizations if it is meant in the light of the notion of gratitude, gift-giving and generalized reciprocity.

Design/methodology/approach
The study reviews the literature on the common notion of indebtedness and integrates it with the literature on gratitude, gift-giving and generalized reciprocity.

Findings
The study reveals that through the notion of gratitude, gift-giving and generalized reciprocity people may conceive their indebtedness as gratitude for having received something, so triggering giving behaviors that does not necessarily aim to repay the debt, but to develop and feed their social relationships.

Originality/value
In the past indebtedness has been conceived as a negative feeling. This study reveals that it may have also a bright side when it is applied to people in flesh and bones, as they are immersed in good quality social relationships.

Keywords
   
Citation
Faldetta, G. (2018), "The bright side of indebtedness", Society and Business Review, Vol. 15 No. 1, pp. 26-34. https://doi.org/10.1108/SBR-11-2018-0132

 Download as .RIS
Publisher: Emerald Publishing Limited
Copyright © 2018, Emerald Publishing Limited

Putting the COVID-19 Death Toll Into Perspective – Here’s the Pandemic’s Impact on US Lifespan

UC Berkeley demographers calculate pandemic’s impact on US lifespan based on projected mortality rates.
With over 170,000 COVID-19 deaths to date, and 1,000 more each day, America’s life expectancy may appear to be plummeting. But in estimating the magnitude of the pandemic, UC Berkeley demographers have found that COVID-19 is likely to shorten the average U.S. lifespan in 2020 by only about a year.
Seeking to put current COVID-19 mortality rates into historic, demographic and economic perspective, UC Berkeley demographers Ronald Lee and Joshua Goldstein calculated the consequences of U.S. lives lost to COVID-19 in 2020 using two scenarios. One was based on a projection of 1 million deaths for the year, the other on the more likely projection of 250,000 deaths.
Their findings, published online last week in the Proceedings of the National Academy of Sciences journal, conclude that 1 million deaths in 2020 would cut three years off the average U.S. life expectancy, while 250,000 deaths would reduce lifespans by about a year.
That said, without the societal efforts that have occurred to lessen the impact of COVID-19, there could have been 2 million deaths projected by the end of 2020, a reduction of the average U.S. lifespan by five years, the researchers pointed out.
Their estimated drop in life expectancy is modest, in part, because 250,000 deaths is not a large increase on top of the 3 million non-COVID-19 deaths expected for 2020, and because older people, who typically have fewer remaining years of life than others do, represent the most COVID-19 fatalities, the study notes.
Still, while COVID-19 mortality rates remain lower than those of the 1918 Spanish flu pandemic, the coronavirus epidemic could be just as devastating as the longer-lasting HIV and opioid epidemics if mitigation efforts fail, the researchers said.
“The death toll of COVID-19 is a terrible thing, both for those who lose their lives and for their family, friends, colleagues and all whom their lives touched. Those are real people, not abstract statistics,” said Lee, a UC Berkeley professor emeritus of demography and associate director of the campus’s Center for the Economics and Demography of Aging.
“But the population perspective helps put this tragedy in a broader context. As we work to contain this epidemic, it is important to know that we have been through such mortality crises before,” he added.
Goldstein’s and Lee’s measures are based on factors that include a current U.S. population of 330 million, age-specific death rates and the economic valuation of saved lives.
Among their other findings:
  • One million COVID-19 deaths in the U.S. in 2020 would be the equivalent of U.S. mortality levels in 1995, adding three years to each American’s biological age, albeit temporarily.
  • The age gap (old versus young) for people dying from COVID-19 is marginally wider than during pre-pandemic times, while the male-female gap is slightly narrower. The researchers found similar death-by-age patterns across several countries.
  • The economic cost of lives lost to COVID-19 in the U.S. is in the trillions of dollars. According to standard government measures, the demographers estimated that the loss of 1 million lives in 2020 would amount to between $10.2 and $17.5 trillion, while the amount for 250,000 deaths would range from $1.5 to $2.5 trillion.
Reference: “Demographic perspectives on the mortality of COVID-19 and other epidemics” by Joshua R. Goldstein and Ronald D. Lee, 20 August 2020, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2006392117

Solving a Paradox: New Younger Age Estimate for Earth’s Inner Core


A computer simulation of the Earth’s magnetic field, which is generated by heat transfer in the Earth’s core. Credit: NASA/ Gary A.Glatzmaier
By creating conditions akin to the center of the Earth inside a laboratory chamber, researchers have improved the estimate of the age of our planet’s solid inner core, putting it at 1 billion to 1.3 billion years old.
The results place the core at the younger end of an age spectrum that usually runs from about 1.3 billion to 4.5 billion years, but they also make it a good bit older than a recent estimate of only 565 million years.
What’s more, the experiments and accompanying theories help pin down the magnitude of how the core conducts heat, and the energy sources that power the planet’s geodynamo — the mechanism that sustains the Earth’s magnetic field, which keeps compasses pointing north and helps protect life from harmful cosmic rays.
“People are really curious and excited about knowing about the origin of the geodynamo, the strength of the magnetic field, because they all contribute to a planet’s habitability,” said Jung-Fu Lin, a professor at The University of Texas at Austin’s Jackson School of Geosciences who led the research.
The results were published on August 13, 2020 in the journal Physical Review Letters.
The Earth’s core is made mostly of iron, with the inner core being solid and the outer core being liquid. The effectiveness of the iron in transferring heat through conduction — known as thermal conductivity — is key to determining a number of other attributes about the core, including when the inner core formed.
Over the years, estimates for core age and conductivity have gone from very old and relatively low, to very young and relatively high. But these younger estimates have also created a paradox, where the core would have had to reach unrealistically high temperatures to maintain the geodynamo for billions of years before the formation of the inner core.
The new research solves that paradox by finding a solution that keeps the temperature of the core within realistic parameters. Finding that solution depended on directly measuring the conductivity of iron under corelike conditions — where pressure is greater than 1 million atmospheres and temperatures can rival those found on the surface of the sun.
The researchers achieved these conditions by squeezing laser-heated samples of iron between two diamond anvils. It wasn’t an easy feat. It took two years to get suitable results.
“We encountered many problems and failed several times, which made us frustrated, and we almost gave up,” said article co-author Youjun Zhang, an associate professor at Sichuan University in China. “With the constructive comments and encouragement by professor Jung-Fu Lin, we finally worked it out after several test runs.”
The newly measured conductivity is 30% to 50% less than the conductivity of the young core estimate, and it suggests that the geodynamo was maintained by two different energy sources and mechanisms: thermal convection and compositional convection. At first the geodynamo was maintained by thermal convection alone. Now, each mechanism plays about an equally important role.
Lin said that with this improved information on conductivity and heat transfer over time, the researchers could make a more precise estimate of the age of the inner core.
“Once you actually know how much of that heat flux from the outer core to the lower mantle, you can actually think about when did the Earth cool sufficiently to the point that the inner core starts to crystalize,” he said.
This revised age of the inner core could correlate with a spike in the strength of the Earth’s magnetic field as recorded by the arrangement of magnetic materials in rocks that were formed around this time. Together, the evidence suggests that the formation of the inner core was an essential part of creating today’s robust magnetic fields.
Reference: “Reconciliation of Experiments and Theory on Transport Properties of Iron and the Geodynamo” by Youjun Zhang, Mingqiang Hou, Guangtao Liu, Chengwei Zhang, Vitali B. Prakapenka, Eran Greenberg, Yingwei Fei, R. E. Cohen and Jung-Fu Lin, 13 August 2020, Physical Review Letters.
DOI: 10.1103/PhysRevLett.125.078501
The National Science Foundation and the National Natural Science Foundation of China supported the research.
The research team also included Mingqiang Hou, Guangtao Liu and Chengwei Zhang of the Center for High Pressure Science and Technology Advanced Research in Shanghai; Vitali Prakapenka and Eran Greenberg of the University of Chicago; and Yingwei Fei and R.E. Cohen of the Carnegie Institution for Science.




This Is The Most Exciting Crisis in Cosmology

MICHELLE STARR
29 AUGUST 2020


For as long as there has been a Universe, space has been expanding. It winked into existence roughly 13.8 billion years ago, and has been puffing up ever since, like a giant cosmic balloon.

The current rate of this expansion is called the Hubble constant, or H0, and it's one of the fundamental measurements of the Universe.

If you know the Hubble constant, you can calculate the age of the Universe. You can calculate the size of the Universe. You can more accurately calculate the influence of the mysterious dark energy that drives the expansion of the Universe. And, fun fact, H0 is one of the values required to calculate intergalactic distances.

However, there's a huge problem. We have several highly precise methods for determining the Hubble constant... and these methods keep returning different results for an unknown reason.

It could be a problem with the calibration of our measurement techniques - the standard candles and standard rulers we use to measure cosmic distances (more on those in a moment). It could be some unknown property of dark energy.

Or perhaps our understanding of fundamental physics is incomplete. To resolve this might well require a breakthrough of the kind that earns Nobel Prizes.

So, where do we begin?

The basics

The Hubble constant is typically expressed with a seemingly unusual combination of distance and time units - kilometres per second per megaparsec, or (km/s)/Mpc; a megaparsec is around 3.3 million light-years.

That combination is needed because the expansion of the Universe is accelerating, therefore stuff that's farther away from us appears to be receding faster. Hypothetically, if we found that a galaxy at 1 megaparsec away was receding at a rate of 10 km/s, and a galaxy at 10 megaparsecs appeared to be receding at 100 km/s, we could describe that relation as 10 km/s per megaparsec.


In other words, determining the proportional relation between how fast galaxies are moving away from us (km/s) and how far they are (Mpc) is what gives us the value of H0.

If only there was an easy way to measure all this.

Cosmologists have devised a number of ways to arrive at the Hubble constant, but there are two main methods. They involve either standard rulers, or standard candles.

Standard rulers and their signals

Standard rulers are based on signals from a time in the early Universe called the Epoch of Recombination. After the Big Bang, the Universe was so hot and dense, atoms couldn't form. Instead, there existed only a hot, opaque plasma fog; after about 380,000 years of cooling and expansion, that plasma finally started recombining into atoms.

We rely on two signals from this period. The first is the cosmic microwave background (CMB) - the light that escaped the plasma fog as matter recombined, and space became transparent. This first light - faint as it is by now - still fills the Universe uniformly in all directions.

Fluctuations in the temperature of the CMB represent expansions and contractions in the early Universe, to be incorporated into calculations that let us infer our Universe's expansion history.






The second signal is called the baryon acoustic oscillation, and it's the result of spherical acoustic density waves that propagated through the plasma fog of the early Universe, coming to a standstill at the Epoch of Recombination.

The distance this acoustic wave could have travelled during this timeframe is approximately 150 megaparsecs; this is detectable in density variations throughout the history of the Universe, providing a 'ruler' whereby to measure distances.


Standard candles in the sky

Standard candles, on the other hand, are distance measurements based on objects in the local Universe. These can't just be any old stars or galaxies - they need to be objects of known intrinsic brightness, such as Type Ia supernovae, Cepheid variable stars, or stars at the tip of the red giant branch.

"When you're looking at the stars in the sky, you can measure their positions left and right really precisely, you can point at them really precisely, but you can't tell how far away they are," astrophysicist Tamara Davis, from the University of Queensland in Australia, told ScienceAlert.

"It's really difficult to tell the difference between something that's really bright and far away, or something that's faint and close. So, the way people measure it is to find something that's standard in some way. A standard candle is something of known brightness."

Both standard rulers and standard candles are as precise as we can get them, which is to say - very. And they both return different results when used to calculate the Hubble constant.


According to standard rulers, that is, the early Universe, H0 is around 67 kilometres per second per megaparsec. For the standard candles - the local Universe - it's around 74 kilometres per second per megaparsec.

Neither of these results have an error margin that comes even close to closing the gap between them.
The history of the gap

Astronomers Alexander Friedmann and Georges Lemaître first noticed that the Universe was expanding all the way back in the 1920s. By 1929, Edwin Hubble calculated the rate of expansion based on standard candles called Cepheid variable stars, which periodically vary in brightness; since the timing of that variability is linked to these stars' intrinsic brightness, they make for an excellent distance measurement tool.

But the distance calibrations weren't quite right, which carried over into the cosmic distance measurements. Thus, the early calculations returned a H0 of around 500 kilometres per second per megaparsec.

"There was an immediate problem discovered with that because geologists, who were studying Earth, knew that Earth was something like 4 billion years old," Davis said.

"If you calculated the rate of expansion as 500 km/s, you can calculate how long it would have taken to get to the current size of the Universe, and that would have been about 2 billion years. That meant Earth was older than the Universe - which is not possible - and so people went bah! this 'expansion of the Universe' thing is all a farce."

That's where the Hubble constant remained until around the 1950s, when German astronomer Walter Baade discovered that there are two types of Cepheid variable stars, allowing for a refined calculation of the Hubble constant. It was brought down to around 100 (km/s)/Mpc.

(John Huchra/Harvard-Smithsonian Center for Astrophysics)

From there, you know how it goes - you can see the progression on the graph above. As our technology, techniques, and understanding grew ever more refined, so too did the Hubble constant calculations, along with our confidence in them.

"We used to have error bars of plus or minus 50," Davis said. "Now we have error bars of plus or minus 1 or 2. Because the measurements have become so good, these techniques are now sufficiently different that it's hard to explain by measurement errors."
What's the big deal?

Today, the difference between the two values, known as the Hubble tension, may not seem like a large number - just 9.4 percent.

But cosmologists are yet to figure out wherein lies the cause of this discrepancy. The most obvious problem would be one of calibration, but its source remains elusive.

Several different teams, for instance, have calculated H0 from the CMB based on measurements obtained by the Planck space observatory. It's possible the problem could lie with our interpretation of the data; but a 2019 CMB survey by a different instrument, the Atacama Cosmology Telescope, agreed with the Planck data.

In addition, H0 calculations from the baryon acoustic oscillation measured by an entirely different instrument, the Sloan Digital Sky Survey, returned the same result.

Perhaps our standard candles are leading us astray, too. These objects are grouped into stages, forming the 'cosmic distance ladder'. First up, parallax - how nearby stars seem to change position against more distant stars - is used to validate the distances to the two types of variable stars.

(design und mehr)

The next step out from variable stars is extragalactic Type Ia supernovae. It's like climbing a ladder farther and farther out into the cosmos, and "even a tiny error in one of the steps can propagate into a larger error later," Davis pointed out.

Other attempts to tackle the problem involve thinking about the very space that surrounds us in a different way.

The Hubble bubble hypothesis, for example, is based on the idea that the Milky Way is located in a relatively low-density 'bubble' in the Universe, surrounded by higher density material. The gravitational effect of this higher density material would pull on the space inside the bubble, making it so that the local space appears to expand at a faster rate than the early Universe.

Even if all of the above were indeed contributing to the problem, this would hardly add up to that 9.4 percent discrepancy, however.

"People have been quite inventive in coming up with possible ways that the methods could go wrong. And so far, no one has convincingly argued any one particular error could explain the differences that we see," cosmologist Matthew Colless, from the Australian National University, told ScienceAlert.

"It's possible that a whole bunch of different small errors all lined up the same way; but these sources of error are not related to each other. It would be very surprising and extremely unlucky if it just so happened that every different sort of error we made, all piled up in one direction and took us one way."
Maybe the blame lies with physics?

In pretty much all other respects, our cosmological models work remarkably well. Thus, if you try to alter one of the basic components of the Hubble constant, something else tends to break.

"You can change the standard ruler," Colless said, "but then you break some other observation that's been made - the amount of matter in the Universe, the mass of neutrinos - things like that, well measured and explained by the current model, but broken by the changes you need to make to 'fix' the standard ruler."

Which leads to - what the heck are we missing? Is it a problem with... fundamental physics?

"I am pretty soundly thinking that it's likely to be an error," Davis noted. "But it is genuinely difficult to explain where that error could have come from in the current measurements. So I'm almost 50-50. It's an intriguing discrepancy. And it's really interesting to try and figure out why."

If our options are "humans stuffed something up" and "actually, physics is wrong", the blame typically tends to fall on the former.

Actually, that's an understatement. "New physics" is an exceedingly rare answer. But the Hubble tension is a slippery problem, defying every attempt at a solution cosmologists can come up with.

Which makes it an incredibly exciting one.

Most of these specks are galaxies. (NASA, ESA, S. Beckwith (STScI) and the HUDF Team)

It's possible there's something general relativity hasn't accounted for. That would be wild: Einstein's theory has survived test after cosmic test. But we can't discount the possibility.

Naturally, there are other possibilities as well, such as the huge unknown of dark energy. We don't know what dark energy is, but it seems to be a fundamental force, responsible for the negative pressure that's accelerating the expansion of our Universe. Maybe.

"Our only vague idea is that it is Einstein's cosmological constant, the energy of the vacuum," said Colless. "But we don't really know exactly how that works, because we don't have a convincing way for predicting what the value of the cosmological constant should be."

Alternatively, it could be some hole in our understanding of gravity, although "new physics that affects a theory of fundamental and general relativity is extremely rare," Colless pointed out.

"If there was new physics, and if it turned out to require a modification to general relativity, that would definitely be Nobel Prize-level breakthrough physics."
The only way forward

Whether it's a calibration error, a huge mistake in our current understanding of physics, or something else altogether, there is only one way forward if we're going to fix the Hubble constant - doing more science.

Firstly, cosmologists can work with the current data we already have on standard candles and standard rulers, refining them further and reducing the error bars even more. To supplement this, we can also obtain new data.

Colless, for instance, is working on a project in Australia using the cutting-edge TAIPAN instrument newly installed at Siding Spring Observatory. That team will be surveying millions of galaxies in the local Universe to measure the baryon acoustic oscillation as close to us as possible, to account for any measurement problems produced by the distance.

"We're going to measure 2 million very nearby galaxies - over the whole Southern Hemisphere and a little bit of the Northern Hemisphere - as nearby as we possibly can, look for this signal of baryon acoustic oscillation, and measure that scale with 1 percent precision at very low redshift."

This is the same volume of space that the distance ladders cover. So, if the TAIPAN results in that same volume return an H0 of 67 kilometres per second per megaparsec, the problem might lie with our standard candles.

On the other hand, if the results are closer to 74 kilometres per second per megaparsec, this would suggest the standard candles are more robust.

Emerging research fields are also an option; not standard candles or standard rulers, but standard sirens, based on gravitational wave astronomy - the ripples in spacetime propagated by massive collisions between black holes and neutron stars.

Animation of two neutron stars colliding. (Caltech/YouTube)

"They're similar to the supernovae in that we know how bright they are intrinsically," Davis said.

"Basically, it's like a standard candle. It's sometimes called a standard siren, because the frequency of the gravitational waves tells you how bright it is. Because we know - from general relativity - the relationship between the frequency and the brightness, we don't have to do any calibration. We just have a number, which makes it much, much cleaner than some of these other methods."

It's still hard to measure the Hubble constant with gravitational waves. But initial calculations are promising. In 2017, neutron star collision allowed astronomers to narrow it down to around 70 (km/s)/Mpc, with error bars large enough on either side to cover both 67 and 74, and then some.

That, Davis said, was stunning.

"We've measured thousands of supernovae now," she said. "We've measured millions of galaxies to measure the baryon acoustic oscillation, we've surveyed the entire sky to measure the cosmic microwave background.

"And this single object, this one measurement of a gravitational wave, got an error bar that was about 10 percent, which took decades of work on the other probes."

Gravitational wave astronomy is still in its infancy - it's only a matter of time before we detect enough neutron star collisions to sufficiently refine those results. With luck, that will help ferret out the cause of the Hubble tension.

Either way, it's going to make history. New physics would, of course, be amazing - but an error in the distance ladder would rock astronomy. It could mean that there's something we don't understand about Type Ia supernovae, or how stars evolve.

Whichever way it shakes out, solving the Hubble tension will have effects that ripple out across astronomical science.

"That's why cosmologists are so excited about this. Because cosmological theory works so well, we're so excited when we find something that it failed to predict. Because when things break, that's when you learn," Colless said.

"Science is all about trial and error - and it's in the error that you learn something new."