Sunday, August 30, 2020

KAPITALISMUS IS KRISIS

Covid-19 has hit advanced economies and Latin America particularly hard


Total coronavirus deaths per 100k versus projected decline in economic growth (2020)

Projected decline in economic growth in 2020 (%)
Italy
Spain
France
Population
Latin America and
the Caribbean
1 billion
UK
2 billion
10
Brazil
US
Germany
Advanced Economies
Emerging and Developing Europe
5
Middle East and Central Asia
Sub-Saharan Africa
Emerging and Developing Asia
0
China
0
20
40
60
Total Covid-19 deaths per 100k

SOURCE: JOHN HOPKINS UNIVERSITY, IMF WORLD ECONOMIC OUTLOOK

Exclusive: British Library’s chief librarian claims 'racism is the creation of white people'

Liz Jolly supports changes to displays and collections in the wake of Black Lives Matter protests to purge 'perceived racism' at the library

ByCraig Simpson 29 August 2020 • 


The building was designed by architect and former naval officer Sir Colin Wilson, and evokes the shape and character of an ocean-going craft CREDIT: Anadolu Agency

The British Library’s chief librarian has claimed “racism is a creation of white people” and backed calls for major cultural change at the institution, the Telegraph can reveal.

Liz Jolly manages the vast collection of literary treasures held by the institution, and is supporting changes to displays and collections in the wake of Black Lives Matter protests. Reforms are being proposed by a “Decolonising Working Group”, which claims the British Library’s London building is an imperialism symbol because it resembles a battleship.

Staff being supported to decolonise the UK’s national library have also suggested that traditional puppet Mr Punch reflects “colonial violence”

The Telegraph has exclusively obtained documents revealing these claims, and a letter endorsed hundreds of employees which declared a racial “state of emergency” at the institution.



In response to this emergency, an internal report called for the removal of statues of the library’s founding fathers, replacing “Eurocentric” maps, and reviewing collections of western classical music which staff branded part of the “outdated notion” of Western Civilization.

The institution recently faced calls for defunding from MPs after the Telegraph revealed employees had urged colleagues to donate to Black Lives Matter and back the work of Labour MP Dianne Abbott.

The Telegraph can now reveal that Chief Librarian Ms Jolly has urged white staff to support the institution's plans to purge the library of perceived racism.

In a video clip obtained by the Telegraph she tells colleagues: “I think, as I have said before, that we need to make sure some white colleagues are involved, because racism is a creation of white people.”

Ms Jolly receives between £120,000 and £125,000 per year for her work as Chief Librarian, a role she has held since 2018.

She has given her support to a broad “Anti-Racism Project” proposed in the wake of Black Lives Matter protests, telling colleagues “particularly pressing” work was ongoing to review “artworks in the St Pancras building”.

Ms Jolly assured staff that the project is “about developing and delivering major cultural change”, and part of this would be ensuring the repository of literary treasures will “reflect the diversity of Britain today”.

There are calls for the bust of the founder Sir Hans Sloane to be removed due to his connections with the slave trade CREDIT: Chris J Ratcliffe/Getty Images Europe

This followed on from a letter declaring a “state of emergency” at the library, signed by 200 employees, demanding BAME staff should review any job cuts which might affect employees with “protected characteristics” to ensure continued diversity at the library, better treatment of these workers, and ensuring the BAME Network has a say in who is employed as Head of Collections.

To tackle the legacy of “colonial violence” at the library, the letter also demands a statue of founder Sir Hans Sloane be removed.

This recommended removal is echoed in a report by the Decolonising Working Group, which claims that the “physical space” of the British Library contains “manifestations of the institution’s racism” by glorifying the British Empire.

Staff claim in the report that: “This glorification is hard to miss in the structure of the building itself, designed as it is in the form of a battleship, by far the greatest symbol of British imperialism.”

The building was designed by architect and former naval officer Sir Colin Wilson, who added maritime references such as portholes to the building.

In the report aimed at reforming the British Library “as a space”, visual aspects including a portrait of Mr Punch which hangs in the institution is critiqued.

Portrait of Mr Punch. Oil painting CREDIT: Alamy

The character, popular at seaside shows is: “A theatrical figure from the heyday of Victorian imperialism who ‘entertained’ through an abuse of women and children that mirrored colonial Violence.”

Staff have also claimed Library links to the “ongoing settler-colonisation of Palestine”, and have proclaimed that “every vacant wall” of the institution’s property should be used to advertise the cause of anti-racism.

Other recommended actions include reducing the number of European maps on display, being “tools of power”, and reviewing the presence of busts of Beethoven and Mendelssohn, whose revered status is part and parcel of “western civilizational supremacy”. The music collections should also be diversified, the report suggests.

The Decolonising Working Group is part of the BAME Network at the library, which advised colleagues to read Marxist authors and support Black Lives Matter, deeming “colour-blindness” a form of “covert white supremacy”.



Marble bust of Sir Joseph Banks (1743-1820) Botanist, Trustee and benefactor of the British Museum CREDIT: PjrTravel / Alamy Stock Photo

In their compiled report circulated among staff they demand the bust of another library founder, botanist Sir Joseph Banks, be removed and the statue of King George III be reviewed.

Their work has been acknowledged by the ongoing Anti-Racism Project at the institution, which is being backed Chief Executive Roly Keating and Chief Librarian Ms Jolly, who has enjoined white staff to advance the project to address the “creation by white people”

“This says to me that there is something fundamentally wrong with the leadership of the British Library,” said MP Ben Bradley.

“If the Chief Librarian is so unhappy with British history perhaps they should not be in that job.

“The very suggestion that racism only applies in one direction , by white people towards BAME people, is categorically false, inflammatory and divisive.”

The British Library houses almost 200 million objects, many of which have major importance to British history, culture, and literature. These include two copies of the Magna Carta.

The institution and Ms Jolly have been contacted for comment.

Coral develops 'osteoporosis' because of acidic oceans caused by climate change, study reveals

The ocean becomes more acidic as a result of more carbon dioxide building up

By Helena Horton

28 August 2020 • 2:54pm
Coral becomes deformed as a result of acidified oceans, a study has found 
CREDIT: TANE SINCLAIR-TAYLOR/AFP/Getty

Coral reefs develop 'osteoporosis' as a result of the ocean becoming more acidic due to carbon dioxide emissions, scientists have found.

As carbon dioxide builds up in concentration in the ocean, it becomes more acidic, and this is impeding the growth of coral, a new study in Geophysical Research Letters has revealed.

Scientists from the Woods Hole Oceanographic Institution (WHOI) found significant reduction in the density of coral skeleton along much of the Great Barrier Reef, the world's largest coral reef system, and also on two reefs in the South China Sea, which they attribute largely to the increasing acidity of the waters surrounding these reefs since 1950.

"This is the first unambiguous detection and attribution of ocean acidification's impact on coral growth," said ead author and WHOI scientist Weifu Guo.

"Our study presents strong evidence that 20th century ocean acidification, exacerbated by reef biogeochemical processes, had measurable effects on the growth of a keystone reef-building coral species across the Great Barrier Reef and in the South China Sea. These effects will likely accelerate as ocean acidification progresses over the next several decades."


The ocean has experienced a 0.1 unit decline in pH since the pre-industrial era, and scientists are just now discovering what effect that has on marine life.

Ocean acidification, has led to a 20 percent decrease in the concentration of carbonate ions in seawater. Calcium carbonate is what corals use to create their skeletons, so this decline means that they are at risk.

Because of the reductions in these concentrations, the density of their skeletons reduces, silently whittling away at the coral's strength, much like osteoporosis weakens bones in humans.

"The corals aren't able to tell us what they're feeling, but we can see it in their skeletons," said Anne Cohen, a WHOI scientist and co-author of the study. "The problem is that corals really need the strength they get from their density, because that's what keeps reefs from breaking apart. The compounding effects of temperature, local stressors, and now ocean acidification will be devastating for many reefs."
Related Topics
#GIFTECONOMY

The bright side of indebtedness

USE VALUE VS EXCHANGE VALUE

Guglielmo Faldetta 
Society and Business Review

ISSN: 1746-5680

Publication date: 3 December 2018 Reprints & Permissions

Abstract
Purpose
The purpose of this paper is to show that indebtedness can have a positive meaning for people who are embedded in social relationships in organizations if it is meant in the light of the notion of gratitude, gift-giving and generalized reciprocity.

Design/methodology/approach
The study reviews the literature on the common notion of indebtedness and integrates it with the literature on gratitude, gift-giving and generalized reciprocity.

Findings
The study reveals that through the notion of gratitude, gift-giving and generalized reciprocity people may conceive their indebtedness as gratitude for having received something, so triggering giving behaviors that does not necessarily aim to repay the debt, but to develop and feed their social relationships.

Originality/value
In the past indebtedness has been conceived as a negative feeling. This study reveals that it may have also a bright side when it is applied to people in flesh and bones, as they are immersed in good quality social relationships.

Keywords
   
Citation
Faldetta, G. (2018), "The bright side of indebtedness", Society and Business Review, Vol. 15 No. 1, pp. 26-34. https://doi.org/10.1108/SBR-11-2018-0132

 Download as .RIS
Publisher: Emerald Publishing Limited
Copyright © 2018, Emerald Publishing Limited

Putting the COVID-19 Death Toll Into Perspective – Here’s the Pandemic’s Impact on US Lifespan

UC Berkeley demographers calculate pandemic’s impact on US lifespan based on projected mortality rates.
With over 170,000 COVID-19 deaths to date, and 1,000 more each day, America’s life expectancy may appear to be plummeting. But in estimating the magnitude of the pandemic, UC Berkeley demographers have found that COVID-19 is likely to shorten the average U.S. lifespan in 2020 by only about a year.
Seeking to put current COVID-19 mortality rates into historic, demographic and economic perspective, UC Berkeley demographers Ronald Lee and Joshua Goldstein calculated the consequences of U.S. lives lost to COVID-19 in 2020 using two scenarios. One was based on a projection of 1 million deaths for the year, the other on the more likely projection of 250,000 deaths.
Their findings, published online last week in the Proceedings of the National Academy of Sciences journal, conclude that 1 million deaths in 2020 would cut three years off the average U.S. life expectancy, while 250,000 deaths would reduce lifespans by about a year.
That said, without the societal efforts that have occurred to lessen the impact of COVID-19, there could have been 2 million deaths projected by the end of 2020, a reduction of the average U.S. lifespan by five years, the researchers pointed out.
Their estimated drop in life expectancy is modest, in part, because 250,000 deaths is not a large increase on top of the 3 million non-COVID-19 deaths expected for 2020, and because older people, who typically have fewer remaining years of life than others do, represent the most COVID-19 fatalities, the study notes.
Still, while COVID-19 mortality rates remain lower than those of the 1918 Spanish flu pandemic, the coronavirus epidemic could be just as devastating as the longer-lasting HIV and opioid epidemics if mitigation efforts fail, the researchers said.
“The death toll of COVID-19 is a terrible thing, both for those who lose their lives and for their family, friends, colleagues and all whom their lives touched. Those are real people, not abstract statistics,” said Lee, a UC Berkeley professor emeritus of demography and associate director of the campus’s Center for the Economics and Demography of Aging.
“But the population perspective helps put this tragedy in a broader context. As we work to contain this epidemic, it is important to know that we have been through such mortality crises before,” he added.
Goldstein’s and Lee’s measures are based on factors that include a current U.S. population of 330 million, age-specific death rates and the economic valuation of saved lives.
Among their other findings:
  • One million COVID-19 deaths in the U.S. in 2020 would be the equivalent of U.S. mortality levels in 1995, adding three years to each American’s biological age, albeit temporarily.
  • The age gap (old versus young) for people dying from COVID-19 is marginally wider than during pre-pandemic times, while the male-female gap is slightly narrower. The researchers found similar death-by-age patterns across several countries.
  • The economic cost of lives lost to COVID-19 in the U.S. is in the trillions of dollars. According to standard government measures, the demographers estimated that the loss of 1 million lives in 2020 would amount to between $10.2 and $17.5 trillion, while the amount for 250,000 deaths would range from $1.5 to $2.5 trillion.
Reference: “Demographic perspectives on the mortality of COVID-19 and other epidemics” by Joshua R. Goldstein and Ronald D. Lee, 20 August 2020, Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2006392117

Solving a Paradox: New Younger Age Estimate for Earth’s Inner Core


A computer simulation of the Earth’s magnetic field, which is generated by heat transfer in the Earth’s core. Credit: NASA/ Gary A.Glatzmaier
By creating conditions akin to the center of the Earth inside a laboratory chamber, researchers have improved the estimate of the age of our planet’s solid inner core, putting it at 1 billion to 1.3 billion years old.
The results place the core at the younger end of an age spectrum that usually runs from about 1.3 billion to 4.5 billion years, but they also make it a good bit older than a recent estimate of only 565 million years.
What’s more, the experiments and accompanying theories help pin down the magnitude of how the core conducts heat, and the energy sources that power the planet’s geodynamo — the mechanism that sustains the Earth’s magnetic field, which keeps compasses pointing north and helps protect life from harmful cosmic rays.
“People are really curious and excited about knowing about the origin of the geodynamo, the strength of the magnetic field, because they all contribute to a planet’s habitability,” said Jung-Fu Lin, a professor at The University of Texas at Austin’s Jackson School of Geosciences who led the research.
The results were published on August 13, 2020 in the journal Physical Review Letters.
The Earth’s core is made mostly of iron, with the inner core being solid and the outer core being liquid. The effectiveness of the iron in transferring heat through conduction — known as thermal conductivity — is key to determining a number of other attributes about the core, including when the inner core formed.
Over the years, estimates for core age and conductivity have gone from very old and relatively low, to very young and relatively high. But these younger estimates have also created a paradox, where the core would have had to reach unrealistically high temperatures to maintain the geodynamo for billions of years before the formation of the inner core.
The new research solves that paradox by finding a solution that keeps the temperature of the core within realistic parameters. Finding that solution depended on directly measuring the conductivity of iron under corelike conditions — where pressure is greater than 1 million atmospheres and temperatures can rival those found on the surface of the sun.
The researchers achieved these conditions by squeezing laser-heated samples of iron between two diamond anvils. It wasn’t an easy feat. It took two years to get suitable results.
“We encountered many problems and failed several times, which made us frustrated, and we almost gave up,” said article co-author Youjun Zhang, an associate professor at Sichuan University in China. “With the constructive comments and encouragement by professor Jung-Fu Lin, we finally worked it out after several test runs.”
The newly measured conductivity is 30% to 50% less than the conductivity of the young core estimate, and it suggests that the geodynamo was maintained by two different energy sources and mechanisms: thermal convection and compositional convection. At first the geodynamo was maintained by thermal convection alone. Now, each mechanism plays about an equally important role.
Lin said that with this improved information on conductivity and heat transfer over time, the researchers could make a more precise estimate of the age of the inner core.
“Once you actually know how much of that heat flux from the outer core to the lower mantle, you can actually think about when did the Earth cool sufficiently to the point that the inner core starts to crystalize,” he said.
This revised age of the inner core could correlate with a spike in the strength of the Earth’s magnetic field as recorded by the arrangement of magnetic materials in rocks that were formed around this time. Together, the evidence suggests that the formation of the inner core was an essential part of creating today’s robust magnetic fields.
Reference: “Reconciliation of Experiments and Theory on Transport Properties of Iron and the Geodynamo” by Youjun Zhang, Mingqiang Hou, Guangtao Liu, Chengwei Zhang, Vitali B. Prakapenka, Eran Greenberg, Yingwei Fei, R. E. Cohen and Jung-Fu Lin, 13 August 2020, Physical Review Letters.
DOI: 10.1103/PhysRevLett.125.078501
The National Science Foundation and the National Natural Science Foundation of China supported the research.
The research team also included Mingqiang Hou, Guangtao Liu and Chengwei Zhang of the Center for High Pressure Science and Technology Advanced Research in Shanghai; Vitali Prakapenka and Eran Greenberg of the University of Chicago; and Yingwei Fei and R.E. Cohen of the Carnegie Institution for Science.




This Is The Most Exciting Crisis in Cosmology

MICHELLE STARR
29 AUGUST 2020


For as long as there has been a Universe, space has been expanding. It winked into existence roughly 13.8 billion years ago, and has been puffing up ever since, like a giant cosmic balloon.

The current rate of this expansion is called the Hubble constant, or H0, and it's one of the fundamental measurements of the Universe.

If you know the Hubble constant, you can calculate the age of the Universe. You can calculate the size of the Universe. You can more accurately calculate the influence of the mysterious dark energy that drives the expansion of the Universe. And, fun fact, H0 is one of the values required to calculate intergalactic distances.

However, there's a huge problem. We have several highly precise methods for determining the Hubble constant... and these methods keep returning different results for an unknown reason.

It could be a problem with the calibration of our measurement techniques - the standard candles and standard rulers we use to measure cosmic distances (more on those in a moment). It could be some unknown property of dark energy.

Or perhaps our understanding of fundamental physics is incomplete. To resolve this might well require a breakthrough of the kind that earns Nobel Prizes.

So, where do we begin?

The basics

The Hubble constant is typically expressed with a seemingly unusual combination of distance and time units - kilometres per second per megaparsec, or (km/s)/Mpc; a megaparsec is around 3.3 million light-years.

That combination is needed because the expansion of the Universe is accelerating, therefore stuff that's farther away from us appears to be receding faster. Hypothetically, if we found that a galaxy at 1 megaparsec away was receding at a rate of 10 km/s, and a galaxy at 10 megaparsecs appeared to be receding at 100 km/s, we could describe that relation as 10 km/s per megaparsec.


In other words, determining the proportional relation between how fast galaxies are moving away from us (km/s) and how far they are (Mpc) is what gives us the value of H0.

If only there was an easy way to measure all this.

Cosmologists have devised a number of ways to arrive at the Hubble constant, but there are two main methods. They involve either standard rulers, or standard candles.

Standard rulers and their signals

Standard rulers are based on signals from a time in the early Universe called the Epoch of Recombination. After the Big Bang, the Universe was so hot and dense, atoms couldn't form. Instead, there existed only a hot, opaque plasma fog; after about 380,000 years of cooling and expansion, that plasma finally started recombining into atoms.

We rely on two signals from this period. The first is the cosmic microwave background (CMB) - the light that escaped the plasma fog as matter recombined, and space became transparent. This first light - faint as it is by now - still fills the Universe uniformly in all directions.

Fluctuations in the temperature of the CMB represent expansions and contractions in the early Universe, to be incorporated into calculations that let us infer our Universe's expansion history.






The second signal is called the baryon acoustic oscillation, and it's the result of spherical acoustic density waves that propagated through the plasma fog of the early Universe, coming to a standstill at the Epoch of Recombination.

The distance this acoustic wave could have travelled during this timeframe is approximately 150 megaparsecs; this is detectable in density variations throughout the history of the Universe, providing a 'ruler' whereby to measure distances.


Standard candles in the sky

Standard candles, on the other hand, are distance measurements based on objects in the local Universe. These can't just be any old stars or galaxies - they need to be objects of known intrinsic brightness, such as Type Ia supernovae, Cepheid variable stars, or stars at the tip of the red giant branch.

"When you're looking at the stars in the sky, you can measure their positions left and right really precisely, you can point at them really precisely, but you can't tell how far away they are," astrophysicist Tamara Davis, from the University of Queensland in Australia, told ScienceAlert.

"It's really difficult to tell the difference between something that's really bright and far away, or something that's faint and close. So, the way people measure it is to find something that's standard in some way. A standard candle is something of known brightness."

Both standard rulers and standard candles are as precise as we can get them, which is to say - very. And they both return different results when used to calculate the Hubble constant.


According to standard rulers, that is, the early Universe, H0 is around 67 kilometres per second per megaparsec. For the standard candles - the local Universe - it's around 74 kilometres per second per megaparsec.

Neither of these results have an error margin that comes even close to closing the gap between them.
The history of the gap

Astronomers Alexander Friedmann and Georges Lemaître first noticed that the Universe was expanding all the way back in the 1920s. By 1929, Edwin Hubble calculated the rate of expansion based on standard candles called Cepheid variable stars, which periodically vary in brightness; since the timing of that variability is linked to these stars' intrinsic brightness, they make for an excellent distance measurement tool.

But the distance calibrations weren't quite right, which carried over into the cosmic distance measurements. Thus, the early calculations returned a H0 of around 500 kilometres per second per megaparsec.

"There was an immediate problem discovered with that because geologists, who were studying Earth, knew that Earth was something like 4 billion years old," Davis said.

"If you calculated the rate of expansion as 500 km/s, you can calculate how long it would have taken to get to the current size of the Universe, and that would have been about 2 billion years. That meant Earth was older than the Universe - which is not possible - and so people went bah! this 'expansion of the Universe' thing is all a farce."

That's where the Hubble constant remained until around the 1950s, when German astronomer Walter Baade discovered that there are two types of Cepheid variable stars, allowing for a refined calculation of the Hubble constant. It was brought down to around 100 (km/s)/Mpc.

(John Huchra/Harvard-Smithsonian Center for Astrophysics)

From there, you know how it goes - you can see the progression on the graph above. As our technology, techniques, and understanding grew ever more refined, so too did the Hubble constant calculations, along with our confidence in them.

"We used to have error bars of plus or minus 50," Davis said. "Now we have error bars of plus or minus 1 or 2. Because the measurements have become so good, these techniques are now sufficiently different that it's hard to explain by measurement errors."
What's the big deal?

Today, the difference between the two values, known as the Hubble tension, may not seem like a large number - just 9.4 percent.

But cosmologists are yet to figure out wherein lies the cause of this discrepancy. The most obvious problem would be one of calibration, but its source remains elusive.

Several different teams, for instance, have calculated H0 from the CMB based on measurements obtained by the Planck space observatory. It's possible the problem could lie with our interpretation of the data; but a 2019 CMB survey by a different instrument, the Atacama Cosmology Telescope, agreed with the Planck data.

In addition, H0 calculations from the baryon acoustic oscillation measured by an entirely different instrument, the Sloan Digital Sky Survey, returned the same result.

Perhaps our standard candles are leading us astray, too. These objects are grouped into stages, forming the 'cosmic distance ladder'. First up, parallax - how nearby stars seem to change position against more distant stars - is used to validate the distances to the two types of variable stars.

(design und mehr)

The next step out from variable stars is extragalactic Type Ia supernovae. It's like climbing a ladder farther and farther out into the cosmos, and "even a tiny error in one of the steps can propagate into a larger error later," Davis pointed out.

Other attempts to tackle the problem involve thinking about the very space that surrounds us in a different way.

The Hubble bubble hypothesis, for example, is based on the idea that the Milky Way is located in a relatively low-density 'bubble' in the Universe, surrounded by higher density material. The gravitational effect of this higher density material would pull on the space inside the bubble, making it so that the local space appears to expand at a faster rate than the early Universe.

Even if all of the above were indeed contributing to the problem, this would hardly add up to that 9.4 percent discrepancy, however.

"People have been quite inventive in coming up with possible ways that the methods could go wrong. And so far, no one has convincingly argued any one particular error could explain the differences that we see," cosmologist Matthew Colless, from the Australian National University, told ScienceAlert.

"It's possible that a whole bunch of different small errors all lined up the same way; but these sources of error are not related to each other. It would be very surprising and extremely unlucky if it just so happened that every different sort of error we made, all piled up in one direction and took us one way."
Maybe the blame lies with physics?

In pretty much all other respects, our cosmological models work remarkably well. Thus, if you try to alter one of the basic components of the Hubble constant, something else tends to break.

"You can change the standard ruler," Colless said, "but then you break some other observation that's been made - the amount of matter in the Universe, the mass of neutrinos - things like that, well measured and explained by the current model, but broken by the changes you need to make to 'fix' the standard ruler."

Which leads to - what the heck are we missing? Is it a problem with... fundamental physics?

"I am pretty soundly thinking that it's likely to be an error," Davis noted. "But it is genuinely difficult to explain where that error could have come from in the current measurements. So I'm almost 50-50. It's an intriguing discrepancy. And it's really interesting to try and figure out why."

If our options are "humans stuffed something up" and "actually, physics is wrong", the blame typically tends to fall on the former.

Actually, that's an understatement. "New physics" is an exceedingly rare answer. But the Hubble tension is a slippery problem, defying every attempt at a solution cosmologists can come up with.

Which makes it an incredibly exciting one.

Most of these specks are galaxies. (NASA, ESA, S. Beckwith (STScI) and the HUDF Team)

It's possible there's something general relativity hasn't accounted for. That would be wild: Einstein's theory has survived test after cosmic test. But we can't discount the possibility.

Naturally, there are other possibilities as well, such as the huge unknown of dark energy. We don't know what dark energy is, but it seems to be a fundamental force, responsible for the negative pressure that's accelerating the expansion of our Universe. Maybe.

"Our only vague idea is that it is Einstein's cosmological constant, the energy of the vacuum," said Colless. "But we don't really know exactly how that works, because we don't have a convincing way for predicting what the value of the cosmological constant should be."

Alternatively, it could be some hole in our understanding of gravity, although "new physics that affects a theory of fundamental and general relativity is extremely rare," Colless pointed out.

"If there was new physics, and if it turned out to require a modification to general relativity, that would definitely be Nobel Prize-level breakthrough physics."
The only way forward

Whether it's a calibration error, a huge mistake in our current understanding of physics, or something else altogether, there is only one way forward if we're going to fix the Hubble constant - doing more science.

Firstly, cosmologists can work with the current data we already have on standard candles and standard rulers, refining them further and reducing the error bars even more. To supplement this, we can also obtain new data.

Colless, for instance, is working on a project in Australia using the cutting-edge TAIPAN instrument newly installed at Siding Spring Observatory. That team will be surveying millions of galaxies in the local Universe to measure the baryon acoustic oscillation as close to us as possible, to account for any measurement problems produced by the distance.

"We're going to measure 2 million very nearby galaxies - over the whole Southern Hemisphere and a little bit of the Northern Hemisphere - as nearby as we possibly can, look for this signal of baryon acoustic oscillation, and measure that scale with 1 percent precision at very low redshift."

This is the same volume of space that the distance ladders cover. So, if the TAIPAN results in that same volume return an H0 of 67 kilometres per second per megaparsec, the problem might lie with our standard candles.

On the other hand, if the results are closer to 74 kilometres per second per megaparsec, this would suggest the standard candles are more robust.

Emerging research fields are also an option; not standard candles or standard rulers, but standard sirens, based on gravitational wave astronomy - the ripples in spacetime propagated by massive collisions between black holes and neutron stars.

Animation of two neutron stars colliding. (Caltech/YouTube)

"They're similar to the supernovae in that we know how bright they are intrinsically," Davis said.

"Basically, it's like a standard candle. It's sometimes called a standard siren, because the frequency of the gravitational waves tells you how bright it is. Because we know - from general relativity - the relationship between the frequency and the brightness, we don't have to do any calibration. We just have a number, which makes it much, much cleaner than some of these other methods."

It's still hard to measure the Hubble constant with gravitational waves. But initial calculations are promising. In 2017, neutron star collision allowed astronomers to narrow it down to around 70 (km/s)/Mpc, with error bars large enough on either side to cover both 67 and 74, and then some.

That, Davis said, was stunning.

"We've measured thousands of supernovae now," she said. "We've measured millions of galaxies to measure the baryon acoustic oscillation, we've surveyed the entire sky to measure the cosmic microwave background.

"And this single object, this one measurement of a gravitational wave, got an error bar that was about 10 percent, which took decades of work on the other probes."

Gravitational wave astronomy is still in its infancy - it's only a matter of time before we detect enough neutron star collisions to sufficiently refine those results. With luck, that will help ferret out the cause of the Hubble tension.

Either way, it's going to make history. New physics would, of course, be amazing - but an error in the distance ladder would rock astronomy. It could mean that there's something we don't understand about Type Ia supernovae, or how stars evolve.

Whichever way it shakes out, solving the Hubble tension will have effects that ripple out across astronomical science.

"That's why cosmologists are so excited about this. Because cosmological theory works so well, we're so excited when we find something that it failed to predict. Because when things break, that's when you learn," Colless said.

"Science is all about trial and error - and it's in the error that you learn something new."




Jaw Dropping Observations of Black Hole Devouring a Star Reveal Rapid Disk Formation


By UNIVERSITY OF CALIFORNIA SANTA CRUZ AUGUST 28, 2020


Computer simulation of the disruption of a star by a black hole shows the formation of an eccentric accretion disk of stellar material spiraling into the black hole. This image from a video of the simulation shows the early stage in the formation of the disk. Credit: Jamie Law-Smith and Enrico Ramirez-Ruiz

First clear confirmation of accretion disk formation in a tidal disruption event without x-ray emissions supports theoretical predictions.

When a star passes too close to a supermassive black hole, tidal forces tear it apart, producing a bright flare of radiation as material from the star falls into the black hole. Astronomers study the light from these “tidal disruption events” (TDEs) for clues to the feeding behavior of the supermassive black holes lurking at the centers of galaxies.


“My jaw dropped, and I immediately knew this was going to be interesting.” — Ryan Foley


New TDE observations led by astronomers at UC Santa Cruz now provide clear evidence that debris from the star forms a rotating disk, called an accretion disk, around the black hole. Theorists have been debating whether an accretion disk can form efficiently during a tidal disruption event, and the new findings, accepted for publication in the Astrophysical Journal, should help resolve that question, said first author Tiara Hung, a postdoctoral researcher at UC Santa Cruz.

“In classical theory, the TDE flare is powered by an accretion disk, producing x-rays from the inner region where hot gas spirals into the black hole,” Hung said. “But for most TDEs, we don’t see x-rays—they mostly shine in the ultraviolet and optical wavelengths—so it was suggested that, instead of a disk, we’re seeing emissions from the collision of stellar debris streams.”


Computer simulation of the disruption of a star by a black hole shows the formation of an eccentric accretion disk of stellar material spiraling into the black hole. This video shows the early stage in the formation of the disk. Credit: Jamie Law-Smith and Enrico Ramirez-Ruiz

Coauthors Enrico Ramirez-Ruiz, professor of astronomy and astrophysics at UCSC, and Jane Dai at the University of Hong Kong developed a theoretical model, published in 2018, that can explain why x-rays are usually not observed in TDEs despite the formation of an accretion disk. The new observations provide strong support for this model.

“This is the first solid confirmation that accretion disks form in these events, even when we don’t see x-rays,” Ramirez-Ruiz said. “The region close to the black hole is obscured by an optically thick wind, so we don’t see the x-ray emissions, but we do see optical light from an extended elliptical disk.”




Telltale evidence

The telltale evidence for an accretion disk comes from spectroscopic observations. Coauthor Ryan Foley, assistant professor of astronomy and astrophysics at UCSC, and his team began monitoring the TDE (named AT 2018hyz) after it was first detected in November 2018 by the All Sky Automated Survey for SuperNovae (ASAS-SN). Foley noticed an unusual spectrum while observing the TDE with the 3-meter Shane Telescope at UC’s Lick Observatory on the night of January 1, 2019.

“My jaw dropped, and I immediately knew this was going to be interesting,” he said. “What stood out was the hydrogen line—the emission from hydrogen gas—which had a double-peaked profile that was unlike any other TDE we’d seen.”


A model of ultraviolet and optical emission from the tidal disruption event AT 2018hyz is shown in this schematic diagram. As an accretion disk forms quickly after the TDE, it generates x-ray emission (black arrows) at small radii, which is only visible through the vertical funnel. In other directions, x-rays are reprocessed by the photosphere or wind, powering the ultraviolet and optical emissions. Hydrogen emission is produced at two distinct sites outside of the photosphere: a large elliptical disk (color-coded by velocity to show rotation) joined by the fallback material, and a broad emission line region (BLR) that is likely created by a radiation-driven wind (purple shaded area). Credit: Tiara Hung

Foley explained that the double peak in the spectrum results from the Doppler effect, which shifts the frequency of light emitted by a moving object. In an accretion disk spiraling around a black hole and viewed at an angle, some of the material will be moving toward the observer, so the light it emits will be shifted to a higher frequency, and some of the material will be moving away from the observer, its light shifted to a lower frequency.

“It’s the same effect that causes the sound of a car on a race track to shift from a high pitch as the car comes toward you to a lower pitch when it passes and starts moving away from you,” Foley said. “If you’re sitting in the bleachers, the cars on one turn are all moving toward you and the cars on the other turn are moving away from you. In an accretion disk, the gas is moving around the black hole in a similar way, and that’s what gives the two peaks in the spectrum.”

The team continued to gather data over the next few months, observing the TDE with several telescopes as it evolved over time. Hung led a detailed analysis of the data, which indicates that disk formation took place relatively quickly, in a matter of weeks after the disruption of the star. The findings suggest that disk formation may be common among optically detected TDEs despite the rarity of double-peaked emission, which depends on factors such as the inclination of the disk relative to observers.

“I think we got lucky with this one,” Ramirez-Ruiz said. “Our simulations show that what we observe is very sensitive to the inclination. There is a preferred orientation to see these double-peak features, and a different orientation to see x-ray emissions.”

He noted that Hung’s analysis of multi-wavelength follow-up observations, including photometric and spectroscopic data, provides unprecedented insights into these unusual events. “When we have spectra, we can learn a lot about the kinematics of the gas and get a much clearer understanding of the accretion process and what is powering the emissions,” Ramirez-Ruiz said.

Reference: “Prompt Accretion Disk Formation in an X-Ray Faint Tidal Disruption Event” by Tiara Hung, Ryan J. Foley, Enrico Ramirez-Ruiz, Jane L. Dai, Katie Auchettl, Charles D. Kilpatrick, Brenna Mockler, Jonathan S. Brown, David A. Coulter, Georgios Dimitriadis, Thomas W.-S. Holoien, Jamie A.P. Law-Smith, Anthony L. Piro, Armin Rest, César Rojas-Bravo and Matthew R. Siebert, Accepted, Astrophysical Journal.
arXiv: 2003.09427


In addition to Hung, Foley, Ramirez-Ruiz, and other members of the UCSC team, the coauthors of the paper also include scientists at the Niels Bohr Institute in Copenhagen (where Ramirez-Ruiz holds a Niels Bohr Professorship); University of Hong Kong; University of Melbourne, Australia; Carnegie Institution for Science; and Space Telescope Science Institute.

Observations were obtained at Lick Observatory, the W. M. Keck Observatory, the Southern Astrophysical Research (SOAR) telescope, and the Swope Telescope at Las Campanas Observatory in Chile. This work was supported in part by the National Science Foundation, the Gordon and Betty Moore Foundation, the David and Lucile Packard Foundation, and the Heising-Simons Foundation.

Saturday, August 29, 2020



Global rally against COVID-19 safety measures comes to Parliament Hill

Online misinformation about pandemic safety measures thriving, experts say

#COVIDIOTS


Raisa Patel, David Thurton · CBC News · Posted: Aug 29, 2020
A demonstrator holds her sign during a protest against measures to stop transmission of COVID-19 on Parliament hill in Ottawa, Sat., Aug. 29, 2020. (Adrian Wyld/The Canadian Press)
Flouting public health guidance to wear masks in public areas, avoid large gatherings and practice physical distancing, hundreds of demonstrators gathered on Parliament Hill Saturday to protest pandemic safety measures they say infringe on their personal rights.

The group behind the rally is part of a wider movement of people across the globe venting their frustrations over science-backed measures designed to slow the spread of COVID-19. The majority of protesters did not wear masks and congregated closely together on Parliament Hill's front lawn.

The demonstration was one of several rallies that took place around the world on Saturday. In Berlin, 38,000 people protested public safety restrictions at an event that was disbanded by police after demonstrators failed to keep their distance and wear masks as instructed.

Elsewhere in Europe, about 200 anti-mask activists held a similar rally in Paris, while thousands of demonstrators gathered in London's Trafalgar Square calling the coronavirus a hoax and demanding an end to restrictions.

Thousands rally in downtown Montreal to protest Quebec's mandatory mask rules

AUDIOAnti-mask group speaks out against Sudbury's health directives

The group behind the event in Canada's capital calls itself a "leaderless" movement that doesn't take political sides, and says it is mostly concerned with being forced to comply with the measures.

"We are not anti-mask at all, we are pro-liberty," said Kelly Anne Wolfe, the executive director of The Line Canada, a group that organized the protest. "If you want to wear a mask or a tutu, we have nothing to say about that. That is your right. You do not have a right to put one on my face or the face of my children. It's as simple as that."

The Parliamentary Protective Service, which is responsible for security on Parliament Hill, has not issued an estimate for the size of Saturday's crowd.
Masks protect individuals — and others

Public health authorities have overwhelmingly recommended or mandated the use of masks to protect against transmission and exposure of the virus — particularly in cases where a person is asymptomatic or might not know if they have been infected.

The Public Health Agency of Canada recommends wearing a homemade or non-medical mask when it is not possible to maintain physical distancing, while many jurisdictions across the country have also made mask-wearing mandatory in indoor public areas.

A balloon floats above protesters on Parliament hill during a demonstration against safety measures intended to slow the spread of COVID-19. (Adrian Wyld/The Canadian Press)

"It's been shown to be a very effective public health measure," said Craig Jenne, an infectious disease researcher at the University of Alberta. "We know, for example, that masks can reduce the transmission and spread of droplets by more than six-fold."

Jenne said that when it comes to considering personal freedom, context is important.

"Wearing a mask is not a large ask and [it] is not an arduous process to literally protect the lives of people in your community."

While some individuals are not able to wear masks because of pre-existing health conditions, those who can are helping to stop the spread, he added.

"If we can do our part to keep viral numbers down, we can protect them as well. So it really is just an overall community effort."
Beliefs shared widely online

The group responsible for Saturday's rally mobilized on Facebook, where organizers said they expected 200,000 people to show up.

Only a small fraction of that number materialized, but experts say these beliefs have found new life online.

"I have noticed an increased level of misinformation circulating on social media since the beginning of the pandemic," said Aengus Bridgman, a Montreal-based researcher with the Media Ecosystem Observatory.

Demonstrators in downtown Ottawa were joined by supporters from outside the city, including a convoy of protesters travelling from Quebec. (Adrian Wyld/The Canadian Press)

While social media provides a platform for citizens to exercise fundamental freedoms and participate in the democratic process, Bridgman said that must be weighed against sharing harmful information.

"There are concerns when we have social media platforms that are, through negligence, allowing the spread of misinformation that...leads to increased infection rates, increased deaths, and increases the duration of the pandemic."

COVID-19 conspiracy theories creating a 'public health crisis' in Canada, experts say

Anti-masking groups draw from anti-vaccination playbook to spread misinformation

Earlier this week, Canada's Chief Public Health Officer Dr. Theresa Tam acknowledged that online platforms were also contributing to mistrust surrounding a COVID-19 vaccine.

"I am also concerned about the first pandemic in the age of the internet and social media. This is an area of significant work because we have an overload of information through which many Canadians can't sort out what is credible and what is not," she said.

"I look towards different partners, different government departments coming together to look at how we better address some of the myths and disinformation that is in that space. That is, I think, fundamentally a massive challenge."


With files from Reuters
CBC's Journalistic Standards and Practices|About CBC NewsReport Typo or Error

RELATED STORIES

Thousands rally in downtown Montreal to protest Quebec's mandatory mask rules
AUDIO Anti-mask group speaks out against Sudbury's health directives
COVID-19 conspiracy theories creating a 'public health crisis' in Canada, experts say
Anti-masking groups draw from anti-vaccination playbook to spread misinformation