Showing posts sorted by relevance for query COSMOLOGY. Sort by date Show all posts
Showing posts sorted by relevance for query COSMOLOGY. Sort by date Show all posts

Tuesday, April 11, 2023

New findings that map the universe’s cosmic growth support Einstein’s theory of gravity

Research by the Atacama Cosmology Telescope collaboration has culminated in a significant breakthrough in understanding the evolution of the universe.

Reports and Proceedings

PRINCETON UNIVERSITY

ACT Lensing Map 

IMAGE: RESEARCHERS USED THE ATACAMA COSMOLOGY TELESCOPE TO CREATE THIS NEW MAP OF THE DARK MATTER. THE ORANGE REGIONS SHOW WHERE THERE IS MORE MASS; PURPLE WHERE THERE IS LESS OR NONE. THE TYPICAL FEATURES ARE HUNDREDS OF MILLIONS OF LIGHT YEARS ACROSS. THE WHITISH BAND SHOWS WHERE CONTAMINATING LIGHT FROM DUST IN OUR MILKY WAY GALAXY, MEASURED BY THE PLANCK SATELLITE, OBSCURES A DEEPER VIEW. THE NEW MAP USES LIGHT FROM THE COSMIC MICROWAVE BACKGROUND (CMB) ESSENTIALLY AS A BACKLIGHT TO SILHOUETTE ALL THE MATTER BETWEEN US AND THE BIG BANG. “IT’S A BIT LIKE SILHOUETTING, BUT INSTEAD OF JUST HAVING BLACK IN THE SILHOUETTE, YOU HAVE TEXTURE AND LUMPS OF DARK MATTER, AS IF THE LIGHT WERE STREAMING THROUGH A FABRIC CURTAIN THAT HAD LOTS OF KNOTS AND BUMPS IN IT,” SAID SUZANNE STAGGS, DIRECTOR OF ACT AND PRINCETON'S HENRY DEWOLF SMYTH PROFESSOR OF PHYSICS. “THE FAMOUS BLUE AND YELLOW CMB IMAGE IS A SNAPSHOT OF WHAT THE UNIVERSE WAS LIKE IN A SINGLE EPOCH, ABOUT 13 BILLION YEARS AGO, AND NOW THIS IS GIVING US THE INFORMATION ABOUT ALL THE EPOCHS SINCE.” view more 

CREDIT: ACT COLLABORATION

For millennia, humans have been fascinated by the mysteries of the cosmos.

Unlike ancient philosophers imagining the universe’s origins, modern cosmologists use quantitative tools to gain insights into the universe’s evolution and structure. Modern cosmology dates back to the early 20th century, with the development of Albert Einstein’s theory of general relativity.

Now, researchers from the Atacama Cosmology Telescope (ACT) collaboration have created a groundbreaking new image that reveals the most detailed map of dark matter distributed across a quarter of the entire sky, extending deep into the cosmos. What’s more, it confirms Einstein’s theory of how massive structures grow and bend light, over the entire 14-billion-year life span of the universe. 

“We have mapped the invisible dark matter across the sky to the largest distances, and clearly see features of this invisible world that are hundreds of millions of light-years across, says Blake Sherwin, professor of cosmology at the University of Cambridge, where he leads a group of ACT researchers. “It looks just as our theories predict.”

Despite making up 85% of the universe and influencing its evolution, dark matter has been hard to detect because it doesn’t interact with light or other forms of electromagnetic radiation. As far as we know dark matter only interacts with gravity. 

To track it down, the more than 160 collaborators who have built and gathered data from the National Science Foundation’s Atacama Cosmology Telescope in the high Chilean Andes observe light emanating following the dawn of the universe’s formation, the Big Bang—when the universe was only 380,000 years old. Cosmologists often refer to this diffuse light that fills our entire universe as the “baby picture of the universe,” but formally, it is known as the cosmic microwave background radiation (CMB).

The team tracks how the gravitational pull of large, heavy structures including dark matter warps the CMB on its 14-billion-year journey to us, like how a magnifying glass bends light as it passes through its lens.

“We’ve made a new mass map using distortions of light left over from the Big Bang,” says Mathew Madhavacheril, assistant professor in the Department of Physics and Astronomy at the University of Pennsylvania. “Remarkably, it provides measurements that show that both the ‘lumpiness’ of the universe, and the rate at which it is growing after 14 billion years of evolution, are just what you’d expect from our standard model of cosmology based on Einstein's theory of gravity.” 

Sherwin adds, “our results also provide new insights into an ongoing debate some have called ‘The Crisis in Cosmology,’”explaining that this crisis stems from recent measurements that use a different background light, one emitted from stars in galaxies rather than the CMB. These have produced results that suggest the dark matter was not lumpy enough under the standard model of cosmology and led to concerns that the model may be broken. However, the team’s latest results from ACT were able to precisely assess that the vast lumps seen in this image are the exact right size. 

“When I first saw them, our measurements were in such good agreement with the underlying theory that it took me a moment to process the results,” says Cambridge Ph.D. student Frank Qu, part of the research team. “It will be interesting to see how this possible discrepancy between different measurements will be resolved.”

“The CMB lensing data rivals more conventional surveys of the visible light from galaxies in their ability to trace the sum of what is out there,” says Suzanne Staggs, director of ACT and Henry DeWolf Smyth Professor of Physics at Princeton University. “Together, the CMB lensing and the best optical surveys are clarifying the evolution of all the mass in the universe.” 

“When we proposed this experiment in 2003, we had no idea the full extent of information that could be extracted from our telescope,” says Mark Devlin, the Reese Flower Professor of Astronomy at the University of Pennsylvania and the deputy director of ACT. “We owe this to the cleverness of the theorists, the many people who built new instruments to make our telescope more sensitive, and the new analysis techniques our team came up with.”

ACT, which operated for 15 years, was decommissioned in September 2022. Nevertheless, more papers presenting results from the final set of observations are expected to be submitted soon, and the Simons Observatory will conduct future observations at the same site, with a new telescope slated to begin operations in 2024. This new instrument will be capable of mapping the sky almost 10 times faster than ACT.

The Atacama Cosmology Telescope in Northern Chile, supported by the National Science Foundation, operated from 2007-2022. The project is led by Princeton University and the University of Pennsylvania -- Director Suzanne Staggs at Princeton, Deputy Director Mark Devlin at Penn -- with 160 collaborators at 47 institutions.

CREDIT

Mark Devlin, Deputy Director of the Atacama Cosmology Telescope and the Reese Flower Professor of Astronomy at the University of Pennsylvania

Research by the Atacama Cosmology Telescope collaboration has culminated in a groundbreaking new map of dark matter distributed across a quarter of the entire sky, reaching deep into the cosmos. Findings provide further support to Einstein’s theory of general relativity, which has been the foundation of the standard model of cosmology for more than a century, and offer new methods to demystify dark matter.

CREDIT

Lucy Reading-Ikkanda, Simons Foundation

Learn more at https://act.princeton.edu/. This research will be presented at "Future Science with CMB x LSS," a conference running from April 10-14 at Yukawa Institute for Theoretical Physics, Kyoto University. This work was supported by the U.S. National Science Foundation (AST-0408698, AST-0965625 and AST-1440226 for the ACT project, as well as awards PHY-0355328, PHY-0855887 and PHY-1214379), Princeton University, the University of Pennsylvania, and a Canada Foundation for Innovation award. Team members at the University of Cambridge were supported by the European Research Council.


Tuesday, October 24, 2023

Biggest ever supercomputer simulation to investigate universe’s evolution

Nina Massey, PA Science Correspondent
Mon, 23 October 2023 


Astronomers have carried out the biggest ever computer simulations, from the Big Bang to the present day, to investigate how the universe evolved.

The project, dubbed Flamingo, calculated the evolution of all components of the universe – ordinary matter, dark matter and dark energy – according to the laws of physics.

As the simulations progress, virtual galaxies and galaxy clusters emerge in detail.

Facilities such as the Euclid Space Telescope recently launched by the European Space Agency (ESA) and Nasa’s James Webb Space Telescope collect data on galaxies, quasars and stars.

Researchers hope the simulations will allow them to compare the virtual universe with observations of the real thing being captured by new high-powered telescopes.

This could help scientists understand if the standard model of cosmology – used to explain the evolution of the universe – is a good description of reality.

Flamingo research collaborator Professor Carlos Frenk, Ogden Professor of Fundamental Physics, at the Institute for Computational Cosmology, Durham University, said: “Cosmology is at a crossroads.

“We have amazing new data from powerful telescopes some of which do not, at first sight, conform to our theoretical expectations.

“Either the standard model of cosmology is flawed or there are subtle biases in the observational data.

“Our super precise simulations of the universe should be able to tell us the answer.”

Past simulations, which have been compared to observations of the universe, have focused on cold dark matter – believed to be a key component of the structure of the cosmos.

However, astronomers now say that the effect of ordinary matter, which makes up only 16% of all matter in the universe, and neutrinos, tiny particles that rarely interact with normal matter, also need to be taken into account when trying to understand the universe’s evolution.

Principal investigator Professor Joop Schaye, of Leiden University, said: “Although the dark matter dominates gravity, the contribution of ordinary matter can no longer be neglected since that contribution could be similar to the deviations between the models and the observations.”

Researchers ran simulations at a powerful supercomputer in Durham over the past two years.

The simulations took more than 50 million processor hours on the Cosmology Machine (COSMA 8) supercomputer, hosted by the Institute for Computational Cosmology, Durham University, on behalf of the UK’s DiRAC High-Performance Computing facility.

In order to make the simulations possible, the researchers developed a new code, called SWIFT, which distributes the computational work over thousands of central processing units (CPUs, sometimes as many as 65,000).

Flamingo is a project of the Virgo Consortium for cosmological supercomputer simulations. The acronym stands for full-hydro large-scale structure simulations with all-sky mapping for the interpretation of next generation observations.

Funding for the project came from the European Research Council, the UK’s Science and Technology Facilities Council, the Netherlands Organisation for Scientific Research and the Swiss National Science Foundation.

The research is published in the journal Monthly Notices of the Royal Astronomical Society.
Biggest-ever supercomputer creates 'identical twin' universe
Story by Katherine Fidler and Nina Massey  • METRO UK

Scientists are still trying to understand exactly how the universe formed 
(Picture: Getty/Science Photo Libra)© Provided by Metro

Astronomers have carried out the biggest ever ‘supercomputer’ simulations, from the Big Bang to the present day, to help answer some of the universe’s biggest mysteries.

The aim is to compare the virtual universe with what we know of the real thing, including new information being captured by high-powered telescopes – which sometimes does not quite match what is expected.

This could help scientists understand if the current theory of how the universe evolved – known as the Standard Model of Cosmology – is a good description of reality.

The project, dubbed Flamingo, calculated the evolution of all components of the universe – ordinary matter, dark matter and dark energy – according to the laws of physics.

As the simulations progress, virtual galaxies and galaxy clusters emerge in detail.

Facilities such as the Euclid Space Telescope, recently launched by the European Space Agency (ESA), and Nasa’s James Webb Space Telescope collect data on galaxies, quasars and stars.

‘Cosmology is at a crossroads,’ said Flamingo research collaborator Professor Carlos Frenk, from Durham University.

‘We have amazing new data from powerful telescopes some of which do not, at first sight, conform to our theoretical expectations.

‘Either the standard model of cosmology is flawed or there are subtle biases in the observational data.

‘Our super precise simulations of the universe should be able to tell us the answer.’


The Christmas Tree Cluster, a young cluster still forming in the constellation Monoceros (Picture: Getty)© Provided by Metro

Past simulations, which have been compared to observations of the universe, have focused on cold dark matter – believed to be a key component of the structure of the cosmos.

However, astronomers now say that the effect of ordinary matter, which makes up only 16% of all matter in the universe – including the Earth and everyone on it – and neutrinos, tiny particles that rarely interact with normal matter, also need to be taken into account when trying to understand the universe’s evolution.


Researchers have been running simulations at a powerful supercomputer in Durham over the past two years.

The simulations took more than 50 million processor hours on the Cosmology Machine (COSMA 8) supercomputer, hosted by the Institute for Computational Cosmology, Durham University, on behalf of the UK’s DiRAC High-Performance Computing facility.

Flamingo is a project of the Virgo Consortium for cosmological supercomputer simulations. The acronym stands for full-hydro large-scale structure simulations with all-sky mapping for the interpretation of next generation observations.

Funding for the project came from the European Research Council, the UK’s Science and Technology Facilities Council, the Netherlands Organisation for Scientific Research and the Swiss National Science Foundation.

The research is published in the journal Monthly Notices of the Royal Astronomical Society.

Get your need-to-know latest news, feel-good stories, analysis and more by signing up to Metro's News Updates

Sunday, August 22, 2021

 

“Before the Big Bang” –Vestiges of a Prior Universe? (Weekend Feature)

 

Big Bang

 

“Eliminating the singularity or Big Bang– which has its origins in the late 1920s when US astronomer Edwin Hubble discovered that almost all galaxies are moving away from each other at ever-faster velocities– brings back the bouncing Universe on to the theoretical stage of cosmology. The absence of a singularity at the start of spacetime opens up the possibility that vestiges of a previous contraction phase may have withstood the phase change (between contraction to expansion) and may still be with us in the ongoing expansion of the Universe,” said Brazilian physicist Juliano Cesar Silva Neves.

A Universe Prior to the Big Bang

Although for five decades, the Big Bang theory has been the best known and most accepted explanation for the beginning and evolution of the Universe, it is hardly a consensus among scientists. Physicists are now assuming the possibility of vestiges of a Universe previous to the Big Bang. 

The Big Bounce

“The idea of a “big bounce” has existed in some form or another for decades,” writes Jonas Mureika, theoretical physicist and physics chair at Loyola Marymount University and the Kavli Institute for Theoretical Physics, UC Santa Barbara, in an extraordinary email to The Daily Galaxy. “The basic idea is to avoid the singularity at t=0 (time of the Big Bang) and r=0 (where all matter and energy were compressed into an infinitesimally small volume), since our mathematical description of spacetime (and thus our favorite theories) break down at that point. This is similar to problems in black hole physics, which itself has similar fixes to remove the singularity.”

“The Quantum Fix”

“The crux of the problem,” observes Mureika, “is that our description of this physics is classical, i.e. a prediction of General Relativity, and that’s why singularities arise. The theories just don’t work in that limit. It is most likely the case, however, that the physics governing the realm of classical singularities — extremely small and extremely high energy — is quantum in nature. So, the rules change in some fascinating ways and introducing us to new physics allows this to make sense.” 

“When classical physics breaks down, we look to replace the broken parts with a quantum fix. If the singularity is at r=0, then one of the ways we can avoid this is to not let the physics act at r=0. That is, we impose a minimal length (usually the Planck length, but not always) below which the universe can’t be ‘probed’. That removes the infinites that plague the singularity and allows our theories to behave well. In a ‘big bounce’ scenario, the quantum pressures at this minimum length basically stop the implosion of the universe and allow it to re-expand. Again, similar ideas exist with black holes, called Planck stars.”

 Roger Penrose’s Cyclic Conformal Cosmology (CCC) 

“Another approach is to change our notion of the structure of spacetime itself,” explains Mureika, “and how it behaves in the small and large limits. This is embodied in Nobel Laureate Roger Penrose’s 2012 Cyclic Conformal Cosmology (CCC) framework, in which the very small limit of the universe (singularity) is identical to the very large limit (‘infinite’ accelerated expansion). This is done by a conformal transformation on the spacetime metric (the thing that defines straight lines and shortest distances), which is a fancy way of saying we stretch and bend spacetime while preserving certain geometric features (e.g. angles). We now know the universe is indeed going through a phase of accelerated expansion, so this adds weight to Penrose’s idea and kills previous ones (i.e. the universe doesn’t contract, so it can’t ‘bounce’ in the ways previously thought). This allows for the universe to be ‘reborn’ as it expands indefinitely, so there is a repeating cycle of big bangs. Penrose calls these cycles ‘aeons’.”

“CMB” Fossils

“Of course,” Mureika concludes, “a theory is only as good as its experimental verification, so the challenge is to detect tell-tale fingerprints of these models. The observational go-to for early universe cosmology is the Cosmic Microwave Background (CMB), which represents an imprint of the earliest time we can see. It’s believed that the CMB will contain information from earlier times, including the big bang (if it happened). These will manifest themselves as e.g. geometric signatures, patterns in temperature fluctuations, over/underdensities of clustering, etc. Detecting any such signature would be a monumental discovery, and will help quantum gravity and cosmology research shape their future paths.”

Brian Keating’s Deep Dive

“In contrast to inflation,” observes Brian Keating, Chancellor’s Distinguished Professor of Physics at UC San Diego, author of Losing the Nobel Prize, and host of the INTO THE IMPOSSIBLE Podcast in an email to The Daily Galaxy,  “there are several other possible mechanisms to the singularity featured in most versions of the Big Bang theory. Two of the most prominent alternatives to the singular Big Bang are the Bouncing model of Anna Ijjas and Paul Steinhardt and the Conformal Cyclic Cosmology (CCC) of Sir Roger Penrose. Both of these share the feature that they do not produce so-called ‘primordial B-mode polarization’ patterns, the result of relic gravitational waves produced in most models of cosmic Inflation, which also features a concomitant spacetime singularity. In that sense, both the Bouncing and CCC models are falsifiable, e.g. if  current or  future B-mode polarization experiments like BICEP Array or the Simons Observatory were to detect and confirm primordial B-modes, these alternatives would be disproven. Many cosmologists find the falsifiability of these models, in contrast to the inflationary Multiverse, strongly appealing.”

Hawking Points”

“The CCC model also predicts  the presence of so-called ‘Hawking Points’ ”, explains Keating, “regions of concentrated energy caused by the coalescing black holes from preceding ‘aeons’ which would, according to Penrose and collaborators, be evidence supporting the cyclic Evidence for Hawking points from the ESA’s Planck satellite has been claimed already. But those claims are also disputed by members of the Planck team. Upcoming experiments like BICEP Array and the Simons Observatory will be able to rule out or confirm evidence for Hawking Points which would be tantamount to evidence for the CCC model.”

 

Mirroring the “Bouncing Universe” model, Neves, in an article published in General Relativity and Gravitation, proposes to eliminate the need for cosmological spacetime singularity and argues that the current expansion phase was preceded by contraction.

Neves is part of a group of researchers who dare to imagine a different origin. In a study published in the journal General Relativity and Gravitation, Neves suggests the elimination of a key aspect of the standard cosmological model: the need for a spacetime singularity known as the Big Bang.

Challenging the Idea that Time had a Beginning

In raising this possibility, Neves challenges the idea that time had a beginning and reintroduces the possibility that the current expansion was preceded by contraction. “I believe the Big Bang never happened,” the physicist said, who works as a researcher at the University of Campinas’s Mathematics, Statistics & Scientific Computation Institute (IMECC-UNICAMP) in Sao Paulo State, Brazil.

For Neves, the fast spacetime expansion stage does not exclude the possibility of a prior contraction phase. Moreover, the switch from contraction to expansion may not have destroyed all traces of the preceding phase.

Introducing the “Scale Factor”

The article, which reflects the work developed under the Thematic Project “Physics and geometry of spacetime”, considers the solutions to the general relativity equations that describe the geometry of the cosmos and then proposes the introduction of a “scale factor” that makes the rate at which the Universe is expanding depend not only on time but also on cosmological scale.

“In order to measure the rate at which the Universe is expanding with the standard cosmology, the model in which there’s a Big Bang, a mathematical function is used that depends only on cosmological time,” said Neves, who elaborated the idea with Alberto Vazques Saa, a Professor at IMECC-UNICAMP and also the supervisor for Neves’ postdoctoral project, funded by the Sao Paulo Research Foundation – FAPESP.

With the scale factor, the Big Bang itself, or cosmological singularity, ceases to be a necessary condition for the cosmos to begin universal expansion. A concept from mathematics that expresses indefiniteness, singularity was used by cosmologists to characterize the “primordial cosmological singularity” that happened 13.8 billion years ago, when all the matter and energy from the Universe were compressed into an initial state of infinite density and temperature, where the traditional laws of physics no longer apply.

From the 1940s onward, scientists guided by Einstein’s theory of general relativity constructed a detailed model of the evolution of the Universe since the Big Bang. Such model could lead to three possible outcomes: the infinite expansion of the Universe at ever-higher velocities; the stagnation of the Universe expansion in a permanent basis; or an inverted process of retraction caused by the gravitational attraction exerted by the mass of the Universe, what is known as Big Crunch.

Neves conceptualizes that “bouncing cosmology” is rooted in the hypothesis that Big Crunch would give way to an eternal succession of universes, creating extreme conditions of density and temperature in order to instigate a new inversion in the process, giving way to expansion in another bounce.

Black-Hole Fossils

Black holes are the starting point of Neves’ investigations about the “Bouncing Universe”. “Who knows, there may be remains of black holes in the ongoing expansion that date from the prior contraction phase and passed intact through the bottleneck of the bounce,” he said.

Consisting of the imploded core remaining after a giant star explodes, black holes are a kind of cosmic object whose core contracted to form a singularity, a point with infinite density and the strongest gravitational attraction known to exist. Nothing escapes from it, not even light.

According to Neves, a black hole is not defined by singularity, but rather by an event horizon, a membrane that indicates the point of no return from which nothing escapes the inexorable destiny of being swallowed up and destroyed by the singularity.

The Illustris simulation shown below that visualizes the universe with the Standard Big Bang Model is the most ambitious computer simulation yet performed. The calculation tracks the expansion of the universe, the gravitational pull of matter onto itself, the motion of cosmic gas, as well as the formation of stars and black holes.

 

 

“Outside the event horizon of a regular black hole, there are no major changes, but inside it, the changes are deep-seated. There’s a different spacetime that avoids the formation of a singularity.”

The scale factor formulated by Neves and Saa was inspired by US physicist James Bardeen. In 1968, Berdeen used a mathematical trick to modify the solution to the general relativity equations that describe black holes.

The trick consisted of thinking of the mass of a black hole not as a constant, as had previously been the case, but as a function that depends on the distance to the center of the black hole. With this change, a different black hole, termed a regular black hole, emerged from the solution to the equations. “Regular black holes are permitted, since they don’t violate general relativity. The concept isn’t new and has frequently been revisited in recent decades,” said Neves.

Since the insertion of a mathematical trick into the general relativity equations could prevent the formation of singularities in regular black holes, Neves considered creating a similar artifice to eliminate the singularity in a regular bounce.

In science, says philosopher Karl Popper,  a theory is worthless if cannot be verified, however beautiful and inspiring it may be. How do you test the hypothesis of a Big Bang that did not start with a singularity?

“By looking for traces of the events in a contraction phase that may have remained in the ongoing expansion phase. What traces? The candidates include remnants of black holes from a previous phase of universal contraction that may have survived the bounce,” Neves said.

The Daily Galaxy, Maxwell Moe, astrophysicist, NASA Einstein Fellow, University of Arizona via Brian KeatingJonas Mureika and  Sao Paulo Research Foundation (FAPESP)

Image credit: Shutterstock License

Saturday, March 16, 2024

BEEN SAYING THIS FOR YEARS

New research suggests that our universe has no dark matter


The current theoretical model for the composition of the universe is that it’s made of ‘normal matter,’ ‘dark energy’ and ‘dark matter.’ A new uOttawa study challenges this




UNIVERSITY OF OTTAWA

New research suggests that our universe has no dark matter 

IMAGE: 

“THE STUDY'S FINDINGS CONFIRM THAT THE UNIVERSE DOES NOT REQUIRE DARK MATTER TO EXIST”

RAJENDRA GUPTA

— PHYSICS PROFESSOR AT THE FACULTY OF SCIENCE, UOTTAWA

view more 

CREDIT: UNIVERSITY OF OTTAWA




The current theoretical model for the composition of the universe is that it’s made of ‘normal matter,’ ‘dark energy’ and ‘dark matter.’ A new uOttawa study challenges this.

A University of Ottawa study published today challenges the current model of the universe by showing that, in fact, it has no room for dark matter.

In cosmology, the term “dark matter” describes all that appears not to interact with light or the electromagnetic field, or that can only be explained through gravitational force. We can’t see it, nor do we know what it’s made of, but it helps us understand how galaxies, planets and stars behave.

Rajendra Gupta, a physics professor at the Faculty of Science, used a combination of the covarying coupling constants (CCC) and “tired light” (TL) theories (the CCC+TL model) to reach this conclusion. This model combines two ideas — about how the forces of nature decrease over cosmic time and about light losing energy when it travels a long distance. It’s been tested and has been shown to match up with several observations, such as about how galaxies are spread out and how light from the early universe has evolved.

This discovery challenges the prevailing understanding of the universe, which suggests that roughly 27% of it is composed of dark matter and less than 5% of ordinary matter, remaining being the dark energy. 

Challenging the need for dark matter in the universe

“The study's findings confirm that our previous work (“JWST early Universe observations and ΛCDM cosmology”) about the age of the universe being 26.7billionyears has allowed us to discover that the universe does not require dark matter to exist,” explains Gupta. “In standard cosmology, the accelerated expansion of the universe is said to be caused by dark energy but is in fact due to the weakening forces of nature as it expands, not due to dark energy.”

“Redshifts” refer to when light is shifted toward the red part of the spectrum. The researcher analyzed data from recent papers on the distribution of galaxies at low redshifts and the angular size of the sound horizon in the literature at high redshift.

“There are several papers that question the existence of dark matter, but mine is the first one, to my knowledge, that eliminates its cosmological existence while being consistent with key cosmological observations that we have had time to confirm,” says Gupta.

By challenging the need for dark matter in the universe and providing evidence for a new cosmological model, this study opens up new avenues for exploring the fundamental properties of the universe.

The study, Testing CCC+TL Cosmology with Observed Baryon Acoustic Oscillation Features,was published in the peer-reviewed Astrophysical Journal.

 

Tuesday, November 08, 2022

First glimpse of what gravity looks like on cosmological scales

A team of international scientists have reconstructed gravity to find a more robust way of understanding the cosmos

Peer-Reviewed Publication

UNIVERSITY OF PORTSMOUTH

Scientists from around the world have reconstructed the laws of gravity, to help get a more precise picture of the Universe and its constitution.

The standard model of cosmology is based on General Relativity, which describes gravity as the curving or warping of space and time. While the Einstein equations have been proven to work very well in our solar system, they had not been observationally confirmed to work over the entire Universe. 

An international team of cosmologists, including scientists from the University of Portsmouth in England, has now been able to test Einstein's theory of gravity in the outer-reaches of space. 

They did this by examining new observational data from space and ground-based telescopes that measure the expansion of the Universe, as well as the shapes and the distribution of distant galaxies. 

The study, published in Nature Astronomy, explored whether modifying General Relativity could help resolve some of the open problems faced by the standard model of cosmology.  

Professor Kazuya Koyama, from the Institute of Cosmology and Gravitation at the University of Portsmouth, said: “We know the expansion of the universe is accelerating, but for Einstein’s theory to work we need this mysterious cosmological constant.

“Different measurements of the rate of cosmic expansion give us different answers, also known as the Hubble tension. To try and combat this, we altered the relationship between matter and spacetime, and studied how well we can constrain deviations from the prediction of General Relativity. The results were promising, but we’re still a long way off a solution.”

Possible modifications to the General Relativity equation are encased in three phenomenological functions describing the expansion of the Universe, the effects of gravity on light, and the effects on matter. Using a statistical method known as the Bayesian inference, the team reconstructed the three functions simultaneously for the first time.

“Partial reconstructions of these functions have been done in the last 5 to 10 years, but we didn't have enough data to accurately reconstruct all three at the same time”, added Professor Koyama.

“What we found was that current observations are getting good enough to get a limit on deviations from General Relativity. But at the same time, we find it's very difficult to solve this problem we have in the standard model even by extending our theory of gravity.

“One exciting prospect is that in a few years’ time we’ll have a lot more data from new probes. This means that we will be able to continue improving the limits on modifications to General Relativity using these statistical methods.”

Up and coming missions will deliver a highly accurate 3D map of the clustered matter in the Universe, which cosmologists call large scale structure. These will offer an unprecedented insight into gravity at large distances. 

Professor Levon Pogosian, from Simon Fraser University in Canada, said: “As the era of precision cosmology is unfolding, we are on the brink of learning about gravity on cosmological scales with high precision. Current data already draws an interesting picture, which, if confirmed with higher constraining power, could pave the way to resolving some of the open challenges in cosmology.”

Monday, August 23, 2021

BOOKS ET AL.HISTORY OF PHYSICS

Before the Big Bang became scientific dogma


Simon Mitton
Flashes of Creation: George Gamow, Fred Hoyle, and the Great Big Bang Debate Paul Halpern Basic Books, 2021. 304 pp.

See all authors and affiliations
Science 20 Aug 2021:
Vol. 373, Issue 6557, pp. 861
DOI: 10.1126/science.abj9479

PDF



The serendipitous detection of the cosmic microwave background radiation in 1964 changed cosmology forever, settling a long-running debate about the origin of the Universe. The radio hiss hinted that the Universe had arisen from an instantaneous fiery beginning, a theory championed by cosmologist George Gamow, who sought to account for the origin of the chemical elements. His rival, Fred Hoyle, who developed the alternative steady-state theory, which posited an infinite Universe, had long insisted that the chemical elements formed continuously in the cores of massive stars. Both cosmic models were falsifiable by solving a simple puzzle: Has the Universe evolved? The feeble whisper detected in 1964 was an undeniable “Yes!”

In Flashes of Creation, Paul Halpern presents a scintillating account of the intellectual travails of Gamow and Hoyle, two animated, curious, provocative, and controversial figures in 20th-century physics. In this joint biography, the reader is introduced to the two physicists' theories and their efforts to explain the origin of elements.

Gamow, we learn, first encountered cosmology in the early 1920s while studying at the University of Leningrad under Alexander Friedmann, the Russian mathematician who pioneered the idea that the Universe is expanding. In Göttingen and Copenhagen, while a doctoral student in physics, he mingled with pioneers who were working on the new quantum theory. These interactions enabled his breakthrough in 1928, when he showed how an alpha particle could escape from an atomic nucleus by quantum tunneling.

Gamow's subsequent realization that quantum tunneling is reversible spurred two colleagues, Robert Atkinson and Fritz Houtermans, to demonstrate that sufficiently energetic protons could penetrate the atomic nuclei often enough to account for the source of stellar energy. Physicist Hans Bethe made the next advance, finding that proton-proton collisions in the cores of stars like the Sun fuse hydrogen to helium. For more-massive stars, he suggested a cycle of nuclear reactions in which carbon, nitrogen, and oxygen catalyze hydrogen to helium. This scheme left open the question that Gamow and Hoyle would confront head on: How did the elements from carbon to uranium come into existence?

Hoyle entered the Cavendish Laboratory at the University of Cambridge in 1936 as a doctoral student supervised by Rudolf Peierls. As academics, including Peierls, later fled the Cavendish Laboratory to professorships elsewhere, Hoyle remained at Cambridge until the war years, working alone on extending Enrico Fermi's theory of beta decay. By peacetime, he had developed the steady-state theory and witheringly dismissed Gamow's cosmology as a mere “big bang.”


Fred Hoyle (left) and George Gamow disagreed about the origins of the Universe
.PHOTOS (LEFT TO RIGHT): A. BARRINGTON BROWN/SCIENCE SOURCE; GRANGER

Hoyle could perceive no merit in Gamow's notion that the elements were created in a flash by the eruption of a primeval atom—it violated the conservation laws of physics. His ageless steady-state approach envisaged that new matter trickled continuously into the empty space left by the expansion of the Universe. The buildup of chemical elements then arose as a consequence of the evolution of massive stars, he postulated. When the hydrogen fuel in a star's core became exhausted, it would implode gravitationally, thereby sparking the physical conditions conducive to the rapid assembly of heavier elements.

The Gamowian school had considered the role of neutrinos in core collapse, but Hoyle's powerful rebuttal of their model in 1946 was vastly more efficient at building heavy elements. By 1957, Hoyle's team had completed its brilliant synthesis of element building via neutron capture reactions. However, steady-state theory came under relentless attack as report after report by observational astronomers cemented Big Bang cosmology.

In 1964, Hoyle reluctantly conceded that “a small residue of Gamow's idea”—the synthesis of light elements in the Big Bang—had merit. Within months, news broke of the discovery of the cosmic microwave background. Hoyle never accepted this as evidence that “the entire cosmos had a start date.” By contrast, Gamow opportunistically seized the moment, claiming primary credit for a neglected prediction of the background temperature made in 1948 by his associates Ralph Alpher and Robert Herman.

In the book's closing pages, Halpern sensitively handles with commendable candor the tragic endgames of these two giants. Gamow's alcoholism, we learn, destroyed him and much of his reputation. And while Hoyle commanded great respect after resigning from Cambridge in 1972, his little tweaks to steady-state cosmology failed to find a following.

Gamow and Hoyle were friendly rivals who seldom interacted in person. Halpern nonetheless renders their contributions and clashes vividly in this expertly crafted biography of two contentious cosmologists who thrived on ingenious invention.
http://www.sciencemag.org/about/science-licenses-journal-article-reuse

Tuesday, October 26, 2021

Black hole thermodynamics: a history from Penrose to Hawking

New research explores the historical context of Penrose’s theory of black hole energy extraction, and how it inspired collaborations across political boundaries: ultimately leading to Stephen Hawking’s celebrated theory of black hole radiation.

Peer-Reviewed Publication

SPRINGER

In 1969, English physicist Roger Penrose discovered a property which would later allow for a long-awaited link between thermodynamics, and the far stranger mechanics of black holes. Through new analysis published in EPJ H, Carla Rodrigues Almeida, based at the University of São Paulo, Brazil, sheds new light on Penrose’s motivations and methods, and explores their historical influence on the groundbreaking discovery of Hawking radiation.

Prior to the 1950s, many physicists were reluctant to accept the idea that black holes are physical objects, consistent with the well-established laws of thermodynamics. This picture transformed entirely over the next two decades, and in 1969, Penrose showed for the first time how energy can be extracted from a rotating black hole. His theory hinged on a newly-conceived region named the ‘ergosphere.’

Although it lies just outside the boundary of a black hole, spacetime within the ergosphere rotates alongside the body, like the gas in a planet’s atmosphere. If a piece of matter enters the region, Penrose proposed that it may split into two parts: one of which can fall into the black hole; while the other can escape, carrying more energy than the original particle.

Over the next few years, Soviet physicist Yakov Zel’doivh explored Penrose’s discovery through the lens of quantum mechanics. Although his work was held back by political circumstances, Zel’doiv established friendly collaborations with Western physicists. Ultimately, the theories that emerged through these relationships led to Stephen Hawking’s discovery of novel quantum effects, which can cause black holes to radiate mass. Finally, the physics community was convinced that black holes can indeed obey the laws of thermodynamics.

In her study, Almeida investigates Penrose’s proposal within this historical context. By revisiting original papers, analysing technological details, and exploring relationships between Western and Soviet physicists, she aims to uncover the history they hide. The article moves through the chain of reasoning which led from Penrose’s proposal, to an analogy between thermodynamics and black hole physics; and ultimately, to the formulation of Hawking radiation.

 

Reference

References:  C R Almeida, The thermodynamics of black holes: from Penrose process to Hawking radiation. EPJ H 46, 20 (2021). https://doi.org/10.1140/epjh/s13129-021-00022-9