Sunday, August 22, 2021

CRIMINAL CAPITALI$M 

U.S. authorities freeze assets, charge British Columbians in 'long-running fraudulent scheme'

Major Canadian player in Panama Papers scandal, former B.C. lawyer Fred Sharp, is charged by SEC, FBI

Fred Sharp, a former West Vancouver lawyer, is facing charges from the U.S. Justice Department and the Securities and Exchange Commission as the alleged 'mastermind' of a pump-and-dump investment scheme. (Sharp Art Pictures/YouTube)

Five years after he was named as a major Canadian player in the "Panama Papers" offshore finance scandal, a former West Vancouver lawyer is now facing prosecution in the United States and has had his financial assets frozen by the U.S. Securities and Exchange Commission.

Fred Sharp, who nicknamed himself "Bond" after the film spy, according to court filings, is one of six British Columbians charged by the regulator with "violating anti-fraud provisions" of the U.S. Securities Act.

Sharp and two of his fellow B.C. defendants are also facing conspiracy and securities fraud charges by the U.S. Justice Department.

"Sharp masterminded ... long-running fraudulent schemes that collectively generated hundreds of millions of dollars from unlawful stock sales" between 2011 and 2018, the SEC said on Aug. 12.

Also facing SEC charges are six of Sharp's alleged "associates": Zhiying Gasarch of Richmond, B.C.; Surrey resident Courtney Kelln; North Vancouver's Mike Veldhuis; Jackson Friesen of Delta; Paul Sexton of Anmore, B.C.; and Avtar Singh Dhillon, a Canadian living in California.

The U.S. stock market regulator alleges the co-accused "frequently collaborated with Sharp to dump huge stock positions while hiding their control positions."

WATCH | Panama Papers reveal middlemen between Canada and offshore secrets:

Panama Papers reveal middlemen between Canada and offshore secrets.

5 years ago
3:29
The CBC is one of only two news outlets in Canada with complete access to all the information that was leaked from the Panama law firm Mossack Fonseca. 3:29

The SEC's move comes after the U.S. Justice Department criminally charged Sharp with securities fraud and conspiracy on Aug. 5 — charges also faced by fellow British Columbians Veldhuis and Kelln.

None of the criminal or securities charges have been proven in court.

"Sharp and his co-conspirators are accused of executing a sophisticated, global con that allegedly bilked unsuspecting investors out of tens of millions of dollars," said Joseph Bonavolonta, the FBI Boston's special agent in charge, who helped lead the criminal investigation. "We will do everything we can to hold accountable those who steal from American investors."

Sharp and his co-conspirators are accused of executing a sophisticated, global con that allegedly bilked unsuspecting investors out of tens of millions of dollars.- Joseph Bonavolonta, FBI Boston special agent in charge

In 2016, CBC News was one of only two news outlets in Canada with complete access to all of the financial documents leaked from the Panama-based law firm Mossack Fonseca, part of a massive international investigative journalism collaboration known as the "Panama Papers."

Sharp's company, Corporate House, was known as the go-to investment firm for wealthy Canadians who wanted to keep assets private and use offshore tax havens to minimize their tax burden, according to sources in the wealth management industry, CBC News learned.

He created more than 1,200 corporate entities linked to Canada, making him the most significant Canadian player in the Panama Papers trove. According to documents obtained by CBC News at the time, the Sharp-associated companies were instructed not to send any account statements or invoices by mail, email or fax, but to destroy them.

"Printed invoices or statement of accts [sic] should be destroyed," the instructions stated.

At the time of the Panama Papers release, Sharp told CBC News in an email, "Tax planning is a global reality that results from international competition and inefficient governmental regulation ... and is legal."

WATCH | Release of Panama Papers documents:

Panama Papers

5 years ago
2:36
CBC News' Frederic Zalac on the release of Panama Papers documents 2:36

A "pump-and-dump" investment scheme is one in which companies' stock prices are artificially inflated, allowing some shareholders to sell their stock at artificially high prices to other investors, according to the FBI.

Sharp "used the code name 'Bond' (styling himself after the fictional character James Bond)," the Securities and Exchange Commission said in court documents, and "was the mastermind and leader" of a company whose clients included "individuals seeking fraudulently to sell stock in the markets to retail investors — and with various offshore trading platforms."

Another James Bond connection appeared in Sharp's company accounting system, which the SEC said he named after fictional senior spy "Q" in the Bond franchise. The role of "Q," according to the court filings, was "to conceal and obscure the actual ownership and control of the stock they were surreptitiously selling."

Sharp is a former Vancouver lawyer who was suspended in 1995 by the Law Society of British Columbia, which ruled he "knowingly took instructions from a person who was ... disqualified from acting as an officer or director of a public company because of a criminal record for fraud," and for delivering a client's nearly $500,000 cheque "knowing that [the client] had no funds in its bank account available."

In one Sharp correspondence quoted in the SEC's court documents, his company's services were "not limited to trading" but also included payments, loans and "keeping clients out of jail."

If convicted of securities fraud, Sharp could face up to 20 years in prison, plus five years for the conspiracy charges, according to the U.S. Justice Department.

High-efficiency masks up to six times better at filtering aerosols than cloth, surgical masks: Canadian study


Tom Yun
CTVNews.ca writer
Published Sunday, August 22, 2021 

TORONTO -- A recent study from engineering researchers in Ontario has found that high-efficiency masks are up to six times better at filtering aerosols compared to more commonly used cloth and surgical masks.

Researchers at the University of Waterloo looked into how effective different types of masks are at filtering out aerosol particles, which are solid or liquid particles approximately 0.001 millimetres in diameter and suspended in the air. They published their findings in the journal Physics of Fluids on July 21.

The team put masks over a CPR mannequin that could simulate a person's breathing and exhale aerosol droplets, which were made using olive oil, and measured the amount of aerosols that would be built up in a large, unventilated room. The measurements were taken from two metres away, the Public Health Agency of Canada (PHAC)'s recommended distance for physical distancing.

Related Links
Read the full study in Physics of Fluids

The researchers found that R95 masks were able to filter out 60 per cent of exhaled aerosols, and KN95 masks could filter out 46 per cent.

On the other hand, cloth masks and surgical masks only filtered out 10 per cent and 12 per cent of exhaled aerosols, respectively.

“There is no question it is beneficial to wear any face covering, both for protection in close proximity and at a distance in a room,” said lead author Serhiy Yarusevych in a press release​. “However, there is a very serious difference in the effectiveness of different masks when it comes to controlling aerosols.”

Researchers say cloth and surgical masks are prone to air leakage at the top of the mask, where the mask meets the bridge of the nose.

In the early days of the pandemic, health officials in Canada and around the world said that that the SARS-CoV-2 virus was thought to primarily spread through droplets created when people cough, sneeze or talk. But in November, PHAC updated its advice to acknowledge aerosol transmission of the virus.

A growing number of scientists and doctors also believe that aerosols are the most dominant mode of transmission of COVID-19. A paper published in The Lancet in April pointed to long-range COVID-19 spread in quarantine hotels as well as air samples in hospitals that contained viable SARS-CoV-2.

Yarusevych's team also examined ventilation and found that even modest ventilation can significantly reduce the amount of aerosols.

When the researchers conducted the experiment with an unmasked mannequin and a room that had a ventilation rate of 1.7 room volumes per hour, the ventilation managed to eliminate 69 per cent of aerosols. At 3.2 room volumes per hour, 84 per cent of aerosols were gone.

Yarusevych says his data shows that wherever possible, high-efficiency masks should be paired with proper ventilation in indoor settings such as schools and workplaces.

“A lot of this may seem like common sense,” he said. “There is a reason, for instance, that medical practitioners wear N95 masks – they work much better. The novelty here is that we have provided solid numbers and rigorous analysis to support that assumption.”​



A man holds a 3M N95 respirator before an announcement at a facility in Brockville, Ont., Friday, Aug. 21, 2020. THE CANADIAN PRESS/Adrian Wyld
#FOREVERCHEMICALS
Are we at risk from wearing clothing with detectable amounts of PFASs or phthalates?

PFASs might be present in clothing, but the presence of a chemical cannot be equated to the presence of risk!


Joe Schwarcz PhD |
MCGILL UNIVERSITY
 20 Aug 2021
Health


Much ink has recently been spilled about our environment, and potentially our bodies, being contaminated by some of the estimated 60,000 chemicals being industrially produced today. That ink itself contains the likes of perfluoroalkyl substances (PFAS) and phthalates, chemicals of concern because of their hormone disruptive properties, and in the case of PFAS, also because of their environmental persistence. That has earned the latter the nickname “forever chemicals.” Both these classes of substances are found in numerous consumer products. Phthalates are added to some plastics, including ones that are spun into fibres to make them soft and pliable, whereas PFASs have both water and oil repellant properties. It therefore comes as no surprise that these chemicals can be detected when some fabrics are subjected to chemical analysis, mostly at the parts per million level or less.

The first take-away is that what the results of such an analysis really demonstrate is the astounding capability of modern instrumentation to detect vanishingly small amounts of substances. The second point is that the presence of a chemical cannot be equated to the presence of risk! Evaluation of risk is a very complex affair and often comes down to making an educated guess given that the relevant experiment cannot be ethically, logistically or economically performed.

For example, when it comes to perfluoroalkyl substances (PFAS), the definitive experiment would involve exposing a group of subjects to varying amounts of these substances and comparing the findings to that of a control group with no exposure. The experimental and control groups would have to be followed for decades, since we are talking about chronic rather than acute effects. To add to the complexity, it should be noted that there are thousands of different PFASs that are in use and they can have totally different toxicological profiles. So which one, or which mixture would be tested? Clearly such an interventional trail cannot be carried out. We are therefore left with human epidemiological studies and laboratory experiments using animals or cells in test tubes. But people are not rats and the body is not some giant test tube, so the relevance of laboratory data to humans is hard to determine.

As far as epidemiological data go, these are clearly relevant, but generally come from large scale exposure. For example, detrimental effects of PFAS have been noted in people living around DuPont’s West Virginia Parkesburg plant where significantly high amounts of PFAS have been found in the water due to improper disposal of manufacturing chemicals. However, this may not mean much for the general population elsewhere with exposure to trace amounts. As we well know, only the dose makes the poison. The same arguments can be made for phthalates, bisphenol A, PCBs, dioxins, furans or the hundreds of other chemicals with potential toxicity that may be encountered in the environment.

All of this is to say that we cannot come to a conclusion about the risk, if any, posed by the tiny amounts of PFASs, phthalates or inorganics detected in fabrics. My guess is that absorption of any of these into the bloodstream would be inconsequential. That, though, may not be the case for cosmetics that are formulated with PFASs such as lipstick, foundations, or mascara. But the biggest concern about exposure to PFAS and phthalates is through food and water. How do these chemicals end up there? Leaching out from discarded items, for one. While wearing fabrics with PFAS may not be an issue, when large numbers of these are discarded, some PFASs end up in the water supply and from there in food. Since PFAS are also used in food packaging, migration into the contents is possible. Then there is the possibility of inadvertent release during manufacture of these chemicals.

The bottom line is that while PFAS in fabrics may pose no risk to the wearer, banning their inclusion in such items means fewer of these chemicals will be produced, and population exposure will be reduced. However, making any recommendation to the public about favouring specific clothing items based on the trivial amounts of PFAS or phthalates found, cannot be justified scientifically.

READ

“Different From All Currently Known Life?” –Darwin’s Extraterrestrials (Weekend Feature)

extraterrestrial life

 

“By now it has become a common futurist prediction and science fiction plot device that intelligent and sentient life forms can be created which are not biochemical in nature and are thus fundamentally different from all currently known life,” distinguished Princeton astrophysicist Edwin Turner wrote in an email to The Daily Galaxy.  “Whether or not this would actually be possible,” he explains, “depends on the nature and origin of consciousness, a topic about which we have little more than entertaining whistling-in-the-dark guesses at this point and no clear path toward obtaining any better understanding of this deep mystery.”

Aliens Shaped by Natural Selection

In a landmark 2017 study published in the International Journal of Astrobiology scientists from the University of Oxford showed that aliens are potentially shaped by the same processes and mechanisms that shaped humans, such as natural selection and are like us, evolving to be fitter and stronger over time.

Only One Known Sample in the Universe

“A fundamental task for astrobiologists is thinking about what extraterrestrial life might be like.” said Sam Levin, a researcher in Oxford’s Department of Zoology. “But making predictions about aliens is hard,” he noted “We only have one example of life – life on Earth — to extrapolate from. Past approaches in the field of astrobiology have been largely mechanistic, taking what we see on Earth, and what we know about chemistry, geology, and physics to make predictions about aliens.”

Complexity’s Arrow

By predicting that aliens have undergone major transitions – which is how complexity has arisen in species on Earth, we can say that there is a level of predictability to evolution that would cause them to look like us.

“In our paper,” said Levin, “we offer an alternative approach, which is to use evolutionary theory to make predictions that are independent of Earth’s details. This is a useful approach, because theoretical predictions will apply to aliens that are silicon based, do not have DNA, and breathe nitrogen, for example.”

 Alien Natural Selection

Using this idea of alien natural selection as a framework, the team addressed extra-terrestrial evolution, and how complexity will arise in space.

Species complexity has increased on the Earth as a result of a handful of events, known as major transitions. These transitions occur when a group of separate organisms evolve into a higher-level organism – when cells become multi-cellular organisms, for example. Both theory and empirical data suggest that extreme conditions are required for major transitions to occur.

The paper also makes specific predictions about the biological make-up of complex aliens, and offers a degree of insight as to what they might look like.

Mirror Sapiens?

“We still can’t say whether aliens will walk on two legs or have big green eyes<” said Levin. “But we believe evolutionary theory offers a unique additional tool for trying to understand what aliens will be like, and we have shown some examples of the kinds of strong predictions we can make with it.”

“By predicting that aliens have undergone major transitions – which is how complexity has arisen in species on Earth, we can say that there is a level of predictability to evolution that would cause them to look like us.”

‘Like humans, we predict that they are made-up of a hierarchy of entities, which all cooperate to produce an alien. At each level of the organism there will be mechanisms in place to eliminate conflict, maintain cooperation, and keep the organism functioning. We can even offer some examples of what these mechanisms will be.

‘There are potentially hundreds of thousands of habitable planets in our galaxy alone. We can’t say whether or not we’re alone on Earth, but we have taken a small step forward in answering, if we’re not alone, what our neighbors are like.’

 Rare Evolutionary Transitions

In subsequent 2019 research from the University of Oxford have created a statistical model that shows the chances of intelligent life existing elsewhere in the Universe are slim. “It’s still unknown,” observes mathematical biologist Michael Bonsall, “how abundant extraterrestrial life is, or whether such life might be intelligent. On Earth, numerous evolutionary transitions were needed for complex intelligent life to emerge, and this occurring relatively late in Earth’s lifetime is thought to be evidence for a handful of rare evolutionary transitions.”

“In addition to the evolutionary transition,” writes Bonsall, “the emergence of intelligent life also requires a set of cosmological factors to be in place. These include whether the planet is in the right place for water to be present, whether life emerges from the water and whether the planet itself is habitable. Most crucial in this is the lifetime of the star the planet is orbiting; if this lifetime is short in comparison to the necessary evolutionary transitions for life, then intelligent observers might never get a chance to emerge (often referred to as the Great Filter).”

The Daily Galaxy, Avi Shporer, Research Scientist, MIT Kavli Institute for Astrophysics and Space Research via Oxford University and Edwin Turner. Avi was formerly a NASA Sagan Fellow at the Jet Propulsion Laboratory (JPL).

 

“Before the Big Bang” –Vestiges of a Prior Universe? (Weekend Feature)

 

Big Bang

 

“Eliminating the singularity or Big Bang– which has its origins in the late 1920s when US astronomer Edwin Hubble discovered that almost all galaxies are moving away from each other at ever-faster velocities– brings back the bouncing Universe on to the theoretical stage of cosmology. The absence of a singularity at the start of spacetime opens up the possibility that vestiges of a previous contraction phase may have withstood the phase change (between contraction to expansion) and may still be with us in the ongoing expansion of the Universe,” said Brazilian physicist Juliano Cesar Silva Neves.

A Universe Prior to the Big Bang

Although for five decades, the Big Bang theory has been the best known and most accepted explanation for the beginning and evolution of the Universe, it is hardly a consensus among scientists. Physicists are now assuming the possibility of vestiges of a Universe previous to the Big Bang. 

The Big Bounce

“The idea of a “big bounce” has existed in some form or another for decades,” writes Jonas Mureika, theoretical physicist and physics chair at Loyola Marymount University and the Kavli Institute for Theoretical Physics, UC Santa Barbara, in an extraordinary email to The Daily Galaxy. “The basic idea is to avoid the singularity at t=0 (time of the Big Bang) and r=0 (where all matter and energy were compressed into an infinitesimally small volume), since our mathematical description of spacetime (and thus our favorite theories) break down at that point. This is similar to problems in black hole physics, which itself has similar fixes to remove the singularity.”

“The Quantum Fix”

“The crux of the problem,” observes Mureika, “is that our description of this physics is classical, i.e. a prediction of General Relativity, and that’s why singularities arise. The theories just don’t work in that limit. It is most likely the case, however, that the physics governing the realm of classical singularities — extremely small and extremely high energy — is quantum in nature. So, the rules change in some fascinating ways and introducing us to new physics allows this to make sense.” 

“When classical physics breaks down, we look to replace the broken parts with a quantum fix. If the singularity is at r=0, then one of the ways we can avoid this is to not let the physics act at r=0. That is, we impose a minimal length (usually the Planck length, but not always) below which the universe can’t be ‘probed’. That removes the infinites that plague the singularity and allows our theories to behave well. In a ‘big bounce’ scenario, the quantum pressures at this minimum length basically stop the implosion of the universe and allow it to re-expand. Again, similar ideas exist with black holes, called Planck stars.”

 Roger Penrose’s Cyclic Conformal Cosmology (CCC) 

“Another approach is to change our notion of the structure of spacetime itself,” explains Mureika, “and how it behaves in the small and large limits. This is embodied in Nobel Laureate Roger Penrose’s 2012 Cyclic Conformal Cosmology (CCC) framework, in which the very small limit of the universe (singularity) is identical to the very large limit (‘infinite’ accelerated expansion). This is done by a conformal transformation on the spacetime metric (the thing that defines straight lines and shortest distances), which is a fancy way of saying we stretch and bend spacetime while preserving certain geometric features (e.g. angles). We now know the universe is indeed going through a phase of accelerated expansion, so this adds weight to Penrose’s idea and kills previous ones (i.e. the universe doesn’t contract, so it can’t ‘bounce’ in the ways previously thought). This allows for the universe to be ‘reborn’ as it expands indefinitely, so there is a repeating cycle of big bangs. Penrose calls these cycles ‘aeons’.”

“CMB” Fossils

“Of course,” Mureika concludes, “a theory is only as good as its experimental verification, so the challenge is to detect tell-tale fingerprints of these models. The observational go-to for early universe cosmology is the Cosmic Microwave Background (CMB), which represents an imprint of the earliest time we can see. It’s believed that the CMB will contain information from earlier times, including the big bang (if it happened). These will manifest themselves as e.g. geometric signatures, patterns in temperature fluctuations, over/underdensities of clustering, etc. Detecting any such signature would be a monumental discovery, and will help quantum gravity and cosmology research shape their future paths.”

Brian Keating’s Deep Dive

“In contrast to inflation,” observes Brian Keating, Chancellor’s Distinguished Professor of Physics at UC San Diego, author of Losing the Nobel Prize, and host of the INTO THE IMPOSSIBLE Podcast in an email to The Daily Galaxy,  “there are several other possible mechanisms to the singularity featured in most versions of the Big Bang theory. Two of the most prominent alternatives to the singular Big Bang are the Bouncing model of Anna Ijjas and Paul Steinhardt and the Conformal Cyclic Cosmology (CCC) of Sir Roger Penrose. Both of these share the feature that they do not produce so-called ‘primordial B-mode polarization’ patterns, the result of relic gravitational waves produced in most models of cosmic Inflation, which also features a concomitant spacetime singularity. In that sense, both the Bouncing and CCC models are falsifiable, e.g. if  current or  future B-mode polarization experiments like BICEP Array or the Simons Observatory were to detect and confirm primordial B-modes, these alternatives would be disproven. Many cosmologists find the falsifiability of these models, in contrast to the inflationary Multiverse, strongly appealing.”

Hawking Points”

“The CCC model also predicts  the presence of so-called ‘Hawking Points’ ”, explains Keating, “regions of concentrated energy caused by the coalescing black holes from preceding ‘aeons’ which would, according to Penrose and collaborators, be evidence supporting the cyclic Evidence for Hawking points from the ESA’s Planck satellite has been claimed already. But those claims are also disputed by members of the Planck team. Upcoming experiments like BICEP Array and the Simons Observatory will be able to rule out or confirm evidence for Hawking Points which would be tantamount to evidence for the CCC model.”

 

Mirroring the “Bouncing Universe” model, Neves, in an article published in General Relativity and Gravitation, proposes to eliminate the need for cosmological spacetime singularity and argues that the current expansion phase was preceded by contraction.

Neves is part of a group of researchers who dare to imagine a different origin. In a study published in the journal General Relativity and Gravitation, Neves suggests the elimination of a key aspect of the standard cosmological model: the need for a spacetime singularity known as the Big Bang.

Challenging the Idea that Time had a Beginning

In raising this possibility, Neves challenges the idea that time had a beginning and reintroduces the possibility that the current expansion was preceded by contraction. “I believe the Big Bang never happened,” the physicist said, who works as a researcher at the University of Campinas’s Mathematics, Statistics & Scientific Computation Institute (IMECC-UNICAMP) in Sao Paulo State, Brazil.

For Neves, the fast spacetime expansion stage does not exclude the possibility of a prior contraction phase. Moreover, the switch from contraction to expansion may not have destroyed all traces of the preceding phase.

Introducing the “Scale Factor”

The article, which reflects the work developed under the Thematic Project “Physics and geometry of spacetime”, considers the solutions to the general relativity equations that describe the geometry of the cosmos and then proposes the introduction of a “scale factor” that makes the rate at which the Universe is expanding depend not only on time but also on cosmological scale.

“In order to measure the rate at which the Universe is expanding with the standard cosmology, the model in which there’s a Big Bang, a mathematical function is used that depends only on cosmological time,” said Neves, who elaborated the idea with Alberto Vazques Saa, a Professor at IMECC-UNICAMP and also the supervisor for Neves’ postdoctoral project, funded by the Sao Paulo Research Foundation – FAPESP.

With the scale factor, the Big Bang itself, or cosmological singularity, ceases to be a necessary condition for the cosmos to begin universal expansion. A concept from mathematics that expresses indefiniteness, singularity was used by cosmologists to characterize the “primordial cosmological singularity” that happened 13.8 billion years ago, when all the matter and energy from the Universe were compressed into an initial state of infinite density and temperature, where the traditional laws of physics no longer apply.

From the 1940s onward, scientists guided by Einstein’s theory of general relativity constructed a detailed model of the evolution of the Universe since the Big Bang. Such model could lead to three possible outcomes: the infinite expansion of the Universe at ever-higher velocities; the stagnation of the Universe expansion in a permanent basis; or an inverted process of retraction caused by the gravitational attraction exerted by the mass of the Universe, what is known as Big Crunch.

Neves conceptualizes that “bouncing cosmology” is rooted in the hypothesis that Big Crunch would give way to an eternal succession of universes, creating extreme conditions of density and temperature in order to instigate a new inversion in the process, giving way to expansion in another bounce.

Black-Hole Fossils

Black holes are the starting point of Neves’ investigations about the “Bouncing Universe”. “Who knows, there may be remains of black holes in the ongoing expansion that date from the prior contraction phase and passed intact through the bottleneck of the bounce,” he said.

Consisting of the imploded core remaining after a giant star explodes, black holes are a kind of cosmic object whose core contracted to form a singularity, a point with infinite density and the strongest gravitational attraction known to exist. Nothing escapes from it, not even light.

According to Neves, a black hole is not defined by singularity, but rather by an event horizon, a membrane that indicates the point of no return from which nothing escapes the inexorable destiny of being swallowed up and destroyed by the singularity.

The Illustris simulation shown below that visualizes the universe with the Standard Big Bang Model is the most ambitious computer simulation yet performed. The calculation tracks the expansion of the universe, the gravitational pull of matter onto itself, the motion of cosmic gas, as well as the formation of stars and black holes.

 

 

“Outside the event horizon of a regular black hole, there are no major changes, but inside it, the changes are deep-seated. There’s a different spacetime that avoids the formation of a singularity.”

The scale factor formulated by Neves and Saa was inspired by US physicist James Bardeen. In 1968, Berdeen used a mathematical trick to modify the solution to the general relativity equations that describe black holes.

The trick consisted of thinking of the mass of a black hole not as a constant, as had previously been the case, but as a function that depends on the distance to the center of the black hole. With this change, a different black hole, termed a regular black hole, emerged from the solution to the equations. “Regular black holes are permitted, since they don’t violate general relativity. The concept isn’t new and has frequently been revisited in recent decades,” said Neves.

Since the insertion of a mathematical trick into the general relativity equations could prevent the formation of singularities in regular black holes, Neves considered creating a similar artifice to eliminate the singularity in a regular bounce.

In science, says philosopher Karl Popper,  a theory is worthless if cannot be verified, however beautiful and inspiring it may be. How do you test the hypothesis of a Big Bang that did not start with a singularity?

“By looking for traces of the events in a contraction phase that may have remained in the ongoing expansion phase. What traces? The candidates include remnants of black holes from a previous phase of universal contraction that may have survived the bounce,” Neves said.

The Daily Galaxy, Maxwell Moe, astrophysicist, NASA Einstein Fellow, University of Arizona via Brian KeatingJonas Mureika and  Sao Paulo Research Foundation (FAPESP)

Image credit: Shutterstock License

A HUNDRED & SEVENTY YEARS AFTER THE FACT

Evolution is now accepted by a majority of Americans

evolution
Credit: CC0 Public Domain

The level of public acceptance of evolution in the United States is now solidly above the halfway mark, according to a new study based on a series of national public opinion surveys conducted over the last 35 years.\

"From 1985 to 2010, there was a statistical dead heat between acceptance and rejection of ," said lead researcher Jon D. Miller of the Institute for Social Research at the University of Michigan. "But acceptance then surged, becoming the majority position in 2016."

Examining data over 35 years, the study consistently identified aspects of education—civic science literacy, taking  in science and having a —as the strongest factors leading to the acceptance of evolution.

"Almost twice as many Americans held a college degree in 2018 as in 1988," said co-author Mark Ackerman, a researcher at Michigan Engineering, the U-M School of Information and Michigan Medicine. "It's hard to earn a college degree without acquiring at least a little respect for the success of ."

The researchers analyzed a collection of biennial surveys from the National Science Board, several national surveys funded by units of the National Science Foundations, and a series focused on adult civic literacy funded by NASA. Beginning in 1985, these national samples of U.S. adults were asked to agree or disagree with this statement: "Human beings, as we know them today, developed from earlier species of animals."

The series of surveys showed that Americans were evenly divided on the question of evolution from 1985 to 2007. According to a 2005 study of the acceptance of evolution in 34 developed nations, led by Miller, only Turkey, at 27%, scored lower than the United States. But over the last decade, until 2019, the percentage of American adults who agreed with this statement increased from 40% to 54%.

The current study consistently identified religious fundamentalism as the strongest factor leading to the rejection of evolution. While their numbers declined slightly in the last decade, approximately 30% of Americans continue to be religious fundamentalists as defined in the study. But even those who scored highest on the scale of religious fundamentalism shifted toward acceptance of evolution, rising from 8% in 1988 to 32% in 2019.

Miller predicted that  would continue to impede the public acceptance of evolution. 

"Such beliefs are not only tenacious but also, increasingly, politicized," he said, citing a widening gap between Republican and Democratic acceptance of evolution. 

As of 2019, 34% of conservative Republicans accepted evolution compared to 83% of liberal Democrats.

The study is published in the journal Public Understanding of Science.

Exploring evolution acceptance for better science education
More information: Jon D. Miller et al, Public acceptance of evolution in the United States, 1985–2020, Public Understanding of Science (2021). DOI: 10.1177/09636625211035919
Journal information: Public Understanding of Science 
Provided by University of Michigan 

 

Scientists Detect Tens of Thousands of Different Molecules in Beer – 80% Not Yet Described in Chemical Databases

Beer Glass Bubbles

Study used modern high resolution analytics to reveal enormous metabolic complexity of beer.

The tradition of beer brewing dates back to at least 7000 BCE and maybe even to the invention of agriculture, considering that most cereals can spontaneously ferment if exposed to airborne yeasts. The code of the Babylonian king Hammurabi (rule 1792 to 1750 BCE), whose laws 108 through 111 regulate beer sales, shows that people have been anxious to safeguard the quality of beer through legislation for millennia. For example, the Bavarian ‘Reinheitsgebot’ (‘Purity Law’) of 1516, often considered the world’s oldest still functional – with modifications – food regulation, allows only barley, water, and hops as ingredients for brewing beer (with confiscation of the barrels as penalty for transgression).

Now, in a recent study in Frontiers in Chemistry, the science of beer is taken to a new level. Scientists from Germany use state-of-the-art analytical methods to reveal the metabolic complexity – tens of thousands of different molecules – of commercial beers from around the world.

Enormous chemical complexity

“Beer is an example of enormous chemical complexity. And thanks to recent improvements in analytical chemistry, comparable in power to the ongoing revolution in the technology of video displays with ever-increasing resolution, we can reveal this complexity in unprecedented detail. Today it’s easy to trace tiny variations in chemistry throughout the food production process, to safeguard quality or to detect hidden adulterations,” said corresponding author Prof Philippe Schmitt-Kopplin, head of the Comprehensive Foodomics Platform at the Technical University of Munich and of the Analytical BioGeoChemistry research unit at the Helmholtz Center in Munich.

Schmitt-Kopplin and colleagues used two powerful methods – direct infusion Fourier transform ion cyclotron resonance mass spectrometry (DI-FTICR MS) and ultra-performance liquid chromatography quadrupole time-of-flight mass spectrometry (UPLC-ToF-MS) – to reveal the full range of metabolites in 467 types of beer brewed in the US, Latin America, Europe, Africa and East Asia. These included lagers, craft and abbey beers, top-fermented beers, and gueuzes brewed from barley as the only source of starch for fermentation, or barley plus either wheat, rice, and corn (maize).

The methods have complementary strengths. DI-FTICR-MS directly revealed the chemical diversity across all beers and predicted chemical formulas for the metabolite ions in them. The authors then used UPLC-ToF-MS on a subset of 100 beers to analyze the results with resolution on the possible isomers. UPLC-ToF-MS uses chromatography to first separate ions with identical masses and fragmentation of the mass ions to daughter ions, making it possible to predict the exact molecular structure.

The authors placed these metabolites in relation within the ‘chemical space’, each linked to one or more others through a single reaction, for example the addition of a methoxy-, hydroxyl-, sulfate-, or sugar-group to the molecular backbone, or turning an unsaturated bond into a saturated bond. This yielded a reconstruction of a metabolite network leading to the final product, consisting of nearly a hundred steps with a starting point in molecules from the original cereals, synthesized from the amino acid tryptophan. Derived from these are secondary metabolites, unique to each cereal.

Powerful method for quality control

“Our mass spectrometry method, which takes only 10 minutes per sample, should be very powerful for quality control in food industry and set the basis of novel molecular markers and non-targeted metabolite profiles needed in foodstuff inspection,” said Schmitt-Kopplin.

The authors found approximately 7700 ions with unique masses and formulas, including lipids, peptides, nucleotides, phenolics, organic acids, phosphates, and carbohydrates, of which around 80% aren’t yet described in chemical databases. Because each formula may in some cases cover up to 25 different molecular structures, this translates into tens of thousands of unique metabolites.

“Here we reveal an enormous chemical diversity across beers, with tens of thousands of unique molecules. We show that this diversity originates in the variety of raw materials, processing, and fermentation. The molecular complexity is then amplified by the so-called ‘Maillard reaction’ between amino acids and sugars which also gives bread, meat steaks, and toasted marshmallow their ‘roasty’ flavor. This complex reaction network is an exciting focus of our research, given its importance for food quality, flavor, and also the development of novel bioactive molecules of interest for health,” concluded first author Stefan Pieczonka, a PhD student at the Technical University of Munich.

Reference: “On the Trail of the German Purity Law: Distinguishing the Metabolic Signatures of Wheat, Corn and Rice in Beer” by Stefan A. Pieczonka, Sophia Paravicini, Michael Rychlik and Philippe Schmitt-Kopplin, 20 July 2021, Frontiers in Chemistry.
DOI: 10.3389/fchem.2021.715372

 

Historical Timeline Is Inaccurate: Advanced Radiocarbon Dating Reveals Machu Picchu Is Older Than Expected

Machu Picchu Peru

Machu Picchu, Peru.

Machu Picchu, the famous 15th-century Inca site in southern Peru, is up to several decades older than previously thought, according to a new study led by Yale archaeologist Richard Burger.

Burger and researchers from several U.S. institutions used accelerator mass spectrometry (AMS) — an advanced form of radiocarbon dating — to date human remains recovered during the early 20th century at the monumental complex and onetime country estate of Inca Emperor Pachacuti located on the eastern face of the Andes Mountains.

Their findings, published in the journal Antiquity, reveal that Machu Picchu was in use from about A.D. 1420 to A.D. 1530 — ending around the time of the Spanish conquest — making the site at least 20 years older than the accepted historical record suggests and raising questions about our understanding of Inca chronology.

Historical sources dating from the Spanish invasion of the Inca Empire indicate that Pachacuti seized power in A.D. 1438 and subsequently conquered the lower Urubamba Valley where Machu Picchu is located. Based on those records, scholars have estimated that the site was built after A.D. 1440, and perhaps as late as A.D. 1450, depending on how long it took Pachacuti to subdue the region and construct the stone palace.

The AMS testing indicates that the historical timeline is inaccurate.

Machu Picchu

“Until now, estimates of Machu Picchu’s antiquity and the length of its occupation were based on contradictory historical accounts written by Spaniards in the period following the Spanish conquest,” said Burger, the Charles J. MacCurdy Professor of Anthropology in Yale’s Faculty of Arts and Sciences. “This is the first study based on scientific evidence to provide an estimate for the founding of Machu Picchu and the length of its occupation, giving us a clearer picture of the site’s origins and history.”

The finding suggests that Pachacuti, whose reign set the Inca on the path to becoming pre-Columbian America’s largest and most powerful empire, gained power and began his conquests decades earlier than textual sources indicate. As such, it has implications for people’s wider understanding of Inca history, Burger said.

Machu Picchu Yale

Credit: Photo courtesy Yale University

“The results suggest that the discussion of the development of the Inca empire based primarily on colonial records needs revision,” he said. “Modern radiocarbon methods provide a better foundation than the historical records for understanding Inca chronology.”

The AMS technique can date bones and teeth that contain even small amounts of organic material, expanding the pool of remains suitable for scientific analysis. For this study, the researchers used it to analyze human samples from 26 individuals that were recovered from four cemeteries at Machu Picchu in 1912 during excavations led by Yale professor Hiram Bingham III, who had “rediscovered” the site the previous year.

The bones and teeth used in the analysis likely belonged to retainers, or attendants, who were assigned to the royal estate, the study states. The remains show little evidence of involvement in heavy physical labor, such as construction, meaning that they likely were from the period when the site functioned as a country palace, not when it was being built, the researchers said.

On November 30, 2010, Yale University and the Peruvian government reached an accord for the return to Peru of the archaeological materials Bingham excavated at Machu Picchu. On February 11, 2011, Yale signed an agreement with the Universidad Nacional de San Antonio Abad del Cusco establishing the International Center for the Study of Machu Picchu and Inca Culture, which is dedicated to the display, conservation, and study of the archaeological collections from Bingham’s 1912 excavations. All human remains and other archaeological materials from Machu Picchu have subsequently been returned to Cusco, the former capital city of the Inca Empire, where they are conserved at the Museo Machu Picchu.

Reference: “New AMS dates for Machu Picchu: results and implications” by Richard L. Burger, Lucy C. Salazar, Jason Nesbitt, Eden Washburn and Lars Fehren-Schmitz, 4 August 2021, Antiquity.
DOI: 10.15184/aqy.2021.99

We recommend