Monday, August 21, 2023

 

First ever remains of a dicraeosaurid sauropod unearthed in India

First ever remains of a dicraeosaurid sauropod unearthed in India
Cervical vertebrae (CV6/8) of Tharosaurus indicus. (a) RWR-241-A, anterior cotyle in 
anterior view. (b–k) RWR-241-B, partial vertebra, photographs and line drawings in (b,c) 
right lateral view, red line indicates U-shaped ridge demarcating anterior and posterior 
halves of lateral pneumatic fossa; (d,e) left lateral view; (f,g) ventral view, red line indicates
 posteriorly bifurcated midline keel and arrow indicates accessory ridge; (h,i) posterior 
view, arrows and red arrowheads indicate deep bifurcation of neural arch and triangular 
facets below cotyle, respectively. (j,k) dorsal view, arrowhead indicates passage enclosed
 by bifid neural arch and ligament scars and striations marked in red and purple, respectively
. Broken areas and artifacts in gray and pink, respectively. c centrum, cpof centropostzygap
ophyseal fossa, cpol centropostzygapophyseal lamina, lf lateral fossa, lvf lateroventral flang
e, mk midline keel, na neural arch, nc neural canal, pvf posteroventral fossa, tpol
 intrapostzygapophyseal lamina. Scale bars represent 50 mm.
 Credit: Scientific Reports (2023). DOI: 10.1038/s41598-023-39759-2

A team of archaeologists from the Indian Institute of Technology and the Geological Survey of India, has unearthed the first ever remains of a dicraeosaurid sauropod in India. In their paper published in the journal Scientific Reports, the group describes the fossil, its condition and where it fits in with other dinosaurs of the Middle Jurassic.

The  (a partial dorsal vertebra) was dug up at a site in the Thar Desert near the city of Jaisalmer, in the state of Rajasthan. Prior research has shown that during the Mesozoic Era, the area was a shoreline along the Tethys Ocean. The newly found fossil has been dated to approximately 167 million years ago and identified as a member of the dicraeosaurids, which were a group of dinosaurs with long necks that fed on vegetation. It is the first member of the group to have ever been found in India—and the oldest in the world.

The team has named their new find Tharosaurus indicus. They note that dicraeosaurids, such as T. indicus, are all part of a larger group called diplodocoids, which all had long bodies and necks and spikes on the backs of their necks. T. indicus., the researchers note, has some slight differences from others in its group, such as a long depression on the side of its neck bones and neural spines that are believed to indicate it had uniquely facing spikes. It also had a frontal vertebra surface reminiscent of a heart near its tail bone.

The research team suggest their find is likely just the first of many to come, and together such fossils hint at the possibility that the area where the fossil was found likely played an important role in the emergence of neosauropods—also long-necked, vegetation eating dinosaurs.

They note that other fossils have been found in the area that also suggest the region played an evolutionary role in the development of many vertebrate groups. They conclude by noting that work such as theirs is still limited in India, due to an inadequate supply of resources—much more needs to be done to find out just how rich the country might be in .

More information: Sunil Bajpai et al, Fossils of the oldest diplodocoid dinosaur suggest India was a major centre for neosauropod radiation, Scientific Reports (2023). DOI: 10.1038/s41598-023-39759-2

Journal information: Scientific Reports 

© 2023 Science X Network

New dinosaur species discovered in Thailand

 

Researchers find 20,000-year-old refugium for orcas in the northern Pacific

Researchers find 20,000 years old refugium for orcas in the northern Pacific
Orcas in North Pacific. Credit: SDU/Olga Filatova

The northern Pacific near Japan and Russia is home for several different groups of orcas. They have no contact with each other, do not seek the same food, do not speak the same dialect, and do not mate with each other. How can this be when they live so close to each other and belong to the same species?

Whale biologist Olga Filatova, University of Southern Denmark, is interested in finding out how the northern Pacific has been colonized by orcas, and during her time at university in Moscow, she conducted several expeditions to the area. Today, she is with University of Southern Denmark's Marine Biological Research Center.

Now, some of her latest results have been published. In a recent paper published in Marine Mammal Science, she and colleagues explore the complex interaction between  culture and post-glacial history of their colonization of the North Pacific, showing that the orca pods currently living near Nemuro Strait in northern Japan are descendants of orcas that settled there during the last ice age, around 20,000 years ago. The location was chosen as a refugium by distant ancestors, and their descendants have lived there ever since.

"Orcas are conservative and tradition-bound creatures who do not move or change their traditions unless there is a very good reason for it. We see that in this population," says Filatova.

This is the second time she has found an orca refugium from the Ice Age. The first one is near the Aleutian Islands, some 2500 km away. The pods there are just as conservative and tradition-bound as their Japanese conspecifics, and are also descendants of Ice Age ancestors who found refuge in ice-free waters.

"When the ice began to retreat again, and orcas and other whales could swim to new ice-free areas, some of them did not follow. They stayed in their [refugia], and they are still living there," says Filatova.

The studies are based on genetic analyses (the researchers took skin biopsies of the animals) and analyses of sounds made by the animals (recorded with underwater microphones).

"Orcas in the Nemuro Strait had unusually high genetic diversity, which is typical for glacial [refugia], and their vocal repertoire is very different from the dialects of orcas living to the north off the coast of Kamchatka. Kamchatkan orcas are most likely the descendants of the few pods that migrated west from the central Aleutian refugium; that's why they are so different," says Filatova.

Orcas' vocalizations are highly diverse, and no two pods make the same sounds. Therefore, these sounds can be used to identify individuals' affiliations to families and pods. Orcas are not genetically programmed to produce sound, for example, the way a cat is. A cat that grows up among other animals and has never heard another cat will still meow when opening its mouth. In contrast, orcas learn to communicate from their mother or other older family members. Each pod has its own dialect, not spoken by others.

"When we combine this with , we get a strong idea of how different orca communities relate to each other," says Filatova.

So far, two Ice Age refugia have been discovered, providing us with insight into how orcas may handle current and future climate changes: They will likely move northward as the ice melts, and this colonization may happen in small, individual families or pods rather than in large waves.

The discovery of the two Ice Age refugia not only contributes to knowledge about how orcas survived during the Ice Age, but it also paints a picture of orcas as very different animals that may not fit neatly into one species.

"Many believe that orcas should be divided into several species. I agree—at least into subspecies because they are so different that it doesn't make sense to talk about one species when discussing their place in the  or when allocating quotas to fishermen," says Filatova.

Some orcas eat fish, some only herring, some only mackerel, some only a specific type of salmon. Others only eat marine mammals such as seals, porpoises, and dolphins. Some take a little of everything, and still others live so far out in the open sea that we fundamentally know very little about them.

Whether a pod eats fish—and which fish—has a significant impact on the fishing that takes place in their habitat. When a country calculates fishing quotas, it must take into account how many fish are naturally hunted by predators, and since an orca can consume 50–100 kg of fish in a day, it greatly affects the quota calculation.

If pods eat marine mammals and do not touch fish, this matters if they are to be captured and sold to marine parks, where it is difficult to feed them marine mammals. While marine parks' popularity is declining worldwide, there is still a large market for orcas in Chinese marine parks.

Since there is only one scientifically recognized species of orcas, researchers have resorted to a different form of classification to distinguish between different types of orcas and categorize them into so-called ecotypes. In the northern Pacific, three ecotypes have been defined so far, and in the southern hemisphere, four or five have been described.

There are probably more—perhaps up to 20 different ecotypes, according to Filatova.

"We need to know the different ecotypes. Orcas are at the top of the food chain, and it affects the entire ecosystem around them what they eat and where they do it," she says.

In the Danish waters, Skagerrak and Kattegat, close to the SDU Marine Biological Research Center,  are occasionally seen. Yet, no one knows if they eat fish or marine mammals—and therefore, also, how they affect the food chain and fishing.

"I look forward to learning more about them. Maybe they turn out to belong to a new ecotype," says Filatova.

More information: Olga A. Filatova et al, Genetic and cultural evidence suggests a refugium for killer whales off Japan during the Last Glacial Maximum, Marine Mammal Science (2023). DOI: 10.1111/mms.13046

 

REBURN: A new tool to model wildfires in the Pacific Northwest and beyond

REBURN: A new tool to model wildfires in the Pacific Northwest and beyond
This NASA MODIS image shows the Tripod Complex Fire in north-central Washington on 
Aug. 7, 2006. 
Credit: NASA/MODIS Rapid Response Team/Goddard Space Flight Center

In 2006, the Tripod Complex Fire burned more than 175,000 acres in north-central Washington. The fire, which was within the Okanogan-Wenatchee National Forest, was more than three times the size of Seattle. Yet while considered severe at the time, even larger wildfires in 2014, 2015 and 2021 have since dwarfed Tripod.

Past research shows that large and severe wildfires like these were much rarer in the western U.S. and Canada prior to the late 20th century.

"Fire exclusion policies for much of the 20th century yielded many dense forests with largely uniform composition," said Susan Prichard, a research scientist with the UW School of Environmental and Forest Sciences. "By the turn of this century, we had mature and densely treed, multi-layered forests with high fuel content—and as a result, large, destructive wildfires can ignite and spread more easily. There's simply more to burn across large landscapes."

Prichard, along with colleagues from the U.S. Forest Service's Pacific Northwest Research Station—Paul Hessburg, Nicholas Povak and Brion Salter—and consulting fire ecologist Robert Gray, have created a  that will allow managers and policymakers to imagine and realize a different future: one where large, severe wildfires like Tripod are once again rare events, even under climate change.

The tool, known as REBURN, can simulate large forest landscapes and  dynamics over decades or centuries under different wildfire management strategies. The model can simulate the consequences of extinguishing all wildfires regardless of size, which was done for much of the 20th century, or of allowing certain fires to return to uninhabited areas. REBURN can also simulate conditions where more benign forest landscape dynamics have fully recovered in an area.

In a pair of papers published in the journal Fire Ecology, the team applied REBURN to the region in north-central Washington where the 2006 Tripod Complex Fire burned. Simulations showed that setting prescribed burns and allowing smaller wildfires to burn can yield more varied and resilient forests over time.

Such forests are made up of forest condition "patches" of different sizes and shapes, and all at different stages of recovery from their most recent fire. Patches that recently burned acted as "fences" to the flow of fire for at least the next 5 to 15 years, preventing wildfires from spreading widely. REBURN simulations showed that a forest landscape comprised of 35 to 50% "fence" areas had far fewer large-scale and damaging wildfires.

"Landscapes had tipped to more 'benign' burning conditions," said Hessburg.

REBURN simulations showed that, when fence areas were less abundant across a region, larger and more severe wildfires tended to dominate how the landscape developed over time.

"The model allows us to simulate what can happen when different management scenarios are applied before the fact, including how small or medium-sized fires in uninhabited areas can reshape forest vulnerability to fires," said Prichard. "We found that having a more complex forest environment—in terms of tree age, composition, density, fuel content—makes it harder for large fires to spread and become severe."

REBURN: A new tool to model wildfires in the Pacific Northwest and beyond
The simulated area (in color) for the REBURN model. The red outline indicates the area 
affected by the 2006 Tripod Complex Fire.
 Credit: Brion Salter/U.S. Forest Service Pacific Northwest Research Station

"We also found that non-forest areas comprised of grasslands, shrublands, wet and dry meadows, and sparsely treed woodlands were key ingredients of wildfire-resilient forests," said Hessburg.

"REBURN showed us that our policy of extinguishing all wildfires created forests like those that exist today, with large, severe wildfires growing more prevalent. In addition to destroying homes and blanketing cities and towns with smoke, conflagrations like these displace wildlife, destroy habitats, and can burn large areas severely, sometimes making it difficult for forests to return."

Short intervals between forest reburns can be especially harmful for long-term recovery by destroying  that have not yet produced cones, they added.

From 1940 to 2005 in Washington's North Cascades, fire crews extinguished more than 300 fires in their early stages in the Tripod area—most triggered by lightning strikes. By the 1980s and 1990s, forests in the region had become high-density tinderboxes, loaded with older, dying trees and lots of dead wood and other fuel on the ground.

Research has shown that before large-scale European colonization of the area, smaller wildfires shaped forests in north-central Washington and elsewhere in the Pacific Northwest. The Methow people and other tribes in the region actively set fires through cultural burning practices. Aerial photos show that, as recently as the 1930s, forests in north-central Washington had a "patchwork quilt" structure that kept large wildfires from forming easily.

"Forests with more complex structure—including densely and lightly treed areas like meadows and grasslands, shrublands, and spare woodlands—also create a wider variety of habitats for wildlife," Hessburg said. "Recently burned areas can develop into wet or dry meadows that can host deer or moose. Other, younger tree-dense areas can host lynx and snowshoe hares."

REBURN can be adapted to other regions in the western U.S. and Canada. Prichard, Hessburg and their colleagues are currently adapting it to simulate forest development in the vast forests of southern British Columbia and northern California, including regions recently hit by wildfires and those culturally burned by Indigenous people.

But knowing when—or even whether—to allow a small fire to burn in an uninhabited region is no easy task, since fire managers must protect people, their homes and livelihoods. The team hopes ongoing research will help refine the model and the insight it can provide to modified forest management strategies.

"This is a new type of tool that couples forest and non-forest development models over time, fuel fall-down after fires, and a  growth model," said Hessburg.

"We hope that it will help people who make major decisions about our forests understand the long-term consequences of different practices and policies when it comes to wildfires," said Prichard. "We hope it will make these conversations easier to have by grounding our predictions in sound  science."

More information: Susan J. Prichard et al, The REBURN model: simulating system-level forest succession and wildfire dynamics, Fire Ecology (2023). DOI: 10.1186/s42408-023-00190-7

Nicholas A. Povak et al, System-level feedbacks of active fire regimes in large landscapes, Fire Ecology (2023). DOI: 10.1186/s42408-023-00197-0


Provided by University of Washington Research supports use of managed and prescribed fires to reduce fire severity

North Atlantic volcanic activity was a major driver of climate change 56 million years ago, study finds

North Atlantic volcanic activity was a major driver of climate change 56 million years ago
Continent and plate reconstruction of the North Atlantic Igneous Province area 56 million 
years ago, highlighting areas of volcanism and the spread of lava flows on the seafloor. 
Credit: Jones et al. 2023

The Paleocene–Eocene Thermal Maximum (PETM) is a period of global warming that occurred ~56 million years ago, lasting approximately 200,000 years, when the Earth experienced global surface temperature elevations of ~5°C.

Hypotheses for the cause of this hyperthermal (short-lived warming) event have included destabilization of methane hydrates (ice-like solids of methane and water) due to orbital forcing (changes in incoming solar radiation due to variation in the tilt of the Earth's axis and orbit) and uplift of the land causing weathering of marine rocks.

However, new research in Climate of the Past has suggested that volcanic activity within the North Atlantic contributed significant amounts of greenhouse gases to the atmosphere (it was active 63–54 million years ago but experienced peak volcanism 56–54 million years ago). Increased  fall in line with a prominent spike in lighter carbon (12C) recorded in the shells of fossil microorganisms living in the oceans at the time, foraminifera. It enhances the  by trapping and absorbing heat radiating from the Earth's surface, causing a positive feedback loop of ever-increasing temperatures.

This volcanism spans a vast North Atlantic Igneous Province (NAIP) located between Greenland, north of the United Kingdom and west of Norway, with the total volume of magma thought to have been emplaced up to 1,000,000 km3, equating to a carbon reservoir of 35,000 gigatons.

To determine the contribution of the NAIP on PETM climate change, Dr. Morgan Jones from the University of Oslo and colleagues, turned to the sediment record preserved on the island of Fur, Denmark, where a complete section preceding the PETM through to after the event is present, having been uplifted from the seafloor over millennia.

Here, hundreds of ash layers (>1cm thick) derived from the NAIP can be found, which the scientists analyzed for particular elements to determine , changes in hydrology regimes and weathering. Such measurements are termed proxies, and provide an indication of past environmental conditions when direct measurements aren't available, unlike today when we can use instruments to measure emissions in real time.

North Atlantic volcanic activity was a major driver of climate change 56 million years ago
Photograph of Stolleklint Beach on the island of Fur, Denmark, with the black lines indicating
 ash layers and pre-, peak and post-PETM phases marked. 
Credit: Jones et al. 2023/ Stokke et al. 2021

Volcanic proxies include mercury and osmium that are released during eruptions and are deposited with organic matter. Their progressive enrichment through the succession indicates elevated NAIP activity leading up to the PETM, before a fairly prompt decline during the recovery phase post-event. This would have been comprised of basaltic eruptions and thermogenic degassing (removal of dissolved gases from liquids) due to contact with magma intrusions.

In the latter case, high levels of methane contributed significantly to global warming as it is a powerful greenhouse gas, 28 times more potent than  at trapping heat over a 100-year period. Dr. Jones suggests a distinct change in the activity of the NAIP from effusive (outpouring of lava onto the ground) to explosive (including ash clouds and volcanic bombs, for example) over this period.

Paleoclimate proxies include carbon, lithium and osmium, the latter two being tracers of silicate weathering. Lithium and osmium abundances increase during the peak and then post-PETM, highlighting enhanced silicate weathering and erosion resulting from a more intense hydrological cycle due to global warming. However, lithium measurements do not correspond fully to the palaeotemperature of the time, with Dr. Jones and colleagues suggesting that uplift of the NAIP would have contributed to providing more exposed rock for weathering and erosion to take place.

Post-PETM weathering of the silica-rich basaltic lava flows used carbon dioxide from the atmosphere to form carbonate and bicarbonate compounds that would sequester this greenhouse gas into the rock, helping to draw down carbon dioxide and therefore aid recovery from the climatic event. In addition, an enhanced hydrological cycle transported ash to the sea for burial, which would have helped create a  whereby more carbon was removed from the atmosphere and hydrosphere; thus, the greenhouse forcing reduced and global temperatures declined.

It is worth noting that not all of the volcanic record is preserved here, as only the most explosive eruptions would have had ash reach from the North Atlantic to Denmark to be preserved and discovered by scientists millions of years later. While there is still much more work to be conducted on climate change events over geological timescales, they are important to study as they offer a window into future global warming, understanding how both natural and anthropogenic-induced carbon dioxide will impact our world.


More information: Morgan T. Jones et al, Tracing North Atlantic volcanism and seaway connectivity across the Paleocene–Eocene Thermal Maximum (PETM), Climate of the Past (2023). DOI: 10.5194/cp-19-1623-2023.


Journal information: Climate of the Past

© 2023 Science X Network



Carbon emissions from volcanic rocks can create global warming: study

 

Physicists use a 350-year-old theorem to reveal new properties of light waves

Want to know how light works? Try asking a mechanic
Physicists at Stevens Institute of Technology use a 350-year-old theorem that explains the workings of pendulums and planets to reveal new properties of light waves. Credit: Stevens Institute of Technology

Since the 17th century, when Isaac Newton and Christiaan Huygens first debated the nature of light, scientists have been puzzling over whether light is best viewed as a wave or a particle—or perhaps, at the quantum level, even both at once. Now, researchers at Stevens Institute of Technology have revealed a new connection between the two perspectives, using a 350-year-old mechanical theorem—ordinarily used to describe the movement of large, physical objects like pendulums and planets—to explain some of the most complex behaviors of light waves.

The work, led by Xiaofeng Qian, assistant professor of physics at Stevens and reported in the August 17 online issue of Physical Review Research, also proves for the first time that a light wave's degree of non-quantum entanglement exists in a direct and complementary relationship with its degree of polarization. As one rises, the other falls, enabling the level of entanglement to be inferred directly from the level of polarization, and vice versa. This means that hard-to-measure  such as amplitudes, phases and correlations—perhaps even these of quantum wave systems—can be deduced from something a lot easier to measure: .

"We've known for over a century that light sometimes behaves like a wave, and sometimes like a particle, but reconciling those two frameworks has proven extremely difficult," said Qian "Our work doesn't solve that problem—but it does show that there are profound connections between wave and particle concepts not just at the , but at the level of classical light-waves and point-mass systems."

Qian's team used a mechanical theorem, originally developed by Huygens in a 1673 book on pendulums, that explains how the energy required to rotate an object varies depending on the object's mass and the axis around which it turns. "This is a well-established mechanical theorem that explains the workings of physical systems like clocks or ," Qian explained. "But we were able to show that it can offer new insights into how light works, too."

This 350-year-old theorem describes relationships between masses and their rotational momentum, so how could it be applied to light where there is no mass to measure? Qian's team interpreted the intensity of a light as the equivalent of a physical object's mass, then mapped those measurements onto a coordinate system that could be interpreted using Huygens' mechanical theorem. "Essentially, we found a way to translate an optical system so we could visualize it as a mechanical system, then describe it using well-established physical equations," explained Qian.

Once the team visualized a light wave as part of a mechanical system, new connections between the wave's properties immediately became apparent—including the fact that entanglement and polarization stood in a clear relationship with one another.

"This was something that hadn't been shown before, but that becomes very clear once you map light's properties onto a mechanical system," said Qian. "What was once abstract becomes concrete: using mechanical equations, you can literally measure the distance between 'center of mass' and other mechanical points to show how different properties of light relate to one another."

Clarifying these relationships could have important practical implications, allowing subtle and hard-to-measure properties of optical systems—or even quantum systems—to be deduced from simpler and more robust measurements of light intensity, Qian explained. More speculatively, the team's findings suggest the possibility of using  to simulate and better-understand the strange and complex behaviors of quantum wave systems.

"That still lies ahead of us, but with this first study we've shown clearly that by applying mechanical concepts, it's possible to understand optical systems in an entirely new way," Qian said. "Ultimately, this research is helping to simplify the way we understand the world, by allowing us to recognize the intrinsic underlying connections between apparently unrelated physical laws."

More information: Xiao-Feng Qian et al, Bridging coherence optics and classical mechanics: A generic light polarization-entanglement complementary relation, Physical Review Research (2023). DOI: 10.1103/PhysRevResearch.5.033110

 THE MAN WITH ONE RED SHOE

Visualizing the mysterious dance: Quantum entanglement of photons captured in real-time

Visualizing the Mysterious Dance: Quantum Entanglement of Photons Captured in Real-Time
Biphoton state holographic reconstruction. Image reconstruction. a, Coincidence image of 
interference between a reference SPDC state and a state obtained by a pump beam with 
the shape of a Ying and Yang symbol (shown in the inset). The inset scale is the same as
 in the main plot. b, Reconstructed amplitude and phase structure of the image imprinted on
 the unknown pump. 
Credit: Nature Photonics (2023). DOI: 10.1038/s41566-023-01272-3

Researchers at the University of Ottawa, in collaboration with Danilo Zia and Fabio Sciarrino from the Sapienza University of Rome, recently demonstrated a novel technique that allows the visualization of the wave function of two entangled photons, the elementary particles that constitute light, in real-time.

Using the analogy of a pair of shoes, the concept of entanglement can be likened to selecting a shoe at random. The moment you identify one shoe, the nature of the other (whether it is the left or right shoe) is instantly discerned, regardless of its location in the universe. However, the intriguing factor is the inherent uncertainty associated with the identification process until the exact moment of observation.

The , a central tenet in , provides a comprehensive understanding of a particle's . For instance, in the shoe example, the "wave function" of the shoe could carry information such as left or right, the size, the color, and so on.

More precisely, the wave function enables quantum scientists to predict the probable outcomes of various measurements on a quantum entity, e.g. position, velocity, etc.

This predictive capability is invaluable, especially in the rapidly progressing field of quantum technology, where knowing a quantum state which is generated or input in a quantum computer will allow to test the computer itself. Moreover, quantum states used in quantum computing are extremely complex, involving many entities that may exhibit strong non-local correlations (entanglement).

Knowing the wave function of such a quantum system is a challenging task—this is also known as quantum state tomography or quantum tomography in short. With the standard approaches (based on the so-called projective operations), a full tomography requires large number of measurements that rapidly increases with the system's complexity (dimensionality).

Previous experiments conducted with this approach by the research group showed that characterizing or measuring the high-dimensional quantum state of two entangled photons can take hours or even days. Moreover, the result's quality is highly sensitive to noise and depends on the complexity of the experimental setup.

The projective measurement approach to quantum tomography can be thought of as looking at the shadows of a high-dimensional object projected on different walls from independent directions. All a researcher can see is the shadows, and from them, they can infer the shape (state) of the full object. For instance, in CT scan (computed tomography scan), the information of a 3D object can thus be reconstructed from a set of 2D images.

In classical optics, however, there is another way to reconstruct a 3D object. This is called digital holography, and is based on recording a , called interferogram, obtained by interfering the light scattered by the object with a reference light.

The team, led byEbrahim Karimi, Canada Research Chair in Structured Quantum Waves, co-director of uOttawa Nexus for Quantum Technologies (NexQT) research institute and associate professor in the Faculty of Science, extended this concept to the case of two photons.

Reconstructing a biphoton state requires superimposing it with a presumably well-known quantum state, and then analyzing the spatial distribution of the positions where two photons arrive simultaneously. Imaging the simultaneous arrival of two photons is known as a coincidence image. These photons may come from the reference source or the unknown source. Quantum mechanics states that the source of the photons cannot be identified.

This results in an  that can be used to reconstruct the unknown wave function. This experiment was made possible by an advanced camera that records events with nanosecond resolution on each pixel.

Dr. Alessio D'Errico, a postdoctoral fellow at the University of Ottawa and one of the co-authors of the paper, highlighted the immense advantages of this innovative approach, "This method is exponentially faster than previous techniques, requiring only minutes or seconds instead of days. Importantly, the detection time is not influenced by the system's complexity—a solution to the long-standing scalability challenge in projective tomography."

The impact of this research goes beyond just the academic community. It has the potential to accelerate quantum technology advancements, such as improving quantum state characterization, quantum communication, and developing new quantum imaging techniques.

The study "Interferometric imaging of amplitude and phase of spatial biphoton states" was published in Nature Photonics.

More information: Danilo Zia et al, Interferometric imaging of amplitude and phase of spatial biphoton states, Nature Photonics (2023). DOI: 10.1038/s41566-023-01272-3

 

City living may make male song sparrows more doting 'super' fathers

City living may make male song sparrows more doting 'super' fathers
A song sparrow singing in Delaware, USA. Credit: Wikimedia Commons, CC BY 2.0

]New behavioral traits are often the first response of animals to changing environmental conditions. As cities increasingly become habitats of wildlife, researchers have studied behavioral changes in birds and examined how urbanization impacts parental care behavior of male song sparrows. The team found that in cities, where male song sparrows are known to be more aggressive than in rural surroundings, male birds visited nests more often than rural conspecifics visited countryside nests. The study is published in Frontiers in Ecology and Evolution.

When animals settle in new environments, or when their  are rapidly changed by human influence, their behaviors change. One such behavioral change that has been observed in several bird species that settled in cities is increased aggression, born out of the need to defend territories.

"Male songbirds in temperate zones are thought to reduce parental care when they are more aggressive. Yet in this study, we show that urban male song sparrows provided more care for their young," said Dr. Samuel Lane, currently a postdoctoral research fellow at North Dakota State University and lead author of the study completed in the Sewall Lab at Virginia Tech. "Against our expectations, we found that they visited nests more frequently and were more successful parents than rural males."

Super songbirds

Many songbird species have readily adapted to cities, yet in these new environments they face challenges not found in their native habitats. One way that animals can cope with those changes is by balancing behaviors to manage energy and time resources better.

If urban male songbirds spend more time securing their territory, it would imply that they have less time to invest into the care of their offspring. Therefore, the researchers expected that more aggressive urban male sparrows were sacrificing  for territorial aggression, which in turn was expected to have a negative impact on the survival of their young. To test their thesis, they studied six sites in southwest Virginia characterized by recent urban sprawl over four breeding seasons.

Lane and colleagues observed that urban males visited their nests significantly more often than their rural fellows. They also began feeding nestlings earlier in the day. "It turns out urban males are super males—able to defend their territories and care for their young," Lane said.

Born and raised in the city

The researchers also found that hatching and fledging success was significantly higher in urban habitats—despite certain challenges city birds faced. For example, —a behavioral pattern where certain , such as the brown-headed cowbird, use nests of other birds to lay their eggs—was higher in cities. This phenomenon can negatively impact development and survival of the offspring belonging in a . On the upside, nest predation rates were significantly lower in the city, contributing to overall higher nesting success.

"It is often assumed that urban areas are more challenging for wild animals," Lane said. "Our study adds to growing evidence that certain species of songbirds even benefit from living in urban environments when there is sufficient green space for them to find food and nest locations." The scientists hope ongoing research in this field will contribute to designing urban environments that support wildlife better.

These results, however, should not be generalized to all locations, or other species and animals. The researchers pointed out that studying sites of more intense urbanization or species that cope worse with urbanization might have produced different outcomes.

More information: Indirect effects of urbanization: Consequences of increased aggression in an urban male songbird for mates and offspring, Frontiers in Ecology and Evolution (2023). DOI: 10.3389/fevo.2023.1234562


Nuclear War Could End the World, but What if It’s All in Our Heads?

Sarah Scoles
Updated Mon, August 21, 2023

- 1952 FILE PHOTO - The mushroom cloud of the first test of a hydrogen bomb, "Ivy Mike", as photographed on Enewetak, an atoll in the Pacific Ocean, in 1952, by a member of the United States Air Force's Lookout Mountain 1352d Photographic Squadron. The top secret film studio, then located in Hollywood, California, produced thousands of classified films for the Department of Defence and the Atomic Energy Commission beginning in 1947. A 50th anniversary tribute to these "Atomic Cinematographers" and their work is planned for October 22 in Hollywood.

Nuclear war has returned to the realm of dinner table conversation, weighing on the minds of the public more than it has in a generation.

It’s not just “Oppenheimer’s” big haul at the box office: Since Russia’s invasion of Ukraine, the country’s officials have made nuclear threats. Russia has also suspended its participation in a nuclear arms control treaty with the United States. North Korea has launched demonstrative missiles. The United States, which is modernizing its nuclear weapons, shot down a surveillance balloon from China, which is building up its atomic arsenal.

“The threat of nuclear use today, I believe, is as high as it has ever been in the nuclear age,” said Joan Rohlfing, president and chief operating officer of the Nuclear Threat Initiative, an influential nonprofit group in Washington, D.C.

In this environment, a conventional crisis runs a significant risk of turning nuclear. It only requires a world leader to decide to launch a nuclear attack. And that decision making process must be better understood.

Historically, scholarship on nuclear decision making grew out of economic theory, where analysts have often irrationally assumed that a “rational actor” is making decisions.

“We all know that humans make mistakes,” Rohlfing said. “We don’t always have good judgment. We behave differently under stress. And there are so many examples of human failures over the course of history. Why do we think it’s going to be any different with nuclear?”

But growing scientific understanding of the human brain hasn’t necessarily translated into adjustments in nuclear launch protocols.

Now there’s a push to change that. The organization led by Rohlfing, for instance, is working on a project to apply insights from cognitive science and neuroscience to nuclear strategy and protocols — so leaders won’t bumble into atomic Armageddon.

But finding truly innovative, scientifically backed ideas to prevent an accidental or unnecessary nuclear attack is easier said than done. So is the task of presenting the work with adequate nuance.

Experts also need to persuade policymakers to apply research-based insights to real-world nuclear practice.

“The boundaries of that discourse are extraordinarily well protected,” said Anne I. Harrington, a nuclear scholar at Cardiff University in Wales, referring to internal pushback she says government insiders have faced when challenging the nuclear status quo. “So anyone who thinks that they’re going to make changes from the outside alone — I think that won’t happen.”

— — —

The world’s nuclear powers have different protocols for making the grave decision to use nuclear weapons. In the United States, absent an unlikely change to the balance of power among the branches of government, the decision rests with just one person.

“The most devastating weapons in the U.S. military arsenal can be ordered into use by only the president,” said Reja Younis of the Center for Strategic and International Studies in Washington, D.C., who is also a doctoral candidate in international relations at the Johns Hopkins School of Advanced International Studies.

In a crisis involving nuclear arms, Younis said, the president would probably meet with the secretary of defense, military leaders and other aides. Together, they would evaluate intelligence and discuss strategy, and the advisers would present the president with possible actions.

“Which could range from ‘let’s do nothing and see what happens’ to ‘let’s full-scale nuclear attack,’” said Alex Wellerstein, a professor at the Stevens Institute of Technology in New Jersey and head of a research project called “The President and the Bomb.”

In the end, though, only the president makes the call — and they can forgo guidance from advisers. A president could just press the proverbial button.

“These are the president’s weapons,” Rohlfing said.

— — —

Before his electoral victory in 2016, experts and political opponents began raising concerns about investing in Donald Trump the power to order a nuclear attack. That debate continued in Congress through his presidential term. By the time he left office, then-House Speaker Nancy Pelosi had openly asked the chairman of the Joint Chiefs of Staff to limit his ability to launch nuclear weapons.

It was in this milieu that Deborah G. Rosenblum, the executive vice president of the Nuclear Threat Initiative, invited Moran Cerf, a neuroscientist who is currently a professor at the Columbia Business School, to give a lecture to the organization in 2018. He titled it “Your Brain on Catastrophic Risk.” (Today, Rosenblum serves in the Biden administration as assistant secretary of defense for nuclear, chemical and biological defense programs — an office that briefs the president on nuclear matters.)

In a black T-shirt and jeans, Cerf briefed a room of experts and researchers on what brain science had to say about existentially troubling topics like nuclear war. The visit preceded a collaboration involving Cerf and a nonprofit called PopTech, whose conference Cerf hosts.

The groups, with a grant from the Carnegie Corporation of New York, are working to provide the government with science-based suggestions to improve nuclear launch protocols. Changing those policies is not impossible, but would require specific the right political scenario.

“You would need to have some sort of consensus that’s going to come from not just outside groups, but also policy and military insiders,” Harrington said. She added, “You probably also need the right president, honestly.”

The project includes a more public-facing arm: Cerf has been interviewing influential security experts like Leon Panetta, former secretary of defense and director of the CIA, and Michael S. Rogers, former director of the National Security Agency. Excerpts from these interviews will be cut into a documentary series, “Mutually Assured Destruction.”

With this project, Cerf and colleagues may have a conduit to share their findings and proposals with prominent government officials past and present. And he is optimistic about the difference those findings might make.

“I always think things will be better,” he said. “I always think that, with a nice smile, you can get the hardest opposition to listen to you.”

— — —

Cerf has the rapid cadence of a TED Talk speaker. Born in France and raised in Israel, he went to college for physics, got a master’s in philosophy, joined a lab that studied consciousness at Caltech and then transitioned to and completed a Ph.D. there in neuroscience.

Along the way, he did mandatory military service in Israel, worked as a white-hat hacker, consulted on films and TV and won a Moth GrandSlam storytelling competition.

Cerf said his primary critique of the system for starting a nuclear war is that despite advances in our understanding of the fickle brain, the status quo assumes largely rational actors. In reality, he says, the fate of millions rests on individual psychology.

One of Cerf’s suggestions is to scan presidents’ brains and gain an understanding of the neuro-particulars of presidential decision making. Maybe one commander in chief functions better in the morning, another in the evening; one is better hungry, the other better sated.

Other ideas for improving the protocols that Cerf has spoken about publicly generally can be traced back to existing research on decision making or nuclear issues.

Cerf says one important factor is speaking order during the big meeting. If, for instance, the president begins with an opinion, others — necessarily lower in the chain of command — are less likely to contradict it.

The idea that the hierarchical order of speaking affects the outcome of a discussion is not new. “That’s a classic experiment done in the ’50s,” said David J. Weiss, a professor emeritus at California State University, Los Angeles, referring to studies conducted by the psychologist Solomon Asch.

Cerf has also proposed decreasing the time pressure of a nuclear decision. The perception of a strict ticking clock to respond to a nuclear attack originated before the United States developed a more robust nuclear arsenal that might survive a first strike.

“We know that compressed time is bad for most decisions and most people,” Cerf said — an idea that goes back to at least the 1980s. Ideally, he says, if the United States received information indicating a launch, then the president could assess it and make a decision outside the direct heat of right-away.

The group’s main recommendation, though, mirrors proposals by other advocates: Require another person (or people) to say yes to a nuclear strike. Wellerstein, who did not contribute to the group’s research, says that such a person needs the explicit power to say no.

“Our belief is that the system we have, which relies on a single decision-maker, who may or may not be equipped to make this decision, is a fragile and very risky system,” Rohlfing said.

While Cerf and colleagues have other papers in the works, the research from the project that he has produced doesn’t address nuclear weapons head-on. In one paper, participants made riskier decisions when they pretended to be merchants seeking deals on unidentified fruits of unknown value.

Cerf says that research is relevant to scenarios of high risk and low probability — like starting nuclear war — which often have numerous sources of uncertainty. A nuclear decision-maker might be unsure of whether a missile is really in the air, how high a nuke’s yield is, why the missile was launched or whether more missiles will follow.

Another of Cerf’s studies focused on climate change. It found that when people were asked to stake money on climate outcomes, they would bet that global warming was happening, and they were more concerned about its impact, more supportive of action and more knowledgeable about relevant issues — even if they began as skeptics. “You basically change your own brain without anyone telling you anything,” Cerf said.

He thinks the results could be applied to nuclear scenarios because you could use bets to make people care about nuclear risk and support changes to policy. The findings could also be used to evaluate the thinking and prediction of aides who advise the president.

Some scholars of decision science don’t agree on such extrapolations.

“To go from there to giving advice on the fate of the world — I don’t think so,” said Baruch Fischhoff, a psychologist who studies decision making at Carnegie Mellon University.

Paul Slovic, a professor of psychology at the University of Oregon and president of the nonprofit Decision Research, said no psychological inquiry can stop at the experiment.

“You have to go back and forth between the laboratory studies, which are very constrained and limited, and looking out the window,” he said.

Experts say it’s also important to avoid selling too good a story about behavioral science to policymakers and elected officials.

“It’s just really easy to sell them stuff if you have enough bravado,” Fischhoff said.

— — —

Any brain, even a commander in chief’s, has a difficult time with the large-scale empathy required to understand what launching a nuclear weapon means. “We can’t really perceive what it means to kill 30 million people,” Cerf said.

There is a long-standing psychological term for this: psychic numbing, coined by Robert Jay Lifton. Just because humans are intelligent enough to master destructive weapons “does not mean that we’re smart enough to manage them after they’re created,” said Slovic, whose research has extended the concept of psychic numbing.

Compounding this effect is the difficulty of paying appropriate attention to all important information. And that compounds with the tendency to make a decision based on one or a few prominent variables. “If we’re faced with choices that pose a conflict between security and saving distant foreign lives to which we’re numb because they’re just numbers, we go with security,” Slovic said.

Slovic has also researched factors that tend to make people — including presidents — more likely to favor a nuclear launch. In one experiment, for instance, he found that the more punitive domestic policies a person supported — like the death penalty — the more likely the person was to approve of using the bomb.

Other researchers, like Janice Stein, a political scientist at the University of Toronto, have looked into scenarios where military officers show a reluctance to pass information up the chain of command that may trigger a nuclear launch.

That actually happened in 1983, when Col. Stanislav Petrov’s command center near Moscow received data suggesting the United States had launched intercontinental ballistic missiles. Petrov thought it could be a false alarm and decided not to send the warning to his superiors. He was right. Because the colonel feared a nuclear war fought under false pretenses more than he feared not retaliating, a third world war did not begin.

— — —

In the past, Wellerstein says, nuclear launch plans have adapted to changing circumstances, philosophies and technologies. And presidents have changed the protocols because of fears that emerged in their historical moments: that the military would launch a nuke on its own, that the country would experience a nuclear Pearl Harbor or that an accident would occur.

Perhaps today’s fear is that individual psychology governs a world-altering choice. Given that, working to understand how brains might work in a nuclear crisis — and how they could work better — is worthwhile.

What comes after the science — how to change policy — is complicated, but not impossible. Nuclear protocols may have a sense of permanence, but they’re written in word processors, not stone.

“The current system that we have didn’t fall out of the sky fully formed,” Wellerstein said.

c.2023 The New York Times Company

DR STRANGELOVE; NO FIGHTING IN THE WAR ROOM

 

FRANKIE GOES TO HOLLYWOOD, TWO TRIBES