Monday, September 27, 2021

 

Emerging market for Tennessee hardwoods could take root


UTIA and TDA evaluate strengthening hardwood exports to Vietnam

Grant and Award Announcement

UNIVERSITY OF TENNESSEE INSTITUTE OF AGRICULTURE

Photo of tree canopy 

IMAGE: THE UNIVERSITY OF TENNESSEE INSTITUTE OF AGRICULTURE IS TEAMING UP WITH THE TENNESSEE DEPARTMENT OF AGRICULTURE ON A GRANT DESIGNED TO INCREASE THE STATE’S HARDWOOD EXPORTS TO VIETNAM. view more 

CREDIT: IMAGE COURTESY UTIA.

KNOXVILLE, Tenn. — Although a long-standing trade war and subsequent pandemic have dealt a crippling blow to the nation’s timber industry, a new opportunity is raising hopes in Tennessee. The University of Tennessee Institute of Agriculture is teaming up with the Tennessee Department of Agriculture on a grant designed to increase the state’s hardwood exports to Vietnam.

Tennessee is one of the top three hardwood lumber-producing states in the U.S. Exports have accounted for roughly 60 percent of the state’s hardwood production, mostly to one market — China. However, the ensuing trade war resulted in a decrease of more than $100 million in global sales for Tennessee suppliers, prompting this latest effort to diversify the state’s market for forest products.

“I have always been inclined to research that could impact decision-makers and industry participants,” said Andrew Muhammad, UTIA professor and Blasingame Chair of Excellence in Agricultural Policy. “This project is an excellent research opportunity that will inform how Tennessee can increase wood product exports to Vietnam.” Muhammad will conduct a detailed market assessment and provide trade insights and guidance as it pertains to this emerging market.

Vietnam is emerging as a major exporter of finished wood products such as furniture and flooring. The number of finished wood product enterprises increased from 2,500 in 2008 to 4,500 in 2017 with exports valued at nearly $11 billion. This growth has increased demand for forest products from the U.S. and other exporting countries.

From a supply standpoint, hardwood lumber is one of the largest export commodities in Tennessee. Approximately 52 percent of the state is covered in forest and 89 percent is comprised of hardwoods. Prior to the pandemic, forestry accounted for 3.5 percent of the state’s economy and generated $24.3 billion in output. In 2017, exports of Tennessee forest products and furniture and related products outside the U.S. totaled $343.7 million.

Vietnam is already an important market for Tennessee forest products, ranking as its second-largest foreign market in 2019 to the tune of $25 million. This grant effort would build upon and strengthen this market and provide an opportunity for the state to capture increased market share in Vietnam.

Funded by USDA’s Foreign Agricultural Service, the grant will also enable on-site trade missions to provide opportunities to better understand the needs, logistics and requirements necessary to anchor the supply chain.

Through its land-grant mission of research, teaching and extension, the University of Tennessee Institute of Agriculture touches lives and provides Real. Life. Solutions. utia.tennessee.edu.


 

Children and youth at low risk of severe acute COVID-19 during first part of pandemic: Canadian study


Peer-Reviewed Publication

CANADIAN MEDICAL ASSOCIATION JOURNAL

New research has found that children and youth may be at low risk of severe acute COVID-19, according to a study conducted during the first half of the pandemic and published in CMAJ (Canadian Medical Association Journal https://www.cmaj.ca/lookup/doi/10.1503/cmaj.210053.

Researchers with the Canadian Paediatric Surveillance Program (CPSP) looked at hospitalizations of children with SARS-CoV-2 infection and factors for severe disease among children and youth admitted to hospital. The study included data on 264 children and youth with SARS-CoV-2 infection hospitalized across Canada between Mar. 25 and Dec. 31, 2020, and involved 2800 pediatricians. The data were collected before the Delta variant became dominant in Canada.

Of all the children and youth with SARS-CoV-2 infection admitted to hospital, 43% were admitted for other reasons — such as medical concerns unrelated to COVID-19 or for infection control purposes — and the infections were picked up incidentally.

"Our study shows that the clinical presentation and severity of disease caused by SARS-CoV-2 infection were different in children than in adults in the first part of the COVID-19 pandemic in Canada," writes Dr. Shaun Morris, co–senior author, infectious diseases physician at The Hospital for Sick Children (SickKids) and associate professor, Department of Paediatrics at the University of Toronto, with coauthors.

The authors initially expected that children and youth could be at higher risk for severe disease given what is typically seen with viral respiratory infections in the pediatric population.

The most common symptoms of the 150 children and youth admitted primarily for COVID-19 were fever (70%), vomiting (35%) and cough (34%). Half (50%) of the children and youth admitted to hospital with COVID-19 were described as having severe disease, 21% of patients were admitted to the intensive care unit and 13% needed cardiac or respiratory support beyond low-flow oxygen.

"Although children have recently been shown to have the highest seroprevalence of SARS-CoV-2 antibodies from infection among all age groups in Canada (3.3%), the relatively small number of pediatric hospital admissions highlights that children have less severe infection than adults even though they may be infected more frequently," writes Dr. Fatima Kakkar, co–senior author, infectious diseases physician at Centre Hospitalier Universitaire Sainte-Justine, and associate professor, Department of Paediatrics at Université de Montréal, Montréal, Quebec.

Children and youth with severe disease were more likely to have an underlying health condition such as obesity, and neurological and respiratory conditions (other than asthma). About half of those with severe disease had at least one comorbidity.

Infants and teenagers had higher rates of hospitalization than school-aged children. The authors suggest it may be because physicians were being extra cautious in the case of infants, while the higher hospitalization rates in teenagers might be because they may be at increased risk of infection and exhibit more severe disease.

Deaths in children from COVID-19 were very rare, consistent with previously published studies.

"Overall, the results of this study serve to inform parents and policy-makers that severe acute disease in kids was rare during the study period," says Dr. Morris. "It is important to note that these study results reflect the burden of disease before the Delta variant."

While the Delta variant is known to be more infectious, it's not yet known whether it causes more severe disease in children or youth.

The authors advocate for continued monitoring in case of potential changes in COVID-19 epidemiology and better understanding of disease severity in healthy children and youth, as well as those with underlying health conditions.

In a related commentary https://www.cmaj.ca/lookup/doi/10.1503/cmaj.211513, Drs. Stephen Freedman and James Kellner, professors at the Alberta Children's Hospital Research Institute in the University of Calgary's Cumming School of Medicine, write, "Although, as shown by the authors of the related study, the consequences of acute COVID-19 in children were limited in the early phases of the pandemic, the direct and indirect impacts of SARS-CoV-2 infections in children must be considered when determining public health policies. These deliberations must integrate the short- and long-term impacts that public policy may have on the physical, mental and social well-being of children. While the light is visible at the end of the tunnel, children in Canada must continue to be protected as they may be the last ones to get there."

"Characteristics of children admitted to hospital with acute SARS-CoV-2 infection in Canada in 2020" is published September 27, 2021.

###

Stanford-led research reveals potential of an overlooked climate change so


Reports and Proceedings

STANFORD UNIVERSITY

Earlier this month, President Biden urged other countries to join the U.S. and European Union in a commitment to slashing methane emissions. Two new Stanford-led studies could help pave the way by laying out a blueprint for coordinating research on methane removal technologies, and modeling how the approach could have an outsized effect on reducing future peak temperatures.

The analyses, published Sept. 27 in Philosophical Transactions of the Royal Society A, reveal that removing about three years-worth of human-caused emissions of the potent greenhouse gas would reduce global surface temperatures by approximately 0.21 degrees Celsius while reducing ozone levels enough to prevent roughly 50,000 premature deaths annually. The findings open the door to direct comparisons with carbon dioxide removal – an approach that has received significantly more research and investment – and could help shape national and international climate policy in the future.

“The time is ripe to invest in methane removal technologies,” said Rob Jackson, lead author on the new research agenda paper and senior author on the modeling study. Jackson is the Michelle and Kevin Douglas Provostial Professor of Energy and Environment in Stanford’s School of Earth, Energy & Environmental Sciences.

The case for methane removal

The relative concentration of methane has grown more than twice as fast as that of carbon dioxide since the beginning of the Industrial Revolution. Removing methane from the atmosphere could reduce temperatures even faster than carbon dioxide removal alone because methane is 81 times more potent in terms of warming the climate over the first 20 years after its release, and about 27 times more potent over a century. Methane removal also improves air quality by decreasing the concentration of tropospheric ozone, exposure to which causes an estimated one million premature deaths annually worldwide due to respiratory illnesses.

Unlike carbon dioxide, the bulk of methane emissions are human-driven. Primary culprits include agricultural sources such as livestock, which emit methane in their breath and manure, and rice fields, which emit methane when flooded. Waste disposal and fossil fuel extraction also contribute substantial emissions. Natural sources of methane, including soil microbes in wetlands, account for the remaining 40 percent of global methane emissions. They further complicate the picture because some of them, such as thawing permafrost, are projected to increase as the planet warms.

While development of methane removal technologies will not be easy, the potential financial rewards are big. If market prices for carbon offsets rise to $100 or more per ton this century, as predicted by most relevant assessment models, each ton of methane removed from the atmosphere could then be worth more than $2,700.

Envisioning methane removal’s impacts

The modeling study uses a new model developed by the United Kingdom’s national weather service (known as the UK Met Office) to examine methane removal’s potential impacts while accounting for its shorter lifetime than carbon dioxide – a key factor because some of the methane removed would have disappeared anyway. The researchers created a set of scenarios by varying either the amount removed or the timing of removal to generalize their results over a wide range of realistic future emissions pathways.

Under a high emissions scenario, the analysis showed that a 40 percent reduction in global methane emissions by 2050 would lead to a temperature reduction of approximately 0.4 degrees Celsius by 2050. Under a low emissions scenario where temperature peaks during the 21st century, methane removal of the same magnitude could reduce the peak temperature by up to 1 degree Celsius.

“This new model allows us to better understand how methane removal alters warming on the global scale and air quality on the human scale,” said modeling study lead author and research agenda coauthor Sam Abernethy, a PhD student in applied physics who works in Jackson’s lab.

From research to development

The path to achieving these climate and air quality improvements remains unclear. To bring it into focus, the research agenda paper compares and contrasts aspects of carbon dioxide and methane removal, describes a range of technologies for methane removal and outlines a framework for coordinating and accelerating its scale-up. The framework would help facilitate more accurate analysis of methane removal factors ranging from location-specific simulations to potential interactions with other climate change mitigation approaches.

Methane is challenging to capture from air because its concentration is so low, but burgeoning technologies – such as a class of crystalline materials called zeolites capable of soaking up the gas – hold the promise of a solution, according to the researchers. They argue for increased research into these technologies’ cost, efficiency, scaling and energy requirements, potential social barriers to deployment, co-benefits and possible negative by-products.

“Carbon dioxide removal has received billions of dollars of investments, with dozens of companies formed,” said Jackson. “We need similar commitments for methane removal.”

To read all stories about Stanford science, subscribe to the biweekly Stanford Science Digest.

Jackson is also a senior fellow at the Stanford Woods Institute for the Environment and the Precourt Institute for Energy and chairman of the Global Carbon Project. Coauthors of the research agenda paper include Josep Canadell of the Global Carbon Project; Matteo Cargnello, an assistant professor of chemical engineering at Stanford, Steven Davis and Chaopeng Hong of the University of California at Irvine; Sarah Féron, a postdoctoral fellow in Earth system science at Stanford at the time of the research; Sabine Fuss of Humboldt Universität in Germany; Alexander Heyer and Hannah Rhoda, PhD students in chemistry at Stanford; and Edward Solomon, the Monroe E. Spaght Professor of Humanities and Sciences at Stanford and professor of photon science at SLAC National Accelerator Laboratory; Maxwell Pisciotta and Jennifer Wilcox of the University of Pennsylvania; H. Damon Matthews of Concordia University in Montreal; Renaud de Richter of Ecole Nationale Supérieure de Chimie de Montpellier in France; Kirsten Zickfeld of Simon Fraser University in Canada. Coauthors of both papers include Fiona O’Connor and Chris Jones of the Met Office Hadley Centre.

Both papers were funded by the Stanford Woods Institute for the Environment’s Environmental Venture Projects program, the Gordon and Betty Moore Foundation, the National Sciences and Engineering Research Council of Canada and the Joint UK BEIS/Defra Met Office Hadley Centre Climate Programme. The paper led by Sam Abernethy was also funded by the Stanford Data Science Scholars Program and the European Union’s Horizon 2020 Crescendo Project.

-30-

Scoping Out the Gulf of Mexico’s Secret Submerged Forest

Understanding what happened to this ancient forest can help us know what is coming as the sea rises 
again.


A stand of bald cypress trees lies buried in the Gulf of Mexico off the Alabama coast. 
The trees contain information about what the environment was like in the Pleistocene. 
Photo by Tim Graham/Alamy Stock Photo

September 27, 2021 

On a ship floating 13 kilometers off the Alabama coast, a group of researchers led by Kristine DeLong, a Louisiana State University paleoclimatologist, coaxed a slim pipe from the bottom of the Gulf of Mexico. It contained a column of sand, peat, ancient gunk, and, they hoped, information about the bald cypress forest submerged 18 meters below the surface. They peered at the bottom of the core. Splinters of wood poked through—a sure sign, says DeLong, that they were above the forest.

This past July, DeLong and her team had returned to Alabama’s sunken forest—a site few have visited, and the location of which is a guarded secret so that the submerged wood is not salvaged or destroyed. The expedition was their third to the site after having last visited in 2016. The forest offers a rare glimpse into the late Pleistocene, an epoch known for its megafauna, kilometer-thick ice sheets, and extreme climatic change. Since the forest was first revealed after a hurricane in 2004, it has faced similar storms, as well as the slower threats of erosion and decomposition, and DeLong and her colleagues were eager to see what had changed since their last visit.

The trees on the bottom of the Gulf of Mexico are bald cypresses. Known for its feathery bark and knobby knees, the bald cypress is a familiar tree in the southeastern United States. It’s long-lived and grows consistently, so is useful for tree ring analyses.

The trees submerged in the Gulf are thought to have lived and died between 42,000 and 74,000 years ago. This was a time of upheaval: as the ice sheet that covered North America grew and retreated, the sea level fell and swiftly rose, then fell again. These changes were rapid, with the sea level sometimes rising or falling by tens of meters in as little as 1,000 years. Rising seas carried sediment that buried the trees, but exactly how remains uncertain.

The team’s working hypothesis is that the trees experienced a mass die-off before sediment flooded the swamp where they grew. Analyses of previously collected cores and wood samples show the trees grew straggly rings—a sign of stress—around the same time. Then sediment-laden water, either from the rising sea or from glacial meltwater coursing down the Mississippi River, entombed the site.

The site’s preservation resulted from a perfect storm of conditions spanning 70,000 years. Most shorelines dating to the end of the last ice age have been eroded. That this forest was preserved intact implies “a rate of sea level rise that’s high enough that it allows for lots of sediment to bury that area really quickly,” says Emily Elliott, a University of Alabama coastal geologist not involved with the research.

The forest has not yielded its secrets easily. In fact, scientists did not know it existed at all until Hurricane Ivan tore overhead in 2004. The passing storm shoved off the heavy blanket of sand that had preserved the trees for so long. After the storm, a fisherman was surprised to find an abundance of fish. He asked some diver friends to investigate, and they reported tree stumps scattered across the seafloor. The fish had colonized the submerged forest like a coral reef. It would be nearly 10 years before DeLong and her colleagues first visited the site and obtained samples of this ancient forest.

On their expedition this past summer, DeLong’s group re-mapped the seafloor by sailing back and forth while scanning with sonar. “It’s a long day of mowing the yard on the ocean,” DeLong says. The team worried that Hurricane Sally, which had ripped through the Gulf in September 2020, had damaged or reburied the trees. Their mapping showed the storm had indeed displaced sand, thinning it in some places.

Using their map to identify where the sand was shallowest, and where they were most likely to be able to access Pleistocene sediment, the scientists began taking cores.

The new data these cores reveal will help the scientists tackle outstanding questions, such as what the Gulf Coast’s climate and ecology were like in this distant past, how the region responded to climate change, and how the site was buried. Understanding the forest’s burial offers insights into future climate change in the Gulf. Speaking of DeLong’s latest analysis, Elliott says, “one of the things that is amazing to me is how timely it is.” The site represents one of the best looks at coastal transitions during rapid sea level rise, she says. That can inform climate models predicting different future scenarios.

With climate change, “we’re very worried about Antarctica and Greenland losing their ice,” DeLong says. “What happened during the ice ages is a great example of how quickly an ice sheet can melt.”

The forest’s history can also help the researchers develop a model to locate other buried sites. More patches of ancient forests may be hidden in the Gulf of Mexico, waiting to be discovered. Identifying such sites is crucial for coastal managers, like the Bureau of Ocean Energy Management, which signs off on oil, gas, and wind activity and funds the team’s research.

While the researchers analyze their newly extracted cores, the drowned forest, no longer protected by sediment, will continue to erode. Anemones will bloom from the tree trunks; future hurricanes will brew in the Gulf. The forest is endangered by its own fame. Presumably interested in its novelty, one furniture company has already submitted requests to salvage the ancient wood. DeLong says onlookers tailed their scientific vessel when they undocked, hoping to find the hidden forest. To throw off curious boaters, they took a winding course at sea. A bill proposed in 2020 would establish the site as a national marine sanctuary, protecting it—and potential discoveries—for years to come.
Russia, U.S. plan to make more movies in space


Russia plans to launch an actress and film director to the International Space Station aboard a Soyuz rocket like this Oct. 5 to make a full-length feature film in space. File Photo by Bill Ingalls | License Phot

Sept. 27 (UPI) -- Russia and the United States are ready to cross new frontiers for filming movies in space as a way to promote growing commercialization of orbital spaceflight and beyond.

Russia's space agency, Roscosmos, plans to launch a Russian actress, film director and cosmonaut to the International Space Station early next month to produce the first full-length feature film shot in space, with a working title of The Challenge.

Russian film director Klim Shipenko and actress Yulia Peresild are to spend 12 days in orbit, during which 10 days will be devoted to shooting the film.

Russia's TASS news agency describes the plot as a thriller about a doctor (Peresild) traveling suddenly to the space station to save a dying cosmonaut.

Peresild and Shipenko also trained quickly for their mission, reflecting the urgency in the script, TASS noted.

Trained cosmonaut Anton Shkaplerov will pilot the mission, the first in decades to see three Russian nationals flying together.

"The space station is like a big house that consists of over 15 modules. At least seven people always stay there and when we arrive, there will be three more," Shkaplerov told TASS.

Roscosmos announced the mission just after former NASA Administrator Jim Bridenstine tweeted in May 2020 that actor Tom Cruise would fly to the space station for a movie. But no date has been announced for a Cruise mission.

"More and more movies and videos will be shot in space as the price of launches falls due to competition from firms like SpaceX and Blue Origin," James Neihouse, a long-time IMAX movie cinematographer who has trained astronauts to shoot film in orbit, told UPI.

"The question is, if you've got a good story, do you really need to go to space for filming?" Neihouse said. "We have so many good films filmed with CGI [computer-generated imagery], and by using airplane flights to simulate zero gravity, that flying actors to space for up to $60 million per seat may not be necessary."

In the meantime, NASA has begun intense planning to show off planned Artemis moon missions by using numerous high-definition cameras.

While there's no firm launch date for Artemis missions, Russia is close to launching its movie endeavor.

The mission is scheduled to lift off on the Russian Soyuz MS-19 spacecraft Oct. 5 at 4:55 a.m. EDT from the Baikonur Cosmodrome in Kazakhstan.

Russia previously tried to mount a similar space movie mission in 1998, but didn't raise the money required, said Jeffrey Manber, CEO of Houston-based space firm Nanoracks, who formerly worked for Russian space company Energia.

"I am sure that having NASA announce a movie project probably made it easier for them in Russia to raise the money," Manber said in an interview. "We may not be in a Cold War anymore, but Russia still wants to notch another first in space. Competition does that."

While Russia's mission has some private backing by a Russian movie studio, Yellow Black and White, and the nation's Channel One Russia TV station, it doesn't represent true commercialization of space like the United States has seen recently, Manber said.

"True commercialization of space would mean diffusion of power at Roscosmos and decentralization -- meaning many smaller organizations would have authority and funding," Manber said. "That just not happening or encouraged in Russia today."

Roscosmos also plans to air a reality TV series about training Peresild and Shipenko for a spaceflight, said Kaylin Land, course lecturer on Russian Studies at Montreal-based McGill University.

That would be similar to an ongoing Netflix series about the training and experience of the all-civilian SpaceX Inspiration4 crew, who circled the Earth for nearly three days ending Sept. 18.

It's likely both the United States and Russia will see more civilians in space, said Land, who has an interest in Russian film.

"Roscosmos has stated that this is meant to be an educational project and that they are learning how to prepare non-professional cosmonauts in a short amount of time, presumably for future flights," Land said.

"What makes the space movie cosmonauts different is that they are being trained quickly and presumably are getting paid to perform in space."
NASA plans to launch climate change-tracking Landsat 9 satellite



NASA's Landsat 9 satellite is encapsulated inside a nosecone for launch from California. Photo courtesy of NASA

Sept. 27 (UPI) -- NASA plans to launch one of its most high-tech Earth observation satellite to date Monday from California to help track climate events that range from California wildfires to deforestation of the Amazon.

A United Launch Alliance Atlas V rocket is scheduled to carry the 5,900-pound spacecraft into orbit from Vandenberg Space Force Base at 2:11 p.m. EDT.

Landsat 9 is the ninth in a series that NASA began to launch in 1972, marking a five-decade partnership between the space agency and the U.S. Geological Survey.

It is intended to provide a continuous record of climate change, urban area growth, glacial melt, cropland health and other phenomena.

"Landsat is our longest-lived remote sensing program," Jeff Masek, project scientist at NASA's Maryland-based Goddard Space Flight Center, said during a press conference Friday.

"Since 1972, it has amassed over 9 million multispectral images of Earth's land in coastal regions."

The new Landsat satellite will join sister satellite Landsat 8 in orbit "to continue collecting images from across the planet to monitor essential resources including crops, irrigation water and forests," Masek said.

Landsat 9, which cost some $900 million, will complete surveys of important coastal areas every eight days at a height of 438 miles -- far above the International Space Station.

The near-polar orbit will be synchronized with the sun's daylight to ensure well-illuminated imagery, Masek said.

NASA delayed the launch by more than a week due to regional shortages of super-cooled liquid propellant fuels prompted by use of liquid oxygen to treat COVID-19 patients at hospitals.

The satellite carries two powerful imaging instruments to measure visible, reflected light and surface temperatures. Using those data points, NASA can establish a record of plant health, surface composition and the loss or addition of water or rainfall.

Northrup Grumman, which designed and built the spacecraft, was responsible for integrating the two instruments.

Without Landsat, the world wouldn't know about the large-scale deforestation of the Amazon in Brazil due to agricultural expansion there, said Inbal Becker-Reshef, director of NASA's Harvest food security and agriculture program.

"Landsat's long-term record has enabled us to track the expansion of such cropland in Brazil by a factor of 2.6 times from 1985 to 2018," Becker-Reshef said.

Humanitarian organizations use Landsat data to help make key decisions ... to manage impending crop shortfalls, she said.

The growing climate change crisis makes Landsat more important than ever, said Sabrina Chapman, a system engineering manager with Northrop Grumman.

"My favorite thing about Landsat is ... monitoring the changes over time due to climate change. That's very important thing in our world right now," Chapman said.

According to a Landsat 9 fact sheet, NASA is responsible for the instruments and the spacecraft, mission integration, launch and on-orbit checkout. USGS is responsible for the flight system, flight operations, and data processing and distribution.

Landsat 9, which as an expected lifespan for at least 10 years, is to move into the orbit of Landsat 7, which will be decommissioned.

Airbus to help build Mexican Moon-mining automata

Commercial partner's ad-funded expedition plans the ultimate pop-up in 2022
Mon 27 Sep 2021 

Airbus and the Mexican Space Agency (MSA) have agreed to collaborate on tech to extract resources on the Moon.

The aerospace company will work with MSA and a startup called Dereum Labs to develop "an end-to-end process from regolith identification and capture to extraction of resources".

The three plan to create a "ground demonstration concept" for whatever hardware they determine is needed to "extract oxygen and metal, or to mine water".

Dereum is an – ahem – interesting outfit: the Mexican company plans to launch a pair of rovers to the Moon in 2022, and to fund the expedition with advertising.

But the company's web site and social feeds offer no details on who will launch the rovers. And the company Twitter feed recently featured CAD designs for the rovers, without suggesting the devices' final configuration has been decided or that they've been built. Indeed, images of the rovers posted by Airbus depict an entirely different design.


Airbus' moon resource extraction concept.

Airbus is much closer to reaching Luna, as its Ariane launchers have proven reliable and it has a strong record building satellites, cargo craft that have reached the International Space Station, and components of the ISS itself. The company is also involved in an ESA project to orbit comms satellites around the Moon.

But no delivery date for the concepts the three entities plan to create has been mentioned.

Airbus and Mexico are not alone in seeking tech to extract resources on the Moon. The Artemis Accords governing NASA’s forthcoming lunar missions include a legal framework to make moon mining permissible, and Japan has signed up to the pact. China has already started surveying the Moon for resources it hopes to use in-situ when its planned lunar base welcomes its first crew in 2020. India’s Chandrayaan-2 moon mission also had resource surveys as one objective ®
Should scientists run the country?

Covid has put academics like Chris Whitty and Patrick Vallance at the heart of policymaking, but electing better politicians could be the answer


Illustration by Elia Barbieri


Philip Ball
Mon 27 Sep 2021 

How many lives would have been saved in the pandemic if the UK government had truly “followed the science”? The question is unanswerable but hardly academic. We cannot accurately quantify how many lives were lost by the politically driven delays to lockdown in the first and second waves, but the number is not small.

So would we have done better simply to put scientists in charge of pandemic policy? Might we hand over climate change policy to them, too? In fact, would their evidence-based methods make them better leaders all round?

How much say scientists should have in running society has been debated since the dawn of science itself. Francis Bacon’s utopian Bensalem in his 1626 book New Atlantis is a techno-theocracy run by a caste of scientist-priests who manipulate nature for the benefit of their citizens. Enthusiasm for technocracies governed by scientists and rooted in rationalism flourished between the world wars, when HG Wells advocated their benefits in The Shape of Things to Come.


But while post-second world war issues such as nuclear power, telecommunications and environmental degradation heightened the demand for expert technical advice to inform policies, the UK government’s first official scientific adviser, Solly Zuckerman, appointed in 1964 by Harold Wilson, stressed the limits of his role. “Advisory bodies can only advise,” he said. “In our system of government, the power of decision must rest with the minister concerned or with the government as a whole. If scientists want more than this then they’d better become politicians.”

That remains the common view today: scientists advise, ministers decide. “The implicit contract,” says the Conservative peer David Willetts, a former minister of state for universities and science, “is that the scientists get to have their voice heard, and in return they accept that ministers will ultimately decide on what should be done.” He considers the view (often credited to Churchill) that “scientists should be on tap but not on top” to be “the right model in a democracy”.


The power of decision must rest with the minister. If scientists want more than this then they’d better become politicians
Solly Zuckerman

But the equation was never that simple. For one thing, in a democracy people have a right to know on what basis decisions are being made: scientific advice can’t happen behind closed doors. After the shambolic BSE crisis of the 1990s, when the minister of agriculture, John Gummer, asserted without scientific justification that British beef was safe to eat (and tried to enlist his reluctant daughter to prove it), a public inquiry concluded that it is vital that science advice to government be transparent and open, and that scientific advisers be able to communicate directly with the public so that people could assess whether what ministers claimed was true. That right was vigorously asserted by Sir David King when he advised the Blair government on the foot-and-mouth epidemic and on nuclear power.

It was a perceived initial lack of transparency in the Scientific Advisory Group for Emergencies (Sage) at the start of the Covid crisis that led King to establish Independent Sage as an alternative, public-facing source of expert advice. The pandemic has also highlighted the tightrope that chief scientific officers such as Chris Whitty and Patrick Vallance must walk. As civil servants, they are duty bound to support the government, and their careful chaperoning by ministers at press briefings led to questions about their independence. When government policy began to diverge markedly from scientific advice during the second wave, the tension was palpable. If a chief medical officer believes that a government policy poses a public health hazard or is downplaying dangers, then where should their allegiance lie?

There is now a strong case for reconsidering the constraints placed on scientific advisers: the top/tap dichotomy fails to acknowledge their broader responsibilities, especially in the face of irresponsible or incompetent governance. And while the idea that they refrain from explicit policy recommendations (which include value judgments) makes sense in normal times, Jonathan Birch of the London School of Economics has proposed that a mode of “normatively heavy” advice that does include such recommendations – perhaps unconditionally (“Do this”) – is warranted in crisis situations. “Different norms apply to scientific advisers in extremis,” he argues.

What’s more, the “on tap” model assumes a view of scientific objectivity that has long since been exploded by experts on the social roles of science. “The idea that scientists can speak truth to power in a value-free manner has emerged as a myth,” wrote the social scientist Sheila Jasanoff in 1990.

For one thing, scientists who join the mechanism of government but imagine they can operate untrammelled by political influences are fooling themselves. The landscape of options considered and modelled by Sage was set not by scientific considerations but by political diktat. As Sage member John Edmunds has said: “The politicians came up with [the] strategy and our job was to make it work” (the strategy here being the fateful “controlled herd immunity” scheme). And modellers predicting the consequences of the full relaxation of restrictions in July did not compare against the baseline scenario of keeping remaining restrictions in place, because they were not asked to do so. Whitty and Vallance must, meanwhile, have recognised that Dominic Cummings’s violation of lockdown rules had implications for public trust and compliance; their silence on the matter was not “staying out of politics”, but itself a political decision.

In its obligation to embrace fallibility and uncertainty, science is antithetical to the current mode of politics in which admissions of doubt and error are regarded as weakness. Yet it is precisely because of those attributes that science is vulnerable to exploitation for political agendas. Studying US policies on cancer risks, Jasanoff concluded that the adversarial style of regulatory decision-making polarises scientific opinion and prevents the resolution of disputes. “Far from promoting consensus, knowledge fed into such a process risks being fractured along existing lines of discord,” she wrote three decades ago. Don’t we know it now.

That consideration exposes, too, the fundamental problem with any notion of “rule by science” – we have to ask: “Which science?” Where there is lack of scientific consensus, science risks becoming a tool not for informing but for justifying policies. One of the most striking aspects of the denialist movements around Covid-19, vaccines and climate change is how “sceptics” position themselves as the true rationalists, parading cherrypicked data in support of fringe views. And they can always find “experts” with superficially plausible qualifications (including Nobel prizes) to support them, just as Johnson could convene a panel of lockdown-sceptic scientists to justify his procrastination last autumn.

Democracy cannot dominate every domain – that would destroy expertise – and expertise cannot dominate every domain – that would destroy democracy
Harry Collins, Robert Evans

But even good-faith experts will disagree, not least because different disciplinary expertise creates different perspectives. The problem is rendered worse by the persistent hierarchy of the sciences that privileges the “hard” disciplines – virology over social sciences, say. Technocrats prefer “hard” fixes: witness how in China, leaders such as Hu Jintao trained as engineers to seek solutions to social problems of water resource management in gargantuan techno-projects. Some say our pandemic response was too much led by “hard” epidemiological modelling and lacked adequate input from public health experts.

So the choice of “expert” matters hugely. Cummings’s enthusiasm for more science-based policymaking sounded all very well until you recognised his tendency to capriciously anoint handpicked “geniuses” (sometimes mavericks). His reliance on the mathematician Tim Gowers to see why the herd immunity policy in early 2020 was “catastrophically wrong” was arbitrary and opaque to scrutiny. Gowers happens to be very smart (and was right), but plenty of experts in public health and epidemiology were already screaming into the void about that mistaken plan.

In the end, we rightly elect politicians to make decisions and judgments, and not simply to enact what experts or data seem to dictate. As the sociologists Harry Collins and Robert Evans have put it: “Democracy cannot dominate every domain – that would destroy expertise – and expertise cannot dominate every domain – that would destroy democracy.” As a scientist, I don’t want to see scientists on top or on tap. Mature leaders, irrespective of their training, who respect science for what it is – a social system for arriving at reliable but contingent knowledge, based on data, embracing error and uncertainty and diversity of opinion – will not struggle to put it to good use. All we need to do is elect them.
Editorial: In light of censure, U of T must take action to support free speech — particularly speech about Palestine

Amid heartbreaking violence, The Varsity stands with the Palestinian community and urges U of T to support the free speech that raises awareness


By The Varsity Editorial Board

A pro-Palestine protest at Nathan Phillips Square in Toronto.
 SAMANTHA HAMILTON/THE VARSITY

Resignations, cancellations, breakings of connections and partnerships.

These are the impacts of the Canadian Association of University Teachers’ (CAUT) censure of U of T, following the university’s failed efforts to dissipate the months-long hiring scandal at the Faculty of Law’s International Human Rights Program (IHRP).

The controversy began last September over allegations that Dr. Valentina Azarova was denied the position of director of the IHRP after a university donor suggested that she would be an unwelcome choice, due to writings of hers that were critical of Israel’s policies toward Palestine. U of T has denied these allegations since they first arose.

The scandal bubbled beneath the surface for many months as U of T’s efforts to move on were moderately effective. Though individuals most concerned by it — such as law faculty professors and former IHRP directors — remained outspoken, the scandal had not fully caught the attention of the community at large.

That is, until the CAUT announced a rare censure of the university, and its members listened. Suddenly, countless events that would have improved the quality of learning and community experience at the university were cancelled. Entire groups, such as Amnesty International and Citizen Lab, have cut ties with U of T over the censure.

Support for the censure appears to grow by the day, as entire departments of the university — most recently, the School of the Environment — are expressing their support.

So what happens now?

The university has already attempted to put the matter at rest by commissioning an “independent” and “transparent” review, which has been criticized for being neither.

The Varsity has published many articles on the controversy as it has unfolded over the past year, and we’ve reported on the criticisms levied at the university by faculty and by the law community almost every step of the way. At any of these points, the university could have stopped its doomed attempts to sweep it under the rug and instead make substantive changes. For example, when met with criticism that the review process would be ineffective and lack transparency, U of T could have created a review body composed of a group of multiple diverse individuals, rather than tasking one former Supreme Court judge with the investigation.

As a result of the review, U of T has committed to creating guidelines around external attempts to interfere in hiring processes, and to review suggestions that academic freedom protections be implemented for certain managerial positions, such as law clinic directors.

Still, little attention seems to be paid to the many U of T community members, especially faculty, who wholeheartedly support the censure and have been calling for action from the university for weeks. It is clear that this will not end without substantive efforts made by the university.

The most prominent solution circulating — and the solution that the CAUT cites as a requirement to end the censure — is that U of T should re-offer Azarova the position. This idea has been offered for months, yet U of T has not acted upon it. U of T responded to this suggestion by saying that Azarova is welcome to apply for the position again following a review of the program as a whole.

So far, the university has not taken any productive action following the censure — it has not even taken responsibility for the inadvertent effects of its inaction on the campus community. When the CAUT censure was announced, the university’s response was that it disagreed with the decision and that the CAUT had no jurisdiction over the case. Instead of taking the CAUT’s censure as a sign that something was wrong, the university merely attempted to pretend that everything was ‘business as usual.’

Following the censure, a U of T spokesperson wrote to The Varsity: “We remain committed to academic freedom for academics, including academic administrators, and to search processes that are confidential and insulated from external pressures whatever their source.”

Despite its statements, we are echoing the calls for U of T to stop the scandal here by finally taking responsibility and accountability for any wrongdoing, and expressing an interest in true transparency so that the university may begin to repair its reputation.

Supporters of the censure have also been trying to shift the narrative to refocus around Palestinian rights, rather than just academic freedom, as the escalation of the censure has also coincided with escalating violence between Israel and Palestine.

The central question of the scandal is really whether U of T will unequivocally support individuals’ right to speak freely on Israel and Palestine. Ensuring this right to free speech is also a moral imperative right now, as Palestinians have been disproportionately killed by Israel over the past two weeks.

Violence erupted between the two sides after an Israeli court decided to forcibly remove Palestinian families from the East Jerusalem neighbourhood of Sheikh Jarrah — a decision that the United Nations called a potential war crime — in addition to Israeli raids on al-Aqsa mosque that injured many Palestinians.

While there have been casualties on both sides, the effect on Palestinians has been far greater. In addition, there is an extreme imbalance of power between Israel and Palestine that makes it impossible to accept that “both sides” are suffering equally. Israel has an incredibly well-funded military, and the Palestinian people lack the same rights as Israeli individuals.

These developments also did not appear out of nowhere. Israel has been occupying Palestine and displacing Palestinians for decades, which has been widely criticized by the international community.

During the most recent fighting, Israel also targeted the offices of the Associated Press and Al-Jazeera in Palestine, claiming it was an attack on Hamas, rather than the media. As journalists, these developments are extremely concerning.

The Varsity stands in solidarity with Palestine and Palestinian community members at U of T. Over the next few weeks — and throughout this volume of The Varsity — we hope to increase our reporting on what the violence in Palestine means for our community members, as well as to make room for community members to write their own stories and experiences.

This is in line with an open letter that The Varsity has signed, along with hundreds of other prominent journalists and news organizations in Canada, demanding better and more nuanced coverage of the Israel-Palestine conflict from Canadian newsrooms.

In addition, The Varsity has donated $200 to Save the Children, a non-governmental organization working in Palestine, as 58 children in Gaza and two in Israel have been killed. We do so out of a sincere belief that anyone — regardless of their politics, background or religion — can sympathize with the tragedy of violence that takes the lives of innocents.

Yet, despite the stance expressed here, The Varsity remains committed to a Comment section that is open to all well-meaning U of T community members. Students may write for the section from any position that is fair, well-reasoned, and based on evidence.

In light of recent events, at minimum, the university needs to affirm its commitment to protecting speech about Palestine, as that is the fundamental question of the IHRP hiring scandal: will U of T protect speech that is critical of Israel, even when pushed back upon?

The Varsity calls on U of T to finally listen to what its community has been saying, to take action and accountability, and to affirm a commitment to free speech for all community members — but particularly for the Palestinian people who have felt silenced for so long.

The Varsity’s editorial board is elected by the masthead at the beginning of each semester. For more information about the editorial policy, email editorial@thevarsity.ca.
Opinion: Treating science like politics is a recipe for disaster

Scientific community should invest in better public communication

By Kuorosh Rezaei
SCIENCE
SEPTEMBER 26, 2021
THE VARISTY
The University of Toronto’s Student Newspaper Since 1880
The politicization of science can be dangerous.
 COURTESY OF NATIONAL CANCER INSTITUTE/UNSPLASH

Science with a dab of politics is a deadly cocktail. If you ever wondered what it tastes like, look no further than the response to COVID-19. Looking back at the pandemic response in the past 18 months in Canada and the US, I wonder how things would be different if science wasn’t politicized and people listened to scientists. Could we prevent the pandemic if we had listened to the scientists who have been warning us for decades about the emergence of pandemic-causing pathogens?

The politicization of science happens when scientific data is intentionally ignored, suppressed, misinterpreted, or cherry-picked because of political considerations and agendas, leading to a situation where support for science is divided along political lines.

When science gets politicized in a situation like the current pandemic, it causes unnecessary harm because it impairs our ability to take the best course of action. However, there is an even bigger social consequence to the constant politicization of science. In the long run, it erodes public trust in science and scientific institutions. Loss of public trust will be detrimental to our society because we need science now more than ever to tackle the challenges of the current pandemic and the imminent climate crisis.

Politicization of climate science

The partisan divide on climate science between Republicans and Democrats in the US is one of the most prominent examples of the politicization of science. Despite an abundance of scientific data regarding human-made global warming, Republican politicians have repeatedly cast doubt on the existence of scientific consensus, using talking points such as “the science of climate change is not settled,” as Business Insider reports.

Among common tactics used by politicians to cast doubt on climate science is cherry-picking data and misrepresenting them to prove that climate models are unreliable. In fact, this is what the Republican representative Lamar Smith did in 2016. Ironically, he was the chairman of the Science, Space & Technology committee at the time.

The politicization of science is not just an American phenomenon. Stephen Harper’s term as prime minister saw multiple accounts of the government suppressing scientific data and restricting environmental scientists from talking to the press and journalists. For instance, according to the CBC, scientists from Environment Canada published a paper in 2011 concluding that a rise of two degrees Celsius in global temperature would be unavoidable by the year 2100. However, Environment Canada’s media office never granted interviews with the researchers. There were other similar cases of government interference in science.

Politicizing COVID-19

Playing politics with science is always a bad idea. It is especially dangerous in times of global disasters when we need science more than ever to inform policies. One reason the US became an epicenter for the virus with hundreds of thousands of deaths was that, in the early days of the pandemic, the government largely ignored and downplayed scientists’ warning about the virus. This decision was motivated more by politics rather than science.

Politicians tend to ignore scientific data in Canada too. For example, in February 2021, just after the end of the second wave of COVID-19 in Ontario, the Ontario government started reopening businesses and public spaces. This decision was taken despite strong warnings by public health experts that this policy would lead to a third wave. As expected, cases began to skyrocket and, by early April, an emergency shutdown was issued. It is likely that many lives could have been saved if the Ontario government had prioritized science over public opinion and politics.

Treating science like politics creates mistrust

Looking at the US is the best way to understand how constant politicization of science can erode public trust. During the pandemic, many scientific aspects of public health wrongfully became the subject of political debate. This included debate about the wearing of masks, the origin of SARS-CoV-2, and proper treatments for COVID-19.

Because of the constant politicization of science, for many Americans, the line between science and politics disappeared along with public trust in science. Signs of this mistrust in science and scientific institutes can be seen in the high rates of vaccine hesitancy. Although the US Food and Drug Administration (FDA) has fully approved the Pfizer-BioNTech COVID-19 vaccine, millions of Americans have not gotten vaccinated because they do not trust the FDA and are concerned about the safety of the vaccine.

Scientific literacy and effective science communication could make society more immune to the negative consequences of mistrust in science. A scientifically literate public is more likely to look at scientific data and make decisions based on evidence instead of adopting beliefs and behaviours based on politics.

Science communication needs to change. Simply providing numbers and facts is not sufficient. There is an ample amount of data showing the effects of global warming caused by humans; however, this topic is still a subject of debate, and many individuals are hesitant about the urgency of the climate crisis. Similarly, many people still believe that not only is COVID-19 not a serious pandemic, but that it is a hoax altogether. There should be radical change in the way that science is communicated to the public because the current model is not working.