It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Sunday, September 24, 2023
Companies in Germany are adapting to the consequences of extreme weather events
by Jutta Gröschl, Institut für Mittelstandsforschung (IfM) Bonn
German Executives are most concerned about the impact of hot weather and heavy rains on their companies' operations. Apart from this, however, they also have an awareness of the consequences of climate change on their value chains. This is revealed by the current IfM survey of more than 1,300 managing executives on the current Climate Adaptation Week of the Federal Ministry for the Environment.
In the last five years, many entrepreneurs in Germany have experienced the consequences of climate change: extreme weather events have already impacted their entrepreneurial business activities. Companies from agriculture, energy and water supply, and the hospitality industry were particularly affected. According to a survey by IfM Bonn, managers of small and medium-sized enterprises (SMEs) and large companies perceive the effects of heat periods and heavy rainfalls as particularly threatening.
The degree of engagement with "climate risks" depends very much on the personal opinions of the company owners, managers, and employees. In contrast, media coverage, information, and awareness-raising activities by politics or associations play a less important role.
Companies see risks in their value chains
Companies in Germany may also be affected by climatic impacts in the regions of their suppliers or customers, regardless of whether they are affected directly. Although the companies consider the risk of the business being affected by a disruption in the supply chain not low, only a minority are intensively dealing with the possible effects of climate risks on their suppliers or customers.
Concrete risk reduction measures are mainly implemented in energy supply and customer diversification. The results suggest that the scope for action of small and medium-sized enterprises, particularly regarding their supply chain risks, is significantly more limited compared to large companies due to their lower purchasing volumes and resources.
Researchers at São Paulo State University (UNESP) in Brazil have developed a strategy for removing glyphosate, one of the world's most frequently used herbicides, from water. Inspired by the concept of the circular economy, the technique is based on sugarcane bagasse, a waste material produced by sugar and ethanol plants.
"Isolated and chemically functionalized sugarcane bagasse fibers can be used as adsorbent material. Glyphosate adheres to its surface and is removed as a water contaminant by filtration, decantation or centrifugation," Maria Vitória Guimarães Leal, told Agência FAPESP.
She is the first author of an article on the research published in the journal Pure and Applied Chemistry. Adsorption is a process whereby molecules dispersed in a liquid or gaseous medium adhere to a solid insoluble surface, which is typically porous.
Owing to its low cost and high potential to raise crop yields, glyphosate is widely used to control the growth of unwanted plants, such as weeds, invasive species and agricultural pests, but scientific studies have shown that it can be a human health hazard and in particular may pose a cancer risk.
Application of glyphosate-containing products is restricted or banned in Austria, Bulgaria, Colombia, Costa Rica, Denmark, El Salvador, Germany and Greece, among other countries. In Brazil, however, annual use of such products averages 173,150.75 metric tons. Part of them is borne away by rain into rivers, wells and other aquatic environments.
Scientists at UNESP's School of Sciences and Technology (FCT) in Presidente Prudente found a way to remove glyphosate products from water in research led by postdoctoral fellow Guilherme Dognani and Aldo Eloizo Job, a professor at FCT-UNESP.
How it works
Dognani explained the procedure. "The bagasse is shredded and the cellulose isolated by separating it from the hemicellulose and lignin. The cellulose fibers are then functionalized by adding quaternary ammonia groups to their surface so that the material is positively charged. The resulting cationic cellulose microfibers bind easily to glyphosate," he said.
Leal added that there are certain favorable conditions, such as pH variation, which was the focus of the study. "When pH is varied, both the adsorbent material and the glyphosate display different molecular configurations. The most efficient level for interaction between them, inducing the most adsorption and hence optimal removal, is pH 14," he said.
To evaluate adsorption capacity, the researchers prepared fractions of a glyphosate solution with pH 2, 6, 10 and 14, measured using a pH meter. They then added to each fraction identical amounts of functionalized cellulose microfiber.
The flasks with the solution contaminated by glyphosate plus cellulose were agitated for 24 hours. In accordance with the procedure described in the literature, they were then heated in a water bath until the reaction occurred, cooled to room temperature and analyzed by visible light spectrophotometry. Removal efficiency was calculated as a ratio of initial to final glyphosate levels in each sample, and adsorption capacity was calculated as a function of pH.
More information: Maria Vitória Guimarães Leal et al, pH dependence of glyphosate adsorption from aqueous solution using a cationic cellulose microfibers (cCMF) biosorbent, Pure and Applied Chemistry (2023). DOI: 10.1515/pac-2022-1205
In a new study from the NIH's Environmental influences on Child Health Outcomes (ECHO) Program, higher prenatal phthalate levels were associated with a slightly increased risk of postpartum depression.
Postpartum depression affects up to 20% of new mothers, making it the most common pregnancy complication to occur after delivery.
ECHO Cohort researchers wanted to examine how chemicals such as phenols, phthalates, and parabens—commonly found in plastics and personal care products—might play a role in postpartum depression symptoms, alongside other factors like genetics and stress. Exposure to these chemicals can affect hormone levels, potentially influencing the development of postpartum depression.
This collaborative research, which is believed to be the largest study to date examining the effects of environmental chemicals on depression, is published in JAMA Psychiatry.
The presence of these man-made chemicals in people is common due to frequent exposure through diet, absorption through the skin, and inhalation. Researchers noted that among study participants all had parabens and nearly all had phthalates in their urine samples.
"Finding new ways to prevent postpartum depression is crucial because most of the known risk factors, like genetics and stressful life events, can't be altered," said Melanie Jacobson, Ph.D., MPH of New York University's Grossman School of Medicine. "Therefore, focusing on prenatal exposure to these types of chemicals represents a novel interventional target."
Researchers measured the concentrations of these chemicals in urine samples of 2,174 pregnant individuals at five ECHO Cohort Study Sites. Those same individuals also completed depression assessments between two weeks and 12 months after delivery to check for postpartum depression symptoms.
Screening instruments indicated the presence or absence of postnatal depressive symptoms. Researchers then harmonized data to the Patient-Reported Measurement Information System (PROMIS) Depression scale.
The study found that higher levels of phthalates, in particular those found in products such as personal care items and plastic consumer products, were associated with an increased risk of postpartum depression. Those who met the criteria for postpartum depression were more likely to be Hispanic and from the ECHO study site in Puerto Rico, have attained less education, and substantially higher prenatal depression scores.
More information: Melanie H. Jacobson et al, Prenatal Exposure to Nonpersistent Environmental Chemicals and Postpartum Depression, JAMA Psychiatry (2023). DOI: 10.1001/jamapsychiatry.2023.3542
What chemicals are we exposed to on a daily basis? That is the central question of "non-targeted analysis" or NTA, an emerging field of analytical science that aims to identify all chemicals around us. A daunting task, because how can you be sure to detect everything if you don't know exactly what you're looking for?
In a paper published in Environmental Science and Technology, researchers at the Universities of Amsterdam (UvA, the Netherlands) and Queensland (UQ, Australia) have assessed this problem. In a meta-analysis of NTA results published over the past six years, they estimate that less than 2% of all chemicals have been identified.
According to Viktoriia Turkina who performed the research as a Ph.D. student with Dr. Saer Samanipour at the UvA's Van 't Hoff Institute for Molecular Sciences, this limitation underscores the urgent need for a more proactive approach to chemical monitoring and management. "We need to incorporate more data-driven strategies into our studies to be able to effectively protect the human and environmental health," she says.
Samanipour explains that current monitoring of chemicals is rather limited since it's expensive, time consuming, and requires specialized experts. "As an example, in the Netherlands we have one of the most sophisticated monitoring programs for chemicals known to be of concern to human health. Yet we target less than 1,000 chemicals. There are far more chemicals out there that we don't know about."
A vast chemical space
To deal with those chemicals, some 15 to 20 years ago the concept of non-targeted analysis was introduced to look at possible exposure in an unbiased manner. The idea is to take a sample from the environment (air, water, soil, sewer sludge) or the human body (hair, blood, etc.) and analyze it using well-established techniques such as chromatography coupled with high resolution mass spectroscopy.
The challenge then is to trace the obtained signal back to the structures of chemicals that may be present in the sample. This will include already known chemicals, but also chemicals of which the potential presence in the environment is yet unknown.
In theory, this "chemical space" includes as many as 1060 compounds, an incomprehensible number that exceeds the number of stars in the universe by far. On the other hand, the number of organic and inorganic substances published in the scientific literature and public databases is estimated at around 180 million.
To make their research more manageable, Turkina, Samanipour and co-workers focused on a subset of 60,000 well-described compounds from the NORMAN database. Turkina says, "This served as the reference to establish what is covered in NTA studies, and more importantly, to develop an idea about what is being overlooked."
The vast 'exposome' of chemicals that humans are exposed to on a daily basis is a sign of our times, according to Samanipour.
"These days we are soaking in a giant ocean of chemicals. The chemical industry is part of that, but also nature is running all a whole bunch of reactions that result in exposure. And we expose ourselves to chemicals by the stuff we use—think for instance of the problem of microplastics. To solve all this we have to be able to go beyond pointing fingers. With our research, we hope to contribute to finding a solution together. Because we all are in the same boat."
Much room for improvement
The meta analysis, which included 57 NTA papers, revealed that only around 2% of the estimated chemical space was covered. This can indicate that the actual exposure to chemicals is indeed quite low, however, it can also point to shortcomings in the applied analyses. According to Turkina and Samanipour, the latter is indeed the case. They focused on NTA studies applying liquid chromatography coupled with high resolution mass spectrometry (LC-HRMS) -one of the most comprehensive methods for the analysis of complex environmental and biological samples.
It turned out that there was much room for improvement. For instance in sample preparation, they observed a bias towards specific compounds rather than capturing a more diverse set of chemicals. They also observed poor selection and inconsistent reporting of LC-HRMS parameters and data acquisition methods.
"In general," Samanipour says, "the chemical analysis community is to a great extent driven by the available technology that vendors have developed for specific analysis purposes. Thus the instrumental set-up and data processing methods are rather limited when it comes to non-targeted analysis."
To Samanipour, the NTA approach is definitely worth pursuing. "But we need to develop it further and push it forward. Together with vendors we can develop new powerful and more versatile analytical technologies, as well as effective data analysis protocols."
He also advocates a data-driven approach were the theoretical chemical space is "back calculated" towards a subset of chemicals that are highly likely to be present in our environment. "Basically we have to better understand what is the true chemical space of exposure. And once those boundaries are defined, then it becomes a lot easier to assess that number of 2% we have determined."
More information: Tobias Hulleman et al, Critical Assessment of the Chemical Space Covered by LC–HRMS Non-Targeted Analysis, Environmental Science & Technology (2023). DOI: 10.1021/acs.est.3c03606
A new research paper published in Science Advances reveals how changes in the size of the Yangtze River watershed may have led to the carving of deep canyons
In this study, UNC-Chapel Hill professor Eric Kirby and his co-authors explore the impact of drainage basin expansion on the growth of the Yangtze River.
"This study presents a new model for when and how the Yangtze River was born," said Kirby, "The Yangtze is one of the world's great rivers, rising on the Tibetan Plateau at altitudes over 17,000 feet and descending through some of the deepest canyons on the planet toward the East China Sea. The timing of when these canyons were carved is commonly attributed to the uplift of the Tibetan Plateau, but whether the rivers existed in their present-day configuration has been uncertain."
Landscape evolution simulations demonstrated that expansion of the Yangtze watershed since the Late Miocene could be responsible for 1 to 2 kilometers of fluvial incision. In addition, increased erosive power associated with capture and basin integration drove accelerated incision during the Late Miocene.
"One key observation is that ancient lakes were present on the top of the plateau. Our work shows that the effect of capturing these lake basins can explain much of the timing of river incision and the depth of those canyons. Essentially, the Yangtze used to be a shorter, smaller river," Kirby adds.
The study also implies that eastern Tibet was elevated prior to approximately 15 million years ago and the timing of fluvial incision may be out of phase with the growth of plateau topography.
More information: Alexander Rohrmann et al, Accelerated Miocene incision along the Yangtze River driven by headward drainage basin expansion, Science Advances (2023). DOI: 10.1126/sciadv.adh1636
A team of behavioral scientists using big data and a simulation-based model to analyze social media "tweets" around the 2020 presidential election found that the spread of voter fraud conspiracy theories on Twitter (now called X) was boosted by a negativity bias. Led by Mason Youngblood, Ph.D., a post-doctoral fellow in the Institute for Advanced Computational Science at Stony Brook University, the findings are published in Humanities and Social Sciences Communications.
The researchers simulated the behavior of around 350,000 real Twitter users. They found that the sharing patterns of some 4 million tweets about voter fraud are consistent with people being much more likely to retweet social posts that contain stronger negative emotion.
The data for their study came from the VoterFraud2020 dataset, collected between October 23 and December 16, 2020. This dataset includes 7.6 million tweets and 25.6 million retweets that were collected in real-time using X's streaming Application Program Interface, under the established guidelines for ethical and social media data use.
"Conspiracy theories about large-scale voter fraud spread widely and rapidly on Twitter during the 2020 U.S. presidential election, but it is unclear what processes are responsible for their amplification," says Youngblood.
Given that, the team ran simulations of individual users tweeting and retweeting one another under different levels and forms of cognitive bias and compared the output to real patterns of retweet behavior among proponents of voter fraud conspiracy theories during and around the election.
"Our results suggest that the spread of voter fraud messages on Twitter was driven by a bias for tweets with more negative emotion, and this has important implications for current debates on how to counter the spread of conspiracy theories and misinformation on social media," Youngblood adds.
Through their simulations and numerical analysis, Youngblood and colleagues found that their results are consistent with previous research by others suggesting that emotionally negative content has an advantage on social media across a variety of domains, including news coverage and political discourse.
The model also showed that even though negative tweets were more likely to be retweeted, quote tweets tended to be more moderate than the original ones, as people tended not to amplify negativity when commenting on something.
Youngblood says that because the team's simulation-based model recreates the patterns in the actual data quite well, it may potentially be useful for simulating interventions against misinformation in the future. For example, the model could be easily modified to reflect the ways that social media companies or policy makers might try to curb the spread of information, such as reducing the rate at which tweets hit people's timelines.
More information: Mason Youngblood et al, Negativity bias in the spread of voter fraud conspiracy theory tweets during the 2020 US election, Humanities and Social Sciences Communications (2023). DOI: 10.1057/s41599-023-02106-x
Surgeons perform second pig heart transplant, trying to save a dying man
by Lauran Neergaard
Surgeons have transplanted a pig's heart into a dying man in a bid to prolong his life—only the second patient to ever undergo such an experimental feat. Two days later, the man was cracking jokes and able to sit in a chair, Maryland doctors said Friday.
The 58-year-old Navy veteran was facing near-certain death from heart failure but other health problems meant he wasn't eligible for a traditional heart transplant, according to doctors at University of Maryland Medicine.
"Nobody knows from this point forward. At least now I have hope and I have a chance," Lawrence Faucette, from Frederick, Maryland, said in a video recorded by the hospital before Wednesday's operation. "I will fight tooth and nail for every breath I can take."
While the next few weeks will be critical, doctors were thrilled at Faucette's early response to the pig organ.
"You know, I just keep shaking my head—how am I talking to someone who has a pig heart?" Dr. Bartley Griffith, who performed the transplant, told The Associated Press. He said doctors are feeling "a great privilege but, you know, a lot of pressure."
The same Maryland team last year performed the world's first transplant of a genetically modified pig heart into another dying man, David Bennett, who survived just two months.
There's a huge shortage of human organs donated for transplant. Last year, there were just over 4,100 heart transplants in the U.S., a record number but the supply is so tight that only patients with the best chance of long-term survival get offered one.
Attempts at animal-to-human organ transplants have failed for decades, as people's immune systems immediately destroyed the foreign tissue. Now scientists are trying again using pigs genetically modified to make their organs more humanlike.
Recently, scientists at other hospitals have tested pig kidneys and hearts in donated human bodies, hoping to learn enough to begin formal studies of what are called xenotransplants.
To make this new attempt in a living patient outside of a rigorous trial, the Maryland researchers required special permission from the Food and Drug Administration, under a process reserved for certain emergency cases with no other options.
It took over 300 pages of documents filed with FDA, but the Maryland researchers made their case that they'd learned enough from their first attempt last year—even though that patient died for reasons that aren't fully understood—that it made sense to try again.
And Faucette, who retired as a lab technician at the National Institutes of Health, had to agree that he understood the procedure's risks.
In a statement his wife, Ann Faucette, said, "We have no expectations other than hoping for more time together. That could be as simple as sitting on the front porch and having coffee together."
What's different this time: Only after last year's transplant did scientists discover signs of a pig virus lurking inside the heart—and they now have better tests to look for hidden viruses. They also made some medication changes.
Possibly more important, while Faucette has end-stage heart failure and was out of other options, he wasn't as near death as the prior patient.
By Friday, his new heart was functioning well without any supportive machinery, the hospital said.
"It's just an amazing feeling to see this pig heart work in a human," said Dr. Muhammad Mohiuddin, the Maryland team's xenotransplantation expert. But, he cautioned, "we don't want to predict anything. We will take every day as a victory and move forward."
This kind of single-patient "compassionate use" can provide some information about how the pig organ works but not nearly as much as more formal testing, said Karen Maschke, a research scholar at the Hastings Center who is helping develop ethics and policy recommendations for xenotransplant clinical trials. That FDA allowed this second case "suggests that the agency is not ready to permit a pig heart clinical trial to start," Mashke added.
The pig heart, provided by Blacksburg, Virginia-based Revivicor, has 10 genetic modifications—knocking out some pig genes and adding some human ones to make it more acceptable to the human immune system.
Galaxies from the early universe are more like our own Milky Way than previously thought, flipping the entire narrative of how scientists think about structure formation in the universe, according to new research published today.
Using the James Webb Space Telescope (JWST), an international team of researchers including those at The University of Manchester and University of Victoria in Canada discovered that galaxies like our own Milky Way dominate throughout the universe and are surprisingly common.
These galaxies go far back in the universe's history with many of these galaxies forming 10 billion years ago or longer.
The Milky Way is a typical disk galaxy, which has a shape similar to a pancake or compact disk, rotating about its center and often containing spiral arms. These galaxies are thought to be the most common in the nearby universe and might be the types of galaxies where life can develop given the nature of their formation history.
However, astronomers previously considered that these types of galaxies were too fragile to exist in the early universe when galaxy mergers were more common, destroying what we thought was their delicate shapes.
The new discovery, published today in The Astrophysical Journal, finds that these disk galaxies are 10 times more common than what astronomers believed based on previous observations with the Hubble Space Telescope.
Christopher Conselice, professor of extragalactic astronomy at The University of Manchester, said, "Using the Hubble Space Telescope we thought that disk galaxies were almost non-existent until the universe was about 6 billion years old, these new JWST results push the time these Milky Way–like galaxies form to almost the beginning of the universe."
The research completely overturns the existing understanding of how scientists think our universe evolves, and the scientists say new ideas need to be considered.
Lead author, Leonardo Ferreira from the University of Victoria, said, "For over 30 years it was thought that these disk galaxies were rare in the early universe due to the common violent encounters that galaxies undergo. The fact that JWST finds so many is another sign of the power of this instrument and that the structures of galaxies form earlier in the universe, much earlier in fact, than anyone had anticipated."
It was once thought that disk galaxies such as the Milky Way were relatively rare through cosmic history, and that they only formed after the universe was already middle aged.
Previously, astronomers using the Hubble Space Telescope believed that galaxies had mostly irregular and peculiar structures that resemble mergers. However, the superior abilities of JWST now allows us to see the true structure of these galaxies for the first time.
The researchers say that this is yet another sign that "structure" in the universe forms much quicker than anyone had anticipated.
Professor Conselice continues, "These JWST results show that disk galaxies like our own Milky Way, are the most common type of galaxy in the universe. This implies that most stars exist and form within these galaxies which is changing our complete understanding of how galaxy formation occurs. These results also suggest important questions about dark matter in the early universe which we know very little about."
"Based on our results astronomers must rethink our understanding of the formation of the first galaxies and how galaxy evolution occurred over the past 10 billion years."
More information: Leonardo Ferreira et al, The JWST Hubble Sequence: The Rest-frame Optical Evolution of Galaxy Structure at 1.5 < z < 6.5, The Astrophysical Journal (2023). DOI: 10.3847/1538-4357/acec76