Friday, February 26, 2021

 

A tangled food web

Trophic levels are an 'insufficient' measure of sustainability for today's aquaculture policy

UNIVERSITY OF CALIFORNIA - SANTA BARBARA

Research News

Born in food web ecology, the concept of trophic levels -- the hierarchy of who eats who in the natural world -- is an elegant way to understand how biomass and energy move through a natural system. It's only natural that the idea found its way into the realm of aquaculture, where marine and freshwater farmers try to maximize their product with efficient inputs.

"It's often used as a measure of how sustainable it is to harvest or consume that species," said Rich Cottrell(link is external), a postdoctoral researcher at UC Santa Barbara's National Center for Ecological Analysis & Synthesis (NCEAS). As plants (level 1) become food to plant eaters (level 2), who in turn are consumed by carnivores (level 3) and so on, the amount of energy required to support the same weight of organisms increases, he explained. As a result, species at levels 4 or 5, such as tuna, require far more energy per pound than would species in the lower trophic levels. It's the same reason vegetarian diets are often considered to be more sustainable than meat-eating ones.

"In the same manner, trophic level measures are now being recommended in policy settings for use as an indicator of the sustainability of fish farming, or aquaculture," Cottrell said. The lower the trophic level, the more sustainable the species is considered to be, and so policy often calls for more farming of low-trophic species.

However, argue Cottrell and fellow aquaculture experts in a paper(link is external) published in the journal Reviews in Aquaculture, at this time of increasing complexity and sophistication in the aquaculture sector, the same simplicity that makes trophic levels attractive as a measure of sustainability also makes them insufficient.

The causes for that have largely to do with how today's farmed fish are fed.

"Most of the fish and invertebrates that we farm for food are produced using human-made feeds," Cottrell explained. "But these feeds are constantly changing, and so the meaning of farmed trophic levels is changing through time." For instance, he pointed out, salmon are considered to be at a higher trophic level because their naturally carnivorous diets would require large amounts of fishmeal and oil, but advances in feed and manufacturing have reduced the proportion of fish-based ingredients to 10-15% in modern salmon diets. Meanwhile, herbivorous species such as carp and tilapia have been found to respond favorably to small amounts of fishmeal in their feed.

"In reality, they're now farmed at similar trophic levels," Cottrell said. "The line between 'low' and 'high' trophic levels will continue to blur with innovation."

The trophic level concept misses still another important aspect of aquaculture sustainability in the realm of feed and resource efficiency, or how efficiently the farmed animals convert what they are fed into edible food.

"This is not well explained by trophic level," Cottrell said, adding that despite their high trophic placement, many carnivorous farmed fish could be more feed-efficient than their naturally carnivorous counterparts. And because aquaculture is increasingly turning to agriculture to provide replacements for fishmeal and oil, the promise of sustainability might be an empty one.

"Replacing fish-based ingredients with crops has led to a dramatic reduction in the trophic level of fed aquaculture species, but we know very little about how sustainable it is to increase pressure on global agricultural systems," he said.

As the global aquaculture sector strives to meet the growing demand for farmed seafood, the researchers say it's time to rethink the use of trophic levels as a rule for and measure of sustainability. Stipulating low trophic level aquaculture recommendations may not be successful in promoting greater sustainability, Cottrell said. Boosting the supply of mussels, for instance, may not fulfill increasing demand for shrimp or salmon.

"It behooves us to find a way to ensure that for high-demand products, we produce these in the most environmentally efficient and socially responsible way possible," he said. "Trophic levels will not get us there."

Fortunately, there are efforts at more nuanced sustainability assessments, such as voluntary certifications through the Aquaculture Stewardship Council or Best Aquaculture Practices, which examine the impacts of aquaculture at the farm level and through supply chains.

"Greater support for these programs and incentives for producers from various regions and production systems to join them would be a far more robust way to strengthen the sustainability of the aquaculture sector going forward," Cottrell said.

###

Research for this paper was conducted also by Halley E. Froehlich, Caitlin D. Kuempel, Benjamin S. Halpern and David R. Williams at UCSB; Marc Metian at the International Atomic Agency, Monaco; Julia Blanchard and Kirsty L. Nash at the University of Tasmania; Nis Sand Jacobsen at the Technical University of Denmark; Peter B. McIntyre at Cornell University; Lex Bouwman at Utrecht University; Jessica Gephart at American University; Daniel D. Moran at the Norwegian University of Science and Technology and Max Troell at the Royal Swedish Academy of Sciences.

Disclaimer: AAAS and EurekAlert! are not resp

What might sheep and driverless cars have in common? Following the herd

Researchers show how social component of moral decision-making can influence programming of autonomous vehicles and other technologies

UNIVERSITY OF SOUTHERN CALIFORNIA

Research News

Psychologists have long found that people behave differently than when they learn of peers' actions. A new study by computer scientists found that when individuals in an experiment about autonomous vehicles were informed that their peers were more likely to sacrifice their own safety to program their vehicle to hit a wall rather than hit pedestrians who were at risk, the percentage of individuals willing to sacrifice their own safety increased by approximately two-thirds.

As computer scientists train machines to act as people's agents in all sorts of situations, the study's authors indicate that the social component of decision-making is often overlooked. This could be of great consequence, note the paper's authors who show that the trolly problem -long shown to be the scenario moral psychologists turn to--is problematic. The problem, the authors indicate, fails to show the complexity of how humans make decisions.

Jonathan Gratch, one of the paper's author, the principal investigator for this project, and a computer scientist at the USC Institute for Creative Technologies, says existing models assume that in high stakes life and death decisions, people think differently than they actually do. He indicates that there are not moral absolutes for human decision-making but rather "it is more nuanced," says Gratch.

The researchers conducted four separate simulation experiments to understand how people might process and act on the moral dilemmas they would face as an operator of a driverless car. The first three experiments focused on human behavior when faced with risk to themselves and others in the event of negative scenario in which the vehicle would have to be programmed to either the car to hit the wall or hit five pedestrians. The authors prove that participants would use severity of injury to self and the risk to others as guideposts for decision-making. They found that the higher the risk to pedestrians; the more likely people were likely to self -sacrifice their own health. In addition, the level of risk to pedestrians did not to have to be as high as for the operator of the autonomous vehicle to sacrifice their own well-being.

In the fourth experiment, the researchers added a social dimension telling participants what peers had opted to do in the same situation. In one simulation, the knowledge that peers chose to risk their own health changed the participants' responses, rising from 30 percent who were willing to risk their health to 50 percent. But this can go both ways cautions Gratch. "Technically there are two forces at work. When people realize their peers don't care, this pulls people down to selfishness. When they realize they care, this pulls them up."

The research has implications for autonomous vehicles including drones and boats, as well as robots that are programmed by humans. The authors suggest that it is important for manufacturers to have an awareness of how humans actually make decisions in life or death situations. In addition, the authors imply that transparency in how machines are programmed as well as relinquishing controls to the human drivers so that they might change settings prior to these life and death situations are important for the public. They also suggest it is important for legislators to be aware of how vehicles might be programmed. Lastly, given the human susceptibility to conform to social norms, the authors believe that public health campaigns related to how peers programmed their autonomous vehicles for self-sacrifice might influence future owners to change their vehicle settings to be more oriented to protecting others from injury and choose self-sacrifice.

###

The authors of this study are Celso M. de Melo of the US Army Research Laboratory, Stacy Marsella of Northeastern University, and Jonathan Gratch of the USC Institute for Creative Studies.

Study uncovers flaws in process for maintaining state voter rolls

YALE UNIVERSITY

Research News

States regularly use administrative records, such as motor-vehicle data, in determining whether people have moved to prune their voter rolls. A Yale-led study of this process in Wisconsin shows that a significant percentage of registered voters are incorrectly identified as having changed addresses, potentially endangering their right to vote.

The study, published in the journal Science Advances, found that at least 4% of people listed as suspected "movers" cast ballots in 2018 elections using addresses that were wrongly flagged as out of date. Minority voters were twice as likely as white voters to cast their ballot with their original address of registration after the state marked them as having moved, the study showed.

The findings suggest that states should more clearly communicate the processes they use to update voter-registration files and that a more robust effort is required to confirm whether individuals have moved before they are removed from the voter rolls, said Yale political scientist Gregory A. Huber, the study's lead author.

"The process of maintaining states' voter-registration files cries out for greater transparency," said Huber, the Forst Family Professor of Political Science in the Faculty of Arts & Sciences. "Our work shows that significant numbers of people are at risk of being disenfranchised, particularly those from minority groups.

"Unfortunately, we don't know enough about the process used to prune voter rolls nationwide to understand why mistakes occur and how to prevent them."

Regularly updating voter rolls prevents registration files from becoming bloated with individuals who have died, moved away, or are otherwise no longer eligible to vote. When these rolls swell with ineligible voters, it raises concerns about potential fraud (although there is little evidence it causes unlawful voting, Huber says) and creates headaches for political campaigns, which rely on accurate registration records to reach potential voters.

Americans are not obligated to inform local election officials when they move to a new address, but federal law mandates that states identify changes in residence among registered voters. To better accomplish this task, 30 states, including Wisconsin, and the District of Columbia have formed the Electronic Registration Information Center (ERIC), a non-profit organization that assists them in improving the accuracy of their voter rolls.

ERIC uses various administrative records, including motor vehicle data, change of address information from the U.S. Postal Service, and the Social Security Administration's master death file, to flag registrations that are potentially out of date. It provides states a "movers list" of people who likely have changed residences. The states contact listed individuals, often by sending them postcards they can use to confirm their address. If people do not return the postcards, their registration can be inactivated, starting the process for removal.

Federal privacy protections and ERIC's agreements with member states prohibit the organization from disclosing who is marked as having moved and on what basis they were flagged as such, making it difficult to examine its process. However, after submitting a Wisconsin Freedom of Information Act request, Huber and his co-authors obtained special "movers poll books" from the state which list all people who were marked as suspected movers and who did not respond to the postcard notification. Individuals in the books who showed up to vote in 2018 signed their names in these books, providing evidence that they voted at addresses that had been flagged as invalid.

The researchers collected movers poll books from a representative sample of election wards and matched their contents against voting records for 2018 local, state, and federal elections. They found that at least 9,000 people -- about 4% of those listed in the poll books -- voted in 2018 using the address of registration that ERIC had marked as invalid. Minority voters were twice as likely to be incorrectly identified as having moved.

The study likely undercounts the number of registered voters incorrectly listed as having moved, the researchers said, explaining that a significant number of people who did not respond to the postcard might have nonetheless renewed their voting registration before the poll books were published. In addition, the study examined low-turnout elections, making it likely that many people wrongly listed in the poll books weren't covered in the analysis because they didn't vote, Huber said.

The researchers are not suggesting that ERIC intentionally targeted minorities.

"There's no malice here," Huber said. "ERIC wants to help states, but relying on administrative records inevitably produces mistakes for any number of reasons. This makes the process used to validate having moved, such as mailed postcards, even more important. Without more information, we can't be certain why the process disparately affects minorities."

A potential reason for the disparity is that minorities are more likely than whites to live in apartment buildings and large households, which may increase the risk of errors in administrative records, the researchers suggest. In addition, residents of apartment buildings also may be less likely to confirm their address using the postcard since mail service can be spottier in multi-unit buildings than single-family homes.

Huber credits Wisconsin for taking steps to protect people's voting rights.

"The poll books are a great way to identify mistakes and prevent people from being disenfranchised," he said. "The state also has same day voter registration, which is another safety valve that doesn't exist in many states. We suggest that states expend more effort on contacting people at risk of losing their registration."

###

The study's co-authors are Marc Meredith of the University of Pennsylvania, Yale Law School graduate Michael Morse, and Katie Steele of the University of Pennsylvania.

Imaging space debris in high resolution

SOCIETY FOR INDUSTRIAL AND APPLIED MATHEMATICS

Research News

IMAGE

IMAGE: FROM LEFT TO RIGHT: SPACE DEBRIS MODELED AS A CLUSTER OF SIX REFLECTIVE OBJECTS, AN IMAGE DEVELOPED OF THE DEBRIS WITHOUT ACCOUNTING FOR THE OBJECTS' ROTATION, AND AN IMAGE DEVELOPED... view more 

CREDIT: FIGURE COURTESY OF MATAN LEIBOVICH, GEORGE PAPANICOLAOU, AND CHRYSOULA TSOGKA.

Litter is not only a problem on Earth. According to NASA, there are currently millions of pieces of space junk in the range of altitudes from 200 to 2,000 kilometers above the Earth's surface, which is known as low Earth orbit (LEO). Most of the junk is comprised of objects created by humans, like pieces of old spacecraft or defunct satellites. This space debris can reach speeds of up to 18,000 miles per hour, posing a major danger to the 2,612 satellites that currently operate at LEO. Without effective tools for tracking space debris, parts of LEO may even become too hazardous for satellites.

In a paper publishing today in the SIAM Journal on Imaging Sciences, Matan Leibovich (New York University), George Papanicolaou (Stanford University), and Chrysoula Tsogka (University of California, Merced) introduce a new method for taking high-resolution images of fast-moving and rotating objects in space, such as satellites or debris in LEO. They created an imaging process that first utilizes a novel algorithm to estimate the speed and angle at which an object in space is rotating, then applies those estimates to develop a high-resolution picture of the target.

Leibovich, Papanicolaou, and Tsogka used a theoretical model of a space imaging system to construct and test their imaging process. The model depicts a piece of fast-moving debris as a cluster of very small, highly reflective objects that represent the strongly reflective edges of an item in orbit, such as the solar panels on a satellite. The cluster of reflectors all move together with the same speed and direction and rotate about a common center. In the model, multiple sources of radiation on the Earth's surface--such as the ground control stations of global navigation satellite systems--emit pulses that are reflected by target pieces of space debris. A distributed set of receivers then detects and records the signals that bounce off the targets.

The model focuses on sources that produce radiation in the X-band, or from frequencies of 8 to 12 gigahertz. "It is well known that resolution can be improved by using higher frequencies, such as the X-band," Tsogka said. "Higher frequencies, however, also result in distortions to the image due to ambient fluctuations from atmospheric effects." Signals are distorted by turbulent air as they travel from the target to receivers, which can make the imaging of objects in LEO quite challenging. The first step of the authors' imaging process was thus to correlate the data taken at different receivers, which can help reduce the effects of these distortions.

The diameter of the area encompassed by the receivers is called the physical aperture of the imaging system -- in the model, this is about 200 kilometers. Under normal imaging conditions, the physical aperture's size determines the resolution of the resulting image; a larger aperture begets a sharper picture. However, the quick movement of the imaging target relative to the receivers can create an inverse synthetic aperture, in which the signals that were detected at multiple receivers as the target moved throughout their field of view are synthesized coherently. This configuration can effectively improve the resolution, as if the imaging system had a wider aperture than the physical one.

Objects in LEO can spin on timescales that range from a full rotation every few seconds to every few hundred seconds, which complicates the imaging process. It is thus important to know--or at least be able to estimate--some details about the rotation before developing the image. The authors therefore needed to estimate the parameters related to the object's rotation before synthesizing the data from different receivers. Though simply checking all of the possible parameters to see which ones yield the sharpest image is technically feasible, doing so would require a lot of computational power. Instead of employing this brute force approach, the authors developed a new algorithm that can analyze the imaging data to estimate the object's rotation speed and the direction of its axis.

After accounting for the rotation, the next step in the authors' imaging process was to analyze the data to develop a picture of the space debris that would hopefully be as accurate and well-resolved as possible. One method that researchers often employ for this type of imaging of fast-moving objects is the single-point migration of cross correlations. Though atmospheric fluctuations do not usually significantly impair this technique, it does not have a very high resolution. A different, commonly-used imaging approach called Kirchhoff migration can achieve a high resolution, as it benefits from the inverse synthetic aperture configuration; however, the trade-off is that it is degraded by atmospheric fluctuations. With the goal of creating an imaging scheme that is not too heavily affected by atmospheric fluctuations but still maintains a high resolution, the authors proposed a third approach: an algorithm whose result they call a rank-1 image. "The introduction of the rank-1 image and its resolution analysis for fast-moving and rotating objects is the most novel part of this study," Leibovich said.

To compare the performance of the three imaging schemes, the authors gave simulated data of a rotating object in LEO to each one and compared the images that they produced. Excitingly, the rank-1 image was much more accurate and well-resolved than the result of single-point migration. It also had similar qualities to the output of the Kirchhoff migration technique. But this result was not entirely surprising, given the problem's configuration. "It is important to note that the rank-1 image benefits from the rotation of the object," Papanicolaou said. Though a rotating object generates more complex data, one can actually incorporate this additional information into the image processing technique to improve its resolution. Rotation at certain angles can also increase the size of the synthetic aperture, which significantly improves the resolution for the Kirchhoff migration and rank-1 images.

Further simulations revealed that the rank-1 image is not easily muddled by errors in the new algorithm for the estimation of rotation parameters. It is also more robust to atmospheric effects than the Kirchhoff migration image. If receivers capture data for a full rotation of the object, the rank-1 image can even achieve optimal imaging resolution. Due to its good performance, this new imaging method could improve the accuracy of imaging LEO satellites and space debris. "Overall, this study shed light on a new method for imaging fast-moving and rotating objects in space," Tsogka said. "This is of great importance for ensuring the safety of the LEO band, which is the backbone of global remote sensing."

###

Source article: Leibovich, M., Papanicolaou, G., & Tsogka, C. (2021). Correlation Based Imaging for Rotating SatellitesSIAM J. Imag. Sci., 14(1), 271-303.

Social dilemma follows 2018 eruption of Kilauea volcano

UNIVERSITY OF HAWAII AT MANOA

Research News

IMAGE

IMAGE: FISSURE 8 ERUPTS IN LAVA HAZARD ZONE 1, BACK-LIGHTING A FRONT GATE, A MAILBOX AND UTILITY LINES. MAY 5, 2018. view more 

CREDIT: BRUCE HOUGHTON

The unprecedented cost of the 2018 Kilauea eruption in Hawai'i reflects the intersection of distinct physical and social phenomena: infrequent, highly destructive eruptions, and atypically high population growth, according to a new study published in Nature Communications and led by University of Hawai'i at Mānoa researchers.

It has long been recognized that areas in Puna, Hawai'i, are at high risk from lava flows. This ensured that land values were lower in Puna--which lies within the three highest risk lava hazard zones 1, 2 and 3--which actively promoted rapid population growth.

"Low prices on beautiful land and a scarcity of recent eruptions led to unavoidable consequences--more people and more development," said Bruce Houghton, the lead author of the study and Gordan Macdonald Professor of Volcanology in the UH Mānoa School of Ocean and Earth Science and Technology (SOEST). "Ultimately this drastically increased the value of what was at risk in 2018, relative to earlier eruptions of Ki?lauea."

Kilauea is one of the most active volcanoes on Earth and has one of the earliest, most comprehensive volcanic monitoring systems. Its recent history has been dominated by activity at the summit caldera and from one of two lines of vents called the Eastern Rift Zone. Between 1967 and 2018, volcanic activity was dominated by eruptions from the upper part of the Eastern Rift Zone. In contrast, no damaging eruptions occurred after 1961 in the more heavily populated Puna district from the vents within the lower portion of the Eastern Rift Zone.

The UH team assessed trends in population growth in Pāhoa-Kalapana, Hilo and Puna using census data, and compared median cost of land and household income in these areas.

Valuable lessons regarding the complex interplay of science, policy, and public behavior emerged from the 2018 disaster.

"Steep population growth occurred during the absence of any locally sourced eruptions between 1961 and 2018, and set the scene for the unprecedented levels of infra-structural damage during the 2018 Lower Eastern Rift Zone eruption," said Wendy Cockshell, co-author on the paper and technical assistant at the National Disaster Preparedness Training Center (NDPTC) at UH Mānoa.

If population growth resumes in lava hazard zones 1 and 2, there will be increased risk in the most dangerous areas on this exceptionally active volcano translating into high cost of damage in future eruptions.

"Our funded research supports the principle of the initiatives by local and federal government to provide buy-out funding to land owners affected by the 2018 eruption to able them to relocate outside of these hazardous areas," said Houghton.


CAPTION

Within lava hazard zone 1 a new cone is formed and a river of lava flows through properties down into lava hazard zone 2.



CAPTION

Houses may survive but can be surrounded by lava without access to electrical or water supplies. Roads now lead to nowhere. (Note Fissure 8 fountain in the background.)

CREDIT

Bruce Houghton




New sustainable building simulation method points to the future of design

CORNELL UNIVERSITY

Research News

ITHACA, N.Y. - A team from Cornell University's Environmental Systems Lab, led by recent graduate Allison Bernett, has put forth a new framework for injecting as much information as possible into the pre-design and early design phases of a project, potentially saving architects and design teams time and money down the road.

"(Our framework) allows designers to understand the full environmental impact of their building," said Bernett, corresponding author of "Sustainability Evaluation for Early Design (SEED) Framework for Energy Use, Embodied Carbon, Cost, and Daylighting Assessment" which published Jan. 10 in the Journal of Building Performance Simulation.

Principle investigators are Timur Dogan, assistant professor of architecture in the College of Architecture, Art and Planning; and Katharina Kral, a licensed architect and lecturer in the Department of Architecture.

"How we look at this is, there's the cost of change in the design process, and then the opportunity of impact," Dogan said. "In the very beginning, changing something doesn't cost anything, but if you're a month into the project, changing something is really expensive, because now you have to rehire consultants and redesign things.

"And then the other thing is the potential of impact," he said. "In the very beginning, just with a simple nudge in the right direction, you can change a project from being an energy hog to something that's very sustainable, and integrates well into the environment."

In 2018, according to the International Energy Agency, the construction sector accounted for 39% of energy and process-related greenhouse gas emissions. That included 11% originating from the manufacturing of building materials and products.

The Sustainability Evaluation for Early Design (SEED) Framework is a decision-making tool that can dynamically and concurrently simulate several variables: building energy performance; embodied carbon (carbon emissions generated by construction and materials); construction cost; and daylighting (the use of natural light to illuminate indoor spaces).

The framework will allow architects and design teams to rapidly trial and rank tens of thousands of design iterations, using as few as four inputs.

Using publicly available data and a suite of available design simulation programs - including Rhino/Grasshopper (a CAD program); ClimateStudio, developed by Dogan, for daylight simulation and building energy modeling; and engineering software Karamba3D - Bernett and the team tested SEED in a case study of a hypothetical mid-sized office building modeled in Boston, Washington, D.C., and Phoenix.

The SEED Framework generated thousands of design options based on variables specific to the three cities in the case study, offering designers the flexibility of many options early in the process, before changing course would get too expensive.

"The idea is, you run this analysis," Dogan said, "and you get a few options that already make a lot of sense, and some options that you can completely forget about. ... [It] always comes down to this lack of information in the decision-making process.

"In that sense, the construction industry is super inefficient," he said. "There's too many players who don't know the full picture and then make decisions that are not always rational. This framework that Allison worked on is geared to help bring the information to the table. Every stakeholder in the design process can then form their own opinion about design goal priorities."

SEED's greatest asset, Bernett said, is amassing a tranche of data on multiple factors in one place, and involving architects early in the design and pre-design phases.

"It takes a lot of time to gather all that data, and we have that prepackaged. So there's definitely a hunger for that," said Bernett, who presented the SEED Framework in September 2019 at the International Building Performance Simulation Conference, in Rome.

"Right now, we rely heavily on energy modelers and consultants to do this work," she said. "And if we can involve architects more readily and more early on, I think that we're going to see a lot of improvement and cost-effectiveness to these early design decisions."

In addition to the publicly available design simulations, the team used AutoFrame, a new procedure developed by Kral for automatically computing structural systems. AutoFrame helps improve the precision of embodied carbon assessments and daylight simulations.

The Cornell Atkinson Center for Sustainability's Small Grants Program provided pivotal support for this work, Bernett said.

"That funding really gave it the push it needed," she said. "It allowed me to present a first iteration [of SEED] at the conference in Rome, and then to really flesh out the research more after that."

###

 

Scientists use Doppler to peer inside cells

Process leads to Scientists use Doppler to peer inside cells, leading to better, faster diagnoses and treatments of infection

PURDUE UNIVERSITY

Research News

IMAGE

IMAGE: DAVID NOLTE WORKS WITH THE DOPPLER APPARATUS TO PEER INSIDE LIVING CELLS, GIVING HIM INSIGHT INTO INTRACELLULAR ACTIVITY, METABOLISM, AND PATHOGENICITY view more 

CREDIT: PURDUE UNIVERSITY PHOTO/REBECCA MCELHOE

WEST LAFAYETTE, Ind. -- Doppler radar improves lives by peeking inside air masses to predict the weather. A Purdue University team is using similar technology to look inside living cells, introducing a method to detect pathogens and treat infections in ways that scientists never have before.

In a new study, the team used Doppler to sneak a peek inside cells and track their metabolic activity in real time, without having to wait for cultures to grow. Using this ability, the researchers can test microbes found in food, water, and other environments to see if they are pathogens, or help them identify the right medicine to treat antibiotic-resistant bacteria.

David Nolte, Purdue's Edward M. Purcell Distinguished Professor of Physics and Astronomy; John Turek, professor of basic medical sciences; Eduardo Ximenes, research scientist in the Department of Agricultural and Biological Engineering; and Michael Ladisch, Distinguished Professor of Agricultural and Biological Engineering, adapted this technique from their previous study on cancer cells in a paper released this month in Communications Biology.

Using funding from the National Science Foundation as well as Purdue's Discovery Park Big Idea Challenge, the team worked with immortalized cell lines -- cells that will live forever unless you kill them. They exposed the cells to different known pathogens, in this case salmonella and E. coli. They then used the Doppler effect to spy out how the cells reacted. These living cells are called "sentinels," and observing their reactions is called a biodynamic assay.

"First we did biodynamic imaging applied to cancer, and now we're applying it to other kinds cells," Nolte said. "This research is unique. No one else is doing anything like it. That's why it's so intriguing."

This strategy is broadly applicable when scientists have isolated an unknown microbe and want to know if it is pathogenic -- harmful to living tissues -- or not. Such cells may show up in food supply, water sources or even in recently melted glaciers.

"This directly measures whether a cell is pathogenic," Ladisch said. "If the cells are not pathogenic, the Doppler signal doesn't change. If they are, the Doppler signal changes quite significantly. Then you can use other methods to identify what the pathogen is. This is a quick way to tell friend from foe."

Being able to quickly discern whether a cell is harmful is incredibly helpful in situations where people encounter a living unknown microorganism, allowing scientists to know what precautions to take. Once it is known that a microbe is harmful, they can begin established protocols that allow them to determine the specific identity of the cell and determine an effective antibiotic against the microorganism.

Another benefit is the ability to quickly and directly diagnose which bacteria respond to which antibiotics. Antibiotic resistance can be a devastating problem in hospitals and other environments where individuals with already compromised bodies and immune systems may be exposed to and infected by increasingly high amounts of antibiotic resistant bacteria. Sometimes this results in a potentially fatal condition called bacterial sepsis, or septicemia. This is different from the viral sepsis that has been discussed in connection with COVID-19, though the scientists say their next steps will include investigating viral sepsis.

Treating sepsis is challenging. Giving the patient broad-spectrum antibiotics, which sounds like a good idea, might not help and could make the situation worse for the next patient. Letting bacteria come into close contact with antibiotics that do not kill them only makes them more resistant to that antibiotic and more difficult to fight next time.

Culturing the patient's tissues and homing in on the correct antibiotic to use can take time the patient does not have, usually eight to 10 hours. This new biodynamic process allows scientists to put the patient's bacterial samples in an array of tiny petri dishes containing the tissue sentinels and treat each sample with a different antibiotic. Using Doppler, they can quickly notice which bacterial samples have dramatic metabolic changes. The samples that do are the ones that have reacted to the antibiotic -- the bacteria are dying, being defeated and beaten back by antibiotics.


CAPTION

The team isolated living immortalized cells in multi-well plates to study them with Doppler.

CREDIT

Purdue University photo/Rebecca McElhoe

"When we treat with antibiotics, the bacteria don't have to multiply much before they start to affect the tissue sentinels," Nolte explained. "There are still too few bacteria to see or to measure directly, but they start to affect how the tissues behaves, which we can detect with Doppler."

In less than half the time a traditional culture and diagnosis takes, doctors could tell which antibiotic to administer, bolstering the patient's chances for recovery. The researchers worked closely with the Purdue Research Foundation Office of Technology Commercialization to patent and license their technologies. They plan to further explore whether this method would work for tissue samples exposed to nonliving pathogenic cells or dried spores, and to test for and treat viral sepsis.


CAPTION

The Doppler apparatus allows scientists to observe living cells in real time

CREDIT

Purdue University photo/Rebecca McElhoe


About Purdue University

Purdue University is a top public research institution developing practical solutions to today's toughest challenges. Ranked the No. 5 Most Innovative University in the United States by U.S. News & World Report, Purdue delivers world-changing research and out-of-this-world discovery. Committed to hands-on and online, real-world learning, Purdue offers a transformative education to all. Committed to affordability and accessibility, Purdue has frozen tuition and most fees at 2012-13 levels, enabling more students than ever to graduate debt-free. See how Purdue never stops in the persistent pursuit of the next giant leap at https://purdue.edu/.


ABSTRACT

Doppler imaging detects bacterial infection of living tissue

Honggu Choi, Zhe Li, Zhen Hua, Jessica Zuponcic, Eduardo Ximenes, John J. Turek, Michael R. Ladisch and David D. Nolte

Living 3D in vitro tissue cultures, grown from immortalized cell lines, act as living sentinels as pathogenic bacteria invade the tissue. The infection is reported through changes in the intracellular dynamics of the sentinel cells caused by the disruption of normal cellular function by the infecting bacteria. Here, the Doppler imaging of infected sentinels shows the dynamic characteristics of infections. Invasive Salmonella enterica serovar Enteritidis and Listeria monocytogenes penetrate through multicellular tumor spheroids, while non-invasive strains of Escherichia coli and Listeria innocua remain isolated outside the cells, generating different Doppler signatures. Phase distributions caused by intracellular transport display Lévy statistics, introducing a Lévy-alpha spectroscopy of bacterial invasion. Antibiotic treatment of infected spheroids, monitored through time-dependent Doppler shifts, can distinguish drug-resistant relative to non-resistant strains. This use of intracellular Doppler spectroscopy of living tissue sentinels opens a new class of microbial assay with potential importance for studying the emergence of antibiotic resistance.

Farmers in developing countries can protect both profits and endangered species

RICE UNIVERSITY

Research News

HOUSTON - (Feb. 25, 2021) - Low-income livestock farmers in developing countries are often faced with a difficult dilemma: protect their animals from endangered predators, or spare the threatened species at the expense of their livestock and livelihood.

A new paper by Rice University economist Ted Loch-Temzelides examines such circumstances faced by farmers in Pakistan. "Conservation, risk aversion, and livestock insurance: The case of the snow leopard" outlines a plan under which farmers can protect themselves from crippling financial losses while preserving and possibly benefiting from the lives of endangered predators.

"These livestock owners often have very low incomes," Loch-Temzelides said. "The loss of even one animal can be financially devastating. They're faced with the difficult task of weighing conservation efforts against economic losses due to attacks on their herds. And this situation isn't limited to snow leopards -- it applies anywhere large predators live near livestock."

Loch-Temzelides proposes establishing community livestock insurance contracts for farmers in developing countries who don't have access to the types of policies available in more developed nations. Under these contracts, farmers would agree to share the cost of lost animals with other farmers in their community. For example: If one farmer in a community of 10 lost an animal valued at $100, each community member would lose the equivalent of about $10.

By aiding conservation efforts, he added, farmers may stand to reap additional benefits.

"Tourists around the world are willing to pay to see endangered species such as snow leopards in their natural habitats," Loch-Temzelides said. "And revenue from ecotourism can benefit communities and their residents significantly."

While Loch-Temzelides' study focuses on Pakistan, he hopes community livestock insurance can be useful around the world.

###

The study will appear in an upcoming edition of the journal Conservation Letters and is available online at

https://conbio.onlinelibrary.wiley.com/doi/10.1111/conl.12793.

This news release can be found online at news.rice.edu.

Follow Rice News and Media Relations on Twitter @RiceUNews.

Photo link: https://news-network.rice.edu/news/files/2021/02/115314887_l.jpg

Photo credit: 123rf.com

Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation's top 20 universities by U.S. News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural Sciences and Social Sciences and is home to the Baker Institute for Public Policy. With 3,978 undergraduates and 3,192 graduate students, Rice's undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction and No. 1 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger's Personal Finance.

 

Improving water quality could help conserve insectivorous birds -- study

Scarcity of insect prey in disturbed lakes and streams drives decline of birds

FRONTIERS

Research News

A new study shows that a widespread decline in abundance of emergent insects - whose immature stages develop in lakes and streams while the adults live on land - can help to explain the alarming decline in abundance and diversity of aerial insectivorous birds (i.e. preying on flying insects) across the USA. In turn, the decline in emergent insects appears to be driven by human disturbance and pollution of water bodies, especially in streams. This study, published in Frontiers in Ecology and Evolution, is one of the first to find evidence for a causal link between the decline of insectivorous birds, the decline of emergent aquatic insects, and poor water quality.

Human activities, such as urbanization and agriculture, have adverse effects on aquatic ecosystems. In the US, 46% of streams are in poor condition, while 57% of lakes suffer from strong human disturbance. The immature stages of aquatic insects, especially stoneflies, mayflies and caddisflies, are known to be highly sensitive to pollution, which is why they have often been used as biomonitors for water quality. But the authors of the present study predicted a priori that emergent insects - whose adult flying stages are important sources of food for birds, spiders, bats and reptiles - should likewise be powerful biomonitors for the health of terrestrial ecosystems. This prediction is borne out by the new results.

"The massive decline in bird fauna across the USA requires that we adopt new paradigms for conservation. Currently, most management and conservation agencies and plans are separated into aquatic and terrestrial divisions. However, aquatic and terrestrial ecosystems are inextricably linked through a suite of ecological connections," says author Dr Maeika Sullivan, associate professor in the School of Environment and Natural Resources and Director of the Schiermeier Olentangy River Wetland Research Park at Ohio State University.

Sullivan and colleagues analyzed data from multiple open-access surveys monitoring water quality, aquatic invertebrates and 21 species of aerial insectivorous birds from the contiguous United States. "The task of putting together these big data sets, collected by different US agencies with different goals and objectives, revealed several new questions and challenges which will require interdisciplinary thinking to resolve," says corresponding author Dr David Manning, assistant professor in the Department of Biology, University of Nebraska at Omaha.

First, the authors show that water quality is a good predictor for local relative abundance of emergent insects. Then they show for the first time that water quality and the associated abundance of emergent insects explains a moderate but significant proportion of the variation in local abundance of aerial insectivorous birds in the US, including both upland and riparian (i.e. foraging on river banks) species.

Not all bird species were equally negatively impacted by declines in the abundance of emergent insects, suggesting that factors such the birds' microhabitat and foraging strategy may also play a role. The western wood pewee (Contopus sordidulus, an upland bird species), the olive-sided flycatcher (C. cooperi, which facultatively lives in riparian zones), and the Acadian flycatcher (Empidonax virescens, which almost exclusively occurs near water) depended most strongly on the local abundance of overall emergent insects. The eastern phoebe (Sayornis phoebe), violet-green swallow (Tachycineta thalassina), tree swallow (Tachycineta bicolor), eastern wood-pewee (C. virens), barn swallow (Hirundo rustica), and chimney swift (Chaetura pelagica), were specifically sensitive to the relative abundance of stoneflies, mayflies and caddisflies.

The authors emphasize the need for interdisciplinary research to develop new conservation and biomonitoring strategies focused on the effects of water quality on endangered birds and other terrestrial wildlife.

"We need a better understanding of the common mechanisms that could drive declines in both aquatic insects and many bird species. We would like to explore some of these shared mechanisms in future research, but at a much larger scale than previously. Tackling these questions will require collaboration among freshwater ecologists, ornithologists, landscape ecologists, entomologists, data scientists, and others," says Manning.

###

OU study highlights need for improving methane emission database

UNIVERSITY OF OKLAHOMA

Research News

A University of Oklahoma-led study published in 2020 revealed that both area and plant growth of paddy rice is significantly related to the spatial-temporal dynamics of atmospheric methane concentration in monsoon Asia, where 87% of the world's paddy rice fields are situated. Now, the same international research team has released a follow-up discussion paper in the journal Nature Communications. In this paper, the team identifies the limits and insufficiency of the major greenhouse emission database (EDGAR) in estimating paddy rice methane emissions.

"Methane emission from paddy rice fields contribute to the rising of atmospheric methane concentration (XCH4), one of the greenhouse gases for global warming and climate change," said Xiangming Xiao, a professor in the Department of Microbiology and Plant Biology, College of Arts and Sciences. "In this paper, our team highlighted the needs and pathways to improve this dataset, which could lead to substantial improvement in understanding and modeling methane emission, atmospheric transport and chemistry over monsoon Asia and the globe."

OU researchers developed annual paddy rice maps at 500-meter spatial resolution and quantified the spatial-temporal changes in rice paddy area in monsoon Asia during 2000-2015. Xiao said these annual maps are the first of their kind and could be used to further improve simulations of models that estimate methane emission from paddy rice fields.