Friday, March 12, 2021

Breaking the warp barrier for faster-than-light travel

Astrophysicist at Göttingen University discovers new theoretical hyper-fast soliton solutions

UNIVERSITY OF GÖTTINGEN

Research News

IMAGE

IMAGE: ARTISTIC IMPRESSION OF DIFFERENT SPACECRAFT DESIGNS CONSIDERING THEORETICAL SHAPES OF DIFFERENT KINDS OF "WARP BUBBLES ". view more 

CREDIT: E LENTZ

If travel to distant stars within an individual's lifetime is going to be possible, a means of faster-than-light propulsion will have to be found. To date, even recent research about superluminal (faster-than-light) transport based on Einstein's theory of general relativity would require vast amounts of hypothetical particles and states of matter that have "exotic" physical properties such as negative energy density. This type of matter either cannot currently be found or cannot be manufactured in viable quantities. In contrast, new research carried out at the University of Göttingen gets around this problem by constructing a new class of hyper-fast 'solitons' using sources with only positive energies that can enable travel at any speed. This reignites debate about the possibility of faster-than-light travel based on conventional physics. The research is published in the journal Classical and Quantum Gravity.

The author of the paper, Dr Erik Lentz, analysed existing research and discovered gaps in previous 'warp drive' studies. Lentz noticed that there existed yet-to-be explored configurations of space-time curvature organized into 'solitons' that have the potential to solve the puzzle while being physically viable. A soliton - in this context also informally referred to as a 'warp bubble' - is a compact wave that maintains its shape and moves at constant velocity. Lentz derived the Einstein equations for unexplored soliton configurations (where the space-time metric's shift vector components obey a hyperbolic relation), finding that the altered space-time geometries could be formed in a way that worked even with conventional energy sources. In essence, the new method uses the very structure of space and time arranged in a soliton to provide a solution to faster-than-light travel, which - unlike other research - would only need sources with positive energy densities. No "exotic" negative energy densities needed.

If sufficient energy could be generated, the equations used in this research would allow space travel to Proxima Centauri, our nearest star, and back to Earth in years instead of decades or millennia. That means an individual could travel there and back within their lifetime. In comparison, the current rocket technology would take more than 50,000 years for a one-way journey. In addition, the solitons (warp bubbles) were configured to contain a region with minimal tidal forces such that the passing of time inside the soliton matches the time outside: an ideal environment for a spacecraft. This means there would not be the complications of the so-called "twin paradox" whereby one twin travelling near the speed of light would age much more slowly than the other twin who stayed on Earth: in fact, according to the recent equations both twins would be the same age when reunited.

"This work has moved the problem of faster-than-light travel one step away from theoretical research in fundamental physics and closer to engineering. The next step is to figure out how to bring down the astronomical amount of energy needed to within the range of today's technologies, such as a large modern nuclear fission power plant. Then we can talk about building the first prototypes," says Lentz.

Currently, the amount of energy required for this new type of space propulsion drive is still immense. Lentz explains, "The energy required for this drive travelling at light speed encompassing a spacecraft of 100 meters in radius is on the order of hundreds of times of the mass of the planet Jupiter. The energy savings would need to be drastic, of approximately 30 orders of magnitude to be in range of modern nuclear fission reactors." He goes on to say: "Fortunately, several energy-saving mechanisms have been proposed in earlier research that can potentially lower the energy required by nearly 60 orders of magnitude." Lentz is currently in the early-stages of determining if these methods can be modified, or if new mechanisms are needed to bring the energy required down to what is currently possible.

###

Original publication: Erik W Lentz, Breaking the Warp Barrier: Hyper-Fast Solitons in Einstein-Maxwell-Plasma Theory, Classical and Quantum Gravity, March 2021. DOI: 10.1088/1361-6382/abe692


CAPTION

Image to show how long it would take different types of spacecraft to travel from our solar system to Proxima Centauri (the nearest known star). Currently, the only option would be to use a chemical rocket meaning a journey time of over 50,000 years.

CREDIT

E Lentz

Bacterial film separates water from oil

NORTH CAROLINA STATE UNIVERSITY

Research News

Researchers have demonstrated that a slimy, yet tough, type of biofilm that certain bacteria make for protection and to help them move around can also be used to separate water and oil. The material may be useful for applications such as cleaning contaminated waters.

In the journal Langmuir, North Carolina State University researchers reported the findings of an experiment in which they used a material produced by the bacteria Gluconacetobacter hansenii as a filter to separate water from an oil mixture.

"It's really remarkable to think that these little bugs can make this stuff that is so perfect in many ways," said Lucian Lucia, the study's corresponding author and an associate professor of forest biomaterials and chemistry at NC State.

The biofilm the bacteria make and release into their environment is made of cellulose, which is the same material that gives plants a sturdy structure in their cell walls. However, when bacteria make cellulose, it has a tightly packed, crystalline structure, researchers said.

"It's one of the purest, if not the purest, forms of cellulose out there," Lucia said. "It's very well structured. It's very water loving, and it's got a very high crystallinity, so it packs very beautifully. Once you strip out the bacteria, you have this amazingly tough material that has a real robustness, or toughness."

The bacteria make the film to protect themselves, the researchers said.

"If you leave something like an unwashed dish out, it can turn all slimy and gross - that's a biofilm," said study co-author Wendy Krause, associate professor of textile engineering, chemistry and science at NC State. "Different bacteria make different biofilms. The bacterial film that we're studying is made of cellulose. The bacteria are making it because they live on it and in it. They're making their home."

In the experiment, researchers used the bacteria as factories of cellulose nano-fibers. They then removed the bacteria and their non-cellulose residue. Finally, the researchers used the cellulose membrane to see if it could separate water from a solution containing both oil and water.

They found the material was effective at removing water, and it was sturdy.

"The oil doesn't want to go through the membrane; it has a repulsive effect to it," Lucia said. "It's super fat-hating."

"If the oil and water were highly mixed, it doesn't matter," Krause added. "You could put an immersion blender into the solution, and the membrane will still separate the water and oil."

Researchers see a variety of potential applications for the material in situations where you need to recover water from an oily mixture - whether it be to clean water contaminated with a textile dye or for environmental remediation. In future work, the researchers want to explore how they can tailor the membrane by chemically modifying it for certain applications.

The study, "Bacterial Superoleophobic Fibrous Matrices: A Naturally Occurring Liquid-Infused System for Oil-Water Separation," was published online in the journal Langmuir on Feb. 19.

###

Note to editors: The abstract follows.

"Bacterial Superoleophobic Fibrous Matrices: A Naturally Occurring Liquid-Infused System for Oil-Water Separation."

Published online in Langmuir on Feb. 19, 2021.

Authors: Zahra Ashrafi, Zimu Hu, Lucia Lucia and Wendy Krause.

DOI: 10.1021/acs.langmuir.0c02717

Abstract: Nanocellulose fibers bioengineered by bacteria are a high performance three-dimensional cross-linked network which can confine a dispersed liquid medium such as water. The strong chemical and physical interactions of dispersed water molecules with the entangled cellulosic network allow these materials to be ideal substrates for effective liquid separation. This type of phenomenon can be characterized as green with no equivalent precedent; its performance and sustainability relative to other cellulose-based or synthetic membranes are shown herein to be superior. In this work, we demonstrated that the renewable bacterial nanocellulosic membrane can be used as a stable liquid-infused system for the development of soft surfaces with superwettability and special adhesion properties and thus address intractable issues normally encountered by solid surfaces.


Deforestation's effects on malaria rates vary by time and distance

Study shows that deforestation in Southeast Asia increases malaria infections before leading to later reductions, although these effects can vary by the location of forest loss

ELIFE

Research News

Deforestation may cause an initial increase in malaria infections across Southeast Asia before leading to later decreases, a study published today in eLife suggests.

The results may help malaria control programs in the region develop better strategies for eliminating malaria infections and educating residents on how to protect themselves from infection.

Mosquitos spread the malaria parasite to humans causing infections that can be severe and sometimes deadly. In the area along the Mekong river in Southeast Asia, many residents hunt or harvest wood in the surrounding forests, which can increase their risk of infection. Yet recent outbreaks of malaria in the region have also been linked to deforestation.

"As countries in the region focus their malaria control and elimination efforts on reducing forest-related transmission, understanding the impact of deforestation on malaria rates is essential," says first author Francois Rerolle, Graduate Student Researcher at the University of California San Francisco (UCSF), US, who works within the UCSF Malaria Elimination Initiative.

To better understand the effects of deforestation on malaria transmission, Rerolle and colleagues examined both forest cover data and village-level malaria incidence data from 2013-2016 in two regions within the Greater Mekong Sub-region.

They found that in the first two years following deforestation activities, malaria infections increased in villages in the area, but then decreased in later years. This trend was mostly driven by infections with the malaria parasite Plasmodium falciparum. Deforestation in the immediate 1-10-kilometer radius surrounding villages did not affect malaria rates, but deforestation in a wider 30-kilometer radius around the villages did. The authors say this is likely due to the effect that wider deforestation can have on human behaviour. "We suspect that people making longer and deeper trips into the forest results in increased exposure to mosquitoes, putting forest-goers at risk," Rerolle explains.

Previously, studies on the Amazon in South America have found increased malaria infections in the first 6-8 years after deforestation, after which malaria rates fall. The difference in timing may be due to regional differences. The previous studies in the Amazon looked at deforestation driven by non-indigenous people moving deeper into the forest, while communities in the current study have long lived at the forest edges and rely on subsistence agriculture.

"Our work provides a more complete picture of the nuanced effects of deforestation on malaria infections," says senior author Adam Bennett, Program Lead at the UCSF Malaria Elimination Initiative. "It may encourage more in-depth studies on the environmental and behavioural drivers of malaria to help inform strategies for disease elimination."

###

Media contact

Emily Packer, Media Relations Manager
eLife
e.packer@elifesciences.org
+44 (0)1223 855373

About eLife

eLife is a non-profit organisation created by funders and led by researchers. Our mission is to accelerate discovery by operating a platform for research communication that encourages and recognises the most responsible behaviours. We aim to publish work of the highest standards and importance in all areas of biology and medicine, including Epidemiology and Global Health, while exploring creative new ways to improve how research is assessed and published. eLife receives financial support and strategic guidance from the Howard Hughes Medical Institute, the Knut and Alice Wallenberg Foundation, the Max Planck Society and Wellcome. Learn more at https://eLifesciences.org/about.

To read the latest Epidemiology and Global Health research published in eLife, visit https://eLifesciences.org/subjects/epidemiology-global-health.

About the UCSF Malaria Elimination Initiative

The Malaria Elimination Initiative works in partnership with malaria endemic countries and regions to advance evidence-based malaria policy and practice. Learn more at http://www.shrinkingthemalariamap.org.

Both old and young fish sustain fisheries

ARC CENTRE OF EXCELLENCE FOR CORAL REEF STUDIES

Research News

IMAGE

IMAGE: THERE ARE FOUR SPECIES OF CORAL GROUPER FOUND COMMONLY ON THE GREAT BARRIER REEF. THE BAR-CHEEK CORAL GROUPER (PLECTROPOMUS MACULATUS) IS DISTINGUISHED BY ELONGATED DOT PATTERNS. view more 

CREDIT: PHIL WOODHEAD, WET IMAGE UNDERWATER PHOTOGRAPHY.

Scientists have used modern genetic techniques to prove age-old assumptions about what sizes of fish to leave in the sea to preserve the future of local fisheries.

"We've known for decades that bigger fish produce exponentially more eggs," said the lead author of the new study, Charles Lavin, who is a research fellow from James Cook University (JCU) and Nord University in Norway.

"However, we also found while these big fish contributed significantly to keeping the population going--they are also rare."

Co-author Dr Hugo Harrison from the ARC Centre of Excellence for Coral Reef Studies at JCU said as fish grow older, they become more fertile and their chances of having babies increase.

"This is an age-old assumption of fisheries management--and with the help of modern genetics, we can show that this assumption is correct."

"But the smaller fish are just as important to keeping populations going. They may have fewer babies, but they also are more abundant."

The study used genetic parentage analysis to identify which adult coral groupers (Plectropomus maculatus) contribute to replenishing fished populations in the Great Barrier Reef Marine Park (GBRMP).

The authors found that large coral groupers are important because they are more likely to replenish the fish stocks removed from the fishery. However, smaller fish are still making a meaningful contribution.

"We show that minimum size-limits on catches are effective at protecting the reproductively mature coral grouper," Mr Lavin said. "This ensures all fish have the opportunity to reproduce at least once prior to being caught."

The authors said all fisheries must ensure there are enough fish reproducing to replace the portion of the population that are caught.

"We're fortunate in the GBRMP to have measures in place that protect both the small and larger fish," Dr Harrison said.

"These ensure our fisheries remain sustainable and can bounce back quickly after a disturbance."

In the GBRMP, catches of coral grouper are limited by size and catch limits, as well as seasonal closures to ensure the fishery is productive and sustainable.

"It's encouraging that these measures are effective," Mr Lavin said.

"But it's important that we also protect the bigger, rarer fish inside no-take marine reserves because they are super-productive," he said.

"For the fisher, this means there will always be fish to catch."

###

PAPER

Lavin C, Jones G, Williamson D, Harrison H. (2021). 'Minimum size limits and the reproductive value of numerous, young, mature female fish'. Proceedings of the Royal Society B. DOI: 10.1098/rspb.2020.2714

Researchers modify air quality models to reflect polluted reality in Latin America

NORTH CAROLINA STATE UNIVERSITY

Research News

IMAGE

IMAGE: COMPUTATIONAL MODELS OF AIR QUALITY HAVE LONG BEEN USED TO SHED LIGHT ON POLLUTION CONTROL EFFORTS IN THE UNITED STATES AND EUROPE, BUT THE TOOLS HAVE NOT FOUND WIDESPREAD ADOPTION... view more 

CREDIT: JAMES EAST

Computational models of air quality have long been used to shed light on pollution control efforts in the United States and Europe, but the tools have not found widespread adoption in Latin America. New work from North Carolina State University and Universidad de La Salle demonstrates how these models can be adapted to offer practical insights into air quality challenges in the Americas outside the U.S.

Computational air quality models can be used in multiple ways. For example, they can be used to determine which sources are responsible for what fraction of air pollution. They can also help authorities predict how air pollution might change if different pollution control methods are adopted.

"Historically, it's been very challenging to apply these modeling tools in Latin America, so it has rarely been done," says Fernando Garcia Menendez, corresponding author of a paper on the work and an assistant professor of environmental engineering at NC State. "This is important because the region has many areas that are dealing with significant air pollution, and these modeling tools can help governments identify the most cost-effective ways of achieving air quality improvements."

One challenge to using computational air quality models in Latin America is that the relevant modeling frameworks were developed largely in the context of the U.S. and Europe. That means that some of the assumptions that modelers took for granted when developing the tools don't always apply in Latin American cities. Furthermore, computational resources and trained environmental modelers are still scarce in the region.

For example, there are often substantially less air emissions data available. In addition, there are some contributors to air pollution that are common across Latin American metro areas, but that differ from what we see in the U.S. - more unpaved roads, an older cargo fleet, a large number of motorcycles, informal economies, and so on.

With that in mind, Garcia Menendez developed a research project with collaborators at the Universidad de La Salle, in Bogotá, Colombia. Specifically, the research team fine-tuned a modeling framework to reflect the air pollution dynamics in Bogotá and investigate the city's air quality problems. The collaborators at Universidad de La Salle also collected air pollution data that allowed the team to assess the accuracy of its modeling results.

"Our paper outlines the techniques we've used to perform computational modeling of air quality issues in a large Latin American city," says James East, first author of the paper and a Ph.D. student at NC State. "This not only demonstrates that it can be done, but provides an approach that others can use to provide insights into air pollution in other parts of the region that are experiencing similar issues."

While the paper focuses on an air quality model for fine particulate matter (PM2.5), the researchers say that the model could be used to look at other air pollutants. Exposure to PM2.5 is associated with a wide variety of health problems, including heart and lung disease.

In their proof-of-concept demonstration, the researchers found that the largest local sources of PM2.5 in Bogotá were dust from unpaved roads and emissions from heavy-duty vehicles. However, when the model was used to project future air quality, the study also found that while paving roads would decrease air pollution in some parts of the city, different emission sources would still lead to increased air pollution in other parts of the city - unless other emission control measures were also implemented.

In short, the model offered practical insights into possible solutions for a complex metropolitan area of 10 million people.

"These findings are of interest to environmental authorities, from the local to the national level, who are pursuing ways to effectively address air pollution in Bogotá and other Colombian cities," says Jorge Pachon, a co-author of the paper and an associate professor at the Universidad de La Salle.

###

The paper, "Air quality modeling to inform pollution mitigation strategies in a Latin American megacity," is published in Science of The Total Environment. The paper was co-authored by Juan Sebastian Montealegre of the Universidad de La Salle. The work was partially funded by Ecopetrol, Colombia's national petroleum company.


Recyclable bioplastic membrane to clear oil spills from water

UNIVERSITY OF GRONINGEN

Research News

IMAGE

IMAGE: THE NEW, VITRIMER MEMBRANE IS MADE BY PRESSING AND SINTERING OF POLYMERS FROM THE NATURAL MONOMER MALIC ACID. THIS MEMBRANE CAN BE RECYCLED BY BALL MILLING AND SUBSEQUENT PRESSING AND... view more 

CREDIT: CHONGNAN YE, UNIVERSITY OF GRONINGEN

Polymer scientists from the University of Groningen and NHL Stenden University of Applied Sciences, both in the Netherlands, have developed a polymer membrane from biobased malic acid. It is a superamphiphilic vitrimer epoxy resin membrane that can be used to separate water and oil. This membrane is fully recyclable. When the pores are blocked by foulants, it can be depolymerized, cleaned and subsequently pressed into a new membrane. A paper describing the creation of this membrane was published in the journal Advanced Materials on 7 March 2021.

How do you clean up an oil spill in water? This is quite a challenge. Superamphiphilic membranes, that 'love' both oil and water, are a promising solution but not yet a very practical one. These membranes are often not robust enough for use outside the laboratory environment and the membrane pores can clog up as a result of fouling by algae and sand. Chongnan Ye and Katja Loos from the University of Groningen and Vincent Voet and Rudy Folkersma from NHL Stenden used a relatively new type of polymer to create a membrane that is both strong and easy to recycle.

Dynamic network

In recent years, the researchers from both institutes have joined forces to investigate vitrimer plastics, polymer materials that have the mechanical properties and chemical resistance of a thermoset plastic. However, vitrimer plastics can also behave like a thermoplastic, since they can be depolymerized and reused. This means that a vitrimer plastic has all the qualities to make a good membrane for oil spill remediation. 'Furthermore, it was made from malic acid, a natural monomer,' adds Loos.

'The polymers in the vitrimer are crosslinked in a reversible manner,' explains Voet. 'They form a dynamic network, which enables recycling of the membrane.' The vitrimer is produced through base-catalysed ring-opening polymerization between pristine and epoxy-modified biobased malic acid. The polymers are ground into a powder by ball milling and turned into a porous membrane through the process of sintering.

Pores

Both water and oil will spread out on the resulting superamphiphilic membrane. In an oil spill, much more water is present than oil, which means that the membrane is covered by water that can then pass through the pores. Voet: 'The water film on the membrane's surface keeps the oil out of the pores so that it is separated from the water.'

The membrane is firm enough to filter oil from the water. When sand and algae clog up the pores, the membrane can be depolymerized and recreated from the building blocks after removal of the pollutants. 'We have tested this on a laboratory scale of a few square centimetres,' says Loos. 'And we are confident that our methods are scalable, both for the polymer synthesis and for the production and recycling of the membrane.' The scientists are hoping that an industrial partner will take up further development.

Applications

Creating this new membrane for oil spill remediation shows the power of cooperation between a research university and an applied university. 'A while ago, we decided that the polymer groups at the two institutes should become one, by sharing students, staff and facilities. We recently started the first hybrid research group in the Netherlands,' explains Loos. This makes it easier to find applications for newly designed materials. Voet: 'Polymer chemists strive to link molecular structures to material properties and applications. Our hybrid research team has the experience to do just that.'

###

Reference: Chongnan Ye, Vincent S. D. Voet, Rudy Folkersma and Katja Loos: Robust Superamphiphilic Membrane with a Closed-loop Life Cycle. Advanced Materials, 7 March 2021.


The neoliberal city needs to change, argues Concordia professor Meghan Joy

A new policy agenda calls for progressive measures to restrict widening inequality

CONCORDIA UNIVERSITY

Research News

IMAGE

IMAGE: MEGHAN JOY: "THE NEOLIBERAL URBAN MODEL HAS HAD TIME TO PROVE WHETHER IT WORKS FOR ALL THE PEOPLE IN A CITY. IT IS CLEAR THAT IT DOES NOT. " view more 

CREDIT: CONCORDIA UNIVERSITY

What would a truly progressive city look like? A city that pays more than lip service to issues that directly affect low-income residents, seniors, marginalized communities and others whom neoliberal policies have seemingly left behind?

Meghan Joy, an assistant professor of political science, argues that urban studies, and particularly urban political scientists, should re-assess the concept of the progressive city. The once-widely embraced notion fell out of favour over the past several decades as local politicians embraced neoliberal policies that she says prioritized wealth generation over liveability and accessibility for all city residents. In a new paper recently published in Urban Affairs Review, Joy and co-author Ronald K. Vogel of Ryerson University, lay out a policy agenda for urban policy thinkers who believe it may be time to shift the thinking around how cities are run and for whose benefit.

"The neoliberal urban model has had time to prove whether it works for all the people in a city," Joy says. "It is clear that it does not, especially for vulnerable people or those living on low incomes."

Policy problems and progressive solutions

Joy believes many cities are at a point of crisis, especially in four key areas: housing, employment, transportation and climate change. The authors do not address issues of policing in this paper, though they do acknowledge that rethinking current approaches to crime and enforcement is essential to a progressive city policy agenda.

In their agenda, Joy and Vogel identify each area's major problem and offer progressive solutions.

They write that affordable housing often depends on incentives offered to developers who in turn wind up building more homes for residents of moderate rather than low income. Joy and Vogel point to creative solutions employed by cities as disparate as Vienna and Hong Kong: governments either own large amounts of housing stock directly or back non-market-driven developers to ensure low-income earners are not pushed out of their city.

Deindustrialization has had a major effect on the nature of employment in many cities, as well as their overall finances. Neoliberal policies have led to a surge in the service economy, but wage inequality and job precarity have led to increasing poverty and a squeezing of the middle class. The authors believe city governments should spend more on hiring employees to provide public services and rely more on in-house talent rather than contracting out to for-profit consultants.

Neoliberalism's effects on transportation include chronic underfunding and underservicing of areas that need public transit the most. The issue is closely tied to housing, with homes conveniently located near transit access points often priced well beyond the means of low-income earners. The authors call for the implementation of social equity transit planning to better serve disadvantaged communities, including subsidies and expanded access.

Finally, climate change has increased the probability and severity of natural disasters such as floods, hurricanes, winter storms and tornadoes, which often require mass evacuations or emergency assistance. Many low-income residents lack the ability to get out of harm's way even with advance warning and often lack relocation options. Joy and Vogel want to see options other than those relying on market forces and entrepreneurialism and push for policies that reduce the overall carbon footprint.

Grassroots movements growing

Joy does believe that grassroots resistance to neoliberalism is growing in cities and there is evidence of movements increasingly pushing on the levers of municipal power.

"We are certainly seeing a groundswell of movement building, especially with Black Lives Matter and the call to defund the police, and around COVID-19 and housing," she says.

"There is more visibility around the question of who we are thinking about when we make urban policy and who benefits in the city. We need to think about how to translate this movement building into an urban policy agenda."

###

Milk prebiotics are the cat's meow, Illinois research shows

UNIVERSITY OF ILLINOIS COLLEGE OF AGRICULTURAL, CONSUMER AND ENVIRONMENTAL SCIENCES

Research News

IMAGE

IMAGE: RESEARCH FROM THE UNIVERSITY OF ILLINOIS IDENTIFIES KEY MILK OLIGOSACCHARIDES IN DOG AND CAT MILK, AND SHOWS A MOLECULAR MIMIC OF THESE COMPOUNDS IN PET FOODS MAKE THEM HIGHLY PALATABLE... view more 

CREDIT: UNIVERSITY OF ILLINOIS

URBANA, Ill. - If you haven't been the parent or caregiver of an infant in recent years, you'd be forgiven for missing the human milk oligosaccharide trend in infant formulas. These complex carbohydrate supplements mimic human breast milk and act like prebiotics, boosting beneficial microbes in babies' guts.

Milk oligosaccharides aren't just for humans, though; all mammals make them. And new University of Illinois research suggests milk oligosaccharides may be beneficial for cats and dogs when added to pet diets.

But before testing the compounds, scientists had to find them.

"When we first looked into this, there had only been one study on milk oligosaccharides in dogs, and none in domestic cats. The closest were really small studies on a single lion and a single clouded leopard," says Kelly Swanson, the Kraft Heinz Company Endowed Professor in Human Nutrition in the Department of Animal Sciences and the Division of Nutritional Sciences at Illinois.

"Our study was the first robust characterization of dog and cat milk oligosaccharides," he adds. "Our data not only provide a better understanding of how milk meets the nutritional needs of newborn kittens and puppies, but also how it helps promote gut immunity and establish a healthy gut microbial community early in life." That research appears in the journal PLoS ONE.

The foundational study identified three predominant oligosaccharide structures in canine milk: 3'sialyllactose, 6'-sialyllactose, and 2'fucosyllactose, the same compound showing up in many infant formulas today. Together, these three structures made up more than 90% of the total oligosaccharides in canine milk.

Feline milk was much more complex and balanced, with approximately 15 structures making up 90% of total oligosaccharides. Of these, difucosyllactose-N-hexaose b, 3'-sialyllactose, and lacto-N-neohexaose represented more than 10% each.

"Even though domestic dogs and cats both evolved as carnivores, they are metabolically distinct in many ways. Although pet cats still exist as true carnivores, pet dogs are omnivorous in nature," Swanson says. "These new milk oligosaccharide data highlight another interesting difference between the species, justifying further research to reveal their role in the nutritional and health status of newborn puppies and kittens."

Even before Swanson and his colleagues identified the oligosaccharides in cat and dog milk, the pet food industry was beginning to recognize the potential benefits of these compounds as supplements in pet foods. In 2019, Swiss biotech company Gnubiotics Sciences announced an animal milk oligosaccharide-like product known as GNU100, but it hadn't been tested in animals. Swanson's team took that on.

In two separate studies, both published in the Journal of Animal Science, Swanson and his colleagues determined the safety, palatability, and digestibility of GNU100 in dogs and cats.

First, in vitro laboratory tests with cellular colonies showed no toxic effects or tendencies to cause cell mutation. There was no reason to expect toxicity, but the result satisfies one of the basic FDA requirements for inclusion of any new ingredient in pet foods.

Next, the researchers mixed GNU100 at 1% with a fat source and coated commercial dry diets for cats or dogs. As a control, fat-coated diets without GNU100 were also offered. When animals got to choose between the control and 1% bowls, they went crazy for the GNU100.

"In the cats, it was a huge preference. They ate nearly 18 times more food with GNU100 than the control food. We had just been hoping they wouldn't reject it. You know, cats can be pretty finicky," Swanson says. "When we got the data back it was like, wow, they really love that stuff! And the dogs did, too."

Swanson explains GNU100 is composed of a complex mixture of oligosaccharides and peptides, small protein-containing compounds that may make the food more appetizing to cats and dogs.

Finally, the researchers included GNU100 in experimental diets at 0%, 0.5%, 1%, and 1.5% and fed them to healthy adult dogs and cats for six months. During that time, they measured stool quality, blood metabolites, and nutrient digestibility, and evaluated changes in gut metabolites and the gut microbial community.

Overall, cats and dogs did well with GNU100, with no adverse health effects. And the researchers saw shifts in the gut microbiome toward more beneficial species and their metabolite profiles.

Aside from the palatability test, changes associated with GNU100 were as expected, showing intriguing trends in gut microbiota and gut metabolites that Gnubiotics plans to explore in future studies. Swanson thinks they would have seen bigger benefits in a more targeted study focusing on newborn cats and dogs, geriatrics, or pets with compromised immune systems.

"Theoretically, these products should stabilize and feed good bacteria in the gut as well as limit the growth of potentially undesirable bacteria. So if an animal is undergoing treatment for something with antibiotics or is in a high stress situation, having that product in the diet might keep the gut from destabilizing," Swanson says. "Another target group for these products might be young animals as a way to maintain beneficial bacteria in the gut as they wean off their mothers. We'd need to do more testing to see if the product holds up in those target groups, but at least we know now that it is safe and well tolerated."

###

The three articles referenced here can be found online at:

https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0243323 (Identifying milk oligosaccharides in dogs and cats)

https://academic.oup.com/jas/article-abstract/99/1/skaa399/6035126?redirectedFrom=fulltext (GNU100 tested in cats)

https://academic.oup.com/jas/article-abstract/99/1/skab014/6102879?redirectedFrom=fulltext (GNU100 tested in dogs)

The Department of Animal Sciences is in the College of Agricultural, Consumer and Environmental Sciences at the University of Illinois.

New lung cancer screening recommendation expands access but may not address inequities

UNC LINEBERGER COMPREHENSIVE CANCER CENTER

Research News

IMAGE

IMAGE:  "THE REVISED U.S. PREVENTIVE SERVICES TASK FORCE'S RECOMMENDATIONS ARE SOUND AND BASED ON WELL-CONCEIVED EVIDENCE AND MODELING STUDIES, BUT THEY ALONE ARE NOT ENOUGH, AS WE HAVE SEEN LIMITED UPTAKE... view more 

CREDIT: UNC LINEBERGER COMPREHENSIVE CANCER CENTER

CHAPEL HILL, NC -- Calling the U.S. Preventive Services Task Force's newly released recommendation statement to expand eligibility for annual lung cancer screening with low-dose computed tomography a step forward, UNC Lineberger Comprehensive Cancer Center researchers say future changes should address equity and implementation issues.

In an editorial published in JAMA, Louise M. Henderson, PhD, professor of radiology at UNC School of Medicine, M. Patricia Rivera, MD, professor of medicine at UNC School of Medicine, and Ethan Basch, MD, MSc, the Richard M. Goldberg Distinguished Professor in Medical Oncology and chief of oncology at the UNC School of Medicine, outlined their concerns and offered potential approaches to make the screening recommendation more inclusive of populations that have been historically underserved.

"The revised U.S. Preventive Services Task Force's recommendations are sound and based on well-conceived evidence and modeling studies, but they alone are not enough, as we have seen limited uptake of the prior recommendations," Basch said. "Implementation will require broader efforts by payers, health systems and professional societies, and, in the future, a more tailored, individual risk prediction approach may be preferable."

The task force has made two significant changes to the screening recommendation it issued in 2013: Annual screening will begin at age 50, instead of 55, and smoking intensity has been reduced from 30 to 20 pack-year history. These more inclusive criteria could more than double the number of adults eligible for lung cancer screening, from 6.4 million to 14.5 million, according to some estimates. This represents an 81% increase.

Henderson, Rivera and Basch are encouraged that lung cancer screening will be available to more people, and they point out that expanding access alone won't reduce racial inequities, especially as measured by lung cancer deaths prevented and life-years gained.

It may be possible to counter this shortcoming, they said, by adding risk-prediction models that identify high-benefit individuals who do not meet USPSTF criteria. This could reduce or eliminate some, though not all, racial disparities, according to one study. Also, future research should explore risks such as family history of lung cancer and genetic susceptibility to develop risk assessment strategies that may identify individuals who never smoked and still have a high risk for lung cancer but currently are not eligible to be screened.

Financial-based barriers are also an issue. Expanding screening access to include people as young as 50 may lead to greater inequities for those who are enrolled in Medicaid, the state-based public health insurance program.

"Medicaid is not required to cover the USPSTF recommended screenings and even when screening is covered, Medicaid programs may use different eligibility criteria," Henderson said. She adds this is problematic because people who receive Medicaid are twice as likely to be current smokers than those with private insurance (26.3% compared to 11.1%), and they are disproportionately affected by lung cancer. "This is a significant issue, particularly in the nine states where Medicaid does not cover lung cancer screening."

Putting the screening recommendation into practice will be a substantial challenge, Rivera said. Primary care providers are critical to implementing the screening process because they initiate the conversation with their patients about the potential benefits and risk of lung cancer screening and make the screening referral. However, Rivera said many already have an overburdened workload, and it may be unrealistic to expect them to be able to spend the necessary time to have these complex conversations.

"A significant barrier to implementation of lung cancer screening is provider time. Many primary care providers do not have adequate time to have a shared decision-making conversation and to conduct a risk assessment," Rivera said. "Although a lung cancer screening risk model that incorporates co-morbidities and clinical risk variables may be the best tool for selecting high risk individuals who are most likely to benefit from screening, such a model requires input of additional clinical information, thereby increasing the time a provider will spend; the use of such a model in clinical practice has not been established."

Despite these limitations and challenges, the new recommendation can expand access to lung cancer screening, the researchers said in the editorial. "Beyond implementation challenges, the future of screening strategies lies in individualized risk assessment including genetic risk. The 2021 USPSTF recommendation statement represents a leap forward in evidence and offers promise to prevent more cancer deaths and address screening disparities. But the greatest work lies ahead to ensure this promise is actualized."



CAPTION

"Medicaid is not required to cover the USPSTF recommended screenings and even when screening is covered, Medicaid programs may use different eligibility criteria," Louise M. Henderson, PhD, said. She adds this is problematic because people who receive Medicaid are twice as likely to be current smokers than those with private insurance (26.3% compared to 11.1%), and they are disproportionately affected by lung cancer. "This is a significant issue, particularly in the nine states where Medicaid does not cover lung cancer screening."

CREDIT

UNC Lineberger Comprehensive Cancer Center


CAPTION

"A significant barrier to implementation of lung cancer screening is provider time. Many primary care providers do not have adequate time to have a shared decision-making conversation and to conduct a risk assessment," M. Patricia Rivera, MD, said. "Although a lung cancer screening risk model that incorporates co-morbidities and clinical risk variables may be the best tool for selecting high risk individuals who are most likely to benefit from screening, such a model requires input of additional clinical information.

CREDIT

UNC Lineberger Comprehensive Cancer Center

Disclosures

Henderson reported receiving grants from the National Cancer Institute. Rivera reported receiving grants from the National Cancer Institute for research in lung cancer screening, serving on the advisory panel for Biodesix and bioAffinity, and serving as a research consultant to Johnson & Johnson, outside the submitted work. Basch reported receiving fees from Astra Zeneca, CareVive Systems, Navigating Cancer, and Sivan Healthcare for serving as a scientific advisor/consultant, outside the submitted work.