Tuesday, October 20, 2020

 

When honey flows faster than water

Physicists surprised to find that in specially coated tubes, the more viscous a liquid is, the faster it flows

AALTO UNIVERSITY

Research News

It's widely known that thick, viscous liquids - like honey - flow more slowly than low-viscosity liquids, like water. Researchers were surprised to find this behaviour flipped on its head when the liquids flow through chemically coated capillaries. In fact, through these specially coated tubes, liquids a thousand times more viscous flow ten times faster.

The speed at which different fluids flow through pipes is important for a large range of applications: from industrial processes such as oil refineries to biological systems like the human heart. Traditionally, if you need to make a fluid flow faster through a pipe, you increase the pressure on it. This technique, however, has its limits; there is only so much pressure you can put into a pipe before you run the risk of bursting it. This is especially true for thin and narrow pipes, like the ones used in microfluidics for producing medicine and other complex chemicals, so researchers are investigating if they can increase the speed at which liquids flow through narrow tubes without having to increase the pressure.

In the paper published on 16 October in the journal Science Advances, researchers found that by coating the inside of the pipes with compounds that repel liquids, they could make viscous liquids flow faster than those with low viscosity.

'A superhydrophobic surface consists of tiny bumps that traps air within the coating, so that a liquid droplet that rests on the surface sits as if on a cushion of air,' explains Professor Robin Ras, whose research team at Aalto University's Department of Applied Physics has made a range of interesting discoveries in the area of extremely water repellent coatings, including recent papers in Science and Nature.

Superhydrophobic coatings themselves don't speed up the flow of the more viscous liquids. If you place a drop of honey and a drop of water on a superhydrophobic coated surface and then tilt the surface so gravity makes the droplets move, the low-viscosity water will flow down faster.

But when a droplet is confined to one of the very narrow tubes used in microfluidics, things change drastically. In this system, the superhydrophobic coating on the walls of the tube creates a small air gap between the inside wall of the tube and the outside of the droplet. 'What we found was that when a droplet is confined to a sealed superhydrophobic capillary, the air gap around the droplet is larger for more viscous liquids. This larger air gap is what allowed for the viscous fluids to move through the tube faster than the less viscous ones when flowing due to gravity,' says Dr Maja Vuckovac, the first author of the paper.

The size of the effect is quite substantial. Droplets of glycerol a thousand times more viscous than water flow through the tube more than ten times faster than water droplets. The researchers filmed the droplets as they moved through the tube, tracking not only how fast the liquid moved through the tube, but also how the liquid flowed inside the droplet. For viscous liquids, the liquid inside the droplet hardly moved around at all, whereas a fast mixing motion was detected in the lower viscosity droplets.

'The crucial discovery is that the less-viscous liquids also managed to penetrate a bit into the air cushion surrounding the droplets, rendering a thinner air gap around these. This means that the air beneath a low-viscosity droplet in the tube couldn't move out of the way as fast as for a more viscous droplet with a thicker air gap. With less air managing to squeeze past the low-viscosity droplets, these were forced to move down the tube with a slower speed than their more viscous counterparts,' explains Dr Matilda Backholm, one of the researchers on the project.

The team developed a fluid dynamics model that can be used to predict how droplets would move in tubes coated with different superhydrophobic coatings. They hope that further work on these systems could have significant applications for microfluidics, a type of chemical engineering technique that is used to precisely control liquids in small quantities and in manufacturing complex chemicals like medicines. By being able to predict how the coatings can be used to modify fluid flow, the coatings may be helpful for engineers developing new microfluidics systems.

###

Read More:

Viscosity-Enhanced Droplet Motion in Sealed Superhydrophobic Capillaries
Maja Vuckovac, Matilda Backholm, Jaakko V. I. Timonen, Robin H. A. Ras, Science Advances, DOI: https://dx.doi.org/10.1126/sciadv.aba5197

Natural nanodiamonds in oceanic rocks

Low pressure and temperature geological processes

UNIVERSITY OF BARCELONA

Research News

IMAGE

IMAGE: THE FLUID INCLUSIONS INSIDE THE OLIVINE CONTAIN NANODIAMONDS, APART FROM SERPENTINE, MAGNETITE, METALLIC SILICON AND PURE METHANE. view more 

CREDIT: UNIVERSITY OF BARCELONA

Natural diamonds can form through low pressure and temperature geological processes on Earth, as stated in an article published in the journal Geochemical Perspectives Letters. The now found mechanism, far from the classic regard on the formation of diamonds under ultra-high pressure, is confirmed in the study, which counts on the participation of experts from the Mineral Resources Research Group of the Faculty of Earth Sciences of the University of Barcelona (UB).

Other participants in the study are the experts from the Institute of Nanoscience and Nanotechnology of the UB (IN2UB), the University of Granada (UGR), the Andalusian Institute of Earth Sciences (IACT), the Institute of Ceramics and Glass (CSIC), and the National Autonomous University of Mexico (UNAM). The study has been carried out within the framework of the doctoral thesis carried out by researcher Núria Pujol-Solà (UB), first author of the article, under the supervision of researchers Joaquín A. Proenza (UB) and Antonio García-Casco (UGR).

Diamond: the toughest of all minerals

A symbol of luxury and richness, the diamond (from the Greek αδ?μας, "invincible") is the most valuable gem and the toughest mineral (value of 10 in Mohs scale). It formed by chemically pure carbon, and according to the traditional hypothesis, it crystalizes the cubic system under ultra-high-pressure conditions at great depths in the Earth's mantle.

The study confirms for the first time the formation of the natural diamond under low pressures in oceanic rocks in the Moa-Baracoa Ophiolitic Massif, in Cuba. This great geological structure is in the north-eastern side of the island and is formed by ophiolites, representative rocks of the Oceanic lithosphere.

These oceanic rocks were placed on the continental edge of North America during the collision of the Caribbean oceanic island arch, between 70 and 40 million years ago. "During its formation in the abysmal marine seafloors, in the cretaceous period -about 120 million years ago-, these oceanic rocks underwent mineral alterations due to marine water infiltrations, a process that led to small fluid inclusions inside the olivine, the most common mineral in this kind of rock", note Joaquín A. Proenza, member of the Department of Mineralogy, Petrology and Applied Geology at the UB and principal researcher of the project in which the article appears, and Antonio García-Casco, from the Department of Mineralogy and Petrology of the UGR.

"These fluid inclusions contain nanodiamonds -of about 200 and 300 nanometres-, apart from serpentine, magnetite, metallic silicon and pure methane. All these materials have formed under low pressure (<200 MPa) and temperature (<350 ºC), during the olivine alteration that contains fluid inclusions", add the researchers.

"Therefore, this is the first description of ophiolitic diamond formed under low pressure and temperature, whose formation under natural processes does not bear any doubts", they highlight.

Diamonds formed under low pressure and temperature

It is notable to bear in mind that the team published, in 2019, a first description of the formation of ophiolitic diamonds under low pressure conditions (Geology), a study carried out as part of the doctoral thesis by the UB researcher Júlia Farré de Pablo, supervised by Joaquín A. Proenza and the UGR professor José María González Jiménez. This study was highly debated on among the members of the international scientific community.

In the published article in Geochemical Perspectives Letters, a journal of the European Association of Geochemistry, the experts detected the nanodiamonds in small fluid inclusions under the surface of the samples. The finding was carried out by using the confocal Raman maps and using focused ion beams (FIB), combined with transmission electron microscopy (FIB-TEM). This is how they could confirm the presence of the diamond in the depth of the sample, and therefore, the formation of a natural diamond under low pressure in exhumed oceanic rocks. The Scientific and Technological Centres of the UB (CCiTUB) have taken part in this study, among other infrastructures supporting the country.

In this case, the study focuses its debate on the validity of some geodynamic models that, based on the presence of ophiolite diamonds, imply circulation in the mantle and large-scale lithosphere recycling. For instance, the ophiolitic diamond was thought to reflect the passing of ophiolitic rocks over the deep earth's mantle up to the transition area (210-660 km deep) before settling into a normal ophiolite formed under low pressure (~10 km deep).

According to the experts, the low state of oxidation in this geological system would explain the formation of nano-çdiamonds instead of graphite -which would be expected under physical and chemical formation conditions of fluid inclusions.

The study counted on the support from the former Ministry for Economy and Competitiveness (MINECO), the Ramón y Cajal Program and the EU European Regional Development Fund (ERDF).


Monday, October 19, 2020

Safe sex or risky romance? Young adults make the rational choice

ASSOCIATION FOR PSYCHOLOGICAL SCIENCE

Research News

A study published in the journal Psychological Science found that young adults--contrary to how they are sometimes portrayed in the media--tend to make highly rational decisions when it comes to selecting potential romantic partners.

This is not to say that young adults make risk-free choices, but they appear to consider both the risks and benefits of their sexual behavior in a highly consistent and thoughtful manner.

"There is a tendency to view sexual decision making in young adults as a highly variable and somewhat random process, more influenced by hormones or impulsivity than rational processes," said Laura Hatz, a doctoral candidate at the University of Missouri and lead author of the study. "Our study suggests, however, that young adults are highly consistent in their choices, balancing potential partners' level of attractiveness against the potential risk for sexually transmitted infection."

The research involved presenting 257 participants with hypothetical "sexual gambles" in which a photo of a potential partner's face was shown alongside an associated, though purely hypothetical, risk of contracting a sexually transmitted infection. Nearly all participants in the study made consistently rational choices, as defined by established models of psychological behavior. Prior research has shown that, in general, individuals tend to use what are known as heuristic decision strategies--cognitive shortcuts that may ignore some information--to make choices in life.

Hatz and her colleagues found that even individuals who could be identified as classic heuristic decision makers for monetary-based choices became rational decision makers when similar choices were framed as sexual choices.

###

See related content in the APS Research Topic on Love and Marriage.

Reference:

Hatz, L. E., Park, S., McCarty, K. N., McCarthy, D. M., & Davis-Stober, C. P. (2020). Young adults make rational sexual decisions. Psychological Science, 31(8), 944-956.
https://doi.org/10.1177%2F0956797620925036

 

Are climate scientists being too cautious when linking extreme weather to climate change?

UNIVERSITY OF WASHINGTON

Research News

IMAGE

IMAGE: THE PUBLIC EXPECTS TO RECEIVE ADVANCED WARNING OF HAZARDOUS WEATHER, SUCH AS TORNADOES AND WINTER STORMS. THIS PHOTO SHOWS A TORNADO IN PROSPECT VALLEY, COLORADO, ON JUNE 19, 2018. view more 

CREDIT: ERIC MEOLA

In this year of extreme weather events -- from devastating West Coast wildfires to tropical Atlantic storms that have exhausted the alphabet -- scientists and members of the public are asking when these extreme events can be scientifically linked to climate change.

Dale Durran, a professor of atmospheric sciences at the University of Washington, argues that climate science need to approach this question in a way similar to how weather forecasters issue warnings for hazardous weather.

In a new paper, published in the October issue of the Bulletin of the American Meteorological Society, he draws on the weather forecasting community's experience in predicting extreme weather events such as tornadoes, flash floods, high winds and winter storms. If forecasters send out a mistaken alert too often, people will start to ignore them. If they don't alert for severe events, people will get hurt. How can the atmospheric sciences community find the right balance?

Most current approaches to attributing extreme weather events to global warming, he says, such as the conditions leading to the ongoing Western wildfires, focus on the likelihood of raising a false alarm. Scientists do this by using statistics to estimate the increase in the probability of that event that is attributable to climate change. Those statistical measures are closely related to the "false alarm ratio," an important metric used to assess the quality of hazardous weather warnings.

But there is a second key metric used to assess the performance of weather forecasters, he argues: The probably that the forecast will correctly warn of events that actually occur, known as the "probability of detection." The ideal probability of detection score is 100%, while the ideal false-alarm rate would be zero.

Probability of detection has mostly been ignored when it comes to linking extreme events to climate change, he says. Yet both weather forecasting and climate change attribution face a tradeoff between the two. In both weather forecasting and climate-change attribution, calculations in the paper show that raising the thresholds to reduce false alarms produces a much greater drop in the probability of detection.

Drawing on a hypothetical example of a tornado forecaster whose false alarm ratio is zero, but is accompanied by a low probability of detection, he writes that such an "overly cautious tornado forecasting strategy might be argued by some to be smart politics in the context of attributing extreme events to global warming, but it is inconsistent with the way meteorologists warn for a wide range of hazardous weather, and arguably with the way society expects to be warned about threats to property and human life."

Why does this matter? The paper concludes by noting: "If a forecaster fails to warn for a tornado there may be serious consequences and loss of life, but missing the forecast does not make next year's tornadoes more severe. On the other hand, every failure to alert the public about those extreme events actually influenced by global warming facilitates the illusion that mankind has time to delay the actions required to address the source of that warming. Because the residence time of CO2 in the atmosphere is many hundreds to thousands of years the cumulative consequences of such errors can have a very long lifetime."

###

For more information, contact Durran at drdee@uw.edu.

 

Remember that fake news you read? It may help you remember even more

ASSOCIATION FOR PSYCHOLOGICAL SCIENCE

Research News

People who receive reminders of past misinformation may form new factual memories with greater fidelity, according to an article published in the journal Psychological Science.

Past research highlights one insidious side of fake news: The more you encounter the same misinformation--for instance, that world governments are covering up the existence of flying saucers--the more familiar and potentially believable that false information becomes.

New research, however, has found that reminders of past misinformation can help protect against remembering misinformation as true while improving recollection of real-world events and information.

"Reminding people of previous encounters with fake news can improve memory and beliefs for facts that correct misinformation," said Christopher Wahlheim, a lead author on the paper and assistant professor of psychology at the University of North Carolina, Greensboro. "This suggests that pointing out conflicting information could improve the comprehension of truth in some situations."

Wahlheim and colleagues conducted two experiments examining whether reminders of misinformation could improve memory for and beliefs in corrections. Study participants were shown corrections of news and information they may have encountered in the past. Reminders of past misinformation appeared before some corrections but not others. Study results showed that misinformation reminders increased the participants' recall of facts and belief accuracy. The researchers interpreted the results to indicate that misinformation reminders raise awareness of discrepancies and promote memory updating. These results may be pertinent to individuals who confront misinformation frequently.

"It suggests that there may be benefits to learning how someone was being misleading. This knowledge may inform strategies that people use to counteract high exposure to misinformation spread for political gain," Wahlheim said.

###

See related content in the APS Research Topic on Memory.

Reference:

Wahlheim, C. N., Alexander T. R., & Peske, C. D. (2020). Reminders of everyday misinformation statements can enhance memory for and beliefs in corrections of those statements in the short term. Psychological Science, 31(10), 1325-1339.
https://doi.org/10.1177/0956797620952797

https://www.wired.com/2010/01/weekly-world-news-comics/



 

NASA supercomputing study breaks ground for tree mapping, carbon research

NASA/GODDARD SPACE FLIGHT CENTER

Research News

Scientists from NASA's Goddard Space Flight Center in Greenbelt, Maryland, and international collaborators demonstrated a new method for mapping the location and size of trees growing outside of forests, discovering billions of trees in arid and semi-arid regions and laying the groundwork for more accurate global measurement of carbon storage on land.

Using powerful supercomputers and machine learning algorithms, the team mapped the crown diameter - the width of a tree when viewed from above - of more than 1.8 billion trees across an area of more than 500,000 square miles, or 1,300,000 square kilometers. The team mapped how tree crown diameter, coverage, and density varied depending on rainfall and land use.

Mapping non-forest trees at this level of detail would take months or years with traditional analysis methods, the team said, compared to a few weeks for this study. The use of very high-resolution imagery and powerful artificial intelligence represents a technology breakthrough for mapping and measuring these trees. This study is intended to be the first in a series of papers whose goal is not only to map non-forest trees across a wide area, but also to calculate how much carbon they store - vital information for understanding the Earth's carbon cycle and how it is changing over time.

Measuring carbon in trees

Carbon is one of the primary building blocks for all life on Earth, and this element circulates among the land, atmosphere, and oceans via the carbon cycle. Some natural processes and human activities release carbon into the atmosphere, while other processes draw it out of the atmosphere and store it on land or in the ocean. Trees and other green vegetation are carbon "sinks," meaning they use carbon for growth and store it out of the atmosphere in their trunks, branches, leaves and roots. Human activities, like burning trees and fossil fuels or clearing forested land, release carbon into the atmosphere as carbon dioxide, and rising concentrations of atmospheric carbon dioxide are a main cause of climate change.

Conservation experts working to mitigate climate change and other environmental threats have targeted deforestation for years, but these efforts do not always include trees that grow outside forests, said Compton Tucker, senior biospheric scientist in the Earth Sciences Division at NASA Goddard. Not only could these trees be significant carbon sinks, but they also contribute to the ecosystems and economies of nearby human, animal and plant populations. However, many current methods for studying trees' carbon content only include forests, not trees that grow individually or in small clusters.

Tucker and his NASA colleagues, together with an international team, used commercial satellite images from DigitalGlobe, which were high-resolution enough to spot individual trees and measure their crown size. The images came from the commercial QuickBird-2, GeoEye-1, WorldView-2, and WorldView-3 satellites. The team focused on the dryland regions - areas that receive less precipitation than what evaporates from plants each year - including the arid south side of the Sahara Desert, that stretches through the semi-arid Sahel Zone and into the humid sub-tropics of West Africa. By studying a variety of landscapes from few trees to nearly forested conditions, the team trained their computing algorithms to recognize trees across diverse terrain types, from deserts in the north to tree savannas in the south.

The team focused on the dryland regions of West Africa, including the arid south side of the Sahara Desert, stretching through the semi-arid Sahel Zone and into the humid sub-tropics. By studying a variety of landscapes from few trees to nearly forested conditions, the team trained their computing algorithms to recognize trees across diverse terrain types, from deserts in the north to tree savannas in the south. Download related video in HD formats: https://svs.gsfc.nasa.gov/4865

Learning on the job

The team ran a powerful computing algorithm called a fully convolutional neural network ("deep learning") on the University of Illinois' Blue Waters, one of the world's fastest supercomputers. The team trained the model by manually marking nearly 90,000 individual trees across a variety of terrain, then allowing it to "learn" which shapes and shadows indicated the presence of trees.

The process of coding the training data took more than a year, said Martin Brandt, an assistant professor of geography at the University of Copenhagen and the study's lead author. Brandt marked all 89,899 trees by himself and helped supervise training and running the model. Ankit Kariryaa of the University of Bremen led the development of the deep learning computer processing.

"In one kilometer of terrain, say it's a desert, many times there are no trees, but the program wants to find a tree," Brandt said. "It will find a stone, and think it's a tree. Further south, it will find houses that look like trees. It sounds easy, you'd think - there's a tree, why shouldn't the model know it's a tree? But the challenges come with this level of detail. The more detail there is, the more challenges come."

Establishing an accurate count of trees in this area provides vital information for researchers, policymakers and conservationists. Additionally, measuring how tree size and density vary by rainfall - with wetter and more populated regions supporting more and larger trees - provides important data for on-the-ground conservation efforts.

"There are important ecological processes, not only inside, but outside forests too," said Jesse Meyer, a programmer at NASA Goddard who led the processing on Blue Waters. "For preservation, restoration, climate change, and other purposes, data like these are very important to establish a baseline. In a year or two or ten, the study could be repeated with new data and compared to data from today, to see if efforts to revitalize and reduce deforestation are effective or not. It has quite practical implications."

After gauging the program's accuracy by comparing it to both manually coded data and field data from the region, the team ran the program across the full study area. The neural network identified more than 1.8 billion trees - surprising numbers for a region often assumed to support little vegetation, said Meyer and Tucker.

"Future papers in the series will build on the foundation of counting trees, extend the areas studied, and look ways to calculate their carbon content," said Tucker. NASA missions like the Global Ecosystem Dynamics Investigation mission, or GEDI, and ICESat-2, or the Ice, Cloud, and Land Elevation Satellite-2, are already collecting data that will be used to measure the height and biomass of forests. In the future, combining these data sources with the power of artificial intelligence could open up new research possibilities.

"Our objective is to see how much carbon is in isolated trees in the vast arid and semi-arid portions of the world," Tucker said. "Then we need to understand the mechanism which drives carbon storage in arid and semi-arid areas. Perhaps this information can be utilized to store more carbon in vegetation by taking more carbon dioxide out of the atmosphere."

"From a carbon cycle perspective, these dry areas are not well mapped, in terms of what density of trees and carbon is there," Brandt said. "It's a white area on maps. These dry areas are basically masked out. This is because normal satellites just don't see the trees - they see a forest, but if the tree is isolated, they can't see it. Now we're on the way to filling these white spots on the maps. And that's quite exciting."

###

USask scientists develop model to identify best lentils for climate change impacts

UNIVERSITY OF SASKATCHEWAN

Research News

IMAGE

IMAGE: USASK PLANT SCIENTIST KIRSTIN BETT. view more 

CREDIT: DEBRA MARSHALL PHOTOGRAPHY

With demand for lentils growing globally and climate change driving temperatures higher, a University of Saskatchewan-led international research team has developed a model for predicting which varieties of the pulse crop are most likely to thrive in new production environments.

An inexpensive plant-based source of protein that can be cooked quickly, lentil is a globally important crop for combatting food and nutritional insecurity.

But increased production to meet this global demand will have to come from either boosting yields in traditional growing areas or shifting production to new locations, said USask plant scientist Kirstin Bett.

"By understanding how different lentil lines will interact with the new environment, we can perhaps get a leg up in developing varieties likely to do well in new growing locations," said Bett.

Working with universities and organizations around the globe, the team planted 324 lentil varieties in nine lentil production hotspots, including two in Saskatchewan and one in the U.S., as well as sites in South Asia (Nepal, Bangladesh, and India) and the Mediterranean (Morocco, Spain, and Italy).

The findings, published in the journal Plants, People, Planet, will help producers and breeders identify existing varieties or develop new lines likely to flourish in new growing environments--valuable intelligence in the quest to feed the world's growing appetite for inexpensive plant-based protein.

The new mathematical model is based on a key predictor of crop yield--days to flowering (DTF) which is determined by two factors: day length (hours of sunshine or "photoperiod") and the mean temperature of the growing environment. Using detailed information about each variety's interaction with temperature and photoperiod, the simple model can be used to predict the number of days it takes each variety to flower in a specific environment.

"With this model, we can predict which lines they (producers) should be looking at that will do well in new regions, how they should work, and whether they'll work," Bett said.

For example, lentil producers in Nepal--which is already experiencing higher mean temperatures as a result of climate change--can use the model to identify which lines will produce high yields if they're grown at higher altitudes.

Closer to home in Western Canada, the model could be used to predict which varieties should do well in what are currently considered to be marginal production areas.

The project also involved USask plant researchers Sandesh Neupane, Derek Wright, Crystal Chan, and Bert Vandenberg.

The next step is putting the new model to work in lentil breeding programs to identify the genes that are controlling lentil lines' interactions with temperature and day length, said Bett.

Once breeders determine the genes involved, they can develop molecular markers that will enable breeders to pre-screen seeds. That way they'll know how crosses between different lentil varieties are likely to perform in different production locations.

###

This research project was part of the Application of Genomics to Innovation in the Lentil Economy (AGILE) project funded by Genome Canada and managed by Genome Prairie. Matching financial support was provided by partners that include the Saskatchewan Pulse Growers, Western Grains Research Foundation, and USask.