Thursday, August 26, 2021

Report: Evidence of extensive corrosion in collapsed condo

“It doesn’t look like normal concrete to me. What’s going on?”

MIAMI (AP) — Video released by a team of federal investigators shows more evidence of extensive corrosion and overcrowded concrete reinforcement in a Miami-area condominium that collapsed in June, killing 98 people.

© Provided by The Canadian Press

The National Institute of Standards and Technology also announced Wednesday it will conduct a five-pronged investigation into the Champlain Towers South collapse, which will be led by Judith Mitrani-Reiser. She is a Cuban-born engineer who grew up in Miami.

“We are going into this with an open mind and will examine all hypotheses that might explain what caused this collapse," Mitrani-Reiser said. “Having a team with experience across a variety of disciplines, including structural and geotechnical engineering, materials, evidence collection, modeling and more, will ensure a thorough investigation.”

The video shows densely packed steel reinforcement in various sections of the building, along with extensive corrosion where one column met the building’s foundation.

“The corrosion on the bottom of that column is astronomical,” Dawn Lehman, a professor of structural engineering at the University of Washington, told the Miami Herald. She said that amount of corrosion should have been obvious and documented as part of the 40-year inspection that was ongoing when the building in Surfside, Florida, collapsed June 24.


“If there’s that amount of corrosion, this should have been fixed,” she said.

The images show beams, walls and columns that appear to be overcrowded with steel reinforcement, which suggests potential weaknesses, she explained.

“There is no reason there should be that kind of bar congestion,” Lehman said.


The risk posed by “congested” vertical rebar in columns would have been even worse in spots where the rebar overlapped, which is known as “lap splice” regions, Abieyuwa Aghayere, a Drexel University engineering researcher who also reviewed the video, told the newspaper.

While it's already congested with rebar, at the splice regions, it would have been “even further congested,” Aghayere told the Herald.

He said he was struck by how “powdery” and white the concrete in columns appeared in the newly released video. Stone-like aggregates used to strengthen concrete during construction typically remain visible but they were not in the images from the collapse site.

“The white color just stuns me,” Aghayere told the newspaper. He added that instead of seeing aggregate material mixed into the concrete, “it’s just homogenous,” which is likely indication of saltwater damage.

He said it is impossible to tell from just the images whether the concrete used in original construction was weaker than the designs called for, or whether the apparent weakness was due to damage over time.

“It doesn’t look like normal concrete to me. What’s going on?” Aghayere said.


The Associated Press

 

A laboratory's "irritating" byproduct now supplies 2D materials research

One scientist’s trash is another’s treasure:
An “irritating byproduct”of Ames Laboratory’s superconductor research is actually a highly
 sought-after and difficult-to-obtain insulating material, hBN, for which scientists studying
 graphene and other 2D materials clamor. Credit: Ames Laboratory

While making materials samples to pursue their own research goals, scientists at the U.S. Department of Energy's Ames Laboratory discovered that an unwanted byproduct of their experiments was an extremely high-quality and difficult-to-obtain substance sought after by scientists researching layered materials.

With a project name like "Complex States, Emergent Phenomena & Superconductivity in Intermetallic and Metal-like Compounds" one immediately senses that the team of scientists behind that title might have a knack for the weird, unusual, and never-before-seen. The Ames Lab group investigates metallic and semi-metallic compounds that have magnetic, superconducting, or other properties that are designed to probe the fundamental mysteries of how correlated or emergent states can form and, also, might be useful in future tech applications, such as energy creation, transmission and storage as well as quantum computing. But before they can investigate the weird, unusual, and never-before-seen, they have to make it— so the team has decades of expertise in creating those materials as well.

One focus of the group's research is MgB2, a difficult to produce superconducting material. So difficult, in fact, that the tool used to make it is called "The Beast." Lurking in the basement of Ames Laboratory's Spedding Hall, the Beast is a finicky, powerful, and giant high-pressure furnace that super-heats and super-squashes tiny amounts of hard-to-combine ingredients together into new compounds.

Mingyu Xu, a graduate student, has been studying MgB2, and is interested in growing pure and doped, single crystal samples of the material in order to better understand its superconducting properties. He and Raquel Ribeiro, an Associate Professor in the Department of Physics and Astronomy at Iowa State University, work together to load specially designed ceramic cubes (basically volcanic ash) with the tiny crucibles (cups) that will be heated to temperatures as high as 2000 C°, and subject to pressures 30,000 times higher than normal atmospheric pressure.

"We have to be very precise, very particular about how we prepare and load the samples," said Ribeiro. "If we make a mistake, if any piece is out of alignment, it breaks," she said, explaining the unforgiving process of making the material. So demanding is the Beast, Xu and Ribeiro keep track of their mistakes, as a way of documenting how to improve their preparation methods and success rate.

When they succeed, Xu's focus is on the MgB2, literally. His  are focused sharply on the glimmer of pale gold-colored crystals that confirm that the Beast has prepared a successful, high-quality sample. But the images also capture something else— the sought-after MgB2 is surrounded by a nest of translucent, flaky material that is a byproduct of the process, called hexagonal boron nitride, or hBN.

"For years, we tried to minimize this irritating second phase," said Paul Canfield, Ames Lab physicist, Distinguished Professor and the Robert Allen Wright Professor of Physics and Astronomy at Iowa State University, and principal investigator on the project. "Our quest was for MgB2. We didn't want that other stuff, and tried to optimize our process to reduce the amount of it."

In his collaborations with physicists at other institutions, however, he learned that his "irritating second phase" was actually a highly sought-after and difficult-to-obtain insulating material for scientists researching graphene, the 2D layered semimetal that was first discovered in 2010, and is studied for its fascinating and unusual electron transport properties.

"Imagine you discovered the sandwich, and it's easy to get the ham, but bread is hard to come by. White bread is boring by itself, but if you have it you can make your sandwich," said Canfield, "hBN is the bread to make that sandwich." Layers of insulating hBN separate graphene and other novel electronic and magnetic materials to make new heterostructures; new-physics sandwiches with properties that may ultimately replace transistors.

Suddenly, Canfield's group found themselves hBN bakers to other scientist's 2D sandwich making efforts— a scientific side hustle, if you will. Xu and Ribeiro have adjusted their systematics operating the Beast to optimize both the production of MgB2, and its desirable byproduct. By doing so, Ames has evolved into being one of the few sources worldwide of hBN, playing a key role in multiple high-profile publications. Ames Laboratory has supplied hBN to MIT, Rutgers, Princeton, Cornell, Stanford, Lawrence Livermore, and other institutions. The resulting research has been featured in Nature CommunicationsNature, and Science.

"It's really gratifying that we have a Department of Energy National Laboratory, not only pursuing its own research goals, but also playing an essential role in enabling key physics discoveries at other laboratories and at other institutions," said Canfield.Scientists shuffle atomic layers like playing cards to make new quantum materials

More information: Nikhil Tilak et al, Flat band carrier confinement in magic-angle twisted bilayer graphene, Nature Communications (2021). DOI: 10.1038/s41467-021-24480-3
Dahlia R. Klein et al, Enhancement of interlayer exchange in an ultrathin two-dimensional magnet, Nature Physics (2019). DOI: 10.1038/s41567-019-0651-0
D. R. Klein et al, Probing magnetism in 2D van der Waals crystalline insulators via electron tunneling, Science (2018). DOI: 10.1126/science.aar3617
Provided by Ames Laboratory 

 

HIV prevention treatment shows gaps among key populations


Kaiser Permanente findings suggest that vulnerable groups would benefit from more targeted outreach.

Peer-Reviewed Publication

KAISER PERMANENTE

HIV prevention treatment shows gaps among key populations

Kaiser Permanente findings suggest that vulnerable groups would benefit from more targeted outreach.

OAKLAND, Calif. — A large, detailed look at patients taking HIV-prevention drug therapy finds strong adherence soon after patients get the prescription, but less consistent use thereafter, particularly among groups considered high priority for receiving the medication.

The study, published August 26 in JAMA Network Open, examined data from 13,906 members of Kaiser Permanente in Northern California referred for pre-exposure prophylaxis, or PrEP, therapy between 2012 and 2019. The study found certain groups were more likely to stop taking PrEP: young people, Black and Latino individuals, women, and people with substance use disorders.

The findings suggest targeted strategies are needed to support use of this highly effective, protective medication in groups at increased risk for HIV acquisition, said lead author Carlo Hojilla, RN, PhD, a research fellow with the Kaiser Permanente Northern California Division of Research.

“The findings have important implications that suggest access to health care is a great way to get people in the door, but we need more effective strategies for making sure people who have an ongoing need for PrEP stay on the medication,” Hojilla said. “These are groups we want to reach, and we need innovative approaches to keep them engaged in PrEP care.”

The study found that a high rate of patients referred for HIV prevention care received a prescription for PrEP (88%), and most of them filled their initial prescriptions (98%). There are 2 main brands of PrEP therapy (Truvada and Descovy), which are generally taken daily to avoid contracting HIV. “These findings were encouraging,” Hojilla said. “Kaiser Permanente has managed to do really well increasing uptake of PrEP therapy.”

However, there was significant inconsistency in use during the study period, with about half of users discontinuing PrEP medication at least once; 60% of those who discontinued later filled a prescription again. This analysis did not look at why users stopped or restarted the drugs, the authors said, and some users may have discontinued PrEP because of a decrease in risk for HIV acquisition. Prior studies have also documented medical mistrust, stigma, homophobia, and transphobia as barriers to PrEP uptake and persistence in some communities.

“We know cost can be a big concern for some people, and this study included a time period before PrEP-related services were provided at no cost as a result of changes in recommendations by the U.S. Preventive Services Task Force,” Hojilla said.

The study was also carried out before the introduction of a new option for dosing known as 2-1-1, or on-demand, which allows the user to take PrEP only around the time of a potential exposure to HIV, with a similar level of effectiveness as daily dosing. It’s possible that some of the discontinuation reflected in the study was from patients who opted to not take the drug daily because they had only occasional risk exposure, even prior to 2-1-1 dosing being formally recommended, said senior author Jonathan Volk, MD, an infectious disease specialist with The Permanente Medical Group.

Among those who remained on PrEP therapy, there were no new HIV infections, the study found. “This shows how incredibly well PrEP works when taken,” Dr. Volk said. “But there are important opportunities for us to maximize the population level impact of this vital therapy. To do this, we need to avoid attrition along the care continuum, especially by assisting patients to stay on PrEP throughout periods of risk for HIV acquisition.”

The same research group at Kaiser Permanente reported in 2015 that PrEP effectively stops HIV infections among early users. They also reported in 2020 that on-demand dosing was effective in preventing HIV among 279 male patients.

This study was funded by a Kaiser Permanente Community Benefit research grant and the National Institutes of Health.

Co-authors included Michael Silverberg, PhD, MPH, Jacek Skarbinski, MD, Derek Satre, PhD, and Leo B. Hurley, MPH, with the Division of Research; and Julia Marcus, PhD, MPH, of Harvard Medical School.

About Kaiser Permanente

Kaiser Permanente is committed to helping shape the future of health care. We are recognized as one of America’s leading health care providers and not-for-profit health plans. Founded in 1945, Kaiser Permanente has a mission to provide high-quality, affordable health care services and to improve the health of our members and the communities we serve. We currently serve approximately 12.5 million members in 8 states and the District of Columbia. Care for members and patients is focused on their total health and guided by their personal Permanente Medical Group physicians, specialists, and team of caregivers. Our expert and caring medical teams are empowered and supported by industry-leading technology advances and tools for health promotion, disease prevention, state-of-the-art care delivery, and world-class chronic disease management. Kaiser Permanente is dedicated to care innovations, clinical research, health education, and the support of community health.

 

 

 

Why companies invest in local social media influencers


News from the Journal of Marketing

Peer-Reviewed Publication

AMERICAN MARKETING ASSOCIATION

Researchers from Emory University published a new paper in the Journal of Marketing that investigates whether geographical distance still matters when word of mouth is disseminated online.

The study, forthcoming in the Journal of Marketing, is titled “Is Distance Really Dead in the Online World? The Moderating Role of Geographical Distance on the Effectiveness of Electronic Word-of-Mouth” and is authored by Vilma Todri, Panagiotis (Panos) Adamopoulos, and Michelle Andrews.

Companies seek local influencers to pitch products. Even though most influencers amass geographically dispersed followings via social media platforms, companies are willing to funnel billions of sponsorship dollars to multiple influencers located in different geographic areas, effectively creating sponsorships that span cities, countries, and, in some cases even, the globe. The desire to work with local influencers has spawned advertising agencies that specialize in connecting companies with influencers and may soon redefine the influencer economy.
 
This trend has merit according to this new study. The researchers show a positive link between online influence and how geographically close an influencer’s followers are located. The nearer a follower is geographically to someone who posts an online recommendation, the more likely she is to follow that recommendation.
 
To investigate whether geographical distance still matters when word of mouth is disseminated online, the research team examined thousands of actual purchases made on Twitter. As Todri explains, “We found the likelihood that a person who saw a Tweet mentioning someone they follow bought a product would subsequently also buy the product increases the closer he/she resides to the purchaser. Not only were followers more likely to heed an influencer’s recommendation the closer they physically reside to the influencer, the more quickly they were to do so, too.”
 
The role of geographic proximity in the effectiveness of online influence occurs across several known retailers and for different types of products, including video game consoles, electronics and sports equipment, gift cards, jewelry, and handbags.
 
“This role of geographic proximity may be due to an invisible connection between people that is rooted in the commonality of place. This invisible link can lead people to identify more closely with someone who is located nearby, even if they do not personally know that person,” says Adamopoulos. The result is that people are more likely to follow someone’s online recommendation when they live closer. These online recommendations can take any form, from a movie review, to a restaurant rating, to a product pitch.
 
What makes these findings surprising is that experts predicted the opposite when the internet first became widely adopted. Experts declared the death of distance. In theory, this makes sense: people do not need to meet in person to share their opinions, reviews, and purchases when they can do so electronically. What the experts who envisioned the end of geography may have overlooked, however, is how people decide whose online opinion to trust. This is where cues that indicate a person’s identity, such as where he/she lives in the real world, come into play. We may be more likely to trust the online opinion from someone who lives in the same city as us than from someone who lives farther away, simply because we have location in common. Known as the social identity theory, this process explains how individuals form perceptions of belonging to and relating to a community. Who we identify with can affect the degree to which we are influenced, even when this influence occurs online. 
 
These findings imply that technology and electronic communications do not completely overcome the forces that govern influence in the real world. Geographical proximity still matters, even in the digital space. Information and cues about an individual’s identity online, such as where he/she lives, may affect his/her influence on others through the extent to which others feel they can relate to him/her. 
 
These findings on how spatial proximity may still be a tie that binds even in an online world affirm what some companies have long suspected. Local influencers may have a leg up in the influence game and are worth their weight in location. For these reasons, says Andrews, “Companies may want to work with influencers who have more proximal connections to increase the persuasiveness of their online advertising, product recommendation, and referral programs. Government officials and not-for-profit organizations may similarly want to partner with local ambassadors to more effectively raise awareness of—and change attitudes and behaviors towards—important social issues.”

Full article and author contact information available at: https://doi.org/10.1177/00222429211034414

About the Journal of Marketing 

The Journal of Marketing develops and disseminates knowledge about real-world marketing questions useful to scholars, educators, managers, policy makers, consumers, and other societal stakeholders around the world. Published by the American Marketing Association since its founding in 1936, JM has played a significant role in shaping the content and boundaries of the marketing discipline. Christine Moorman (T. Austin Finch, Sr. Professor of Business Administration at the Fuqua School of Business, Duke University) serves as the current Editor in Chief.
https://www.ama.org/jm

About the American Marketing Association (AMA) 

As the largest chapter-based marketing association in the world, the AMA is trusted by marketing and sales professionals to help them discover what is coming next in the industry. The AMA has a community of local chapters in more than 70 cities and 350 college campuses throughout North America. The AMA is home to award-winning content, PCM® professional certification, premiere academic journals, and industry-leading training events and conferences.
https://www.ama.org

 

Science-backed tips for maximizing play time with kids

Penn State researcher discusses using lessons from cognitive development research to create playful learning spaces for children

Peer-Reviewed Publication

PENN STATE

UNIVERSITY PARK, Pa. — Years of cognitive development research have uncovered a wealth of knowledge about what children need to grow, thrive and learn — but it’s also important to translate those insights into real-world applications.

A new paper led by Penn State graduate Brenna Hassinger-Das, now an assistant professor of psychology at Pace University-NYC, and Jennifer Zosh, professor of human development and family studies at Penn State Brandywine, explores the science-backed core pillars of learning and examples of play spaces that incorporate them.

“We know kids in Western countries spend only 20% of their time at school, so where are they for the other 80%?” Hassinger-Das said. “They’re at home, or in the grocery store, or the park. How can we transform those everyday spaces to encourage playful learning and promote conversations between caregivers and children? We wanted to maximize these spaces and turn them into places that are fun but also support the kind of learning you do in school.”

According to the researchers, previous studies have found six core characteristics or “pillars” of learning that can maximize any experience a child has, whether it’s playing in the park, going to the grocery store, or using a smartphone app. These pillars suggest that experiences should be active rather than passive, engaging, meaningful, socially interactive, iterative rather than repetitive, and joyful.

“We wanted to leverage insights from research to help support children, families and communities,” Zosh said. “For this specific initiative, we wanted to see how we could take those pillars and apply them to communities in playful ways that support children's learning.”

The paper — recently published in the journal Trends in Cognitive Sciences — looked at several examples of spaces that incorporated the pillars into their designs.

One example was Urban Thinkscape, an installation that turned a bus stop into a playful learning space that fosters play and conversations. One of the features incorporated was an element called Stories, which includes several icons placed on the ground with different pictures that can be used to tell stories. According to the researchers, as children go from icon to icon and create a story, they build narrative skills, which are a key part of literacy.

Even though the paper focused on public spaces that incorporated these elements, the researchers said anyone can use the pillars to enhance children’s play spaces and experiences, including parents at home.

Pillar One: Active

Stay “active” as you play and interact with your child, for example, by incorporating literary and STEM elements into your speech and interactions.

Zosh said this could mean counting the apples out loud as you put them in your basket at the grocery store or asking your child what letter each block starts with as you build a tower. She also said asking lots of questions — such as “What would happen if we mixed these blue and yellow paints together?” or “What might happen if we stack the red square block on top of the yellow triangle block?” — can be helpful, as well.

Pillar Two: Engaged

“Try to limit distractions as much as possible, including background television and your own smartphone use,” Hassinger-Das said. “These types of distractions are sometimes unavoidable, but they do have the potential to take away from these high-quality times with your child. Focusing and staying engaged during play can help you make the most of these interactions.”

Pillar Three: Meaningful

Try building on topics the child is already interested in during play. If they like dinosaurs, you could suggest a make-believe scenario where you dig for dinosaur fossils at the playground. Or, you can integrate information about dinosaurs like counting how many bones they have and what they ate.

“If you are reading a book set in a different state, get out a globe or a map app and explore where the state is and how the weather there is different from where you live,” Zosh said. “Helping children build connections helps them weave together a rich world of understanding.”

Pillar Four: Socially interactive

The researchers advised letting your child lead in play time while you offer support along the way. For example, let your child decide what to build with blocks while asking questions like, “What would happen if you placed that block in a different direction?” or “How many more blocks do you think it would take to build a tower as tall as you?”

Pillar Five: Iterative

Children are naturally scientific thinkers — they like to experiment, see what happens, and try again and again until something works. The researchers advised giving your children opportunities to guess what will happen, conduct “experiments,” make up new words to favorite songs, and make mistakes. Every mistake leads to learning.

Pillar Six: Joyful

Finally, making playtime joyful can be done in many ways, including incorporating elements of surprise.

“Playing with shadows and asking which one is bigger or how you can make your shadow grow or shrink is one way to foster surprise and joy,” Hassinger-Das said. “Similarly, think about what helps your child connect with whatever brings them joy, from construction with a cardboard box to playing vet with their stuffed animals.”

Andres S. Bustamante, University of California, Irvine; Roberta Michnick Golinkoff, University of Delaware; and Kathy Hirsh-Pasek, Temple University, also participated in this work.

The William Penn Foundation and the National Science Foundation helped support this research.


Material efficiency holds great potential for climate neutrality

by Albert Ludwigs University of Freiburg
Fig. 1: Total global cumulative (top) and annual 2050 (bottom) greenhouse gas (GHG) emission reductions of the technical potential of ten industrial and demand-side material efficiency (ME) strategies. Results are shown for three socioeconomic (low energy demand (LED) and the shared socioeconomic pathways (SSP1 and SSP2)), and climate policy (No Pol. and 2 °C Pol., see text) scenarios and ME strategy for the passenger-vehicle and residential-building sectors combined. The absolute values in the plot are in megatons (Mt) or gigatons (Gt) of CO2-eq. See the ‘Methods’ section for an overview of the different ME strategies implemented. Credit: DOI: 10.1038/s41467-021-25300-4

More efficient use of materials in residential buildings and cars could save enormous amounts of greenhouse gas emissions by 2050: 20 to 52 gigatons of CO2 equivalents for residential buildings and 13 to 26 gigatons of CO2 equivalents for cars, which amounts to up to two-thirds of current consumption. This is the conclusion reached by a research team led by Dr. Stefan Pauliuk, Assistant Professor of Sustainable Energy and Material Flow Management at the University of Freiburg.


For this purpose, the scientists analyzed 10 global strategies for material efficiency (ME), such as the reuse of scrap from manufacturing, and calculated their joint maximum potential if they were consistently implemented by 2040 and flanked by a strict climate policy. In the case of residential buildings, wood construction and reducing the living space per capita show the greatest savings potential. For passenger vehicles, it's ride-sharing and car-sharing. "This shows that material efficiency can be a key to becoming largely carbon neutral," says Pauliuk. "The potentials are enormous and should be used more." The research team recently presented its findings in the journal Nature Communications.

What if scenarios for various degrees of material efficiency

With a team of international researchers, including Prof. Dr. Edgar Hertwich from the Norwegian University of Science and Technology/Norway, Pauliuk looked at the life cycles of materials for housing and car construction and calculated how many greenhouse gas emissions could be saved by 2050 through a broad and ambitious introduction of material efficiency (ME) measures in combination with a strict climate policy. The calculation considers ten ME strategies. These include supply-side measures, such as reusing scrap from manufacturing; demand-side measures, such as reusing products; and more efficient use of products through car sharing and shared housing.

Alongside this, the scientists determined the future changes in material flows and energy consumption due to higher material yields, lighter design, material substitution, longer service life, higher service efficiency, reuse and recycling. In doing so, the computational model captures the production, demand, use and recycling of six climate-relevant materials: aluminum, cement, copper, plastics, steel and wood. "The analysis generates a range of what-if scenarios for various degrees of material efficiency in the vehicle and building sectors and associated key material cycles against different socioeconomic and climate policy backgrounds," Pauliuk explains.

Material efficiency must be given higher priority in climate policy

According to the researchers, their study shows that renewable energy alone is not enough to achieve deep emission reductions in the residential sector, but that additional efficiency measures are needed. The same applies to passenger cars, where electrification and conversion to low-carbon electricity must go hand in hand, they say.

Pauliuk concludes that "if material efficiency strategies are given a similarly high priority as energy efficiency measures, achieving the Paris goal of limiting global warming to well below 2 degrees Celsius will be easier. Measures to improve material efficiency can be easily implemented, so they should be given higher priority in climate policy."

Explore further Using materials efficiently can substantially cut greenhouse gas emissions

More information: Stefan Pauliuk et al, Global scenarios of resource and emission savings from material efficiency in residential buildings and cars, Nature Communications (2021). DOI: 10.1038/s41467-021-25300-4

 

Frontier Development Lab transforms space and earth science for NASA with Google cloud artificial intelligence and machine learning technology


Frontier Development Lab’s goal is to apply synergies between physics, simulation and machine learning to problems important to space science and humanity

Business Announcement

SETI INSTITUTE

FDL-Google-PR-2021 

IMAGE: FRONTIER DEVELOPMENT LAB 2021 CHALLENGES view more 

CREDIT: SETI INSTITUTE

August 26, 2021, Mountain View, Calif., – Frontier Development Lab (FDL), in partnership with the SETI Institute, NASA and private sector partners including Google Cloud, are transforming space and Earth science through the application of industry-leading artificial intelligence (AI) and machine learning (ML) tools.

FDL tackles knowledge gaps in space science by pairing ML experts with researchers in physics, astronomy, astrobiology, planetary science, space medicine and Earth science. These researchers have utilized Google Cloud compute resources and expertise since 2018, specifically AI / ML technology, to address research challenges in areas like astronaut health, lunar exploration, exoplanets, heliophysics, climate change and disaster response.

With access to compute resources provided by Google Cloud, FDL has been able to increase the typical ML pipeline by more than 700 times in the last five years, facilitating new discoveries and improved understanding of our planet, solar system and the universe. Throughout this period, Google Cloud’s Office of the CTO (OCTO) has provided ongoing strategic guidance to FDL researchers on how to optimize AI / ML , and how to use compute resources most efficiently.

With Google Cloud’s investment, recent FDL achievements include: 

  • Allowing lunar scientists to see into the permanently shadowed craters of the lunar poles to help NASA develop traverse plans for robotic and human exploration. 
  • Helping clinicians understand the causal factors in a target cancer in a way that respects patient privacy. 
  • Enabling researchers to search NASA's petascale archives of data without human labels.
  • Helping astrophysicists survey the starspots on distant stars to determine their suitability for life.
  • Updating U.S. hydrology maps everyday to help the U.S. Geological Survey understand the U.S. water supply to develop better predictive models for droughts and floods.

 

"Unfettered on-demand access to massive super-compute resources has transformed the FDL program, enabling researchers to address highly complex challenges across a wide range of science domains, advancing new knowledge, new discoveries and improved understandings in previously unimaginable timeframes,” said Bill Diamond, president and CEO, SETI Institute. “This program, and the extraordinary results it achieves, would not be possible without the resources generously provided by Google Cloud.”

“When I first met Bill Diamond and James Parr in 2017, they asked me a simple question: ‘What could happen if we marry the best of Silicon Valley and the minds of NASA?’” said Scott Penberthy, director of Applied AI at Google Cloud. “That was an irresistible challenge. We at Google Cloud simply shared some of our AI tricks and tools, one engineer to another, and they ran with it. I’m delighted to see what we’ve been able to accomplish together - and I am inspired for what we can achieve in the future. The possibilities are endless.”

FDL leverages AI technologies to push the frontiers of science research and develop new tools to help solve some of humanity's biggest challenges. FDL teams are comprised of doctoral and post-doctoral researchers who use AI / ML to tackle ground-breaking challenges. Cloud-based super-computer resources mean that FDL teams achieve results in eight-week research sprints that would not be possible in even year-long programs with conventional compute capabilities. 

“High-performance computing is normally constrained due to the large amount of time, limited availability and cost of running AI experiments,” said James Parr, director of FDL. “You’re always in a queue. Having a common platform to integrate unstructured data and train neural networks in the cloud allows our FDL researchers from different backgrounds to work together on hugely complex problems with enormous data requirements - no matter where they are located.”

Better integrating science and ML is the founding rationale and future north star of FDL’s partnership with Google Cloud. ML is particularly powerful for space science when paired with a physical understanding of a problem space. The gap between what we know so far and what we collect as data is an exciting frontier for discovery and something AI / ML and cloud technology is poised to transform. 

###

You can learn more about FDL’s 2021 program here.

The FDL 2021 showcase presentations can be watched as follows:

  1. FDL USA in full - https://youtu.be/viAm8gAlb3g
  2. Astronaut Health - https://youtu.be/lhe7gPjo7HQ
  3. Solar Wind - https://youtu.be/5gwMC4Rw5zA
  4. Solar Drag - https://youtu.be/gEL89pH4PWU
  5. Automated reporting (NLP) - https://youtu.be/ieA1isaXMnk
  6. Upscaling Lunar resources - https://youtu.be/op740EPXQ30
  7. GLM Lightning - https://youtu.be/CnUx2eOz6zc
  8. Digital Twin Earth - Coasts - https://youtu.be/7wjtcJxRjKk

In addition to Google Cloud, FDL is supported by partners including Lockheed Martin, Intel, Luxembourg Space Agency, MIT Portugal, Lawrence Berkeley National Lab, USGS, Microsoft, NVIDIA, Mayo Clinic, Planet and IBM.

About the SETI Institute

Founded in 1984, the SETI Institute is a non-profit, multidisciplinary research and education organization whose mission is to lead humanity's quest to understand the origins and prevalence of life and intelligence in the universe and share that knowledge with the world. Our research encompasses the physical and biological sciences and leverages expertise in data analytics, machine learning and advanced signal detection technologies. The SETI Institute is a distinguished research partner for industry, academia and government agencies, including NASA and NSF.

Contact Information:

Rebecca McDonald
Director of Communications
SETI Institute
rmcdonald@SETI.org

 

 

Plants evolved ability to actively control water-loss earlier than previously thought


Peer-Reviewed Publication

UNIVERSITY OF BIRMINGHAM

New research has shed light on when plants first evolved the ability to respond to changing humidity in the air around them, and was probably a feature of a common ancestor of both flowering plants and ferns.

Key to the regulation mechanism are tiny holes, or pores, on the surface of leaves, called stomata. These enable the plant to regulate the uptake of CO2 gas as fuel for photosynthesis, and the loss of water vapour – a constant balancing act that requires the pores to open and close according to changing conditions. This ability is important to agriculture because it helps crops to use less water to grow.

Plants first evolved stomata soon after they moved from water to land, some 450 million years ago, but scientists are still uncertain about the evolutionary pathway they took and the point at which plants became able to choose whether to open or close the pores in response to their environment.

In the most recently evolved plants – flowering plants – stomata closure in response to drought is actively triggered by a number of internal signals, including a hormone called abscisic acid (ABA), but scientists have been struggling to understand if this mechanism is also present in older groups of plants. In a new study, published in Current Biology, researchers at the University of Birmingham have found evidence that the fern species Ceratopteris richardii actively closes its stomata using similar signals.

This semi-aquatic tropical fern has recently become the first model for exploring genetic control of development in the fern family, and is now helping scientists to unpick the long evolutionary history between the earliest land-living plants (mosses, liverworts and hornworts) and the modern flowering plants that dominate today’s ecosystems.

The team used RNA sequencing technology to identify the genetic mechanisms behind different stomatal responses and was able to demonstrate the fern’s ability to close stomata in response to low humidity or in response to ABA involves copies of genes already known to control stomata in flowering plants.

The results suggest that both ferns and flowering plants evolved using similar stomatal closure methods. This indicates that these mechanisms were present – at least in some form – in the stomata of the last common ancestor of both groups.

Dr Andrew Plackett, of the University of Birmingham’s School of Biosciences and Birmingham Institute of Forest Research, led the research in collaboration with groups at the University of Bristol and the University of Oxford. He said: “We know that plants have possessed stomata for most of their evolutionary history, but the point in evolution where plants became able to actively open and close them has been controversial.

“We’ve been able to show the same active closure mechanisms found in flowering plants are also present in ferns, a much older group of plants. Being able to better understand how these mechanisms have changed during plant evolution gives us useful tools to learn more about how they work. This will be important for helping our crops to adapt to future environmental changes.”

Alistair Hetherington from the University of Bristol said: “This new work confirms that the earliest plants were able to actively control the water they lost through the microscopic valve like structures on the surfaces of leaves known as stomata. This is important because it shows that the intracellular machinery allowing stomata to open and close was present in the earliest land plants. The research also shows that, whether stomata respond actively or passively is dictated by the environment in which the plants lived."

###

 

Fewer El Niño and La Niña events in a warmer world

Highest resolution global warming simulation conducted to date reveals possible end of El Niño/ La Niña temperature cycle

Peer-Reviewed Publication

INSTITUTE FOR BASIC SCIENCE

Fig. 1 

IMAGE: SURFACE OCEAN TEMPERATURES SIMULATED AT UNPRECEDENTED RESOLUTION USING A COUPLED ATMOSPHERE-OCEAN MODEL. THE EXTENSIVE WAVY COLD STRUCTURE IN THE EQUATORIAL PACIFIC CORRESPONDS TO A TROPICAL INSTABILITY WAVE. SIMULATIONS WERE CONDUCTED ON THE IBS/ICCP SUPERCOMPUTER ALEPH. view more 

CREDIT: INSTITUTE FOR BASIC SCIENCE

The cycling between warm El Niño and cold La Niña conditions in the eastern Pacific (commonly referred to as the El Niño-Southern Oscillation, ENSO) has persisted without major interruptions for at least the last 11,000 years. This may change in the future according to a new study published in the journal Nature Climate Change by a team of scientists from the IBS Center for Climate Physics (ICCP) at Pusan National University in South Korea, the Max Planck Institute of Meteorology, Hamburg, Germany, and the University of Hawaiʻi at Mānoa, USA.

The team conducted a series of global climate model simulations with an unprecedented spatial resolution of 10 km in the ocean and 25 km in the atmosphere. Boosted by the power of one of South Korea’s fastest supercomputers (Aleph), the new ultra-high-resolution climate model simulations can now realistically simulate tropical cyclones in the atmosphere and tropical instability waves in the equatorial Pacific Ocean (see Fig.1), which both play fundamental roles in the generation and termination of El Niño and La Niña events. “Our supercomputer ran non-stop for over one year to complete a series of century-long simulations covering present-day climate and two different global warming levels. The model generated 2 quadrillion bytes of data; enough to fill up about 2,000 hard disks”, says Dr. Sun-Seon Lee who conducted the experiments.

Analyzing this enormous dataset, the team focused on a long-standing problem: how will ENSO change in response to increasing greenhouse gas concentrations. “Two generations of climate scientists have looked at this issue using climate models of varying complexity. Some models simulated weaker; others predicted larger eastern Pacific temperature swings in a future warmer climate. The jury was still out.” says Prof. Axel Timmermann, co-corresponding author and Director of the ICCP. He adds “What is common to these models is that their simulated temperatures in the equatorial Pacific, west of Galapagos, were always too cold compared to the observations. This prevented them from properly representing the delicate balance between positive and negative feedback processes that are important in the ENSO cycle”.

By capturing small-scale climatic processes at the highest computationally possible resolution, the ICCP team was able to alleviate these ocean temperature biases, leading to substantial improvements in the representations of ENSO and its response to Global Warming. “The result from our computer simulations is clear: Increasing CO2 concentrations will weaken the intensity of the ENSO temperature cycle,” says Dr. Christian Wengel, first author of the study and former postdoctoral researcher at the ICCP, now at the Max Planck Institute of Meteorology in Hamburg in Germany.

By tracing the movement of heat in the coupled atmosphere/ocean system the scientists identified the main culprit of the collapse of the ENSO system: Future El Niño events will lose heat to the atmosphere more quickly due to the evaporation of water vapor, which has the tendency to cool the ocean. In addition, the reduced future temperature difference between the eastern and western tropical Pacific will also inhibit the development of temperature extremes during the ENSO cycle. However, these two factors are partly offset by a projected future weakening of tropical instability waves (Fig. 1). Normally these oceanic waves, which can encompass up to 30% of the earth’s entire circumference, develop during La Niña conditions. They replace colder equatorial waters with warmer off-equatorial water, thereby accelerating the demise of a La Niña event. The new computer simulations, which resolve the detailed structure of these waves, demonstrate that the associated negative feedback for ENSO will weaken in the future.

“There is a tug-of-war between positive and negative feedbacks in the ENSO system, which tips over to the negative side in a warmer climate. This means future El Niño and La Niña events cannot develop their full amplitude anymore” comments ICCP alumni Prof. Malte Stuecker, co-author of the study and now assistant professor at the Department of Oceanography and the International Pacific Research Center at the University of Hawaiʻi at Mānoa.

Even though the year-to-year fluctuations in eastern equatorial Pacific temperatures are likely to weaken with human-induced warming according to this new study, the corresponding changes in El Niño and La Niña-related rainfall extremes will continue to increase due to an intensified hydrological cycle in a warmer climate, as shown in recent studies by scientists from the ICCP and their international collaborators.

“Our research documents that unabated warming is likely to silence the world’s most powerful natural climate swing which has been operating for thousands of years. We don’t yet know the ecological consequences of this potential no-analog situation“ says Axel Timmermann, “but we are eager to find out”.

###

Disclaimer: AAAS and Eure