Monday, June 24, 2024

Unlocking the tea leaf's secret: decoding the amino acid blueprint of tea plant



NANJING AGRICULTURAL UNIVERSITY THE ACADEMY OF SCIENCE

A working model for N assimilation, AA synthesis, transport, and degradation/recycling in tea plants. 

IMAGE: 

A WORKING MODEL FOR N ASSIMILATION, AA SYNTHESIS, TRANSPORT, AND DEGRADATION/RECYCLING IN TEA PLANTS.

view more 

CREDIT: HORTICULTURE RESEARCH




A recent study has unlocked the secrets of amino acid metabolism in tea plants, revealing the pivotal role of nitrogen assimilation in root tissues and the long-distance transport of key amino acids to leaves. This discovery offers a pathway to enhance tea's flavor and health benefits, providing a foundation for improving tea cultivation practices and potentially elevating the quality of tea beverages.

Amino acids are vital for plant growth and significantly influence tea flavor and health benefits. Tea plants, particularly Camellia sinensis, exhibit unique amino acid profiles that contribute to their distinctive taste and nutritional value. Despite the known importance of amino acids like theanine and glutamine (Gln), the detailed dynamics of their synthesis, transport, and degradation in tea plants remain unclear. Due to these challenges, there is a need to conduct in-depth research to understand the complex metabolic pathways and spatial distribution of amino acids within tea plants.

Researchers from Hunan Agricultural University, in a study (DOI: 10.1093/hr/uhae060) published on February 28, 2024, in Horticulture Research, dissected the spatial dynamics of amino acid biosynthesis, transport, and turnover in tea plants. The study provides a detailed analysis of the metabolic pathways and gene expressions that govern these processes. By understanding these mechanisms, the researchers aim to improve tea cultivation and enhance the quality of tea beverages.

The study revealed that nitrogen assimilation primarily occurs in the roots, where Gln, theanine, and arginine (Arg) are actively synthesized. These amino acids are then transported through the plant’s vascular system. Transcriptome analyses identified that genes involved in Arg synthesis are highly expressed in roots, while genes responsible for Arg transport and degradation are expressed in stems and young leaves. This indicates a sophisticated system of amino acid management within the plant. One key finding is the role of the CsGSIa gene, which is crucial for amino acid synthesis, transport, and recycling. Overexpression and knockdown experiments of CsGSIa in transgenic tea plants demonstrated its significant impact on Gln and theanine levels. The study also highlighted that Arg, Gln, glutamate (Glu), and theanine are the major amino acids transported through the xylem sap, facilitating long-distance nitrogen transport from roots to leaves.

Dr. Jian Zhao, the lead researcher, stated, "Our findings offer a detailed map of amino acid metabolism in tea plants, which is crucial for both basic science and applied agricultural practices. Understanding these metabolic pathways opens up new possibilities for breeding tea varieties with enhanced flavors and health benefits."

The study’s findings have significant implications for the tea industry. By elucidating the pathways of amino acid metabolism, this research paves the way for developing tea plants with higher levels of beneficial amino acids, enhancing both flavor and nutritional value. These insights can be applied in breeding programs and cultivation practices to produce superior tea varieties. Additionally, understanding these metabolic processes can help in developing strategies to improve nitrogen utilization efficiency, contributing to more sustainable and productive tea farming.

###

References

DOI

10.1093/hr/uhae060

Original Source URL

https://doi.org/10.1093/hr/uhae060

Funding information

This work was supported by Natural Science Foundation of China (U23A20214), the funds from Hunan 'Three Top' Innovative Talents Project (2022RC1142).

About Horticulture Research

Horticulture Research is an open access journal of Nanjing Agricultural University and ranked number one in the Horticulture category of the Journal Citation Reports ™ from Clarivate, 2022. The journal is committed to publishing original research articles, reviews, perspectives, comments, correspondence articles and letters to the editor related to all major horticultural plants and disciplines, including biotechnology, breeding, cellular and molecular biology, evolution, genetics, inter-species interactions, physiology, and the origination and domestication of crops.

A.I.

UVA and the Toyota Research Institute aim to give your car the power to reason




UNIVERSITY OF VIRGINIA SCHOOL OF ENGINEERING AND APPLIED SCIENCE

UVA Link Lab Driving Simulator 

IMAGE: 

YEN-LING KUO, AN ASSISTANT PROFESSOR OF COMPUTER SCIENCE, IS BUILDING A DRIVING SIMULATOR, SIMILAR TO THIS ONE IN UVA ENGINEERING’S LINK LAB, TO COLLECT DATA ON DRIVING BEHAVIOR. SHE’LL USE THE DATA TO ENABLE A ROBOT’S AI TO ASSOCIATE THE MEANING OF WORDS WITH WHAT IT SEES BY WATCHING HOW HUMANS INTERACT WITH THE ENVIRONMENT OR BY ITS OWN INTERACTIONS WITH THE ENVIRONMENT.

view more 

CREDIT: GRAEME JENVEY/UNIVERSITY OF VIRGINIA SCHOOL OF ENGINEERING AND APPLIED SCIENCE





Self-driving cars are coming, but will you really be OK sitting passively while a 2,000-pound autonomous robot motors you and your family around town?

Would you feel more secure if, while autonomous technology is perfected over the next few years, your semi-autonomous car could explain to you what it’s doing — for example, why it suddenly braked when you didn’t? 

Better yet, what if it could help your teenager not only learn to drive, but to drive more safely? 

Yen-Ling Kuo, the Anita Jones Faculty Fellow and assistant professor of computer science at the University of Virginia School of Engineering and Applied Science, is training machines to use human language and reasoning to be capable of doing all of that and more. The work is funded by a two-year Young Faculty Researcher grant from the Toyota Research Institute.

“This project is about how artificial intelligence can understand the meaning of drivers’ actions through language modeling and use this understanding to augment our human capabilities,” Kuo said.

“By themselves, robots aren’t perfect, and neither are we. We don’t necessarily want machines to take over for us, but we can work with them for better outcomes.”

Eliminating the Need to Program Every Scenario

To reach that level of cooperation, you need machine learning models that imbue robots with generalizable reasoning skills.

That’s “as opposed to collecting large datasets to train for every scenario, which will be expensive, if not impossible,” Kuo said.

Kuo is collaborating with a team at the Toyota Research Institute to build language representations of driving behavior that enable a robot to associate the meaning of words with what it sees by watching how humans interact with the environment or by its own interactions with the environment.

Let’s say you’re an inexperienced driver, or maybe you grew up in Miami and moved to Boston. A car that helps you drive on icy roads would be handy, right?

This new intelligence will be especially important for handling out-of-the-ordinary circumstances, such as helping inexperienced drivers adjust to road conditions or guiding them through challenging situations.

“We would like to apply the learned representations in shared autonomy. For example, the AI can describe a high-level intention of turning right without skidding and give guidance to slow to a certain speed while turning right,” Kuo said. “If the driver doesn’t slow enough, the AI will adjust the speed further, or if the driver’s turn is too sharp, the AI will correct for it.”

Kuo will develop the language representations from a variety of data sources, including from a driving simulator she is building for her lab this summer.

Her work is being noticed. Kuo recently gave an invited talk on related research at the Association for the Advancement of Artificial Intelligence’s New Faculty Highlights 2024 program. She also has a forthcoming paper, “Learning Representations for Robust Human-Robot Interaction,” slated for publication in AI Magazine.

Advancing Human-Centered AI

Kuo’s proposal closely aligns with the Toyota Research Institute’s goals for advancing human-centered AI, interactive driving and robotics. 

“Once language-based representations are learned, their semantics can be used to share autonomy between humans and vehicles or robots, promoting usability and teaming,” said Kuo’s co-investigator, Guy Rosman, who manages the institute’s Human Aware Interaction and Learning team.

“This harnesses the power of language-based reasoning into driver-vehicle interactions that better generalize our notion of common sense, well beyond existing approaches,” Rosman said.

That means if you ever do hand the proverbial keys over to your car, the trust enabled by Kuo’s research should help you steer clear of any worries.


Berkeley Lab researchers advance AI-driven plant root analysis


Enhancing biomass assessment and plant root growth monitoring in hydroponic systems



\

DOE/LAWRENCE BERKELEY NATIONAL LABORATORY

RhizoNet harnesses the power of AI to transform how we study plant roots 

IMAGE: 

DEVELOPED BY BERKELEY LAB RESEARCHERS, RHIZONET IS A NEW COMPUTATIONAL TOOL THAT HARNESSES THE POWER OF AI TO TRANSFORM HOW WE STUDY PLANT ROOTS, OFFERING NEW INSIGHTS INTO ROOT BEHAVIOR UNDER VARIOUS ENVIRONMENTAL CONDITIONS. IT WORKS IN CONJUNCTION WITH ECOFAB, A NOVEL HYDROPONIC DEVICE THAT FACILITATES IN-SITU PLANT IMAGING BY OFFERING A DETAILED VIEW OF PLANT ROOT SYSTEMS.

view more 

CREDIT: THOR SWIFT, LAWRENCE BERKELEY NATIONAL LABORATORY




In a world striving for sustainability, understanding the hidden half of a living plant – the roots – is crucial. Roots are not just an anchor; they are a dynamic interface between the plant and soil, critical for water uptake, nutrient absorption, and, ultimately, the survival of the plant. In an investigation to boost agricultural yields and develop crops resilient to climate change, scientists from Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) Applied Mathematics and Computational Research (AMCR) and Environmental Genomics and Systems Biology (EGSB) Divisions have made a significant leap. Their latest innovation, RhizoNet, harnesses the power of artificial intelligence (AI) to transform how we study plant roots, offering new insights into root behavior under various environmental conditions.

This pioneering tool, detailed in a study published on June 5 in Scientific Reports, revolutionizes root image analysis by automating the process with exceptional accuracy. Traditional methods, which are labor-intensive and prone to errors, fall short when faced with the complex and tangled nature of root systems. RhizoNet steps in with a state-of-the-art deep learning approach, enabling researchers to track root growth and biomass with precision. Using an advanced deep learning-based backbone based on a convolutional neural network, this new computational tool semantically segments plant roots for comprehensive biomass and growth assessment, changing the way laboratories can analyze plant roots and propelling efforts toward self-driving labs.

As Berkeley Lab’s Daniela Ushizima, lead investigator of the AI-driven software, explained, “The capability of RhizoNet to standardize root segmentation and phenotyping represents a substantial advancement in the systematic and accelerated analysis of thousands of images. This innovation is instrumental in our ongoing efforts to enhance the precision in capturing root growth dynamics under diverse plant conditions.” 

Getting to the Roots

Root analysis has traditionally relied on flatbed scanners and manual segmentation methods, which are not only time-consuming but also susceptible to errors, particularly in extensive multi-plant studies. Root image segmentation also presents significant challenges due to natural phenomena like bubbles, droplets, reflections, and shadows. The intricate nature of root structures and the presence of noisy backgrounds further complicate the automated analysis process. These complications are particularly acute at smaller spatial scales, where fine structures are sometimes only as wide as a pixel, making manual annotation extremely challenging even for expert human annotators.

EGSB recently introduced the latest version (2.0) of EcoFAB, a novel hydroponic device that facilitates in-situ plant imaging by offering a detailed view of plant root systems. EcoFAB – developed via a collaboration between EGSB, the DOE Joint Genome Institute (JGI), and the Climate & Ecosystem Sciences division at Berkeley Lab – is part of an automated experimental system designed to perform fabricated ecosystem experiments that enhance data reproducibility. RhizoNet, which processes color scans of plants grown in EcoFAB that are subjected to specific nutritional treatments, addresses the scientific challenges of plant root analysis. It employs a sophisticated Residual U-Net architecture (an architecture used in semantic segmentation that improves upon the original U-Net by adding residual connections between input and output blocks within the same level, i.e. resolution, in both the encoder and decoder pathways) to deliver root segmentation specifically adapted for EcoFAB conditions, significantly enhancing prediction accuracy. The system also integrates a convexification procedure that serves to encapsulate identified roots from time series and helps quickly delineate the primary root components from complex backgrounds. This integration is key for accurately monitoring root biomass and growth over time, especially in plants grown under varied nutritional treatments in EcoFABs.

To illustrate this, the new Scientific Reports paper details how the researchers used EcoFAB and RhizoNet to process root scans of Brachypodium distachyon (a small grass species) plants subjected to different nutrient deprivation conditions over approximately five weeks. These images, taken every three to seven days, provide vital data that help scientists understand how roots adapt to varying environments. The high-throughput nature of EcoBOT, the new image acquisition system for EcoFABs, offers research teams the potential for systematic experimental monitoring – as long as data is analyzed promptly. 

“We’ve made a lot of progress in reducing the manual work involved in plant cultivation experiments with the EcoBOT, and now RhizoNet is reducing the manual work involved in analyzing the data generated,” noted Peter Andeer, a research scientist in EGSB and a lead developer of EcoBOT, who collaborated with Ushizima on this work. “This increases our throughput and moves us toward the goal of self-driving labs.” Resources at the National Energy Research Scientific Computing Center (NERSC) – a U.S. Department of Energy (DOE) user facility located at Berkeley Lab – were used to train RhizoNet and perform inference, bringing this capability of computer vision to the EcoBOT, Ushizima noted.

“EcoBOT is capable of collecting images automatically, but it was unable to determine if how athe plant responds to different environmental changes alive or not or growing or not,” Ushizima explained. “By measuring the roots with RhizoNet, we capture detailed data on root biomass and growth not solely to determine plant vitality but to provide comprehensive, quantitative insights that are not readily observable through conventional means. After training the model, it can be reused for multiple experiments (unseen plants).”

“In order to analyze the complex plant images from the EcoBOT, we created a new convolutional neural network for semantic segmentation," added Zineb Sordo, a computer systems engineer in AMCR working as a data scientist on the project. "Our goal was to design an optimized pipeline that uses prior information about the time series to improve the model's accuracy beyond manual annotations done on a single frame. RhizoNet handles noisy images, detecting plant roots from images so biomass and growth can be calculated.”

One Patch at a Time

During model tuning, the findings indicated that using smaller image patches significantly enhances the model's performance. In these patches, each neuron in the early layers of the artificial neural network has a smaller receptive field. This allows the model to capture fine details more effectively, enriching the latent space with diverse feature vectors. This approach not only improves the model's ability to generalize to unseen EcoFAB images but also increases its robustness, enabling it to focus on thin objects and capture intricate patterns despite various visual artifacts.

Smaller patches also help prevent class imbalance by excluding sparsely labeled patches – those with less than 20% of annotated pixels, predominantly background. The team’s results show high accuracy, precision, recall, and Intersection over Union (IoU) for smaller patch sizes, demonstrating the model's improved ability to distinguish roots from other objects or artifacts.

To validate the performance of root predictions, the paper compares predicted root biomass to actual measurements. Linear regression analysis revealed a significant correlation, underscoring the precision of automated segmentation over manual annotations, which often struggle to distinguish thin root pixels from similar-looking noise. This comparison highlights the challenge human annotators face and showcases the advanced capabilities of the RhizoNet models, particularly when trained on smaller patch sizes.

This study demonstrates the practical applications of RhizoNet in current research settings, the authors noted, and lays the groundwork for future innovations in sustainable energy solutions as well as carbon-sequestration technology using plants and microbes. The research team is optimistic about the implications of their findings. 

“Our next steps involve refining RhizoNet’s capabilities to further improve the detection and branching patterns of plant roots,” said Ushizima. "We also see potential in adapting and applying these deep-learning algorithms for roots in soil as well as new materials science investigations. We're exploring iterative training protocols, hyperparameter optimization, and leveraging multiple GPUs. These computational tools are designed to assist science teams in analyzing diverse experiments captured as images, and have applicability in multiple areas.” 

Further research work in plant root growth dynamics is described in a pioneering book on autonomous experimentation edited by Ushizima and Berkeley Lab colleague Marcus Noack that was released in 2023. Other team members from Berkeley Lab include Peter Andeer, Trent Northen, Camille Catoulos, and James Sethian. This multidisciplinary group of scientists is part of Twin Ecosystems, a DOE Office of Science Genomic Science Program project that integrates computer vision software and autonomous experimental design software developed at Berkeley Lab (gpCAM) with an automated experimental system (EcoFAB and EcoBOT) to perform fabricated ecosystem experiments and enhance data reproducibility. The work of analyzing plant roots under different kinds of nutrition and environmental conditions is also part of the DOE’s Carbon Negative Earthshot initiative (see sidebar).

 

Among cancer survivors, LGBTQ+ individuals report higher prevalence of chronic health conditions, disabilities, other limitations



Transgender or gender non-conforming cancer survivors had higher odds of most conditions compared to cisgender cancer survivors



AMERICAN ASSOCIATION FOR CANCER RESEARCH





Bottom Line: Cancer survivors who identify as lesbian, gay, bisexual, transgender, queer, or anything other than straight and cisgender (LGBTQ+) experience more chronic health conditions, disabilities, and other physical and cognitive limitations than non-LGBTQ+ cancer survivors; however, the prevalence of most conditions was highest among transgender or gender non-conforming (TGNC) individuals.

Journal in Which the Study was Published: Cancer Epidemiology, Biomarkers & Prevention, a journal of the American Association for Cancer Research (AACR)

Author: Austin R. Waters, MSPH, a doctoral candidate in health policy and management at the UNC Gillings School of Global Public Health in Chapel Hill, North Carolina and a predoctoral fellow at the Lineberger Comprehensive Cancer Center, University of North Carolina at Chapel Hill

Background: Prior research shows cancer survivors are more likely to have chronic diseases such as diabetes, kidney disease, liver disease, COPD, and heart disease compared to those who have never been diagnosed with cancer. Meanwhile, LGBTQ+ individuals, who represent about 7.1% of the U.S. population, have been found to face health disparities due to stigma and other social determinants of health. But few national samples that differentiate between cisgender and transgender identities have been used to study disparities among LGBTQ+ cancer survivors for chronic health conditions, according to Waters.

“Thinking about how LGBTQ+ cancer survivors’ health compares to non-LGBTQ+ cancer survivors’ is an important question because it begins to disentangle the driving forces behind inequities,” Waters said. “Notably, our analysis revealed that even when controlling for factors such as smoking status and income—factors known to be associated with poor health—LGBTQ+ cancer survivors continued to have higher odds of most chronic health conditions and other limitations.”

How the Study was Conducted: Waters and colleagues used data from the Behavioral Risk Factor Surveillance System (BRFSS), a phone survey system managed by the Centers for Disease Control and Prevention, collected in 2020, 2021, or 2022 from 23 states that administered questionnaires about sexual orientation and gender identity as well as cancer survivorship. Of 40,990 cancer survivors, 1,715 were LGBTQ+, including 638 lesbian or gay individuals, 551 bisexuals, and 458 who identified as another non-heterosexual sexual orientation, such as queer, pansexual, or asexual. Of the 114 TGNC cancer survivors, 38 identified as transgender men, 43 as transgender women, and 33 as gender non-conforming. Overall, the LGBTQ+ cancer survivors were significantly more racially and ethnically diverse, had a lower household income, and were younger both at the time of the survey and at diagnosis of their cancer.

Participants were asked to reply “yes” or “no” if they were “ever told” they had chronic health conditions such as angina or heart disease, asthma, COPD, depressive disorder, kidney disease, stroke, or diabetes as well as disabilities and physical limitations such as hearing disability, vision disability, difficulty walking, difficulty dressing, or difficulty running errands, or cognitive limitations such as serious difficulty concentrating, remembering, or making decisions due to any physical, mental, or emotional condition. Waters and colleagues compared results between LGBTQ+ and non-LGBTQ+ cancer survivors. They also broke the results down by examining TGNC and cisgender lesbian, gay, and bisexual (LGB) cancer survivors in comparison to non-LGBTQ+ cancer survivors and controlled for factors including age, race and ethnicity, smoking status, and education and household income.  

Results: When adjusted for age, race and ethnicity, smoking status, and education and household income, LGBTQ+ cancer survivors overall had higher odds ratios of reporting asthma, depressive disorder, kidney disease, stroke, diabetes, vision disabilities, cognitive limitations, difficulty walking, difficulty dressing, and difficulty running errands compared to non-LGBTQ+ cancer survivors. The odds for TGNC cancer survivors, however, were substantially higher for most outcomes compared to non-TGNC survivors, with increased odds ranging from 2.34 to 6.03. The lone exception was depressive disorder. When adjusted for age, TGNC survivors also had a higher prevalence of most health conditions compared to LGB survivors except for depressive disorder as well as cognitive limitations.

Author’s Comments: “Transgender and gender non-conforming individuals are some of the most marginalized people in the LGBTQ+ community and are known to experience barriers to healthcare discrimination, more exclusion, more violence, and other factors than LGB individuals,” Waters said. “Our study highlights the challenges TGNC cancer survivors face and the need for TGNC individuals, as well as all other LGBTQ+ cancer survivors, to be prioritized in care across the continuum.”

Waters said future studies will begin to explore how some of these outcomes, such as depression and cognitive limitations, interplay with financial well-being and the ability to work after cancer to identify ways to better support LGBTQ+ survivors throughout the care process.

“While interventions like LGBTQ+-specific prehabilitation or LGBTQ+ patient navigators may minimize some inequities, ultimately societal and policy changes such as non-discrimination laws, affordable housing, and affordable health care are needed to completely address such disparities,” he said.

Study Limitations: Limitations of this study include a smaller sample of LGBTQ+ individuals in BRFSS compared to national samples, which could indicate that participants were not comfortable disclosing information about sexual orientation or gender identity or that LGBTQ+ individuals were less likely to respond to BRFSS. Additionally, cancer survivorship and sexual and gender identity survey modules are optional for states, which means the experiences of cancer survivors in states that did not elect to include this information are not reflected. The cross-sectional design of the study could have also resulted in a cohort of healthier cancer survivors with less severe disease or treatment. Further, potential recall errors are possible due to the self-reported status of cancer and chronic conditions. The study also lacks information about cancer treatments and pack-years for smokers, which may have further explained the findings.

Funding & Disclosures: Funding for the study was provided by the Cancer Care Quality Training Program at the Lineberger Comprehensive Cancer Center and the National Cancer Institute. Waters declares no conflicts of interest.