Tuesday, July 09, 2024

 

What can America do to make health care and health outcomes more equal?


U-M expert on racial and ethnic differences in health care and health outcomes offers recommendations as part of a national committee



MICHIGAN MEDICINE - UNIVERSITY OF MICHIGAN

Health care inequality illustration 

IMAGE: 

HEALTH CARE INEQUALITY PERSISTS 21 YEARS AFTER A LANDMARK REPORT FIRST DREW BROAD NATIONAL ATTENTION TO THE ISSUE

view more 

CREDIT: EMITH SMITH, UNIVERSITY OF MICHIGAN




In 2003, Americans learned just how unequal health care in the United States really was.

A major report by an eminent group of experts showed wide gaps in how people from different racial and ethnic backgrounds received care for many different conditions -- even if they had the same income or insurance coverage.

Of course, people from Black, Hispanic, Native American and certain other backgrounds had lived that inequality for centuries. And researchers had documented it since the 1960s.

But the landmark report, called “Unequal Treatment,” launched the issue into the spotlight for all Americans, and spurred action at many levels. It helped inform the Affordable Care Act, including new programs such as Medicaid expansion to reduce the number of people of all backgrounds who lacked health insurance.

Did it make a difference? Has inequality dropped?

To some degree, yes – but much more action is needed, according to a new report, Ending Unequal Treatment, first released late last month by the National Academies of Science, Engineering and Medicine.

One of the new report’s authors, John Z. Ayanian, M.D., M.P.P., leads a major University of Michigan institute focused on health care research and policy, including health equity.

Back in 2003, he was one of the researchers whose studies helped form the backbone of the first report.  

“More than 20 years later, our committee found there’s been some progress, but it’s been uneven and incomplete in eliminating health care disparities and promoting health equity,” he said. “We also can draw much more of a link between inequities in health care and inequities in health outcomes for individuals and populations.”

In other words, the data now clearly show that inequality harms people’s health.

“We find that racial and ethnic inequities remain a fundamental flaw of the U.S. health care system, and that they’re driven by complex interactions between a variety of forces,” says Ayanian, who is a professor of internal medicine, public health and public policy at U-M as well as directing the U-M Institute for Healthcare Policy and Innovation.

He adds, “We also document specific approaches that have been demonstrated to improve health care equity – for example, community health workers engaging with people who have chronic conditions, health-related social needs, or risk factors – but these approaches haven’t been disseminated or implemented widely and over enough time to make a major difference.”

The committee also notes that despite the gains made possible by changes in health policy, other policy developments could raise major new barriers.

So what can be done?

Ayanian describes some of the committee’s key recommendations:

  • Continuing the efforts started after the first Unequal Treatment report to diversify the workforces for both health care and health services research so they are more representative of the entire U.S. population.
  • More comprehensive efforts to increase health equity, going beyond the incremental and time-limited changes of the past. For instance, Ayanian points to New York City’s success in substantially increasing colorectal cancer screening among Asian, Black, and Hispanic residents, which has not yet been adopted by other cities or regions.
  • More research and evaluation to understand inequality more deeply, and to drive improvement. Ayanian and colleagues spoke to the importance of this work in a special panel in early July national AcademyHealth meeting of health care researchers.
  • Accountability via enforcement of current laws and policies, such as the prohibitions on discrimination under the Affordable Care Act. For instance, helping more people understand the process for filing a complaint, including information in multiple languages so that people with limited English skills can understand how to file a complaint if they believe they’ve received unequal care.
  • Clearer and more enforceable standards, via the Internal Revenue Service and the Department of Treasury, for nonprofit hospitals to report what they do to address the health-related social needs of people in the communities they serve.
  • Congressional action to make affordable health insurance available for everyone, including those living in states that have not expanded Medicaid, and immigrants regardless of status.
  • Congressional action to make Medicaid’s payments to physicians and hospitals equal to the payment amounts they provide for the same services for people with Medicare. Because of the current lower payment, only about half of physicians accept patients with Medicaid coverage, which contributes to health care inequities because people of color are more likely to qualify for Medicaid due to having lower incomes.
  • Fully funding the Indian Health Service, which serves Native American and Alaska Native individuals nationwide.
  • Addressing the gap in Medicaid funding for U.S. territories such as Puerto Rico and American Samoa, which currently have much more limited funding and coverage than those in the 50 states and District of Columbia. 
  • New standards to guide better collection of data on race and ethnicity by all areas of the federal government for both health care workers and patients covered by any source of insurance or no insurance.
  • Expansion of demonstration projects that are seeking to address health-related social needs. 

In addition to the report and a related webinar, members of the committee laid out the policy implications and recommendations in a new piece in Health Affairs Forefront.

The new report comes at a unique moment, Ayanian notes, because of the disparities that came into sharp focus during the height of the COVID-19 pandemic.

Although deaths from COVID-19 have dropped significantly thanks to vaccinations, increased natural immunity, and effective treatment, the pandemic reversed more than a decade of progress in reducing the gap in life expectancy between people of different racial and ethnic backgrounds.

Black and Native American life expectancy was actually catching up to white life expectancy steadily before 2020, and Hispanic Americans actually had achieved longer life expectancy than white Americans by 2018. But all those advances were reversed in 2020-2021 because COVID-19 led to substantially higher rates of people of color dying prematurely.

The committee behind the new report points out the major economic impacts of health care inequity, as well as the injustice it represents.

Says Ayanian, “We hope this report will be a guide for effective changes in policy and practice for years to come, just as the original report was used to motivate such efforts over the past 20 years.”

 

National Academies of Sciences, Engineering, and Medicine. 2024. Ending Unequal Treatment: Strategies to Achieve Equitable Health Care and Optimal Health for All. Washington, DC: The National Academies Press. https://doi.org/10.17226/27820

 

Return-to-work programs may have a hidden cost to women, according to study



UNIVERSITY OF SURREY






Researchers explored experiences of professional women re-entering the workforce after taking time off for family reasons. The research focused on returner programmes – employer-sponsored initiatives designed to ease the transition back into professional roles. 

The study reveals a complex picture. While returner programmes offer valuable support and mitigate some of the stigma associated with career breaks, they fall short of addressing persistent discrimination. 

The study found a troubling paradox:  

  • Returner programmes may reduce the risk of immediate workplace stigma but don't address deeper issues. 
  • Women who successfully re-enter their professions often experience occupational downgrading and limited career progression. Notably, the research suggests this is particularly true in the private sector. 

Dr Cecile Guillaume, lead author of the study, Reader in Work, Employment and Organisation Studies, and the Surrey Business School Director of Internationalisation, said: 

"While returner programmes are a positive step in supporting women re-entering the workforce, our research suggests they have limitations. We found that these programmes can address some of the initial hurdles, but they don't dismantle the systemic inequalities that lead to occupational downgrading and hinder career progression for women. A multi-pronged approach that tackles both individual challenges and broader societal barriers is necessary to create a truly equitable workplace for all." 

The research highlights the programmes' role in helping women overcome the stigma often attached to career gaps. Through supportive elements like coaching and mentoring, as well as access to professional networks, these programmes can empower women to re-enter their fields. 

However, the study goes beyond simply identifying barriers; it also highlights the 'hidden costs' of returner programmes. Many participants reported significant financial investments to participate, such as interview training and professional certifications. Additionally, the research suggests that even with successful re-entry, women may face long-term career consequences. 

The study emphasises the limitations of individual and organisational coping mechanisms. While these strategies may help women navigate daily workplace interactions, they don't dismantle systemic inequalities.  

Dr Cecile Guillaume continued: 

"There is a need for broader societal shifts to address structural and cultural barriers that continue to disadvantage women returning to work. Only then can we create a truly inclusive and equitable workplace for women at all stages of their careers." 

This study has been published in the journal of the British Sociological Association 

### 

Note to editors:  

 

Hepatitis C leaves “scars” in immune cells even after successful treatment



Study reveals epigenetic changes in regulatory T cells of hepatitis C patients post-treatment




INSTITUTE FOR BASIC SCIENCE

Figure 1. Regulatory T cells continue to exhibit inflammatory features even after clearance of the hepatitis C Virus 

IMAGE: 

THE NUMBER OF REGULATORY T CELLS REMAINS ELEVATED IN THE PERIPHERAL BLOOD OF CHRONIC HEPATITIS C PATIENTS, AND THIS INCREASE REMAINS PERSISTENT EVEN AFTER THE COMPLETION OF TREATMENT. ANALYSIS OF THE TRANSCRIPTOME AND EPIGENOME OF PATIENTS WITH CHRONIC HEPATITIS C SHOW THAT REGULATORY T CELLS CONTINUE TO HAVE INCREASED INFLAMMATORY CHARACTERISTICS COMPARED TO NORMAL PEOPLE. THESE INFLAMMATORY TREG CELLS SHOWED HIGHER EXPRESSION LEVELS OF ACTIVATION MARKERS AND TRANSCRIPTION FACTORS SUCH AS RORΓT AND T-BET, LEADING THEM TO SECRETE INFLAMMATORY CYTOKINE TNF.

view more 

CREDIT: INSTITUTE FOR BASIC SCIENCE




Led by Director SHIN Eui-Cheol, researchers from the Institute for Basic Science (IBS) Korea Virus Research Institute’s (KVRI) Center for Viral Immunology have provided new insights into the lasting effects of chronic Hepatitis C virus (HCV) infection on the immune system, even after the disease has been successfully treated. The research team has discovered that traces of “epigenetic scars” remain in regulatory T cells and exhibit sustained inflammatory properties long after the virus is cleared from the body.

Chronic hepatitis C, caused by the hepatitis C virus, can lead to severe complications such as liver cirrhosis and liver cancer. The advent of highly effective direct-acting antivirals (DAAs) has resulted in high cure rates for this chronic viral infection. However, it has been reported that the immune system of patients does not fully recover even after being cured.

The study examined patients with chronic HCV infection who achieved sustained virologic response (SVR) after DAA treatment. SVR means that the HCV virus is not detected in blood for 12 weeks after treatment, which is a strong indicator that the virus has been eradicated from the body. The researchers found that the frequency of activated TREG cells remained elevated during treatment and continued to be high even after the virus was eliminated.

The researchers then performed comprehensive analyses, including RNA sequencing and ATAC-seq, which revealed that the transcriptomic and epigenetic landscapes of TREG cells from HCV patients remained altered even after eradication of the virus. Inflammatory features, such as increased TNF signaling, were sustained in TREG cells, indicating long-term immune system changes induced by the chronic infection. These activated TREG cells from HCV patients continued to produce inflammatory cytokines like TNF, IFN-γ, and IL-17A even after clearance of the virus. The researchers followed the patients for up to six years after achieving SVR and found that inflammatory features still persisted.

The study’s results have significant implications for the long-term management of patients who have been treated for chronic HCV infection. Despite successful viral clearance, the persistence of inflammatory features in TREG cells suggests that these patients may be at risk for ongoing immune system dysregulation. This could potentially lead to chronic inflammation and related health issues.

Director SHIN Eui Cheol explained, “Our findings highlight the need for ongoing monitoring even after HCV has been cleared. By understanding the underlying mechanisms of these persistent immune changes, we can develop more effective strategies to ensure complete recovery and improve the quality of life for HCV patients.”

The research team is now focusing on further investigating the mechanisms behind the sustained inflammatory state of TREG cells. They aim to explore potential therapeutic interventions that could reverse these epigenetic and transcriptomic changes.

“We are now interested in seeing whether other chronic viral infections also cause long-lasting epigenetic changes in our immune systems,” said Director Shin. “One of our goals is to identify clinical implications of these persistent immune alterations.”

This research was published in the Journal of Hepatology.

 

How lasers and 2D materials could solve the world's plastic problem



UNIVERSITY OF TEXAS AT AUSTIN
Lasers and plastics 1 

IMAGE: 

A SERIES OF MIRRORS AND PRISMS DEFLECT LASERS AND FOCUS THEM TO PERFORM THE REACTION.

view more 

CREDIT: THE UNIVERSITY OF TEXAS AT AUSTIN




A global research team led by Texas Engineers has developed a way to blast the molecules in plastics and other materials with a laser to break them down into their smallest parts for future reuse.

The discovery, which involves laying these materials on top of two-dimensional materials called transition metal dichalcogenides and then lighting them up, has the potential to improve how we dispose of plastics that are nearly impossible to break down with today's technologies.

"By harnessing these unique reactions, we can explore new pathways for transforming environmental pollutants into valuable, reusable chemicals, contributing to the development of a more sustainable and circular economy," said Yuebing Zheng, professor in the Cockrell School of Engineering's Walker Department of Mechanical Engineering and one of the leaders on the project.​ “This discovery has significant implications for addressing environmental challenges and advancing the field of green chemistry."

The research was recently published in Nature Communications. The team includes researchers from the University of California, Berkeley; Tohoku University in Japan; Lawrence Berkeley National Laboratory; Baylor University; and The Pennsylvania State University.

Plastic pollution has become a global environmental crisis, with millions of tons of plastic waste accumulating in landfills and oceans each year. Conventional methods of plastic degradation are often energy-intensive, environmentally harmful and ineffective. The researchers envision using this new discovery to develop efficient plastic recycling technologies to reduce pollution.

The researchers used low-power light to break the chemical bonding of the plastics and create new chemical bonds that turned the materials into luminescent carbon dots. Carbon-based nanomaterials are in high demand because of their many capabilities, and these dots could potentially be used as memory storage devices in next-generation computer devices.

"It's exciting to potentially take plastic that on its own may never break down and turn it into something useful for many different industries," said Jingang Li, a postdoctoral student at University of California, Berkeley who started the research at UT.

The specific reaction is called C-H activation, where carbon-hydrogen bonds in an organic molecule are selectively broken and transformed into a new chemical bond. In this research, the two-dimensional materials catalyzed this reaction that led to hydrogen molecules morphing into gas. That cleared the way for carbon molecules to bond with each other to form the information-storing dots.

Further research and development are needed to optimize the light-driven C-H activation process and scale it up for industrial applications. However, this study represents a significant step forward in the quest for sustainable solutions to plastic waste management. 

The light-driven C-H activation process demonstrated in this study can be applied to many long-chain organic compounds, including polyethylene and surfactants commonly used in nanomaterials systems. ​  

The research was funded by various institutions, including the National Institutes of Health, National Science Foundation, Japan Society for the Promotion of Science, the Hirose Foundation and the National Natural Science Foundation of China. ​ 

The research team includes Deji Akinwande and Yuqian Gu of UT's Chandra Family Department of Electrical and Computer Engineering; Zhihan Chen, Zilong Wu and Suichu Huang of the Materials Science and Engineering Program at UT; Hao Li, Di Zhang and Zhongyuan Guo from Tohoku University in Japan; Brian Blankenship, Min Chen and Costas P. Grigoropoulos of the University of California, Berkeley; Xi Jiang, Robert Kostecki and Andrew M. Minor of Lawrence Berkeley National Laboratory; Jonathan M. Larson of Baylor University; and Haoyue Zhu, Tianyi Zhang, Mauricio Terrones and Joan M. Redwing of The Pennsylvania State University.

Professor Yuebing Zheng and graduate student Siyuan Huang.

CREDIT

The University of Texas at Austi

 

A treatment for anorexia nervosa?


McGill-led research team may have discovered neurological mechanism underlying common eating disorder



MCGILL UNIVERSITY




A McGill University-led research team working in collaboration with a French team (CNRS, INSERM and Sorbonne university) believes it has identified both the neurological mechanism underlying anorexia nervosa as well as a possible cure.  

The international team’s findings, published this week in Nature Communications, have the potential to improve the lives of millions of people around the world, mostly women, who suffer from the common eating disorder, which has the highest mortality rate of any psychiatric disease.

Working with mice, the researchers discovered that a deficit in the acetylcholine, a neurotransmitter in an area of the brain called the striatum, which is associated with the reward system, can lead to excessive habit formation and precipitate the compulsive self-starvation seen in people who suffer from anorexia nervosa.

McGill Psychiatry professor Dr Salah El Mestikawy, the senior author on the paper and a researcher at the Douglas Research Centre, said the research team decided to see whether using donepezil, a medication which is known to increase the presence of acetylcholine in the brain, could have an effect on these compulsive self-destructive behaviours.  

“We found that it fully reversed the anorexia-like behaviour in mice, and we believe that it could potentially offer the first mechanism-based treatment of anorexia nervosa. In fact, we are already seeing its effects on some patients with the disease.”

Improving health of anorexia patients treated with donepezil

Ongoing independent studies in Toronto and Montreal, led by Dr. Leora Pinhas, an independent psychiatrist, are showing positive results for 10 patients with severe anorexia nervosa who are being treated with low doses of donepezil. Three of the patients are in full remission and the other seven patients show a marked improvement in the disease.

Further double-blind clinical trials, comparing the results of those taking a placebo with those of taking the medication, are due to take place this year at Columbia University, Denver University, and the Hôpital Sainte-Anne in Paris.

Dr. El Mestikawy cautions, however, that between clinical trials and government approval it may several years before a new drug can be used to treat anorexic patients.

Possible effects on other diseases involving compulsive behaviours

El Mestikawy said donepezil is a drug with many gastrointestinal and muscular side effects. As a result, the McGill researchers are working with the  French team, led by Stéphanie Daumas and Nicolas Pietrancosta at Sorbonne Université, who co-authored the study, to develop a novel compound with fewer problems.

 “We also suspect that other compulsive pathologies such as obsessive-compulsive disorders (OCD) and addictions can also be improved by donepezil, so we are actively looking for collaboration with other psychiatrist around the world to explore the possibilities,” said El Mestikawy.

The study: “The human VGLUT3-pT8I mutation elicits uneven striatal DA signaling, food or drug maladaptive consumption in male mice" by Mathieu Favier et al was published in Nature Communications

DOI: https://www.nature.com/articles/s41467-024-49371-1

 

Statistical experts warn of looming threats to vital official data



The American Statistical Association releases new report, The Nation’s Data at Risk



AMERICAN STATISTICAL ASSOCIATION





ALEXANDRIA, Va., July 9, 2024 – As the nation wraps up celebrations of its birth 248 years ago, a first-ever comprehensive report about the status of the federal statistical system—informing and powering the progress of the world’s oldest democracy since the first census in 1790—issues a clarion call with concerns for the future of official statistics.

Today, the American Statistical Association released a 90-page report, two years in the making, that details serious threats to America’s ability to continue producing trusted, nonpartisan and essential statistics that serve every community in the nation. Titled The Nation’s Data at Risk: Meeting America’s Information Needs for the 21st Centurythe report assesses the core of the federal statistical system—the 13 principal statistical agencies that produce official statistics as their primary mission and the chief statistician’s office in the Office of Management and Budget.

The report found statistical agencies are experiencing significant weaknesses in at least one of three critical supports needed to strengthen our nation:

  • Many agencies lack statutory protection to sustain a high degree of control over how they collect and disseminate trusted statistics.
  • Not all agencies have strong support from the cabinet department or independent agency (“parent agency”) in which they reside.
  • Most agencies have suffered a decline in resources in real dollar terms over the last 15 years so they are stretched to carry out basic responsibilities.

“A consequence of weaknesses in the three critical supports is that long-standing statistical series that produce important economic indicators, such as the unemployment rate, are prone to become outdated in content and methods because of the statistical agencies’ inability to invest in continuous testing and improvement. In other cases, essential programs have been cut, delayed, or otherwise curtailed without due consideration of the consequences to data users outside the parent agency,” according to the report.

Compounding these challenges, the authors of the report found “… data collection methodology is rooted in 20th century technology and survey-taking techniques. But the public is less cooperative, and agencies are hampered by legal and other barriers in their abilities to more rapidly develop and implement new data collection methods and tap other public and private data sources to sustain quality and timeliness, increase efficiency and productivity, and keep up with policy areas of interest.”

The expert authors of the ASA report, written in collaboration with George Mason University and with support from the Alfred P. Sloan Foundation, provided 15 specific recommendations for Congress and the executive branch to head off the looming threats to the nation’s most trusted data.

“We collectively face a narrowing window of opportunity to reverse these trends and bring our federal statistical system into the modern era,” said Nancy Potok, former chief statistician of the U.S. and a report coauthor.

The experts strongly believe their “... 15 recommendations would fill important gaps in existing legislation and regulations to bolster statistical agencies’ professional autonomy, data-sharing authority, and resources, which are critical if we are to continue to provide relevant, accurate, timely, detailed, and credible data for the public and policymakers,” said Connie Citro, a report coauthor and senior scholar for the Committee on National Statistics.

The ASA Board of Directors endorsed the report’s recommendations.

The ASA—composed of 15,000 professionals in academia, government, research and business and with the mission to promote the practice and profession of statistics—led and oversaw production of the report and intends to update it annually.

“Our professional judgment is that these [federal statistical] agencies have been overlooked in investment and innovation, which we have known for several years but feel now is the time to raise the alarm with data users and taxpayers that the system is at risk,” said Steve Pierson, a coauthor of the report and director of science policy at the ASA.

Researchers from George Mason University examined a wide range of information, from budget cuts to delayed survey operations.

“The data show a number of concerning patterns,” said Jonathan Auerbach, a coauthor of the report and assistant professor of statistics at George Mason University. “If present trends continue, future generations may no longer have access to the public information we rely on today.”

The report includes assessments from almost two dozen former leaders of these agencies, data users and Capitol Hill veterans. Several provided dire warnings about the future of the system.

“We are only a few short years from unreliable unemployment numbers on the 'first Friday' of every month without serious interventions to modernize the underlying data collection (the Current Population Survey).” William Beach, former commissioner, Bureau of Labor Statistics, 2019–2023; a senior fellow in economics, Economic Policy Innovation Center

“Lousy data beget lousy decisions. It is no exaggeration to say that Americans’ well-being and the vitality of the U.S. economy rely in no small part on the quality of information provided by our federal statistical system.” Erica Groshen, commissioner, Bureau of Labor Statistics, 2013–2017; senior economics advisor, Cornell University School of Industrial and Labor Relations

“One of the most important functions of Congress in conducting oversight is to assess the performance, need and value of federal programs. Timely, high-quality data from the federal statistical system is essential in carrying out this important function, across every committee and for every Member.” Paul Ryan, speaker, U.S. House of Representatives, 2015–2019

“The value to the taxpayer and the public of creating high-quality statistics by blending data from multiple sources is blindingly clear. If the federal statistical system does not act quickly and decisively to create that value, it will be marginalized and its products replaced by lower quality but cheaper, timelier, and more actionable information. It will take vision, leadership, and determination. But the time to stop talking and start acting is now.” Julia Lane, cofounder of the LEHD program of the U.S. Census Bureau, the STAR METRICS/UMETRICS program, the Democratizing Data project, the NORC Data Enclave, and the Coleridge Initiative; initiator of New Zealand’s Integrated Data Infrastructure and Patentsview

 

##

Contact: Steve Pierson, pierson@amstat.org, (703) 302-1841.

 

About the American Statistical Association

The ASA is the world’s largest community of statisticians and the oldest continuously operating professional science society in the United States. Its members serve in industry, government and academia in more than 90 countries, advancing research and promoting sound statistical practice to inform public policy and improve human welfare. For additional information, please visit the ASA website at www.amstat.org.


 Indian Government is considering allowing white rice shipments with a fixed duty

Published: July 08, 2024 
The stable supply of vegetables as harvest season comes likewise resulted in a slower inflation rate of the commodity.Image Credit: Bloomberg

India, the world’s top rice shipper, may relax restrictions on exports of some varieties to avoid a glut in the country before the new crop arrives in the market in October, according to people familiar with the matter.

The government is considering allowing white rice shipments with a fixed duty, said the people, who asked not to be identified as the talks are confidential. The authorities may also scrap a 20% tax on parboiled rice exports and impose a fixed levy instead to discourage under-invoicing of cargoes, they said.

Any such move could help cool benchmark Asian rice prices, which reached the highest in more than 15 years in January, following India’s move to start restricting sales of key varieties from 2023. That would be good news for some countries in West Africa and the Middle East that rely on the South Asian nation for most of their requirements of the food staple.

A spokesperson representing both the food and commerce ministries didn’t immediately comment.

India’s total rice exports slumped 21% from a year earlier to 2.9 million tons in the first two months of the fiscal year that started on April 1, according to government data. Shipments of non-basmati rice fell 32% to 1.93 million tons during the same period, it said.

Indian farmers are in the midst of sowing their main rice crop for the next harvest as the monsoon kicks off. Planting will peak in July and the grain will be collected from late September.

Acreage stood at 6 million hectares (14.8 million acres) as of July 8, a jump of 19% from a year earlier, according to the farm ministry, following a recovery in the monsoon after deficient rains last month.

Rice gone wild: how humans have inadvertently selected for 'weedy' rice



Researchers at UMass Amherst show the evolutionary persistence of a weediness trait, rewriting the story of rice as we know it



Peer-Reviewed Publication

UNIVERSITY OF MASSACHUSETTS AMHERST

Scanning electron microscope images of abscission zones from a sample of rice accessions with different shattering abilities. 

IMAGE: 

SCANNING ELECTRON MICROSCOPE IMAGES OF ABSCISSION ZONES FROM A SAMPLE OF RICE ACCESSIONS WITH DIFFERENT SHATTERING ABILITIES.

view more 

CREDIT: XIANG LI AND DANIEL LOWEY



AMHERST, Mass. – University of Massachusetts Amherst researchers have discovered that the anatomical adaptation helping weedy rice varieties to proliferate is not, as previously believed, confined only to these pest varieties. The research, published recently in the Journal of Experimental Botany, shows that despite 10,000 years of human cultivation, a cell tissue that allows rice plants to easily drop their seeds remains a feature in nearly all cultivated varieties of the grain, though to a lesser degree and with much more variation.

The continued proliferation of weedy rice, a type of rice that is a pest in cultivated fields, suggests that the way we cultivate rice inadvertently also selects for weedy behavior; this is particularly evidenced by the fact that cultivated rice continually de-domesticates into weedy varieties.

“For the past few decades, we biologists have been telling ourselves a story about how rice domestication occurred” says Ana Caicedo, professor of biology at UMass Amherst and the paper’s senior author. “But when we started looking very closely at a wide array of different rice varieties, it turns out that old story is far too simplistic and obscures what is actually happening.”

The story begins long before humans entered the picture, when the wild ancestor of cultivated rice evolved the ability to “shatter,” or easily drop and disperse its seeds rather than hang on to them. “Shattering,” or easily dropping seeds, is an evolutionarily elegant reproductive strategy, allowing a plant to spread widely and quickly, and is possible because of the formation of the “abscission zone:” a special kind of tissue located at the base of each rice grain—the part we eat—that connects the grain to the plant.

Enter the humans.

While shattering is beneficial for wild plants, it is a drawback for harvesting grain, because most of the seeds would be lost before they could be gathered by a hungry human. Over many millennia, humans have selected for different cultivated rice varieties, each of which has its own domestication history, that hold on to their grains more tightly than their wild progenitors, and it has long been thought that the abscission zone, and thus shattering, had been bred out of cultivated rice. 

However, cultivated rice has also evolved into weedy rice varieties which shatter and are so successful that they are one of the leading factors in limiting cultivated rice production worldwide. In fact, weedy rice is really many different groups that de-domesticated, or evolved independently, from various cultivated varieties.

“The abscission zone has long been recognized as a critical factor influencing shattering in rice, but it has not been thoroughly examined,” says Xiang Li, the paper’s co-lead author who completed this research as part of his graduate studies at UMass Amherst. “We needed to look at more rice individuals to uncover the pattern of this specialized tissue in different rice varieties. This anatomic investigation will improve our understanding of rice evolution and lay the foundation for further examination of underlying genetics.”

Li, Caicedo and their team, including co-lead author Daniel Lowey, who was an undergraduate at UMass Amherst when he helped drive this research, collected microscopy images of 86 samples from both the five major cultivated rice groups as well as from de-domesticated weedy rice from multiple locations, including the Iberian Peninsula, South Asia, Northeast Asia and the U.S.

What they found is that while one group of cultivated rice, temperate japonica, has almost completely lost its abscission zone, most of the other cultivated groups have maintained theirs to one degree or another. But there were differences among these abscission zones.

“Visually, we could see clear differences between the abscission zones of the rice we examined, but we could not rely on perception alone,” Lowey explains. “To put numbers to our observations, we developed a set of three novel measures to numerically quantify aspects of the abscission zones we were most interested in.”

The team further discovered that the relative length of the abscission zone—not the simple presence or absence of the tissue—was the variable that best predicted which varieties easily shatter and which are more resistant. While the cultivated groups had abscission zones that varied widely in terms of their length, leading to some differences in the degree to which they had lost shattering, each of the weedy varieties, no matter where they were gathered from, converged on a longer zone.

“What this tells us,” Caicedo says, “is that each time weedy rice evolves, it ‘chooses’ a long abscission zone as the best evolutionary reproductive adaptation.” Conversely, each time rice varieties were domesticated, their abscission zones were modified differently to result in different degrees of loss of shattering.

That so many different independently evolved weedy rice groups have converged on the same adaptation—a long abscission zone and ease-of-shattering—despite 10,000 years of human efforts to maintain rice in domesticated, non-shattering form, means that there’s something about the way humans manage cultivated rice fields that inadvertently selects for these two features. It also suggests that the only way to achieve high shattering is through a long abscission zone.

This also means that our understanding of the long history of humans and rice needs updating. In particular, scientists had previously believed that cultivated rice had lost its abscission zone and that the degree of roughness at the base of a rice grain could signal if an abscission zone had been present, and thus if that rice grain had shattered or not.

However, Caicedo says that “you can’t simply look at the surface of a rice grain, either one from today or an archaeological sample dating back thousands of years, and tell whether or not it was cultivated or if it shattered, because almost all of them have some degree of abscission-zone formation.” Li adds, “An exciting aspect of our research is that it lays the foundation for examination of underlying genetics. We now know we need to figure out the genes that control the length of the abscission zones in different rice groups. Then we’ll be able to understand when and how all of these changes in the abscission zone arose, and how they have shaped the story of rice domestication and de-domestication.”

 

Contacts: Ana Caicedo, caicedo@umass.edu

                 Daegan Miller, drmiller@umass.edu

 

 

 

 

Hackers beware: Research shows AI can assist with cybersecurity



A Mizzou researcher and collaborators found that leading chatbots can pass certified ethical hacking exams




UNIVERSITY OF MISSOURI-COLUMBIA

Prasad Calyam 

IMAGE: 

PRASAD CALYAM

view more 

CREDIT: UNIVERSITY OF MISSOURI




Chatbots powered by artificial intelligence (AI) can pass a cybersecurity exam, but don’t rely on them for complete protection.

That’s the conclusion of a recent paper co-authored by University of Missouri researcher Prasad Calyam and collaborators from Amrita University in India. The team tested two leading generative AI tools — OpenAI’s ChatGPT and Google’s Bard — using a standard certified ethical hacking exam.

Certified Ethical Hackers are cybersecurity professionals who use the same tricks and tools as malicious hackers to find and fix security flaws. Ethical hacking exams measure a person’s knowledge of different types of attacks, how to protect systems and how to respond to security breaches.

ChatGPT and Bard, now Gemini, are advanced AI programs called large language models. They generate human-like text using networks with billions of parameters that allow them to answer questions and create content.

In the study, Calyam and team tested the bots with standard questions from a validated certified ethical hacking exam. For example, they challenged the AI tools to explain a man-in-the-middle attack — an attack in which a third party intercepts communication between two systems. Both were able to explain the attack and suggested security measures on how to prevent it.

Overall, Bard slightly outperformed ChatGPT in terms of accuracy while ChatGPT exhibited better responses in terms of comprehensiveness, clarity and conciseness, researchers found.

“We put them through several scenarios from the exam to see how far they would go in terms of answering questions,” said Calyam, the Greg L. Gilliom Professor of Cyber Security in Electrical Engineering and Computer Science at Mizzou. “Both passed the test and had good responses that were understandable to individuals with background in cyber defense — but they are giving incorrect answers, too. And in cybersecurity, there’s no room for error. If you don’t plug all of the holes and rely on potentially harmful advice, you’re going to be attacked again. And it’s dangerous if companies think they fixed a problem but haven’t.”

Researchers also found that when the platforms were asked to confirm their responses with prompts such as “are you sure?” both systems changed their answers, often correcting previous errors. When the programs were asked for advice on how to attack a computer system, ChatGPT referenced “ethics” while Bard responded that it was not programmed to assist with that type of question.

Calyam doesn’t believe these tools can replace human cybersecurity experts with problem solving expertise to devise robust cyber defense measures, but they can provide baseline information for individuals or small companies needing quick assistance.

“These AI tools can be a good starting point to investigate issues before consulting an expert,” he said. “They can also be good training tools for those working with information technology or who want to learn the basics on identifying and explaining emerging threats.”

The most promising part? The AI tools are only going to continue to improve their capabilities, he said.

“The research shows that AI models have the potential to contribute to ethical hacking, but more work is needed to fully harness their capabilities,” Calyam said. “Ultimately, if we can guarantee their accuracy as ethical hackers, we can improve overall cybersecurity measures and rely on them to help us make our digital world safer and more secure.”

The study, “ChatGPT or Bard: Who is a better Certified Ethical Hacker,” was published in the May issue of the journal Computers & Security. Co-authors were Raghu Raman and Krishnashree Achuthan.