Tuesday, April 11, 2023

Study shows families making choices that perpetuate segregation in city with school choice policy


Families returning to city center gamed system, left district in roles that kept schools unequal

Peer-Reviewed Publication

UNIVERSITY OF KANSAS

LAWRENCE — Even though Brown v. Board of Education outlawed school segregation in 1954, racially segregated schools have persisted in practice. In recent years, the decades-long trend of white flight to suburbs has reversed in some areas as some white residents are moving to city centers. New research from the University of Kansas shows that in one such city that also has school choice policy, families are making decisions that perpetuate school segregation despite more opportunities for integration.

Washington, D.C., is a major metropolitan area with a school district policy that allows parents to choose the school their children attend. Data has shown that white families are returning to the city’s core in recent years, and while in theory that could lead to more integrated schools in a city with large minority populations, a new study shows that is not happening. As families regularly move amongst schools to get the most desirable one, they do not use their voice in attempt to improve schools and even game the lottery system that is designed to assign to students equitably among systems.

Bryan Mann, assistant professor of educational leadership & policy studies at KU, was lead author of a study that interviewed nine white parents about how they chose their children’s schools. This was part of a broader study that interviewed 20 parents and triangulated findings with GIS mapping and quantitative data. This portion of the study used Exit, Voice, Loyalty, a framework common in organizational theory and political research. It examines why people choose to stay with or leave an organization. Exit is reflected by those who choose to leave, loyalty is reflected by those who stay, and voice is represented in the choice to advocate for changes and improvements to a system. The study, written with Annah Rogers of the University of West Alabama, was published in the journal Urban Education.

“When you think about community-driven school reform, you think about loyalty and how parents use their voice to change things. I was curious about the school ecosystem in D.C. that has gentrification and if it still tracks with these ideas of loyalty and voice,” Mann said. “Here people can exit a school or exit the system altogether and go to a private school. We found those who have the tools to exit or work within the system to move are more likely to move. That’s part of what exacerbated school segregation.”

The study focused on in-depth interviews with parents who recently moved into the city and whose children attended district schools. Part of a larger study focusing on gentrification and its effects on school integration and choice, it asked parents what they valued in a school and how they behaved in making their choices. Interviewees expressed logistical values such as commute time. Several said they wanted to avoid schools near downtown as that would exacerbate commute time because of heavy traffic. However, those with the means reported making alternate plans to work around such concerns in ways like flexible work schedules if that allowed them to get their child in a desired school. Parents also said they largely valued schools that feed from one elementary into a prestigious middle school, and they largely valued bilingual curriculum. Low-performing schools were said to be schools to avoid, not improve. Few parents mentioned racial makeup of schools, though the ones who did indicated schools with large minority populations were to be avoided.

In terms of behaviors, parents frequently reported playing the school system’s lottery every year. The lottery is intended to distribute students evenly among schools, based on available seats and where a family lives. But parents reported entering their children each year in hopes they would get selected for the one they wanted most. They even reported gaming the system, which it is set up to avoid, using tactics such as establishing a mailing address close to the school they wanted.

“This shows there is competition between children for spots in schools. This is different than competition between schools for children. Competition for spots undermines improvements and equality for schools,” Mann said. “Families showed that with all the ways they compete. It’s a lottery system that is supposed to be fair and ‘ungameable.’ But they still found ways to try to get the schools they viewed as prestigious.”

Parents also routinely mentioned the importance of “playing the long game.” If their student didn’t get the school they wanted one year, they would try again in each subsequent year in hopes of getting their child to a more desirable school to then feed into a better middle school. The lottery system includes public and charter schools, and when students did not get the school they wanted, they also reported having an easier time changing schools in transition grades such as pre-K/kindergarten, fifth and ninth grades. In addition to manipulating the system, others reported a willingness to move out of the city to avoid schools they didn’t want or sending their children to private school.

“Exit is the default, and disloyalty is the norm,” Mann and Rogers wrote about the findings that showed parents rarely used their voice in an attempt to improve schools and preferred leaving schools or the system.

The strategies and choices of parents in the study are important to consider, as “the long game remains separate and unequal,” the authors wrote. Census data shows that in 1954, D.C. public schools enrolled 57% Black/minority and 43% white students with complete segregation. The Brown decision led to white flight on a large scale, and by 1990, the percentages were 96% and 4%, respectively. Gentrification from 2000 to 2019 shows that trend reversing, but not equally, as the authors cite data that shows white enrollment in D.C. schools was 11.9% in the 2018-19 school year, even though white city population had increased to 39.6% in 2020. Racial segregation is persisting in the city’s schools, the data shows.

The results should be considered as part of the national debate about school choice policy and indicate that it tends to result in continued school segregation as affluent families and those with means navigate the systems to their benefit instead of working to improve schools viewed as less prestigious. Even when parents voiced concern about other children, communities and schools, the market-based school and housing system encouraged them to advocate for improving choice mechanisms rather than improving the schools themselves.

“If the ideal is a fully integrated school system, it’s barely trending in that direction, and it is not reaching the ideal,” Mann said. “These lessons are important because policymakers across the country are debating similar ideas, and understanding the outcomes of such policies is vital. This can be an early indicator of what to expect with these types of choice policies. People don’t often talk about what happens when parents are competing for spots in schools. Here it resulted in continued segregation, and we were able to get a better understanding of how families made the decisions that led to it.”

A new technique opens the door to safer gene editing by reducing the mutation problem in gene therapy

Peer-Reviewed Publication

NAGOYA UNIVERSITY

CRISPR-Cas9 is widely used to edit the genome by studying genes of interest and modifying disease-associated genes. However, this process is associated with side effects including unwanted mutations and toxicity. Therefore, a new technology that reduces these side effects is needed to improve its usefulness in industry and medicine. Now, researchers at Kyushu University in southern Japan and Nagoya University School of Medicine in central Japan have developed an optimized genome-editing method that vastly reduces mutations, opening the door to more effective treatment of genetic diseases with fewer unwanted mutations. Their findings were published in Nature Biomedical Engineering. 

Genome-editing technology centered on CRISPR-Cas9 has revolutionized the food and medicine industries. In the technology, Cas9 nuclease, an enzyme that cuts DNA, is introduced into the cell with a synthetic guide RNA (gRNA) that guides the enzyme to the required location. By cutting the genome, unwanted genes can be deleted, and new (functional) genes can be added in easily and quickly.  

One of the drawbacks of genome editing is that there are growing concerns about mutations and off-target effects. This is often caused by the enzyme targeting genomic sites that have a sequence similar to the target site. Similarly, mutations at the chromosome level can occur when genes are altered, which has hindered clinical trials of gene therapy for cancer and even resulted in the deaths of patients undergoing treatment for muscular dystrophy. The group hypothesized that current editing protocols that use Cas9 cause excessive DNA cleavage, resulting in some of the mutations. 

To test this hypothesis, a group consisting of Assistant Professor Masaki Kawamata at Kyushu University and Professor Hiroshi Suzuki at the Nagoya University Graduate School of Medicine constructed a system called "AIMS" in mouse cells, which evaluated the activity of Cas9 separately for each chromosome. Their results showed that the commonly used method was associated with very high editing activity. They determined that this high activity was causing some of the unwanted side effects, so they searched for gRNA modification methods that could suppress it. They found that an extra cytosine extension to the 5' end of the gRNA was effective as a "safeguard" for the overactivity and allowed control over DNA cleavage. They called this fine-tuning system ‘safeguard gRNA’ ([C]gRNA).” 

Their results were striking. Using their new technique, off-target effects and cytotoxicity were reduced, the efficiency of single-allele selective editing was increased, and the efficiency of homology-directed repair, the most commonly employed mechanism for DNA double-strand break repair, was enhanced.  

To test its effectiveness in a medical setting, they investigated a rare disease called fibrodysplasia ossificans progressiva. Using a mouse model, they were able to create the same genotype as the human version of the disease. Then, using patient-derived iPS cells, they were able to precisely repair damage down to a single nucleotide specifically in the disease-associated allele causing the disease, demonstrating their technique’s usefulness as a safe and efficient gene therapy method. 

The team also constructed the first mathematical model of the correlation between various genome-editing patterns and Cas9 activity, which would enable the user to simulate the results of genome editing in an entire cell population. This breakthrough would allow researchers to determine the Cas9 activity that maximizes efficiency, reducing the enormous costs and labor required. 

“We established a new genome editing platform that can maximize the desired editing efficiency by developing activity-regulating [C]gRNAs with appropriate Cas9 activity. Furthermore, we found that ‘safeguard gRNA’ can be applied to various CRISPR tools that require gRNAs by regulating their activities, such as those using Cas12a, which has a different DNA cleavage mechanism,” said Professor Suzuki. “For techniques that use Cas9 to activate or repress genes of interest, such as CRISPR activation and CRISPR interference, excessive induction or suppression of gene expression may be not useful and even harmful to cells. Controlling expression levels by [C]gRNA is an important technology that can be used for various applications, including the implementation of precise gene therapy.” 

The group is now working on a start-up business plan to spread the new genome editing platform. “In particular, we believe that this technology can make a significant contribution to the medical field,” said Dr. Kawamata. “We are currently evaluating its therapeutic efficacy and safety for selected target diseases in cell and animal experiments and using it to help develop therapeutic drugs and gene therapy methods, especially for rare diseases for which no treatment methods have yet been established.” 

-------------------------------------------------------------------------------------------------------------------------- 

About Nagoya University

Nagoya University was formally instituted as a National University in 1939. The university has been pursuing excellence since its founding. A number of important discoveries have been made at the University, including the Okazaki DNA Fragments by Reiji and Tsuneko Okazaki; and depletion forces by Sho Asakura and Fumio Oosawa. 

About Kyushu University  

Kyushu University  is one of Japan’s leading research-oriented institutes of higher education since its founding in 1911. Home to around 19,000 students and 8,000 faculty and staff, Kyushu U's world-class research centers cover a wide range of study areas and research fields, from the humanities and arts to engineering and medical sciences. Its multiple campuses—including one of the largest in Japan—are located around Fukuoka City, a coastal metropolis on the southwestern Japanese island of Kyushu that is frequently ranked among the world’s most livable cities and historically known as Japan’s gateway to Asia. Through its Vision 2030, Kyushu U will ‘Drive Social Change with Integrative Knowledge.’ Its synergistic application of knowledge will encompass all of academia and solve issues in society while innovating new systems for a better future. 

U$A

Medicaid ‘cliff’ adds to racial and ethnic disparities in care for near-poor seniors

Peer-Reviewed Publication

UNIVERSITY OF PITTSBURGH


Eric Roberts, Ph.D. 

IMAGE: ERIC ROBERTS, PH.D. view more 

CREDIT: UNIVERSITY OF PITTSBURGH

PITTSBURGH, April 10, 2023 – Black and Hispanic older adults whose annual income is slightly above the federal poverty level are more likely than their white peers to face cost-related barriers to accessing health care and filling medications for chronic conditions, according to new research led by a University of Pittsburgh School of Public Health scientist. 

Published today in JAMA Internal Medicine, the analysis links these disparities to a Medicaid “cliff” – an abrupt end to supplemental Medicaid insurance when people clear the federal poverty threshold. For older adults, Medicaid helps to lower out-of-pocket health care costs by covering most Medicare co-pays. However, a cutoff in Medicaid eligibility above the poverty threshold leads to a sudden rise in copays, making it harder for Black and Hispanic older adults with modest incomes to afford care, compared to their white counterparts at similar income levels.  

“Chronic disease risks among older adults of color often go unaddressed due to cost-related barriers to care, and our research shows that this Medicaid cliff contributes to these barriers,” said Eric T. Roberts, Ph.D., assistant professor of health policy and management at Pitt Public Health. “Fixing this so that people on Medicare don’t face substantially higher co-pays above the poverty threshold could lessen health care inequities among our nation’s seniors.” 

“One option is to turn the ‘cliff’ into a ‘gentle hill,’ by tapering Medicaid assistance for seniors with incomes slightly above the federal poverty threshold,” Roberts added. 

Medicare, the federal health care insurance program for people age 65 and older and disabled individuals, can come with high cost-sharing, including deductibles and copays. Medicare beneficiaries with income less than 100% of the federal poverty threshold, about $14,600 annually, receive supplemental Medicaid insurance to offset these costs and automatically receive a subsidy to lower out-of-pocket prescription drug costs. However, unlike other federal programs that taper aid on a sliding scale, such as the Supplemental Nutrition Assistance Program that helps people buy food, those with incomes even slightly above the poverty line are cut off from Medicaid.  

Roberts and his colleagues obtained data on 8,144 Medicare beneficiaries with incomes less than 200% of the federal poverty level. They looked at health care use for beneficiaries on either side of the Medicaid eligibility line, comparing outcomes among Black and Hispanic beneficiaries to white beneficiaries.  

Being just above the federal poverty threshold, and therefore ineligible for Medicaid, was associated with a 21% drop in annual outpatient visits and a 15% drop in prescriptions filled for Black and Hispanic Medicare beneficiaries, but barely any change for white beneficiaries.  

“We found — and other research supports — that white beneficiaries are more likely to have savings to draw upon to cover medical costs,” Roberts said. “The income that the federal government looks at to determine Medicaid eligibility may make it appear that Black and Hispanic beneficiaries have the same ability as their white peers to pay for care. But the reality is that they don’t have the same reserves — and we’re seeing the impact of that in their forgoing doctor’s visits and needed medications.” 

Black and Hispanic Medicare beneficiaries are more likely to have chronic diseases that can be managed with medications, such as diabetes and heart disease, than white beneficiaries.  

“The Medicaid ‘cliff’ is doubly concerning because it results in Black and Hispanic adults filling fewer medications and makes it harder for them to keep up with regular doctor’s appointments, making it harder to get prescription medications in the first place,” Roberts said. 

The findings support broadening Medicaid eligibility for older adults and tapering Medicaid assistance above the poverty threshold. Such changes could complement measures in the Inflation Reduction Act designed to make drug costs more affordable for seniors, the researchers noted. 

Additional authors of this study are Youngmin Kwon, B.A., and Alexandra G. Hames, B.A., both of Pitt; J. Michael McWilliams, M.D., Ph.D., of Harvard University and Brigham and Women’s Hospital; and John Z. Ayanian, M.D., M.P.P., and Renuka Tipirneni, M.D., M.Sc., both of the University of Michigan.  




Potential drug treats fatty liver disease in animal models, brings hope for first human treatment

Nonalcoholic steatohepatitis, or NASH, is a severe liver disease and impacts millions of people


Peer-Reviewed Publication

MICHIGAN MEDICINE - UNIVERSITY OF MICHIGAN

A recently developed amino acid compound successfully treats nonalcoholic fatty liver disease in non-human primates — bringing scientists one step closer to the first human treatment for the condition that is rapidly increasing around the world, a study suggests.

Researchers at Michigan Medicine developed DT-109, a glycine-based tripeptide, to treat the severe form of fatty liver disease called nonalcoholic steatohepatitis. More commonly known as NASH, the disease causes scarring and inflammation in the liver and is estimated to affect up to 6.5% of the global population.

Results reveal that DT-109 reversed fat buildup and prevented scarring in the livers of both mice and primates that had developed NASH. The study, completed in partnership with an international team including the Laboratory Animal Center at Xi’an Jiaotong University Health Science Center and the Institute of Cardiovascular Sciences at Peking University Health Science Center, is published in Cell Metabolism.

“For years, scientists have been trying to develop a medication that treats NASH, but many attempts have failed to show an improvement or have raised safety concerns in clinical trials,” said Eugene Chen, M.D., Ph.D., senior author of the study and Frederick G. L. Huetwell Professor of Cardiovascular Medicine at University of Michigan Medical School. “NASH is rising at a staggering rate, and successful treatment of non-human primates with our drug candidate, DT-109, brings us closer than ever to treating the millions of people suffering from this condition.”

NASH is the second stage of nonalcoholic fatty liver disease, which is estimated to affect 32% of people worldwide. While fatty liver disease can be treated with exercise and nutritional intervention, the liver damage from NASH is more permanent. It has become the primary cause of chronic liver disease, and NASH-related cirrhosis is now one of the most common reasons for liver transplantation.

Chen and his team developed DT-109 for treating NASH in non-human primates after reports showed that impaired glycine metabolism emerged as a cause of nonalcoholic fatty liver disease and NASH.

While hundreds of compounds have successfully treated NASH in mice, including DT-109, Chen says mouse NASH models are limited because not all aspects of the human disease are accurately mimicked and, therefore, are not easily translatable to the clinic. The research team’s non-human primate model for NASH, confirmed using multiomics profiling studies, is among the first to accomplish the feat.

In both non-human primates and mice, investigators in the international collaboration found that treatment with DT-109 reverses fat buildup and prevents fibrosis progression by stimulating fatty acid degradation and antioxidant formation. The drug also inhibited the production of lithocholic acid, a toxic secondary bile acid closely linked to nonalcoholic fatty liver disease.

“With this significant breakthrough in preclinical models, we can now consider evaluating DT-109 as a potential drug candidate for the treatment of NASH in future clinical trials,” said Jifeng Zhang, Ph.D., co-corresponding author and research associate professor of cardiovascular medicine at Michigan Medicine. “With millions of people suffering from NASH, the need for an effective treatment is more pressing than ever.”

Additional authors include Oren Rom, Ying Zhao, Chao Xue, Yang Zhao, Bo Wen, Duxin Sun, Jiandie Lin, all of University of Michigan, Pengxiang Qu, Linying Jia, Wenbin Cao, Jinpeng Zhao, Liang Bai, Sihai Zhao, Enqi Liu, all of Xi’an Jiaotong University Health Science Center, Ke Li, Shusi Ding, Beijing Tiantan Hospital, Mingming Zhao, Huiqing Wang, Lemin Zheng, all of Peking University, Xiaojing Gao, Chengshuang Chu, Rong Zeng, all of Shanghai Institute of Biochemistry and Cell Biology,  Zhipeng Lui, Purdue University,  Shuangshuang Chen  and Xuelian Xiong, both of  Fudan University,  Alexandra C. Finney, Louisiana State University Health Sciences Center-Shreveport, Zuowen Zheng, Spring Biological Technology Development Co., Wanqing Liu, Wayne State University.

Disclosure: Chen is an inventor of the compound DT-109. The University of Michigan has patented it and licensed it to Diapin Therapeutics. Chen and the university have an ownership interest in Diapin. Diapin provided DT-109 for this study. The company is further developing the compound.

All procedures performed in mice were approved by the Institutional Animal Care and Use Committee at the University of Michigan and performed in accordance with the institutional guidelines. All experimental protocols involving non-human primates were approved by the Laboratory Animal Care Committee of Xi'an Jiaotong University (approval number: 20191278) and the Institutional Animal Care and Use Committee of Spring Biological Technology Development Co., Ltd. (approval number: 201901). The study was performed in accordance with the National Institutes of Health Guide for the Care and Use of Laboratory Animals.

Paper cited: “DT-109 Ameliorates Nonalcoholic Steatohepatitis in Non-human Primates,” Cell Metabolism. DOI: 10.1016/j.cmet.2023.03.013

Scientists show how we can anticipate rather than react to extinction in mammals

Peer-Reviewed Publication

CELL PRESS

Concentrations of terrestrial mammal species with multiple future risk factors 

IMAGE: CONCENTRATIONS OF TERRESTRIAL MAMMAL SPECIES WITH MULTIPLE FUTURE RISK FACTORS view more 

CREDIT: CURRENT BIOLOGY/CARDILLO ET AL.

Most conservation efforts are reactive. Typically, a species must reach threatened status before action is taken to prevent extinction, such as establishing protected areas. A new study published in the journal Current Biology on April 10 shows that we can use existing conservation data to predict which currently unthreatened species could become threatened and take proactive action to prevent their decline before it is too late.

“Conservation funding is really limited,” says lead author Marcel Cardillo (@MarcelCardillo) of Australian National University. “Ideally, what we need is some way of anticipating species that may not be threatened at the moment but have a high chance of becoming threatened in the future. Prevention is better than cure.”

To predict “over-the-horizon” extinction risk, Cardillo and colleagues looked at three aspects of global change—climate change, human population growth, and the rate of change in land use—together with intrinsic biological features that could make some species more vulnerable. The team predicts that up to 20% of land mammals will have a combination of two or more of these risk factors by the year 2100.

“Globally, the percentage of terrestrial mammal species that our models predict will have at least one of the four future risk factors by 2100 ranges from 40% under a middle-of-the-road emissions scenario with broad species dispersal to 58% under a fossil-fueled development scenario with no dispersal,” say the authors.

“There’s a congruence of multiple future risk factors in Sub-Saharan African and southeastern Australia: climate change (which is expected to be particularly severe in Africa), human population growth, and changes in land use,” says Cardillo. “And there are a lot of large mammal species that are likely to be more sensitive to these things. It’s pretty much the perfect storm.”

Larger mammals in particular, like elephants, rhinos, giraffes, and kangaroos, are often more susceptible to population decline since their reproductive patterns influence how quickly their populations can bounce back from disturbances. Compared to smaller mammals, such as rodents, which reproduce quickly and in larger numbers, bigger mammals, such as elephants, have long gestational periods and produce fewer offspring at a time.

“Traditionally, conservation has relied heavily on declaring protected areas,” says Cardillo. “The basic idea is that you remove or mitigate what is causing the species to become threatened.”

“But increasingly, it's being recognized that that's very much a Western view of conservation because it dictates separating people from nature,” says Cardillo. “It's a sort of view of nature where humans don't play a role, and that's something that doesn't sit well with a lot of cultures in many parts of the world.”

In preventing animal extinction, the researchers say we must also be aware of how conservation impacts Indigenous communities. Sub-Saharan Africa is home to many Indigenous populations, and Western ideas of conservation, although well-intended, may have negative impacts.

Australia has already begun tackling this issue by establishing Indigenous Protected Areas (IPAs), which are owned by Indigenous peoples and operate with the help of rangers from local communities. In these regions, humans and animals can coexist, as established through collaboration between governments and private landowners outside of these protected areas.

“There’s an important part to play for broad-scale modeling studies because they can provide a broad framework and context for planning,” says Cardillo. “But science is only a very small part of the mix. We hope our model acts as a catalyst for bringing about some kind of change in the outlook for conservation.”

###

Current Biology, Cardillo et al. “Priorities for conserving the world’s terrestrial mammals based on over-the-horizon extinction risk” https://www.cell.com/current-biology/fulltext/S0960-9822(23)00236-1

Current Biology (@CurrentBiology), published by Cell Press, is a bimonthly journal that features papers across all areas of biology. Current Biology strives to foster communication across fields of biology, both by publishing important findings of general interest and through highly accessible front matter for non-specialists. Visit http://www.cell.com/current-biology. To receive Cell Press media alerts, contact press@cell.com.

This elephant’s self-taught banana peeling offers glimpse of elephants’ broader abilities

Peer-Reviewed Publication

CELL PRESS

Pha peeling bananas 

VIDEO: PHA PEELING BANANAS view more 

CREDIT: CURRENT BIOLOGY/KAUFMANN ET AL.

Elephants like to eat bananas, but they don’t usually peel them first in the way humans do. A new report in the journal Current Biology on April 10, however, shows that one very special Asian elephant named Pang Pha picked up banana peeling all on her own while living at the Berlin Zoo. She reserves it for yellow-brown bananas, first breaking the banana before shaking out and collecting the pulp, leaving the thick peel behind.

The female elephant most likely learned the unusual peeling behavior from watching her caretakers peel bananas for her, the study authors report. The findings in a single elephant show that elephants more broadly have special cognitive and manipulative abilities, they say.

“We discovered a very unique behavior,” said Michael Brecht (@BrechtLab) of Humboldt-Universität zu Berlin’s Bernstein Center for Computational Neuroscience. “What makes Pang Pha's banana peeling so unique is a combination of factors—skillfulness, speed, individuality, and the putatively human origin—rather than a single behavioral element.”

Like other elephants, Pha eats green or yellow bananas whole. She rejects brown bananas outright. But when it comes to yellow bananas spotted with brown—the kind one might reserve for banana bread—she eats after peeling them first.

Brecht and colleagues including Lena Kaufmann (@lena_v_kaufmann), also at Humboldt-Universität zu Berlin, and Andreas Ochs, Berlin Zoological Garden, made the discovery after learning from Pha’s caretakers about her unusual banana-peeling talent. At first, they were confused. They brought Pha nice yellow and green bananas, and she never peeled them.

“It was only when we understood that she peels only yellow-brown bananas that our project took off,” Brecht said.

When yellow-brown bananas are offered to a group of elephants, Pha changes her behavior, they report. She eats as many bananas as she can whole and then saves the last one to peel later.

Banana-peeling appears to be rare in elephants as far as anyone knows, and none of the other Berlin elephants engage in peeling. It’s not clear why Pha peels them. The researchers note, however, that she was hand raised by human caretakers in the Berlin Zoo. They never taught her to peel bananas, but they did feed her peeled bananas.

Based on this, the researchers suggest she acquired peeling through observational learning from humans. Earlier reports on African elephants suggest elephants can interpret human pointing gestures and classify people into ethnic groups, but complex human-derived manipulation behaviors, like banana-peeling, appear rather unique, according to the researchers. The findings in Pha nevertheless suggest that elephants overall have surprising cognitive abilities and impressive manipulative skill.

“Elephants have truly remarkable trunk skills and that their behavior is shaped by experience,” says Brecht.

The researchers find it surprising that Pha alone picked up on banana peeling. It leads them to wonder if such habits are normally passed on through elephant families. They’re now looking into other sophisticated trunk behaviors, such as tool use.

####

Current Biology, Kaufmann et al. “Elephant banana peeling” https://www.cell.com/current-biology/fulltext/S0960-9822(23)00266-X

Current Biology (@CurrentBiology), published by Cell Press, is a bimonthly journal that features papers across all areas of biology. Current Biology strives to foster communication across fields of biology, both by publishing important findings of general interest and through highly accessible front matter for non-specialists. Visit http://www.cell.com/current-biology. To receive Cell Press media alerts, contact press@cell.com.

Navigating the cosmos with Georgia State’s CHARA Array


New instruments and plans for a seventh telescope at Georgia State’s CHARA Array will allow scientists to see the stars in greater detail than ever before.

Reports and Proceedings

GEORGIA STATE UNIVERSITY

Navigating the Cosmos with Georgia State’s CHARA Array Navigating the Cosmos with Georgia State’s CHARA Array 

IMAGE: GEORGIA STATE'S CHARA ARRAY IS AN OPTICAL INTERFEROMETER LOCATED ON MOUNT WILSON, CALIFORNIA view more 

CREDIT: GEORGIA STATE UNIVERSITY

ATLANTA—Plans are underway to add a seventh movable telescope to Georgia State University’s Center for High Angular Resolution Astronomy— known as the CHARA Array—that would increase the resolution, or the ability to see small objects, by a factor of three.

Located at Mount Wilson Observatory in Southern California and operated by Georgia State, the new telescope will be connected using fiber optics to transport the starlight, a technique that will serve as a pathfinder for future expansion of the Array. The update comes after a group of international scientists gathered in Atlanta to take part in the 2023 CHARA Science Meeting to share the latest developments in high-resolution astronomical imaging using the CHARA Array.

“Adding a seventh moveable telescope to the Array represents a great leap forward in stellar astronomy,” says Doug Gies, Regents’ Professor of Physics and Astronomy and director of the center. “Collaboration is truly fundamental for an undertaking like the CHARA Array. With scientists all over the world using our telescopes, this annual gathering is an important forum for us to share our latest discoveries.”

The CHARA Array combines the light from six optical telescopes spread across the mountaintop to image stars with a spatial resolution equivalent to a single telescope 331 meters (over 1000 ft) in diameter. The visible and infrared observatory offers astronomers the opportunity to capture images of space with better resolution than any other telescope in the world.

More than 40 members of the CHARA Consortium, which represents 10 institutions around the world, took part in the annual review of the latest scientific and technical progress.

Scientists gathered at Georgia State University in March 2023 for the CHARA Science Meeting and Imaging Workshop.

CHARA features a new suite of instruments built by partner institutions at the University of Michigan, University of Exeter, and Observatoire de la Côte d’Azur in France. This next generation of instrumentation provides unprecedented capabilities to image the surfaces of stars and their circumstellar environments at a variety of different wavelengths from the near-infrared to the visible part of the spectrum. Georgia State University is also building a new instrument that will increase the sensitivity of the CHARA Array to measure light 30 times fainter than possible now. This improvement will help astronomers probe the gas clouds swirling around supermassive black holes in very distant active galaxies.

With funding from the National Science Foundation (NSF), CHARA has expanded its user base over the last six years by offering open access time to the global community of astronomers through a competitive proposal process offered through the National Optical-Infrared Astronomy Research Laboratory. In addition to over 60 active observers at Georgia State University and partner institutions, the open-access program has received applications from over 350 visiting astronomers around the world.

“Expanding the user community brings new opportunities for innovative science projects that broaden the impact and productivity of the CHARA Array,’’ says Gail Schaefer, Director of the CHARA Array.

At the recent meeting, members presented some science highlights and findings from the CHARA Array.

  • Georgia State graduate student Katherine Shepard presented results on a sample of evolved massive binary star systems surrounded by outflowing disks. The disks in these fascinating systems form as one star in the system grows in size as it evolves and material from that star is transferred to the companion. Some of the mass escapes into a disk that surrounds the system. Katherine is using the CHARA Array to resolve the structure of these disks and search for interactions between the disk and the inner binary system.
  • Noura Ibrahim, a graduate student from the University of Michigan, imaged the ring-like structure of a circumstellar disk around the young star V1295 Aquila. Two images taken one month apart show a bright spot in the ring that rotates between the two epochs. This variation could be caused by a stellar companion, an exoplanet in formation, or asymmetries in the density distribution.
  • Visiting astronomer Willie Torres at the Harvard-Smithsonian Center for Astrophysics mapped the orbits in the Castor multiple star system. The system consists of Castor A and B that revolve around each other every 450 years, and each component in turn are short-period binary systems with periods of a few days. They are joined by a more distant component Castor C, which is also a binary. Torres used the CHARA Array to resolve the close, faint companions in Castor A and B for the first time. He combined these observations with historical observations spanning the past three centuries to map the orbits of the stars in the Castor system and measure their stellar masses with a precision better than 1%. The CHARA observations were also used to measure the radii of the two brightest stars to infer an age for the system of 290 million years.
  • Rachael Roettenbacher, a Postdoctoral Associate from the University of Michigan, presented recent work on mapping starspots over a rotation cycle for the sun-like star Epsilon Eridani, which is orbited by an exoplanet. The starspot images, in combination with data from other telescopes, were used to develop a technique to distinguish between small changes in the stellar spectrum caused by starspots and those caused the orbiting planet. These techniques will improve the detection of planets around other stars.

The annual meeting was followed by a workshop on imaging and modeling of interferometric observations. Participants were given an overview of modeling and imaging software packages available to analyze data from stellar interferometers (arrays of telescopes that combine light together), and the workshop included interactive hands-on sessions where participants used the software tools to analyze data. Participants also brought their own data for review in order to get the most from observations made with the CHARA Array.


Scientists map gusty winds in a far-off neutron star system

The 2D map of this “disk wind” may reveal clues to galaxy formation.

Peer-Reviewed Publication

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

Disk Wind 

IMAGE: MIT ASTRONOMERS MAPPED THE “DISK WINDS” ASSOCIATED WITH THE ACCRETION DISK AROUND HERCULES X-1, A SYSTEM IN WHICH A NEUTRON STAR IS DRAWING MATERIAL AWAY FROM A SUN-LIKE STAR, REPRESENTED AS THE TEAL SPHERE. THE FINDINGS MAY OFFER CLUES TO HOW SUPERMASSIVE BLACK HOLES SHAPE ENTIRE GALAXIES. view more 

CREDIT: CREDIT: JOSE-LUIS OLIVARES, MIT. BASED ON AN IMAGE OF HERCULES X-1 BY D. KLOCHKOV, EUROPEAN SPACE AGENCY.

An accretion disk is a colossal whirlpool of gas and dust that gathers around a black hole or a neutron star like cotton candy as it pulls in material from a nearby star. As the disk spins, it whips up powerful winds that push and pull on the sprawling, rotating plasma. These massive outflows can affect the surroundings of black holes by heating and blowing away the gas and dust around them.

At immense scales, “disk winds” can offer clues to how supermassive black holes shape entire galaxies. Astronomers have observed signs of disk winds in many systems, including accreting black holes and neutron stars. But to date, they’ve only ever glimpsed a very narrow view of this phenomenon.

Now, MIT astronomers have observed a wider swath of winds, in Hercules X-1, a system in which a neutron star is drawing material away from a sun-like star. This neutron star’s accretion disk is unique in that it wobbles, or “precesses,” as it rotates. By taking advantage of this wobble, the astronomers have captured varying perspectives of the rotating disk and created a two-dimensional map of its winds, for the first time.

The new map reveals the wind’s vertical shape and structure, as well as its velocity — around hundreds of kilometers per second, or about a million miles per hour, which is on the milder end of what accretion disks can spin up. 

If astronomers can spot more wobbling systems in the future, the team’s mapping technique could help determine how disk winds influence the formation and evolution of stellar systems, and even entire galaxies. 

“In the future, we could map disk winds in a range of objects and determine how wind properties change, for instance, with the mass of a black hole, or with how much material it is accreting,” says Peter Kosec, a postdoc in MIT’s Kavli Institute for Astrophysics and Space Research. “That will help determine how black holes and neutron stars influence our universe.”

Kosec is the lead author of a study appearing in Nature Astronomy. His MIT co-authors include Erin Kara, Daniele Rogantini, and Claude Canizares, along with collaborators from multiple institutions, including the Institute of Astronomy in Cambridge, U.K.

Fixed sight

Disk winds have most often been observed in X-ray binaries — systems in which a black hole or a neutron star is pulling material from a less dense object and generating a white-hot disk of inspiraling matter, along with outflowing wind. Exactly how winds are launched from these systems is unclear. Some theories propose that magnetic fields could shred the disk and expel some of the material outward as wind. Others posit that the neutron star’s radiation could heat and evaporate the disk’s surface in white-hot gusts.  

Clues to a wind’s origins may be deduced from its structure, but the shape and extent of disk winds has been difficult to resolve. Most binaries produce accretion disks that are relatively even in shape, like thin donuts of gas that spins in a single plane. Astronomers who study these disks from far-off satellites or telescopes can only observe the effects of disk winds within a fixed and narrow range, relative to their rotating disk. Any wind that astronomers manage to detect is therefore a small sliver of its larger structure. 

“We can only probe the wind properties at a single point, and we’re completely blind to everything around that point,” Kosec notes.

In 2020, he and his colleagues realized that one binary system could offer a wider view of disk winds. Hercules X-1 has stood out from most known X-ray binaries for its warped accretion disk, which wobbles as it rotates around the system’s central neutron star. 

“The disk is really wobbling over time every 35 days, and the winds are originating somewhere in the disk and crossing our line of sight at different heights above the disk with time,” Kosec explains. “That’s a very unique property of this system which allows us to better understand its vertical wind properties.”

A warped wobble

In the new study, the researchers observed Hercules X-1 using two X-ray telescopes — the European Space Agency’s XMM Newton and NASA’s Chandra Observatory. 

“What we measure is an X-ray spectrum, which means the amount of X-ray photons that arrive at our detectors, versus their energy. We measure the absorption lines, or the lack of X-ray light at very specific energies,” Kosec says. “From the ratio of how strong the different lines are, we can determine the temperature, velocity, and the amount of plasma within the disk wind.”

With Hercules X-1’s warped disk, astronomers were able to see the line of the disk moving up and down as it wobbled and rotated, similar to the way a warped record appears to oscillate when seen from edge-on. The effect was such that the researchers could observe signs of disk winds at changing heights with respect to the disk, rather than at a single, fixed height above a uniformly rotating disk. 

By measuring X-ray emissions and the absorption lines as the disk wobbled and rotated over time, the researchers could scan properties such as the temperature and density of winds at various heights with respect to its disk and construct a two-dimensional map of the wind’s vertical structure. 

“What we see is that the wind rises from the disk, at an angle of about 12 degrees with respect to the disk as it expands in space,” Kosec says. “It’s also getting colder and more clumpy, and weaker at greater heights above the disk.” 

The team plans to compare their observations with theoretical simulations of various wind-launching mechanisms, to see which could best explain the wind’s origins. Further out, they hope to discover more warped and wobbling systems, and map their disk wind structures. Then, scientists could have a broader view of disk winds, and how such outflows influence their surroundings — particularly at much larger scales. 

“How do supermassive black holes affect the shape and structure of galaxies?” poses Erin Kara, the Class of 1958 Career Development Assistant Professor of Physics at MIT. “One of the leading hypotheses is that disk winds, launched from a black hole, can affect how galaxies look. Now we can get a more detailed picture of how these winds are launched, and what they look like.”

This research was supported in part by NASA.

###

Written by Jennifer Chu, MIT News Office