Monday, July 17, 2023

World-first clinical trial to help millions with penicillin allergies


Peer-Reviewed Publication

VANDERBILT UNIVERSITY MEDICAL CENTER

Dr. Elizabeth Phillips 

IMAGE: VANDERBILT UNIVERSITY MEDICAL CENTER PRINCIPAL INVESTIGATOR ELIZABETH PHILLIPS, MD, THE JOHN OATES PROFESSOR OF CLINICAL RESEARCH. view more 

CREDIT: VANDERBILT UNIVERSITY MEDICAL CENTER



Penicillin allergy affects more than 25 million people in the United States (up to 1 in 10 Americans) and has been shown to lead to particularly poor health outcomes in pregnant women and surgical patients. It is also a public health threat, leading to antibiotic resistance and infections in hospitalized patients that can be life threatening.

Seventy-five% or more penicillin allergy labels come on by age 3 due to, for example, confusion with a viral rash. The majority of these rashes were never allergic, but the labels ‘stick’ into adulthood and carry many adverse consequences.”

 

Many low-risk patients with a penicillin allergy were able to have their penicillin allergy label removed through a simple procedure known as “direct oral challenge” as part of a world-first multicenter randomized control trial known as the Penicillin Allergy Clinical Decision Rule (PALACE) study.

 

In the PALACE study, investigators randomized low-risk penicillin allergic patients to two different approaches to remove their allergy label. They either underwent the current standard of care to have skin testing followed if negative by oral challenge with a penicillin or they went straight to oral challenge (“direct oral challenge”) without preceding skin testing.   

 

“The majority of patients labeled as penicillin allergic, more than 90%, have low-risk histories, meaning they did not have a history to suggest a severe or more recent reaction to a penicillin,” said PALACE study protocol member and Vanderbilt University Medical Center principal investigator Elizabeth Phillips, MD, the John Oates Professor of Clinical Research. “We would expect more than 95% of these patients to have negative testing and be able to take penicillin in the future.”

 

The study, undertaken by a team of researchers from specialized centers in North America and Australia, enrolled 382 adults who were assessed using a specialized risk assessment tool called PEN-FAST. Participants were randomly assigned to receive either a direct oral penicillin challenge or the standard approach (penicillin skin testing followed by an oral challenge). The primary goal was to determine if the direct oral penicillin challenge was no worse than the standard method of skin testing followed by oral challenge which needs to be performed in an allergist’s office.

 

Only one patient (0.5%) in each group experienced a positive reaction to the penicillin challenge, demonstrating that the direct oral penicillin challenge performs just as well as the standard method. Importantly, there were no significant differences in adverse events between the two groups, and no serious adverse events were reported.

 

The findings have wide-ranging implications for patients. By accurately identifying low-risk penicillin allergy patients, health care providers can ensure appropriate antibiotic prescriptions. Patients with a documented penicillin allergy are more likely to be prescribed alternative antibiotics, known as second-line antibiotics, which are often not as effective against certain infections and may have more side effects.

 

“Patients with penicillin allergy are more likely to get second-line or broader spectrum antibiotics that lead to risk of antibiotic resistance and serious infections such as antibiotic-associated diarrhea due to Clostridioides difficile, which can spread through hospitals and become a major public health problem.” Phillips said. “In the U.S. increasingly we also have a major problem with other antibiotic-resistant ‘superbugs’ such as multi-resistant gram-negative infections, Candida auris and even a resurgence of syphilis for which penicillin is the best treatment and the only treatment that should be used in pregnancy to prevent transmission to an unborn child.

 

“The evidence provided by the PALACE study will change clinical practice.  Many patients in the United States do not have direct access to an allergist to provide specialized testing such as skin testing.  Therefore, the ability to go to direct oral challenge with a penicillin in low-risk patients which can be carried out in any observed setting will make it easier for patients in the United States to access health care to safely and effectively remove the label of penicillin allergy,” she said.

 

The PALACE study was led by Ana-Maria Copaescu, MD, PhD, McGill University Health Centre, Montréal, and senior author Jason Trubiano. PhD, Center for Antibiotic Allergy and Research, Department of Infectious Diseases at Austin Health. Cosby Stone, MD, assistant professor of Medicine in the Division of Allergy, Pulmonology, and Critical Care Medicine, was a co-investigator for VUMC.

 

 

 

 

 

 

 

The missing Americans: Unprecedented US mortality far exceeds other wealthy nations


A new study found that more than one million US deaths per year—including many young and working-age adults—could be avoided if the US had mortality rates similar to its peer nations.

Peer-Reviewed Publication

BOSTON UNIVERSITY SCHOOL OF PUBLIC HEALTH



A new study found that more than one million US deaths a year—including many young and working-age adults—could be avoided if the US had mortality rates similar to its peer nations.

In 2021, 1.1 million deaths would have been averted in the United States if the US had mortality rates similar to other wealthy nations, according to a new study led by a Boston University School of Public Health (BUSPH) researcher.

Published in the journal PNAS Nexus, the study refers to these excess deaths as "Missing Americans," because these deaths reflect people who would still be alive if the US mortality rates were equal to its peer countries. 

Comparing age-specific death rates in the U.S. and 21 other wealthy nations from 1933 through 2021, the authors find that current death rates in the US are much higher than other wealthy nations, and the number of excess U.S. deaths has never been larger.

“The number of Missing Americans in recent years is unprecedented in modern times,” says study lead and corresponding author Dr. Jacob Bor, associate professor of global health and epidemiology at BUSPH.

Nearly 50 percent of all Missing Americans died before age 65 in 2020 and 2021. According to Dr. Bor, the level of excess mortality among working age adults is particularly stark. “Think of people you know who have passed away before reaching age 65. Statistically, half of them would still be alive if the US had the mortality rates of our peers. The US is experiencing a crisis of early death that is unique among wealthy nations.”

The COVID-19 pandemic contributed to a sharp spike in mortality in the US—more so than in other countries—but the new findings show that the number of excess US deaths has been accelerating over the last four decades. Dr. Bor and colleagues analyzed trends in US deaths from 1933 to 2021, including the impact of COVID-19, and then compared these trends with age-specific mortality rates in Canada, Japan, Australia, and 18 European nations.

The US had lower mortality rates than peer countries during World War II and its aftermath. During the 1960’s and 1970’s, the US had mortality rates similar to other wealthy nations, but the number of Missing Americans began to increase year by year starting in the 1980’s, reaching 622,534 annual excess US deaths by 2019. Deaths then spiked to 1,009,467 in 2020 and 1,090,103 in 2021 during the pandemic. From 1980 to 2021, there were a total of 13.1 million Missing Americans.

The researchers emphasize that this mortality crisis is a multiracial phenomenon and is not specific to minoritized groups. Black and Native Americans are overrepresented in these measures, with mortality rates in early adulthood (ages 15-44) that were five and eight times higher than the average of other wealthy nations. The team also says that the history of structural racism in the US, including policies such as slavery and redlining, has contributed to racial and ethnic disparities in wealth and in access to education, housing, and healthcare that drive leading causes of worse mortality rates, particularly at young ages.

Still, two-thirds of the Missing Americans are White, a result of the larger population of White Americans, their older age distribution, and death rates that are significantly higher than other wealthy nations.

“Living in the US is a risk factor for early death that is common across many US racial and ethnic groups. Whereas most health disparities studies assess differences between US racial/ethnic groups, such an approach renders the poor health of Whites invisible and grossly underestimates the health shortfall of minoritized groups,” Dr. Bor says. “By using an international benchmark, we show that Americans of all races and ethnicities are adversely affected by the US policy environment, which places a low priority on public health and social protections, particularly for low-income people.”

Accounting for future years of life lost due when a person dies prematurely, the team estimates that in 2021, excess U.S. mortality translated to 26.4 million years of life lost relative to the mortality rates of peer nations. They connect the large excess mortality burden to the failure of US policy to adequately address major public health issues, including the opioid epidemic, gun violence, environmental pollution, economic inequality, food insecurity, and workplace safety. The COVID-19 pandemic exacerbated many of these issues, particularly among lower-income and minority groups, and now that most of the safety-net policies created during COVID-19 have expired, vulnerable groups have lost vital support. 

“We waste hundreds of billions each year on health insurers’ profits and paperwork, while tens of millions can’t afford medical care, healthy food, or a decent place to live,” says study senior author Dr. Steffie Woolhandler, Distinguished Professor at the School of Urban Public Health at Hunter College, City University of New York. “Americans die younger than their counterparts elsewhere because when corporate profits conflict with health, our politicians side with the corporations.”

In comparison to the US, other countries also had higher vaccination uptake and mask policies during the Omicron wave, and this layered mitigation was associated with fewer COVID-19 cases.

The findings raise a number of urgent questions that will be critical to address in future research, says study coauthor Dr. Andrew Stokes, assistant professor of global health at BUSPH. For example, he says, "Which geographic areas are disproportionately responsible for the Missing Americans, and what were their causes of death? Answers to these questions may help to clarify policy solutions.” 

At the moment, Dr. Bor says he is not optimistic that mortality rates will reverse in the near future, even as COVID-related deaths continue to wane from their 2021 peak. 

“The US was already experiencing more than 600,000 Missing Americans annually before the pandemic began, and that number was increasing each year. There have been no significant policy changes since then to change this trajectory,” he says.

“While COVID-19 brought new attention to public health, the backlash unleashed during the pandemic has undermined trust in government and support for expansive policies to improve population health,” said Dr. Bor. “This could be the most harmful long-term impact of the pandemic, because expansion of public policy to support health is exactly how our peer countries have attained higher life expectancy and better health outcomes.”

**

About Boston University School of Public Health

Founded in 1976, Boston University School of Public Health is one of the top ten ranked schools of public health in the world. It offers master's- and doctoral-level education in public health. The faculty in six departments conduct policy-changing public health research around the world, with the mission of improving the health of populations—especially the disadvantaged, underserved, and vulnerable—locally and globally.

THE SHOW ME STATE

University of Missouri researchers driving innovative solutions to advance use of ‘plastic’ roads


Members of the Mizzou Asphalt Pavement and Innovation Lab are on the leading-edge of developing sustainable plastic waste road pavement mixtures


Business Announcement

UNIVERSITY OF MISSOURI-COLUMBIA

Driving innovative solutions for advancing use of ‘plastic’ roads 

VIDEO: UNIVERSITY OF MISSOURI RESEARCHERS ARE ON THE LEADING-EDGE OF DEVELOPING SUSTAINABLE PLASTIC WASTE ROAD PAVEMENT MIXTURES. view more 

CREDIT: UNIVERSITY OF MISSOURI



Millions of roads across the United States are constructed with asphalt pavement that’s deteriorating over time. Now, researchers at the University of Missouri are using recyclables, including plastic waste, as a sustainable solution to fix America's fracturing road system.

In partnership with the Missouri Department of Transportation (MoDOT), researchers from the Mizzou Asphalt Pavement and Innovation Lab (MAPIL) recently created a real-world test road using recycled materials like scrap tires and plastic waste along a portion of Interstate 155 in the Missouri Bootheel. By increasing the sustainability of asphalt mixes, this innovative method can help reduce the number of items going into landfills or leaking into the environment, said Bill Buttlar, director of MAPIL.  

“Missouri is the Show-Me State, so we take a very pragmatic view,” Buttlar said. “The science can be thorny and difficult, but we are up to the task. We’re excited that while our approach is complicated in the lab, its simple to execute in the field, so it makes it easily adaptable, scalable and cost-effective to incorporate into many types of road environments.”

The I-155 project takes the group’s previous test road, installed along a stretch of Stadium Boulevard in Columbia, Missouri, one step further. Instead of just testing four different types of recycled materials, the I-155 project will evaluate the real-world effectiveness of nine different types of recycled materials in the creation of asphalt pavement. This includes three different types of polyethylene (PE) — a material commonly found in plastic grocery bags — and ground tire rubber, which is a newer way of disposing scrap tires.

“These projects afford us an opportunity to intentionally build the next generation of roads with these materials not as a type of linear landfill, but to also help the environment while making the value of dollars spent on transportation infrastructure like this stretch farther into the future,” said Buttlar, who is also the Glen Barton Chair in Flexible Pavements.

MU is on the leading-edge of this type of work in the U.S. because its team has addressed most of the translational research questions like durability and safety that could prevent a general contractor or department of transportation from adopting this ground-breaking strategy.  

“We don’t just live in the laboratory,” Buttlar said. “In the field of transportation material research, we need to see how all the various materials used to construct a road — the rock, the asphalt and the recycled materials — behave in the real world and gel together to build a road. Asphalt is liquified with heat, and when you put an additive in like a plastic or rubber material, you must get everything to bond together with good adhesion. But we’re only going to know if that happens successfully when we produce it on a full-scale level and then expose it to elements, such as different weather conditions and heavy traffic.”

MAPIL specializes in a dry process, which allows the researchers to easily add the recyclables directly into the mixture before it’s applied to a road surface.  

“The form, shape and size of the plastics bring different challenges in how the material flows, how it behaves and how it mixes,” said Punya Rath, an assistant research professor in the Department of Civil and Environmental Engineering who works at MAPIL. “So, we did extensive small-scale testing for almost an entire year before we moved to a larger scale out in the field with contractors.”

One advantage of this process is that the researchers can test the mixtures in the field using a mobile research lab, which they developed and used for both the Stadium Boulevard and I-155 projects.

“It helps the Missouri Department of Transportation (MoDOT) immensely to have a mobile research lab on-site in the field that has the ability to rapidly test samples and provide results within 24-48 hours to better inform the process,” Rath said.

Citing environmental concerns, Buttlar said the team makes sure everything they do is within the current limits as established by the Environmental Protection Agency (EPA).

“We are designing the material to be able to hold or capture the environmental by-products at the highest percentage for the longest amount of time. It’s not going to be a 100% containment,” Buttlar said. “Everything built in a natural environment will degrade over time, so that’s why EPA has standards for everything, and we make sure we are safely within those standards.”

In partnership with MoDOT, University of Missouri researchers recently created a real-world test road using recycled materials along a portion of Interstate 155 in the Missouri Bootheel.

CREDIT

Graphic by Samantha Novak/University of Missouri

An asphalt pavement test mixture sample designed by the Mizzou Asphalt Pavement and Innovation Lab is ready for further lab testing to determine its strength and durability.


Lab staff in the Mizzou Asphalt Pavement and Innovation Lab show the plastic waste particles that are being added to the pavement mixture. The lab is located inside the University of Missouri College of Engineering.

A pavement mixture using recycled plastic is applied to a section of Stadium Boulevard in Columbia near U.S. Highway 63 in 2021.


Steam rises from the newly poured plastic waste pavement mixture as an asphalt roller compacts the mixture into a smooth driving surface along Stadium Boulevard in Columbia in 2021.

CREDIT

University of Missouri

Editor’s Note:

Both the Stadium Boulevard and I-155 projects were developed in collaboration with the Missouri Department of Transportation (MoDOT) as part of their ongoing partnership with the University of Missouri System’s Missouri Center for Transportation Innovation (MCTI). While both projects only involved repairing an existing road surface, Buttlar said there have been some preliminary talks between his team and MoDOT on how to incorporate more sustainable asphalt pavement mixtures into the building of a third lane on Interstate 70 across Missouri.

For more on the story, please see:

 

What causes mudslides and floods after wildfires? Hint: It’s not what scientists thought


Peer-Reviewed Publication

UNIVERSITY OF SOUTHERN CALIFORNIA



In 2020, one of the largest wildfires in Los Angeles County raged across the San Gabriel Mountains, scorching more than 115,000 acres, damaging or destroying over 150 structures and raining ash and smoke down on pandemic-weary Angelenos.

But even after exhausted firefighters had finally snuffed out the flames, the Bobcat Fire — like other so-called “mega-fires” that have become more common due to climate change — carried the potential to wreak more havoc in its wake. As rainstorms deluge burnt areas, flooding, mudslides and debris flow can compound the fire’s damage.

Understanding how water accumulates and monitoring the movement of runoff and streamflow in burn areas helps authorities predict when and where these post-wildfire events might occur so they can provide affected residents with early warning of flash flooding and debris movement.

A slippery slope

Common knowledge has long held that loss of vegetation during a fire leaves soil vulnerable to erosion because the plant roots that hold the soil in place wither and die. Scientists, however, have long held a different view, that as leaves burn, their waxy coating forms an organic, oily substance on the soil’s surface. This waxy coating creates a water-repellant layer at or near the surface. Scientists believed this layer prevented the ground from absorbing water, resulting in rapid water runoff akin to a Slip ‘N Slide that carries mud and debris.

New research published in Nature Communications has called that scientific theory into question.

A watershed finding

Images of the 2020 Bobcat Fire area indicate soil burn severity and the locations the researchers studied. (Image Source: a- USDA Forest Service; b- A.J. West; d- PlĂ©iades ©CNES, Distribution AIRBUSDS, sourced via SkyWatch Space Applications Inc., and USGS 3D Elevation Program.)

Scientists at the USC Dornsife College of Letters, Arts and Science, in collaboration with researchers from the University of Michigan, the U.S. Geological Survey and Rutgers University, monitored two wet seasons following the Bobcat Fire, from December 2020 to March 2022. The team concluded that water was, in fact, being absorbed by the burnt ground that contained this waxy coating.

Specifically, the team studied three watersheds — areas of land that drain rainfall and snowmelt into streams and rivers — in Southern California’s San Gabriel Mountains. Two of the watersheds burned during the 2020 Bobcat Fire and the other was mainly untouched.

The researchers found that post-wildfire, a significant portion of the water flow in all three watersheds came from water that had been absorbed in the ground.

Joshua West, professor of Earth sciences who led the study at USC Dornsife, said it was no surprise that the flow of water and debris in the burned area’s stream was four to 10 times greater than the flow in the unburnt area’s stream. What he didn’t expect was that stormwater had permeated the ground in both of the burnt watersheds.

This finding contradicted scientists’ previous beliefs that little water would be absorbed in the burnt watershed due to the presence of waxy soils.

In the unburnt watershed, however, the researchers found that trees absorbed the water as anticipated, preventing it from reaching streams.

West and PhD candidate Abra Atwood surmised that, in keeping with the popular notion, increased water in rivers originated from the burnt areas because burned trees and vegetation could not retain water in their roots as they normally would, but not from the inability of the soil to absorb water.

The research team’s finding that the water-repellant layer does not prevent water from being absorbed into the soil strengthened their hypothesis that the water in streams comes from both rainfall and groundwater, leading to increased flooding in burnt areas versus unburnt.

Water build-up poses enduring threat

Identifying areas that are at high-risk for debris flow and mudslides and accurately predicting the amount of debris flow following rainfall in burn areas depends on understanding how water infiltrates the soil in different areas and how it contributes to the flow of streams.

Also, the dynamics of water flow and how water accumulates below the surface can significantly impact how quickly landscapes recover after a wildfire. This recovery affects the stability of hill slopes and helps buffer forests against severe drought.

On the flip side, water accumulation can contribute to landslides for up to four years after a fire as pressure builds up in the soil.

“The underground water accumulation suggests that the potential for landslides extends far beyond the two years following the fire, posing an enduring problem,” West said. “The abundance of water stored in areas affected by the Bobcat Fire, for example, could serve as a harbinger of future flooding concerns in the years to come.”

West is confident that the study’s findings hold important information that can be used by the USGS to improve burn area monitoring and predict flooding and mudslides after a wildfire.

About the study

Funding was provided USC Dornsife’s Department of Earth Sciences and the National Science Foundation.

 

New NIST measurements aim to advance and validate portable MRI technology


A new wave of smaller, less expensive, and portable MRI systems promises to expand the delivery of health care and the capabilities of medicine


Peer-Reviewed Publication

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY (NIST)

Low Field MRI 

IMAGE: KALINA JORDANOVA, STEPHEN OGIER AND KATY KEENAN ARE AMONG THE NIST RESEARCHERS WHO HAVE BEEN WORKING ON SEVERAL PROJECTS THAT AIM TO ADVANCE MRI TECHNOLOGY THAT USES LOWER-STRENGTH MAGNETIC FIELDS AND VALIDATE ITS APPROACHES FOR CAPTURING IMAGES WITH WEAKER MAGNETIC FIELDS. view more 

CREDIT: R. JACOBSON/NIST



Magnetic resonance imaging (MRI) machines can clearly view non-bony parts of the body — soft tissue such as the brain, muscles and ligaments — as well as detect tumors, making it possible to diagnose many diseases and other conditions. However, the powerful magnets in conventional MRI machines make them expensive and bulky, confining them mainly to hospitals and other large facilities. 

As an alternative solution, companies are developing new portable versions that have lower-strength magnetic fields. These new models can potentially expand the ways in which MRI is used. For instance, low-field MRI systems could be deployed in ambulances and other mobile settings. They also could cost much less, promising to make MRI more widely available, including in underserved communities and developing nations. 

But for low-field MRI scanners to reach their full potential, more research is needed to understand the relationship between low-field images and the underlying tissue properties they represent. Researchers at the National Institute of Standards and Technology (NIST) have been working on several fronts to advance low-field MRI technology and validate methods for creating images with weaker magnetic fields.

“Magnetic resonance images of tissue differ depending on magnetic strength,” said NIST electrical engineer Kalina Jordanova. “With low-field MRI systems, the contrast of the images is different, so we need to know how human tissue looks at these lower field strengths.” 

Toward these ends, researchers measured the properties of brain tissue at low magnetic field strength. Their results were published in the journal Magnetic Resonance Materials in Physics, Biology and Medicine.

The researchers used a commercially available portable MRI machine to image brain tissue in five male and five female volunteers. The images were created using a magnetic field strength of 64 millitesla, which is at least 20 times lower than the magnetic field in conventional MRI scanners.  

They collected images of the entire brain and obtained data on its gray matter (which has a high concentration of nerve cells), white matter (deeper tissues of the brain that house nerve fibers), and cerebrospinal fluid (clear fluid surrounding the brain and spinal cord). 

These three brain constituents respond to the low magnetic field in different ways and produce distinctive signals that reflect their unique properties, enabling the MRI system to produce images that contain quantitative information about each constituent. “Knowing the quantitative properties of tissue allows us to develop new image collection strategies for this MRI system,” said NIST biomedical engineer Katy Keenan. 

In separate work, NIST researchers are exploring several candidate materials that can significantly boost image quality in low-field MRI scans.

MRI contrast agents — magnetic materials that are injected into patients and enhance image contrast — make it easier for radiologists to identify anatomical features or evidence of disease and are routinely used in MRI at conventional magnetic field strengths. However, researchers are just starting to understand how contrast agents might be used with the new low-field MRI scanners. At the lower field strengths of these scanners, contrast agents may act differently than at higher field strengths, creating opportunities to use new types of magnetic materials for image enhancement. 

NIST scientists and their colleagues compared the sensitivity of several magnetic contrast agents in low magnetic fields. The researchers found that iron oxide nanoparticles outperformed traditional contrast agents, which are made of the element gadolinium — a rare-earth metal. At low magnetic field strength, the nanoparticles provided good contrast using a concentration of only about one-ninth that of the gadolinium particles.

Iron oxide nanoparticles also offer the advantage that they are broken down by the human body instead of potentially accumulating in tissue, noted NIST researcher Samuel Oberdick. By comparison, a small amount of gadolinium may accumulate in tissue and could confound the interpretation of future MRI scans if it is not taken into account.

NIST researchers collaborated with the University of Florence in Italy and Hyperfine Inc. in Guilford, Connecticut, and reported their findings in the journal Scientific Reports.

NIST researcher Sam Oberdick explored contrasts agents for MRI machines along with his cohort. His group tested iron oxide nanoparticles at lower-strength magnetic fields. These nanoparticles inside the liquid solution (pictured here) are magnetic and are pulled toward the magnet through a combination of magnetic interactions and surface tension.

CREDIT

R. Wilson/NIST

Rice study: Men vastly outnumber women in studying legislative politics


Peer-Reviewed Publication

RICE UNIVERSITY




It’s no secret that men outnumber women in the halls of Congress and in other political arenas, but new research from Rice University, the University of Wisconsin-Madison and the University of Illinois Urbana-Champaign also found that significantly more men than women study the legislative process in the U.S. and abroad.

This has troubling implications for the inner workings of the discipline and the overall study of topics that impact women’s political involvement, according to Leslie Schwindt-Bayer, the Thomas Cooke and Mary Elizabeth Edwards Chair in Government and Democracy and one of the authors of “Women in Legislative Studies: Improving Gender Equality.” The article appears in an upcoming issue of Political Science & Politics.

For their study, Schwindt-Bayer and fellow authors Eleanor Neff Powell at the University of Wisconsin–Madison and Gisela Sin from the University of Illinois Urbana-Champaign surveyed 361 women studying political science with a focus in legislative politics to understand why gender representation in the field is so imbalanced and what can be done about it.

Schwindt-Bayer said that while many women study legislatures, they do not always identify as scholars of legislative studies.

“We argue that one big reason for this low representation may be that women are less likely to call themselves legislative scholars and do not perceive the field as inclusive,” she said. “When asked what problems they perceive with the section of scholars studying legislative politics, the most common critique was that the field was male-dominated. Nearly 40 percent raised this concern; another 21 percent commented on the lack of ethnic and racial diversity.”

Other problems cited included an overemphasis on the U.S. Congress (21 percent) and the subfield being too cliquey/exclusive (28 percent).

Being underrepresented means that networking opportunities and scholarly resources for women are more scarce, in addition to there being fewer female mentors and letter writers for promotion and tenure. The representation imbalance also reinforces gender bias when it comes to authoring research, graduate training, and publication and citation rates for women in the field.

“It’s especially troublesome when you consider that female authors are significantly more likely than male authors to cite studies by women,” Schwindt-Bayer said. “This lack of representation puts women at a real disadvantage in this field.”

The researchers also looked at articles in Legislative Studies Quarterly (LSQ), a prominent political science journal focused on this political science subfield, between 2016 and August 2018. In the issues published during this time period, 59 percent of the 73 articles were authored by men, 18 percent by women and 23 percent by mixed-gender teams. These percentages are representative of women in the subfield but are small nonetheless.

“It was plain as day — women are just not well-represented in this branch of the American Political Science Association (APSA),” Schwindt-Bayer said.

While the authors have not examined how representation in the discipline affects women in legislative politics outside of academia, Schwindt-Bayer said it could mean less attention to issues that are important to women involved in the legislative process.

So what can be done? Schwindt-Bayer and her fellow authors said a more expansive definition of legislative studies, embracing the diversity of women in the field, peer mentoring and space for women studying legislatures to engage in professional development and share research would be a good place to start.

“This could help encourage more women to join this field,” she said.

The paper is online at https://www.cambridge.org/core/journals/ps-political-science-and-politics/article/women-in-legislative-studies-improving-gender-equality/40661FE6B0FB4E3B8F341DBC6359C2FE.