Tuesday, February 27, 2024

 

Study finds pesticide use linked to Parkinson’s in rocky mountain, great plains region

Reports and Proceedings

AMERICAN ACADEMY OF NEUROLOGY




MINNEAPOLIS – Pesticides and herbicides used in farming have been linked to Parkinson’s disease in the Rocky Mountain and Great Plains region of the country, according to a preliminary study released today, February 27, 2024, that will be presented at the American Academy of Neurology’s 76th Annual Meeting taking place April 13–18, 2024, in person in Denver and online.

“We used geographic methods to examine the rates of Parkinson’s disease across the United States and compared those rates to regional levels of pesticide and herbicide use,” said study author Brittany Krzyzanowski, PhD, of Barrow Neurological Institute in Phoenix, Arizona. “Our methods enabled us to identify parts of the nation where there was a relationship between most pesticides and Parkinson’s disease and subsequently pinpoint where the relationship was strongest so we could explore specific pesticides in that region. In the Rocky Mountain and Great Plains region, we identified 14 pesticides associated with Parkinson’s disease.”

Krzyzanowski said the region included parts of Colorado, Idaho, Kansas, Montana, Nebraska, Nevada, New Mexico, North Dakota, Oklahoma, South Dakota, Texas, Utah and Wyoming.

The study involved review of records from the 21.5 million people enrolled in Medicare in 2009 to determine the rate of Parkinson’s disease for various regions across the country. The researchers then looked for a possible relationship between these rates of Parkinson’s and the use of 65 pesticides.

They found that the pesticides and herbicides simazine, atrazine and lindane had the strongest relationship with Parkinson’s disease. When researchers divided counties into 10 groups based on exposure to pesticides, people living in the counties with the highest amount of application of the herbicide simazine were 36% more likely to have Parkinson’s disease than people living in the counties with the lowest amount of exposure.

In the counties with the highest exposure to simazine, 411 new Parkinson’s disease cases developed per every 100,000 people compared to 380 cases in the counties with the lowest exposure.

For the herbicide atrazine, those exposed to the highest amount were 31% more likely to have Parkinson’s disease than those with the lowest exposure. For the insecticide lindane, those with the most exposure were 25% more likely to have the disease.

In the counties with the highest exposure to atrazine, 475 new Parkinson’s disease cases developed per every 100,000 people compared to 398 cases in the counties with the lowest exposure. In the counties with the highest exposure to lindane, 386 new Parkinson’s disease cases developed per every 100,000 people compared to 349 cases in the counties with the lowest exposure.

The results remained the same when researchers adjusted for other factors that could affect the risk of Parkinson’s disease, such as air pollution exposure.

“It’s concerning that previous studies have identified other pesticides and herbicides as potential risk factors for Parkinson’s, and there are hundreds of pesticides that have not yet been studied for any relationship to the disease,” Krzyzanowski said. “Much more research is needed to determine these relationships and hopefully to inspire others to take steps to lower the risk of disease by reducing the levels of these pesticides.”

A limitation of the study was that it relied on the use of county-level estimates since person-level information on pesticide exposure was not available for the study population.

The study was supported by the Michael J. Fox Foundation for Parkinson’s Research.

Learn more about Parkinson’s disease at BrainandLife.org, home of the American Academy of Neurology’s free patient and caregiver magazine focused on the intersection of neurologic disease and brain health. Follow Brain & Life® on FacebookX and Instagram.

When posting to social media channels about this research, we encourage you to use the American Academy of Neurology’s Annual Meeting hashtag #AANAM.

The American Academy of Neurology is the world’s largest association of neurologists and neuroscience professionals, with over 40,000 members. The AAN is dedicated to promoting the highest quality patient-centered neurologic care. A neurologist is a doctor with specialized training in diagnosing, treating and managing disorders of the brain and nervous system such as Alzheimer’s disease, stroke, migraine, multiple sclerosis, concussion, Parkinson’s disease and epilepsy.

For more information about the American Academy of Neurology, visit AAN.com or find us on FacebookXInstagramLinkedIn and YouTube.

 

Walleye struggle with changes to timing of spring thaw


Peer-Reviewed Publication

UNIVERSITY OF WISCONSIN-MADISON




Walleye are one of the most sought-after species in freshwater sportfishing, a delicacy on Midwestern menus and a critically important part of the culture of many Indigenous communities. They are also struggling to survive in the warming waters of the Midwestern United States and Canada.

According to a new study published Feb. 26 in the journal Limnology and Oceanography Letters, part of the problem is that walleye are creatures of habit, and the seasons — especially winter — are changing so fast that this iconic species of freshwater fish can’t keep up.

The timing of walleye spawning — when the fish mate and lay their eggs — has historically been tied to the thawing of frozen lakes each spring, says the study’s lead author, Martha Barta, a research technician at the University of Wisconsin–Madison. Now, due to our changing climate, walleye have been “unable to keep up with increasingly early and more variable ice-off dates,” Barta says.

Within a few days of ice-off, when a lakes’ frozen lid has melted away, walleye begin laying eggs and fertilizing them. In a normal year, that timing sets baby fish up for success once they hatch. But, Barta says, “climate change is interrupting the historical pairing of ice-off and walleye spawning, and that threatens the persistence of walleye populations across the Upper Midwest.”

Barta — who began working on the study as an undergraduate student at UW–Madison’s Center for Limnology — and her colleagues used data from walleye surveys from various state natural resource departments and the Great Lakes Indian Fish & Wildlife Commission, as well as the spring harvest counts from Ojibwe tribal nations to track the fate of walleye populations on 194 lakes across Minnesota, Wisconsin and Michigan. The data revealed “mismatches” in ice-off and spawning on almost every single lake. While there has been a slight shift to earlier spring spawning dates for walleye, the ice-off dates on those lakes were shifting at a rate of three times faster.

Suddenly, the timing is all wrong for walleye, explains Zach Feiner, a fisheries scientist with both the UW–Madison Center for Limnology and the Wisconsin Department of Natural Resources.

“In an average ice-off year, you have this nice progression of events,” Feiner says. “The ice goes off, you get light and warmer water that creates a bloom of small plant life called phytoplankton. And then tiny animals called zooplankton emerge and eat the phytoplankton, and usually, the walleye spawning is timed for them to hatch when zooplankton are around in high abundance and can serve as fish food for the baby walleye.”

But lately, the timing of yearly thaws has gotten “weird,” says Feiner. Lakes are, on average, thawing earlier, but the number of winters where lakes thaw late is also increasing. The shifts throw off the timing of phytoplankton blooms, zooplankton emergence and walleye hatch, breaking their linked progression as winter transitions to spring.

“When the fish hatch, there aren’t enough zooplankton around, and walleye don’t have enough food to survive,” Feiner says.

On a year-to-year basis, that isn’t necessarily a problem, as adult walleye can always spawn again the next year, when conditions may be more favorable and more of their offspring can survive and increase the population. But, Feiner says, the heightened variability of spring thaws is “increasing the frequency of bust years, and we’re not seeing many or any boom years for a lot of walleye populations.”

While this is obviously bad news for walleye and the people who depend on them, the study underscores the need to identify and protect lakes that can offer refuge in bad years.

“There is a need now to find places where, through management of things we can control — like land use, fish harvest and invasive species — we can buffer or boost their resiliency to be able to handle stuff we can’t control, like climate change,” Feiner says

 If fisheries managers can identify lakes where walleye populations are doing relatively well, they can try to keep conditions optimal so that the fish can take advantage during the increasingly rare years when ice-off and their spring spawn do line up.

Then there is also the question of what our “weird” winters mean for other fish species.

“Most of our big-time sportfish species in the Midwest, like walleye, perch, pike, bass, bluegill and muskies, spawn in springtime,” Feiner says. Other species like lake trout and whitefish spawn in the fall, and their eggs overwinter under the ice.

Feiner hopes to expand the research to see if a pattern extends to other fish prized by people — or if some of them are resilient to less-predictable ice-off timing.

###

— Adam Hinterthuer, hinterthuer@wisc.edu

 

New UC Berkeley-led study reveals widen gap on racial inequality in higher education


Study shows that disparities in the share of Black and Latino students admitted to America’s elite colleges and universities have endured and even widened over the last 40 years


Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - BERKELEY





A UC Berkeley-led study revealed that disparities in the share of Black and Latino students admitted to America’s elite colleges and universities have endured and even widened over the last 40 years. 

The study, "Shifting Tides: The Evolution of Racial Inequality in Higher Education from the 1980s through the 2010s,"(link is external) was published this month in Sage Journals and conducted by a team of researchers from UC Berkeley’s Social Sciences D-Lab, SUNY Polytechnic Institute, University of Arizona and Portland State University. It used four national survey datasets to examine college attendance trends across different racial groups over several decades. 

The results show a concerning pattern: despite more students from all races going to college, Black and Latino students are increasingly less likely to attend top-tier, four-year colleges compared to their White peers. This gap remains significant, even when family income and parents’ education are factored in.

"The recent rapid expansion of higher education might raise an expectation of a decrease in racial inequality, but our findings show this isn't the case,” said Byeongdon Oh, a UC Berkeley Social Sciences D-Lab researcher, who led the study. “The privileged continue to benefit the most from higher education opportunities."

This study is unique in its comprehensive approach, analyzing data from the High School and Beyond survey, the National Education Longitudinal Study of 1988, the Education Longitudinal Study of 2002 and the High School Longitudinal Study of 2009. These datasets represent a wide spectrum of high school graduates over a span of four decades, providing a robust platform for analysis.

The stark findings show that while participation in two-year and non-selective four-year colleges increased for Black and Hispanic students, their presence in elite four-year colleges (those ranked by Barron’s selectivity index) continued to lag.

The study reveals that adjustments for family socioeconomic status do little to alter these trends. This suggests that the underlying issue of racial inequality in college attendance goes beyond socioeconomic measures, such as family income and parent’s education, and is intrinsically linked to race itself. It points to a systemic issue within the fabric of American education and society.

"In terms of the underlying issue of racial inequality in college attendance, there are subtle mechanisms and experiences of racism and discrimination that Black and Latino students encounter,” said Ned Tilbrook, a University of Arizona researcher who is co-author of the study. "One of the real contributions of this research is being able to look at the evolution of racial inequality in college attendance over four decades, which shows how the gap in admissions to the most selective institutions is getting wider. This is especially striking considering that overall  college attendance has increased."

The implications of these findings are significant, particularly in light of the 2023 Supreme Court ruling that overturned affirmative action. "While our study doesn't directly measure the impact of the Supreme Court decision, it does suggest that the trend of widening racial disparities might continue or even worsen," said Oh, who did postdoctoral work at PSU and is also an assistant professor of sociology at SUNY Polytechnic Institute.

The researchers argue for a holistic approach to admissions, advocating for the consideration of an applicant's background and circumstances and a focus on rectifying educational disparities at the K–12 level. 

"The Education Counsel in Washington, D.C. calls for universities to recognize the recent SCOTUS decision as an opportunity – an opportunity to integrate DEI into all university mission statements and to centralize holistic admissions practices, including identity-neutral questions that ask applicants to describe how their experiences, knowledge and expertise align with the university's DEI mission," said Dara Shifrer, associate professor of sociology at Portland State University and co-author of the study. 

The researchers also emphasize the need for proactive policy interventions to address this deep-rooted inequality. "Considering a student's background in admissions is crucial," said Tilbrook, who is also a Ph.D. candidate in sociology at PSU.

The study highlights the vital role of college admissions processes in shaping future socioeconomic disparities. With higher education being a prerequisite for economic success for many, the underrepresentation of racial minorities in selective institutions can perpetuate and widen the racial gap in socioeconomic outcomes, the researchers said.

The long-term societal impacts of these trends are a major concern. "If these educational disparities continue, they will likely lead to further racial inequalities in society," Oh said. "It's about the reproduction of structural racism across generations."

This study underscores the critical need for continued research and policy intervention to bridge the growing racial divide in higher education.

"Merely eradicating race from admissions criteria isn't a solution,” Oh said. “We need to understand and tackle the historical and systemic roots of these inequalities.”

 

How artificial intelligence could improve speed and accuracy of response to infectious disease outbreaks in hospitals, and even prevent them

Reports and Proceedings

EUROPEAN SOCIETY OF CLINICAL MICROBIOLOGY AND INFECTIOUS DISEASES





Please mention the European Congress of Clinical Microbiology and Infectious Diseases (ECCMID 2024, Barcelona, 27-30 April) if using this material

A new research review to be given at a pre-congress day for this year’s European Congress of Clinical Microbiology and Infectious Diseases (ECCMID 2024) will highlight the potential artificial intelligence (AI) has to improve the speed and accuracy of investigations into infectious disease outbreaks in hospitals, and potentially provide real time information to stop or prevent them. The talk will be by Dr Jonas Marschall, Division of Infectious Diseases, Washington University School of Medicine, St. Louis, MO, USA.

Dr Marschall uses the example of an outbreak of vancomycin-resistant Enterococcus faecium (VRE) that began in late 2017, in Bern University Hospital, Switzerland and went on until July 2020, receiving substantial media attention and becoming Switzerland’s largest-ever multidrug-resistant organism outbreak.

The investigations into the outbreak revealed most VRE affected patients were colonised and not infected (a “silent outbreak”); that isolation of VRE patients was costly (requiring isolation rooms, personal protective equipment and bed closures); and that screening for VRE required a substantial logistical and financial effort (screening were proximity-based, captured entire floors, or were even hospital-wide).

Dr Marschall and colleagues then re-analysed the medical records generated during the first two years of the outbreak period (1/2018-12/2019) and identified (and mostly confirmed) risk factors for VRE colonisation by using various statistical methods and then moved to a framework called network graph theory and graph neural networks (a type of AI).

Briefly, network graphs inspect the connections between discrete “nodes” in a network (which could be patients, rooms, devices) and determines, for example, which nodes are most connected or which nodes have the shortest path to another node (and thus play a larger role in the outbreak).

Dr Marschall explains: “The more traditional methods for analysing outbreaks might yield signals that tend to confirm previously known and often generic risk factors, without adding the detail needed to make specific interventions”. 

Extrapolating from their preliminary work in the field,  Dr Marschall thinks AI-based analyses could ultimately provide answers to key questions in an outbreak – for example, their study (click on link above for full article) showed that the “electrocardiography service” and the “examination room ZZ” were at the centre of many interactions, and thus could likely have served as a place of transmission – in consequence, that room/device/person could be the target for disinfection/interventions.

However, in this study, Dr Marschall explains that not all employee interactions were captured because not every profession logs their time with a patient as well as, for example, nursing personnel do (which leaves gaps in our understanding of the many interactions happening in an acute care hospital). To extract the maximum of information, all interactions between patients, employees, visitors, rooms, and devices would have to be captured. Also, Dr Marschall and colleagues based their work on the local data infrastructure in Bern – while the general approach they describe can be used for any setting, a given hospital would need to ensure its data is in an analysable format and labeled so as to facilitate interpretation.  

To make AI input into infectious disease outbreaks a success, Dr Marschall explains that hospital teams must pivot from research to operations, meaning implementing and refining AI tools as an outbreak happens in real time (which is not easy to predict), He explains “this approach could even help with individual patients infected with a multidrug-resistant organism or small clusters of such patients because it could identify surrounding patients/employees and rooms/devices that would need to addressed, either by screening or by targeted disinfection. The beauty of AI in outbreak management (and where its greatest power lies) is to make real time or near real time operational decisions easier, quicker and more precise.”

In his talk, Dr Marschall highlights that novel approaches to medical data (such as network graphs and temporal graph neural networks) can provide us with the framework to elevate outbreak investigations to the next stage. It can identify specific "hot spots" of an outbreak, where transmissions are likely to happen, and give us the tools to target these hot spots in order to fight an outbreak. He concludes: “If this help comes to infection prevention experts in near real time, it will dramatically improve their ability to respond to an outbreak.”

 

Researchers closer to understanding hydrogen's great challenge


Solving embrittlement is a multi-billion-dollar question


Peer-Reviewed Publication

UNIVERSITY OF SYDNEY





Why hydrogen causes steels to become brittle and crack is the great conundrum of engineers and researchers looking to develop large-scale transport and storage solutions for the hydrogen age – an era which Australia hopes to lead by 2030.

They may now be one step closer to understanding how hydrogen affects steels, thanks to new University of Sydney research. The researchers found adding the chemical element molybdenum to steel reinforced with metal carbides markedly enhances its ability to trap hydrogen.

Published in Nature Communications, the finding was demonstrated by a team which was led by Pro Vice-Chancellor (Research - Enterprise and Engagement) Professor Julie Cairney and Dr Yi-Sheng (Eason) Chen, and included Dr Ranming Liu and PhD candidate Pang-Yu Liu. 

They used an advanced microscopy technique pioneered at the University of Sydney, known as cryogenic atom probe tomography, allowing for direct observation of hydrogen distribution in materials.

“We hope this study will get us closer to revealing exactly why hydrogen embrittlement occurs in steel, paving the way for large-scale solutions to hydrogen transport and storage,” said Professor Cairney, who is based at the Australian Centre for Microscopy and Microanalysis, where the research was undertaken.

Hydrogen embrittlement is a process whereby hydrogen causes high strength materials like steel to become brittle and crack. The researchers say it is one of the biggest obstacles to the transition to a hydrogen economy as it prevents hydrogen from being effectively stored and transported at high pressures. This makes understanding and solving embrittlement a multi-billion-dollar question for the renewables market.

"The future of a large-scale hydrogen economy largely comes down to this issue. Hydrogen is notoriously insidious; as the smallest atom and molecule, it seeps into materials, then cracks and breaks them. To be able to effectively produce, transport, store and use hydrogen on a large-scale, this is not ideal,” said Dr Chen.

Deloitte estimates the clean hydrogen market could reach USD$1.4 trillion by 2050. 

How the process worked

Molybdenum was added to the steel, combined with other elements to form an extremely hard ceramic known as ‘carbide’. Carbides are often added to steels to increase their durability and strength.

Using their advanced microscopy technique, the researchers saw the trapped hydrogen atoms were at the core of the carbide sites, suggesting the addition of molybdenum helps trap hydrogen. This was compared with a benchmark titanium carbide steel which did not show the same hydrogen trapping mechanism.

“The addition of molybdenum helped boost the presence of carbon vacancies – a defect in carbides that can effectively capture hydrogen,” said Dr Chen.

The added molybdenum represented only 0.2 percent of the total steel, which the researchers say makes it a cost-effective strategy for reducing embrittlement. The researchers believe niobium and vanadium may also have a similar effect on steels.

DECLARATION

The research was funded by the Australian Research Council’s Linkage Project (LP180100431 and LP210300999) Early Career Industry Fellowship (IE230100160), Future Fellowship (FT180100232), LIEF (LE190100048), the 2019 University of Sydney (Postdoctoral) Fellowship and the Taiwan-University of Sydney Scholarship.

 

Poorly controlled asthma emits same quantity of greenhouse gas as 124,000 homes each year in the UK


Improving care of asthma patients could help NHS meet its net zero target, say researchers

Peer-Reviewed Publication

BMJ




Patients whose asthma is poorly controlled have eight times excess greenhouse gas emissions compared with those whose condition is well controlled—equivalent to that produced by 124,000 homes each year in the UK—indicates the first study of its kind, published online in the journal Thorax.

Improving the care of asthma patients could achieve substantial carbon emissions savings, and help the NHS meet its net zero target, say the researchers.

Healthcare is a major contributor to greenhouse gas emissions and in 2020 the NHS set an ambitious target of reducing its carbon footprint by 80% over the next 15 years, with the aim of reaching net zero by 2045, note the researchers.

Asthma is poorly controlled in around half of those with the condition in the UK and Europe, increasing the risk of hospital admission and severe illness as well as healthcare costs. 

To gauge the environmental footprint of asthma care in the UK, the researchers retrospectively analysed the anonymised health records of 236,506 people with asthma whose data had been submitted to the Clinical Practice Research Datalink between 2008 and 2019.

Greenhouse gas (GHG) emissions, measured as carbon dioxide equivalent (CO2e), were estimated for asthma-related medication use, healthcare resource utilisation and severe exacerbations during follow-up of patients with asthma.

Well controlled asthma was categorised as no episodes of severe worsening symptoms and fewer than 3 prescriptions of short-acting beta-agonists (SABAs) reliever inhalers in a year.

Poorly controlled asthma was categorised as 3 or more SABA canister prescriptions or 1 or more episodes of severe worsening symptoms in a year.

A severe exacerbation of asthma was defined as worsening symptoms requiring a short course of oral corticosteroids, an emergency department visit, or hospitalisation.

Excess GHG emissions due to suboptimal asthma control included at least 3 or more SABA canisters per year, severe exacerbations and any GP visits within 10 days of hospitalisation or an emergency department visit.

The researchers calculated that the overall carbon footprint attributed to asthma care when scaled to the entire UK asthma population added up to 750,540 tonnes CO22/year.

Asthma was poorly controlled in just under half (47%; 111,844) of the patients. And poorly controlled asthma contributed to excess greenhouse gas emissions of 303,874 tonnes CO2e/year—equivalent to emissions from more than 124,000 homes in the UK, they estimate. The excess GHG emissions were 8-fold higher on average for a person with poorly controlled asthma than in the well controlled asthma patients.

The excess GHG emissions were 90% comprised of inappropriate SABA use with the remainder mostly due to healthcare resource utilisation such as GP and hospital visits, required to treat severe worsening symptoms.

Poorly controlled asthma generated 3-fold higher greenhouse gas emissions on average for a person with poorly controlled asthma compared with well controlled asthma when taking into account GHG emissions related to all aspects of asthma care including routine prescribing and management.

The researchers acknowledge various limitations to their findings, including that the study results were largely descriptive in nature. And factors other than the level of asthma symptom control, such as prescribing patterns, may also have contributed to high SABA use.

But they nevertheless write: “Our study indicates that poorly controlled asthma contributes to a large proportion of asthma-care related greenhouse gas emissions with inappropriate SABA use emerging as the single largest contributor.”

The Global Initiative for Asthma no longer recommends SABA used alone as the preferred reliever for acute asthma symptoms, they add.

The authors conclude that efforts to improve asthma treatment practices including curtailing inappropriate SABA use and implementing evidence-based treatment recommendations, could result in substantial carbon savings.

Disclaimer: AAAS and EurekAlert! are not respo

 

The West is best to spot UFOs


Most sightings of Unidentified Anomalous Phenomena occur in the American West where proximity to public lands, dark skies and military installations afford more opportunities to see strange objects in the air.

Peer-Reviewed Publication

UNIVERSITY OF UTAH

UAP hotspots 

IMAGE: 

HOTSPOT ANALYSIS OF REPORTED SIGHTINGS FROM 2001 TO 2020.

view more 

CREDIT: MEDINA, BREWER & KIRKPATRICK. SCI REP (2023)



This [Tic Tac-shaped object that] had just traveled 60 miles in…less than a minute, was far superior in performance to my brand-new F/A-18F and did not operate with any of the known aerodynamic principles that we expect for objects that fly in our atmosphere.” 

In July of 2023, retired commander in the U.S. Navy David Fravor testified to the House Oversight Committee about a mysterious, Tic Tac-shaped object that he and three others observed over the Pacific Ocean in 2004. The congressional hearings riveted the world by bringing Unidentified Anomalous Phenomena (UAP) out of the “alien truther” realm and into the mainstream. 

As sensor technology has advanced and personal aircraft-use has skyrocketed, our ability to explain strange events has become harder to resolve. The U.S. Department of Defense has increasingly taken UAP, formerly known as Unidentified Flying Objects (UFOs), as a serious threat to national security. 

A new study led by University of Utah geographers attempts to understand if local environmental factors increase or decrease the number of sighting reports. The authors used data from the National UFO Research Center, and included approximately 98,000 total sighting reports over a 20-year period, from 2001 to 2020. For each county in the contiguous U.S., the researchers analyzed two conditions: Sky view potential, which refers to the area’s light pollution, cloud cover and tree canopy cover; and the potential for objects to be present in the sky, meaning the proximity to airports and military installations.

The majority of sightings were in western parts of the U.S. due to the region’s physical geography—lots of wide-open spaces and dark skies. UAP-reporting hotspots had credible relationships with air traffic and military activity, suggesting that people are spotting real objects, but not recognizing what they are.

“The idea is that if you have a chance to see something, then it's more likely that you're going to see unexplained phenomena in the sky,” said Richard Medina, associate professor of geography at the University of Utah and lead author of the study. “There’s more technology in the sky than ever before so the question is: What are people actually seeing? It’s a tough question to answer, and it is an important one because any uncertainty can be a potential threat to national security.”

Understanding the environmental context of these sightings will make it easier to find explanations for their occurrence and help identify truly anomalous objects that are a legitimate threat.

The paper published on Dec. 14, 2023, in the journal Scientific Reports

Hot spots and cold spots

The authors looked at the number of sightings per 10,000 people per county and identified significant clusters of low numbers of reports (cold spots) and high numbers of reports (hot spots). There were far more sightings reported in the West and in the very Northeast, along some isolated areas. The cold spots were in the central plains and the Southeast. All results except for cloud cover supported the general hypothesis that people will see things when there’s an opportunity.

“The West has a historical relationship to UAP—Area 51 in Nevada, Roswell in New Mexico and here in Utah we have Skinwalker Ranch in the Uinta Basin and military activity in the U.S. Army Dugway Proving Ground,” said Medina. “Plus, there’s a robust outdoor community that recreates in public lands year-round. People are out and looking skyward.”

Traditional academia has mostly avoided UAP research because of the stigma of flying saucers and space invaders. Yet people around the world continue to spot unexplainable objects in the sky. What little research exists tends to rely on firsthand accounts or look for cultural and psychological explanations, which limits the ability to analyze patterns over a large area.

Additionally, legitimate data sources and questionable accounts have limited rigorous study. The authors note that the National UFO Research Center’s data is a public, self-reporting system with no real way to verify hoaxes. However, the authors assert that if the data were entirely invalid due to some psychological and sociological cause, then there would be no spatial pattern. But there is.

“There are many factors that can contribute to the report of anomalous objects,” said Simon Brewer, associate professor of geography at the U and co-author of the study. “By examining the spatial distribution of reports and how they relate to the local environment, we hope to provide some geographical context that may help resolve or understand reports by both the public and in military settings.”

Roswell, X-Files and Starlink

In July of 2022 the U.S. Deputy Secretary of Defense, in coordination with the Director of National Intelligence, directed the establishment of the All-domain Anomaly Resolution Office (AARO) as the single authoritative UAP office to lead and synchronize a whole government approach to the issue. Earlier UAP tracking efforts include Project BLUE BOOK, a U.S. Air Force-led project that investigated UFO sightings between 1947 and 1969. Project BLUE BOOK’s most famous report is the Roswell, New Mexico, incident alleging that a flying saucer crashed in the desert town on July 8, 1947, and its alien occupants were recovered by government officials. Many Roswell residents witnessed the unexplainable event, which may have led to the flurry of flying-saucer-sightings that swept the nation. Silence from government officials led to wild speculation of otherworldly visitors and subsequent coverups. Later, the U.S. Air Force disclosed that the incident was caused by a classified, multi-balloon project to detect Soviet nuclear tests.

Many UAP sightings have a natural explanation—the planet Venus is a regular culprit, for example. The last few years have seen a boost in UAP reports, likely related to the exponential growth in spacecraft launches and orbiters, such as the Starlink satellite-train blazing across the night sky and the ubiquity of personal drones. The challenge is to parse which reports signal a real threat. 

The authors are exploring whether there are temporal considerations for fluctuations in sightings, based on socio-cultural triggers. For example, were there more reports after the congressional hearings in July of 2023 or after a Space X launch? They’re also investigating whether sociocultural factors influence UAP sightings—is there a spike in reports after a show like “X-Files” gets popular? Are some cultures more likely to see UAPs because of their beliefs?

The U.S. government—the military, intelligence and civil agencies—needs to understand what is in the operating domains to ensure the safety and security of the nation and its people,” said Sean Kirkpatrick, first director of the AARO, adjunct assistant professor of physics at the University of Georgia and co-author of the study. “Unknowns are unacceptable in this age of ubiquitous sensors and data availability. The scientific community has a responsibility to investigate and educate.”

Timeline of the National UFO Research Center reported sightings from 2001 to 2020.

CREDIT

Medina, Brewer & Kirkpatrick. Sci Rep (2023)

Tic Tac Object video (VIDEO)

The study is titled: An environmental analysis of public UAP sightings and sky view potential can be found here: Medina, R.M., Brewer, S.C. & Kirkpatrick, S.M. An environmental analysis of public UAP sightings and sky view potential. Sci Rep 13, 22213 (2023). https://doi.org/10.1038/s41598-023-49527-x