Saturday, January 04, 2025

 

Can patient groups remain independent with drug company funding?


Action is needed to safeguard the independence of patient organizations and charities – new study finds



University of Bath

Patient organisations and charities are at risk of aligning their interests with their corporate funders even when this doesn't benefit their members, according to a new study led by the University of Bath in the UK. To combat this, the researchers are calling for action to safeguard the independence of patient groups.

The new research – led by Dr Piotr Ozieranski from the Department of Social and Policy Sciences at Bath – focuses on pharmaceutical industry payments to patient organisations in Poland, comparing the funding model followed by this central European country to those found in Western Europe and North America.

Patient organisations in the US and across Europe are increasingly dependent on funding from drug companies. In Europe, this happens against a backdrop of reduced government funding of patient groups and dwindling funding for the health sector more broadly.

Explaining why this global trend matters, Dr Ozieranski – who studies the transparency of interactions between drug companies, patient organisations and the health service – said: “Patient organisations have established themselves as key stakeholders in the health policy domain, playing an important role in raising awareness of diseases and supporting patients with specific diseases, and there’s an assumption that they are independent and represent their members’ interests alone”.

“Pharmaceutical companies are eager to fund their work, but this comes with strings attached – these same companies manufacture drugs designed for the very people who have turned to the patient organisation for independent support.”

Hidden conflict of interest

Dr Ozieranski said this clear conflict of interest between drug companies and patient organisations is not always apparent at first glance.

He said: “Companies play the long game – they don’t want anything obvious in return for their funding, but over time, they build closer ties with the patient organisations. As their relationship deepens, the two bodies run shared projects and conferences, and may provide joint testimony to scientific advisory bodies.

“In other words, drug companies shape and mould patient organisations over time. This can be concerning as patient organisations should represent the interests of patients, their carers and families, which are not necessarily the same as those of product manufacturers.”

The new study, conducted by the University of Bath together with Lund University, Sweden and Kozminski University, Poland, and published this week in the International Journal of Social Determinants of Health and Health Services, adds further nuance to these trends. It does so by providing an in-depth examination of financial ties between pharmaceutical companies and patient organisations in Poland, the largest country in Central Europe.

How to make things better

Dr Ozieranski and his collaborators, Dr Marta Makowska (from the Kozminski Academy) and Dr Shai Mulinari (from Lund University), propose three core strategies to mitigate the risks of undue influence being exerted on patient groups by drug-company funders – not only in Poland but also in other countries, including the UK:

  • Create a central shared pool of funding supported by all companies, so patient organisations avoid becoming overly reliant on one, or a small number of, donor corporations to continue their work. This fund would be managed by an independent body responsible for evaluating and approving project proposals from patient organisations, ensuring greater separation and impartiality.
  • Follow the example set by Poland, where taxpayers are invited to allocate 1.5% of their income tax to a specific patient organisation, to fund its activities. To qualify, organisations must register as 'public benefit organisations', a status that mandates transparent reporting of the funds they receive as well as other activities undertaken.
  • Replace drug company funding with other funding sources whenever possible. Some patient organisations may find it impossible to cut their ties with drug companies entirely. However, every step taken in that direction is valuable, including finding other sources of support or gradually withdrawing from being industry funding, which can start from non-essential, lower-scale activities.

All funding collected in a single database

There is concern that governments around the world seem to pay little attention to trends in funding for patient group funding, allowing companies to set and execute the rules for disclosing such payments. Essentially, companies are free to ‘mark their own homework’.

To bolster the independence of patient groups, Dr Makowska calls for each country to establish a simple, state-run, state-mandated database of all payments made to patient organisations, doctors and hospitals.

She said: “At the moment, there’s no such place, and as you might expect, companies are not always truthful about the funding they disclose.”.

Dr Mulinari added: “Transparency of drug company funding of patient organisations needs improvement.

"When conducting this study, we had to find and download over 200 disclosure reports published separately by different drug companies. We then integrated them into a single payments database and painstakingly cleaned the data, which involved ensuring that each patient organisation appears under a single name, and not several names used by different companies. This should be the work done by drug companies and not researchers, let alone patients or members of the public.

“We have seen this pattern over and over again in many European countries. It is unreasonable to expect this amount of investigative work to be able to achieve even a modest degree of transparency of ties between key actors in today’s healthcare.”

ENDS.

Background

The Poland case study

Poland has one of the biggest and most rapidly growing pharmaceutical markets in Europe, which has been boosted in the period following the EU accession in 2004. Increased pharmaceutical investment has gone hand in hand with expanding collaborations with the pharmaceutical industry, documented in the new study.

  • Between 2012 and 2020, funding provided by 33 drug companies to 273 patient organisations has more than tripled from €775,225 to €2,419,271, amounting, in total, to €13,729,644.
  • The study also found that the funding was heavily concentrated, with the top 10 recipients amassing almost half of the total.
  • Many patient organisations in Poland form exclusive or nearly exclusive financial ties with one, or a few, companies. This clearly shows how strong these relationships are and suggests increased risks of undue influence by these companies. Notably, more than half of the organisations were tied to just one company, and nearly a fifth worked exclusively with two. For example, one of the big patient organisations featured in the new study was the largest recipient of funding from drug companies in the three years covered by the study period, with all of its funding coming from a single big drug company – Pfizer. This patient organisation leads vaccination campaigns while the drug company is a vaccines manufacturer, lending weight to fears of conflicting interests.

Evidence of influence

A stark example of drug companies benefiting from donating funds to patient-advocacy groups was seen in the US between 1996 and 2008 when the National Alliance on Mental Illness (NAMI) received US$45 million from pharmaceutical industry giants, with significant contributions from Eli Lilly, Pfizer and Bristol-Myers Squibb.

In a 2006 lawsuit, NAMI was accused of lobbying aggressively for the wider use of new antipsychotics, such as Zyprexa – a drug produced by Eli Lilly – despite evidence suggesting serious side effects, including weight gain, diabetes and metabolic issues. At the time, cheaper drugs were available to treat psychosis, and though these also came with risks, they were cheaper than the newer alternatives, and the newer drugs did not demonstrate clear superiority in terms of efficacy.

Zyprexa was the leading antipsychotic in the world in 2000, capturing nearly 40% of the global antipsychotic market.

A US Senate investigation in 2009 highlighted that pharmaceutical funding significantly influenced NAMI’s operations.

Background ENDS.

 

Johns Hopkins APL modeling tool affirms critical role of testing in pandemic response



Johns Hopkins University Applied Physics Laboratory



The COVID-19 pandemic highlighted how crucial testing is for disease preparedness and response, and new research from the Johns Hopkins Applied Physics Laboratory (APL) and a team of collaborators underscores that principle.

Published in the Jan. 2 edition of The Lancet Public Health, the research included simulation and analysis that suggests public-private partnerships to develop, produce and distribute COVID-19 diagnostic tests saved an estimated 1.4 million lives and prevented about 7 million patient hospitalizations in the United States during the pandemic.

APL, based in Laurel, Maryland, teamed with the Administration for Strategic Preparedness and Response (ASPR), the U.S. Centers for Disease Control and Prevention, and consultants from MITRE Corporation on the study.

“The analysis found that the early development, manufacturing and distribution of tests significantly reduced severe COVID-19 outcomes,” said Gary Lin, a computational epidemiologist at APL and a study co-author. “Through modeling and simulation, we’ve shown how national coordination can effectively leverage resources and capabilities.”

APL researchers developed a digital twin prototype — a virtual simulation environment — to model the testing and diagnostic supply chain. The tool was used to simulate baseline scenarios and assess the effects of potential pandemic interventions.

“The digital twin helps us quantitatively understand the impact and consequences of disruptions and changing infection levels on test availability,” said Elizabeth Currier, the APL digital twin project manager. “It can also evaluate the impact of policies and investments and be used in planning and evaluating supply needs, aiding in response and ensuring a secure supply chain for future medical crises.”

The prototype model integrated diverse data sources, including manufacturing, retail and government stockpile information as well as wastewater and inpatient data, which enabled the team to assess complex scenarios. It simulated forecasting for infectious disease cases to reflect demand for tests, production of tests, and supply and distribution logistics.

Between January 2020 and December 2022, government efforts produced more than 6.7 billion COVID-19 tests in the United States. These included laboratory tests, point-of-care tests and over-the-counter tests, with more than 2.7 billion tests performed in U.S. laboratories, in health care facilities or at home.

“The findings underscore the importance of robust and rapid test development, production and distribution to address future public health threats,” Currier said. “The insights gained from integrating data go beyond responding to COVID-19: They prepare us for future pandemics with a scalable framework to allocate resources effectively.”

APL’s digital twin modeling has since expanded to monitor nationwide testing for COVID-19, influenza, respiratory syncytial virus (RSV) and other public health threats under an all-hazards approach.

 

Shedding light on the impacts of solar farms on deserts through the emerging field of “energy meteorology”




Institute of Atmospheric Physics, Chinese Academy of Sciences
Solar farm 

image: 

A solar farm in the desert.

view more 

Credit: Coimbra Research Group, 2018



Rooftop solar panels are popular in many parts of the world, but power is generated much more cheaply and efficiently in utility-scale solar farms. Often, though, these large-scale solar farms are deployed in desert habitats, which contain native flora and fauna that are highly sensitive to changes in temperature and humidity.

In a study recently published in Advances in Atmospheric Sciences, Professor Carlos Coimbra of the University of California San Diego outlines in detail the thermal balances between solar farm panels and the surrounding environment, which can then be used to examine the thermal effects of solar plants on desert habitats and vice versa.

The work falls under the umbrella of the emerging field of “energy meteorology”, which in its broadest sense covers any effect that weather has on power generation, transmission and distribution systems. In Professor Coimbra’s study, whilst the use of energy meteorology is restricted to solar power generation, its usual scope is also expanded to include not only the effects of weather on  solar power plants, but also the reverse, i.e., the effect of solar power plants on the local environment.

The ability to calculate in detail the thermal balances of solar panels, which are characterized according to their main material components, allows relationships to be derived between difficult-to-measure flow-dependent variables such as the mean convective heat transfer coefficients and radiative fluxes to and from the panels. These relationships can then be exploited through measurements or model estimates to develop a more complete and consistent picture of solar farm thermal effects on the local environment.

In addition, the study puts forward a method that can classify regional microclimates in terms of the effective optical depth of the cloudy atmosphere. Such a classification can provide resourcing information that complements the monthly, daily, or hourly averaged values of cloudiness or clearness indices for shortwave radiation used in the design, siting and management of solar power plants.

“It behooves us in the solar energy research community to answer concerns and criticisms that the solar power industry encounters with the best possible science. It could very well be that the net thermal impact of large-scale power plants is minimal, or even benign, but the conflicting results reported in the research literature point toward the need to study the problem from the standpoint of fundamental thermal balances,” explains Professor Coimbra.

In this respect, the work reported in this paper stands as an attempt to motivate solar engineers and energy meteorologists alike to push forward with studying and assessing the environmental impacts of large-scale solar farms. Essentially, the analysis is intended to serve as a research primer for those interested in exploring new research opportunities in energy meteorology applied to solar farms.

The study is included in a recently published special collection on solar energy meteorology.

 

Co-management of protected areas by NGOs and African countries helps reduce deforestation



INRAE - National Research Institute for Agriculture, Food and Environment
Virunga National Park 

image: 

Virunga National Park

view more 

Credit: INRAE - S. Desbureaux



Sub-Saharan Africa is home to 13% of biodiversity and represents approximately 20% of the world’s forests. Protected areas play an essential role in protecting biodiversity and ecosystems. Since the first protected area was set up in 1925 in the Democratic Republic of the Congo, the Virunga National Park, several thousand parks have been established. But the structural lack of funding, limited management capacities, weak institutions and governance complicate the missions of these areas to protect wildlife and their habitats effectively. To alleviate these difficulties, innovative management models have been set up over the last 20 years: states and NGOs co-manage these parks through public-private partnerships. These collaborative management partnerships (CMPs) can also go as far as delegating the full management of thousands of square kilometres of a territory in one or several states to national or international NGOs. Another specificity of these CMPs is their duration: the collaborations are set up for several decades (around 25-30 years, even 40 years in certain cases), while NGOs usually support projects over 2 to 5 years. CMPs facilitate substantial, long-term funding (P. Lindsey et al. 2021 Biological Conservation), making it possible, for example, to recruit and train staff and park rangers, as well as to build infrastructure to help local populations to reduce their dependence on park resources and improve their living conditions (power stations around the Garamba and Virunga Parks, roads, tourism infrastructure, etc).

The researchers examined this change in approach to see whether these investments have made protected areas more effective. Their study identified 127 partnerships across 16 countries in Sub-Saharan Africa in 2023, involving 48 NGOs, 21 of which are national and 27 international. These areas cover almost 1 million square kilometres, nearly twice the size of France. The researchers assessed the impact of CMPs by comparing the rate of tree cover loss before and after they were established. Their results show that CMPs have reduced deforestation by an average of 55% in protected areas. They are particularly effective in protected areas under a high level of anthropogenic pressure, where the reduction in deforestation reaches 66%.

These results illustrate that long-term CMPs between governments and non-governmental organizations can be part of the solution to improving biodiversity protection. However, the duration of CMPs requires states to implement monitoring mechanisms to assess their impact. In addition, future research will need to determine whether the improvement in environmental conditions benefits the populations living near these parks.

 

From CO2 to acetaldehyde: Towards greener industrial chemistry



Ecole Polytechnique Fédérale de Lausanne
Copper-cluster catalysts on activated carbon 

image: 

Copper-cluster catalysts on activated carbon.

view more 

Credit: Cedric Koolen (EPFL)



Acetaldehyde is a vital chemical used in making everything from perfumes to plastics. Today, its production largely relies on ethylene, a petrochemical. But increasing environmental concerns are pushing the chemical industry to reduce its reliance on fossil fuels, so scientists have been searching for greener ways to produce acetaldehyde.

Currently, acetaldehyde is produced through the so-called “Wacker process”, a chemical synthesis method that uses ethylene from oil and natural gas with other chemicals such as strong acids, i.e. hydrochloric acid. The Wacker process not only has a large carbon footprint but is resource-heavy and is unsustainable in the long run.

A promising solution to this problem is the electrochemical reduction of carbon dioxide (CO2) into useful products. As CO2 is a waste product that contributes to global warming, this approach tackles two environmental issues at once: it reduces COemissions and creates valuable chemicals.

An innovative catalyst for greater efficiency

Copper-based catalysts have shown potential for this transformation, but so far, they’ve struggled with low selectivity — which means that they produce a mixture of products rather than the desired acetaldehyde.

Now, scientists of a public-private consortium, led by Cedric David Koolen in the group of Andreas Züttel at EPFL, Jack K. Pedersen at the University of Copenhagen, and Wen Luo at Shanghai University have developed a novel copper-based catalyst that can selectively convert COinto acetaldehyde with an impressive efficiency of 92%.

The breakthrough, published in Nature Synthesis, provides a greener and more sustainable way to produce acetaldehyde, and could replace the Wacker process. Moreover, the catalyst is scalable and cost-effective, opening the door for industrial applications.

“The Wacker process effectively hasn’t changed in the past 60 years. It is still based on the same basic chemistry. The time was ripe for a green breakthrough,” says Koolen.

“Fascinating chemistry”

The researchers began by synthesizing tiny clusters of copper particles, each about 1.6 nanometers in size, using a method called spark ablation. This technique involves vaporizing copper electrodes in an inert gas environment, and allowed the scientists to precisely control particle sizes. The copper clusters were then immobilized on carbon supports to create a stable and reusable catalyst.

In the lab, the team tested the catalyst’s performance by running it through a series of electrochemical reactions with COin a controlled environment. Using a synchrotron — a large scale facility that generates a very bright light source — the team ensured that the copper clusters were actively converting CO2 to acetaldehyde by a technique called X-ray absorption spectroscopy.

The results were remarkable. The copper clusters achieved 92% selectivity for acetaldehyde at a relatively low voltage, which is essential for energy efficiency. In a 30-hour stress test, the catalyst demonstrated high stability, maintaining its performance across multiple cycles. The researchers also found that the copper particles retained their metallic nature throughout the reaction, which contributes to the catalyst’s longevity.

“What was really surprising to us was that the copper remained metallic, even after removal of the potential and exposure to air,” says co-lead author Wen Luo. “. “Copper usually oxidizes like crazy, especially copper that small. But in our case, an oxide shell formed around the cluster protecting the core from further oxidation. And this explains the recyclability of the material. Fascinating chemistry.”

The keys to success

Why did the new catalyst work so well? Computational simulations showed that the copper clusters feature a specific configuration of atoms that promotes CO2 molecules to bond and transform in a way that favors the production of acetaldehyde over other possible products, like ethanol or methane.

“The great thing about our process is the fact that it can be applied to any other catalysts system,” says co-lead author Jack K. Pedersen. “With our computational framework, we can quickly screen clusters for promising characteristics. If it’s for CO2 reduction, or water electrolysis, with spark ablation we can produce the new material with ease and directly test it in the lab. This is so much faster than your typical test-learn-repeat cycle.”

The new copper catalyst is a significant step toward greener industrial chemistry. If scaled up, it could replace the Wacker process, reducing the need for petrochemicals and cutting down on CO2 emissions. Since acetaldehyde is a building block for many other chemicals, this research has the potential to transform multiple industries, from pharmaceuticals to agriculture.

Other contributors

  • Empa Materials Science & Technology
  • University of Copenhagen
  • VSPARTICLE
  • Paul Scherrer Institute, Switzerland
  • EPFL Environmental Engineering Institute
  • Technical University of Delft
  • Shanghai University

Reference

Koolen, C. D., Pedersen, J. K., Zijlstra, B., Winzely, M., Zhang, J., Pfeiffer, T. V., Vrijburg, W., Li, M., Agarwal, A., Akbari, Z., Kuddusi, Y., Herranz, J., Safonova, O. V., Schmidt-Ott, A., Luo, W., Züttel, A. Scalable synthesis of Cu cluster catalysts via spark ablation for the highly selective electrochemical conversion of CO2 to acetaldehyde. Nature Synthesis 03 January 2025. DOI: 10.1038/s44160-024-00705-3

Cedric David Koolen showing an example electrochemical cell used in the Laboratory of Materials for Renewable Energy.

Cedric David Koolen, co-lead-author of the study, posing with the spark ablation device used to generate the clusters in the Laboratory of Materials for Renewable Energy.

Credit

EPFL

 

Growing divide: Agricultural climate policies affect food prices differently in poor and wealthy countries



Potsdam Institute for Climate Impact Research (PIK)





“In high-income countries like the U.S. or Germany, farmers receive less than a quarter of food spending, compared to over 70 percent in Sub-Saharan Africa, where farming costs make up a larger portion of food prices,” says David Meng-Chuen Chen, PIK scientist and lead author of the study published in Nature Food. “This gap underscores how differently food systems function across regions.” The researchers project that as economies develop and food systems industrialise, farmers will increasingly receive a smaller share of consumer spending, a measure known as the ‘farm share’ of the food dollar.

“In wealthy countries, we increasingly buy processed products like bread, cheese or candy where raw ingredients make up just a small fraction of the cost,” adds Benjamin Bodirsky, PIK scientist and author of the study. “The majority of the price is spent for processing, retail, marketing and transport. This also means that consumers are largely shielded from fluctuations in farm prices caused by climate policies such as taxes on pollution or restrictions on land expansion, but it also underscores how little farmers actually earn.”

Examining the full food value chain to uncover climate policy impacts

To arrive at these conclusions, the team of scientists combined statistical and process-based modelling to assess food price components across 136 countries and 11 food groups. They studied prices of food both consumed at home and away from home. "Most models stop at farm costs, but we went all the way to the grocery store and even the restaurant or canteen,” says Chen. By analysing the entire food value chain, the researchers also provide new insights into how greenhouse gas mitigation policies impact consumers: “Climate policies aimed at reducing emissions in agriculture often raise concerns about rising food prices, particularly for consumers. Our analysis shows that long supply chains of modern food systems buffer consumer prices from drastic increases, especially in wealthier countries,” explains Chen.

Climate policies impact consumers differently in wealthy and poor countries

“Even under very ambitious climate policies with strong greenhouse gas pricing on farming activities the impact on consumer prices by the year 2050 would be far smaller in wealthier countries,” Bodirsky says. Consumer food prices in richer countries would be 1.25 times higher with climate policies, even if producer prices are 2.73 times higher by 2050. In contrast, lower-income countries would see consumer food prices rise by a factor of 2.45 under ambitious climate policies by 2050, while producer prices would rise by a factor of 3.3. While even in lower-income countries consumer price rises are less pronounced than for farmers, it would still make it harder for people in lower-income countries to afford sufficient and healthy food.

Despite food price inflation, poor consumers do not necessarily need to suffer from climate mitigation policies. A previous study by PIK (Soergel et al 2021) showed that if revenues from carbon pricing were used to support low-income households, these households would be net better off despite food price inflation, due to their higher incomes.

“Climate policies might be challenging for consumers, farmers, and food producers in the short term, but they are essential for safeguarding agriculture and food systems in the long run,” says Hermann Lotze-Campen, Head of Research Department “Climate Resilience” at PIK and author of the study. “Without ambitious climate policies and emission reductions, much larger impacts of unabated climate change, such as crop harvest failures and supply chain disruptions, are likely to drive food prices even higher. Climate policies should be designed to include mechanisms that help producers and consumers to transition smoothly, such as fair carbon pricing, financial support for vulnerable regions and population groups, and investments in sustainable farming practices.”


Article
David Meng-Chuen Chen, Benjamin Bodirsky, Xiaoxi Wang, Jiaqi Xuan, Jan Philipp Dietrich, Alexander Popp, Hermann Lotze-Campen (2025): Future food prices will become less sensitive to agricultural market prices and mitigation costs. Nature Food. DOI: 10.1038/s43016-024-01099-3