Wednesday, December 24, 2025

 

How changing your diet could help save the world




University of British Columbia





For many of us, the holiday season can mean delightful overeating, followed by recriminatory New Year’s resolutions.

But eating enough and no more should be on the menu for all of us, according to a recent UBC study. It found that 44 per cent of us would need to change our diets for the world to warm no more than 2 C.

Dr. Juan Diego Martinez, who led the research as a doctoral student at UBC’s Institute for Resources, Environment and Sustainability, discusses the study’s findings and the simple dietary changes we can all make.

What did you find?

Half of us globally and at least 90 per cent of Canadians need to change our diets to prevent severe planetary warming. And that number is conservative, because we used 2012 data. Since then, emissions and the world’s population have both increased. Looking ahead to 2050, we found that 90 per cent of us will need to be eating differently.

We looked at data from 112 countries, accounting for 99 per cent of food-related greenhouse gas emissions globally, and divided each country’s population into 10 income groups. We calculated a food emissions budget for each person by combining emissions from food consumption, global food production and supply chains, and compared these emissions to the total the world can afford if we want to stay below 2 C of warming.

Why focus on dietary changes rather than, say, flying less?

The world’s food systems are responsible for more than one-third of all human greenhouse gas emissions.

We found that the 15 per cent of people who emitted the most account for 30 per cent of total food emissions, equaling the contribution of the entire bottom 50 per cent. This select group consists of the wealthiest people in high emissions countries, including the Central African Republic, Brazil and Australia.

Even though this group is emitting a lot, there is a much higher number of people whose diets are above that cap. This is why half, not just the richest, of the global population needs to change diets. In Canada, all 10 income groups are above the cap.

Debates around flying less, driving electric and buying fewer luxury goods are valid: We need to reduce emissions any way we can. However, food emissions are not just a problem for the richest—we all need to eat, so we can all make a change. For people who are both flying frequently and eating lots of beef, it’s not an either/or: Try to reduce both.

What changes can we make to our diets?

Eat only what you need. Repurpose what you don’t. Less wasted food means fewer emissions, less cooking and more easy, tasty leftovers.

Eliminate or reduce your beef consumption—43 per cent of food-related emissions from the average Canadian come from beef alone. We could have had our beef and eaten it too if we’d followed the agreements laid out in the Kyoto Protocol, but we’re now at a point where food emissions also need to fall to avoid the worst of climate change.

I grew up in Latin America where eating a lot of beef is part of the culture, so I get how much of an ask this is. But we just can’t deny the data anymore.

Vote with your fork. This is a first step to demand change from your political leaders. The more we talk about our own dietary changes and what matters to us, the more politicians will begin to care about policies that bring positive changes to our food systems.

School meals could unlock major gains for human and planetary health




University College London




Healthy, sustainable school meals could cut undernourishment, reduce diet-related deaths and significantly lower environmental impacts, according to a new modelling study led by a UCL (University College London) researcher.

The study is part of a new collection of papers published in Lancet Planetary Health by members of the Research Consortium for School Health and Nutrition – the independent research initiative of the School Meals Coalition. The papers find that well-designed school meal programmes could be a strategic investment in a healthier, more sustainable future.

Drawing together modelling, case studies and evidence from multiple disciplines, the six-paper collection demonstrates how planet-friendly school meal programmes can simultaneously improve child nutrition, reduce the prevalence of long-term diet-related illness, lessen climate and environmental pressures, and stimulate more resilient, agrobiodiverse food systems.

School meals: a strategic investment in human and planetary health

Global food systems are responsible for a third of human-induced greenhouse gas emissions while also contributing to rising malnutrition and diet-related diseases. At the same time, national school meal programmes feed 466 million children every day, representing 70% of the global public food system – a scale that provides governments unparalleled leverage.

A global modelling study, led by Professor Marco Springmann, modelling lead for the Research Consortium based at UCL’s Institute for Global Health, finds that providing a healthy, sustainable meal to every child by 2030 could:

  • Reduce global undernourishment by 24%, with particularly strong impacts in food-insecure regions. This translates to 120 million fewer people in the world not getting enough vitamins, minerals, and energy from food
  • Prevent over 1 million deaths every year from diet-related illnesses such as diabetes and coronary heart disease, assuming today’s schoolchildren retain, at least in part, preference for healthy foods into adulthood
  • Halve food-related environmental impacts, including emissions and land use, when meals follow healthy, sustainable dietary patterns, for instance by increasing the proportion of vegetables and reducing meat and dairy products
  • Generate major health and climate savings, significantly offsetting investment needs

Currently only one in five children in the world receive a school meal.

Professor Springmann said: “Our modelling shows that healthy and sustainable school meals can generate substantial health and environmental gains in every region of the world. Importantly, the climate and health savings that result from healthier diets and lower emissions can help offset the costs of expanding school meal programmes. The evidence is clear: investing in school meals is both effective and economically sound.”

A framework for transforming food systems

To support governments to transition to planet-friendly school meal programmes, the collection sets out a conceptual framework for how school meals can drive systemic food systems transformation at scale, structured around four essential pillars:

  • Healthy, diverse, culturally relevant school menus
  • Clean, modern cooking methods
  • Reduced food loss and waste
  • Holistic food education that connects children, families and communities

Together, these pillars offer governments a pathway to improve child health and food literacy, strengthen agrobiodiversity, stimulate ecological local production and build climate-resilient food systems. Crucially, the framework emphasises that these pillars must be embedded in public procurement rules, nutrition standards and policy reforms to unlock their full potential and shift demand towards healthier and more sustainable food systems.

Dr Silvia Pastorino, Diets & Planetary Health Lead for the Research Consortium and curator of the collection based at the London School of Hygiene & Tropical Medicine (LSHTM), said: “This framework highlights how school meals are not just a nutrition programme – they are a powerful lever for transforming food systems. When meals are healthy, sustainable and linked to food education, they improve children’s wellbeing today and foster long-term sustainable habits, while helping countries protect biodiversity, reduce emissions and build resilient communities. Few interventions deliver such wide-ranging, long-lasting benefits.”

The framework builds on insights first published in the Research Consortium’s 2023 White Paper, School Meals and Food Systems, which brought together 164 authors from 87 organisations worldwide, also coordinated by Dr Pastorino.

Food, learning, energy, and biodiversity

To further explore each of the four pillars laid out in the framework, the wider Lancet Planetary Health collection includes:

  • A viewpoint from FAO (Food and Agricultural Organization of the United Nations) on integrating food education into learning to build lifelong sustainable habits
  • A personal view from a Loughborough University team on the critical role of clean, reliable energy in delivering safe, planet-friendly meals
  • A scoping review from Alliance Bioversity-CIAT on the importance of agrobiodiversity in providing nutritious, climate-resilient school menus
  • A personal view from an Imperial College London team on promoting regenerative agriculture, agrobiodiversity, and food security through school feeding

From evidence to action: supporting governments to implement planet-friendly policies

In partnership with international organisations and government partners, the Research Consortium is now developing a Planet-Friendly School Meals Toolkit to help countries assess the costs, environmental impacts and health benefits of shifting to sustainable school meal models. Co-created with partners in Kenya and Rwanda, the first results are expected in spring 2026.

New study uncovers the lingering, evolving threat of "thirdhand smoke" in our homes

AH COME ON, YOU KNEW THIS WAS COMING


Institute of Atmospheric Physics, Chinese Academy of Sciences
The thirdhand smoke explosure experiment 

image: 

A wool sweater exposed to thirdhand smoke is suspended within a custom-built Teflon film chamber for controlled ozone oxidation experiments. 

view more 

Credit: Liu Yang





While the dangers of secondhand smoke are widely recognized, a new study reveals a more persistent and stealthier hazard hiding in our indoor spaces: "Thirdhand Smoke." This residue, which lingers on surfaces long after a cigarette is put out, is not just a stale smell but a dynamic source of continuous air pollution, according to a recently published research in Building and Environment from the Chinese Academy of Sciences.

Thirdhand smoke refers to the toxic mix of chemicals that cling to walls, furniture, carpets, curtains, and other surfaces after tobacco smoke clears. Unlike secondhand smoke, which you breathe in directly, this residue can continuously re-emit pollutants back into the air and undergo chemical changes over time, creating a long-term exposure risk.

A research team led by Professor Sun Yele from the Institute of Atmospheric Physics has, for the first time, mapped the real-time evolution of thirdhand smoke in indoor environments. Using advanced air monitoring technology, the team tracked both tiny particles and invisible gases released from contaminated surfaces.

The findings reveal a stark difference between secondhand and thirdhand smoke. While secondhand smoke particles fade quickly, thirdhand smoke maintains a stable, low-level presence in the air for extended periods. More importantly, the chemical "fingerprint" of the particles changes, indicating that the residue becomes more nitrogen-rich and potentially more harmful over time.

The study also identifies how our homes trap this pollution. Porous materials like wool carpets or fabric upholstery act as deep "reservoirs," soaking in tobacco chemicals. These reservoirs then slowly release pollutants, making them resistant to simple airing out and prolonging contamination for hours or even days.

The research outlines a three-stage release pattern: an initial burst of gases, followed by a hours-long phase of steady release of nicotine-related compounds, and finally a long, slow "tail" of other chemicals.

"The key takeaway is that thirdhand smoke is not a static stain; it's an active and ongoing source of pollution in a room," said Professor Yele Sun, the study's corresponding author. "The smoking session might end, but the release of hazardous compounds continues, exposing people to low levels of toxins long after the fact. This turns our homes into environments of chronic, low-dose exposure."

This work provides critical scientific evidence for public health guidelines, highlighting the need to consider thirdhand smoke in tobacco control policies and indoor air quality standards. It offers new chemical markers to detect this lingering contamination.

A new theory of biological computation might explain consciousness



Estonian Research Council
Biological computation is complex 

image: 

In conventional computing, we can draw a clean line between software and hardware. In brains, there is no such separation of different scales. In the brain, everything influences everything else, from ion channels to electric fields to circuits to whole-brain dynamics.

view more 

Credit: Borjan Milinkovic






Right now, the debate about consciousness often feels frozen between two entrenched positions. On one side sits computational functionalism, which treats cognition as something you can fully explain in terms of abstract information processing: get the right functional organization (regardless of the material it runs on) and you get consciousness. On the other side is biological naturalism, which insists that consciousness is inseparable from the distinctive properties of living brains and bodies: biology isn’t just a vehicle for cognition, it is part of what cognition is. Each camp captures something important, but the stalemate suggests that something is missing from the picture.

In our new paper, we argue for a third path: biological computationalism. The idea is deliberately provocative but, we think, clarifying. Our core claim is that the traditional computational paradigm is broken or at least badly mismatched to how real brains operate. For decades, it has been tempting to assume that brains “compute” in roughly the same way conventional computers do: as if cognition were essentially software, running atop neural hardware. But brains do not resemble von Neumann machines, and treating them as though they do forces us into awkward metaphors and brittle explanations. If we want a serious theory of how brains compute and what it would take to build minds in other substrates, we need to widen what we mean by “computation” in the first place.

Biological computation, as we describe it, has three defining properties.

First, it is hybrid: it combines discrete events with continuous dynamics. Neurons spike, synapses release neurotransmitters, and networks exhibit event-like transitions, yet all of this is embedded in evolving fields of voltage, chemical gradients, ionic diffusion, and time-varying conductances. The brain is not purely digital, and it is not merely an analog machine either. It is a layered system where continuous processes shape discrete happenings, and discrete happenings reshape continuous landscapes, in a constant feedback loop.

Second, it is scale-inseparable. In conventional computing, we can draw a clean line between software and hardware, or between a “functional level” and an “implementation level.” In brains, that separation is not clean at all. There is no tidy boundary where we can say: here is the algorithm, and over there is the physical stuff that happens to realize it. The causal story runs through multiple scales at once, from ion channels to dendrites to circuits to whole-brain dynamics and the levels do not behave like modular layers in a stack. Changing the “implementation” changes the “computation,” because in biological systems, those are deeply entangled.

Third, biological computation is metabolically grounded. The brain is an energy-limited organ, and its organization reflects that constraint everywhere. Importantly, this is not just an engineering footnote; it shapes what the brain can represent, how it learns, which dynamics are stable, and how information flows are orchestrated. In this view, tight coupling across levels is not accidental complexity. It is an energy optimization strategy: a way to produce robust, adaptive intelligence under severe metabolic limits.

These three properties lead to a conclusion that can feel uncomfortable if we are used to thinking in classical computational terms: computation in the brain is not abstract symbol manipulation. It is not simply a matter of shuffling representations according to formal rules, with the physical medium relegated to “mere implementation.” Instead, in biological computation, the algorithm is the substrate. The physical organization does not just support the computation; it constitutes it. Brains don’t merely run a program. They are a particular kind of physical process that performs computation by unfolding in time.

This also highlights a key limitation in how we often talk about contemporary AI. Current systems, for all their power, largely simulate functions. They approximate mappings from inputs to outputs, often with impressive generalization, but the computation is still fundamentally a digital procedure executed on hardware designed for a very different computational style. Brains, by contrast, instantiate computation in physical time. Continuous fields, ion flows, dendritic integration, local oscillatory coupling, and emergent electromagnetic interactions are not just biological “details” we might safely ignore while extracting an abstract algorithm. In our view, these are the computational primitives of the system. They are the mechanism by which the brain achieves real-time integration, resilience, and adaptive control.

This does not mean we think consciousness is magically exclusive to carbon-based life. We are not making a “biology or nothing” argument. What we are claiming is more specific: if consciousness (or mind-like cognition) depends on this kind of computation, then it may require biological-style computational organization, even if it is implemented in new substrates. In other words, the crucial question is not whether the substrate is literally biological, but whether the system instantiates the right class of hybrid, scale-inseparable, metabolically (or more generally energetically) grounded computation.

That shift changes the target for anyone interested in synthetic minds. If the brain’s computation is inseparable from the way it is physically realized, then scaling digital AI alone may not be sufficient. Not because digital systems can’t become more capable, but because capability is only part of the story. The deeper challenge is that we might be optimizing the wrong thing: improving algorithms while leaving the underlying computational ontology untouched. Biological computationalism suggests that to engineer genuinely mind-like systems, we may need to build new kinds of physical systems: machines whose computing is not layered neatly into software on hardware, but distributed across levels, dynamically coupled, and grounded in the constraints of real-time physics and energy.

So, if we want something like synthetic consciousness, the problem may not be, “What algorithm should we run?” The problem may be, “What kind of physical system must exist for that algorithm to be inseparable from its own dynamics?” What are the necessary features—hybrid event–field interactions, multi-scale coupling without clean interfaces, energetic constraints that shape inference and learning—such that computation is not an abstract description laid on top, but an intrinsic property of the system itself?

That is the shift biological computationalism demands: moving from a search for the right program to a search for the right kind of computing matter.