Saturday, March 28, 2026

SPACE/COSMOS

 

Space creature or humble potato? What NASA astronaut Don Pettit’s latest photo actually shows

NASA astronaut Don Pettit recently shared a picture of a purple potato he had grown in space on the International Space Station.
Copyright NASA

By Indrabati Lahiri
Published on 

As technologies to grow food in space see increasing popularity, NASA astronaut Don Pettit sheds more light on his space farming hobby.

NASA astronaut Don Pettit, recently shared a picture of an odd, purple, egg-shaped object with “tentacles” from the International Space Station (ISS) on the social media platform X.

While you could be forgiven for thinking that this was some kind of alien creature, it is in fact only a potato that Pettit grew in space.

Although the colour is not commonly seen, potatoes can be purple , mainly due to high levels of anthocyanins.

In a post on X, Pettit shared more about his space food-growing hobby.

“Spudnik-1, an orbiting potato on the International Space Station. I flew potatoes on Expedition 72 for my space garden, an activity I did in my off-duty time. This is an early purple potato, complete with a spot of hook Velcro to anchor it in my improvised grow light terrarium,” Pettit said.

“Potatoes are one of the most efficient plants based on edible nutrition to total plant mass (including roots). Recognised by Andy Weir in his book/movie "The Martian," potatoes will have a place in future exploration of space. So I thought it good to get started now," he added.

Pettit has captured several photos from his four space journeys since the first one in 2002, with a total of 590 days spent in orbit.

Some of his photos include comets like C/2023 A3 and C/2024 G3, along with aurora displays from space, such as the October 2024 one.

The rise of food-growing in space

Developing technology to grow food in space, especially on Mars and the Moon, has become a significant focus for a number of space agencies in the last few years.

These include NASA, the European Space Agency (ESA), the German Aerospace Centre (DLR) and Japan Aerospace Exploration Agency (JAXA), among others.

This is mainly to enable long-duration missions to Mars and the Moon, as well as permanent settlements, where sending all the food required from Earth would be practically impossible.

NASA has already successfully grown lettuce and other leafy greens, as well as peppers, with their space agricultural technology Veggie and Advanced Plant Habitat (APH) programmes.

On the other hand, the ESA focuses more on bioregenerative systems, which involve growing food from microorganisms and stem cells, along with lab-grown food.

The DLR is also focusing on automated greenhouse techniques, which it uses to further its space farming efforts, as well as Antarctica studies.

Some key technologies include hydroponics, which involves growing plants with nutrient-rich water instead of soil, along with bioreactors to turn yeast or bacterial fermentation into protein.


Scientists solve decades-long mystery about why Saturn appears to change its spin



Northumbria University

context_saturn_asymmetric_temperatures.png 

image: 

shows the asymmetric temperature structure revealed in the paper, as it was observed from JWST. These are offset from where the currents flow into and out of the planet, but ultimately, the winds generated by this temperature offset are what drive those currents

view more 

Credit: Image/movie credit: NASA/ESA/CSA, Tom Stallard (Northumbria University), Melina Thévenot, Macarena Garcia Marin (STScI/ESA).





Researchers at Northumbria University have used the most powerful space telescope ever built to answer one of the longest-standing puzzles in planetary science – why does Saturn appear to spin at a different speed depending on how you measure it?

The findings, published in the Journal of Geophysical Research: Space Physics,reveal for the first time the complex patterns of heat and electrically charged particles in Saturn's aurora, and show that the entire system is driven by a self-sustaining feedback loop powered by the planet's own northern lights.

Saturn has puzzled scientists for many years. Measurements taken by NASA's Cassini spacecraft in 2004 suggested the planet's rotation rate was slowly changing over time – yet this should not have been possible, as a planet cannot simply speed up or slow down its spin.

In 2021, a study led by Tom Stallard, Professor of Planetary Astronomy at Northumbria University, showed that the mystery did not actually involve Saturn's rotation at all. Instead, the apparent changes were being driven by winds in the planet's upper atmosphere, which were producing electrical currents that created the misleading auroral signal.

However, the findings raised a further question for the research team – if atmospheric winds were responsible for the effect, what was causing those winds?

New research by Professor Stallard and colleagues across the UK and US has now provided the first direct evidence of the answer.

Using the James Webb Space Telescope (JWST), the team observed Saturn's northern auroral region – the equivalent of Earth's northern lights – continuously for a full Saturnian day, capturing detailed measurements that were simply not possible with any previous instrument.

By analysing the infrared glow from a molecule called trihydrogen cation, which forms in Saturn's upper atmosphere and acts as a natural thermometer, the researchers were able to produce the first high-resolution maps of both temperature and particle density across Saturn's auroral region.

The level of detail was extraordinary. Previous measurements had errors of around 50 degrees Celsius, roughly on a par with the differences the scientists were trying to detect, and were produced by combining broad regions of the hot polar aurora. The new JWST data was ten times more accurate than previous measurements, allowing the team to map fine details of heating and cooling across Saturn's auroral region for the very first time.

What the team found was that these temperature and density patterns match remarkably well with predictions made by computer models more than a decade ago, but only if the source of heat is placed exactly where the main auroral emissions enter the atmosphere.

This means Saturn's aurora is not just a visual display – it is actively heating the atmosphere in a specific location. That localised heating drives winds, which in turn generate the electrical currents responsible for the aurora. The aurora then heats the atmosphere again, sustaining the whole cycle.

Lead researcher Professor Tom Stallard, said: “What we are seeing is essentially a planetary heat pump. Saturn's aurora heats its atmosphere, the atmosphere drives winds, the winds produce currents that power the aurora, and so it goes on. The system feeds itself.

“For decades, we knew something strange was happening with Saturn's apparent rotation rate, but we could not explain it. We then showed it was being driven by atmospheric winds, but we still did not know why those winds existed. These new observations, made possible by JWST, finally give us the evidence we needed to close that loop.”

The findings also have broader implications. The research suggests that what happens in Saturn's atmosphere directly influences conditions in its surrounding magnetosphere – the vast region of space shaped by the planet's magnetic field – which in turn feeds energy back into the system. This two-way relationship between atmosphere and magnetosphere may help explain why the effect is so stable and long-lasting.

Professor Stallard added: “This result changes how we think about planetary atmospheres more generally. If a planet's atmospheric conditions can drive currents out into the surrounding space environment, then understanding what is happening in the stratospheres of other worlds may reveal interactions we have not yet even imagined.”

The James Webb Space Telescope is the world's premier space science observatory. Webb is solving mysteries in our solar system, looking beyond to distant worlds around other stars, and probing the mysterious structures and origins of our universe and our place in it. Webb is an international program led by NASA with its partners, ESA (European Space Agency) and CSA (Canadian Space Agency).

The study was carried out by researchers from Northumbria University, alongside collaborators from Boston University, the University of Leicester, Aberystwyth University, the University of Reading, Imperial College London, Lancaster University, and Johns Hopkins University Applied Physics Laboratory. The research was supported by the Science and Technology Facilities Council (STFC).

Visit the Northumbria University Research Portal to find out more about Professor Tom Stallard’s work.

The paper JWST/NIRSpec reveals the atmospheric driver of Saturn's variable magnetospheric rotation ratewas published in Journal of Geophysical Research: Space Physics on 3 March 2026 (DOI 10.1029/2025GL118553).

Ends
 

Media descriptions:

Here, we show the asymmetric temperatures, density and intensity of the auroral ionosphere revealed in this recent research, as it was seen from JWST.

We have combined spectral imagery taken on the same day, 29 November 2024, by the JWST NIRSPEC and JWST NIRCAM instruments.

The NIRSPEC data was taken under programme GO-5308, PI: Moore, co-PI: Stallard, Melin, and were processed into these final data products by T. Stallard.

The three-color NIRCAM image of Saturn were taken under programme DD-9219, PI: Garcia Marin, and were processed into the final three-color image by Melina Thévenot (https://bsky.app/profile/melina-iras07572.bsky.social).

Image/movie credit:NASA/ESA/CSA, Tom Stallard (Northumbria University), Melina Thévenot, Macarena Garcia Marin (STScI/ESA).

  • context_saturns_temperatures_movie.mov shows the asymmetric temperature structure revealed in the paper, as it was observed from JWST. These are offset from where the currents flow into and out of the planet, but ultimately, the winds generated by this temperature offset are what drive those currents
  • context_saturns_h3p_density_movie.mov shows the asymmetric density structure, revealing where the auroral current was preferentially flowing into (as darker) and out of (as brighter) the planet. These are offset from the temperature peaks, but ultimately drive that temperature asymmetry
  • context_saturns_h3p_emission_movie.mov shows the auroral brightening, as has previously been observed from both Earth and in orbit around Saturn

The following images are three frames taken from the same movies at the same time, showing how these three asymmetric features are related:

  • context_saturn_asymmetric_densities.png
  • context_saturn_asymmetric_temperatures.png
  • context_saturn_asymmetric_intensity.png

Data movies:

Here, we show the asymmetric temperatures, density and intensity of the auroral ionosphere as viewed from above the auroral region, rotating to highlight how interconnected these different parameters are:

  • data_parameters.mov shows these three parameters (top row) and the different from the median values at each latitude (bottom row) - here, red is higher and blue lower, revealing that not only the brighter regions but also weaker regions follow very similar patterns, driven by and driving the planetary-period currents flowing into and out of the planet.

 

Taming the acid clouds: a new blueprint for breathing and fueling on Venus



Higher Education Press

IMAGE 

image: 

Overall detection block diagram (a) and (b); Instrument Structure diagram (c).

view more 

Credit: HIGHER EDUCATON PRESS





Integrated system for filtering, enriching, and detecting trace gases paves the way for high-precision isotopic measurements and resource extraction from the planet's corrosive atmosphere

Venus, often regarded as Earth’s sister planet due to its comparable size and bulk composition, presents an extreme environment and distinctive atmospheric chemistry that not only make it a valuable natural laboratory for planetary science, but also pose unprecedented challenges and opportunities for in situ resource utilization. In a pioneering study published in Planet (Volume, 2 Issue 1), a team led by Researcher Nailiang Cao from the Anhui Institute of Optics and Fine Mechanics, Chinese Academy of Sciences, together with Professors Xiaoping Zhang and Yi Xu from the State Key Laboratory of Lunar and Planetary Sciences at Macau University of Science and Technology, proposed an integrated detection strategy combining gas filtration, enrichment, and spectroscopic analysis, thereby offering a promising technical framework for high-precision atmospheric characterization and future resource utilization in Venus exploration missions

At present, our understanding of the Venusian atmosphere relies primarily on decades of remote-sensing observations and a limited number of in situ measurements. Although it is well established that the atmosphere is dominated by carbon dioxide and contains trace amounts of water vapor and sulfur dioxide, while the presence of phosphine and ammonia remains highly debated, decisive evidence is still lacking to resolve key scientific questions concerning Venusian geological activity, the planet’s history of water loss, and even the possible existence of biosignature. At the same time, the near-surface environment of Venus, characterized by pressures of about 90 bar, temperatures exceeding 460°C, and planet-encircling sulfuric acid clouds, poses an extraordinary challenge to any exploratory instrument. Even in currently planned missions such as NASA’s DAVINCI, the onboard tunable laser spectrometer and associated atmospheric instruments are designed to measure key gases and trace compounds during descent, yet high-sensitivity detection of multiple critical trace species and their isotopic signatures remain intrinsically constrained by measurement sensitivity and observational coverage. Although conventional remote-sensing techniques offer the advantage of large-scale global atmospheric coverage, their spectral resolution is often insufficient for high-precision retrieval of isotopic ratios involving C, H, O, N, and S.

To address these challenges, the research team proposed an integrated detection system that combines gas filtration, enrichment, and spectroscopic analysis. The first requirement is to mitigate the effects of the highly corrosive constituents in the Venusian atmosphere, whose cloud layer is dominated by sulfuric-acid aerosols and haze particles. To this end, the team designed a three-stage gradient filtration module incorporating two porous ceramic layers followed by a microporous polytetrafluoroethylene membrane. Working in concert, these multistage filters are intended to remove sulfuric-acid aerosols and solid particulates with diameters as small as 0.1 μm at efficiencies exceeding 99.99%. The module also integrates a thermal self-cleaning unit capable of continuously evaporating residual droplets and periodically removing sulfide deposits through high-temperature bakeout, thereby helping to ensure stable instrument performance during long-duration missions.

The filtered gas is then delivered to an enrichment module, a key component for the high-sensitivity detection of trace gases. Because species such as PH₃, NH₃, and H₂S are present at extremely low concentrations in the Venusian atmosphere, direct detection is often constrained by poor signal-to-noise ratio. The module therefore adopts a two-stage molecular-sieve adsorption scheme: first, a CO₂-selective sieve removes the dominant background gas to achieve preliminary enrichment of the target species; next, a high-selectivity sorbent captures and further concentrates the trace gases. This process effectively increases both target-gas abundance and spectroscopic signal strength, thereby facilitating subsequent high-precision analysis.

Finally, the spectroscopic detection module functions as the “intelligent eye” of the system. By integrating two laser spectroscopic techniques, it provides coordinated coverage from orbital remote sensing to in situ exploration. In remote-sensing mode, the system uses laser heterodyne spectroscopy, in which the target signal is mixed with solar radiation to produce a radio-frequency beat signal; subsequent narrowband filtering and Fourier transformation enable ultra-high-resolution spectral detection. The received signal is then demodulated through lock-in amplification to retrieve the absorption features of trace gases in the Venusian atmosphere, while also supporting target-region selection for lander or probe entry. or in situ exploration at 40–70 km altitude, the system employs OA-ICOS. The pretreated gas sample enters a high-reflectivity optical cavity, where multiple reflections produce a kilometer-scale effective path length and markedly strengthen the absorption signal. By scanning the characteristic absorption lines of isotopes such as H, N, and S, the system can retrieve gas abundances and isotopic ratios, including D/H, ¹⁵N/¹⁴N, and ³⁴S/³²S. Spectral simulations indicate that an operating pressure of about 20 mbar effectively suppresses pressure-broadening interference, achieving an optimal balance between detection sensitivity and fitting accuracy.

The importance of this work resides not only in its highly integrated technological framework, but also in its effective coupling of scientific exploration with in situ resource utilization. Because the Venusian atmosphere is dominated by CO₂ and also contains sulfur-bearing species and trace water, it represents a potentially valuable resource system. Extracted water could be electrolyzed to yield oxygen and hydrogen for life support and fuel production; CO₂ could be converted electrochemically into CO and O₂ for power generation or propellant synthesis; and sulfur species such as SO₂ and H₂S could serve as chemically energetic components of a redox system. In this sense, the key gases targeted by the proposed system are simultaneously scientific tracers and potential resources for future long-duration Venus exploration. The proposed modular design, featuring active and passive thermal control using phase-change materials to withstand the planet’s extreme temperatures, ensures compatibility with various mission architectures, including orbiters, descent probes, and potentially long-duration aerial platforms. This integrated framework promises to deliver multi-scale observations and cross-validated datasets, significantly improving the reliability of our atmospheric models. As the authors note, the rigorous laboratory validation of this system, focusing on heat-resistant materials, ultra-stable laser sources, and enhanced cavity technologies, will not only pave the way for a return to Venus but also establish a robust blueprint for resource-based exploration of other challenging worlds like Mars, Europa, and Titan, fundamentally changing how we approach the sustainable exploration of the solar system.

Journal

DOI

Method of Research

Subject of Research

Article Title

Shining light on lunar darkness: the network that could end the Moon’s power cut





Higher Education Press

Image 

image: 

 

(a) A terrain-aware multi-site high-efficiency laser power beaming network on the lunar surface. (b) Distribution of received power for lunar mobile explorers before and after terrain-aware optimisation.

view more 

Credit: HIGHER EDUCATION PRESS





Harbin Institute of Technology researchers propose a new terrain-aware framework for jointly optimising coverage, connectivity, and cost, enabling the first system-level design of laser power-beaming networks for extreme exploration tasks in the Moon’s permanently shadowed regions

The Moon’s polar regions present one of the most alluring yet forbidding frontiers in human space exploration. Within the deep craters of the lunar south pole lie permanently shadowed regions (PSRs)—areas that have not seen sunlight for billions of years and which harbour valuable water ice deposits that could support future lunar bases. However, these same regions exist in perpetual darkness, with temperatures plunging below -230°C, making them inaccessible to traditional solar-powered equipment. While space agencies and commercial entities have proposed solutions ranging from fission reactors to orbital power stations, a fundamental question has remained unanswered: how can we design a practical, cost-effective energy delivery system that reliably powers exploration activities in these sun-forbidden zones?

A study published in Planet (Volume 2, Issue 1) by Professor Lifang Li and Pengzhen Guo’s team at the Harbin Institute of Technology offers a systematic research approach to this challenge. Their paper, titled “Optimal laser power beaming network for powering Lunar permanently shadowed regions: a coverage–connectivity–cost trade-off,” introduces a sophisticated terrain-aware network optimisation framework that advances laser power beaming from traditional single-link analysis to multi-station, system-level optimisation, offering a new perspective for future lunar energy infrastructure deployment. The work arrives at a critical juncture when multiple spacefaring nations are racing to establish a sustainable presence on the Moon, with NASA’s Artemis programme, China’s international lunar research station, and various commercial ventures all targeting the south pole for permanent outposts.

The fundamental challenge of lunar polar exploration lies in its paradoxical energy geography. The crater rims receive nearly continuous sunlight, making them ideal locations for solar energy harvesting and power deployment, yet the scientifically valuable crater floors—where water ice accumulates—remain in permanent darkness. Previous technical efforts have largely been limited to terrain-constrained point-to-point transmission links. Researchers have demonstrated laser power transmission over terrestrial distances, developed efficient photovoltaic converters for laser light, and proposed orbital power relay constellations. What has been lacking is a systems-level understanding of how multiple power transmission nodes can work together as a coordinated network under the triple constraints of improving effective target-area coverage, enhancing regional connectivity, and controlling infrastructure costs.

The team has tackled this optimisation problem head-on, developing a mathematical framework that treats lunar power delivery as a network design challenge rather than a simple point-to-point transmission problem. Their approach begins with realistic geography, using high-resolution topographic data from NASA’s Lunar Orbiter Laser Altimeter (LOLA) and focusing on the region near Shackleton crater. The model incorporates terrain obstruction, local illumination conditions, beam diffraction divergence, pointing errors, and lunar dust attenuation, thereby establishing a comprehensive framework for lunar laser transmission and network deployment. It is important to note that the power supply nodes in this study are not simply fixed “laser stations”; instead, the system adopts a split architecture in which fixed support platforms are responsible for power acquisition and supply, while the laser emission units can be adjusted and repositioned locally to achieve more favourable transmission conditions. Based on this framework, the team simulated how multiple emission units could transmit laser energy to receivers mounted on rovers, hoppers, or in-situ resource utilisation equipment operating in permanently shadowed areas.

The core innovation of the study lies in the first simultaneous optimisation of three key performance dimensions. Coverage ensures that more scientifically valuable PSRs can receive energy support when needed, whether for short rover traverses or long-term operation of fixed equipment. Connectivity is not simply about adding more isolated power-supply points, but about reducing fragmentation of the powered areas and creating a more continuous spatial structure, thereby lowering the risk that a mobile explorer will unintentionally leave the powered region during cross-regional movement and supporting sustained exploration tasks. Cost constraints recognise that every transmission unit, every square metre of receiver array, and every tonne of equipment delivered to the lunar surface carries a substantial price tag. By treating these three factors as interdependent variables rather than separate considerations, the team derived a terrain-aware optimised laser power-beaming network configuration that balances infrastructure scale and operational capability.

The study’s findings offer practical decision support for lunar base planning. The research shows that terrain-aware optimised deployment can significantly improve power coverage and regional connectivity in the south polar PSRs: the effective coverage ratio increases from 10.76% to 27.55%, while regional connectivity rises from 39.93% to 98.92%. Compared with the baseline scheme, which selects sites solely on the basis of local high-illumination conditions, the optimised configuration significantly improves overall network performance while keeping infrastructure requirements under control. More importantly, the team not only optimised the station selection, but also refined the local positioning of the laser emission units, enabling previously fragmented powered areas to be connected more effectively and providing more reliable sustained energy support for mobile exploration tasks on the lunar surface.

From a technical standpoint, the research advances laser power beaming beyond the laboratory demonstrations that have characterised the field to date. Recent experiments have shown that high-efficiency semiconductor lasers can maintain stable operation across the temperature extremes expected in lunar environments, while photovoltaic receivers have demonstrated conversion efficiencies that make laser power transmission economically viable. The HIT team’s contribution synthesises these technological building blocks into an architectural framework that provides lunar base mission planners with guidance on how emission units can be deployed, how different nodes can work together, and how overall system performance can be balanced across coverage, connectivity, and cost under complex lunar terrain conditions.

The broader significance of this work extends beyond the lunar context. As space exploration moves toward permanent human presence beyond Earth, the ability to deliver power wirelessly across challenging terrain will become increasingly essential. The same optimisation principles that the team has applied to lunar craters may also be transferable to Martian canyons, asteroid mining operations, or even terrestrial applications where conventional power infrastructure is impractical. The study establishes a methodological foundation for thinking about space power networks as integrated systems rather than isolated links—a perspective that will prove invaluable as humanity’s reach into the solar system expands.

The timing of this publication aligns with a surge of interest in lunar power solutions from multiple sectors. NASA has recently accelerated its Fission Surface Power programme, while commercial entities are proposing orbital power satellite networks and tower-based laser transmission systems. Each approach has its advocates, but all share a common need for the kind of systems-level thinking that the HIT team has now provided. By establishing rigorous optimisation criteria, this research enables apples-to-apples comparisons between different power delivery architectures and provides objective guidance for the difficult investment decisions that lie ahead.

Perhaps most encouragingly, the study demonstrates that laser power beaming networks exhibit clear engineering potential, while the relevant enabling technologies continue to mature. The required laser efficiencies have been demonstrated in laboratory settings; pointing and tracking systems have achieved the necessary precision for Earth-orbital applications; and photovoltaic receivers have been tested under simulated lunar conditions. What has been missing until now is the confidence that these components can be assembled into a system that reliably meets mission requirements at acceptable cost. The team has provided that confidence through rigorous analysis and optimisation.

As spacefaring nations prepare for the next decade of lunar exploration, the question is no longer whether we can deliver power to the Moon’s darkest places, but how to do so most effectively. This study by the Harbin Institute of Technology provides a systematic design approach, advancing laser power beaming from a single-link concept to a networked solution for mission planning. For the rovers, drilling systems, and life-support systems that may one day operate in the eternal twilight of lunar craters, reliable power supply will be an essential foundation for the continued advance of deep-space exploration.

A million new SpaceX satellites will destroy the night sky — for everyone on Earth
Hanno Rein
March 25, 2026 
THE CONVERSATION


Hurricane Milton as seen from the International Space Station (Screen cap via NASA)

More than 10,000 Starlink satellites currently orbit the Earth. We see them crawling across dark skies, no matter how remote our location, and streaking through images from research telescopes.

SpaceX recently announced that it wants to launch one million more of these satellites as orbital data centres for AI computing power.

A few years ago, we wrote a paper predicting what the night sky would look like with 65,000 satellites from four planned megaconstellations: SpaceX’s Starlink, Amazon’s Kuiper (now Leo), the U.K.’s OneWeb and China’s Guowang. We calibrated our models to observations of real Starlink satellites and came up with a startling prediction: One in 15 visible points in the night sky would be a satellite, not a star.


A million satellites would be so much worse.


The human eye can see fewer than 4,500 stars in an unpolluted night sky. If we permit SpaceX to launch these satellites, we will see more satellites than stars — for large portions of the night and the year, throughout the world. This will severely damage the night sky for everyone on Earth.

SpaceX’s proposal also completely fails to account for atmospheric pollution, collision risk or how to develop the technology needed to disperse waste heat from orbital data centres.
Predicting the night sky

SpaceX has filed its million-satellite proposal to the United States Federal Communications Commission (FCC) and has only provided bare-bones information about these new satellites so far.

We do know that the proposed constellation will have satellites in much higher orbits, making them visible for longer periods of the night.

We decided to build an updated simulation, using the website of astrophysicist Jonathan McDowell. This includes a set of orbits consistent with the limited information in SpaceX’s filing.

We used the observed brightness of Starlink satellites as a reference, scaling the brightness model by considering size jumps between Starlink V1, V2 and predictions for V3, and assuming even higher complexity and power requirements.

There are many factors we don’t know anything about, so there is some uncertainty in the brightness we predict.
Predictions for satellite brightness and positions comparing SpaceX’s proposed one-million-satellite AI data centres with a previously approved 42,000 satellite megaconstellation. (Lawler et al. 2022), CC BY-NC-ND

In the figure above, each grey circle shows a simulation of the full night sky, as seen from latitude 50 degrees north at midnight on the summer solstice.

The left circle shows the night sky with SpaceX’s orbital data centres (SXODC), and the right shows the night sky with 42,000 Starlink satellites for comparison.

The coloured points show the positions and brightness of satellites in the sky, with blue the faintest and yellow the brightest. Below each all-sky simulation we list the number of sunlit satellites in the sky (Ntot) and the number of naked-eye visible satellites (Nvis), with tens of thousands predicted for SXODC.

Each of our simulations shows there will be more visible satellites than stars for large portions of the night and the year.

It is hard to overstate this: Should a million new satellites be launched, in the orbits and with the sizes proposed, the stars we are able to see at night would be completely overwhelmed by artificial satellites — throughout the world.

This does not even account for additional large satellite system proposals filed to the International Telecommunication Union (ITU) in recent years by numerous national governments.

A satellite crematorium

SpaceX’s proposal is that these new satellites will operate as orbital data centres.

Data centres on the ground are drawing increasing criticism for the huge amounts of water and electricity they use. In an impressive feat of greenwashing, SpaceX suggests that launching data centres into orbit is better for the environment. This is only true if you ignore all the consequences of satellite launch, orbital operations and re-entry.

We can already measure atmospheric pollution from “re-entries,” when satellites fall back to Earth. We know that multiple satellites are falling every day and that if they do not fully burn up on re-entry, debris falls on the ground with risk for injury and death.

Increasing densities of satellites also drive up collision risks in orbit. And using the atmosphere as a satellite crematorium is changing the atmosphere in ways we don’t yet understand.

Practically, it is not at all clear whether the proposed orbital data centres are feasible any time soon. To operate data centres in orbit, they would need to disperse huge amounts of waste heat. Despite the greenwashing, this is actually very hard to do in space as they would have to manage the intense radiation from the sun, while cooling the satellite by radiation.

SpaceX should know this well: one of the first brightness mitigations they tested for Starlink was “darksat,” a Starlink satellite they effectively just painted black. The satellite overheated and the electronics fried.

A slap in the face for astronomers


SpaceX has done a lot of engineering work to make its Starlink satellites fainter. They are still too bright for research astronomy, but thanks to new coatings, their brightness has not increased dramatically even as SpaceX has launched larger and larger satellites.

SpaceX’s proposal for one million AI data centre satellites with enormous power requirements does not include any discussion of the co-ordination agreement for dark and quiet skies required by the FCC.

It feels like a slap in the face after many astronomers have spent years working with SpaceX on ways to mitigate their Starlink megaconstellation and save the night sky.
Orbital space is a finite resource

The SpaceX filing does not include exact orbits, the size or shape of satellites or the casualty risk from de-orbiting (other than a vague promise that it won’t exceed 0.01 per cent per satellite). It doesn’t even include any information on how the company plans to develop the technology that does not currently exist but is needed to make this plan work.

Despite how shockingly little information SpaceX provided, the FCC accepted SpaceX’s filing and opened the comment period within four days. Astronomers and dark sky advocates worldwide scrambled to write and submit comments in the short four weeks that the comment period was open.

The scientific process is slow and careful and it often takes months or years to publish a peer-reviewed result. Companies like SpaceX have stated repeatedly that their method is to “move fast and break things.” They are now close to breaking the atmosphere, the night sky and anything on the ground or in space that their satellites and rockets fall on or crash into.

Earth’s orbital space is a finite resource. There is an evolving set of international guidelines for operating in outer space, grounded in a set of high-level international rules. Yet, those rules and guidelines are inadequate.

One corporation based in one country should not be allowed to ruin orbit, the night sky, and the atmosphere for everyone else in the world.
Why is the US going back round the Moon with Artemis II? A space policy expert explains

Gemma Ware,
 The Conversation
March 27, 2026 


Intuitive Machines' Athena lander in circular orbit around the Moon. (AFP)

Final preparations are underway for NASA’s Artemis II mission, the first crewed mission around the Moon for more than 50 years. Four astronauts, three men and one woman, will spend 10 days aboard the Orion spacecraft, going further into space than any other humans as they orbit the Moon and return to Earth.

Issues caused by a fuel leak while testing the Space Launch System rocket used for the mission meant launch windows in February and March were missed. Now NASA is targeting early April for launch.

The mission is the next step of the Artemis programme, which plans to land astronauts back to the Moon by 2028. China has its own programme targeting a full crewed mission to the lunar surface by 2030.



In this episode of The Conversation Weekly podcast we speak to Scott Pace, director of the Space Policy Institute at George Washington University about why NASA is sending people back round the Moon. Pace worked in space policy for the George W. Bush Administration, followed by a stint at NASA before his appointment as the executive secretary of the National Space Council during the first Trump administration, where he worked on the launch of the Artemis programme.

No human has set foot on the Moon since Gene Cernan climbed back aboard Apollo 17 in 1972. Pace says that once the Americans had beaten the Russians to the Moon “the geopolitical reason for continuing those missions really wasn’t there”.

Today, Pace believes the “geopolitical purpose for being on the Moon is to be there a lot”. He compares the Moon to Antarctica, arguing that the US and its allies have influence over Antarctica in part because they put 3,000 people on the ice every summer. “Rules are made by people who show up,” he says. It matters to him if China beats the US back to the Moon, “if China drives all the standards and the operating norms”.

For Pace, this means it’s important to up the flight rate to the lunar surface by building capacity to send more than one crewed mission a year. He thinks Artemis’s partnerships with commercial space partners will be crucial to achieving this.

“What we’re seeing now with Artemis is NASA and industry learning how to fly to the Moon, and then making a decision about what will be a sustainable future for doing this,” says Pace. “That is a current debate that will shape what happens after Artemis II.”

 

‘Sometimes an adult should shut up and go away’: scientists reveal the qualities that kids need in play



For better play, give children space to choose and accessible games, scientists say



Frontiers





If you need good play to have a good childhood, then we need to know what good play looks like. But studies of play often start from an adult perspective, leaving out kids’ perspectives. To overcome this, scientists surveyed schoolchildren about play and used statistical analysis to identify the themes which came up most often. While some components of ‘good play’ seem to depend on individual preferences, others could be universal. 

“Perhaps we have made the first steps to describing the magic and intangible thing we call play in a very new way,” said Dr Andreas Lieberoth of Aarhus University, lead author of the article in Frontiers in Psychology. “This can help educators and carers to nurture different sorts of play, even if they’re not ‘done right’, ‘educational’, or even ‘nice’ according to adult standards.” 

“Adults should stop explaining to children how they ought to play,” said co-author Dr Hanne Hede Jørgensen of VIA University College, “and put faith in children’s ability to work things out. We don’t make space for either good or bad play — and we must make space for both, because good play to one child might be bad to another.” 

In their own words 

The scientists started by interviewing 104 children about play. Using these interviews, they identified recurring elements that described what made play ‘bad’ or ‘good’, and developed a list of 83 statements from the interviews that represented these recurring elements. They then asked 504 other children to recall either a good or a bad play experience and rate it by saying whether they agreed or disagreed with the different statements.  

Using principal component analysis, the team then identified two sets of important elements in play: seven critical factors that were generally applicable to as many play experiences as possible, and 22 elements that captured a wider variety of experiences. Because a scale based on the 22 elements would be too long for practical use in research, they used the first seven factors to form the ‘play qualities inventory’. These were social inclusion, imagination, transgression, accessibility, wild and exciting play, having something to do, and something the scientists named ‘play feeling’. This last factor explained more variation in good or bad play experiences than any other.  

“If you have ever felt it, you know what it means,” said Lieberoth. “You know it when you see it, like love, evil, or fun. In the words of kids, it’s an experience where you feel that's ‘just totally perfect’, and maybe you ‘just laugh’ or ‘get a smile on your mouth’. When the feeling is not there, play is ‘annoying’, ‘boring’, or maybe you ‘think the rules should be different’.” 

Fun and games 

High levels of accessibility and play feeling were usually present in good play, but the other five themes could be present in good play or bad play. Importantly, good play experiences weren’t always those supervising adults might consider nice.  

“In many cases, good play will have no transgression,” said Lieberoth. “But in some cases, what really makes play fun and special is the ability to go nuts, tease each other, and generally flout the norms of society — or the playground.” 

The scientists also learned that disharmony makes play bad. Losing social alignment with other children turned good play into bad play.  

“Some of the factors we discovered showed the anti-play kryptonite many of us can recognize from childhood or painfully awkward team-building exercises,” said Lieberoth. “The absence of alignment is highest on my personal list. I have seen many well-intentioned adults try to interject a hapless kid into someone else’s game, basically ruining the shared alignment. Sometimes an adult is needed to scaffold, initiate, inspire, and support, but sometimes they should shut up and go away.” 

But the scientists point out that different kids like different things. Good play for one kid could be bad play for another, especially across different cultures. Providing larger-scale play opportunities where children can choose to try different games or activities could maximize inclusion. 

“The last thing we want is for people to use this work to make up rules for ‘correct play’,” said Lieberoth. “There is no such thing. I’m convinced that the same protocol would yield different stories, different memories, and different agreements in a different time and place. But within the dataset the findings appear quite robust across many kids, so it could be that some features are indeed universal. I would be very excited to see the scales used in different settings.”

Population-based lung cancer screening can reduce mortality in people who have never smoked, study shows in China




European Society for Medical Oncology





COPENHAGEN, Denmark, 27 March 2026 – New evidence from a Chinese cohort presented today at the European Lung Cancer Congress (ELCC) 2026 shows that onetime lowdose computed tomography (LDCT) screening can significantly reduce lung cancer mortality in a non–risk based population, including individuals with no smoking history. (1) The findings support reconsideration of current eligibility criteria, which still rely heavily on tobacco exposure.

In the prospective non-randomised controlled study conducted between 2017 and 2021 within the Chinese LungCare Project, nearly 12,000 adults aged 40–74 in Guangzhou underwent LDCT screening and were compared with a geographically matched control cohort receiving standard risk based care. After seven years of follow up, LDCT screening was associated with a 55% reduction in lung cancer–specific mortality (HR 0.45; 95% CI 0.32–0.65; P<0.001). The mortality benefit was observed across sexes and was particularly pronounced among women with 72% risk reduction (HR 0.28; 95% CI 0.13–0.60; P<0.001) compared to male (HR 0.55; 95%CI 0.36–0.83, P=0.004). Among patients diagnosed with lung cancer over the course of the study period, screen-detected cases demonstrated significantly better overall survival compared with the non-screened group (HR 0.13; 95% CI 0.09–0.19; p<0.001). In the screening group, 81.5% of cancers were diagnosed at stage I, compared with 25.1% in the non-screening group. In contrast, advanced-stage disease accounted for about 70% of cases in the non-screened group. 

These results challenge how countries currently decide who is eligible for lung cancer screening. Today, most screening programmes mainly target longtime or heavy smokers. However, this approach overlooks a rapidly growing group: people who develop lung cancer despite never having smoked. In many parts of the world—especially in Asia but increasingly also elsewhere (2)—nonsmokers make up a substantial proportion of new lung cancer cases (3), often linked to factors like fine particulate matter (PM2.5) in air pollution or genetic susceptibility.

“Current screening guidelines were built around smoking history and in doing so, they leave behind a large and growing group of people who develop lung cancer despite never having smoked. In Asia, this is not a marginal issue: never-smoking women represent a substantial share of all lung cancer cases, driven by factors like air pollution and genetic risk rather than tobacco. The LUNG-CARE Project shows that when we screen beyond conventional risk criteria, we catch disease earlier, over 80% of screen-detected cancers were Stage I, and that translates directly into lives saved. A 72% mortality reduction in women is not a signal to note; it is a signal to act on.”, commented Prof. Marina Garassino, University of Chicago, who was not involved in the study.  

Implementing mass LDCT screening comes with challenges. Although broader screening can be cost effective in some settings, the costs of imaging and follow up tests after positive results may be difficult for certain health systems to sustain. LDCT also has a relatively high false positive rate—about 8% (4) —which can lead to unnecessary invasive procedures, added costs, and patient anxiety. As a result, adoption into national programmes has been slow, and where screening is available, participation remains uneven due to barriers such as fear of diagnosis and low perceived personal risk. (5) 

“This is a game-changer for Asian populations, but we should resist the temptation to over-generalise. Lung cancer in Asia follows a different epidemiological playbook: never-smokers, women, environmental exposures and guidelines built on Western smoking-based data simply do not serve these populations. On the other hand, Western guidelines cannot simply copy-paste these results. What this study does demand, urgently, is updated criteria that recognise Asian ancestry as an independent risk factor for screening eligibility,” Garassino concluded. 

-END- 


 

Photographer Lennart Nilsson’s archive to open for research and public access




University of Gothenburg

A Child is Born 

image: 

The book A Child is Born

First edition of the American version, 1966. Delacorte Press/Seymour Lawrence, New York, New York.

view more 

Credit: Delacorte Press/Seymour Lawrence, New York, New York.





The University of Gothenburg has been funded 1.3 million SEK by the Hasselblad Foundation to take over the extensive archive of photographer Lennart Nilsson. The archive is one of Sweden’s most significant photographic legacies and contains a life’s work that transformed how we understand the human body, science and the role of photography in society.

The initiative means that Lennart Nilsson’s archive will be preserved long-term at the Gothenburg University Library and will be made accessible for research and the public. At the same time, the archive will be connected to the internationally recognised research environment in photography developed through collaboration between HDK-Valand at the University of Gothenburg and the Hasselblad Foundation.

Reached a global audience 
Lennart Nilsson is one of Sweden’s most influential photographers, not least through his unique and long-standing collaboration with medical research.

“Preserving his archive is not only about safeguarding an extraordinary life’s work. It is also about enabling a deeper understanding of a creative practice that spans several decades and helped bring scientific knowledge to a broad public,” says Niclas Östlind, Professor of Photography at HDK-Valand.

Lennart Nilsson’s most well-known work is the book A Child Is Born (1965), which depicts foetal development from conception to birth. The book has been translated into more than twenty languages and is one of the most widely distributed photography books in the world. The images have also sparked important discussions about research ethics, perspectives on gender and reproduction, and the role of photography in scientific knowledge production.

“His work, at the intersection of photography and medical research, has had a major impact on how images shape knowledge and societal understanding. Lennart Nilsson’s legacy is very close to our hearts, and we welcome that the archive will now be preserved and made accessible for research and the public,” says Kalle Sanner, Executive Director of the Hasselblad Foundation. 

A vital part of our collective memory 
The archive, previously managed by Lennart Nilsson’s stepdaughter Anne Fjellström, contains a large number of negatives, slides, as well as books, magazines and an extensive personal archive including correspondence, notes and documents. The material spans from the 1940s until Nilsson’s death in 2017, offering insight into both his working process and the period in which he was active.

The initiative also highlights the broader issue of how photographic heritage should be preserved.

“Photography plays a crucial role in our collective memory. Universities, more than many other societal actors, can contribute to advanced knowledge and deeper understanding in this area,” says Niclas Östlind.

In order to make the material searchable and accessible, it needs to be catalogued and adapted to the university library’s system – work that will now begin. The aim is for the archive to be accessible to researchers, students and the public by 2029.

Once the archive has been incorporated into the university’s collections, the Gothenburg University Library will be responsible for its long-term management.

“It is truly fantastic that, together with HDK-Valand and the Hasselblad Foundation, we have been able to bring this to fruition. In doing so, we are taking on a national responsibility for the photographic cultural heritage, closely connected to the education and research conducted at the university,” says Morgan Palmqvist, Library Director. 

Lennart Nilsson 

Lennart Nilsson, Stockholm, 2016.

Credit

Photo: Nicho Södling