Wednesday, March 02, 2022

MiHoYo's spending that Genshin cash on an experimental fusion reactor

By Rich Stanton 
PC GAMER

'Tech otakus save the world' is the company motto, after all.

(Image credit: miHoYo)

Developer miHoYo has been around since 2012, but 2020's Genshin Impact was its first global success: the game remains enormously popular, but in its first year made over $2 billion from the mobile version alone. The company has quite a charming motto—Tech otakus save the world—and, with this big ol' bunch of money burning a hole in its metaphorical pocket, has decided to have a go at living up to those words.

miHoYo recently led a funding round alongside NIO Capital, a Chinese investment firm, and a total of $63 million will be invested in a company called Energy Singularity (as per Beijing's PanDaily). The funds will be used for R&D of a "small tokamak experimental device based on high temperature superconducting material, and advanced magnet systems that can be used for the next generation of high-performance fusion devices."

What's a tokamak to you? A tokamak is a plant design concept for nuclear fusion, wherein plasma is confined using magnetic fields in a donut shape: a torus. It is considered, by people who know about these things, to be the most realistic and achievable nuclear fusion design.


The long-and-short of it is that nuclear fusion (where two atomic nuclei combine to create a heavier nuclei, releasing energy) will in theory have huge advantages over nuclear fission (where a nucleus is split). However despite being theorised about and researched since the 1940s, no working nuclear fusion reactor has ever been built. If it can be done, this technology could change everything about global energy supplies and become a major tool in fighting climate change (fusion even produces less waste than fission).

So, perhaps in 100 years they'll be writing textbooks about how thirsty weebs inadvertently saved the planet by buying bunny costumes.

miHoYo's not just into nuclear fusion: last year it funded a lab studying brain-computer interface technologies, and how they could possibly be used to treat depression.

This is the first major investment in Energy Singularity, which was founded in 2021 by experts in various fields, and is dedicated to creating commercialised fusion technology. After this funding it will focus on developing what it calls an "experimental advanced superconducting Tokamak (EAST)."

Yes that name does sound a bit like a boss fight. Genshin Impact, meanwhile, continues to receive regular updates, with 2.5 arriving just under a month ago and driving fans bonkers with its new characters.

Rich Stanton
Rich is a games journalist with 15 years' experience, beginning his career on Edge magazine before working for a wide range of outlets, including Ars Technica, Eurogamer, GamesRadar+, Gamespot, the Guardian, IGN, the New Statesman, Polygon, and Vice. He was the editor of Kotaku UK, the UK arm of Kotaku, for three years before joining PC Gamer. He is the author of a Brief History of Video Games, a full history of the medium, which the Midwest Book Review described as "[a] must-read for serious minded game historians and curious video game connoisseurs alike."

Swiss Plasma Center and DeepMind Use AI To Control Plasmas for Nuclear Fusion

Plasma Inside TCV Tokamak

Plasma inside the TCV tokamak. Credit: Curdin Wüthrich /SPC/EPFL  LOOK ITS A HAPPY FACE

Scientists at EPFL’s Swiss Plasma Center and DeepMind have jointly developed a new method for controlling plasma configurations for use in nuclear fusion research.

EPFL’s Swiss Plasma Center (SPC) has decades of experience in plasma physics and plasma control methods. DeepMind is a scientific discovery company acquired by Google in 2014 that’s committed to ‘solving intelligence to advance science and humanity. Together, they have developed a new magnetic control method for plasmas based on deep reinforcement learning, and applied it to a real-world plasma for the first time in the SPC’s tokamak research facility, TCV. Their study has just been published in Nature.

Tokamaks are donut-shaped devices for conducting research on nuclear fusion, and the SPC is one of the few research centers in the world that has one in operation. These devices use a powerful magnetic field to confine plasma at extremely high temperatures – hundreds of millions of degrees Celsius, even hotter than the sun’s core – so that nuclear fusion can occur between hydrogen atoms. The energy released from fusion is being studied for use in generating electricity. What makes the SPC’s tokamak unique is that it allows for a variety of plasma configurations, hence its name: variable-configuration tokamak (TCV). That means scientists can use it to investigate new approaches for confining and controlling plasmas. A plasma’s configuration relates to its shape and position in the device.

Deep Reinforcement Learning Steers Fusion Plasma

The controller trained with deep reinforcement learning steers the plasma through multiple phases of an experiment. On the left, there is an inside view in the tokamak during the experiment. On the right, you can see the reconstructed plasma shape and the target points we wanted to hit. Credit: DeepMind & SPC/EPFL

Controlling a substance as hot as the Sun

Tokamaks form and maintain plasmas through a series of magnetic coils whose settings, especially voltage, must be controlled carefully. Otherwise, the plasma could collide with the vessel walls and deteriorate. To prevent this from happening, researchers at the SPC first test their control systems configurations on a simulator before using them in the TCV tokamak. “Our simulator is based on more than 20 years of research and is updated continuously,” says Federico Felici, an SPC scientist and co-author of the study. “But even so, lengthy calculations are still needed to determine the right value for each variable in the control system. That’s where our joint research project with DeepMind comes in.”

TCV Vacuum Vessel 3D Model

3D model of the TCV vacuum vessel containing the plasma, surrounded by various magnetic coils to keep the plasma in place and to affect its shape. Credit: DeepMind & SPC/EPFL

DeepMind’s experts developed an AI algorithm that can create and maintain specific plasma configurations and trained it on the SPC’s simulator. This involved first having the algorithm try many different control strategies in simulation and gathering experience. Based on the collected experience, the algorithm generated a control strategy to produce the requested plasma configuration. This involved first having the algorithm run through a number of different settings and analyze the plasma configurations that resulted from each one. Then the algorithm was called on to work the other way – to produce a specific plasma configuration by identifying the right settings. After being trained, the AI-based system was able to create and maintain a wide range of plasma shapes and advanced configurations, including one where two separate plasmas are maintained simultaneously in the vessel. Finally, the research team tested their new system directly on the tokamak to see how it would perform under real-world conditions.

Range of Different Plasma Shapes

Range of different plasma shapes generated with the reinforcement learning controller. Credit: DeepMind & SPC/EPFL

The SPC’s collaboration with DeepMind dates back to 2018 when Felici first met DeepMind scientists at a hackathon at the company’s London headquarters. There he explained his research group’s tokamak magnetic-control problem. “DeepMind was immediately interested in the prospect of testing their AI technology in a field such as nuclear fusion, and especially on a real-world system like a tokamak,” says Felici. Martin Riedmiller, control team lead at DeepMind and co-author of the study, adds that “our team’s mission is to research a new generation of AI systems – closed-loop controllers – that can learn in complex dynamic environments completely from scratch. Controlling a fusion plasma in the real world offers fantastic, albeit extremely challenging and complex, opportunities.”

A win-win collaboration

After speaking with Felici, DeepMind offered to work with the SPC to develop an AI-based control system for its tokamak. “We agreed to the idea right away, because we saw the huge potential for innovation,” says Ambrogio Fasoli, the director of the SPC and a co-author of the study. “All the DeepMind scientists we worked with were highly enthusiastic and knew a lot about implementing AI in control systems.” For his part, Felici was impressed with the amazing things DeepMind can do in a short time when it focuses its efforts on a given project.

The collaboration with the SPC pushes us to improve our reinforcement learning algorithms.
— Brendan Tracey, senior research engineer, DeepMind

DeepMind also got a lot out of the joint research project, illustrating the benefits to both parties of taking a multidisciplinary approach. Brendan Tracey, a senior research engineer at DeepMind and co-author of the study, says: “The collaboration with the SPC pushes us to improve our reinforcement learning algorithms, and as a result can accelerate research on fusing plasmas.”

This project should pave the way for EPFL to seek out other joint R&D opportunities with outside organizations. “We’re always open to innovative win-win collaborations where we can share ideas and explore new perspectives, thereby speeding the pace of technological development,” says Fasoli.

Reference: “Magnetic control of tokamak plasmas through deep reinforcement learning” by Jonas Degrave, Federico Felici, Jonas Buchli, Michael Neunert, Brendan Tracey, Francesco Carpanese, Timo Ewalds, Roland Hafner, Abbas Abdolmaleki, Diego de las Casas, Craig Donner, Leslie Fritz, Cristian Galperti, Andrea Huber, James Keeling, Maria Tsimpoukelli, Jackie Kay, Antoine Merle, Jean-Marc Moret, Seb Noury, Federico Pesamosca, David Pfau, Olivier Sauter, Cristian Sommariva, Stefano Coda, Basil Duval, Ambrogio Fasoli, Pushmeet Kohli, Koray Kavukcuoglu, Demis Hassabis and Martin Riedmiller, 16 February 2022, Nature.
DOI: 10.1038/s41586-021-04301-9

NASA’S NEW SHORTCUT TO FUSION POWER

Lattice confinement fusion eliminates massive magnets and powerful lasers


BAYARBADRAKH BARAMSAI THERESA BENYO LAWRENCE FORSLEY BRUCE STEINETZ
27 FEB 2022

EDMON DE HARO

PHYSICISTS FIRST SUSPECTED more than a century ago that the fusing of hydrogen into helium powers the sun. It took researchers many years to unravel the secrets by which lighter elements are smashed together into heavier ones inside stars, releasing energy in the process. And scientists and engineers have continued to study the sun’s fusion process in hopes of one day using nuclear fusion to generate heat or electricity. But the prospect of meeting our energy needs this way remains elusive.

The extraction of energy from nuclear fission, by contrast, happened relatively quickly. Fission in uranium was discovered in 1938, in Germany, and it was only four years until the first nuclear “pile” was constructed in Chicago, in 1942.

There are currently about 440 fission reactors operating worldwide, which together can generate about 400 gigawatts of power with zero carbon emissions. Yet these fission plants, for all their value, have considerable downsides. The enriched uranium fuel they use must be kept secure. Devastating accidents, like the one at Fukushima in Japan, can leave areas uninhabitable. Fission waste by-products need to be disposed of safely, and they remain radioactive for thousands of years. Consequently, governments, universities, and companies have long looked to fusion to remedy these ills.

Among those interested parties is NASA. The space agency has significant energy needs for deep-space travel, including probes and crewed missions to the moon and Mars. For more than 60 years, photovoltaic cellsfuel cells, or radioisotope thermoelectric generators (RTGs) have provided power to spacecraft. RTGs, which rely on the heat produced when nonfissile plutonium-238 decays, have demonstrated excellent longevity—both Voyager probes use such generators and remain operational nearly 45 years after their launch, for example. But these generators convert heat to electricity at roughly 7.5 percent efficiency. And modern spacecraft need more power than an RTG of reasonable size can provide.

One promising alternative is lattice confinement fusion (LCF), a type of fusion in which the nuclear fuel is bound in a metal lattice. The confinement encourages positively charged nuclei to fuse because the high electron density of the conductive metal reduces the likelihood that two nuclei will repel each other as they get closer together.

 
The deuterated erbium (chemical symbol ErD3) is placed into thumb-size vials, as shown in this set of samples from a 20 June 2018 experiment. Here, the vials are arrayed pre-experiment, with wipes on top of the metal to keep the metal in position during the experiment. The metal has begun to crack and break apart, indicating it is fully saturated. 
NASA

The vials are placed upside down to align the metal with the gamma ray beam. Gamma rays have turned the clear glass amber.
NASA

We and other scientists and engineers at NASA Glenn Research Center, in Cleveland, are investigating whether this approach could one day provide enough power to operate small robotic probes on the surface of Mars, for example. LCF would eliminate the need for fissile materials such as enriched uranium, which can be costly to obtain and difficult to handle safely. LCF promises to be less expensive, smaller, and safer than other strategies for harnessing nuclear fusion. And as the technology matures, it could also find uses here on Earth, such as for small power plants for individual buildings, which would reduce fossil-fuel dependency and increase grid resiliency.

Physicists have long thought that fusion should be able to provide clean nuclear power. After all, the sun generates power this way. But the sun has a tremendous size advantage. At nearly 1.4 million kilometers in diameter, with a plasma core 150 times as dense as liquid water and heated to 15 million °C, the sun uses heat and gravity to force particles together and keep its fusion furnace stoked.

On Earth, we lack the ability to produce energy this way. A fusion reactor needs to reach a critical level of fuel-particle density, confinement time, and plasma temperature (called the Lawson Criteria after creator John Lawson) to achieve a net-positive energy output. And so far, nobody has done that.

Lighting the Fusion Fire


In lattice confinement fusion (LCF), a beam of gamma rays is directed at a sample of erbium [shown here] or titanium saturated with deuterons. Occasionally, gamma rays of sufficient energy will break apart a deuteron in the metal lattice into its constituent proton and neutron.


The neutron collides with another deuteron in the lattice, imparting some of its own momentum to the deuteron. The electron-screened deuteron is now energetic enough to overcome the Coulomb barrier, which would typically repel it from another deuteron.


Deuteron-Deuteron Fusion


When the energetic deuteron fuses with another deuteron in the lattice, it can produce a helium-3 nucleus (helion) and give off useful energy. A leftover neutron could provide the push for another energetic deuteron elsewhere.

Alternatively, the fusing of the two deuterons could result in a hydrogen-3 nucleus (triton) and a leftover proton. This reaction also produces useful energy.

Stripping and OP Reaction


Another possible reaction in lattice confinement fusion would happen if an erbium atom instead rips apart the energetic deuteron and absorbs the proton. The extra proton changes the erbium atom to thulium and releases energy.

If the erbium atom absorbs the neutron, it becomes a new isotope of erbium. This is an Oppenheimer-Phillips (OP) stripping reaction. The proton from the broken-apart deuteron heats the lattice.

Fusion reactors commonly utilize two different hydrogen isotopes: deuterium (one proton and one neutron) and tritium (one proton and two neutrons). These are fused into helium nuclei (two protons and two neutrons)—also called alpha particles—with an unbound neutron left over.

Existing fusion reactors rely on the resulting alpha particles—and the energy released in the process of their creation—to further heat the plasma. The plasma will then drive more nuclear reactions with the end goal of providing a net power gain. But there are limits. Even in the hottest plasmas that reactors can create, alpha particles will mostly skip past additional deuterium nuclei without transferring much energy. For a fusion reactor to be successful, it needs to create as many direct hits between alpha particles and deuterium nuclei as possible.

In the 1950s, scientists created various magnetic-confinement fusion devices, the most well known of which were Andrei Sakharov’s tokamak and Lyman Spitzer’s stellarator. Setting aside differences in design particulars, each attempts the near-impossible: Heat a gas enough for it to become a plasma and magnetically squeeze it enough to ignite fusion—all without letting the plasma escape.

Inertial-confinement fusion devices followed in the 1970s. They used lasers and ion beams either to compress the surface of a target in a direct-drive implosion or to energize an interior target container in an indirect-drive implosion. Unlike magnetically confined reactions, which can last for seconds or even minutes (and perhaps one day, indefinitely), inertial-confinement fusion reactions last less than a microsecond before the target disassembles, thus ending the reaction.

Both types of devices can create fusion, but so far they are incapable of generating enough energy to offset what’s needed to initiate and maintain the nuclear reactions. In other words, more energy goes in than comes out. Hybrid approaches, collectively called magneto-inertial fusion, face the same issues.

Who’s Who in the Fusion Zoo






Proton: Positively charged protons (along with neutrons) make up atomic nuclei. One component of lattice confinement fusion (LCF) may occur when a proton is absorbed by an erbium atom in a deuteron stripping reaction.







Neutron: Neutrally charged neutrons (along with protons) make up atomic nuclei. In fusion reactions, they impart energy to other particles such as deuterons. They also can be absorbed in Oppenheimer-Phillips reactions.







Erbium & Titanium: Erbium and titanium are the metals of choice for LCF. Relatively colossal compared with the other particles involved, they hold the deuterons and screen them from one another.







Deuterium: Deuterium is hydrogen with one proton and one neutron in its nucleus (hydrogen with just the proton is protium). Deuterium’s nucleus, call a deuteron, is crucial to LCF.







Deuteron: The nucleus of a deuterium atom. Deuterons are vital to LCF—the actual fusion instances occur when an energetic deuteron smashes into another in the lattice. They can also be broken apart in stripping reactions.







Hydrogen-3 (Tritium): One possible resulting particle from deuteron-deuteron fusion, alongside a leftover proton. Tritium has one proton and two neutrons in its nucleus, which is also called a triton.







Helium-3: One possible resulting particle from deuteron-deuteron fusion, alongside a leftover neutron. Helium-3 has two protons and one neutron in its nucleus, which is also called a helion.







Alpha particle: The core of a normal helium atom (two protons and two neutrons). Alpha particles are a commonplace result of typical fusion reactors, which often smash deuterium and tritium particles together. They can also emerge from LCF reactions.







Gamma ray: Extremely energetic photons that are used to kick off the fusion reactions in a metal lattice by breaking apart deuterons.

Current fusion reactors also require copious amounts of tritium as one part of their fuel mixture. The most reliable source of tritium is a fission reactor, which somewhat defeats the purpose of using fusion.

The fundamental problem of these techniques is that the atomic nuclei in the reactor need to be energetic enough—meaning hot enough—to overcome the Coulomb barrier, the natural tendency for the positively charged nuclei to repel one another. Because of the Coulomb barrier, fusing atomic nuclei have a very small fusion cross section, meaning the probability that two particles will fuse is low. You can increase the cross section by raising the plasma temperature to 100 million °C, but that requires increasingly heroic efforts to confine the plasma. As it stands, after billions of dollars of investment and decades of research, these approaches, which we’ll call “hot fusion,” still have a long way to go.

The barriers to hot fusion here on Earth are indeed tremendous. As you can imagine, they’d be even more overwhelming on a spacecraft, which can’t carry a tokamak or stellarator onboard. Fission reactors are being considered as an alternative—NASA successfully tested the Kilopower fission reactor at the Nevada National Security Site in 2018 using a uranium-235 core about the size of a paper towel roll. The Kilopower reactor could produce up to 10 kilowatts of electric power. The downside is that it required highly enriched uranium, which would have brought additional launch safety and security concerns. This fuel also costs a lot.

But fusion could still work, even if the conventional hot-fusion approaches are nonstarters. LCF technology could be compact enough, light enough, and simple enough to serve for spacecraft.

How does LCF work? Remember that we earlier mentioned deuterium, the isotope of hydrogen with one proton and one neutron in its nucleus. Deuterided metals—erbium and titanium, in our experiments—have been “saturated” with either deuterium or deuterium atoms stripped of their electrons (deuterons). This is possible because the metal naturally exists in a regularly spaced lattice structure, which creates equally regular slots in between the metal atoms for deuterons to nest.

In a tokamak or a stellarator, the hot plasma is limited to a density of 10 14 deuterons per cubic centimeter. Inertial-confinement fusion devices can momentarily reach densities of 1026 deuterons per cubic centimeter. It turns out that metals like erbium can indefinitely hold deuterons at a density of nearly 1023 per cubic centimeter—far higher than the density that can be attained in a magnetic-confinement device, and only three orders of magnitude below that attained in an inertial-confinement device. Crucially, these metals can hold that many ions at room temperature.

The deuteron-saturated metal forms a plasma with neutral charge. The metal lattice confines and electron-screens the deuterons, keeping each of them from “seeing” adjacent deuterons (which are all positively charged). This screening increases the chances of more direct hits, which further promotes the fusion reaction. Without the electron screening, two deuterons would be much more likely to repel each other.

Using a metal lattice that has screened a dense, cold plasma of deuterons, we can jump-start the fusion process using what is called a Dynamitron electron-beam accelerator. The electron beam hits a tantalum target and produces gamma rays, which then irradiate thumb-size vials containing titanium deuteride or erbium deuteride.

When a gamma ray of sufficient energy—about 2.2 megaelectron volts (MeV)—strikes one of the deuterons in the metal lattice, the deuteron breaks apart into its constituent proton and neutron. The released neutron may collide with another deuteron, accelerating it much as a pool cue accelerates a ball when striking it. This second, energetic deuteron then goes through one of two processes: screened fusion or a stripping reaction.

In screened fusion, which we have observed in our experiments, the energetic deuteron fuses with another deuteron in the lattice. The fusion reaction will result in either a helium-3 nucleus and a leftover neutron or a hydrogen-3 nucleus and a leftover proton. These fusion products may fuse with other deuterons, creating an alpha particle, or with another helium-3 or hydrogen-3 nucleus. Each of these nuclear reactions releases energy, helping to drive more instances of fusion.

In a stripping reaction, an atom like the titanium or erbium in our experiments strips the proton or neutron from the deuteron and captures that proton or neutron. Erbium, titanium, and other heavier atoms preferentially absorb the neutron because the proton is repulsed by the positively charged nucleus (called an Oppenheimer-Phillips reaction). It is theoretically possible, although we haven’t observed it, that the electron screening might allow the proton to be captured, transforming erbium into thulium or titanium into vanadium. Both kinds of stripping reactions would produce useful energy.

As it stands, after billions of dollars of investment and decades of research, these approaches, which we’ll call “hot fusion,” still have a long way to go.

To be sure that we were actually producing fusion in our vials of erbium deuteride and titanium deuteride, we used neutron spectroscopy. This technique detects the neutrons that result from fusion reactions. When deuteron-deuteron fusion produces a helium-3 nucleus and a neutron, that neutron has an energy of 2.45 MeV. So when we detected 2.45 MeV neutrons, we knew fusion had occurred. That’s when we published our initial results in Physical Review C.

Electron screening makes it seem as though the deuterons are fusing at a temperature of 11 million °C. In reality, the metal lattice remains much cooler than that, although it heats up somewhat from room temperature as the deuterons fuse.


 
Rich Martin [left], a research engineer, and coauthor Bruce Steinetz, principal investigator for the LCF project’s precursor experiment, examine samples after a run. NASA

Overall, in LCF, most of the heating occurs in regions just tens of micrometers across. This is far more efficient than in magnetic- or inertial-confinement fusion reactors, which heat up the entire fuel amount to very high temperatures. LCF isn’t cold fusion—it still requires energetic deuterons and can use neutrons to heat them. However, LCF also removes many of the technologic and engineering barriers that have prevented other fusion schemes from being successful.

Although the neutron recoil technique we’ve been using is the most efficient means to transfer energy to cold deuterons, producing neutrons from a Dynamitron is energy intensive. There are other, lower energy methods of producing neutrons including using an isotopic neutron source, like americium-beryllium or californium-252, to initiate the reactions. We also need to make the reaction self-sustaining, which may be possible using neutron reflectors to bounce neutrons back into the lattice—carbon and beryllium are examples of common neutron reflectors. Another option is to couple a fusion neutron source with fission fuel to take advantage of the best of both worlds. Regardless, there’s more development of the process required to increase the efficiency of these lattice-confined nuclear reactions.

We’ve also triggered nuclear reactions by pumping deuterium gas through a thin wall of a palladium-silver alloy tubing, and by electrolytically loading palladium with deuterium. In the latter experiment, we’ve detected fast neutrons. The electrolytic setup is now using the same neutron-spectroscopy detection method we mentioned above to measure the energy of those neutrons. The energy measurements we get will inform us about the kinds of nuclear reaction that produce them.

We’re not alone in these endeavors. Researchers at Lawrence Berkeley National Laboratory, in California, with funding from Google Research, achieved favorable results with a similar electron-screened fusion setup. Researchers at the U.S. Naval Surface Warfare Center, Indian Head Division, in Maryland have likewise gotten promising initial results using an electrochemical approach to LCF. There are also upcoming conferences: the American Nuclear Society’s Nuclear and Emerging Technologies for Space conference in Cleveland in May and the International Conference on Cold Fusion 24, focused on solid-state energy, in Mountain View, Calif., in July.

Any practical application of LCF will require efficient, self-sustaining reactions. Our work represents just the first step toward realizing that goal. If the reaction rates can be significantly boosted, LCF may open an entirely new door for generating clean nuclear energy, both for space missions and for the many people who could use it here on Earth.

FROM YOUR SITE ARTICLES
Magnetic-Confinement Fusion Without the Magnets - IEEE Spectrum ›


Bayarbadrakh Baramsai
is a systems engineer at NASA Glenn Research Center contributing to on the lattice confinement fusion project.,

Theresa Benyo
is a physicist and the principal investigator for the lattice confinement fusion project at NASA Glenn Research Center.,

Lawrence Forsley
is the deputy principal investigator for NASA’s lattice confinement fusion project, based at NASA Glenn Research Center.

Bruce Steinetz
is a senior technologist at NASA Glenn Research Center involved in the lattice confinement fusion project.




New method for rapid, efficient hydrogen generation from water

BY AMIT MALEWAR
FEBRUARY 28, 2022
EMERGING TECH
Hydrogen gas generated from the reaction of water with an aluminum-gallium composite. Credit: Amberchan et al., Applied Nano Materials 2022

Aluminum is a highly reactive metal that can strip oxygen from water molecules to generate hydrogen gas. Now, researchers at UC Santa Cruz have developed a new cost-effective and effective way to use aluminum’s reactivity to generate clean hydrogen fuel.

In a new study, a team of researchers shows that an easily produced composite of gallium and aluminum creates aluminum nanoparticles that react rapidly with water at room temperature to yield large amounts of hydrogen. According to researchers, the gallium was easily recovered for reuse after the reaction, which yields 90% of the hydrogen that could theoretically be produced from the reaction of all the aluminum in the composite.

“We don’t need any energy input, and it bubbles hydrogen-like crazy. I’ve never seen anything like it,” said UCSC Chemistry Professor Scott Oliver.

The reaction of aluminum and gallium with water works because gallium removes the passive aluminum oxide coating, allowing direct contact of aluminum with water.

Using scanning electron microscopy and x-ray diffraction, the researchers showed the formation of aluminum nanoparticles in a 3:1 gallium-aluminum composite, which they found to be the optimal ratio for hydrogen production. In this gallium-rich composite, the gallium serves both to dissolve the aluminum oxide coating and to separate the aluminum into nanoparticles.

“The gallium separates the nanoparticles and keeps them from aggregating into larger particles,” said Bakthan Singaram, corresponding authors of a paper on the new findings. “People have struggled to make aluminum nanoparticles, and here we are producing them under normal atmospheric pressure and room temperature conditions.”

The researchers say the composite for their method can be made with readily available sources of aluminum, including used foil or cans. The composite can be easily stored for long periods by covering it with cyclohexane to protect it from moisture.

While gallium is not abundant and is relatively expensive, it can be recovered and reused multiple times without losing effectiveness. However, it remains to be seen if this process can be scaled up to be practical for commercial hydrogen production.

New screening system may point the way to clean, renewable hydrogen power

hydrogen
Credit: Pixabay/CC0 Public Domain

A new, highly sensitive system for detecting the production of hydrogen gas may play an important role in the quest to develop hydrogen as an environmentally friendly and economical alternative to fossil fuels, according to Penn State scientists.

"We have built a new system for detecting hydrogen evolution that is the most sensitive in the world," said Venkatraman Gopalan, professor of materials science and engineering and physics at Penn State. "This tackles a problem that had not been addressed but that is important going forward for materials discovery."

The tool can be used to screen promising photocatalysts, materials that when placed in water and exposed to sunlight facilitate reactions that split water molecules into hydrogen and oxygen gases, the scientists said. The process, called water splitting, offers a clean and renewable source of hydrogen, but it is inefficient and finding the right photocatalysts to boost  has been challenging.

In a study, the team found they could test smaller amounts of  material than previously possible and detect very small amounts of hydrogen gas produced, or hydrogen evolution, in the range of tens of nanomoles per hour per tens of milligrams of material. They recently published their findings in the Review of Scientific Instruments.

"If you ranked low in both the categories of hydrogen evolution rate and the mass of the photocatalyst needed, it means it's a really sensitive system for discovering new photocatalytic materials," said Huaiyu "Hugo" Wang, a graduate student in the Department of Materials Science and Engineering who led the study and built the system. "And it turns out that our work ranked the best in both categories."

Developing photocatalysts is an area of intense research. At Penn State, scientists led by Ismaila Dabo, associate professor of materials science and engineering, recently used a supercomputer to narrow a list of more than 70,000 different compounds down to six promising candidates. Another team led by Raymond Schaak, DuPont Professor of Materials Chemistry, synthesized the materials in their laboratory, but creating even small amounts is expensive and time consuming.

"Typical photocatalysts use rare and precious metals such as platinum, which are immensely expensive," said Julian Fanghanel, a graduate student in materials science and engineering who is co-advised by Dabo and Schaak. "For this project, we are making dozens of samples of materials, so making them in large quantities is impractical, time- consuming and costly."

Gopalan said the new system will allow scientists to test smaller amounts of these materials and focus efforts on the most promising candidates. But when it came time to test samples, the researchers found commercial equipment was not sensitive enough, so Gopalan and Wang built their own.

"They developed from the ground up a uniquely sensitive gas chromatography setup for the reproducible detection of hydrogen, which was instrumental to the validation of our computational predictions," Dabo said. "This newly developed capability was a key enabler to confirm the discovery of new photocatalysts for the solar production of hydrogen."

Unlike the commercial units, the new design can test photocatalysts in their bare state, the scientists said. To be effective, photocatalysts require co-catalysts and other techniques that further improve their efficiency. The gold standard, for example, is  with platinum particles added as a co-catalyst. Photocatalysts without these add-ons are considered bare.

"When we are looking at new materials, we don't know what the correct co-catalysts will be," Wang said. "The simple answer is—detecting the bare form is the quickest way to help guide the direction of this materials discovery process."

Two of the photocatalyst materials tested as part of the study performed better than titanium dioxide did in its bare state, the scientists said. The findings suggest that further study of those materials could yield promising photocatalysts.

"If you have a bare compound that behaved much better than titanium dioxide then we know this is a potential material to optimize," Wang said. "If we find the right co-catalysts for those materials, we can improve them by orders or magnitude and these materials could eventually be useful in water splitting."

The scientists said the system is affordable and easy to build from commercially available components. It features a low leakage rate and a small reaction chamber volume size, which allows three orders of magnitude higher detection sensitivity for  evolution than a conventional gas chromatography system.

"It's not a brand new technology, it's just superior engineering," Gopalan said. "The value of this is that it's a simple, cost-effective system that anyone can build. And if they do, their research for discovering new photocatalysts is going to go much faster."

Also contributing from Penn State was Rebecca Katz, graduate student in the Eberly College of Science.Computers help researchers find materials to turn solar power into hydrogen

More information: Huaiyu(Hugo) Wang et al, Ultrasensitive electrode-free and co-catalyst-free detection of nanomoles per hour hydrogen evolution for the discovery of new photocatalysts, Review of Scientific Instruments (2022). DOI: 10.1063/5.0077650

Journal information: Review of Scientific Instruments 

Provided by Pennsylvania State University 

Paul Martin Talks H2 Science Coalition & More Problems With Hydrogen



Paul Martin CleanTech Talk
LISTEN TO PODCAST

The H2 Science Coalition is only months old, formed in November 2021 as an independent advocacy and PR group for hydrogen sanity. It’s intended to counter the bags of money going into hydrogen PR that is contrary to elementary physics and economics.

By Michael Barnard
Published13 hours ago

Paul Martin, Senior Technical Fellow Zeton and one of five founding members of the H2 Science Coalition, returns to CleanTech Talks. The H2 Science Coalition is only months old, formed in November 2021 as an independent advocacy and PR group for hydrogen sanity. It’s intended to counter the bags of money going into hydrogen PR that is contrary to elementary physics and economics. Previously Paul and I had talked about hydrogen end use cases, leveraging Michael Liebreich’s useful hydrogen ladder.

The H2 Science Coalition coalesced on LinkedIn. The group was violently agreeing about hydrogen, among other things. An organization devoted to helping governments get access to good advice and helping groups get access to media suggested they formalize their efforts.

It’s focused in the UK and Europe initially because the hydrogen #hopium epidemic has an epicenter there. Their founders all share three attributes: they are independent with no vested interests, they have specific expertise, and they are willing to point at unclothed emperors.

Paul’s expertise is hydrogen production, the alternatives to hydrogen production, and fuels production. Bernard van Dijk is a recently retired airplane performance lecturer at Amsterdam University of Applied Sciences. He really hadn’t given hydrogen much thought, then looked at it and thought it was nonsense that anyone would think of using hydrogen for aviation. David Cebon is a Professor of Mechanical Engineering, University of Cambridge, England, and has a strong expertise in transportation. Tom Baxter is Visiting Professor University of Strathclyde, and ex-BP Engineer, with a broad and deep knowledge about what is done and has been done in the fossil fuel industry, including on the use of hydrogen as a replacement for natural gas. Jochen Bard is Director of Energy Process Technology Division, Fraunhofer IEE. Many at Fraunhofer are tied up with hydrogen, such as the gray goo hydrogen paste, but it’s a very big place and allows a wide variety of voices to express themselves. Jochen has an energy markets and economic perspective.

The coalition shares the opinion that replacing natural gas with hydrogen is fundamentally suspect, as is blending hydrogen with natural gas.


Their intent is to branch out to include North America and the world over time, but the current focus is on finding new members with specific expertise. Gaps include shipping, storage, geological storage, and some end use cases.

The motivations for the hydrogen #hopium PR push are clear, but the thinking is murky. In a truly decarbonized future there is no role for natural gas as a fuel. Shell, as an example, has a lot of natural gas reserves in the ground and wants its investors and financiers to think that they still have value. The approach is to pretend that it will be turned into hydrogen and used, with carbon capture and sequestration.

Personally, Martin would never have hydrogen in his home, even though he still has a legacy natural gas service. A big part of that is that burning any gas in our nitrogen rich atmosphere creates NOx, two of which, NO and N2O, are toxic and contribute to juvenile asthma, and N2O or nitrous oxide, which has 265 times the global warming potential of CO2 and sticks around for at least 100 years. It’s a nasty molecule which we can’t make a lot more of if we are serious about fighting climate change.

Town gas often comes up with discussions of hydrogen, as it it had high hydrogen percentages. It was made from coal, and it also had a lot of carbon monoxide, hence the historical tradition of putting heads in ovens. It was called town gas for a reason. It was made in each town. Every town with a system has a giant mess from the coal tar that was dumped. The dumps are often the downtowns, and we’ve spent a lot of money digging up the deposits and burying them elsewhere. The compounds that are nasty and toxic in coal tars are much the same as those in burning tobacco.

The argument that the natural gas system used to carry town gas with lots of hydrogen doesn’t wash, as the transmission for long distances of hydrogen is problematic. 20% of volume, the typical reference number, is only 6% of the energy of the displaced natural gas.

EU’s target for 2030 is 30% of hydrogen in the natural gas transmission system. Martin published a piece in CleanTechnica that details the challenges with hydrogen in existing natural gas transmission and distribution systems.

20% is a limit due to metallurgy. The ordinary soft steel used in the distribution network is fine, as long as it’s not 500° Celsius, but they don’t use soft steel in long distance transmission pipelines. Natural gas is very low corrosivity, and when building the transmission pipelines, they used the best material for natural gas. But hydrogen’s characteristics are very different, and challenging for natural gas transmission lines, especially around embrittlement.

There are a couple of types of embrittlement. One type involves plating in which electrochemical processes end up making hydrogen atoms at one of the electrodes, and they can permeate between the grains of the metal. When plated structural fasteners are used, they can become brittle. Instead of stretching, they snap.

In pipes, it’s a different phenomenon. In harder steel pipes, the hydrogen diminishes the fatigue life of the material. Pipes bend and flex over time, mostly due to daily pressure changes in the pipe, but also due to frost heaves, flooding exposure, and the like. When something gets bent again and again, the material can change its structure and end up breaking. Pipes are designed for a certain number of pressure changes.

For some pipelines, it’s 1% hydrogen. For the EU it’s 20%, and for the US it’s 4%. If hydrogen is present when flexing, the hydrogen can sneak in and fill dislocations. As an analogy, when rolling dough, adding flour to it reduces the ability of the dough to adhere to the surface. If you have two metal crystals in contact with one another, they share forces, but if that space is filled with hydrogen it can reduce their ability to stick together.

Hydrogen is like sand at a beach, bathing suits, and human crevices. It gets in there and causes irritation.

Pipes come in short lengths that are welded together. Spiral welded pipes also exist. Where welding has been used, the heat affected zone is affected differently and to a greater extent by hydrogen. At weld spots, there is a greater potential for cracks forming and propagating.

Transmission of hydrogen through long distance hard steel natural gas pipelines radically degrades their lifetimes. They aren’t long-lasting assets that can be reused, but shorter lived assets that we will be breaking.

And so, the first half of our conversation ended, with analogies of baking and beach time, heart-warming and irritating. But the second half of the conversation exists as well, for those who are members of CleanTechnica Pro. For those interested in hearing more from hydrogen expert Paul Martin, sign up.

In the second half, we talk about the problem of hydrogen diffusing through polyethylene pipes, the 2-5x the boil off rate of natural gas for shipping, trucking, or aviation, further esoteric exothermic problems with chilling hydrogen, the 24° Kelvin liquification point of hydrogen that requires 3x the energy as natural gas, the 5x global warming of hydrogen, the substantially enhanced likelihood of explosions in homes and buildings, the extra energy to move hydrogen, the lack of odorants that work with fuel cells, the prevalence of airborne wind energy types that moved into urban air mobility, the waste of public money such as the request from Suncor and ATCO for a hydrogen CCS facility for refinery use, the failure to focus public funds on actual climate solutions, and more.

 

Three critical factors in the end-Permian mass extinction

Three critical factors in the end-Permian mass extinction
Fossils that be became extinct millions of years ago can reveal various aspects of how they
 lived. Credit: W.J.Foster

Two-hundred fifty-two million years ago, volcanic eruptions led to massive climate changes. Around the globe, roughly 75 percent of all land-based organisms and roughly 90 percent of all marine organisms died out as a result. Dr. William Foster from Universität Hamburg's Center for Earth System Research and Sustainability (CEN) and colleagues have now used a new form of machine learning to unravel the causes of the mass extinction in the oceans. This was possible because the "machine" part of machine learning is normally a "black box," but Dr. Foster and colleagues have managed to open that box: They have not only determined which organisms went extinct and which survived—but also why. Their findings have just been published in the journal Paleobiology.

The end of the Permian was characterized by the greatest mass extinction event in Earth's history. Two-hundred fifty-two million years ago, a series of  in Siberia led to a massive release of greenhouse gases. In the course of the next several millennia, the climate ultimately warmed by 10 degrees. As a consequence, on land, roughly 75 percent of all  went extinct; in the oceans, the number was roughly 90 percent.

By analyzing how the now-extinct  once lived, Dr. Foster and his team were able to directly link their extinction to the following climate changes: declining oxygen levels in the water, rising water temperatures, and most likely also ocean acidification.

These changes are similar to current trends. "Needless to say, our findings on the Permian can't be applied to modern climate change one-to-one. The two climate systems are far too different," says Foster, a geoscientist. "Yet they do show which traits were critical for an organism's survival or extinction—under similar conditions. This can offer us valuable indicators for who or what will be at the greatest risk in the future."

Specifically, the team analyzed more than 25,000 records on 1,283 genera of fossil marine organisms like bivalves, snails, sponges, algae and crustaceans from the region of South China—all of which had mineral skeletons or shells. Their fossilized remains can be dated using a special method, offering insights into  dating back millions of years. The team also drew on an enormous database that offers additional information on various ecological aspects of how these organisms lived.

Three critical factors in the end-Permian mass extinction
The coastline of Svalbard in Norway exposes Permian-Triassic rocks and expose the 
record of the mass extinction. Credit: W.J.Foster

For each genus, 12 of these criteria were analyzed. Did certain traits make a given organism more likely to survive under the conditions prevalent at the end of the Permian—or not? With the aid of machine learning, a method from the field of artificial intelligence, all of these factors were analyzed jointly and simultaneously. In the process, the machine essentially made certain rational decisions on its own. Once this was done, the team compared the results: what organisms were there before, during and after the mass extinction?

Their findings reveal the four factors that were most essential to whether or not organisms survived the end of the Permian: where in the water they lived, the mineralization of their shells, species diversity within their genus, and their sensitivity to acidification.

Dr. Foster says, "But with previous machine learning applications, we couldn't say how the machine made its decisions."

Using a newly implemented method from games theory, Dr. Foster has now succeeded in unraveling this aspect. "Some animals lived in deeper water. Here, the machine shows that the worsening lack of oxygen posed a risk. In contrast, those animals that lived nearer the surface had to contend with the rising water temperatures. Plus, when you have only a limited habitat, you have nowhere to go when that specific habitat becomes uninhabitable."

As such, the results show which of the organisms' traits were determined to be potentially fatal. The team was ultimately able to confirm that the mass extinction can be directly attributed to deoxygenation, rising water temperatures and acidification—which indicates that, in a future climate crisis, these could also be the three main causes of  in the long term

Fossil snail shells offer new tool for analyzing ancient ocean chemistry

More information: William J. Foster et al, Machine learning identifies ecological selectivity patterns across the end-Permian mass extinction, Paleobiology (2022). DOI: 10.1017/pab.2022.1

Journal information: Paleobiology 

Provided by University of Hamburg 

Ex-chief of Hong Kong barristers says he left city after police interview


Published March 2, 2022

HONG KONG : A British human rights lawyer and former head of Hong Kong's Bar Association said he left the city on Tuesday evening, hours after China's state-backed media said he was summoned to a local police station and interviewed by national security police.

The Hong Kong-based Wen Wei Po reported that Paul Harris had been summoned to a police station in the city's Wanchai district, with Harris shown walking into the police station in a video posted on its website.

The report on Wednesday also carried a second video of Harris at Hong Kong International Airport pushing a trolley with suitcases. In the video, Harris was asked by a reporter whether he was afraid of being arrested, but he didn't comment. Reuters could not confirm when either video was filmed.

Another state-backed newspaper, Ta Kung Pao, said in a separate report on Wednesday that Harris had "fled" Hong Kong after being called in to meet police for an "alleged violation" of the national security law.

Harris confirmed to Reuters in a phone message that he had left the city, but gave no further details.

"Yes, on way to see my mother in England," he said. Harris declined to answer any further questions from Reuters.

Harris stood down in January from the Bar Association, a professional body representing Hong Kong's 1,500 barristers, without seeking re-election.

Harris' year-long term was marked by criticism from Hong Kong and Beijing officials who described him as "anti-China", and repeated attacks in state-backed media.

Harris, a senior counsel, had commented on jail terms for several democracy activists last year and the sweeping national security law that China imposed on the city in June 2020.

Two of Harris' fellow barristers said he had been planning for some time to leave Hong Kong in March, but it was not known for how long.

At the Bar's in-house election in January, Harris was replaced by fellow barrister Victor Dawes.

Some lawyers see Dawes as a pro-establishment selection after several years of strong stances by the association in the former British colony, whose robust legal system helped make it an attractive international financial centre.

Dawes said at the time his team would work to keep the Bar's independence and defend human rights if needed as prosecutions moved through courts under the national security law, which came after months of anti-government protests.

The law expands police powers of search and surveillance and punishes acts of subversion, terrorism, collusion with foreign forces and secession with up to life imprisonment.

Harris had been involved in several cases related to the law.

Dawes and the Bar Association did not respond to Reuters' requests for comment on Tuesday night and Wednesday morning.

(Reporting By Greg Torode, James Pomfret and Jessie Pang. Editing by Gerry Doyle)
KOREA
Moon calls nuclear power main source of electricity over next 60 years


President Moon Jae-in presides over a meeting on stable supply of power at the presidential office in Seoul, Friday. Yonhap

President Moon Jae-in said Friday that Korea will fully use its nuclear power plants under its nuclear phase-out scheme, calling nuclear power a main source of electricity over the next 60 years.

"Over the next 60 years, while nuclear power plants continue to operate, nuclear power plants should be fully utilized as a main base source of electricity," Moon told a meeting on stable supply of power, according to presidential spokesperson Park Kyung-mee.

The construction of four nuclear reactors ― Shin-Hanul No. 1 and No. 2, as well as Shin-Kori No. 5 and No. 6 ― have been delayed due to safety inspections and other issues.

Moon urged officials to make efforts to gradually start operations at the four nuclear reactors, because safety standards for operations have been strengthened, Park said.

Moon said the Korean government's energy transition policy is to gradually suspend the construction of new nuclear power plants and prohibit the extension of the lifespans of older nuclear power plants until 2084.

Moon called on officials to "make every effort to ensure the safety of nuclear power plants while maintaining proper operation rate," according to Park.

The nuclear phase-out scheme set by Moon's government centers on slowly breaking away from nuclear energy by refraining from building additional plants while retiring old ones.

Under the roadmap, Korea plans to decrease the number of nuclear plants in operation to 17 by 2034 from 24 at present.

The government plans to reduce nuclear energy to account for 23.9 percent of the country's total power generation by 2030 from around 30 percent last year. It targets raising the proportion of renewable sources to 20 percent from 6.6 percent over the same period. (Yonhap)

ZOMBIE APOCALYPSE

What brain-eating amoebae can tell us about the diversity of life on earth and evolutionary history

What brain-eating amoebae can tell us about the diversity of life on earth and evolutionary history
Naegleria gruberi cells use one set of tubulins to build a mitotic spindle (cyan, left), and 
another set of tubulins (orange, right) to transform into a flagellate cell type. 
Credit: Katrina Velle, Fritz-Laylin Lab, UMass Amherst

An international team of researchers, led by the University of Massachusetts Amherst, recently announced in the journal Current Biology that an amoeba called Naegleria has evolved more distinct sets of tubulins, used for specific cellular processes, than previously thought. Their insight has a host of implications, which range from developing treatments for brain-eating infections to better understanding how life on earth evolved such enormous diversity.

Much of life on earth relies on a series of polymers called microtubules, composed of , to complete a wide range of tasks inside their . These microtubules are like the 2x4s of the cell and are used in everything from helping the cell to move, to transporting food and waste within the cell and giving the cell structural support.

Microtubules also help in mitosis, which is when a single cell divides into two by first duplicating its chromosomes and then pulling each set to opposite sides of the cell before dividing itself in two. One of the key moments in mitosis is when a spindle, made up of microtubules, grabs hold of the chromosomes and helps separate them into two identical sets.

This is where Naegleria comes in. Biologists had previously known that Naegleria uses a specific kind of tubulin during mitosis. But the new study, led by Katrina Velle, a postdoc in biology at UMass Amherst and the paper's lead author, shows that Naegleria also employs three additional distinct tubulins specifically during mitosis. One pair of tubulins are used only during mitosis, while the other, the flagellate tubulin, specialize in cellular movement. The authors of the study then compared the tubulins and the structures they build to each other and those of more commonly studied species.

What brain-eating amoebae can tell us about the diversity of life on earth and evolutionary history
The cell surface of a Naegleria gruberi amoeba visualized by scanning electron 
microscopy. Credit: Katrina Velle, Fritz-Laylin Lab, UMass Amherst, taken at the Marine
 Biological Laboratory Central Microscopy Center

The implications of this work are exciting and range from the practical to the theoretical. For instance, the team studied a species of Naegleria, Naegleria gruberi, which is closely related to Naegleria fowleri—an amoeba that can eat your brain. "If we can understand the basic biology of Naegleria," says Velle, "we can learn how to kill it by devising drugs that target the amoeba's unique tubulins."

But Naegleria also helps us to understand the basic rules that govern life on earth. "All organisms have to replicate themselves," says Lillian Fritz-Laylin, professor of biology at UMass Amherst and a senior author of the paper. "We know how the replication processes works for some cells, but there's a huge set that we don't understand. Naegleria lets us test the rules scientists have come up with to see if they hold here."

To conduct their research, the team relied in part on the state-of-the-art microscopy equipment at UMass Amherst's Institute for the Applied Life Sciences (IALS), which combines deep and interdisciplinary expertise from 29 departments on the UMass Amherst campus to translate  into innovations that benefit human health and well-being. The team grew the Naegleria cells, stained them with different chemicals so that the tubulins would glow, and then took extremely high resolution, 3D photographs, which allowed them to measure, count and analyze the different microtubule structures.

"I've spent most of my career studying the mitotic spindles of more common cells, like mammalian cells," says Patricia Wadsworth, professor of biology at UMass Amherst and one of the paper's senior authors. "The tools of modern biology allow us to explore more diverse cells, like Naegleria, which is in some ways similar, but also very different."

"People often think of technology driving science," says Fritz-Laylin. "But in this case, the questions we are trying to answer are so fundamental to how life on earth operates, and of such interest to so many scientific specialties, that we needed to assemble an international team of various experts. In this case, collaboration, teamwork and effective communication drove the science."Proportion of mutations in cellular protein drives neurodegeneration

More information: Katrina B. Velle et al, Naegleria's mitotic spindles are built from unique tubulins and highlight core spindle features, Current Biology (2022). DOI: 10.1016/j.cub.2022.01.034

Journal information: Current Biology 

Provided by University of Massachusetts Amherst