Tuesday, September 07, 2021

Why is the color blue so rare in nature?


By Mindy Weisberger
about 21 hours ago

Feeling blue? That color isn't as common as you may think.
In poison dart frogs, bright blue colors broadcast a warning to predators that the animal is toxic. 
(Image credit: Lillian King/Getty Images)

When you look up at the blue sky overhead or gaze across the seemingly endless expanse of a blue ocean, you might think that the color blue is common in nature.

But among all the hues found in rocks, plants and flowers, or in the fur, feathers, scales and skin of animals, blue is surprisingly scarce.

But why is the color blue so rare? The answer stems from the chemistry and physics of how colors are produced — and how we see them.

Related: Why is the sky blue?


We're able to see color because each of our eyes contains between 6 million and 7 million light-sensitive cells called cones. There are three different types of cones in the eye of a person with normal color vision, and each cone type is most sensitive to a particular wavelength of light: red, green or blue. Information from millions of cones reaches our brains as electrical signals that communicate all the types of light reflected by what we see, which is then interpreted as different shades of color.

When we look at a colorful object, such as a sparkling sapphire or a vibrant hydrangea bloom, "the object is absorbing some of the white light that falls onto it; because it's absorbing some of the light, the rest of the light that's reflected has a color," science writer Kai Kupferschmidt, author of "Blue: In Search of Nature's Rarest Color" (The Experiment, 2021), told Live Science.

"When you see a blue flower — for instance, a cornflower — you see the cornflower as blue because it absorbs the red part of the spectrum," Kupferschmidt said. Or to put it another way, the flower appears blue because that color is the part of the spectrum that the blossom rejected, Kupferschmidt wrote in his book, which explores the science and nature of this popular hue



In the book "Blue," writer Kai Kupferschmidt explores the science behind this elusive color. (Image credit: Courtesy of The Experiment)

In the visible spectrum, red has long wavelengths, meaning it is very low-energy compared with other colors. For a flower to appear blue, "it needs to be able to produce a molecule that can absorb very small amounts of energy," in order to absorb the red part of the spectrum, Kupferschmidt said.

Generating such molecules — which are large and complex — is difficult for plants to do, which is why blue flowers are produced by fewer than 10% of the world's nearly 300,000 flowering plant species. One possible driver for the evolution of blue flowers is that blue is highly visible to pollinators such as bees, and producing blue blossoms may benefit plants in ecosystems where competition for pollinators is high, Adrian Dyer, an associate professor and vision scientist at the Royal Melbourne Institute of Technology in Melbourne, Australia, told the Australian Broadcasting Company in 2016.

As for minerals, their crystal structures interact with ions (charged atoms or molecules) to determine which parts of the spectrum are absorbed and which are reflected. The mineral lapis lazuli, which is mined only in Afghanistan and produces the rare blue pigment ultramarine, contains trisulfide ions — three sulfur atoms bound together inside a crystal lattice — that can release or bind a single electron.

"That energy difference is what makes the blue," Kupferschmidt said.


Azurite is a copper carbonate hydroxide mineral known for its deep-blue color. (Image credit: Serge Briez/capmediations/Getty Images)


Blue animals' colors don't come from chemical pigments. Rather, they rely on physics to create a blue appearance. Blue-winged butterflies in the Morpho genus have intricate, layered nanostructures on their wing scales that manipulate layers of light so that some colors cancel each other out and only blue is reflected; a similar effect happens in structures found in the feathers of blue jays (Cyanocitta cristata), the scales of blue tangs (Paracanthurus hepatus) and the flashing rings of venomous blue-ringed octopuses (Hapalochlaena maculosa).

Blue shades in mammals are even rarer than in birds, fish, reptiles and insects. Some whales and dolphins have bluish skin; primates such as golden snub-nosed monkeys (Rhinopithecus roxellana) have blue-skinned faces; and mandrills (Mandrillus sphinx) have blue faces and blue rear ends. But fur — a trait shared by most terrestrial mammals — is never naturally bright blue (at least, not in visible light. Researchers recently found that platypus fur glows in vivid shades of blue and green when exposed to ultraviolet (UV) rays, Live Science previously reported).


The highly venomous blue-ringed octopus. (Image credit: Belive/Getty Images)

"But it takes a lot of work to make this blue, and so the other question becomes: What are the evolutionary reasons to make blue? What's the incentive?" Kupferschmidt said. "The fascinating thing when you dive into these animal worlds is always, who's the recipient of this message and can they see the blue?"

For example, while humans have three light-sensing receptor types in our eyes, birds have a fourth receptor type for sensing UV light. Feathers that appear blue to human eyes "actually reflect even more UV light than blue light," Kupferschmidt explained. By that reasoning, the birds that we call blue tits (Cyanistes caeruleus) "would probably call themselves 'UV tits,' because that's what they would mostly see," he said.


Related: How do dogs see the world?


Because of blue's scarcity in nature, the word for blue was a relative latecomer to languages around the world, appearing after the words for black, white, red and yellow, according to Kupferschmidt.

"One theory for this is that you really only need to name a color once you can dye things — once you can divorce the color from its object. Otherwise, you don't really need the name for the color," he explained. "Dyeing things blue or finding a blue pigment happened really late in most cultures, and you can see that in the linguistics."


Birds' brilliant blue plumage, such as that of Spix's macaws (Cyanopsitta spixii), gets its color not from pigments but from structures in feathers that scatter light. (Image credit: Wera Rodsawang/Getty Images)

The earliest use of blue dye dates to about 6,000 years ago in Peru, and the ancient Egyptians combined silica, calcium oxide and copper oxide to create a long-lasting blue pigment known as irtyu for decorating statues, researchers reported Jan. 15 in the journal Frontiers in Plant Science. Ultramarine, a vivid blue pigment ground from lapis lazuli, was as precious as gold in medieval Europe, and was reserved primarily for illustrating illuminated manuscripts.

RELATED MYSTERIES

How do we see in color?

Why do babies' eyes start out blue, then change color?

Why is the ocean blue?

Blue's rarity meant that people viewed it as a high-status color for thousands of years. Blue has long been associated with the Hindu deity Krishna and with the Christian Virgin Mary, and artists who were famously inspired by blue in nature include Michelangelo, Gauguin, Picasso and Van Gogh, according to the Frontiers in Plant Science study.

"The relative scarcity of blue available in natural pigments likely fueled our fascination," the scientists wrote.

Blue also colors our expressions, appearing in dozens of English idioms: You can work a blue-collar job, swear a blue streak, sink into a blue funk or talk until you're blue in the face, to name just a few. And blue can sometimes mean contradictory things depending on the idiom: "'Blue sky ahead' means a bright future, but 'feeling blue' is being sad," Kupferschmidt said.

Blue’s scarcity in nature may have helped shape our perception of the color and things that appear blue. "With blue, it's like a whole canvas that you can still paint on," Kupferschmidt said. "Maybe because it is rare in nature and maybe because we associate it with things that we can't really touch, like the sky and the sea, it's something that is very open to different associations."

Originally published on Live Science.
Three Ways to Prevent Our Next Massive Power Failure

Hurricane Ida exposed the grid’s weaknesses. It didn’t have to be this way.


EMILY PONTECORVOBio

Utility poles lean over a street following Hurricane Ida on August 31, 2021 in Houma, Louisiana.
Scott Olson/Getty


This story was originally published by Grist and is reproduced here as part of the Climate Desk collaboration.

Hurricane Ida, one of the strongest storms to hit the US on record, intensified so rapidly before hitting New Orleans that city officials did not have enough time to issue a mandatory evacuation order. Limited exit routes from the city meant that people would have been stuck in traffic on the highway when the storm came. Those who stayed in the city and surrounding area were hit by 150-mile-per-hour winds and heavy rains that knocked out power to almost 1 million homes and businesses in Louisiana and Mississippi on Monday. By Thursday afternoon, in the midst of the blazing heat wave, Entergy, the utility that serves most of the region, reported that only 18 percent of its system had been restored.“The reality is, our infrastructure is built for the climate of the past, and we keep rebuilding it by incremental improvements.”

At the same time that stronger, wetter storms like Ida are exposing the dangerous weaknesses of the US electricity grid, the clearest pathways to stop the effects of climate change from getting worse all involve people becoming more and more reliant on it—for example, by trading gas-powered cars for electric ones, or using renewable electricity to heat homes. As demand for electricity grows, experts say that the way utilities and policymakers address grid resilience, which is largely reactive rather than preventative, has to change.

Ida’s aftermath shows just how risky petrochemical production is in a hurricane zone

“The reality is, our infrastructure is built for the climate of the past, and we keep rebuilding it by incremental improvements,” said Roshi Nateghi, an assistant professor of industrial engineering at Purdue University. “And that’s just not gonna cut it.”

Resilience is a slippery word. There’s no universally agreed-upon way to define or measure it. Experts say it’s unrealistic to expect a grid that never has outages, but there are at least three different kinds of solutions that Nateghi and others point to that could help our electricity system withstand stronger storms and, in the inevitable case of an outage, ensure that communities get the minimal service needed to remain safe.

The first begins with what we’ll call the old, incremental way of thinking—a focus on the physical infrastructure that makes up the grid. The scale of the damage wrought by Ida was severe. Entergy reported that within its transmission system—the high-voltage poles and wires that deliver electricity from power plants to the distribution lines that serve customers’ neighborhoods—more than 200 wires and 200 substations had been put out of service by the storm. In its distribution system, about 10,000 poles, 13,000 wires, and 2,000 transformers were damaged or destroyed.

There’s a lot utilities can do to minimize this kind of damage during storms. They can design systems to withstand stronger winds by using stronger wires supported by poles spaced more closely together. They can replace wooden poles with concrete and steel, and be diligent about trimming trees nearby. But Nateghi said these kinds of fixes are piecemeal and may be more expensive in the long term than an often-debated solution with high upfront costs—burying power lines underground. “It’s always argued to be really expensive,” said Nateghi, who said that when you look at the full costs of these disasters, many of which enter the billions, it might not seem as expensive. Buried lines are protected from wind and can be insulated from flooding. The downside is that they are harder to access for repairs.

Half of low-income households in New Orleans spend more than 10 percent of their income on energy. The nationwide average is 3.5 percent.

Logan Burke, the executive director of New Orleans-based nonprofit the Alliance for Affordable Energy, said there have been conversations about burying lines in New Orleans for decades. Part of the problem is that the cost of burying lines would likely get passed on to customers through their electric bills, and that the city, and Louisiana at large, has extreme levels of poverty and high energy burdens. Half of the low-income households in New Orleans spend more than 10 percent of their income on energy, according to a 2016 report, and a quarter spend more than 19 percent, compared to a national average energy burden of 3.5 percent.

“The hesitance to burying lines is, how do we do this in a way that people can afford?” said Burke. Unless the federal infrastructure bill, or a reconciliation bill, provides additional dollars for that kind of project, she said, it’s simply not an option for Louisiana.

The bipartisan infrastructure bill that the Senate passed in early August contained $65 billion for the power grid, with $10 billion to $12 billion specifically for building new transmission lines. The Biden administration also announced last month that it is making nearly $5 billion available through the Federal Emergency Management Agency for projects that improve community resilience to extreme weather.

The second possible solution, which is cheaper than burying lines and something that utilities can take advantage of today, is using predictive computer modeling to identify where the biggest weaknesses in their systems are in order to make those incremental improvements more strategically. Nateghi and other academic researchers have published methods that use meteorological models of climate impacts and translate them into potential infrastructure damage to predict which areas are most likely to lose power. As part of her doctoral research, Nateghi worked with utilities in the southeast to incorporate such models into their planning and said they were able to cut costs and fare better in future storms. Farzad Ferdowsi, an assistant professor of electrical and computer engineering at the University of Louisiana who has worked with Entergy, agreed that one of the things the company could do to improve resilience is more comprehensive modeling.

Entergy has forcefully fought proposals to allow for more locally produced and controlled electricity.

But regardless, the grid will sometimes fail in one way or another. That’s why Burke thinks it’s more important to shift the conversation around resilience away from utilities to people. “We think it’s so important to be thinking about how to help people stay safe in their homes or where they’re sheltering, and that includes things like distributed solar and storage,” she said. New Orleans has a lot of rooftop solar, but most of it isn’t paired with batteries, which would allow it to provide power when the larger grid goes down. Burke imagines homes and community-based organizations like libraries, churches, and schools that have solar and storage systems that could be connected to form “neighborhood reliability corridors.” They would be able to operate as microgrids, independently from Entergy’s system, and allow communities to access cooling and other basic electricity needs in the aftermath of storms.

Entergy has forcefully fought proposals to allow for more locally produced and controlled electricity in New Orleans, instead convincing the city council to allow it to build a new gas-fired power plant in the city on the grounds of improved resilience during storms. That plant didn’t keep the power on during Ida because of damage to transmission and distribution lines. The company was able to start it up on Wednesday morning and provide power to a small part of New Orleans East, but most of the city is still blacked out, and Entergy has not yet provided estimates for when power will be restored.

New Orleans power outages as of the morning of Thursday, September 2


Entergy

“We expect to complete assessing all damage today, and then we can begin providing estimated restoration times for customers,” said Deanna Rodriguez, Entergy New Orleans’ president and CEO, during a press conference on Thursday morning.

Entergy can earn a rate of return on big capital investments like power plants, while locally produced solar would eat away at its profits. Like in other cities, Burke said that in the last month she has heard lots of calls for a public power utility that wouldn’t be subject to profit-motivated decision making. But she’s not very optimistic about a future for public power in New Orleans.

“New Orleans only has one Fortune 500 company, and it is Entergy,” she said. “They wield political power, they fund a lot of nonprofits. The kind of power that they have is fairly unmatched in the state. And so a movement to municipalize has a big heavy barrier up against it.”
Lawmakers: Ida damage shows need for infrastructure upgrades

By MATTHEW DALY and HOPE YEN

1 of 7
United States Geological Survey workers push a boat as they look for residents on a street flooded as a result of the remnants of Hurricane Ida in Somerville, NJ., Thursday, Sept. 2, 2021. (AP Photo/Eduardo Munoz Alvarez)

WASHINGTON (AP) — Shaken by haunting images of surging rivers, flooded roads and subways and other damage caused by the remnants of Hurricane Ida, lawmakers from both parties are vowing to upgrade the nation’s aging infrastructure network.

As the deadly storm moved from the Gulf Coast through the Northeast, members of Congress said the deluge offered irrefutable evidence that power lines, roads, bridges and other infrastructure are deteriorating even as storms and other extreme weather are strengthening. At least 50 people from Virginia to Connecticut died as storm water from Ida’s remnants cascaded into people’s homes and engulfed automobiles, overwhelming urban drainage systems unable to handle so much rain in such a short time.

At least 16 deaths deaths were blamed on the storm in Louisiana, Mississippi and Alabama.

“Global warming is upon us,” said Senate Majority Leader Chuck Schumer, D-N.Y. “When you get two record rainfalls in a week (in New York City), it’s not just coincidence. When you get all the changes that we have seen in weather, that’s not a coincidence. ... It’s going to get worse and worse and worse, unless we do something about it.”

Schumer and other lawmakers said the catastrophe is the latest example of why the nation needs the nearly trillion-dollar infrastructure bill passed by the Senate last month. He and other Democrats also are calling for passage of President Joe Biden’s $3.5 trillion, partisan rebuilding plan aimed at helping families and combating climate change.

“It’s so imperative to pass the two bills,” Schumer said.

Democrats hope to pass both bills by the end of this month, but action on the bipartisan bill may be difficult until the larger package is ready. Progressives have said they won’t support a bipartisan bill without strong companion legislation to advance their priorities.

Biden made a pitch Friday for the bipartisan bill, saying it “is going to change things on our streets across the country.” He cited the bill’s “historic investment” in roads, rail and bridges, as well as clean energy, clean water and universal broadband.

“It’s about resilience,” Biden said. “Make our roads and highways safer. Make us more resilient to the kinds of devastating impacts from extreme weather we’re seeing in so many parts of the country.”

The plan includes $110 billion to build and repair roads and bridges and $66 billion to upgrade railroads. It also includes about $60 billion to upgrade the electric grid and build thousands of miles of transmission lines to expand use of renewable energy and nearly $47 billion to adapt and rebuild roads, ports and bridges to help withstand damage from stronger storms as well as wildfires and drought.

“If we’re going to make our country more resilient to natural disasters, whatever they are, we have to start preparing now,” said Sen. Bill Cassidy, R-La.

Cassidy, a lead negotiator on the bipartisan bill, has touted the infrastructure legislation as a boon to hurricane-prone states such as his. “I’m sure hoping that Republicans look around my state, see this damage and say, ‘If there’s money for resiliency, money to harden the grid, money to help sewer and water, then maybe this is something we should be for,’” he told CNN.

Ultimately, repair and replacement of roads, bridges and other infrastructure damaged by Hurricane Ida and other natural disasters are likely to be funded by Congress as emergency relief money. But the bipartisan bill will be valuable in providing major investments in “future-proofing” infrastructure against climate change and extreme weather such as Ida, according to Jeff Davis, a senior fellow at the Eno Center for Transportation, a Washington think tank.

The bill would be the first to devote money for “climate resilience,” including $17 billion for the Army Corps of Engineers to address backlogs in federal flood control projects.

The National Oceanic and Atmospheric Administration would receive $492 million to map inland and coastal flooding, including “next-gen” modeling and forecasts. Another $492 million would go toward improving the resilience of coastal communities to flooding by restoring natural ecosystems.

The legislation also provides $3.5 billion for the Federal Emergency Management Agency to help communities reduce the risk of flood damage and $8.7 billion for the Transportation Department to award grants to states to improve resiliency in ports and other coastal infrastructure.

“We have to start planning for what the future might hold and do modeling that’s going to help us predict what these future risks are going to be,” FEMA Administrator Deanne Criswell told “Fox News Sunday,” calling the increasing frequency of severe storms that intensify more rapidly “a new normal” because of climate change. “These threats aren’t going to go away, and we need to start to reduce those impacts.”

The U.S. had 22 climate and weather disasters in 2020 with losses exceeding $1 billion each, with eight such disasters this year as of July 9, according to NOAA. Ida and its remnants will likely cost in the tens of billions, analysts say.

Rep. Garret Graves, R-La., said a bipartisan infrastructure bill is needed, but the bill approved by the Senate could harm oil-producing states such as Louisiana by freezing out benefits for states that encourage fossil-fuel production.

The Democratic-only bill is even worse, Graves said, calling it “a final nail in the coffin” to the offshore oil industry, which is already struggling because of the pandemic and the hurricane.

“All this does is benefit Iran,” Graves said. “It benefits Russia. It benefits China.”

In an interview, Graves said he would prefer to use emergency spending to help Louisiana and other states hit by the hurricane. That way, money “is tailored to the disaster” and based on need, not ideology, he said.

But Ed Potosnak, executive director of the New Jersey League of Conservation Voters, said emergency spending, and even the bipartisan infrastructure bill, is not sufficient to address the threat of climate change.

“We have not taken the bold measures we need to protect our families and our way of life and our communities that we cherish,” he said.

Potosnak, whose central New Jersey neighborhood was flooded by Ida, said storms are increasing in intensity and frequency, with at least seven “100-year storms” in the past few decades.

“I hope this storm is a reminder to all our elected officials: This is what climate change looks like,″ Potosnak said. “Congress needs to act to match the challenge we face.”
The quiet search for dark matter deep underground

In which the author travels to South Dakota to visit a gold mine—housing LUX.


MATTHEW FRANCIS - 9/6/2021, 6:39 AM

A mile below ground, a sign hangs over the door to the LUX dark matter experiment telling visitors how far to Wall Drug—in both dimensions.

Matthew R. Francis


Update, Sept. 6, 2021: It's Labor Day Weekend in the US, and even though most of us are continuing to call home "the office," Ars staff is taking a long weekend to rest and relax. And given we can't travel like we could during Labor Day Weekends past, we thought we'd revisit one of our favorite trips from the archives. This story on our adventure to the Large Underground Xenon (LUX) dark matter experiment in South Dakota originally ran in July 2014, and it appears unchanged below.

One of the quietest, darkest places in the cosmos isn’t out in the depths of space. It’s at the center of a tank of cold liquid xenon in a gold mine deep under the Black Hills of South Dakota. It needs to be that quiet: any stray particles could confuse the detectors lining the outside of the tank. Those detectors are looking for faint, rare signals, ones that could reveal the presence of dark matter.

The whole assembly—the container of liquid and gaseous xenon, the water tank enveloping that, and all the detectors—is called the Large Underground Xenon (LUX) dark matter experiment. So far, LUX hasn’t found anything, but the days of its operation are just beginning: the detector was installed and started operations just last year.

Though still relatively young, LUX has already set many standards for hunting for dark matter particles. When I visited, the facility was gearing up for the next data collection run, one that will involve 300 days of constant operation. The size and sensitivity of the experiment, its designers’ dedication to understanding any noise sources, and the relative simplicity of the detector lead many to hope that if there’s any dark matter to be found, LUX—or its successor—will find it.

(I’ll use “detector” to describe LUX as a whole in addition to the individual photon detectors that are the business-end of the experiment. I hope the context will make it clear which is which.)

That last "if" is a big one, of course. Dark matter is remarkable for its invisibility—it neither absorbs nor emits light of any wavelength. We know about it through its gravitational action, the way it shaped galaxies, organized the largest objects in the Universe, and affected the spectrum of light from the early days of the cosmos. Based on the structure of galaxies, astronomers suspect it is made of particles. But how massive those particles are, how many types might exist, and how they interact are still matters of some conjecture.


The LUX experiment is inside a metal tank containing 71,600 gallons of pure water, so the xenon detector isn't visible. The head at lower left is LUX researcher Rick Gaitskell.
Matthew R. FrancisIf, for example, dark matter doesn’t interact with ordinary matter on any scale our experimental cleverness can currently reach, LUX will turn up nothing, no matter how sophisticated the experiment is.

Advertisement


Rick Gaitskell, one of the principle researchers on LUX, is a professor at Brown University in the United States. (He was born and educated in England, a heritage that came through in his sartorial choice of green flannel three-piece suit with bright red socks.) He told Ars, “I’ve been looking for dark matter for 27 years. We’ve had a number of results over the years where we’ve either ended up seeing nothing or one or two cases we’ve seen something that looked actually rather exciting, in that it showed initially many of the features you might associate with a dark matter interaction.” However, none of those have been confirmed by later experiments—and most now have mundane explanations.

This past frustration is part of what drives Gaitskell. He quit another experiment when he concluded it was likely never to succeed: the detection rate, based on calculations, would likely be less than one dark matter particle per century for each kilogram of detector material. For most detector materials, that implies a prohibitively high cost. If you want to find dark matter on the scale of a year or two, you’d need a detector with masses of several metric tons. But mass is where LUX and its siblings—such as the XENON100 experiment at Gran Sasso in Italy—excel. Using xenon as the detection medium has many advantages, including the ability to build large.
Unicorns, Wall Drug, and dark matter


The lab housing LUX has many small paper unicorns sitting on top of server racks and other equipment.
Matthew R. FrancisTo understand how LUX hunts dark matter, I decided to visit the lab and see for myself. It's not exactly in a location that makes you think of physics.



I arrived in South Dakota during the dying throes of winter; at the end of March, the Black Hills were still snow-covered, but the warmer air carried tantalizing hints of spring. Thankfully, the flights fell between late-season snowfalls, though it snowed enough to cover my rental car during my day underground at LUX. The Black Hills are mountains in miniature, steep rocky peaks topping out lower than the foothills of the grander front range of the Rockies or the Tetons. Nevertheless, they are mountains with all that entails: sudden snowstorms, gates that can close off the roads in the case of severe weather, and “falling rocks” signs everywhere.

LUX is part of the Sanford Underground Research Facility (SURF), housed at the now-defunct Homestake gold mine in the town of Lead. (The irony of a gold mine in Lead vanished quickly. The name is pronounced “leed," not like the metal.) The whole region cashes in on its Wild West past, especially the adjacent town of Deadwood. Every hotel and gas station has slot machines, and innumerable billboards advertise big payouts at the casinos. South Dakota in general has unrestrictive laws about billboards, as anyone crossing the state by car knows. There are signs for Wall Drug, Mount Rushmore, and various other tourist destinations that nearly dwarf the attractions themselves. Continuing this trend, the lab has its own Wall Drug sign, showing the distance to be 97 miles away horizontally, plus 2037 feet vertically. Advertisement


By advertising standards, Sanford and LUX are relatively low key. The lab generally isn’t open to the public. To get to it, I had to drive through a residential neighborhood up steep (and at that time of year) snow-packed streets. Because access to the underground portion of the lab requires running the mining elevators—known as “cages”—I had to report at 6am for safety training. Even with jet-lag more or less in my favor, that was still rather early to be functional, though I had it better than the BBC film crew that flew in from the UK.

The safety briefing consisted largely of a video letting us know what we should do if we needed to evacuate the lab. We signed a waiver. We even had to affirm that we would make “intelligent choices,” a seemingly futile thing to ask a bunch of journalists with no mining experience among them. Finally, before descending, we had to don rubber steel-toed boots, coveralls, safety goggles, hard hats with lanterns, and a special emergency breathing apparatus clipped to a heavy utility belt. We made for a sexy-looking crew by the end of the process.

The cages themselves are large enough for about 15 people, with tracks in the floor for the mining carts. They have no built-in lights, so during the 10-minute descent, many of the crew turned on their hard-hat lanterns. (This is not a trip for those with claustrophobia.) Even the motors raising and lowering the cages are vintage: they were built in 1939, and the cable spools are cast iron. The gauges showing the position of the elevators are huge disks with pointers, another delightfully analog touch.

Gaitskell brought an adapted airplane altimeter into our cage to show in real time how far we were descending. Since the top of the mine is roughly a mile above sea-level and the lab is about a mile below ground, we nearly reached the elevation of my home city of Richmond, Virginia. My ears certainly registered the change in air pressure.

However rustic the approach, the lab itself is a typical, sleek modern facility: all shiny pipes, metal stairways, and tile floors. The typical accoutrements of office life abound. There are computers, white boards, water coolers, and (that most necessary piece of lab equipment) espresso machines all underground. University College London PhD student Sally Shaw told me “You kind of forget you’re underground down there.” Additionally, the researchers have adorned the lab with personal touches. A warning sign admonishes visitors to not feed the scientists, and when I looked around, I spotted a few paper unicorns sitting on various shelves. Shaw said the unicorns probably started as a late-night boredom project, but they grew into an inside joke. After all, hunting for dark matter is like looking for unicorns.

Be vewwy vewwy quiet, we're hunting WIMPs


The LUX detector.
Matt Kapust/Sanford Underground Research FacilityLUX is designed to hunt for weakly interacting massive particles, or WIMPs (one of the more unfortunate acronyms in cosmology). As the name suggests, WIMPs should interact with ordinary matter, albeit infrequently and weakly—what physicists call relatively low energy scales. From astronomical surveys, researchers also predict they should have a relatively large mass and move slowly compared to the speed of light. (We’ll discuss some alternative ideas in the third installment of this series.)

Beyond those basic criteria, however, LUX is designed to be as particle-agnostic as possible. Any particle that fits the basic description should register in the detector.

That could be critical, as "WIMP" is a fairly generic term that encompasses a number of hypothetical particle types predicted by theory. For example, the category includes particles predicted in some versions of supersymmetry; the “lightest supersymmetric partner” is sometimes listed as the most likely candidate for WIMP dark matter. But this isn't the only option that's been suggested.

The general properties of WIMPs don’t come from a particular theory, but from a general idea of how they should behave. WIMP models involve particles ranging from about 40 to 1,000 times the mass of a proton, though lower-mass versions are also possible—an important consideration. No matter what, though, experiments are looking for particles moving relatively slowly compared to the speed of light, meaning they will deposit very little energy into the detector.

While LUX consists of 368 kilograms of xenon, not all of the mass is used for dark matter hunting. In fact, most of the volume of the experiment consists of shielding. Each layer blocks more particles that could interfere with WIMP searches. Jim Dobson, a postdoctoral researcher at the University of Edinburgh and University College London, explained the process. “We go underground to get away from the majority of the cosmic rays. Then you build a huge water tank to mitigate the remaining cosmics that get through and any kind of radioactive materials in the cavern.” The biggest annoyances are gamma rays and neutrons, which are rare that far below ground but problematic. (Neutrinos, which like WIMPs are electrically neutral, are too high energy to be confused with dark matter.)

Finally, the xenon is part liquid, part gas. The gas portion is the final barrier. “We only look for dark matter in the inner region of the detector, where the xenon itself acts as … the most effective final shield towards radiation. When you get right in the center, you have almost zero background in that region that could fake a WIMP experiment.” This is a key difference with some other experiments, which showed something that looked like a WIMP, but it was actually the result of a more mundane process in the detector itself. LUX is designed to sidestep those problems.

Most WIMPs entering LUX won’t hit anything, but given enough time, we might get lucky. Xenon is very nonreactive. It is an inert gas, meaning that it doesn’t form chemical bonds easily. If a WIMP strikes the nucleus of a xenon atom, it transfers some energy. This results in a number of events—the rebound of the nucleus, the loss of one electron, and the production of some photons. Fortunately, we're very good at detecting photons and stray electrons.

However, as Dobson said, “The challenge of that is reading out the signal from the middle of the detector. How do you see a few photons of light right in the middle, how do you see a few electrons being created by that interaction? This is why you have to purify the xenon constantly." And that’s part of graduate student Sally Shaw’s job: monitoring the purification process. While she laughingly downplays the importance of her efforts—“get the PhD student to do the monkey work”—it’s obviously something that needs doing. When all goes well, “you could see the concentration of the nitrogen dropping each time we took a sample, see [the xenon] getting purer.” Advertisement


Pure xenon is essential because it is remarkably transparent to both the light and electrons ejected from a possible WIMP collision. Anything else, particularly nitrogen (which can be outgassed from the tank walls), clouds the chamber, and the tiny signal could disappear—a major tragedy in experimental terms. Pure xenon also will not absorb stray electrons; there’s no place for the atom to hold them. Applying a very small electric field across the tank gently encourages those electrons to move upward until they can be registered at the detector at LUX’s top surface.

The challenge in LUX, then, is to pick up that single electron and figure out where it came from inside the tank of liquid xenon. That in turn allows researchers to measure how much energy the WIMP transferred to the xenon atom and work backward to figure out how much mass it has. The mass, of course, is one of the important parameters characterizing any particle, setting the scale for how much energy was required to create it in the Universe’s first instants of existence.
So underground, even hipsters don't know about it

Beyond the detector, LUX is a challenging project with many unusual aspects. While engineering is always an aspect of big physics experiments, any facility deep underground will have more than its share of complications. The Sanford lab occupies about 12 miles of the original tunnels from the Homestake gold mine. While that’s a tiny fraction of the total mine, maintaining a livable working environment requires keeping the air circulating and breathable, pumping water out (anything that deep underground will flood if left alone), as well as providing electricity and communications.

When you consider workers are present in the mine for long shifts, something has to be done with their bodily wastes. Unlike more ordinary trash, it’s not something you can carry back to the surface in a receptacle. (In the brief periods I was in the lab, I didn’t need to use the Port-a-potties or combustion toilets, though perhaps I should have taken care of business in the name of research—you, dear readers, deserve no less.)

Rick Gaitskell told Ars, “When you start as a physicist, you sort of assume you’ll spend your time on very esoteric calculations. The reality is I know more about pumping water, mining operations, how to treat water—which is fascinating—when you’re pulling over half a million gallons a day out of the mine, just the business of treating [it] is fascinating technology.” For comparison, that’s roughly the same as pumping and chemically treating enough water to fill an Olympic-sized swimming pool.

Safety is an overwhelming concern as well. The first morning of my visit, we were delayed in going below ground for several hours after a fiber-optic cable used for monitoring carbon monoxide levels was severed. While nobody I spoke to thought that meant conditions were dangerous in the mine, every piece of protocol is designed to make sure people are safe.

While the modern incarnation of the lab is very recent—LUX wasn’t installed underground until winter of 2013, and the other experiments are still under construction—Homestake has a storied history as a physics facility. Ray Davis constructed a neutrino detector in the mine back in the late 1960s; the results he obtained led him and collaborator John Bahcall to recognize that two-thirds of the expected neutrinos from the Sun were absent. That discovery led eventually to our modern understanding of neutrino oscillation and a Nobel Prize for Davis in 2002. (Bahcall was unfairly left out of the award.) LUX is housed in the same artificial cave blasted out of the rock for Davis’ experiment; today that part of the mine is known as the Davis Campus in his honor.

Gaitskell said that Homestake ceased mining operations in 2002 when gold prices tanked and looked like they would remain low indefinitely, but it took several more years for the Sanford lab to negotiate access. During that time, water rose to fill much of the mine, drowning the tunnels until the Davis Campus was about 400 feet below the surface. The pumps needed to be restarted and the area cleaned before any physics experiments could become more than a dream. Advertisement


Beyond its extreme location, another difference between LUX and other particle physics experiments is the way responsibility is doled out. Both Dobson and Shaw commented on how quickly junior researchers were allowed to take responsibility for important aspects of the experiment. Shaw in particular is a first-year PhD student. In most labs, she wouldn’t be participating in on-site research until later in her program, but at LUX that’s more standard. She told Ars, “It was a bit of a shock that I could come out so quickly—only a few months in, and they said straight away, ‘oh here’s some responsibility’—woo! I think it’s good, it was a real shock that the people I had heard on telephone conferences I thought were postdocs or quite old turned out to just be PhD students because they’re really pulling their weight.”

Dobson concurred. “By the time they’re at the end of their PhDs, [the students] really own whole subsystems. I was really surprised when I joined; it’s really cool how much responsibility they’re given. Along with the [senior researchers], the mature PhD students are the ones who know the most.” That’s in contrast to many of the big particle experiments such as the detectors at CERN, where hundreds of researchers may be involved, and there’s little room for junior students to contribute anything other than basic grunt-work.
First results and the future

In its preliminary run in 2013, LUX took data for about 90 days. Already during that relatively short time of operation, the detector pushed the limits of what other experiments had done, both in terms of the energy range of searching and the control of potential false signals. Again, the extremely quiet center of the xenon tank is LUX’s advantage. Any signal the researchers might see would be hard to mistake for anything other than a WIMP. The fact that they didn’t see anything during those 90 days isn’t a mark of failure—it's a sign everything is working as expected.

But of course, the LUX team wants to see dark matter, that’s why the upcoming 300-day run is crucial. LUX will need to run reliably for 24 hours a day for the better part of a year, which will leave us with a lot of clean data for researchers to work with. If any WIMP signals are present, 300 days should be sufficiently long to tell us something about what kind of particle is producing them. If nothing is there, on the other hand… it’s going to rule out some possibilities. Null experiments can be as valuable to science as those that discover something new.

And LUX isn’t the end of the road. Gaitskell told Ars, “Even at the time we were putting LUX together, we knew there would be a successor experiment. The reason is that if you see anything in LUX, you want to make detailed measurements of it. If you don’t see anything, you want to increase your sensitivity.” If LUX does detect dark matter during its 300-day run, the upgraded experiment—known as LZ, for LUX-ZEPLIN—will be even better able to characterize it. (ZEPLIN was a similar xenon-based dark matter detector in northwest England; LZ will combine resources from both existing projects.) The goal of LZ will be to help us understand how the particles interact, how much mass they have, what kind of spin they possess… all the characteristics of the particles to help fit them into what we know about other matter.

If all goes as planned, LZ will also occupy the Davis Campus where LUX currently resides. The detector will consist of at least five metric tons of xenon, more than a 10-fold increase in volume over LUX, with a corresponding increase in sensitivity. As Jim Dobson said, with LZ “you’re getting close to seeing coherent neutrino scattering from atmospheric neutrinos.” That’s nearly the practical limit of anything you could do with a dark matter experiment. If the researchers don’t see anything with LUX and LZ, there’s probably nothing to be seen in that energy range. But Dobson is sanguine. “If you discover something and the signal is big enough, you can start actually doing a lot more physics and exploring what that signal means. Exciting times!”

Further reading: Physical Review Letters, 2014. DOI: (About DOIs).



FDA’s new test results on PFAS in food tell an incomplete story
on September 6, 2021
CONTRIBUTED OPINION


Imagine using a radar gun to detect speeding in cars, but then manipulating the radar so that it only detects speeding in cars going over 100 mph. This means any reading below 100 mph would be considered undetectable and any data results would conceal any problems by showing that minimal or no speeding has occurred.

Based on FDA test results on per- and polyfluorinated alkyl substances (PFAS) in food released last week, the FDA appears to have employed a similar approach when it conducted its survey/testing. The testing methodology used by the agency applied limits of detection and quantitation that likely underrepresented the presence of PFAS in food. Had lower detection and quantitation limits been applied, the FDA likely could have found PFAS contamination in the tested food to be much more pervasive, as the FDA’s own earlier released results and testing at commercial laboratories suggest.

Presenting these test results using this less sensitive methodology unnecessarily minimizes the risk that consumers are facing. PFAS are an ever-expanding group of thousands of man-made toxic chemicals that are widely used to make fluoropolymer coating that allows products to be more resistant to heat, stains, grease and water. Consumers can be exposed to PFAS in myriad ways — food, food packaging, water, clothing, cosmetics, cooking surfaces, etc.

Several PFAS characteristics make them especially dangerous to humans.

First, they are extremely persistent, resistant to breaking down naturally in the environment, and remain in people’s bodies for years. This is why they are often described as “forever chemicals.”

Second, they are highly mobile, spreading quickly and remain prevalent throughout our environment. Finally, they can be toxic at very low doses and have been linked to a variety of severe health effects, including an increased risk of cancer, thyroid disease, and birth defects.

The high detection level the Food and Drug Administration used for this survey is only part of the reason to be skeptical of their results. The FDA’s current PFAS testing method is only capable of detecting and quantifying 16 of the more than 600 PFAS that are currently in use. The agency itself has acknowledged that their sampling is very limited.

Returning to the radar gun analogy — in addition to the radar only being able to detect cars going over 100 mph, it’s further limited in that it can only measure 16 of more than 600 cars that pass by the radar; this means hundreds of cars going over 100 mph still would go undetected because the radar gun is only equipped to detect 16 cars.

Yet another issue of concern about the FDA’s PFAS testing methodology is that none of the PFAS tested were ones approved for food packaging. This was a missed opportunity, as the FDA could have provided updated data on how much PFAS in food packaging leaches into food products. The Environmental Working Group (EWG) found nearly half of fast food wrappers collected in 2014 and 2015 had high fluorine counts, a reliable indicator of PFAS use in food packaging products.

Granted, testing for PFAS in food can be difficult. There are many PFAS chemicals for which there are no analytical standards and many complex food matrices that can hinder unbiased detection and quantification of PFAS chemicals.

Also, serious efforts to develop test methods for investigating the occurrence of, and potential exposure to, PFAS chemicals from food began only recently, so there currently are no robust methods that could be used to test and report on an adequate number of food-relevant PFAS chemicals.

However, in releasing these misleading test results, the FDA is conveying a false sense of security about PFAS contamination in food that has the potential to be harmful, especially in the long-term. If the FDA would like to do something meaningful, they should move quickly to ban PFAS in all food packaging.

Brian Ronholm is the Director of Food Policy for Consumer Reports. He leads CR’s advocacy efforts to advance a safe and healthy food system. He previously served as Deputy Under Secretary for Food Safety at the U.S. Department of Agriculture (USDA) and, prior to that, served in the office of Rep. Rosa DeLauro of Connecticut.
WTF IS WRONG WITH YOU PEOPLE
Shelby Rogers expects '9 million death threats' on social media after US Open loss

Ryan Young· Writer
Mon., September 6, 2021, 


Shelby Rogers barely had time to look at her phone after her loss at the US Open on Monday afternoon, but she knew what was on it.

Or if they weren't there yet, they were coming.

Rogers — who fell in straight sets to Great Britain’s Emma Raducanu in the fourth round of the Grand Slam — said she was facing roughly “9 million death threats” coming her way on social media from those who are mad at her from losing.

"Obviously we appreciate the spotlight in those moments, but then you have today and I'm going to have nine million death threats and whatnot," she said after her loss, via ESPN. "It's very much polarizing, one extreme to the other very quickly."

While that number may be exaggerated, the impact even one death threat has is real.

"You could probably go through my profile right now — I'm probably a 'fat pig' and words that I can't say right now," Rogers said, via ESPN. "But, it is what it is. You try not to take it to heart, and it's the unfortunate side of any sport and what we do."

Sloane Stephens speaks out about abuse


Rogers isn’t the only one speaking out about abuse on social media at the US Open.

Fellow American star and former US Open champion Sloane Stephens said she received more than 2,000 abusive comments and messages on Instagram after she fell to Angelique Kerber in the third round at the US Open.

Stephens shared examples on her Instagram story, some of which included racial slurs and threats of sexual violence.

“This type of hate is so exhausting and never-ending,” she wrote, in part. “This isn’t talked about enough, but it really freaking sucks.

“I’m happy to have people in my corner who support me. I’m choosing positive vibes over negative ones. I choose to show you guys happiness on here, but it’s not always smiles and roses.”

Rogers was very much on the same page as Stephens on Monday. Though she said she tries to ignore the comments, she said that it “does get to your head sometimes.

“Social media can’t control what I’m doing and the way my training is going to move forward, but I wish it didn’t exist,” she said, via ESPN. “It’s really tough.”


Shelby Rogers said she expects to receive death threats and abuse on social media after her loss at the US Open. (Sarah Stier/Getty Images)

Death threats after tennis losses now the norm, players say

Steve Keating
ReutersStaff

Monday, September 6, 2021 

NEW YORK -- As if a fourth round 6-2 6-1 thrashing at the U.S. Open was not hard enough to deal with, American Shelby Rogers said that she would most likely have to contend with death threats on social media following Monday's loss to Emma Raducanu.

While death threats over a tennis match may seem shocking, several players at this year's U.S. Open have said such things are now part of sport with much of the abuse coming from gamblers hooked on online betting.

"I'm going to have nine million death threats and whatnot," said Rogers with a shrug. "At this point in my career, I'd say I'm used to it.

"I kind of wish social media didn't exist.

"You could probably go through my profile right now, I'm probably a fat pig and words that I can't say right now."

American Sloane Stephens said she received a torrent of angry messages on social media, including racist and sexist abuse, following her third round loss to Angelique Kerber.

Stephens, who is Black, said she got over 2,000 messages of abuse after Friday's 5-7 6-2 6-3 defeat, including threats of physical harm.

"It's so hard to read messages like these, but I'll post a few so you guys can see what it's like after a loss," she wrote in an Instagram story.

What followed was a series of racist and sexist messages aimed at the 28-year-old.

"This type of hate is so exhausting and never ending," she said.

The WTA Tour said in an email to Reuters that the number of players targeted for online abuse is on the rise and it is working with social media companies to find ways of curtailing the harassment.

The Tour is also working with Theseus, a risk assessment and management company, which specializes evaluation, protection and support for athletes if they encounter online harassment.

"Theseus and the WTA work with the social media platforms to shut down accounts when warranted, and if applicable, local authorities are notified," said the WTA.

"Working with Theseus allows the WTA and the players to take the most appropriate action, while enabling WTA players to safely keep their social media accounts and use them to communicate and share exciting highlights, stories and news to their fans."

Rogers acknowledged that social media interaction is encouraged as way to market the sport and connect with fans.

"It's part of marketing now, we have contracts, we have to post certain things," said Rogers. "It is what it is.

"You try not to take it to heart, and it's the unfortunate side of any sport.

"You know, just focus on the important things, not comments from people in their mom's basement.

"It's really unfortunate and some of it does get to your head sometimes."

(Reporting by Steve Keating in New York. Editing by Pritha Sarkar)

'MAYBE' TECH
New Efficiency Record Set for Solar Hydrogen Production

Researchers achieved a solar-to-hydrogen conversion efficiency of over 20 percent.


By Loukia Papadopoulos
Sep 05, 2021

onurdongel/iStock


Solar hydrogen production is a clean energy system that holds great potential to bolster sustainability efforts across the globe. Unfortunately, it's also largely unfeasible due to the high costs associated with its production and operation.

Now, researchers from the Australian National University and the University of New South Wales have set a new world record in efficiency for the production of renewable hydrogen from solar energy using low-cost materials, reported RenewEconomy. The team of scientists achieved a solar-to-hydrogen conversion efficiency of greater than 20 percent.

The researchers focused on combining tandem solar cells with low-cost catalyst materials to split water into both hydrogen and oxygen through electrolysis.


Research co-author Dr. Siva Karuturi of Australian National University told RenewEconomy that his team's new approach of combining solar cells with hydrogen electrolyzers into a single unit could produce significant improvements in production efficiency and reductions in cost.

“In a centralized electrolyzer which usually runs on grid electricity, membrane and electrodes are stacked in multiple numbers – often hundreds of them – to achieve the desired production capacity which results in a complex system,” Karuturi said.

Karuturi added that in direct photovoltaic (PV)-electrolysis, a single unit of electrodes and membrane can be directly combined with PV cells into a simplified solar hydrogen module, getting rid of the need for power infrastructure and electrolyzers, and resulting in higher power conversion efficiency and lower costs.

The team speculates that their new design could lower the cost of renewable hydrogen production to $2.30 per kilogram. This would be in line with targets set by the United States Department of Energy.

This is welcome news as the world strives to achieve net zero emissions by 2050, a target that is the catalyst behind a number of green hydrogen projects. In 2017, another energy innovation used a floating solar rig to produce hydrogen fuel using seawater. However, that technology continues to be quite costly.

Caterpillar to Offer Gensets Operating on 100% Hydrogen



Caterpillar will begin offering Cat generator sets capable of operating on 100% hydrogen, including fully renewable green hydrogen, later this year, the company announced. Hydrogen-powered generator sets will be available on a designed-to-order basis. The company also plans to launch commercially available power generation solutions from 400 kW to 4.5 MW that can operate on natural gas blended with as much as 25% hydrogen.


Caterpillar says these hydrogen-fueled power technologies have improved performance and “minimal impacts on maintenance costs and schedules, availability and operations.”


Organizations are increasingly seeking to maximize the economic benefits of reducing their carbon intensities. Caterpillar Large Electric Power general manager Bart Myers says with its new offerings, the company hopes to “demonstrate the viability of power solutions that can utilize many types of hydrogen” in order to “shorten the path to commercial availability.”


Caterpillar says its plans for operating on 100% hydrogen include developing a range of commercially available products and upgrades for existing Cat gas generators. The development and launch of these solutions address potential customer demand growth as the hydrogen supply infrastructure matures


Hydrogen is enjoying a resurgence among energy ecosystem and clean transportation stakeholders, Guidehouse Insights said earlier this year. Almost two decades ago, hydrogen was seen as the primary solution to ending reliance on fossil fuels at some point in the future, however the transition to hydrogen did not occur then because of the high cost of generating hydrogen from renewable energy sources. But today, a variety of governments and private sector vendors are experimenting with new applications for green and blue hydrogen, per the research.


Hydrogen production is one of the 10 key technologies seen as critical to decarbonization goals during the next five years, according to a recent DNV Technology Progress Report.


Making the case for hydrogen in a zero-carbon economy

Making the case for hydrogen in a zero-carbon economy
Credit: DOI: 10.1016/j.apenergy.2021.117314

As the United States races to achieve its goal of zero-carbon electricity generation by 2035, energy providers are swiftly ramping up renewable resources such as solar and wind. But because these technologies churn out electrons only when the sun shines and the wind blows, they need backup from other energy sources, especially during seasons of high electric demand. Currently, plants burning fossil fuels, primarily natural gas, fill in the gaps.

"As we move to more and more renewable penetration, this intermittency will make a greater impact on the ," says Emre Gençer, a research scientist at the MIT Energy Initiative (MITEI). That's because grid operators will increasingly resort to fossil-fuel-based "peaker"  that compensate for the intermittency of the variable renewable  (VRE) sources of sun and wind. "If we're to achieve zero-carbon electricity, we must replace all greenhouse gas-emitting sources," Gençer says.

Low- and zero-carbon alternatives to greenhouse-gas emitting peaker plants are in development, such as arrays of lithium-ion batteries and  power generation. But each of these evolving technologies comes with its own set of advantages and constraints, and it has proven difficult to frame the debate about these options in a way that's useful for policymakers, investors, and utilities engaged in the clean energy transition.

Now, Gençer and Drake D. Hernandez SM '21 have come up with a model that makes it possible to pin down the pros and cons of these peaker-plant alternatives with greater precision. Their hybrid technological and , based on a detailed inventory of California's power system, was published online last month in Applied Energy. While their work focuses on the most cost-effective solutions for replacing peaker power plants, it also contains insights intended to contribute to the larger conversation about transforming energy systems.

"Our study's essential takeaway is that hydrogen-fired power generation can be the more economical option when compared to lithium-ion batteries—even today, when the costs of hydrogen production, transmission, and storage are very high," says Hernandez, who worked on the study while a graduate research assistant for MITEI. Adds Gençer, "If there is a place for hydrogen in the cases we analyzed, that suggests there is a promising role for hydrogen to play in the energy transition."

Adding up the costs

California serves as a stellar paradigm for a swiftly shifting power system. The state draws more than 20 percent of its electricity from solar and approximately 7 percent from wind, with more VRE coming online rapidly. This means its peaker plants already play a pivotal role, coming online each evening when the sun goes down or when events such as heat waves drive up electricity use for days at a time.

"We looked at all the peaker plants in California," recounts Gençer. "We wanted to know the cost of electricity if we replaced them with hydrogen-fired turbines or with lithium-ion batteries." The researchers used a core metric called the levelized cost of electricity (LCOE) as a way of comparing the costs of different technologies to each other. LCOE measures the average total cost of building and operating a particular energy-generating asset per unit of total electricity generated over the hypothetical lifetime of that asset.

Selecting 2019 as their base study year, the team looked at the costs of running natural gas-fired peaker plants, which they defined as plants operating 15 percent of the year in response to gaps in intermittent renewable electricity. In addition, they determined the amount of carbon dioxide released by these plants and the expense of abating these emissions. Much of this information was publicly available.

Coming up with prices for replacing peaker plants with massive arrays of lithium-ion batteries was also relatively straightforward: "There are no technical limitations to lithium-ion, so you can build as many as you want; but they are super expensive in terms of their footprint for energy storage and the mining required to manufacture them," says Gençer.

But then came the hard part: nailing down the costs of hydrogen-fired electricity generation. "The most difficult thing is finding cost assumptions for new technologies," says Hernandez. "You can't do this through a literature review, so we had many conversations with equipment manufacturers and plant operators."

The team considered two different forms of hydrogen fuel to replace natural gas, one produced through electrolyzer facilities that convert water and electricity into hydrogen, and another that reforms natural gas, yielding hydrogen and carbon waste that can be captured to reduce emissions. They also ran the numbers on retrofitting natural gas plants to burn hydrogen as opposed to building entirely new facilities. Their model includes identification of likely locations throughout the state and expenses involved in constructing these facilities.

The researchers spent months compiling a giant dataset before setting out on the task of analysis. The results from their modeling were clear: "Hydrogen can be a more cost-effective alternative to lithium-ion batteries for peaking operations on a power grid," says Hernandez. In addition, notes Gençer, "While certain technologies worked better in particular locations, we found that on average, reforming hydrogen rather than electrolytic hydrogen turned out to be the cheapest option for replacing peaker plants."

A tool for energy investors

When he began this project, Gençer admits he "wasn't hopeful" about hydrogen replacing natural gas in peaker plants. "It was kind of shocking to see in our different scenarios that there was a place for hydrogen." That's because the overall price tag for converting a fossil-fuel based plant to one based on hydrogen is very high, and such conversions likely won't take place until more sectors of the economy embrace hydrogen, whether as a fuel for transportation or for varied manufacturing and industrial purposes.

A nascent hydrogen production infrastructure does exist, mainly in the production of ammonia for fertilizer. But enormous investments will be necessary to expand this framework to meet grid-scale needs, driven by purposeful incentives. "With any of the climate solutions proposed today, we will need a carbon tax or carbon pricing; otherwise nobody will switch to new technologies," says Gençer.

The researchers believe studies like theirs could help key energy stakeholders make better-informed decisions. To that end, they have integrated their analysis into SESAME, a life cycle and techno-economic assessment tool for a range of energy systems that was developed by MIT researchers. Users can leverage this sophisticated modeling environment to compare costs of energy storage and emissions from different technologies, for instance, or to determine whether it is cost-efficient to replace a -powered plant with one powered by hydrogen.

"As utilities, industry, and investors look to decarbonize and achieve zero-emissions targets, they have to weigh the costs of investing in low-carbon technologies today against the potential impacts of climate change moving forward," says Hernandez, who is currently a senior associate in the energy practice at Charles River Associates. Hydrogen, he believes, will become increasingly cost-competitive as its production costs decline and markets expand.

A study group member of MITEI's soon-to-be published Future of Storage study, Gençer knows that hydrogen alone will not usher in a zero-carbon future. But, he says, "Our research shows we need to seriously consider hydrogen in the energy transition, start thinking about key areas where hydrogen should be used, and start making the massive investments necessary."Green hydrogen production from curtailed wind and solar power

More information: Drake D. Hernandez et al, Techno-economic analysis of balancing California's power system on a seasonal basis: Hydrogen vs. lithium-ion batteries, Applied Energy (2021). DOI: 10.1016/j.apenergy.2021.117314