Thursday, December 09, 2021

OPINION
Three Myths About Renewable Energy and the Grid, Debunked


Renewable energy skeptics argue that because of their variability, wind and solar cannot be the foundation of a dependable electricity grid. But the expansion of renewables and new methods of energy management and storage can lead to a grid that is reliable and clean.


BY AMORY B. LOVINS AND M. V. RAMANA • 
DECEMBER 9, 2021
Yale Environment 360
Published at theYale School of the Environment
About E360

Wind turbines and solar panels in Bavaria, Germany.
 FRANK BIENEWALD / LIGHTROCKET VIA GETTY IMAGES

LONG READ

As wind and solar power have become dramatically cheaper, and their share of electricity generation grows, skeptics of these technologies are propagating several myths about renewable energy and the electrical grid. The myths boil down to this: Relying on renewable sources of energy will make the electricity supply undependable.

Last summer, some commentators argued that blackouts in California were due to the “intermittency” of renewable energy sources, when in fact the chief causes were a combination of an extreme heat wave probably induced by climate change, faulty planning, and the lack of flexible generation sources and sufficient electricity storage. During a brutal Texas cold snap last winter, Gov. Greg Abbott wrongly blamed wind and solar power for the state’s massive grid failure, which was vastly larger than California’s. In fact, renewables outperformed the grid operator’s forecast during 90 percent of the blackout, and in the rest, fell short by at most one-fifteenth as much as gas plants. Instead, other causes — such as inadequately weatherized power plants and natural gas shutting down because of frozen equipment — led to most of the state’s electricity shortages.

In Europe, the usual target is Germany, in part because of its Energiewende (energy transformation) policies shifting from fossil fuels and nuclear energy to efficient use and renewables. The newly elected German government plans to accelerate the former and complete the latter, but some critics have warned that Germany is running “up against the limits of renewables.”

In reality, it is entirely possible to sustain a reliable electricity system based on renewable energy sources plus a combination of other means, including improved methods of energy management and storage. A clearer understanding of how to dependably manage electricity supply is vital because climate threats require a rapid shift to renewable sources like solar and wind power. This transition has been sped by plummeting costs —Bloomberg New Energy Finance estimates that solar and wind are the cheapest source for 91 percent of the world’s electricity — but is being held back by misinformation and myths.
Myth No. 1: A grid that increasingly relies on renewable energy is an unreliable grid.

Going by the cliché, “In God we trust; all others bring data,” it’s worth looking at the statistics on grid reliability in countries with high levels of renewables. The indicator most often used to describe grid reliability is the average power outage duration experienced by each customer in a year, a metric known by the tongue-tying name of “System Average Interruption Duration Index” (SAIDI). Based on this metric, Germany — where renewables supply nearly half of the country’s electricity — boasts a grid that is one of the most reliable in Europe and the world. In 2020, SAIDI was just 0.25 hours in Germany. Only Liechtenstein (0.08 hours), and Finland and Switzerland (0.2 hours), did better in Europe, where 2020 electricity generation was 38 percent renewable (ahead of the world’s 29 percent). Countries like France (0.35 hours) and Sweden (0.61 hours) — both far more reliant on nuclear power — did worse, for various reasons.

The United States, where renewable energy and nuclear power each provide roughly 20 percent of electricity, had five times Germany’s outage rate — 1.28 hours in 2020. Since 2006, Germany’s renewable share of electricity generation has nearly quadrupled, while its power outage rate was nearly halved. Similarly, the Texas grid became more stable as its wind capacity sextupled from 2007 to 2020. Today, Texas generates more wind power — about a fifth of its total electricity — than any other state in the U.S.
Myth No. 2: Countries like Germany must continue to rely on fossil fuels to stabilize the grid and back up variable wind and solar power.

Again, the official data say otherwise. Between 2010 — the year before the Fukushima nuclear accident in Japan — and 2020, Germany’s generation from fossil fuels declined by 130.9 terawatt-hours and nuclear generation by 76.3 terawatt hours. These were more than offset by increased generation from renewables (149.5 terawatt hours) and energy savings that decreased consumption by 38 terawatt hours in 2019, before the pandemic cut economic activity, too. By 2020, Germany’s greenhouse gas emissions had declined by 42.3 percent below its 1990 levels, beating the target of 40 percent set in 2007. Emissions of carbon dioxide from just the power sector declined from 315 million tons in 2010 to 185 million tons in 2020.

So as the percentage of electricity generated by renewables in Germany steadily grew, its grid reliability improved, and its coal burning and greenhouse gas emissions substantially decreased.

In Japan, following the multiple reactor meltdowns at Fukushima, more than 40 nuclear reactors closed permanently or indefinitely without materially raising fossil-fueled generation or greenhouse gas emissions; electricity savings and renewable energy offset virtually the whole loss, despite policies that suppressed renewables.
Myth No. 3: Because solar and wind energy can be generated only when the sun is shining or the wind is blowing, they cannot be the basis of a grid that has to provide electricity 24/7, year-round.

While variable output is a challenge, it is neither new nor especially hard to manage. No kind of power plant runs 24/7, 365 days a year, and operating a grid always involves managing variability of demand at all times. Even with no solar and wind power (which tend to work dependably at different times and seasons, making shortfalls less likely), all electricity supply varies.

Seasonal variations in water availability and, increasingly, drought reduce electricity output from hydroelectric dams. Nuclear plants must be shut down for refueling or maintenance, and big fossil and nuclear plants are typically out of action roughly 7 percent to 12 percent of the time, some much more. A coal plant’s fuel supply might be interrupted by the derailment of a train or failure of a bridge. A nuclear plant or fleet might unexpectedly have to be shut down for safety reasons, as was Japan’s biggest plant from 2007 to 2009. Every French nuclear plant was, on average, shut down for 96.2 days in 2019 due to “planned” or “forced unavailability.” That rose to 115.5 days in 2020, when French nuclear plants generated less than 65 percent of the electricity they theoretically could have produced. Comparing expected with actual performance, one might even say that nuclear power was France’s most intermittent 2020 source of electricity.

Climate- and weather-related factors have caused multiple nuclear plant interruptions, which have become seven times more frequent in the past decade. Even normally steady nuclear output can fail abruptly and lastingly, as in Japan after the Fukushima disaster, or in the northeastern U.S. after the 2003 regional blackout, which triggered abrupt shutdowns that caused nine reactors to produce almost no power for several days and take nearly two weeks to return to full output.


The Bungala Solar Farm in South Australia, where the grid has run almost exclusively on renewables for days on end. 
LINCOLN FOWLER / ALAMY STOCK PHOTO

Thus all sources of power will be unavailable sometime or other. Managing a grid has to deal with that reality, just as much as with fluctuating demand. The influx of larger amounts of renewable energy does not change that reality, even if the ways they deal with variability and uncertainty are changing. Modern grid operators emphasize diversity and flexibility rather than nominally steady but less flexible “baseload” generation sources. Diversified renewable portfolios don’t fail as massively, lastingly, or unpredictably as big thermal power stations.

The purpose of an electric grid is not just to transmit and distribute electricity as demand fluctuates, but also to back up non-functional plants with working plants: that is, to manage the intermittency of traditional fossil and nuclear plants. In the same way, but more easily and often at lower cost, the grid can rapidly back up wind and solar photovoltaics’ predictable variations with other renewables, of other kinds or in other places or both.This has become easier with today’s far more accurate forecasting of weather and wind speeds, thus allowing better prediction of the output of variable renewables. Local or onsite renewables are even more resilient because they largely or wholly bypass the grid, where nearly all power failures begin. And modern power electronics have reliably run the billion-watt South Australian grid on just sun and wind for days on end, with no coal, no hydro, no nuclear, and at most the 4.4-percent natural-gas generation currently required by the grid regulator.

Most discussions of renewables focus on batteries and other electric storage technologies to mitigate variability. This is not surprising because batteries are rapidly becoming cheaper and widely deployed. At the same time, new storage technologies with diverse attributes continue to emerge; the U.S. Department of Energy Global Energy Storage Database lists 30 kinds already deployed or under construction. Meanwhile, many other and less expensive carbon-free ways exist to deal with variable renewables besides giant batteries.

Many less expensive and carbon-free ways exist to deal with variable renewables besides giant batteries.


The first and foremost is energy efficiency, which reduces demand, especially during periods of peak use. Buildings that are more efficient need less heating or cooling and change their temperature more slowly, so they can coast longer on their own thermal capacity and thus sustain comfort with less energy, especially during peak-load periods.

A second option is demand flexibility or demand response, wherein utilities compensate electricity customers that lower their use when asked — often automatically and imperceptibly — helping balance supply and demand. One recent study found that the U.S. has 200 gigawatts of cost-effective load flexibility potential that could be realized by 2030 if effective demand response is actively pursued. Indeed, the biggest lesson from recent shortages in California might be the greater appreciation of the need for demand response. Following the challenges of the past two summers, the California Public Utilities Commission has instituted the Emergency Load Reduction Program to build on earlier demand response efforts.

Some evidence suggests an even larger potential: An hourly simulation of the 2050 Texas grid found that eight types of demand response could eliminate the steep ramp of early-evening power demand as solar output wanes and household loads spike. For example, currently available ice-storage technology freezes water using lower-cost electricity and cooler air, usually at night, and then uses the ice to cool buildings during hot days. This reduces electricity demand from air conditioning, and saves money, partly because storage capacity for heating or cooling is far cheaper than storing electricity to deliver them. Likewise, without changing driving patterns, many electric vehicles can be intelligently charged when electricity is more abundant, affordable, and renewable.



The top graph shows daily solar power output (yellow line) and demand from various household uses. The bottom graph shows how to align demand with supply, running devices in the middle of the day when solar output is highest. ROCKY MOUNTAIN INSTITUTE

A third option for stabilizing the grid as renewable energy generation increases is diversity, both of geography and of technology — onshore wind, offshore wind, solar panels, solar thermal power, geothermal, hydropower, burning municipal or industrial or agricultural wastes. The idea is simple: If one of these sources, at one location, is not generating electricity at a given time, odds are that some others will be.

Finally, some forms of storage, such as electric vehicle batteries, are already economical today. Simulations show that ice-storage air conditioning in buildings, plus smart charging to and from the grid of electric cars, which are parked 96 percent of the time, could enable Texas in 2050 to use 100 percent renewable electricity without needing giant batteries.

To pick a much tougher case, the “dark doldrums” of European winters are often claimed to need many months of battery storage for an all-renewable electrical grid. Yet top German and Belgian grid operators find Europe would need only one to two weeks of renewably derived backup fuel, providing just 6 percent of winter output — not a huge challenge.

The bottom line is simple. Electrical grids can deal with much larger fractions of renewable energy at zero or modest cost, and this has been known for quite a while. Some European countries with little or no hydropower already get about half to three-fourths of their electricity from renewables with grid reliability better than in the U.S. It is time to get past the myths.

Amory B. Lovins is an adjunct professor of civil and environmental engineering at Stanford University, and co-founder and chairman emeritus of Rocky Mountain Institute. 

M. V. Ramana is the Simons Chair in Disarmament, Global and Human Security and director of the Liu Institute for Global Issues at the School of Public Policy and Global Affairs at the University of British Columbia in Vancouver, Canada. 



Video: Amory Lovins: Natural Capitalism — Global Issues
https://www.globalissues.org/video/732/amory-lovins-natural-capitalism
2004-10-17 · Natural Capitalism Running time 5m 00s Filmed San Raphael, USA, October 17, 2004 Credits Marcus Morrell About Amory Lovins Energy Consultant, Author. Amory Lovins is one of the world’s foremost energy consultants and is the CEO of Rocky Mountain Institute, based in Colorado. His work focuses on developing advanced resource productivity and energy efficiency.

Natural Capitalism - Read the Book
https://www.natcap.org/sitepages/pid5.php
Natural Capitalism: Creating the Next Industrial Revolution, by Paul Hawken, Amory Lovins, and L. Hunter Lovins, is the first book to explore the lucrative opportunities for businesses in an era of approaching environmental limits. In this groundbreaking blueprint for a new economy, three leading business visionaries explain how the world is on the verge of a new industrial revolution-one that promises to …

Natural Capitalism: The Next Industrial Revolution
https://centerforneweconomics.org/publications/natural-capitalism-the...
BIOLOGIST; AUTHOR; MEMBER, BOARD OF DIRECTORS, SCHUMACHER CENTER FOR A NEW ECONOMICS. Amory Lovins is the co-founder and chief executive officer of the Rocky Mountain Institute, which has been around since 1982—a fifty-person, independent, nonprofit, applied research center in Snowmass, Colorado. The Institute’s objective is to foster efficient and restorative use of natural and …







As climate 'net-zero' plans grow, so do concerns from scientists

Scientists and monitoring groups are growing increasingly alarmed at the slew of vague net-zero pledges that appear to privilege
Scientists and monitoring groups are growing increasingly alarmed at the slew of vague 
net-zero pledges that appear to privilege offsets and future technological breakthroughs 
over short-term emissions cuts.

Faced with the prospect that climate change will drive ever deadlier heat waves, rising seas and crop failures that will menace the global food system, countries, corporations and cities appear to have come up with a plan: net zero.

The concept is simple: starting now, to ensure that by a certain date—usually 2050—they absorb as much  dioxide as they emit, thereby achieving carbon neutrality.

But scientists and monitoring groups are growing increasingly alarmed at the slew of vague net-zero pledges that appear to privilege offsets and future technological breakthroughs over short-term emissions cuts.

"They're not fit for purpose, any of them," Myles Allen, director of Oxford Net Zero at the University of Oxford said of today's  plans.

"You can't offset continued fossil fuel use by planting trees for very long. Nobody has even acknowledged that in their net-zero plans, even the really ambitious countries," he told AFP.

Last month's COP26 climate summit in Glasgow saw major emitter India commit for the first time to work towards net-zero emissions, joining the likes of China, the United States and the European Union.

According to Net Zero Tracker (NZT), 90 percent of global GDP is now covered by some sort of net-zero plan. But it said that the vast majority remain ill-defined.

Take offsets. These are when countries or companies deploy measures—such as tree planting or direct CO2 capture—to compensate for the emissions they produce. NZT found that 91 percent of country targets, and 48 percent of public company targets, failed to even specify whether offsets feature in their net-zero plans.

Which emissions?

What's more, it found that less than a third (32 percent) of corporate net-zero targets cover what are known as "scope 3 emissions"—those from a company's product, which normally account for the vast majority of carbon pollution from a given business.

Alberto Carrillo Pineda, co-founder of Science Based Targets initiative, which helps companies align their net-zero plans with what science says is needed to avoid catastrophic heating, said most decarbonisation pledges "don't make sense" without including scope 3 emissions.

"From a climate point of view it matters, the companies are driving emissions not only through their operations but also through what they buy and sell," he told AFP.

"And that constitutes their business model. A company wouldn't exist without their product and so their product needs considering from an emissions point of view."

The UN  body, UNFCCC, analysed the latest national emissions cutting plans during COP26.

It found that they would see emissions increase 13.7 percent by 2030, when they must fall by roughly half to keep the Paris Agreement warming limit of 1.5C within reach.

Of the 74 countries that have published detailed net-zero plans, the UNFCCC found that their emissions would fall 70-79 percent by 2050—a significant drop, but still not net zero.

Stuart Parkinson, executive director of Scientists for Global Responsibility (SGR), said governments had started to use net-zero pledges as a way of delaying the immediate action the atmosphere needs.

"From our perspective, that's thoroughly irresponsible," he said.

"It is kicking the problem into the long grass and relying on speculative efforts in technology when we know that we can change behaviour right here and now and reduce emissions."

Last month UN Secretary General Antonio Guterres said an independent group would be established to monitor companies' net-zero progress.

'Rude awakening'

Many countries and businesses plan to deploy mass reforestation as part of net-zero plans. Experts say this is problematic for two reasons.

The first is simple science: Earth's plants and soil already absorb enormous amounts of manmade CO2 and there are signs that carbon sinks such as tropical forests are reaching saturation point.

"The concern is that the biosphere is turning from a sink to a source by warming itself," said Allen.

"So relying on the biosphere to store fossil carbon is really daft when we may well need all the nature-based solutions we can find just to keep the carbon content of the biosphere stable."

Teresa Anderson, senior policy director at ActionAid International, said relying on land-based  was "setting Earth up for a rude awakening".

But the concept is also problematic from the perspective of human rights and fairness.

"When it comes to the competition for land to plant trees and bioenergy, that's going to impact low-income communities, the ones that have done the least to cause the problem," Anderson told AFP.

And because humans have already burned through most of the carbon budget—that is, how much total carbon pollution we can produce before 1.5C is breached—there simply isn't time to delay.

This year the UN's Intergovernmental Panel on Climate Change found that since 1850, humans had emitted around 2400 billion tonnes of CO2 equivalent. That leaves just 460 billion tonnes left before 1.5C is breached—around 11 years at current emissions rates.

Pineda said that while hundreds of companies have made net-zero pledges, "very few" have concrete long-term plans to decarbonise.

"We need to be very sceptical of any target that doesn't have clear milestones in terms of how the company is going to halve emissions by 2030," he said.

"Any net-zero target without a 2030 milestone is just unbelievable, basically."

New emissions pledges barely affect global heating: UN

© 2021 AFP

New US energy standards would reverse Trump’s war on lightbulbs


Trump’s energy department blocked a rule intended to phase out less efficient bulbs. Now Biden plans to move forward, slowly


Trump once said that LED bulbs were ‘very dangerous with all the gases’.
 Photograph: john_99/Getty Images/iStockphoto


Oliver Milman
@olliemilman
Thu 9 Dec 2021

The Biden administration has moved to reverse the depredations endured by one of the more unusual targets of Donald Trump’s culture wars during his time as US president: the humble lightbulb.

The US Department of Energy has put forward a new standard for the energy efficiency of lightbulbs that would essentially banish the era of older, incandescent technology in favor of LED lighting.

The absence of lightbulb regulations helps worsen the climate crisis and wastes households money, according to the American Council for an Energy-Efficient Economy. The group has found the use of wasteful incandescent lights is costing Americans nearly $300m a month in needless electricity bills and releasing 800,000 tons of carbon dioxide over the lifetimes of the inefficient bulbs sold each month.

Trump, however, was not enamored by LED lights while he was in the White House. The twice-impeached president said that he looks “better under an incandescent light than these crazy lights that are beaming down”, complaining in 2019 that he “always looks orange” under LED lights. He claimed that if LEDs break they become “a hazardous waste site” and are “very dangerous with all of the gases”.

Trump, who also used his position at the zenith of global power to voice outrage over toilets that he said required flushing “10 times, 15 times, as opposed to once”, vowed to bring back the older lightbulbs, with the White House declaring that the real estate developer and former Apprentice host was allowing Americans to “go ahead and decorate your house with whatever lights you want”.

Trump’s Department of Energy followed this lead, blocking a rule that would have led to the phase-out of incandescent lightbulbs. The move, the administration said, would save consumers money and remove unnecessary government inference in the market.

The Department of Energy has consistently found, however, that installing LED bulbs in light fixtures can save people hundreds of dollars over the lifetime of the bulbs. While LEDs have traditionally been more expensive than incandescents, which are based upon a technology devised by Thomas Edison, their cost has dropped by nearly 90% over the past decade.

Incandescent lightbulbs still make up about a third of all bulb sales in the US and the department of energy said the new standard, which will go through a period of public comment, will reduce greenhouse gas emissions by 222m metric tons over the next 30 years and save consumers nearly $3bn in annual net costs.

The new standard will be implemented in staggered fashion after lobbying from manufacturers, however, which could lead to several years of further incandescent sales. “This progress is welcome news for consumers and for the planet, but the administration is not acting here with the urgency needed to address the climate crisis,” said Steven Nadel, executive director of the American Council for an Energy-Efficient Economy.

Andrew deLaski, executive director of the Appliance Standards Awareness Project, added: “It’s time to get this done. The manufacturers have already received a couple extra years beyond Congress’s deadline to sell bulbs that have a short lifespan and waste a lot of energy. Now they’re pushing for more. The department needs to remember that any extra time it takes and compliance flexibility it gives come at the expense of consumers and the climate.”

America's 'Maximum Pressure' Policy On Venezuela Has Failed

  • U.S. sanctions have a long history of failing to initiate regime change

  • Sanctions from both the Obama and Trump Administrations have failed to have a material impact on Venezuela’s regime

  • The dire situation in Venezuela has provided Russia, China and Iran to expand their influence in the crisis-stricken country

Two decades of malfeasance and corruption coupled with sharply weaker crude oil prices and ever stricter U.S. sanctions have precipitated the worst modern economic collapse outside of war in Venezuela. The crisis-riven Latin American country, which was once considered the wealthiest and most stable democracy in its region, is on the verge of collapse. While U.S. policymakers continue to believe that tough sanctions will bring the autocratic Maduro government to its knees and trigger regime change there are increasing signs that they are failing and have in fact strengthened his position. This is supported by events that have occurred since sanctions were ratcheted up by the Obama administration in 2015 with Venezuela described as an " unusual and extraordinary threat to the national security and foreign policy of the United States”. Unexpectedly former President Trump’s policy of maximum pressure, which saw his administration enact the harshest sanctions ever against Venezuela cutting the country off from international energy markets, has failed. Even the near-collapse of Venezuela’s economy due to harsh U.S. sanctions and growing lawlessness in the stricken Latin American country has done little if anything to erode Maduro’s grip on power.

The latest event highlighting the strength of Maduro’s grip on power has become since that declaration is the ruling United Socialist Party of Venezuela’s recent sweeping victory in regional elections. Coalition led by the party, which is controlled by Maduro, won 20 of the 23 governorships available and the mayoralty of Caracas. This comes after Maduro was able to secure control of Venezuela’s National Assembly, winning 256 of the body’s 277 seats during the December 2020 elections. That success essentially destroyed Washington’s recognized interim President Juan Guaido’s legitimacy because he lost not only his leadership of the lawmaking body but his seat. As a result, the European Union ceased recognizing Guaido as Venezuela’s legitimate interim president, instead bestowing the title of privileged interlocutor.

U.S. sanctions have a long history of failing to initiate regime change unless they are accompanied by other forms of overt pressure including military action. They were unsuccessful in removing Saddam Hussein from power, which was only accomplished through direct military action, have failed to curb the activities of a fundamentalist Shia Iran nor caused the communist regime in Cuba to collapse. Indeed, sanctions tend to fortify authoritarian governments by adding to the scarcity of goods and services thereby handing greater control to authorities for their provision and distribution while providing a handy scapegoat for the hardships they create. Harsh U.S. sanctions precipitated Venezuela’s economic meltdown which bolstered the Maduro regime’s power, making it the key provider of essential goods and services.

The targets of the sanctions typically find a way of dulling their impact or avoiding them altogether by finding alternate sources of capital, markets, and crucial resources. Maduro has been extremely successful in this regard, obtaining considerable support from countries opposed or antagonistic to the U.S, notably Russia, China, Iran and Cuba. Moscow and Beijing are both lenders of last resort for a near-bankrupt Caracas providing oil backed loans and even investing in Venezuela’s rapidly corroding hydrocarbon sector. During late-March 2020 Russian energy company Rosneft announced it had transferred its Venezuelan energy assets to a series of Russian government-controlled entities to avoid the impact of U.S. sanctions on its operations. That gave the Kremlin ownership of interests in a series of joint ventures with Venezuela’s national oil company PDVSA, including oilfields and infrastructure. Beijing along with Moscow has loaned billions to the financially crippled petrostate. It is estimated that Caracas owes Beijing anywhere up to and in excess of $50 billion with another $17 billion payable to Moscow. While that debt is proving to be a crippling burden for a nearly bankrupt Venezuela it has done little to weaken Maduro’s grip on power.

Events in Venezuela have provided Beijing with the opportunity to significantly expand its influence in Venezuela. State-controlled China National Petroleum Corp. is ramping up its presence in Venezuela sending engineers and technicians to the petrostate as it discusses with PDVSA how to boost oil production at their joint projects. Beijing is also a key facilitator for shipping Venezuelan crude oil. Logistics company China Concord Petroleum Co. was identified as a key facilitator in organizing shipments of Venezuelan crude oil. According to Reuters, the Hong Kong-registered firm charted tankers that in April and May 2021 were responsible for transporting a fifth of Venezuela’s crude oil exports for those months. Beijing is focused on more than obtaining repayment of outstanding loans. Caracas’ desperation for capital is being fully exploited by a Beijing determined to build influence and presence in Latin America as a direct challenge to Washington. A resource-hungry China has also secured extremely favorable terms for a series of loans not only for petroleum but other commodities including iron ore, leaving an increasingly financially desperate Venezuela with backbreaking debt. This strategy is an increasingly important plank in China’s economic and societal conflict with the U.S as it seeks to secure vital raw materials and gain greater geopolitical influence. 

Iran and its proxy, Hezbollah, are becoming increasingly important supporters of the Maduro regime. Teheran is providing considerable assistance to PDVSA including materials, technicians, and engineers to rebuild its severely dilapidated refineries. According to Reuters, in 2020 Iran sent more than 20 flights, carrying parts and technicians, to Venezuela to restart the 310,000 barrels per day Cardon Refinery which is part of the 971,000 barrels per day Paraguana refinery complex. Then in February 2021 further airlifts of materials including catalysts bound for Paraguana were identified. Iran is also sending shipments of urgently needed condensate to Venezuela, This is critical to PDVSA’s operations because it is mixed with the extra-heavy crude produced in the  Orinoco Belt so that it can be transported, processed, and exported. 

The growing influence of Iran, which is also subject to strict U.S. sanctions, has allowed Hezbollah to establish a sizeable foothold in Venezuela where it engages in a range of illicit activities including cocaine and arms smuggling as well as money laundering. Hezbollah has also established terrorist training camps in Venezuela, meaning it poses an existential terrorist threat to the U.S. its citizens and allies in Latin America. The militant Shia group was responsible for the 1992 car bombing of the Israeli Embassy in Buenos Aires and then the 1994 bombing of a Jewish community center in that city. The combined attacks claimed 116 lives, were the worst-ever terrorist attacks in Argentina, and highlight eh threat posed by Hezbollah in Latin America. Hezbollah is conducting all those activities with the approval of the Maduro regime. The Shia militant organization’s presence in Venezuela is only fueling further regional instability, with the militant U.S.-designated terrorist organization engaged in relationships with local illegal armed groups such as the ELN and FARC dissidents.

The autocratic Maduro regime’s ability to overcome or avoid strict U.S. sanctions has allowed crude oil output to steadily rise since June 2020 reaching 590,000 barrels per day for October 2021. That coupled with Maduro’s reluctant reforms, the economy bottoming out and its steady unofficial dollarization leading to lower inflation sees international financial institutions predicting that Venezuela’s economy will grow during 2021. Investment bank Credit Suisse has forecast that Venezuela’s gross domestic product during 2021 will expand by 5.5% after contracting every year since 2014. Whereas economists estimate Venezuela’s economy will grow by anywhere between 5% and 10% this year. If that occurs it will further fortify Maduros’ position making it even more difficult for his regime to be toppled by current U.S. policy. 

Washington’s unwavering faith in strict economic and other sanctions has failed to trigger regime change in Venezuela. In fact, it is becoming increasingly clear that over time the utility of sanctions gradually diminishes to the point where they no longer have any material impact on the target regime. This is the point that has been reached regarding Washington’s approach to Venezuela. Not only have those measures failed but they are creating a range of undesirable side effects which are harming U.S. interests in Latin America. One of the most worrying is that they are creating an opportunity for Russia, China, and Iran to bolster their footprint and influence in the region, challenging Washington’s traditional hegemony. This is particularly worrying because Venezuela’s vast natural resources, notably its colossal petroleum reserves are gradually falling under foreign control. Washington’s policy is also allowing illegal armed groups and designated terrorist organizations to prosper in a lawless Venezuela which is nearly a failed state. That not only further inflames regional instability and creates greater opportunities for criminal groups to engage in harmful illicit activities such as cocaine smuggling but heightens the risk of terrorist attacks. Aside from the considerable humanitarian suffering occurring in Venezuela, it is for these reasons that Washington needs to reappraise its approach towards the authoritarian regime and the crisis-riven country.

By Matthew Smith for Oilprice.com

 

New research makes waves tackling the future of tsunami monitoring and modeling

New research makes waves tackling the future of tsunami monitoring and modeling
Fralin Life Scientes Institutes' Tina Dura (right) conducts research with colleagues 
Richard Briggs (United States Geological Survey) and Simon Engelhart (Durham University)
 on an island off the coast of Alaska. Photo courtesy of Rich Koehler for Virginia Tech. 
Credit: Virginia Tech

The coastal zone is home to over a billion people. Rising sea levels are already impacting coastal residents and aggravating existing coastal hazards, such as flooding during high tides and storm surges.

However, new research by assistant professor Tina Dura and professor Robert Weiss in the College of Science's Department of Geosciences indicates that future  rise will also have impacts on the heights of future tsunamis.

"In 50 to 70 years, sea level is going to be significantly higher around the world," said Dura, who is also an affiliate of the Center for Coastal Studies, an arm of the Fralin Life Sciences Institute. "If a  strikes in that time frame, the impacts that you're estimating for today are going to be greater. I think that coastal geologists and modelers alike need to consider sea-level rise in future models and hazards assessments."

Their findings were published in Nature Communications.

Around the colloquial Ring of Fire, tectonic plates are colliding with the massive Pacific plate, resulting in seismic and volcanic activity. Because the Ring of Fire encircles the Pacific Ocean, large earthquakes on its boundaries produce regional tsunamis and also distant-source tsunamis that propagate across the Pacific Ocean and affect coastlines thousands of miles away.

Off the coast of Alaska, colliding tectonic plates create a 2500-mile-long fault known as the Alaska-Aleutian subduction zone. Research shows that the subduction zone can produce distant-source tsunamis that strike the west coast of the United States, and in particular, Southern California.

In 2013, the United States Geological Survey initiated a Science Application for Risk Reduction project focused on a distant-source tsunami originating along the Alaska-Aleutian subduction zone and its impacts in California.

The project found that a magnitude 9.1  could produce a distant-source tsunami with an amplitude of 3.2 feet at the ports of Los Angeles and Long Beach, larger than any historical distant-source tsunami at the ports, causing losses of up to $4.2 billion.

a) Map of Alaska showing the sections of the Alaska-Aleutian subduction zone, earthquake boundaries, and approximate historical earthquake extents. b) Light gray shaded area shows the U.S. Geological Survey Science Application for Risk Reduction scenario magnitude 9.1 Semidi section earthquake. c) Map of the ports of Los Angeles and Long Beach showing the location of gauges that measure water levels at the ports and maximum nearshore tsunami heights. d) Plot showing modeled earthquake magnitudes in the year 2000 with no tidal variability included (blue histogram), with tidal variability (green histogram), and the combined tsunami heights and tidal variability (red histogram).

However, due to rising sea levels, this tsunami scenario at the ports of Los Angeles and Long Beach will not be accurate in the long run.

Observations show that the world's temperatures are rising and sea levels are following suit. It's not a question of whether sea level will continue to rise but by how much.

Dura and Weiss, along with colleagues from Rowan University, Rutgers University, Durham University, Nanyang Technological University, and the United States Geological Survey, joined forces to combine distant-source tsunami modeling with future sea-level rise projections to see how rising sea levels will influence tsunami heights in Southern California.

The group projected sea-level rise for the ports of Los Angeles and Long Beach based on scenarios that factor in both low and high estimates of greenhouse gas emissions and climate change mitigation strategies.

One scenario included mitigation strategies to reduce greenhouse gas emissions that resulted in minimal temperature and sea-level rise. Another scenario reflects a future with no mitigation efforts and high emissions, leading to a faster rise in temperatures and higher sea levels.

The group found that today, a magnitude 9.1 earthquake can produce tsunami heights that exceed 3.2 feet at the ports. However, by 2100, under high-emissions sea-level rise projections, a much smaller magnitude 8 earthquake will be able to produce a tsunami that exceeds 3.2 feet.

In other words, higher sea levels will make the ports more vulnerable to tsunamis produced by less powerful earthquakes. The results are especially concerning given the higher frequency of magnitude 8 earthquakes.

"A 9.1 is very, very rare," said Dura. "So today, the chances of having a tsunami exceeding 3.2 feet at the ports is pretty small because a very rare, very large earthquake would be required. But in 2100, a magnitude 8, which happens around the Pacific Rim quite often, will be able to exceed the same tsunami heights due to higher sea levels."

"This work really illustrates the potential for future tsunamis to become far more destructive as sea levels rise, especially if we fail to reduce future greenhouse gas emissions," said co-author Andra Garner, who is an assistant professor studying sea-level rise at Rowan University. "The good news is that the work also illustrates our ability to minimize future hazards, if we act to limit future warming and the amount by which future sea levels increase."

But knowing about these potentially devastating tsunamis entails not just looking ahead, but looking back as well.

The United States Geological Survey Science Application for Risk Reduction project only considered an earthquake that occurred within the Semidi section of the Alaska-Aleutian subduction zone. But since that initial work, Dura and colleagues have published research that suggests other sections of the subduction zone should be considered as well.

The Semidi section and the adjacent Kodiak section of the subduction zone have produced historical earthquakes. In 1938, a magnitude 8.3 earthquake struck the Semidi section. In 1964, a magnitude 9.2—the largest recorded earthquake to occur on the Alaska-Aleutian subduction zone—struck the Kodiak section and other sections to the east.

Because the earthquakes of 1938 and 1964 did not overlap, seismic hazard maps labeled the area between them as a "persistent earthquake boundary." In other words, the risk of the region's greatest, multi-section earthquakes was thought to be quite low.

"Although the 1964 earthquake rupture did not cross into the rupture area of the 1938 earthquake, it is unclear if this has been the case for earthquakes hundreds to thousands of years in the past. Should this be considered a persistent boundary between earthquakes, or can there be very large, multi-section earthquakes in this region? We wanted to find out," said Dura.

To learn more about the seismic history of the Alaska-Aleutian subduction zone, Dura and colleagues used 5 centimeter cookie-cutter-like cylinders to collect core samples from wetlands that are peppered across the proposed earthquake boundary.

The group then analyzed the soil layers contained in the cores to identify instances of land-level change and tsunami inundation from past earthquakes. Through radiocarbon, cesium, and lead dating, the group was able to build a timeline of past large earthquakes in the region.

Their research showed that multiple  had spanned the proposed earthquake boundary, which means that earthquakes that ruptured both the Semidi and Kodiak sections of the subduction zone had occurred multiple times in the past.

"Our geologic data shows that earthquakes can span the Semidi and Kodiak sections," said Dura. "For this reason, we incorporated both single and multi-section earthquakes into our distant-source tsunami modeling for the ports. By including multi-section earthquakes in our modeling, we believe the range of tsunami heights we estimate for the ports is a step forward in our understanding of impacts of future tsunamis there."

The group's data will be included in hazard maps for southern Alaska to help improve future modelling scenarios for the Alaska-Aleutian subduction zone.

"Collaborations like ours that aim to integrate coastal geology, earthquake modeling, and future projections of sea level are crucial in developing a complete picture of future tsunami impacts at ports," said Weiss, director of the Center for Coastal Studies. "Increasing interdisciplinary research capacity, meaning the integration of scientific fields with each other that follow different governing paradigms, will be the key to understanding the impacts that the changing Earth has on our well-being and prosperity. Building interdisciplinary research teams is difficult, and Virginia Tech's Center for Coastal Studies fulfills a pivotal role bringing such teams together. Fulfilling this team-building role not only enables studies such as ours, but also helps Virginia Tech remain true to its motto, Ut Prosim (That I May Serve)."

In future projects, Dura, Weiss, and colleagues plan to incorporate distant-source tsunamis originating from other subduction zones around the Ring of Fire into their modeling of tsunami impacts on other coasts as well as the economic consequences of coastal inundation.

"With our new study, we provide an important framework for incorporating  into distant-source tsunami modeling, and we're excited to continue building on these initial results," Dura said.Weird earthquake reveals hidden mechanism

More information: Nature Communications (2021). DOI: 10.1038/s41467-021-27445-8

Journal information: Nature Communications 

Provided by Virginia Tech 

$450M flood bill costliest in B.C.'s history, still climbing

The nearly half-billion dollars in insurance claims makes November's flooding the costliest natural disaster in B.C.'s history. But with the government yet to release its damage estimates, it comes nowhere near the true financial toll.
Princeton flood
Princeton, B.C., residents float and wade through floodwaters, Nov. 15, 2021. 

The flooding that sank several B.C. communities underwater last month is now estimated to have caused $450 million in insurable damage, making it the most costly weather event in the province’s history, says the Insurance Bureau of Canada (IBC). 

The record bill comes nowhere close to capturing the full damage to people’s homes and businesses. The B.C. government has yet to tally the full cost of both disaster assistance relief and repairs to damaged infrastructure. 

Part of the reason that government bill is expected to stretch into the hundreds of millions, if not billions of dollars, is because only roughly half of British Columbians are covered by flood insurance. In high-risk areas, like Abbotsford’s Sumas Prairie or parts of Merritt, flood insurance simply wasn’t available. 

“This is a fraction of the total cost,” says Aaron Sutherland, IBC’s vice-president for Canada’s Pacific region. 

“If you lived in these areas that have been impacted by these floods, insurance for your home would have been… either very expensive or the insurers just simply don't offer it because they can't make it affordable.”

Sutherland says IBC does not yet have a grasp on what proportion of homes and businesses impacted by November’s flooding had insurance, but so far roughly 6,000 homes have filed claims.

That comes nowhere near the $4 billion (adjusted for inflation) claimed in the wake of the 2016 Fort McMurray wildfire; to date, it's Canada’s most costly insurance payout due to natural disaster. But because fire is always covered in insurance policies and flood damage requires homeowners to opt in, the full cost of B.C.’s flooding could be much higher.

Sutherland says 90 per cent of B.C. homeowners with insurance pay no more than $300 per year. It’s the remaining 10 per cent, who live in places expected to flood every 10 or 20 years, that face massive insurance premiums. And as climate change makes the risk of flood more likely, those costs are only expected to climb. 

Every year, the insurance industry pays out roughly $40 billion across Canada, of which $6 billion is payed out in B.C. That’s still nowhere near a breaking point, says Sutherland. 

The growing sense of urgency rests with home and business owners who can’t foot the rising cost of premiums. To that end, IBC is part of a federal, provincial and territorial task force looking to create a low-cost residential flood insurance program for those highest at risk. 

At the same time, Sutherland says more needs to be done to protect communities from the effects of climate change in the first place — whether from wildfire or floodwaters.

“Insurance is just one piece of what is a much bigger challenge we’re facing,” he says. “The impact it’s having on people living and working in these areas, you can’t quantify that with a dollar value.”