Friday, August 28, 2020

California reveals that the transition to renewable energy Isn’t so simple 

By Alex Trembath & Zeke Hausfather | News | August 27th 2020
#1393 of 1394 articles from the Special Report:Race Against Climate Change

Image by Michael Schwarzenberger from Pixabay. Pixabay License

This story was originally published by Slate and appears here as part of the Climate Desk collaboration

The recent “heat storm” in California has pushed grid operators to impose rolling blackouts for the first time since 2001. A combination of heavy air conditioning usage, the unplanned unavailability of some power plants, limited options for importing power from neighboring states, and insufficient solar and wind generation have led to an imbalance of electricity generation and consumption. As Stephen Berberich, president of the California Independent System Operator, or CAISO, which oversees operation of the state’s electric grid, told Sammy Roth of the Los Angeles Times, “We thought there would be adequate power to supply the demand. … We were wrong.”

The blackouts are designed to affect customers for no more than an hour or two. Some may find that little more than mildly inconvenient. Others more prone to heat stress or in possession of medicines that require refrigeration might be more worried, with good reason. The aggregate reaction could prove substantial, especially if we face more rounds of shortages in the coming months and years. The last time this happened to us Californians, we altered long-held plans to partially deregulate electricity systems, recalled Gov. Gray Davis, and pushed Enron, one of the energy services providers found guilty of gaming electricity markets, into bankruptcy.


This time, what we’re seeing should give renewable energy advocates pause.

For years, renewable energy enthusiasts have insisted that most of the problems of the electric grid were caused by outdated and inflexible coal, natural gas, and nuclear energy technologies. A system built on solar panels and wind turbines, smart meters, electricity storage, and payments for flexible demand would lower costs and improve reliability for everyone. Some academic studies showed that renewables could easily supply 80 percent or more of an electric grid’s demands.

But with non-hydroelectric renewable technologies, mostly solar and wind, generating about 30 percent of California’s electricity today, we are witnessing the types of obstacles and problems that these new technologies introduce.

Solar power, of course, is well matched with high demand for air conditioning in the middle of the day. That advantage disappears when the sun goes down but the temperature doesn’t. Distributed and centralized electricity storage technologies can shift some power generation toward later in the day. But lithium-ion battery systems are usually designed for only a few hours of storage—enough to accommodate much of the daily swings in solar generation, but not the type of extreme surges in demand we’ve seen this week. Storage technologies that last a few days would help, but truly high-renewables systems will likely require seasonal storage technologies that can shift consumption from the hot months to the cool months. Beyond geographically constrained, pumped hydroelectric storage systems, those technologies are largely nonexistent.

Advocates emphasize that solar and wind complement each other; the wind usually blows while the sun doesn’t shine, and vice versa. And even if a renewable resource is less available in one part of the country, long-distance power lines can transmit electricity from other regions. These are features of a wind-and-solar-heavy system—until they’re bugs. In California this week, the heat pushed power demand to near record highs, solar generation plummeted in the evening, the winds slowed faster than expected, and the same thing happened in Nevada, Arizona, and other states we usually import electricity from. The result was predictable to anyone who hasn’t been heralding a seamless transition to renewable energy technologies.


Part of the problem here is how much solar generation drops as the sun goes down and how much flexible generation needs to fire up to compensate. The amount of solar power has increased substantially in California over the last decade. The required summer “ramp-up” of firm generation (generators like natural gas plants that can be turned on and kept on) between the middle of the day and the evening—when the sun sets and many people turn on lights and appliances—has more than doubled. California shifts from using a large amount of solar during the day to a large amount of natural gas and imports during the evening, and the size of this shift is growing. This puts stress on the rest of the electricity grid and increases the risk that disruptions in either in-state power generation or imports could lead to shortfalls. About two-thirds of the generation ramp-up comes from natural gas plants in California, while the remaining third comes from imports from neighboring states.

In this case, blackouts resulted from a combination of less available imports—neighboring states also had extreme demand associated with the heat wave—and a number of natural gas plants unexpectedly going offline. More gas plants or battery storage can be added in the short term to hedge against future energy use spikes, but having resources that sit idle for the vast majority of the time increases the overall cost of generating electricity.

California’s challenges are exacerbated by the declining availability of reliable, always-on nuclear capacity in the state. In 2012, Southern California Edison closed the San Onofre Nuclear Generating Station, or SONGS, taking 2,200 megawatts off the grid in one fell swoop.* The power once generated by SONGS was largely replaced by natural gas, causing a jump in carbon emissions and a squeeze in electric capacity reserve margins. A similar fate now awaits the Diablo Canyon Power Plant, the last nuclear plant in the state. In 2016, Pacific Gas & Electric announced that it would not seek a license extension for Diablo Canyon.

Sometimes the features of a wind-and-solar system turn out to be bugs.


Replacing California’s nuclear power plants reveals the difference between modeling a renewable energy future and building one. The Natural Resources Defense Council, which pushed to close Diablo Canyon, argued at the time that the plant’s electricity would “be replaced with gains in energy efficiency, renewable power, and pollution-free energy technologies.” But an initial California Public Utility Commission replacement plan included substantial natural gas capacity. (The NRDC called the plan “deeply flawed.”) In 2018, the state Legislature passed a law directing the California Public Utilities Commission to ensure that no greenhouse-emissions increase results after Diablo Canyon shuts down in 2025. How this will all unfold in practice remains to be seen.

Solar, wind, and battery storage technologies have gotten significantly cheaper over the last decade. They are modular and have proven much easier to deploy and build than, say, new nuclear power plants, which have become expensive megaprojects (and are actually illegal to build in California until federal lawmakers create a long-term solution for storing nuclear spent fuel). And of course renewables do not emit air pollution or carbon dioxide. The reasons to support the deployment of solar and wind are myriad and valid.

And the challenges we describe above, which states like California and countries like Germany are increasingly grappling with, are at least partially surmountable. Future advances in long-duration electric storage capacity could allow grid operators to shift supply to meet demand. “Demand response” arrangements, or payments to customers who temporarily reduce their consumption, could help smooth imbalances. Continent-spanning transmission would help move electrons from where they’re more reliably generated to where they’re more reliably consumed.

But California’s experience also underscores a growing consensus among energy scholars: that variable renewable energy technologies are unlikely to meet the grid’s power demand by themselves. They will play an important role, but more firm generating sources, like next-generation nuclear reactors, natural gas plants with carbon capture technologies, enhanced geothermal, and others that can balance out variable renewables, will be required.

To be clear: Rising energy costs and the ongoing reliability crisis cannot be blamed entirely on California’s growing solar and wind sectors. But this month’s challenges surface the complexities and difficulties of energy transitions, and the imperative of maintaining a flexible and diverse supply of energy technologies. If this month’s blackouts continue, there is a risk California’s ratepayers will come to associate them with the state’s clean energy transition.

*Correction, Aug. 25, 2020: This article originally misstated that San Diego Gas & Electric closed the San Onofre Nuclear Generating Station. San Diego Gas & Electric is a minority owner of SONGS; the majority owner is Southern California Edison.
Distributed Energy Helped Fight California’s Grid Outages, But It Could Do Much More

Battery and demand response providers say policy and market reforms could turn emergency help into stable, consistent grid relief
.
JEFF ST. JOHN AUGUST 28, 2020


California's growing behind-the-meter energy resource mix could play a bigger role in solving grid emergencies, advocates say.

During last week’s heat-wave-driven grid emergency in California, grid operator CAISO and the state’s utilities sent out desperate calls to demand response providers, behind-the-meter battery aggregators, electric-vehicle charging providers, microgrid operators and backup generator owners, seeking whatever help they could provide.

The call was answered. According to CAISO and the California Public Utilities Commission, consumer conservation and demand-side resources were critical in avoiding more rolling blackouts like those CAISO ordered the evenings of Friday, Aug. 14 and Saturday, Aug. 15.

“Conservation efforts are the only way we have gotten through these peaks,” CAISO President Steve Berberich said last week.

While the grid operator lacks visibility into how much load reduction came from those voluntary energy-saving decisions, it can measure the roughly 1,300 megawatts of residential and commercial-industrial demand response it secured during the crisis.


It also tallied hundreds of megawatts from out-of-market responses, according to a letter from CAISO and the CPUC laying out plans for a “postmortem” analysis of the emergency for Gov. Gavin Newsom and state leaders. Silicon Valley data centers provided about 100 megawatts by switching to backup power systems. Military microgrids, ships and backup generators delivered about 23.5 megawatts, and microgrids funded by state research grants provided another 1.2 megawatts per day.

Microgrids, solar-plus-storage systems, EV chargers, dispatchable energy loads and other distributed energy resources are going to play an increasingly important role in shaping California’s electricity supply-demand balance as it seeks to reach its goal of 100 percent carbon-free energy by 2045.

Beyond these backup systems, behind-the-meter batteries from providers including Stem, Sunrun, Enel X and Tesla shifted their typical operating schedules to maximize effectiveness through 9 p.m. to match the state’s “net peak,” or total grid demand minus wind and solar. Electric-vehicle charging operators like Tesla and Enel X also encouraged customers to avoid charging during peak hours.


Surya Panditi, CEO of Enel X North America, noted that the company’s demand response, behind-the-meter battery and EV charger assets were able to deliver about 150 megawatts per day during the crisis. “If...the right kind of economic construct [is in place], everyone benefits.”
Barriers to behind-the-meter participation

But this month’s crisis has also revealed a gap between the grid-balancing potential of demand-side resources and the policies needed to bring them to bear, industry participants say.

“We delivered about 50 megawatts of relief to the grid on Friday evening — but we could have deployed about 50 megawatts more if all the policies were in place,” said Ted Ko, Stem’s vice president of policy and regulatory affairs. “What’s ironic is that the [CPUC] and CAISO have been putting up all these barriers to us participating, and then they called us and said, ‘Hey, can you do this for us for free?’”

A 2019 paper from the California Solar & Storage Association lays out a long list of such barriers. One key issue is the rules meant to prevent “double-counting,” which refers to resource-gaming or distorting the market by getting paid twice from different programs for the same service.

This prohibition has been applied in illogical ways, Ko contends. For example, until recently, batteries receiving payments from the state's Self-Generation Incentive Program were barred from most demand-side resources programs — even though SGIP’s upfront installation incentives are distinct from grid-services payments.

The CPUC has made changes to allow SGIP systems to participate in certain distribution grid programs. But it hasn’t yet set standards for how “multiuse applications,” such as behind-the-meter resources earning incentives from other state programs, can participate in California’s key program for securing capacity to meet grid demand peaks, the Resource Adequacy (RA) program.

CAISO has criticized the CPUC for failing to secure enough RA to forestall last week’s grid emergencies, and reforming the program is high on both agencies’ agendas. While some SGIP-backed solar-storage aggregations in California qualify as RA, other utility projects seeking behind-the-meter capacity have barred participants from receiving SGIP or other installation incentives, the 2019 report states.
Slow progress on reforms to demand response


Thomas Folker, CEO of Leap, a company that helps renewables aggregators access markets, agreed that changes are needed to unlock more demand-side capacity in California. Last week, Leap provided between 50 and 60 megawatts of load reduction daily from the mix of Nest smart thermostats, EV chargers, water pumps and commercial-industrial load controls it bids into CAISO’s energy markets for aggregators.

Many of those resources are organized under the Demand Response Auction Mechanism (DRAM) pilot program, which allows aggregators to earn money in two ways: by entering into RA contracts with utilities and by bidding into CAISO’s day-ahead and real-time energy markets.

DRAM hasn't seen its budget expanded to allow broader participation by renewables aggregators over the past few years as the CPUC debates its next steps, limiting its growth. Meanwhile, traditional utility-run demand response initiatives such as the Capacity Bidding Program pay less than what’s available from DRAM, with payments disconnected from real-time energy market prices, he said.

CAISO allows distributed energy resources (DERs) to participate directly in its markets, but the Distributed Energy Resource Provider program hasn’t enlisted any aggregators yet. DER providers say that’s because CAISO sets deal-breaking requirements, including the need to be available at all times, which prevents alternative revenue-generating opportunities.

Even DRAM has its limitations, Folker said. For example, RA rules require resources to commit to at least four hours of continuous delivery. That’s meant to assure that CAISO can access a reliable, consistent peak reduction over time. But it also limits how “surgically” Leap can bid load-reduction potential into CAISO’s market during its net peak, when the sun sets, and the buffer between supply and demand dwindles the most.

Narrowing that minimum bid requirement to two hours, for example, could double batteries’ load-reduction potential, by halving the amount of time they’re required to draw upon their limited storage capacity. Capacity from homeowners turning off air conditioners or EV drivers delaying charging could also be boosted by shortening how long they’re asked to do it, he said. Since CAISO's rolling blackouts were forced during a two-hour window, this kind of focused increase in capacity could have made a significant difference.
Letting behind-the-meter assets charge up the grid

Behind-the-meter resources also can’t send energy back to the grid since demand response programs are built for load reduction, not energy export. Changing those rules could allow them to play a much more aggressive role, Ko said.

Stem sought emergency authorization from utilities to export from its batteries last week, but the company "couldn’t make that happen fast enough,” he said. “We don’t have the constructs in place to dispatch and get paid for DERs to help out in these emergency situations.”

Allowing behind-the-meter energy export would require reforms to CAISO’s Proxy Demand Resource and Reliability Demand Response Resource programs that organize much of the state’s utility demand response activity, he explained. CAISO, the CPUC and the California Energy Commission are planning a joint workshop to discuss the potential to provide RA credit to “hybrid” solar-storage behind-the-meter resources, a proceeding over which the energy-export issue is likely to loom large, he said.

There’s a lot of potential value being left on the table, Ko said. According to research from Stem, solar installer Sunrun and software startup Station A, the state’s latent solar-storage capacity could provide about 9 gigawatts of grid capacity under ideal conditions.

California’s nearly 20-year-old RA construct faces major changes to adapt to a future of increasing levels of intermittent clean energy. Key questions include how much to rely on out-of-state imports and how to enlist energy storage to replace a declining natural-gas power plant fleet.

But how to tap California's fast-growing behind-the-meter capacity will also be an important part of the solution, said Ben Kellison, WoodMac's director of grid research. One big question will be to determine “how much of the emergency capacity was called through market mechanisms versus how much just showed up out of a sense of societal obligation or goodwill” — and to determine whether policies need to change in order to pull some of the latter category into the former
AUSTRALIA
'Orwellian': Coalition accused of planning to open green bank to fossil fuel investments

Angus Taylor introduces bill to give Clean Energy Finance Corporation $1bn for ‘grid reliability fund’

Angus Taylor flanked by Scott Morrison at a Snowy 2.0 press conference. The CEFC’s definition of low-emissions technology would be expanded to include gas under a bill the minister has brought before parliament. Photograph: Mick Tsikas/AAP

Adam Morton Environment editor
THE GUARDIAN
Fri 28 Aug 2020

The Morrison government has been accused of planning to open up the taxpayer-owned green bank to fossil fuels investments through a change that explicitly defines gas-fired power as a “low-emissions technology”.

The energy and emissions reduction minister, Angus Taylor, introduced legislation to parliament on Thursday that would give the Clean Energy Finance Corporation an additional $1bn funding for a “grid reliability fund”.

Taylor said the new fund would be used to encourage private investment in infrastructure needed to balance electricity generation from solar and windfarms – battery and pumped hydro storage, “dispatchable” power plants that can be ramped up as needed, transmission projects and technologies that stabilise the grid.


Methane released in gas production means Australia's emissions may be 10% higher than reported
Read more


It would also allow the CEFC to administer a long-promised underwriting program for new electricity generation announced before last year’s federal election. The government has released a shortlist of 12 projects being considered for support through the controversial program – six renewable energy, five gas and one coal.

Under Taylor’s proposed legislative changes, the CEFC’s definition of low-emissions technology would be expanded to include any projects that “support the achievement of low emission energy in Australia”. An explanatory memorandum says this could include “certain types of gas-fired electricity generation” but coal would not qualify.

The bill would exempt the grid reliability fund from a CEFC requirement that at least 50% of funding must go to renewable energy. The fund would also be freed of a rule that individual CEFC investments have to deliver a return for taxpayers, though it would need to overall.

Taylor told parliament: “To ensure we can continue to be a world leader, we must back more grid reliability investments, such as flexible gas generators or significant pumped hydro projects.”

The Australian Energy Market Operator last month found additional gas-fired power was an option, but not essential, for an electricity grid increasingly based on renewable energy, and gas prices would need to stay at lower levels than expected if it was to compete with pumped hydro, batteries and other alternatives.


Australia's chief scientist rejects experts' letter warning him not to back gas


Labor’s climate and energy spokesman, Mark Butler, said the opposition was yet to form a view on the proposal but criticised the government for previously trying to abolish the CEFC.

“Labor established the CEFC, we have consistently protected the integrity of the CEFC as a renewable energy financing body, and we will continue to do so,” he said.

The Greens leader, Adam Bandt, said the proposal to define gas as low emissions was “Orwellian”.

“This is nothing more than a shameless attempt to bankroll gas corporations with public money meant for renewables,” he said. “The Greens are calling on Labor to join us in the fight against a gas-loving government that is hurtling us towards climate catastrophe.”

The CEFC was created in 2012 under a deal between Labor, the Greens and independents with a mandate to invest in renewable energy, low-emissions technology and energy-efficiency projects that would deliver a return.

Introducing the legislation to create it in 2012, the then climate change minister, Greg Combet, said gas may “technically be eligible for funding as a low-emissions technology” but was not expected to win backing as it had a track record in getting financed.

Gas is often described as having about half the emissions of coal but studies have suggested this could be more due to leakage of methane, a particularly potent greenhouse gas. In addition to being burned to create electricity, gas is used as a feedstock in some manufacturing industries.

The prime minister, Scott Morrison, this week told parliament that expanding gas supply was critical to Australia’s economic recovery from the coronavirus pandemic. His handpicked National Covid-19 Coordinating Commission has recommended the government underwrite new gas infrastructure.


Large-scale renewable energy investment in Australia falls to lowest level since 2017

Richie Mezian, the climate and energy program director at the Australia Institute, said the proposed amendments would transform the CEFC “from an explicitly profit-making investor of renewable energy projects for the Australia to a potentially loss-making underwriter of fossil fuel projects”.

He said investment in renewable energy was falling dramatically and the government should be doubling down on its mandate to support it. “How the government will justify its amendments that ensure CEFC is no longer obligated to fund at least 50% of its projects from renewable energy is beyond me,” he said.

Jonathan Moylan, a Greenpeace campaigner, accused the government of trying to give gas an unfair competitive advantage over batteries and renewable energy, despite them being cheaper. “Instead of making money from fossil fuels, the government is seriously considering handing them public money and not expecting anything in return,” he said.

Facebook says Apple censored an update that mentioned the 30% App Store commission for in-app purchases
Apple CEO Tim Cook (left) and Facebook CEO Mark Zuckerberg. AP; Francois Mori/AP

Facebook told Reuters that Apple has blocked an update to its iOS app containing a message about the 30% commission Apple takes on in-app purchases.
Facebook said Apple had blocked the update on the grounds it contained "irrelevant" information.
Facebook launched a new tool allowing users to buy tickets to virtual events in-app, and earlier this month asked Apple to waive its usual 30% commission. Apple declined.
Visit Business Insider's homepage for more stories.

Facebook's fight with Apple over the App Store's 30% commission on in-app purchases has reached new heights, after the social media giant accused the iPhone maker of censorship.

Facebook told Reuters on Thursday that Apple had blocked it from issuing an iOS app update that contained a message about the fee Apple charges.

The message was supposed to accompany a new Facebook tool that allows users to buy tickets for online events inside the app. Facebook said the update, which contained a "transparency notice" telling users that 30% of their payment goes to Apple, was rejected by Apple because it contained "irrelevant" information for users.

"Now more than ever, we should have the option to help people understand where money they intend for small businesses actually goes. Unfortunately Apple rejected our transparency notice around their 30% tax but we are still working to make that information available inside the app experience," Facebook said in a statement to Business Insider.
The message on Facebook's app would have looked like this. Facebook/Business Insider.

Apple declined to comment to Reuters, and was not immediately available to comment when contacted by Business Insider.



On August 14, Facebook said it had asked Apple to waive its usual 30% fee for the new tool so that all the money could be passed onto small business owners hosting paid events, but Apple declined.

Facebook isn't the only big company to lock horns with Apple over its in-app purchases levy. The tech giant is in a legal battle with "Fortnite" creator Epic Games after it removed "Fortnite" from the App Store for circumventing its payment system.

The 30% commission is also currently the subject of an antitrust investigation in the EU after music-streaming giant Spotify filed a complaint last year, claiming the fee allows Apple to artificially inflate prices while launching its own competing apps.

Tesla employee bribed with US$1 million helps FBI arrest Russian hackers

Hackers' scheme to hold Tesla's private data for ransom exposed by contact

by DAVID BOOTH | 14 HOURS AGO




The dashboard interior of the semi-autonomous 2018 Tesla Model 3
 Tuesday, April, 2018, at the Consumer Reports Auto Test Track in Colchester, Conn.


There are, it seems, people better than you and I. To wit—

According to Clearance Job, a recruiting agency out of Urbandale, Iowa that specializes in employees with American federal government security clearances, a Tesla employee turned down an offer of US$1 million to help some Russian hackers infiltrate the company’s computer system.

It all started mid-July when Egor Igorevich Kriuchkov, a 27-year-old citizen of Russia in the United States on a travel visa contacted a Russian-speaking foreign national – the name of the employee has not been divulged – working in Sparks, Nevada. Along the way, the employee was wined, dined and then provided with – shades of CSI – a burner phone for secret communications.


The plan, according to the U.S. Department of Justice, was to get the employee “to introduce malware – i.e., malicious software programs designed to damage or do other unwanted actions on a computer system – into the company’s computer network” that would allow Kriuchkov and his co-conspirators access to the company’s system, letting them download data and threaten to make it public. In other words, a typical ransomware scam.

Instead, the employee went directly to Tesla, who subsequently alerted the FBI. Wearing a wire – are we sure this isn’t going to be made into a movie? – the employee was able to record their interactions with Kriuchkov, and on August 22, he was arrested as he attempted to flee from the Los Angeles airport. He was charged, according to the American DOJ, with “one count of conspiracy to intentionally cause damage to a protected computer.”

RELATED

Motor Mouth: Hackers have weaponized a Tesla


Motor Mouth: Hacking a car is far easier than you might think


What’s possibly most interesting about this subterfuge is that despite all the hoopla about susceptibility of automobile companies to over-the-air hacking, this is yet another “denial of service” exploit that (would have) required hands-on intervention to succeed. Like the famous case of Jeep hacking by Charlie Miller and Chris Velasek, Kriuchkov needed a physical connection to the main computer before he could start spoofing the network.

I’m not enough of a computer geek to know whether that is comforting at all, but I can say it frightens the h-e-double-hockey-sticks out of me to know the safety of our networks may depend on people being righteous enough to turn down a million-dollar bribe.

Effectiveness of cloth masks depends on type of covering


by American Institute of Physics


 
Schlieren images of coughs with varying degrees of face covering. Credit: Padmanabha Prasanna Simha, Indian Space Research Organisation

Months into the COVID-19 pandemic, wearing a mask while out in public has become the recommended practice. However, many still question the effectiveness of this.


To allay these doubts, Padmanabha Prasanna Simha, from the Indian Space Research Organisation, and Prasanna Simha Mohan Rao, from the Sri Jayadeva Institute of Cardiovascular Sciences and Research, experimentally visualized the flow fields of coughs under various common mouth covering scenarios. They present their findings in the journal Physics of Fluids.

"If a person can reduce the extent of how much they contaminate the environment by mitigating the spread, it's a far better situation for other healthy individuals who may enter places that have such contaminated areas," Simha said.

Density and temperature are intricately related, and coughs tend to be warmer than their surrounding area. Tapping into this connection, Simha and Rao utilized a technique called schlieren imaging, which visualizes changes in density, to capture pictures of voluntary coughs from five test subjects. By tracking the motion of a cough over successive images, the team estimated velocity and spread of the expelled droplets.

Unsurprisingly, they found N95 masks to be the most effective at reducing the horizontal spread of a cough. The N95 masks reduced a cough's initial velocity by up to a factor of 10 and limit its spread to between 0.1 and 0.25 meters.

An uncovered cough, in contrast, can travel up to 3 meters, but even a simple disposable mask can bring this all the way down to 0.5 meters.

"Even if a mask does not filter out all the particles, if we can prevent clouds of such particles from traveling very far, it's better than not doing anything," said Simha. "In situations where sophisticated masks are not available, any mask is better than no mask at all for the general public in slowing the spread of infection."

Some of the other comparisons, however, were striking.

For example, using an elbow to cover up a cough is typically considered a good alternative in a pinch, which is contradictory to what the pair found. Unless covered by a sleeve, a bare arm cannot form the proper seal against the nose necessary to obstruct airflow. A cough is then able to leak through any openings and propagate in many directions.

Simha and Rao hope their findings will put to rest the argument that regular cloth masks are ineffective, but they emphasize that masks must continue to be used in conjunction with social distancing.

"Adequate distancing is something that must not be ignored, since masks are not foolproof," Simha said.


Explore furtherYes, face mask protection helps slow the spread of COVID-19
More information: "Universal trends in human cough airflows at large distances," Physics of Fluids (2020). DOI: 10.1063/5.0021666
Journal information: Physics of Fluids


Provided by American Institute of Physics
A new chess film: Critical Thinking
by André Schulz

8/27/2020 –

A new chess film will be released soon: Critical Thinking. The film tells the story of the Miami Jackson High School chess team, which was the first inner city team to win the US National Chess Championship despite incredible adversity, motivated by a dedicated teacher. It was directed by John Leguizamo.


Based on a true story

John Leguizamo was born July 22, 1964, in Bogotá, Colombia, to Luz and Alberto Leguizamo. He was four when his family emigrated to the United States. He was raised in Queens, New York, attended New York University and studied under legendary acting coach Lee Strasberg for only one day before Strasberg passed away. The extroverted Leguizamo started working the comedy club circuit in New York and first appeared in front of the cameras in an episode of Miami Vice (1984).

In 1998, he made his Broadway debut in John Leguizamo: Freak, a “demi-semi-quasi-pseudo-autobiographical” one-man show, which was filmed for HBO by Spike Lee.
Excerpt from “Freak


Fast-talking and feisty-looking John Leguizamo has continued to impress film audiences with his versatility: he can play sensitive and naive young men, such as Johnny in Hangin' with the Homeboys (1991); cold-blooded killers like Benny Blanco in Carlito's Way (1993); a heroic Army Green Beret, stopping aerial terrorists in Executive Decision (1996); and drag queen Chi-Chi Rodriguez in To Wong Foo Thanks for Everything, Julie Newmar (1995). Arguably, not since ill-fated actor and comedian Freddie Prinze starred in the smash TV series Chico and the Man (1974) has a youthful Latino personality had such a powerful impact on critics and fans alike.

John Leguizamo makes his directorial feature film debut with a story that hits close to his heart. Set in 1998, Critical Thinking tells the true story of Cuban-American teacher Mario Martinez and his national championship winning chess team at Miami Jackson High School.

The school is hardly supported and the team has to manage without adequate financial support. Nevertheless, the team participates in the municipal school championships hoping to perhaps qualify to the US National School Chess Championship. The students come mostly from socially disadvantaged families, but are passionately motivated by dedicated teacher Mario Martinez, played by John Leguizamo himself. On the way to success, the team has to break with numerous conventions.

Let endgame expert Dr Karsten Müller show and explain the finesses of the world champions. Although they had different styles each and every one of them played the endgame exceptionally well, so take the opportunity to enjoy and learn from some of the best endgames in the history of chess.


Mario Martinez, played by John Leguizamo

The film is based on a true story and takes place in 1998. The people on which the story was based were involved in the production of the film as consultants.


André SchulzAndré Schulz started working for ChessBase in 1991 and is an editor of ChessBase News.




Chess: Garry Kasparov and Magnus Carlsen to meet for first time in 16 years



Kasparov, 57, and Carlsen, 30, will compete at random chess in the 10-player St Louis Champions Showdown, their first encounter since 2004

3686: Oscar Panno v Garry Kasparov, Buenos Aires 1992. All-time legend Kasparov defeated Argentina’s top player. Can you visualise Black’s winning sequence, several moves deep but virtually all forced?


Leonard Barden
Fri 28 Aug 2020
Garry Kasparov will make a rare cameo appearance when the legend, now aged 57, takes on the reigning champion, Magnus Carlsen, in the 10-player Champions Showdown invitation organised by St Louis from 11-13 September.

This will be an historic clash, even though it is only online random chess. The two world champions, widely considered the best players of all time, have faced each other in only one previous official event. That was at rapid and blitz chess in 2004 in Reykjavik, when Carlsen was aged 13 and Kasparov 41, a year before his retirement.


Chess: Carlsen fights back from brink to overcome Nakamura in 38-game epic
Read more


Carlsen, understandably nervous, was crushed in the first and second games but was pressing in the drawn third.

There is more history between the two icons of modern chess. In summer 2009 the Russian briefly became the Norwegian’s coach but owing to a personality clash between Carlsen’s laid-back attitude and Kasparov’s intensity they soon parted ways. They played several informal blitz games then, which were ultra-competitive. As Carlsen put it: “Neither of us likes losing, him especially.”

Since 2017 the 1985-2000 world champion has taken part only in games using FischerRandom, also known as Chess960 or Chess9LX, where a computer makes a random choice of the back row starting array.

Besides Kasparov and Carlsen, the field includes America’s world No 2, Fabiano Caruana, who defeated Kasparov 5-1 at blitz in 2019, the US champion, Hikaru Nakamura, whose match with Carlsen last week drew record audiences, and the prodigy Alireza Firouzja, 17, who Kasparov has never met.

China, the 2018 Olympiad double gold medallists and tournament favourites for the current 163-nation online version, were knocked out 7-6 by Ukraine on Thursday after a match that will leave their team selectors with some serious questions to answer in Beijing.

The knockout format was a double round six board match where a 6-6 tie would be broken by a single Armageddon game. China were lucky to score 3-3 in the first round where Natalia Zhukova drew by perpetual check in a winning position against the world woman champion, Ju Wenjun.

For the second round China dropped both Ju and the world No 1, Hou Yifan, their replacements scored only half a point, the match went to 6-6 and Ukraine won the Armageddon game. Earlier, Hungary v Germany also went to Armageddon and the German lost on time with the Hungarian having just three seconds left. In Armageddon, White has five minutes on the clock to Black’s four but a draw on the board counts as a win for Black in the scores.

Results: Hungary 7-6 Germany. Ukraine 7-6 China, Armenia 8-4 Greece, Poland 7.5-4.5 Bulgaria. Friday’s quarter-finals: Russia v Hungary, United States v Ukraine, India v Armenia, Azerbaijan v Poland.

England were at full strength and scored a shock win over the strong Armenians, but finished only sixth in their group as they lost 5-1 to Russia, Bulgaria and Croatia while struggling to beat weaker teams. The women’s boards did well, though three of the four juniors had minus scores overall while the normally reliable Michael Adams and Luke McShane dropped points in critical matches.


Chess: England aim for place in top group as 163-team Olympiad begins
Read more


Some of the best Olympiad moments have been where strong grandmasters won elegantly against lesser lights. One such creative performance came from Alexey Shirov, author of the classic Fire on Board, who qualified for a world title match with Kasparov in 1998 but missed out due to lack of financial support. Shirov’s attacking play is a pleasure to watch.

Wolfgang Uhlmann, who died on Monday aged 85, was the former East Germany’s best player and a world title candidate in 1971. His opening repertoire was strikingly narrow for a top grandmaster. As Black, he almost always used the French 1 e4 e6 2 d4 d5 and the King’s Indian Nf6 g6 and Bg7.

A vintage Uhlmann performance came in 1958 when the biennial Olympiad was held in Munich, then in West Germany, and Uhlmann led his young GDR team to a 3.5-0.5 victory over the host nation. I was present in the playing hall that day and remember the stunned silence from the large patriotic audience. True to himself, Uhlmann won against Wolfgang Unzicker with a classically styled King’s Indian.

3686 1...Re1+ 2 Kf2 Qh4+ 3 g3 Qxh3 4 Rxe1 Qh2+ 5 Kf1 Qh1+ 6 Qg1 Rxe1+ and wins.
Will automation eliminate data science positions?
Michael Li@tianhuil / 1:00 pm MDT•August 27, 2020

Image Credits: Westend61 (opens in a new window)/ Getty Images
Michael LiContributor

Tianhui Michael Li is founder of The Data Incubator, an eight-week fellowship to help PhDs and postdocs transition from academia into industry. Previously, he headed monetization data science at Foursquare and has worked at Google, Andreessen Horowitz, J.P. Morgan and D.E. Shaw.
More posts by this contributor

What’s different about hiring data scientists in 2020?
Five building blocks of a data-driven culture

“Will automation eliminate data science positions?”

This is a question I’m asked at almost every conference I attend, and it usually comes from someone from one of two groups with a vested interest in the answer: The first is current or aspiring practitioners who are wondering about their future employment prospects. The second consists of executives and managers who are just starting on their data science journey.

They have often just heard that Target can determine whether a customer is pregnant from her shopping patterns and are hoping for similarly powerful tools for their data. And they have heard the latest automated-AI vendor pitch that promises to deliver what Target did (and more!) without data scientists. We argue that automation and better data science tooling will not eliminate or even reduce data science positions (including use cases like the Target story). It creates more of them!

Here’s why.

Understanding the business problem is the biggest challenge

The most important question in data science is not which machine learning algorithm to choose or even how to clean your data. It is the questions you need to ask before even one line of code is written: What data do you choose and what questions do you choose to ask of that data?

What is missing (or wishfully assumed) from the popular imagination is the ingenuity, creativity and business understanding that goes into those tasks. Why do we care if our customers are pregnant? Target’s data scientists had built upon substantial earlier work to understand why this was a lucrative customer demographic primed to switch retailers. Which datasets are available and how can we pose scientifically testable questions of those datasets?

Target’s data science team happened to have baby registry data tied to purchasing history and knew how to tie that to customer spending. How do we measure success? Formulating nontechnical requirements into technical questions that can be answered with data is amongst the most challenging data science tasks — and probably the hardest to do well. Without experienced humans to formulate these questions, we would not be able to even start on the journey of data science.
Making your assumptions

After formulating a data science question, data scientists need to outline their assumptions. This often manifests itself in the form of data munging, data cleaning and feature engineering. Real-world data are notoriously dirty and many assumptions have to be made to bridge the gap between the data we have and the business or policy questions we are seeking to address. These assumptions are also highly dependent on real-world knowledge and business context.

In the Target example, data scientists had to make assumptions about proxy variables for pregnancy, realistic time frame of their analyses and appropriate control groups for accurate comparison. They almost certainly had to make realistic assumptions that allowed them to throw out extraneous data and correctly normalize features. All of this work depends critically on human judgment. Removing the human from the loop can be dangerous as we have seen with the recent spate of bias-in-machine-learning incidents. It is perhaps no coincidence that many of them revolve around deep learning algorithms that make some of the strongest claims to do away with feature engineering.

So while parts of core machine learning are automated (in fact, we even teach some of the ways to automate those workflows), the data munging, data cleaning and feature engineering (which comprises 90% of the real work in data science) cannot be safely automated away.
A historical analogy

There is a clear precedent in history to suggest data science will not be automated away. There is another field where highly trained humans are crafting code to make computers perform amazing feats. These humans are paid a significant premium over others who are not trained in this field and (perhaps not surprisingly) there are education programs specializing in training this skill. The resulting economic pressure to automate this field is equally, if not more, intense. This field is software engineering.

Indeed, as software engineering has become easier, the demand for programmers has only grown. This paradox — that automation increases productivity, driving down prices and ultimately driving up demand is not new — we’ve seen it again and again in fields ranging from software engineering to financial analysis to accounting. Data science is no exception and automation will likely drive up demand for this skillset, not down

Repeating radio signal from space fires back up on predicted schedule

By Michael Irving
August 25, 2020

A strange repeating radio signal from space has now been found to have a cycle
Kristi Mickaliger

Back in June, astronomers discovered a hidden pattern within seemingly-random radio signals from space. Based on years of data, it was predicted that the next bursts of activity should flare up around August – and now those signals have come through, right on schedule. The discovery could help us unravel the mystery of these Fast Radio Bursts (FRBs).

FRBs are pulses of radio signals that throw off incredible amounts of energy within milliseconds, often as one-off events. But a select few have been found to repeat at random – or at least, we thought it was random.

Earlier this year, astronomers discovered a periodic pattern hidden within the rumblings of a repeating radio source called FRB 121102. By analyzing 32 bursts from this repeater over four years, a team found that FRB 121102 follows a 157-day cycle: it flares up into a 90-day period of activity before falling silent for 67 days.

In May when this previous study was published, FRB 121102 was in the quiet phase of this cycle, and the researchers predicted that, based on their data, its next active phase should be between June 2 and August 28, 2020.


And lo and behold, the FRB has sprung back to life, right on schedule. Using the Five-hundred-meter Aperture Spherical radio Telescope (FAST) in China, astronomers observed the location of the signals regularly between March and August. No detections were made until August 17, when FRB 121102 gave off at least 12 bursts in the space of two hours.

This new finding does lend weight to the idea of periodicity in FRB 121102’s emissions, but the exact details of its cycle are still hazy. By combining data from several studies, the team suggests that its cycle actually runs for 156.1 days, with the source active for 99 of those days. Following this new schedule, the team predicts that the object should "turn off" again around August 31 to September 9.

If it continues to fire off bursts after that time, the researchers say this could suggest that the period has either evolved, or wasn’t “real” in the first place.


Other scientists have come to different conclusions about FRB 121102’s periodicity. A team led by Marilyn Cruces at the Max Planck Institute found that the cycle lasts around 161 days, and the source should be active between July 9 and October 14, 2020. After that, it should remain quiet until December 17, when it enters a new active phase that lasts until March 24, 2021.

Time – and future observations – will tell which, if any, of these predictions come true. Either way, FRB 121102 will be watched closely over the next few months, in the hopes that it could give up new clues to what might be behind these strange signals.

Cruces' team's work is available in pre-print on ArXiv.

Source: The Astronomer’s Telegram via Science Alert