Friday, February 17, 2023

Reducing your electric bill with a predictive control heating system

The Norwegian University of Science and Technology (NTNU) has tested a system to predict the heating needs on its Gløshaugen campus, showing that surplus heat can be used even more efficiently by predicting consumption.

Peer-Reviewed Publication

NORWEGIAN UNIVERSITY OF SCIENCE AND TECHNOLOGY

Predictive modeling helps cut district heating inefficiencies 

IMAGE: PROFESSOR NATASA NORD FROM THE NORWEGIAN UNIVERSITY OF SCIENCE AND TECHNOLOGY (NTNU) IS DOING RESEARCH ON INCREASING THE EFFICIENCY OF CONTROL SYSTEMS FOR HEATING, INCLUDING THE TYPE USED ON THE UNIVERSITY'S GLØSHAUGEN CAMPUS. view more 

CREDIT: GEIR MOGEN/NTNU

A good deal of research shows that surplus heat, such as from data centres, lends itself well as a heat source. But we have only limited knowledge about how the control systems for these systems can save us even more energy and money.

Researchers Juan Hou, Haoran Li and Natasa Nord from the Norwegian University of Science and Technology (NTNU) wanted to do something about that. They have presented their results in a recent research article.

The researchers used NTNU’s Gløshaugen campus as a test case for a Model Predictive Control (MPC) scheme.

NTNU is basking in the heat of mathematical calculations

First, a little information about how the buildings that researchers and students at Gløshaugen spend time in are heated.

On some days of the year, the surplus heat from the supercomputers at Gløshaugen is enough to heat the entire campus. Cold water is carried to the super machines to cool them down, and the heated water is moved to where it is needed elsewhere on campus.

In winter, the district heating network also supplies heat, but on the warmest days the supercomputers can handle all the campus heating needs.

The water-borne heat from the district heating network, the surplus heat from the computers, and other heating equipment using heat pumps all need to be controlled by a common system. This control system is what the researchers believe can become even more effective.

Introducing “fortune-telling system” to save even more

“Making predictions is actually completely different from divination. Whereas a fortune teller rubs a ball and receives her prediction out of thin air, MPC is research based,” says Natasa Nord, a professor at NTNU's Department of Energy and Process Engineering.

The Model Predictive Control (MPC) scheme is a control model filled with a lot of data about the specific heating system and includes everything from previous local weather reports to building temperatures throughout the year to the various components that provide surplus heat. When all the data is fed into the system, it becomes a data-based and accurate fortune-telling ball.

Nord explains that traditional control systems are based on rules for very different buildings. They are based on a template that works for all the diverse components, in contrast to the MPC technique, which is tailor-made and knows local conditions inside and out. The researchers took data for each month in 2017-2018 as their starting point and tested MPC via simulations. They then compared the simulation results with actual electricity use and electric bills.


Supercomputers can be hot — but there are ways to use that heat 

Small savings percentage can add up when prices are high

The testing revealed that it was possible to save 1.8 per cent in energy costs per month with MPC. Nord points that this figure was based on the 2021price model.

“We can assume that the effect of the power savings would be even greater today, although I wouldn’t speculate on how much. Today, cost schemes use different price models for night- and daytime energy use. In any case, our test shows that with MPC we can maximize financial savings for heat consumers like the Gløshaugen campus. The results also show that MPC was more stable and robust than the traditional control system, which is a positive factor for system safety,” says Nord.

Can also be used for systems with other types of excess heat

Not many places in Norway use surplus heat from supercomputers for heating.

“MPC could just as well have been used with other sources of surplus heat, such as cooling in the food industry,” says Nord.

In addition to heat-based industry, cooling also provides surplus heat. A current project at the Leangen ice rink in Trondheim is attempting to utilize the surplus heat from its cooling facility. Many food industries also require refrigeration.

Nord explains that one challenge in introducing MPC lies in getting the different systems to talk to each other.

“Acquiring good data that we can build into the programming can be challenging. Some components have protocols that we struggle to communicate with. They require some work to adapt the programming. But we’re continuously working on implementation, and this is something that researchers at NTNU can help other players with,” she says.

The research in this article was supported through the Research Council of Norway’s FRIPRO/FRINATEK programme.

Reference: Hou, Juan & Li, Haoran & Nord, Natasa. (2022). Model predictive control for a data centre waste heat-based heat prosumer in Norway. E3S Web of Conferences. 362.10.1051/e3sconf/202236206007.

Garfield Weston Foundation funding adds new habitats to Space4Nature wildlife protection project 

Grant and Award Announcement

UNIVERSITY OF SURREY

Nature in Surrey’s wetlands, urban areas and urban fringes will benefit from a grant of £200,000 from the philanthropic Garfield Weston Foundation.  

This donation will allow scientists at the University of Surrey and ecologists at the Surrey Wildlife Trust, Buglife and Painshill Park Trust to expand their trail blazing Space4Nature work, which combines satellite earth observation, artificial intelligence (AI) and citizen science to boost biodiversity in Surrey. 

The new funding follows the £1.25m Dream Fund 2022 award, thanks to players of People’s Postcode Lottery. This means a wider range of habitats can be studied, adding Surrey’s towns and wetlands to the grassland, heaths, chalk downland and woodland already being researched through Space4Nature. 

Professor Richard Murphy, Director of the Centre for Environment and Sustainability, University of Surrey, said: 

“It’s fantastic we’re getting this grant to enable more types of habitats and more areas in Surrey to benefit. By combining information from local people exploring nature with AI and satellite images, we are developing a new way to protect and enhance nature, boosting biodiversity. 

“Garfield Weston’s support will mean we can look at wetlands, which are particularly sensitive to climate change, and urban and peri-urban areas, which are on the edges of Surrey’s many towns and villages. More funding means we can help more nature recover.” 

Andrew Jamieson, Space4Nature’s Project Manager at Surrey Wildlife Trust, said: 

“Citizen science will be core to the success of Space4Nature and its mission to connect habitats across Surrey, while connecting more people with nature. This new funding will directly benefit our wildlife and wild places, helping our communities benefit from a healthy natural environment – and making our landscapes more resilient against the effects of a changing climate.”   

Philippa Charles OBE Director at the Garfield Weston Foundation, said: 

“Our Trustees were delighted to support the Space4Nature project, and the partnership between the University of Surrey, Surrey Wildlife Trust, Buglife and Painshill Park Trust.” 

This latest funding adds to the University of Surrey's global philanthropic The Future Says Surrey campaign, which aims to raise £60m between 2022 and 2025 to help Surrey continue making positive impacts on society and confirm it as one of the UK’s leading research and educational institutions.   

 

### 

 

Note to editors  

  

  • Richard Murphy is available for interview. 

  • For more information, contact the University of Surrey’s press office via mediarelations@surrey.ac.uk  

One-time tax items aren’t earnings misconduct

Nonrecurring income taxes reflect mostly economic causes, not management manipulation.

Peer-Reviewed Publication

UNIVERSITY OF TEXAS AT AUSTIN

When investors try to forecast a company’s future earnings by analyzing its current financial statements, a perplexing problem is how to interpret nonrecurring income taxes.

Those one-time tax items increase or reduce quarterly earnings. Should they be treated as unusual events, with no lasting impact? Or do they offer clues to possible corporate manipulation, like shifting tax expenses from one quarter to another to artificially boost earnings?

In new research from Texas McCombs, Dean and Accounting Professor Lillian Mills offers answers to such questions. She finds that most nonrecurring taxes stem from legitimate business causes rather than efforts to cook the books.

“Past research suggests that nonrecurring income taxes are driven by managerial opportunism,” Mills says. “But we found that they’re not associated with efforts to meet or beat analysts’ earnings forecasts. In general, they’re driven by economics, not by opportunism.”

With Dain Donelson of the University of Iowa and Colin Koutney of George Mason University, Mills analyzed 68,139 quarterly earnings reports from 2007 to 2017. Their results offer insights on what nonrecurring taxes mean — and don’t mean.

They’re not uncommon. Such taxes occurred in 6% of all company-quarters and in all industries. For larger businesses with more complex finances, that figure rose to 10%.

They have economic causes. The most frequent triggers of nonrecurring taxes are:

· Tax-related events, such as audit resolutions and changes in tax laws.

· Accounting changes, like repatriating profits or revaluing assets.

· Corporate restructurings, such as mergers.

They don’t affect future earnings. Nonrecurring income taxes had little predictive power for future profits.

They don’t affect future taxes. Nonrecurring taxes did not predict a company’s future tax rates or its likelihood of having to restate taxes it had previously filed.

“By better understanding nonrecurring income taxes, investors and analysts can make better earnings projections,” says Mills. “Likewise, researchers need to understand these transitory items when they’re analyzing a company’s tax picture or seeking signs of earnings manipulation.”

Nonrecurring Income Taxes” is published in the Review of Accounting Studies.

Story by Steve Brooks

Receptor location matters for psychedelic drug effects

UNIVERSITY OF CALIFORNIA - DAVIS

Receptor Location Matters for Psychedelic Drug Effects 

IMAGE: IMAGE OF A CORTICAL NEURON (WHITE) EXPRESSING SEROTONIN 2A (5-HT2A) RECEPTORS (MULTICOLOR). NEW WORK SHOWS THAT ENGAGING 5-HT2A RECEPTORS INSIDE CELLS, BUT NOT ON THE CELL SURFACE, ENCOURAGES CELL GROWTH AND FORMATION OF NEW CONNECTIONS. view more 

CREDIT: DAVID OLSON/UC DAVIS

Location, location, location is the key for psychedelic drugs that could treat mental illness by rapidly rebuilding connections between nerve cells. In a paper published Feb. 17 in Science, researchers at the University of California, Davis show that engaging serotonin 2A receptors inside neurons promotes growth of new connections but engaging the same receptor on the surface of nerve cells does not. 

The findings will help guide efforts to discover new drugs for depression, PTSD and other disorders, said senior author David E. Olson, associate professor of chemistry, biochemistry and molecular medicine and director of the Institute for Psychedelics and Neurotherapeutics at UC Davis. 

Drugs such as LSD, MDMA and psilocybin show great promise for treating a wide range of mental disorders that are characterized by a loss of neural connections. In laboratory studies, a single dose of these drugs can cause rapid growth of new dendrites – branches – from nerve cells, and formation of new spines on those dendrites. 

Olson calls this group of drugs “psychoplastogens” because of their ability to regrow and remodel connections in the brain. 

Earlier work from Olson’s and other labs showed that psychedelic drugs work by engaging the serotonin 2A receptor (5-HT2AR). But other drugs that engage the same receptor, including serotonin, do not have the same growth effects. 

Maxemiliano Vargas, a graduate student in Olson’s lab, Olson and colleagues experimented with chemically tweaking drugs and using transporters to make it easier or harder for compounds to slip across cell membranes. Serotonin itself is polar, meaning it dissolves well in water but does not easily cross the lipid membranes that surround cells. The psychedelics, on the other hand, are much less polar and can easily enter the interior of a cell. 

They found that the growth-promoting ability of compounds was correlated with the ability to cross cell membranes. 

Drug receptors are usually thought of as being on the cell membrane, facing out. But the researchers found that in nerve cells, serotonin 2A receptors were concentrated inside cells, mostly around a structure called the Golgi body, with some receptors on the cell surface. Other types of signaling receptors in the same class were on the surface. 

The results show that there is a location bias in how these drugs work, Olson said. Engaging the serotonin 2A receptor when it is inside a cell produces a different effect from triggering it when it is on the outside. 

“It gives us deeper mechanistic insight into how the receptor promotes plasticity, and allows us to design better drugs,” Olson said. 

Additional authors on the paper include: from UC Davis, Lee Dunlap, Chunyang Dong, Samuel Carter, Robert Tombari, Lin Tian, John Gray, Shekib Jami, Seona Patel, Lindsay Cameron and Hannah Saeger; Joseph Hennessey and John McCorvy from the Medical College of Wisconsin, Milwaukee. The work was supported by grants from the National Institutes of Health and the Camille and Henry Dreyfus Foundation, and by a sponsored research agreement with Delix Therapeutics. 

Physicists solve durability issue in next-generation solar cells

The UToledo discovery clears the path for the commercialization of new photovoltaic technology to replace silicon and lower the cost of solar electricity

Peer-Reviewed Publication

UNIVERSITY OF TOLEDO

Dr. Yanfa Yan, UToledo 

IMAGE: DR. YANFA YAN, UTOLEDO DISTINGUISHED UNIVERSITY PROFESSOR OF PHYSICS AND A MEMBER OF THE UTOLEDO WRIGHT CENTER FOR PHOTOVOLTAICS INNOVATION AND COMMERCIALIZATION view more 

CREDIT: DANIEL MILLER, THE UNIVERSITY OF TOLEDO

Physicists in the U.S. jumped a major hurdle standing in the way of the commercialization of solar cells created with halide perovskites as a lower-cost, higher-efficiency replacement for silicon when generating electricity from the sun.

Published in the journal Science, the clean energy research led by The University of Toledo in collaboration with the University of Washington, University of Toronto, Northwestern University and Swiss Federal Laboratories for Materials Science and Technology solved the problem with the durability of perovskite solar cells, taking the technology one step closer to powering solar panels in the consumer market.

“Perovskite solar cells offer a route to lowering the cost of solar electricity given their high power conversion efficiencies and low manufacturing cost,” said Dr. Yanfa Yan, UToledo Distinguished University Professor of physics and a member of the UToledo Wright Center for Photovoltaics Innovation and Commercialization. “However, we needed to strengthen the emerging solar cell technology’s endurance during outdoor operation.”

The technology needs to survive for decades outdoors in all kinds of weather and temperatures without corroding or breaking down.

“This challenge is no longer a roadblock to deploying the potential of perovskite solar cells,” Yan said. “Our breakthrough work improved device stability and presents ways of achieving success after a decade of research and development.”

The team discovered the ingredient that enhances adhesion and mechanical toughness.

Researchers experimentally demonstrated that perovskite solar cells treated with 1,3-bis(diphenylphosphino)propane (DPPP), a diphosphine Lewis base molecule, retained a high power conversion efficiency and exhibited superior durability after continuous operation under simulated sun illumination for more than 3,500 hours, or more than 145 days.

They used what is called one sun illumination, which is equivalent to outdoor sunlight.

“Phosphine-containing Lewis base molecules with two electron-donating atoms have a strong binding with the perovskite surface,” Yan said. “We saw the robust beneficial effects on perovskite film quality and device performance when we treated the perovskite solar cells with DPPP.”

“DPPP is also a commercialized product with low cost and easy accessibility, which make it suitable for the commercialization of perovskite solar cells,” said Dr. Zhaoning Song, a research assistant professor in Yan’s lab at UToledo and one of the authors on the new paper.

Researchers say the next step to move the technology forward is to employ their findings to make perovskite panels stable.

Dr. Chongwen Li, the first author of the study and a UToledo alumnus, worked with Yan as a graduate student. Li earned his Ph.D. in physics from UToledo in 2020. He is a postdoctoral researcher at the University of Toronto.

“Continuing to exploit the potentiality in the stability of perovskite solar cells is a crucial priority for the ongoing decarbonization of the world’s economy,” Li said. “After the successful demonstration of DPPP on improving the stability of perovskite solar cells, we are further applying it to large area perovskite solar panels and moving the prototype device forward to commercialization.”

UToledo has been a trailblazer in solar energy research and development for more than 30 years.

It has been a decade since Yan’s team at UToledo identified the ideal properties of perovskites, compound materials with a special crystal structure formed through chemistry, and started to focus their efforts on bringing together two different solar cells to increase the total electrical power generated by using two different parts of the sun’s spectrum.

In November, a team of scientists from UToledo, the University of Toronto and Northwestern University collaborated to create an all-perovskite tandem solar cell with record-setting voltage. The research was published in the journal Nature.

“Northwest Ohio is a global leader in solar technology, and The University of Toledo has been at the forefront of breakthrough after breakthrough. Investment and collaboration between the Department of Energy and world-class research professionals at our universities continues to pay dividends, as working men and women benefit from a growing solar industry,” said Congresswoman Marcy Kaptur (OH-09), ranking member of the House Appropriations Subcommittee on Energy and Water Development. “As a leading member on Energy and Water Development, I’ll continue to champion smart investment to power American energy independence.”

The Wright Center for Photovoltaics Innovation and Commercialization was created at UToledo in 2007 to support solar energy research and manufacturing with $18.6 million in support from the Ohio Department of Development, along with matching contributions of $30 million from federal agencies, universities and industrial partners.

UToledo perovskite solar cells ready for the durability test

U.S. unprepared for dangers posed by zoonotic diseases, new analysis concludes

Editorial in "Science" calls for integrating multiple agencies to better combat threats

Peer-Reviewed Publication

NEW YORK UNIVERSITY

The United States, the largest importer of wildlife in the world, is not prepared for future spread of animal-borne, or zoonotic, diseases due to gaps among governmental agencies designed to combat these threats, concludes a new analysis by researchers at Harvard Law School and New York University. The authors call for a “One Health” approach, integrating multiple agencies in order to better govern human-animal interactions.

The editorial, “Blind spots in biodefense,” which appears in the journal Science, is authored by Ann Linder, a research fellow at Harvard Law School’s Brooks McCormick Jr. Animal Law & Policy Program, and Dale Jamieson, a professor at New York University’s Center for Environmental and Animal Protection in the Department of Environmental Studies.

Linder and Jamieson note that the Biden administration’s recent release of its National Biodefense Strategy (NBS-22), the first update since the COVID-19 pandemic began, frames threats as largely external to the United States. 

“NBS-22 focuses primarily on bioterrorism and laboratory accidents, neglecting threats posed by routine practices of animal use and production inside the United States,” they write. 

This oversight is significant, Linder and Jamieson observe, given the United States’ past and present when it comes to human-animal interface:

  • More zoonotic diseases originated in the United States than in any other country during the second half of the 20th century. 
  • In 2022, the U.S. processed more than 10 billion livestock, the largest number ever recorded and an increase of 204 million over 2021.
  • The ongoing H5N1 avian influenza outbreak has left 58 million animals dead in backyard chicken coops and industrial farms in the U.S.
  • Since 2011, the U.S. has recorded more swine-origin influenza infections than any other country. Most occurred at state and county fairs, which attract 150 million visitors each year and where an estimated 18% of swine have tested positive. 

Moreover, they add, the current patchwork of siloed agencies and authorities is marked by a lack of coordination, leaving significant gaps and areas of underregulation. In fact, of the many agencies that govern food animal production, the U.S. Department of Agriculture is the most important, but it has no authority to regulate on-farm animal production.

The authors call for rebuilding from the ground up the U.S. regulatory system in order to combat zoonotic disease risk.

“What is needed is not simply for agencies to do their jobs better or to paper over the gaps, but a fundamental restructuring of the way that human–animal interfaces are governed,” Linder and Jamieson urge. “A One Health approach, which NBS-22 claims as its guiding principle, would take the health of other living things not merely as the occasional means or obstacles to human health, but as continuous with it. The first step in implementing such an approach would be to create a high-level process for integrating the broken mosaic of multiple agencies, with their unclear and sometimes competing mandates, into an effective, comprehensive regime.”

The editorial is based on research from the Live Animal Markets Project, which is examining global policy responses to animal markets and their role in zoonotic disease transmission. The project includes 15 individual country case studies involving local collaborators, partner institutions, and members of the core research team. The project aims to provide a comprehensive assessment that will aid policymakers, contribute to public education about zoonotic risks, and support the human health and animal protection communities. The project is led by researchers from Harvard Law School’s Brooks McCormick Jr. Animal Law & Policy Program, and New York University’s Center for Environmental and Animal Protection, and involves researchers and institutions from around the world. Kristen Stilt, Arthur Caplan, Chris Green, Bonnie Nadzam, and Valerie Wilson McCarthy contributed to this editorial.

Earthquake scientists have a new tool in the race to find the next big one

Peer-Reviewed Publication

UNIVERSITY OF TEXAS AT AUSTIN

San Andreas Seismograph 

IMAGE: A SEISMOGRAPH RECORDS SEISMIC ACTIVITY NEAR THE SAN ANDREAS FAULT. NEW RESEARCH FROM THE UNIVERSITY OF TEXAS AT AUSTIN JACKSON SCHOOL OF GEOSCIENCES COULD AID IN PREDICTING THE WORLD’S MOST POWERFUL EARTHQUAKES. view more 

CREDIT: RAY_EXPLORES/FLICKR HTTPS://FLICKR.COM/PHOTOS/RAYBOUK/8201310617/

An everyday quirk of physics could be an important missing piece in scientists' efforts to predict the world’s most powerful earthquakes. 

In a study published in the journal Science, researchers at The University of Texas at Austin discovered that a frictional phenomenon could be key to understanding when and how violently faults move. That’s because the phenomenon, which explains why it takes more effort to shove a heavy box from a standstill than it does to keep it moving, governs how quickly the fault surfaces bond together, or heal, after an earthquake. A fault that is slow to heal is more likely to move harmlessly, while one that heals quickly is more likely to stick until it breaks in a large, damaging earthquake.

The discovery could be key to understanding when, and how violently, faults move. That alone won’t allow scientists to predict when the next big one will strike — the forces behind large earthquakes are too complex — but it does give researchers a valuable new way to investigate the causes and potential for a large, damaging earthquake to happen, the authors said.

“The same physics and logic should apply to all different kinds of faults around the world,” said the study’s co-lead author Demian Saffer, director of the University of Texas Institute for Geophysics at the Jackson School of Geosciences. “With the right samples and field observations we can now start to make testable predictions about how big and how often large seismic slip events might occur on other major faults, like Cascadia in the Pacific Northwest.”

To make the discovery, researchers devised a test that combined rocks from a well-studied fault off the coast of New Zealand and a computer model, to successfully calculate that a harmless kind of “slow motion” earthquake would happen every few years because the clay-rich rocks within the fault are very slow to heal. 

The rock samples the researchers tested were drilled from about half a mile under the seafloor in a fault in New Zealand. They squeezed the fault zone rocks in a hydraulic press and found that they were very slow to heal and slipped easily. When they plugged the rock data into a computer model of the fault, the result was a small, slow-motion tremor every two years, a near exact match with observations from the New Zealand fault.

The researchers think the clay-rich rocks, which are common at many large faults, could be regulating earthquakes by allowing plates to slip quietly past each other, which limits the buildup of stress. The discovery could be used to determine whether a fault is prone to slipping in large, damaging earthquakes, said study co-lead Srisharan Shreedharan, affiliate researcher at the University of Texas Institute for Geophysics and assistant professor at Utah State University.

“This doesn't get us any closer to actually predicting earthquakes, but it does tell us whether a fault is likely to slip silently with no earthquakes, or have large ground-shaking earthquakes,” he said.

At Cascadia, there is little evidence of shallow, slow-motion tremors. That’s one of the reasons the Pacific Northwest Seismic Network wants to place sensors across key areas of the fault. The new study gives them the framework to do so, said network Director Harold Tobin.

“We want to zero in on the processes in the shallow part of the fault because that’s what governs the size of the tsunami,” said Tobin, who was not part of the study. “Fault healing doesn’t explain everything, but it does give us a window into the working of subduction zone faults that we didn’t have before.”

The research was funded by the University of Texas Institute for Geophysics, the International Ocean Discovery Program, and New Zealand’s GNS Science. The New Zealand rock samples were gathered during a 2018 scientific ocean drilling mission co-led by Saffer and Laura Wallace, a research scientist at University of Texas Institute for Geophysics and GNS Science in New Zealand. Coauthors included Wallace and Charles Williams, also at GNS Science, who collaborated on the study’s computer modeling.

Demian Saffer, director of the University of Texas Institute for Geophysics (UTIG) and Laura Wallace, a UTIG research scientist, examine rock samples drilled from about half a mile under the seafloor in a fault in New Zealand during a 2018 scientific ocean drilling mission that they co-led. Lab tests revealed that clay-rich rocks are regulating earthquakes there by allowing the fault to slip harmlessly.

CREDIT

Tim Fulton, IODP JRSO.