It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Friday, February 17, 2023
Food insecurity may increase cognitive decline in older adults
UNIVERSITY PARK, Pa. — Older adults living with food insecurity are more likely to experience malnutrition, depression and physical limitations that affect how they live. The Supplemental Nutrition Assistance Program (SNAP) is the largest federally funded nutrition-assistance program in the United States, and research has shown that SNAP has reduced hunger and food insecurity in the general population.
Little evidence is available, however, on how SNAP may impact brain aging in older adults. To bridge this knowledge gap, Muzi Na, assistant professor of nutritional sciences at Penn State, led a team of researchers who investigated the relationship between food insecurity, SNAP and cognitive decline. They found that food sufficiency and participation in SNAP may help protect against accelerated cognitive decline in older adults.
In a new article published in The Journal of Nutrition, the researchers analyzed a representative sample of 4,578 older adults in the United States using data from the National Health and Aging Trends Study, 2012-20. Participants reported their experiences with food insecurity and were classified as food sufficient or food insufficient. The SNAP status was defined as SNAP participants, SNAP-eligible nonparticipants and SNAP-ineligible nonparticipants. The researchers found that food insecure adults experienced cognitive declines more rapidly than their food secure peers.
The researchers identified different trajectories of cognitive decline using food insufficiency status or SNAP status. Rates of cognitive decline were similar in SNAP participants and SNAP-ineligible nonparticipants, both of which were slower than the rate of SNAP-eligible nonparticipants. The greater cognitive decline rate observed in the food insecure group was equivalent to being 3.8 years older, whereas the greater cognitive decline rate observed in the SNAP-eligible nonparticipant group was equivalent to being 4.5 years older.
“For an aging population, roughly four years of brain aging can be very significant,” Na explained. “These results really point to the importance of food security for people as they age and the value that SNAP can have in improving people’s cognitive health as they age. We need to make sure that people have access to — and encourage them to use — the SNAP program as they age.”
Future studies are warranted to investigate the impact of addressing food insecurity and promoting SNAP participation on cognitive health in older adults, said Na.
Nan Dou of Penn State, Monique Brown of University of South Carolina, Lenis Chen-Edinboro of University of North Carolina Wilmington, Loretta Anderson of University of Maryland School of Medicine, and Alexandra Wennberg of Karolinska Institutet in Stockholm all contributed to this research.
This research was supported by funding from the Broadhurst Career Development Professorship for the Study of Health Promotion and Disease Prevention and the National Institute of Mental Health.
This press release is based on a version created by The Journal of Nutrition.
UNIVERSITY OF TORONTO, ROTMAN SCHOOL OF MANAGEMENT
Toronto - To professor Alexander Dyck, corporate fraud is like an iceberg: a small number is visible, but much more lurks below the surface.
How much more, he wondered? And, at what cost to investors?
Prof. Dyck and his team found that under typical surveillance, about three percent of U.S. companies are found doing something funny with their books in any given year. They determined that number by looking at financial misrepresentations exposed by auditors, enforcement releases by the U.S. Securities and Exchange Commission (SEC), financial restatements, and full legal prosecutions by the SEC against insider trading, all between 1997 and 2005.
However, the freefall and unexpected collapse of auditing firm Arthur Andersen, starting in 2001, due to its involvement in the Enron accounting scandal, gave Prof. Dyck, from the University of Toronto’s Rotman School of Management, and other researchers the chance to see how much fraud was detected during a period of heightened scrutiny. It represented “a huge opportunity,” that rarely comes along, said Prof. Dyck, putting 20 percent of all U.S. publicly traded companies – the slice that had been working with Andersen and were forced to find new auditors -- under a higher-powered microscope due to their previous association with the disgraced accounting firm.
Those companies did not show a greater propensity to fraud compared to other companies in the 1998 to 2000 period. But that changed once the spotlight was turned on beginning Nov. 30, 2001 – the date when Andersen client Enron began filing for bankruptcy – until the end of 2003, the period the researchers looked at. The new auditors, as well as regulators, investors and news media were all looking much more closely at the ex-Andersen companies.
“What we found was that there was three times as much detected fraud in the companies that were subjected to this special treatment, as a former Andersen firm, compared to those that weren’t,” said Prof. Dyck, who holds the Manulife Financial Chair in Financial Services and is the Director of the Capital Markets Institute at the Rotman School.
The researchers used the finding to infer that the real number of companies involved in fraud is at least 10 percent. That squares with previous research that has pegged the true incidence of corporate fraud between 10 and 18 percent. While the researchers were looking at U.S. companies, Prof. Dyck speculated that the ratio of undetected-to-detected fraud is not significantly different in Canada.
Given those numbers, the researchers estimated that fraud destroys about 1.6 percent of a company’s equity value, mostly due to diminished reputation among those in the know, representing about $830 billion in current U.S. dollars.
The figures also help to quantify the value of regulatory intervention, such as through the Sarbanes-Oxley Act, or SOX, introduced in 2002 in response to Enron and other financial scandals. Its not hard to come up with the compliance costs of SOX. What their study shows is that the legislation would satisfy a cost benefit analysis, even if it only reduced corporate fraud by 10 percent of its current level.
The results should capture the attention of anyone with responsibility for corporate oversight and research, Prof. Dyck says: “I spend a lot of time running a program for directors of public corporations and I tout this evidence when I say, ‘Do I think you guys should be spending time worrying about these things? Yes. The problem is bigger than you might think.’”
The research was co-authored with Adair Morse of the University of California at Berkeley and Luigi Zingales of the University of Chicago. It appears in the Review of Accounting Studies.
Prof. Dyck will present his research during an event hosted by the Capital Markets Institute on February 23, which will also include a discussion with representatives from academia, the plaintiff’s bar, regulators, and accountants. Further details are online.
Bringing together high-impact faculty research and thought leadership on one searchable platform, the new Rotman Insights Hub offers articles, podcasts, opinions, books and videos representing the latest in management thinking and providing insights into the key issues facing business and society. Visit www.rotman.utoronto.ca/insightshub.
The Rotman School of Management is part of the University of Toronto, a global centre of research and teaching excellence at the heart of Canada’s commercial capital. Rotman is a catalyst for transformative learning, insights and public engagement, bringing together diverse views and initiatives around a defining purpose: to create value for business and society. For more information, visit www.rotman.utoronto.ca
HOUSTON – (Feb. 16, 2023) – Rice University scientists have figured out a way to engineer wood to trap carbon dioxide through a potentially scalable, energy-efficient process that also makes the material stronger for use in construction.
Structural materials like steel or cement come at a high cost both in dollars and carbon dioxide emissions; building construction and use accounts for an estimated 40% of emissions. Developing sustainable alternatives to existing materials could help mitigate climate change and reduce carbon dioxide emissions.
“Wood is a sustainable, renewable structural material that we already use extensively,” Rahman said. “Our engineered wood did exhibit greater strength than normal, untreated wood.”
To achieve the feat, the network of cellulose fibers that gives wood its strength is first cleared out through a process known as delignification.
“Wood is made up of three essential components: cellulose, hemicellulose and lignin,” Rahman said. “Lignin is what gives wood its color, so when you take lignin out, the wood becomes colorless. Removing the lignin is a fairly simple process that involves a two-step chemical treatment using environmentally benign substances. After removing the lignin, we use bleach or hydrogen peroxide to remove the hemicellulose.”
Next, the delignified wood is soaked in a solution containing microparticles of a metal-organic framework, or MOF, known as Calgary framework 20 (CALF-20). MOFs are high-surface-area sorbent materials used for their ability to adsorb carbon dioxide molecules into their pores. “The MOF particles easily fit into the cellulose channels and get attached to them through favorable surface interactions,” said Soumyabrata Roy, a Rice research scientist and lead author on the study.
MOFs are among several nascent carbon capture technologies developed to address anthropogenic climate change. “Right now, there is no biodegradable, sustainable substrate for deploying carbon dioxide-sorbent materials,” Rahman said. “Our MOF-enhanced wood is an adaptable support platform for deploying sorbent in different carbon dioxide applications.”
“Many of the existing MOFs are not very stable in varying environmental conditions,” Roy said. “Some are very susceptible to moisture, and you don’t want that in a structural material.”
CALF-20, however, developed by University of Calgary Professor George Shimizu and collaborators, stands out in terms of both performance level and versatility under a variety of environmental conditions, Roy said.
“The manufacturing of structural materials such as metals or cement represents a significant source of industrial carbon emissions,” Rahman said. “Our process is simpler and ‘greener’ in terms of both substances used and processing byproducts.
“The next step would be to determine sequestration processes as well as a detailed economic analysis to understand the scalability and commercial viability of this material,” he added.
Rahman is an assistant research professor in materials science and nanoengineering in Rice’s George R. Brown School of Engineering. Roy is a research scientist in materials science and nanoengineering at Rice.
Shell Technologies (R66830) and the UES-Air Force Research Laboratory (G10000097) supported the research.
Natural wood (left) versus delignified wood. Removing lignin from wood makes it colorless.
Authors: Soumyabrata Roy, Firuz Alam Philip, Eliezer Fernando Oliveira, Gurwinder Singh, Stalin Joseph, Ram Manohar Yadav, Aparna Adumbumkulath, Sakib Hassan, Ali Khater, Xiaowei Wu, Praveen Bollini, Ajayan Vinu, George Shimizu, Pulickel Ajayan, Golam Kibria and Muhammad M. Rahman
Image downloads:
https://news-network.rice.edu/news/files/2023/02/230202_WOOD_1_LG.jpg CAPTION: Wood pieces at different stages of modification, from natural (far right) to delignified (second from right) to dried, bleached and delignified (second from left) and MOF-infused functional wood (first on the left).
https://news-network.rice.edu/news/files/2023/02/230202_Rahman_Roy_LG.jpg CAPTION: Soumyabrata Roy (left) poses beside Muhammad Rahman while holding a piece of the carbon dioxide-trapping wood developed by their team. Roy is a research scientist and Rahman is an assistant research professor in materials science and nanoengineering at Rice’s George R. Brown School of Engineering.
Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation’s top 20 universities by U.S. News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural Sciences and Social Sciences and is home to the Baker Institute for Public Policy. With 4,552 undergraduates and 3,998 graduate students, Rice’s undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction and No. 1 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger’s Personal Finance.
If you do not wish to receive news releases from Rice University, reply to this email and write “unsubscribe” in the subject line. Office of News and Media Relations – MS 300, Rice University, 6100 Main St., Houston, TX 77005
Reducing your electric bill with a predictive control heating system
The Norwegian University of Science and Technology (NTNU) has tested a system to predict the heating needs on its Gløshaugen campus, showing that surplus heat can be used even more efficiently by predicting consumption.
A good deal of research shows that surplus heat, such as from data centres, lends itself well as a heat source. But we have only limited knowledge about how the control systems for these systems can save us even more energy and money.
Researchers Juan Hou, Haoran Li and Natasa Nord from the Norwegian University of Science and Technology (NTNU) wanted to do something about that. They have presented their results in a recent research article.
The researchers used NTNU’s Gløshaugen campus as a test case for a Model Predictive Control (MPC) scheme.
NTNU is basking in the heat of mathematical calculations
First, a little information about how the buildings that researchers and students at Gløshaugen spend time in are heated.
On some days of the year, the surplus heat from the supercomputers at Gløshaugen is enough to heat the entire campus. Cold water is carried to the super machines to cool them down, and the heated water is moved to where it is needed elsewhere on campus.
In winter, the district heating network also supplies heat, but on the warmest days the supercomputers can handle all the campus heating needs.
The water-borne heat from the district heating network, the surplus heat from the computers, and other heating equipment using heat pumps all need to be controlled by a common system. This control system is what the researchers believe can become even more effective.
Introducing “fortune-telling system” to save even more
“Making predictions is actually completely different from divination. Whereas a fortune teller rubs a ball and receives her prediction out of thin air, MPC is research based,” says Natasa Nord, a professor at NTNU's Department of Energy and Process Engineering.
The Model Predictive Control (MPC) scheme is a control model filled with a lot of data about the specific heating system and includes everything from previous local weather reports to building temperatures throughout the year to the various components that provide surplus heat. When all the data is fed into the system, it becomes a data-based and accurate fortune-telling ball.
Nord explains that traditional control systems are based on rules for very different buildings. They are based on a template that works for all the diverse components, in contrast to the MPC technique, which is tailor-made and knows local conditions inside and out. The researchers took data for each month in 2017-2018 as their starting point and tested MPC via simulations. They then compared the simulation results with actual electricity use and electric bills.
Supercomputers can be hot — but there are ways to use that heat
Small savings percentage can add up when prices are high
The testing revealed that it was possible to save 1.8 per cent in energy costs per month with MPC. Nord points that this figure was based on the 2021price model.
“We can assume that the effect of the power savings would be even greater today, although I wouldn’t speculate on how much. Today, cost schemes use different price models for night- and daytime energy use. In any case, our test shows that with MPC we can maximize financial savings for heat consumers like the Gløshaugen campus. The results also show that MPC was more stable and robust than the traditional control system, which is a positive factor for system safety,” says Nord.
Can also be used for systems with other types of excess heat
Not many places in Norway use surplus heat from supercomputers for heating.
“MPC could just as well have been used with other sources of surplus heat, such as cooling in the food industry,” says Nord.
In addition to heat-based industry, cooling also provides surplus heat. A current project at the Leangen ice rink in Trondheim is attempting to utilize the surplus heat from its cooling facility. Many food industries also require refrigeration.
Nord explains that one challenge in introducing MPC lies in getting the different systems to talk to each other.
“Acquiring good data that we can build into the programming can be challenging. Some components have protocols that we struggle to communicate with. They require some work to adapt the programming. But we’re continuously working on implementation, and this is something that researchers at NTNU can help other players with,” she says.
The research in this article was supported through the Research Council of Norway’s FRIPRO/FRINATEK programme.
Nature in Surrey’s wetlands, urban areas and urban fringes will benefit from a grant of £200,000 from the philanthropic Garfield Weston Foundation.
This donation will allow scientists at the University of Surrey and ecologists at the Surrey Wildlife Trust, Buglife and Painshill Park Trust to expand their trail blazing Space4Nature work, which combines satellite earth observation, artificial intelligence (AI) and citizen science to boost biodiversity in Surrey.
The new funding follows the £1.25m Dream Fund 2022 award, thanks to players of People’s Postcode Lottery. This means a wider range of habitats can be studied, adding Surrey’s towns and wetlands to the grassland, heaths, chalk downland and woodland already being researched through Space4Nature.
Professor Richard Murphy, Director of the Centre for Environment and Sustainability, University of Surrey, said:
“It’s fantastic we’re getting this grant to enable more types of habitats and more areas in Surrey to benefit. By combining information from local people exploring nature with AI and satellite images, we are developing a new way to protect and enhance nature, boosting biodiversity.
“Garfield Weston’s support will mean we can look at wetlands, which are particularly sensitive to climate change, and urban and peri-urban areas, which are on the edges of Surrey’s many towns and villages. More funding means we can help more nature recover.”
Andrew Jamieson, Space4Nature’s Project Manager at Surrey Wildlife Trust, said:
“Citizen science will be core to the success of Space4Nature and its mission to connect habitats across Surrey, while connecting more people with nature. This new funding will directly benefit our wildlife and wild places, helping our communities benefit from a healthy natural environment – and making our landscapes more resilient against the effects of a changing climate.”
Philippa Charles OBE Director at the Garfield Weston Foundation, said:
“Our Trustees were delighted to support the Space4Nature project, and the partnership between the University of Surrey, Surrey Wildlife Trust, Buglife and Painshill Park Trust.”
This latest funding adds to the University of Surrey's global philanthropic The Future Says Surrey campaign, which aims to raise £60m between 2022 and 2025 to help Surrey continue making positive impacts on society and confirm it as one of the UK’s leading research and educational institutions.
When investors try to forecast a company’s future earnings by analyzing its current financial statements, a perplexing problem is how to interpret nonrecurring income taxes.
Those one-time tax items increase or reduce quarterly earnings. Should they be treated as unusual events, with no lasting impact? Or do they offer clues to possible corporate manipulation, like shifting tax expenses from one quarter to another to artificially boost earnings?
In new research from Texas McCombs, Dean and Accounting Professor Lillian Mills offers answers to such questions. She finds that most nonrecurring taxes stem from legitimate business causes rather than efforts to cook the books.
“Past research suggests that nonrecurring income taxes are driven by managerial opportunism,” Mills says. “But we found that they’re not associated with efforts to meet or beat analysts’ earnings forecasts. In general, they’re driven by economics, not by opportunism.”
With Dain Donelson of the University of Iowa and Colin Koutney of George Mason University, Mills analyzed 68,139 quarterly earnings reports from 2007 to 2017. Their results offer insights on what nonrecurring taxes mean — and don’t mean.
They’re not uncommon. Such taxes occurred in 6% of all company-quarters and in all industries. For larger businesses with more complex finances, that figure rose to 10%.
They have economic causes. The most frequent triggers of nonrecurring taxes are:
· Tax-related events, such as audit resolutions and changes in tax laws.
· Accounting changes, like repatriating profits or revaluing assets.
· Corporate restructurings, such as mergers.
They don’t affect future earnings. Nonrecurring income taxes had little predictive power for future profits.
They don’t affect future taxes. Nonrecurring taxes did not predict a company’s future tax rates or its likelihood of having to restate taxes it had previously filed.
“By better understanding nonrecurring income taxes, investors and analysts can make better earnings projections,” says Mills. “Likewise, researchers need to understand these transitory items when they’re analyzing a company’s tax picture or seeking signs of earnings manipulation.”
Receptor location matters for psychedelic drug effects
UNIVERSITY OF CALIFORNIA - DAVIS
Location, location, location is the key for psychedelic drugs that could treat mental illness by rapidly rebuilding connections between nerve cells. In a paper published Feb. 17 in Science, researchers at the University of California, Davis show that engaging serotonin 2A receptors inside neurons promotes growth of new connections but engaging the same receptor on the surface of nerve cells does not.
The findings will help guide efforts to discover new drugs for depression, PTSD and other disorders, said senior author David E. Olson, associate professor of chemistry, biochemistry and molecular medicine and director of the Institute for Psychedelics and Neurotherapeutics at UC Davis.
Drugs such as LSD, MDMA and psilocybin show great promise for treating a wide range of mental disorders that are characterized by a loss of neural connections. In laboratory studies, a single dose of these drugs can cause rapid growth of new dendrites – branches – from nerve cells, and formation of new spines on those dendrites.
Olson calls this group of drugs “psychoplastogens” because of their ability to regrow and remodel connections in the brain.
Earlier work from Olson’s and other labs showed that psychedelic drugs work by engaging the serotonin 2A receptor (5-HT2AR). But other drugs that engage the same receptor, including serotonin, do not have the same growth effects.
Maxemiliano Vargas, a graduate student in Olson’s lab, Olson and colleagues experimented with chemically tweaking drugs and using transporters to make it easier or harder for compounds to slip across cell membranes. Serotonin itself is polar, meaning it dissolves well in water but does not easily cross the lipid membranes that surround cells. The psychedelics, on the other hand, are much less polar and can easily enter the interior of a cell.
They found that the growth-promoting ability of compounds was correlated with the ability to cross cell membranes.
Drug receptors are usually thought of as being on the cell membrane, facing out. But the researchers found that in nerve cells, serotonin 2A receptors were concentrated inside cells, mostly around a structure called the Golgi body, with some receptors on the cell surface. Other types of signaling receptors in the same class were on the surface.
The results show that there is a location bias in how these drugs work, Olson said. Engaging the serotonin 2A receptor when it is inside a cell produces a different effect from triggering it when it is on the outside.
“It gives us deeper mechanistic insight into how the receptor promotes plasticity, and allows us to design better drugs,” Olson said.
Additional authors on the paper include: from UC Davis, Lee Dunlap, Chunyang Dong, Samuel Carter, Robert Tombari, Lin Tian, John Gray, Shekib Jami, Seona Patel, Lindsay Cameron and Hannah Saeger; Joseph Hennessey and John McCorvy from the Medical College of Wisconsin, Milwaukee. The work was supported by grants from the National Institutes of Health and the Camille and Henry Dreyfus Foundation, and by a sponsored research agreement with Delix Therapeutics.
Psychedelics promote neuroplasticity through activation of intracellular 5-HT2A receptors
ARTICLE PUBLICATION DATE
17-Feb-2023
COI STATEMENT
D.E.O. is a co-founder of Delix Therapeutics, Inc., serves as the chief innovation officer and head of the scientific advisory board, and has sponsored research agreements with Delix Therapeutics. Delix Therapeutics has licensed technology from UC Davis. The sponsors of this research were not involved in the conceptualization, design, decision to publish, or preparation of the manuscript.