Wednesday, January 15, 2025

 

Long before the L.A. fires, America’s housing crisis displaced millions



A USC study reveals how decades of policy missteps fueled today’s housing crisis, pushing the dream of homeownership out of reach for millions, including the largest generation in U.S. history: millennials.




University of Southern California

Dowell Myers 

image: 

Dowell Myers, Ph.D., is a specialist in urban growth and societal change, with expertise as a planner and urban demographer. Myers is a professor at the USC Price School of Public Policy and director of the USC Population Dynamics Research Group.

view more 

Credit: USC Price School of Public Policy




A new USC study reveals that the challenges that led to a national shortage of affordable housing and soaring home prices were set in motion long ago — and could have been foreseen.

The researchers behind the study say that the problem will only worsen as more natural disasters — such as the devastating Los Angeles area wildfires and large hurricanes — flatten entire communities. Los Angeles County officials estimate that more than 10,000 homes and businesses have been lost so far to the fires that erupted across the region last week.

“A tightly constrained housing supply reduces resilience to absorb losses from unexpected disasters — fires, earthquakes, hurricanes and more. In Los Angeles, this lack of flexibility could rapidly intensify gentrification as relocations strain the existing housing stock,” said Dowell Myers, professor of policy, planning and demography at the USC Price School of Public Policy and the study’s corresponding author.

A tightly constricted housing supply makes it difficult if not impossible to overcome unexpected losses of homes to disasters like wildfires in California. (Photo/iStock)

The study, published in the Russell Sage Foundation Journal of the Social Sciences, traces the origins of the crisis back to the early 2000s. A perfect storm of policy missteps, demographic shifts and economic forces emerged that have severely constrained housing supply, disproportionately affecting millennials and people of color.

The nation now faces a shortage of 4.5 million or more homes.

A crisis decades in the making

The researchers argue that the roots of the crisis began in the early 2000s when easy access to credit for young adults fueled a historic housing bubble. When the bubble burst in 2008 and led to the Great Recession, policymakers overcorrected by tightening mortgage lending standards and limiting funds for new construction.

“These measures came at the worst possible time — just as millennials, the largest generation in 30 years, entered the housing market,” said Myers, who directs the Population Dynamics Research Group at USC Price. “Young home seekers were welcomed with the lowest construction in more than 60 years.”

Using data from the U.S. Census and the American Community Survey, the researchers built a housing-demographic model that tracks how people move through different life stages with housing, such as renting and buying, over time.

They found that from 1990 to 2021, laws passed in 2010 to prevent another housing bubble unintentionally created a mismatch in the market. Stricter credit rules and limited housing supply couldn’t keep up with the millennial generation’s demand for housing. This led to shortages in rental housing first, followed by a shortage in homeowner opportunities. Many millennials delayed buying homes until after 2015, leading to a sudden rush to catch up.

Myers and his co-authors said the factors that led to the housing crisis included:

  • Severe underestimations of millennial demand: Policymakers and industry leaders miscalculated the millennial generation’s future needs for housing. This oversight only became evident after 2016, when pent-up demand surged.
  • Overlooked “age waves”: Policymakers failed to consider how shifts in age demographics — especially the increasing number of millennials reaching home-buying age — affected housing demand.
  • Forgotten lagging effects: Housing and population dynamics are shaped by past events — a fundamental concept that the authors argue has been overlooked in policy discussions. Current homeownership rates reflect decisions made in the past, and today’s population trends are influenced by the characteristics of previous generations.
  • Flawed demand measurements: Current methods for measuring housing demand fail to account for potential households that cannot form due to housing shortages. This contributed to the underestimates in future housing demand and misinformed policy responses.
  • Misunderstood homeownership trends: The decline in homeownership rates after the 2008 recession was misinterpreted as a permanent shift in preferences, rather than a delayed response to the housing market crash.

America’s housing crisis: A failure of foresight

Myers credits the work of the late Richard A. Easterlin, professor emeritus of economics at the USC Dornsife College of Letters, Arts and Sciences and a pioneer of linking size of generations to economic consequences. Easterlin’s research explored how changes in fertility rates 20 years prior predict the number of young adults entering the workforce and consumer market — insights that should have informed housing policy and prepared for the millennial demand.

Between 1976 and 1990, birth rates rose by 32%, setting the stage for a surge of 20-somethings moving into cities by 2010, the study found. The 2008 recession delayed their entry into stable jobs and housing, but when millennials began seeking homes after recovery from the Great Recession, supply was scarce.

Homebuilders and local officials likely assumed millennials would stay in apartments with roommates for much longer, but as they approached age 30, this “peak millennial” effect passed and their demand for homeownership surged, Myers explained.

“Easterlin’s timeless lessons were disregarded to our peril and underlie the extreme housing crisis,” Myers said. “We should have seen the millennials coming and made suitable housing plans in advance. How could so fundamental a lesson be disregarded?”

Racial disparities in homeownership recovery

As the housing crisis has deepened, racial and ethnic disparities in homeownership have also widened, with recovery varying significantly across different groups. In their analysis, the researchers examined homeownership trends from 2000 to 2021, focusing on how much these groups were falling short in 2016 and how they recovered by 2021.

In 2016, homeownership among whites was 9% lower than expected. While the gap narrowed over time, by 2021, there were still 4.6% fewer homeowners than anticipated.

Black Americans faced a much larger disparity. In 2016, Black homeownership was 23.2% lower than expected. By 2021, the gap had only slightly improved, with homeownership still 16.3% lower than expected, pointing to a slower recovery in comparison to other groups.

Among Hispanics, homeownership was 12.3% lower than expected in 2016, but the group saw a substantial recovery by 2021. By the end of the study, Hispanic homeownership exceeded expected levels, with a small surplus of 0.4%.

“Our findings point to the need for targeted interventions to address systemic barriers to homeownership and ensure more equitable access to housing for all,” Myers said.

Proactive housing policies in a changing climate

To address the housing crisis, the researchers offer two key solutions: first, shortening the lag time between rising demand and supply by anticipating housing needs over the next five years, and second, standardizing the practice of tracking both population growth and housing availability together, rather than in isolation.

“Housing policy needs to be better at planning for the needs of different age groups and their life stages to avoid mismatches between supply and demand,” Myers said.

“Without proactive policy, we risk not only falling short of meeting demand but also being unprepared with a resilient housing supply that can accommodate victims of climate-driven disasters, like wildfires, or other emergencies that create new demand in an instant.”

Additional revelations are available in the full report.


About the study: Co-authors of the study include Hyojung Lee of Seoul National University and JungHo Park of Kyung Hee University, South Korea.

 

Just as Gouda: Improving the quality of cheese alternatives


Studying how plant-based proteins interact with fat matrices can lead to better texture and nutrition for vegan cheeses



American Institute of Physics

Alejandro Marangoni demonstrates the stretchy properties of his team’s plant-based cheese analog 

image: 

Researcher Alejandro Marangoni demonstrates the stretchy properties of his team’s plant-based cheese analog.

view more 

Credit: Alejandro Marangoni




WASHINGTON, Jan. 14, 2025 – Plant-based dairy products are a great alternative for people who avoid animal products, but manufacturers have a hard time replicating the creamy, cheesy qualities that make dairy so indulgent.

Scientists from the University of Guelph in Ontario and Canadian Light Source Inc. in Saskatchewan are working to produce plant-based cheese with all the characteristics of real cheese, but with better health benefits.

In Physics of Fluids, by AIP Publishing, researchers studied multiple types of plant-based proteins and how they interact with alternative cheese scaffolds.

“If you wanted to strictly only eat plant-based products, you would end up eating a lot of beans and tofu, which can be little bit boring after a while,” said author Alejandro Marangoni. “Now, consumers expect essentially the same animal product but with plant-based ingredients, which is very difficult.”

To create a cheesy product with the same texture as the real thing, scientists need to study a variety of physical attributes such as the melting, stretching, and oil-release upon grilling and heating.

“The behavior of milk proteins and meat proteins is reasonably well understood, but knowledge about the functionality of plant proteins is lacking,” Marangoni said. “There is also a huge variety of different plant proteins, each one very different from one another.”

Previous research concluded that a blend of 25% coconut oil, 75% sunflower oil, and pea protein created a desirable cheese texture. In this research, Marangoni and his team studied isolates from three proteins (lentil protein, faba bean protein, and a particular type of pea protein) and observed how the isolates interacted with the oil and with the starch matrix of the cheese alternative.

They found that increasing the coconut oil increased the hardness of cheeses, but cheese with pea protein and 25% coconut oil had the firmest texture due to their unique protein-fat interactions. This formulation matched or surpassed the melt, oil loss, and stretch of the cheese analogs made with 100% coconut oil.

Using a blend of sunflower and coconut oil decreased the saturated fat content of the cheese, creating a healthy and sustainable alternative to dairy cheeses and other plant-based cheeses.

“Ultimately we want to improve the nutrition, increase the protein content, and lower the saturated fat content of cheese alternatives,” Marangoni said. “But keeping all the functionality in there, which includes the melt and the stretch of the ‘cheese,’ is very difficult.”

###

The article “Impact of protein sources on the functionality of plant-based cheeses formulated with saturated and unsaturated fat” is authored by Cameryn Sanders, Jarvis Stobbs, Stacie Dobson, and Alejandro G. Marangoni. It will appear in Physics of Fluids on Jan. 14, 2025 (DOI: 10.1063/5.0238556). After that date, it can be accessed at https://doi.org/10.1063/5.0238556.

ABOUT THE JOURNAL

Physics of Fluids is devoted to the publication of original theoretical, computational, and experimental contributions to the dynamics of gases, liquids, and complex fluids. See https://pubs.aip.org/aip/pof.

###

 

Global study pinpoints genes for depression across ethnicities



University of Edinburgh





New genetic risk factors for depression have been identified across all major global populations for the first time, allowing scientists to predict risk of depression regardless of ethnicity.

The world’s largest and most diverse genetic study ever into major depression has revealed nearly 300 previously unknown genetic links to the condition, experts say.

100 of the newly discovered genetic variations – small differences in the DNA sequence that makes up a gene – were identified due to the inclusion of people of African, East Asian, Hispanic and South Asian descent, the study found.

Previous research into the genetics of depression has focused primarily on white populations that originally descended from people living in Europe. Therapies developed using genetic approaches may therefore not be effective in other ethnicities, widening existing health inequalities.

Each single genetic variant has a very small effect on the overall risk of developing depression. If a person has multiple variants, these small effects can add up, increasing their risk.

The research team were able to more accurately predict an individual’s risk of depression by taking into account the newly identified variants.

The international team of scientists, led by the University of Edinburgh and King’s College London, looked at anonymised genetic data from more than five million people in 29 countries worldwide. One in four individuals included in the study were from non-European ancestries.

Researchers identified a total of 700 variations in the genetic code of individuals linked to the development of depression, almost half of which had never been associated with the condition before, implicating 308 specific genes.

The identified genetic variants were linked to neurons - a type of brain cell - across multiple brain regions, including areas which control emotion.

The findings offer new insight into depression’s impact on the brain and present possible new targets for treatment, experts say.

The research team highlight the existing drugs pregabalin and modafinil – used to treat chronic pain and the sleeping condition narcolepsy, respectively – which could potentially be repurposed for the treatment of depression, based on the study findings.

However, the team caution that further studies and clinical trials are needed to explore the potential of the drugs in patients with depression.

The study, funded by NIH, Wellcome and the National Institute for Health and Care Research Maudsley Biomedical Research Centre, is published in the journal Cellhttps://www.cell.com/cell/fulltext/S0092-8674(24)01415-6 [URL will become active after embargo lifts].

The research team from the Psychiatric Genomics Consortium involved scientists from all continents, including studies from South Africa, Brazil, Mexico, the USA, Australia, Taiwan and China.

Professor Andrew McIntosh, study co-lead, from the University of Edinburgh’s Centre for Clinical Brain Sciences, said: “There are huge gaps in our understanding of clinical depression that limit opportunities to improve outcomes for those affected. Larger and more globally representative studies are vital to provide the insights needed to develop new and better therapies, and prevent illness in those at higher risk of developing the condition.”

Professor Cathryn Lewis, study co-lead, from the Institute of Psychiatry, Psychology & Neuroscience at King’s College London, said: “Depression is a highly prevalent disorder and we still have a lot to learn about its biological underpinnings. Our study identifies hundreds of additional genetic variants that play a role in depression. These findings show depression is highly polygenic and open up downstream pathways to translate these findings into better care for people with depression.”

 

Study shows connection between childhood maltreatment and disease in later life



University of Birmingham


University of Birmingham venture Dexter has demonstrated the power of its Dexter software platform in a study showing that people whose childhoods featured abuse, neglect or domestic abuse carry a significantly increased risk of developing rheumatoid arthritis or psoriasis in later life.    

The starting point for the recently published study was a database of over 16 million Electronic Health Records, from which the Dexter software defined a cohort, one arm that was exposed to childhood maltreatment, and one arm that was not.  

The software then checked the records over a 26-year period for medical codes of immune-mediated inflammatory disorders (IMIDs) and found childhood maltreatment was associated with increased incidence rates for Rheumatoid Arthritis and psoriasis. 

The increase in risk was significant for both diseases, and the association was more pronounced in women than in men.  
The study was led by Dr Liam Snook from the University of Birmingham.  Its senior author, Professor Joht Singh Chandan, Clinical Professor of Public Health and Health Inequalities Lead at Birmingham Health Partners, commented: "As far as we’re aware, this is the first matched cohort longitudinal study to investigate the relationship between childhood maltreatment and immune-mediated inflammatory disorders (IMIDs) in later life.” 

Previous studies were metanalyses, which synthesised data from several independent studies, and these showed an increased risk for IMIDs in people whose childhood featured abuse.  

But to determine the incidence (the number of new cases over time), you need a to follow a cohort over time in a longitudinal study – and the only longitudinal study we could find limited its identification IMIDs to people who had been hospitalised, so did not show the onset of disease, which is usually recorded in primary care.”    

Longitudinal studies, which collect information from the same group of people repeatedly over time, allow for analysis of change over time, and the influence of earlier life circumstances on health later in life.   

The uniqueness of the study lies in the Dexter software, can rapidly inspect huge databases of longitudinal data with billions of datapoints, delivering reproducible, accurate and transparent results.  

Professor Chandan, who is also Head of Customer Success at Dexter software, points out two impacts from studies that use this type of ‘big data’.  He said: “For scientists, the study identifies an area that warrants further investigation into biological pathways that may link early life stressors to these two immune diseases.  For public health professionals, it paints a better picture of the patient.”  

“This approach can be used to identify inequalities in healthcare, alerting healthcare providers and delivers to patient groups that need specific attention, and ultimately allocating healthcare resources to meet those with the greatest need. In the future, we plan to use this approach to look at further longitudinal datasets in crime, social care, and mental health care.” 

About the study

Starting from 16 million records, the software identified 13.2 million records eligible for the study, based on requirements for age and data quality.  

From these, it identified 256,130 records coded for exposure before 18 years of age to abuse (physical, sexual, or emotional), neglect, domestic abuse, either by the GP, or codes relating to social services involvement.  The control group consisted of 712,478 records which were matched on age, sex, indices of deprivation, and GP surgery.  

The records from these two cohorts were checked for diagnostic codes for immune-mediated inflammatory disorders (rheumatoid arthritis, psoriasis, coeliac disease, multiple sclerosis, inflammatory bowel disease, and systemic lupus erythematosus) over a 26-year period running from 1995 to 2021.  

The findings showed the childhood maltreatment cohort had a significantly increased risk for: 
•    Rheumatoid Arthritis: 11.19/100,000 patient years vs 7.77/100,000 patient years (p=0.003)
•    Psoriasis: 152.79/100,000 patient years vs 128.74/100,000 patient years (p<0.001). 

SCHRODINGERS CAT

This metaphorical cat is both dead and alive – and it will help quantum engineers detect computing errors




University of New South Wales
Cat on sofa 

image: 

This metaphorical cat has seven lives.

view more 

Credit: UNSW Sydney





UNSW engineers have demonstrated a well-known quantum thought experiment in the real world. Their findings deliver a new and more robust way to perform quantum computations – and they have important implications for error correction, one of the biggest obstacles standing between them and a working quantum computer.

Quantum mechanics has puzzled scientists and philosophers for more than a century. One of the most famous quantum thought experiments is that of the “Schrödinger’s cat” – a cat whose life or death depends on the decay of a radioactive atom.

According to quantum mechanics, unless the atom is directly observed, it must be considered to be in a superposition – that is, being in multiple states at the same time – of decayed and not decayed. This leads to the troubling conclusion that the cat is in a superposition of dead and alive.

“No one has ever seen an actual cat in a state of being both dead and alive at the same time, but people use the Schrödinger’s cat metaphor to describe a superposition of quantum states that differ by a large amount,” says UNSW Professor Andrea Morello, leader of the team that conducted the research, published recently in the journal Nature Physics.

Atomic cat

For this research paper, Prof. Morello’s team used an atom of antimony, which is much more complex than standard ‘qubits’, or quantum building blocks.

“In our work, the ‘cat’ is an atom of antimony,” says Xi Yu, lead author of the paper.

“Antimony is a heavy atom, which possesses a large nuclear spin, meaning a large magnetic dipole. The spin of antimony can take eight different directions, instead of just two. This may not seem much, but in fact it completely changes the behaviour of the system. A superposition of the antimony spin pointing in opposite directions is not just a superposition of ‘up’ and ‘down’, because there are multiple quantum states separating the two branches of the superposition.”

This has profound consequences for scientists working on building a quantum computer using the nuclear spin of an atom as the basic building block.

“Normally, people use a quantum bit, or ‘qubit’ – an object described by only two quantum states – as the basic unit of quantum information,” says co-author Benjamin Wilhelm.

“If the qubit is a spin, we can call ‘spin down’ the ‘0’ state, and ‘spin up’ the ‘1’ state. But if the direction of the spin suddenly changes, we have immediately a logical error: 0 turns to 1 or vice versa, in just one go. This is why quantum information is so fragile.”

But in the antimony atom that has eight different spin directions, if the ‘0’ is encoded as a ‘dead cat’, and the ‘1’ as an ‘alive cat’, a single error is not enough to scramble the quantum code.

“As the proverb goes, a cat has nine lives. One little scratch is not enough to kill it. Our metaphorical ‘cat’ has seven lives: it would take seven consecutive errors to turn the ‘0’ into a ‘1’! This is the sense in which the superposition of antimony spin states in opposite directions is ‘macroscopic’ – because it’s happening on a larger scale, and realises a Schrödinger cat,” explains Yu.

Scalable technology

The antimony cat is embedded inside a silicon quantum chip, similar to the ones we have in our computers and mobile phones, but adapted to give access to the quantum state of a single atom. The chip was fabricated by UNSW’s Dr Danielle Holmes, while the atom of antimony was inserted in the chip by colleagues at the University of Melbourne.

“By hosting the atomic ‘Schrödinger cat’ inside a silicon chip, we gain an exquisite control over its quantum state – or, if you wish, over its life and death,” says Dr Holmes.

“Moreover, hosting the ‘cat’ in silicon means that, in the long term, this technology can be scaled up using similar methods as those we already adopt to build the computer chips we have today.”

The significance of this breakthrough is that it opens the door to a new way to perform quantum computations. The information is still encoded in binary code, ‘0’ or ‘1’, but there is more ‘room for error’ between the logical codes.

“A single, or even a few errors, do not immediately scramble the information,” Prof. Morello says.

“If an error occurs, we detect it straight away, and we can correct it before further errors accumulate. To continue the ‘Schrödinger cat’ metaphor, it’s as if we saw our cat coming home with a big scratch on his face. He’s far from dead, but we know that he got into a fight; we can go and find who caused the fight, before it happens again and our cat gets further injuries.”

The demonstration of quantum error detection and correction – a ‘Holy Grail’ in quantum computing – is the next milestone that the team will address.

The work was the result of a vast international collaboration. Several authors from UNSW Sydney, plus colleagues at the University of Melbourne, fabricated and operated the quantum devices. Theory collaborators in the USA, at Sandia National Laboratories and NASA Ames, and Canada, at the University of Calgary, provided precious ideas on how to create the cat, and how to assess its complicated quantum state.

“This work is a wonderful example of open-borders collaboration between world-leading teams with complementary expertise,” says Prof. Morello.

ENDS

This explainer video is available for media to embed in their stories.

Left to right: UNSW researchers Benjamin Wilhelm, Xi Yu, Prof Andrea Morello, Dr Danielle Holmes

Credit

UNSW Sydney