Friday, June 26, 2020

UM researcher helps reveal changes in water of Canadian arctic

THE UNIVERSITY OF MONTANA
IMAGE
IMAGE: CREW MEMBERS DEPLOY EQUIPMENT ONTO THE ICE FROM A CANADIAN ICEBREAKER, CCGS LOUIS S. ST. LAURENT, IN THE ARCTIC OCEAN. view more 
CREDIT: PHOTO BY GARY MORGAN, CANADIAN COAST GUARD
MISSOULA - Melting of Arctic ice due to climate change has exposed more sea surface to an atmosphere with higher concentrations of carbon dioxide. Scientists have long suspected this trend would raise CO2 in Arctic Ocean water.
Now University of Montana researcher Michael DeGrandpre and his patented sensors have helped an international team determine that, indeed, CO2 levels are rising in water across wide swaths of the Arctic Ocean's Canada Basin. However, some areas have exhibited slower increases, suggesting other processes - such as biological uptake of CO2 - have counteracted expected increases.
The work was published this month in the journal Nature Climate Change. The study is online at https://www.nature.com/articles/s41558-020-0784-2.
DeGrandpre is a UM chemistry professor, and in 2015 he and the company he founded, Sunburst Sensors, won two coveted XPRIZE awards for developing inexpensive, durable sensors to better understand ocean acidification. Sunburst Sensor technology also was used in this recent study for a CO2 measurement system placed on board a Canadian icebreaker, the CCGS Louis S. St. Laurent.
University of Montana chemistry Professor Michael DeGrandpre poses with a research buoy deployed through the ice in the Canada Basin of the Arctic Ocean.
DeGrandpre said ocean measurements are taken while the icebreaker is underway, sometimes crashing through ice one to two meters thick. DeGrandpre and UM research associate Cory Beatty have participated in these research cruises since 2012 with support from the National Science Foundation Office of Polar Programs.
"Because of the inaccessibility of the Arctic and the typically harsh work conditions, we really need a world-class icebreaker to access these areas," DeGrandpre said. "It also has given us a high-quality, consistent dataset, which really helped with this latest study. Most Arctic CO2 datasets are from infrequent cruises that do not visit the same locations year to year."
He said the new study combines sporadic data dating back to 1994 with the more-frequent data they have collected since 2012. DeGrandpre said their consistent dataset will only improve, as NSF recently awarded them an $890,000 grant to continue the icebreaker project through 2023.
###

Sunnier but riskier

Conservation efforts to open up rattlesnake habitat bring in much-needed sunlight but could attract more threatening predators
PENN STATE


IMAGE
IMAGE: NEW RESULTS SUGGEST THAT CONSERVATION EFFORTS TO OPEN UP OVERGROWN SNAKE HABITAT DO PROVIDE MORE OPPORTUNITIES FOR PREGNANT TIMBER RATTLESNAKES TO REACH TEMPERATURES NECESSARY FOR EMBRYOS TO DEVELOP, AS INTENDED.... view more 
CREDIT: CHRISTOPHER CAMACHO

UNIVERSITY PARK, Pa. -- Conservation efforts that open up the canopy of overgrown habitat for threatened timber rattlesnakes--whose venom is used in anticoagulants and other medical treatments--are beneficial to snakes but could come at a cost, according to a new study by researchers at Penn State and the University of Scranton. The researchers confirmed that breeding areas with more open canopies do provide more opportunities for these snakes to reach required body temperatures, but also have riskier predators like hawks and bobcats. The study, which appears in the June issue of the Journal of Herpetology, has important implications for how forest managers might open up snake habitat in the future.
Timber rattlesnakes are a species of conservation concern in Pennsylvania and are considered threatened or endangered in many of the northern states within their range. Like other ectothermic animals, snakes do not produce their own body heat and must move to warmer or cooler areas to regulate their temperature. Timber rattlesnakes typically use sunny, rocky forest clearings to breed, however many of these "gestation sites" are becoming overgrown with vegetation, blocking much-needed sunlight.
"Pregnant timber rattlesnakes typically maintain a temperature 6 to 8 degrees Celsius higher than normal so that their embryos can develop," said Christopher Howey, assistant professor of biology at the University of Scranton and former postdoctoral researcher at Penn State. "If a gestation site doesn't provide enough opportunities for snakes to reach that temperature, a snake might abort its litter, or babies might be born too small or later in the season, which reduces their chances of obtaining an essential first meal before hibernation. We wanted to understand if existing conservation efforts to open up the canopy in gestation sites actually do provide more thermal opportunities for snakes, as intended, and if these efforts impact predation risk."
The research team first quantified thermal opportunities for rattlesnakes in known gestation sites that had open or closed canopies. They logged temperatures within thermal models--essentially a copper tube painted to have similar reflectivity and heat absorbance to a snake--placed in areas where the researchers had seen snakes basking.
"As expected, we found that gestation sites with more open canopies did indeed provide more opportunities for snakes to reach optimal temperatures," said Tracy Langkilde, professor and head of biology at Penn State. "This confirms that conservation efforts to open up the canopy do what they are intended to do. But we also found that this might come at a cost, in the form of more threatening predators."
The research team also placed foam models painted like rattlesnakes at gestation sites and monitored for predators using trail game cameras--remote cameras that are triggered by movement. While there was a similar overall number of predators at sites with open canopies and closed canopies, the more threatening species--red-tailed hawks, fishers, and bobcats--only appeared at open sites.
"Our results suggest that there are tradeoffs to any management strategy and that by opening up a gestation site, we may inadvertently put more predation risk on a species," said Julian Avery, assistant research professor of wildlife ecology and conservation at Penn State. "Our models were slightly less visible to potential predators than actual snakes, so our estimates of predation risk are probably conservative, and the tradeoff may be more pronounced than what we observed."

Less threatening predators--raccoons and black bears--appeared at sites with both open and closed canopies.
"As far as we know, this is the first time that a black bear has been observed preying on a rattlesnake, or at least a model," said Howey. "Until now, we always thought that black bears avoided rattlesnakes, but we observed one bear attack two models and bite into a third."
The team suggests that forest managers should balance canopy cover and predation risk during future conservation efforts, for example by selectively removing trees that block direct sunlight but that do not considerably open up the canopy.
Improving conservation efforts at rattlesnake gestation sites is particularly important because, as far as the researchers know, snakes return to the same sites year after year to breed. If a gestation site decreases in quality, they might leave the site to find a new area, but it is unclear how successful these efforts are and the act of moving to new sites could increase contact with humans.
The researchers are currently radio-tracking actual snakes and directly manipulating the canopy cover to better understand how snakes behave in response to predators at sites with open vs. closed canopies.
"Timber rattlesnakes are an important part of the ecosystem, and where you have more rattlesnakes, you tend to have lower occurrences of Lyme disease because the snakes are eating things like chipmunks and mice which are the main vectors for the disease," said Howey. "Rattlesnake venom is also used in anticoagulants, in blood pressure medicine, and to treat breast cancer. Our research will help us refine how we conserve these important animals."
###
In addition to Howey, Langkilde, and Avery, the research team includes Mark Herr, an undergraduate student at Penn State at the time of the research and currently a graduate student at the University of Kansas. This work was supported by the Society for the Study of Amphibians and Reptiles, the Pennsylvania Department of Conservation and Natural Resources, and the National Science Foundation.

America's political future will be shaped by aging, journal indicates

THE GERONTOLOGICAL SOCIETY OF AMERICA
The latest issue of the journal Public Policy & Aging Report (PP&AR) from The Gerontological Society of America shows how aging is reshaping politics today in unprecedented ways, and how it will continue to do so for years to come.
Titled "Building Momentum for a New Future in Politics and Aging," the journal highlights existing studies as well as recommended areas for further research.
"Here, we see how equal amounts of policy progress and stagnation, as well as changing cultural views on aging, are fueling social, economic, and political changes in ways both expected and not," wrote PP&AR Associate Editor Michael Lepore, PhD, in his introduction. "Some of these societal changes raise concerns about limitations in current and future generations' abilities to live well into old age, at younger ages with disabilities, and as caregivers, whereas other shifts, like evolving cultural views on aging, nurture a more intergenerationally just society."
Among the seven articles that follow, the journal offers insight into major trends in the politics of aging; how generational political divides are influencing aging-related policies; the impact of aging on the economy; political impediments to aging in place; the importance of support for family caregivers; the longevity and health of U.S. presidential candidates; and how to build momentum through the frames we use to describe aging.
"Recognizing and grappling with the relevance of aging to politics is an essential step to ensuring that we are not weaving the last thread of the American social fabric but, rather, beginning a new national era that embraces aging and fully supports our interpersonal and transgenerational interdependences," Lepore wrote. "By achieving these goals, living well into and throughout old age -- despite physically or cognitively disabling conditions -- will be increasingly possible for all."
###
Public Policy & Aging Report is a publication of the National Academy on an Aging Society, the policy institute of The Gerontological Society of America (GSA). As the nation's oldest and largest interdisciplinary organization devoted to research, education, and practice in the field of aging, GSA's principal mission -- and that of its 5,500+ members -- is to advance the study of aging and disseminate information among scientists, decision makers, and the general public.

New automotive radar spots hazards around corners

PRINCETON UNIVERSITY, ENGINEERING SCHOOL
Using radar commonly deployed to track speeders and fastballs, researchers have developed an automated system that will allow cars to peer around corners and spot oncoming traffic and pedestrians.
The system, easily integrated into today's vehicles, uses Doppler radar to bounce radio waves off surfaces such as buildings and parked automobiles. The radar signal hits the surface at an angle, so its reflection rebounds off like a cue ball hitting the wall of a pool table. The signal goes on to strike objects hidden around the corner. Some of the radar signal bounces back to detectors mounted on the car, allowing the system to see objects around the corner and tell whether they are moving or stationary.
"This will enable cars to see occluded objects that today's lidar and camera sensors cannot record, for example, allowing a self-driving vehicle to see around a dangerous intersection" said Felix Heide, an assistant professor of computer science at Princeton University and one of researchers. "The radar sensors are also relatively low-cost, especially compared to lidar sensors, and scale to mass production."
In a paper presented June 16 at this Conference on Computer Vision and Pattern Recognition (CVPR), the researchers described how the system is able to distinguish objects including cars, bicyclists and pedestrians and gauge their direction and oncoming speed.
"The proposed approach allows for collision warning for pedestrians and cyclists in real-world autonomous driving scenarios -- before seeing them with exist direct line-of-sight sensors," the authors write.
In recent years, engineers have developed a variety of sensor systems that allow cars to detect other objects on the road. Many of them rely on lidar or cameras using visible or near-infrared light, and such sensors preventing collisions are now common on modern cars. But optical sensing is difficult to use to spot items out of the car's line of sight. In earlier research, Heide's team has used light to see objects hidden around corners. But those efforts currently are not practical for use in cars both because they require high-powered lasers and are restricted to short ranges.
In conducting that earlier research, Heide and his colleagues wondered whether it would be possible to create a system to detect hazards out of the car's line of sight using imaging radar instead of visible light. The signal loss at smooth surfaces is much lower for radar systems, and radar is a proven technology for tracking objects. The challenge is that radar's spatial resolution -- used for picturing objects around corners such as cars and bikes -- is relatively low. However, the researchers believe that they could create algorithms to interpret the radar data to allow the sensors to function.
"The algorithms that we developed are highly efficient and fit on current generation automotive hardware systems," Heide said. "So, you might see this technology already in the next generation of vehicles."
To allow the system to distinguish objects, Heide's team processed part of the radar signal that standard radars consider background noise rather than usable information. The team applied artificial intelligence techniques to refine the processing and read the images. Fangyin Wei, a graduate student in computer science and one of the paper's lead authors, said the computer running the system had to learn to recognize cyclists and pedestrians from a very sparse amount of data.
"First we have to detect if something is there. If there is something there, is it important? Is it a cyclist or a pedestrian?" she said. "Then we have to locate it."
Wei said the system currently detects pedestrians and cyclists because the engineers felt those were the most challenging objects because of their small size and varied shape and motion. She said the system could be adjusted to detect vehicles as well.
Heide said the researchers plan to follow the research in a number of directions for applications involving both radar and refinements in signal processing. He said the system has the potential to radically improve automotive safety and it relies on existing radar sensor technology, so readying the radar system for deployment in the next generation of automobiles should be possible.
"It would certainly go through the very rigorous automotive development cycles," he said. "In terms of integration and bringing it to market, it requires a lot of engineering. But the technology is there, so there is the potential for seeing this very soon in vehicles."
###
Besides Heide and Wei, the paper's authors include: Jürgen Dickmann, Florian Krause, Werner Ritter, and Nicolas Schiener of Mercedes-Benz AG; Buu Phan and Fahim Mannan of Algolux; Klaus Dietmayer of Ulm University; and Bernard Sick of the University of Kassel. Support for the research was provided in part by the European Union's H2020 ECSEL program

Consumers can distinguish between bitter tastes in beer -- doesn't alter liking

PENN STATE
Although most beer consumers can distinguish between different bitter tastes in beer, this does not appear to influence which beer they like. It seems they just like beer, regardless of the source of the bitterness.
That is the conclusion of Penn State sensory researchers who conducted multiple studies with more than 150 self-identified beer drinkers to see if they could differentiate bitterants in beer. But the question of whether humans can discriminate between types of bitterness remains controversial, according to researcher John Hayes, associate professor of food science.
"Given that countless craft breweries around the country have been very successful in selling a near-endless variety of India pale ales -- better known as IPAs -- we wanted to move past testing bitter chemicals in water to see if consumers could differentiate different bitters in a real food such as beer," he said.
To determine beer drinkers' ability to distinguish between bitter chemicals, study participants in blind taste tests were given commercially available nonalcoholic beer spiked with hop extract Isolone, quinine -- the ingredient that makes tonic water bitter -- and sucrose octaacetate, a food additive so bitter it has been used as a nail-biting and thumb-sucking deterrent.
Participants, about half men and half women, most in their 30s, took part in three experiments. In the first, researchers asked subjects to rate the amount of bitterness and other beer flavor attributes in samples using an intensity scale, to ensure the beer samples were equally bitter.
In the next experiment, beer consumers rated how samples differed from a reference on a seven-point scale. Then, to understand how each sample differed from others, participants checked attributes on a list of 13 descriptors to describe the samples.
In the final experiment, beer consumers tasted the beer samples, rated how much they liked each sample and provided a forced-choice ranking for best-liked to worst-liked.
According to Hayes, who is director of Penn State's Sensory Evaluation Center in the College of Agricultural Sciences, most participants were able to discern differences in bitterness -- even though the samples had been matched for bitterness intensity.
"But our results also show that, despite being able to differentiate between the different bitter chemicals, they were not able to verbally describe these differences, even when provided a list of attributes," he said. "Further, we found no consistent effect on liking or preference. The source of bitterness did not influence which beers they liked."
In the sampled beers, researchers attempted to match the flavor profile of a pale ale style beer, in which high bitterness is not only accepted but desired by consumers, noted lead researcher Molly Higgins, who will receive her doctoral degree in food science this August. Higgins explained that she recruited regular beer consumers because they are more likely to be aware of the various flavor profiles of beer and respond positively to the bitter qualities of samples during testing.
"What we found was unsurprising in hindsight -- beer consumers simply like beer," she said. "So, it seems that for consumers who drink IPAs, a beer just needs to have a bitter profile. For them, it's about bitterness in general, not the specific bitter quality -- if it's there, they will like it."
Higgins suggests that this finding may help in quality assurance at breweries. "Beer consumers may be more forgiving than previously believed when it comes to small variations across batches," she said.
Higgins noted that some breweries use highly trained expert tasters to evaluate each batch. If these experts detect any off notes or flaws in the final product, they may throw out an entire batch. "When breweries can establish an acceptable range for sensory attributes for their final products, they can make better decisions about how much variation is tolerable," she said.
However, there are many segments of beer consumers, Higgins added, and within the craft beer market there is a unique subgroup of consumers who are devoted to their IPAs. Those beer drinkers, she explained, doubtlessly pick up on more of the finer bitter notes created by novel blends of hops. Those consumers patronize craft breweries and are willing to try many different beers.
The bitter beer tasting study, recently published in Nutrients, was part of a larger research project conducted by Higgins at Penn State for her dissertation. Because of its sensory complexity and wide acceptance by many consumers, she contends, beer is a good model food to explore the capacity of people to perceive bitter taste.
Higgins said when people ask her why she would do this kind of a study, she points out that it's not about beer.
"The overall goal of my dissertation research was to learn more about bitterness and bitterness perception, and to better understand how individuals learn to like bitter products," she said. "We hope that understanding bitterness can guide further research that helps people incorporate healthy bitter foods into their diet. The overall goal is to look at more complex bitter foods, such as kale and broccoli, and figure out ways to increase their consumption and liking."
###
The Agriculture and Food Research Initiative administered by the U.S. Department of Agriculture's National Institute of Food and Agriculture supported this wor

Voter ID laws discriminate against racial and ethnic minorities, new study reveals

Minority voter turnout declines in states where voter ID laws are enacted
UNIVERSITY OF CALIFORNIA - SAN DIEGO
IMAGE
IMAGE: CONTRARY TO PREVIOUS STUDIES ON VOTER ID LAWS, THE RESEARCHERS USED ACTUAL VOTER TURNOUT DATA, RATHER THAN SURVEYS GAUGING ATTITUDES TOWARDS VOTING. view more 
CREDIT: SDI PRODUCTIONS
Voter ID laws are becoming more common and more strict, and the stakes for American democracy are high and growing higher by the year. New research from the University of California San Diego provides evidence that voter ID laws disproportionately reduce voter turnout in more racially diverse areas. As a result, the voices of racial minorities become more muted and the relative influence of white America grows.
In a study published in the journal Politics, Groups, and Identities, researchers focused on turnout changes across the 2012 and 2016 presidential elections in states that had recently passed strict photo voter ID laws: Alabama, Mississippi, Virginia and Wisconsin and compared those changes to other states with similar racial compositions that had not passed laws. They found the turnout gap between white counties and racially diverse counties grew more within states enacting new strict photo ID laws.
Such results lead to "an already significant racial skew in American democracy growing even more pronounced," according the authors. Contrary to previous studies on voter ID laws, the researchers used actual voter turnout data, rather than surveys gauging attitudes towards voting.
"By using official turnout data, we eliminate concerns over inflated or biased turnout patterns from self-reported surveys," said co-author Zoltan Hajnal, a professor of political science at the UC San Diego School of Global Policy and Strategy. "This analysis provides more precise evidence that strict voter ID laws appear to discriminate."
The researchers define strict voter identification law as any electoral law that requires voters to present identification before their ballot will be officially counted. Currently, 36 states have voter ID laws that, at a minimum, request identification and 11 of these states have strict voter ID, requiring ID (all of which have been passed since 2000). Since the study was completed, four additional states--Kentucky, North Carolina, Arkansas and North Dakota--have passed strict voter ID laws. Kentucky's legislation was recently passed in April.
In swing states, a decline among non-white voters can have major electoral impacts
To determine if the implementation of strict photo ID laws discriminate, the authors looked to see if turnout in racially diverse counties declined relative to turnout in predominantly white counties more in states enacting strict voter IDs, than it did in states not enacting such laws.
To gauge this, the researchers used data from official county-level aggregate vote totals for all 3,142 counties in the U.S. They added census data on the racial and ethnic breakdown of the voting age population by county.
The findings revealed that when these laws are enacted, turnout in racially diverse counties declines more than in less diverse areas and more sharply than it does in other states.
"As the share of counties' non-whites increases, so does the negative impact of strict ID laws,"
Hajnal and co-authors write. "For example, voter turnout in counties with a 75 percent non-white population declines 1.5 points more in states that just adopted strict ID laws than in states that didn't implement a strict law."
The authors added, "Given that the margin of victory in Wisconsin in the 2016 Presidential election was only 0.77 percentage points, this is a meaningful effect."
Implications for the courts, which have served as the primary battle ground over these laws
Proponents of voter ID laws argue they are necessary to reduce voter fraud and instill greater legitimacy in the democratic process. Critics cite that racial and ethnic minorities are less likely than whites to have ready access to valid identification.
The authors point out that no two voter ID laws are identical, and laws in different states may be targeting different groups. For example, North Dakota's strict law requires an ID with a residential street address, which may disproportionately target and impact Native Americans who live on reservations without an official street address.
By contrast, Texas's initial law allowed residents to use a concealed carry gun license but not a state issued student ID--a pattern that critics felt favored whites and disproportionately impacted Black and Hispanic residents.
The researchers noted that in many ways, the courts have served as the primary battle ground over these laws. Almost every strict ID law has been challenged in the courts.
In past proceedings, the courts' rulings have appeared to rest more than anything else on the balance between the burden that these laws pose on racial and ethnic minorities and the state's interest in the integrity of the electoral process. And that balance often seems to rest on the weight of the empirical evidence about the burden these laws pose to minorities.
"If courts are indeed trying to gauge the burden these laws impose on minorities and others, then this new data should help the courts with their deliberations," the authors write.
In conclusion, they note "this research is an effort to expose inequalities and discrimination in the political system that will hopefully lead to awareness of those inequalities and even more so, to changes in the laws that could reduce them."
###

Helping consumers in a crisis

'Quantitative easing' program let households spend more during the last recession; could it work again?
MASSACHUSETTS INSTITUTE OF TECHNOLOGY
A new study shows that the central bank tool known as quantitative easing helped consumers substantially during the last big economic downturn -- a finding with clear relevance for today's pandemic-hit economy.
More specifically, the study finds that one particular form of quantitative easing -- in which the U.S. Federal Reserve purchased massive amounts of mortgage-backed securities -- drove down mortgage interest rates, allowed consumers to refinance their house loans and spend more on everyday items, and in turn bolstered the economy.
"Quantitative easing has a really big effect, but it does matter who it targets," says Christopher Palmer, an MIT economist and co-author of a recently published paper detailing the results of the study.
All told, the study finds, the Fed's so-called QE1 phase from late November 2008 through March 2010, a part of the larger quantitative easing program, generated about $600 billion in mortgage refinancing at lower interest rates, bringing about $76 billion worth of additional spending back into the broader economy.
However, as the study also demonstrates, the people benefitting from QE1 were a relatively circumscribed group of mortgage holders: borrowers from the Government Sponsored Entities (GSEs) Fannie Mae and Freddie Mac. So while observers may talk about quantitative easing as a "helicopter drop" of money, scattered across the public, the Fed's previous interventions were relatively targeted. Recognizing that fact could shape policy decisions in the future.
"It's not like the Fed drops money from a helicopter and then it lands randomly and uniformly and equally across the population, and people pick up those dollars and spend money and are off to the races," Palmer says. "The Fed intervenes in specific ways, and specific people benefit."
The paper, "How Quantitative Easing Works: Evidence on the Refinancing Channel," is published in the latest issue of the Review of Economic Studies. The authors are Marco Di Maggio, an associate professor at Harvard Business School; Amir Kermani, an associate professor at the Haas School of Business at the University of California at Berkeley; and Palmer, the Albert and Jeanne Clear Career Development Assistant Professor at the MIT Sloan School of Management.
Mortgage relief for some
The introduction of quantitative easing during the Great Recession was a notable expansion of the tools used by central banks. Rather than limiting its holdings to treasury securities, the U.S. Federal Reserve's purchase of mortgage-backed securities -- bonds backed by home loans -- gave it more scope to boost the economy, by lowering interest rates in another area of the bond market.
The first round of quantitative easing, QE1, which began in November 2008, included $1.25 trillion in mortgage purchases. The second round, QE2, which started in September 2010, focused exclusively on treasury securities. The third round, QE3, was initiated in September 2012 and was a combination of mortgage and treasury security purchases.
To conduct the study, the researchers drew heavily on a database from Equifax, the giant consumer credit reporting agency, which includes detailed individual-level information about mortgages. That includes the size of individual loans, their interest rates, and other liabilities. The database covered about 65 percent of the mortgage market.
"It basically allowed us to trace the flow of Fed mortgage purchases down to individual households -- we could see who was refinancing when the Fed was intervening to make interest rates lower," Palmer says.
The study found that refinancing activity increased by about 170 percent during QE1, with interest rates dropping from about 6.5 percent to 5 percent. However, the Fed purchasing activity was highly focused on "conforming" mortgages -- those fitting the guidelines of the GSEs, which often mandate having loans cover no more than 80 percent of a home's value.
With the Fed not aiming its resources at nonconforming mortgages, much less refinancing occurred from people with those kinds of home loans.
"We saw a really big difference in who seemed like they were getting credit during quantitative easing," Palmer says.
That means QE1 bypassed many people who needed it the most. Consumers with nonconforming mortgages, on aggregate, were in worse financial straits than people who could put more equity into their homes initially.
Checking the data geographically, the researchers also found that much less refinancing occurred in the "sand states" where a huge number of subprime, nonconforming mortgages were issued -- especially Florida, Arizona, Nevada, and the Inland Empire region of California.
"People who are outside the conforming mortgage system are often those who need help the most, whether that's because their loan size is too big, or their equity is too small, or their credit score is too low," Palmer says. "They often needed the stimulus most and yet couldn't get it because credit was too tight."
Take it easy
Given both the success and targeted nature of QE1, Palmer suggests that future interventions could be broadened.
"One of our takeaways is that if the Fannie and Freddie requirements can be temporarily loosened, then Federal reserve QE purchases can do a lot more good, because they can reach more borrowers," Palmer says.
More broadly, surveying the economic landscape as the Covid-19 pandemic continues, Palmer says we should continue to examine how central banks can provide relief, and to whom. With interest rates very low, the U.S. Federal Reserve cannot offer much broad relief by adjusting rates. More help may come from efforts like the Main Street Lending Program facilitated by the CARES Act, which runs through September.
"When credit markets get locked up, there's less opportunity for your local restaurant or auto-body garage or toy store to take advantage of the fact that the interest rates are lower," Palmer says. Instead, targeted programs are "really an attempt to focus the monetary stimulus directly from the Fed to the people who need it."
To be sure, consumers gaining credit relief may not be as willing to spend right now as they were in 2008 or 2010. But given the economic struggles of 2020, freeing up any additional spending would be productive, Palmer says.
"If people can refinance right now, they're probably not going on shopping sprees," he says. "But there is still a lot of consumption happening that is very valuable."
###
Written by Peter Dizikes, MIT News Office
Paper: "How Quantitative Easing Works: Evidence on the Refinancing Channel"

Most massive quasar known in early universe discovered on Maunakea

The second-most distant quasar ever discovered now has a Hawaiian name
W. M. KECK OBSERVATORY
IMAGE
IMAGE: AN ARTIST'S IMPRESSION OF THE FORMATION OF QUASAR PŌNIUĀ`ENA, STARTING WITH A SEED BLACK HOLE, 100 MILLION YEARS AFTER THE BIG BANG. view more 
CREDIT: CREDIT: INTERNATIONAL GEMINI OBSERVATORY/NOIRLAB/NSF/AURA/P. MARENFELD
Maunakea, Hawai'i - Astronomers have discovered the second-most distant quasar ever found using three Maunakea Observatories in Hawai'i: W. M. Keck Observatory, the international Gemini Observatory, a Program of NSF's NOIRLab, and the University of Hawai'i-owned United Kingdom Infrared Telescope (UKIRT). It is the first quasar to receive an indigenous Hawaiian name, Pōniuā`ena, which means "unseen spinning source of creation, surrounded with brilliance" in the Hawaiian language.
Pōniuā`ena is only the second quasar yet detected at a distance calculated at a cosmological redshift greater than 7.5 and it hosts a black hole twice as large as the other quasar known in the same era. The existence of these massive black holes at such early times challenges current theories of how supermassive black holes formed and grew in the young universe.
The research has been accepted in the Astrophysical Journal Letters and is available in preprint format on arXiv.org.
Quasars are the most energetic objects in the universe powered by their supermassive black holes and since their discovery, astronomers have been keen to determine when they first appeared in our cosmic history. By systematically searching for these rare objects in wide-area sky surveys, astronomers discovered the most distant quasar (named J1342+0928) in 2018 and now the second-most distant, Pōniuā`ena (or J1007+2115, at redshift 7.515). The light seen from Pōniuā`ena traveled through space for over 13 billion years since leaving the quasar just 700 million years after the Big Bang.
Spectroscopic observations from Keck Observatory and Gemini Observatory show the supermassive black hole powering Pōniuā`ena is 1.5 billion times more massive than our Sun.
Pōniuā`ena is the most distant object known in the universe hosting a black hole exceeding one billion solar masses," said Jinyi Yang, a postdoctoral research associate at the Steward Observatory of the University of Arizona and lead author of the study.
For a black hole of this size to form this early in the universe, it would need to start as a 10,000 solar mass "seed" black hole about 100 million years after the Big Bang, rather than growing from a much smaller black hole formed by the collapse of a single star.
"How can the universe produce such a massive black hole so early in its history?" said Xiaohui Fan, Regents' professor and associate department head of the Department of Astronomy at the University of Arizona. "This discovery presents the biggest challenge yet for the theory of black hole formation and growth in the early universe."
Current theory holds the birth of stars and galaxies as we know them started during the Epoch of Reionization, beginning about 400 million years after the Big Bang. The growth of the first giant black holes is thought to have occurred during that same era in the universe's history.
The discovery of quasars like Pōniuā`ena, deep into the reionization epoch, is a big step towards understanding this process of reionization and the formation of early supermassive black holes and massive galaxies. Pōniuā`ena has placed new and important constraints on the evolution of the matter between galaxies (intergalactic medium) in the reionization epoch.
"Pōniuā`ena acts like a cosmic lighthouse. As its light travels the long journey towards Earth, its spectrum is altered by diffuse gas in the intergalactic medium which allowed us to pinpoint when the Epoch of Reionization occurred," said co-author Joseph Hennawi, a professor in the Department of Physics at the University of California, Santa Barbara.
METHODOLOGY
Yang's team first detected Pōniuā`ena as a possible quasar after combing through large area surveys such as the UKIRT Hemisphere Survey and data from the University of Hawai'i Institute for Astronomy's Pan-STARRS1 telescope on the Island of Maui.
In 2019, the researchers observed the object using Gemini Observatory's GNIRS instrument as well as Keck Observatory's Near Infrared Echellette Spectrograph (NIRES) to confirm the existence of Pōniuā`ena.
"The preliminary data from Gemini suggested this was likely to be an important discovery. Our team had observing time scheduled at Keck just a few weeks later, perfectly timed to observe the new quasar using Keck's NIRES spectrograph in order to confirm its extremely high redshift and measure the mass of its black hole," said co-author Aaron Barth, a professor in the Department of Physics and Astronomy at the University of California, Irvine.
In honor of its discovery from atop Maunakea, 30 Hawaiian immersion school teachers named the quasar Pōniuā`ena through the 'Imiloa Astronomy Center of Hawai'i's A Hua He Inoa program led by renowned Hawaiian language expert Dr. Larry Kimura.
"We recognize there are different ways of knowing the universe," said John O'Meara, chief scientist at Keck Observatory. "Pōniuā`ena is a wonderful example of interconnectedness between science and culture, with shared appreciation for how different knowledge systems enrich each other."
"I am extremely grateful to be a part of this educational experience - it is a rare learning opportunity," said Kau'i Kaina, a high school Hawaiian immersion teacher from Kahuku, O'ahu who was involved in the naming workshop. "Today it is relevant to apply these cultural values in order to further the well-being of the Hawaiian language beyond ordinary contexts such as in school, but also to ensure the language lives throughout the universe."
ABOUT NIRES
The Near Infrared Echellette Spectrograph (NIRES) is a prism cross-dispersed near-infrared spectrograph built at the California Institute of Technology by a team led by Chief Instrument Scientist Keith Matthews and Prof. Tom Soifer. Commissioned in 2018, NIRES covers a large wavelength range at moderate spectral resolution for use on the Keck II telescope and observes extremely faint red objects found with the Spitzer and WISE infrared space telescopes, as well as brown dwarfs, high-redshift galaxies, and quasars. Support for this technology was generously provided by the Mt. Cuba Astronomical Foundation.
ABOUT W. M. KECK OBSERVATORY
The W. M. Keck Observatory telescopes are among the most scientifically productive on Earth. The two 10-meter optical/infrared telescopes on the summit of Maunakea on the Island of Hawai'i feature a suite of advanced instruments including imagers, multi-object spectrographs, high-resolution spectrographs, integral-field spectrometers, and world-leading laser guide star adaptive optics systems.
Some of the data presented herein were obtained at Keck Observatory, which is a private 501(c) 3 non-profit organization operated as a scientific partnership among the California Institute of Technology, the University of California, and the National Aeronautics and Space Administration. The Observatory was made possible by the generous financial support of the W. M. Keck Foundation.
The authors wish to recognize and acknowledge the very significant cultural role and reverence that the summit of Maunakea has always had within the Native Hawaiian community. We are most fortunate to have the opportunity to conduct observations from this mountain.
For more information, visit http://www.keckobservatory.org.
###