Tuesday, March 16, 2021

COUNT ME IN #ABOLISHMONARCHY

Over half Canadians say monarchy is obsolete after Harry and Meghan's interview: Poll



OTTAWA — A new poll suggests just over half of Canadians believe the British monarchy is a relic that Canada should abandon, following Prince Harry and Meghan's explosive interview with Oprah Winfrey.



Fifty-three per cent of respondents to an online survey by Leger and the Association for Canadian Studies say the British monarchy no longer has its place in 21st-century Canada, while one-third say they would rather preserve this part of our heritage.

Leger executive vice-president Christian Bourque said the interview — and how Canadians are responding to its revelations — should be considered a blow for the monarchy and those who believe in the importance of the role it plays in Canada.

"I think this probably would not have been true a few weeks ago," Bourque said.

The poll also found that 59 per cent of respondents sympathize more with Harry and Meghan, while 26 per cent say they held more sympathy for the Royal Family.

The online poll of 1,512 adult Canadians was carried out March 12 to 14. It cannot be assigned a margin of error because internet-based surveys are not considered random samples.

Fifty-two per cent say the recent events involving the couple, which included the Duchess of Sussex divulging that she had been driven to thoughts of suicide and that a member of the Royal Family had asked how dark her son Archie's skin might be, speak about a fundamental problem with the institution.

Forty-three per cent of respondents say the recent events show the Royal Family holds racist views, which Bourque said is damaging to its reputation.


The negative view of the monarchy was higher in Quebec, where 71 per cent of respondents said it is out of date, which Bourque said is not surprising.

"Even if you exclude the Quebec numbers, you still get about half of Canadians who say basically do we really need the Royal Family in Canada," he said.

An earlier poll of 2,122 adult Canadians carried out from Feb. 5 to 7 had 46 per cent of respondents saying the monarchy is outdated and that Canada should get rid of it, so the numbers are slightly higher after the interview with Harry and Meghan.

In the more recent poll, Canadians appear divided on what could replace the monarchy.


Thirty-six per cent of respondents said they would prefer the prime minister be the head of state, with no other representative such as the governor general. Sixteen per cent said they would like Canada to be a republic with an elected president and 20 per cent said they would like to keep the existing arrangement.

This report by The Canadian Press was first published March 16, 2021

This story was produced with the financial assistance of the Facebook and Canadian Press News Fellowship


MONOPOLY CAPITALISM
Rogers-Shaw merger will 'only help big telecoms profit more on the backs of Canadians,' NDP says

Anja Karadeglija 
3/15/2021

© Provided by National Post NDP Leader Jagmeet Singh is critical of the proposed Rogers-Shaw merger.

Video player from: YouTube (Privacy Policy, Terms)

Allowing a $26-billion merger between Canada’s two largest cable companies will “only help big telecom companies profit more on the backs of Canadians,” NDP leader Jagmeet Singh said after Rogers Communications and Shaw Communications announced their deal on Monday.

Approval of the merger would see Rogers, one of Canada’s big three wireless providers, acquire Shaw’s Freedom Mobile, whose presence in recent years as the upstart fourth competitor in British Columbia, Alberta and Ontario, has driven competition in those provinces.

“Big telecom companies are gouging Canadians and continuing to make massive profits in a time where most families are struggling to get by. A merger between two of Canada’s biggest providers will just make it worse,” Singh said in a statement.

Allowing the acquisition would be a reversal of federal governments’ pro-competition wireless policies dating back to Stephen Harper’s Conservative government, which promoted a fourth wireless competitor to the market dominance of Bell, Telus and Rogers. In the last federal election, the Liberals’ campaign promises included lowering wireless prices.

Pierre Poilievre, the Conservative jobs and industry critic, said in a statement the proposed merger has a “number of considerations that must be studied. Conservatives will be reviewing this deal to ensure it helps create competition, affordability for Canadians, and jobs.”

The merger has three regulatory approvals to go through before it can be finalized — at the CRTC, at Innovation Canada and through the Competition Bureau.






Innovation Minister François-Philippe Champagne said in his own statement that greater affordability and competition would be central “in analyzing the implications of today’s news,” but that the government wouldn’t “presuppose” the outcome of those regulatory approval processes.

In the 2019 election, the Liberals promised to lower wireless prices by 25 per cent. A year ago, the government announced that promise would apply only to “mid-range” plans of 2 to 6 GB per month, and the price drop would have to happen within two years. Access-to-information documents later revealed the government expected the prices of those plans to fall organically by that date, without government intervention, as Canadians moved to higher-value plans not covered by the price drop promise.

The government said at the time they chose not to include higher-bucket data plans because prices for those plans had already fallen.

Freedom Mobile has previously been credited for driving down prices in the overall market. For instance, in 2017, it began offering 10 GB plans for $50, leading the big three to lower their prices and match that offer.

In its press release announcing the deal, Rogers pledged not to raise the prices of current Freedom Mobile customers for three years. It also promised to launch a new $1-billion fund to improve connectivity in rural, remote and Indigenous communities, expand its low-cost internet offerings for seniors and low-income individuals, and invest billions in its 5G network buildout. The company declined a request for an interview Monday.

After the COVID-19 pandemic began, the Liberal government began prioritizing rollouts of rural broadband connectivity, with former innovation minister Navdeep Bains telling a Parliamentary committee in November there needs to be a “balancing act” between affordability and investment. Telecom companies, including Shaw and Rogers, have been arguing that low wholesale rates for internet service would harm their investments in rural networks.

John Lawford, executive director of the consumer advocacy group Public Interest Advocacy Centre, said the federal government is now “at the mercy of the telecom providers” when it comes to rolling out those rural networks, the need for which has intensified as work, school, and even health care has moved online.


The only ones who can deliver it quickly are the biggest telecoms, Lawford said, a situation that gives big telecom companies leverage over the government and makes it a good time for try for a merger, Lawford said.

Ben Klass, a PhD candidate at Carleton University whose research focuses on the Canadian telecom industry, noted Freedom Mobile getting bought out by one of the big wireless providers would have a bigger impact than when the same thing previously happened with smaller competitors like Public Mobile and Mobilicity. Freedom Mobile currently has 1.9 million customers.

“This isn’t going to be a small thing that people don’t understand or notice,” he said in an interview. “Shaw’s like the pillar in English speaking Canada of the government competition policy in mobile.”

Without Freedom, “there won’t be any downward pressure” on prices from the big three at all, Lawford argued.


Of the three regulatory reviews, the Competition Bureau process, which looks at whether a deal could result in a substantial lessening or prevention of competition, is likely to be the most significant. The Competition Bureau could impose conditions on the deal, such as the sale of some wireless assets, though the market doesn’t currently have an obvious buyer.

Klass said how likely the government is to oppose Rogers’ acquisition could depend on how the issue plays with potential voters.

“I think that that depends on their perception of the public’s reaction to it,” he said. “If we are going into an election and this becomes something that captures the public’s attention, it might be something that could play a part.”

Ultimately, the government may not have final say over the decision, because the Competition Bureau is an independent agency. Its decisions can be appealed to the courts, but there is no way for the government itself to change or overturn them, Lawford said.

 

Whispers from the dark side: What can gravitational waves reveal about dark matter?

Scientists of the PRISMA+ Cluster of Excellence analyze data from the gravitational wave observatory NANOGrav

JOHANNES GUTENBERG UNIVERSITAET MAINZ

Research News

The NANOGrav Collaboration recently captured the first signs of very low-frequency gravitational waves. Prof. Pedro Schwaller and Wolfram Ratzinger analyzed the data and, in particular, considered the possibility of whether this may point towards new physics beyond the Standard Model. In an article published in the journal SciPost Physics, they report that the signal is consistent with both a phase transition in the early universe and the presence of a field of extremely light axion-like particles (ALPs). The latter are considered as promising candidates for dark matter.

Gravitational waves open a window into the early universe. While the ubiquitous cosmic microwave background yields no clues about the first 300,000 years of our universe, they provide some glimpses of what happened during Big Bang. "It's exactly this very early universe that is so exciting for particle physicists," explains Pedro Schwaller, Professor of Theoretical Physics at the PRISMA+ Cluster of Excellence at Johannes Gutenberg University Mainz (JGU). "This is the time when the elementary particles like quarks and gluons are present, and then combine to form the building blocks of atomic nuclei."

The special thing about the gravitational waves which the NANOGrav Collaboration has detected for the first time is that they have a very low frequency of 10-8 Hertz, which equates to approximately one oscillation per year. Due to their correspondingly large wavelength, in order to detect them any detector would also have to be equally large. As such a detector is not possible here on Earth, the astronomers at NANOGrav use distant pulsars and their light signals as huge detectors.

Wolfram Ratzinger outlines the motivation behind their work: "Even though so far the data only provides us with a first hint of the existence of low-frequency gravitational waves, it is still very exciting for us to work with them. This is because such waves could be produced by various processes that occurred in the early universe. We can now use the data we already have to decide, which of these come into consideration and which do not fit the data at all."

As a result, the Mainz-based scientists decided to take a particularly close look at two scenarios that could have caused the observed gravitational waves: Phase transitions in the early universe and a dark matter field of extremely light axion-like particles (ALPs). Phase transitions such as these occur due to the falling temperature in the primordial soup after the Big Bang and result in massive turbulences - however, like dark matter they are not covered by the Standard Model.

Based on the data available, Pedro Schwaller and Wolfram Ratzinger interpret the results of their analysis with relative caution: "Perhaps slightly more probable is the early phase transition scenario." On the other hand, the two physicists believe that the fact that they are able to work out certain possibilities based only on limited data proves the potential of their approach. "Our work is a first, but important development - it gives us a lot of confidence that with more precise data we can draw reliable conclusions about the message gravitational waves are sending us from the early universe."

"Furthermore," Pedro Schwaller concludes, "we can already begin to pin down certain characteristics of the scenarios and put constraints on them, in our case the strength of the phase transition and the mass of the axions."


 

Lab studies of emotion and well-being may be missing real-world anxiety

Test participants' natural level of anxiety may cloud findings of laboratory psychology studies

DUKE UNIVERSITY

Research News

DURHAM, N.C. - For decades, psychologists' study of emotional health and well-being has involved contrived laboratory experiments and self-report questionnaires to understand the emotional experiences and strategies study participants use to manage stress.

But those hundreds of studies may have taken for granted a pretty big complicating factor, argues a new study from Duke University and Dartmouth College.

The study, which appears March 12 in PLOS One, says the background level of anxiety a person normally experiences may interfere with how they behave in the lab setting.

"The paper is not saying all of this work is wrong," emphasized first author Daisy Burr, a graduate student in psychology and neuroscience at Duke. "It's just saying, 'Hey, there's this really interesting unknown here that we should all be examining.' "

Most of the research on emotional regulation has focused on two strategies: Reappraisal and suppression. People who are naturally more anxious tend toward suppressing these feelings or pretending them away, "but that's kind of a surface-level technique that's not going to have any long-term impact," Burr said.

The reappraisal tactic has people face the stressor and try to change what it means to them - overcoming their fears - and that tends to be a little more long-lasting, she said.

Indeed, prior research finds that people who employ reappraisal more often are less anxious and depressed, Burr said.

Psychologists care about emotion regulation because it helps keep us all sane and on track.

"Emotion regulation is a buffer against the really negative effects that stress can have on your life," Burr said. "Stress is always going to be there, but it doesn't have to ruin your life."

Burr wondered how anxiety influences the way people naturally tend to regulate their emotions.

She and two colleagues taught Dartmouth undergraduates how to suppress or reappraise an emotional stimulus, and then put them through a standardized emotional regulation training protocol that has been used in hundreds of studies. For each set of stimuli, the participants were instructed to actively suppress or reappraise their response or simply to look at it without receiving any instruction.

As participants went through this set of stimuli, the researchers measured three physiological responses: Skin conductance (a measure of stress used in the polygraph test) and two sensors for the activity of specific facial muscles.

The three measures were then combined to create a "signature" for each test participant that captured when they were suppressing, reappraising or naturally engaging without instruction.

Researchers then compared response signatures for all 52 participants and found in the uninstructed situation where they weren't told how to cope, people who were naturally more anxious were more likely to suppress. Those who were less anxious tended toward reappraisal.

While that all fits with what the research would predict, they also found that anxiety, not self-reported regulation strategies, predicted how participants were regulating.

"There's a disconnect between how people are self-reporting their emotion regulation and how they're regulating in the lab," Burr said. "A person's anxiety may be this more fundamental process or disposition that kind of overrides how you regulate, at least in unrealistic environments, such as in the lab," Burr said.

And if that's true, Burr said future research should explore this disconnect to better understand the right way to rely on self-report measures and how to realistically study emotion -- inside and outside the lab.

"This is really a puzzle," Burr said. "It could be that people are not self-reporting their true regulation styles, or it could be that how people are regulating in the lab isn't mapping on to how they're regulating in the real world."

Part of the answer to this problem entails finding study methods that get out of the lab, which Burr and her Duke Ph.D. adviser Gregory Samanez-Larkin have already done.

They used text messaging during different times of the day to reach study participants where they are and assess their emotions in that moment. As a bonus, they can use these tools to study people outside the traditional demographics of lab studies: Undergraduate students who were enticed to the lab by coffee cards or extra credits in Psych 101.

Either way, more research is going to be needed. The paper has been shared as a pre-print since January and Burr said the feedback from her peers has been positive so far.

###

The study was supported by the Dartmouth College Department of Education. This story appears online at https://today.duke.edu/2021/03/lab-studies-emotion-and-well-being-may-be-missing-real-world-anxiety.

CITATION: "Anxiety, Not Regulation Tendency, Predicts How Individuals Regulate In The Laboratory: An Exploratory Comparison Of Self-Report And Psychophysiology," Daisy A. Burr, Rachel G. Pizzie & David J.M. Kraemer. PLOS One, March 12, 2021, DOI: 10.1371/journal.pone.024724

Voltage from the parquet

Piezoelectric wood

SWISS FEDERAL LABORATORIES FOR MATERIALS SCIENCE AND TECHNOLOGY (EMPA)

Research News

IMAGE

IMAGE: SCANNING ELECTRON MICROSCOPY (SEM) IMAGES OF BALSA WOOD (LEFT) AND DELIGNIFIED WOOD ILLUSTRATE THE STRUCTURAL CHANGES. view more 

CREDIT: ACS NANO / EMPA

Ingo Burgert and his team at Empa and ETH Zurich has proven it time and again: Wood is so much more than "just" a building material. Their research aims at extending the existing characteristics of wood in such a way that it is suitable for completely new ranges of application. For instance, they have already developed high-strength, water-repellent and magnetizable wood. Now, together with the Empa research group of Francis Schwarze and Javier Ribera, the team has developed a simple, environmentally friendly process for generating electricity from a type of wood sponge, as they reported last week in the journal Science Advances.

Voltage through deformation

If you want to generate electricity from wood, the so-called piezoelectric effect comes into play. Piezoelectricity means that an electric voltage is created by the elastic deformation of solids. This phenomenon is mainly exploited by metrology, which uses sensors that generate a charge signal, say, when a mechanical load is applied. However, such sensors often use materials that are unsuitable for use in biomedical applications, such as lead zirconate titanate (PZT), which cannot be used on human skin due to the lead it contains. It also makes the ecological disposal of PZT and Co rather tricky. Being able to use the natural piezoelectric effect of wood thus offers a number of advantages. If thought further, the effect could also be used for sustainable energy production. But first of all, wood must be given the appropriate properties. Without special treatment, wood is not flexible enough; when subjected to mechanical stress; therefore, only a very low electrical voltage is generated in the deformation process.

From block to sponge

Jianguo Sun, a PhD student in Burgert's team, used a chemical process that is the basis for various "refinements" of wood the team has undertaken in recent years: delignification. Wood cell walls consist of three basic materials: lignin, hemicelluloses and cellulose. "Lignin is what a tree needs primarily in order to grow to great heights. This would not be possible without lignin as a stabilizing substance that connects the cells and prevents the rigid cellulose fibrils from buckling," explains Burgert. In order to transform wood into a material that can easily be deformed, lignin must at least partially be "extracted". This is achieved by placing wood in a mixture of hydrogen peroxide and acetic acid. The lignin is dissolved in this acid bath, leaving a framework of cellulose layers. "We take advantage of the hierarchical structure of wood without first dissolving it, as is the case in paper production, for example, and then having to reconnect the fibers", says Burgert. The resulting white wood sponge consists of superimposed thin layers of cellulose that can easily be squeezed together and then expand back into their original form - wood has become elastic.


CAPTION

This is how a piezoelectric nanogenerator works: After the rigid wooden structure has been dissolved, a flexible cellulose network remains. When this is squeezed, charges are separated, generating an electric voltage.

CREDIT

ACS Nano / Empa

Electricity from wooden floors

Burgert's team subjected the test cube with a side length of about 1.5cm to about 600 load cycles. The material showed an amazing stability. At each compression, the researchers measured a voltage of around 0.63V - enough for an application as a sensor. In further experiments, the team tried to scale up their wooden nanogenerators. For example, they were able to show that 30 such wooden blocks, when loaded in parallel with the body weight of an adult, can light up a simple LCD display. It would therefore be conceivable to develop a wooden floor that is capable of converting the energy of people walking on it into electricity. The researchers also tested the suitability as a pressure sensor on human skin and showed that it could be used in biomedical applications.




Application in preparation

The work described in the Empa-ETH team's latest publication, however, goes one step further: The aim was to modify the process in such a way that it no longer requires the use of aggressive chemicals. The researchers found a suitable candidate that could carry out the delignification in the form of a biological process in nature: the fungus Ganoderma applanatum, the causes of white rot in wood. "The fungus breaks down lignin and hemicellulose in the wood particularly gently," says Empa researcher Javier Ribera, explaining the environmentally friendly process. What's more, the process can be easily controlled in the lab.

There are still a few steps to be taken before the "piezo" wood can be used as a sensor or as an electricity-generating wooden floor. But the advantages of such a simple and at the same time renewable and biodegradable piezoelectric system are obvious - and are now being investigated by Burgert and his colleagues in a follow-up projects. And in order to adapt the technology for industrial applications, the researchers are already in talks with potential cooperation partners.

###

Race influences flood risk behaviors

UNIVERSITY OF CONNECTICUT

Research News

If you live in a flood prone area, would you -- or could you -- take measures to mitigate flood risks? What about others in your community? We are running out of time to ask this question according to The World Resources Institute, because global flood risk is increasing and loss projections for rivers alone put the cost over 500 billion dollars by midcentury.

Research published today in Proceedings of the National Academy of Sciences from scientists from the University of Connecticut, the University of Maryland's National Socio-Environmental Synthesis Center (SESYNC), the University of Massachusetts Amherst, and the London School of Economics suggests that in the United States, race and stream flow are important indicators among the myriad factors considered in flood risk behavior.

This research started with the complexities of studying human behavior in mind, says UConn Assistant Professor James Knighton, when the researchers decided to look at a model designed to approximate how populations respond to flooding events,

"I was talking with one of my colleagues, Kelly Hondula, and we were looking critically at these models of human behavior and thinking they're too simple and there is no way they could apply to US cities. How could they be useful? Originally we were writing this paper to show that these models may only apply to community flooding dynamics in very carefully controlled case studies."

"We started with data from over 200 cities that had long records of streamflow from USGS," explains Kelly Hondula, formerly of SESYNC and currently a postdoc at Battelle's National Ecological Observatory Network. "Although the model only worked for a subset of those, we were surprised that there were so many cities where the model could explain patterns in the data quite well given how complex the interactions are between flooding, risk mitigation, and structural inequalities."

The researchers tailored the models to 50 metropolitan areas across the US at risk for river flooding events, says Knighton. "These are models where you feed in a time series of flooding hazards that happen every year. The model uses streamflow information to predict how much damage occurs in a flood, how much a given flood changes a community's behavior, and how lasting are these changes."

They further calibrated the models to include trends in behaviors related to flood insurance policies and claims.

Knighton says, "After running the model, we were totally shocked because it worked fairly well. We shifted our focus to see what this model is saying about these communities."

The models show two diverging patterns among US metro areas, one the researchers deemed 'risk enduring' where as soon as a flood occurs, incidences of people obtaining insurance policies goes up then as years progress, those policies quickly decline says Knighton. The other pattern group, they deemed 'risk averse', responds differently to a flood where a flood happens, a modest increase in the insurance policies was seen, then the levels do not drop as rapidly as seen in the risk enduring group. The next flood would see a slight increase in policies where levels would stay fairly consistent.

Professor Christian Guzman of the University of Massachusetts Amherst says the researchers were mindful in how they described the diverging patterns, "Some models have termed a certain group as 'risk-taking' (versus risk-averse), but we decided to reorient this to indicate risk-enduring populations, meaning these groups are probably enduring the risk of flooding despite risk-preference or any agency associated with these preferences. This was something we found to be important when discussing flood risks and history in the U.S."

To learn more about these communities, the researchers including Rebecca Elliot at the London School of Economics, analyzed census data for the cities to see if measurements of social vulnerability could describe the divergent patterns, which Knighton said was also immediately apparent. Socioeconomic demographics were the main split between the patterns where the proportion of white residents was higher in risk averse communities.

"Our thinking was that poverty and race may indicate socioeconomic barriers to participate in existing federal programs such as the National Flood Insurance Program (NFIP). There is probably also some aspect of this where people are rational actors in the economic systems where if they perceive a big risk they will take some sort of action to mitigate the risk and if they don't perceive a risk they won't. The National Flood Insurance Program is built around this belief," says Knighton.

The researchers started from the opposite end of the spectrum, assuming there are no social barriers to people acting as rational actors, says Knighton, "We asked, is there some component of this that is actually tied to the risks people are facing? That is why we looked at stream flow and how predictable it is for people who live near it."

This predictability can be measured with a variable called 'flashiness'. A stream that is likely to stay at roughly the same level day-to-day is less flashy, whereas flashy streams have drastic and less predictable level fluctuations.

"Overall, we saw two things," says Knighton. "Race and flashiness together were very strong predictors of insurance dynamics."

Risk averse communities also tend to have much more local investments in their flood control infrastructure, such as more and bigger dams says Knighton. There is more federal attention on larger, richer, and whiter cities.

Knighton explains that barriers such as affordability and bureaucratic red tape may make access to insurance difficult and inequitable. Accounting for all these factors led to this research being the most comprehensive model-based investigation of flood risk behaviors in the US to date. There are programs in place that are designed to make federal flood programs more accessible, however inequities remain and the researchers hope that this research will raise awareness.

"It seems like there is a very rational component to the insurance market where people have some basic understanding of hydrology and can assess their local risks and they're taking action, but there are also barriers. The barriers of race, and likely poverty, are limiting certain communities or individuals from participating."

The Federal Emergency Management Agency, FEMA, has been committed by Congress to investigate an affordability framework which would be a way for the National Flood Insurance Program to apportion flood insurance more equitably. However, action needs to be taken quickly.

"Part of our hope is that there is strong support for a comprehensive affordability framework," says Knighton. "Some cities just live with risk. Flooding will likely get worse in the near future. Maybe people are able to live with the flooding we have now but we can't know for certain if everyone can live with the flooding that is coming."

###

This work was supported by the National Socio-Environmental Synthesis Center (SESYNC) under funding received from the National Science Foundation DBI-1639145.

Global river flow contingent upon climate change

MICHIGAN STATE UNIVERSITY

Research News

IMAGE

IMAGE: THIS STUDY SHOWS THAT THE CHANGE IN STREAM FLOW ANNUALLY OR DURING DROUGHTS WAS PRIMARILY CAUSED BY CLIMATE CHANGE DURING THE PAST 30 YEARS. view more 

CREDIT: MICHIGAN STATE UNIVERSITY

More often than ever before, water available in rivers is at the mercy of climate change, international researchers collaborating on a worldwide study with Michigan State University have revealed. The finding could profoundly affect future water and food security around the world.

Yadu Pokhrel, associate professor of civil and environmental engineering in the MSU College of Engineering and a co-author of the study, said climate is the key driver in the current changes to global river flow.

"It's a noteworthy finding because as climate change impacts extreme flows, it could be worsening flooding or increasing water scarcity during dry seasons," Pokhrel explained.

Details of the new study, "Globally observed trends in mean and extreme river flow attributed to climate change," have been published in the AAAS journal, Science.

Pokhrel said flows in rivers and streams can alter year by year, but they don't change direction over long-time scales without serious influence - from changing climate or human actions.

"Previous research has shown that river flows have been changing over time globally but the causes were not known. This study shows that the change in stream flow annually or during droughts was primarily caused by climate change during the past 30 years," Pokhrel said.

"This suggests that we are on course to lose more and more water in rivers as climate change continues, which could seriously undermine our ability to maintain water supplies for drinking, industries, power generation, and food production."

The study used measurements of streamflow at 7,250 locations globally and computer simulations from nine global hydrological models to examine how manmade climate change affected river flows across the global from 1971 to 2010. It found that climate change affected long-term average river flow as well as the flow during dry seasons.

"It highlights that it is critical to incorporate climate change impacts on water resource planning and management," Pokhrel added.

Lukas Gudmundsson at ETH Zurich, Switzerland, led the research.

"This study was possible thanks to the great collaboration between researchers and institutions from 12 countries," Gudmundsson said.

Study: Men of color avoid public places out of fear of involvement with criminal justice agents

CRIME AND JUSTICE RESEARCH ALLIANCE

Research News

The U.S. criminal legal system has expanded at a rapid pace, even as crime rates have declined since the 1990s. As a result, individuals' interactions with and surveillance by law enforcement are now commonplace. But citizens experience different interactions, with people of color who live in impoverished urban communities having the most frequent encounters. A new study interviewed young Philadelphia men to determine their perceptions of and reactions to this phenomenon. Nearly all of the men of color said they stayed at home and avoided public spaces out of concern over the potential interactions with others that might draw police attention.

The study, by a researcher at Temple University, appears in Criminology, a publication of the American Society of Criminology.

"This kind of avoidance of being out in the community is a racialized adaptation to the expansion of the criminal legal apparatus and the unpredictable nature of men's interactions with its agents and enforcers," explains Jamie J. Fader, associate professor of criminal justice at Temple University, who conducted the study. "It reproduces the effects of incarceration by turning the men's homes into virtual prisons, and it effectively erases young men of color from the public sphere, with considerable costs for the men and their neighborhoods."

Fader and her team conducted in-depth interviews with 45 men ages 25 to 34 who live in Frankford, Pennsylvania, an impoverished, racially diverse Philadelphia neighborhood with high rates of crime and violence. The area has also been heavily affected by mass incarceration. The study included 23 men who identified as Black, 5 who identified as Latino, and 17 who identified as White. Twenty-six of the men had at least a high school diploma and 19 had less education.

In her study, carried out from 2014 to 2019, Fader sought to learn how the men perceived risk and how they organized their daily routines in response to it. She also explored whether there are racial differences in perceptions and adaptation to risk.

Interviews lasted several hours and covered a range of topics, including daily routines, perceptions of the neighborhood, contact with police, family and social ties, and identity questions centering on stigma, masculinity, and adulthood. Fader and her team examined official criminal records to secure a history of each man's contact with the criminal justice system prior to the interview and new contact in the time after the interview. They also drew information about the Frankford community from five years of regularly observing community meetings, events, and venues.

The Frankford men, particularly the men of color, described a preoccupation with trouble, which they often called "drama," that stemmed from the unpredictable nature of their status vis à vis the criminal legal system, as well as the regular threat posed by street or police violence. Even men who had no prior criminal record or involvement with the criminal justice system expressed concern about venturing out beyond their homes, viewing public spaces as unpredictable and posing an unacceptable level of risk.

This practice of avoiding venturing into their neighborhood often translated into a lack of engagement in their community, whether it be in civic or service organizations or by getting to know neighbors, the study concluded. Fader and her team distinguished those who felt they had no choice but to stay home (identifying them as avoidant) from men who said they had chosen to limit their social contacts because they wanted to avoid social ties (identifying them as isolated). The study also found a systemic lack of men under 40 at community meetings and events in Frankford. The same pattern emerged in observations of barber shops, boxing gyms, and bars.

"Our study features men who came of age during the super-predator scare," says Fader, referring to the since-debunked idea that some impulsive young men carried out violent crimes without remorse. "The justice system, which is ostensibly designed to reduce risk of crime, has actually heightened men's concerns about risk. In addition to fears of victimization by street-level gun violence, these men also worry about the potential for arrest, court involvement and a criminal record, incarceration, and death at the hands of police."

The study has a number of implications, Fader suggests. The vigilance these men need to avoid trouble creates an environment of unrelenting stress, which affects mental and physical health and can reduce life expectancy. The men's avoidance of others limits their ability to develop and maintain social ties, and prevents the accumulation of social capital. It also forces the men to manage the risks they fear alone, without support from social service agencies. And it limits job opportunities, travel, friendships, and community involvement.

Finally, Fader notes, mistrusting social institutions and avoiding public space reduces community engagement, political participation, organizing, protesting, and capacity building, activities that build stronger social infrastructures and are engines of social and personal change.

Fader pointed to several limitations of the study: Because the study focused on one community and interviewed only men, its findings cannot be generalized to other communities or speak to women's experiences. Also, because the study relied on the men's narratives of their activities rather than on direct observation, it is possible that the men characterized themselves as more self-sufficient and independent than they actually were. Finally, about half the White men in the study were in recovery from substance abuse, which may have affected both their assessments of risk and the economic and social support they received from the recovery community.

###

The study was supported by Temple University.

Study: Seattle's minimum wage increase did not change crime or employment rates

CRIME AND JUSTICE RESEARCH ALLIANCE

Research News

Between 2015 and 2017, Seattle, Washington, became the first U.S. city to increase its hourly minimum wage to $15, more than double the federal minimum wage and 60 percent higher than Seattle's previous minimum wage. A new study examined the impact of this change on public safety. The study was motivated by the idea that since crime is sometimes the result of material deprivation, changes in the minimum wage might have implications for criminal activity: Boosting the minimum wage could raise workers' salaries (which could be associated with reduced crime). But if higher minimum wages spur employers to substitute capital for labor, this could increase unemployment (which could be associated with increased crime). The study found little evidence that Seattle's aggregate rate of violent or property crimes changed relative to other U.S. cities. It also found no meaningful adverse effects on low-wage workers' rates of employment.

The study, by researchers at the University of Pennsylvania, appears in Criminology & Public Policy, a publication of the American Society of Criminology.

"Our study suggests that Seattle increased its minimum wage without compromising public safety, at least in the short run," explains David Mitre-Becerril, a doctoral student in criminology at the University of Pennsylvania, who led the study.

Past research on the effects of minimum wage legislation on crime has been mixed.

In this study, researchers compared crime trends in Seattle to 118 U.S. cities in 18 states that did not increase their minimum wage but whose pre-2015 crime trends were comparable to Seattle. Seattle's minimum wage began to be implemented in 2015 and rose to $15/hour by 2017. For the period 2010 to 2017, researchers collected crime data from the FBI's Uniform Crime Reports and the Seattle Police Department. They also collected sociodemographic and employment/earnings data from the American Community Survey conducted by the U.S. Census Bureau. Sociodemographic data included information on race, age, schooling, and poverty level.

The study found that Seattle's rate of violent crime (murder, robbery, aggravated assault) did not diverge significantly from the other cities' rates. With respect to property crimes (burglary, larceny, motor vehicle theft), the findings were more nuanced. Although researchers found evidence that Seattle's rate of burglaries may have increased 15 to 30 percent during the study period, driven by offenses on commercial premises, they did not detect an aggregate change in property crimes.

The increase in Seattle's minimum wage appears to have had little impact on arrest rates among any major demographic group, including young men, who tend to drive an outsized share of offending and whose employment has been considered the most sensitive to changes in the minimum wage. This, the authors note, may be because Seattle's law did not end up reducing employment for low-skilled workers.

This could be because Seattle has more college-educated workers and fewer low-wage earners than other cities, primarily because of its booming tech industry, which would mean that fewer workers were affected by a shift in the wage structure than in other cities.

The study's authors note that they focused on one city, thus limiting the generalizability of their findings to other cities. To address this matter, the researchers supplemented their work with analyses of four other U.S. cities that have increased their minimum wages substantially, though less than in Seattle, during the same period. In those cities--Chicago and three cities in California: San Francisco, San Jose, and Sunnyvale--the laws spurred few consistent increases in criminal offenses.

The authors also note that Washington State legalized recreational marijuana in December 2013 and allowed retail sales of recreational marijuana beginning in July 2014, which may have affected crime rates, though there is little evidence of a change in public safety immediately following the law.

"In the aftermath of Seattle's decision to raise the minimum wage, several other U.S. cities and states plan to follow suit," notes Aaron Chalfin, assistant professor of criminology at the University of Pennsylvania, who coauthored the study. "Therefore, the question of how high minimum wages can be pushed without compromising public safety is timely.

"However, the COVID-19 pandemic is likely to affect regional economic dynamics: Although a $15 minimum wage appears not to have disrupted Seattle's labor market during a period of broad economic growth, it could become a structural barrier to employment during the coming months, which may have implications for public safety."


Telehealth growth during pandemic occurred mostly in more affluent and
metropolitan a
reas

Findings raise concerns that the pandemic may be worsening disparities

RAND CORPORATION

Research News

Increases in the use of telehealth during the coronavirus pandemic among people with private insurance has occurred mostly among those who are more affluent and those who live in metropolitan areas, according to a new RAND Corporation study.

Researchers say the findings raise concerns that the pandemic may be worsening existing disparities in overall health care utilization.

Overall, the study found there was a 20-fold increase in the rate of telemedicine utilization following the outset of the pandemic in March 2020. At the same time, the rate of office-based medical encounters declined by nearly 50% and was not fully offset by the increase in telemedicine.

The findings are published online by the American Journal of Preventive Medicine.

"This study expands our understanding about the growing use of telehealth as the pandemic progresses," said Jonathan Cantor, lead author of the study and a policy researcher at RAND, a nonprofit research organization. "Given our findings, policymakers should consider increasing efforts to reach populations that are deferring in-office care and not replacing it with telehealth visits."

The study analyzed insurance claims from more than 6 million people who had employer-based health insurance in 2019 and 2020. The group represented about 200 employers across all 50 U.S. states.

Researchers examined the number and types of medical care received by enrollees from January 2019 through July 2020.

The increase in telemedicine was greatest among patients in counties with low poverty levels (about 48 visits per 10,000 people versus 15 per 10,000 people in high-poverty areas) and among patients in metropolitan areas (about 50 visits per 10,000 people versus about 31 visits per 10,000 people in rural areas).

Adults were more likely to have a telehealth visit as compared to children aged 12 and younger (about 65 visits per 10,000 adults as compared to about 50 visits per 10,000 children).

"More intensive training for parents and pediatricians about telehealth, as well as efforts to address barriers to children's access to telemedicine, may be necessary," Cantor said.

###

Support for the study was provided by the National Institutes on Aging and the Robert Wood Johnson Foundation.

Other authors of the study are Ryan K. McBain and Christopher Whaley of RAND, Megan F. Pera of Castlight Health, and Dena M. Bravata of Castlight Health and the Center for Primary Care and Outcomes Research at Stanford University.

RAND Health Care promotes healthier societies by improving health care systems in the United States and other countries.