Friday, August 13, 2021

 

CDC’s “honor system” mask guidance threatened vulnerable communities


Commentary argues that the agency’s policies may have increased risk of transmission of COVID-19 in the Black community.

Peer-Reviewed Publication

MASSACHUSETTS GENERAL HOSPITAL

BOSTON – The Centers for Disease Control and Prevention (CDC) may have increased the threat of COVID-19 infection in the Black community and among some other racial and ethnic minority groups when the agency updated its mask guidance last May, according to a new commentary in the Journal of General Internal Medicine by two colleagues at Massachusetts General Hospital (MGH). At the time, the CDC announced that fully vaccinated people could go without masks in most circumstances, a policy recommendation that was intended to help restore a sense of normalcy, yet ignored certain realities that left some groups vulnerable, the authors argue.

A critical problem with the CDC’s May mask guidance was that there is no way to tell whether a person is fully vaccinated against COVID-19, says Simar Bajaj, a research fellow at MGH. That meant that the CDC was relying on the honor system to ensure that unvaccinated people continued to wear masks. However, earlier research indicates that people often lie about personal health information to avoid being judged, notes Bajaj. “Without a way to verify vaccination status, everyone is going to unmask,” he says, dooming the CDC’s mask guidance to fail.   

Blacks in the United States were the group most likely to suffer the consequences of that failure, for a number of reasons, say Bajaj and his coauthor, Fatima Cody Stanford, MD, MPH, MPA, MBA, who is director of equity in MGH’s endocrinology division. For example, at around the time the CDC announced the mask guidance in mid May, just 28 percent of Black Americans had received at least one dose of the COVID-19 vaccine, compared to 42 percent of white Americans and 52 percent of Asians. “That’s a huge gap,” says Bajaj, who says it’s partly due to vaccine hesitancy that “is a product of everyday racism that Blacks face when navigating the health system.”

What’s more, Black Americans are far more likely than white Americans to be employed in essential jobs, meaning they can’t lower their risk for infection by working remotely. Black Americans have more medical comorbidities, which made those who became infected more likely to become seriously ill. Throughout the COVID-19 pandemic, Blacks have had disproportionately high rates of rates of infection with the virus, hospitalization and death, point out Bajaj and Stanford.

All of these factors made the CDC guidance a threat to the Black community. “When you issue this one-size-fits-all guidance, it should come as no surprise that it could deleteriously impact communities that have been hurt the most already,” says Bajaj. A more nuanced approach, he suggests, might have used “quantitative benchmarks” to determine when it’s safe for a community to unmask, says Bajaj. One approach might be to set goals for what proportion of a community must be fully vaccinated before unmasking is recommended in a state, possibly with higher thresholds in certain racial or ethnic communities, he suggests. Digital health passes that could verify vaccination status are another tool that the CDC could explore.

Their Journal of General Internal Medicine “Viewpoint” article had already been accepted for publication when, in late July, the CDC changed its mask guidance once again, citing new scientific data about the rapidly spreading Delta variant of the COVID-19 virus—in particular, the fact that vaccinated people can transmit the variant. The revised advice encouraged vaccinated people in counties where transmission of the virus is “substantial” or “high” to wear a mask while in indoor public places. To Bajaj, that’s a step in the right direction, since the new guidance is using its own quantitative benchmark—degree of transmission in a county—to determine whether masks are necessary for vaccinated people.

“But it’s too little, too late. Once you have swung the pendulum in the direction of abandoning all caution, it’s hard to walk it back,” says Bajaj, adding that public health authorities should focus on increasing vaccination rates overall, including in communities of color, and promoting greater health equity: the idea of removing barriers to good health for all. “If we continue the current state of affairs, it will be a tale of two pandemics—one in the highly vaccinated, majority-white, suburban areas, and another ripping through and devasting minority communities,” says Bajaj. “I think that’s an unacceptable proposition.”

Bajaj is also a scholar in the department of the History of Science at Harvard College. Stanford is a physician-scientist in the departments of Medicine and Pediatrics and is an Equity Director at MGH. She is also director of Diversity for the Nutrition Obesity Research Center at Harvard Medical School.

This work was supported by the National Institutes of Health and the Massachusetts General Hospital Executive Committee on Research.  

About the Massachusetts General Hospital
Massachusetts General Hospital, founded in 1811, is the original and largest teaching hospital of Harvard Medical School. The Mass General Research Institute conducts the largest hospital-based research program in the nation, with annual research operations of more than $1 billion and comprises more than 9,500 researchers working across more than 30 institutes, centers and departments. In August 2021, Mass General was named #5 in the U.S. News & World Report list of “America’s Best Hospitals.”

 

 

 

Competitive success: New research finds performance feedback at clinics increases flu vaccination rates

Peer-Reviewed Publication

INSTITUTE FOR OPERATIONS RESEARCH AND THE MANAGEMENT SCIENCES

INFORMS Journal Management Science Study Key Takeaways:

  • Clinics that receive performance rankings saw a 12% increase in flu shots and outperformed all others.

  • Clinics that receive performance rankings try to avoid falling into last place; clinics near last place outperform other similar clinics by 23 percentage points.

  • Even a 1% increase in U.S. adult flu vaccination rates can translate to almost $400 million in societal benefits.

 

CATONSVILLE, MD, August 12, 2021 – Every year the flu threatens the health of millions of people. Experts continue to recommend annual flu vaccination as the best line of defense, but despite these recommendations, flu vaccination rates haven’t broken 50% in more than a decade. New research in the INFORMS journal Management Science seeks to overcome this. The study finds that performance feedback at healthcare clinics can significantly increase vaccination rates. This has important public policy implications. Citing other research, the authors highlight that even just a 1% increase in U.S. adult flu vaccination rates could translate to some $400 million in societal benefits.

The study, “Focusing Provider Attention: An Empirical Examination of Incentives and Feedback in Flu Vaccinations,” was conducted by Bradley Staats and Robert Niewoehner III, both of the University of North Carolina at Chapel Hill, in partnership with VaxCare, a technology company that partners with clinics to coordinate vaccination logistics. The study looked at 145 clinics in nine different states and tested whether financial incentives or performance feedback might improve vaccination rates.

“We find clinics that got performance rankings grew their flu vaccinations more than all other clinics. Specifically, our experiment led to a 12% increase in flu shots for these clinics,” said Staats, a professor of operations and Sarah Graham Kenan Scholar, faculty director of the Center for the Business of Health and associate dean of MBA programs in the Kenan-Flagler Business School at UNC-Chapel Hill. “We also find that the clinics who received rankings don’t want to come in last – that is, they do whatever they can to avoid the bottom rankings. Because of this, in trying move up, the clinics near last-place end up outperforming their corresponding control clinics by 23 percentage points – a significant margin!”

This research stands to have a very large impact. If even just a portion of the increased vaccination rates go to at-risk groups, this could avert serious health consequences.

“Even further – if most of an increase in flu shots went to seniors, the CDC estimates that this could prevent thousands of hospitalizations,” said Niewoehner, a doctoral candidate in the Kenan-Flagler Business School at UNC-Chapel Hill.

“Our study shows that behavioral interventions like our experiment can improve performance outcomes, even when targeting seemingly immutable trends, like flu vaccination rates. Going forward, we believe our findings hold great promise for improving public health and company operations in general.”


Link to full study.


About INFORMS and Management Science

Management Science is a premier peer-reviewed scholarly journal focused on research using quantitative approaches to study all aspects of management in companies and organizations. It is published by INFORMS, the leading international association for operations research and analytics professionals. More information is available at www.informs.org or @informs. 

 

Applying data-driven multimodal speed management strategies for safe, efficient transportation


Reports and Proceedings

PORTLAND STATE UNIVERSITY

How can we use a variety of data-driven speed management strategies to make transportation safer and more efficient for all modes–whether you’re driving, walking or taking transit?

The project was led by Yao Jan Wu, director of the Smart Transportation Lab at the University of Arizona. Co-investigators were Xianfeng Terry Yang of the University of Utah, who researches traffic operations and modeling along with connected automated vehicles, and Sirisha Kothuri of Portland State University, whose research has focused on improving signal timing to better serve pedestrians. Join them on Sept 15, 2021 for a free webinar to learn more.

"We want to improve mobility for all users, be it pedestrians, vehicle drivers or transit riders, and there are different strategies to do this. How do we harness data to drive us to these strategies?" Kothuri said.

Funded by the National Institute for Transportation and Communities (NITC), this multi-university collaboration addressed the question from three angles:

  • Wu and his students in Arizona looked at the impact of speed management strategies on conventional roadways
  • Yang and his team examined the effects of speed management strategies on connected corridors, coordinating with transit signal priority (TSP) systems. 
  • Kothuri and her PSU team came up with an approach to estimate pedestrian delay at signalized intersections.

The aim of their combined research efforts was to investigate the possibility of developing and implementing more innovative speed management strategies that are effective for multimodal transportation and can be applied in both conventional roadways and "connected" roadways - i.e. equipped with vehicle-to-infrastructure or infrastructure-to-infrastructure communication capabilities.

IMPACT OF SPEED FEEDBACK SIGNS ON TRAFFIC FLOW AND SAFETY

Working with Pima County, Arizona, Wu and the Arizona research team evaluated the mobility and safety impacts of speed feedback signs on conventional roadways. Ina Road, a major signalized arterial in Tucson, was selected as the study corridor. This corridor was chosen because of the existing speed feedback signs along the corridor between signalized intersections, and due to the presence of advanced traffic data collection systems. Traffic data were collected for four weeks (May 28-June 25, 2018), and the existing signs were disabled for two weeks (June 11th-June 25th) during the data collection.

Using MioVision’s TrafficLink platform and high-resolution data, the researchers measured:

  • Percentage arrival on red: The percentage of vehicles that arrived at the intersection when the signal was red. 
  • Split failure: The occurrence of leftover demand (when at least one vehicle in the queue was not able to go, but had to wait for the next green cycle) for a specific approach at an intersection.
  • Intersection delay: Total amount of time that all vehicles spend in the intersection queue while waiting to pass the intersection.

For a given time of day before and after disabling the speed feedback signs, only a little variation in traffic flow was observed. Similar traffic flow peaks for all the segments suggest that arterial mobility and traffic flow were not affected by disabling the signs. But what about the signs' effect on safety?

Data from the Pima Association of Governments show that the total number of severe crashes (four) on the study corridor all occurred before implementing the speed feedback signs in 2015. Moreover, using speed as a performance indicator, the researchers found a reduction in drivers' speeds along each link of the corridor, in between intersections (see page 21 of the final report for a table of speed results on weekdays and weekends). The reduction in the link speed was significant during the times the feedback signs were enabled, suggesting a reduced likelihood of severe crashes.

RETIMING SIGNALS FOR TRANSIT SIGNAL PRIORITY

Yang and the Utah team explored the impact of a speed management strategy on a connected corridor in Salt Lake City, Utah: how does signal retiming impact a transit signal priority (TSP) system?

Although TSP is a promising way to reduce bus delays at intersections, improve transit operational reliability, and consequently increase transit ridership with improved service, the effectiveness of TSP is subject to things like bus schedule, signal timing plan, passenger flows, etc. Considering this, the Utah Department of Transportation (UDOT) adopted a speed management strategy – signal coordination and retiming –  to improve the effectiveness of TSP. UDOT implemented signal coordination along a stretch of Redwood Road, a connected corridor with dedicated short-range communication, and applied several signal timing plans with the aim of maximizing the benefits of TSP. 

In general, researchers found, the average rate of TSP served before signal retiming was 33.13%, which is lower than that of 35.29% after signal retiming. This means that more buses requesting signal priority had their requests met, after the signals were retimed. In other words, the speed management strategies were also helping to improve bus reliability. 

In addition, bus travel time and bus running time were reduced after signal retiming. All of these findings indicate that the speed management strategy implemented along this connected vehicle corridor results in an improvement of TSP and bus service.

ESTIMATING PEDESTRIAN DELAY

There is technology, like signal controllers that can record high-resolution data, capable of recording pedestrian delay; but not every intersection is equipped with this technology because it is costly. Agencies are upgrading their infrastructure when possible, but there are still a lot of intersections where there is no means of knowing how long of a delay a pedestrian may experience. 

The PSU team, led by Kothuri, developed an alternative method for estimating pedestrian delay by using controller data for estimating multimodal signal performance measures.

Traditionally, signal timing is calibrated to prioritize vehicle movement, and long delays for pedestrians can increase noncompliance, causing unnecessary risk. So the team's goal was to find a way to use data to estimate what the pedestrian delay would be, at intersections that are not equipped with the latest infrastructure. Researchers took data from Ina Road, the same study corridor in Pima County, Arizona used to evaluate speed management strategies, and used finite mixture modeling to model pedestrian delay. Results showed that their method was able to successfully model the delay fluctuations with less than 10% mean absolute error. This method can be applied to intersections with similar characteristics as the test intersections. So cities and agencies that do not have signal controllers to capture pedestrian delay can use this finite mixture modeling method to figure out where they need to apply strategies to reduce that delay. 

The application of the proposed method could be beneficial to transportation agencies in three capacities:

  1. providing a more reliable, robust, and accurate approach for estimating pedestrian delay at signalized intersections where sensors are not available to collect pedestrian delay;
  2. a tool for analyzing the risk of pedestrians violating the signal;
  3. calibrating a network-wide model for estimating pedestrian delay at all intersections without the need to use additional resources

OUTCOMES OF THE RESEARCH

An effective strategy for improving mobility needed to consider both motorized and non-motorized modes of transportation. The three main achievements of this project were:

  1. Evaluate the impact of speed management strategies along conventional arterials using smart sensor data;
  2. Understand the role of conventional speed management strategies in supporting connected arterials;
  3. Examine the possibility of using controller event-based data to estimate multimodal signal performance measures.

Improved multimodal speed management strategies foster a safer community that will, in turn, encourage more people to walk and bike. This project addressed data-driven multimodal speed management strategies for traditional corridors using traffic sensors, and for future evaluation of connected vehicle-based strategies. The project also strengthened relationships between the three universities and their local partners, including Pima County and the Utah DOT.

"This project highlighted the strong collaboration we have among the universities. Trying to find innovative solutions during the process tied our universities together, bringing local resources together as well," Wu said.

ABOUT THE PROJECT

Data-Driven Mobility Strategies for Multimodal Transportation

Yao-Jan Wu, University of Arizona; Xianfeng Yang, University of Utah; Sirisha Kothuri, Portland State University

This research was funded by the National Institute for Transportation and Communities, with additional support from Pima County Public Works Administration, Portland State University, University of Arizona, University of Utah, and Utah Department of Transportation.

RELATED RESEARCH

To learn more about this and other NITC research, sign up for our monthly research newsletter.

The National Institute for Transportation and Communities (NITC) is one of seven U.S. Department of Transportation national university transportation centers. NITC is a program of the Transportation Research and Education Center (TREC) at Portland State University. This PSU-led research partnership also includes the Oregon Institute of Technology, University of Arizona, University of Oregon, University of Texas at Arlington and University of Utah. We pursue our theme — improving mobility of people and goods to build strong communities — through research, education and technology transfer.

 

Is your mobile provider tracking your location? This new technology could stop it.

For the first time, researchers have found a way to protect phone users’ identity and location, without compromising connectivity.

Peer-Reviewed Publication

UNIVERSITY OF SOUTHERN CALIFORNIA

Right now, there is a good chance your phone is tracking your location—even with GPS services turned off. That’s because, to receive service, our phones reveal personal identifiers to cell towers owned by major network operators. This has led to vast and largely unregulated data-harvesting industries based around selling users’ location data to third parties without consent.

For the first time, researchers at the University of Southern California (USC) Viterbi School of Engineering and Princeton University have found a way to stop this privacy breach using existing cellular networks. The new system, presented at USENIX Security conference on Aug. 11, protects users’ mobile privacy while providing normal mobile connectivity.

The new architecture, called “Pretty Good Phone Privacy” or PGPP, decouples phone connectivity from authentication and billing by anonymizing personal identifiers sent to cell towers. The software-based solution, described by the researchers as an “architecture change,” does not alter cellular network hardware.

“We’ve unwittingly accepted that our phones are tracking devices in disguise, but until now we’ve had no other option—using mobile devices meant accepting this tracking,” said study co-author Barath Raghavan, an assistant professor in computer science at USC. “We figured out how to decouple authentication from connectivity and ensure privacy while maintaining seamless connectivity, and it is all done in software.”

Decoupling authentication and phone connectivity

Currently, for your phone to work, the network has to know your location and identify you as paying customer. As such, both your identity and location data are tracked by the device at all times. Data brokers and major operators have taken advantage of this system to profit off revealing sensitive user data—to date, in the United States, there are no federal laws restricting the use of location data.

“Today, whenever your phone is receiving or sending data, radio signals go from your phone to the cell tower, then into the network,” said Raghavan. “The networks can scoop up all that data and sell it to companies or information-for-hire middlemen. Even if you stop apps tracking your location, the phone still talks to the tower, which means the carrier knows where you are. Until now, it seemed like a fundamental thing we could never get around.”

But Raghavan, with study co-author Paul Schmitt who recently joined USC’s Information Sciences Institute from Princeton University, found a way: they decoupled what’s known as authentication—who you are—from your phone connectivity. The key finding: there is no reason why your personal identifier has to grant you network connectivity.

Their new system works by breaking the direct line of communication between the user’s cellphone and the cell tower. Instead of sending a personally identifiable signal to the cell tower, it sends an anonymous “token.” It does this by using a mobile virtual network operator, such as Cricket or Boost, as a proxy or intermediary.

“The key is – if you want to be anonymous, how do they know you’re a paying customer?” said Raghavan. “In the protocol we developed, the user pays the bills, and gets a cryptographically signed token from the provider, which is anonymous. Now the identity in a specific location is separated from the fact that there is a phone at that location.”

Restoring control

The duo, who have launched a startup called Invisv, prototyped and tested everything with real phones in the lab. Crucially, their approach adds almost zero latency and doesn’t introduce new bottlenecks, avoiding performance and scalability problems of other anonymity networks. The service could handle tens of millions of users on a single server and would be deployed seamlessly to customers through the network operator.

Since the system works by stopping a phone from identifying its user to the cell tower, all other location-based services—such as searching for the nearest gas station, or contact tracing—still work as usual. The researchers hope the technology will be accepted by major networks as default, particularly with mounting legal pressure to adopt new privacy measures.

“For the first time in human history, almost every single human being on the planet can be tracked in real-time,” said Raghavan. “Until now, we had to just silently accept this loss of control over our own data—we believe this new measure will help to restore some of that control.”

 

The push to cashless transit fare leaves some riders behind


Reports and Proceedings

PORTLAND STATE UNIVERSITY

Our multi-year study on automated transit fare collection offers a key finding that won't surprise you: Despite the convenience, the rush toward cashless fare systems has created barriers for lower-income riders seeking to use transit. Results from focus groups, surveys, and a review of current transit agency practices suggest that continuing to accept cash is a crucial way to keep transit accessible. However, dealing with cash has drawbacks: it’s time intensive and expensive. Using a detailed cost-benefit model, the researchers explored the costs for agencies to maintain some cash options and found that some simple approaches can be quite effective. The best bang for the buck? Cash collection on board buses.

Launched in 2019, the research project "Applying an Equity Lens to Automated Payment Solutions for Public Transportation" was supported by a Pooled Fund grant program from the National Institute for Transportation and Communities (NITC) and conducted at three universities: Portland State University (PSU), the University of Oregon (UO), and the University of Tennessee, Knoxville (UTK). The other funding partners were City of Eugene, OR, City of Gresham, OR, Lane Transit District, Clevor Consulting Group, and RTD (Regional Transportation District) Denver.

Aaron Golub of PSU served as the principal investigator, with co-investigators Anne Brown of UO, Candace Brakewood of UTK and John MacArthur of PSU.

WHY STUDY CASHLESS TRANSIT FARE?

Automated payment technologies can smooth operations and improve data collection, but the added convenience for the agency and some riders comes at a price: those systems require riders to have access to private internet, smartphones, and banking/credit services. Access that is decidedly not universal. What happens to the riders who are left behind?

Fare payment systems have a long history in significant equity challenges - both in fare amounts, but also in how and where fare can be purchased and stored. The final report “Applying an Equity Lens to Automated Payment Solutions for Public Transportation” (PDF) is a detailed exploration of how transit riders pay for their fare, based on 2,303 intercept surveys and three focus groups with transit riders in Colorado and Oregon. Researchers looked at the use of cash, and focused on riders who may be excluded if cash options are removed as new fare payment systems are implemented in the coming years. The research team also interviewed employees at ten transit agencies to find out how their fare payment has been modernized in the past 5 years, how those agencies have evaluated the equity implications of these changes, and what programs they have deployed to mitigate the equity impacts.

Andrew Martin, Development Planner at Lane Transit District, served on the project's technical advisory committee. "Around the same time as this study, we were in the middle of purchasing and implementing our first electronic fare collection system. We had already decided to take a more customer-centric approach: instead of going completely cash-free, we determined that we were going to take on the costs of making sure our service remained accessible to all riders. It was good to see, in the research, a lot of the things that we were intuitively feeling turned out to be true. The cost-benefit analysis shows that the cost isn't as great as you think; by doing the equity mitigations, you might end up with higher ridership and offset the revenue loss," Martin said.

SOME KEY FINDINGS

Researchers found that a significant number (around 30%) of transit riders still rely heavily on paying cash on-board buses. Older and lower-income respondents had less access to smartphones and internet. 

Of those who do own smartphones, many are concerned about reaching data limits, and some depend solely on public Wi-Fi for internet connectivity. 

A small but significant number of riders (around 7%) have no access to formal banking services.

The researchers worked with transit organizations in three case cities: Eugene, OR (population 247,421); Denver, CO (population 2,374,203); and Portland, OR (population 1,849,898). The cost-benefit model can be used by any size agency to implement new fare payment technology.

"One thing that would be really helpful to a lot of agencies is the cost modeling [the researchers] did. It estimates the general cost to put new technologies out there, like ticket vending machines. A lot of smaller agencies may not have things like that, and they're really useful for customers. So even aside from the equity focus, there is a lot of good info on costs of implementing a system," said Martin.

COST BENEFIT MODEL

Researchers constructed a quantitative cost-benefit model that combines first-year capital investments along with 10 years of maintenance, operations and capital replacement into a single total cost estimate. This approach creates an overall reflection of the lifecycle costs of the fare payment system, meaning it enables us to understand the total cost from both the initial costs, as well as the recurring annual costs.

They then used the model to explore and compare four scenarios along with an additional base (no-cash) case. Scenarios are based on the feedback received from transit agencies and a review of best practices nationally: 

  • Base - (No cash accepted anywhere) 
  • Scenario 1 - No cash anywhere, adds retail network 
  • Scenario 2 - Cash on board, not at TVMs, no retail 
  • Scenario 3 - Cash only at TVMs, no retail 
  • Scenario 4 - Cash accepted everywhere

"The heart of this cost-benefit model is, how many riders cannot ride under the different scenarios? We were able to study more than 2,000 riders, and, in the fully no-cash base case, we knew that about 8% of riders could not ride, based on our surveys. Their answers to how they would  ride with different configurations of ticket vending machines and cash on board informed this model," Golub said.

SELECTING MITIGATION STRATEGIES

Any of the above scenarios 1-4, above the no-cash baseline, can mitigate some of the equity implications of going cashless. Which scenario is best for a particular setting depends greatly on how many riders are potentially excluded by a cashless fare system, and on which options those riders would most likely use, given the opportunity to pay with cash. Based on the results of the cost-benefit analysis for each of the three case cities, researchers developed some general principles that agencies should keep in mind, when choosing strategies to help keep transit accessible. 

"When you're looking at 10 different systems and you've got to justify to the board, the general manager, the community, why you're spending money a certain way – it's really helpful to have research like this that shows that the costs are not some huge amount. When equity is cheap to obtain, it's really easy to justify doing that," Martin said.

Larger agencies spend less to collect fare. This impacts the cost-benefit calculation of adding additional capabilities. Small agencies, the researchers suggest, should seriously consider going fare-free. The Eugene case study (the smallest agency) shows that, across the board, fare collection consumes a large part of fare revenues - in the full cash scenario, about 40% of revenue is spent on collecting fare. 

Retail is a low-cost option: Accepting cash payments at retail locations is by far the lowest cost option to add cash capabilities in terms of total cost, net costs, and in terms of cost to accommodate potentially excluded riders. It is also the most commonly used mitigation, according to interviews with agencies. However, the retail network still poses significant geographical barriers for many riders, and does not offer the kind of coverage and access that cash collection on-board would offer.

Simple cash collection on buses could be an important bridge: According to the ridership survey data, in addition to being a low-cost option for agencies, this mitigation also added significant ridership. Accepting cash at ticket vending machines was found to be much more expensive than accepting cash on board.

When larger numbers of riders are excluded, equity mitigations are cheaper. The larger number of riders that are excluded, the bigger impact equity mitigations have and the cheaper they are per additional rider, and per additional fare collected. The Portland-Gresham case study showed relatively few riders were excluded when cash was eliminated compared to the other properties. That meant that adding retail cash collection cost $0.27 per new boarding. In Denver and Eugene, larger populations of riders were potentially excluded by cashless fare, and adding retail capabilities only cost 14 and 1.9 cents per boarding, respectively.

"Within the 10-year transition, some of the worst effects could be avoided by using some of these mitigations,” Golub told NextCity in a May 25, 2021 article: What Happens When Cash Fares Are Eliminated?

This research was funded by the National Institute for Transportation and Communities; the City of Eugene, OR, City of Gresham, OR, Lane Transit District, Clevor Consulting Group, and RTD (Regional Transportation District) Denver.

ABOUT THE PROJECT

Applying an Equity Lens to Automated Payment Solutions for Public Transportation

Aaron Golub and John MacArthur, Portland State University; Anne Brown, University of Oregon; Candace Brakewood, University of Tennessee, Knoxville

RELATED RESEARCH

To learn more about this and other NITC research, sign up for our monthly research newsletter.

The National Institute for Transportation and Communities (NITC) is one of seven U.S. Department of Transportation national university transportation centers. NITC is a program of the Transportation Research and Education Center (TREC) at Portland State University. This PSU-led research partnership also includes the Oregon Institute of Technology, University of Arizona, University of Oregon, University of Texas at Arlington and University of Utah. We pursue our theme — improving mobility of people and goods to build strong communities — through research, education and technology transfer.

 

Scrap the nap: Study shows short naps don’t relieve sleep deprivation


Peer-Reviewed Publication

MICHIGAN STATE UNIVERSITY

No to naps 

IMAGE: RESEARCHERS FOUND THAT SHORT NAPS OF 30 OR 60 MINUTES DID NOT SHOW ANY MEASURABLE EFFECTS ON MITIGATING EFFECTS OF SLEEP DEPRIVATION. view more 

CREDIT: CREATIVE COMMONS VIA PXHERE

EAST LANSING, Mich. – A nap during the day won’t restore a sleepless night, says the latest study from Michigan State University’s Sleep and Learning Lab.

“We are interested in understanding cognitive deficits associated with sleep deprivation. In this study, we wanted to know if a short nap during the deprivation period would mitigate these deficits,” said Kimberly Fenn, associate professor of MSU, study author and director of MSU’s Sleep and Learning Lab. “We found that short naps of 30 or 60 minutes did not show any measurable effects.”

The study was published in the journal Sleep and is among the first to measure the effectiveness of shorter naps — which are often all people have time to fit into their busy schedules.

“While short naps didn’t show measurable effects on relieving the effects of sleep deprivation, we found that the amount of slow-wave sleep that participants obtained during the nap was related to reduced impairments associated with sleep deprivation,” Fenn said.

Slow-wave sleep, or SWS, is the deepest and most restorative stage of sleep. It is marked by high amplitude, low frequency brain waves and is the sleep stage when your body is most relaxed; your muscles are at ease, and your heart rate and respiration are at their slowest.

“SWS is the most important stage of sleep,” Fenn said. “When someone goes without sleep for a period of time, even just during the day, they build up a need for sleep; in particular, they build up a need for SWS. When individuals go to sleep each night, they will soon enter into SWS and spend a substantial amount of time in this stage.”

Fenn’s research team – including MSU colleague Erik Altmann, professor of psychology, and Michelle Stepan, a recent MSU alumna currently working at the University of Pittsburgh - recruited 275 college-aged participants for the study.

The participants completed cognitive tasks when arriving at MSU’s Sleep and Learning Lab in the evening and were then randomly assigned to three groups:  The first was sent home to sleep; the second stayed at the lab overnight and had the opportunity to take either a 30 or a 60 minute nap; and the third did not nap at all in the deprivation condition.

The next morning, participants reconvened in the lab to repeat the cognitive tasks, which measured attention and placekeeping, or the ability to complete a series of steps in a specific order without skipping or repeating them — even after being interrupted.

“The group that stayed overnight and took short naps still suffered from the effects of sleep deprivation and made significantly more errors on the tasks than their counterparts who went home and obtained a full night of sleep,” Fenn said. “However, every 10-minute increase in SWS reduced errors after interruptions by about 4%.”

These numbers may seem small but when considering the types of errors that are likely to occur in sleep-deprived operators — like those of surgeons, police officers or truck drivers — a 4% decrease in errors could potentially save lives, Fenn said.

“Individuals who obtained more SWS tended to show reduced errors on both tasks. However, they still showed worse performance than the participants who slept,” she said.

Fenn hopes that the findings underscore the importance of prioritizing sleep and that naps — even if they include SWS — cannot replace a full night of sleep.

(Note for media: Please include the following link to the study in all online media coverage: https://academic.oup.com/sleep/advance-article-abstract/doi/10.1093/sleep/zsab152/6307588)

###

Michigan State University has been working to advance the common good in uncommon ways for more than 165 years. One of the top research universities in the world, MSU focuses its vast resources on creating solutions to some of the world’s most pressing challenges, while providing life-changing opportunities to a diverse and inclusive academic community through more than 200 programs of study in 17 degree-granting colleges.

 

For MSU news on the Web, go to MSUToday. Follow MSU News on Twitter at twitter.com/MSUnews

Closer encounters: How UFOs became a target for US intelligence again

    
Tourists waiting outside the gift shop near Area 51, a US Air Force facility said to host an alien spacecraft that has become a popular destination for alien enthusiasts.
 © John Locher, AP

Text by: Sébastian SEIBT

A hotly anticipated US government report on unidentified aerial phenomena (UAP) was released on Friday. After decades of mysterious reported sightings, the Pentagon had too little data to accurately determine the nature of these flying objects. But the fact that the report even exists is a marvel in itself, dating back to events from 1947.

While Doctor Who, The X-Files and ET fuel our imaginations with the possibility of alien life on earth, they remain works of fiction. But the nine-page report on UFOs, released by the US Office of the Director of National Intelligence and the Pentagon last Friday, is very much real.

The document discloses 144 observations of what the government calls “unidentified aerial phenomenon” (UAP) dating all the way back to 2004, most of which were reported by US Navy personnel. Mysterious flying objects, described in the report as a “potential threat” to US national security, were said to “probably lack a single explanation”.

Although the nature of these UFOs is still unknown, the document’s publication is a climactic moment in the decades spent analysing sightings. The US military has been tracking, deflecting and debunking observations of UFOs and “flying saucers” since the 1950s.

The truth is still out there


Bizarrely enough, efforts to disclose intelligence on UFOs wouldn’t have been possible without Donald Trump and the Covid-19 pandemic. In December 2020, just before leaving the White House, Trump signed a $2.3 trillion (€1.9tn) coronavirus relief bill into law. The bill included a clause ordering the US Department of Defense to tell Congress everything they knew about UFOs. And they had six months to do so.

The news ignited excitement among alien enthusiasts, eagerly waiting for the report to be released. But to their dismay, there was little evidence to confirm or deny any intergalactic visitations. Possible explanations for UAP included birds, drones, atmospheric phenomena like ice crystals, innovative developments by US government entities and unchartered technologies by adversaries such as Russia or China. However, the report clearly stated a “lack [of] sufficient information” in its dataset “to attribute incidents to specific explanations”.


Only one of the reported sightings was fully accounted for. Classified as airborne clutter, it was identified “with high confidence” as a large, deflating balloon. “The others remain unexplained,” the report said.

There is still hope for enthusiasts, though. In the next three months, the Pentagon will develop a new strategy for collecting and tracking information on potential sightings.

“It is critical that the United States maintain operations security and safety at DoD ranges,” Deputy Defense Secretary Kathleen Hicks wrote in a memo released on Friday. “To this end, it is equally critical that all US military aircrews or government personnel report whenever aircraft or other devices interfere with military training. This includes the observation and reporting of UAPs.”

UFOs: a bipartisan fascination


But a lack of conclusive evidence is precisely what could fuel more theories of otherworldly visitations, to which the US government is clearly not immune. The surprising decision to publish the report was described by the BBC as “a cultural shift that saw the US military and US political leadership go from extra-terrestrial-sceptic to ET-curious”.

In fact, UFOs have become a rare unifying topic across political lines. “After this last year, it’s kind of nice to see something that’s bipartisan,” Robert Powell, an executive board member of the Scientific Coalition for UAP Studies, told the Washington Post.

On the left, former presidential candidate Hillary Clinton, Bill Clinton’s chief of staff John Podesta and former Democratic Senate leader Harry M. Reid all expressed an interest in researching unidentified aerial phenomenon. Former President Barack Obama also chimed in, acknowledging in May 2021 that “there’s footage and records of objects in the skies that we don’t know exactly what they are”.

In the search for greater transparency, the top Republican on the Senate intelligence committee Marco Rubio said in May that despite the “stigma on Capitol Hill” resulting in “giggles” from his colleagues, the issue was worth investigating. He was joined by Fox news host Tucker Carlson, who referred to UFOs as a “very big issue”.

From the margins to the centre-stage


Often relegated to fringe, unscientific and even conspiratorial theories reserved for “ufologists”, it’s the first time since the 1950s that the topic has been taken seriously by the US government. After the pilot of a small plane reported seeing several “saucer-like” objects flying near Mount Rainier in Washington state in a 1947 newspaper headline, ordinary citizens followed suit. Sightings of “flying saucers” started exploding in numbers.

In the summer of that same year, theories circulated that an alien spacecraft had crashed near a military site in Roswell, New Mexico. US military officials visited the crash site and said they found remnants of a crashed weather balloon, but their findings weren’t enough to quell what had already become a national sensation.

Because of the Cold War, the US military wasn’t as worried about coming face to face with an alien as it was about discovering a Soviet machine more technically advanced than their own. So in 1953, a CIA advisory panel of experts was formed to officially investigate unidentified flying objects, saying they posed a potential threat to national security.

In a lengthy investigation on the relationship between UFOs and the Pentagon published by the New Yorker magazine in April 2021, journalist Gideon Lewis-Kraus wrote: “The Cold War made it crucial that the US government be perceived to have full control over its airspace.”

What ensued was a vast media campaign in the 1960s, orchestrated by the US government to try to dissuade the population from tracking down “flying saucers”. For a period of about 40 years, the US government was entirely silent on the issue. This never stopped ufologists from pilgrimaging across the “Extraterrestrial Highway” in the Nevada desert to Area 51, a US Air Force facility said to host an alien spacecraft that has become a popular tourist destination.

Later on in 2017, the New York Times published an article about the Pentagon’s mysterious UFO programme and the subject came back into the limelight. The article explained that sightings had frequently come from the US military itself, shattering the myth that the government considered civilian reports to be nonsense spewed by conspiracy theorists.

Since then, US media outlets have been able to access several videos and audio recordings of US Air Force pilots who have come face to face with unidentifiable objects flying at supersonic speeds, their movements hard to replicate.

Playing politics with Blink-182

Months before the report on UAP was ordered by Trump, the US Department of Defense formally released three Navy videos showing unidentified aerial phenomena blurrily moving against dark backdrops. Unintentionally stoking ufologists hopes that extra-terrestrial life exists, the Pentagon’s intention was to “clear up any misconceptions”.

The videos were first released between December 2017 and March 2018 by To The Stars Academy of Arts & Sciences, a private company co-founded by former Blink-182 member Tom DeLonge.

As to the question of why UFOs are coming back into the spotlight now, many speculate it could be tied to tense US-Russia relations. Just like in the beginning of the Cold War, the US once again feels its military and technological superiority is under threat.

David Clarke, an associate professor at Sheffield Hallam University in the UK, told the Washington Post: “My sort of view of this is UFOs are a product of the Cold War.” He said he found it “interesting that UFOs should once again come into public prominence” as they have today.

This article was adapted from the original in French.
German sports giant Adidas to sell Reebok

Adidas had owned Reebok for 15 years but was unable to revive the ailing brand. The German sporting goods behemoth saw its stock surge with the announcement of the sale.




High-profile stars like Victoria Beckham, Cardi B and Ariana Grande tried to help revive the brand

German sportswear firm Adidas said Thursday it will sell the brand Reebok to a US company, in a deal worth €2.1 billion ($2.5 billion).

Authentic Brands Group (ABG) will be Reebok's new owner, after Adidas failed to lift the fortunes of a brand it acquired in 2006.


The large US firm owns a number of brands including fashion retailers JCPenney, Forever21 and Brooks Brothers, as well as the magazine Sports Illustrated.


"Reebok has been a valued part of Adidas, and we are grateful for the contributions the brand and the team behind it have made to our company," Adidas CEO Kasper Rorsted said in a press release.

"With this change in ownership, we believe the Reebok brand will be well-positioned for long-term success," Rorsted said.

A struggling brand


ABG chief Jamie Salter said it was an "honor" to carry Reebok's legacy forward. "We look forward to working closely with the Reebok team to build on the brand's success," Salter said.

Boston-based Reebok was acquired by Adidas 15 years ago for €3.1 billion ($3.8 billion), as the German sports giant sought to take on US rival Nike.

Despite high-profile collaborations with the likes of Victoria Beckham, Cardi B and Ariana Grande in recent years, the brand struggled.

While Adidas said the sale of Reebok would have "no impact" on its financial outlook in the coming year, the company saw its stock price rise 1.6% after announcing the sale.

jcg/rs (Reuters, dpa)