Sunday, February 18, 2024

 

Meow or rooaaar - exotic cats' ability to recognize familiar caregivers' voices


Peer-Reviewed Publication

PEERJ

Interview with Professor Jennifer Vonk 

IMAGE: 

SEE THE VIDEO: HTTPS://WWW.YOUTUBE.COM/WATCH?V=9DLR4OJ7OTG

view more 

CREDIT: PEERJ




In a recent PeerJ Life & Environment study, Professor Jennifer Vonk from Oakland University presents compelling evidence that exotic cats possess the remarkable ability to discriminate between familiar and unfamiliar human voices.

The article - "Catcalls: Exotic Cats Discriminate the Voices of Familiar Caregivers" - delves into the often-overlooked realm of voice recognition among Felidae species. While much attention has been paid to domestication and early experiences in understanding animals' ability to differentiate between familiar and unfamiliar humans, few studies have explored this phenomenon in the feline family.

Using carefully designed experiments, Professor Vonk’s graduate student, Taylor Crews, and her team investigated whether non-domesticated Felidae species could recognise familiar human voices. The results had some key takeaways.

The researchers found consistent evidence of voice recognition across pilot and main studies involving 25 cats from various species, including lions, tigers, and cheetahs. Cats responded more quickly, intensely, and for longer durations to familiar voices compared to unfamiliar ones, regardless of the use of their names or rearing history.

The findings suggest that close human contact, rather than domestication, is associated with the ability of cats to discriminate between human voices. It also challenges the notion that less social species lack socio-cognitive abilities comparable to more gregarious species. Professor Vonk explains “Non-group-living animals can exhibit social cognitive abilities such as heterospecific vocal recognition so we should not neglect the study of social cognition in less highly social species.”

The implications of this study are profound, particularly considering the widespread housing of cats of all species in human care. Understanding their capacity to differentiate familiar from unfamiliar human voices could have significant implications for their welfare and interactions with caregivers.

This study adds to the growing body of work challenging stereotypes about cats as aloof creatures. It underscores the importance of recognizing the cognitive abilities of all species, even those traditionally considered less social.

For media inquiries or to request an interview with Professor Jennifer Vonk, please contact: vonk@oakland.edu

Study shows background checks don’t always check out


Bad data, human error can lead to unfair reports, or miss significant wrongdoing


 NEWS RELEASE 

UNIVERSITY OF MARYLAND




Employers making hiring decisions, landlords considering possible tenants and schools approving field trip chaperones all widely use commercial background checks. But a new multi-institutional study co-authored by a University of Maryland researcher shows that background checks themselves can’t be trusted.

Assistant Professor Robert Stewart of the Department of Criminology and Criminal Justice and Associate Professor Sarah Lageson of Rutgers University suspected that the loosely regulated entities that businesses and landlords rely on to run background checks produce faulty reports, and their research bore out this hunch. The results were published last week in Criminology.

“There’s a common, taken-for-granted assumption that background checks are an accurate reflection of a person’s criminal record, but our findings show that’s not necessarily the case,” Stewart said. “My co-author and I found that there are lots of inaccuracies and mistakes in background checks caused, in part, by imperfect data aggregation techniques that rely on names and birth dates rather than unique identifiers like fingerprints.”

The erroneous results of a background check can “go both ways,” Stewart said: They can miss convictions that a potential employer would want to know about, or they can falsely assign a conviction to an innocent person through transposed numbers in a birth date, incorrect spelling of a name or simply the existence of common aliases.

Stewart and Lageson’s study is based on the examination of official state rap sheets containing all arrests, criminal charges, and case dispositions recorded in the state linked to the record subject’s name and fingerprints for 101 study participants in New Jersey. Then, the researchers ordered background checks from a regulated service provider—the same type of company that an employer, a landlord, or a school system might use. The researchers also looked up background checks on the same study participants from an unregulated data provider, such as popular “people search” websites.

“We find that both types of background checks have numerous ‘false positive’ results, reporting charges that our study participants did not have, as well as ‘false negatives,’ not reporting charges that our study participants did have,” Stewart said.

More than half of study participants had at least one false-positive error on their regulated and unregulated background checks. About 90% of participants had at least one false-negative error.

Stewart and Lageson defined a number of problems with private-sector criminal records: mismatched data that create false negatives, missing case depositions that create incomplete and misleading criminal records, and incorrect data that create false positives.

For both the commercial and public-use background check services, the driving force behind errors in background checks is likely erroneous use of algorithms.

“These companies and platforms are linking records together based on names, aliases and birth dates rather than fingerprints, which is what the police use to match people to records,” Stewart said. “So these companies end up lumping people together who are not the same person.”

Through interviews with study participants, Stewart and Lageson explored the consequences of the errors, including limited access to employment and housing, as well as the difficulty of correcting them.

For example, one participant who had a pair of drug convictions decades ago had been mistakenly linked to much more serious crimes, including attempted murder.

“The problem was, he had at one point used an alias, and another man with a very extensive record had used a similar alias, and all his charges were linked to our participant,” Stewart said. “As a result, this other man’s record followed our participant for decades and helped to explain why he always had trouble securing a decent job.”

The researchers interviewed participants who described how errors in their background checks limited their access to education.

“We’re talking about a violation of the basic principles of fairness in our society and in the legal system,” Lageson said. “Unfortunately, people have little legal recourse when facing these issues. It’s clear this is an area ripe for policy reform.”

While commercial background checks providers are ostensibly regulated by the Fair Credit Reporting Act and other guidelines, Stewart and Lageson’s research has demonstrated that considerable errors persist.

Stewart said that public awareness of the potentially erroneous and incomplete results of background checks will be key to addressing this systemic social problem.

“Other countries are handling background checks in different ways, ways that may take more time, but there are better models out there,” Stewart said. It may be better for background checks to be done through the state, or the FBI, or through other ways that use biometric data. It’s important for people to realize that there’s a lot at stake.”

Scientists try out stone age tools to understand how they were used


Used stone edges might help illuminate timber use by early humans

Peer-Reviewed Publication

TOKYO METROPOLITAN UNIVERSITY

Traces left by tree-felling experiments on replica stone age tools. 

IMAGE: 

CHARACTERISTIC MACROSCOPIC (TOP) AND MICROSCOPIC (BOTTOM) TRACES MIGHT BE USED TO DETERMINE HOW STONE EDGES WERE USED.

view more 

CREDIT: TOKYO METROPOLITAN UNIVERSITY




Tokyo, Japan – Researchers from Tokyo Metropolitan University crafted replica stone age tools and used them for a range of tasks to see how different activities create traces on the edge. They found that a combination of macroscopic and microscopic traces can tell us how stone edges were used. Their criteria help separate tools used for wood-felling from other activities. Dated stone edges may be used to identify when timber use began for early humans.

 

For prehistoric humans, improvements in woodworking technology were revolutionary. While Paleolithic (early stone age) artifacts point to the use of wood for simple tools such as spears or throwing sticks, later Mesolithic and Neolithic artifacts reflect far more sophisticated uses, for building houses, canoes, bows, and wells. This is underscored by the availability of tools that let people craft more and more complex wooden instruments, particularly polished stone axes.

But while developed wood-processing technology is generally believed to be associated with a way of life seen in the Neolithic age of the Holocene, starting approximately 10,000 years ago, ground stone axes have been found from periods significantly before this. In sites around Australia and Japan, many ground edge artifacts from Marine Isotope Stage 3 (about 60,000 - 30,000 years ago) have been recovered. The important question becomes what they were used for at such an early stage.

To address this, a team led by Assistant Professor Akira Iwase from Tokyo Metropolitan University have tried to establish what kinds of traces might be left on ground stone edges when they are used for different activities. They took a hands-on approach, and crafted replicas of tools that might have been used in the Early Upper Paleolithic age (about 38,000 - 30,000 years ago). Edges were crafted and polished using knapping and grinding techniques which would have been available at the time. Though hafts have not been recovered from Japanese sites of the period, they adopted methods from Irian Jaya to attach a handle to the stone edges to create adzes, axes, and chisels.

The team then got to work using the tools for 15 different activities, including tree-felling, hide-processing, and butchering, as well as “non-use” events like carrying them around and trampling them. Edges were then examined in depth for both macroscopic and microscopic traces of their history. They found that impacts like those required for cutting down trees leave distinctive, macroscopic fractures on the stone; friction at the micron-scale between the stone edge and wood also form microscopic traces which may be used for diagnosis. While fractures alone cannot tell us what they were used for, they found that combining both macroscopic and microscopic evidence can help us make reliable conclusions about whether the edges were used to fell trees.

The team believe that if similar traces are found on the edges of real artifacts from Marine Isotope Stage 3 sites, this would mean that humans had honed woodworking technology from significantly earlier times than is currently believed. This would change our understanding of stone tool use by Ice Age humans, and how such technology was spread into different environments.

This work was supported by JSPS KAKENHI Grant Numbers 15K16874, 18H03596, 20K13235, 23H04840, and 23H00009.

Viruses that can help ‘dial up’ carbon capture in the sea


Researchers begin applying lessons learned from the ocean to soils

Reports and Proceedings

OHIO STATE UNIVERSITY




DENVER – Armed with a catalog of hundreds of thousands of DNA and RNA virus species in the world’s oceans, scientists are now zeroing in on the viruses most likely to combat climate change by helping trap carbon dioxide in seawater or, using similar techniques, different viruses that may prevent methane’s escape from thawing Arctic soil.

By combining genomic sequencing data with artificial intelligence analysis, researchers have identified ocean-based viruses and assessed their genomes to find that they “steal” genes from other microbes or cells that process carbon in the sea. Mapping microbial metabolism genes, including those for underwater carbon metabolism, revealed 340 known metabolic pathways throughout the global oceans. Of these, 128 were also found in the genomes of ocean viruses.

“I was shocked that the number was that high,” said Matthew Sullivan, professor of microbiology and director of the Center of Microbiome Science at The Ohio State University.

Having mined this massive trove of data via advances in computation, the team has now revealed which viruses have a role in carbon metabolism and are using this information in newly developed community metabolic models to help predict how using viruses to engineer the ocean microbiome toward better carbon capture would look.

“The modeling is about how viruses may dial up or dial down microbial activity in the system,” Sullivan said. “Community metabolic modeling is telling me the dream data point: which viruses are targeting the most important metabolic pathways, and that matters because it means they’re good levers to pull on.”

Sullivan presented the research today (Feb. 17, 2024) at the annual meeting of the American Association for the Advancement of Science in Denver.

Sullivan was the virus coordinator for the Tara Oceans Consortium, a three-year global study of the impact of climate change on the world’s oceans and the source of 35,000 water samples containing the microbial bounty. His lab focuses on phages, viruses that infect bacteria, and their potential to be scaled up in an engineering framework to manipulate marine microbes into converting carbon into the heaviest organic form that will sink to the ocean floor.

“Oceans soak up carbon, and that buffers us against climate change. CO2 is absorbed as a gas, and its conversion into organic carbon is dictated by microbes,” Sullivan said. “What we’re seeing now is that viruses target the most important reactions in these microbial community metabolisms. This means we can start investigating which viruses could be used to convert carbon toward the kind we want.

“In other words, can we strengthen this massive ocean buffer to be a carbon sink to buy time against climate change, as opposed to that carbon being released back into the atmosphere to accelerate it?”

In 2016, the Tara team determined that carbon sinking in the ocean was related to the presence of viruses. It is thought that viruses help sink carbon when virus-infected carbon-processing cells cluster into larger, sticky aggregates that drop to the ocean floor. The researchers developed AI-based analytics to identify from thousands of viruses which few are “VIP” viruses to culture in the lab and work with as model systems for ocean geoengineering.

This new community metabolic modeling, developed by collaborator Professor Damien Eveillard of the Tara Oceans Consortium, helps them understand what unintended consequences might be of such an approach. Sullivan’s lab is taking these oceanic lessons learned and applying them to using viruses to engineer microbiomes in human settings to aid recovery from spinal cord injury, improve outcomes for infants born to mothers with HIV, combat infection in burn wounds, and more.

“The conversation we’re having is, ‘How much of this is transferable?’” said Sullivan, also a professor of civil, environmental and geodetic engineering. “The overall goal is engineering microbiomes toward what we think is something useful.”

He also reported on early efforts to use phages as geoengineering tools in an entirely different ecosystem: the permafrost in northern Sweden, where microbes both change the climate and respond to climate change as the frozen soil thaws. Virginia Rich, associate professor of microbiology at Ohio State, is co-director of the National Science Foundation-funded EMERGE Biology Integration Institute based at Ohio State that organizes the microbiome science at the Sweden field site. Rich also co-led previous research that identified a lineage of single-cell organisms in the thawing permafrost soil as a significant producer of methane, a potent greenhouse gas.

Rich co-organized the AAAS session with Ruth Varner of the University of New Hampshire, who co-directs the EMERGE Institute, which is focusing on better understanding how microbiomes respond to permafrost thaw and the resulting climate interactions.

Sullivan’s talk was titled “From ecosystems biology to managing microbiomes with viruses,” and was presented at the session titled “Microbiome-Targeted Ecosystem Management: Small Players, Big Roles.”

The oceans work is supported by the National Science Foundation, the Gordon and Betty Moore Foundation and Tara Oceans, and, in addition to the NSF, the soils work has been funded by the Department of Energy and the Grantham Foundation.

#

 

CRISPR-copies: New tool accelerates and optimizes genome editing


CABBI researchers publically share a new tool to revolutionize CRISPR gene editing


UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN INSTITUTE FOR SUSTAINABILITY, ENERGY, AND ENVIRONMENT

CRISPR COPIES Graphic 

IMAGE: 

CRISPR-COPIES HAS APPLICATIONS IN SYNTHETIC BIOLOGY TOOLKIT CHARACTERIZATION, GENE THERAPY, AND METABOLIC ENGINEERING.

view more 

CREDIT: AASHUTOSH BOOB ET AL.




CRISPR/Cas systems have undergone tremendous advancement in the past decade. These precise genome editing tools have applications ranging from transgenic crop development to gene therapy and beyond. And with their recent development of CRISPR-COPIES, researchers at the Center for Advanced Bioenergy and Bioproducts Innovation (CABBI) are further improving CRISPR’s versatility and ease of use.

“CRISPR-COPIES is a tool that can quickly identify appropriate chromosomal integration sites for genetic engineering in any organism,” said Huimin Zhao, CABBI Conversion Theme Leader and Steven L. Miller Chair of Chemical and Biomolecular Engineering (ChBE) at the University of Illinois. “It will accelerate our work in the metabolic engineering of non-model yeasts for cost-effective production of chemicals and biofuels.”

Gene editing has revolutionized scientists’ capabilities in understanding and manipulating genetic information. This form of genetic engineering allows researchers to introduce new traits into an organism, such as resistance to pests or the ability to produce a valuable biochemical.

With CRISPR/Cas systems, researchers can make precise, targeted genetic edits. However, locating optimal integration sites in the genome for these edits has been a critical and largely unsolved problem. Historically, when researchers needed to determine where to target their edits, they would typically manually screen for potential integration sites, then test the site by integrating a reporter gene to assess its cellular fitness and gene expression levels. It’s a time- and resource-intensive process.

To address this challenge, the CABBI team developed CRISPR-COPIES, a COmputational Pipeline for the Identification of CRISPR/Cas-facilitated intEgration Sites. This tool can identify genome-wide neutral integration sites for most bacterial and fungal genomes within two to three minutes.

“Finding the integration site in the genome manually is like searching for a needle in a haystack,” said Aashutosh Boob, a ChBE Ph.D. student at the University of Illinois and primary author of the study. “However, with CRISPR-COPIES, we transform the haystack into a searchable space, empowering researchers to efficiently locate all the needles that align with their specific criteria.”

In their paper published in Nucleic Acids Research, the researchers demonstrated the versatility and scalability of CRISPR-COPIES by characterizing integration sites in three diverse species: Cupriavidus necatorSaccharomyces cerevisiae, and HEK 293T cells. They used integration sites found by CRISPR-COPIES to engineer cells with increased production of 5-aminolevulinic acid, a valuable biochemical that has applications in agriculture and the food industry.

In addition, the team has created a user-friendly web interface for CRISPR-COPIES. This incredibly accessible application can be used by researchers even without significant bioinformatics expertise.

A primary objective of CABBI is the engineering of non-model yeasts to produce chemicals and fuels from plant biomass. Economically producing biofuels and bioproducts from low-cost feedstocks at a large scale is a challenge, however, due to the lack of genetic tools and the cumbersome nature of traditional genome-editing methods. By enabling researchers to swiftly pinpoint genomic loci for targeted gene integration, CRISPR-COPIES provides a streamlined pipeline that facilitates the identification of stable integration sites across the genome. It also eliminates the manual labor involved in designing components for CRISPR/Cas-mediated DNA integration.

For crop engineering, the tool can be used to increase biomass yields, pest resistance, and/or environmental resilience. For converting biomass to valuable chemicals — for instance, by using the yeast S. cerevisiae — CRISPR-COPIES can be used to engineer cells with significantly greater yields.

This versatile software is designed to simplify and accelerate the strain construction process, saving researchers both time and resources. Researchers around the world in both academia and industry can benefit from its utility in strain engineering for biochemical production and transgenic crop development.

Co-authors on this study include ChBE Ph.D. student Zhixin Zhu, ChBE visiting student Pattarawan Intasian, and Bioengineering Ph.D. student Guanhua Xun; Carl R. Woese Institute for Genomic Biology (IGB) Software Developers Manan Jain and Vassily Petrov; IGB Biofoundry Manager Stephan Lane; and CABBI postdoc Shih-I Tan.

— Article by CABBI Communications Specialist April Wendling

 

Determining who gets blamed when cars hit pedestrians


Study finds urban environment plays major role in fault-finding

Peer-Reviewed Publication

OHIO STATE UNIVERSITY




COLUMBUS, Ohio – A new study examines the circumstances behind who is found at fault when cars hit pedestrians in an urban area.

 

Results showed that the environment where the crash took place – especially the types of roads and the amount of access to marked crosswalks – played a key role in whether the pedestrian or the driver was blamed for the collision.

 

In the study, done in Columbus, pedestrians were more likely to be blamed when they were crossing roads with a high volume of cars traveling at faster speeds, and where crosswalks were few and far between.

 

In areas of the city – such as downtown – where there were more marked intersections with pedestrian crossings, drivers were more likely to be found at fault.

 

“Our findings suggest that while the tendency is to blame individuals, the built environment where the crash occurs plays an important role,” said Jonathan Stiles, who led the study as a postdoctoral researcher in geography at The Ohio State University.

 

This was evident in areas of the city where there was an average of a quarter-mile between crosswalks with signals.

 

“In neighborhoods where there is less pedestrian infrastructure, pedestrians tend to be blamed more for crashes,” said co-author Harvey Miller, professor of geography at Ohio State.

 

“There needs to be more attention to road design and the built environment that contributes to crashes.”

 

The study was published recently in the Journal of Transport and Land Use.

 

Miller and Stiles, who is now a visiting assistant professor at Columbia University, analyzed five years of data on pedestrian crashes in Franklin County, Ohio, the home of Columbus.

 

There were 2,757 pedestrian crashes in the county between 2015 and 2019. In slightly more than half the crashes (54%), the driver was found at fault, and in 36%, the pedestrian was blamed. No fault was found in the remaining crashes.

 

As expected, the pedestrian being in the middle of a block or in a travel lane (outside of a crosswalk) increased the likelihood of them being found at fault.

 

Aspects of the built environment had a strong impact on who was found at fault.  Pedestrians were more likely to be found at fault on arterial roads – high-capacity thoroughfares that have heavier traffic and higher speed limits than local roads.

 

The presence of bus stops in the area also increased the odds of the pedestrian being found at fault – probably because most bus stops are dropping off pedestrians on the busy arterial roads with fewer crosswalks, researchers said, meaning many people may be crossing between blocks.

 

To get a better handle on how the built environment affected who was found at fault, Miller and Stiles did case studies of various neighborhoods in Columbus.

 

They found that downtown Columbus – which has pedestrian-friendly street design and infrastructure, with automatic pedestrian traffic controls at most intersections – was the area where drivers were most likely to be found at fault for crashes. Here, drivers were found at fault in 3 out of 4 crashes, and pedestrians were faulted only 17% of the time.

 

The situation was much different in the Hilltop and South Linden, both of which are lower-income neighborhoods in the city. Unlike in downtown, crosswalks with signals were much less common in these neighborhoods, making it more difficult for pedestrians to cross busy streets safely.

 

In the Hilltop, pedestrians and drivers were equally likely to be found at fault (46% of the crashes). The researchers described Broad Street, a five-lane arterial road, as “a daunting road to cross with or without a signal in Hilltop, and distances to signalized crossing areas can be several blocks.”

 

On one part of Sullivant Avenue in the Hilltop, there was an average of 418 meters between signalized intersections – one-fourth of a mile.

 

The situation for pedestrians was even more difficult in South Linden, a predominantly Black neighborhood to the northwest of downtown, where pedestrians were found at fault in 55% of crashes.

 

Here, the average distance between pedestrian crossings with signals was 429 meters in some locations.

 

Some busy areas were well above this average, including one that lacked a crossing for more than 640 meters – about 0.4 miles.  That stretch included a mix of residences, retail, day care, religious facilities, a library and bus stops.

 

“This makes it very difficult for pedestrians who may be carrying bags from a store to find a crosswalk to get across the street,” Stiles said.

 

“It makes it more understandable why pedestrians may try to cross a street between crosswalks.”

 

It is also understandable why police officers mark pedestrians at fault on official reports when people get hit while trying to cross in the middle of blocks, said Miller, who is director of the Center for Urban and Regional Analysis at Ohio State.

 

“Police officers are concerned about the safety of people, so they’re trying to do the right thing when they find pedestrians at fault in these situations,” Miller said.

 

“But what we’re seeing in this research is that the built environment is a key factor. People make errors in judgment, but no one deserves to die or get injured for such errors. And they would be less likely to make these choices if there were more pedestrian infrastructure,” he said.

 

One recommendation from the researchers is to redesign the crash forms completed by police to include information on the built environment around the crash site, such as distance to the nearest pedestrian crossing, to give more context on why pedestrians make certain choices.

 

The built environment for pedestrians isn’t just a problem in Columbus, according to the researchers. Many cities have similar issues. And the situation in Columbus is improving because of Vision Zero Columbus, a government effort focused on reducing crash-related fatalities and injuries in the city.

 

But this study shows the importance of a Safe System Approach to designing roadways to minimize the effects of human errors and allow pedestrians, as well as cars, to move safely through the city.

 

“We don’t have to design the streets the way we do. We can make fundamental design choices that could prioritize safety over the speed of traffic,” Miller said.

 

Successful employer-driven disability initiatives benefit individuals and companies


Research in the Journal of Vocational Rehabilitation pinpoints the elements impactful initiatives have in common and the positive effects on disabled and non-disabled employees and corporate climate

Peer-Reviewed Publication

IOS PRESS




Amsterdam, February 13, 2024 – Successful employer-driven disability initiatives share certain characteristics, even when the companies and programs differ in other ways. According to new research published in the Journal of Vocational Rehabilitation, companies whose leadership teams are committed to inclusion for people with disabilities and are guided by complementary values deeply embedded in their organizations experienced improved performance, positive employee perceptions, and a cohesive, unified culture. The fidelity and prominence of the disability-inclusive actions and practices moderated their relative outcomes.

Disability is commonplace in society but not in the US labor market. According to the Centers for Disease Control and Prevention and the US Department of Labor, the participation rate of people with disabilities in the workplace in 2023 is nearly half that experienced by people without disabilities.

“Employer-driven disability initiatives can indeed have a positive impact on both company performance and organizational climate. Our study identified the motivations and best practices that led to successful outcomes: Improved business and financial results, better morale, and unity within the company,” explained lead investigator Brian N. Phillips, PhD, CRC, Department of Special Education and Rehabilitation Counseling, Utah State University. “These best practices should serve as a beacon to other organizations to actively recognize disability as a valued part of company diversity.”

The findings suggest that employers can gain a great deal from seeing the disabled as a valuable and largely untapped part of the workforce. The study combined findings from case studies conducted across seven companies of varying size and industry. Recognizing the limited amount of research on the topic, the investigators sought increased understanding of employer-driven disability initiatives, what motivated them, and what impact they had on the company’s productivity and overall success. The greatest success seemed to come to companies that took the most decisive action in implementing their initiatives.

“It is powerful to appreciate how much inclusion in the workplace can be a competitive advantage. At the same time, it’s important to point out that companies adopting a disability initiative on more of a trial basis or with more skepticism generated less favorable results,” noted co-investigator Timothy N. Tansey, PhD, University of Wisconsin-Madison. “Many employers prioritize diversity but underinvest in providing support for people with disabilities.”

Co-investigator Paul Wehman, PhD, Virginia Commonwealth University and Editor-in-Chief of the Journal of Vocational Rehabilitation, noted, “The literature suggested we would find a relationship between employer disability initiatives and company performance, and that commitment from leadership would be influential. However, we were surprised to learn that the disability initiatives had such a positive impact on non-disabled co-workers. Many employees with whom we spoke reported feeling a new sense of pride in the company and a greater closeness to their co-workers.”

Dr. Phillips commented that companies in the study all shared a value of being employee-centered, acknowledging that employees didn’t exist solely to help the company, but that the company also existed to help and support employees through flexibility, training, upward mobility, and compensation.