Tuesday, May 14, 2024

 

New transit station in Japan significantly reduced cumulative health expenditures


Time series data and causal impact algorithm reveal the effectiveness of a new transit station over a four-year period



OSAKA METROPOLITAN UNIVERSITY

Graphical abstract: Health Expenditure Impact of Opening a New Public Transport Station: A Natural Experiment of JR-Sojiji Station in Japan 

IMAGE: 

THE NEW PUBLIC TRANSPORT STATION WAS SIGNIFICANTLY ASSOCIATED WITH THE DECREASE IN AVERAGE HEALTH EXPENDITURES PER CAPITA BY APPROXIMATELY JPY 99,257 OVER FOUR YEARS.

view more 

CREDIT: HARUKA KATO, OSAKA METROPOLITAN UNIVERSITY (CC BY 4.0, HTTPS://CREATIVECOMMONS.ORG/LICENSES/BY/4.0/)




The declining population in Osaka is related to an aging society that is driving up health expenditures. Dr. Haruka Kato, a junior associate professor at Osaka Metropolitan University, teamed up with the Future Co-creation Laboratory at Japan System Techniques Co., Ltd. to conduct natural experiments on how a new train station might impact healthcare expenditures.

JR-Sojiji Station opened in March 2018 in a suburban city on the West Japan Railway line connecting Osaka and Kyoto. The researchers used a causal impact algorithm to analyze the medical expenditure data gathered from the time series medical dataset REZULT provided by Japan System Techniques.

Their results indicate that opening this mass transit station was significantly associated with a decrease in average healthcare expenditures per capita by approximately 99,257.31 Japanese yen (USD 929.99) over four years, with US dollar figures based on March 2018 exchange rates. In addition, the 95% confidence interval indicated the four-year decreasing expenditure of JPY 136,194.37 ($1276.06) to JPY 62,119.02 ($582.02). This study’s findings are consistent with previous studies suggesting that increased access to transit might increase physical activity among transit users. The results provided evidence for the effectiveness of opening a mass transit station from the viewpoint of health expenditures.

“From the perspective of evidence-based policymaking, there is a need to assess the social impact of urban designs,” said Dr. Kato. “Our findings are an important achievement because they enable us to assess this impact from the perspective of health care expenditures, as in the case of JR-Sojiji Station.”

The findings were published in Journal of Transport & Health.

###

About OMU
Established in Osaka as one of the largest public universities in Japan, Osaka Metropolitan University is committed to shaping the future of society through the “Convergence of Knowledge” and the promotion of world-class research. For more research news, visit https://www.omu.ac.jp/en/ and follow us on social media: XFacebookInstagramLinkedIn.

 

High genetic diversity discovered in South African leopards



UNIVERSITY OF ADELAIDE
Declan Morris with a leopard 

IMAGE: 

DECLAN MORRIS WITH A LEOPARD.

view more 

CREDIT: UNIVERSITY OF ADELAIDE




Researchers say the discovery of very high genetic diversity in leopards found in the Highveld region of South Africa has increased the need for conservation efforts to protect leopards in the country.

Declan Morris, a PhD candidate with the University of Adelaide’s School of Animal and Veterinary Sciences, led the research project, which discovered that the two maternal lineages of leopards found in Africa overlap in the Highveld, leading to the high genetic diversity.

One lineage can be found across most of the African continent, while the other is confined mostly to the Western Cape, Eastern Cape, KwaZulu-Natal and Mpumalanga regions of South Africa.

“We compiled the most comprehensive mitochondrial DNA (mtDNA) data set to date to explore the trends and leopard genetics on a continental scale,” says Morris.

“The results of our analysis, using a combination of mtDNA, microsatellites, and comparisons with results of other published studies, is what enabled us to determine that the leopard population in the Highveld of Mpumalanga had the highest levels of genetic diversity in the country.”

Genetic diversity is important for a species’ long-term survival.

“High genetic diversity increases the ability for a species to adapt to a changing environment around it; therefore, it can make species more resilient to events such as climate change or the introduction of new diseases,” says Morris.

“The discovery that the leopards in the Highveld have the highest recorded levels of genetic diversity in South Africa is significant as it places a high conservation priority for the population in the region.”

It is likely the two lineages of leopards diverged between 960,000–440,000 years ago due to the aridification of the Limpopo basin between 1,000,000–600,000 years ago. Both leopard lineages are now comingling in the Mpumalanga Province where Morris’ PhD work was conducted.

“We had originally hypothesised that the Highveld leopards would be isolated as they exist in a highly fragmented region, but this discovery shows us that it’s not as isolated as we thought,” Morris says.

“Gene flow is occurring with Lowveld areas and Kruger National Park. We found an unexpected level of connectivity, even across landscapes highly modified by humans.”

Morris, whose research team included the University of Adelaide’s Dr Todd McWhorter and Associate Professor Wayne Boardman, and collaborators from University of Pretoria and University of Venda, hopes this discovery will place a higher importance on the conservation of leopard populations in South Africa.

“This information will hopefully help change attitudes towards the management of leopards and be used to inform management decisions – such as choosing translocation instead of issuing destruction permits for problem-causing animals,” he says.

“One of the biggest measures that could protect leopards in the Highveld is community engagement. Building better, stronger relationships between the community, government, researchers, and conservation organisations allows for efficient, targeted management programs to be designed.”

This discovery was published in the journal PeerJ and builds upon another recent leopard study published by the research team.

 

Eco-friendly and affordable battery for low-income countries



LINKÖPING UNIVERSITY
Zinc-Lignin Battery 

IMAGE: 

THE NEW ZINC-LIGNIN BATTERY IS STABLE, AS IT CAN BE USED OVER 8000 CYCLES WHILE MAINTAINING ABOUT 80% OF ITS PERFORMANCE. THE BARRERY DEVELOPED BY THE RESEARCHERS IS SMALL BUT THE TECHNOLOGY IS SCALABLE.

view more 

CREDIT: THOR BALKHED





A battery made from zinc and lignin that can be used over 8000 times. This has been developed by researchers at Linköping University, Sweden, with a vision to provide a cheap and sustainable battery solution for countries where access to electricity is limited. The study has been published in the journal Energy & Environmental Materials.

“Solar panels have become relatively inexpensive, and many people in low-income countries have adopted them. However, near the equator, the sun sets at around 6 PM, leaving households and businesses without electricity. The hope is that this battery technology, even with lower performance than the expensive Li-ion batteries, will eventually offer a solution for these situations,” says Reverant Crispin, professor of organic electronics at Linköping University.

His research group at the Laboratory of Organic Electronics, together with researchers at Karlstad University and Chalmers, has developed a battery that is based on zinc and lignin, two cost-effective and environmentally friendly materials. In terms of energy density, it is comparable to lead-acid batteries but without the lead, which is toxic.

The battery is stable, as it can be used over 8000 cycles while maintaining about 80% of its performance. Additionally, the battery retains its charge for approximately one week, significantly longer than other similar zinc-based batteries that discharge in just a few hours.

Although zinc-based batteries are already on the market, primarily as non-rechargeable batteries, they are predicted to complement and, in some cases, replace lithium-ion batteries in the long run when the feature of rechargeability is properly introduced.

“While lithium-ion batteries are useful when handled correctly, they can be explosive, challenging to recycle, and problematic in terms of environmental and human rights issues when specific elements like cobalt are extracted. Therefore, our sustainable battery offers a promising alternative where energy density is not critical,” says Ziyauddin Khan, a researcher at the Laboratory of Organic Electronics at LiU.

The issue with zinc batteries has primarily been poor durability due to zinc reacting with the water in the battery’s electrolyte solution. This reaction leads to the generation of hydrogen gas and dendritic growth of the zinc, rendering the battery essentially unusable.

To stabilise the zinc, a substance called potassium polyacrylate based water-in-polymer salt electrolyte (WiPSE) is used. What the researchers at Linköping have now demonstrated is that when WiPSE is used in a battery containing zinc and lignin, stability is very high.

“Both zinc and lignin are super cheap, and the battery is easily recyclable. And if you calculate the cost per usage cycle, it becomes an extremely cheap battery compared to lithium-ion batteries,” says Ziyauddin Khan.

Currently, the batteries developed in the lab are small. However, the researchers believe that they can create large batteries, roughly the size of a car battery, thanks to the abundance of both lignin and zinc at low cost. However, mass production would require the involvement of a company.

Reverant Crispin asserts that Sweden’s position as an innovative country enables it to assist other nations in adopting more sustainable alternatives.

“We can view it as our duty to help low-income countries avoid making the same mistakes we did. When they build their infrastructure, they need to start with green technology right away. If unsustainable technology is introduced, it will be used by billions of people, leading to a climate catastrophe,” says Reverant Crispin.

 

Concussion, CTE experts warn term used to describe head impacts – “subconcussion” – is misleading and dangerous



Researchers from Spaulding Rehabilitation, Boston University, Mayo Clinic, and Concussion Legacy Foundation agree: retire the use of subconcussion for more accurate alternatives



MASS GENERAL BRIGHAM






BOSTON (May 14, 2024) – A new editorial published this May in the British Journal of Sports Medicine by experts from Spaulding Rehabilitation, Boston University, Mayo Clinic, and the Concussion Legacy Foundation, argues that the term “subconcussion” is a dangerous misnomer that should be retired. The authors are appealing to the medical community and media to substitute the term with more specific terms so the public can better understand the risks of brain injuries and advance effective efforts to prevent chronic traumatic encephalopathy (CTE).

“The public has been led to believe through media coverage and movies that concussions alone cause CTE,” said senior author Dan Daneshvar, MD, PhD, chief of Brain Injury Rehabilitation at Spaulding Rehabilitation, a member of the Mass General Brigham healthcare system, and assistant professor, Harvard Medical School. “But the research is clear: concussions do not predict CTE status, and the hits that cause concussions are often not the hardest ones, making ‘subconcussive’ misleading when describing impacts.”

The authors believe part of the confusion results from the fact that head impacts which don’t cause concussion are referred to as “subconcussive impacts,” implying they are less than concussions. Scientists often say that CTE is caused by “small, repetitive impacts,” which leaves out the effect of any “large repetitive impacts.”.

Ross Zafonte, DO, president of Spaulding Rehabilitation and chair of the Harvard Medical School Department of Physical Medicine and Rehabilitation, served as a co-author.

Previous studies report a high incidence of large repetitive impacts during football. Published helmet sensor studies show that around 10 percent of head impacts experienced by football players are harder than the average concussion. That means that if a football player gets one concussion during a 1,000 head impacts season, around 100 hits were harder than that one concussion. One study showed that for every concussion a college football player experiences, they experience 340 head impacts of greater force.

The authors of the editorial recommend replacing “subconcussive” with “nonconcussive” to better describe head impacts that don’t result in a concussion.

“We’ve always known CTE is caused by head impacts, but until we did this analysis, I didn’t realize I absorbed hundreds of extreme head impacts for every concussion when I played football,” said Chris Nowinski, PhD, lead author, co-founder and CEO of the Concussion Legacy Foundation, and former Harvard football player. “Using the term subconcussive naturally led me to imagine smaller hits, but now I suspect these frequent larger hits are playing a more significant role in causing CTE than we previously believed.”

The editorial also highlights how the term subconcussive has not only confused the discussion around head impacts, but also around traumatic brain injuries. Studies consistently show that athletes exposed to hundreds of repetitive head impacts, in the absence of a concussion, still have changes to brain function, blood biomarkers of brain injury, and structural changes on imaging that look similar to changes in athletes with diagnosed concussions. The concept of subconcussive injury has been shoehorned into the conversation to explain this “missing link.”

The authors suggest we stop using subconcussive injury, noting the missing link is better described as subclinical traumatic brain injury (TBI). Subclinical TBI happens when there are changes in brain function, biomarkers, or imaging without TBI signs or symptoms.

“The human brain has more than 80 billion neurons, and we can be confident an athlete cannot feel it when only one is injured,” said neurosurgeon Robert Cantu, MD, clinical professor of neurology, Boston University School of Medicine, and diagnostics and therapeutics leader, Boston University ARDC-CTE Center. “Athletes, military veterans, and members of the community frequently suffer subclinical traumatic brain injuries, and we suggest retiring subconcussion, a poorly defined term, when referring to brain injuries.”

By changing this nomenclature, the authors hope to clarify why concussions do not predict who has CTE, whereas the number and strength of repeated head impacts does. They implore the medical community and media to properly name the impacts and injuries that can’t be seen, which can advance the conversation to accelerate CTE prevention efforts, such as the CTE Prevention Protocol.

About Spaulding Rehabilitation

A member of the Mass General Brigham Health System, Spaulding Rehabilitation includes Spaulding Rehabilitation Hospital, with a main campus in Charlestown the 2nd ranked in the nation for rehabilitation by U.S. News & World Report, along with Spaulding Rehabilitation Hospital Cape Cod, Spaulding Rehabilitation Hospital Cambridge, Spaulding Rehabilitation Nursing and Therapy Center Brighton, and over 25 outpatient sites throughout Eastern Massachusetts. An acclaimed teaching hospital of Harvard Medical School and home to the Department of Physical Medicine and Rehabilitation, Spaulding is recognized as a top residency program in the U.S. in the Doximity Residency Navigator. Spaulding also was recognized by the 2023 Disability Equality Index as a “Best Places to Work for Disability Inclusion.” For more information, visit www.spauldingrehab.org

 

WAIT, WHAT?!

Study reveals patients with brain injuries who died after withdrawal of life support may have recovered


Findings support a more cautious approach to making early decisions on withdrawal of life support following traumatic brain injuries



MASS GENERAL BRIGHAM




BOSTON - (May 13, 2024) Severe traumatic brain injury (TBI) is a major cause of hospitalizations and deaths around the world, affecting more than five million people each year. Predicting outcomes following a brain injury can be challenging, yet families are asked to make decisions about continuing or withdrawing life-sustaining treatment within days of injury.

In a new study, Mass General Brigham investigators analyzed potential clinical outcomes for TBI patients enrolled in the Transforming Research and Clinical Knowledge in TBI (TRACK-TBI) study for whom life support was withdrawn. The investigators found that some patients for whom life support was withdrawn may have survived and recovered some level of independence a few months after injury. These findings suggest that delaying decisions on withdrawing life support might be beneficial for some patients.

Families are often asked to make decisions to withdraw life support measures, such as mechanical breathing, within 72 hours of a brain injury. Information relayed by physicians suggesting a poor neurologic prognosis is the most common reason families opt for withdrawing life support measures. However, there are currently no medical guidelines or precise algorithms that determine which patients with severe TBI are likely to recover.  

Using data collected over a 7.5-year period on 1,392 TBI patients in intensive care units at 18 United States trauma centers, the researchers created a mathematical model to calculate the likelihood of withdrawal of life-sustaining treatment, based on properties like demographics, socioeconomic factors and injury characteristics. Then, they paired individuals for whom life-sustaining treatment was not withdrawn (WLST-) to individuals with similar model scores, but for whom life-sustaining treatment was withdrawn (WLST+).

Based on follow-up of their WLST- paired counterparts, the estimated six-month outcomes for a substantial proportion of the WLST+ group was either death or recovery of at least some independence in daily activities. Of survivors, more than 40 percent of the WLST- group recovered at least some independence. In addition, the research team found that remaining in a vegetative state was an unlikely outcome by six-months after injury. Importantly, none of the patients who died in this study were pronounced brain dead, and thus the results are not applicable to brain death.

According to the authors, the findings suggest there is a cyclical, self-fulfilling prophecy taking place: Clinicians assume patients will do poorly based on outcomes data. This assumption results in withdrawal of life support, which in turn increases poor outcomes rates and leads to even more decisions to withdraw life support.

The authors suggest that further studies involving larger sample sizes that allow for more precise matching of WLST+ and WLST- cohorts are needed to understand variable recovery trajectories for patients who sustain traumatic brain injuries.

“Our findings support a more cautious approach to making early decisions on withdrawal of life support,” said corresponding author Yelena Bodien, PhD, of the Department of Neurology’s Center for Neurotechnology and Neurorecovery at Massachusetts General Hospital and of the Spaulding-Harvard Traumatic Brain Injury Model Systems. “Traumatic brain injury is a chronic condition that requires long term follow-ups to understand patient outcomes. Delaying decisions regarding life support may be warranted to better identify patients whose condition may improve.”

Read more in the study, published May 13, in the Journal of Neurotrauma.

 

###

About Mass General Brigham

Mass General Brigham is an integrated academic health care system, uniting great minds to solve the hardest problems in medicine for our communities and the world. Mass General Brigham connects a full continuum of care across a system of academic medical centers, community and specialty hospitals, a health insurance plan, physician networks, community health centers, home care, and long-term care services. Mass General Brigham is a nonprofit organization committed to patient care, research, teaching, and service to the community. In addition, Mass General Brigham is one of the nation’s leading biomedical research organizations with several Harvard Medical School teaching hospitals. For more information, please visit massgeneralbrigham.org.

 

UK survey finds “disgust factor” needs to be overcome if eating insects is to become truly mainstream



EUROPEAN ASSOCIATION FOR THE STUDY OF OBESITY





UK survey examines consumer attitudes towards and willingness to consume insect-based foods.

Only 13% of respondents said they would be willing to regularly consume insects, with younger respondents less willing to give insects a try, as were those with higher sensitivity to food disgust.

*Please mention the European Congress on Obesity (ECO 2024, Venice,12-15 May) if using this material*

New research being presented at this year’s European Congress on Obesity (ECO) in Venice, Italy (12-15 May), finds that insect-based foods remain unappealing in the UK, and more needs to be done to change attitudes towards and willingness to consume insects, as a potential avenue for more sustainable food production which could reduce the carbon footprint of UK consumers.

Food production accounts for up to a quarter of all human greenhouse gas emissions. Livestock is a huge contributor to these emissions and researchers and policymakers are trying to develop and promote more sustainable ways to produce protein. One option gaining attention is farming and eating insects, such as crickets, flies, and worms, due to their potential nutritional and environmental advantages over other protein sources.

“Insects are a potentially rich source of protein and micro-nutrients and could help provide a solution to the double burden of obesity and undernutrition”, says lead author Dr Lauren McGale from Edge Hill University, UK. “Some insect proteins, such as ground crickets or freeze-dried mealworms, are cheaper and easier to farm, often lower in fat and have a lower environmental impact than traditional livestock.”

Despite these benefits, people in Western countries rarely eat insects, and many people are disgusted at the thought of insect-based food. Nevertheless, people are happy to eat lobster or crayfish despite their insect-like appearance, so it is possible attitudes could change.

To identify factors which may affect willingness to consume insects and to establish existing experience with insect-based food in the UK, researchers conducted an online survey of 603 UK adults (average age 34 years; 76% female) between 2019 and 2020, recruited using the Prolific recruitment platform—a large database of people from across the UK who have agreed to take part in research.

In the survey, participants were asked about their demographics (e.g., age, gender, ethnicity, and education level) and socioeconomic status as well as their level of concern about the environment.

Respondents were also asked to complete a Food Disgust Scale to measure how disgusting they find certain food-related situations, in order to determine their individual food disgust sensitivity. For example, participants are asked to rate their disgust at less commonly eaten parts of animals (such as organs, jaws, etc.), or their disgust response to food which had gone mouldy or had fallen on the floor.

They were also asked questions about anticipated taste/sensory perceptions, for example, how sweet, savoury, crunchy or slimy they anticipated insects to be in general, and their willingness to consume insects regularly.

The survey reveals that perception’s about insects’ taste or sensory properties were not generally favourable, with participants tending to rate them lower on visual or smell appeal, and anticipating lower levels of enjoyment, liking, or sweetness, and higher levels of savouriness, saltiness, and bitterness [1].

Overall, only 13% of respondents said they would be willing to regularly consume insects, compared to 47% who said they would not be willing, and 40% who responded maybe or that they were unsure.

Younger respondents were less open to consuming insects regularly, with each year younger being associated with a 2% increase in responding ‘no’ when asked if they would be willing to consume insects regularly.

Furthermore, as expected, levels of general food disgust predicted openness to consuming insects, with each point increase on the Food Disgust Scale predicting a 4% increase in saying ‘no’ to consuming insects.

Interestingly, disgust ratings were significantly higher for powdered insects than for whole insects. However, respondents' willingness to consume insects was also significantly higher for powdered insects than for whole insects despite higher levels of disgust.

“The disgust factor associated with eating whole insects could be overcome by incorporating insect flours into processed foods. This has been done successfully with rice products fortified with cricket or locust flours in other parts of the world”, says co-author Dr Maxine Sharps from De Montfort University, UK. “But if insects are to be a mainstream part of the Western diet, the disgust factor is one of most important challenges to be overcome. Afterall, there may be eventually no choice with climate change and projected global population growth.”

 

 

Nature's 3D printer: bristle worms form bristles piece by piece


Better understanding of this natural formation process offers potential for technical developments




UNIVERSITY OF VIENNA

Larva of the marine annelid Platynereis dumerilii, scanning electron micrograph (size scale: 100µm) 

IMAGE: 

LARVA OF THE MARINE ANNELID PLATYNEREIS DUMERILII, SCANNING ELECTRON MICROGRAPH (SIZE SCALE: 100ΜM)

view more 

CREDIT: LUIS ZELAYA-LAINEZ, VIENNA UNIVERSITY OF TECHNOLOGY




A new interdisciplinary study led by molecular biologist Florian Raible from the Max Perutz Labs at the University of Vienna provides exciting insights into the bristles of the marine annelid worm Platynereis dumerilii. Specialized cells, so-called chaetoblasts, control the formation of the bristles. Their mode of operation is astonishingly similar to that of a technical 3D printer. The project is a collaboration with researchers from the University of Helsinki, Vienna University of Technology and Masaryk University in Brno. The study was recently published in the renowned journal Nature Communications. 

Chitin is the primary building material both for the exoskeleton of insects and for the bristles of bristle worms such as the marine annelid worm Platynereis dumerilii. However, the bristle worms have a somewhat softer chitin – the so-called beta chitin – which is particularly interesting for biomedical applications. The bristles allow the worms to move around in the water. How exactly the chitin is formed into distinct bristles has so far remained enigmatic. The new study now provides exciting insight into this special biogenesis. Florian Raible explains: "The process begins with the tip of the bristle, followed by the middle section and finally the base of the bristles. The finished parts are pushed further and further out of the body. In this development process, the important functional units are created one after the other, piece by piece, which is similar to 3D printing." 

A better understanding of processes such as these also holds potential for the development of future medical products or for the production of naturally degradable materials. Beta-chitin from the dorsal shell of squid, for example, is currently used as a raw material for the production of particularly well-tolerated wound dressings. "Perhaps in the future it will also be possible to use annelid cells to produce this material," says Raible. 
  
The exact biological background to this: so-called chaetoblasts play a central role in this process. Chaetoblasts are specialized cells with long surface structures, so-called microvilli. These microvilli harbor a specific enzyme that the researches could show to be responsible for the formation of chitin, the material from which the bristles are ultimately made. The researchers' results show a dynamic cell surface characterized by geometrically arranged microvilli.

The individual microvilli have a similar function to the nozzles of a 3D printer. Florian Raible explains: "Our analysis suggests that the chitin is produced by the individual microvilli of the chaetoblast cell. The precise change in the number and shape of these microvilli over time is therefore the key to shaping the geometric structures of the individual bristles, such as individual teeth on the bristle tip, which are precise down to the sub-micrometer range." The bristles usually develop within just two days and can have different shapes; depending on the worm's stage of development, they are shorter or longer, more pointed or flatter.

In addition to the local collaboration with the Vienna University of Technology and imaging specialists from the University of Brno, the cooperation with the Jokitalo laboratory at the University of Helsinki proved to be a great benefit for the researchers at the University of Vienna. Using their expertise in serial block-face scanning electron microscopy (SBF-SEM), the researchers investigated the arrangement of microvilli in the bristle formation process and proposed a 3D model for the synthesis of bristle formation. First author Kyojiro Ikeda from the University of Vienna explains: "Standard electron tomography is very labor-intensive, as the cutting of the samples and their examination in the electron microscope must be done manually. With this approach, however, we can reliably automate the analysis of thousands of layers." 

The Raible group is currently working on improving the resolution of the observation in order to reveal even more details about bristle biogenesis.

 

Cats purrfectly demonstrate what it takes to trust robots




UNIVERSITY OF NOTTINGHAM

Cat Royale - Cat with the robot arm 

IMAGE: 

CAT WITH THE ROBOT ARM IN THE CAT ROYALE INSTALLATION 

view more 

CREDIT: BLAST THEORY - STEPHEN DALY




Would you trust a robot to look after your cat? New research suggests it takes more than a carefully designed robot to care for your cat, the environment in which they operate is also vital, as well as human interaction.

Cat Royale is a unique collaboration between Computer Scientists from the University of Nottingham and artists at Blast Theory who worked together to create a multispecies world centred around a be-spoke enclosure in which three cats and a robot arm coexist for six hours a day during a twelve-day installation as part of an artist-led project. The installation was launched in 2023 at the World Science Festival in Brisbane, Australia and has been touring since, it has just won a Webby award for its creative experience.

The research paper, “Designing Multispecies Worlds for Robots, Cats, and Humans” has just been presented at the annual Computer-Human Conference (CHI’24) where it won best paper. It outlines how designing the technology and its interactions is not sufficient, but that it is equally important to consider the design of the `world' in which the technology operates. The research also highlights the necessity of human involvement in areas such as breakdown recovery, animal welfare, and their role as audience.

Cat Royale centred around a robot arm offering activities to make the cats happier, these included dragging a ‘mouse’ toy along the floor, raising a feather ‘bird’ into the air, and even offering them treats to eat. The team then trained an AI to learn what games the cats liked best so that it could personalise their experiences.

 “At first glance, the project is about designing a robot to enrich the lives of a family of cats by playing with them. “ commented Professor Steve Benford from the University of Nottingham who led the research, “Under the surface, however, it explores the question of what it takes to trust a robot to look after our loved ones and potentially ourselves.”

Working with Blast Theory to develop and then study Cat Royale, the research team gained important insights into the design of robots and its interactions with the cats. They had to design the robot to pick up toys, deploy them in ways that excited the cats, while it learned which games each cat liked. They also designed the entire world in which the cats and the robot lived, providing safe spaces for the cats to observe the robot and from which to sneak up on it, and decorating it so that the robot had the best chance of spotting the approaching cats. 

The implication is designing robots involves interior design as well as engineering and AI. If you want to introduce robots into your home to look after your loved ones, then you will likely need to redesign your home.

Research workshops for Cat Royale were held at the Univeraity of Nottingham’s unique Cobotmaker Space where stakeholders were bought together to think about the design of the robot /welfare of cats. Eike Schneiders, Transitional Assistant Professor in the Mixed Reality Lab at the University of Nottingham worked on the design, he said: “As we learned through Cat Royale, creating a multispecies system—where cats, robots, and humans are all accounted for—takes more than just designing the robot. We had to ensure animal wellbeing at all times, while simultaneously ensuring that the interactive installation engaged the (human) audiences around the world. This involved consideration of many elements, including the design of the enclosure, the robot and its underlying systems, the various roles of the humans-in-the-loop, and, of course, the selection of the cats.”