Sunday, May 09, 2021

Penn study reveals how opioid supply shortages shape emergency department prescribing behaviors

Injectable opioid shortage in 2018 led to less opioid use overall, yet no significant change in opioid dosing, Penn Medicine research finds

UNIVERSITY OF PENNSYLVANIA SCHOOL OF MEDICINE

Research News

PHILADELPHIA-- When evaluating the opioid crisis, research reveals that external factors - such as the volume of pre-filled syringes, or a default number of opioid tablets that could easily be ordered at discharge for the patient - can shift prescribing and compel emergency department (ED) physicians to administer or prescribe greater quantities of opioids. A new study published in the Journal of Medical Toxicology reveals that opioid prescribing behavior can also be decreased by external factors, such as a supply shortage.

Led by the Perelman School of Medicine at the University of Pennsylvania, researchers evaluated pharmacy data from the electronic medical records (EMR) collected before, during, and after a period of parenteral opioid shortage across two large urban academic emergency departments - the Hospital of the University of Pennsylvania and University Hospital in Newark, New Jersey. In this case, the shortage was of parenteral morphine and hydromorphone, as a result of supply chain disruptions caused by Hurricane Maria in 2018.

Researchers found that the percentage of patients who received an opioid among all ED visits during the 2018 shortage fell significantly from 11.5% pre-shortage to 8.5% during, and did not return to baseline once the shortage abated. Further, the total number of oral or IV opioid doses administered during the shortage also decreased and remained lower than pre-shortage levels once supply chains were restored. However, the study also found that while fewer opioid doses were administered to fewer patients, there was no change in net Morphine Milligram Equivalent (MME) per patient receiving opioids.

"Although the percentage of patients who received non-opioid analgesics did not rise during the shortage, it was significantly higher in the post-shortage period," said lead investigator Amanda Deutsch, MD, an Emergency Medicine resident at Penn. "This suggests that a subset of patients were transitioned to non-opioids, and this prescribing practice was a sustained change after the resolution of the shortage."

The study also found that because the shortages were specifically of parenteral morphine and hydromorphone, those were replaced by parenteral fentanyl, but the overall total use in MME of opioids administered remained the same. The use of oral morphine also appeared to increase during the shortage period. These findings may suggest that the shortage may have led to a shift from IV opioid use to oral opioid formulations.

However, researchers felt that the shift in prescribing patterns during the shortage can offer an opportunity for more research into how external factors can influence ED prescribing practices. "Changing clinician prescribing behavior is challenging," said author Jeanmarie Perrone, MD, a professor of Emergency Medicine at Penn. "This study shows encouraging data to support that there are environmental modifications that can nudge providers toward more judicious opioid use."

###

Penn Medicine is one of the world's leading academic medical centers, dedicated to the related missions of medical education, biomedical research, and excellence in patient care. Penn Medicine consists of the Raymond and Ruth Perelman School of Medicine at the University of Pennsylvania (founded in 1765 as the nation's first medical school) and the University of Pennsylvania Health System, which together form a $8.9 billion enterprise.

The Perelman School of Medicine has been ranked among the top medical schools in the United States for more than 20 years, according to U.S. News & World Report's survey of research-oriented medical schools. The School is consistently among the nation's top recipients of funding from the National Institutes of Health, with $496 million awarded in the 2020 fiscal year.

The University of Pennsylvania Health System's patient care facilities include: The Hospital of the University of Pennsylvania and Penn Presbyterian Medical Center--which are recognized as one of the nation's top "Honor Roll" hospitals by U.S. News & World Report--Chester County Hospital; Lancaster General Health; Penn Medicine Princeton Health; and Pennsylvania Hospital, the nation's first hospital, founded in 1751. Additional facilities and enterprises include Good Shepherd Penn Partners, Penn Medicine at Home, Lancaster Behavioral Health Hospital, and Princeton House Behavioral Health, among others.

Penn Medicine is powered by a talented and dedicated workforce of more than 44,000 people. The organization also has alliances with top community health systems across both Southeastern Pennsylvania and Southern New Jersey, creating more options for patients no matter where they live.

Penn Medicine is committed to improving lives and health through a variety of community-based programs and activities. In fiscal year 2020, Penn Medicine provided more than $563 million to benefit our community.

 

Supernovae twins open up new possibilities for precision cosmology

Findings will enhance dark energy experiments at major telescopes

DOE/LAWRENCE BERKELEY NATIONAL LABORATORY

Research News

Cosmologists have found a way to double the accuracy of measuring distances to supernova explosions - one of their tried-and-true tools for studying the mysterious dark energy that is making the universe expand faster and faster. The results from the Nearby Supernova Factory (SNfactory) collaboration, led by Greg Aldering of the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab), will enable scientists to study dark energy with greatly improved precision and accuracy, and provide a powerful crosscheck of the technique across vast distances and time. The findings will also be central to major upcoming cosmology experiments that will use new ground and space telescopes to test alternative explanations of dark energy.

Two papers published in The Astrophysical Journal report these findings, with Kyle Boone as lead author. Currently a postdoctoral fellow at the University of Washington, Boone is a former graduate student of Nobel Laureate Saul Perlmutter, the Berkeley Lab senior scientist and UC Berkeley professor who led one of the teams that originally discovered dark energy. Perlmutter was also a co-author on both studies.

Supernovae were used in 1998 to make the startling discovery that the expansion of the universe is speeding up, rather than slowing down as had been expected. This acceleration - attributed to the dark energy that makes up two-thirds of all the energy in the universe - has since been confirmed by a variety of independent techniques as well as with more detailed studies of supernovae.

The discovery of dark energy relied on using a particular class of supernovae, Type Ia. These supernovae always explode with nearly the same intrinsic maximum brightness. Because the observed maximum brightness of the supernova is used to infer its distance, the small remaining variations in the intrinsic maximum brightness limited the precision with which dark energy could be tested. Despite 20 years of improvements by many groups, supernovae studies of dark energy have until now remained limited by these variations.

Quadrupling the number of supernovae

The new results announced by the SNfactory come from a multi-year study devoted entirely to increasing the precision of cosmological measurements made with supernovae. Measurement of dark energy requires comparisons of the maximum brightnesses of distant supernovae billions of light-years away with those of nearby supernovae "only" 300 million light-years away. The team studied hundreds of such nearby supernovae in exquisite detail. Each supernova was measured a number of times, at intervals of a few days. Each measurement examined the spectrum of the supernova, recording its intensity across the wavelength range of visible light. An instrument custom-made for this investigation, the SuperNova Integral Field Spectrometer, installed at the University of Hawaii 2.2-meter telescope at Maunakea, was used to measure the spectra.

"We've long had this idea that if the physics of the explosion of two supernovae were the same, their maximum brightnesses would be the same. Using the Nearby Supernova Factory spectra as a kind of CAT scan through the supernova explosion, we could test this idea," said Perlmutter.

Indeed, several years ago, physicist Hannah Fakhouri, then a graduate student working with Perlmutter, made a discovery key to today's results. Looking at a multitude of spectra taken by the SNfactory, she found that in quite a number of instances, the spectra from two different supernovae looked very nearly identical. Among the 50 or so supernovae, some were virtually identical twins. When the wiggly spectra of a pair of twins were superimposed, to the eye there was just a single track. The current analysis builds on this observation to model the behavior of supernovae in the period near the time of their maximum brightness.

The new work nearly quadruples the number of supernovae used in the analysis. This made the sample large enough to apply machine-learning techniques to identify these twins, leading to the discovery that Type Ia supernova spectra vary in only three ways. The intrinsic brightnesses of the supernovae also depend primarily on these three observed differences, making it possible to measure supernova distances to the remarkable accuracy of about 3%.

Just as important, this new method does not suffer from the biases that have beset previous methods, seen when comparing supernovae found in different types of galaxies. Since nearby galaxies are somewhat different than distant ones, there was a serious concern that such dependence would produce false readings in the dark energy measurement. Now this concern can be greatly reduced by measuring distant supernovae with this new technique.

In describing this work, Boone noted, "Conventional measurement of supernova distances uses light curves - images taken in several colors as a supernova brightens and fades. Instead, we used a spectrum of each supernova. These are so much more detailed, and with machine-learning techniques it then became possible to discern the complex behavior that was key to measuring more accurate distances."

The results from Boone's papers will benefit two upcoming major experiments. The first experiment will be at the 8.4-meter Rubin Observatory, under construction in Chile, with its Legacy Survey of Space and Time, a joint project of the Department of Energy and the National Science Foundation. The second is NASA's forthcoming Nancy Grace Roman Space Telescope. These telescopes will measure thousands of supernovae to further improve the measurement of dark energy. They will be able to compare their results with measurements made using complementary techniques.

Aldering, also a co-author on the papers, observed that "not only is this distance measurement technique more accurate, it only requires a single spectrum, taken when a supernova is brightest and thus easiest to observe - a game changer!" Having a variety of techniques is particularly valuable in this field where preconceptions have turned out to be wrong and the need for independent verification is high.

###

The SNfactory collaboration includes Berkeley Lab, the Laboratory for Nuclear Physics and High Energy at Sorbonne University, the Center for Astronomical Research of Lyon, the Institute of Physics of the 2 Infinities at the University Claude Bernard, Yale University, Germany's Humboldt University, the Max Planck Institute for Astrophysics, China's Tsinghua University, the Center for Particle Physics of Marseille, and Clermont Auvergne University.

This work was supported by the Department of Energy's Office of Science, NASA's Astrophysics Division, the Gordon and Betty Moore Foundation, the French National Institute of Nuclear and Particle Physics and the National Institute for Earth Sciences and Astronomy of the French National Centre for Scientific Research, the German Research Foundation and German Aerospace Center, the European Research Council, Tsinghua University, and the National Natural Science Foundation of China.

Additional background

In 1998, two competing groups studying supernovae, the Supernova Cosmology Project and the High-z Supernova Search team, both announced they had found evidence that, contrary to expectations, the expansion of the universe was not slowing but becoming faster and faster. Dark energy is the term used to describe the cause of the acceleration. The 2011 Nobel Prize was awarded to leaders of the two teams: Saul Perlmutter of Berkeley Lab and UC Berkeley, leader of the Supernova Cosmology Project, and to Brian Schmidt of the Australian National University and Adam Riess of Johns Hopkins University, from the High-z team.

Additional techniques for measuring dark energy include the DOE-supported Dark Energy Spectroscopic Instrument, led by Berkeley Lab, which will use spectroscopy on 30 million galaxies in a technique called baryon acoustic oscillation. The Rubin Observatory will also use another called weak gravitational lensing.

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory (https://www.lbl.gov/) and its scientists have been recognized with 14 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab's facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy's Office of Science.

DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

-By Bob Cahn

 

Researchers develop artificial intelligence that can detect sarcasm in social media

UNIVERSITY OF CENTRAL FLORIDA

Research News

IMAGE

IMAGE: DR. GARIBAY IS INVESTIGATING WAYS TO MAKE ARTIFICIAL INTELLIGENCE SMARTER WHEN IT COMES TO DETECTING AND APPROPRIATELY RESPONDING TO HUMAN EMOTIONS. view more 

CREDIT: UNIVERSITY OF CENTRAL FLORIDA

Computer science researchers at the University of Central Florida have developed a sarcasm detector.

Social media has become a dominant form of communication for individuals, and for companies looking to market and sell their products and services. Properly understanding and responding to customer feedback on Twitter, Facebook and other social media platforms is critical for success, but it is incredibly labor intensive.

That's where sentiment analysis comes in. The term refers to the automated process of identifying the emotion -- either positive, negative or neutral -- associated with text. While artificial intelligence refers to logical data analysis and response, sentiment analysis is akin to correctly identifying emotional communication. A UCF team developed a technique that accurately detects sarcasm in social media text.

The team's findings were recently published in the journal Entropy.

Effectively the team taught the computer model to find patterns that often indicate sarcasm and combined that with teaching the program to correctly pick out cue words in sequences that were more likely to indicate sarcasm. They taught the model to do this by feeding it large data sets and then checked its accuracy.

"The presence of sarcasm in text is the main hindrance in the performance of sentiment analysis," says Assistant Professor of engineering Ivan Garibay '00MS '04PhD. "Sarcasm isn't always easy to identify in conversation, so you can imagine it's pretty challenging for a computer program to do it and do it well. We developed an interpretable deep learning model using multi-head self-attention and gated recurrent units. The multi-head self-attention module aids in identifying crucial sarcastic cue-words from the input, and the recurrent units learn long-range dependencies between these cue-words to better classify the input text."

The team, which includes computer science doctoral student Ramya Akula, began working on this problem under a DARPA grant that supports the organization's Computational Simulation of Online Social Behavior program.

"Sarcasm has been a major hurdle to increasing the accuracy of sentiment analysis, especially on social media, since sarcasm relies heavily on vocal tones, facial expressions and gestures that cannot be represented in text," says Brian Kettler, a program manager in DARPA's Information Innovation Office (I2O). "Recognizing sarcasm in textual online communication is no easy task as none of these cues are readily available."

This is one of the challenges Garibay's Complex Adaptive Systems Lab (CASL) is studying. CASL is an interdisciplinary research group dedicated to the study of complex phenomena such as the global economy, the global information environment, innovation ecosystems, sustainability, and social and cultural dynamics and evolution. CASL scientists study these problems using data science, network science, complexity science, cognitive science, machine learning, deep learning, social sciences, team cognition, among other approaches.

"In face-to-face conversation, sarcasm can be identified effortlessly using facial expressions, gestures, and tone of the speaker," Akula says. "Detecting sarcasm in textual communication is not a trivial task as none of these cues are readily available. Specially with the explosion of internet usage, sarcasm detection in online communications from social networking platforms is much more challenging."

###

Garibay is an assistant professor in Industrial Engineering and Management Systems. He has several degrees including a Ph.D. in computer science from UCF. Garibay is the director of UCF's Artificial Intelligence and Big Data Initiative of CASL and of the master's program in data analytics. His research areas include complex systems, agent-based models, information and misinformation dynamics on social media, artificial intelligence and machine learning. He has more than 75 peer-reviewed papers and more than $9.5 million in funding from various national agencies.

Akula is a doctoral scholar and graduate research assistant at CASL. She has a master's degree in computer science from Technical University of Kaiserslautern in Germany and a bachelor's degree in computer science engineering from Jawaharlal Nehru Technological University, India.

 

Having a ball: New English Premier League soccer ball more stable, drags more

Researchers at the University of Tsukuba compare a new English Premier League soccer ball with previous versions in wind-tunnel experiments, and find increased drag and stability, which may lead to a better understanding of aerodynamics in sports

UNIVERSITY OF TSUKUBA

Research News

IMAGE

IMAGE: UNIVERSITY OF TSUKUBA RESEARCHERS TESTED A NEW NIKE SOCCER BALL USED IN THE ENGLISH PREMIER LEAGUE WITH A WIND TUNNEL. THEY FOUND THAT ITS AERODYNAMIC PROPERTIES MAKE IT MORE STABLE... view more 

CREDIT: UNIVERSITY OF TSUKUBA

Tsukuba, Japan - Scientists from the Faculty of Health and Sports Sciences at the University of Tsukuba used aerodynamics experiments to empirically test the flight properties of a new four-panel soccer ball adopted by the English Premier League this year. Based on projectile and wind-tunnel data, they computed the drag and side forces and found that the new ball was marginally more stable than previous versions but may not fly as far. This work may help improve the design of future sports equipment.

Sports players know that millions of dollars in salary and potential endorsement deals can be at stake during each match. Soccer players often complain about the aerodynamic properties of the ball because a random flutter in flight can turn a harmless shot into a goal. Old-school soccer balls have 32 panels, with a mix of hexagons and pentagons. More recently, top soccer leagues have experimented with 6-panel versions with strips similar to a volleyball. For the new season, the English Premier League has introduced the Flight 2020 Soccer Ball by Nike, which is advertised as having molded grooves that provide consistent flight.

Now, researchers at Tsukuba University have tested these claims with wind tunnel experiments. They measured the drag coefficient for the ball, along with two previous models, as a function of the Reynolds number. The Reynolds number, an important parameter in fluid dynamics, controls the transition from smooth to turbulent flow. According to author Professor Takeshi Asai, "at low Reynolds numbers, smooth flow occurs, because viscosity can damp out turbulence. At high Reynolds numbers, chaotic air vortices can lead to unstable and unpredictable flight patterns."

The team found increased drag at high Reynolds numbers for the new ball. This led to reduced flight range but may have also reduced lateral forces that can destabilize the trajectory. This was especially true in the "asymmetric" orientation of the ball, when one of the grooves was facing forward. "The smaller fluctuations in the side and lift forces of the Flight 2020 indicates that it is less likely to experience irregular changes in trajectory, thereby possibly leading to greater stability during flight," says Professor Asai.

The team partially attributed this tradeoff in stability at the expense of range to increased surface roughness. This finding may be useful for designing other sports equipment to increase the importance of skill and reduce the impact of luck.

###

The work is published in Scientific Reports as "Aerodynamics of the newly approved football for the English Premier League 2020-21 season" (DOI: 10.1038/s41598-021-89162-y).

 

Head to toe: study reveals brain activity behind missed penalty kicks

Study finds that beneficial brain regions, including those involved in movement, underlie successful kicks and overthinking may be a factor in missed kicks

FRONTIERS

Research News

Are penalty shots a soccer player's dream or nightmare? What should be an easy shot can become a mammoth task when the hopes and fears of an entire nation rest on a player's shoulders, leading them to choke under pressure. Understanding the brain activity behind choking is the driving force behind a new study in open-access journal Frontiers in Computer Science. The study is the first to measure brain activity during penalty shots in a soccer pitch environment. It finds that people who choked activated areas of the brain involved in long-term thinking, suggesting that they were overthinking the consequences of missing the shot.

Penalty shootouts hold a special fear for many soccer fans. For instance, the English team has suffered some infamous misses over the years and their record at the penalty spot is fairly bleak. Remember Beckham's slip on the penalty spot against Portugal in the 2004 European Cup? He certainly does. Penalties can go either way, which makes them exciting to watch, but they raise some questions about performance under pressure.

"How can it be that football players with a near perfect control over the ball (they can very precisely kick a ball over more than 50 meters) fail to score a penalty kick from only 11 meters?" asks coauthor Max Slutter, an MSc student at the faculty of electrical engineering, mathematics and computer science at the University of Twente, the Netherlands. "Obviously, huge psychological pressure plays a role, but why does this pressure cause a missed penalty? We tried to answer this by measuring the brain activity of football players during the physical execution of a penalty kick."

Slutter and colleagues recruited 22 volunteers to kick penalties and measured their brain activity using a technique called functional near-infrared spectroscopy (fNIRS). fNIRS involves wearing a headset, and can measure brain activity on the move. The study is the first to investigate the neuroscience behind choking under realistic conditions outside a laboratory.

The volunteers attempted to score penalties under different pressure conditions: with an open goal, against a friendly goalkeeper, and in a high-pressure situation where the goalkeeper attempted to distract them and there was a prize at stake.

"We found that players who were able to perform under pressure activated task-relevant areas of the brain," explained Dr Nattapong Thammasan, of the University of Twente. "For example, increased activation of the motor cortex was related to performing under pressure. This seems logical, as movement is one of the most important elements when taking a penalty."

For players who tended to experience more anxiety and miss penalties, another area of the brain was more active - the pre-frontal cortex. This brain region is involved in long-term thinking, suggesting that such players were thinking about the consequences of missing the shot, which impaired their performance.

Strikingly, the researchers believe that the fNIRS technology could assist players to perform better under pressure by letting them know how their brains are behaving. They hypothesize that players could train themselves to activate beneficial brain regions in high-pressure situations.

Excitingly, the technique could also be beneficial for other professions where performance under high pressure is important, such as brain surgery. It may be too late for Beckham, but next-generation brain surgeons could be on the way.

#

How viruses and bacteria can reach drinking water wells

Scientists in Berlin and Milan study transport of microorganisms in bank filtration

POLITECNICO DI MILANO

Research News

Induced bank filtration is a key and well-established approach to provide drinking water supply to populated areas located along rivers or lakes and with limited access to groundwater resources. It is employed in several countries worldwide, with notable examples in Europe, the United States, and parts of Africa. Contamination of surface waters poses a serious threat to attaining drinking water standards. In this context, human pathogenic microorganisms such as some viruses and bacteria, originating from the discharge of wastewater treatment plants, form a major contaminant group. A detailed study at an induced bank filtration site along the Rhine river in Germany has now linked transport of bacteria to seasonal dynamics. Key results of the study show that floods should be considered as particular threats, because they can reduce the purification capacity of bank filtration, thus leading to an increase in the concentrations of bacteria in groundwater. Changes in properties of the riverbed sediments over the course of a year can markedly influence the purification capacity of bank filtration and these dynamics may need to be considered in risk assessment practices.

Toolbox for Waterworks Operators

A project led by Prof. Dr Irina Engelhardt, who teaches hydrogeology at the TU Berlin, aims to enhance our understanding of bank filtration processes and to improve predictive capabilities of models describing transport of microorganisms in bank filtration. Model-based tools are developed to enable waterworks operators to assess more accurately the purification potential of their bank filtration plants with regard to microorganisms. Partners collaborating to the research include Prof. Dr. Alberto Guadagnini and Prof. Dr. Monica Riva from the Dipartimento di Ingegneria Civile e Ambientale of the Politecnico di Milano.

First results of the cooperation were recently published in the article "Uncertainty analysis and identification of key parameters controlling bacteria transport within a riverbank filtration scenario" in the international journal Water Resources Research.

Uses in Germany and in Italy

More than 100 bank filtration plants are spread across Europe, mostly located in Northern and Central European countries (particularly in Germany, The Netherlands, France, and Finland). In Germany, drinking water production by induced bank filtration is regionally significant. For example, about half of the drinking water in Berlin is obtained by this method. In contrast, induced bank filtration is not widespread in Mediterranean countries. In Italy, an example can be found in Lucca (Tuscany) along the Serchio river. But pressure on water resources in the Mediterranean is increasing due to climate change. Induced bank filtration, as a resource efficient drinking water production method, could become a tool to combat increasing water scarcity in the region.

Uncertainty in numerical model predictions

For drinking water production by induced bank filtration, groundwater extraction wells are placed near rivers or lakes. These wells mainly yield so-called bank filtrate, corresponding to water that flows from the surface water body to the well which is naturally purified by the underground migration. Human pathogens are affected by a complex system of processes during their migration through the aquifer and their concentrations tend to decrease in time because these are partially retained on the sediment grains and can also naturally lose their infectivity. A critical operational question for waterworks operators is: how strong is the purification effect of bank filtration, or, in other words, how much can nature reduce the concentration of microorganisms? The research has highlighted the role of the main factors influencing these effects. These include, for example, groundwater flow velocity, temperature, and the mineralogical composition of the subsurface. Furthermore, it is shown that the transport behavior depends on the microorganism species considered. As a result, predictions about migration of microorganisms in groundwater are subject to a high degree of uncertainty. The research offers a powerful and flexible tool to quantify and possibly control such uncertainty to increase confidence in the purification effect of bank filtration.

The research project is funded by the Deutsche Bundesstiftung Umwelt (German Federal Environmental Foundation). The University of Vienna as well as the Stadtwerke Düsseldorf and the company VisDat from Dresden are involved in the research together with the TU Berlin and the Politecnico di Milano.

###

 

Learning on the fly

Computational model demonstrates similarity in how humans and insects learn about their surroundings

UNIVERSITY OF SUSSEX

Research News

IMAGE

IMAGE: SCHEMATIC OF THE VS MODEL. UNITS ARE COLOUR-CODED ACCORDING TO CELL TYPES. view more 

CREDIT: UNIVERSITY OF SUSSEX

Even the humble fruit fly craves a dose of the happy hormone, according to a new study from the University of Sussex which shows how they may use dopamine to learn in a similar manner to humans.

Informatics experts at the University of Sussex have developed a new computational model that demonstrates a long sought after link between insect and mammalian learning, as detailed in a new paper published today in Nature Communications.

Incorporating anatomical and functional data from recent experiments, Dr James Bennett and colleagues modelled how the anatomy and physiology of the fruit fly's brain can support learning according to the reward prediction error (RPE) hypothesis.

The computational model indicates how dopamine neurons in an area of a fruit fly's brain, known as the mushroom body, can produce similar signals to dopamine neurons in mammals, and how these dopamine signals can reliably instruct learning.

The academics believe that establishing whether flies also use prediction errors to learn could lead to more humane animal research allowing researchers to replace animals with more simple insect species for future studies into the mechanisms of learning.

By opening up new opportunities to study neural mechanisms of learning, the researchers hope the model could also be helpful in illuminating greater understanding of mental health issues such as depression or addiction which are underpinned by the RPE hypothesis.

Dr Bennett, research fellow in the University of Sussex's School of Engineering and Informatics, said: "Using our computational model, we were able to show that data from insect experiments did not necessarily conflict with predictions from the RPE hypothesis, as had been thought previously.

"Establishing a bridge between insect and mammal studies on learning may open up the possibility to exploit the powerful genetic tools available for performing experiments in insects, and the smaller scale of their brains, to make sense of brain function and disease in mammals, including humans."

Understanding of how mammals learn has come a long way thanks to the RPE hypothesis, which suggests that associative memories are learned in proportion to how inaccurate they are.

The hypothesis has had considerable success explaining experimental data about learning in mammals, and has been extensively applied to decision-making and mental health illnesses such as addiction and depression. But scientists have encountered difficulties when applying the hypothesis to learning in insects due to conflicting results from different experiments.

The University of Sussex research team created a computational model to show how the major features of mushroom body anatomy and physiology can implement learning according to the RPE hypothesis.

The model simulates a simplification of the mushroom body, including different neuron types and the connections between them, and how the activity of those neurons promote learning and influence the decisions a fly makes when certain choices are rewarded.

To further understanding of learning in fly brains, the research team used their model to make five novel predictions about the influence different neurons in the mushroom body have on learning and decision-making, in the hope that they promote future experimental work.

Dr Bennett said: "While other models of the mushroom body have been created, to the best of our knowledge no other model until now has included connections between dopamine neurons and another set of neurons that predict and drive behaviour towards rewards. For example, when the reward is the sugar content of food, these connections would allow the predicted sugar availability to be compared with the actual sugar ingested, allowing more accurate predictions and appropriate sugar-seeking behaviours to be learned.

"The model can explain a large array of behaviours exhibited by fruit flies when the activity of particular neurons in their brains are either silenced or activated artificially in experiments. We also propose connections between dopamine neurons and other neurons in the mushroom body, which have not yet been reported in experiments, but would help to explain even more experimental data."

Thomas Nowotny, Professor of Informatics at the University of Sussex, said: "The model brings together learning theory and experimental knowledge in a way that allows us to think systematically how fly brains actually work. The results show how learning in simple flies might be more similar to how we learn than previously thought."

###

 

What consumers mean when they say your products are authentic

News from the Journal of Marketing

AMERICAN MARKETING ASSOCIATION

Research News

Researchers from University of Southern California, Bocconi University, and Vrije Universitei Amsterdam published a new paper in the Journal of Marketing that explains the six types of judgements consumers make when determining a product's authenticity and how marketers can use this insight to deliver more authentic offerings.

The study, forthcoming in the Journal of Marketing, is titled "The Concept of Authenticity: What it Means to Consumers" and is authored by Joseph Nunes, Andrea Ordanini, and Gaia Giambastiani.

Consumers crave authenticity. Yet marketing itself is typically considered inherently inauthentic. Hence, firms must learn to understand, manage, and excel at rendering authenticity. The critical question is: how? Marketers who wish to deliver authentic consumption experiences would benefit from guidance regarding ways to enhance consumers' assessments of the authenticity of their offerings. The starting point is knowing what consumers mean when they talk about authenticity, a nebulous concept.

Nunes says "When consumers talk about authentic consumption experiences, they really are referencing six types of judgements they make involving: accuracy, connectedness, integrity, legitimacy, originality, and proficiency." Accuracy refers to the seller being transparent and reliable in what is conveyed to consumers. Connectedness describes consumers' feelings of engagement and at times a sense of transformation. Integrity means the source is seen as intrinsically motivated, while acting autonomously and consistently. Legitimacy refers to conformity in terms of adhering to norms, standards, rules, or traditions. Originality refers to a product or service standing out from the mainstream. Finally, proficiency refers to the display of skills, craftsmanship, and/or expertise in the offering.

Knowing that judgments of accuracy, connectedness, integrity, legitimacy, originality, and proficiency are what comprise assessments of authenticity, managers can more efficiently and effectively deduce actionable strategies in terms of positioning. Ordanini continues, "From this research, practitioners can also tell which of these six judgments to emphasize and when in their customer marketing and communications. For example, companies selling hedonic products should see relatively large returns perception-wise from emphasizing proficiency because it matters more for hedonic products than for utilitarian products." The mattress company Tuft & Needle (what is more hedonic than sleep?) illustrates this by focusing on their belief in "quality craftsmanship without the gimmicks" on their website.

###

Full article and author contact information available at: https://doi.org/10.1177/0022242921997081

About the Journal of Marketing

The Journal of Marketing develops and disseminates knowledge about real-world marketing questions useful to scholars, educators, managers, policy makers, consumers, and other societal stakeholders around the world. Published by the American Marketing Association since its founding in 1936, JM has played a significant role in shaping the content and boundaries of the marketing discipline. Christine Moorman (T. Austin Finch, Sr. Professor of Business Administration at the Fuqua School of Business, Duke University) serves as the current Editor in Chief. https://www.ama.org/jm

About the American Marketing Association (AMA)

As the largest chapter-based marketing association in the world, the AMA is trusted by marketing and sales professionals to help them discover what is coming next in the industry. The AMA has a community of local chapters in more than 70 cities and 350 college campuses throughout North America. The AMA is home to award-winning content, PCM® professional certification, premiere academic journals, and industry-leading training events and conferences. https://www.ama.org

 

Discovery of a new genetic cause of hearing loss illuminates how inner ear works

Penn Medicine study shows link between mutations of GAS2 gene and ability to amplify incoming sound, providing research target for future gene therapy

UNIVERSITY OF PENNSYLVANIA SCHOOL OF MEDICINE

Research News

PHILADELPHIA-- A gene called GAS2 plays a key role in normal hearing, and its absence causes severe hearing loss, according to a study led by researchers in the Perelman School of Medicine at the University of Pennsylvania.

The researchers, whose findings are published online today in Developmental Cell, discovered that the protein encoded by GAS2 is crucial for maintaining the structural stiffness of support cells in the inner ear that normally help amplify incoming sound waves. They showed that inner ear support cells lacking functional GAS2 lose their amplifier abilities, causing severe hearing impairment in mice. The researchers also identified people who have GAS2 mutations and severe hearing loss.

"Anatomists 150 years ago took pains to draw these support cells with the details of their unique internal structures, but it's only now, with this discovery about GAS2, that we understand the importance of those structures for normal hearing," said study senior author Douglas J. Epstein, PhD, professor of genetics at Penn Medicine.

Two to three of every 1,000 children in the United States are born with hearing loss in one or both ears. About half of these cases are genetic. Although hearing aids and cochlear implants often can help, these devices seldom restore hearing to normal.

One of the main focuses of the Epstein laboratory at Penn Medicine is the study of genes that control the development and function of the inner ear--genes that are often implicated in congenital hearing loss. The inner ear contains a complex, snail-shaped structure, the cochlea, that amplifies the vibrations from sound waves, transduces them into nerve signals, and sends those signals toward the auditory cortex of the brain.

Unraveling the role of Gas2 in hearing

A few years ago, Epstein's team discovered that Gas2, the mouse version of human GAS2, is switched on in embryos by another gene known to be critical for inner ear development. To determine Gas2's role in that development, the team developed a line of mice in which the gene had been knocked out of the genome and called them Gas2-knockout mice.

Alex Rohacek, PhD, a former graduate student in the Epstein lab, was puzzled to observe that the Gas2-knockout mice had inner ears with cells and structures that seemed quite normal. However, the animals, when tested, turned out to be severely hearing-impaired, with deficits at high sound frequencies of up to 50 decibels--equivalent to a loss of 99.999 percent of the normal acoustic energy.

Tingfang Chen, PhD, a postdoctoral fellow and co-first author on the study, determined that Gas2 is normally active within inner-ear support cells called pillar cells and Deiters' cells. In these cells, the protein encoded by the gene binds to flexible, tube-like structures called microtubules in a way that bundles and stabilizes them, effectively stiffening the cells.

With help from the collaborating team of Benjamin L. Prosser, PhD, assistant professor of Physiology at Penn Medicine and an expert on microtubules, the researchers discovered that when pillar cells and Deiters' cells lack Gas2, their microtubule bundles tend to come apart, dramatically reducing the stiffness of the cells.

That turns out to have dire implications for hearing. Within the inner ear, pillar cells and Dieters' cells help form the basic structure of the cochlea and serve as physical supports for cells called outer hair cells. The outer hair cells move in response to incoming acoustical vibrations--essentially to provide a crucial amplification of that sound energy. The experiments revealed that the pillar and Deiters' cells' loss of stiffness, due to the absence of Gas2, severely degrades the sound-amplifying properties of the outer hair cells they support.

"We observed that some of Deiters' cells in the Gas2-knockout mice even buckled under the tension of the rapid movements of the outer hair cells," Epstein said.

The experiments included sophisticated imaging of propagating sound waves in the inner ears of live Gas2-knockout and normal mice, conducted by collaborator John Oghalai, MD, chair and professor of otolaryngology-head and neck surgery at the Keck School of Medicine of USC, and his team.

GAS2 also causes human hearing loss

Curiously, the researchers could find no reports of GAS2-associated congenital hearing loss in the medical literature. Even when they canvassed colleagues around the world who run hearing-loss clinics, they came up empty-handed.

Then one day, Hannie Kremer, PhD, professor and chair of molecular otogenetics at Radboud University Medical Center in the Netherlands, emailed Epstein. She and her team had been studying a Somalian family in which four of the siblings had severe hearing loss from early life. The affected family members had no mutations in known hearing-loss genes--but each carried two mutant copies of GAS2.

The study therefore establishes GAS2 as a very probable new hearing loss gene in humans--the first one known to affect the mechanical properties of inner ear support cells.

The prevalence of hearing loss in people due to GAS2 mutations remains to be determined, but Epstein noted that this type of congenital hearing loss is nevertheless an attractive target for a future gene therapy.

"In many genetic hearing loss conditions, the affected cells are permanently damaged or die, but in this one, the affected cells are intact and conceivably could be restored to normal or near-normal by restoring GAS2 function," he said.

He added that such a gene therapy might be useful not only in more obvious cases of hearing loss in early childhood, but also in cases--perhaps more numerous--in which inherited mutations lead to a slower development of hearing loss in adulthood.

###

Funding was provided by the National Institutes of Health (R01 DC006254, R01 DC014450, R01 DC013774, R01 DC017741, R01 HL133080), the Boucai Innovation Fund in Auditory Genomics, the National Science Foundation (15-48571), and the Heinsius Houbolt Foundation.

Penn Medicine is one of the world's leading academic medical centers, dedicated to the related missions of medical education, biomedical research, and excellence in patient care. Penn Medicine consists of the Raymond and Ruth Perelman School of Medicine at the University of Pennsylvania (founded in 1765 as the nation's first medical school) and the University of Pennsylvania Health System, which together form a $8.9 billion enterprise.

The Perelman School of Medicine has been ranked among the top medical schools in the United States for more than 20 years, according to U.S. News & World Report's survey of research-oriented medical schools. The School is consistently among the nation's top recipients of funding from the National Institutes of Health, with $496 million awarded in the 2020 fiscal year.


 

Hologram experts can now create real-life images that move in the air

Using lasers to create the displays of science fiction, inspired by Star Wars and Star Trek

BRIGHAM YOUNG UNIVERSITY

Research News

IMAGE

IMAGE: A TINY STARSHIP ENTERPRISE FIRES ON A TINY KLINGON BATTLE CRUISER WITH REAL ANIMATED IMAGES CREATED IN THIN AIR. view more 

CREDIT: BYU

They may be tiny weapons, but Brigham Young University's holography research group has figured out how to create lightsabers -- green for Yoda and red for Darth Vader, naturally -- with actual luminous beams rising from them.

Inspired by the displays of science fiction, the researchers have also engineered battles between equally small versions of the Starship Enterprise and a Klingon Battle Cruiser that incorporate photon torpedoes launching and striking the enemy vessel that you can see with the naked eye.

"What you're seeing in the scenes we create is real; there is nothing computer generated about them," said lead researcher Dan Smalley, a professor of electrical engineering at BYU. "This is not like the movies, where the lightsabers or the photon torpedoes never really existed in physical space. These are real, and if you look at them from any angle, you will see them existing in that space."

It's the latest work from Smalley and his team of researchers who garnered national and international attention three years ago when they figured out how to draw screenless, free-floating objects in space. Called optical trap displays, they're created by trapping a single particle in the air with a laser beam and then moving that particle around, leaving behind a laser-illuminated path that floats in midair; like a "a 3D printer for light."

The research group's new project, funded by a National Science Foundation CAREER grant, goes to the next level and produces simple animations in thin air. The development paves the way for an immersive experience where people can interact with holographic-like virtual objects that co-exist in their immediate space.

"Most 3D displays require you to look at a screen, but our technology allows us to create images floating in space -- and they're physical; not some mirage," Smalley said. "This technology can make it possible to create vibrant animated content that orbits around or crawls on or explodes out of every day physical objects."

To demonstrate that principle, the team has created virtual stick figures that walk in thin air. They were able to demonstrate the interaction between their virtual images and humans by having a student place a finger in the middle of the volumetric display and then film the same stick finger walking along and jumping off that finger.

Smalley and Rogers detail these and other recent breakthroughs in a new paper published in Nature Scientific Reports this month. The work overcomes a limiting factor to optical trap displays: wherein this technology lacks the ability to show virtual images, Smalley and Rogers show it is possible to simulate virtual images by employing a time-varying perspective projection backdrop.

"We can play some fancy tricks with motion parallax and we can make the display look a lot bigger than it physically is," Rogers said. "This methodology would allow us to create the illusion of a much deeper display up to theoretically an infinite size display."

###