Friday, January 19, 2024

 

Independent dispute resolution of no surprises act financially unviable for radiology


The No Surprises Act’s (NSA) independent dispute resolution (IDR) process would be financially unfeasible for a large portion of out-of-network (OON) claims for hospital-based specialties—more so for radiologists than other specialists.


Peer-Reviewed Publication

AMERICAN ROENTGEN RAY SOCIETY

0:02 / 6:02   Financial Viability of the No Surprises Act Independent Dispute Resolution Process 

VIDEO: 

ERIC W. CHRISTENSEN, PHD DISCUSSES HIS RESEARCH THAT ASSESSES THE FRACTION OF OUT-OF-NETWORK CLAIMS FOR WHICH RADIOLOGISTS AND OTHER HOSPITAL-BASED SPECIALISTS CAN EXPECT TO AT LEAST BREAK EVEN WHEN CHALLENGING PAYER-DETERMINED PAYMENTS THROUGH THE NO SURPRISES ACT INDEPENDENT DISPUTE RESOLUTION PROCESS, AS A MEASURE OF THE PROCESS'S FINANCIAL VIABILITY.

view more 

CREDIT: ARRS/AJR




Leesburg, VA, January 17, 2024—According to the American Journal of Roentgenology (AJR), the No Surprises Act’s (NSA) independent dispute resolution (IDR) process would be financially unfeasible for a large portion of out-of-network (OON) claims for hospital-based specialties—more so for radiologists than other specialists.

“Although the NSA enacted important patient protections, IDR fees limit clinicians’ opportunities to dispute payer-determined payments and potentially undermine their bargaining power in contract negotiations,” wrote AJR first author Eric W. Christensen, PhD, of the Neiman Health Policy Institute in Reston, VA. “Therefore, IDR rulemaking may negatively impact patient access to in-network care.”

Christensen et al.’s accepted manuscript extracted claims from Optum’s de-identified Clinformatics Data Mart Database for hospital-based specialties occurring the same day as in-network emergency visits or inpatient stays (January 2017–December 2021). After identifying OON claims, claims were then batched via simulated IDR rules. Maximum potential recovered payments from the IDR process was estimated as the difference between charges and allowed amount. The percentage of claims for which the maximum potential payment, and one-quarter of this amount—a more realistic payment recovery estimate—would exceed IDR fees was determined, using $150 and $450 fee thresholds to approximate the range of final 2024 IDR fees. These values represented the percentage of OON claims that would be financially viable candidates for IDR submission.

Ultimately, the percentage of radiologists’ OON claims for which the maximum potential recovered payment exceeded fee thresholds of $150 and $450 (i.e., financial breakeven points for entering the NSA IDR process) was 55.0% and 32.1%, respectively; at payment of one-quarter of the maximum amount, these percentages were 26.9% and 10.6%, respectively.

Dr. Christensen discusses his research that assesses the fraction of out-of-network claims for which radiologists and other hospital-based specialists can expect to at least break even when challenging payer-determined payments through the No Surprises Act independent dispute resolution process, as a measure of the process's financial viability.


North America’s first radiological society, the American Roentgen Ray Society (ARRS) remains dedicated to the advancement of medicine through the profession of medical imaging and its allied sciences. An international forum for progress in radiology since the discovery of the x-ray, ARRS maintains its mission of improving health through a community committed to advancing knowledge and skills with the world’s longest continuously published radiology journal—American Journal of Roentgenology—the ARRS Annual Meeting, InPractice magazine, topical symposia, myriad multimedia educational materials, as well as awarding scholarships via The Roentgen Fund®.

 WHY THE UK IS NOT MULTICULTURAL

Fewer than 1% of schools in England have full policies on second languages, language learning and English


Survey of 998 schools found just six with whole-school language policies covering issues such as how to support the 20% of students who speak English as an additional language (EAL), and expectations concerning English.

Peer-Reviewed Publication

UNIVERSITY OF CAMBRIDGE





A tiny fraction of schools in England – about three in every 500 – have whole-school policies which address foreign languages, English usage, and integrating students who speak English as an additional language (EAL), new research indicates.

The study of almost 1,000 secondary schools, by researchers at the University of Cambridge, questions many schools’ claims to being ‘inclusive’ spaces that value the linguistic diversity of their communities. It also suggests that language learning, and an appreciation of different languages, is being deprioritised, conflicting with Government ambitions for 90% of students to study a language to GCSE by 2025.

Schools in England are increasingly multicultural. Around 20% of students qualify as EAL, meaning they typically speak a different language at home. The study examined how schools navigate various issues this raises relating to languages: including students’ use and acquisition of English; the status of modern languages in the curriculum; and schools’ attitudes towards home and community languages.

The researchers reviewed every publicly available policy document from the websites of 998 secondary schools (about 20% of all those in England). Only six published dedicated school-wide policies on languages. More schools had official policies on asbestos management, or whether dogs were allowed on school grounds.

While most schools did have some specific language-related policies, these were often isolated, unclear and inconsistent; particularly on key issues such as supporting EAL learners, or the status of community languages like Polish and Urdu. Over a third (37%) failed to mention English, other languages, or even the term ‘language’ in any public policy document.

Writing in the British Educational Research Journal, the authors suggest that the absence of joined-up language strategies could pose particular challenges for England’s 1.7 million EAL students. Only 6% of state schools, and 15% of schools overall, had explicit EAL policies.

They also encountered anecdotal evidence that some EAL students’ experiences are disjointed as a result. One student, who had recently moved to the UK from Poland, described being told off for speaking Polish in one lesson when asking a friend for help, before receiving learning resources translated into Polish in her next class.

Karen Forbes, Associate Professor in Second Language Education at the University of Cambridge, said: “It’s surprising that so few schools seem to have systematic policies concerning language education and language use.”

“A major reason is probably is that school leaders just don’t have time to devise a unified approach on linguistic diversity. But language is fundamental to how students learn, conceptualise ideas, and process information. The absence of clear policies is very worrying, especially for EAL students.”

Although one in 10 state schools in the sample had EAL student populations exceeding 40% of the total, most only mentioned languages in passing, and usually within broader policies. In 33 schools, EAL was explicitly categorised as a “special educational need”, contradicting the Government’s own code of practice on special educational needs and disabilities.

Most policies on English reinforced national guidance requiring students to learn ‘standard English’ across the curriculum. Surprisingly, however, some extended this beyond the classroom. One school expected parents to “explain when their child should use Standard English appropriately at home”; which, the study notes, non-English-speaking parents would find challenging. Seven schools banned other  languages completely, including at breaktimes, with one going so far as to state that: “students should be reprimanded for speaking other languages”.

These policies often appear to clash with schools’ public claims that they celebrate cultural and linguistic diversity. Forbes said that they could also hinder learning, especially in subjects that involve specialised language and terminology, which EAL students might grasp more quickly if they were allowed to translate them.

“Prohibiting students from even using their home language to talk to their friends at breaktime can seriously undermine a student’s sense of belonging and whether they feel valued,” she added.

The study also raises concerns about how far language learning and multilingualism are encouraged within the curriculum.

While modern foreign languages are mandatory to Key Stage 3 (age 14), the Government encourages their study to GCSE through the English Baccalaureate. About 65% of independent schools in the sample regarded language study to GCSE as compulsory, but just a quarter of state schools “required” or “encouraged” this.

Even at Key Stage 3, however, over 10% of schools had policies for “disapplying” students from supposedly compulsory language lessons, often to prioritise English and Maths. This disproportionately affected EAL students, who tend to have strong multilingual skills.

Dr Nicola Morea, a co-author of the study, now based at the University of Reading, said: “Lots of research indicates that learning other languages supports English literacy skills. The argument that students should do less French or Spanish because they need to improve their English is potentially counterproductive.”

The authors argue that language policy is best handled by schools – rather than nationally – because this allows them to tailor their approach to the needs and linguistic context of local communities. They hope the study will become the initial part of a larger project, leading to the development of a toolkit through which schools and school leaders can develop joined-up language policies.

“In some schools, even individual teachers seem to be unaware of how their colleagues are handling issues like expectations concerning English, or EAL student support,” Forbes said. “Much of that could be resolved by developing shared principles and practices. That will need to come from schools themselves, so that it is meaningful in their own settings.”

The findings are published in the British Educational Research Journal.

ARACHNOLOGY

Rain can spoil a wolf spider’s day, too


Wolf spiders change their mating strategy after it rains


Peer-Reviewed Publication

UNIVERSITY OF CINCINNATI

WOLF SPIDER 

IMAGE: 

UNIVERSITY OF CINCINNATI BIOLOGISTS FOUND THAT WOLF SPIDERS CAN'T DETECT PREDATORS OR COMMUNICATE WITH POTENTIAL MATES AS EASILY AFTER IT RAINS. THEY MUST RELY MORE ON VISUAL COMMUNICATION THAN CHEMICAL OR VIBRATORY CUES.

view more 

CREDIT: JOSEPH FUQUA II





If you hate the rain, you have something in common with wolf spiders.

Researchers at the University of Cincinnati found that wolf spiders can’t signal others or perceive danger from predators as easily on rain-soaked leaves compared to dry ones. Even communicating with would-be mates is harder after it rains.

The study was published in the Journal of Insect Behavior.

Biologists in UC’s College of Arts and Sciences studied Schizocosa ocreata, a wolf spider found across much of the United States. Males respond to chemical signals from females by using a combination of visual and vibratory signals for courtship.

Spiders don’t have ears, but sense sound in vibrations using specialized sense organs in their eight legs. The vibrations from sound carry much better in dry leaves.

Wolf spiders are especially attuned to the calls of predatory birds like blue jays that feast on spiders, particularly during nesting season when their offspring eat virtually nothing else, lead author and UC Professor George Uetz said.

“Birds preferentially feed spiders to their offspring because spiders have 50 times the amount of taurine than insects,” Uetz said. “And taurine is critical for the development of the hippocampus of nestlings’ brains. The hippocampus is associated with spatial memory, which is really important for birds.”

The spring nesting season coincides with the spiders’ mating season. Male wolf spiders are especially vulnerable to birds when they try to woo a mate with a flashy display, which involves drumming their exaggeratedly furry front legs on the leaves in front of them.

The UC study found that male spiders become motionless and “freeze” when a blue jay calls nearby. And they stay frozen longer on dry leaves than on wet ones, possibly because the dry leaves provide a better early warning than wet ones that dampen vibrations.

Uetz said the striped black and brown spiders have remarkable camouflage when they remain motionless.

“If they remain still, they just disappear,” he said.

“Like lots of predators, blue jays are visually oriented,” Uetz said. “They’re very visually acute. For them, movement really is the key.”

With climate change increasing the frequency and duration of spring rains, wolf spiders will face increasing ecological challenges, Uetz said.

“Spring is coming a little earlier. What we’re seeing is there’s a shift in the life cycle of the spiders,” Uetz said. “If global warming shifts the annual life cycle of the spiders, that will push them out of synchrony with the nesting season of the birds, which is more closely tied to daylight cues.”

Wet leaves didn’t discourage male spiders from courting females. Male spiders increased visual signaling to compensate for the inability to use vibrations. But spiders that could use dry leaves to communicate to females had more mating success. 

UC graduate Rachel Gilbert, a study co-author, said spiders can be important bellwethers of environmental change.

“Spiders are very important to local ecosystems, both as a predator of insects and as a major food source to larger animals like birds,” she said.

The study was funded by grants from the National Science Foundation.

WOLF SPIDERS SENSE SOUND BY DETECTING VIBRATIONS IN THEIR LEGS. THEY OFTEN FREEZE AT THE JARRING CALLS OF BLUE JAYS AND OTHER BIRDS.

BIOLOGISTS AT THE UNIVERSITY OF CINCINNATI FOUND THAT WOLF SPIDERS HAVE MORE DIFFICULTY DETECTING PREDATORS AND COMMUNICATING WITH EACH OTHER ON WET LEAVES FROM A RAIN.

CREDIT

JOSEPH FUQUA II

HORTICULTURE 

Keys to aging hidden in the leaves


Discovery brings nearly dead plants back to life


Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - RIVERSIDE

Researchers with their experimental plants 

IMAGE: 

UC RIVERSIDE RESEARCHERS HEESEUNG CHOI AND KATIE DEHESH HOLDING YOUNG, GREEN AND OLD, YELLOW ARABIDOPSIS PLANTS IN THE LABORATORY. 

view more 

CREDIT: KATIE DEHESH/UCR





Scientists have known about a particular organelle in plant cells for over a century. However, UC Riverside scientists have only now discovered that organelle’s key role in aging.

The researchers initially set out to understand more generally which parts of plant cells control plant responses to stress from things like infections, too much salt, or too little light. Serendipitously, they found this organelle, and a protein responsible for maintaining the organelle, control whether plants survive being left too often in the dark.  

Because they had not expected this discovery, which is described in a Nature Plants journal article, the research team was thrilled.

“For us, this finding is a big deal. For the first time, we have defined the profound importance of an organelle in the cell that was not previously implicated in the process of aging,” said Katie Dehesh, distinguished professor of molecular biochemistry at UCR and co-author of the new article.

Sometimes described as appearing like a stack of deflated balloons or some dropped lasagna, the organelle called the Golgi body is composed of a series of cup-shaped membrane-covered sacs. It sorts various molecules in the cell and ensures they get to the right places.

“Golgi are like the post office of the cell. They package and send out proteins and lipids to where they’re needed,” said Heeseung Choi, a researcher in UCR’s Botany and Plant Sciences Department and co-author of the new study. “A damaged Golgi can create confusion and trouble in the cell’s activities, affecting how the cell works and stays healthy.”

If the Golgi is the post office, then the COG protein is the postal worker. This protein controls and coordinates the movement of small sac “envelopes” that transport other molecules around the cell.

Additionally, COG helps Golgi bodies attach sugars to other proteins or lipids before they are sent elsewhere in the cell. This sugar modification, called glycosylation, is crucial for many biological processes, including immune response.

To learn more about how COG affects plant cells, the research team modified some plants so that they could not produce it. Under normal growing conditions, the modified plants grew just fine, and were indistinguishable from unmodified plants.

However, depriving plants of light means plants are unable to make sugar from sunlight to fuel growth. When exposed to excessive darkness the leaves of the mutant, COG-free plants began to turn yellow, wrinkled, and thin — signs the plants were dying.

“In the dark, the COG mutants showed signs of aging that typically appear in wild, unmodified plants around day nine. But in the mutants, these signs manifested in just three days,” Choi said.

Reversing the mutation and returning the COG protein back into the plants rapidly brought them back to life. “It’s like nothing happened to them once we reversed the mutation,” Dehesh said. “These responses highlight the critical importance of the COG protein and normal Golgi function in stress management,” Choi added.

Part of the excitement surrounding this discovery is that humans, plants, and all eukaryotic organisms have Golgi bodies in their cells. Now, plants can serve as a platform to explore the intricacies of the Golgi's role in human aging. For this reason, the research team is planning further studies of the molecular mechanisms behind the results from this study.

“Not only does our research advance our knowledge about how plants age, but it could also provide crucial clues about aging in humans,” Dehesh said. “When the COG protein complex doesn't work properly, it might make our cells age faster, just like what we saw in plants when they lacked light. This breakthrough could have far-reaching implications for the study of aging and age-related diseases.”

Revolutionizing grapevine phenotyping: harnessing LiDAR for enhanced growth assessment and genetic insights


Peer-Reviewed Publication

PLANT PHENOMICS

Fig. 1 

IMAGE: 

 OVERVIEW OF THE EXPERIMENTAL AND TECHNICAL DESIGN OF THE STUDY.

view more 

CREDIT: PLANT PHENOMICS




In response to the pressing need to reduce pesticide usage and adapt grapevine varieties to climate change, there's an unprecedented effort to phenotype new genotypes using high-throughput methods. Teams globally are developing advanced systems, employing technologies like multispectral cameras and LiDAR, to assess growth traits, photosynthetic capability, and other architectural parameters. However, traditional methods remain time-consuming and less efficient for large-scale studies. The current research gap lies in effectively employing LiDAR technology to explore genetic factors affecting grapevine vigor for sustainable viticulture.

In November 2023, Plant Phenomics published a research article entitled by “LiDAR Is Effective in Characterizing Vine Growth and Detecting Associated Genetic Loci ”.

The study assessed growth traits in 209 grapevine genotypes using methods such as fresh pruning wood weight, exposed leaf area from digital images, leaf chlorophyll concentration, and LiDAR-derived volumes. It found 6 genomic regions associated with trait variations, validating LiDAR as an effective tool for characterizing grapevine growth. LiDAR-derived canopy volumes showed strong correlations with traditional methods, and pruning wood volume from LiDAR positively correlated with actual pruning weight. However, some relationships varied between seasons, indicating that LiDAR provided more consistent measurements overall. Traits except exposed leaf area (ELA) in certain years met normality criteria, and parents displayed significant differences for most traits. LiDAR-derived traits exhibited high, stable heritability, outperforming traditional methods. These traits also led to effective genetic models explaining substantial phenotypic variance. The study generated high-density genetic maps and identified quantitative trait loci (QTLs) associated with growth traits. It found stable QTLs across seasons and validated the genetic determinism of grapevine vigor using LiDAR. The study also noted that LiDAR-derived volumes at véraison and winter were more reliable and heritable than traditional methods, and powerful QTL detection confirmed their efficacy.

In summary, this research underscores the potential of LiDAR technology for high-throughput phenotyping and genetic studies of grapevine growth, providing a more efficient alternative to conventional methods. It opens avenues for understanding environmental effects, management techniques, and training systems on grapevine growth, moving towards more detailed genetic insights into grapevine vigor and architecture.

###

References

Authors

Elsa  Chedid1, Komlan  Avia1, Vincent  Dumas1, Lionel  Ley2,  Nicolas  Reibel2, Gisèle  Butterlin1, Maxime  Soma3, Raul  Lopez-Lozano4, Frédéric  Baret4, Didier  Merdinoglu1, and Éric  Duchêne1*

Affiliations

1INRAE,  University  of  Strasbourg,  UMR  SVQV,  28,  rue  de  Herrlisheim,  68000  Colmar,  France.  

2INRAE, UEAV, 28, rue de Herrlisheim, 68000 Colmar, France.

3INRAE, Aix-Marseille Université, UMR RECOVER, 3275 Route de Cézanne, 13182 Aix-en-Provence, France.

4INRAE, Avignon Université, UMR EMMAH, UMT CAPTE, 228, route de l’aérodrome, 84914 Avignon, France.

About Éric Duchêne

He is currently the Deputy Director of the Joint INRAE-UNISTRA Research Unit on Vine Health and Wine Quality (SVQV). His research revolves around two main lines: on the one hand, plant-(vector)-pathogen interactions to reduce the use of pesticides and the impact of diseases; on the other hand, to maintain the productivity of vineyards and the quality of wines in the context of vine decline and climate change.

 

Can topography facilitate the refinement of landscape design methods?


Peer-Reviewed Publication

HIGHER EDUCATION PRESS

Panoramic aerial photography of the Qilin transportation hub in suburban Nanjing, where the three-dimensional technology recorded the constructed terrain and surface features in two dimensions. 

IMAGE: 

PANORAMIC AERIAL PHOTOGRAPHY OF THE QILIN TRANSPORTATION HUB IN SUBURBAN NANJING, WHERE THE THREE-DIMENSIONAL TECHNOLOGY RECORDED THE CONSTRUCTED TERRAIN AND SURFACE FEATURES IN TWO DIMENSIONS. THE DISTORTION IN THE IMAGE MAKES THE LANDSCAPE UNFAMILIAR TO US, WHO CAN UNEXPECTEDLY AND STRONGLY PERCEIVE THE FRAGMENTED LANDSCAPE AT THE URBAN FRINGE AND ITS MAJOR CAUSE OF TRANSPORTATION INFRASTRUCTURE.
 

view more 

CREDIT: XIAOAN WU





In the field of Landscape Architecture, Topography aims to study the complex and ongoing changing relationship between humans and the land through continuously updated and iterative tools and media. It maintains a balance between abstract concepts and concrete perceptions, which can both drive the development of science and technology in this field and hold on to openness to artistic expression. Thus, topographical design may be an effective way to help facilitate refining landscape design methods. 

 

The work entitled “Can Topography Facilitate the Refinement of Landscape Design Methods?” was published on the journal of Landscape Architecture Frontiers (December 7, 2023).