It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Monday, July 17, 2023
Survival of children with acute lymphatic leukemia further increased
The five-year survival of all children with acute lymphatic leukemia (ALL) has continued to increase to 94%. This is evident from a study of 800 Dutch children. Within the study, modified treatment protocols for four subgroups were examined.
The five-year survival of all children with acute lymphatic leukemia (ALL) has continued to increase to 94%. This is evident from a study of 800 Dutch children. Within the study, modified treatment protocols for four subgroups were examined. The modifications were found to have positive effects on survival and quality of life. For example, the risk of disease recurrence became as much as three times smaller for children with an aggressive form of leukemia. Says Prof. Dr. Rob Pieters: ‘The five-year survival rate for children with acute lymphatic leukemia has increased dramatically since the 1960s, from zero to 94%, but the last steps are the most difficult.’
Acute lymphatic leukemia is the most common form of childhood cancer in the Netherlands. Every year, about 110 children are diagnosed with this form of cancer. The prognosis is good for many children, but not yet for every child. To improve survival rates and quality of life for all children with leukemia, the treatment protocol is constantly being adapted over the years on the basis of new scientific insights.
The study results of the ALL-11 treatment protocol, led by researchers at the Princess Máxima Center, were published today in the scientific journal Journal of Clinical Oncology.
Three times lower risk of return
Between April 2012 and July 2020, more than 800 children in the Netherlands were treated according to this protocol. The study looked at the effect of modified treatment in specific groups of children with leukemia, including those with a so-called Ikaros abnormality. Prof. Dr. Rob Pieters, pediatric oncologist and medical director of the Princess Máxima Center, led the clinical study. He says: ‘There is broad interest worldwide in this research, because it was still unknown how to improve therapy for children with Ikaros leukemia.’
Children with an Ikaros abnormality in the DNA of their leukemia cells are more likely to have their disease return after treatment. In this study, these children received an additional year of ‘maintenance phase’ chemotherapy on top of the first two years of treatment. This modification led to a three times lower risk of the cancer returning: it only happened in 9% of them, compared to 26% of children in the previous treatment protocol.
Less severe treatment proves safe
In the ALL-11 protocol, doctors and researchers also looked at the effect of less intensive treatment for three other groups of children. These included children with a DNA abnormality in their leukemia cells that is associated with a very high cure rate, and children with Down syndrome who suffer a lot of side effects from therapy. These children were given lower amounts of anthracyclines, a particular type of chemotherapy that increases the risk of heart damage and infections. The modification turned out to be a good choice: the children had the same or even better survival rate while their quality of life improved due to a lower risk of infections and less risk of heart damage.
More cure with fewer side effects
Prof. Dr. Rob Pieters: ‘The five-year survival rate for children with acute lymphoblastic leukemia has increased tremendously since the 1960s, from zero to 94%, but the last steps are the hardest. We are now one step closer to curing all children with ALL. We have also been able to remove a drug that gives risk of heart damage largely from the treatment of children with less aggressive disease. So the latest results for children with leukemia fit exactly with our mission: more cure, with fewer side effects.'
We know less about the rainforest canopy, where most of the world’s species live than we do about the surface of Mars or the bottom of the ocean. However, that is about to change thanks to GEDI—a NASA space laser that has provided a detailed structure of the world’s rainforests for the first time ever.
“Tropical forests are mainly unstratified especially in Amazonia and regions with lower fertility or higher temperatures” reads the title of the recently published paper in Environmental Research Ecology that details the laser’s findings. Authored by researchers from the U.S., the U.K. and Singapore, Christopher Doughty, professor in NAU’s School of Informatics, Computing, and Cyber Systems and first author on the study, believes this research is crucial—and long overdue—in finding out more about the tropical ecosystems.
“Most of the world’s species live in tropical forests and most of those make use of the canopy, and yet, we know so little,” Doughty said. “Rainforest structure matters because it controls how animals access resources and escape predators, and these findings will help us understand tropical forest animal’s susceptibility to climate change.”
Research into forest canopies has come a long way. Early western visitors described tropical forests as horror vacui (nature abhorring a vacuum) since vegetation was “anxious to fill every available space with stems and leaves.” Later, as scientists began to study tropical forests, they categorized the lush flora into forest layers—a thick upper crown and a thick mid-layer with a thin layer in between. However, this was only observed in a few well-studied locations. The structure across most tropical forests was still unknown.
Then came GEDI, the Global Ecosystem Dynamics Investigation.
“A key difference between GEDI and many other satellites is its measurement of three-dimensional canopy structure,” said Hao Tang, professor in the Department of Geography at the National University of Singapore (NUS) and co-author on the paper. Tang, who is also a principal investigator at the NUS Centre for Nature-based Climate Solutions, added, “Conventional satellites, while providing valuable data on land cover and canopy greenness, often lack the detailed vertical information offered by GEDI. This vertical information is crucial for understanding ecosystem dynamics, carbon storage and biodiversity that cannot be easily seen from typical satellite images.”
Launched in late 2018, NASA’s GEDI shoots an invisible laser from the International Space Station into Earth’s forests thousands of times a day. Depending on the amount of energy returned to the satellite, it can provide a detailed 3D map that shows where the leaves and branches are in a forest and how they change over time. This will help researchers understand the amounts of biomass and carbon forests store and how much they lose when disturbed—vital information for understanding Earth’s carbon cycle and how it is changing.
Doughty, Tang and the other authors of the paper analyzed GEDI data across all tropical forests and found that the structure was simpler and more exposed to sunlight than previously thought. Data also revealed that most tropical forests (80 percent of the Amazon and 70 percent of Southeast Asia and the Congo Basin) have a peak in the number of leaves at 15 meters instead of at the canopy top, debunking the fullest-at-the-top theory of early researchers. While forests vary, a key finding that seemed to remain constant in every scenario was that deviation from more ideal conditions (like lower fertility or higher temperatures) leads to shorter, less stratified forests with lower biomass.
“It was really surprising to see the dominance of this structure type because it differs from what we had learned in the classic textbooks on the topic,” Doughty said. “These finding will not only help us understand how the millions of species that live in a rainforest canopy might acclimate to changing temperatures, but also how much carbon these forests hold and how good they are at fighting climate change.”
JOURNAL
Environmental Research Ecology
METHOD OF RESEARCH
Data/statistical analysis
SUBJECT OF RESEARCH
Not applicable
ARTICLE TITLE
Tropical forests are mainly unstratified especially in Amazonia and regions with lower fertility or higher temperatures
ARTICLE PUBLICATION DATE
17-Jul-2023
Prof. Chilai Chen’s team developed the first deep-sea mass spectrometer of China and successfully tested at deep sea
The study is led by Prof. Chilai Chen (Hefei Institutes of Physical Science, Chinese Academy of Sciences).
The detection of dissolved gases in the deep sea is of great significance in exploring the origin and early evolution of life, understanding the interaction between the Earth's spheres, studying the geological profile of the Earth, searching for underwater oil, gas, and mineral resources, and researching global climate change. Changes in the concentrations of dissolved oxygen and nitrogen can indirectly or directly reflect the activity patterns of organisms such as plankton, phytoplankton, bacteria, and viruses in the deep sea. In situ detection of these changes is meaningful for studying biodiversity, especially in cold seeps and hydrothermal areas, which has special significance for the study of the origin of life. Typical greenhouse gases such as methane (CH4) and carbon dioxide (CO2) are important factors causing global warming. In the deep sea, they usually diffuse into the surrounding area in the form of bubbles or fluids. In situ detection of the gases is important for studying biogeochemical cycles, marine environments, and global climate change. Additionally, methane (CH4) is an important indicator for the detection of natural gas resources on the seabed.
The traditional method for detecting dissolved gases in the deep sea is offline detection, which combines pressure sampling with laboratory analysis. This method has the advantages of high detection accuracy and simultaneously detecting multiple substances. However, it has limitations in time and spatial resolution, making it difficult to perform continuous analysis of the temporal and spatial distribution of substances. An effective means of addressing this issue is online detection technology. Currently, the main underwater dissolved gas online detection technologies include electrochemical sensors based on gas-sensitive materials, various spectrometers based on optical measurement methods, and underwater mass spectrometers based on mass analyzers. Amongst these, electrochemical sensors have the advantages of small size, low power consumption, low cost, and the potential for large-scale deployment, making them highly promising in the field of deep-sea detection. However, their extensive use still requires further shortening of equipment response time, expanding the concentration detection range, and improving their anti-interference capabilities. Spectrometers based on optical measurement methods have small size and rapid characteristics and have developed rapidly in recent years. They have been successfully applied to the online detection of dissolved methane, carbon dioxide, and their isotopes in the ocean. The development of this technology provides a technical means for the detection of dissolved gases in the deep sea. However, this technology needs to further expand the types of detectable substances, shorten the detection time, and further improve the sustainable working time in the future.
Mass spectrometry is an analytical technique that works under vacuum conditions and is based on separating ions for their mass-to-charge ratios. Due to its advantages of fast response, low detection limit, high specificity, simultaneous detection of multiple substances, strong anti-interference ability, and ability to provide a large amount of elemental, structural, and isotopic information of chemical substances, it has been widely used in fields such as food safety, biomedical, environmental protection, and ecological health. The application of mass spectrometry to the in-situ online detection of dissolved gases in the deep sea is an important advancement in marine chemistry research.
Since its inception, deep-sea mass spectrometry has undergone more than 30 years of development. The world's first deep-sea mass spectrometer, a gas chromatography-mass spectrometer, was developed by the Hamburg University of Technology in Germany in 1998, primarily used for cleaning up chemical pollutants at sea. However, due to its complex detection process, long detection time, large volume, and challenges in meeting the requirements of underwater in-situ analysis, the development of this technology has been relatively limited.
In 1999 and 2001, the research team at the University of South Florida successively unveiled the first-generation deep-sea mass spectrometry based on quadrupole mass analyzers and the second-generation underwater mass spectrometry (UMS) based on ion trap mass analyzers. In the following five years, the team carried out instrument optimization and shallow water online detection work. They conducted experiments in Bayboro Harbor, plotted the three-dimensional concentration distribution maps of toluene and dimethyl sulfide, and obtained time-concentration curves of dissolved gases such as benzene, carbon dioxide, and argon in Lake Maggiore. In 2007, the team moved to RSI Company to continue deep-sea mass spectrometry research and successively obtained depth-concentration curves of carbon dioxide, methane, ethanol, and propane in deep-sea cold springs, hydrothermal areas, and other regions. They also conducted research on marine oil spills and the total inorganic dissolved carbon in the seabed.
The MIT underwater mass spectrometry research team publicly released their UMS based on a cycloidal mass analyzer in 2002. This device has a power consumption of 20 W, a weight of 25 kg, a detection time of less than 5 s, and can detect a mass range of 1–200 amu, with a maximum operating depth of 25 m. In principle, this UMS uses a double-focusing method based on electric and magnetic fields and a vacuum system based on an ion pump, which greatly improves the volume and weight of the equipment. Over the next five years, the team carried out underwater experiments and online detection work in shallow water areas. They obtained methane concentration distribution maps in the range of 0–25 m deep in Lake Superior and dissolved oxygen, dissolved nitrogen, and dissolved argon concentration distribution maps in the range of 0–5 m deep in Boston Harbor. In 2007, the Woods Hole Oceanographic Institution developed a new underwater mass spectrometer based on the work of the MIT team. This device has a weight of 13 kg, a maximum depth of over 5000 m, a mass detection range of 1–200 amu, a mass resolution of less than 0.1 amu, and a response time of less than 5 s. Based on this underwater mass spectrometer, they obtained concentration distribution maps of dissolved oxygen (O2), methane (CH4), and carbon dioxide (CO2) in the depth range of 0–350 m. Subsequently, they conducted related research work such as deep-sea oil exploration, pipeline leaks, and atmospheric circulation.
Apart from the two main research branches mentioned above, other institutions such as the University of Hawaii, Harvard University, and the Alfred Wegener Institute for Polar and Marine Research in Germany have also conducted research on underwater mass spectrometry. In 2005, engineers at the University of Hawaii's School of Ocean and Earth Science and Technology publicly released their first underwater mass spectrometer, and subsequently carried out more systematic instrument optimization and deep-sea in-situ detection work. The Alfred Wegener Institute released its first underwater mass spectrometer in 2008 and conducted deep-sea in-situ detection work. Harvard University released its first underwater mass spectrometer in 2010 and subsequently conducted research on the origin and early evolution of life, inter-layer interactions, and global environmental changes.
The work of the above research teams has made important contributions to the characteristics of deep-sea cold seeps and hydrothermal regions, exploring the origin and early evolution of life, studying the geological overview of the earth, searching for seabed resources, evaluating oil and gas and mineral resource reserves, analyzing the harm of pollution to the marine ecosystem, and studying global environmental changes. Overall, the research on UMS is still in the exploratory stage, and there are still many challenges to overcome. Currently, relevant research focuses on improving detection accuracy, reducing volume, and power consumption to achieve long-term, stable, and accurate detection in the deep sea.
UMS is still in its early stages in China, and no public reports are currently available. Based on previous research on the development of conventional mass spectrometry and atmospheric pressure ion mobility spectrometry technology, this study conducted research on underwater mass spectrometry. A breakthrough was made in high-pressure injection technology for underwater mass spectrometry, and vacuum maintenance technology under high-pressure and low-temperature environments, as well as long-term unmanned self-control technology, were developed to realize the miniaturization of high-precision underwater mass spectrometry technology and achieve online high-precision detection of small molecules and volatile organic compounds in seawater.
See the article:
Development and application of an underwater mass spectrometer for in situ detection of deep-sea dissolved gases
The building sector is a significant contributor to global energy consumption, accounting for approximately 33% of the world's final energy usage. Recently, data mining technologies have showed powerful capacities for revealing energy waste and providing energy-saving tips to building owners. These technologies have the ability to save approximately 15%-30% of the energy consumed in buildings. However, the practical application of data mining technologies has been limited due to its labor-intensive nature, resulting in a scarcity of real-world use cases.
In a study published in the KeAi journal Energy and Built Environment, a collaborative team of researchers from China and the Netherlands has successfully developed a solution based on GPT-4. This innovative solution automates the analysis of building operational data, thereby providing comprehensive support for building energy management.
"The study's first author, Chaobo Zhang, a postdoctoral researcher in smart buildings at the Department of the Built Environment, Eindhoven University of Technology, highlights the necessity for tailored data mining solutions in building energy management due to the highly diverse nature of building energy systems.
"While GPT-4 stands as one of the most advanced large language models currently available, demonstrating remarkable human-level performance in various real-world scenarios such as coding, writing, and image generation, its ability to analyze building operational data using data mining tools at a comparable human-level performance remains uncertain. Exploring the potential of leveraging GPT-4 to replace humans in data mining-based building energy management tasks holds significant value and warrants further investigation.” Zhang explains.
The team successfully showcased GPT-4's capability to generate codes that forecast building energy loads, even when provided with limited user information. Furthermore, GPT-4 exhibits the ability to identify device faults and detect abnormal patterns in system operations by analyzing building operational data. When applied in real-world buildings, the codes generated by GPT-4 demonstrate a high level of accuracy in energy load prediction.
“Additionally, GPT-4 offers reliable and precise explanations for fault diagnosis and anomaly detection outcomes. By automating coding and data analysis tasks, GPT-4 effectively liberates humans from tedious work, resulting in a more accessible and cost-effective approach to data-guided building energy management,” adds Zhang.
This study represents a breakthrough in the domain of building energy management. "Automated data mining solutions are still rare for building energy management until now. Our study indicates that GPT-4 is a promising solution to enabling computers to implement customized data mining solutions for building energy management with limited assistance from human,” says Yang Zhao, a professor at Zhejiang University, and senior author of the study. "We hope more scientists can explore the potential of GPT-4 in this domain, so that the building energy management will be smarter and more efficient in the future."
###
Contact the author: Yang Zhao, Institute of Refrigeration and Cryogenics, Zhejiang University, Hangzhou, China, youngzhao@zju.edu.cn
The publisher KeAi was established by Elsevier and China Science Publishing & Media Ltd to unfold quality research globally. In 2013, our focus shifted to open access publishing. We now proudly publish more than 100 world-class, open access, English language journals, spanning all scientific disciplines. Many of these are titles we publish in partnership with prestigious societies and academic institutions, such as the National Natural Science Foundation of China (NSFC).
Scent dogs may represent a cheaper, faster and more effective way to detect COVID-19, and could be a key tool in future pandemics, a new review of recent research suggests. The review, published in De Gruyter’s Journal of Osteopathic Medicine, found that scent dogs are as effective, or even more effective, than conventional COVID-19 tests such as RT-PCR.
Dogs possess up to 300 million olfactory cells, compared to just 5 or 6 million in humans, and use one-third of their brains to process scent information, compared with just 5% for humans. Dogs trained to recognize specific volatile organic compounds created in the body during disease have successfully identified patients with certain cancers, Parkinson’s and diabetes.
Prof. Tommy Dickey of the University of California, Santa Barbara and Heather Junqueira of BioScent Detection Dogs reviewed 29 studies where dogs were used to detect COVID-19. The studies were performed using over 31,000 samples by over 400 scientists from more than 30 countries using 19 different dog breeds. In some studies, the scent dogs sniffed people directly, sometimes in public places as a health screening. In others, the dogs sniffed patient samples such as sweat, saliva or urine samples.
In the majority of studies, the scent dogs demonstrated similar or better sensitivity and specificity than the current gold-standard RT-PCR tests or antigen tests. In one study, four of the dogs could detect the equivalent of less than 2.6 x 10−12 copies of viral RNA per milliliter. This is equivalent to detecting one drop of any odorous substance dissolved in ten and a half Olympic-sized swimming pools and is three orders of magnitude better than modern scientific instruments.
The dogs could detect COVID-19 in symptomatic, pre-symptomatic and asymptomatic patients, along with new COVID variants and even long COVID. A major benefit of using the dogs was their speed – they could provide a result in seconds to minutes, and did not require expensive lab equipment or create mountains of plastic waste, unlike conventional diagnostic approaches.
“Although many people have heard about the exceptional abilities of dogs to help humans, their value to the medical field has been considered fascinating, but not ready for real-world medical use,” said Prof. Dickey. “Having conducted this review, we believe that scent dogs deserve their place as a serious diagnostic methodology that could be particularly useful during pandemics, potentially as part of rapid health screenings in public spaces. We are confident that scent dogs will be useful in detecting a wide variety of diseases in the future."
Prof. Dickey and Heather Junqueira added that they feel that the impressive international COVID scent dog research described in their paper, perhaps for the first time, demonstrates that medical scent dogs are ready for mainstream medical applications.
BKH is a one-stop portal that allows users to access FAIR and interlinked biodiversity data and services in a few clicks. BKH was designed to support a new emerging community of users over time and across the entire biodiversity research cycle providing its services to anybody, anywhere and anytime.
“The Knowledge Hub is the main product from our BiCIKL consortium, and we are delighted with the result! BKH can easily be seen as the beginning of the major shift in the way we search interlinked biodiversity information,”
says Prof. Lyubomir Penev, BiCIKL’s Project coordinator and Founder of Pensoft Publishers.
“Biodiversity researchers, research infrastructures and publishers interested in fields ranging from taxonomy to ecology and bioinformatics can now freely use BKH as a compass to navigate the oceans of biodiversity data. BKH will do the linkages.”
“We have invested our best energies and resources in the development of BKH and the Fair Data Place (FDP), which is the beating heart of the portal,”
says Christos Arvanitidis, CEO of LifeWatch ERIC - “BKH has been designed to support a new emerging community of users across the entire biodiversity research cycle. Its purpose goes beyond the BiCIKL project itself: we are thrilled to say that BKH is meant to stay, aiming to reshape the way biodiversity knowledge is accessed and used.”
With its services, the Biodiversity Knowledge Hub is designed to support a new emerging community of users over time and across the entire biodiversity research cycle.
CREDIT
BiCIKL Project.
“The BKH outlines how users can navigate and access the linked data, tools and services of the infrastructures cooperating in BiCIKL,”
said Joe Miller, Executive Secretary of GBIF—the Global Biodiversity Information Facility.
“By revealing how they harvest, liberate and reuse data, these increasingly integrated sources enable researchers in the natural sciences to move more seamlessly between specimens and material samples, genomic and metagenomic data, scientific literature, and taxonomic names and units.”
The vast scale of global trade presents a constant threat of introducing new plant diseases, which is challenging to the United States system of biosecurity. Plant health professionals often must respond quickly to a newly introduced or emerging plant disease outbreak even before a well-validated diagnostic test is available. Additionally, thousands of plant pathogens that already exist have been routinely diagnosed with assays that were not fully or consistently validated, which can lead to inaccurate diagnoses, delays in proper disease management, and significant consequences for growers and the public.
Growing awareness of this gap in coordination and resources for plant disease diagnostic assay development and validation inspired Kitty Cardwell—Director of the Institute for Biosecurity and Microbial Forensics at Oklahoma State University—and colleagues, in collaboration with The American Phytopathological Society (APS), to publish the focus issue “Diagnostic Assay Development and Validation: The Science of Getting It Right” in the journal PhytoFrontiers. This visionary and widely collaborative focus issue contains more than twenty-five open access articles addressing the need to harmonize plant health diagnostics within the U.S.agricultural biosecurity system.
The focus issue contains ten perspective articles and sixteen research articles. Many of the perspective articles are based on discussions among a large group of experts over several years supported by a USDA National Institute of Food and Agriculture seed grant (NIFA 2020). Other perspective articles discuss the outcomes of VALITEST, a similar project funded by the European Union (EU) on diagnostic assay validation. The need and vision for developing the Diagnostic Assay Validation Network (DAVN) describes the goal of facilitating the accessibility and shareability of standard diagnostic method development and validation data/tools across institutions. Cardwell says that “outcomes of the DAVN will include more validated diagnostic assays, faster assay development time, and better coordination and communications across the continuum of plant health professionals and the industries they serve, as well as networked people, technology, and resources to protect agriculture, the environment, and trade.”
Six research articles discuss developing and validating high throughput screening (HTS) methods to detect and identify common plant pathogen taxa. Three of these papers demonstrate the development, validation, and use of e-probe diagnostic nucleic acid analysis (EDNA). Ten more research articles highlight the validation of polymerase chain reaction (PCR) and recombinase polymerase amplification (RPA) plant disease diagnostic assay methods for common plant pathogen taxa.
“Diagnostic Assay Development and Validation: The Science of Getting It Right” is the first focus issue from PhytoFrontiers, a relatively new journal published by APS. Editor-in-Chief Steve Klosterman remarks, "The publication of this focus issue is an amazing accomplishment for PhytoFrontiers—due to the number of articles and their overall quality. Clearly, there is a thirst for this topic.”
This focus issue aims to raise awareness about, and improve the discourse on, the importance of assay validation for diagnostic accuracy. Cardwell and the other focus issue guest editors (Carrie Harmon, Poonam Sharma, and James Stack) state that this focus issue should enhance the quality of diagnostic assays and increase the confidence in their use for the protection of U.S. agricultural, horticultural, and natural landscapes. As the United States strives to maintain important relationships with its global trading partners, a transparent, robust assay validation system with networked resources and experts will assure trade partners and increase confidence in U.S. production systems. The science of “getting it right” presented in this focus issue can facilitate the protection of plant health in the U.S. and abroad.
The American Phytopathological Society @plantdisease
About PhytoFrontiers™
Established in 2020 by The American Phytopathological Society, PhytoFrontiers is an interdisciplinary open-access journal publishing high-quality research covering basic to applied aspects of plant health. PhytoFrontiers also provides space for plant pathologists to publish negative results or results perceived as having no impact.