It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Tuesday, May 25, 2021
Researchers identify a gene that causes canine hereditary deafness in puppies
Finnish researchers have been the first to determine the cause for the nonsyndromic early-onset hereditary canine hearing loss in Rottweilers. The gene defect was identified in a gene relevant to the sense of hearing. The study can also promote the understanding of mechanisms of hearing loss in human.
Hearing loss is the most common sensory impairment and a complex problem in humans, with varying causes, severity and age of onset. Deafness and hearing loss are fairly common also in dogs, but gene variants underlying the hereditary form of the disorder are so far poorly known.
Researchers from the University of Helsinki and the Folkhalsan Research Center focused on a rare type of hearing loss observed in Rottweilers. It begins early in puppyhood and progresses to deafness at the age of few months. A similar type of hearing loss was also seen in a small number of mixed-breed dogs, of which the majority had Rottweiler ancestry.
"We identified the variant in the LOXHD1 gene, which plays a key role in the function of the cilia of the cochlear sensory cells. While the exact mechanism of deafness is not known, variants of the same gene cause hereditary hearing loss in humans and mice as well," says Docent Marjo Hytonen from the University of Helsinki and the Folkhalsan Research Center.
Hearing impairment caused by the LOXHD1 gene defect is a recessively inherited trait, which means that to develop the disorder, the dog must have two copies of the defective gene, one from the father and one from the dam.
"Through our collaboration partner, we had the chance to investigate the prevalence and breed specificity of the gene variant in a unique global dataset of some 800,000 dogs. No surveys of similar scope have previously been published," says Professor Hannes Lohi from the University of Helsinki and the Folkhalsan Research Center.
New individual dogs that had inherited the gene defect and were also found to be deaf were identified in the screening.
"This enhances the significance of our finding. Thanks to our gene discovery, dogs used for breeding can now be tested for the defect. This makes it possible to avoid combinations that could result in puppies who will lose their hearing."
The recent study is part of a research programme led by Professor Lohi and investigating the genetic background of hereditary diseases. Currently ongoing are several projects whose goals include the determination of genetic causes for hearing loss.
According to Marjo Hytonen, the preliminary results are promising.
"We have observed that both previously unknown hereditary congenital hearing loss and adult-onset hearing loss occur in several dog breeds. In addition to dogs, the preliminary findings open new avenues for investigating human hereditary hearing defects."
NIST, collaborators develop new method to better study microscopic plastics in the ocean
Multistep technique uses small invertebrate to detect, count and characterize nanoplastics
NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY (NIST)
If you've been to your local beach, you may have noticed the wind tossing around litter such as an empty potato chip bag or a plastic straw. These plastics often make their way into the ocean, affecting not only marine life and the environment but also threatening food safety and human health.
Eventually, many of these plastics break down into microscopic sizes, making it hard for scientists to quantify and measure them. Researchers call these incredibly small fragments "nanoplastics" and "microplastics" because they are not visible to the naked eye. Now, in a multiorganizational effort led by the National Institute of Standards and Technology (NIST) and the European Commission's Joint Research Centre (JRC), researchers are turning to a lower part of the food chain to solve this problem.
The researchers have developed a novel method that uses a filter-feeding marine species to collect these tiny plastics from ocean water. The team published its findings as a proof-of-principle study in the scientific journal Microplastics and Nanoplastics.
Plastics consist of synthetic materials known as polymers that are usually made from petroleum and other fossil fuels. Each year more than 300 million tons of plastics are produced, and 8 million tons end up in the ocean. The most common kinds of plastics found in marine environments are polyethylene and polypropylene. Low-density polyethylene is commonly used in plastic grocery bags or six-pack rings for soda cans. Polypropylene is commonly used in reusable food containers or bottle caps.
"Sunlight and other chemical and mechanical processes cause these plastic objects to become smaller and smaller," said NIST researcher Vince Hackley. "With time they change their shape and maybe even their chemistry."
While there isn't an official definition for these smaller nanoplastics, researchers generally describe them as being artificial products that the environment breaks down into microscopic pieces. They're typically the size of one millionth of a meter (one micrometer, or a micron) or smaller.
These tiny plastics pose many potential hazards to the environment and food chain. "As plastic materials degrade and become smaller, they are consumed by fish or other marine organisms like mollusks. Through that path they end up in the food system, and then in us. That's the big concern," said Hackley.
For help in measuring nanoplastics, researchers turned to a group of marine species known as tunicates, which process large volumes of water through their bodies to get food and oxygen -- and, unintentionally, nanoplastics. What makes tunicates so useful to this project is that they can ingest nanoplastics without affecting the plastics' shapes or size.
For their study, researchers chose a tunicate species known as C. robusta because "they have a good retention efficiency for micro- and nanoparticles," said European Commission researcher Andrea Valsesia. The researchers obtained live specimens of the species as part of a collaboration with the Institute of Biochemistry and Cell Biology and the Stazione Zoologica Anton Dohrn research institute, both in Naples, Italy.
The tunicates were exposed to different concentrations of polystyrene, a versatile plastic, in the form of nanosize particles. The tunicates were then harvested and then went through a chemical digestion process, which separated the nanoplastics from the organisms. However, during this stage some residual organic compounds digested by the tunicate were still mixed in with the nanoplastics, possibly interfering with the purification and analysis of the plastics.
So, researchers used an additional isolation technique called asymmetrical-flow field flow fractionation (AF4) to separate the nanoplastics from the unwanted material. The separated or "fractionated" nanoplastics could then be collected for further analysis. "That is one of the biggest issues in this field: the ability to find these nanoplastics and isolate and separate them from the environment they exist in," said Valsesia.
The nanoplastic samples were then placed on a specially engineered chip, designed so that the nanoplastics formed clusters, making it easier to detect and count them in the sample. Lastly, the researchers used Raman spectroscopy, a noninvasive laser-based technique, to characterize and identify the chemical structure of the nanoplastics.
The special chips provide advantages over previous methods. "Normally, using Raman spectroscopy for identifying nanoplastics is challenging, but with the engineered chips researchers can overcome this limitation, which is an important step for potential standardization of this method," said Valsesia. "The method also enables detection of the nanoplastics in the tunicate with high sensitivity because it concentrates the nanoparticles into specific locations on the chip."
The researchers hope this method can lay the foundation for future work. "Almost everything we're doing is at the frontier. There are no widely adopted methods or measurements," said Hackley. "This study on its own is not the end point. It's a model for how to do things going forward."
Among other possibilities, this approach might pave the way for using tunicates to serve as biological indicators of an ecosystem's health. "Scientists might be able to analyze tunicates in a particular spot to look at nanoplastic pollution in that area," said Jérémie Parot, who worked on this study while at NIST and is now at SINTEF Industry, a research institute in Norway.
The NIST and JRC researchers continue to work together through a collaboration agreement and hope it will provide additional foundations for this field, such as a reference material for nanoplastics. For now, the group's multistep methodology provides a model for other scientists and laboratories to build on. "The most important part of this collaboration was the opportunity to exchange ideas for how we can do things going forward together," said Hackley.
CAPTION
Microscopic images of the tunicate species C. Robusta exposed to polystyrene particles, a type of nanoplastic. The left image shows the tunicate exposed to 100 nanometers of polystyrene particles. The right image shows the polystyrene particles in the gonads (reproductive gland) of the tunicate.
CREDIT
A. Valsesia et al. via Creative Commons (https://creativecommons.org/licenses/by/4.0), adapted by N. Hanacek/NIST
CHINA INFRASTRUCTURE NEWER THAN AMERICA'S
Railway infrastructure susceptible to greater damages from climate change
$2.06 billion/year can be saved by keeping warming to 1.5 °C instead of 3 °C for Chinese railway
INSTITUTE OF ATMOSPHERIC PHYSICS, CHINESE ACADEMY OF SCIENCES
Just half a degree Celsius less warming would save economic losses of Chinese railway infrastructure by approximately $0.63 billion per year, according to a new paper published by a collaborative research team based in Beijing Normal University and the Institute of Atmospheric Physics, Chinese Academy of Sciences, China.
The study, which appears in Transportation Research Part D recently, found that the rainfall-induced disaster risk of railway infrastructure has increased with increasing extreme rainfall days during the decades 1981-2016. Limiting global warming to the Paris Agreement target of 1.5oC instead of 2.0oC would significantly reduce the disaster susceptibility of Chinese railway infrastructure to extreme precipitation, according to LIU Kai, the first author of the paper.
LIU is an associated professor at the Academy of Disaster Reduction and Emergency Management, Beijing Normal University.
"Flood disaster can inundate the railway track, cause failures of the subgrade and track structure. Based on our statistics, a total of 975 historical railway rainfall-induced disasters was reported from 1981 to 2016. The rainfall-induced debris flow had the largest contribution, about 42%, followed by the rainfall-induced flood, which is about 26%, rainfall-induced landslide--about 18%-- and rainfall-induced compound hazards, about 14%,"Liu said.
The team used a random forest (RF) machine-learning model to calculate the disaster susceptibility and quantify the relationship between susceptibility and precipitation change.
"We found a remarkable increase in the disaster susceptibility of railway lines along the Yangtze River valley, which is the economic center of China with the largest population density." Said LIU, "The disaster susceptibility has increased by 30% during the period 1999-2016 relative to that in 1981-1998 ."
Liu and her team, collaborated with Dr. Tianjun Zhou, professor of the Institute of Atmospheric Physics, Chinese Academy of Sciences, combined CMIP5, an archive of comprehensive climate models, with socio-economic projections to investigate future climate changes and the accompanying impacts. The researchers specifically examined extreme precipitation changes under RCP4.5 and RCP8.5 scenarios [RCP4.5 and 8.5 scenarios represent a possible range of radiative forcing values in the year 2100 relative to pre-industrial values (+4.5 and +8.5 W/m2, respectively)] over three time periods including near term (2020-2039), mid-term (2040-2059), and long term (2080-2099).
The scientists found that 32.0% and 45.0% of land area will be exposed to an increase in the annual average extreme rainfall days of more than 0.5 days by 2050 and 2090 under RCP8.5. The proportion of railway infrastructure with high disaster susceptibility is projected to increase from the baseline period level (1981-1998) of 1.1% to 4.5% by 2050 and up to 12% by 2090 under RCP8.5.
"We extended the projection to the changes in the proportions of railway lines at high risk for specific levels of 1.5oC, 2oC, and 3oC global warming and measure the benefits of mitigation by calculating the avoided impact. The avoided impact, or railway exposure to high disaster susceptibility, would be 90% and 391% if warming was limited to 1.5oC compared to the impact for 2oC and 3oC warming under RCP8.5, respectively." said Prof. Zhou, the co-author of the study. Under RCP8.5, with a global average temperature increase of 1.5oC, the direct damage and repair cost could increase to an annual amount of $1.47 billion. With 2oC warming, the damage doubles, and the loss grows to $2.10 billion.
"This study quantifies the influence of the climate change with its associated rainfall change on railway infrastructures in China. Chinese railway is still under large expansion. The mileage of China's railway lines will reach about 200,000 km in 2035 compared to about 140,000 km in 2020. The design of newly planned high-speed railway lines should incorporate climate change effects. How to reduce the disaster susceptibility of the world's most densely populated railway network should be planned to limit the adverse impact," Liu said.
###
Water treatment: Removing hormones with sunlight
KIT researchers developed a new method to remove micropollutants using a photocatalytic membrane and visible light
Micropollutants such as steroid hormones contaminate drinking water worldwide and pose a significant threat to human health and the environment even in smallest quantities. Until now, easily scalable water treatment technologies that remove them efficiently and sustainably have been lacking. Scientists at the Karlsruhe Institute of Technology (KIT) developed a new chemical process for removing hormones. It takes advantage of the mechanisms of photocatalysis and transforms the pollutants into potentially safe oxidation products. The team reports on this in the scientific journal Applied Catalysis B: Environmental.
Organic pollutants such as pharmaceuticals, pesticides, and hormones - even at nanoscale concentrations - contaminate drinking water in a way that poses significant risks to humans, animals, and the environment. In particular, the steroid hormones estrone, estradiol, progesterone, and testosterone can cause biological damage in humans and wildlife. The European Union has therefore set strict minimum quality standards for safe and clean drinking water, which must also be taken into account in the development of new technologies for water treatment. "The challenge for science is to develop more sensitive methods to target the hormone molecules," says Professor Andrea Iris Schäfer, Head of the Institute for Advanced Membrane Technology (IAMT) at KIT. The main problem is that steroid hormones are very hard to detect in water. "There is one hormone molecule for every quintillion water molecules. This is an extremely low concentration," explains the expert.
Detecting - and Removing - Micropollutants
With conventional water treatment technologies, wastewater treatment plants can neither find nor remove micropollutants. Researchers at the IAMT and the KIT Institute of Microstructure Technology (IMT) are therefore working on new methods to not only detect and measure micropollutants, but also remove them. A new, photocatalytic process proves to be promising. The scientists coated a commercially available large-pore polymer membrane with Pd(II)-porphyrin, a palladium-containing, light-sensitive molecule that can absorb visible radiation. Exposure to radiation with simulated sunlight initiates a chemical process that produces so-called singlet oxygen, a highly reactive oxygen species. The singlet oxygen specifically "attacks" the hormone molecules and converts them into potentially safe oxidation products. "It is crucial that we coat the surface of each pore with the photosensitizer molecule, increasing the surface area of attack," explains Roman Lyubimenko, a scientist at IAMT and IMT.
Significant Reduction of the Estradiol Concentration
The chemical decomposition of steroid hormones and the filtration of other micropollutants can be realized in a single module. With this process, filtering of 60 to 600 liters of water per square meter of membrane is possible in one hour. The scientists were able to reduce the concentration of estradiol, the most biologically active steroid hormone, by 98 percent from 100 to 2 nanograms per liter. "This means that we are already very close to the EU target value of one nanogram per liter," emphasizes Schäfer. The next goal of the research team is to further optimize the photocatalytic process and transfer it to a larger scale. Open issues are to find out how much light intensity and how much porphyrin will be needed and whether the costly palladium from the platinum group of metals can be replaced by other metals. (sur)
CAPTION
Insertion of the photocatalytic membrane into the membrane reactor. (Photo: Markus Breig, KIT)
CREDIT
Markus Breig, KIT
Original publication:
Roman Lyubimenko, Oscar Ivan Gutierrez Cardenas, Andrey Turshatov, Bryce Sydney Richards, & Andrea Iris Schäfer. Photodegradation of steroid-hormone micropollutants in a flow-through membrane reactor coated with Pd(II)-porphyrin. Applied Catalysis B: Environmental, 2021. DOI: 10.1016/j.apcatb.2021.120097
Being "The Research University in the Helmholtz Association", KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility, and information. For this, about 9,600 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 23,300 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life. KIT is one of the German universities of excellence.
New mechanism to control tomato ripening discovered
It opens the door to producing tomatoes of a higher commercial and nutritional quality
An international research group involving the Institute of Molecular and Cellular Biology of Plants (IBMCP), a joint centre of the Universitat Politècnica de València (UPV) and the Spanish National Research Council (CSIC), has discovered that a genetic mechanism, called CHLORAD, which is involved in the ageing of plant leaves, also plays a decisive role in the tomato ripening process. Thus, tomatoes with an activated CHLORAD system turn red more quickly, and accumulate more lycopene, a compound beneficial to health. The results, which have been published in the latest issue of the journal Nature Plants, will lead to better quality tomatoes.
The ripening of most fleshy fruits gives them attractive colours and smells, which is a trick of the plant to spread its seeds more widely and colonise new territories. In tomatoes, ripening changes their colour from green to orange and red. The green is due to the presence of chlorophyll (the photosynthesis pigment) in the chloroplasts of the immature fruits. When they ripen, the chloroplasts (the organs in charge of photosynthesis) lose that chlorophyll and produce large quantities of other pigments, called carotenoids.
Tomato carotenoids are orange (due to beta-carotene) and red (due to lycopene), which causes the fruit to change colour when ripe. In addition, these carotenoids form aromas that contribute to the characteristic smell of ripe tomatoes. For all this to happen, the chloroplasts need to be transformed into a new type of carotenoid storage compartment, called a chromoplast.
Until recently, it was not known how the tomato plant controls the transformation of chloroplasts into chromoplasts. Now, a research group from the University of Oxford (UK) in collaboration with the Valencian Institute of Molecular and Cell Biology of Plants (IBMCP) has unravelled part of this mystery, in an article published in the journal Nature Plants.
The key to this work comes from Arabidopsis, a plant used as a study model that does not develop chromoplasts naturally, but does transform its chloroplasts during a process -known as leaf senescence- in which the leaves age, lose their chlorophyll and stop photosynthesising. During this process, a molecular mechanism called CHLORAD removes complexes in the outer layer of chloroplasts that import proteins needed for photosynthesis.
Tomatoes that turn red sooner
Researchers have found that the CHLORAD system also works during tomato ripening. When activated, it prevents the import of photosynthetic proteins, but promotes the incorporation of other proteins necessary for the production and storage of carotenoids during the transformation of chloroplasts into chromoplasts. Thus, fruits with an activated CHLORAD system turn red sooner and accumulate more of the health-promoting carotenoid lycopene, while fruits with a deficient CHLORAD system take longer to ripen.
"In addition to better understanding how chloroplasts are transformed into chromoplasts, we now know that this process not only regulates fruit pigmentation, but also affects many other aspects linked to ripening that influence the firmness or the aroma of tomatoes," says Manuel Rodríguez Concepción, a CSIC researcher at the IBMCP who is participating in this study. The challenge now is to understand the connections between these mechanisms in order to produce tomatoes of a higher commercial and nutritional quality without sacrificing their characteristic colour, aroma and flavour.
With a kitchen freezer and plant cellulose, an aerogel for therapeutic use is developed
KTH, ROYAL INSTITUTE OF TECHNOLOGY
VIDEO: BY ADDING A BIT OF ACID TO THE ACETONE, IT DISSOLVES THE CALCIUM CARBONATE PARTICLES IN THE AEROGEL AND RELEASES CO2--GENERATING THE BUBBLES THAT MAKE THE MATERIAL MORE POROUS. THE...view more
A new low-cost and sustainable technique would boost the possibilities for hospitals and clinics to deliver therapeutics with aerogels, a foam-like material now found in such high-tech applications as insulation for spacesuits and breathable plasters.
With the help of an ordinary kitchen freezer, this newest form of aerogel was made from all natural ingredients, including plant cellulose and algae, says Jowan Rostami, a researcher in fibre technology at KTH Royal Institute of Technology in Stockholm.
Rostami says that the aerogel's low density and favorable surface area make it ideal for a wide range of uses, including timed release of medication and wound dressing.
The advance was reported in the scientific journal, Materials Today, by researchers from the Department of Fibre and Polymer Technology at KTH, the Department of Engineering Mechanics at KTH, the Wallenberg Wood Science Centre at KTH, and the Division of Solid Mechanics at Lund University.
The aerogel's density could be pushed down to as low levels as 2kg per cubic meter, which her research team believes is among the lowest recorded densities for similar materials, she says.
"To give you an idea of how light that is--the density of air is 1.23 kg per cubic meter."
In order to demonstrate that the material can be used for controlled delivery of therapeutics., the researchers attached proteins to the aerogel through a water-based self-assembly process.
"The aerogel is designed for biointeractivity, so it can for example be used to treat wounds or other medical problems," Rostami says.
With an air volume of up to nearly 99.9 percent, aerogels are super-lightweight yet durable (the KTH aerogel is nearly 99 percent air). They've been used in a wide range of products since the mid-20th century, from skin care to paint, and numerous materials for building construction.
Technical advances have enabled aerogels from plant cells--or cellulose nanofibrils--which have generated interest for environmental applications such as water purification and home insulation. The basic process for nanocellulose-based aerogels involves dispersing nanofibrils in water, and then drying out the mixture.
But the steps along the way are energy-intensive and time-consuming, in part because they require freeze drying or critical-point drying with CO2 gas.
"We use a sustainable approach instead," Rostami says. "It's simple yet sophisticated."
The fibrils are mixed in water with alginate--a naturally occurring polymer in seaweed--and then calcium carbonate is added. In the freezer, the water turns to ice and compresses these components together, rendering a frozen hydrogel.
The frozen hydrogel is removed from the freezer and placed in acetone, which not only removes water and evaporates quickly, but by adding a bit of acid to the acetone, it dissolves the calcium carbonate particles and releases CO2--generating the bubbles that could make the material more porous.
The dissolution of calcium carbonate enables yet another benefit: it releases calcium ions which crosslink with the alginate and CNFs, giving the aerogel wet-stability and its ability to recover its shape after being suffused with liquid.
Rostami says this quality further adds to the aerogel's usefulness in more applications, "without using costly, time and energy-consuming processes, toxic chemicals or complicated chemistry."
CAPTION
Nearly as light as air, these all-natural cellulose aerogels can be made sustainably, cheaply and with all natural materials. They're biointeractive too, so they can be used for therapeutics.
CREDIT
Andrew Marais
Digital Twin technology a 'powerful tool' but requires significant investment, say experts
Healthcare and aerospace experts at King's College London, The Alan Turing Institute, the University of Cambridge, and the Oden Institute for Computational Engineering and Sciences at UT Austin in Texas have said advances in digital twin technology make it a powerful tool for facilitating predictive and precision medicine and enhancing decision-making for aerospace systems. Their opinion piece was published today in Nature Computational Science.
When applied to healthcare, the digital twin, a virtual version of real-life objects that can be used to predict how that object will perform, could predict how a patient's disease will develop and how patients are likely to respond to different therapies.
It is also of huge benefit in aerospace, where, for example, the technology will be needed to monitor and control thousands of drones, ensuring that they are maintained, have efficient and safe flight plans and can automatically adapt to changes in conditions, such as weather, without the need for human interaction.
However, current digital twins are largely the result of bespoke technical solutions that are difficult to scale.
The authors say that these use cases place new demands on the speed, robustness, validation, verification and uncertainty quantification in digital twin creation workflows.
Achieving digital twins at scale will require a drastic reduction in technical barriers to their adoption.
Lead author Professor Steven Niederer from the School of Biomedical Engineering & Imaging Sciences, King's College London said in medicine, the digital twin will allow testing of a large number of therapies on a patient to identify the best option for that individual with their unique disease.
"There is, however, a need to invest in the underlying theory for how to make models, how to run these models at speed and how to combine multiple models together to ensure that they run as expected," he said.
"We also need to further develop the mathematics of how we create digital twins from patient data, how we measure uncertainty in patient data, and how to account for uncertainty in the model in predictions. These are all things which need further investment."
"We are making more measurements in patients in the hospital and from remote monitoring and we need to develop methods for rapidly and robustly combining this patient information into a digital twin to provide a single representation of the patient."
"Another challenge is how do we get better at predicting how the heart will operate under extreme conditions. We often want to predict when the heart will fail, however, we only have information that is obtained from them under normal operating procedures."
Similarly from an aerospace perspective, Dr Karen Willcox, Director, Oden Institute for Computational Engineering and Sciences at The University of Texas at Austin, said in order to be useful, the digital twin must be predictive and quantify uncertainty.
"Digital twins must be able to analyze 'what if' scenarios and issue predictions about the future, in order to guide decision making to manage a physical asset. That means the digital twin cannot be built on data alone, it needs to include both data and predictive models," she said.
Though open challenges remain, the value of digital twins is clear. And they don't have to be perfect to be valuable. "Even with existing limitations, digital twins are providing valuable decision support in many different application areas," said Willcox. "Ultimately we would like to see the technology used in every engineering system. At that point we can start thinking not just about how a digital twin might change the way we operate the system, but also how we design it in the first place."
Mark Girolami, Programme Director for Data-Centric Engineering at The Alan Turing Institute said: "The promise of data driven coupling of mathematical models with the physical reality they represent, the so-called digital twin, is going to be transformational in how we interact with and control the physical world. In healthcare for instance, the increasing power of computers and algorithms are enabling technologies to build a patient-specific digital twin, catering to our diversity as human beings and improving individual health outcomes. However, these promised advances are going to be hard won, requiring further concerted and sustained foundational research and development to fully realise the promise of the digital twin."
###
Scientists created building materials effectively protecting from radiation
Bricks made of new materials are cheaper and more environmentally friendly than analogues made of lead and other materials
Scientists at the Ural Federal University (UrFU, Russia) have created clay bricks that are able to attenuate ionizing radiation to a level that is safe for the human body. To the composition of bricks scientists add waste from the industry, which protects against radiation. The article describing the technology was published in the journal Applied Radiation and Isotopes.
"Bricks are a relatively cheap and convenient material with which we can quickly erect protective rooms, structures, walls around objects with radiation," says scientific head of the project, associate professor of the Department of Nuclear Power Plants and Renewable Energy Sources at UrFU Oleg Tashlykov. "The bricks are alloyed with heavy metals - wastes from the metallurgical enterprises. These substances have pronounced radiation-protective properties. Thus, we solve two problems at once. First, by adding crushed absorbers of ionizing radiation to the matrix, in this case from clay, we obtain building materials with the desired protective properties. Secondly, in this way we find a way to utilize industrial waste."
CAPTION
Bricks are a relatively cheap and convenient material with which it can be quickly erect protective rooms, structures, walls around objects with radiation
CREDIT
UrFU / Anastasia Farafontova
The ultimate goal of scientists is to develop a wide range of materials based not only on clay, but also cement mortars or concrete, artificial polymers with different chemical composition and concentration of absorbing substances. In other words, with specified protective properties that meet specific conditions (isotopic composition of radioactive contamination, types of radiation, etc.) at nuclear power plants, in radioactive waste storage facilities, as well as in medical institutions where diagnostics and treatment are carried out using X-ray equipment and irradiating devices.
"Tungsten is widely known to be the most reliable protection against gamma or X-ray radiation, but it is very expensive," says research coauthor, research engineer of the Department of Nuclear Power Plants and Renewable Energy Sources at UrFU Karem Makhmud. "Lead is cheaper but toxic. And, besides, it is plastic and in an upright position can slide under its own weight, forming holes in the radiation protection system and reducing its stability. Our materials are optimal in terms of radiation protection efficiency and ease of manufacture, strength, durability, cost. The latter factor is important, since today the contribution of biological protection to the cost of nuclear power facilities reaches 20-30%."
Scientists use high-precision computational codes to create bricks. Also they use for their experimental research the reactor plant of the Institute of Reactor Materials of the State Corporation "Rosatom" (Russia), as well as the production technologies of the Sealing Materials Plant (Russia). The products of joint activities are of great interest to domestic and foreign enterprises of the nuclear industry. There are plans to further study the mechanical and radiation-protective parameters of various natural substances, including those common in the partner countries of Rosatom (Turkey, Egypt, Bangladesh), where nuclear power plants are being built with the participation of Russian specialists.
Indigenous peoples and local communities, key to achieving biodiversity goals
An international study led by the ICTA-UAB states that recognizing indigenous peoples' and local communities' rights and agency is critical to addressing the current biodiversity crisis
An international study led by the ICTA-UAB states that recognizing indigenous peoples' and local communities' rights and agency is critical to addressing the current biodiversity crisis
Policies established by the post-2020 Global Biodiversity Framework of the Convention on Biological Diversity (CBD) could be ineffective if the rights and agency of indigenous peoples and local communities are not recognized and fully incorporated into biodiversity management. This is supported by an international study led by the Institute of Environmental Science and Technology of the Universitat Autònoma de Barcelona (ICTA-UAB) and recently published in the journal Ambio.
The Convention on Biological Diversity is now working to formulate the goals that will frame global biodiversity policy in the years to come. This will be done through an ambitious international plan commonly known as the post-2020 Global Biodiversity Framework. The objective of this framework is to promote a profound transformation at the social level that allows halting biodiversity loss at global level. Unfortunately, the framework, as currently written, still has a long way to go towards fully recognizing Indigenous Peoples' rights and agency, scientists argue.
"Indigenous Peoples' and local communities' understandings of nature align perfectly well with the Convention on Biological Diversity's vision of Living in Harmony with Nature", says ICREA Research Professor at the ICTA-UAB Victoria Reyes-García, leader of the study. "It seems paradoxical that global discussions on the collective future of the planet do not heed the voices of Indigenous Peoples' and local communities, one of the groups of actors that has contributed the most to safeguarding the planet's biodiversity".
The study, signed by 21 scientists from all over the world, presents a set of arguments why foregrounding Indigenous Peoples' and local communities' rights and agency is essential to the success of future biodiversity policy. Based on an in-depth review of literature, the study highlights that Indigenous Peoples and local communities hold critical knowledge for setting realistic, legitimate and effective biodiversity targets.
"The Global Biodiversity Framework should recognize and address the views and perspectives of Indigenous Peoples and local communities", states Dr. Álvaro Fernández-Llamazares, co-author of the study and researcher at the University of Helsinki. "There is crystal-clear evidence that their knowledge systems, practices and values have so much to offer in addressing the current biodiversity crisis".
The authors argue that Indigenous Peoples' and local communities' participation in biodiversity policy contribute to recognizing and upholding human rights, and call on the Convention on Biological Diversity to fully recognize Indigenous Peoples' and local communities not only as stakeholders, but also as rights, agency and knowledge-holders.