Thursday, June 03, 2021

 

CMU Team develops machine learning platform that mines nature for new drugs

CARNEGIE MELLON UNIVERSITY

Research News

Researchers from Carnegie Mellon University's Computational Biology Department in the School of Computer Science have developed a new process that could reinvigorate the search for natural product drugs to treat cancers, viral infections and other ailments.

The machine learning algorithms developed by the Metabolomics and Metagenomics Lab match the signals of a microbe's metabolites with its genomic signals and identify which likely correspond to a natural product. Knowing that, researchers are better equipped to isolate the natural product to begin developing it for a possible drug.

"Natural products are still one of the most successful paths for drug discovery," said Bahar Behsaz, a project scientist in the lab and lead author of a paper about the process. "And we think we're able to take it further with an algorithm like ours. Our computational model is orders of magnitude faster and more sensitive."

In a single study, the team was able to scan the metabolomics and genomic data for about 200 strains of microbes. The algorithm not only identified the hundreds of natural product drugs the researchers expected to find, but it also discovered four novel natural products that appear promising for future drug development. The team's work was published recently in Nature Communications.

The paper, "Integrating Genomics and Metabolomics for Scalable Non-Ribosomal Peptide Discovery," outlines the team's development of NRPminer, an artificial intelligence tool to aid in discovering non-ribosomal peptides (NRPs). NRPs are an important type of natural product and are used to make many antibiotics, anticancer drugs and other clinically used medications. They are, however, difficult to detect and even more difficult to identify as potentially useful.

"What is unique about our approach is that our technology is very sensitive. It can detect molecules with nanograms of abundance," said Hosein Mohimani, an assistant professor and head of the lab. "We can discover things that are hidden under the grass."

Most of the antibiotic, antifungal and many antitumor medications discovered and widely used have come from natural products.

Penicillin is among the most used and well-known drugs derived from natural products. It was, in part, discovered by luck, as are many of the drugs made from natural products. But replicating that luck is difficult in the laboratory and at scale. Trying to uncover natural products is also time and labor intensive, often taking years and millions of dollars. Major pharmaceutical companies have mostly abandoned the search for new natural products in the past decades.

By applying machine learning algorithms to the study of genomics, however, researchers have created new opportunities to identify and isolate natural products that could be beneficial.

"Our hope is that we can push this forward and discover other natural drug candidates and then develop those into a phase that would be attractive to pharmaceutical companies," Mohimani said. "Bahar Behsaz and I are expanding our discovery methods to different classes of natural products at a scale suitable for commercialization."

The team is already investigating the four new natural products discovered during their study. The products are being analyzed by a team led by Helga Bode, head of the Institute for Molecular Bioscience at Goethe University in Germany, and two have been found to have potential antimalarial properties.

###

This study was conducted in collaboration with researchers from the University of California San Diego; Saint Petersburg University; the Max-Planck Institute; Goethe University; the University of Wisconsin, Madison; and the Jackson Laboratory.

Oldest human traces from the southern Tibetan Plateau in a new light

UNIVERSITY OF INNSBRUCK

Research News

Stone tools have been made by humans and their ancestors for millions of years. For archaeologists these rocky remnants - lithic artefacts and flakes - are of key importance. Because of their high preservation potential they are among the most common findings in archaeological excavations. Worldwide, numerical dating of these lithic artefacts, especially when they occur as surface findings, remains a major challenge. Usually, stone tools cannot be dated directly, but only when they are embedded in sediment layers together with, for example, organic material. The age of such organic material can be constrained via the radiocarbon technique. If such datable organic remains are missing or if stone artefacts lack a stratified sedimentary context, but rather occur as scattered surface artefacts, numerical dating becomes very difficult or is simply impossible. "The earth's surface is highly dynamic and erosion and redeposition of material, especially over long timescales, is common. A precise age determination of lithic artefacts that occur as surface finds has therefore hardly been possible so far. Many aspects of ancient human behaviour have only been preserved as surface finds, hence cannot be dated precisely with currently available dating methods. By further developing the Optically Stimulated Luminescence (OSL) dating technique, we can now, for the first time, carry out precise, and direct age measurements on lithic artefacts. In our current study we used stone artefacts from an archaeological surface site in south-central Tibet", explains Michael Meyer, head of the Luminescence Laboratory at the Department of Geology at the University of Innsbruck and one of the main authors of the study now published in the renowned journal Science Advances. OSL dating is based on the measurement of light stored in natural minerals and is one of the most important absolute dating tools in archaeology and the earth sciences. "This dating method uses natural light signals that accumulate over time in natural dosimeters, such as quartz and feldspar grains that are important constituents of sediments, as well as rocks and lithic artefacts. These minerals can be imagined as miniaturized clocks. Each grain is a tiny clock that can be 'read-out' under controlled laboratory conditions. The light signal allows us to infer the age of the archaeological sediment layer or artefact. The more light, the older the sample," says the geologist. "In this study, we have now taken a new approach and focused not on sediment grains of sand, but - for the first time - on stone artefacts themselves."

Quarrying activities more than 5,000 years ago

Due to its extreme environmental and climatic conditions the dry highlands of Tibet are considered to be one of the last regions on earth that were occupied by humans. When exactly peopling of this remote and rather extreme environments occurred has caused a lot of scientific debate over the course of the last decade. In 2017, Michael Meyer dated the famous human foot and hand prints of Chusang in the central part of the Tibetan plateau to an age between 8,000 and 12,000 years. In the current study, Meyer and his team analysed archaeological finds from southern Tibet in the Innsbruck OSL Laboratory: The excavation site Su-re is located immediately north of the Mount Everest-Cho Oyu massif in the so-called Tingri graben at an elevation of 4450 metres. Surface artefacts are particularly common in Tibet. To date them, the researcher used the so-called "Rock Surface Burial Dating" technique and applied it to lithic surface artefacts. This method determines the point in time when the stone artefact was discarded by humans and at least partly covered by earth. "With our luminescence method, we can look inside the stone and create a continuous age-depth profile. The inside of a rock has never been exposed to sunlight, so we have a saturated luminescence signal there and an infinite high age. However, if the rock surface is exposed to daylight for a long enough time, the signal in the top millimeters or centimeters of the rock will be erased. This happens during knapping, when the stone tool is produced, and also during the subsequent artefact use by humans. When the artefact is then discarded and at least partially buried in sediment and shielded from light, the luminescence signal in this artefact surface recharges. By measuring this depth-dependent luminescence signal in the rock surfaces, we can calculate the age of the artefact discard, taking into account the dynamics of local earth surface processes. Such an approach allows us to date stone artefacts directly, even if they occur as surface finds," Meyer explains. The analyses on the surface artefacts from southern Tibet revealed an age between 5,200 and 5,500 years. "We assume that the artefact findings at Su-re are related to quarrying activities at this site". Very old sites have been discovered in the central part of the Plateau, however, for southern sector of the Tibetan Plateau, Su-re is currently to oldest securely dated site.

For Michael Meyer, the analysis of these Tibetan artefacts is just the beginning: "This OSL-based method opens up new vistas in archaeological dating and holds great potential also for sites on other continents that preserve lithic artefacts in a favorable setting," concludes the geologist.

###

Publication: L.A. Gliganic, M.C. Meyer, J.-H. May, M.S. Aldenderfer, P. Tropper: Direct dating of lithic surface artifacts using luminescence. Sci. Adv. 7, eabb3424 (2021). DOI: 10.1126/sciadv.abb3424

Links: OSL Laboratory at the Department of Geology, University of Innsbruck, Austria: https://quaternary.uibk.ac.at/Research/Current-Research/Luminescence-geochronology.asp

Jack Tseng loves bone-crunching animals -- hyenas are his favorite -- so when paleontologist Joseph Peterson discovered fossilized dinosaur bones that had teeth marks from a juvenile Tyrannosaurus rex, Tseng decided to try to replicate the bite marks and measure how hard those kids could actually chomp down.

Last year, he and Peterson made a metal replica of a scimitar-shaped tooth of a 13-year-old juvie T. rex, mounted it on a mechanical testing frame commonly used in engineering and materials science, and tried to crack a cow legbone with it.

Based on 17 successful attempts to match the depth and shape of the bite marks on the fossils -- he had to toss out some trials because the fresh bone slid around too much -- he determined that a juvenile could have exerted up to 5,641 newtons of force, somewhere between the jaw forces exerted by a hyena and a crocodile.

Compare that to the bite force of an adult T. rex -- about 35,000 newtons -- or to the puny biting power of humans: 300 newtons.

Previous bite force estimates for juvenile T. rexes -- based on reconstruction of the jaw muscles or from mathematically scaling down the bite force of adult T. rexes -- were considerably less, about 4,000 newtons.

Why does it matter? Bite force measurements can help paleontologists understand the ecosystem in which dinosaurs -- or any extinct animal -- lived, which predators were powerful enough to eat which prey, and what other predators they competed with.

"If you are up to almost 6,000 newtons of bite force, that places them in a slightly different weight class," said Tseng, UC Berkeley assistant professor of integrative biology. "By really refining our estimates of juvenile bite force, we can more succinctly place them in a part of the food web and think about how they may have played the role of a different kind of predator from their larger, adult parents."

The study reveals that juvenile T. rexes, while not yet able to crush bones like their 30- or 40-year-old parents, were developing their biting techniques and strengthening their jaw muscles to be able do so once their adult teeth came in.

"This actually gives us a little bit of a metric to help us gauge how quickly the bite force is changing from juvenile to adulthood, and something to compare with how the body is changing during that same period of time," said Peterson, a professor at the University of Wisconsin in Oshkosh and a paleopathologist -- a specialist on the injuries and deformities visible in fossil skeletons. "Are they already crushing bone? No, but they are puncturing it. It allows us to get a better idea of how they are feeding, what they are eating. It is just adding more to that full picture of how animals like tyrannosaurs lived and grew and the roles that they played in that ecosystem."

Tseng, Peterson and graduate student Shannon Brink of East Carolina University in Greenville, North Carolina, will publish their findings this week in the journal PeerJ.


CAPTION

An artist's depiction of a young Tyrannosaurus rex, about 13 years old, chewing on the tail of an Edmontosaurus, a plant-eating, duckbill dinosaur of the late Cretaceous Period. The teeth punctures left in the bone, which the youngster probably scavenged, allowed scientists to estimate the bite force that juvenile tyrannosaurs could exert.

CREDIT

Sketch by Brian Engh, http://dontmesswithdinosaurs.com/

Teeth marks galore, but who was the biter?

Experiments using metal casts of dinosaur teeth to match observed bite marks are rare, not because bite marks on dinosaur fossils are rare, but because the identity of the biter is seldom clear.

Two dinosaur fossils that Peterson excavated years earlier from the Hell Creek Formation of eastern Montana, however, proved ideal for such an experiment. One, the skull of a juvenile T. rex, had a healed bite mark on its face. "What, other than another T. rex, would be able to chomp another T. rex and puncture its skull?" he reasoned. Tyrannosaurs, like crocodiles today, played rough, and the wound was likely from a fight over food or territory.

In addition, the puncture holes in the skull, which had healed, were the size and shape of juvenile T. rex teeth, and the spacing fit a juvenile's tooth gap. Juvenile T. rexes have teeth that are oval in cross section: more knife-like, presumably to cut and tear flesh. Adult T. rexes have teeth with round cross sections: more like posts, to crush bone. Both juveniles and adults could replace lost or broken teeth from spares buried in the jaw that emerged once the socket was empty.

Because skull bone is harder than other bone, Peterson said, matching these holes with punctures made by the metal tooth in a cow bone provided an upper limit to the bite force.

The other fossil was a tail vertebra from a plant-eating, duckbilled dinosaur, an Edmontosaurus. It had two puncture marks from teeth that matched those of a juvenile T. rex. Peterson said that T. rex was the only predator around at that time -- the late Cretaceous Period, more than 66 million years ago -- that could have bitten that hard on the tailbone of a duckbill. The juvenile likely punctured the bone when chomping down on a meaty part of the tail of the already dead animal.

Because vertebrae are softer, experimentally creating similar punctures in a cow bone gave the researchers a lower limit on bite force.

Tseng employed a testing technique that was used in 2010 by researchers who measured the bite force of a much older and smaller dinosaur from the early Cretaceous: a Deinonychus, made famous under a different name -- Velociraptor -- in the 1993 movie Jurassic Park. Its bite force was between 4,000 and 8,000 newtons.

Tseng, then at the University at Buffalo in New York, and Peterson made a replica of a juvenile T. rex tooth from the middle of the jaw using a dental-grade cobalt chromium alloy, which is much harder than dinosaur tooth enamel, Tseng said.


CAPTION

Jack Tseng of UC Berkeley measuring punctures produced in a cow bone by a metal cast of a tyrannosaur tooth.

CREDIT

UC Berkeley photo by Juan Liu

They then mounted the metal tooth in a mechanical testing frame and pushed it slowly, at a millimeter per second, into a fresh-frozen and thawed humerus of a cow. Bones are easier to fracture at low speed than with a rapid chomp. Because the middle of the humerus has a thicker cortex than the bone near the joint ends, the middle was used to replicate the facial punctures. The ends were used to simulate the vertebra punctures.

"What we did, an actualistic study, is to say, 'Let's actually stab the thing with a tooth and see what it does,'" Peterson said. "What we are finding is that our estimates are slightly different than other models, but they are within a close enough range -- we are on the same page."

Tseng emphasized that there is no one number describing the bite force of any animal: it depends on how the creature bites and adjusts the prey in its mouth for the best leverage.

"They probably were not just chomping down. If you look at modern predators, even reptilian predators, sometimes there is adjustment. Maybe they are finding the most mechanically advantageous place, or the strongest tooth to make their bite," said Tseng, who is a 2004 graduate of UC Berkeley's Department of Integrative Biology and an assistant curator in the University of California Museum of Paleontology. "Presumably, there is some tuning involved before they make that bite, so they can literally take the best bite forward to make that kill or to damage whatever they are trying to get into."

Nevertheless, the measurements are a start in charting the increase in tyrannosaurs' bite force as they mature, similar to how paleontologists have charted T. rex size and weight with age.

"Just as you can do a growth curve for such an organism, you can also do a strength curve for their bite force -- what was their bite force at 12 or 13 years old, what was it at 30, 35 or 40 years old. And what does that potentially mean about the role that those animals played in that ecosystem at the time?" Peterson said. "What's cool about finding bite marks in bone from a juvenile tyrannosaur is that it is tells us that at 13 years old, they weren't capable of crushing bone yet, but they were already trying, they were puncturing bone, pretty deep. They are probably building up their strength as they get older."

Tseng, whose primary interest is mammals, is eager to resume studies interrupted by the pandemic to measure the bite force of various living and extinct animals in order to infer the ecosystem niches of predators no longer alive. For those creatures, fossils are all that paleontologists have, in order to "interpret behavior and breathe some life into these extinct animals," said Peterson.

"I use a biomechanical lens when I look at everything, living or extinct," Tseng added. "Ecologists today studying food webs and ecosystems don't rely much on bones; they have physical animals and plants. It is really the paleontologists who are interested in this approach, because the majority of what we have to study are bones and bite marks."

###

TECHNOMAGE MAGICK

Entangled quantum memories for a quantum repeater: A step closer to the Quantum Internet

ICFO-THE INSTITUTE OF PHOTONIC SCIENCES

Research News

IMAGE

IMAGE: SCHEMATIC ILLUSTRATION OF THE EXPERIMENTAL SETUP AND THE LOCATION OF THE LABS IN THE ICFO BUILDING. view more 

CREDIT: ©ICFO

* ICFO researchers report in Nature on having achieved, for the first time, entanglement of two multimode quantum memories located in different labs separated by 10 meters, and heralded by a photon at the telecommunication wavelength.

* The scientists implemented a technique that allowed them to reach a record in the entanglement rate in a system that could be integrated into the fibre communication network, paving the way to operation over long distances.

* The results are considered a landmark for quantum communications and a major step forward in the development of quantum repeaters for the future quantum internet.

During the 90s, engineers made major advances in the telecom arena spreading out the network to distances beyond the cities and metropolitan areas. To achieve this scalability factor, they used repeaters, which enhanced attenuated signals and allowed these to travel farther distances with the same features such as intensity or fidelity. Now, with the addition of satellites, it is completely normal to be in the middle of a mountain in Europe and talk with your loved ones living in the other part of the world.

In the road towards building the future Quantum internet, quantum memories play the same role. Together with sources of qubits, they are the building blocks of this novel internet, acting as quantum repeaters of data operations and using superposition and entanglement as the key ingredients of the system. But to operate such system at a quantum level, the entanglement between quantum memories had to be created over long distances and maintained as efficiently as possible.

All together in one

In a recently published study in Nature, ICFO scientists Dario Lago, Samuele Grandi, Alessandro Seri and Jelena Rakonjac, led by ICREA Prof at ICFO, Hugues de Riedmatten, have achieved scalable, telecom-heralded matter-matter entanglement between two remote, multimode and solid-state quantum memories. In simpler words, they were able to store, for a maximum of 25 microseconds, one single photon in two quantum memories placed 10m apart.

The researchers knew that the photon was in one of the two memories, but they did not know in which one, which emphasized this counter-intuitive notion that we have of nature, which allows the photon to be in a quantum superposition state in the two quantum memories at the same time but, amazingly, 10 meters apart. The team also knew that the entanglement was created with the detection of a photon at telecom wavelength, and it was stored in the quantum memories in a multiplexed fashion, "a feature akin to allowing several messages to be sent at the same time in a classical channel". These two key features have been achieved together for the first time and define the stepping stone in extending this scheme to much longer distances.

As Dario Lago, a PhD student at ICFO and first author of the study, enthusiastically pinpoints "So far, several of the milestones achieved in this experiment were done by other groups, like entangling quantum memories or achieving storage of the photons in quantum memories with a very high efficiency and high rates. But, the uniqueness about this experiment is that our techniques achieved very high rates and can be extended to longer distances."


CAPTION

Close up image of a rare-earth doped crystal used as a quantum memory

CREDIT

©ICFO

Setting up the experiment

Achieving this landmark took its effort and time. During the course of several months, the team setup the experiment, where they used a rare-earth doped crystal as a quantum memory for the basis of their test.

Then, they took two sources generating correlated pairs of single photons. In each pair, one photon, named idler, is at a 1436nm (telecom wavelength), and the other, named signal, is at a wavelength of 606nm. The single signal photons, were sent to a quantum memory, made up of millions of atoms all randomly placed inside a crystal, and stored there via a protocol called atomic frequency comb. Alongside, the idler photons, also called heralding or messenger photons, were sent through an optical fiber to a device called beam-splitter, where the information about their origin and path was completely erased. Samuele Grandi, postdoctoral researcher and co-first author of the study, comments, "We erased any sort of feature that would tell us where the idler photons were coming from, let it be source 1 or 2, and we did this because we did not want to know any information about the signal photon and in which Quantum Memory it was being stored in". By erasing these features, the signal photon could have been stored in any of the quantum memories, which means that entanglement was created between them.

Every time that the scientists saw on the monitor a click of an idler photon arriving at the detector, they were able to confirm and verify that there was, in fact, entanglement. This entanglement consisted in a signal photon in a superposition state between the two quantum memories, where it was stored as an excitation shared by tens of millions of atoms for up to 25 microseconds.

As Sam and Dario mention, "The curious thing about the experiment is that it is not possible to know if the photon was stored in the quantum memory in the lab 1 or in Lab 2, which was more than 10 meters away. Although this was the principal feature of our experiment, and one that we kind of expected, the results in the lab were still counter-intuitive, and even more peculiar and mind-blowing to us is that we were capable of controlling it!"

The importance of heralded photons

Most of the previous studies that have experimented with entanglement and quantum memories used herald photons to know whether or not the entanglement between quantum memories had been successful. A heralding photon is like a messenger dove and the scientists can know upon its arrival that the entanglement between the quantum memories has been established. When this happens, the entanglement attempts stop and the entanglement is stored in the memories before being analyzed.

In this experiment, the scientists used a heralding photon in the telecom frequency, confirming that the entanglement being produced could be established with a photon that is compatible with existing telecom networks, an important feat since it allows entanglement to be created over long distances and, even more so, enables these quantum technologies to be easily integrated into the existing classical network infrastructures.

Multiplexing is key

Multiplexing is the capability of a system to send several messages at the same time through only one channel of transmission. In classical telecommunications, this is a frequent tool used to transmit data over the internet. In quantum repeaters, such technique is slightly more complex. With standard quantum memories, one has to wait for the message heralding the entanglement to come back to the memories, before one can try again to create entanglement. But with the use of the atomic frequency comb protocol, which allows this multiplexing approach, the researchers were able to store the entangled photons at many different times in the quantum memory, without having to wait for a successful heralding event before generating the next entangled pair. This condition, called "temporal multiplexing" is a key feature that represents a major increase in the operational time of the system, leading to an increment in the final entanglement rate.

Future Steps

As Prof. ICREA at ICFO Hugues de Riedmatten enthusiastically remarks "This idea was conceived more than 10 years ago and I am thrilled to see that it has now succeeded in the lab. The next steps are to bring the experiment outside of the lab, to try and link different nodes together and distribute entanglement over much larger distances, beyond what we currently have now. In fact, we are in the midst of achieving the first quantum link of 35km, which will be done between Barcelona and ICFO, in Castelldefels".

It is clear that the future quantum network will bring many applications in the near future. This achieved landmark proves and confirms that we are in the correct pathway towards developing these disruptive technologies and beginning to deploy them into what will be a new way of communicating, the Quantum Internet.


CAPTION

The authors of the work in their lab at ICFO. From left to right: Samuele Grandi, Dario Lago, Jelena Rakonjac, Alessandro Seri and Hugues de Riedmatten.

CREDIT

©ICFO


Reference: Telecom-heralded entanglement between multimode solid-state quantum memories, Dario Lago-Rivera, Samuele Grandi, Jelena V. Rakonjac, Alessandro Seri, and Hugues de Riedmatten, Nature, 2021, https://www.nature.com/articles/s41586-021-03481-8

Links of Interest

Link to the Video Abstract - Youtube video with English, Spanish, Catalan subtitles: https://youtu.be/yEuWyta9O6Y

Link to Audio-visual Material - Images, Photos, Infographs https://drive.google.com/drive/folders/1YUQFrxPZzIUFNmvvUoiMOPQ9NT4i_c31?usp=sharing

Link to the research group led by ICREA Prof. Hugues de Riedmatten https://www.icfo.eu/lang/research/groups/groups-details?group_id=32

Funding Entities

This study has received partial funding from the Quantum Flagship research project Quantum Internet Alliance (QIA), by the Gordon and Betty Moore foundation, as well as the Fundació Cellex, Fundació Mir-Puig, Generalitat de Catalunya, and the Spanish government, among other entities.

About ICFO

ICFO is a CERCA research centre member of the Barcelona Institute of Science and Technology, founded in 2002 by the Government of Catalonia and the Universitat Politècnica de Catalunya · Barcelona Tech, both of which are members of ICFO's board of trustees along with the Cellex and Mir-Puig Foundations, philanthropic entities that have played a critical role in the advancement of the institute. Located in the Mediterranean Technology Park in the metropolitan area of Barcelona, the institute currently hosts 450 people, organized in 26 research teams that use 80 state-of-the-art research laboratories. Research lines encompass diverse areas in which photonics plays a decisive role, with an emphasis on basic and applied themes relevant to medicine and biology, advanced imaging techniques, information technologies, a range of environmental sensors, tunable and ultra-fast lasers, quantum science and technologies, photovoltaics and the properties and applications of nano and quantum materials such as graphene, among others. In addition to three consecutive accreditations of the Severo Ochoa national program for top research excellence, ICFOnians have been awarded 15 elite ICREA Professorships as well as 40 European Research Council grants. ICFO is very proactive in fostering entrepreneurial activities, spin-off creation, and creating collaborations and links between industry and ICFO researchers. To date, ICFO has helped create 10 start-up companies.

Atmospheric metal layers appear with surprising regularity

UNIVERSITY OF COLORADO AT BOULDER

Research News

IMAGE

IMAGE: THE WORD "LIDAR " WAS CREATED USING A FLASHLIGHT. view more 

CREDIT: ZHIBIN YU/CIRES, CU BOULDER, HARBIN INSTITUTE OF TECHNOLOGY.

Twice a day, at dusk and just before dawn, a faint layer of sodium and other metals begins sinking down through the atmosphere, about 90 miles high above the city of Boulder, Colorado. The movement was captured by one of the world's most sensitive "lidar" instruments and reported today in the AGU journal Geophysical Research Letters.

The metals in those layers come originally from rocky material blasting into Earth's atmosphere from space, and the regularly appearing layers promise to help researchers understand better how earth's atmosphere interacts with space, even potentially how those interactions help support life.

"This is an important discovery because we have never seen these dusk/dawn features before, and because these metal layers affect many things. The metals can fall into the ocean and act as fertilizer for ecosystems, the ionized metals can affect GPS radio signals," said Xinzhao Chu, CIRES Fellow, CU Boulder professor of Aerospace Engineering Sciences, and lead author of the new assessment.

It is the first time that the metal layers--which are not harmful to people--have been seen so regularly at these extreme heights in the atmosphere. Such high-altitude metal layers were discovered by Chu's group just 10 years ago above McMurdo, Antarctica, but there they occur more sporadically. Above Boulder, they're consistent, daily, and synched with winds that occur high in the atmosphere.

"Consistent daily patterns seen in our Boulder observations tell us that there are unknown processes at play, a golden opportunity for atmospheric scientists," said Jackson Jandreau who worked alongside Chu and Yingfei Chen in this study. Chen and Jandreau are both PhD students in Chu's group.

The discovery also gives researchers a window into a crucial part of the atmosphere that is challenging to observe. It's a complicated region where interactions between the sun, earth and our planet's magnetic field can end up creating the environmental conditions in which surface life can thrive, protected from the harsh space environment.

"There are metals in the atmospheres of other planetary bodies, such as Mars, and researchers look for Earth-like features on exoplanets as indicators for hospitable environments," Chu said. "Can these metal layers be one of these features?"

###

Her team used a powerful atmospheric lidar to detect and measure very small quantities of particles in the high atmosphere. Lidar is similar to radar, but lidar uses photons from a laser, instead of radio waves, to investigate the composition and structure of distant objects. Chu's group developed the highly sensitive instrument with funding from the National Science Foundation.

21ST CENTURY ALCHEMY

RUDN University chemists created cheap catalysts for ethanol conversion

IMAGE

IMAGE: RUDN UNIVERSITY CHEMISTS PROPOSED A NEW WAY TO SYNTHESIZE CATALYSTS FOR THE CONVERSION OF ETHYL ALCOHOL. THE OBTAINED MATERIALS ARE PROMISING CATALYSTS FOR THE SELECTIVE CONVERSION OF ETHANOL, WHICH IS... view more 

CREDIT: RUDN UNIVERSITY

RUDN University chemists proposed a new way to synthesize catalysts for the conversion of ethyl alcohol. The obtained materials are promising catalysts for the selective conversion of ethanol, which is an important stage in the development of an alternative technology for obtaining valuable chemical synthesis products based on plant raw materials. The results of the study are published in Catalysis Today.

Ethanol fuel is ethyl alcohol, it is produced from plant material by fermentation of industrial or agricultural waste biomass. It is used as a more environmentally friendly fuel compared to gasoline. But this is not its sole use -- ethanol can be converted into acetaldehyde, diethyl ether and other chemicals that are in demand in the industry. Highly efficient catalysts are required to trigger such chemical reactions. However, existing catalysts contain precious metals, and therefore they are too expensive to use. RUDN University chemists proposed new catalysts based on aluminium and zirconium, modified with copper.

"The best-known catalysts for ethanol conversion are based on oxides promoted by noble metals. However, they are quite expensive. A more affordable option is catalysts with copper as the active phase, but so far, the best option has not been found among them. Improvements are required to use these catalysts to ensure both high conversion and selectivity of the reaction -- that is, to leave as little ethanol as possible unprocessed and at the same time to obtain the necessary substances, and not by-products", Anna Zhukova, associated professor, PhD, from the Department of Physical and Colloidal Chemistry of RUDN University

RUDN chemists combined two approaches to improve the efficiency of catalysts for acetaldehyde synthesis. First, they combined oxides of several metals in nanocomposites: aluminium, cerium, and zirconium. The researchers synthesized five types of powders with different oxides ratios. Five of them was prepared at a relatively low temperature of 180°C, and another five was heated to 950°C. This made it possible to form different structures in the materials. The calcined samples had a large diameter and pore volume.

The second idea was to add copper. All the powders were soaked in an aqueous solution of copper nitrate, dried at room temperature, and exposed to a flow of hydrogen at 400°C. After that, the finished catalysts were tested in the ethanol vapor dehydrogenation reaction. Chemists placed them in a thin layer on a porous filter, and then fed alcohol vapors in the helium flow. The reaction was carried out at temperatures from 240°C to 360°C.

"All obtained systems demonstrated ? high alcohol conversion and selectivity to acetaldehyde. The copper containing catalysts with 5% aluminium oxide produced significant amounts of acetaldehyde with selectivity above 80 % at 3600C. We found that the mixed composition of the oxides creates conditions for the formation of active centres on the surface of the catalyst from copper ions with different charges. The best option is to use a mixture of oxides with a small content of aluminium in the synthesis of the catalyst and calcinate them at 950°C", Anna Zhukova from RUDN University

STAR TREK TECH

Laser physics: Two-stage particle-beam booster

LUDWIG-MAXIMILIANS-UNIVERSITÄT MÜNCHEN

Research News

In collaborative international effort, laser physicists at LMU have built the first hybrid plasma accelerator.

Particle accelerators have made crucial contributions to some of the most spectacular scientific discoveries of modern times, and greatly augmented our knowledge of the structure of matter. Now a team of laser physicists led by Prof. Stefan Karsch at the Ludwig-Maximilian University (LMU) in Munich and the Max Planck Institute for Quantum Optics, in cooperation with scientists based at the Helmholtz Centre in Dresden-Rossendorf (HZDR), the Laboratoire d'Optique Appliquée in Paris (LOA), Strathclyde University in Glasgow and the DESY Electron Synchrotron in Hamburg, have now achieved a significant breakthrough in accelerator miniaturization. They have built the first compact two-stage plasma-based accelerator in which particles in a plasma wave initiated by a powerful laser are used to accelerate a beam of electrons.

Particle accelerators have become an indispensable tool for studies of the structure of matter at sub-atomic scales, and have important applications in biology and medicine. Most of these systems make use of powerful radio-frequency waves to bring particles up to the desired energy. One drawback of this approach, which has been the standard methodology in the field for decades, lies in the risk of electrical breakdown when very high levels of electrical power at radio frequencies are coupled into the accelerator. This potential risk effectively limits the field strengths attainable, and is one of the reasons why these accelerator systems are typically many kilometers long. Physicists have therefore been exploring ways of reducing their size by exploiting the fact that a plasma can sustain much higher acceleration fields. In this case, the electric field generated by a powerful laser or a particle beam is used to strip electrons from the atoms in a gas and to create a wake similar to the one produced by a speedboat on water, Electrons surfing on that wake can get accelerated to nearly the speed of light within a distance of only a few millimeters.

Studies on plasma-based acceleration with the aid of lasers, i.e. Laser Wakefield Acceleration (LWFA), are now in progress in many research institutions around the world. In contrast, work with accelerators based on particle beams - a field which is known as Plasma Wakefield Acceleration (PWFA) - has so far been possible only in large-scale accelerator facilities (e.g. CERN, DESY and SLAC), although it offers a number of advantages over LWFA. For example, particle beams do not heat the plasma as much as laser beams and allow to use a longer accelerating distance. This in turn promises to improve the quality of the beam and increase its energy, parameters that are a very important in terms of the technique's potential range of applications.

In their experiments, the authors of the new study were able, for the first time, to build and successfully test a practical and compact particle-based plasma accelerator. The essential breakthrough lies in the fact that the PWFA, which accelerates the final electron beam, is driven by a particle beam from an LWFA. The latter is itself highly compact, so that the hybrid plasma accelerator is only a few centimeters long. Moreover, simulations indicate that the acceleration fields are more than three orders of magnitude higher than that attainable in conventional accelerators. Another promising result of the study is that the data obtained at LMU are confirmed by complementary tests performed with the DRACO laser at the HZDR.

Dr. Andreas Döpp, a member of the Munich group led by Prof. Stefan Karsch, points out that "only a few years ago, the practical realization of such a combination would have been unthinkable. The hybrid accelerator was made possible by subsequent developments in the design of laser-based accelerators, which have led to tremendous improvements in the stability of the beam and in other vital parameters." Much of this progress has been made at LMU, following the installation in the Centre for Advanced Laser Applications (CALA) of the ATLAS laser, which is one of the most powerful of its kind in Germany.

The successful demonstration of the hybrid plasma accelerator represents the latest advance ahead. "We had already shown that our compact plasma accelerator behaves very similarly to its conventional and far larger conventional cousins. So we are confident that we will be able to generate extremely bright electron beams with this set-up in the near future," says Stefan Karsch.

Before the technology can be applied on a wider scale, a number of outstanding challenges must be overcome, but the team are already considering a variety of possible contexts in which such instruments would highly advantageous. "For instance, research groups that have not had easy access to a particle accelerator could utilize the technique and develop it further. Secondly, our hybrid accelerator could serve as the basis for what is called a free-electron laser (FEL)," says Dr. Arie Irman, who coordinated the experiments at the HZDR.

FELs are highly prized radiation sources, which can be used for extremely precise characterizations of nanomaterials, biomolecules and geological samples. Competition for access to these sources, such as the European XFEL in Hamburg, has been correspondingly intense. If such large-scale X-ray lasers could be complemented by the new plasma-based technology in future, such more compact sources could potentially be made available for a broader user base, therefore boosting research with brilliant X-rays as a whole.

###