It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Tuesday, December 05, 2023
Unlocking the secrets of the brain’s dopaminergic system
Breakthrough organoid model replicates essential neural network
A new organoid model of the dopaminergic system sheds lights on its intricate functionality and potential implications for Parkinson’s disease. The model, developed by the group of Jürgen Knoblich at the Institute of Molecular Biotechnology (IMBA) of the Austrian Academy of Sciences, replicates the dopaminergic system’s structure, connectivity, and functionality. The study, published on December 5 in Nature Methods, also uncovers the enduring effects of chronic cocaine exposure on the dopaminergic circuit, even after withdrawal.
A completed run, the early morning hit of caffeine, the smell of cookies in the oven - these rewarding moments are all due to a hit of the neurotransmitter dopamine, released by neurons in a neural network in our brain, called the “dopaminergic reward pathway”. Apart from mediating the feeling of “reward”, dopaminergic neurons also play a crucial role in fine motor control, which is lost in diseases such as Parkinson’s disease. Despite dopamine’s importance, key features of the system are not yet understood, and no cure for Parkinson’s disease exists. In their new study, the group of Jürgen Knoblich at IMBA developed an organoid model of the dopaminergic system, which not only recapitulates the system’s morphology and nerve projections, but also its functionality.
A model of Parkinson’s disease
Tremor and a loss of motor control are characteristic symptoms of Parkinson’s disease and are due to a loss of neurons that release the neurotransmitter dopamine, called dopaminergic neurons. When dopaminergic neurons die, fine motor control is lost and patients develop tremors and uncontrollable movements. Although the loss of dopaminergic neurons is crucial in the development of Parkinson’s disease, the mechanisms how this happens, and how we can prevent – or even repair – the dopaminergic system is not yet understood.
Animal models for Parkinson’s disease have provided some insight into Parkinsons disease, however as rodents do not naturally develop Parkinson’s disease, animal studies proved unsatisfactory in recapitulating hallmark features of the disease. In addition, the human brain contains many more dopaminergic neurons, which also wire up differently within the human brain, sending projections to the striatum and the cortex. “We sought to develop an in vitro model that recapitulates these human features in so called brain organoids”, explains Daniel Reumann, previously a PhD student in the lab of Jürgen Knoblich at IMBA, and first author of the paper. “Brain organoids are human stem cell derived three-dimensional structures, which can be used to understand both human brain development, as well as function”, he explains further.
The team first developed organoid models of the so-called ventral midbrain, striatum and cortex – the regions linked by neurons in the dopaminergic system – and then developed a method for fusing these organoids together. As happens in the human brain, the dopaminergic neurons of the midbrain organoid send out projections to the striatum and the cortex organoids. “Somewhat surprisingly, we observed a high level of dopaminergic innervation, as well as synapses forming between dopaminergic neurons and neurons in striatum and cortex”, Reumann recalls.
To assess whether these neurons and synapses are functional, the team collaborated with Cedric Bardy’s group at SAHMRI and Flinders University, Australia, to investigate if neurons in this system would start to form functional neural networks. And indeed, when the researchers stimulated the midbrain which contains dopaminergic neurons, neurons in the striatum and cortex responded to the stimulation. “We successfully modelled the dopaminergic circuit in vitro, as the cells not only wire correctly, but also function together”, Reumann sums up.
The organoid model of the dopaminergic system could be used to improve cell therapies for Parkinson’s disease. In first clinical studies, researchers have injected precursors of dopaminergic neurons into the striatum, to try and make up for the lost natural innervation. However, these studies have had mixed success. In collaboration with the lab of Malin Parmar at Lund University, Sweden, the team demonstrated that dopaminergic progenitor cells injected into the dopaminergic organoid model mature into neurons and extend neuronal projections within the organoid. “Our organoid system could serve as a platform to test conditions for cell therapies, allowing us to observe how precursor cells behave in a three-dimensional human environment”, Jürgen Knoblich, the study’s corresponding author, explains. “This allows researchers to study how progenitors can be differentiated more efficiently and provides a platform which allows to study how to recruit dopaminergic axons to target regions, all in a high-throughput manner.”
Insights into the reward system
Dopaminergic neurons also fire whenever we feel rewarded, thus forming the basis of the “reward pathway” in our brains. But what happens when dopaminergic signaling is perturbed, such as in addiction? To investigate this question, the researchers made use of a well-known dopamine reuptake inhibitor, cocaine. When the organoids were exposed to cocaine chronically, over 80 days, the dopaminergic circuit changed functionally, morphologically and transcriptionally. These changes persisted, even when cocaine exposure was stopped 25 days before the end of the experiment, which simulated the withdrawal condition. “Even after almost a month after stopping cocaine exposure, the effects of cocaine on the dopaminergic circuit were still visible, which means that we can now investigate what the long-term effects of dopaminergic overstimulation are in a human-specific in vitro system”, Reumann summarizes.
About IMBA
IMBA - Institute of Molecular Biotechnology - is one of the leading biomedical research institutes in Europe. IMBA is located at the Vienna BioCenter, a vibrant cluster of research institutes, universities and biotech companies in Austria. IMBA is an institute of the Austrian Academy of Sciences, the leading national sponsor of non-university academic research. Research topics pursued at IMBA include organoid and developmental biology, neuroscience, RNA biology and chromosome biology. For further information, please visit imba.oeaw.ac.at or follow us on social media.
Research in the lab of Jürgen Knoblich for this publication was funded by the ERC under the European Union’s Horizon 2020 program, the Austrian Federal Ministry of Education, Science, and Research, the Austrian Academy of Sciences, the City of Vienna, the Austrian Science Fund and the Austrian Lotteries.
In vitro modeling of the human dopaminergic system using spatially arranged ventral midbrain–striatum–cortex assembloids.
ARTICLE PUBLICATION DATE
5-Dec-2023
COI STATEMENT
J.A.K. is inventor on a patent describing cerebral organoid technology (European Patent Application number: EP2743345A1), co-founder and member of the scientific advisory board of a:head bio AG. J.A.K. and D.R. are inventors on a patent application describing brain organoid fusion technology (patent application number EP22177191.8). M.M.S, K.I.R., D.R. and J.A.K. are inventors on a patent describing organoid technology (patent submission ID: GB2206768.0). MP is the owner of Parmar Cells that holds related IP (U.S. patent 15/093,927, 570 PCT/EP17181588), performs paid consultancy to Novo Nordisk AS and serves on the SAB for Arbor Biotechnologies. C.B. is inventor on a patent about a cell culture media for neuronal cell culture (BrainPhys®) (international publication number: WO2014/172580A1). The remaining authors declare no competing interests.
IT'S QUANTUM REALITY
Diamonds and rust help unveil ‘impossible’ quasi-particles
Researchers have discovered magnetic monopoles – isolated magnetic charges – in a material closely related to rust, a result that could be used to power greener and faster computing technologies.
Researchers led by the University of Cambridge used a technique known as diamond quantum sensing to observe swirling textures and faint magnetic signals on the surface of hematite, a type of iron oxide.
The researchers observed that magnetic monopoles in hematite emerge through the collective behaviour of many spins (the angular momentum of a particle). These monopoles glide across the swirling textures on the surface of the hematite, like tiny hockey pucks of magnetic charge. This is the first time that naturally occurring emergent monopoles have been observed experimentally.
The research has also shown the direct connection between the previously hidden swirling textures and the magnetic charges of materials like hematite, as if there is a secret code linking them together. The results, which could be useful in enabling next-generation logic and memory applications, are reported in the journal Nature Materials.
According to the equations of James Clerk Maxwell, a giant of Cambridge physics, magnetic objects, whether a fridge magnet or the Earth itself, must always exist as a pair of magnetic poles that cannot be isolated.
“The magnets we use every day have two poles: north and south,” said Professor Mete Atatüre, who led the research. “In the 19th century, it was hypothesised that monopoles could exist. But in one of his foundational equations for the study of electromagnetism, James Clerk Maxwell disagreed.”
Atatüre is Head of Cambridge’s Cavendish Laboratory, a position once held by Maxwell himself. “If monopoles did exist, and we were able to isolate them, it would be like finding a missing puzzle piece that was assumed to be lost,” he said.
About 15 years ago, scientists suggested how monopoles could exist in a magnetic material. This theoretical result relied on the extreme separation of north and south poles so that locally each pole appeared isolated in an exotic material called spin ice.
However, there is an alternative strategy to find monopoles, involving the concept of emergence. The idea of emergence is the combination of many physical entities can give rise to properties that are either more than or different to the sum of their parts.
Working with colleagues from the University of Oxford and the National University of Singapore, the Cambridge researchers used emergence to uncover monopoles spread over two-dimensional space, gliding across the swirling textures on the surface of a magnetic material.
The swirling topological textures are found in two main types of materials: ferromagnets and antiferromagnets. Of the two, antiferromagnets are more stable than ferromagnets, but they are more difficult to study, as they don’t have a strong magnetic signature.
To study the behaviour of antiferromagnets, Atatüre and his colleagues use an imaging technique known as diamond quantum magnetometry. This technique uses a single spin – the inherent angular momentum of an electron – in a diamond needle to precisely measure the magnetic field on the surface of a material, without affecting its behaviour.
For the current study, the researchers used the technique to look at hematite, an antiferromagnetic iron oxide material. To their surprise, they found hidden patterns of magnetic charges within hematite, including monopoles, dipoles and quadrupoles.
“Monopoles had been predicted theoretically, but this is the first time we’ve actually seen a two-dimensional monopole in a naturally occurring magnet,” said co-author Professor Paolo Radaelli, from the University of Oxford.
“These monopoles are a collective state of many spins that twirl around a singularity rather than a single fixed particle, so they emerge through many-body interactions. The result is a tiny, localised stable particle with diverging magnetic field coming out of it,” said co-first author Dr Hariom Jani, from the University of Oxford.
“We’ve shown how diamond quantum magnetometry could be used to unravel the mysterious behaviour of magnetism in two-dimensional quantum materials, which could open up new fields of study in this area,” said co-first author Dr Anthony Tan, from the Cavendish Laboratory. “The challenge has always been direct imaging of these textures in antiferromagnets due to their weaker magnetic pull, but now we’re able to do so, with a nice combination of diamonds and rust.”
The study not only highlights the potential of diamond quantum magnetometry but also underscores its capacity to uncover and investigate hidden magnetic phenomena in quantum materials. If controlled, these swirling textures dressed in magnetic charges could power super-fast and energy-efficient computer memory logic.
The research was supported in part by the Royal Society, the Sir Henry Royce Institute, the European Union, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).
Revealing Emergent Magnetic Charge in an Antiferromagnet with Diamond Quantum Magnetometry
ARTICLE PUBLICATION DATE
5-Dec-2023
New theory unites Einstein’s gravity with quantum mechanics
A radical theory that consistently unifies gravity and quantum mechanics while preserving Einstein’s classical concept of spacetime is announced today in two papers published simultaneously by UCL physicists
THE IMAGE DEPICTS AN EXPERIMENT IN WHICH HEAVY PARTICLES(ILLUSTRATED AS THE MOON), CAUSE AN INTERFERENCE PATTERN (A QUANTUM EFFECT), WHILE ALSO BENDINGSPACETIME. THE HANGING PENDULUMS DEPICT THE MEASUREMENT OF SPACETIME. THE ACTUAL EXPERIMENT ISTYPICALLY PERFORMED USING CARBON-60, ONE OF THE LARGEST KNOWN MOLECULES. THE UCL CALCULATIONINDICATES THAT THE EXPERIMENT SHOULD ALSO BE PERFORMED USING HIGHER DENSITY ATOMS SUCH AS GOLD. THE OTHER TWO IMAGES REPRESENT THE TWO EXPERIMENTS PROPOSED BY THE UCL GROUP, BOTH OF WHICHCONSTRAIN ANY THEORY WHERE SPACETIME IS TREATED CLASSICALLY. ONE IS THE WEIGHING OF A MASS, THE OTHERIS AN INTERFERENCE EXPERIMENT.
A radical theory that consistently unifies gravity and quantum mechanics while preserving Einstein’s classical concept of spacetime is announced today in two papers published simultaneously by UCL (University College London) physicists.
Modern physics is founded upon two pillars: quantum theory on the one hand, which governs the smallest particles in the universe, and Einstein’s theory of general relativity on the other, which explains gravity through the bending of spacetime. But these two theories are in contradiction with each other and a reconciliation has remained elusive for over a century.
The prevailing assumptionhas been that Einstein’s theory of gravity must be modified, or “quantised”, in order to fit within quantum theory. This is the approach of two leading candidates for a quantum theory of gravity, string theory and loop quantum gravity.
But a new theory, developed by Professor Jonathan Oppenheim (UCL Physics & Astronomy) and laid out in a new paper in Physical Review X (PRX), challenges that consensus and takes an alternative approach by suggesting that spacetime may be classical – that is, not governed by quantum theory at all.
Instead of modifying spacetime, the theory - dubbed a “postquantum theory of classical gravity” - modifies quantum theory and predicts an intrinsic breakdown in predictability that is mediated by spacetime itself. This results in random and violent fluctuations in spacetime that are larger than envisaged under quantum theory, rendering the apparent weight of objects unpredictable if measured precisely enough.
A second paper, published simultaneously in Nature Communications and led by Professor Oppenheim’s former PhD students,looks atsome of the consequences of the theory, and proposes an experiment to test it: to measure a mass very precisely to see if its weight appears to fluctuate over time.
For example, the International Bureau of Weights and Measures in France routinely weigh a 1kg mass which used to be the 1kg standard. If the fluctuations in measurements of this 1kg mass are smaller than required for mathematical consistency, the theory can be ruled out.
The outcome of the experiment, or other evidence emerging which would confirm the quantum vs classical nature of spacetime, is the subject of a 5000:1 odds bet between Professor Oppenheim and Professor Carlo Rovelli and Dr Geoff Penington – leading proponents of quantum loop gravity and string theory respectively. [link]
For the past five years, the UCL research group has been stress-testing the theory, and exploring its consequences.
Professor Oppenheim said: "Quantum theory and Einstein's theory of general relativity are mathematically incompatible with each other, so it's important to understand how this contradiction is resolved. Should spacetime be quantised, or should we modify quantum theory, or is it something else entirely? Now that we have a consistent fundamental theory in which spacetime does not get quantised, it’s anybody’s guess.”
Co-author Zach Weller-Davies, who as a PhD student at UCL helped develop the experimental proposal and made key contributions to the theory itself, said: "This discovery challenges our understanding of the fundamental nature of gravity but also offers avenues to probe its potential quantum nature.
“We have shown that if spacetime doesn’t have a quantum nature, then there must be random fluctuations in the curvature of spacetime which have a particular signature that can be verified experimentally.
“In both quantum gravity and classical gravity, spacetime must be undergoing violent and random fluctuations all around us, but on a scale which we haven’t yet been able to detect. But if spacetime is classical, the fluctuations have to be larger than a certain scale, and this scale can be determined by another experiment where we test how long we can put a heavy atom in superposition* of being in two different locations."
Co-authors Dr Carlo Sparaciari and Dr Barbara Šoda, whose analytical and numerical calculations helped guide the project, expressed hope that these experiments could determine whether the pursuit of a quantum theory of gravity is the right approach.
Dr Šoda (formerly UCL Physics & Astronomy, now at the Perimeter Institute of Theoretical Physics, Canada) said: “Because gravity is made manifest through the bending of space and time, we can think of the question in terms of whether the rate at which time flows has a quantum nature, or classical nature.
“And testing this is almost as simple as testing whether the weight of a mass is constant, or appears to fluctuate in a particular way.”
Dr Sparaciari (UCL Physics & Astronomy) said: “While the experimental concept is simple, the weighing of the object needs to be carried out with extreme precision.
“But what I find exciting is that starting from very general assumptions, we can prove a clear relationship between two measurable quantities – the scale of the spacetime fluctuations, and how long objects like atoms or apples can be put in quantum superposition of two different locations. We can then determine these two quantities experimentally.”
Weller-Davies added: “A delicate interplay must exist if quantum particles such as atoms are able to bend classical spacetime. There must be a fundamental trade-off between the wave nature of atoms, and how large the random fluctuations in spacetime need to be.”
The proposal to test whether spacetime is classical by looking for random fluctuations in mass is complementary to another experimental proposal which aims to verify the quantum nature of spacetime by looking for something called “gravitationally mediated entanglement.”
Professor Sougato Bose (UCL Physics & Astronomy), who was not involved with the announcement today, but was among those to first propose the entanglement experiment, said: “Experiments to test the nature of spacetime will take a large-scale effort, but they're of huge importance from the perspective of understanding the fundamental laws of nature. I believe these experiments are within reach – these things are difficult to predict, but perhaps we'll know the answer within the next 20 years.”
The postquantum theory has implications beyond gravity. The infamous and problematic “measurement postulate” of quantum theory is not needed, since quantum superpositions necessarily localise through their interaction with classical spacetime.
The theory was motivated by Professor Oppenheim’s attempt to resolve the black hole information problem. According to standard quantum theory, an object going into a black hole should be radiated back out in some way as information cannot be destroyed, but this violates general relativity, which says you can never know about objects that cross the black hole’s event horizon. The new theory allows for information to be destroyed, due to a fundamental breakdown in predictability.
* Background information
Quantum mechanics background: All the matter in the universe obeys the laws of quantum theory, but we only really observe quantum behaviour at the scale of atoms and molecules. Quantum theory tells us that particles obey Heisenberg’s uncertainty principle, and we can never know their position or velocity at the same time. In fact, they don’t even have a definite position or velocity until we measure them. Particles like electrons can behave more like waves and act almost as if they can be in many places at once (more precisely, physicists describe particles as being in a “superposition” of different locations).
Quantum theory governs everything from semiconductors which are ubiquitous in computer chips, to lasers, to superconductivity to radioactive decay. In contrast, we say that a system behaves classically if it has definite underlying properties. A cat appears to behave classically – it is either dead or alive, not both, nor in a superposition of being dead and alive. Why do cats behave classically, and small particles quantumly? We don’t know, but the postquantum theory doesn’t require the measurement postulate, because the classicality of spacetime infects quantum systems and causes them to localise.
Gravity background: Newton’s theory of gravity, gave way to Einstein’s theory of general relativity (GR), which holds that gravity is not a force in the usual sense. Instead, heavy objects such as the sun, bend the fabric of spacetime in such a way that causes the earth to revolve around it. Spacetime is just a mathematical object consisting of the three dimensions of space, and time considered as a fourth dimension. General relativity predicted the formation of black holes and the big bang. It holds that time flows at different rates at different points in space, and the GPS in your smartphone needs to account for this in order to properly determine your location.
Historical context: The framework presented by Oppenheim in PRX, and in a companion paper with Sparaciari, Šoda and Weller-Davies, derives the most general consistent form of dynamics in which a quantum system interacts with a classical system. It then applies this framework to the case of general relativity coupled to quantum fields theory. It builds on earlier work and a community of physicists. An experiment to test the quantum nature of gravity via gravitationally mediated entanglement was proposed by Bose et. al. and by C. Marletto and V. Vadral. Two examples of consistent classical-quantum dynamics were discovered in the 90’s by Ph. Blanchard and A. Jadzyk, and by Lajos Diosi, and again by David Poulin around 2017. From a different perspective, in 2014 a model of Newtonian gravity coupled to quantum systems via a “measurement-and-feedback” approach, was presented by Diosi and Antoinne Tilloy in 2016, and by D Kafri, J. Taylor, and G. Milburn, in 2014. The idea that gravity might be somehow related to the collapse of the wavefunction, dates back to F. Karolyhazy (1966), L. Diosi (1987) and R. Penrose (1996). That classical-quantum couplings might explain localistation of the wavefunction has been suggested by others including M. Hall and M. Reginatto, Diosi and Tilloy, and David Poulin. The idea that spacetime might be classical dates back to I. Sato (1950), and C. Moller (1962), but no consistent theory was found until now.
Precision mass measurement - artistic STEAMPUNK concept
The weighing of a mass - an experiment proposed by the UCL group which constrain any theory where spacetime is treated classically.
CREDIT
Isaac Young
JOURNAL
Physical Review X
SUBJECT OF RESEARCH
Not applicable
ARTICLE TITLE
A postquantum theory of classical gravity?
ARTICLE PUBLICATION DATE
4-Dec-2023
Goethe University receives its first quantum computer
Device to be used to pioneer quantum computing, under the direction of computer scientist Prof. Thomas Lippert
FRANKFURT. With the upcoming installation of its first quantum computer, Goethe University will join the list the leading German universities in the field of applied quantum computing: Based on the technology of nitrogen vacancies in a synthetic diamond, Frankfurt's first quantum computer, named "Baby Diamond", will start as a pilot system with five qubits. Ulm-based start-up XeedQ is scheduled to deliver the device in the first quarter of 2024, with initial pilot users expected to come from Goethe University Frankfurt and the National High Performance Computing NHR Alliance.
The topic of quantum computing is a future technology that is currently on everyone's lips, promising to tackle tasks in the fields of computer simulation and AI that were previously too large or even unsolvable using digital methods. "With our new pilot quantum computer, we are taking an important step into this revolutionary field, which will soon be followed by others," says Goethe University President Prof. Enrico Schleiff. "Baby Diamond will give us a first glimpse into a future in which great computational challenges, the likes of which we cannot even imagine today, will become possible."
Ulrich Schielein, Goethe University Vice President and Chief Information Officer (CIO), adds: "It is likely that, in a few years’ time, we will be able to address completely new types of problems not only from the worlds of finance, logistics in rail, air and road transport, medicine and biology, weather and climate research, but also in the fields of basic sciences, like physics and chemistry, or the training of basic models of artificial intelligence. We are looking forward to working together with researchers, companies and institutions here in the Rhine-Main region."
The quantum computer uses a small synthetic diamond, commonly found in industrial applications, in which nitrogen atoms are embedded, each of which induces a defect that can in turn be used as a central qubit. Spins of atoms can be controlled as further qubits around this defect, making practical quantum computing possible.
"Our entry-level system is based on the idea of a compact quantum computer that can be used at room temperature, doesn’t require any special cryogenic cooling, can be set up in a small laboratory and is particularly energy-efficient," says Prof. Thomas Lippert, head of the modular supercomputing and quantum computing working group, established at Goethe University’s Faculty of Computer Science and Mathematics in summer 2020. "As a university, by installing the quantum computer, we are consciously taking a stand against the current monopolization of large companies that hide their systems behind paywalls. It being a compact system, we can already train students today in a hands-on manner directly on the device – exactly what is needed to become fit for the future."
The quantum computer is part of the so-called “Frankfurt Roadmap”, which sets out to procure up to 16 high-quality qubits by 2025 and gradually increase this number in the future. The pilot system operated in collaboration with the NHR Alliance will help establish an infrastructure at Goethe University that will closely link quantum computing with high-performance computing. In this context, Goethe University was able to secure Forschungszentrum Jülich with its JUNIQ quantum computing infrastructure as a scientific partner –a global pioneer in modular hybrid quantum HPC computing.
The system is being developed by XeedQ, which is based both in Leipzig and at the German Aerospace Center’s innovation hub in Ulm. XeedQ is funded by the latter’s Quantum Computing Initiative, with a view towards developing a scalable quantum computing technology.
Quantum computing is often referred to as the second quantum revolution. Goethe University's quantum computer will be located on the historic Bockenheim Campus, where Stern and Gerlach's famous experiment, carried out more than 100 years ago, laid the foundation for today’s quantum computing and served as an important part of the first quantum revolution. With its Baby Diamond, Goethe University is paving the way to bring new quantum revolutions back to Frankfurt.
Quantum physics: Superconducting Nanowires Detect Single Protein Ions
Detection efficiency 1,000 times higher than conventional ion detectors due to high sensitivity
VIEW OF THE SUPERMAMA LABORATORY AT THE UNIVERSITY OF VIENNA. THE HANGING GOLD-PLATED INSERT IS THE RADIATION SHIELD BEHIND WHICH THE SUPERCONDUCTING NANOWIRE DETECTORS ARE INSTALLED.
An international research team led by quantum physicist Markus Arndt (University of Vienna) has achieved a breakthrough in the detection of protein ions: Due to their high energy sensitivity, superconducting nanowire detectors achieve almost 100% quantum efficiency and exceed the detection efficiency of conventional ion detectors at low energies by a factor of up to a 1,000. In contrast to conventional detectors, they can also distinguish macromolecules by their impact energy. This allows for more sensitive detection of proteins and it provides additional information in mass spectrometry. The results of this study were recently published in the journal Science Advances.
The detection, identification, and analysis of macromolecules is interesting in many areas of life sciences, including protein research, diagnostics, and analytics. Mass spectrometry is often used as a detection system – a method that typically separates charged particles (ions) according to their mass-to-charge-ratio and measures the intensity of the signals generated by a detector. This provides information about the relative abundance of the different types of ions and therefore the composition of the sample. However, conventional detectors have only been able to achieve high detection efficiency and spatial resolution for particles with high impact energy – a limitation that has now been overcome by an international team of researchers using superconducting nanowire detectors.
Joined forces for low energy particles
In the current study, a European consortium coordinated by the University of Vienna, with partners in Delft (Single Quantum), Lausanne (EPFL), Almere (MSVision) and Basel (University), demonstrates for the first time the use of superconducting nanowires as excellent detectors for protein beams in so-called quadrupole mass spectrometry. Ions from the sample to be analyzed are fed into a quadrupole mass spectrometer where they are filtered. "If we now use superconducting nanowires instead of conventional detectors, we can even identify particles that hit the detector with low kinetic energy," explains project leader Markus Arndt from the Quantum Nanophysics Group at the Faculty of Physics at the University of Vienna. This is made possible by a special material property (superconductivity) of the nanowire detectors.
Getting there with superconductivity
The key to this detection method is that nanowires enter a superconducting state at very low temperatures, in which they lose their electrical resistance and allow lossless current flow. Excitation of the superconducting nanowires by incoming ions causes a return to the normal conducting state (quantum transition). The change in the electrical properties of the nanowires during this transition is interpreted as a detection signal. "With the nanowire detectors we use," says first author Marcel Strauß, "we exploit the quantum transition from the superconducting to the normal conducting state and can thus outperform conventional ion detectors by up to three orders of magnitude." Indeed, nanowire detectors have a remarkable quantum yield at exceptionally low impact energies – and redefine the possibilities of conventional detectors: "In addition, a mass spectrometer adapted with such a quantum sensor can not only distinguish molecules according to their mass to charge state, but also classify them according to their kinetic energy. This improves the detection and offers the possibility for have better spatial resolution," says Marcel Strauß. Nanowire detectors can find new applications in mass spectrometry, molecular spectroscopy, molecular deflectometry, or quantum interferometry of molecules, where high efficiency and good resolution are required, especially at low impact energy.
Team & Funding
Single Quantum is leading the research on superconducting nanowire detectors, the experts from EPFL-Lausanne provide the ultracold electronics, MSVISION is a specialist in mass spectrometry, and the experts from the University of Basel are responsible for chemical synthesis and protein functionalization. The University of Vienna brings together all the components with its expertise in quantum optics, molecular beams and superconductivity.
The work was funded by the European Commission as part of the SuperMaMa project (860713), which is dedicated to research into superconducting detectors for mass spectrometry and molecular analysis. Funding from the Gordon & Betty Moore Foundation (10771) contributed to the analysis of the modified proteins.
More about Figure 1: View of the SuperMaMa laboratory at the University of Vienna. In the foreground: the adapted tandem mass spectrometer. On the optical table in front: the ultra-high vacuum chamber with 3.7 Kelvin cryocooler. The hanging gold-plated insert is the radiation shield behind which the superconducting nanowire detectors are installed. When closed, the proteins are focused onto the detector via ring electrodes through the few millimetre hole in the gold-plated shielding. In the background: Pulsed high-power laser for the photocleavage of labeled proteins with visible and ultraviolet light.
Counting single proteins with a superconducting nanowire. The background and nanowire are altered in Photoshop with the Generative Fill AI. (Human Insulin PDB:3I40). (IMAGE)
UNIVERSITY OF VIENNA
Counting single proteins with a superconducting nanowire. The background and nanowire are altered in Photoshop with the Generative Fill AI. (Human Insulin PDB:3I40).
CREDIT
C: CC BY-ND 4.0 Quantum Nanophysics University of Vienna
Neutron stars have fascinated and puzzled scientists since the first detected signature in 1967. Known for their periodic flashes of light and rapid rotation, neutron stars are among the densest objects in the universe, with a mass comparable to that of the Sun but compressed into a sphere only about 20 kilometers in diameter. These stellar objects exhibit a peculiar behavior known as a “glitch”, where the star suddenly speeds up its spin. This phenomenon suggests that neutron stars might be partly superfluid. In a superfluid, rotation is characterized by numerous tiny vortices, each carrying a fraction of angular momentum. A glitch occurs when these vortices escape from the star's inner crust to its solid outer crust, thereby increasing the star's rotational speed.
The key ingredient for this study lies in the concept of a “supersolid” – a state that exhibits both crystalline and superfluid properties – which is predicted to be a necessary ingredient of neutron star glitches. Quantized vortices nest within the supersolid until they collectively escape and are consequently absorbed by the outer crust of the star, accelerating its rotation. Recently, the supersolid phase has been realized in experiments with ultracold dipolar atoms, providing a unique opportunity to simulate the conditions within a neutron star.
The recent study by researchers at the University of Innsbruck and the Austrian Academy of Sciences as well as the Laboratori Nazionali del Gran Sasso and the Gran Sasso Science Institute in Italy demonstrates that glitches can occur in ultracold supersolids, serving as versatile analogues for the inside of neutron stars. This groundbreaking approach allows for a detailed exploration of the glitch mechanism, including its dependence on the quality of the supersolid. “Our research establishes a strong link between quantum mechanics and astrophysics and provides a new perspective on the inner nature of neutron stars”, says first author Elena Poli. Glitches provide valuable insights into the internal structure and dynamics of neutron stars. By studying these events, scientists can learn more about the properties of matter under extreme conditions.
“This research shows a new approach to gain insights into the behavior of neutron stars and opens new avenues for the quantum simulation of stellar objects from low-energy Earth laboratories”, emphasizes Francesca Ferlaino.
The study has been published in Physical Review Letters and was financially supported by the Austrian Science Fund FWF and the European Research Council ERC, among others.
Publication: Glitches in rotating supersolids. Elena Poli, Thomas Bland, Samuel J. M. White, Manfred J. Mark, Francesca Ferlaino, Silvia Trabucco and Massimo Mannarelli. Phys. Rev. Lett. 131, 223401 DOI: 10.1103/PhysRevLett.131.223401 [arXiv: 2306.09698]
How Israel's AI use is resulting in indiscriminate civilian deaths in Gaza
Israel's growing reliance on AI in conflict with Hamas has resulted in a drastic surge in civilian casualties, sparking concerns about ethical implications and the opacity of AI-driven target selection.
Aftermath of an Israeli strike on a house after a temporary truce between Hamas and Israel expired in Rafah, southern Gaza
The staggering number of children and women killed in besieged Gaza has put the spotlight on Israel's increasing use of artificial intelligence (AI), which does not differentiate between combatants and civilians in picking targets.
As Israel resumes its offensive in devastated Gaza after a seven-day ceasefire, the number of casualties has risen sharply – with close to 16,000 people killed since Israel launched what is being described as a "collective punishment" of Gaza residents following the October 7 Hamas attacks.
In a bid to target Hamas hideouts, IDF's strikes are far from surgical, and reports suggest limited attention to target selection methods in Gaza. The integration of AI into lethal operations has played a pivotal role in recent Israel-Hamas wars.
In May 2021, officials said Israel had fought its 'first AI war' during the bombardment of Gaza for 11 days using machine learning and advanced computing.
And months ahead of the Hamas attacks in Israel on October 7, the IDF revealed its integration of AI into lethal operations.
According to Bloomberg, as of July 15, the IDF has initiated the use of AI to select a target in air strikes and logistical planning during wartime.
Israeli officials also disclosed the implementation of an AI system for target selection in aerial bombardments, alongside another model named 'Fire Factory'.
The Fire Factory system utilises data on military-approved targets to calculate munition loads, prioritise and allocate thousands of targets to aircraft and drones, and propose a schedule for subsequent raids.
AFP Israel and Gaza at war after Hamas launches surprise attack
And in the latest Israel-Hamas war, the use of an AI platform called "The Gospel" is said to have been a notable aspect of their operations in Gaza.
Current and former members of Israel's intelligence community, Palestinian testimonies, data from Gaza, and official statements suggest that authorisation for bombing non-military targets, a relaxation of constraints regarding expected civilian casualties, and the use of AI to generate an unprecedented number of potential targets have all contributed to one of the deadliest military campaigns against Palestinians since 1948.
Hence, in Israel's "Operation Iron Swords," there has seen a significant increase in the bombing of non-military targets, including private residences, public buildings, infrastructure, and high-rise blocks, categorised by the army as "power targets". 'Expected casualties'
Several sources, speaking to +972 Magazine and Local Call, confirmed that the Israeli army possesses files on the majority of potential targets in Gaza, including residences.
These files specify the number of civilians likely to be killed in an attack on a particular target. The army's intelligence units calculate and know the expected civilian casualties before executing an attack.
Specifying a case, one source said that the military command knowingly approved an operation to assassinate a single top Hamas military commander – resulting in the deaths of hundreds of Palestinian civilians.
Another source said that the decision-making extends to civilian casualties, emphasising that nothing happens by accident.
The investigation by the +972 Magazine suggests that the widespread use of the "Habsora" ("The Gospel") system, primarily built on AI, also contributed to the high number of targets and extensive harm to civilian life in Gaza. 'The Gospel' and its role in Gaza
Described as a "mass assassination factory," this AI system can generate targets almost automatically at a rate exceeding previous capabilities.
It also enables the army to conduct extensive strikes on residential homes, even targeting those who are junior Hamas operatives.
Palestinian testimonies suggest that since October 7, the army has also attacked private residences without known or apparent Hamas members. These strikes, as confirmed by sources, knowingly result in the death of entire families.
A senior intelligence officer reportedly emphasised the goal of "killing as many Hamas operatives as possible" after October 7, relaxing criteria around civilian harm. This has led to shelling based on broad cellular pinpointing – resulting in civilian casualties – to save time rather than investing in more accurate targeting.
The outcome of these policies is a staggering loss of human life in Gaza since October 7. Over 300 families have lost ten or more members in Israeli bombings in the past two months — a figure 15 times higher than in Israel's previous deadliest war on Gaza in 2014.
"All of this is happening contrary to the protocol used by the IDF in the past," one source quoted by +972 Magazine.
"There is a feeling that senior officials in the army are aware of their failure on October 7 and are busy with the question of how to provide the Israeli public with an image [of victory] that will salvage their reputation."
What forms of data are fed into the Gospel is not yet known. Still, it typically involves analysing extensive information from various channels, including drone footage, intercepted communications, surveillance data, and details obtained from monitoring individual and group movements and behaviour patterns.
An official involved in previous Gaza operations' targeting decisions said that the IDF hadn't previously bombed the homes of junior Hamas members. However, they believe this approach has changed in the current conflict, with suspected Hamas operatives' houses targeted irrespective of rank.
"Hamas members who don't really mean anything live in homes across Gaza. So they mark the home and bomb the house and kill everyone there," the official told +972/Local Call.
Type of targets
According to sources speaking to +972 and Local Call, Israeli aircraft have targeted Gaza in roughly four categories. The first includes "tactical targets," such as standard military objectives like armed militant cells, weapon warehouses, rocket launchers, anti-tank missile launchers, launch pits, mortar bombs, military headquarters, and observation posts.
The second category is "underground targets," mainly tunnels dug by Hamas under Gaza's neighbourhoods, including beneath civilian homes. Aerial strikes on these targets can lead to the collapse of homes above or near the tunnels.
The third category, "power targets," involves high-rises, residential towers, and public buildings like universities, banks, and government offices in city centres.
Three intelligence sources suggest that targeting these structures aims to exert "civil pressure" on Hamas by deliberately attacking Palestinian society.
The final category comprises "family homes" or "operatives' homes." The stated purpose is to destroy private residences to assassinate a single resident suspected of being a Hamas or the Islamic Jihad group's operative.
"Hamas is everywhere in Gaza; there is no building that does not have something of Hamas in it, so if you want to find a way to turn a high-rise into a target, you will be able to do so," said one former intelligence official.
"They will never just hit a high-rise that does not have something we can define as a military target," said another intelligence source, who carried out previous strikes against power targets.
"There will always be a floor in the high-rise [associated with Hamas]. But for the most part, when it comes to power targets, it is clear that the target doesn't have a military value that justifies an attack that would bring down the entire empty building in the middle of a city with the help of six planes and bombs weighing several tons."
Destruction from Israeli aerial bombardment is seen in Gaza.
OTHERS
However, Palestinian testimonies in the current war indicate that some families killed did not include any operatives from these organisations.
As of November 10, the IDF Spokesperson reported that Israel had attacked a total of 15,000 targets in Gaza during the first 35 days of the current conflict.
This figure is notably higher than in the four previous major operations in the besieged enclave: Guardian of the Walls in 2021 (1,500 targets in 11 days), Protective Edge in 2014 (between 5,266 and 6,231 targets in 51 days), Pillar of Defense in 2012 (about 1,500 targets in eight days), and Cast Lead in 2008 (3,400 targets in 22 days).
As Israeli commanders receive lists of targets generated by AI tools like the Gospel, the opacity of the method raises concerns: increasing dependence on AI may turn humans into mere components in a mechanised process, jeopardising their ability to assess the impact on civilian lives effectively.