It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Friday, January 17, 2025
Australopithecines at South African cave site were not eating substantial amounts of meat
Summary author: Becky Ham
American Association for the Advancement of Science (AAAS)
Seven Australopithecus specimens uncovered at the Sterkfontein fossil site in South Africa were herbivorous hominins who did not eat substantial amounts of meat, according to a new study by Tina Lüdecke and colleagues. Lüdecke et al. analyzed organic nitrogen and carbonate carbon isotopes extracted from tooth enamel in the fossil specimens to determine the hominin diets. Some researchers have hypothesized that the incorporation of animal-based foods in early hominin diets led to increased brain size, smaller gut size and increased stature – all key events in human evolution. Cut and scraped bones and some stone tools from the same time period (around 3.7 million years ago) offer hints that australopithecines were eating some meat, but there has been a lack of direct evidence for an animal diet. The researchers analyzed enamel nitrogen isotope measurements from 43 animal fossils, including the australopithecines, and modern African mammals to characterize these isotopes in known carnivores and herbivores. They found a clear separation in the enamel isotopes between the two groups, with the Australopithecus enamel significantly similar to that of the herbivore group. It’s possible, the researchers note, that the australopithecines were eating energy-rich foods with low nitrogen isotope ratios, like legumes or possibly termites. But it’s unlikely that they were eating enough meat to drive changes in brain size and other characteristics that are hallmarks of human evolution, Lüdecke et al. conclude.
A segment of Science's weekly podcast with Tina Lüdecke, related to this research, will be available on the Science.org podcast landing page after the embargo lifts. Reporters are free to make use of the segments for broadcast purposes and/or quote from them – with appropriate attribution (i.e., cite "Science podcast"). Please note that the file itself should not be posted to any other Web site.
Australopithecus at Sterkfontein did not consume substantial mammalian meat
Article Publication Date
17-Jan-2025
Fine-tuned brain-computer interface makes prosthetic limbs feel more real
Two new papers document progress in neuroprosthetic technology that lets people feel the shape and movement of objects moving over the "skin" of a bionic hand
A testing participant controls a bionic hand through a brain-computer interface that allows him to feel pressure changes as the steering wheel moves in the hand.
You can probably complete an amazing number of tasks with your hands without looking at them. But if you put on gloves that muffle your sense of touch, many of those simple tasks become frustrating. Take away proprioception — your ability to sense your body’s relative position and movement — and you might even end up breaking an object or injuring yourself.
“Most people don’t realize how often they rely on touch instead of vision — typing, walking, picking up a flimsy cup of water,” said Charles Greenspon, PhD, a neuroscientist at the University of Chicago. “If you can’t feel, you have to constantly watch your hand while doing anything, and you still risk spilling, crushing or dropping objects.”
Greenspon and his research collaborators recently published papers in Nature Biomedical Engineering and Science documenting major progress on a technology designed to address precisely this problem: direct, carefully timed electrical stimulation of the brain that can recreate tactile feedback to give nuanced “feeling” to prosthetic hands.
The science of restoring sensation
These new studies build on years of collaboration among scientists and engineers at UChicago, the University of Pittsburgh, Northwestern University, Case Western Reserve University and Blackrock Neurotech. Together they are designing, building, implementing and refining brain-computer interfaces (BCIs) and robotic prosthetic arms aimed at restoring both motor control and sensation in people who have lost significant limb function.
On the UChicago side, the research was led by neuroscientist Sliman Bensmaia, PhD, until his unexpected passing in 2023.
The researchers’ approach to prosthetic sensation involves placing tiny electrode arrays in the parts of the brain responsible for moving and feeling the hand. On one side, a participant can move a robotic arm by simply thinking about movement, and on the other side, sensors on that robotic limb can trigger pulses of electrical activity called intracortical microstimulation (ICMS) in the part of the brain dedicated to touch.
For about a decade, Greenspon explained, this stimulation of the touch center could only provide a simple sense of contact in different places on the hand.
“We could evoke the feeling that you were touching something, but it was mostly just an on/off signal, and often it was pretty weak and difficult to tell where on the hand contact occurred,” he said.
The newly published results mark important milestones in moving past these limitations.
Advancing understanding of artificial touch
In the first study, published in Nature Biomedical Engineering, Greenspon and his colleagues focused on ensuring that electrically evoked touch sensations are stable, accurately localized and strong enough to be useful for everyday tasks.
By delivering short pulses to individual electrodes in participants’ touch centers and having them report where and how strongly they felt each sensation, the researchers created detailed “maps” of brain areas that corresponded to specific parts of the hand. The testing revealed that when two closely spaced electrodes are stimulated together, participants feel a stronger, clearer touch, which can improve their ability to locate and gauge pressure on the correct part of the hand.
The researchers also conducted exhaustive tests to confirm that the same electrode consistently creates a sensation corresponding to a specific location.
“If I stimulate an electrode on day one and a participant feels it on their thumb, we can test that same electrode on day 100, day 1,000, even many years later, and they still feel it in roughly the same spot,” said Greenspon, who was the lead author on this paper.
From a practical standpoint, any clinical device would need to be stable enough for a patient to rely on it in everyday life. An electrode that continually shifts its “touch location” or produces inconsistent sensations would be frustrating and require frequent recalibration. By contrast, the long-term consistency this study revealed could allow prosthetic users to develop confidence in their motor control and sense of touch, much as they would in their natural limbs.
Adding feelings of movement and shapes
The complementary Science paper went a step further to make artificial touch even more immersive and intuitive. The project was led by first author Giacomo Valle, PhD, a former postdoctoral fellow at UChicago who is now continuing his bionics research at Chalmers University of Technology in Sweden.
“Two electrodes next to each other in the brain don’t create sensations that ‘tile’ the hand in neat little patches with one-to-one correspondence; instead, the sensory locations overlap,” explained Greenspon, who shared senior authorship of this paper with Bensmaia.
The researchers decided to test whether they could use this overlapping nature to create sensations that could let users feel the boundaries of an object or the motion of something sliding along their skin. After identifying pairs or clusters of electrodes whose “touch zones” overlapped, the scientists activated them in carefully orchestrated patterns to generate sensations that progressed across the sensory map.
Participants described feeling a gentle gliding touch passing smoothly over their fingers, despite the stimulus being delivered in small, discrete steps. The scientists attribute this result to the brain’s remarkable ability to stitch together sensory inputs and interpret them as coherent, moving experiences by “filling in” gaps in perception.
The approach of sequentially activating electrodes also significantly improved participants’ ability to distinguish complex tactile shapes and respond to changes in the objects they touched. They could sometimes identify letters of the alphabet electrically “traced” on their fingertips, and they could use a bionic arm to steady a steering wheel when it began to slip through the hand.
These advancements help move bionic feedback closer to the precise, complex, adaptive abilities of natural touch, paving the way for prosthetics that enable confident handling of everyday objects and responses to shifting stimuli.
The future of neuroprosthetics
The researchers hope that as electrode designs and surgical methods continue to improve, the coverage across the hand will become even finer, enabling more lifelike feedback.
“We hope to integrate the results of these two studies into our robotics systems, where we have already shown that even simple stimulation strategies can improve people’s abilities to control robotic arms with their brains,” said co-author Robert Gaunt, PhD, associate professor of physical medicine and rehabilitation and lead of the stimulation work at the University of Pittsburgh.
Greenspon emphasized that the motivation behind this work is to enhance independence and quality of life for people living with limb loss or paralysis.
“We all care about the people in our lives who get injured and lose the use of a limb — this research is for them,” he said. “This is how we restore touch to people. It’s the forefront of restorative neurotechnology, and we’re working to expand the approach to other regions of the brain.”
The approach also holds promise for people with other types of sensory loss. In fact, the group has also collaborated with surgeons and obstetricians at UChicago on the Bionic Breast Project, which aims to produce an implantable device that can restore the sense of touch after mastectomy.
Although many challenges remain, these latest studies offer evidence that the path to restoring touch is becoming clearer. With each new set of findings, researchers come closer to a future in which a prosthetic body part is not just a functional tool, but a way to experience the world.
“Evoking stable and precise tactile sensations via multi-electrode intracortical microstimulation of the somatosensory cortex” was published in Nature Biomedical Engineering in December 2024. Authors include Charles M. Greenspon, Giacomo Valle, Natalya D. Shelchkova, Thierri Callier, Ev I. Berger-Wolf, Elizaveta V. Okorokova, Efe Dogruoz, Anton R. Sobinov, Patrick M. Jordan, Emily E. Fitzgerald, Dillan Prasad, Ashley Van Driesche, Qinpu He, David Satzer, Peter C. Warnke, John E. Downey, Nicholas G. Hatsopoulos and Sliman J. Bensmaia from the University of Chicago; Taylor G. Hobbs, Ceci Verbaarschot, Jeffrey M. Weiss, Fang Liu, Jorge Gonzalez-Martinez, Michael L. Boninger, Jennifer L. Collinger and Robert A. Gaunt from the University of Pittsburgh; Brianna C. Hutchison, Robert F. Kirsch, Jonathan P. Miller, Abidemi B. Ajiboye, Emily L. Graczyk, from Case Western Reserve University; Lee E. Miller from Northwestern University; and Ray C. Lee from Schwab Rehabilitation Hospital.
“Tactile edges and motion via patterned microstimulation of the human somatosensory cortex” was published in Science in January 2025. Authors include Giacomo Valle, now at Chalmers University in Sweden; Ali H. Alamri, John E. Downey, Patrick M. Jordan, Anton R. Sobinov, Linnea J. Endsley, Dillan Prasad, Peter C. Warnke, Nicholas G. Hatsopoulos, Charles M. Greenspon and Sliman J. Bensmaia from the University of Chicago; Robin Lienkämper, Michael L. Boninger, Jennifer L. Collinger and Robert A. Gaunt from the University of Pittsburgh; and Lee E. Miller from Northwestern University.
An illustration showing a paralyzed individual with a spinal cord injury, implanted with intracortical electrodes in the brain. This brain-computer interface (BCI) allows the individual to control a bionic limb that is not attached to the body, directly with thoughts, to reach and grasp a coffee mug. Due to embedded sensors, the bionic hand senses the grasped object as if it were being grasped with the human hand, communicating the touch sensations to the user’s brain via advanced neurostimulation.
Credit: Chalmers University of Technology | Boid | David Ljungberg
For the first time ever, a complex sense of touch for individuals living with spinal cord injuries is a step closer to reality. A new study published in Science, paves the way for complex touch sensation through brain stimulation, whilst using an extracorporeal bionic limb, that is attached to a chair or wheelchair.
The researchers, who are all part of the US-based Cortical Bionics Research Group, have discovered a unique method for encoding natural touch sensations of the hand via specific microstimulation patterns in implantable electrodes in the brain. This allows individuals with spinal cord injuries not only to control a bionic arm with their brain, but also to feel tactile edges, shapes, curvatures and movements, that until now have not been possible.
“In this work, for the first time the research went beyond anything that has been done before in the field of brain-computer interfaces (BCI) – we conveyed tactile sensations related to orientation, curvature, motion and 3D shapes for a participant using a brain-controlled bionic limb. We are in another level of artificial touch now. We think this richness is crucial for achieving the level of dexterity, manipulation, and a highly dimensional tactile experience typical of the human hand,” says Giacomo Valle, lead author of the study and Assistant Professor at Chalmers University of Technology, in Sweden.
The importance of the sense of touch
A sense of touch builds richness and independence in our everyday lives. For individuals living with a spinal cord injury, the electrical signals coming from the hand to the brain that should allow an individual to feel tactile sensations, are being blocked by the injury and that sense of touch is lost. A bionic limb controlled by user’s brain signals can bring back some functionality and independence to someone with a paralysed hand, but without the sense of touch, it is very difficult to lift, hold and manipulate objects. Previously, a bionic hand would not be perceived by the user as part of the body, since it would not provide any sensory feedback like a biological hand. This study aimed to improve the usability of an extracorporeal bionic limb, which would be mounted on a wheelchair or similar equipment close by to the user.
Implantable technology for controlling bionic limbs with the brain
For the study, two BCI participants were fitted with chronic brain implants in the sensory and motor regions of the brain that represent the arm and hand. Over the course of several years, the researchers were able to record and decode all of the different patterns of electrical activity that occurred in the brain related to motor intention of the arm and hand. This was possible, since the electrical activity was still present in the brain, but the paralysis was blocking this from reaching the hand. Decoding and deciphering brain signals with this technology is unique and allows the participants to directly control a bionic arm and hand with the brain for interacting with the environment.
Complex touch typed into the brain
The participants were able to accomplish a series of complex experiments, that require rich tactile sensations. To do this, the researchers typed specific stimulations directly into the users’ brain via the implants.
“We found a way to type these ‘tactile messages’ via microstimulation using the tiny electrodes in the brain and we found a unique way to encode complex sensations. This allowed for more vivid sensory feedback and experience while using a bionic hand,” says Valle.
The participants could feel the edge of an object, as well as the direction of motion along the fingertips.
By utilising the Brain Computer Interface, the researchers could decode the intention of motion from the participant’s brain in order to control a bionic arm. Since the bionic arm has sensors on it, when an object comes into contact with these sensors, the stimulation is sent to the brain and the participant feels the sensation as if it were in their hand. This means that the participants could potentially complete complex tasks with a bionic arm with more accuracy than was previously possible, like picking up an object and moving it from one location to another.
The future of complex touch for neural prosthetics
This research is just the first step towards patients with spinal cord injuries being able to feel this level of complex touch. To capture all the features of complex touch that the researchers are now able to encode and convey to the user, more complex sensors and robotic technology is needed (for example prosthetic skin). The implantable technology used to stimulate, would also require development, to increase the repertoire of sensation.
More about the research:
The study “Tactile edges and motion via patterned microstimulation of the human somatosensory cortex” published in Science, was led by Giacomo Valle, who is now Assistant Professor at the Department of Electrical Engineering at Chalmers University of Technology in Sweden, and who was active with a team at the University of Chicago in the Bensmaia Lab, USA, at the time of the study.
More about Cortical Bionics Research Group:
This area of research is already of major business interest in the USA, with multiple research institutions, and also private companies, now starting to commercialise implantable neurotechnology Part of the system being used for this study is now being developed by an American neurotech company. But there is less happening in Europe where the different regulatory landscape could impact the translation of emerging neurotechnologies. This study brings Chalmers into the Cortical Bionics Research Group and aims to establish a European hub for this area of neurotech research.
The Cortical Bionics Research Group is made up of three north American Universities – University of Pittsburgh, University of Chicago and Northwestern University. The mission of the Cortical Bionics Research Group is to build next-generation intracortical Brain-Computer Interfaces that enable dexterous control of bionic hands by people with paralysis or amputation.
Illustration caption: An illustration showing a paralyzed individual with a spinal cord injury, implanted with intracortical electrodes in the brain. This brain-computer interface (BCI) allows the individual to control a bionic limb that is not attached to the body, directly with thoughts, to reach and grasp a coffee mug. Due to embedded sensors, the bionic hand senses the grasped object as if it were being grasped with the human hand, communicating the touch sensations to the user’s brain via advanced neurostimulation.
Illustration credit: Chalmers University of Technology | Boid | David Ljungberg
For more information, please contact:
Giacomo Valle, Assistant Professor at the Department of Electrical Engineering, Chalmers University of Technology, Sweden, +46 70 83 88 515 valleg@chalmers.se
Charles Greenspon, Research Assistant Professor at the Department of Organismal Biology & Anatomy, University of Chicago, USA, +1 312 88 94 029 cmgreenspon@uchicago.edu
The contact persons speak English They are available for live and pre-recorded interviews. At Chalmers, we have podcast studios and broadcast filming equipment on site and would be able to assist a request for a television, radio or podcast interview. The BCI participants in America are available for interview through Giacomo Valle.
More about the scientific article:
The article, Tactile edges and motion via patterned microstimulation of the human somatosensory cortex, was published in Science.
The researchers involved in the study are Giacomo Valle, Ali H. Alamri, John E. Downey, Robin Lienkämper, Patrick M. Jordan, Anton R. Sobinov, Linnea J. Endsley, Dillan Prasad, Michael L. Boninger, Jennifer L. Collinger, Peter C. Warnke, Nicholas G. Hatsopoulos, Lee E. Miller, Robert A. Gaunt, Charles M. Greenspon, Sliman J. Bensmaia.
At the time of the study, the researchers were active at University of Chicago, USA; Chalmers University of Technology, Sweden; University of Pittsburgh, USA; and Northwestern University, USA.
The research was funded by the National Institute of Neurological Disorders and Stroke of the National Institutes of Health under Award Number UH3NS107714 to R.A.G., by R35 NS122333 to S.J.B. and by the University of Chicago (Chicago Postdoctoral Fellowship) to G.V.
You can read a recent Nature article that has been written about the study here.
- The company developing part of the system that was used in the study and the researchers' industrial partner can be found here.
- BCI participants association involved can be found here.
- BCI network of worldwide experts in which the researchers collaborate can be found here.
- patent submitted on the presented neurotech: “GV is an inventor on a pending international (PCT) patent application submitted by The University of Chicago and The University of Pittsburgh that covers multi-channel microstimulation of the somatosensory cortex (n. 23-0518-WO)"
Tactile edges and motion via patterned microstimulation of the human somatosensory cortex
Article Publication Date
17-Jan-2025
COI Statement
N.G.H. and R.A.G. serve as consultants for Blackrock Neurotech, Inc. R.A.G. is also on the scientific advisory board of Neurowired. LLC., M.L.B., J.L.C., and R.A.G have received research funding from Blackrock Neurotech, Inc. though that funding did not support the work presented here. P.W. served as a consultant for Medtronic. A.R.S. served as a consultant for Google DeepMind. S.J.B., C.M.G., G.V., R.A.G., and R.L., are inventors on a pending international (PCT) patent application submitted by The University of Chicago and The University of Pittsburgh that covers multichannel microstimulation of the somatosensory cortex (no. 23-0518-WO).
FORWARD TO THE PAST
New chainmail-like material could be the future of armor
First 2D mechanically interlocked polymer exhibits exceptional flexibility and strength
This illustration shows how X-shaped monomers are interlinked to create the first 2D mechanically interlocked polymer. Similar to chainmail, the material exhibits exceptional strength.
Credit: Mark Seniw, Center for Regenerative Nanomedicine, Northwestern University
EVANSTON, Il. --- In a remarkable feat of chemistry, a Northwestern University-led research team has developed the first two-dimensional (2D) mechanically interlocked material.
Resembling the interlocking links in chainmail, the nanoscale material exhibits exceptional flexibility and strength. With further work, it holds promise for use in high-performance, light-weight body armor and other uses that demand lightweight, flexible and tough materials.
Publishing on Friday (Jan. 17) in the journal Science, the study marks several firsts for the field. Not only is it the first 2D mechanically interlocked polymer, but the novel material also contains 100 trillion mechanical bonds per 1 square centimeter — the highest density of mechanical bonds ever achieved. The researchers produced this material using a new, highly efficient and scalable polymerization process.
“We made a completely new polymer structure,” said Northwestern’s William Dichtel, the study’s corresponding author. “It’s similar to chainmail in that it cannot easily rip because each of the mechanical bonds has a bit of freedom to slide around. If you pull it, it can dissipate the applied force in multiple directions. And if you want to rip it apart, you would have to break it in many, many different places. We are continuing to explore its properties and will probably be studying it for years.”
For years, researchers have attempted to develop mechanically interlocked molecules with polymers but found it near impossible to coax polymers to form mechanical bonds.
To overcome this challenge, Dichtel’s team took a whole new approach. They started with X-shaped monomers — which are the building blocks of polymers — and arranged them into a specific, highly ordered crystalline structure. Then, they reacted these crystals with another molecule to create bonds between the molecules within the crystal.
“I give a lot of credit to Madison because she came up with this concept for forming the mechanically interlocked polymer,” Dichtel said. “It was a high-risk, high-reward idea where we had to question our assumptions about what types of reactions are possible in molecular crystals.”
The resulting crystals comprise layers and layers of 2D interlocked polymer sheets. Within the polymer sheets, the ends of the X-shaped monomers are bonded to the ends of other X-shaped monomers. Then, more monomers are threaded through the gaps in between. Despite its rigid structure, the polymer is surprisingly flexible. Dichtel’s team also found that dissolving the polymer in solution caused the layers of interlocked monomers to peel off each other.
“After the polymer is formed, there’s not a whole lot holding the structure together,” Dichtel said. “So, when we put it in solvent, the crystal dissolves, but each 2D layer holds together. We can manipulate those individual sheets.”
To examine the structure at the nanoscale, collaborators at Cornell University, led by Professor David Muller, used cutting-edge electron microscopy techniques. The images revealed the polymer’s high degree of crystallinity, confirmed its interlocked structure and indicated its high flexibility.
Dichtel’s team also found the new material can be produced in large quantities. Previous polymers containing mechanical bonds typically have been prepared in very small quantities using methods that are unlikely to be scalable. Dichtel’s team, on the other hand, made half a kilogram of their new material and assume even larger amounts are possible as their most promising applications emerge.
Adding strength to tough polymers
Inspired by the material’s inherent strength, Dichtel’s collaborators at Duke University, led by Professor Matthew Becker, added it to Ultem. In the same family as Kevlar, Ultem is an incredibly strong material that can withstand extreme temperatures as well as acidic and caustic chemicals. The researchers developed a composite material of 97.5% Ultem fiber and just 2.5% of the 2D polymer. That small percentage dramatically increased Ultem’s overall strength and toughness.
Dichtel envisions his group’s new polymer might have a future as a specialty material for light-weight body armor and ballistic fabrics.
“We have a lot more analysis to do, but we can tell that it improves the strength of these composite materials,” Dichtel said. “Almost every property we have measured has been exceptional in some way.”
Steeped in Northwestern history
The authors dedicated the paper to the memory of former Northwestern chemist Sir Fraser Stoddart, who introduced the concept of mechanical bonds in the 1980s. Ultimately, he elaborated these bonds into molecular machines that switch, rotate, contract and expand in controllable ways. Stoddart, who passed away last month, received the 2016 Nobel Prize in Chemistry for this work.
“Molecules don’t just thread themselves through each other on their own, so Fraser developed ingenious ways to template interlocked structures,” said Dichtel, who was a postdoctoral researcher in Stoddart’s lab at UCLA. “But even these methods have stopped short of being practical enough to use in big molecules like polymers. In our present work, the molecules are held firmly in place in a crystal, which templates the formation of a mechanical bond around each one.
“So, these mechanical bonds have deep tradition at Northwestern, and we are excited to explore their possibilities in ways that have not yet been possible.”
The study, “Mechanically interlocked two-dimensional polymers,” was primarily supported by the Defense Advanced Research Projects Agency (contract number HR00112320041) and Northwestern’s IIN (Ryan Fellows Program).