Friday, August 18, 2023

Long-standing mystery of modified human skulls in ancient Japan solved by new analysis

Story by By Mindy Weisberger, CNN •8h


Modified skulls found in an ancient burial site in Japan were deliberately reshaped in both men and women as an expression of collective identity. The practice may also have helped the group to forge trade connections, a new study suggests.

In all the skulls, the normally rounded part at the back of the head was flattened and squared off, possibly through binding or being pressed against a flat surface during infancy.

The skulls belonged to individuals from the Hirota people, who lived on the southern Japanese island of Tanegashima from the third century to the seventh century. They were buried at a large grave site that was first excavated in the 1950s and then later in the early 2000s, researchers reported Wednesday in the journal PLOS One. The site held hundreds of skeletons, and 90% of the burials were richly adorned with glass beads and pendants, bracelets and beads made of shell, suggesting that all the burials held people of similar status.

The study team determined that Hirota people practiced cranial deformation “regardless of social rank or sex,” said coauthors Dr. Noriko Seguchi, an associate professor with the Faculty of Social and Cultural Studies at Kyushu University in Japan, and James Frances Loftus III, a postdoctoral research fellow at the university.

Shells adorning Hirota remains were abundant but were not local to the region. The artifacts originated thousands of kilometers south of the burial site, suggesting that Hirota people traded for them. It’s possible, the researchers suggested, that the Hirota cranial reshaping played a part in their trading success.

“We hypothesize that the Hirota people deformed their crania to preserve group identity and potentially facilitate long-distance trade of shellfish, as supported by archaeological evidence,” Seguchi and Loftus told CNN in an email.

An ancient and widespread practice

Intentional skull reshaping is found in burials across Asia, Europe and Mesoamerica. Modified skulls from medieval sites in Germany and Croatia show signs of deformation, and an elongated skull from China that’s 12,000 years old suggests that such modifications date to Neolithic times.

The practice was also widespread among the Maya in Central America and what is now Mexico, prior to the arrival of Europeans. Cranial deformation in Mayan people is particularly well documented, with carved figurines showing children wearing devices to shape their skulls, “such as a board on the forehead,” Seguchi and Loftus said. “This could have been a practice done by mothers or caretakers of infants as young as a couple of weeks or months old.”

Other researchers had previously speculated about the flattened Hirota skulls in Japan, but it was unknown whether the Hirota modifications were deliberate or merely a byproduct of other cultural practices.

For the new study, the scientists 3D-scanned and digitally modeled 19 adult Hirota skulls to conduct more detailed morphological analysis. The team compared them to adult skulls from two other ancient Japanese groups — Jomon and Doigahama — which had similar cranial deformations.



Ancient human remains are shown in a burial found at the Hirota ruins. Shell accessories found near the skeleton hint at regional culture and trade at the time. - The Kyushu University Museum© Provided by CNN

“We examined abnormalities in cranial sutures that might have formed during growth and development by utilizing a novel 3D visualization method of cranial surfaces,” Seguchi and Loftus explained. This new technique revealed an unprecedented level of detail in subtle bone shapes.

Intentionally modified skulls

In all three groups, male and female skulls were deformed. But in the Hirota skulls, the occipital region — a cranial bone at the back — was notably flatter than in Jomon and Doigahama skulls. Certain indentations were also identified in Hirota skulls but were absent in those of Jomon and Doigahama individuals. These depressions may have been left by binding practices that deformed the infant skulls, the scientists said.

The dramatic flatness of Hirota crania, the skull depressions and overall changes in bone structure corresponded with examples of extreme skull modification in cultures from the Americas. This told the researchers that Hirota people also deliberately modified their cranial shapes.


At the Hirota site, each marker indicates where burials were found along with the notes on their sex and approximate age group. - The Kyushu University Museum© Provided by CNN

The authors are hopeful that future excavations may uncover settlements that offer further evidence of the Hirota people’s daily lives, which will shed light on their motivations for cranial deformation.

“Through these findings, we believe we have begun the process of unravelling the still mysterious nature of the Hirota people, their culture, and potential trade practices,” Seguchi and Loftus said. “We hope that this study will open the eyes of researchers of this and other periods of Japanese prehistory to lines of thinking which allow us to view Japan through an international lens, seeing cultural practice as a fluid and changing phenomenon.”

Unveiling Japan's ancient practice of cranial modification: The case of the Hirota people in Tanegashima


An international team of researchers report that the Hirota people, who lived on the southern Japanese island of Tanegashima between the late Yayoi period to the Kofun period (3rd to 7th century CE), practiced cranial modification

Peer-Reviewed Publication

KYUSHU UNIVERSITY

Skeletal remains from the Hirota ruins 

IMAGE: A PHOTOGRAPH OF THE ANCIENT HUMAN REMAINS FOUND AT THE HIROTA RUINS. A NOTABLE CHARACTERISTIC OF THE REMAINS IS THE WEARING OF MANY SHELL ACCESSORIES, INDICATIVE OF THE CULTURE AND TRADE OF THE REGION AT THE TIME. view more 

CREDIT: THE KYUSHU UNIVERSITY MUSEUM


Fukuoka, Japan—A team of biological anthropologists and archaeologists from Kyushu University and the University of Montana have broken new ground in our understanding on the practice of intentional cranial modification, a practice found in numerous ancient civilizations around the world.

Publishing in PLOS ONE, the team reports that the Hirota people, who lived on the southern Japanese island of Tanegashima around the 3rd century to 7th century CE, also partook in the practice. Moreover, the study found no significant differences in cranial modification between sexes, indicating that both males and females practiced intentional cranial modification.

Cranial modification is a form of body alteration where the head of a person is pressed or bound, usually at an early age, to permanently deform the skull. The practice predates written history, and researchers theorize that it was performed to signify group affiliation or demonstrate social status.

"One location in Japan that has long been associated with cranial deformation is the Hirota site on the Japanese island of Tanegashima, in Kagoshima Prefecture. This is a large-scale burial site of the Hirota people who lived there during the end of the Yayoi Period, around the 3rd century CE, to the Kofun Period, between the 5th and 7th century CE." explains Noriko Seguchi of Kyushu University's Faculty of Social and Cultural Studies who led the study. "This site was excavated from 1957 to 1959 and again from 2005 to 2006. From the initial excavation, we found remains with cranial deformations characterized by a short head and a flattened back of the skull, specifically the occipital bone and posterior parts of the parietal bones."

However, while the site provided an ideal opportunity to study the phenomenon, it had remained unclear whether these cranial modifications had been intentional, or were simply the unintended result of other habits.

To conduct the study, the research group employed a hybrid approach, utilizing 2D images to analyze the shape of the skulls' outline, as well as 3D scans of their surface. The group also compared crania data from other archeological sites in Japan, such as the Doigahama Yayoi people in Western Yamaguchi, and the Kyushu Island Jomon people, who were the hunter-gatherer predecessors to the Yayoi people. Along with visually assessing skull morphology, the team gathered all this data and statically analyzed the contours and shapes between the skulls.

"Our results revealed distinct cranial morphology and significant statistical variability between the Hirota individuals with the Kyushu Island Jomon and Doigahama Yayoi samples," continues Seguchi. "The presence of a flattened back of the skull characterized by changes in the occipital bone, along with depressions in parts of the skull that connects the bones together, specifically the sagittal and lambdoidal sutures, strongly suggested intentional cranial modification."

The motivations behind this practice remain unclear, but the researchers hypothesize that the Hirota people deformed their crania to preserve group identity and potentially facilitate long-distance trade of shellfish, as supported by archaeological evidence found at the site.

"Our findings significantly contribute to our understanding of the practice of intentional cranial modification in ancient societies," concludes Seguchi. "We hope that further investigations in the region will offer additional insights into the social and cultural significance of this practice in East Asia and the world.

3D images of the skulls excavated from the Doigahama site (left) and the Hirota site (right) 

About Kyushu University 
Kyushu University is one of Japan's leading research-oriented institutes of higher education since its founding in 1911. Home to around 19,000 students and 8,000 faculty and staff, Kyushu U's world-class research centers cover a wide range of study areas and research fields, from the humanities and arts to engineering and medical sciences. Its multiple campuses—including one of the largest in Japan—are located around Fukuoka City, a coastal metropolis on the southwestern Japanese island of Kyushu that is frequently ranked among the world's most livable cities and historically known as Japan's gateway to Asia. Through its Vision 2030, Kyushu U will 'Drive Social Change with Integrative Knowledge.' Its synergistic application of knowledge will encompass all of academia and solve issues in society while innovating new systems for a better future.


 

Brown-led research provides unprecedented look at what influences sea ice motion in the Arctic


A new in-depth analysis of sea ice motion in the fastest-warming part of the globe shows how Arctic Ocean sea ice responds to different ocean currents and reveals that the seafloor plays a crucial role.


Peer-Reviewed Publication

BROWN UNIVERSITY

Sea Ice 

IMAGE: LOCAL TIDAL CURRENTS STRONGLY AFFECT THE MOVEMENT OF SEA ICE IN THE ARCTIC OCEAN AND THE MAKEUP OF THE SEAFLOOR CAUSES SOME OF THE MOST ABRUPT CHANGES. view more 

CREDIT: DANIEL WATKINS.




PROVIDENCE, R.I. [Brown University] — A new study led by researchers at Brown offers fresh insights into the forces above and beneath the ocean surface that influence how sea ice moves and disperses in the Arctic Ocean, which is warming at over twice the rate of the global average.

The in-depth analysis reveals how local tidal currents strongly affect the movement of the ice along its journey and provides an unprecedented look at how the makeup of the seafloor is causing some of the most abrupt changes.

Data from the study can be applied to improve complex computer simulations used for forecasting Arctic sea ice conditions, and in the long-term, the results may help clarify how climate change is altering the Arctic and inform future climate predictions.

“The ice is clearly feeling the influence of the bottom of the ocean,” said Daniel Watkins, a postdoctoral researcher at Brown and lead author of the new study published in Geophysical Research Letters. “The landscape at the ocean floor, like canyons and continental shelves, affects tides and other ocean currents. And as it drifts, the sea ice passes over many different undersea features. We see sharp changes in the dynamics of the sea ice as soon as it gets to those undersea features.”

Using data from largest ever drifting sea-ice buoy array, along with 20 years of satellite images, the researchers examined sea ice motion as it drifted from the Arctic Ocean through a deep-water passage called the Fram Strait and eventually into the Greenland Sea. The analysis revealed the seafloor’s impact on some of the most abrupt changes affecting the sea ice, like dramatic gains in speed or motions that force the ice to pack in close together or even break apart.

“What we see with this dataset is a transition from the central Arctic, where the ice is mostly moving as a whole and following wind patterns, to areas where we’re seeing much stronger impacts of ocean currents,” Watkins said.

The Arctic is the fastest warming part of the globe and it has long been understood that sea ice in the region plays an important role in the planet’s climate. For instance, the ice acts like a reflective surface deflecting how much sunlight is absorbed by the Earth. As it disappears, more sunlight is absorbed, leading to a warmer planet. Many scientists also expect that as Arctic ice vanishes, weather across the Northern Hemisphere will be impacted, producing periods of bitter cold, punishing heat waves and disastrous floods.

With the study, the researchers wanted to delve deeper into the changes happening in this critically important part of the Earth. Much of the data for the study was gathered during the largest polar expedition in history — the Multidisciplinary drifting Observatory for the Study of Arctic Climate.

Comprehensive research reveals sudden increases in ice speed

During the expedition, teams of researchers took turns spending a year drifting with the sea ice aboard a massive German icebreaker in the Arctic Ocean. Watkins was there for two weeks in October 2019 to help install a network of autonomous sensors around the base camp. While there, Watkins coordinated helicopter flights to remote patches of sea ice, worked with analysts to find suitable sites for instruments and buoys and deployed them on the ice.

Throughout the year-long expedition, a total of 214 buoys were deployed, including 51 during Watkins’ tenure on the expedition. The study is based on GPS data transmitted from a set of 108 of the buoys that drifted from the central Arctic through the Fram Strait and into the Greenland Sea.

The major focus was on what are known as marginal ice zones in the Greenland Sea and Fram Strait, which is the transition zone between the open, ice-free ocean and the pack ice of the central arctic.

As part of their analysis, the group also analyzed satellite measurements taken from 2003 to 2020 to put the data the buoys gathered over the year adrift into historical context. The satellite data helped confirm sharp changes in ice velocity and ice motion that could only be explained by the seafloor’s influence on the sea ice.

For instance, looking at the data from an area northeast of Svalbard, Norway, the researchers noticed the speed of the ice suddenly increased even though the wind hadn’t changed. That meant the ice was getting pushed by the ocean currents, so the team delved deeper to find where this happens and how. They found that the sea ice speeds up where the Transpolar Drift Stream, one of the Arctic’s Ocean major currents, ends and the fast-moving East Greenland Current, which forms due to a combination of the Earth’s rotation and the edge of the continental shelf on the seafloor, begins. The analysis shows how the sea ice responds to different ocean currents and that the sea floor plays a role.

“In the beginning of this journey, there was almost no difference in the drift speed across the whole set of buoys,” Watkins said. “Then there’s essentially one day where the wind died down and the ice ran into the that boundary current and it just took off. It was like a one-day-to-the-next change in what was pushing the ice.”

As next steps, the researchers plan to work with model developers to help implement the data from the study into forecasts of how the ice will move and where it will end up. They also plan to further develop an ice floe tracking tool to track the motion of individual pieces of ice. The tool would help researchers see details of ice motion that are invisible to standard approaches.

“We’re hoping to understand the changing ice physics in a warming Arctic and use it to help make our models of those physics better,” Watkins said.

Along with Watkins, other researchers involved with the study included Monica Martinez Wilhelmus, an assistant professor of engineering and a senior author on the study, as well as Angela C. Bliss from NASA’s Goddard Space Flight Center and Jennifer K. Hutchings from Oregon State University. Funders for the work included NASA, the Department of Energy, the Office of Naval Research and the National Science Foundation.

 

Illinois Tech engineer spearheads research leading to groundbreaking green propane production method


Mohammad Asadi partners with SHV Energy to distribute electrolyzer device that can convert carbon dioxide into propane in a way that is economically viable and scalable


Peer-Reviewed Publication

ILLINOIS INSTITUTE OF TECHNOLOGY

Electrolyzer 

IMAGE: ILLUSTRATION OF ELECTROLYZER, WHICH USES A NOVEL CATALYST TO CONVERT CARBON DIOXIDE INTO PROPANE. view more 

CREDIT: ILLINOIS INSTITUTE OF TECHNOLOGY




CHICAGO—August 18, 2023—A paper recently published in Nature Energy based on pioneering research done at Illinois Institute of Technology reveals a promising breakthrough in green energy: an electrolyzer device capable of converting carbon dioxide into propane in a manner that is both scalable and economically viable.

As the United States races toward its target of net-zero greenhouse gas emissions by 2050, innovative methods to reduce the significant carbon dioxide emissions from electric power and industrial sectors are critical. Mohammad Asadi, assistant professor of chemical engineering at Illinois Tech, spearheaded this groundbreaking research. 

“Making renewable chemical manufacturing is really important,” says Asadi. “It’s the best way to close the carbon cycle without losing the chemicals we currently use daily.”

What sets Asadi’s electrolyzer apart is its unique catalytic system. It uses inexpensive, readily available materials to produce tri-carbon molecules—fundamental building blocks for fuels like propane, which is used for purposes ranging from home heating to aviation.

To ensure a deep understanding of the catalyst’s operations, the team employed a combination of experimental and computational methods. This rigorous approach illuminated the crucial elements influencing the catalyst’s reaction activity, selectivity, and stability.

A distinctive feature of this technology, lending to its commercial viability, is the implementation of a flow electrolyzer. This design permits continuous propane production, sidestepping the pitfalls of the more conventional batch processing methods.

“Designing and engineering this laboratory-scale flow electrolyzer prototype has demonstrated Illinois Tech’s commitment to creating innovative technologies. Optimizing and scaling up this prototype will be an important step toward producing a sustainable, economically viable, and energy-efficient carbon capture and utilization process,” says Advanced Research Projects Agency-Energy Program Director Jack Lewnard.

This innovation is not Asadi’s first venture into sustainable energy. He previously adapted a version of this catalyst to produce ethanol by harnessing carbon dioxide from industrial waste gas. Recognizing the potential of the green propane technology, Asadi has collaborated with global propane distributor SHV Energy to further scale and disseminate the system. 

“This is an exciting development which opens up a new e-fuel pathway to on-purpose propane production for the benefit of global users of this essential fuel,” says Keith Simons, head of research and development for sustainable fuels at SHV Energy.

Illinois Tech Duchossois Leadership Professor and Professor of Physics Carlo Segre, University of Pennsylvania Professor of Materials Science and Engineering Andrew Rappe, and University of Illinois Chicago Professor Reza Shahbazian-Yassar contributed to this work. Mohammadreza Esmaeilirad (Ph.D. CHE ’22) was a lead author on the paper. 

Disclaimer: “Research reported in this publication was supported by the National Science Foundation under Award Number 2135173, the Advanced Research Projects Agency-Energy under Award Number DE-AR0001581, and SHV Energy. This content is solely the responsibility of the authors and does not necessarily represent the official views of the National Science Foundation, the Advanced Research Projects Agency-Energy, or SHV Energy.”

Mohammad Asadi, “Imidazolium Functionalized Transition Metal Phosphide Catalysts for Electrochemical Carbon Dioxide Conversion to Ethanol,” National Science Foundation; Award Number 2135173

Mohammad Asadi, “Direct Conversion of Flue Gas to Value-Added Chemicals Using a Carbon-Neutral Process,” Advanced Research Projects Agency-Energy; Award Number DE-AR0001581

Researchers unveil a new, economical approach for producing green hydrogen

Researchers unveil a new, economical approach for producing green hydrogen
Graphical abstract. Credit: Joule (2023). DOI: 10.1016/j.joule.2023.07.016

Researchers at the University of Colorado have developed a new and efficient way to produce green hydrogen or green syngas, a precursor to liquid fuels. The findings could open the door for more sustainable energy use in industries like transportation, steelmaking and ammonia production.

The new study, published Aug. 16 in the journal Joule, focuses on the production of hydrogen or syngas, a mixture of hydrogen and  that can be converted into fuels like gasoline, diesel and kerosene. The CU Boulder team lays the groundwork for what could be the first commercially viable method for producing this fuel, entirely using . That might help engineers to generate syngas in a more sustainable way.

The group was led by Al Weimer, professor in the Department of Chemical and Biological Engineering.

"The way I like to think about it is some day when you go to the pump you'll have, for example, unleaded, super unleaded and ethanol options, and then an additional option being solar fuel, where the fuel is derived from sunlight, water and carbon dioxide," said Kent Warren, one of two lead authors of the new study and a research associate in Chemical and Biological Engineering. "Our hope is that it will be cost-competitive to the fuels sourced from the ground."

Traditionally, engineers produce hydrogen gas through electrolysis, or using electricity to split molecules of water into hydrogen and oxygen gas. The team's "thermochemical" approach, in contrast, uses heat generated by solar rays to complete those same . The methods can also split molecules of  pulled from the atmosphere to produce carbon monoxide.

Scientists had previously shown that such an approach to making hydrogen and carbon monoxide was possible, but might not be efficient enough to produce syngas in a commercially viable manner.

In the new study, the researchers demonstrated that they can conduct these reactions at elevated pressures, in part by employing iron-aluminate materials, which are relatively inexpensive and abundant in the Earth. Those higher pressures allowed the team to more than double its production of hydrogen.

More information: Justin T. Tran et al, Pressure-enhanced performance of metal oxides for thermochemical water and carbon dioxide splitting, Joule (2023). DOI: 10.1016/j.joule.2023.07.016

Journal information: Joule 

Solar-powered chemistry uses carbon dioxide and water to make feedstock for fuels, chemicals


 

Q&A: Climate change is increasing wildfire risks for forests—what can we do about it?

wildfire
Credit: CC0 Public Domain

Recent wildfires in Hawaii and in many parts of Canada underscore the importance of wildfire prevention and management. Research from North Carolina State is helping us better understand—and possibly mitigate—increased risks for forests associated with climate change, including from wildfires.

While wildfires are not necessarily new, their frequency, size, duration and intensity are "pretty readily" linked with a changing climate, according to Robert Scheller, associate dean of North Carolina State University's College of Natural Resources and professor of forestry and environmental resources.

"You can see the frequency of  events, and the area burned, is closely linked to temperature and precipitation," Scheller said. "Fuels don't burn very well when it's really wet. They burn fantastically when it's warm and dry. Those are the basics. If there is more drought, and higher temperatures in the future, there will be more ."

Scheller uses computer modeling to understand risks for forests from wildfires, drought and insects in the future under climate change. He has been involved in studies on everything from estimating tree mortality by wildfire in the Southern Appalachian mountains to projecting the benefit of efforts to reduce fuel loads in forests of California.

His work not only offers a glimpse at what wildfire risks might look like under climate change, but also offers insight into the costs and benefits of solutions for mitigating those risks. The Abstract spoke to Scheller about his work and findings from a few recent studies.

The Abstract: How do you use computer modeling to study wildfires?

Robert Scheller: We have a lot of , so we know a lot about landscapes today. Next, we really want to understand landscapes in the future. What we want to know is: Should we be managing landscapes differently? What could we be doing to influence the future? To do that, we use a , based on  and complex algorithms, called LANDIS-II to explore different futures, and understand what we could be doing differently, and what the real risks are to landscapes.

It's just the computer doing math, but it does produce a myriad of maps that show potential futures. Like SimCity, it's spatial, but we're simulating events that happen in forests. Another way to think about it is like a digital twin of a landscape.

TA: What makes your approach unique?

Scheller: One thing most models don't do is capture human ignitions of wildfire versus lightning and active suppression of wildfires. They're only focused on natural causes of wildfire. I'm more interested in what humans are doing on the landscape. We need to pay more attention to where and when humans are lighting fires. Human ignitions have different spatial and temporal patterns. I wrote a wildfire sub-model for LANDIS-II that pays as much attention to humans as it does to "natural" causes of fire.

TA: In one of your studies, you looked at delayed impacts of wildfire on forests in the Southern Appalachians. What did you find?

Scheller: After a wildfire, you can look at satellite data to see what the  looked like just before the fire, and then after. Doing that, you can quantify how much  occurred. We found in the Southern Appalachians that trees are dying from fires up to three years after the fire happened.

When you look at it across a whole landscape, there is as much as 40% more mortality from the fire if you account for this later mortality happening two or three years after the fire. That means that fire is having a bigger effect than we assumed because we're often looking at the effect too soon after the fire.

TA: What did your study of the California's Sierra Nevada mountains look at, and what did you find?

Scheller: We looked at the impact of fuel treatments in the Lake Tahoe area of California. Fuel treatments can be very effective at reducing the intensity of wildfire. It's getting rid of all the dead leaves and logs on the ground, and the little trees that can help the flames get up to the top of the large trees. Thinning also reduces forest density.

In our study, we found that after about the middle of this century, trying to manage for climate change may not be effective. Wildfire gets all the glory, but climate change is also causing insects to kill enormous numbers of trees. The insects are killing way more trees than the wildfires.

The challenge with fuel treatments is they are expensive to get on the ground. You can't treat the whole forest like one would like. The other problem is these fuel treatments don't help with insects. They are way up high in the canopy. They fly from tree to tree. You can do all these fuel treatments, and with climate change you're still going to lose a lot of trees because of insects. There are different ways to manage to reduce insect mortality, but people are focused on fire.

TA: What do you take away from this?

Scheller: The  is forests are going to change because of  change. Without a doubt. The good news is, there are options for managing them, but we'll need to manage really aggressively to achieve our objectives—much more aggressively than people have assumed in the past. Back to the good news: There's a lot of money for managing forests for  in California.

 

Bots are better at CAPTCHA than humans, researchers find

Bots are better at CAPTCHA than humans, researchers find
Solving times for various types of CAPTCHAS. Boxes show the middle 50% of
 participants, and whiskers show the filtered range. Black vertical lines show 
the median. Credit: arXiv (2023). DOI: 10.48550/arxiv.2307.12108

It may be impossible to calculate the time spent and the financial and emotional costs of combating the bad guys lurking around the web, but one thing is for sure: Users and website operators both are getting fed up with all the obstacles they must deal with daily.

One of the leading nuisances for users are CAPTCHAs.

Developed 20 years ago to thwart hackers seeking to steal content, insert malicious posts, make fraudulent transactions or slow website traffic to a crawl, the acronym for this omnipresent line of defense clearly spells out its mission: Completely Automated Public Turing test to tell Computers and Humans Apart.

It's a laudable objective, but there are a few problems. One, they're not foolproof. Two, they are time-consuming. Three, they can be a tremendous nuisance. Is that zoomed-in mirror considered a bus if it's attached to a bus? Is a cropped photo of the bottom of a traffic light pole still a ? Is an elevated crosswalk a bridge?

As one anonymous pundit once suggested, "We thought it was our ability to love that made us human. But it turns out it is our ability to select each image that contains a truck."

And now there's a fourth reason to pull our hair out over the nuisance of CAPTCHAs. Bots, it seems, are better at solving them than humans are.

That's according to researchers at the University of California, Irvine, who tested 1,400 subjects on several types of CAPTCHAs. They found that bots are not only better at solving various forms of CAPTCHAs such as image recognition, puzzle sliders and distorted text, they're faster, too.

They suggest it's been a game of cat-and-mouse.

"CAPTCHAs have evolved in terms of sophistication and diversity," said Andrew Searles, an author of a paper titled "An Empirical Study and Evaluation of Modern CAPTCHAs," "becoming increasingly difficult to solve for both bots [machines] and humans." The study is published on the arXiv preprint server.

"But advances in computer vision and  have dramatically increased the ability of bots to recognize distorted text [with more than ] 99% accuracy … and bots often outsource solving to CAPTCHA farms—sweatshop-like operations where humans are paid to solve CAPTCHAs," he said.

In their study, researchers found bots cracked distorted-text CAPTCHAs correctly just under 100% of the time. Humans achieved between 50% and 84% accuracy. And humans required up to 15 seconds to solve the challenges; the  dispatched the problems in less than a second.

"We do know for sure that [the tests] are very much unloved," said Gene Tsudik, another author of the paper. "We didn't have to do a study to come to that conclusion. But people don't know whether that effort, that colossal global effort that is invested into solving CAPTCHAs every day, every year, every month, whether that effort is actually worthwhile."

Bots excelled at other varieties of CAPTCHAs, struggling most with image-based tests, on which they did slightly better than humans in accuracy and speed.

For Searles, the conclusion is clear.

"There's no easy way using these little image challenges or whatever to distinguish between a human and a bot any more," he said. Instead, he recommended capitalizing on AI advances to design "intelligent algorithms" that can better distinguish bot activity from human input.

More information: Andrew Searles et al, An Empirical Study & Evaluation of Modern CAPTCHAs, arXiv (2023). DOI: 10.48550/arxiv.2307.12108


Journal information: arXiv 


© 2023 Science X Network

New attack could make website security captchas obsolete

 

Microbial influences on fermented beans

Microbial influences on fermented beans
Harvested soybeans, ready for fermenting. Credit: Wikimedia Commons

Fermented foods are deeply seated in global food culture. Many recipes are highly guarded, passed through generations and shrouded in mystery. The microbial communities that make up fermented foods are often diverse, but typically only a few species contribute significantly to the final product.

Additionally, fermented foods and drinks (termed ferments in this article) with similar flavor profiles are often fermented by similar microbes, or at least microbes that carry out similar metabolic processes. By exploring the variety of fermentations from essentially the same starting material we can observe the impact microbes have on the final product.

Fermentation's role in the field of microbiology is hard to overstate. Significant advances in early microbiology history stemmed from trying to understand why some ferments go wrong. For example, in the mid 1800s Louis Pasteur noticed that wine was souring because it was colonized by an organism smaller than yeast, which turned out to be lactic acid bacteria.

Metabolically, fermentation simply describes processes where organic molecules act as donors and acceptors of electrons. The microbial community that drives fermentation can either be the microbes that are natively on the  (termed spontaneous fermentation), or a part of a previous ferment known as a starter, mother or backslop.

Starter cultures typically help expedite the first stages of fermentation. This is because the starter will seed the ferment with a larger population of organisms than are natively present on the food starting material. Starter cultures can also ensure more fermentation consistency and quality by maintaining similar organisms. Larger scale production tends to rely on starter cultures, while many traditional fermented foods utilize the native communities of microorganisms.

People maintain starters for long periods to make sure they can continue making the fermented foods they enjoy. For example, in popular history, sourdough starters were kept warm by Alaskan miners by keeping them on their person.

The microbiological processes underlying food fermentation are highly dynamic, with many ferments characterized by multiple stages as the microbes change their environments with the addition of each metabolic waste product. Furthermore, different ferments are characterized by different microbes that each contribute to the unique flavor profiles.

How fermentation shapes food

Fermentation influences five primary aspects of food:

  1. Preservation.
  2. Digestibility.
  3. Toxin reduction.
  4. Probiotics.
  5. Taste.

Preservation

Fermentation is often associated with the idea of food preservation. This is because the fermenting microbes influence their environment in ways that exclude many of the organisms that cause rancidity or make people sick. For example, in acidic ferments, lactic acid bacteria produce acids that exclude dangerous microbes, such as Clostridium.

Digestibility and Toxin Reduction

Fermentation can facilitate healthy food consumption by improving digestibility and performing toxin reduction. This is especially true in the case of soybeans, as well as many other plant-based foods.

Plant matter is often composed of molecules that reduce the bioavailability of nutrients, inhibit digestive enzymes or even inhibit overall cell function. During fermentation, microbial enzymes can break down many of these toxins, including cyanogenic glycosides, which can inhibit cellular respiration. Additionally, some toxins are removed during preparation steps, such as removing the outer hull of seed pods or through cooking.

Probiotics

The health benefits of fermented foods are thought to extend beyond making foods easier to digest and less toxic. In particular, scientists are very interested in the influence of the microbial constituents of the ferment as potential probiotics.

There is mixed evidence regarding the effects of fermented foods on gut health. Many of these food-associated microbes are considered transient and have limited ability to colonize the gut, meaning that they simply pass through the digestive tract (or something to this effect). Yet, one study indicates that some situations of dysbiosis can be mitigated, with the help of lactic acid bacteria from a kefir-like fermented milk product.

The authors explored the ability of microorganisms (a consortia of Bfidobacteria, Lactococcus, Lactobacillus and Streptococcus) from the probiotic drink to colonize rat guts with different initial microbiomes. They observed that individuals that did not maintain the Lactococcus lactis had a more resilient microbiome that was characterized by a higher relative abundance of Lachnospiraceae. This suggests that microbiomes going through flux are more likely to be impacted by probiotics in fermented foods.

Taste

And of course, taste! Secondary metabolites, such as lactic acids, esters and free amino acids that are produced during fermentation can significantly alter the flavor profile of the food. For example, in sourdough cultures, the balance of lactic acid bacteria and yeasts will determine just how sour the bread gets. A higher proportion of  results in more sour bread.

Further, by adjusting the environmental conditions one can influence what metabolic pathways are favored, thereby encouraging production of different acids or esters, which also contribute to the flavor profile of the food.

Back to the basics: Alkaline fermentation

Typically, we think of fermentation as being an anaerobic process, but that is not always the case in "fermented" foods. In Western cuisine, fermented foods are typically acidic; however, fermented foods can also be produced through alkaline processes or have phases at different pHs. Acidic fermentation, fermentation that results in a decreased pH, is a highly studied process. However less literature exists on alkaline fermentation, where amino acids are metabolized to ammonia, which raises the pH of the culture.

Foods that are alkaline fermented are typically more protein rich and commonly include legumes (such as soybeans) and seeds. For example, natto, dawadawa and kinema are all alkaline fermented foods, made using legumes. Most alkaline  use native , from which Bacillus species tend to be highly abundant after the cooking process enriches for them.

Natto, a fermented soybean product characterized by stringy polymers that bind the beans together, is influenced by microbial processes in multiple ways. The production of natto from soybeans occurs when Bacillus subtilis subspecies natto secretes proteases that break down the proteins of the soaked and cooked soybeans. Then, a secondary fermentation driven by by glutamate dehydrogenases and ureases leads to natto's iconic ammonia smell.

Strings of natto are primarily made up of poly-gamma-glutamate (γ-PGA), an extracellular polymeric substance (EPS) of Bacillus subtilis (natto). The EPS derived from the production of natto is also of interest as a polymer for manufactured products including biodegradable fibers, heavy metal absorber, among other uses.

Another Bacillus-driven alkaline ferment is the seed-based condiment from Western Africa, which goes by the names Dawadawa, ogiri, soumbala, iru and netetu among others. To produce Dawadawa, locust bean seeds (or other large seeds) are boiled and dehulled, then fermented for a few days as flavor develops. Bacillus subtilis is often a primary organism involved in this fermentation; other involved organisms include members of the genera Leuconostoc, Staphylococcus and Micrococcus.

There is limited information about the role of fungi in this fermentation process. It is thought that the cooking process encourages the dominance of Bacilli because they can form heat-resistant spores, and inoculating the seeds with Bacillus subtilis spores improves the reproducibility of the fermentation process.

Kinema is produced in a similar manner as dawadawa, but with soybeans and in a different region—the Eastern Himalayas. Like natto, kinema fermentation forms stringy polymers, but to a slightly lesser extent than natto. The fermentation is also carried out by native microorganisms, primarily Bacillus species and other Firmicutes

In terms of fungi, the main species are Wallemia canadensis and Pichia sporocuriosa, but their functional role is not well defined. As with other alkaline ferments, amino acids are released by the microbes providing human consumers a ready source of nutrients, as well as an umami or meat-like flavor.

Mold-driven fermentation of soybeans

Like alkaline ferments, many other foods with a higher protein content, such as beans and seeds, are fermented with molds. Mold-driven ferments are generally shaped by the action of just a few molds; Rhizopus, Mucor and Aspergillus.

Compared to many bacterial ferments, mold-based ferments are often fermented at somewhat higher temperatures. Propagation methods are also somewhat different than for bacterial cultures because it is often useful to wait for parts of the culture to sporulate in order to transfer to the next ferment.

Tempeh and Koji are two soybean ferments mediated by molds. Tempeh production involves soaking, cooking and drying the beans then inoculating with a starter culture of Rhizopus. Then, this mixture is formed into a shape (typically blocks) and covered with an air permeable wrapper.

This process must be watched and mixed fairly carefully, as the interior of the fermenting tempeh can become too warm through the metabolism of the molds. As the mold grows, it forms a mycelium that binds the tempeh block together. This fermentation process usually takes between 1.5 and three days, depending on temperature and starting inoculum.

Rhizopus improves the soybean nutrient quality by breaking down proteins, making iron more bioavailable and removing toxic compounds. Koji, on the other hand, is typically made using Apsergillus species, but it is made from a variety of starting materials including rice and barley, in addition to soybeans. Koji is the starting culture for other products including miso, soy sauce and saké.

Interestingly some foods, like amazaké, use koji as a source of enzymes, rather than relying on growth of the mold itself. Amazaké relies on high temperatures to encourage enzyme activity to release sugars from rice to make a pudding or drink.

Fermentation broadens the foods that we are able to eat and the amount of time we have to eat them before they spoil. Different fermentation processes and organisms are used to generate wildly different flavors, which we highlight by exploring the fermentation processes used to make natto, dawadawa, kinema as well as tempeh and koji.

Fermented food is a staple in west Africa: 5 ways to make sure it's safe for eating