‘Digging’ into early medieval Europe with big data
Peer-Reviewed PublicationDuring the middle of the sixth century CE a dramatic transformation began in how the people of western Europe buried their dead. The transition from ‘furnished’ inhumation (those with grave goods to include jewellery, dress accessories, tools and personal items etc) to ‘unfurnished’ (those without grave goods) was widespread and by the early eighth century an unfurnished inhumation was by far the favoured method of burial.
This relatively swift change – spreading across almost the entirety of western Europe in c. 150 years – points to the interconnectedness of early medieval Europe, but is it as simple as that?
Dr Emma Brownlee, researcher at the McDonald Institute for Archaeological Research and Fellow of Girton College, University of Cambridge, has picked apart the nuances and regional variabilities in burial rites using a ‘Big Data’ study of over 26,000 graves. The study is published in the journal Internet Archaeology.
Dr Brownlee said: “It’s a really complex picture. Burial change across this period has long been viewed as a simple trajectory from a variable, furnished burial rite, to a much more standardised shrouded burial in a churchyard. But, there’s a lot more variation than that, both in the way grave good use changed over time, and the way in which it varied to begin with. It isn’t possible to come up with a simple narrative to explain why funerary rites look a certain way in different parts of Europe, because although there are broad tendencies in certain directions, there is also a huge amount of variation within regions.”
Highlighting an important detail of this study, Emma Brownlee continued: “You can see clear connections between Kent and northern France, not so much in how they’re using grave goods, but in the fact that they’re the only regions where grave good use doesn’t decline throughout the seventh century. In many ways, grave good use in Kent resembled other areas of England, with objects such as brooches, beads and knives often placed in graves, while objects such as vessels were more rarely used.”
“But, while grave good use gradually declined in other parts of England, Kentish graves continued to be richly furnished until the end of the seventh century, when those rich cemeteries were abandoned. This is exactly what we see in northern France; despite a quite different funerary culture - with much more focus on vessels in graves - rich furnishing continues to be common long after it began declining in other areas. This suggests that those two regions are much more culturally interconnected with each other than they are with other surrounding areas, while the rest of England has a greater affinity with places like southern Germany.”
Dr Brownlee further concluded: “Essentially, there is no way of demonstrating that a certain combination of grave goods indicates one regional tradition over another. Instead, we see a funerary rite that is influenced by the choices others in a community are making, influenced by the choices made in the surrounding communities, influenced by the identity of the deceased. This is ultimately something that is highly personal, and based on decisions made in the emotionally charged circumstances surrounding a death. Other aspects of a funeral, such as the choice of inhumation or cremation, or the use of a coffin, stone settings, or a plain, earth-cut grave, were most likely influenced by similar decisions.”
“While regional trends, and trends based on identity do exist, this should not be used to obscure the variability within those trends. Every burial was unique and was probably quite personal to the families.”
###
Grave Goods in Early Medieval Europe: regional variability and decline by Emma Brownlee is published in Internet Archaeology.
JOURNAL
Internet Archaeology
DOI
10.11141/ia.56.11
METHOD OF RESEARCH
Meta-analysis
SUBJECT OF RESEARCH
People
ARTICLE TITLE
Grave Goods in Early Medieval Europe: regional variability and decline
ARTICLE PUBLICATION DATE
29-Jul-2021
It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Sunday, August 01, 2021
Paper: Some birds steal hair from living mammals
Peer-Reviewed PublicationUNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN,
A new paper in the journal Ecology documents this phenomenon, which the authors call “kleptotrichy,” from the Greek roots for “theft” and “hair.” The authors found only a few descriptions of the behavior in the scientific literature but came up with dozens more examples in online videos posted by birders and other bird enthusiasts. In almost all the videorecorded cases, the thief is a titmouse plucking hair from a cat, dog, human, raccoon or, in one case, porcupine.
Many species of titmice, chickadees and tits – all members of the family Paridae – are known to use hair or fur to line their nests, said Mark Hauber, a professor of evolution, ecology and behavior at the University of Illinois Urbana-Champaign who led the write-up with postdoctoral researcher Henry Pollock. The hair’s role in the nest is still debated, although it is more commonly used by birds nesting in temperate climates, so maintaining warmth in the nest is thought to be one advantage.
The impetus for the study came from a chance sighting. Study co-author U. of I. natural resources and environmental sciences professor Jeffrey Brawn first observed the behavior with Pollock while on a spring bird count in central Illinois.
Scientists once assumed that birds with hair in their nests had collected it from the carcasses of dead mammals or found hair that had been shed into the environment, Brawn said.
“But the titmouse I saw was plucking hair from a live animal,” he said. “This was from a live raccoon with claws and teeth. And the raccoon didn’t seem to mind because it didn’t even wake up.”
A search of published research turned up just nine studies that documented 11 instances of the behavior, but a YouTube search yielded many more cases, Hauber said. Sometimes the target animals were sleeping, and sometimes they were awake.
“We know, of course, that birds use a variety of materials to line their nests,” Hauber said. “But why are these birds risking their lives to approach these mammals?”
The behavior suggests that the benefit of lining its nest with hair outweighs the danger to the bird, he said.
It may be that the birds simply need the hair to insulate their nests, but the presence of mammal hair – and the associated odor of the mammal – could also deter nest predators like snakes or other birds, the researchers said.
“There’s a local species called the great crested flycatcher, which, like the titmouse, is a cavity nester, that actually puts shed snakeskins into its nest, possibly to deter predators,” Brawn said.
“There are finches in Africa that put mammalian fecal material on top of their enclosed nests, presumably to confuse and thus keep predators away,” Hauber said.
The hair also may repel nest and nestling parasites, which are a common threat to chick survival, especially in cavity nests like those of titmice, he said.
Regardless of the purpose of the behavior, the new paper is the first to document so many examples of hair-plucking by birds in a single report. In addition to citing nine papers about the phenomenon, it also links to dozens of online videos. Collectively, the videos show titmice – and in one case, a black-capped chickadee – plucking hair from 47 humans, 45 dogs, three cats, three raccoons and a porcupine.
“Unexpected interactions such as these remind us that animals exhibit all types of interesting and often overlooked behaviors and highlight the importance of careful natural history observations to shed light on the intricacies of ecological communities,” Pollock said.
Editor’s notes:
To reach Henry Pollock, email henry.s.pollock@gmail.com.
To reach Mark Hauber, email mhauber@illinois.edu.
To reach Jeffrey Brawn, email jbrawn@illinois.edu.
The paper “What the pluck? Theft of mammal hair by birds is an overlooked but common behavior with fitness implications” is available online and from the U. of I. News Bureau.
JOURNAL
Ecology
DOI
10.1002/ecy.3501
METHOD OF RESEARCH
Literature review
SUBJECT OF RESEARCH
Animals
ARTICLE TITLE
What the pluck? Theft of mammal hair by birds is an overlooked but common behavior with fitness implications
ARTICLE PUBLICATION DATE
27-Jul-2021
Golden age of genre films leads Australian industry revival
Despite being often dismissed by critics, genre movies are leading a box-office boom for Australian films
Book AnnouncementQUEENSLAND UNIVERSITY OF TECHNOLOGY
The Australian film industry is in the middle of a renaissance, with locally produced popular genre movies riding a wave of success, from action, comedy and sci-fi films to crime, thrillers, and our unique horror productions, says a QUT researcher.
Associate Professor Mark Ryan, a lecturer in film, screen and animation, and a chief investigator with QUT’s Digital Media Research Centre, argues the study of film genre has been long overlooked in Australian film studies and deserves more attention.
He adds that despite some exceptions, critics in the past have largely relegated local genre films to B-movie status and celebrated Australian movies that contributed to a sense of national identity and history; films like Sunday Too Far Away, Breaker Morant, and Gallipoli.
“Since the mid- 1970s, genre films made largely for a popular, commercial audience have held a tenuous position in industry practices, film culture, policy frameworks and public funding initiatives but we are now in a golden age of genre cinema,” said Professor Ryan who, along with Associate Professor Kelly McWilliam from the University of Southern Queensland, has co-edited a new book, Australian Genre Film (Routledge).
“The 1980s was a productive period for genre films but the 1990s Australian output was dominated by independent art films, dramas and comedies. National film policy prioritised the development of an industry contributing to the screening and preservation of Australian culture, identity, and history. Genre films were not seen to fit that brief.
“Between the mid-2000s and 2020, however, there has been a surge in action, adventure, horror and science fiction films. Our filmmakers are turning out more movies across a broad spectrum of popular movie types, from low budget to blockbuster productions, including some that morph more than one genre, such as Mystery Road (thriller and crime) or Wolf Creek (horror and road movie).”
According to Professor Ryan, the current boom in genre film making shows no sign of slowing down and began after the federal government created Screen Australia, which in turn introduced critical core production and post-production finance incentives.
“A key role of Screen Australia was to support ‘commercially focused’ films with the potential to appeal to audiences and it has certainly succeeded in this. Between 2007 and 2017, its Producer Offset contributed over AU$900 million in rebates to producers in support of 291 features,” he said.
“Eight of the top 20 highest-earning Australian feature films of all time were produced in the last decade, and each of these films have a dominant primary genre or hybrid generic identity. They include Mao’s Last Dancer (biopic), The Sapphires (musical), The Great Gatsby (drama/romance); The Water Diviner (war), The Dressmaker (revenge/thriller) and Mad Max: Fury Road (action/road).
“Previously, commercial movie genres associated with Hollywood cinema such as action, horror, thrillers, and the musical were viewed by film critics as American genres to the extent that it became ‘unacceptable’ to make Australian films with Hollywood generic conventions.
“The rise of the genre film has also been fuelled by an explosion in guerrilla filmmaking thanks to greater availability of low-cost high-definition cameras and digital editing equipment alongside the proliferation of video on demand platforms, as well as increased production of transnational cinema.”
Professor Ryan said while the study of movie genres was a well-established subfield of film studies, the focus has overwhelmingly been on Hollywood. His book turns the lens to Australia and analyses of 13 significant Australian genres: action, biopics, comedy, crime, horror, musical, road movie, romance, science fiction, teen, thriller, war, and the Western.
His own field of expertise is horror which he covers in the chapter A Monstrous Landscape Filled with Killer Animals and Madmen: Tropes of Contemporary Australian Horror Movies.
“Unlike other major movie genres such as fantasy, action, and science fiction, which have struggled to gain a foothold in the Australian feature film industry until quite recently, the horror genre has been an enduring genre for independent filmmakers over the last 50 years,” he said.
“In the last 20 years, purely in terms of the volume of films produced, the horror genre has become a far more significant form of production for the Australian film industry even if it remains largely a cult genre.
“Since the mid-2000s, horror has become both a staple and, at times, a commercially viable genre for low-budget filmmakers targeting international markets. In this time, the Australian film industry has produced several high-profile horror movies that circulated widely in international cinema markets, such as Bait, Saw, Daybreakers and Wolf Creek.
“However, the genre is often still dismissed by reviewers as unworthy of serious critical attention unless a film can be considered within a more reputable critical context; like The Babadook which was internationally acclaimed and seen as art-horror when it came out in 2014,” he said.
Australian Genre Film is published by Routledge.
New research infuses equity principles into the algorithm development process
NYU TANDON SCHOOL OF ENGINEERING
BROOKLYN, New York, Thursday, July 29, 2021 – In the U.S., the place where one was born, one’s social and economic background, the neighborhoods in which one spends one’s formative years, and where one grows old are factors that account for a quarter to 60% of deaths in any given year, partly because these forces play a significant role in occurrence and outcomes for heart disease, cancer, unintentional injuries, chronic lower respiratory diseases, and cerebrovascular diseases — the five leading causes of death.
While data on such “macro” factors is critical to tracking and predicting health outcomes for individuals and communities, analysts who apply machine-learning tools to health outcomes tend to rely on “micro” data constrained to purely clinical settings and driven by healthcare data and processes inside the hospital, leaving factors that could shed light on healthcare disparities in the dark.
Researchers at the NYU Tandon School of Engineering and NYU School of Global Public Health (NYU GPH), in a new perspective, “Machine learning and algorithmic fairness in public and population health,” in Nature Machine Intelligence, aim to activate the machine learning community to account for “macro” factors and their impact on health. Thinking outside the clinical “box” and beyond the strict limits of individual factors, Rumi Chunara, associate professor of computer science and engineering at NYU Tandon and of biostatistics at the NYU GPH, found a new approach to incorporating the larger web of relevant data for predictive modeling for individual and community health outcomes.
“Research of what causes and reduces equity shows that to avoid creating more disparities it is essential to consider upstream factors as well,” explained Chunara. She noted, on the one hand, the large body of work on AI and machine learning implementation in healthcare in areas like image analysis, radiography, and pathology, and on the other the strong awareness and advocacy focused on such areas as structural racism, police brutality, and healthcare disparities that came to light around the COVID-19 pandemic.
“Our goal is to take that work and the explosion of data-rich machine learning in healthcare, and create a holistic view beyond the clinical setting, incorporating data about communities and the environment.”
Chunara, along with her doctoral students Vishwali Mhasawade and Yuan Zhao, at NYU Tandon and NYU GPH, respectively, leveraged the Social Ecological Model, a framework for understanding how the health, habits and behavior of an individual are affected by factors such as public policies at the national and international level and availability of health resources within a community and neighborhood. The team shows how principles of this model can be used in algorithm development to show how algorithms can be designed and used more equitably.
The researchers organized existing work into a taxonomy of the types of tasks for which machine learning and AI are used that span prediction, interventions, identifying effects and allocations, to show examples of how a multi-level perspective can be leveraged. In the piece, the authors also show how the same framework is applicable to considerations of data privacy, governance, and best practices to move the healthcare burden from individuals, toward improving equity.
As an example of such approaches, members of the same team recently presented at the AAAI/ACM Conference on Artificial Intelligence, Ethics and Society a new approach to using “causal multi-level fairness,” the larger web of relevant data for assessing fairness of algorithms. This work builds on the field of “algorithmic fairness,” which, to date, is limited by its exclusive focus on individual-level attributes such as gender and race.
In this work Mhasawade and Chunara formalized a novel approach to understanding fairness relationships using tools from causal inference, synthesizing a means by which an investigator could assess and account for effects of sensitive macro attributes and not merely individual factors. They developed the algorithm for their approach and provided the settings under which it is applicable. They also illustrated their method on data showing how predictions based merely on data points associated with labels like race, income and gender are of limited value if sensitive attributes are not accounted for, or are accounted for without proper context.
“As in healthcare, algorithmic fairness tends to be focused on labels — men and women, Black versus white, etc. — without considering multiple layers of influence from a causal perspective to decide what is fair and unfair in predictions,” said Chunara. “Our work presents a framework for thinking not only about equity in algorithms but also what types of data we use in them.”
The study, Machine learning and algorithmic fairness in public and population heath is available at https://www.nature.com/articles/s42256-021-00373-4
About the New York University Tandon School of Engineering
The NYU Tandon School of Engineering dates to 1854, the founding date for both the New York University School of Civil Engineering and Architecture and the Brooklyn Collegiate and Polytechnic Institute. A January 2014 merger created a comprehensive school of education and research in engineering and applied sciences as part of a global university, with close connections to engineering programs at NYU Abu Dhabi and NYU Shanghai. NYU Tandon is rooted in a vibrant tradition of entrepreneurship, intellectual curiosity, and innovative solutions to humanity’s most pressing global challenges. Research at Tandon focuses on vital intersections between communications/IT, cybersecurity, and data science/AI/robotics systems and tools and critical areas of society that they influence, including emerging media, health, sustainability, and urban living. We believe diversity is integral to excellence, and are creating a vibrant, inclusive, and equitable environment for all of our students, faculty and staff. For more information, visit engineering.nyu.edu.
###
JOURNAL
Nature
DOI
10.1038/s42256-021-00373-4
METHOD OF RESEARCH
Literature review
SUBJECT OF RESEARCH
Not applicable
ARTICLE TITLE
Machine learning and algorithmic fairness in public and population health
ARTICLE PUBLICATION DATE
29-Jul-2021
Lehigh University Computer science and engineering researcher Roberto Palmieri wins NSF CAREER award to take RDMA technology to the next level
LEHIGH UNIVERSITY
IMAGE: NSF CAREER AWARD WINNER ROBERTO PALMIERI IS AN ASSISTANT PROFESSOR OF COMPUTER SCIENCE AND ENGINEERING IN LEHIGH UNIVERSITY'S P.C. ROSSIN COLLEGE OF ENGINEERING AND APPLIED SCIENCE. view more
CREDIT: LEHIGH UNIVERSITY
We are not a species that likes to wait. Especially when it comes to our online demands—we want instant responses to our queries, and immediate confirmation of our posts.
Meeting such expectations requires distributed computing systems capable of meeting demand while still preserving the integrity of the data they are providing. Distributed systems enable resource sharing in the form of hardware, software, or data, and comprise multiple machines connected through a network. The internet is the largest, best known example; others include social networks, online gaming, and e-commerce.
Such systems must perform innumerable complex interactions—fast—for potentially millions of users, without ruining the data. In other words, a travel site has to accommodate simultaneous requests for a flight, hotel, and rental car from millions of travelers, without screwing any of those requests up. If a site is at all glitchy or slow, users will go somewhere else.
Improving that speed is at the heart of Roberto Palmieri’s research. Palmieri, an assistant professor of computer science and engineering in Lehigh University’s P.C Rossin College of Engineering and Applied Science, recently won support from the National Science Foundation’s Faculty Early Career Development (CAREER) program for his proposal to optimize the technology known as Remote Direct Memory Access (RDMA) to better serve the massive number of internet-user requests.
The prestigious NSF CAREER award is given annually to junior faculty members across the U.S. who exemplify the role of teacher-scholars through outstanding research, excellent education, and the integration of education and research. Each award provides stable support at the level of approximately $500,000 for a five-year period.
“The general idea is that we have a lot of data within a given system, and this data doesn’t fit on a single machine,” says Palmieri. “It’s distributed on multiple machines, and there are operations working on this shared data. Ten, twenty years ago, a certain level of performance was good enough, but now there are so many services available on the internet, the infrastructure has to keep up with this increased workload. We want to make the operations performed by those machines go as fast as possible.”
RDMA is a fairly recent technology that changed the way computers communicated. At a basic level, that traditional communication involved one machine sending a request to another for a particular service. The second machine had to devote resources to processing and responding to the message, and that all took time. RDMA disrupted that pattern.
“So now, if a machine wants something from another machine, it will not ask for it,” he says. “It will just take it by interacting directly with that machine’s RDMA card. Which means that, instead of spending resources handling the message, the machine can focus on its specific business application. With RDMA, we’re talking about tens of nanoseconds for two machines to interact. Whereas before, we were talking about tens or hundreds of milliseconds. If you’re posting something on social media, and one interaction takes hundreds of milliseconds, and you need 10 interactions, the user is now waiting nearly a second, and starting to think, Why am I waiting so long?”
And when it comes to businesses competing for users, timing is everything.
Palmieri equates the difference between pre-RDMA days and now to using snail mail versus email. If you had to mail a letter and then wait for a response, you might not ask certain questions in that letter.
“If I have to decide whether I should put salt on my pasta, I’ll send you an email because I know that in a minute, you can answer.”
RDMA is a superfast delivery system. But it’s one that Palmieri intends to make even faster. In part, by going back to a long-held theory.
Before the arrival of RDMA, researchers had theorized that one way to speed up communication between machines would be to migrate required data from the computer that has it, to the one that wants it. That way, the next time a machine needed something, it didn’t have to ask for it. With the data stored locally, it could perform operations quicker. But at the time, says Palmieri, such migration couldn’t be done efficiently. Once RDMA was developed, retrieving data became so fast and cheap (in terms of performance cost) that migration no longer seemed necessary.
“People said, ‘I’m just going to go and get memory whenever I need it.’ What I’m saying is, ‘No, let’s go back to what we knew was optimal before, which was migrating memory to a local node,” he says. “Let us redesign that software component called the directory that allows memory to move, and traces where it is in the system. If we can move this memory efficiently, then basically every machine can interact with memory that is local. Subsequent requests for operations will then not even trigger a remote operation, it will all be done locally, which is shown to have the best performance. It’s at least one order of magnitude faster than even an RDMA operation.”
To do this, Palmieri and his team plan to redesign algorithms and protocols to fully exploit the capabilities of RDMA. Everything they produce will eventually become open-source, so others can build on it. A portion of Palmieri’s proposal is also directed at sparking more interest among students in computer systems.
“Getting students excited about something that’s intangible is hard,” he says. “To work on systems, students need to learn a lot of advanced concepts. How to work with the hardware and the operating system. You have to understand algorithms and protocols. So even though the ability to build infrastructure and software systems is in high demand, I attribute the lack of enthusiasm for the field to these barriers. You need so much knowledge before you can even start to get excited.”
To stoke interest, he’ll produce software that will allow students to see the potential in accessing hundreds of machines with just a few lines of code and truly appreciate nanosecond speed.
For Palmieri and his team, the potential to realize an outcome that was once theoretical is beyond exciting. And getting to this point, he says, would have been impossible without the ingenuity of his own students.
“This is a collective work. And it’s very unexplored. We had these continuous brainstorming sessions where we were trying to figure out something that no one else has ever done, and they were crazy good,” he says. “I get to do the talks and the interviews, but the students are at the core of the actual work.”
###
About Roberto Palmieri
Roberto Palmieri is an assistant professor in the Department of Computer Science and Engineering Department at Lehigh University, where he co-leads the Scalable Systems Software (SSS) Research Group. He joined the faculty of the P.C. Rossin College of Engineering and Applied Science in 2017, and was previously a research assistant professor at Virginia Tech. He earned his PhD, MS, and BS in computer engineering from Sapienza University of Rome.
Palmieri’s research interests focus on different aspects of concurrency, synchronization, data structures, distributed computing, heterogeneous systems, key-value stores, and distributed systems, spanning from theory to practice. He is passionate about designing and implementing synchronization protocols optimized for a wide range of deployments, from multicore architectures to cluster-scale and geo-distributed infrastructures.
Related Links:
Rossin College Faculty Profile: Roberto Palmieri
NSF Faculty Early Career Development (CAREER) program
NSF Award Abstract (2045976): "Distributed Protocols and Primitives Optimized for RDMA Read/Write Operations in Data Management
Lehigh University: Roberto Palmieri
Lehigh University: Scalable Systems Software (SSS) Research Group
Lehigh University: Institute for Data, Intelligent Systems, and Computation
ABIONIC OIL
Phytoplankton Dicrateria rotunda synthesizes hydrocarbons equivalent to petroleum
Peer-Reviewed PublicationTOYOHASHI UNIVERSITY OF TECHNOLOGY (TUT)
Key points
We discovered that Dicrateria rotunda (D. rotunda), a species of Haptophyte, can synthesize hydrocarbons equivalent to petroleum (saturated hydrocarbons with a carbon number ranging from 10 to 38). No organism capable of petroleum production has been previously reported.
All of the eleven Dicrateria strains examined in this study, including ARC1 obtained during a science cruise in the Arctic Ocean, were found to be capable of synthesizing a series of saturated hydrocarbons. This has indicated that this capability of synthesis is common to the entire Dicrateria genus.
The saturated hydrocarbon content of the D. rotunda ARC1 increased under dark and nitrogen-deficient conditions. Understanding the physiological function and synthesis pathways of these saturated hydrocarbons may contribute to the development of biofuels in the future.
Overview
Director-General Naomi Harada and colleagues from the Research Institute for Global Change at the Japan Agency for Marine-Earth Science and Technology, in collaboration with Assistant Professor Yuu Hirose from Toyohashi University of Technology and Specially Appointed Professor Kazuyoshi Murata from the National Institute for Physiological Sciences, discovered that the phytoplankton Dicrateria rotunda (D. rotunda) can synthesize a series of saturated hydrocarbons with a carbon number ranging from 10 to 38*.
A phytoplankton community was collected from seawater of the Chukchi Sea during a science cruise of the research vessel “Mirai” in the Arctic Ocean in 2013, from which we isolated and cultured the Arctic strain of D. rotunda, ARC1. ARC1 contained a series of saturated hydrocarbons with a carbon number ranging from 10 to 38, which are categorized as petrol (carbon number 10 to 15), diesel oils (carbon number 16 to 20), and fuel oils (carbon number 21 or higher) . Moreover, we examined ten additional strains of Dicrateria stored in culture collections, all of which were found to be similarly capable of hydrocarbon synthesis, indicating that this was common to the entire Dicrateria genus. This study is the first to report on an organism with the capability to synthesize hydrocarbons equivalent to petroleum.
The capability of the ARC1 strain to synthesize saturated hydrocarbons was shown to increase depending on the environmental conditions , and the findings of this study are expected to contribute to the development of biofuels in the future.
This study was supported by JSPS Grants-in-Aid for Scientific Research JP22221003 and JP15H05712.
Reference:
Naomi Harada, Yuu Hirose, Song Chihong, Hirofumi Kurita, Miyako Sato, Jonaotaro Onodera, Kazuyoshi Murata, Fumihiro Itoh (2021) “A novel characteristic of a phytoplankton as a potential source of straight-chain alkanes” Sci. Rep. 11, 14190, 10.1038/s41598-021-93204-w.
Supplemental information:
* Saturated hydrocarbons: Organic compounds composed of carbon and hydrogen. Saturated hydrocarbon with the smallest mass number is methane (CH4), which has a carbon number of one.
JOURNAL
Scientific Reports
DOI
CAPTION
a) Photographs of the Arctic strain ARC1 of D. rotunda captured by bright field microscopy (upper left), fluorescence microscopy (lower left), and electron microscopy (right). b) A 3D structure of ARC1 cell reconstructed from multiple electron microscope images.
CREDIT
COPYRIGHT (C) TOYOHASHI UNIVERSITY OF TECHNOLOGY. ALL RIGHTS RESERVED.
CAPTION
a) Gas chromatogram of hydrocarbons extracted from the Arctic strain ARC1 of D. rotunda. b) Amount of C10-C38 saturated hydrocarbons in 11 strains of D. rotunda. c) Amount of C10-C38 saturated hydrocarbons in the ARC1 strain cultured under different conditions. *Error bars (standard deviation)
Solar-powered microbes to feed the world?
International research team shows that protein from microbes uses a fraction of the resources of conventional farming
Peer-Reviewed PublicationUNIVERSITY OF GÖTTINGEN
Microbes have played a key role in our food and drinks – from cheese to beer – for millennia but their impact on our nutrition may soon become even more important. The world is facing growing food challenges as the human population continues to increase alongside its demand for resource intensive animal products. If these needs are to be met solely by conventional agriculture, the environmental cost will be huge. An international research team led by a Göttingen University alumnus has now shown that using solar-panels to produce microbial protein – which is rich not just in proteins but also in other nutrients – is more sustainable, efficient and environmentally friendly than growing conventional crops. This method uses solar energy, land, nutrients, and carbon dioxide from the air. Their research was published in PNAS.
Using computer simulations drawing directly from laboratory results, the researchers modelled large-scale microbial food production facilities, which use solar energy, air, water, and nutrients to grow microbes. The protein-rich biomass is harvested and processed, and the resulting powder can be used as feed for animals, or as food for people. The study carried out an analysis of the energy requirements for each step, from the very start to the end product, taking into account: electricity generation (from solar panels), electrochemical production of energy-rich substrate for the microbes, microbe cultivation, harvesting, and processing the protein-rich biomass. Several types of microbes and growth strategies were compared in order to identify the most efficient.
The study found that for each kilo of protein produced, solar-powered microbes require only 10% of the land area compared to even the most efficient plant crop – soybean. The study calculated that even in northern climates with less sunshine, the yields of solar-powered microbial foods could far outproduce staple crops, while minimizing water and fertilizer use. Importantly, this production could also be located in regions not suitable for agriculture, such as deserts.
In previous research, the protein from these types of microbes has shown beneficial effects when fed to livestock and is already produced at large scale in the EU. “We expect that microbial protein will also be beneficial as a supplement to our diets, since it provides a high-quality protein source composed of all essential amino acids, as well as vitamins and minerals,” explains first author Dorian Leger, who carried out the work in the MPI of Molecular Plant Physiology while studying at the University of Göttingen, along with colleagues from Italy and Israel. “This technology has the potential to support food production while preventing damage to the environment. Current farming methods contribute to polluted ecosystems and depleted water reserves worldwide.”
At the moment, 30-40% of the Earth’s land is used for farming, yet one in ten people are undernourished. Leger says, “Integrating the cultivation of nutrient-rich microbes with renewable energy systems, such as solar panels, has the potential to produce more food with less resources. This could free up vast amounts of agricultural land, and, in addition, prevent the further destruction of natural ecosystems thereby making a valuable contribution to conservation and sustainability whilst promoting food availability globally.”
Contact:
Dorian Leger
Email: dorian.dleger@gmail.com
Tel: +49 1520 571 9379
DOI
10.1073/pnas.2015025118
METHOD OF RESEARCH
Computational simulation/modeling
SUBJECT OF RESEARCH
Cells
ARTICLE TITLE
Photovoltaic -driven microbial protein production can use land and sunlight more efficiently than conventional crops
ARTICLE PUBLICATION DATE
28-Jun-2021