Wednesday, May 06, 2020


Police stop fewer black drivers at night when a 'veil of darkness' obscures their race

The Stanford-led study also found that when drivers were pulled over, officers searched the cars of blacks and Hispanics more often than whites
STANFORD SCHOOL OF ENGINEERING

That is one of several examples of systematic bias that emerged from a five-year study that analyzed 95 million traffic stop records, filed by officers with 21 state patrol agencies and 35 municipal police forces from 2011 to 2018.
The Stanford-led study also found that when drivers were pulled over, officers searched the cars of blacks and Hispanics more often than whites. The researchers also examined a subset of data from Washington and Colorado, two states that legalized marijuana, and found that while this change resulted in fewer searches overall, and thus fewer searches of blacks and Hispanics, minorities were still more likely than whites to have their cars searched after a pull-over.
"Our results indicate that police stops and search decisions suffer from persistent racial bias, and point to the value of policy interventions to mitigate these disparities," the researchers write in the May 4th issue of Nature Human Behaviour.
The paper culminates a five-year collaboration between Stanford's Cheryl Phillips, a journalism lecturer whose graduate students obtained the raw data through public records requests, and Sharad Goel, a professor of management science and engineering whose computer science team organized and analyzed the data.
Goel and his collaborators, which included Ravi Shroff, a professor of applied statistics at New York University, spent years culling through the data, eliminating records that were incomplete or from the wrong time periods, to create the 95 million-record database that was the basis for their analysis. "There is no way to overstate the difficulty of that task," Goel said.
Creating that database enabled the team to find the statistical evidence that a "veil of darkness" partially immunized blacks against traffic stops. That term and idea has been around since 2006 when it was used in a study that compared the race of 8,000 drivers in Oakland, California, who were stopped at any time of day or night over a six month period. But the findings from that study were inconclusive because the sample was too small to prove a link between the darkness of the sky and the race of the stopped drivers.
The Stanford team decided to repeat the analysis using the much larger dataset that they had gathered. First, they narrowed the range of variables they had to analyze by choosing a specific time of day - around 7 p.m. - when the probable causes for a stop were more or less constant. Next, they took advantage of the fact that, in the months before and after daylight saving time each year, the sky gets a little darker or lighter, day by day. Because they had such a massive database, the researchers were able to find 113,000 traffic stops, from all of the locations in their database, that occurred on those days, before or after clocks sprang forward or fell back, when the sky was growing darker or lighter at around 7 p.m. local time.
This dataset provided a statistically valid sample with two important variables - the race of the driver being stopped, and the darkness of the sky at around 7 p.m. The analysis left no doubt that the darker it got, the less likely it became that a black driver would be stopped. The reverse was true when the sky was lighter.
More than any single finding, the collaboration's most lasting impact may be from the Stanford Open Policing Project, which the researchers started to make their data available to investigative and data-savvy reporters, and to hold workshops to help reporters learn how to use the data to do local stories.
For example, the researchers helped reporters at the Seattle-based non-profit news organization, Investigate West, understand the patterns in the data for stories showing bias in police searches of Native Americans. That reporting prompted the Washington State Patrol to review its practices and boost officer training. Similarly, the researchers helped reporters at the Los Angeles Times analyze data that showed how police searched minority drivers far more often than whites. It resulted in a story that was part of a larger investigative series that prompted changes in Los Angeles Police Department practices.
"All told we've trained about 200 journalists, which is one of the unique things about this project," Phillips said.
Goel and Phillips plan to continue collaborating through a project called Big Local News that will explore how data science can shed light on public issues, such as civil asset forfeitures - instances in which law enforcement is authorized to seize and sell property associated with a crime. Gathering and analyzing records of when and where such seizures occur, to whom, and how such property is disposed will help shed light on how this practice is being used. Big Local News is also working on collaborative efforts to standardize information from police disciplinary cases.
"These projects demonstrate the power of combining data science with journalism to tell important stories," Goel said.
###
Study finds stronger links between automation and inequality

Job-replacing tech has directly driven the income gap since the late 1980s, economists report.



MASSACHUSETTS INSTITUTE OF TECHNOLOGY


This is part 3 of a three-part series examining the effects of robots and automation on employment, based on new research from economist and Institute Professor Daron Acemoglu.

CAMBRIDGE, Mass. -- Modern technology affects different workers in different ways. In some white-collar jobs -- designer, engineer -- people become more productive with sophisticated software at their side. In other cases, forms of automation, from robots to phone-answering systems, have simply replaced factory workers, receptionists, and many other kinds of employees.

Now a new study co-authored by an MIT economist suggests automation has a bigger impact on the labor market and income inequality than previous research would indicate -- and identifies the year 1987 as a key inflection point in this process, the moment when jobs lost to automation stopped being replaced by an equal number of similar workplace opportunities.

"Automation is critical for understanding inequality dynamics," says MIT economist Daron Acemoglu, co-author of a newly published paper detailing the findings.

Within industries adopting automation, the study shows, the average "displacement" (or job loss) from 1947-1987 was 17 percent of jobs, while the average "reinstatement" (new opportunities) was 19 percent. But from 1987-2016, displacement was 16 percent, while reinstatement was just 10 percent. In short, those factory positions or phone-answering jobs are not coming back.

"A lot of the new job opportunities that technology brought from the 1960s to the 1980s benefitted low-skill workers," Acemoglu adds. "But from the 1980s, and especially in the 1990s and 2000s, there's a double whammy for low-skill workers: They're hurt by displacement, and the new tasks that are coming, are coming slower and benefitting high-skill workers."


The new paper, "Unpacking Skill Bias: Automation and New Tasks," will appear in the May issue of the American Economic Association: Papers and Proceedings. The authors are Acemoglu, who is an Institute Professor at MIT, and Pascual Restrepo PhD '16, an assistant professor of economics at Boston University.

Low-skill workers: Moving backward
The new paper is one of several studies Acemoglu and Restrepo have conducted recently examining the effects of robots and automation in the workplace. In a just-published paper, they concluded that across the U.S. from 1993 to 2007, each new robot replaced 3.3 jobs.

In still another new paper, Acemoglu and Restrepo examined French industry from 2010 to 2015. They found that firms that quickly adopted robots became more productive and hired more workers, while their competitors fell behind and shed workers -- with jobs again being reduced overall.

In the current study, Acemoglu and Restrepo construct a model of technology's effects on the labor market, while testing the model's strength by using empirical data from 44 relevant industries. (The study uses U.S. Census statistics on employment and wages, as well as economic data from the Bureau of Economic Analysis and the Bureau of Labor Studies, among other sources.)

The result is an alternative to the standard economic modeling in the field, which has emphasized the idea of "skill-biased" technological change -- meaning that technology tends to benefit select high-skilled workers more than low-skill workers, helping the wages of high-skilled workers more, while the value of other workers stagnates. Think again of highly trained engineers who use new software to finish more projects more quickly: They become more productive and valuable, while workers lacking synergy with new technology are comparatively less valued.

However, Acemoglu and Restrepo think even this scenario, with the prosperity gap it implies, is still too benign. Where automation occurs, lower-skill workers are not just failing to make gains; they are actively pushed backward financially. Moreover, Acemoglu and Restrepo note, the standard model of skill-biased change does not fully account for this dynamic; it estimates that productivity gains and real (inflation-adjusted) wages of workers should be higher than they actually are.

More specifically, the standard model implies an estimate of about 2 percent annual growth in productivity since 1963, whereas annual productivity gains have been about 1.2 percent; it also estimates wage growth for low-skill workers of about 1 percent per year, whereas real wages for low-skill workers have actually dropped since the 1970s.
"Productivity growth has been lackluster, and real wages have fallen," Acemoglu says. "Automation accounts for both of those." Moreover, he adds, "Demand for skills has gone down almost exclusely in industries that have seen a lot of automation."

Why "so-so technologies" are so, so bad


Indeed, Acemoglu says, automation is a special case within the larger set of technological changes in the workplace. As he puts it, automation "is different than garden-variety skill-biased technological change," because it can replace jobs without adding much productivity to the economy.

Think of a self-checkout system in your supermarket or pharmacy: It reduces labor costs without making the task more efficient. The difference is the work is done by you, not paid employees. These kinds of systems are what Acemoglu and Restrepo have termed "so-so technologies," because of the minimal value they offer.

"So-so technologies are not really doing a fantastic job, nobody's enthusiastic about going one-by-one through their items at checkout, and nobody likes it when the airline they're calling puts them through automated menus," Acemoglu says. "So-so technologies are cost-saving devices for firms that just reduce their costs a little bit but don't increase productivity by much. They create the usual displacement effect but don't benefit other workers that much, and firms have no reason to hire more workers or pay other workers more."

To be sure, not all automation resembles self-checkout systems, which were not around in 1987. Automation at that time consisted more of printed office records being converted into databases, or machinery being added to sectors like textiles and furniture-making. Robots became more commonly added to heavy industrial manufacturing in the 1990s. Automation is a suite of technologies, continuing today with software and AI, which are inherently worker-displacing.
"Displacement is really the center of our theory," Acemoglu says. "And it has grimmer implications, because wage inequality is associated with disruptive changes for workers. It's a much more Luddite explanation."

After all, the Luddites -- British textile mill workers who destroyed machinery in the 1810s -- may be synonymous with technophobia, but their actions were motivated by economic concerns; they knew machines were replacing their jobs. That same displacement continues today, although, Acemoglu contends, the net negative consequences of technology on jobs is not inevitable. We could, perhaps, find more ways to produce job-enhancing technologies, rather than job-replacing innovations.

"It's not all doom and gloom," says Acemoglu. "There is nothing that says technology is all bad for workers. It is the choice we make about the direction to develop technology that is critical."




New research finds racial bias in rideshare platforms

Minority riders are twice as likely to have rides canceled than caucasians
INSTITUTE FOR OPERATIONS RESEARCH AND THE MANAGEMENT SCIENCES


INFORMS Journal Management Science New Study Key Takeaways:


Under-represented minorities are more than twice as likely to have a ride canceled compared to Caucasians.

The racial bias is lessened during peak demand times.

Rides are more likely to be canceled for people who show support for the LGBT community with no changes during peak demand times.


CATONSVILLE, MD, May 6, 2020 - New research to be published in the INFORMS journal Management Science has found popular rideshare platforms exhibit racial and other biases that penalize under-represented minorities and others seeking to use their services.

The study, "When Transparency Fails: Bias and Financial Incentives in Ridesharing Platforms," was conducted by Jorge Mejia of Indiana University and Chris Parker of American University. In addition to finding racial biases persist, similar phenomena were also documented against people who show support for the LGBT community.

Data was analyzed from a major rideshare platform in Washington, D.C., between early October to mid-November 2018. The experiment manipulated rider names and profile pictures to observe drivers' behavior patterns in accepting and canceling rides. To illustrate support for LGBT rights a rainbow profile picture filter was used. In addition, times of ride requests varied to determine how peak and non-peak price periods impact bias.

"We found under-represented minorities are more than twice as likely to have a ride canceled than Caucasians, that's about 3% versus 8%," said Mejia, an assistant professor in the Kelley School of Business at Indiana. "Along with racial bias, LGBT biases are persistent, while there is no evidence of gender bias."

Peak timing was found to have a moderating effect, with lower cancelation rates for minority riders, but the timing doesn't appear to change the bias for riders that signal support for the LGBT community.

"Data-driven solutions may exist wherein rider characteristics are captured when a driver cancels, and the platform penalizes the driver for the biased behavior. One possible way to punish drivers is to move them down the priority list when they exhibit biased cancelation behavior, so they have fewer ride requests. Alternatively, less-punitive measures may provide 'badges' for drivers that exhibit especially low cancelation rates for minority riders," concluded Mejia.


About INFORMS and Management Science

Management Science is a premier peer-reviewed scholarly journal focused on research using quantitative approaches to study all aspects of management in companies and organizations. It is published by INFORMS, the leading international association for operations research and analytics professionals. More information is available at http://www.informs.org or @informs.

Study reveals most critically ill patients with COVID-19 survive with standard treatment

MASSACHUSETTS GENERAL HOSPITAL



During the COVID-19 pandemic, hospitals around the world have shared anecdotal experiences to help inform the care of affected patients, but such anecdotes do not always reveal the best treatment strategies, and they can even lead to harm. To provide more reliable information, a team led by C. Corey Hardin, MD, PhD, an Assistant Professor of Medicine at MGH and Harvard Medical School, carefully examined the records of 66 critically ill patients with COVID-19 who experienced respiratory failure and were put on ventilators, making note of their responses to the care they received.
The investigators found that the most severe cases of COVID-19 result in a syndrome called Acute Respiratory Distress Syndrome (ARDS), a life-threatening lung condition that can be caused by a wide range of pathogens. "The good news is we have been studying ARDS for over 50 years and we have a number of effective evidenced-based therapies with which to treat it," said Dr. Hardin. "We applied these treatments--such as prone ventilation where patients are turned onto their stomachs--to patients in our study and they responded to them as we would expect patients with ARDS to respond."
Importantly, the death rate among critically ill patients with COVID-19 treated this way--16.7%--was not nearly as high as has been reported by other hospitals. Also, over a median follow-up of 34 days, 75.8% of patients who were on ventilators were discharged from the intensive care unit. "Based on this, we recommend that clinicians provide evidence-based ARDS treatments to patients with respiratory failure due to COVID-19 and await standardized clinical trials before contemplating novel therapies," said co-lead author Jehan Alladina, MD, an Instructor in Medicine at Mass General.
###
Paper cited: Ziehr DR, Alladina J, Petri CR, et al. Respiratory Pathophysiology of Mechanically Ventilated Patients with COVID-19: A Cohort Study [published online ahead of print, 2020 Apr 29]. Am J Respir Crit Care Med. 2020;10.1164/rccm.202004-1163LE. doi:10.1164/rccm.202004-1163LE
About Massachusetts General Hospital
Massachusetts General Hospital, founded in 1811, is the original and largest teaching hospital of Harvard Medical School. The Mass General Research Institute conducts the largest hospital-based research program in the nation, with annual research operations of more than $1 billion and comprises more than 9,500 researchers working across more than 30 institutes, centers and departments. In August 2019, Mass General was named #2 in the U.S. News & World Report list of "America's Best Hospitals."
Disclaimer: AAAS and EurekAlert! are not responsible for the

During tough times, ancient 'tourists' sought solace in Florida oyster feasts

FLORIDA MUSEUM OF NATURAL HISTORY
IMAGE: OUT-OF-TOWNERS FLOCKED TO CEREMONIAL SITES ON FLORIDA'S GULF COAST FOR HUNDREDS OF YEARS TO SOCIALIZE AND FEAST. CRYSTAL RIVER WAS HOME TO ONE OF THE MOST PROMINENT SITES, WHICH FEATURED... view more 
CREDIT: THOMAS J. PLUCKHAHN









GAINESVILLE, Fla. --- More than a thousand years ago, people from across the Southeast regularly traveled to a small island on Florida's Gulf Coast to bond over oysters, likely as a means of coping with climate change and social upheaval.
Archaeologists' analysis of present-day Roberts Island, about 50 miles north of Tampa Bay, showed that ancient people continued their centuries-long tradition of meeting to socialize and feast, even after an unknown crisis around A.D. 650 triggered the abandonment of most other such ceremonial sites in the region. For the next 400 years, out-of-towners made trips to the island, where shell mounds and a stepped pyramid were maintained by a small group of locals. But unlike the lavish spreads of the past, the menu primarily consisted of oysters, possibly a reflection of lower sea levels and cool, dry conditions.
People's persistence in gathering at Roberts Island, despite regional hardship, underscores their commitment to community, said study lead author C. Trevor Duke, a researcher in the Florida Museum of Natural History's Ceramic Technology Lab.
"What I found most compelling was the fact that people were so interested in keeping their ties to that landscape in the midst of all this potential climate change and abandonment," said Duke, a Ph.D. candidate in the University of Florida department of anthropology. "They still put forth the effort to harvest all these oysters and keep these social relationships active. These gatherings probably occurred when different groups of people were getting together and trying to figure out the future."
Duke and his collaborators compared animal remains from shell mounds and middens - essentially kitchen trash heaps - at Roberts Island and Crystal River, home to an older, more prominent ceremonial site. Their findings showed Crystal River residents "pulled out all the stops" for ritual feasts, regaling visitors with deer, alligator, sharks and dozens of other dishes, while at Roberts Island, feasts consisted of "oysters and very little else," Duke said.
The Roberts Island ceremonial site, which was vacated around A.D. 1050, was one of the last outposts in what was once a flourishing network of religious sites across the Eastern U.S. These sites were characterized by burial grounds with distinctly decorated ceramics known as Swift Creek and Weeden Island pottery. What differentiated Roberts Island and Crystal River from other sites was that their continuous occupation by a small group of residents who prepared for the influx of hundreds of visitors - not unlike Florida's tourist towns today.
"These were very cosmopolitan communities," Duke said. "I'm from Broward County, but I also spent time in the Panhandle, so I'm used to being part of a small residential community that deals with a massive population boom for a month or two months a year. That has been a Florida phenomenon for at least two thousand years."
Archaeologists estimate small-scale ceremonies began at Crystal River around A.D. 50, growing substantially after a residential community settled the site around A.D. 200. Excavations have uncovered minerals and artifacts from the Midwest, including copper breastplates from the Great Lakes. Similarly, conch shells from the Gulf Coast have been found at Midwestern archaeological sites.
"There was this long-distance reciprocal exchange network going on across much of the Eastern U.S. that Crystal River was very much a part of," Duke said.
Religious ceremonies at Crystal River included ritual burials and marriage alliances, Duke said, solidifying social ties between different groups of people. But the community was not immune to the environmental and social crises that swept the region, and the site was abandoned around A.D. 650. A smaller ceremonial site was soon established less than a mile downstream on Roberts Island, likely by a remnant of the Crystal River population.
Duke and his collaborators collected samples from mounds and middens at the two ceremonial sites, identifying the species present and calculating the weight of the meat they would have contained. They found that feasts at hard-strapped Roberts Island featured far fewer species. Meat from oysters and other bivalves accounted for 75% of the weight of Robert Island samples and roughly 25% of the weight from Crystal River. Meat from deer and other mammals made up 45% of the weight in Crystal River samples and less then 3% from Roberts Island.
Duke said evidence suggests that Roberts Island residents also had to travel farther to harvest food. As sea levels fell, oyster beds may have shifted seaward, possibly explaining why the Crystal River population relocated to the island, which was small and had few resources.
"Previous research suggests that environmental change completely rearranged the distribution of reefs and the ecosystem," Duke said. "They had to go far out to harvest these things to keep their ritual program active."
No one knows what caused the widespread abandonment of most of the region's ceremonial sites in A.D. 650, Duke said. But the production of Weeden Island pottery, likely associated with religious activities, ramped up as bustling sites became ghost towns.
"That's kind of counterintuitive," he said. "This religious movement comes on really strong right as this abandonment is happening. It almost seems like people were trying to do something, create some kind of intervention to stop whatever was happening."
###
Thomas Pluckhahn of the University of South Florida and J. Matthew Compton of Georgia Southern University also co-authored the study.

Infectious disease modeling study casts doubt on impact of Justinianic plague

Work shows value of new examinations of old narratives of this pandemic
UNIVERSITY OF MARYLAND
Researchers Lauren White, PhD and Lee Mordechai, PhD, of the University of Maryland's National Socio-Environmental Synthesis Center (SESYNC), examined the impacts of the Justinianic Plague with mathematical modeling. Using modern plague research as their basis, the two developed novel mathematical models to re-examine primary sources from the time of the Justinianic Plague outbreak. From the modeling, they found that it was unlikely that any transmission route of the plague would have had both the mortality rate and duration described in the primary sources. Their findings appear in a paper titled "Modeling the Justinianic Plague: Comparing hypothesized transmission routes" in PLOS ONE.
"This is the first time, to our knowledge, that a robust mathematical modeling approach has been used to investigate the Justinianic Plague," said lead author Lauren White, PhD, a quantitative disease ecologist and postdoctoral fellow at SESYNC. "Given that there is very little quantitative information in the primary sources for the Justinianic Plague, this was an exciting opportunity to think creatively about how we could combine present-day knowledge of plague's etiology with descriptions from the historical texts."
White and Mordechai focused their efforts on the city of Constantinople, capital of the Roman Empire, which had a comparatively well-described outbreak in 542 CE. Some primary sources claim plague killed up to 300,000 people in the city, which had a population of some 500,000 people at the time. Other sources suggest the plague killed half the empire's population. Until recently, many scholars accepted this image of mass death. By comparing bubonic, pneumonic, and combined transmission routes, the authors showed that no single transmission route precisely mimicked the outbreak dynamics described in these primary sources.
Existing literature often assumes that the Justinianic Plague affected all areas of the Mediterranean in the same way. The new findings from this paper suggest that given the variation in ecological and social patterns across the region (e.g., climate, population density), it is unlikely that a plague outbreak would have impacted all corners of the diverse empire equally.
"Our results strongly suggest that the effects of the Justinianic Plague varied considerably between different urban areas in late antiquity," said co-author Lee Mordechai, an environmental historian and a postdoctoral fellow at SESYNC when he wrote the paper. He is now a senior lecturer at the Hebrew University of Jerusalem, and co-lead of Princeton's Climate Change and History Research Initiative (CCHRI). He said, "This paper is part of a series of publications in recent years that casts doubt on the traditional interpretation of plague using new methodologies. It's an exciting time to do this kind of interdisciplinary research!"
Using an approach called global sensitivity analysis, White and Mordechai were able to explore the importance of any given model parameter in dictating simulated disease outcomes. They found that several understudied parameters are also very important in determining model results. White explained, "One example was the transmission rate from fleas to humans. Although the analysis described this as an important parameter, there hasn't been enough research to validate a plausible range for that parameter."
These high importance variables with minimal information also point to future directions for empirical data collection. "Working with mathematical models of disease was an insightful process for me as a historian," reflected Mordechai. "It allowed us to examine traditional historical arguments with a powerful new lens."
Together, with other recent work from Mordechai, this study is another call to examine the primary sources and narratives surrounding the Justinianic Plague more critically.
###
White, L.A. & Mordechai, L. (2020). Modeling the Justinianic Plague: Comparing hypothesized transmission routes. PLOS ONE. doi: 10.1371/journal.pone.0231256
About SESYNC: The University of Maryland's National Socio-Environmental Synthesis Center (SESYNC) in Annapolis brings together the science of the natural world with the science of human behavior and decision making to find solutions to complex environmental problems. SESYNC is funded by an award to the University of Maryland from the National Science Foundation. For more information on SESYNC and its activities, please visit http://www.sesync.org.

Demographic expansion of several Amazonian archaeological cultures by computer simulation

The study uses simulation techniques and shows that some cultural expansions from Amazonia during the late Holocene may have arisen from similar demographic processes to the Neolithic in Eurasia
UNIVERSITAT POMPEU FABRA - BARCELONA
IMAGE
IMAGE: COMPUTER SIMULATION OF THE EXPANSIONS OF SEVERAL ARCHAEOLOGICAL CULTURES IN SOUTH AMERICA. view more 
CREDIT: UPF
Expansions by groups of humans were common during prehistoric times, after the adoption of agriculture. Among other factors, this is due to population growth of farmers which was greater than of that hunter-gatherers. We can find one example of this during the Neolithic period, when farming was introduced to Europe by migrations from the Middle East.
However, in South America, it was not clear whether the same would have occurred as it was argued that no cultural group had expanded across such long distances as in Europe or Asia. In addition, it was believed that the type of agriculture practised by pre-Columbian peoples in the Amazon would not allow them to expand at the same rate.
Research conducted by three members of the Culture and Socio-Ecological Dynamics Research Group (CaSEs) at the UPF Department of Humanities shows that expansions by some archaeological cultures in South America can be simulated by computer through population growth and migration in the same way as the Neolithic in Europe. This is the case of so-called Saladoid-Barrancoid culture, which spread from the Orinoco to various parts of Amazonia, even reaching the Caribbean.
The article, published on 27 April in the journal PLOS ONE, involved Jonas Gregorio de Souza, a Marie Curie researcher, as first author, together with Jonas Alcaina Mateos, a predoctoral researcher, and Marco Madella, UPF-ICREA research professor and director of the CaSEs Research Group.
"The use of computer simulations to test human migrations in prehistoric times is an approach that has proved productive in other continents, but had not been applied to the area of the tropics of South America. We have shown that some cultural expansions that have taken place from Amazonia may be the result of similar demographic processes to the Neolithic in Eurasia", says Jonas Gregorio de Souza.
A computational model to simulate the expansions of four archaeological cultures
The article uses a computational approach to simulate human expansions in prehistory. "We use parameters derived from the ethnography of farmers in the Amazon to simulate the rate of population growth, the fission of villages, how far and how often they moved", the authors state. Based on these parameters, they created a computer model to simulate expansions from different points and dates and compare the results with archaeological data.
The researchers used radiocarbon dates from different archaeological cultures over a large area of territory in the last 5,000 years, which were compared with the prediction of the model, to assess whether their rate of territorial expansion could be explained as being a demographic phenomenon (rather than another type, such as cultural diffusion).
The four archaeological cultures or traditions analysed were the Saladoid-Barrancoid, the Arauquinoid, the Tupiguarani, and the (closely related) Una, Itararé and Aratu traditions. In most regions where they settled, these cultures introduced the cultivation of domesticated plants, marked the transition towards more permanent settlements, and spread an economic model called "polycultureagroforestry".
However, the authors warn that some expansions could not be predicted by the simulations, suggesting that they were caused by other factors: "Although some archaeological expansions can be predicted, by the simulations, as demographic processes, others are not easily explained in the same way. This is possibly due to different processes that drive their dispersal, such as cultural diffusion, or because the archaeological data are inconclusive or sparse", they conclude.
###
MY HOME TOWN

Arctic Edmontosaurus lives again -- a new look at the 'caribou of the Cretaceous'

PEROT MUSEUM OF NATURE AND SCIENCE
IMAGE
IMAGE: PUBLISHED IN PLOS ONE TODAY, A STUDY BY AN INTERNATIONAL TEAM FROM THE PEROT MUSEUM OF NATURE AND SCIENCE IN DALLAS AND HOKKAIDO UNIVERSITY IN JAPAN FURTHER EXPLORES THE PROLIFERATION... view more 
CREDIT: MASATO HATTORI
DALLAS, TEXAS (May 6, 2020) - A new study by an international team from the Perot Museum of Nature and Science in Dallas and Hokkaido University and Okayama University of Science in Japan further explores the proliferation of the most commonly occurring duck-billed dinosaur of the ancient Arctic as the genus Edmontosaurus. The findings also reinforce that the hadrosaurs - known as the "caribou of the Cretaceous" - had a huge geographical distribution of approximately 60 degrees of latitude, spanning the North American West from Alaska to Colorado.
The scientific paper describing the find - titled "Re-examination of the cranial osteology of the Arctic Alaskan hadrosaurine with implications for its taxonomic status" - has been posted in PLOS ONE, an international, peer-reviewed, open-access online publication featuring reports on primary research from all scientific disciplines. The authors of the report are Ryuji Takasaki of Okayama University of Science in Japan; Anthony R. Fiorillo, Ph.D. and Ronald S. Tykoski, Ph.D. of the Perot Museum of Nature and Science in Dallas, Texas; and Yoshitsugu Kobayashi, Ph.D. of Hokkaido University Museum in Japan. To read the entire manuscript and view renderings, go to https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0232410.
"Recent studies have identified new species of hadrosaurs in Alaska, but our research shows that these Arctic hadrosaurs actually belong to the genus Edmontosaurus, an abundant and previously recognized genus of duck-billed dinosaur known from Alberta south to Colorado," said Takasaki.
The report states that anatomical comparisons and phylogenetic analyses clearly demonstrate that attribution of the Alaskan hadrosaurines to a unique genus Ugrunaaluk is inappropriate, and they are now considered as a junior synonym of Edmontosaurus, a hadrosaurines genus previously known from lower latitude North America roughly in between northern Colorado (N40?) to southern Alberta (N53?).
The fossils used for this study were found primarily in the Liscomb Bonebed, Prince Creek Formation of the North Slope of Alaska, the location of the first dinosaur fossils discovered in the Arctic.
The team's research also show that the plant-eating hadrosaurs were taking over parts of North America during the Cretaceous, suggesting that Edmontosaurus was likely an ecological generalist.
"In other words, Edmontosaurus was a highly successful dinosaur that could adapt to a wide variety of environmental conditions," said Fiorillo. "It's not unrealistic to compare them to generalized animals today - such as mountain sheep, wolves and cougars in terms of their range and numbers - that also roam greater geographic distributions."
Members of this team also found ties to Kamuysaurus japonicus, a new genus species they discovered near Hokkaido, Japan, and named in 2019.
"Combined with the newly named Kamuysaurus of Japan, Alaska Edmontosaurus shows that this group of hadrosaurs, the Edmontosaurini, were widely distributed in the northern circum-Pacific region, meaning that they were incredibly successful dinosaurs," said Kobayashi. "It's fascinating to think they likely used the ancestral Bering Land Bridge between Asia and North America for migration in a manner similar to mammoths, woolly rhinoceroses and early humans."
Edmontosaurus belong to a clade Edmontosaurini as Kamuysaurus, a recently described hadrosaurine dinosaur from Japan, suggesting that Edmontosaurini widely distributed along the northern circum-Pacific region. North America and Asia were connected by Beringia during the Late Cretaceous, and some dinosaurs are believed to have traveled to the North American continent this way. Edmontosaurini is one of the dinosaur groups that may have ventured the North America-to-Asia pathway and adapted to the Arctic environment. Those creatures that stayed in North America evolved to Edmontosaurus, and those that stayed in Asia and moved on to Japan are believed to have evolved to Kamuysaurus.
"This study is a wonderful example of why paleontologists need to be more aware of how individual growth and life stage of fossils matter when we try to interpret the anatomical features preserved in them. If you don't, you run the risk of erroneously erecting a new 'genus' or species based on juvenile traits that will change or vanish as the individual creature grows up - and winds up being an adult of an already-known 'genus' or species!," said Tykoski. "Our study shows that was probably the case with these juvenile duck-billed dinosaurs from the ancient Arctic of Alaska."
###
About the Perot Museum of Nature and Science. The top cultural attraction in Dallas/Fort Worth and a Michelin Green Guide three-star destination, the Perot Museum of Nature and Science is a nonprofit educational organization located in the heart of Dallas, Texas. With a mission to inspire minds through nature and science, the Perot Museum delivers exciting, engaging and innovative visitor and outreach experiences through its education, exhibition, and research and collections programming for children, students, teachers, families and life-long learners. The 180,000-square-foot facility in Victory Park opened in December 2012 and is now recognized as the symbolic gateway to the Dallas Arts District. Future scientists, mathematicians and engineers will find inspiration and enlightenment through 11 permanent exhibit halls on five floors of public space; a children's museum; a state-of-the art traveling exhibition hall; and The Hoglund Foundation Theater. Designed by 2005 Pritzker Architecture Prize Laureate Thom Mayne and his firm Morphosis Architects, the museum has been lauded for its artistry and sustainability. To learn more, please visit perotmuseum.org.
About Hokkaido University. Hokkaido University is home to some 4 million specimens and documents that have been gathered, preserved and studied since the Sapporo Agricultural College began more than 130 years ago. Amongst these are more than 10,000 precious "type specimens" that form the basis for the discovery and certification of new species. Opened in the spring of 1999, the Hokkaido University Museum conveys the diverse range of research carried out at Hokkaido University while also using various original materials and visual media to introduce the university's cutting-edge research.
Okayama University of Science. Okayama University of Science carries a various scientific departments more than 5000 students are enrolled every year. The university recently opened a new course for studying paleontology in 2014 and renovated a dinosaur museum in this April. Dinosaur research projects at the Okayama University of Science are appointed to one of the selective projects of the Japanese Ministry of Education, Culture, Sports, Science and Technology, and the University is now known as one of the key institutions for dinosaur researches in Japan.
Fly ash geopolymer concrete: Significantly enhanced resistance to extreme alkali attack
UNIVERSITY OF JOHANNESBUR

IMAGE
IMAGE: GEOPOLYMER CONCRETE BLOCKS, HEAT CURED AT 200 DEGREES CELSIUS AND THEN IMMERSED IN AN EXTREME ALKALI MEDIUM FOR 14 DAYS AT 80 DEGREES CELSIUS (A AND B), RESIST THE ATTACK... view more 
CREDIT: DR ABDOLHOSSEIN NAGHIZADEH, UNIVERSITY OF JOHANNESBURG.
Fly ash generated by coal-fired power stations is an environmental headache, creating groundwater and air pollution from vast landfills and ash dams. Some of the waste product can be repurposed into geopolymer concrete, such as pre-cast heat-cured elements for structures.
However, a critical durability problem has been low resistance to extreme alkali attack. Researchers at the University of Johannesburg have found that high temperature heat-treatment (HTHT) can reduce this harmful mechanism in fly ash geopolymer concrete by half.
"In a previous study, we found that fly ash geopolymer concrete can be vulnerable under extreme alkaline conditions. The recommendation from the study, was that this material should not be employed in structures that are exposed to highly alkaline mediums, such as some chemical storage facilities.
"The findings of our new study show that the alkali resistance of geopolymer concrete can be significantly improved by exposing it to an evaluated temperature, optimally 200 degrees Celsius," says Dr Abdolhossein Naghizadeh.
The study forms part of Naghizadeh's doctoral research at the Department of Civil Engineering Science at the University of Johannesburg.
Extreme alkali medium
In the research published in Case Studies in Construction Materials, blocks of fly ash geopolymer mortars were variously heat-cured at 100, 200, 400 or 600 degrees Celsius for 6 hours. These were then immersed in water, a medium alkali medium or an extreme alkali medium; and stored at 80 degrees Celsius for 14 days or 28 days, depending on the performance measurement.
(The prolonged heat-curing for 28 days was conducted to compare the results with those obtained by the other studies, which employed the same curing regime. This long-term curing is suitable for research purposes, but not recommended for actual construction. The medium alkali medium was a 1M NaOH solution. The extreme alkali medium was a 3M NaOH solution.)
"The hardened blocks heat-cured at 200 degrees, and then immersed in the extreme alkali medium (the "200/3M" blocks), maintained about 50% residual strength at 22.6 MPa upon alkali attack. The blocks heat-cured at the other temperatures maintained much lower residual strengths at 10.3 - 14.6 MPa," says Naghizadeh.
"The 200/3M blocks immersed in extreme alkali medium displayed only limited fine cracking indicating low expansion, compared to the others which displayed severe cracking. Leaching of silicone and aluminium was lowest for the 200/3M blocks.
"X-ray diffraction showed that crystalline minerals, albite and sillimanite, formed in the binder phase of 200/3M blocks. Scanning electron microscope images of the 200/3M binders show the presence of a gel-like substance, characteristic of alkali attack. The heat-curing significantly reduced in the intensity of the attack but could not prevent it," he says.
"The High Temperature Heat Treatment (HTHT) at 200 degrees created this effect by inhibiting the dissolution of unreacted fly ash particles within the hardened geopolymer concrete matrix. However, the HTHT also reduced the compressive strength for these blocks by 26.7%."
Best used as precast
Fly ash geopolymer binders exhibit remarkable durability properties. Among these are high resistance to alkali-silica reaction; superior acid resistance; and high resistance to fire, low carbonation and limited sulfate attack, says Naghizadeh.
Fly ash geopolymer cement is suitable mostly for precast concrete manufactured at a factory or workshop. The reason is that strength development in geopolymer cement mixtures is generally slow under ambient temperatures.
This makes heat-curing necessary or essential for early strength gain. The practical methods established for heat-curing pre-cast Ordinary Portland cement (OPC) can be adapted for this.
This makes fly ash geopolymers suitable for precast concrete elements such as beams or girders for buildings and bridges, railway sleepers, wall panels, hollow core slabs, and concrete pipes.
For regular fly ash geopolymer concrete, a 24-hour period of heating at 60-80 degrees Celsius would be enough to achieve sufficient strength. This curing regime (temperature and duration) is common in cement industry, which is also used for some Portland cement concretes.
Although the use of geopolymer cement is growing every year, its application is still very small compared to OPC. Geopolymer has been employed as the binder in residential structures, bridges, and runways mostly in European countries, China, Australia, and the USA.
A new generation cement
Since the middle of the 18'th century, OPC has been used extensively to produce concrete. Its durability performance is well understood and its long-term behaviour can be predicted.
However, a new generation of cement is emerging as a suitable alternative to OPC in certain applications. These geopolymer cements (or geopolymer binders) have a nature and microstructure totally different from OPC.
A starting material used for geopolymer binder needs to be rich in alumina and silicate contents. On this criterion, multiple industrial waste or by-products qualify - such as rice husk ash, palm oil fuel ash and coal power plant fly ash.
However fly ash has two advantages for use as a geopolymer cement, says Naghizadeh.
Firstly, fly ash is available in millions of tons globally, also in developing countries. Repurposing fly ash as construction material can potentially reduce some of its environmental impacts. Currently, it is disposed of in vast ash dams and landfills close to coal-fuelled power plants, which generate air and ground-water pollution.
The second advantage for fly ash as starting material for geopolymer cement is its chemical composition. Typically, fly ash is rich enough in reactive silicon and aluminium oxides, which results in a better geopolymerization.
This in turn yields a binder with superior mechanical, physical and durability properties compared to the geopolymer concretes made using other waste products containing alumino-silicates.
More complex mix design
When designing a building, the engineer needs to ensure that the concrete used in the structure will have the expected strength for the service life. However, the physical and mechanical properties of concrete and other construction materials can change over time. Such changes can influence the material performance over the service life span of the construction.
Generally, an OPC concrete mixture includes cement, water and aggregate. The civil engineer develops an OPC mix design using specific proportions of these three ingredients for the intended structure.
"For fly ash-based geopolymer concrete activated by sodium silicate and sodium hydroxide, mix design is more complex than for OPC," says Naghizadeh.
"More parameters are involved: the amounts of fly ash, sodium silicate, sodium hydroxide, water, and aggregate; as well as the concentration of sodium hydroxide; the proportion and quality of glass within the alkali."
Fly ash from ash dams
In South Africa, research about using fly ash as a geopolymer cement is limited, says Prof Stephen Ekolu. Ekolu is a co-author of the study and former Head of the School of Civil Engineering and the Built Environment at University of Johannesburg.
"The existing research about fly ash geopolymer concrete uses fly ash supplied directly from power stations. Further research is needed about using fly ash from landfills and ash dams, technically referred to as "bottom ash" to produce geopolymer cement.
"The biggest research questions are issues of material quality, mix design, and developing the technology to allow curing at ambient conditions rather than the current practice of curing at elevated temperatures. Once these three scientific issues have been resolved, fly-ash and indeed most other forms of geopolymer cements can be better placed as OPC replacements worldwide," says Ekolu.
Not a concrete extender
Currently, a small amount of fly ash is used as a common cement extender. In South Africa that amount is 10% of the 36 million tons produced annually. It is mixed with clinker to produce Pozzolanic Portland Cement (PPC).
Though fly ash is used as a common OPC extender, fly ash-based geopolymer concrete (FA-GC) is not combined with OPC-based concrete.
The reason is that the hydration process of OPC is completely different from the geopolymerization reaction of FA-GC. Also, OPC-based concrete and geopolymer concrete each requires a different curing condition.
Different production than OPC
The major phases in OPC production are the calcination and grinding processes.
Unlike OPC, geopolymer production does not require these phases. Fly ash-based geopolymer binders consist of two components: The fly ash and an alkali activator. Usually, fly ash is used as produced in the power station, with no need for further treatment.
Alkali activator solutions such as sodium silicate and sodium hydroxide are also extensively produced in the industry. These are used for multiple purposes, such as detergent and textile production.
"Greener" concrete
"The long-term durability of geopolymer cement under different environmental conditions needs further research. Also, the construction industry globally lacks technical knowledge of the production of geopolymers. To employ geopolymer binders, engineers, technicians and construction workers need training to design and produce geopolymer concrete mix designs with the required properties," says Naghizadeh.
"There is no doubt that production of Portland cement needs to be limited in future, due to its huge environmental impacts. This includes about 5-8% global anthropogenic carbon-dioxide emissions into the atmosphere, which contributes to climate change," adds Ekolu.
Several studies, including those from the University of Johannesburg, have shown that fly ash geopolymer can exhibit superior, or similar properties to Portland cement. This makes it a suitable alternative to replace Portland cement in certain applications.
Moreover, the availability of fly ash worldwide, especially in developing countries, provides an opportunity to produce more economic concrete "greener" than Ordinary Portland Cement, from the viewpoint of potential repurposing of a problematic waste product.
The series of photographs shows expansion of fly ash geopolymer concrete blocks heat-cured and then immersed in an extreme alkali medium at 80 degrees Celsius for 14 days. The blocks heat-cured at 200 degrees Celsius display only limited fine cracking indicating low expansion, compared to the others. Fly ash generated by coal power generation can be repurposed into geopolymer concrete. However, a critical durability problem has been low resistance to alkali attack. Researchers at the University of Johannesburg have found that high temperature heat-treatment at 200 degrees Celsius can halve this harmful mechanism in fly ash geopolymer concretes.

Written by Ms Therese van Wyk and Dr Abdolhossein Naghizadeh.
INTERVIEWS: For email interviews or questions, contact Dr Naghizadeh at anaghizadeh@uj.ac.za.
For interviews via mobile phone / Skype / Zoom with Dr Naghizadeh, contact Ms Therese van Wyk at Theresevw@uj.ac.za or +27 71 139 8407 (mobile) in Johannesburg, UTC + 2.
Dr Naghizadeh's doctoral research was partly funded by the Centre of Applied Research and Innovation in the Built Environment (CARINBE) at the University of Johannesburg.
How we might recharge an electric car as it drives

Engineers have demonstrated a practical way to use magnetism to transmit electricity wirelessly to recharge electric cars, robots or even drones
STANFORD SCHOOL OF ENGINEERING

Stanford engineers have taken a big step toward making it practical for electric cars to recharge as they speed along futuristic highways built to "refuel" vehicles wirelessly.

Although wireless charging pads already exist for smartphones, they only work if the phone is sitting still. For cars, that would be just as inconvenient as the current practice of plugging them in for an hour or two at charging stations.

Three years ago, Stanford electrical engineer Shanhui Fan and Sid Assawaworrarit, a graduate student in his lab, built the first system that could wirelessly recharge objects in motion. However, the technology was too inefficient to be useful outside the lab.

Now, in Nature Electronics, the two engineers demonstrate a technology that could one day be scaled up to power a car moving down the road. In the nearer term, the system could soon make it practical to wirelessly recharge robots as they move around in warehouses and on factory floors -- eliminating downtime and enabling robots to work almost around the clock.

"This is a significant step toward a practical and efficient system for wirelessly re-charging automobiles and robots, even when they are moving at high speeds," Fan said. "We would have to scale up the power to recharge a moving car, but I don't think that's a serious roadblock. For re-charging robots, we're already within the range of practical usefulness."

Wireless chargers transmit electricity by creating a magnetic field that oscillates at a frequency that creates a resonating vibration in magnetic coils on the receiving device. The problem is that the resonant frequency changes if the distance between the source and receiver changes by even a small amount.

In their first breakthrough three years ago, the researchers developed a wireless charger that could transmit electricity even as the distance to the receiver changes. They did this by incorporating an amplifier and feedback resistor that allowed the system to automatically adjusts its operating frequency as the distance between the charger and the moving object changed.

But that initial system wasn't efficient enough to be practical. The amplifier uses so much electricity internally to produce the required amplification effect that the system only transmitted 10% of the power flowing through the system.

In their new paper, the researchers show how to boosts the system's wireless-transmission efficiency to 92%. The key, Assawaworrarit explained, was to replace the original amplifier with a far more efficient "switch mode" amplifier. Such amplifiers aren't new but they are finicky and will only produce high-efficiency amplification under very precise conditions. It took years of tinkering, and additional theoretical work, to design a circuit configuration that worked.

The new lab prototype can wirelessly transmit 10 watts of electricity over a distance of two or three feet. Fan says there aren't any fundamental obstacles to scaling up a system to transmit the tens or hundreds of kilowatts that a car would need. He says the system is more than fast enough to re-supply a speeding automobile. The wireless transmission takes only a few milliseconds -- a tiny fraction of the time it would take a car moving at 70 miles an hour to cross a four-foot charging zone. The only limiting factor, Fan said, will be how fast the car's batteries can absorb all the power.

The wireless chargers shouldn't pose a health risk, said Assawaworrarit, because even ones that are powerful enough for cars would produce magnetic fields that are well within established safety guidelines. Indeed, the magnetic fields can transmit electricity through people without them feeling a thing.

Though it could be many years before wireless chargers become embedded in highways, the opportunities for robots and even aerial drones are more immediate. It's much less costly to embed chargers in floors or on rooftops than on long stretches of highway. Imagine a drone, says Fan, that could fly all day by swooping down occasionally and hovering around a roof for quick charges.

Who knows? Maybe drones really could be practical for delivering pizza.

###

This work is supported by a Vannevar Bush Faculty Fellowship from the U. S. Department of Defense.