Monday, January 27, 2020

Research suggests benefits of conservation efforts may not yet be fully visible

conservation
Credit: CC0 Public Domain
The time it takes for species to respond to conservation measures—known as an 'ecological time lag' - could be partly masking any real progress that is being made, experts have warned.
Global conservation targets to reverse declines in biodiversity and halt species extinctions are not being met, despite decades of conservation action.
Last year, a UN report on global biodiversity warned one million species are at risk of extinction within decades, putting the world's natural life-support systems in jeopardy.
The report also revealed we were on track to miss almost all the 2020 nature targets that had been agreed a decade earlier by the global Convention on Biological Diversity
But work published today in the journal Nature Ecology and Evolution offers new hope that in some cases,  may not necessarily be failing, it is just too early to see the progress that is being made.
Led by Forest Research together with the University of Stirling, Natural England, and Newcastle University, the study authors highlight the need for 'smarter' biodiversity targets which account for ecological time-lags to help us better distinguish between cases where conservation interventions are on track to achieve success but need more time for the conservation benefits to be realised, and those where current conservation actions are simply insufficient or inappropriate.
Lead researcher Dr. Kevin Watts of Forest Research said:
"We don't have time to wait and see which conservation measures are working and which ones will fail. But the picture is complicated and we fear that some conservation actions that will ultimately be successful may be negatively reviewed, reduced or even abandoned simply due to the unappreciated delay between actions and species' response.
"We hope the inclusion of time-lags within biodiversity targets, including the use of well-informed interim indicators or milestones, will greatly improve the way that we evaluate progress towards conservation success.
"Previous conservation efforts have greatly reduced the rate of decline for many species and protected many from extinction and we must learn from past successes and remain optimistic: conservation can and does work, but at the same time, we mustn't be complacent. This work also emphasises the need to acknowledge and account for the fact that biodiversity may still be responding negatively to previous habitat loss and degradation."
'Rebalancing the system'
Ecological time-lags relate to the rebalancing of a system following a change, such as the loss of habitat or the creation of new habitat.
Dr. Watts added: "The system is analogous to a financial economy: we are paying back the extinction 'debt' from past destruction of habitats and now waiting for the 'credit' to accrue from conservation actions. What we're trying to avoid now is going bankrupt by intervening too late and allow the ecosystem to fail."
Using theoretical modelling, along with data from a 'woodland bird' biodiversity indicator, the research team explored how species with different characteristics (e.g. habitat generalists vs specialists) might respond over time to landscape change caused by conservation actions.
The authors suggest the use of milestones that mark the path towards conservation targets. For instance, the ultimate success of habitat restoration policies could be assessed firstly against the amount of habitat created, followed by the arrival of generalist species. Then, later colonisation by specialists would indicate increased habitat quality. If a milestone is missed at any point, the cause should be investigated and additional conservation interventions considered.
Philip McGowan, Professor of Conservation Science and Policy at Newcastle University and Chair of the IUCN Species Survival Commission Post-2020 Biodiversity Targets Task Force, said we need to "hold our nerve."
"Ultimately, ten years is too short a time for most species to recover.
"There are many cases where there is strong evidence to suggest the conservation actions that have been put in place are appropriate and robust—we just need to give nature more time.
"Of course, time isn't something we have. We are moving faster and faster towards a point where the critical support systems in nature are going to fail.
Almost 200 of the world's governments and the EU, signed the Convention on Biological Diversity, a 10-year plan to protect some of the world's most threatened species which was launched in 2010.
"A new plan is being negotiated during 2020 and it is critical that negotiators understand the time that it takes to reverse species declines and the steps necessary to achieve recovery of species on the scale that we need," says Professor McGowan.
"But there is hope."
Simon Duffield, Natural England, adds:
"We know that natural systems takes time to respond to change, whether it be positive, such as habitat creation, or negative such as  loss, degradation or increasingly climate change. How these time lags are incorporated into  targets has always been a challenge. We hope that this framework takes us some way towards being able to do so."
The research was conducted as part of the Woodland Creation and Ecological Networks (WrEN) project.
Professor Kirsty Park, co-lead for the WrEN project, said:
"This research is timely as there is an opportunity to incorporate time-lags into the construction of the Convention on Biological Diversity Post-2020 Global Biodiversity Framework. We need to consider realistic timescales to observe changes in the status of , and also take into account the sequence of policies and actions that will be necessary to deliver those changes"
Global policy-makers must take a more ambitious approach to reversing biodiversity loss

More information: Ecological time lags and the journey towards conservation success, Nature Ecology and Evolution (2020). DOI: 10.1038/s41559-019-1087-8 , https://nature.com/articles/s41559-019-1087-8
Journal information: Nature Ecology & Evolution 

Cutting road transport pollution could help plants grow

road transport
Credit: CC0 Public Domain
Cutting emissions of particular gases could improve conditions for plants, allowing them to grow faster and capture more carbon, new research suggests.
A cocktail of gases—including , carbon monoxide,  and methane—combines in the atmosphere to form ozone.
Ozone at the Earth's surface limits photosynthesis, reducing plants' ability to grow.
University of Exeter researchers say cutting emissions of ozone-forming gases offers a "unique opportunity" to create a "natural climate solution".
A 50% cut in emissions of these gases from the seven largest human-made sources—including  (the largest emitter) and —would help plants contribute to "negative carbon emissions", the study says.
"Ecosystems on land currently slow global warming by storing about 30% of our  every year," said Professor Nadine Unger, of the University of Exeter.
"This  is being undermined by ozone pollution.
"Our findings suggest the largest losses of plant productivity are in the eastern United States, Europe and eastern China, which all have high levels of surface ozone pollution.
"The impact on plant growth in these areas is estimated to be 5-20% annually."
Ozone is not emitted directly but forms in the atmosphere during complex chemical reactions of , methane, non-methane volatile organic compounds and nitrogen oxides.
The seven areas of human activity that emit the largest amounts of these gases are agriculture, residential, energy, industry, road transportation, waste/landfill and shipping.
The study says a target of cutting these specific emissions by 50% is "large but plausible", citing examples of cuts already made in some industries.
"Deep cuts in air pollutant emissions from road transportation and the energy sector are the most effective mitigation measures for ozone-induced loss of plant productivity in eastern China, the eastern United States, Europe and globally," said Professor Unger.
"Our results suggest mitigation of ozone vegetation damage is a unique opportunity to contribute to negative carbon emissions, offering a natural climate solution that links fossil fuel emission abatement, air quality and climate.
"However, achieving these benefits requires ambitious mitigation efforts in multiple sectors."
The paper, published in the journal Nature Climate Change, is entitled: "Mitigation of ozone damage to the world's land ecosystems by source sector."
Suffocating ozone—policies that stem emission of precursor chemicals save lives and crops

More information: Mitigation of ozone damage to the world's land ecosystems by source sector, Nature Climate Change (2020). DOI: 10.1038/s41558-019-0678-3 , https://nature.com/articles/s41558-019-0678-3
GOOD NEWS

Most young people do not vape, and even fewer vape regularly

VAPING SCARE IS AN FDA FALSE FLAG

vaping
Credit: CC0 Public Domain
While youth vaping rates have increased in recent years, most middle and high school students don't vape or smoke and very few vape or smoke daily, finds a study led by researchers at NYU School of Global Public Health.
The study, published this month in the journal Nicotine & Tobacco Research, finds that over 80 percent of youth do not use any tobacco and over 86 percent don't vape—and among the minority who do vape, most are not regular users. In addition, the study reveals that most youth who are  are also current or former smokers.
"Our findings underscore the importance of examining the full context of how youth are using vaping and tobacco products," said Allison Glasser, an assistant research scientist at NYU School of Global Public Health and the study's lead author. "The key to protecting youth in the United States is determining the patterns of frequency of use and co-use of vaping and tobacco products, which will give public health decision makers the best possible information to protect the public's health."
While the FDA and CDC's National Youth Tobacco Survey has shown a concerning increase in youth vaping in recent years, little is known about the frequency with which youth use e-cigarettes—if it's an occasional occurrence or a daily habit—as well as whether they also use more harmful smoked tobacco products like cigarettes and inexpensive cigars or cigarillos.
In this study, the researchers analyzed the 2018 National Youth Tobacco Survey in which more than 20,000 middle and  were asked about their use of various tobacco and vaping products in the past 30 days. The analysis was conducted on the 2018 survey, the latest available full data set; the 2019 National Youth Tobacco Survey, which showed that youth vaping continued to grow from 2018 to 2019, has not yet been made available for public analysis.
A critical finding across all surveys from 2013 to 2019 is that smoking actually decreased much more rapidly to a record low during the very same years vaping increased. From 2015 to 2018, daily cigarette smoking among youth declined from 1.2 percent to 0.9 percent, while regular vaping (20 or more out of the past 30 days) increased from 1.7 percent to 3.6 percent.
"The faster drop in smoking suggests vaping is helping displace youth use of much more deadly smoking—a net harm reduction benefit to the population as a whole," said David Abrams, a professor of social and behavioral sciences at NYU School of Global Public Health and a study coauthor.
The researchers also found that while youth vaping increased from 2017 to 2018, the increase was driven by infrequent e-cigarette use rather than regular use: in 2018, while 13.8 percent of students had vaped in the 30 days, more than half of them vaped five days or fewer.
Critically, the majority of youth vapers also use or have used more deadly  (60 to 88.9 percent, depending on the frequency of vaping). While there has been fear that e-cigarettes are introducing nicotine to many  who otherwise would not have smoked, the data show otherwise—only a small proportion of tobacco-naïve youth report vaping.
"Examining tobacco and e-cigarette use patterns in youth is informative about the risk of continued use in adulthood. While in a perfect world young people would not be smoking or vaping, if the vast majority of youth who vape are already current or former smokers, vaping could offer them a safer alternative than cancer-causing cigarettes," said Ray Niaura, a professor of social and behavioral sciences at NYU School of Global Public Health and a study coauthor.
"This study provides us with a better understanding of youth vaping patterns, which is critical for creating effective public health policies around nicotine and tobacco. Reacting too quickly to reports of  vaping without considering the full context could do more harm than good," added Abrams. "We need to avoid prohibitionist regulations like banning e-cigarettes—while leaving much more deadly cigarettes and cigars in corner stores—and instead should consider strong enforcement of age 21 sales restrictions. Prohibition creates a black market for vaping products or inadvertently pushes individuals back to smoking ."
Three quarters of teens who vape report using nicotine, marijuana, or multiple substances

More information: Allison M Glasser et al, Youth Vaping and Tobacco Use in Context in the United States: Results from the 2018 National Youth Tobacco Survey, Nicotine & Tobacco Research (2020). DOI: 10.1093/ntr/ntaa010
Journal information: Nicotine & Tobacco Research 
Provided by New York University 




Climate costs lowest if warming is limited to 2 degrees Celsius

HURRAH WE CAN CONTINUE CAPITALISM UNABATED
climate
Credit: CC0 Public Domain
Using computer simulations of a model by U.S. Nobel Laureate William Nordhaus, researchers have weighted climate damage from increasing weather extremes, decreasing labor productivity and other factors against the costs of cutting greenhouse gas emission by phasing out coal and oil. Interestingly, the most economically cost-efficient level of global warming turns out to be 2 degrees Celsius, the level to which more than 190 nations agreed in the Paris Climate Agreement. So far, however, CO2 reductions promised by nations worldwide are insufficient to reach this goal.
"To secure  for all people in these times of global warming, we need to balance the costs of  change damage and those of . Now, our team has found what we should aim for," says Anders Levermann from the Potsdam Institute for Climate Impact Research (PIK) and Columbia University's LDEO, New York, head of the team conducting the study. "We did a lot of thorough testing with our computers. And we have been amazed to find that limiting the global temperature increase to 2 degrees Celsius, as agreed in the science-based but highly political process leading to the 2015 Paris Agreement, indeed emerges as economically optimal."
Striving for economic growth
Climate policies such as the replacement of coal-fired power plants by windmills and solar energy or the introduction of CO2 pricing entail . The same is true for climate damage. Cutting greenhouse gas emissions clearly reduces the damage, but so far, observed temperature-induced losses in economic production have not really been accounted for in computations of economically optimal policy pathways. The researchers have now done that. They fed up-to-date research on economic damages driven by climate change effects into one of the most renowned computer simulation systems, the Dynamic Integrated Climate-Economy model developed by the Nobel Laureate of Economics, William Nordhaus, and used in the past for U.S. policy planning. The computer simulation is trained to strive for economic growth.
"It is remarkable how robustly reasonable the temperature limit of more or less 2 degrees Celsius is, standing out in almost all the cost-curves we've produced," says Sven Willner, also from PIK and an author of the study. The researchers tested a number of uncertainties in their study. For instance, they accounted for people's preference for consumption today instead of consumption tomorrow versus the notion that tomorrow's generations should not have less consumption means. The result, that limiting  to 2 degrees Celsius is the most cost-efficient, was also true for the full range of possible climate sensitivities; hence, the amount of warming that results from a doubling of CO2 in the atmosphere.
"The world is running out of excuses for doing nothing"
"Since we have already increased the temperature of the planet by more than one degree, 2 degrees Celsius requires fast and fundamental global action," says Levermann. "Our analysis is based on the observed relation between temperature and economic growth, but there could be other effects that we cannot anticipate yet." Changes in the response of societies to climate stress—especially a violent flare-up of smoldering conflicts or the crossing of tipping points for critical elements in the Earth system—could shift the cost-benefit analysis toward even more urgent action.
"The world is running out of excuses to justify sitting back and doing nothing—all those who have been saying that climate stabilization would be nice but is too costly can see now that it is really unmitigated global warming that is too expensive," Levermann concludes. "Business as usual is clearly not a viable economic option anymore. We either decarbonize our economies or we let global warming fire up costs for businesses and societies worldwide."
The study is published in Nature Communications.
How fast the planet warms will be crucial for livability

More information: Nicole Glanemann, Sven N. Willner, Anders Levermann (2020): Paris Climate Agreement passes the cost-benefit text. Nature CommunicationsDOI: 10.1038/s41467-019-13961-1

Oceanographers predict increase in phytoplankton by 2100

phytoplankton
Credit: CC0 Public Domain
A neural network-driven Earth system model has led University of California, Irvine oceanographers to a surprising conclusion: phytoplankton populations will grow in low-latitude waters by the end of the 21st century.
The unexpected simulation outcome runs counter to the longstanding belief by many in the environmental science community that future global climate change will make tropical oceans inhospitable to , which are the base of the aquatic food web. The UCI researchers provide the evidence for their findings in a paper published today in Nature Geoscience.
Senior author Adam Martiny, UCI professor in oceanography, explained that the prevalent thinking on phytoplankton biomass is based on an increasingly stratified ocean. Warming seas inhibit mixing between the heavier cold layer in the deep and lighter warm water closer to the surface. With less circulation between the levels, fewer nutrients reach the higher strata where they can be accessed by hungry plankton.
"All the  have this mechanism built into them, and it has led to these well-established predictions that phytoplankton productivity, biomass and export into the deep ocean will all decline with climate change," he said. "Earth system models are largely based upon laboratory studies of phytoplankton, but of course laboratory studies of plankton are not the real ocean."
According to Martiny, scientists traditionally account for plankton by measuring the amount of chlorophyll in the water. There is considerably less of the green stuff in low-latitude regions that are very hot compared to cooler regions further away from the equator.
"The problem is that chlorophyll is not everything that's in a cell, and actually in low latitudes, many plankton are characterized by having a very small amount of it; there's so much sunlight, plankton only need a few chlorophyll molecules to get enough energy to grow," he noted. "In reality, we have had so far very little data to actually demonstrate whether or not there is more or less biomass in regions undergoing stratification. As a result, the empirical basis for less biomass in warmer regions is not that strong."
These doubts led Martiny and his UCI colleagues to conduct their own phytoplankton census. Analyzing samples from more than 10,000 locations around the world, the team created a global synthesis of the key phytoplankton groups that grow in warm regions.
The vast majority of these species are very tiny cells known as picophytoplankton. Ten times smaller in diameter than the strains of plankton one would find off the California coast—and 1,000 times less voluminous—picophytoplankton are nonetheless great in number, making up 80 to 90 percent of plankton biomass in most warm regions.
The group built global maps and compared the quantity of biomass along the gradient of temperature, a key parameter, according to Martiny. Conducting a machine learning analysis to determine the difference now versus the year 2100, they found a big surprise: "In many regions there would be an increase of 10 to 20 percent of plankton biomass, rather than a decline," Martiny said.
"Machine learning is not biased by the human mind," he said. "We just give the model tons and tons of data, but they can help us challenge existing paradigms."
One of the theories the team explored to explain the growth, with help from co-author Francois Primeau, UCI professor of Earth system science, had to do with what happens to phytoplankton at the end of their life cycle.
"When plankton die—especially these small species—they sit around for a while longer, and maybe at high temperature other plankton can more easily degrade them and recycle the nutrients back to build new biomass," Martiny said.
Such ecosystem features are not easily taken into account by traditional, mechanistic Earth system models, according to Martiny, but they were part of the geographically diverse dataset the team used to train its -derived quantitative niche model.
Martiny said that this study as a follow-up to research published last summer is further evidence as to the diversity and resilience of phytoplankton.
"We could obviously let climate change get out of hand and go into completely uncharted territory, and then all bets are off," he said. "But at least for a while, I think the adaptive capabilities in these diverse  communities will help them maintain high biomass despite these environmental changes."
Joining Martiny and Primeau were fellow authors Pedro Flombaum, former UCI postdoctoral researcher and later visiting scholar in Earth system science (currently a professor at the University of Buenos Aires, Argentina), and Weilei Wang, UCI postdoctoral scholar in Earth system science. The study received support from the National Science Foundation's Ten Big Ideas program and the U.S. Department of Energy Office of Biological and Environmental Research.
Plankton are more resilient to nutrient stress than previously thought

More information: Global picophytoplankton niche partitioning predicts overall positive response to ocean warming, Nature Geoscience (2020). DOI: 10.1038/s41561-019-0524-2 , https://nature.com/articles/s41561-019-0524-2

Patterns of thinning of Antarctica's biggest glacier are opposite to previously observed

Patterns of thinning of Antarctica's biggest glacier are opposite to previously observed
The lead author undertaking satellite validation fieldwork on the Filchner Ronne Ice Shelf, West Antarctica with the Alfred Wegener Institute, Germany. Credit: Jonathan Bamber, University of Bristol
Using the latest satellite technology from the European Space Agency (ESA), scientists from the University of Bristol have been tracking patterns of mass loss from Pine Island—Antarctica's largest glacier.
They found that the pattern of thinning is evolving in complex ways both in space and time with thinning rates now highest along the slow-flow margins of the glacier, while rates in the fast-flowing central trunk have decreased by about a factor of five since 2007. This is the opposite of what was observed prior to 2010.
Pine Island has contributed more to  over the past four decades than any other glacier in Antarctica, and as a consequence has become one of its most intensively and extensively investigated ice stream systems.
However, different  projections of future mass loss give conflicting results; some suggesting  could dramatically increase over the next few decades, resulting in a rapidly growing contribution to sea level, while others indicate a more moderate response.
Identifying which is the more likely behaviour is important for understanding  and how this vulnerable part of Antarctica is going to evolve over the coming decades.
The results of the new study, published in the journal Nature Geoscience, suggest that rapid migration of the grounding line, the place where the grounded ice first meets the ocean, is unlikely over that timescale, without a major change in ocean forcing. Instead, the results support model simulations that imply that the glacier will continue to lose mass but not at much greater rates than present.
Patterns of thinning of Antarctica's biggest glacier are opposite to previously observed
Animated gif of thinning of Pine Island Glacier. Credit: University of Bristol
Lead author Professor Jonathan Bamber from the University of Bristol's School of Geographical Sciences, said: "This could seem like a 'good news story' but it's important to remember that we still expect this glacier to continue to lose mass in the future and for that trend to increase over time, just not quite as fast as some model simulations suggested.
"It's really important to understand why the models are producing different behaviour in the future and to get a better handle on how the glacier will evolve with the benefit of these new observations.
"In our study, we didn't make projections but with the aid of these new data we can improve model projections for this part of Antarctica."
Lab turns trash into valuable graphene in a flash

by Mike Williams, Rice University

Carbon black powder turns into graphene in a burst of light and heat through a technique developed at Rice University. Flash graphene turns any carbon source into the valuable 2D material in 10 milliseconds. Credit: Jeff Fitlow/Rice University

That banana peel, turned into graphene, can help facilitate a massive reduction of the environmental impact of concrete and other building materials. While you're at it, toss in those plastic empties.

A new process introduced by the Rice University lab of chemist James Tour can turn bulk quantities of just about any carbon source into valuable graphene flakes. The process is quick and cheap; Tour said the "flash graphene" technique can convert a ton of coal, food waste or plastic into graphene for a fraction of the cost used by other bulk graphene-producing methods.
"This is a big deal," Tour said. "The world throws out 30% to 40% of all food, because it goes bad, and plastic waste is of worldwide concern. We've already proven that any solid carbon-based matter, including mixed plastic waste and rubber tires, can be turned into graphene."

As reported in Nature, flash graphene is made in 10 milliseconds by heating carbon-containing materials to 3,000 Kelvin (about 5,000 degrees Fahrenheit). The source material can be nearly anything with carbon content. Food waste, plastic waste, petroleum coke, coal, wood clippings and biochar are prime candidates, Tour said. "With the present commercial price of graphene being $67,000 to $200,000 per ton, the prospects for this process look superb," he said.

Tour said a concentration of as little as 0.1% of flash graphene in the cement used to bind concrete could lessen its massive environmental impact by a third. Production of cement reportedly emits as much as 8% of human-made carbon dioxide every year.

In a flash, carbon black turns into graphene through a technique developed by Rice University scientists. The scalable process promises to quickly turn carbon from any source into bulk graphene. From left: undergraduate intern Christina Crassas, chemist James Tour and graduate students Weiyin Chen and Duy Luong. Credit: Jeff Fitlow/Rice University

"By strengthening concrete with graphene, we could use less concrete for building, and it would cost less to manufacture and less to transport," he said. "Essentially, we're trapping greenhouse gases like carbon dioxide and methane that waste food would have emitted in landfills. We are converting those carbons into graphene and adding that graphene to concrete, thereby lowering the amount of carbon dioxide generated in concrete manufacture. It's a win-win environmental scenario using graphene."

"Turning trash to treasure is key to the circular economy," said co-corresponding author Rouzbeh Shahsavari, an adjunct assistant professor of civil and environmental engineering and of materials science and nanoengineering at Rice and president of C-Crete Technologies. "Here, graphene acts both as a 2-D template and a reinforcing agent that controls cement hydration and subsequent strength development."

In the past, Tour said, "graphene has been too expensive to use in these applications. The flash process will greatly lessen the price while it helps us better manage waste."

"With our method, that carbon becomes fixed," he said. "It will not enter the air again."

The process aligns nicely with Rice's recently announced Carbon Hub initiative to create a zero-emissions future that repurposes hydrocarbons from oil and gas to generate hydrogen gas and solid carbon with zero emission of carbon dioxide. The flash graphene process can convert that solid carbon into graphene for concrete, asphalt, buildings, cars, clothing and more, Tour said.

Flash Joule heating for bulk graphene, developed in the Tour lab by Rice graduate student and lead author Duy Luong, improves upon techniques like exfoliation from graphite and chemical vapor deposition on a metal foil that require much more effort and cost to produce just a little graphene.

Even better, the process produces "turbostratic" graphene, with misaligned layers that are easy to separate. "A-B stacked graphene from other processes, like exfoliation of graphite, is very hard to pull apart," Tour said. "The layers adhere strongly together.

But turbostratic graphene is much easier to work with because the adhesion between layers is much lower. They just come apart in solution or upon blending in composites.

"That's important, because now we can get each of these single-atomic layers to interact with a host composite," he said.

The lab noted that used coffee grounds transformed into pristine single-layer sheets of graphene.

Bulk composites of graphene with plastic, metals, plywood, concrete and other building materials would be a major market for flash graphene, according to the researchers, who are already testing graphene-enhanced concrete and plastic.

The flash process happens in a custom-designed reactor that heats material quickly and emits all noncarbon elements as gas. "When this process is industrialized, elements like oxygen and nitrogen that exit the flash reactor can all be trapped as small molecules because they have value," Tour said.

He said the flash process produces very little excess heat, channeling almost all of its energy into the target. "You can put your finger right on the container a few seconds afterwards," Tour said. "And keep in mind this is almost three times hotter than the chemical vapor deposition furnaces we formerly used to make graphene, but in the flash process the heat is concentrated in the carbon material and none in a surrounding reactor.

"All the excess energy comes out as light, in a very bright flash, and because there aren't any solvents, it's a super clean process," he said.

Luong did not expect to find graphene when he fired up the first small-scale device to find new phases of material, beginning with a sample of carbon black. "This started when I took a look at a Science paper talking about flash Joule heating to make phase-changing nanoparticles of metals," he said. But Luong quickly realized the process produced nothing but high-quality graphene.
Rice University scientists are turning waste into turbostratic graphene via a process they say can be scaled up to produce industrial-scale quantities. Credit: Rouzbeh Shahsavari/C-Crete Group

Atom-level simulations by Rice researcher and co-author Ksenia Bets confirmed that temperature is key to the material's rapid formation. "We essentially speed up the slow geological process by which carbon evolves into its ground state, graphite," she said. "Greatly accelerated by a heat spike, it is also stopped at the right instant, at the graphene stage.

"It is amazing how state-of-the-art computer simulations, notoriously slow for observing such kinetics, reveal the details of high temperature-modulated atomic movements and transformation," Bets said.

Tour hopes to produce a kilogram (2.2 pounds) a day of flash graphene within two years, starting with a project recently funded by the Department of Energy to convert U.S.-sourced coal. "This could provide an outlet for coal in large scale by converting it inexpensively into a much-higher-value building material," he said.

Explore furtherGraphene is 3-D as well as 2-D
More information: Luong et al. Gram-scale bottom-up flash graphene synthesis. Nature (2020). doi.org/10.1038/s41586-020-1938-0

VIDEOS

Study says that we trust our workplace robots



robot
Credit: CC0 Public Domain
The only constant is change. Presumptions harden as truth but then there is occasion to throw presumptions off the table and start again. That's the deal with information technology using AI for business and with robots unleashed in the workplace. The presumptions are that such tech is potentially harmful and that if those robots rebel against you, you're toast.
A new study has a sunnier view. People have more trust in robots. "The majority (65 percent) of workers are optimistic, excited and grateful about having  co-workers and nearly a quarter report having a loving and gratifying relationship with AI at work," said the press release provided by Oracle.
The information comes from an annual study that Oracle runs with the research company Future Workplace. The title of the report is "Artificial Intelligence Is Winning More Hearts and Minds in the Workplace." That is quite the positive headline; is it an aggressive spin or a realistic reflection that people are so amenable to workplace AI?
Responses were taken from 8,370 employees, managers and HR leaders across 10 countries. (And why not at least ask: Whether responses would be good or negative, AI has changed the  and influences how human resources and managers behave in order to keep organizations on track.)
Judith Humphrey, the founder of a Toronto-based communications firm, in Fast Company had a look at the study. She thought it presented "a strong case that AI is already winning the hearts and minds of employees."
Consider technologies that remove the grunt work so that managers can turn to more creative pursuits; the technologies that teach workers how to maximize communications via imaginative digital platforms; technologies that add weight to their career portfolios as they seek promotions or new jobs.
Humphrey noted study results: "New technologies, according to respondents, will help them master new skills (36%), gain more free time (36%), and expand their current role so that it's more strategic (28%)."
With that said, people outside Oracle still may not easily accept the very thought of an employee at any company trusting a machine more than a human manager to "do the right thing" or make the right assessment.

A closer look at the survey questions, though, indicate the response was significant; the outcome had its own logic.
What, specifically, were the activities that respondents felt could be done better by robots than by their managers? These were (1) providing unbiased information, (2) maintaining work schedules, (3)  and (4) managing a budget.
Increased adoption of AI at work is having an impact on the way employees interact with their managers. The traditional role of HR teams and the manager is shifting.


Oracle's press summary of the findings noted that "64 percent of people would trust a robot more than their manager and half have turned to a robot instead of their manager for advice."
As for Oracle's  headline, "Artificial Intelligence Is Winning More Hearts and Minds in the Workplace," it is not an inaccurate spin but more a snippet from a larger thought. AI is winning more hearts and minds in the realm of what AI is good at doing, leaving room and time for managers to do what they do best, coach, motivate, inspire, build teams.
Jeanne Meister, founding partner, Future Workplace: "As workers and managers leverage the power of artificial intelligence in the workplace, they are moving from fear to enthusiasm as they see the possibility of being freed of many of their routine tasks and having more time to solve critical business problems for the enterprise."


Humphrey in Fast Company did not miss the part in the study where respondents pinned what,, on the flip side, their managers did better than robots: "understanding my feelings," "coaching me," "creating or promoting a work culture" and "evaluating team performance."
A prophetic enough article appeared back in 2016 in Harvard Business Review where the authors argued that  will soon be able to do the administrative tasks that consume much of managers' time faster, better, and at a lower cost.
The authors reflected on study findings at the time. The attitude was encouraging; managers could see the difference between intelligently leveraging AI for  and data-driven solutions as opposed to fighting AI as a threat leading to their removal, leadership skills and all.
"Writing earnings reports is one thing, but developing messages that can engage a workforce and provide a sense of purpose is human through and through," the authors wrote. "Tracking schedules and resources may soon fall within the jurisdiction of machines, but drafting strategy remains unmistakably human. Simply put, our recommendation is to adopt AI in order to automate administration and to augment but not replace human judgment."
In the Oracle study, meanwhile, workers in India (89 percent) and China (88 percent) were more trusting of robots over their . Singapore followed by 83 percent; Brazil, 78 percent; Japan, 76 percent; UAE, 74 percent; Australia/New Zealand, 58 percent; U.S., 57 percent; UK, 54 percent; and France, 56 percent.

Burden of health care costs greatest among low-income Americans

by RAND Corporation

Higher income American households pay the most to finance the nation's health care system, but the burden of payments as a share of income is greatest among households with the lowest incomes, according to a new RAND Corporation study.

Households in the bottom fifth of income groups pay an average of 33.9% of their income toward health care, while families in the highest income group pay 16% of their income toward health care.

The analysis finds that households in the middle three income tiers pay between 19.8% and 23.2% of their income toward health care. The analysis considered all payments made by households to support health care, including taxes and employer contributions.


The study is published online by the journal Health Services Research.

"Our findings suggest that health care payments in the U.S. are even more regressive than suggested by earlier research," said Katherine G. Carman, lead author of the study and a senior economist at RAND, a nonprofit research organization. "As national discussions continue about health reform and health equity, it's important to understand how the current health care system distributes costs and payments."


In 2015, health care spending accounted for nearly 18 percent of the U.S.'s gross domestic product, a measure of the total value of goods produced and services provided by the nation. Ultimately all health care costs are paid by households, either in obvious ways such as through insurance premiums or out-of-pocket costs, in addition to less-visible ways such as employer-paid premiums and taxes.

RAND researchers analyzed a variety of sources of information to examine the burden that different families face to pay for health care, as well as the relationship between who pays for care and who receives care.

Researchers combined data from multiple sources collected in 2015, including the Survey of Income and Program Participation, the Medical Expenditure Panel Survey, the Kaiser Family Foundation/Health Research Education Trust Employer Health Benefits Survey, the American Community Survey and the National Health Expenditure Accounts.

Previous research has examined the distribution of health care financing, but the new RAND study considers payments made to finance health care, the dollar value of benefits received, and the impact on different groups by age, source of insurance and size of income.

The RAND study also is the first to consider the burden of health costs among people who are in nursing homes and other institutions, a calculation that led to higher estimates of health spending. The burden is particularly large on low-income people who need long-term care because in order to qualify for public benefits they must first spend most of their savings.

"We think this is a particularly important addition because those in nursing homes are among the most vulnerable in terms not only of their health, but also of the large financial burden that they face," Carman said.

While out-of-pocket spending, including insurance premiums, is the most obvious payment most people make for health care, the RAND study found it accounted for just 9.1% of health care costs. The vast bulk of health care costs are paid through health insurance premiums and taxes.

The study found that payments to finance health care was $9,393 per person, or 18.7% of average household income.
Examining benefits by type of insurance, researchers found that Americans with Medicare receive the greatest dollar value of health care, a result of older people generally using more health care services.

Those with Medicaid have the largest dollar value of health care received as a percent of income, which corresponds to the lower income and generally poorer health among the group. People with employer-sponsored insurance received the lowest dollar value of health care.

Unsurprisingly, those with lower income are much more likely to benefit from redistribution of health care payments made by others toward health care services.

The study found that households in the three lowest income groups receive more health care services than they pay for through all forms of payments. In the fourth income group, payments and the dollar value of care received are similar.

Households in the highest of the five income groups are paying much more into the system than they receive in health care services.

"Understanding how different groups contribute to and and benefit from health care spending is difficult for researchers, policymakers and the general public," Carman said. "This work provides better insight into how the American health care system redistributes contributions and spending across different parts of society."


Explore furtherEmployee premiums, deductibles eating larger share of income
Journal information: Health Services Research


Provided by RAND Corporation

The Blue Acceleration: Recent colossal rise in human pressure on ocean quantified

The Blue Acceleration: Recent colossal rise in human pressure on ocean quantified
Global trends in use of the marine environment. Usage reached an inflection point around the turn of the new millennium. Credit: One Earth,
Human pressure on the world's oceans accelerated sharply at the start of the 21st century and shows no sign of slowing, according to a comprehensive new analysis on the state of the ocean.
Scientists have dubbed the dramatic rise the "Blue Acceleration." The researchers from the Stockholm Resilience Centre, Stockholm University, synthesized 50-years of data from shipping, drilling, deep-sea mining, aquaculture, bioprospecting and much more. The results are published in the journal One Earth on 24 January.
The scientists say the largest  industry is the oil and gas sector, responsible for about one third of the value of the ocean economy. Sand and gravel are the ocean's most mined minerals to meet demand from the construction industry. As freshwater becomes an increasingly scarce commodity, around 16,000  have sprung up around the world in the last 50 years with a steep rise since 2000, according to the analysis.
Lead author Jean-Baptiste Jouffray from the Stockholm Resilience Centre said, "Claiming  and space is not new to humanity, but the extent, intensity, and diversity of today's aspirations are unprecedented."
The industrialization of the ocean took off at the end of the last century, driven by a combination of technological progress and declining land-based resources.
"This Blue Acceleration is really a race for ocean resources and space, posing risks and opportunities for global sustainability."
The study highlights some positive human impacts. For example, the area protected from some exploitation has increased exponentially, with a surge since 2000 that shows no signs of slowing. And offshore wind farm technology has reached  in this period allowing the world to reduce reliance on fossil fuels.
The authors conclude by calling for increased attention to who is driving the Blue Acceleration, what is financing it, and who is benefiting from it. The United Nations is embarking on a "decade of the ocean" in 2021. The scientists say this is is an opportunity to assess social-ecological impacts and manage ocean resources for long-term sustainability.
They highlight there is a high degree of consolidation relating the seafood industry, oil and gas exploitation, and bioprospecting, with just a small handful of multinational companies dominating each sector. The team suggests that banks and other investors could adopt more stringent sustainability criteria for ocean investments.
New criteria for bank loans and stock exchange listings could protect ocean resources

More information: Jean-Baptiste Jouffray et al. The Blue Acceleration: The Trajectory of Human Expansion into the Ocean. One EarthDOI: 10.1016/j.oneear.2019.12.016