Monday, January 27, 2020


Climate costs lowest if warming is limited to 2 degrees Celsius

HURRAH WE CAN CONTINUE CAPITALISM UNABATED
climate
Credit: CC0 Public Domain
Using computer simulations of a model by U.S. Nobel Laureate William Nordhaus, researchers have weighted climate damage from increasing weather extremes, decreasing labor productivity and other factors against the costs of cutting greenhouse gas emission by phasing out coal and oil. Interestingly, the most economically cost-efficient level of global warming turns out to be 2 degrees Celsius, the level to which more than 190 nations agreed in the Paris Climate Agreement. So far, however, CO2 reductions promised by nations worldwide are insufficient to reach this goal.
"To secure  for all people in these times of global warming, we need to balance the costs of  change damage and those of . Now, our team has found what we should aim for," says Anders Levermann from the Potsdam Institute for Climate Impact Research (PIK) and Columbia University's LDEO, New York, head of the team conducting the study. "We did a lot of thorough testing with our computers. And we have been amazed to find that limiting the global temperature increase to 2 degrees Celsius, as agreed in the science-based but highly political process leading to the 2015 Paris Agreement, indeed emerges as economically optimal."
Striving for economic growth
Climate policies such as the replacement of coal-fired power plants by windmills and solar energy or the introduction of CO2 pricing entail . The same is true for climate damage. Cutting greenhouse gas emissions clearly reduces the damage, but so far, observed temperature-induced losses in economic production have not really been accounted for in computations of economically optimal policy pathways. The researchers have now done that. They fed up-to-date research on economic damages driven by climate change effects into one of the most renowned computer simulation systems, the Dynamic Integrated Climate-Economy model developed by the Nobel Laureate of Economics, William Nordhaus, and used in the past for U.S. policy planning. The computer simulation is trained to strive for economic growth.
"It is remarkable how robustly reasonable the temperature limit of more or less 2 degrees Celsius is, standing out in almost all the cost-curves we've produced," says Sven Willner, also from PIK and an author of the study. The researchers tested a number of uncertainties in their study. For instance, they accounted for people's preference for consumption today instead of consumption tomorrow versus the notion that tomorrow's generations should not have less consumption means. The result, that limiting  to 2 degrees Celsius is the most cost-efficient, was also true for the full range of possible climate sensitivities; hence, the amount of warming that results from a doubling of CO2 in the atmosphere.
"The world is running out of excuses for doing nothing"
"Since we have already increased the temperature of the planet by more than one degree, 2 degrees Celsius requires fast and fundamental global action," says Levermann. "Our analysis is based on the observed relation between temperature and economic growth, but there could be other effects that we cannot anticipate yet." Changes in the response of societies to climate stress—especially a violent flare-up of smoldering conflicts or the crossing of tipping points for critical elements in the Earth system—could shift the cost-benefit analysis toward even more urgent action.
"The world is running out of excuses to justify sitting back and doing nothing—all those who have been saying that climate stabilization would be nice but is too costly can see now that it is really unmitigated global warming that is too expensive," Levermann concludes. "Business as usual is clearly not a viable economic option anymore. We either decarbonize our economies or we let global warming fire up costs for businesses and societies worldwide."
The study is published in Nature Communications.
How fast the planet warms will be crucial for livability

More information: Nicole Glanemann, Sven N. Willner, Anders Levermann (2020): Paris Climate Agreement passes the cost-benefit text. Nature CommunicationsDOI: 10.1038/s41467-019-13961-1

Oceanographers predict increase in phytoplankton by 2100

phytoplankton
Credit: CC0 Public Domain
A neural network-driven Earth system model has led University of California, Irvine oceanographers to a surprising conclusion: phytoplankton populations will grow in low-latitude waters by the end of the 21st century.
The unexpected simulation outcome runs counter to the longstanding belief by many in the environmental science community that future global climate change will make tropical oceans inhospitable to , which are the base of the aquatic food web. The UCI researchers provide the evidence for their findings in a paper published today in Nature Geoscience.
Senior author Adam Martiny, UCI professor in oceanography, explained that the prevalent thinking on phytoplankton biomass is based on an increasingly stratified ocean. Warming seas inhibit mixing between the heavier cold layer in the deep and lighter warm water closer to the surface. With less circulation between the levels, fewer nutrients reach the higher strata where they can be accessed by hungry plankton.
"All the  have this mechanism built into them, and it has led to these well-established predictions that phytoplankton productivity, biomass and export into the deep ocean will all decline with climate change," he said. "Earth system models are largely based upon laboratory studies of phytoplankton, but of course laboratory studies of plankton are not the real ocean."
According to Martiny, scientists traditionally account for plankton by measuring the amount of chlorophyll in the water. There is considerably less of the green stuff in low-latitude regions that are very hot compared to cooler regions further away from the equator.
"The problem is that chlorophyll is not everything that's in a cell, and actually in low latitudes, many plankton are characterized by having a very small amount of it; there's so much sunlight, plankton only need a few chlorophyll molecules to get enough energy to grow," he noted. "In reality, we have had so far very little data to actually demonstrate whether or not there is more or less biomass in regions undergoing stratification. As a result, the empirical basis for less biomass in warmer regions is not that strong."
These doubts led Martiny and his UCI colleagues to conduct their own phytoplankton census. Analyzing samples from more than 10,000 locations around the world, the team created a global synthesis of the key phytoplankton groups that grow in warm regions.
The vast majority of these species are very tiny cells known as picophytoplankton. Ten times smaller in diameter than the strains of plankton one would find off the California coast—and 1,000 times less voluminous—picophytoplankton are nonetheless great in number, making up 80 to 90 percent of plankton biomass in most warm regions.
The group built global maps and compared the quantity of biomass along the gradient of temperature, a key parameter, according to Martiny. Conducting a machine learning analysis to determine the difference now versus the year 2100, they found a big surprise: "In many regions there would be an increase of 10 to 20 percent of plankton biomass, rather than a decline," Martiny said.
"Machine learning is not biased by the human mind," he said. "We just give the model tons and tons of data, but they can help us challenge existing paradigms."
One of the theories the team explored to explain the growth, with help from co-author Francois Primeau, UCI professor of Earth system science, had to do with what happens to phytoplankton at the end of their life cycle.
"When plankton die—especially these small species—they sit around for a while longer, and maybe at high temperature other plankton can more easily degrade them and recycle the nutrients back to build new biomass," Martiny said.
Such ecosystem features are not easily taken into account by traditional, mechanistic Earth system models, according to Martiny, but they were part of the geographically diverse dataset the team used to train its -derived quantitative niche model.
Martiny said that this study as a follow-up to research published last summer is further evidence as to the diversity and resilience of phytoplankton.
"We could obviously let climate change get out of hand and go into completely uncharted territory, and then all bets are off," he said. "But at least for a while, I think the adaptive capabilities in these diverse  communities will help them maintain high biomass despite these environmental changes."
Joining Martiny and Primeau were fellow authors Pedro Flombaum, former UCI postdoctoral researcher and later visiting scholar in Earth system science (currently a professor at the University of Buenos Aires, Argentina), and Weilei Wang, UCI postdoctoral scholar in Earth system science. The study received support from the National Science Foundation's Ten Big Ideas program and the U.S. Department of Energy Office of Biological and Environmental Research.
Plankton are more resilient to nutrient stress than previously thought

More information: Global picophytoplankton niche partitioning predicts overall positive response to ocean warming, Nature Geoscience (2020). DOI: 10.1038/s41561-019-0524-2 , https://nature.com/articles/s41561-019-0524-2

Patterns of thinning of Antarctica's biggest glacier are opposite to previously observed

Patterns of thinning of Antarctica's biggest glacier are opposite to previously observed
The lead author undertaking satellite validation fieldwork on the Filchner Ronne Ice Shelf, West Antarctica with the Alfred Wegener Institute, Germany. Credit: Jonathan Bamber, University of Bristol
Using the latest satellite technology from the European Space Agency (ESA), scientists from the University of Bristol have been tracking patterns of mass loss from Pine Island—Antarctica's largest glacier.
They found that the pattern of thinning is evolving in complex ways both in space and time with thinning rates now highest along the slow-flow margins of the glacier, while rates in the fast-flowing central trunk have decreased by about a factor of five since 2007. This is the opposite of what was observed prior to 2010.
Pine Island has contributed more to  over the past four decades than any other glacier in Antarctica, and as a consequence has become one of its most intensively and extensively investigated ice stream systems.
However, different  projections of future mass loss give conflicting results; some suggesting  could dramatically increase over the next few decades, resulting in a rapidly growing contribution to sea level, while others indicate a more moderate response.
Identifying which is the more likely behaviour is important for understanding  and how this vulnerable part of Antarctica is going to evolve over the coming decades.
The results of the new study, published in the journal Nature Geoscience, suggest that rapid migration of the grounding line, the place where the grounded ice first meets the ocean, is unlikely over that timescale, without a major change in ocean forcing. Instead, the results support model simulations that imply that the glacier will continue to lose mass but not at much greater rates than present.
Patterns of thinning of Antarctica's biggest glacier are opposite to previously observed
Animated gif of thinning of Pine Island Glacier. Credit: University of Bristol
Lead author Professor Jonathan Bamber from the University of Bristol's School of Geographical Sciences, said: "This could seem like a 'good news story' but it's important to remember that we still expect this glacier to continue to lose mass in the future and for that trend to increase over time, just not quite as fast as some model simulations suggested.
"It's really important to understand why the models are producing different behaviour in the future and to get a better handle on how the glacier will evolve with the benefit of these new observations.
"In our study, we didn't make projections but with the aid of these new data we can improve model projections for this part of Antarctica."
Lab turns trash into valuable graphene in a flash

by Mike Williams, Rice University

Carbon black powder turns into graphene in a burst of light and heat through a technique developed at Rice University. Flash graphene turns any carbon source into the valuable 2D material in 10 milliseconds. Credit: Jeff Fitlow/Rice University

That banana peel, turned into graphene, can help facilitate a massive reduction of the environmental impact of concrete and other building materials. While you're at it, toss in those plastic empties.

A new process introduced by the Rice University lab of chemist James Tour can turn bulk quantities of just about any carbon source into valuable graphene flakes. The process is quick and cheap; Tour said the "flash graphene" technique can convert a ton of coal, food waste or plastic into graphene for a fraction of the cost used by other bulk graphene-producing methods.
"This is a big deal," Tour said. "The world throws out 30% to 40% of all food, because it goes bad, and plastic waste is of worldwide concern. We've already proven that any solid carbon-based matter, including mixed plastic waste and rubber tires, can be turned into graphene."

As reported in Nature, flash graphene is made in 10 milliseconds by heating carbon-containing materials to 3,000 Kelvin (about 5,000 degrees Fahrenheit). The source material can be nearly anything with carbon content. Food waste, plastic waste, petroleum coke, coal, wood clippings and biochar are prime candidates, Tour said. "With the present commercial price of graphene being $67,000 to $200,000 per ton, the prospects for this process look superb," he said.

Tour said a concentration of as little as 0.1% of flash graphene in the cement used to bind concrete could lessen its massive environmental impact by a third. Production of cement reportedly emits as much as 8% of human-made carbon dioxide every year.

In a flash, carbon black turns into graphene through a technique developed by Rice University scientists. The scalable process promises to quickly turn carbon from any source into bulk graphene. From left: undergraduate intern Christina Crassas, chemist James Tour and graduate students Weiyin Chen and Duy Luong. Credit: Jeff Fitlow/Rice University

"By strengthening concrete with graphene, we could use less concrete for building, and it would cost less to manufacture and less to transport," he said. "Essentially, we're trapping greenhouse gases like carbon dioxide and methane that waste food would have emitted in landfills. We are converting those carbons into graphene and adding that graphene to concrete, thereby lowering the amount of carbon dioxide generated in concrete manufacture. It's a win-win environmental scenario using graphene."

"Turning trash to treasure is key to the circular economy," said co-corresponding author Rouzbeh Shahsavari, an adjunct assistant professor of civil and environmental engineering and of materials science and nanoengineering at Rice and president of C-Crete Technologies. "Here, graphene acts both as a 2-D template and a reinforcing agent that controls cement hydration and subsequent strength development."

In the past, Tour said, "graphene has been too expensive to use in these applications. The flash process will greatly lessen the price while it helps us better manage waste."

"With our method, that carbon becomes fixed," he said. "It will not enter the air again."

The process aligns nicely with Rice's recently announced Carbon Hub initiative to create a zero-emissions future that repurposes hydrocarbons from oil and gas to generate hydrogen gas and solid carbon with zero emission of carbon dioxide. The flash graphene process can convert that solid carbon into graphene for concrete, asphalt, buildings, cars, clothing and more, Tour said.

Flash Joule heating for bulk graphene, developed in the Tour lab by Rice graduate student and lead author Duy Luong, improves upon techniques like exfoliation from graphite and chemical vapor deposition on a metal foil that require much more effort and cost to produce just a little graphene.

Even better, the process produces "turbostratic" graphene, with misaligned layers that are easy to separate. "A-B stacked graphene from other processes, like exfoliation of graphite, is very hard to pull apart," Tour said. "The layers adhere strongly together.

But turbostratic graphene is much easier to work with because the adhesion between layers is much lower. They just come apart in solution or upon blending in composites.

"That's important, because now we can get each of these single-atomic layers to interact with a host composite," he said.

The lab noted that used coffee grounds transformed into pristine single-layer sheets of graphene.

Bulk composites of graphene with plastic, metals, plywood, concrete and other building materials would be a major market for flash graphene, according to the researchers, who are already testing graphene-enhanced concrete and plastic.

The flash process happens in a custom-designed reactor that heats material quickly and emits all noncarbon elements as gas. "When this process is industrialized, elements like oxygen and nitrogen that exit the flash reactor can all be trapped as small molecules because they have value," Tour said.

He said the flash process produces very little excess heat, channeling almost all of its energy into the target. "You can put your finger right on the container a few seconds afterwards," Tour said. "And keep in mind this is almost three times hotter than the chemical vapor deposition furnaces we formerly used to make graphene, but in the flash process the heat is concentrated in the carbon material and none in a surrounding reactor.

"All the excess energy comes out as light, in a very bright flash, and because there aren't any solvents, it's a super clean process," he said.

Luong did not expect to find graphene when he fired up the first small-scale device to find new phases of material, beginning with a sample of carbon black. "This started when I took a look at a Science paper talking about flash Joule heating to make phase-changing nanoparticles of metals," he said. But Luong quickly realized the process produced nothing but high-quality graphene.
Rice University scientists are turning waste into turbostratic graphene via a process they say can be scaled up to produce industrial-scale quantities. Credit: Rouzbeh Shahsavari/C-Crete Group

Atom-level simulations by Rice researcher and co-author Ksenia Bets confirmed that temperature is key to the material's rapid formation. "We essentially speed up the slow geological process by which carbon evolves into its ground state, graphite," she said. "Greatly accelerated by a heat spike, it is also stopped at the right instant, at the graphene stage.

"It is amazing how state-of-the-art computer simulations, notoriously slow for observing such kinetics, reveal the details of high temperature-modulated atomic movements and transformation," Bets said.

Tour hopes to produce a kilogram (2.2 pounds) a day of flash graphene within two years, starting with a project recently funded by the Department of Energy to convert U.S.-sourced coal. "This could provide an outlet for coal in large scale by converting it inexpensively into a much-higher-value building material," he said.

Explore furtherGraphene is 3-D as well as 2-D
More information: Luong et al. Gram-scale bottom-up flash graphene synthesis. Nature (2020). doi.org/10.1038/s41586-020-1938-0

VIDEOS

Study says that we trust our workplace robots



robot
Credit: CC0 Public Domain
The only constant is change. Presumptions harden as truth but then there is occasion to throw presumptions off the table and start again. That's the deal with information technology using AI for business and with robots unleashed in the workplace. The presumptions are that such tech is potentially harmful and that if those robots rebel against you, you're toast.
A new study has a sunnier view. People have more trust in robots. "The majority (65 percent) of workers are optimistic, excited and grateful about having  co-workers and nearly a quarter report having a loving and gratifying relationship with AI at work," said the press release provided by Oracle.
The information comes from an annual study that Oracle runs with the research company Future Workplace. The title of the report is "Artificial Intelligence Is Winning More Hearts and Minds in the Workplace." That is quite the positive headline; is it an aggressive spin or a realistic reflection that people are so amenable to workplace AI?
Responses were taken from 8,370 employees, managers and HR leaders across 10 countries. (And why not at least ask: Whether responses would be good or negative, AI has changed the  and influences how human resources and managers behave in order to keep organizations on track.)
Judith Humphrey, the founder of a Toronto-based communications firm, in Fast Company had a look at the study. She thought it presented "a strong case that AI is already winning the hearts and minds of employees."
Consider technologies that remove the grunt work so that managers can turn to more creative pursuits; the technologies that teach workers how to maximize communications via imaginative digital platforms; technologies that add weight to their career portfolios as they seek promotions or new jobs.
Humphrey noted study results: "New technologies, according to respondents, will help them master new skills (36%), gain more free time (36%), and expand their current role so that it's more strategic (28%)."
With that said, people outside Oracle still may not easily accept the very thought of an employee at any company trusting a machine more than a human manager to "do the right thing" or make the right assessment.

A closer look at the survey questions, though, indicate the response was significant; the outcome had its own logic.
What, specifically, were the activities that respondents felt could be done better by robots than by their managers? These were (1) providing unbiased information, (2) maintaining work schedules, (3)  and (4) managing a budget.
Increased adoption of AI at work is having an impact on the way employees interact with their managers. The traditional role of HR teams and the manager is shifting.


Oracle's press summary of the findings noted that "64 percent of people would trust a robot more than their manager and half have turned to a robot instead of their manager for advice."
As for Oracle's  headline, "Artificial Intelligence Is Winning More Hearts and Minds in the Workplace," it is not an inaccurate spin but more a snippet from a larger thought. AI is winning more hearts and minds in the realm of what AI is good at doing, leaving room and time for managers to do what they do best, coach, motivate, inspire, build teams.
Jeanne Meister, founding partner, Future Workplace: "As workers and managers leverage the power of artificial intelligence in the workplace, they are moving from fear to enthusiasm as they see the possibility of being freed of many of their routine tasks and having more time to solve critical business problems for the enterprise."


Humphrey in Fast Company did not miss the part in the study where respondents pinned what,, on the flip side, their managers did better than robots: "understanding my feelings," "coaching me," "creating or promoting a work culture" and "evaluating team performance."
A prophetic enough article appeared back in 2016 in Harvard Business Review where the authors argued that  will soon be able to do the administrative tasks that consume much of managers' time faster, better, and at a lower cost.
The authors reflected on study findings at the time. The attitude was encouraging; managers could see the difference between intelligently leveraging AI for  and data-driven solutions as opposed to fighting AI as a threat leading to their removal, leadership skills and all.
"Writing earnings reports is one thing, but developing messages that can engage a workforce and provide a sense of purpose is human through and through," the authors wrote. "Tracking schedules and resources may soon fall within the jurisdiction of machines, but drafting strategy remains unmistakably human. Simply put, our recommendation is to adopt AI in order to automate administration and to augment but not replace human judgment."
In the Oracle study, meanwhile, workers in India (89 percent) and China (88 percent) were more trusting of robots over their . Singapore followed by 83 percent; Brazil, 78 percent; Japan, 76 percent; UAE, 74 percent; Australia/New Zealand, 58 percent; U.S., 57 percent; UK, 54 percent; and France, 56 percent.

Burden of health care costs greatest among low-income Americans

by RAND Corporation

Higher income American households pay the most to finance the nation's health care system, but the burden of payments as a share of income is greatest among households with the lowest incomes, according to a new RAND Corporation study.

Households in the bottom fifth of income groups pay an average of 33.9% of their income toward health care, while families in the highest income group pay 16% of their income toward health care.

The analysis finds that households in the middle three income tiers pay between 19.8% and 23.2% of their income toward health care. The analysis considered all payments made by households to support health care, including taxes and employer contributions.


The study is published online by the journal Health Services Research.

"Our findings suggest that health care payments in the U.S. are even more regressive than suggested by earlier research," said Katherine G. Carman, lead author of the study and a senior economist at RAND, a nonprofit research organization. "As national discussions continue about health reform and health equity, it's important to understand how the current health care system distributes costs and payments."


In 2015, health care spending accounted for nearly 18 percent of the U.S.'s gross domestic product, a measure of the total value of goods produced and services provided by the nation. Ultimately all health care costs are paid by households, either in obvious ways such as through insurance premiums or out-of-pocket costs, in addition to less-visible ways such as employer-paid premiums and taxes.

RAND researchers analyzed a variety of sources of information to examine the burden that different families face to pay for health care, as well as the relationship between who pays for care and who receives care.

Researchers combined data from multiple sources collected in 2015, including the Survey of Income and Program Participation, the Medical Expenditure Panel Survey, the Kaiser Family Foundation/Health Research Education Trust Employer Health Benefits Survey, the American Community Survey and the National Health Expenditure Accounts.

Previous research has examined the distribution of health care financing, but the new RAND study considers payments made to finance health care, the dollar value of benefits received, and the impact on different groups by age, source of insurance and size of income.

The RAND study also is the first to consider the burden of health costs among people who are in nursing homes and other institutions, a calculation that led to higher estimates of health spending. The burden is particularly large on low-income people who need long-term care because in order to qualify for public benefits they must first spend most of their savings.

"We think this is a particularly important addition because those in nursing homes are among the most vulnerable in terms not only of their health, but also of the large financial burden that they face," Carman said.

While out-of-pocket spending, including insurance premiums, is the most obvious payment most people make for health care, the RAND study found it accounted for just 9.1% of health care costs. The vast bulk of health care costs are paid through health insurance premiums and taxes.

The study found that payments to finance health care was $9,393 per person, or 18.7% of average household income.
Examining benefits by type of insurance, researchers found that Americans with Medicare receive the greatest dollar value of health care, a result of older people generally using more health care services.

Those with Medicaid have the largest dollar value of health care received as a percent of income, which corresponds to the lower income and generally poorer health among the group. People with employer-sponsored insurance received the lowest dollar value of health care.

Unsurprisingly, those with lower income are much more likely to benefit from redistribution of health care payments made by others toward health care services.

The study found that households in the three lowest income groups receive more health care services than they pay for through all forms of payments. In the fourth income group, payments and the dollar value of care received are similar.

Households in the highest of the five income groups are paying much more into the system than they receive in health care services.

"Understanding how different groups contribute to and and benefit from health care spending is difficult for researchers, policymakers and the general public," Carman said. "This work provides better insight into how the American health care system redistributes contributions and spending across different parts of society."


Explore furtherEmployee premiums, deductibles eating larger share of income
Journal information: Health Services Research


Provided by RAND Corporation

The Blue Acceleration: Recent colossal rise in human pressure on ocean quantified

The Blue Acceleration: Recent colossal rise in human pressure on ocean quantified
Global trends in use of the marine environment. Usage reached an inflection point around the turn of the new millennium. Credit: One Earth,
Human pressure on the world's oceans accelerated sharply at the start of the 21st century and shows no sign of slowing, according to a comprehensive new analysis on the state of the ocean.
Scientists have dubbed the dramatic rise the "Blue Acceleration." The researchers from the Stockholm Resilience Centre, Stockholm University, synthesized 50-years of data from shipping, drilling, deep-sea mining, aquaculture, bioprospecting and much more. The results are published in the journal One Earth on 24 January.
The scientists say the largest  industry is the oil and gas sector, responsible for about one third of the value of the ocean economy. Sand and gravel are the ocean's most mined minerals to meet demand from the construction industry. As freshwater becomes an increasingly scarce commodity, around 16,000  have sprung up around the world in the last 50 years with a steep rise since 2000, according to the analysis.
Lead author Jean-Baptiste Jouffray from the Stockholm Resilience Centre said, "Claiming  and space is not new to humanity, but the extent, intensity, and diversity of today's aspirations are unprecedented."
The industrialization of the ocean took off at the end of the last century, driven by a combination of technological progress and declining land-based resources.
"This Blue Acceleration is really a race for ocean resources and space, posing risks and opportunities for global sustainability."
The study highlights some positive human impacts. For example, the area protected from some exploitation has increased exponentially, with a surge since 2000 that shows no signs of slowing. And offshore wind farm technology has reached  in this period allowing the world to reduce reliance on fossil fuels.
The authors conclude by calling for increased attention to who is driving the Blue Acceleration, what is financing it, and who is benefiting from it. The United Nations is embarking on a "decade of the ocean" in 2021. The scientists say this is is an opportunity to assess social-ecological impacts and manage ocean resources for long-term sustainability.
They highlight there is a high degree of consolidation relating the seafood industry, oil and gas exploitation, and bioprospecting, with just a small handful of multinational companies dominating each sector. The team suggests that banks and other investors could adopt more stringent sustainability criteria for ocean investments.
New criteria for bank loans and stock exchange listings could protect ocean resources

More information: Jean-Baptiste Jouffray et al. The Blue Acceleration: The Trajectory of Human Expansion into the Ocean. One EarthDOI: 10.1016/j.oneear.2019.12.016





On the occasion of Karl Marx's 200th birthday this year, numerous conferences, edited volumes and special issues have celebrated his work by focusing on its main achievements – a radical critique of capitalist society and an alternative vocabulary for thinking about the social, economic and political tendencies and struggles of our age. Albeit often illuminating, this has also produced a certain amount of déjà vu. Providing an occasion to disrupt patterns of repetition and musealization, Krisis (http://krisis.eu/) proposes a different way to pay tribute to Marx's revolutionary theorizing. We have invited authors from around the globe to craft short entries for an alternative ABC under the title " Marx from the Margins: A Collective Project, from A to Z " – taking up, and giving a twist to, Kevin Anderson's influential Marx at the Margins (2010). The chief motivation of this collaborative endeavour is to probe the power – including the generative failures – of Marx's thinking by starting from marginal concepts in his work or from social realities or theoretical challenges often considered to be marginal from a Marxist perspective. Rather than reproduce historically and theoretically inadequate differentiations between an ascribed or prescribed cultural , economic, geographic, intellectual, political, social, or spatial centre and its margins, the margins we have identified and inspected are epistemic vantage points that open up new theoretical and political vistas while keeping Marx's thought from becoming either an all-purpose intellectual token employed with little risk from left or right, or a set of formulaic certitudes that force-feed dead dogma to ever-shrinking political circles. We have welcomed short and succinct contributions that discuss how a wide variety of concepts – from acid communism and big data via extractivism and the Haitian Revolution to whiteness and the Zapatistas – can offer an unexpected key to the significance of Marx's thought today. The resulting ABC, far from a comprehensive compendium, is an open-ended and genuinely collective project that resonates between and amplifies through different voices speaking from different perspectives in different styles; we envisage it as a beginning rather than as an end. In this spirit, we invite readers to submit new entries to Krisis, where they will be subject to our usual editorial review process and added on a regular basis, thus making this issue of Krisis its first truly interactive one. The project is also an attempt to redeem, in part, the task that the name of this journal has set for its multiple generations of editors from the very beginning: a crisis/Krise/Krisis is always a moment in which certainties are suspended, things are at stake, and times are experienced as critical. A crisis, to which critique is internally linked, compels a critique that cannot consist simply of ready-made solutions pulled out of the lectern, but demand, in the words of Marx's " credo of our journal " in his letter to Ruge, " the self-clarification (critical philosophy) of the struggles and wishes of the age " .

From_the_Mass-Worker_to_Cognitive_Labour_Historical_and_Theoretical_Considerations


https://www.academia.edu/15647826/From_the_Mass-Worker_to_Cognitive_Labour_Historical_and_Theoretical_Considerations



Interface: a journal for and about social movements, 2019