It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Tuesday, July 18, 2023
Field site shows evidence of humans in Oregon 18,000 years ago
Oregon archaeologists have found evidence suggesting humans occupied the Rimrock Draw rock shelter outside of what is now the Eastern Oregon town of Riley more than 18,000 years ago.
The University of Oregon's Museum of Natural and Cultural History Archaeological Field School, led by archaeologist UO Patrick O'Grady, has been excavating at the Rimrock Draw rock shelter since 2011 under an official partnership agreement with the Bureau of Land Management. Discoveries at the site have included stone tools and extinct mammal tooth fragments from the Pleistocene Epoch. The pieces of tooth enamel are identified as bison and camel.
In 2012, O'Grady's team found camel teeth fragments under a layer of volcanic ash from an eruption of Mount St. Helens that was dated at more than 15,000 years ago. The team also uncovered two finely crafted orange agate scrapers, one in 2012 with preserved bison blood residue and another in 2015, buried deeper in the ash. Natural layering of the rock shelter sediments suggests the scrapers are older than both the volcanic ash and camel teeth.
Radiocarbon dating analysis on the tooth enamel, first in 2018 and then again in 2023, by Thomas W. Stafford Jr. of Stafford Research and John Southon of University of California, Irvine, yielded exciting results: a date of 18,250 years before present.
That date, in association with stone tools, suggests that the Rimrock Draw rock shelter is one of the oldest sites of human occupation in North America.
"This early date aligns well with the oral histories of the tribal nations in the region, many of whom have stories about witnessing geological events like the Missoula floods, a series of events that changed everything for the tribes between 18,000 and 15,000 years ago," said David Lewis, who received his doctorate in anthropology from the UO and is currently a professor of anthropology at Oregon State University. "As well, tribes have oral histories of encountering giant animals, monsters on the land, and Rimrock Draw rock shelter's evidence suggest that we did interact with the megafauna, and they may have become characters in our histories of the time before memory."
Additional testing of other camel and bison tooth fragments is currently underway, and archaeobotanists are studying plant remains from cooking fires as well.
"The identification of 15,000-years-old volcanic ash was a shock, then Tom's 18,000-years-old dates on the enamel, with stone tools and flakes below, were even more startling," O'Grady said.
Currently, another archaeological site on BLM-managed public lands in western Idaho, known as Cooper's Ferry, is thought to be the oldest known site of human habitation in western North America. Evidence there suggests human occupation dating back more than 16,000 years.
"This is a very exciting development for the archaeological community," said Heather Ulrich, BLM Oregon/Washington archaeology lead. "Previous excavations on BLM public lands in Oregon have provided archaeological evidence of human occupation dating back 14,000 years. Thanks to the partnership with Dr. O'Grady and the university, these new dates push our archaeological knowledge of human occupation in North America even farther, perhaps the oldest yet."
This summer, O'Grady plans to complete the final archaeology field school at Rimrock Draw. The team will be working on several units where more ice age animal remains and artifacts are providing supporting evidence for the 2012 discoveries.
A revised strategy to reduce global shipping emissions has emerged from two weeks of intense talks in London. It marks a significant increase in the industry's climate ambition.
The revised strategy has been criticized for not being ambitious enough. However, the forecast growth in global trade and the world's shipping fleet means the reductions required of individual ships are much greater than the overall greenhouse gas emission targets.
The new targets for international shipping are:
reductions of 20% (from a 2008 baseline), striving for 30%, by 2030
reductions of 70%, striving for 80%, by 2040
net-zero emissions "by or around, i.e. close to" 2050.
We calculate the strategy will require cuts in emissions per ship of up to 60% by 2030 and as much as 91% by 2040. This means the days of fossil-fueled ships are numbered.
Edging closer to limiting warming to 1.5℃
Global shipping emissions rank within to the top 10 countries for emissions. The industry should do its fair share in keeping global warming below 1.5℃.
The revised strategy was negotiated at the London headquarters of the International Maritime Organization (IMO), the United Nations agency that regulates shipping. Backed by the Science-Based Targets initiative, several Pacific Island states, New Zealand, the US, the UK and Canada had proposed emission cuts of at least 37% by 2030, 96% by 2040 and to absolute zero by 2050. (An initial strategy adopted in 2018 aimed to reduce shipping emissions by at least 50% by 2050.)
The revised strategy's targets are not as high as those called for by the science and the most ambitious governments. However, they are still very stringent at a ship level.
Before the revised strategy, IMO policy focused on improving the energy efficiency and carbon intensity of new and existing ships. These tools failed to rein in shipping emissions.
Climate Action Tracker's most recent analysis concluded the "highly insufficient" initial strategy put shipping on a pathway consistent with 3-4℃ of warming. To estimate how the new targets compare—assuming the strategy's measures that are yet to be adopted will be effective—they can be superimposed on this assessment's current trajectories.
This shows the revised strategy still does not align global shipping with the emission-reduction pathway needed to avoid more than 1.5℃ of warming. But it does mark the beginning of the end for fossil fuels.
What are the strategy's key elements?
The revised strategy calls for "net-zero" GHG emissions "by or around, i.e. close to 2050". The term "net" leaves an unfortunate loophole for future use of emission offsets. It's big enough for the giant container ship Ever Given to steam through. This ambiguity has been left for future negotiations to resolve.
Importantly, though, IMO member states agreed to set targets for emissions on a "well-to-wake" basis, covering emissions from both fuel production and combustion. Including "upstream" emissions ensures shipping decarbonization does not shift emissions ashore. Being required to achieve these reductions will fundamentally and rapidly change the sector's technology and energy supply chains.
Decarbonization will drive up shipping costs. Developing countries fear the impacts will be much greater for them than for developed countries.
Small island developing states and least developed countries bear almost no historical responsibility for the climate crisis. They have called for a "just and equitable transition". So too have countries with large numbers of maritime workers, like the Philippines, as well as the International Transport Workers' Federation representing these workers.
To deliver a policy that both reduces emissions and supports a just and equitable transition, the revised strategy includes a commitment to finalize a "basket of candidate measures", both technical and economic.
The technical measure is a fuel standard that ratchets down the permissible emission intensity of fuels over time. This proved uncontroversial.
The "candidate" economic measures to price emissions did not all get broad support. For example, a mandatory universal emissions levy—proposed by the Marshall Islands and the Solomon Islands—was strongly opposed by countries like China, Brazil and Argentina for fear it might harm their exports.
Many "small island developing states" and "least developed countries" backed a levy. They see it as the most environmentally effective companion to a fuel standard. A price on emissions will speed up the transition, while revenues from the levy can be used to support a just and equitable transition.
As a result of these political differences, more work needs to be done to resolve the specifics of the emissions pricing mechanism.
The message to industry is crystal clear: the commercial competitiveness of fossil-fuel-driven ships, and demand for them, will dwindle rapidly with almost a full phasing out by the 2040s. During this rapid transition, shipping firms will have to very carefully manage the liabilities and risks of existing investments and formulate ways to maximize opportunities and market share.
Pressure from Pacific Island states and increased public scrutiny forced IMO member states to commit to higher levels of ambition than many had wanted to accept. Continued pressure will be needed, though, to ensure the measures adopted deliver on the ambition of the IMO strategy.
Before member states adopt any of these measures, the UN Conference on Trade and Development (UNCTAD) will model the expected impacts on states. Some countries may then fight hard to block or reduce the effect of measures that have "disproportionate negative impacts".
The final "basket of measures" won't be adopted until 2025 when their details are finalized. They will become legally binding when the strategy comes into force in 2027.
In sum, the revised strategy is a modest win, but the battle is far from over.
In the past 70 years, thousands of pesticides have been applied worldwide to increase agricultural production. The intensification of agriculture has led to higher yields, but what are the side effects? Systematic monitoring programs on pesticide residues in ecosystems are only established for water bodies and crops, and only scattered information is available on residues in soils, sediment, air, indoor dust, and the human body.
A session by the SPRINT project at the Science Summit at the UN General Assembly in New York on 27 September 2023 will offer significant new knowledge to solve the urgent question of reducing pesticide risk as a base for transitioning to sustainable agriculture.
SPRINT is the first European-wide effort, extended to Argentina as a significant feed exporter, to monitor actual pesticide application patterns and the resulting distribution of pesticide residues across all central cropping systems. At the UN Science Summit, the initial results of a monitoring campaign will be presented from the growing season 2021, based on the analysis of about 600 samples across soil, plants, surface water, sediment, air and indoor dust and urine and fecal samples from approximately 700 people.
SPRINT focuses on more than 200 active substances of synthetic pesticides and their metabolites from products that are either approved, are substitute candidates or are now banned in the EU. Pesticide residue distribution in different environments is linked to the hazardous properties of the pesticide residues for the ecosystem and human health. Laboratory tests also identify the effects on the ecosystem and human health.
Key findings from the monitoring campaign
Overall
Pesticide residue mixtures are omnipresent in all ecosystems and humans; the presence of mixtures is the new normal.
In more than 90% of the human urine and fecal samples, residues are detected, with a maximum of 20 different types.
Benchmarks for pesticide mixtures in humans, soil, sediment, air and indoor dust are not yet included in the regulations.
In the environment
38% of the European surface water bodies exceed the benchmarks for drinking water quality.
96% of the soils sampled under conventional farming contain pesticide mixtures with a maximum of 20 residues.
79% of soils under organic farming contain mixtures as well.
Most pesticide residues detected in soils and water are classified as hazardous for earthworms or fish.
More than 90 different pesticides and pesticide metabolites were detected in the air in two European agricultural areas.
On crops and farms
More than 40% of the crops contain mixtures with a maximum of 20 residues per sample. The maximum residue level (MRL) for single residues is mostly not exceeded, but there is no MRL for mixtures.
Ecosystems connected to organic farms contained fewer residues but often mixtures of mostly banned pesticide residues from past applications.
The highest accumulation of pesticide residues is detected in indoor dust of farmhouses, with up to 121 different residues per sample in conventional farmhouses and more than 80 in organic farmhouses.
More than 30% of the residues detected in indoor dust are classified as carcinogenic, with more than 60% potentially affecting reproduction development.
In advance of the Science Summit, Violette Geissen, the project leader, said, "Our research has shown just how widespread these pesticide residues are in our homes, bodies and ecosystems. Based on our results, we propose a new risk indicator system to account for the frequency and levels of pesticide use that can be included in future regulations. With the EU's strong commitment to regulation and human safety, we are confident our new data will help inform decision-makers as they work towards the EU's policy goals of a 50% reduction in the use and risk of pesticides."
This summer has seen a violent outbreak of forest fires across Canada and North America. According to the Canadian Interagency Forest Fire Center (CIFFC), there were 911 active fires across the country on July 13, nearly 600 of which were characterized as "out-of-control."
More than half of these active fires are taking place in the provinces of British Columbia, driven by a combination of unusual heat, dry lightning, and drought. The situation is becoming increasingly common thanks to rising global temperatures, diminished rainfall, changing weather patterns, and other related effects of Climate Change.
Monitoring forest fires and other meteorological phenomena is an important task for which Earth Observation missions like NASA's Aqua satellite were created. On July 12, with six weeks left in the Canadian fire season, Aqua captured images of some of the largest fires over British Columbia using its Moderate Resolution Imaging Spectroradiometer (MODIS) instrument.
The image above shows some of the biggest "hot spots" in the province, which produced dense plumes of smoke blowing eastward through the Rocky Mountains and into Alberta and the Northwest Territories.
The Aqua and Terra satellite work in tandem to acquire data twice daily at mid-latitudes using their MODIS instruments. These four daily observations inform operational fire management while advancing global monitoring of wildfires and their effects on the local environment, the planet's atmosphere, and climate.
In addition, these fires produce massive plumes of smoke that reach the upper atmosphere, where high-level winds spread the smoke and particulate matter to neighboring areas. Several of these fires have produced smoke-infused storm clouds in recent weeks, known as pyrocumulonimbus (pyroCb).
"We observed 14 pyroCbs with satellites on July 9-10 alone—several in British Columbia and some in Quebec," said David Peterson, an atmospheric scientist with the U.S. Naval Research Laboratory. "This surge in activity has made 2023 the most active year for pyroCbs both in Canada and worldwide since we began tracking their numbers closely about a decade ago."
As of July 13, Peterson and colleagues observed 90 pyroCbs in Canada and 104 worldwide. This set a new record for Canada and the world, the previous of which were set in 2021 (50 and 100, respectively), and prompted Environment Canada to issue air quality warnings for communities throughout British Columbia, Alberta, and the Northwest Territories.
In addition, more than 1.2 million hectares were damaged by fire in British Columbia as of July 12, 2023, roughly 30 times the average by this point in the year.
A major outbreak in June struck the eastern province of Quebec and led to air quality issues in neighboring Ontario and parts of Minnesota, Wisconsin, Illinois, and Michigan. This year's totals represent 13% of the 9.4 million hectares that burned nationwide as of July 12. It's also an almost 25% increase over the 7.1 million hectares that burned in 1997, the second-highest year recorded by the CIFFC. Monitoring and tracking these firestorms is vital to mitigation and adaptation efforts as average global temperatures continue to rise.
The transition to an environmentally sustainable economy has begun, but it will be a generation (about twenty years) before we will see moderation of global warming. The cost of reducing climate pollution faster would be an economic and political disaster. Slowing or shutting down economic growth would destabilize politics, and, given the rapidly advancing technology of arms and destruction, that instability would be more destructive than the impact of climate change.
The pace of the transition to a renewable resource-based economy depends on the development and adoption of new energy and waste processing technologies. It is difficult to predict how fast that process will be, but the sunk costs in existing technologies will only be discarded when new technologies are clearly better, cheaper, and more reliable than what they replace. The idea that climate pollution can be eliminated by political edict overestimates political power and underestimates economic power.
It is not simply powerful economic interests that influence public policy, but the sense of economic well-being perceived and experienced by the mass public. The maintenance of that sense of well-being is a critical foundation of political stability. The transition to a renewable resource-based economy must be careful to reinforce and not undermine that sense of well-being.
The fact that this transition will take time does not diminish our sense of urgency about its necessity, but it is important that we deal with the world that we live in rather than the one we might wish for.
This summer of fire, floods, and intense heat makes it clear that decarbonization must accelerate. But our work on protecting our communities from the impact of extreme weather events is even more urgent. While there continues to be discussion of "managed retreat" from low-lying areas vulnerable to flooding, the widespread geography of extreme weather indicates that you can run but you can't hide from climate impacts.
Perhaps we should move from the Jersey shore to the Green Mountains of Vermont: Oops, that won't work, Vermont just got hit with two months' worth of rain in a few days. Perhaps we could move to small towns in the Midwest: Nope, they've got those tornadoes along with overflowing rivers as well. Maybe we should move out west near the forests: Oh yeah, drought and electrical sparks are causing massive fires. Forget retreat, managed or under conditions of panic, we need to build a stronger and more resilient built environment.
And the need is urgent. Drainage systems, dams, buildings, transportation, and energy systems must be built to withstand the rains, wind, heat, and cold of climate-accelerated extreme weather events. The impact of extreme weather is massive and growing. According to Christopher Flavelle of the New York Times:
"Weather-related disasters pushed more than 3.3 million American adults out of their homes in 2022, census data shows. Of those, at least 1.2 million people were out of their homes for at a month or longer; more than half a million of them never returned, fueling a growing diaspora of domestic climate refugees."
Flavelle reports that we are not helpless to prevent damage from extreme weather:
"Technologies exist to protect homes against severe weather—but those innovations have been slow to seep into mainstream homebuilding, leaving most Americans increasingly exposed to climate shocks, experts say… Houses made from steel and concrete can be more resilient to heat, wildfire and storms. Even traditional wood-framed homes can be constructed in ways that greatly reduce the odds of severe damage from hurricanes or flooding. But the costs of added resiliency can be about 10 percent higher than conventional construction."
A climate-resilient built environment is going to cost money. Lots of money. And that means we will need to pay for those upgrades out of money that we'd rather use elsewhere but must be induced to spend on these protective measures. Government must lead with upgrades to public infrastructure such as roads, bridges, ports, dams, train tracks, and airports.
Taxes on income, sales, transportation, and tolls will need to rise to pay for these improvements. The anti-government forces controlling the U.S. House of Representatives are unlikely to be helpful here, and so we will need Mayors and Governors around the country to take the lead.
Here in New York, the state and city government are both taking action. Hilary Howard of the New York Times reported that the state has budgeted over $1.1 billion for flood control but wonders if it is enough. According to Howard's report:
"Catastrophic rainfall caused overwhelming floods in parts of the Hudson Valley and elsewhere in the country this week, leading New York officials like Gov. Kathy Hochul to warn of extreme weather that would be 'our new normal.' New York City's chief climate officer, Rohit T. Aggarwala, gave an even more dire warning, saying that 'the weather is changing faster than our infrastructure can keep up.' Thousands of projects are in the works across the state to combat the effects of climate change, including rethinking flood-resistant housing, updating weather models and racing to manage overflow rain. But many will take decades to complete, and there are concerns over whether it will be enough."
But clearly, New York state and city are working hard to reduce climate risk. In concert with government, the private sector also has a critical part to play in reducing the costs of climate risk. Insurance companies can play a central role by requiring building owners to meet higher standards of resistance to floods, fire, and wind to purchase insurance. Owners who meet these higher standards should be given a discount on the insurance they buy. Insurance companies have a clear self-interest in reducing the cost of climate risk. They should work to play a role in convincing property owners to invest in resilience measures. There is ample precedent for this. When buildings first began to electrify at the end of the 19th century, fire insurance companies supported William Henry Merrill's Underwriters Lab, which tested and certified the safety of equipment used in buildings ranging from refrigerators to boilers. Insurance companies required the use of these approved devices before they would agree to sell you building insurance.
Today, some insurance companies are abandoning high-risk states like Florida and California. These insurers are unwilling to take the risk involved in insuring homes in climate-threatened places. Just as government had to assume the risk of flood insurance, we may eventually see that in home insurance as well. Homeowners holding mortgages are typically required to carry insurance, and government and insurance companies will need to work together to require more storm-resistant construction and retrofits. Otherwise, the cost of insurance will become prohibitive.
Government may well need to play the role of reinsurer in climate-threatened areas because private reinsurers may be unwilling to take on the financial burden of insuring insurers. You will find no climate deniers in the insurance and reinsurance industry. As Sean Baldwin and David Coon observed in Risk Management Magazine last summer:
"It is no secret that climate change is having a serious impact on the insurance market, affecting industry participants from the level of primary insurance all the way up to insurance-linked securities (ILS) and retrocessional reinsurance. With severe catastrophe events increasing in both frequency and severity as global temperatures and sea levels rise, industry participants have faced rapidly increasing exposure to severe catastrophe losses."
"For instance, one Moody's study found that more than 70% of global insured wildfire losses between 1980 and 2018 occurred between 2016 and 2018 alone. This dramatic increase in catastrophe risks poses challenges for every level of the insurance industry. For buyers of primary insurance, the changing nature of catastrophe risks results in increased premium rates from insurers who face not only increased insured losses, but also increased loss adjustment expenses tied to more frequent coverage disputes…. These changing risks, increased volatility and increased reinsurance demand will result in increased costs for reinsurance buyers."
As insurers and reinsurers find the risks growing, they are charging more to insure property, or simply refusing to do business. Lenders will not lend capital to businesses in uninsurable locations or will charge so much interest that borrowing is not practical. All of this could lead to a damaged business environment in climate-threatened regions.
The alternative to this risk-induced death spiral is to develop a built environment that is more capable of resisting catastrophic destruction. We've seen this in earthquake zones where stricter building codes enable structures to survive quakes with only minor damage. Buildings need to be constructed to survive forest fires, floods, and high winds. Towns and cities need to make sure they can handle the drainage needs of five inches of rain in a couple of hours. We can build more resilient structures if we are willing to pay the price.
We have entered an era of extreme weather, and we need to invest the massive resources required to survive weather impacts. One danger of this extreme focus on climate adaptation is that it might deemphasize the importance of reducing climate pollution. That would be a mistake because the planet could still get warmer, causing even greater damage.
I suspect that the political dynamic caused by these weather disasters will lead to greater understanding of the need to decarbonize our economy. Most people understand the growing intensity of floods, fire, droughts, and heat are impacts of climate change and are starting to see that we need to attack both causes and impacts.
The technologies that we rely on have made the world more comfortable, exciting, stimulating, and dangerous. We need to take these dangers seriously, and that requires reinforcing our homes as well as investment in the collective body we call government. We need policies, rules, and resources to ensure that the costs of our technological world do not exceed the benefits. Protection costs money.
In America we have the wealth to protect ourselves, what we lack is the political will to act. Conspiracy theories and fact-free social media rantings continue to clog our political discourse and make it difficult to address real dangers. Sadly, it takes floods, heat, and fire to convince people about the reality of our polluted planet. That growing awareness now needs to stimulate action.
NASA's first new wind tunnel in 40 years will turn science fiction to fact
by Katrina Dix
Flying cars. Space tourism. Safe reentry for astronauts coming back from Mars.
These technologies are still science fiction, but some won't be for much longer, according to Charles "Mike" Fremaux, NASA Langley Research Center's chief engineer for intelligent flight systems.
To test these concepts, particularly in regard to public and military safety, NASA Langley is building its first new wind tunnel in over 40 years. The NASA Flight Dynamic Research Facility, a project Fremaux has been pursuing for 25 years, will replace two smaller wind tunnels that are around 80 years old. The center's most recent and largest, the National Transonic Facility, was built in 1980.
"These facilities are really kind of tailor-made for doing a lot of that work," he said at a presentation at the Virginia Air & Space Science Center in Hampton on Tuesday. The talk was part of NASA Langley's Sigma Series community lectures.
"That's not our traditional wheelhouse. We haven't tested anything with a propeller on it in decades."
That's because many new craft will depend on electric vertical takeoff and landing, or "eVTOL," technology. With likely dozens or even hundreds of private vehicles in the airways, research is needed to understand how vehicles will react in real-world conditions.
Fremaux expects some of these technologies will likely be mainstream by 2040 or sooner.
The $43.2 million federal government contract to design and build the 25,000-square-foot facility went to BL Harbert International, a construction company based in Birmingham, Alabama. It is expected to open in early 2025.
The wind tunnel will be 130 feet tall, Fremaux said, comparing its capabilities to those it will replace: The 12-foot, Low-Speed Spin Tunnel built in 1939 and the 20-foot, Vertical Spin Tunnel built in 1940.
One project he worked on using the center's other wind tunnels—there are currently around 16 operating, Fremaux said—was the Stardust Mission in 2006, the first spacecraft to bring back material from outside the moon's orbit.
Without parachute technology developed at NASA Langley, it wouldn't have been possible to recover samples from that mission, Fremaux said.
Now, along with testing the next generation of commercial, earthbound aviation, the Flight Dynamics Research Facility will provide experimental support for entry, descent and landing of missions returning from the moon and Mars, as well as exploration of Venus and Titan, a moon of Saturn.
The research will support human space exploration, contributing to the possibility of safe landing and reentry on a human mission to Mars.
The research will be similar to some NASA Langley has performed for nearly 100 years as public and private air traffic went from hard-to-imagine to hard-to-imagine-life-without, Fremaux said.
The tunnel will provide safeguards not just to the public, but for the technicians who work there.
"How are the models going to be launched and retrieved?" audience member Ronald Hermansderfer, 89, asked, referring to small, free-flying scale models.
"The plan is to do that just like we do now; a very skilled technician is going to launch the models by hand. That's not a joke," Fremaux said to murmurs from the audience. "That's true, and we have one right here, now retired, who did it for many years. So I know that was a loaded question."
Fremaux recognized Hermansderfer, who worked at the center as a technician from 1983 to 2002. Hermansderfer's job was dangerous; if someone opened the wrong door elsewhere in the facility, affecting the pressure differential, a technician could be sucked into the wind tunnel while launching a model.
The new system will have a pressure equalization system, Fremaux said.
After the talk, Hermansderfer said that, as a kid, he used to set paper airplanes on fire and throw them out of windows. He did something remarkably similar in the testing tunnels at Langley. But it never really occurred to him he was in danger.
His son—also named Ronald Hermansderfer and also retired from NASA Langley—laughed.
In the world of fast fashion, where trends are born as quickly as they are discarded, global trade regulations have struggled to keep pace with relentless cycles of production and consumption.
In attempting to meet the demands of this fast-paced sector, global trade has historically failed to address the troubling reality hiding behind the industry's glamorous façade: a supply chain tainted by human rights abuses and forced labor.
But experts say that's changing.
In 2022, the U.S. Department of Homeland Security began enforcing the standards for manufacturing and trade under the Uyghur Forced Labor Protection Act to crack down on Asian goods that U.S. officials suspect are the product of forced labor by imprisoned ethnic minorities. Those include the Uyghurs, whose maltreatment has been extensively documented.
The United States has banned a large number of garment imports from Vietnam, a major exporter of textiles. Companies there were found to be sourcing materials, including cotton, from manufacturers in China that the U.S. government believes violated trade and labor standards.
"We have a calling to use trade as a force for good, advocating for fairness creating real opportunity for all of our people," said U.S. Trade Representative Katherine Tai at a "Dean's Dialogue" with USC Marshall Dean Geoffrey Garrett at Town and Gown on the University Park Campus in May.
"There are substantial challenges in the relationship," said Tai, referring to the U.S. and China. "The People's Republic of China's growth and development over the last few decades have been phenomenal, but the impacts, and especially the negative impacts on other economies—including ours—are having consequences that we cannot ignore."
Under scrutiny, brands confront the human cost of fast fashion
"The fast fashion industry openly admits to creating disposable products with no apologies. But it's crucial to include labor in the definition of sustainability because people are an integral part of our environment," said Annalisa Enrile, a teaching professor of social work at the USC Suzanne Dworak-Peck School of Social Work.
Shein, the world's largest online fashion retailer, has faced repeated backlash for suspected labor and human rights abuses in its supply chains, particularly the use of materials produced by imprisoned Uyghurs in China's Xinjiang region. U.S. lawmakers have called for investigations and potential tariffs, further complicating Shein's rumored plans to go public. The brand has since relocated its headquarters from Nanjing, China, to Singapore and invested in massive PR campaigns—including extravagant influencer factory tours in China— to clean up its image.
Enrile—an expert in global justice, human trafficking and exploitative labor—sheds light on the plight of migrants and child laborers, who constitute a significant portion of the global labor force. The International Labor Organization estimates that there are over 170 million migrant workers worldwide, nearly half of whom are women.
"Anywhere you have large-scale migration, you will also have a higher rate of labor exploitation. These countries often have export processing zones where factories are built with different laws and standards to attract businesses. It's a massive scope, with thousands of people migrating for labor every day, leading to entire villages in developing countries without women," Enrile said.
Gender-based violence and harassment in the fashion industry are well-documented, primarily driven by male factory owners and supervisors who enforce unreasonable production targets imposed by fashion brands. Some employers resort to coercive measures, including pressuring female workers to vow to not become pregnant, denying maternity leave and terminating pregnant employees.
Meanwhile, conservative estimates from the International Labor Organization show that South Asia, a major hub for the world's garment exports, is home to approximately 16.7 million children aged 5-17 engaged in child labor, with 10.3 million falling within the 5-14 age range. Children between the ages of 5 and 11 represent around one-fifth of all child laborers in South Asia.
However, Enrile pointed out that it is difficult to quantify and understand the full extent of child labor globally due to variations in its definition across countries. For example, in Vietnam, the age of employment is 15. In Bangladesh the legal age is technically 14, but children as young as 12 are allowed to engage in loosely defined "light work."
Empowered consumers drive fashion toward social responsibility
In today's ever-evolving landscape, the rise of conscious consumerism has become a game-changer. With a heightened awareness of the profound environmental and social consequences of their decisions, consumers are exerting immense pressure on fashion brands to revolutionize their supply and labor practices. The demand is clear: align with sustainability, embrace social responsibility and embody unwavering ethical values.
"The accessibility of information has brought to light the ethical concerns around product sourcing. We're seeing a preference for not just fair trade, but direct trade and more conscious and socially aware consumerism," said Elizabeth Currid-Halkett, the James Irvine chair in urban and regional planning and professor of public policy at the USC Price School of Public Policy.
Consumers value the origin of products, showing a preference for goods hailing from regions known for their quality, Currid-Halkett explained. An example of this is the resurgence in American manufacturing, especially the "Made in the U.S." movement prominent in high-end denim and loungewear such as T-shirts and sweatshirts.
"It holds great importance for the American economy as it reflects a sense of social consciousness around our purchasing decisions. When a product is made in the U.S., made in L.A. or made in Brooklyn, part of its appeal lies in knowing where it comes from and who is making it," she said.
The trend is gaining traction among a much larger consumer base, she added. While high-income consumers initially had the means to pay for these types of goods, ethically sourced clothing is becoming increasingly more accessible amid growing sensitivity toward climate change and social justice.
"The industry's pursuit of the best version of a product extends beyond luxury; it encompasses a balance between quality and affordability. This shift in consumer mindset is reshaping our nations' industries as they seek out the people, countries and places that produce the best versions of various products," Currid-Halkett said.
The future of fashion—and supply chain transparency—is digital
Companies have already begun leveraging the power of emerging technologies, such as generative AI and mixed reality, to bolster their marketing strategies. Through robust e-commerce platforms, they provide personalized recommendations driven by AI algorithms. virtual reality and augmented reality technologies enable virtual try-ons and immersive shopping experiences.
But this expanding digital toolkit also presents opportunities for enhancing transparency and accountability across even the most complex supply chains.
Nick Vyas, an associate professor at the USC Marshall School of Business and an expert in global supply chain management, sees great promise in artificial intelligence to revolutionize business processes within the global fashion industry.
Through generative AI, retailers and manufacturers can track inventory using RFID tags and IoT sensors, ensuring better visibility throughout their clothing supply chains. AI-powered decision-making can help companies select suppliers based on their performance, certificates, and historical and real-time data, ensuring ethical sourcing practices.
"Beyond the quest for lucrative expansion in new and existing foreign markets, the industry must redefine its success parameters. We must transition from a single bottom-line approach to a triple bottom line that includes profit, people, and the planet," Vyas said.
"The prevailing economic headwinds should accelerate our investment in emerging technologies that enhance efficiency and promote sustainability. As we navigate these tumultuous times, the key to resilience lies not in resisting change but in embracing it," he said.
"Change requires a collective commitment to transparency, both in our own actions and in the systems that govern us," Enrile added. "Just as we have made strides towards environmental sustainability, we must apply the same dedication to human rights."
It could almost be the start of a joke: What do bats, priests and rabbis have in common? The answer: Their social networks can be traced using a new computer science tool called network analysis.
Prof. Michal Bar-Asher Siegal, a Talmud scholar from Ben-Gurion University of the Negev, and Prof. Yossi Yovel, a zoologist from Tel Aviv University, are not an obvious pairing. However, Prof. Yovel is an expert in network analysis, which he uses to research bats. Together with Prof. Bar-Asher Siegal they had the radical idea to apply his methodologies to examine Judeo-Christian relations in the literature from the early first centuries CE.
Their proof of concept was published in Humanities and Social Sciences Communications.
Using passages from the Babylonian Talmud and Christian texts from the first to sixth centuries, which Prof. Bar-Asher Siegal had previously analyzed, they performed the first network analysis of textual parallels of Christian writings and Jewish sources.
"Our study shows that network analysis of textual parallels using the tools of computer science yield wondrous results. We believe it will really open up the study of relations between the two religions in the beginning of the common era. It has already yielded new insights when applied to our small sample of texts. Who knows what exciting discoveries await when we analyze larger amounts of text," says Prof. Bar-Asher Siegal.
"This is a good example of how interdisciplinarity and the use of tools from one scientific field can enrich another," adds Prof. Yovel.
While network analysis is not new in the field of digital humanities, it is new to the study of rabbinic and Christian literary interactions. This is the first time an automated analysis of textual parallels was accomplished, according to the researchers.
One of the fascinating benefits of this novel approach is the visualization of the relationships between Jewish and Christian writings to represent sets of temporal-spatial-contextual relationships, which evolved over hundreds of years, in single snapshots.
"The visualizations allow us both to get a clearer picture of the interactions and look at them in new ways, which prompted new insights," explains Prof. Bar-Asher Siegal.
The evolution of relations between Judaism and Christianity has often been characterized as a "parting of the ways" where, after a specific point, the two religions began to develop independently. While recent scholarly research has challenged that idea, these new visualizations really bring to light how the two religions developed in parallel and in dialog, not separately.
"The application of network analysis makes it possible to identify the most influential texts—that is, the key 'nodes'—testifying to the importance of certain traditions for both religious communities. What did the Jews know? The New Testament or later sources? And which parts of the New Testament? This leads to interesting scholarly questions: why these texts and not others? How did they know and how did they react to this knowledge?" explains Prof. Bar-Asher Siegal.
"The networks we created reveal that rabbinic sources mock and argue against early Christian sources, but less so when it comes to later ones. Moreover, network analysis suggests a correlation between text, time, and geography. Namely, Jewish sources are familiar with early, eastern Christian sources, BUT they show wider geographical familiarity with both eastern and western Christian sources in later periods," according to Prof. Bar-Asher Siegal.
More information: Michal Bar-Asher Siegal et al, Network analysis reveals insights about the interconnections of Judaism and Christianity in the first centuries CE, Humanities and Social Sciences Communications (2023). DOI: 10.1057/s41599-023-01678-y