Sunday, July 23, 2023

'You've got to move fast': Science learns to quickly link extreme weather and climate

The Canadian Press
Sat, July 22, 2023 



As firefighters and other first responders battle an unprecedented summer of fires, floods, tornadoes and heat waves around the country, a group of Canadian scientists are asking why they're happening in the first place.

"May and June were record hot months in Canada and we've got the record wildfire season as well," said Nathan Gillett of Environment and Climate Change Canada. "Yes, it has been busy."

Gillett heads the Rapid Extreme Event Attribution Project, a new federal program that uses the growing field of attribution science to promptly establish to what extent — if any — a specific flood in British Columbia or wildfire in Quebec is due to climate change.

"The idea is to be able to make rapid extreme event attribution days or weeks after the extreme events occur," he said.

Twenty years ago, if you'd asked a scientist if climate change was linked to days of torrential rain or months of desiccating drought, you'd probably get an answer along the lines of "We can't say for sure but this event is consistent with the modelling."

But in 2003, a paper was published suggesting science could do better. Myles Allen of Oxford University borrowed a concept from epidemiology.

"You can say that smoking increases your risk of lung cancer by a certain amount," Gillett said. "In the same way, you can say human-induced climate change increased the risk of a certain event by a certain amount."

Since then, hundreds of attribution papers have been peer-reviewed and published. As well as Canada, governments including the United Kingdom, Australia, the Netherlands, South Korea, Japan and the United States are using attribution science.

Attribution science works by comparing climate models. One set of models will use data drawn from actual records while another, otherwise identical, set will be constructed with the influence of greenhouse gases removed.

Simulations will be run using those two sets and the difference in the results reveals the impact of climate change. It allows scientists to say to what extent the presence of greenhouse gases increased the likelihood of the event in question.

"It's probabilistic," Gillett said.

The process is now established enough, with peer-reviewed protocols and standards, that the calculations can be done quickly.

"Once you've got the method in place and it's validated, you really just have to get the observations from that event and you can provide a result," said Gillett.

Some events are easier to study than others. Gillett said his group hopes to be able to come to conclusions on heat waves in about a week, but wildfires, which involve more variables, will take longer.

Speed matters, said Clair Barnes, a researcher with the World Weather Attribution group in the U.K., which has studied the role of climate change since 2015 in more than 50 events around the world — including the finding that the heat wave preceding the fire that levelled Lytton, B.C., was made 150 times more likely by climate change.

"Our aim is to look at high-impact events that are in the news," she said. "There was an appetite in the public and the media for more information about what's really happening now."

Promptly assessing the role of climate change after extreme events brings actual insight and information to the discussion, Barnes said.

"If you spend three years thinking about it, the media has already decided it was climate change or it wasn't climate change and has moved on. If you want to be involved in that discussion and bring some science to that discussion, you've got to move quickly."

But attribution science has more uses than just shaping public debate. Governments are using it inform their adaptation strategies. Financial institutions are using it to assess risk. It's come up in hundreds of court cases around the world attempting to attribute climate liability.

It does have its limitations.

Attribution science can only work where there's enough historical weather data to build an accurate climate model. That leaves out much of the global south, where some of the worst human impacts are occurring. As well, extremely local events are often beyond its resolving power.

"You do have to be careful to communicate the uncertainties," said Gillett. "We shouldn't be overconfident."

There's certainly no shortage of work. Barnes said her group has had to establish a strict protocol that weighs the magnitude of the event, the amount of damage it inflicts and its effect on human lives to weed out which events merit study.

"There are so many events that we just don't have the time to look at them all."

But World Weather Attribution has found the time to consider Canada's wildfires. It's a complex one, so results aren't expected for another month or so.

By then, chances are there will be a new extreme event to consider. When Barnes joined World Weather Awareness, she assumed winter and summer — the times of peak temperature highs and lows — would be the busiest. Not so.

"We've had temperature records set for the last few months and it's not even the peak of boreal summer," she said. "It's just been non-stop."

This report by The Canadian Press was first published July 22, 2023.

Bob weber, The Canadian Press
Memorial for fallen B.C. wildfire fighter held in Revelstoke

The Canadian Press
Sat, July 22, 2023 



REVELSTOKE, B.C. — Family, friends and fellow firefighters paid tribute today to the 19-year-old woman killed while battling wildfires in British Columbia earlier this month.

Devyn Gale died on July 13 after being struck by a falling tree while fighting a wildfire near Revelstoke, B.C.

Gale’s brother and sister, Nolan and Kayln, who are also firefighters, gave emotional speeches about their sister at a public memorial in Revelstoke, calling her compassionate, wise and nurturing.

Casey Robinson of the B.C. Wildfire Service, who interviewed and trained Gale, said he was impressed by her "smarts, her energy and her ability to work hard."

He says Gale was an "excellent firefighter" and encouraged all those in the same field to continue her legacy of "being welcoming, conscientious and open hearted to anyone who joins" their crews.

The service followed a memorial procession that included Gale's BC Wildfire Service colleagues, a Colour Party, Honour Guard and representatives from various first-responder agencies. Community members lined city streets in Revelstoke to watch the march.

Gale is one of three Canadian firefighters who have died battling the hundreds of blazes that are burning across the country.

Adam Yeadon, 25, died last Saturday while fighting a wildfire near his home in Fort Liard, N.W.T.

A 41-year-old helicopter pilot from Whitecourt, Alta., died after his aircraft crashed Wednesday during firefighting operations in that province's northwest.

This report by The Canadian Press was first published July 22, 2023.

The Canadian Press
Trinity Nuclear Test’s Fallout Reached 46 States, Canada and Mexico, Study Finds

Lesley M.M. Blume
Fri, July 21, 2023 

An undated photo provided by the National Archives and Records Administration of contaminated film scans that were sent from Rochester, N.Y. to Lt. Gen. Leslie Groves, the leader of the Manhattan Project, an early indicator that the fallout from the Trinity nuclear test was spreading nationwide. (National Archives and Records Administration via The New York Times)

In July 1945, as J. Robert Oppenheimer and the other researchers of the Manhattan Project prepared to test their brand-new atomic bomb in a New Mexico desert, they knew relatively little about how that mega-weapon would behave.

On July 16, when the plutonium-implosion device was set off atop a 100-foot metal tower in a test code-named “Trinity,” the resultant blast was much stronger than anticipated. The irradiated mushroom cloud also went many times higher into the atmosphere than expected: some 50,000 to 70,000 feet. Where it would ultimately go was anyone’s guess.

A new study, released Thursday before submission to a scientific journal for peer review, shows that the cloud and its fallout went farther than anyone in the Manhattan Project had imagined in 1945. Using state-of-the-art modeling software and recently uncovered historical weather data, the study’s authors say that radioactive fallout from the Trinity test reached 46 states, Canada and Mexico within 10 days of detonation.

“It’s a huge finding and, at the same time, it shouldn’t surprise anyone,” said the study’s lead author, Sébastien Philippe, a researcher and scientist at Princeton University’s Program on Science and Global Security.

The study also reanalyzed fallout from all 93 aboveground U.S. atomic tests in Nevada and created a map depicting composite deposition of radioactive material across the contiguous U.S. (The team also hopes to study U.S. tests over the Pacific Ocean in the future).

How much of Trinity’s fallout still remains at original deposition sites across the country is difficult to calculate, said Susan Alzner, an author of the study and the co-founder of shift7, an organization that coordinated the study’s research. The study documents deposition as it originally hit the ground in 1945.

“It’s a frozen-in-time image,” she said.

The findings could be cited by advocates aiming to increase the number of people eligible for compensation by the federal government for potential exposure to radiation from atmospheric nuclear explosions.

The drift of the Trinity cloud was monitored by Manhattan Project physicists and doctors, but they underestimated its reach.

“They were aware that there were radioactive hazards, but they were thinking about acute risk in the areas around the immediate detonation site,” Alex Wellerstein, a nuclear historian at the Stevens Institute of Technology in New Jersey, said. They had little understanding, he said, about how the radioactive materials could embed in ecosystems, near and far. “They were not really thinking about effects of low doses on large populations, which is exactly what the fallout problem is.”

At the time, Dr. Stafford L. Warren, a Manhattan Project physician specializing in nuclear medicine, reported to Lt. Gen. Leslie Groves, leader of the Manhattan Project, that the Trinity cloud “remained towering over the northeast corner of the site for several hours.” Soon, he added, “various levels were seen to move in different directions.” Warren assured Groves that an assessment of the fallout’s reach could be undertaken later on horseback.

In the decades that followed, a lack of crucial data bedeviled assessments and attempted studies of the Trinity test’s fallout. The U.S. had no national monitoring stations in place in 1945 to track the fallout, Philippe said. Plus, essential historical weather and atmospheric data was available only from 1948 onward. Remodeling fallout from tests in Nevada — starting in 1951 — was easier, but Trinity remained frustratingly difficult to reanalyze.

“The data sets for the Nevada tests and the available data that we could possibly find for Trinity were not comparable,” Alzner said. “You couldn’t put them on the same map. We decided to keep pushing.”

Determined to fill in the gaps, the team started the study about 18 months ago. Philippe has extensive background in modeling fallout and was an author of a similar project in 2021 that documented the effects from French nuclear tests.

A breakthrough came in March, when Alzner and Megan Smith, another co-founder of shift7 and a former U.S. chief technology officer in the Obama administration, contacted the National Oceanic and Atmospheric Administration. There, Gilbert P. Compo, a senior research scientist at the University of Colorado and the NOAA Physical Sciences Laboratory, told the team the European Centre for Medium-Range Weather Forecasts had only a week earlier released historical data that charted weather patterns extending 30,000 feet or higher above Earth’s surface.

“For the first time, we had the most accurate hourly reconstruction of the weather back to 1940, around the world,” said Compo, who became a co-author on the study. “Every single event that puts something in the air, no matter what it is, can now be tracked, by the hour.”

Using the new data and software built by NOAA, Philippe then reanalyzed Trinity’s fallout. And while the study’s authors acknowledge limitations and uncertainties within their calculations, they maintain that “our estimates likely remain conservatively low.”

“It’s a very comprehensive, well-executed study,” said M.V. Ramana, professor and Simons chair in disarmament, global and human security at the University of British Columbia, who was not involved in the study. Ramana was unsurprised by the study’s findings about Trinity. “I expected that the old estimates were understating what was actually deposited,” he said.

The results show that New Mexico was heavily affected by Trinity’s fallout. Computations by Philippe and his colleagues show the cloud’s trajectory primarily spreading up over northeast New Mexico and a part of the cloud circling to the south and west of ground zero over the next few days. The researchers wrote that there are “locations in New Mexico where radionuclide deposition reached levels on par with Nevada.”

Trinity’s fallout, Philippe says, accounts for 87% of total deposition found across New Mexico, which also received deposition from Nevada’s aboveground tests. The study also found that Socorro County — where the Trinity test took place — has the fifth highest deposition per county of all counties in the United States.

Trinity test “downwinders” — a term describing people who have lived near nuclear test sites and may have been exposed to deadly radioactive fallout — have never been eligible for compensation under the 1990 Radiation Exposure Compensation Act (RECA). It has provided over $2.5 billion in payments to nuclear workers in much of the Western U.S. and to downwinders who were located near the Nevada test site and may have developed cancer or other diseases as a result of radiation exposure.

“Despite the Trinity test taking place in New Mexico, many New Mexicans were left out of the original RECA legislation and nobody has ever been able to explain why,” said Sen. Ben Ray Luján, D-N.M. He has helped lead efforts in Congress to expand and extend the legislation, due to sunset in 2024.

Census data from 1940 shows that as many as 500,000 people were living within a 150-mile radius of the test site. Some families lived as close as 12 miles away, according to the Tularosa Basin Downwinders Consortium. Yet no civilians were warned about the test ahead of time, and they weren’t evacuated before or after the test.

“This new information about the Trinity bomb is monumental and a long time coming,” Tina Cordova, a co-founder of the consortium, said. “We’ve been waiting for an affirmation of the histories told by generations of people from Tularosa who witnessed the Trinity bomb and talked about how the ash fell from the sky for days afterward.”

The study also documents significant deposition in Nevada, Utah, Wyoming, Colorado, Arizona and Idaho, as well as dozens of federally-recognized tribal lands, potentially strengthening the case for people seeking expanded compensation in those areas.

Although Wellerstein said that he approaches such reanalyses of historical fallout with a certain amount of uncertainty, partly because of the age of the data, he said there is value in such studies by keeping nuclear history and its legacy in the public discourse.

“The extent to which America nuked itself is not completely appreciated still, to this day, by most Americans, especially younger Americans,” he said.

c.2023 The New York Times Company

SPACE NEWS

Supernova Alert: We Will Soon See an Exploding Star in the Night Sky

 


SpaceVerse
Jul 21, 2023  #interstellar #astrophysics #universe

Get ready to witness a breathtaking celestial event as we unveil an extraordinary revelation in the night sky! Brace yourself for the imminent explosion of a massive star, a phenomenon known as a supernova. In this captivating video, we bring you the ultimate cosmic experience as we unravel the secrets and beauty behind this awe-inspiring celestial occurrence.

Prepare to marvel at the sheer magnitude of the imminent supernova in the night sky. Our knowledgeable astronomy experts have researched and scrutinized the celestial patterns, allowing them to predict this mesmerizing display of cosmic fireworks. As the universe constantly evolves, we are fortunate to be part of this rare moment that will forever etch itself into the tapestry of our memories.

Join us as we venture into the depths of outer space, where grandeur meets mystery. This awe-inspiring video not only promises a visual feast for stargazers and astronomers alike but also delves into the fascinating science and symbolism behind supernovae eruptions. Gain a deeper understanding of the cataclysmic forces at play, as we explore the birth, life, and death of these colossal stars.

Immerse yourself in the ethereal beauty of the celestial canopy as it unveils the grand spectacle of a supernova explosion. Witness how the dying star, under immense gravitational pressure, culminates in a glorious burst of light and energy, illuminating vast stretches of the universe. Prepare to be captivated as we unveil the intricate dance of particles and the cosmic aftermath that follows such an extraordinary event.

Stay tuned for this groundbreaking revelation that promises to leave stargazers awestruck. Don't miss out on this rare opportunity to witness a supernova in all its magnificence. Subscribe to our channel and hit the notification bell to receive updates as we navigate the celestial wonders and deliver unparalleled cosmic adventures straight to your screen.

https://www.tor.com/2018/03/29/destruction-and-renewal-nova-by-samuel-r-delany

Mar 29, 2018 ... In Nova, he created a novel that works on many levels, including myth and legend, unfolding against a solidly-researched science fiction 




In new space race, scientists propose geoarchaeology can aid in preserving space heritage


Peer-Reviewed Publication

UNIVERSITY OF KANSAS




LAWRENCE, KANSAS — As a new space race heats up, two researchers from the Kansas Geological Survey at the University of Kansas and their colleagues have proposed a new scientific subfield: planetary geoarchaeology, the study of how cultural and natural processes on Earth’s moon, on Mars and across the solar system may be altering, preserving or destroying the material record of space exploration.

“Until recently, we might consider the material left behind during the space race of the mid-20th century as relatively safe,” said Justin Holcomb, postdoctoral researcher at the Kansas Geological Survey, based at the University of Kansas, and lead author on a new paper introducing the concept of planetary geoarchaeology in the journal Geoarchaeology. “However, the material record that currently exists on the moon is rapidly becoming at risk of being destroyed if proper attention isn’t paid during the new space era.”

Since the advent of space exploration, humans have launched more than 6,700 satellites and spacecraft from countries around the globe, according to the Union of Concerned Scientists. The United States alone accounts for more than 4,500 civil, commercial, governmental and military satellites.

“We’re trying to draw attention to the preservation, study and documentation of space heritage because I do think there’s a risk to this heritage on the moon,” Holcomb said. “The United States is trying to get boots on the moon again, and China is as well. We’ve already had at least four countries accidentally crash into the moon recently. There are a lot of accidental crashes and not a lot of protections right now.”

Holcomb began considering the idea of planetary geoarchaeology during the COVID-19 lockdown. Applying geoarchaeological tools and methods to the movement of people into space and the solar system is a natural extension of the study of human migration on Earth, the focus of the ODYSSEY Archaeological Research Program housed at KGS and directed by Holcomb’s co-author, Rolfe Mandel, KGS senior scientist and University Distinguished Professor in the Department of Anthropology.

“Human migration out of Africa may have occurred as early as 150,000 years ago, and space travel represents the latest stage of that journey,” Mandel said. “Although the ODYSSEY program is focused on documenting the earliest evidence of people in the Americas, the next frontier for similar research will be in space.”

How planetary geoarchaeologists will determine whether an item is worth preserving is an open question.

“We feel that all material currently existing on extraterrestrial surfaces is space heritage and worthy of protection,” Holcomb said. “However, some sites, such as the very first footprints on the moon at Tranquility Base or the first lander on Mars, Viking 1, represent the material footprint of a long history of migration.”

Beyond those “firsts,” sifting through the hundreds of thousands of bits of material currently in orbit or strewn across the surfaces of the moon and Mars — what many call “trash” but Holcomb and his colleagues regard as heritage — will require case-by-case decision making.

“We have to make those decisions all the time with archaeological sites today,” Holcomb said. “The moon has such a limited record now that it’s totally possible to protect all of it. Certainly, we need to protect space heritage related to the Apollo missions, but other countries, too, deserve to have their records protected.”

With resources for protecting space heritage limited, Holcomb and his colleagues advocate for developing systems to track materials left in space.

“We should begin tracking our material record as it continues to expand, both to preserve the earliest record but also to keep a check on our impact on extraterrestrial environments,” he said. “It’s our job as anthropologists and archaeologists to bring issues of heritage to the forefront.”

Beyond the moon, Holcomb wants to see planetary geoarchaeology extend to issues related to exploration and migration to Mars. He points to NASA’s Spirit Rover as an example. The rover became stuck in Martian sand in 2008 and now risks being completely covered by encroaching sand dunes.

“As planetary geoarchaeologists, we can predict when the rover will be buried, talk about what will happen when it’s buried and make sure it’s well documented before it’s lost,” he said. “Planetary scientists are rightfully interested in successful missions, but they seldom think about the material left behind. That’s the way we can work with them.”

Holcomb believes geoarchaeologists should be included in future NASA missions to ensure the protection and safety of space heritage. Meanwhile, geoarchaeologists on Earth can lay the foundation for that work, including advocating for laws to protect and preserve space heritage, studying the effects extraterrestrial ecosystems have on items space missions leave behind and conducting international discussions regarding space heritage preservation and protection issues.

As for being part of a space mission himself?

“I’ll leave that to other geoarchaeologists,” Holcomb said. “There’s plenty to do down here, but I do hope to see an archaeologist in space before it’s all over.”

‘It almost doubled our workload’: AI is supposed to make jobs easier. These workers disagree

By Catherine Thorbecke, CNN
Updated Sat July 22, 2023

Maria Korneeva/Moment RF/Getty Images
CNN —

A new crop of artificial intelligence tools carries the promise of streamlining tasks, improving efficiency and boosting productivity in the workplace. But that hasn’t been Neil Clarke’s experience so far.

Clarke, an editor and publisher, said he recently had to temporarily shutter the online submission form for his science fiction and fantasy magazine, Clarkesworld, after his team was inundated with a deluge of “consistently bad” AI-generated submissions.

“They’re some of the worst stories we’ve seen, actually,” Clarke said of the hundreds of pieces of AI-produced content he and his team of humans now must manually parse through. “But it’s more of the problem of volume, not quality. The quantity is burying us.”

“It almost doubled our workload,” he added, describing the latest AI tools as “a thorn in our side for the last few months.” Clarke said that he anticipates his team is going to have to close submissions again. “It’s going to reach a point where we can’t handle it.”

Since ChatGPT launched late last year, many of the tech world’s most prominent figures have waxed poetic about how AI has the potential to boost productivity, help us all work less and create new and better jobs in the future. “In the next few years, the main impact of AI on work will be to help people do their jobs more efficiently,” Microsoft co-founder Bill Gates said in a blog post recently.

But as is often the case with tech, the long-term impact isn’t always clear or the same across industries and markets. Moreover, the road to a techno-utopia is often bumpy and plagued with unintended consequences, whether it’s lawyers fined for submitting fake court citations from ChatGPT or a small publication buried under an avalanche of computer-generated submissions.

Big Tech companies are now rushing to jump on the AI bandwagon, pledging significant investments into new AI-powered tools that promise to streamline work. These tools can help people quickly draft emails, make presentations and summarize large datasets or texts.

In a recent study, researchers at the Massachusetts Institute of Technology found that access to ChatGPT increased productivity for workers who were assigned tasks like writing cover letters, “delicate” emails and cost-benefit analyses. “I think what our study shows is that this kind of technology has important applications in white collar work. It’s a useful technology. But it’s still too early to tell if it will be good or bad, or how exactly it’s going to cause society to adjust,” Shakked Noy, a PhD student in MIT’s Department of Economics, who co-authored the paper, said in a statement.


Neil Clarke, Editor of Clarkesworld Magazine.Lisa R. Clarke

Mathias Cormann, the secretary-general of the Organization for Economic Co-operation and Development recently said the intergovernmental organization has found that AI can improve some aspects of job quality, but there are tradeoffs.

“Workers do report, though, that the intensity of their work has increased after the adoption of AI in their workplaces,” Cormann said in public remarks, pointing to the findings of a report released by the organization. The report also found that for non-AI specialists and non-managers, the use of AI had only a “minimal impact on wages so far” – meaning that for the average employee, the work is scaling up, but the pay isn’t.

Some workers feel like ‘guinea pigs’

Ivana Saula, the research director for the International Association of Machinists and Aerospace Workers, said that workers in her union have said they feel like “guinea pigs” as employers rush to roll out AI-powered tools on the job.

And it hasn’t always gone smoothly, Saula said. The implementation of these new tech tools has often led to more “residual tasks that a human still needs to do.” This can include picking up additional logistics tasks that a machine simply can’t do, Saula said, adding more time and pressure to a daily work flow.

The union represents a broad range of workers, including in air transportation, health care, public service, manufacturing and the nuclear industry, Saula said.

“It’s never just clean cut, where the machine can entirely replace the human,” Saula told CNN. “It can replace certain aspects of what a worker does, but there’s some tasks that are outstanding that get placed on whoever remains.”

Workers are also “saying that my workload is heavier” after the implementation of new AI tools, Saula said, and “the intensity at which I work is much faster because now it’s being set by the machine.” She added that the feedback they are getting from workers shows how important it is to “actually involve workers in the process of implementation.”

“Because there’s knowledge on the ground, on the frontlines, that employers need to be aware of,” she said. “And oftentimes, I think there’s disconnects between frontline workers and what happens on shop floors, and upper management, and not to mention CEOs.”

Perhaps nowhere are the pros and cons of AI for businesses as apparent as in the media industry. These tools offer the promise of accelerating if not automating copywriting, advertising and certain editorial work, but there have already been some notable blunders.

News outlet CNET had to issue “substantial” corrections earlier this year after experimenting with using an AI tool to write stories. And what was supposed to be a simple AI-written story on Star Wars published by Gizmodo earlier this month similarly required a correction and resulted in employee turmoil. But both outlets have signaled they will still move forward with using the technology to assist in newsrooms.

Others like Clarke, the publisher, have tried to combat the fallout from the rise of AI by relying on more AI. Clarke said he and his team turned to AI-powered detectors of AI-generated work to deal with the deluge of submissions but found these tools weren’t helpful because of how unreliably they flag “false positives and false negatives,” especially for writers whose second language is English.

“You listen to these AI experts, they go on about how these things are going to do amazing breakthroughs in different fields,” Clarke said. “But those aren’t the fields they’re currently working in.”

 

iEarth: An interdisciplinary framework in the era of big data and AI for sustainable development


Peer-Reviewed Publication

SCIENCE CHINA PRESS

iEarth: an interdisciplinary framework in the era of big data and AI for sustainable development 

IMAGE: FIGURE:CONCEPTUALIZED FRAMEWORK OF INTELLIGENT EARTH (IEARTH) view more 

CREDIT: ©SCIENCE CHINA PRESS



The United Nations Sustainable Development Goals (SDGs) hold the key to humanity's future existence and growth. In a bid to optimize the implementation of these SDGs, Professor Peng Gong's team from the University of Hong Kong and Professor Huadong Guo's team from the Chinese Academy of Sciences have collaboratively introduced an innovative "iEarth" framework. This interdisciplinary framework is powered by Big Earth Data science and seeks to amalgamate various interdisciplinary methodologies and expertise. It aims to quantify the processes of Earth systems and human civilization, uncover the intricate interplay between natural ecosystems and human society, foster cross-disciplinary ideologies and solutions, and furnish explicit evidence and valuable scientific knowledge for sustainable development.

The inception of the iEarth concept springs from intelligent Mapping (iMap), and its further development is influenced by a spectrum of disciplinary and interdisciplinary studies. The team distinguishes four primary themes within the iEarth framework: iEarth data, iEarth science, iEarth analytics, and iEarth decision.

iEarth data comprises all data related to Earth systems, encapsulating natural systems and human societies. iEarth science delves into a multidisciplinary exploration of the natural system, human society, and their mutual interaction and feedback, focusing on the diverse traits of objects when interconnected. iEarth analytics presents a methodology inclusive of detection, prediction, assessment, and optimization for achieving SDGs by leveraging the "iEarth+" model, which is dedicated to transcending disciplinary boundaries and actively connecting Earth observations with other disciplines. iEarth decision supports the implementation of SDGs by monitoring progress, pinpointing drivers, simulating pathways, and performing cost-benefit evaluations. The holistic iEarth framework thus consolidates multi-source data, interdisciplinary knowledge, and advanced technology to establish a comprehensive data-science-analytics-decision support system for fostering sustainable environmental, social, and economic prosperity.

The 'intelligence' in the iEarth framework is characterized by its potential for active learning and knowledge synthesis through Big Earth Data models powered by Artificial Intelligence (AI). Consequently, the iEarth framework can also be seen as an AI model anchored on Big Earth Data. According to the team, the successful implementation of the iEarth framework necessitates significant investment in both hard and soft infrastructures.

With an aim to reinforce the vision and boost the capability of iEarth for sustainable development, the team has outlined key research directions, practical implications, and educational curricula. The ultimate objective is to shape and build an interdisciplinary and synergistic framework for research, practice, and education that helps in preserving our living planet.

See the article:

iEarth: an interdisciplinary framework in the era of big data and AI for sustainable development

https://doi.org/10.1093/nsr/nwad178

What if AI models like GPT-4 don't automatically improve over time?

Alistair Barr
Wed, July 19, 2023 

Stefani Reynolds/Getty Images

GPT-4 users have complained that the OpenAI model is getting 'dumber.'


AI researchers studied the model to find out if this was true.


Their findings, published on Tuesday, challenge the assumption that AI models automatically improve.

One of the bedrock assumptions of the current artificial intelligence boom is that AI models "learn" and improve over time. What if that doesn't actually happen?

This is what users of OpenAI's GPT-4, the world's most-powerful AI model, have been experiencing lately. They have gone on Twitter and OpenAI's developer forum to complain about a host of performance issues.

After I reported on this, OpenAI responded that it hasn't "made GPT-4 dumber."

AI researchers decided to settle this debate once and for all by conducting a study. The results were published on Tuesday, and I can't wait any longer to tell you the conclusion: I was right.

"We find that the performance and behavior of both GPT-3.5 and GPT-4 vary significantly across these two releases and that their performance on some tasks have gotten substantially worse over time," the authors of the study wrote.

These are serious AI researchers. The main one is Matei Zaharia, the CTO of Databricks, one of the top AI data companies out there that was most recently valued at $38 billion.

You can read the rest of their findings here. What I'm most interested in is the new questions that these findings raise. Here's the most fascinating one.

"It is also an interesting question whether an LLM service like GPT4 is consistently getting 'better' over time," Zaharia and his research colleagues wrote in their paper.

Another common phrase for AI is machine learning. The magic of this technology is that it can ingest new data and use that to get better over time, without human software engineers manually updating code. Again, this is the core idea that is driving today's AI frenzy and accompanying stock market surges.

If GPT-4 is getting worse, not better, this premise begins to feel shaky.

The Microsoft factor

Microsoft has invested heavily in OpenAI, the creator of GPT-4. Microsoft is also baking this technology into its software, and charging users a lot for the new capabilities.

On Tuesday, the same day Zaharia & Co. published their paper, Microsoft unveiled pricing for Microsoft CoPilot, new AI-powered versions of popular cloud software such as Office 365. This costs $30 a month more, on top of what users are already paying.

Microsoft's market value jumped more than $150 billion after this announcement, showing that Wall Street is betting on AI, and the impact the technology will have on the company's products.

This recent GPT-4 research paper provides a healthy dose of skepticism to the assumptions that are driving these wild swings in value.

Scientist Gary Marcus read Zaharia's study and highlighted how unstable LLMs are. So unstable that relying on them for high-end business products might not be a good idea.

"Who in their right mind would rely on a system that could be 97.6% correct on a task in March and 2.4% correct on same task in June?," he tweeted, citing one of the findings in the research paper. "Important results. Anyone planning to rely on LLMs, take note."

"Prediction: this instability will be LLMs' undoing," Marcus added. "They will never be as commercially successful as the VC community is imagining, and some architectural innovation that allows for greater stability will largely displace LLMs within the next 10 years."

Spokespeople from OpenAI and Microsoft didn't respond to a request for comment on Wednesday.