Saturday, October 11, 2025

AI steers education innovation across US states


By Dr. Tim Sandle
SCIENCE EDITOR
DIGITAL JOURNAL
October 9, 2025


AI tools could change the traditional rules of the classroom. — © AFP

With the rise of artificial intelligence use and other innovations in education providing equitable access to learning and preparing students for more advanced futures. The non-profit organization SmileHub has released new reports into the use of AI in the U.S, titled States Leading the Way in Educational Innovation and the Best Charities for Education in 2025.

As the preamble indicates: “Through hybrid or virtual learning over platforms like Zoom, schools are able to hold classes without disruption when people are sick and provide education access to more students. Additionally, rapid growth in artificial intelligence (AI) use globally has schools racing to prepare students for a bolder, AI-driven future.”

To highlight the states innovating the most in their education systems and the ones that have more work to do, SmileHub compared each of the 50 states based on 14 key metrics. The data set ranges from test-optional universities per capita to the number of education charities per capita to the adoption of K-12 computer science policies.

This revealed the most innovative states to be:

1. California

2. Massachusetts

3. New York

4. Pennsylvania

5. Illinois

6. Florida

7. Texas

8. Ohio

9. Washington

10. Indiana

In contrast, the least innovative states were revealed to be:

41. Delaware

42. New Mexico

43. Montana

44. Hawaii

45. Mississippi

46. West Virginia

47. Nevada

48. Alaska

49. North Dakota

50. South Dakota

Within the main rankings there are some interesting variances. California, for example, has the most creative workspaces per capita – 18.4 times more than Mississippi, which has the fewest creative workspaces.

Indiana has the most blue ribbon schools per capita – 10.9 times more than Nevada, which has the fewest blue ribbon schools. Returning to California, the state has the most education charities per capita – 8.4 times more than New Hampshire, which has the fewest charities.

Shaping the future of learning and work: University of Waterloo and Google collaborate



By Dr. Tim Sandle
SCIENCE EDITOR
DIGITAL JOURNAL
October 2, 2025


Google image: - © AFP SEBASTIEN BOZON

Education can be a powerful equalizer, opening doors and creating new opportunities in people’s lives. One area that is creating considerable interest from the current generation of university students is artificial intelligence. There are many directions to take AI research; one such area is to understand its impact on learning and work.

With these twin areas in mind, Google and the University of Waterloo have jointly announced a new research collaboration that examines how artificial intelligence will shape education and career preparedness.

The partnership includes a $1 million in contribution from Google. The bulk of the funding will go towards a new Google Chair in the Future of Work and Learning. Additional funding will be directed towards hands-on learning labs to enable participants to co-create AI-powered tools and prepare students for the evolving workplace.

Google Chair in the Future of Work and Learning

The first Chair is set to be Professor Edith Law, Computer Science professor. Law has made pioneering contributions to fostering human-AI collaboration in the pursuit of enhanced creativity.

Law will be working closely with students and researchers to co-create AI-facilitated learning technologies and to answer some of the fundamental questions facing educational institutions today:How can we best prepare students for jobs that don’t exist yet?
How do we evolve the learning experience to meet learners where they are?
How do we make sure learners are ready for the workforce in an increasingly evolving world?

Learning by doing: The futures lab workshop

With the laboratory support, the Futures Lab, a unique, hands-on learning lab where interdisciplinary student teams will come together with University of Waterloo faculty and Google mentors to build new, AI-powered learning prototypes, with tools such as Gemini and AI Studio.

In this setting, interdisciplinary student teams will come together, multiple times per year, with University of Waterloo faculty and Google mentors to build new, AI-powered learning prototypes, with tools such as Gemini and AI Studio.

Google view

Speaking about the partnership, Mira Lane, VP Society and Technology, Google explains the motivation for the initiative: “In an era of rapid technological advancement, ensuring education can continue to fulfil that promise for everyone is critical.”

In terms of the support Google will be providing, Law says: “This collaboration brings together our expertise in AI with University of Waterloo’s visionary educational approach. To kick things off, we will be providing a $1 million CAD contribution to establish a new Google Chair in the Future of Work and Learning to explore new paradigms of learning and teaching. This partnership marks our shared commitment to redefine education and empower the next generation to thrive in an AI-driven world.”

Other projects

Google has an established history of working with the University of Waterloo, including Kids on Campus, a programme that brings Grade 4 classes to the university for a day of STEM activities. Google has also provided funding for the university’s Women in Computer Science (WiCS) programme and, most recently, Google collaborated with the university’s Jimmy Lin Data and the Waterloo Data and Artificial Intelligence Institute to host a K-12 AI Day for Educators.

Op-Ed: AI vs education — ‘Outsourcing’ education to AI can’t work at all, but there is real hope


By Paul Wallis
EDITOR AT LARGE
DIGITAL JOURNAL
September 30, 2025


A view of Harvard University campus in Cambridge, Massachussetts in April 2025 - Copyright AFP Joseph Prezioso

The understandable howls of outrage about AI vs education have been pretty much continuous. The shameless misrepresentation of generative AI as a learning tool has truly taken root in education.

The problem is that AI “creates knowledge” on demand. This outcome is in direct conflict with the nature of learning.

A lot of fundamental learning is about students learning how to learn, backed up by learning to understand the information and applications. It lacks depth on the student side.

That’s hardly good enough. It’s exactly what educators worldwide dread, delivering little value as actual education. Nobody’s learning much but how to use the AI, in theory. How much actual knowledge and skills are being learned? Education has quite enough problems without this vacuous outcome.

Somebody called Assistant Professor Kimberley Hardcastle from Northumbria University in Newcastle was kind enough to produce this very useful article on the subject of AI in education.

You need to read this article to get the core issues clear.

Her area of special interest is epistemology, the study of knowledge. This approach nails the big picture issues with AI all too well. It’s scary.

This very real problem is NOT about “the kids are doing it all with AI”. It’s far more complex, about the nature and quality of educational knowledge, and it’s pretty grim.

The generation and pseudo-creation of knowledge on demand for students is now in the hands of first-generation AI. It’s a technological toddler itself. Big Tech is being its usual immature, infantile self in terms of the quality of info, with some exceptions.

Hardcastle patiently points out the contrast between “original” and “assisted” thinking. This is critical, and it’s essential to the learning process on any level.

If you’re a teacher, how the hell are you supposed to know:

Whether the student is doing the necessary thinking and actual research or just letting the AI do it

Whether they understand the information about the subject, or your questions at all

Whether the student gets the point that critical thinking is required

Try this little workout for yourself:

Pick any subject at random.

Search the subject and check the AI response.

At what point do you become totally dependent on the AI information?

Almost immediately, perhaps?

Can you question what the AI has created? No. That’s the problem.

Now put yourself in the position of any schoolkid or undergraduate,

Are you “educated” yet?

You’re not, and you can’t be.

Unless you put in a lot of effort to fix the knowledge gaps, you lack anything resembling a personal knowledge base.

How “productive” does this sound so far? Does anyone detect the subtle suggestion that any amount of dud useless information can be generated?

What a surprise.

The good news, eventually.

I want to point out that this situation is far from insoluble. It can solve itself.

Maybe it’s the use of words like epistemology that appeals to my hyper-antisocial polysyllabic soul. Maybe a good, worthwhile subject feels good to write about.

Here’s the good news about AI and education for the future.

AI is a learning tool. Learn how to use it.

Kids will have to use AI anyway.

It makes sense to create accredited academic AI for education purposes.

Accreditation is a major plus for products and puts some skin in the game for AI companies.

LLMs can be tweaked to any degree for academic purposes.

You can work with these LLMs and AI to create high standards of testing and assignments with built-in oversight on standards of learning and curricula.

You can test these LLMs using existing methods.

You can make academic LLMs and AI subject to contract terms.

You can stop worrying about the quality of information and the massive, inexcusable holes in knowledge bases when you use AI properly.

___________________________________________________________

Disclaimer
The opinions expressed in this Op-Ed are those of the author. They do not purport to reflect the opinions or views of the Digital Journal or its members.

No comments: