Thursday, July 06, 2023

 

Code Language Limitations: The Achilles’ Heel Of Autonomous Vehicles

  • Autonomous vehicles, with their programming limited to their creators' understanding and coding abilities, fail to account for unpredictable elements of human driving and environmental factors, causing disruptions on the road.

  • Due to the limited and simplified nature of coding language, the complex subtleties of human driving experience, including intuition, cannot be fully encapsulated.

  • These vehicles pose a significant threat to other drivers and pedestrians, as there's no way for humans to communicate their intentions effectively to a machine.

Self-driving vehicles are stopping in traffic for no apparent reason and blocking emergency vehicles reports the Los Angeles Times. The writer alludes to a famous high-tech shibboleth: "Move fast and break things." But in this case the things that are being broken are the health and lives of California residents who are having to endure the growing presence of so-called autonomous vehicles on the state's streets and highways.

I have repeatedly warned that autonomous vehicles could only be truly safely operated on closed courses where the possible moves of all other vehicles would be known in advance and therefore predictable. Humans and the environments in which they drive will never be that predictable.

Beneath the bravado of the self-driving booster club is a completely obvious truth: Autonomous vehicles can only do what they are programmed to do, and that programming is limited to what their creators can put into words or, more precisely, that subset of language we call code.

As I keep repeating, language and the computer code derived from it is inherently limited in its ability to describe the reality that we all experience. Even a task as seemingly simple as driving a vehicle is fraught with subtleties and surprises. Our total experience as humans while driving does not narrow substantially when we get into a car or truck. We are still receiving signals from the total environment in which we find ourselves and that includes intuitions, hunches and a stream of thoughts projected onto our inner awareness.

Language of any kind cannot hope to capture that total experience. If it could, it would be experience itself rather than a description of it. Description always, always involves reducing complex perceptions into a sample of our experience.

With relatively low speeds on the city streets, mistakes made by autonomous cars may not be that destructive. Of course, there is no guarantee because low-speed accidents can result in death and injury. But since California seems determined to put autonomous trucks on its highways, we can look forward to some spectacular accidents when higher mass teams up with higher speeds and the nonhuman algorithms driving these trucks. These algorithms may fail to take into account crucial subtleties and changes in conditions and then misfire.

It's not just other drivers who are at risk. As I've previously asked, "[H]ow can you make your intentions known to a robot? How could a pedestrian communicate with a robot car in the way that approaches the simplicity of a nod or a wave to acknowledge the courteous offer from a driver to let the pedestrian cross the street?"

No matter how hard we try to accurately describe the situations a driver might encounter, "words are a surprisingly imprecise way to convey meaning, freighted as they are with nuance, cultural context, history and so many other interlocking dependencies for their meaning." And, of course, much of our experience is beyond words. How many times have you said the following about a strange, new experience: "I have no words to describe it. You just have to experience it."

Our tech barons seem to lack the literary education needed to understand these limits and so press ahead with ill-advised schemes to fill the roadways with autonomous vehicles.

Let me conclude with the final words of a previous piece that are even more relevant today than when I first wrote them:

If we reduce all of our efforts at addressing our problems to language a machine can understand, we will get machine solutions. What we need, however, are solutions that come from our deep connections to this planet as beings of this planet, connections that no machine will ever fathom.

That is the bigger issue.

By Kurt Cobb via Resource Insights

No comments: