|

Alexa, tell me a joke

We focus on what Artificial Intelligence can do and forget to ask what it is.

“Alexa, tell me a story.” Today we can ask our “voice assistants” almost any question and hear a reply. Alexa can tell jokes and bedtime stories and even play games. Artificial intelligence (AI) applied to areas like this appears almost magical.

As a computer scientist I delight in the possibilities of AI but I don’t see it as magic. It’s math encoded in a computer program. Typically, AI programs begin with a “training” phase, which associates known inputs with outputs through weighted connections tuned to minimize error. AI uses statistical logic to classify inputs but has no actual comprehension of the meaning of those inputs. In reality, AI systems are no more capable of comprehension than a spreadsheet. However, the remarkable strides in AI have begun to blur the lines between humans and machines.

We are often captivated by what things can do, rather than asking what things are. I think the question “what is AI?” is an essential one. This is what philosophers refer to as the ontological question – the question of being. Once we have established who we are and what machines are, we can start asking subsequent questions about ethics.

A common tendency is to attribute human characteristics or anthropomorphize our machines, thereby elevating the status of machines and, in doing so, reducing the distinctiveness of human beings. This way of thinking is encouraged by the language we use. We say that AI programs “learn” and that computers “think.” The blurring of machines and people is further complicated by designing voice assistants like Alexa and Siri to sound just like human beings. Even the phrase “artificial intelligence” can exacerbate this confusion since the word intelligence is typically associated with being human. Perhaps a better term for the field of AI is “data science,” a term that emphasizes the reliance on data and the science of extracting patterns from that data.

Machines doing math

The danger with seeing machines as people is that we will be inclined to use them in places that ought to be reserved for humans. The early AI pioneer Joseph Weizenbaum created a simple pattern-matching program called Eliza that emulated a Rogerian psychotherapist which some suggested could be used for automated psychotherapy. Weizebaum was troubled by these suggestions, and asserted that therapy requires empathy, something a machine could never do. He concluded that “there are limits to what computers ought to be put to do.”

If we understand AI as a machine doing math, we should conclude that there are many things that AI “ought not to be put to do.” AI can perform statistical classifications, but it has no comprehension, wisdom or empathy – things that are not reducible to mathematics or algorithms (although there are many who would argue otherwise).

In fact, it is the power of statistical logic in AI that enables it to be fruitfully applied in many areas like classifying images, performing a web search, or sorting parts on an assembly line, to name just a few. However, this statistical logic is not appropriate when applied to areas like human companionship, therapy or decisions requiring wisdom.

When I queried Siri, a voice assistant on my computer, “What are you?” it replied, “I’m not a person . . . I’m software here to help.” While AI is entirely unsuitable for answering philosophical questions, this automated answer made me smile. An understanding of what AI really is will be essential as we discern how to responsibly use this emerging technology.

  • Derek C. Schuurman is a Canadian currently living in Grand Rapids, Michigan where he is professor of computer science at Calvin University. Prior to arriving at Calvin he taught for many years at Redeemer University College and was a visiting professor at Dordt University. He currently holds the William Spoelhof Teacher-Scholar-in-Residence chair at Calvin. Besides his technical interests he is interested in faith and technology issues. He is the author of Shaping a Digital World: Faith, Culture and Computer Technology (IVP, 2013).

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *