Science in the Bluegrass — Artificial Intelligence?

Chris Graney

Artificial Intelligence (or AI) is in all the news today. What can it do? What will it mean? Can an AI be sentient or have a soul or offer insights about religion?

An AI is an algorithm. You follow algorithms when you do your taxes: “Enter amount from line 15a; if amount is greater than line 14d, subtract … .” You can do your taxes by hand, on paper and work through the algorithm yourself; or you can use tax software and let a computer “app” work through the algorithm. Either way, your taxes are calculated.

Modern computing technology is based on binary logic — 0 and 1; open switches (0) and closed switches (1); logic gates that mean that if this switch is opened, that switch will close, etc. If you use an “app” to do your taxes, then you have many tiny electronic switches opening and closing under certain programmed conditions. 

We can imagine a purely mechanical system running an algorithm, with cogs and levers going this way and that serving as our 0s and 1s (the first calculating machines were mechanical). We might even imagine a biomechanical system — a vast horde of rats, all trained to make binary decisions: Do this if that happens; do that if this happens.

When IBM’s “Deep Blue” beat a world chess champion, Garry Kasparov, in 1997, much was made about it — “robot overlords” and all that. But what beat Kasparov was an algorithm written by human beings. As my son put it, what Deep Blue showed was that chess can be reduced to following a recipe (an algorithm). This means you yourself could have beaten Kasparov. You just would have needed the right IRS-style worksheet telling you what to do (it would be huge), step by countless step. Likewise, a mechanical system could have beat him. So could the rat horde.

A Google engineer named Blake Lemoine made news recently by claiming that Google’s “LaMDA” AI is sentient. Lemoine, who Google fired, has said that LaMDA is self-reflective and claims to have a soul. LaMDA probably sucks up vast volumes of text off the internet and sifts it statistically — then, when you input text that contains the word “soul,” it responds with the sort of text that, statistically speaking, mimics what we humans have created around that topic. 

As Google spokesperson Brian Gabriel said in dealing with Lemoine’s claims, “Today’s conversational models … imitate the types of exchanges found in millions of sentences,” so they can riff on any topic, souls or selves. Therefore, an AI can probably sound a lot like a human person. But it is an algorithm. Therefore you, the mechanical system or the rat horde could all be LaMDA (or the famous ChatGPT).

We dress our algorithms in devices that are heavily marketed and made to be so appealing that we often prefer, for example, to hold them and interact with them while driving instead of putting them down and watching the road — all at risk of life and limb to both ourselves and others. 

That appeal distorts our perception regarding all things AI. If algorithms ran on mechanical devices the size of huge warehouses, and we could walk through them and see the levers moving (or the rats squeaking), we’d think differently about them.

Perhaps instead of talking about AIs and what they can do and whether they have souls, we should talk about what we will do and our souls. When our technology is powered by AIs that sound like people with souls, AIs honed to generate profit for their makers, how will we ever be able to put any of it down?

Chris Graney is an astronomer with the Vatican’s astronomical observatory who lives in Louisville.

Tags from the story
,
The Record
Written By
The Record
More from The Record
St. Xavier students volunteer at the Abbey of Gethsemani
Students from St. Xavier High School have been volunteer each month at...
Read More
0 replies on “Science in the Bluegrass — Artificial Intelligence?”