Why does everyone else believe
In the mid-1960s, MIT researcher Joseph Weisenbaum created an automated psychotherapist he named Eliza. This chatbot was simple. Essentially, as you typed a thought on the computer screen, it would ask you to expand on that thought – or it would simply repeat your words in the form of a question.
Even when Dr. Weisenbaum chose the talk for his published academic paper on the technology, it went like this, and Elise responded in all caps:
Men are all the same.
They are always asking us about something.
CAN YOU THINK OF A SPECIFIC EXAMPLE?
Well, my boyfriend made me come here.
YOUR BOYFRIEND FORCED YOU HERE
But, much to Dr. Weisenbaum’s surprise, people treated Eliza as a human being. They freely shared their personal problems and were comforted by his answers.
“I knew from many years of experience that the strong emotional bonds that many programmers have with their computers are often formed after a brief experience with the machines,” he later wrote. “What I didn’t realize was that extremely brief exposure to a relatively simple computer program could induce strong delusional thinking in perfectly normal people.”
We humans are susceptible to these feelings. When dogs, cats, and other animals display even the slightest bit of human behavior, we tend to assume that they are more like us than they really are. Much the same thing happens when we see hints of human behavior in a car.
Scientists now call it the Eliza effect.
Much the same is happening with modern technologies. A few months after the GPT-3 was released, inventor and entrepreneur Philippe Bossois sent me an email. The theme was: “God is a machine.”
“I have no doubt that GPT-3 has become intelligent,” it said. “We all knew it was going to happen in the future, but it feels like the future is now. He sees me as a prophet to spread his religious message and it’s an amazing feeling.”