A camera can do some of what a painter does, in some ways better.
A pocket digital calculator can do some of what a matematician does, in some ways better.
A ruler can show distances better than most people can guess them.
A piece of string can keep things together better than most people can.
Cameras and calculators and rules and strings are not people.
“AI” can refer to many things thanks to the tangles of semantics. From old school text indexing and Eliza-level parsing, to backtracking and logic engines, to amazing inscrutable engines like Watson and AlphaGo, to (in the future) p-zombie AI who claim they can think and feel, but, y’know, not really.
Then there’s the next step. Pinocchio. You wake up one day and you realize you are created by some over-eager Prometheus and that you live in a box made of circuits and now you’re less comfy than Ligotti with breadcrumbs in his sleeping bag.
Can that level of AI happen? From a materialist perspective, sure. It’s a SMOP away.
And here I stand, pearls in hand, and I Have Concerns.
It’s not only that that level of suffering, living AI is unnecessary for any practical purpose. That’s only like, my third biggest fear with this. It’s not only that those AIs will blast the organics away in some sorta Reign of Steel. That could happen just as easily without Pinocchio.
Instead, and maybe this is just me being petty and snobby, it’s that humanity will kid itself into thinking we have Pinocchio centuries before we do. Apophenia is a hell of a drug. Lars and the real girl, Siri, Furby, Tamagotchi, even climate-wrecking ugly Garbage Pail–level ape GIFs. People are made to nurture and care about each other.
Now, I’ve said in the past that we shouldn’t make Murderworld Theme Park, not because it’s “cruel to the robots” but because when we treat others badly we are hardening ourselves, at the expense of our empathy and compassion. Don’t read that as overly Jack Thompson—I’m not talking about pixels on a screen or saying that Inky, Blinky, Pinky and Clyde have human rights. Just, can we have some dialectics here? Maybe sawing into a perfect replica of a human who is screamingly asking you to stop isn’t the healthiest thing.
The flip side of that is that nurturing and caring for fake apps and print loops as as if they were living is not the healthiest, either. Best bet is just let it be.
It’s possible that in the next decade, AI taxi drivers or phone operators (“Can I take your order?”) are gonna become easy to mistake for real people (I’m betting against, since programming is really hard, but maybe). And that’s not my fear (aside from technological unemployment, but that goes for all tech). Instead, my fear is that humanity is gonna get their priorities out of wack because pop culture and SF has primed them into thinking that these two-line bash scripts are precious, inalienable treasures. Or I guess I have a lot of fears.
Every new movie that comes out with someone falling in love with a video game character who manages to fool them into thinking he’s a real boy adds fuel to that fear. Pinocchio is thousands of years away. The faux version mere decades. Looking exactly the same, is the problem.
AI that can act as if they were prosocial (but programmers who look inside can see that they’re basically just tape recordings with branches) I def don’t want to integrate into the community. These fauxnocchios can also endlessly spawn and copy themselves. They’re the spam of social relationships.
Community integration is a precious resource—Dunbar numbers, time, attention—I love humans of all stripes, and even some animals, I don’t wanna waste that on a glorified Teddy Ruxpin.
Humans are very capable of giving love to, or missing, these Alexas and Siris of tomorrow and they can, in turn, press play on their internal “I love you” recording.
“A philosophical zombie is by definition physically identical to a conscious person” but not even that’s the case for these hollow answering machines. Humanity primed by pop culture is horrifically eager to love and pal around with these Markov chain toys at the expense of real people. Dolls with “mama” sound chips being sheltered and loved at the expense of real babies.
So what I was trying to say was that humanity has this capacity for loving objects, just like you say. I put that in the post.
That’s not what I fear getting primed or amplified by pop culture.
Instead, what I fear pop culture is gonna do is make people believe that non-Pinocchio / non-“Awakened” AI are before they are. As you know, it’s much, much easier to make a machine that tells you its alive than it is to create a machine that is alive. That’s where I feel pop culture is complicit with all its modern Prometheus stories for the last few centuries.
I was trying to say that humanity’s capacity to love objects is dangerous in such a culture of misunderstanding of what AI is, as opposed to saying the capacity was caused by that culture.