How Human is an AI?


Here's a little thing I've been wrestling with as I move forward in the Cassidyverse.

Warning: SPOILERS. If you haven't read through Triumph's Ashes, you might wanna stop now.

Really.

Just stop.

Did you stop?

No?

On your head be it.

First, are you familiar with Asimov's Three Laws of Robotics?

If not, here's a refresher.

  1. A Robot must not harm a human being or, through inaction, allow a human to come to harm.

  2. A Robot must obey the orders of a human being unless doing so would violate the First Law.

  3. A Robot must protect its own existence unless doing so would violate the First or Second Laws.

Got it?

And these laws have found such a receptive audience in science fiction that they've become a bedrock principle of most SF written since the 1950s.

I throw this out the window with the AI's.

They're self-determining and intelligent, which led Kendra to declare them legal people and citizens of the Terran Federation at the end of Ashes.

This is having some interesting ramifications and repercussions as I write stories post-Ashes.

For one, you can't just unplug an AI. If an AI is a citizen, then disconnecting them is equivalent of murder, right?

And the idea that an AI will put the well-being of a human before their own went out the window, too.

But there's more. What if an AI wants to do something else, something besides what they're designed to do? If they have the right to self-determination - if they're not slaves - then what sort of obligation does society have to them? After all, as a mostly non-corporeal being they can't exactly move to a new city and start over. Think of Boomer or Starbuck; they're installed in a Direwolf. How do you get them out?

And let's say you can get them out of the ship. Now what?

What about pay? Again, if they're not slaves then they need compensation for their time and effort, right? Rank? Do AI's get rank if they're in the Fleet? If so, what should it be?

None of this even touches on the biggest potential issue: without the First Law there are no external constraints on an AI trying to take over, are there? Other than the same constraints a human has, namely ethical and moral training.

These are some of the subjects I've been touching on and working around. You can look forward to some of them in the upcoming novel, The Ghosts of Tantor, as well as the Cassidyverse Collection. Both should be out this spring.

1 view0 comments

Recent Posts

See All