Smart Things

Robot gives a hand to a woman. Two hands in offer position. Artificial intelligence conceptual business design

Throughout its history, science fiction has discussed the possible effects sentient robots, or smart things, would have on the human race. Possible scenarios range from the apocalyptic (like in Terminator or the Matrix), the friendly or funny (like in Futurama), or the tense (such as in Alien). Whether human-like intelligent robots will be a force for good or evil in society, everyone would agree that they will have some kind of effect. Or as Loyd Bitzer would say, the creation of a sentient robot would have exigence and demand discourse.

A fictional example of such discourse happens in the 2012 sci-fi movie Prometheus by Ridley Scott.  There is a scene where the one of the ship’s lead archaeologists is having  a conversation with the ships android, David. They are on a different planet trying to find out about a race of aliens (called the engineers) that they think created humans.

Charlie Holloway: What we hoped to achieve was to meet our makers. To get answers. Why they even made us in the first place.

David: Why do you think your people made me?

Charlie Holloway: We made you because we could.

David: Can you imagine how disappointing it would be for you to hear the same thing from your creator?

Charlie Holloway: I guess it’s good you can’t be disappointed.

I like this scene because, one, I’m a big fan of the Alien movies, but also because it talks about the relationship between humans and androids with free will.  David has no emotions and therefore can’t be disappointed, but he still recognizes what would disappoint a human, he is not the same as a human but he can fake it convincingly enough, which I love. All humans are different in some respect, but we all have the capacity to understand each other to a degree. At this point in the Alien world androids can look completely human and have meaningful and natural interactions, possibly even surpass the emotional intelligence of some humans. Carla Diana touches on a different but related reason for not feeling emotionally to her Karotz in her Atlantic article on the device:

The third (and most challenging) failing is Karotz’s inability to learn my preferences and grow more sophisticated over time.

To be able to have an emotional connection to a smart thing they need to be able to surprise us. The Alien world is a great example of such a  future. The android David could learn new things over time on his own, as shown in  the scene of him practicing basketball and learning how to speak proto-indo-european.

While we are not too close to having true basketball-playing-ancient-language learning-artificial intelligence, people have made great strides.

Check out Rose, the 2015 winner of the Loebner prize which is a yearly competition to determine the most human-like chatbot:

http://ec2-54-215-197-164.us-west-1.compute.amazonaws.com/speech.php

While Rose is fun to talk to at first, some of her reactions are pretty clunky, and although socially appropriate not too interesting after a while. She doesn’t grow like David the android. Even though we want her too.

Conversely, the reason people fear the possibility of robots gaining free will is that we are naturally afraid of being surprised, or the unknown. Especially when it comes to other human beings. This is why we do things like build walls to keep immigrants out of countries. 

However,  it is probably healthier to focus on the creator/creation relationship between humans and sentient robots. At a certain point, some of our objects will get so smart that instead of viewing them as a thing we view them more like something autonomous that we as humans gave life to.

Maybe instead of humanities enemy,  smart robots will be humanities children. And like most parents, we will want our children to be better than us. And maybe they will be. Let’s just hope artificial intelligence won’t  be like the spiders that eat their mothers when they’re born.

Leave a Reply

Your email address will not be published. Required fields are marked *