I came across this article over the weekend mainly because the headline obviously grabbed my attention. It hard not to click on an article that says that a robot has passes a self-aware test.
I will let you read the article to understand how the test was administered. The interesting thing is the video that is in the article. First off I was amazed by the robot itself. I have never seen a robot start from seated position and look more or less human like as it gets to its feet. That was impressive to me, just the engineering that went into creating the actual robot. Despite my amazement the point of the video shows an experiment that claims to show that this robot is self-aware. This experiments is pretty cool, the robot was able to discern that he was different from the others, and technically this is a mild form of self-awareness to a very small extent.
When I first read about this I was skeptical, but at the same time felt this was really cool. What I mean is I do not think that this qualifies as the robot being “self-aware.” However, it is really cool that it was able to figure out who was given the dumbing pill just by its ability to speak. I do not know where that stands on the journey to creating a true self-aware artificial intelligence, but I would have to think we are getting closer to figuring it out rather than farther.
Still I think there is a massive grain of salt that needs to be taken with this ideology. I honestly do not know how if ever we could create a thinking machine in the truest sense. For every expert that says that it can be done there is one that says it can’t. Then there is an equally massive number that state that even if we could do it, we probably shouldn’t as they will surely destroy us. I am not sold on any of these scenarios. I think that this could be possible, but the real question is whether or not we will figure it out. Then if we do, and that is a huge “if”, I am very skeptical that this AI will want to destroy us. But that is a post that has already been written, with an additional follow up here.