Robots are more and more part of everyday life. Most of them are thinking of the classic industrial robot. But the technologies are progressing. We build more and more intelligent robots, link them together to smart systems and give them in the form of artificial intelligence the ability to learn. That changes their role in our lives.
Industrial robots are common for us. We know their sight, and it does not surprise us to see them working in factory buildings. We know that these robots only do what they are programmed to do. They follow without deviation a predetermined sequence, both temporally and spatially.
But as soon as Artificial Intelligence merges with the capabilities of modern robotics, we start to become skeptical. As is known, anxiety is inversely proportional to the knowledge of something. Therefore, let's take a closer look at the current state of development with a few examples:
The first step from the industrial robot to the independent entity is to tear it from its pedestal and give it "feet". This enables a robot to follow us humans even in rough terrain. It is not surprising that this project is specifically funded by the military.
Cassie is a running robot. He was developed at the University of Oregon (https://www.youtube.com/watch?v=Is4JZqhAy-M). He looks a bit like a half human, is about hip-high and shows, except for a pair of legs, hardly any human characteristics. The idea of Cassie is generally to replace people in dangerous situations. For example, a robot instead of a firefighter can be sent to burning buildings to search for injured people. Robots are most welcome in this role.
Also welcome are all kinds of robots that take things from us that are exhausting or monotonous. Delivery robots, for example, replace people in freight transport, especially in urban areas. In Silicon Valley, such systems are widely used and accepted.
Nightscope (https://www.knightscope.com) is a security robot, and here the situation with acceptance already looks a little different. Nightscope looks like an oversized egg on wheels. It replaces a police officer on patrol and is equipped with a variety of sensors, such as 360 ° video, night vision, infrared image, number plate scanning, etc.
For systems like Nightscope, the main concern is the fact that as a human, you never know exactly what the robot is up to or who you are watching. It deeply irritates us humans if we do not have any gestures and facial expressions for the assessment of our counterpart, in many cases not even something like a face, where we could read the attitude of the robot opposite us. At this point, we quickly notice people how immensely important nonverbal communication is.
So we give the systems a behavior that takes the rules into account when dealing with people who have developed evolutionarily. When a robot in human presence shows us pleasing behaviors, it breaks down an inhibition threshold in dealing with them.
Such an example is individual distance. If a robot were standing motionless, we humans would be skeptical because we would ask ourselves what "he" is "thinking" ... "feeling?".
Bossanova Robotics has a system that scans shelves in supermarkets (https://www.youtube.com/watch?v=KRJV1SPYpIE). Such robots work during the day when people are present. Here it is important that the robot reacts when a person comes too close. This not only stops immediately, but also moves back a few inches to give room to humans. The preservation of individual distance avoids stress when dealing with the machines and thus makes it easier for people to accept such robots in their vicinity. As a human being, I have to constantly have the confidence to be sure.
In Europe, robots are still relatively uncommon in public life. Robots are already part of family life in Japan. An example here is Pepper (https://de.wikipedia.org/wiki/Pepper_(Roboter)). Pepper can even recognize gestures and facial expressions, and is therefore not just human-like in appearance. He is recognized in some Japanese families as something of a family member. Even though we know it's a robot, we project character and emotion onto it, and once that's the case, we build a relationship with the robot. This relationship can go so far that we can not let robots die. Sony decided to stop production and maintenance of AIBO. Customers did not want to accept this, and so has developed its own market for maintenance and repair of AIBO. Customers are willing to pay a lot of money because they have built a relationship with the robots.
So where do we draw the line between the robot as a technical device and the robot as a souled entity? Today we personalize technical equipment, we give our cars names and speak of a "him" or a "her" in the sense of "Today he is stubborn", if the starter doesn't work properly at too low temperatures.
Is it 100% correct to talk about machines when we are talking about robots?
In 2017, Sophia, a robot by Hanson Robotics from Hong Kong (see picture), was granted citizenship by Saudi Arabia. "She" is the world's first robot that has civil rights.