Affectiva, a company that grew out of MIT’s Media Lab, develops Affdex, another emotion measurement technology that competes with FACET Vision. Affdex SDK enables any connected device, even a smartphone, to be able to read emotions and expressions. The data for such systems is gained through 3D cameras mounted on the robots, and through the use of sensor arrays on the robots’ bodies. One example of a much-hyped idea for a successful application of this technology is to create robotic pets that help you in times of mental difficulty. In short, a robot can be your friend in need.
Step three: replace their animals
Robotics is not always about humanoid form-factors. Some robot concepts and designs that came up recently were inspired by fishes, birds and animals. Jayakrishnan of Asimov Robotics finds these models inspired by nature very intriguing. These robotic pets are also known as companion robots. They might be an emotional companion to some, or a utility companion to others.
PARO is one such high-tech interactive robot developed by AIST, a Japanese industrial automation company. It allows the benefits of animal therapy to be administered to people in environments where live animals present treatment or logistical difficulties. Think of it as your Tamagotchi (a hand-held digital pet created in Japan) with a body. Robotic pets use interactive and adaptive computing technology to use the input gained from their sensors to simulate some biological characteristics of the animal it is pretending to be.
For example, PARO has five kinds of sensors: tactile, light, audition, temperature and posture. It processes data from these sensors to collect information with which it can perceive people and its environment. With the light sensor, the robot can recognise light and its intensity. It feels being stroked and beaten by tactile sensors, or being held by posture sensor. It can also recognise the direction of voice and words such as its name, greetings and praise with its audio sensor.
Researchers at National Sun Yat-sen University have developed a framework that employs a neural network-based approach to construct behavior primitives and arbitrators for robots. This neural network-based architecture could allow these researchers to build a robot that learns or evolves its behavior based on how the human interacts with it, or based on the data it gets from its sensors. They believe that it will be far easier to implement through this architecture, compared to traditional solutions.
PhD student Mriganka Biswas says in an article in ScienceDaily, “Cognitive biases make humans what they are, fashioning characteristics and personality, complete with errors and imperfections. Therefore introducing cognitive biases in a robot’s characteristics makes the robot imperfect by nature, but also more human-like. Based on human interactions and relationships, we will introduce characteristics and personalities to the robot. If we can explain how human-to-human long-term relationships begin and develop, it would be easier to plan the human-robot relationship.”
Step four: takeover their vehicles
OEMs have taken autonomous cars seriously, and Land Rover has taken its first step towards self-driving, cloud-connected, augmented-reality based vehicles. Discovery Vision is Land Rover’s all new SUV concept expected to hit roads in a few years. BMW and Audi are coming out with their own versions, although none have reached the levels of Transformer robots yet.
NVIDIA’s Tegra K1 mobile processor is currently the processor of choice for Audi, as the German automaker tips its toes into autonomous vehicles. The K1 processor is powered by a 192-core graphics processing unit (GPU) which is the heart of Audi’s entire automotive infotainment system. This gives an idea of the level of computing required for an automotive robot to function properly.
Of course, you need some very sophisticated software to ensure that your car doesn’t end up jumping over a bridge with you inside. Bosch is developing an Automatic Park Assist technology that will be out next year, which allows a car to be parked remotely using just a smartphone app. Another technology, Traffic Jam Assistant, will step in when the vehicle is moving at low speeds. Google is already trying to make fully automated driving a reality. Bosch’s autonomous technology gathers data using an array of sensors, including radar and video cameras, as well as a roof-mounted laser scanner (LIDAR) that generates a detailed 3D map of the environment.