Friday, March 29, 2024

“SixthSense is all about making technology more human”

- Advertisement -

Q. How has the landscape evolved since you unveiled the device in 2009?
After 2009, the industry is interested in two technologies: augmented reality and gesture interaction. Microsoft is working on gestures-based gaming using Kinect. A lot many gesture-based input devices are also being introduced. Some of the big corporates working on these two technologies are MIT Media Lab sponsors and we are helping them in this work.

Other advancements are happening at the hardware level. You must have noticed that when I made the SixthSense presentation in 2009, the hardware used was of the size of a helmet. But now I am using a device which can be fitted into a match box. This rarely happens. While computing devices are becoming smaller and smaller, I have never noticed in this industry in my fifteen years of experience any device becoming 200 to 300 per cent smaller, as SixthSense device has in the last two years.

Q. Do you see this trend being replicated in other devices like mobile phones?
A. Mobile phones have now reached the limit of going smaller. This is not a technical problem. The reason is we are bound by the screen sizes of these devices. While the components inside the mobile phone can be made smaller and smaller, and a mobile phone can be made of the size of a coin, technically, there would be user limitation. The size of the mobile phone can be reduced only to the the size of the interaction that you wish to give to a user via the screen of the device.

- Advertisement -

The advantage of using the projection technology is that it can be reduced to the size of a button and any surface can be used as the output medium. You can project on a table, wall, newspaper or your palm—so it eliminates the limitation of the screen. That’s an interesting phenomenon which will impact the way we interact with the digital world.

Q. You have made the SixthSense technology Open Source. What inspired you to share the technology with the community?
A. A technology which gets locked down into corporate policies and confines of the intellectual property can be soon forgotten.

I come from India, from an area where till a few years ago the notion of technological advances was always associated with the western world; to advances aimed at making the life of the western world better and better. But life in the western world is good already and we need to break this model.

It is the two-thirds of the other world that needs the technological advances so that the life of people in these countries becomes better. While I could have made more money by selling the technology to a big company, I will get more blessings if I share the technology out in the open for the benefit of the masses.

Q. Will the SixthSense technology replace the mobile phones and laptops some day?
A. SixthSense is not an alternative to these devices. It is only going to add an option to the existing computing world. We are going to become human again by making computing more human. That is, access to the digital world will no longer remain confined to the rectangular screen of devices. People will be able to interact with and access the digital world while continuing to be in the real world, and not necessarily via the conventional devices.

Q. Which of your other research projects are special to you?
A. Ghost in the machine—a project on which I worked when I was at IIT-Mumbai—is a special project. The project is likely to have long-term implications. It explored how machines can be made more creative. Big corporates like IBM and DARPA are working on future technologies like artificial intelligence with an aim to make machines more intelligent. They are working on making machines as interesting collaborators of humans, capable of better serving the humanity and earth.

In 2008, I explored the future of augmented reality and used it in my projects as an input medium to access information from the digital world. But now, as part of my current project named TeleTouch, I am exploring the inverse of augment reality. So far it has remained an input device for us where the information is accessed via devices. But in this project I am trying to use augment reality as an output medium to touch and control things that are far and wide in the real world. I am trying to explore how can I touch far, for instance, my door which is 20 metres away from me.

The technology will enable users to use their smartphone’s camera and control everything they see on the screen by touching it. Users can interact with their appliances from far and perform tasks like opening the door and switching the light on or off, just by touching the objects on the phone’s screen.

Q. Last but not the least, tell us about your association with EFY?
A. I grew up reading EFY in India. And it was so much pleasure and surprise when I was approached by you for getting featured in it. I am sure my dad will feel great. He used to teach me and make all sort of stuff from EFY guides


SHARE YOUR THOUGHTS & COMMENTS

Electronics News

Truly Innovative Tech

MOst Popular Videos

Electronics Components

Calculators