Sunday, December 22, 2024

“Limitations are paving the way for ultrasound-based sensors”

- Advertisement -

Q. What are the fundamentals of gesture recognition?
A. From the physical point of view, there is an input to the system and the resulting output from the system. The input can be in the form of a person being tracked completely, a person’s hand, a person’s f

Anup Tapadia, founder, TouchMagix
Anup Tapadia,
founder, TouchMagix

ingers or tracking the special emotions of a person to capture what the user is trying to do. The output can be in the form of a display, sound or interface like the actuators. Nowadays, there are dispensers or smell creators which trigger another dimension of a human gesture interface.

Q. What are the major components that help enable gesture control and motion sensing?
A. The first type of sensor is what is inside the device and allows the motion of the device itself to be used as a gesture. These are primarily accelerometers, 9-way or 12-way, etc. The second are optical systems or non-optical gesture tracking systems like the front camera of a mobile phone or something like what Microsoft is using, a Kinect-ware.

- Advertisement -

Q. What are the other techniques for gesture tracking?
A. There are ultrasound wave sensors and radio-frequency space-gesture sensors. There is an experiment that uses the Wi-Fi signal and the change in Wi-Fi signal to recognise what gesture a user is performing.

Q. What is referential gesture interaction?
A. Consider using the reference of the screen in order to interact with the screen. For example, if I move my hand, there would be a cursor that would move on the screen and, when I move my body, there would be a virtual person mimicking my gesture. This is a referential gesture tracking system.

Q. What is immersive gesture interaction?
A. In this type of gesture recognition technology there is no cursor or reference. Imagine a projection on the floor with a football in that projection. You just have to go and kick that football. So there is no reference on the screen, however, your foot itself acts as a reference to the football on the screen. So the system would respond to my gesture even when it is performed right next to the screen.

Q. What new sensors have been introduced in recent years?
A. Time of flight sensors and structured light-based sensors are the two most popular types of sensors. 3D sensing has become possible only due to these. These use reflective light to compute the depth of information. There is a new range of sensors coming up where people are using ultrasound, which can actually allow low-power gesture sensing for very simple gesture recognition applications.

Q. What are the limitations to the use of different types of new sensors?
A. The structured light-based 3D sensors cannot operate under high infrared light conditions, because they do infra elimination that needs to be visible to the capturing sensors. That is why these sensors cannot be used in an outdoor environment or a mobile kind of platform.

Q. How can we overcome the limitations to make better sensors?
A. In the MotionMagix sensor we have a photo diode which does ambient light detection, correction and elimination. This helps us ensure proper control of the camera under different lighting conditions. Also, we have a coating on the surface of these sensors to resolve the dust prevention issues. TouchMagix sensors are engineered for industrial use and are not much affected by humidity or temperature.

Q. What kind of challenges an engineer would face while designing a component?
A. The key challenge is definitely to know how a sensor would work in different environments. MotionMagix sensor can not be used in a night club as the adaptation time for the sensor is not built for flashing lights. Adding a photo diode and an optical illumination correction or feedback loop allows us to have a better control over the ambient illumination and lighting.

Q. Apart from gesture and motion sensing what other technologies do you use?
A. Other than the features enabled through sensors, nearly 70-80% of the features like projection scheduling, projection correction, remote management, etc are the software enabled features.

Q. What are the resources that can help the young engineers know about these new technologies and the components?
A. There is an online global research community, called NUI Group. Then there is Open NI framework (now discontinued) which people can use to learn the different algorithms used on 3D cameras for gesture tracking.


SHARE YOUR THOUGHTS & COMMENTS

EFY Prime

Unique DIY Projects

Electronics News

Truly Innovative Electronics

Latest DIY Videos

Electronics Components

Electronics Jobs

Calculators For Electronics