Gesture recognition technology is here to stay and will make life smarter and easier. With wearable device market booming in the past few years, gesture control of these devices is also becoming increasingly popular. But deploying a reliable and efficient gesture recognition system to any project could be costly and time consuming. What if you have a ready made board that does this function for you? All you have to do is connect this device to your project and define the gestures and controls using a smartphone app, and voilà, your product can be controlled with your hands!
This is now possible, thanks to the crowdfunding campaign I came across recently. A group of five young engineers and entrepreneurs from Boston have come up with such a device that makes gesture recognition easy for you. Gesto is the first open-source boards for wearable gesture control that combines muscle signals with motion patterns. A MassChallenge Boston finalist, this small form factor board can be worn around the user’s body (wrist, forearm, arm, torso, or leg). It combines information received from the spatial sensors and the user’s muscle activity, and analyses them in real time to recognize new gestures and motion patterns. It eliminates the need for using cameras and tedious calibrations to use the body itself as a controller. The board is recognised by other devices in the same way a keyboard or mouse is recognised, and hence the user can send direct commands to the device and control it. Let us take a look at the features of this board.
Gesto boards don’t have a ground electrode; they use a virtual ground. This eliminates the need of extra cables and electrodes to measure muscle activity in any part of the body. The biosignals received from the user’s muscle are analysed in real-time. All the tools a developer need to perform muscle analysis – software filters, machine learning algorithms, feature extraction, data compression, integration etc – are available for free in different languages such as C, MATLAB, Python and Java. These features let a user get raw data from the board and use the patterns for other analysis too.
The gesture recognition algorithm used in this project is known as DualBurst because it combines muscle signals from body and motion signals from accelerometer for pattern analysis. This feature lets you recognise and define different types of hand movements:
- Singular gestures: Pinch, wave, hand fold, finger press and similar gestures that depend on the muscle contractions.
- Air drawing: Letters, numbers, figures etc that is performed with wrist or arm movement.
- Directional gestures: Gestures in the three dimensional space such as raising and lowering hand, rotation of fist etc.