At the first ever TI Educator’s conference at Bangalore, we saw projects from hundreds of eager minds, all supported by TI’s support programs. In this article, we take a look at one of the interesting projects that caught our eye — a touch panel that can be added on to any display to make it touch enabled.

The project team consists of 3 engineers, Vasuki Soni, Rounak Singh Narde and Mordhwaj Patel from the National Institute of Technology, Raipur. The project is titled “An Interactive Multi Touch Panel using Infrared Sensors”. It is basically a touch panel that is used to provide a touch operated interface to either a computer or any computer operated device. On fitting this panel on a computer or laptop, one can convert an ordinary monitor into a touch sensitive monitor. It means one can control mouse cursor position on the screen using this touch panel. So, in essence, you can turn every one of your non-touch operated devices into touch-operated ones.


178_2Growing an Idea
Mr Soni says, “This technology presents an idea that is dedicated to the field of education. In our day to day life, we see that the field of education involves more user interaction for better understanding. In the village areas and in many schools/colleges, the major need is to make the education process more interactive. But due to a severe lack of resources, it has not yet been possible for us to provide a solution which is affordable at the village/grass-root level. We sincerely believe that touch technologies have the scope to show a lot of innovation in the same”.

I probe on about how their idea is different from the other touch interfaces out there. Mr Narde replies, “Our idea is not for the consumer consumption market as in it isn’t a glitzy, full of overt displays of non-essential features aimed for the rich and the gadget loving community but for the villages. You can think of it like a development board, consisting of only the essentials. The product may have a lesser resolution compared to the other exisiting technologies like resistive and capacitive touch, but the project, like I mentioned before is intended for the village education program as the whole touch screen assembly that we have worked on is very cheap and it is useful for the interactive computer interfaces for the student education like for drawing, painting and other simple operations of computer that do not require high resolution for tangible user interface.”

READ
Vroom Down Chock-a-Block Lanes With iWave Car PC

Working
There are different levels to the final touch-interface.

Explore Circuits and Projects Explore Videos and Tutorials

1. Sensor panel for Touch
The sensor panel comprises the major part of the sensing circuitry. These sensors are connected along a line to form the two opposite sides of a rectangle just like that of a Laptop or a computer screen. These sensors are built in the form of modules which contain 8 IR sensors and an IR LED. These LEDs are not actuated all at a time but one by one.

2. Sensor data acquisition
The touch detection system works on the principle that a finger placed in the touch panel obstructs the path of the IR Rays emitted from the periphery of the panel. Due to this, the receivers change their outputs after the reception of the reflected IR Rays. The IR Sensors are basically actuated by a 38 kHz modulated signal to stand out of the noise. At a time, only a single IR LED is activated using the combination of D flip-flop and tristate buffer which provides accurate clock and synchronized switching of LEDs.

When the touch event takes place on the panel obstructing the path of the IR LED, some sensors do not receive the IR Rays emitted from the LED due to the obstruction. /*So we get a type of image as shown below in Figure 2. */

Mr Patel explains further, “Once the data is retrieved, comes our next concern. We now have 96 sensors and 12 IR LEDs. But for the data retrieval from sensors, we don’t have sufficient I/O pins in the Stellaris ARM Cortex M3 microcontroller. Hence we are using the Shift Registers (SN74165) to increase the I/O pins. The data is then collected by the 8-Bit Parallel-Input Serial-Out Shift Register. The serial data coming from each shift register in each sensor module is padded serially in the bit stream which finally reaches the microcontroller and is received using the Synchronous Serial interface.”

3. USB Based Serial Communication with micro-controller
Now the data is ready for further processing. The data bytes received from each module are now sent to the computer. This is accomplished using a USB Based transfer. Soni adds, “We are using Bulk Transfer scheme because the data, which is in the form of an image, has to been sent over USB, which is a long burst data for which the bulk transfer is basically meant for.”

READ
"We Created A Specialised Tablet At A Low Price Point"

4. Implementing C Language for the Image creation using the Sensor Matrix data
The data retrieved in the previous step is now utilized for the Image creation which will give a pictorial representation of where, in the 2D plane of the touch panel, the touch was detected which is done using C programming.

Rounak elaborates,” This image is actually made by creating scan-lines i.e. lines between the receiver and the transmitter, made when the communication between them is successful. These scan-lines are made in the image for each possible Rx-Tx pair in the sensor matrix. A sample image made during our implementation using dummy data in which every possible Rx-Tx pair is communicating successfully as no touch/obstruction is shown below in Figure – 1.”

Single_sensor_array

5. Creating AVI(Audio video interleave) Video file from the frames (Images of scan-lines) obtained
Now, the next step attempts to make an avi video using the frames from the previous step. In case of a Touch sense on the touch panel, for each LED there is generated one image as in Figure- 2 and the required frame is the one obtained by the sensor data corresponding to all LEDs actuated. However, the final image is that which is obtained after overlapping all the frames obtained corresponding to each lit LED which, in turn, contains a touch point.

Overall_project_prototype

6. Creating Video interface with Community Core Vision (CCV)
Mordhwaj explains,”Up to this stage, we have a video which now is capable of showing where actually the touch was detected (i.e. Blob) w.r.t the Sensor Matrix frame. This information at this stage is now very useful as these videos are ready to be interfaced and fed to the Image Processing software: CCV– Community Core Vision – this software takes the video input stream made by the frames of blob detection images from touch panel and outputs the tracking data which is useful for mouse movements. So we now have the video being interfaced with the software and now we require a software driver which can synchronize the blob position figured out by CCV with the mouse movements which leads us to the last step.”

READ
Lab-on-chip field will explode in usability"

7. TUIO Mouse Driver Implementation
TUIO is an open framework and platform to support the tangible user interface. The TUIO allows the transmission of meaningful information extracted from the tangible interfaces including touch events and object specifications.

Vasuki chimes in,”This protocol enciphers the control data from a tracker application (e.g. based on computer vision) and sends it to any client algorithm that deciphers this information. This combination of TUIO trackers, protocol and client implementations allow the rapid development of tangible multi-touch interfaces. Finally we are now at a stage to successfully run our Touch user interface having mouse moves well synchronized with the Finger movements and gesture.”

Challenges faced
I cannot help but quiz them on the level of challenges associated with such a complex idea being brought to fruition. Rounak says,” It was actually really hard to design the hardware because that aspect, along with the complicated signal and supply management were the most critical issues that were to be handled in order to control the sensitivity of the touch sensitive panel. Also the problem of the propagation delay was encountered as the signal input from the sensors has to pass through hardware and software algorithm. But somehow we managed to shrink the code so as to enhance the speed of the operation and the whole interface and that helped a lot in the completion.”

What lies ahead
With such work ethic and abundantly evident innovativeness, the talk moves on to what kind of features they would like added on to their design. Patel adds,” We are still on our way to increase its resolution and its compactness. Further we are also planning to add more multi-touch features to this panel like scrolling, zooming etc. We have also been brainstorming on ways to make the design more compact so that it fits easily on the screen of a computer. To decrease the thickness of the touch panel, we are still searching for smaller IR sensors. This way we will also be able to reduce the size of the PCB for the sensor array.”


LEAVE A REPLY