Integrating electronics onto the human body is nothing new in this age of wearables. Neil Harbisson has taken this to the next level with the eyeborg—an antenna implanted into his skull that lets him ‘hear’ colours. He is now the world’s first government-recognised cyborg. Neil Harbisson speaks with Dilin Anand in an interview with EFY
Q. Why do you need the eyeborg?
A. I was born with a condition that makes me see the world in black and white, literally. It is called achromatopsia, and it means that I can only see in greyscale. People often mistake this with colour-blindness. As a kid I was teased a lot; kids would give me a red pen saying it was blue and I would write essays in the wrong colour. During my teenage years, I only wore black and white clothes. This was until I got my first eyeborg, which allowed me to sense colours.
Q. What created the spark that led you to integrate an electronic device onto your body?
A. It started when I heard a lecture by a cybernetics expert, Adam Montandon, while I was studying music composition at Darlington College of Arts. The idea of using digital inputs from an electronic device to augment senses excited me, especially because it meant that I could sense colour. He helped me create the first model of the eyeborg, which was more like a headphone. Eventually, it evolved into a cyborg-like extension of my sensory system; essentially, a prosthesis that would deliver input signals to my existing sense of sound.
Q. Was using sound a calculated decision?
A. We used sound because Adam felt that it would give me a better approximation of the variations of colour, since I am a musician. Moreover, the natural occurrence of synaesthesia—a neurological phenomenon in which stimulation of one sensory or cognitive pathway leads to automatic, involuntary experiences in a second sensory or cognitive pathway—suggested that visual and auditory senses could in some cases overlap. But, the challenge was in figuring out how to convert colours into sounds.
Q. How did the team manage to convert light into sound?
A. Since both light and sound are waves, a physical model of transposing light into sound was used. This allowed us to create something that would create an experience similar to how we sense colours, in a continuous spectrum. Although light waves have wavelengths that are too high to hear, we were able to mathematically transpose these down until these sat within the audible wavelengths.
We then implemented this into software that runs on a wearable device outfitted with a camera. Red is the lowest colour in the visible spectrum, and it is also the lowest note that I hear.
Q. How challenging was it to get the eyeborg to work?
A. A challenge that came up with the use of a digital camera was that of saturation. It tended to over saturate or under saturate what it saw, based on the environment.
We were able to solve this by tuning the system to detect 360 different hues and disregard brighter or darker versions. Saturation is used for adjusting the volume of sound.
Q. How is the implantable experience working out?
A. Initial designs left me with cables coming out of my head into a computer outfitted into my backpack. With modern technology, the eyeborg now uses a chip placed at the back of my skull. The converted sound is transferred to me through the device pressed against my head by using bone conduction technology. The device lets me hear music or receive phone calls directly to my head using Bluetooth.
Q. Why did you use bone conduction?
A. Bone conduction technology allows me to sense colours through a different channel; I am able to hear people speak and hear the colour of their clothes at the same time.
I have also had it osteo-integrated, which will place the device inside the bone and then the sound will resonate much better.
Q. Which technologies can help enhance this device?
A. As of now, I have to recharge the device at a power socket by standing near it while it charges. I am working on ways to use my blood circulation instead.
Q. What other cyborg projects are you working on?
A. Fingerborg, a prosthetic finger designed for a student who lost a finger, features a camera inside. We are working on making the camera deliver feedback directly to his finger.
Then there are Speedborgs, which are internal radars that allow the user to perceive the exact speed of movements in front of him or her.