Blogs

How Human Eye Can Be Interfaced with Machines?

Dr. Abhishek Mandal, Ph.D.

Senior Business Adviser, Vision Science Academy, London, U.K.

 

Vision Science Academy Exclusive

The introduction of the latest technological advancements and the use of modern devices on a daily basis has changed our lifestyle altogether. We live in a more automated society and each and every aspect of our life is destined to be completely handed over to the machines of modern era. Even with such widespread innovations, people with disabilities have always been left behind in utilizing these fruits of science. But now efforts in this area have also picked up the pace, and many new modalities linked with AI are expected to prove beneficial for the handicapped individuals.

Vision-based human-computer interaction

 People who are physically challenged have long been facing tremendous difficulty in using modern-day machines like computers. There was no alternative for them except to remain out of the race. But now with the help of new and improved AI systems, such individuals can use their eye movements to control the computer interface. The use of eye-tracking technology frees the user from the conventional input methods such as the keyboard, mouse, or touchpad. The movements of the eyes and intentional blinks are registered as commands by the computers and help them navigate the interface as normal people would do (Kyung-Nam & Ramakrishna, 1999). The algorithm estimates the blink duration and the sequence of blinks in real-time to control the machines. Even though the technology is not as seamless as conventional input methods, but it is bound to get better with time.

How does the system work?

 The vision-based computer interaction depends upon the image processing technology and a special camera to detect the commands being given. Both wearable and non-wearable sensors can be used for this purpose. For convenience, non-wearable and integrated systems are going to be the primary choice of input that will find easy adoption in the world of tomorrow (Zander, Gaertner, Kothe, & Vilimek, 2010).

Other than that, electrooculography (EOG) based interfaces are also present which employ the use of electrodes implanted around the eye to register commands. The electrical potentials produced by moving muscles are detected by the sensors and allow the user to operate the interface (Barea, Boquete, Mazo, & López, 2002).

Uses of the technology:

 With the help of these deep learning algorithms, changes in the eye’s morphology are already being detected to prevent the late detection of diseases like retinopathy and age-related macular degeneration (Jacko et al., 2000). The continuous data collected from using this technology can greatly refine the database and find even more breakthroughs in ophthalmological care.

The scale to which this technology can be integrated into our daily life is limitless. The smart homes of the future will no doubt have similar technology integrated into them which will further streamline the routine tasks. Trials and experiments have proved the use of eye-based AI to be very promising and it is only a matter of time before these innovations revolutionize our lives completely.

 

 References:

  1. Barea, R., Boquete, L., Mazo, M., & López, E. (2002). System for assisted mobility using eye movements based on electrooculography. IEEE transactions on neural systems rehabilitation engineering, 10(4), 209-218.
  2. Jacko, J. A., Barreto, A. B., Chu, J. Y. M., Scott, I. U., Rosa, R. H., & Pappas, C. C. (2000). Macular Degeneration and Visual Search: What We Can Learn from Eye Movement Analysis. 44(29), 116-119. doi:10.1177/154193120004402931
  3. Kyung-Nam, K., & Ramakrishna, R. S. (1999, 12-15 Oct. 1999). Vision-based eye-gaze tracking for human computer interface. Paper presented at the IEEE SMC’99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028).
  4. Zander, T. O., Gaertner, M., Kothe, C., & Vilimek, R. (2010). Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction. International Journal of Human–Computer Interaction, 27(1), 38-51. doi:10.1080/10447318.2011.535752

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe Now