Robot Control by ERPs of Brain Waves

This paper presented the technique of robot control by event-related potentials (ERPs) of brain waves. Based on the proposed technique, severe physical disabilities can free browse outside world. A specific component of ERPs, N2P3, was found and used to control the movement of robot and the view of camera on the designed brain-computer interface (BCI). Users only required watching the stimuli of attended button on the BCI, the evoked potentials of brain waves of the target button, N2P3, had the greatest amplitude among all control buttons. An experimental scene had been constructed that the robot required walking to a specific position and move the view of camera to see the instruction of the mission, and then completed the task. Twelve volunteers participated in this experiment, and experimental results showed that the correct rate of BCI control achieved 80% and the average of execution time was 353 seconds for completing the mission. Four main contributions included in this research: (1) find an efficient component of ERPs, N2P3, for BCI control, (2) embed robot's viewpoint image into user interface for robot control, (3) design an experimental scene and conduct the experiment, and (4) evaluate the performance of the proposed system for assessing the practicability.





References:
[1] K. Saravanan and H. Mahalakshmi, "Brain-computer control of wheelchair concluded mobile robot,” International Journal of Advanced Research in Robotics and Development, vol. 1, no. 1, pp. 1–5, 2013.
[2] S. Sutton, M. Braren, J. Zubin, and E. R. John, "Evoked-potential correlates of stimulus uncertainty,” Science, vol. 150, no. 3700, pp. 1187–1188, 1965.
[3] M. Kutas and S. A. Hillyard, "Reading senseless sentences: Brain potentials reflect semantic incongruity,” Science, vol. 207, no.4427, pp. 203–205, 1980.
[4] E. Donchin, K. M. Spencer, and R. Wijesinghe, "The Mental Prosthesis: Assessing the Speed of a P300-Based Brain–Computer Interface,” IEEE Transactions on Rehabilitation Engineering, vol. 8, no. 2, pp. 174–179, 2000.
[5] Y. H. Liu, , H. P. Huang, , T. H. Huang, , Z. H. Kang, and J. T. Teng, "Controlling a Rehabilitation Robot with Brain-Machine Interface: An approach based on Independent Component Analysis and Multiple Kernel Learning,” International Journal of Automation and Smart Technology, vol. 3, no. 1, pp. 67–75, 2013.
[6] Y. Su, J. Dai, X. Liu, Q. Xu, Y. Zhuang, W. Chen, and X. Zheng, "EEG channel evaluation and selection by rough set in P300 BCI,” Journal of Computational Information Systems, vol. 6, no. 6, pp. 1727–1735, 2010.
[7] D. V. Renterghem, B. Wyns, and D. Devlaminck, "Shared control between P300 BCI and robotic arm,” International Journal of Bioelectromagnetism, vol. 13, no. 1, pp. 2–4, 2011.
[8] I. H. Hasan, , A. R. Ramli, , S. A. Ahmad, and R. Osman, "P300-Based EEG Signal Interpretation System for Robot Navigation Control,” World Applied Sciences Journal, vol. 26, no. 5, pp. 566–572, 2013.
[9] W. Chen, J. Zhang, Y. Li, Y. Qi, Y. Su, B. Wu, S. Zhang, J. Dai, and X. Zheng, "A P300 based online brain-computer interface system for virtual hand control,” Journal of Zhejiang University SCIENCE C, vol. 11, no. 8, pp. 587–597, 2010.
[10] N. Birbaumer, "Breaking the silence: Brain–computer interfaces (BCI) for communication and motor control,” Psychophysiology, vol. 43, no. 6, pp. 517–532, 2006.
[11] B. Hong, F. Guo, T. Liu, X. Gao, and S. Gao, "N200-speller using motion-onset visual response,” Clinical Neurophysiology, vol. 120, no. 9, pp. 1658–1666, 2009.
[12] L. A. Farwell, and E. Donchin, "Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials,” Electroencephalogr Clin. Neurophysiol, vol. 70, no. 6, pp. 510–23, 1988.
[13] J. N. Mak, D. J. McFarland, T. M. Vaughan, L. M. McCane, P. Z. Tsui, D. J. Zeitlin, E. W. Sellers, and J. R. Wolpaw, "EEG correlates of P300-based brain-computer interface (BCI) performance in people with amyotrophic lateral sclerosis,” J. Neural Eng., vol. 9, no. 2:026014, 2012.
[14] R. Fazel-Rezai, B. Z. Allison, C. Guger, E. W. Sellers, S. C. Kleih, and A. Kübler, "P300 brain computer interface: current challenges and emerging trends,” Front. Neuroeng., vol. 5, doi: 10.3389/fneng.2012.00014, July 2012.
[15] D. Zhang, H. Song, H. Xu, W. Wu, S. Gao, and B. Hong, "An N200 speller integrating the spatial profile for the detection of the non-control state,” Journal of Neural Engineering, vol. 9, No. 2, doi:10.1088/1741–2560/9/2/026016, 2012.
[16] B. Hong, F. Guo, T. Liu, X. Gao, S. Gao, "N200-speller using motion-onset visual response,” Clin. Neurophysiol, vol. 120, pp. 1658–66, Sep 2009.
[17] L. Tonin, E. Menegatti, M. Cavinato, C. D’Avanzo, M. Pirini, A. Merico, L. Piron, K. Priftis, S. Silvoni, C. Volpato, and F. Piccione, "Evaluation of a robot as embodied interface for brain computer interface systems,” International Journal of Bioelectromagnetism, vol. 11, no.2, pp. 97–104, 2009.
[18] P. Belluomo, M. Bucolo, L. Fortuna, and M. Frasca, "Robot Control through Brain-Computer Interface for Pattern Generation,” Complex Systems, vol. 20, no. 3, pp. 243–251, 2012.
[19] T. Carlson and J. del R Millan, "Brain-controlled wheelchairs: a robotic architecture,” IEEE Robotics and Automation Magazine, vol. 20, no. 1, pp. 65–73, 2013.
[20] R. Singla and B. A. Haseena, "BCI-based wheelchair control using steady state visual evoked potentials and support vector machines,” International Journal of Soft Computing and Engineering, vol. 3, no. 3, pp. 46–52, 2013.
[21] K. T. Sun, T. W. Huang, and M. C. Chen, "Design of Chinese spelling system based on ERPs,” in Proc. 11th IEEE International Conference on Bioinformatics and Bioengineering, Taichung, Taiwan, 2011, pp. 310–313.
[22] K. T. Sun, T. W. Huang, M. C. Chen, and Y. C. Li, "Design of Chinese spelling system,” in Proc. 2nd Annual International Conference on Advanced Topics in Artificial Intelligence, Fort Canning, Singapore, 2011, pp.26–31.