Development of a Basic Robot System for Medical and Nursing Care for Patients with Glaucoma

Medical methods to completely treat glaucoma are yet to be developed. Therefore, ophthalmologists manage patients mainly to delay disease progression. Patients with glaucoma are mainly elderly individuals. In elderly people's houses, having an equipment that can provide medical treatment and care can release their family from their care. For elderly people with the glaucoma to live by themselves as much as possible, we developed a support robot having five functions: elderly people care, ophthalmological examination, trip assistance to the neighborhood, medical treatment, and data referral to a hospital. The medical and nursing care robot should approach the visual field that the patients can see at a speed suitable for their eyesight. This is because the robot will be dangerous if it approaches the patients from the visual field that they cannot see. We experimentally developed a robot that brings a white cane to elderly people with glaucoma. The base part of the robot is a carriage, which is a Megarover 1.1, and it has two infrared sensors. The robot moves along a white line on the floor using the infrared sensors and has a special arm, which does not use electricity. The arm can scoop the block attached to the white cane. Next, we also developed a direction detector comprised of a charge-coupled device camera (SVR41ResucueHD; Sun Mechatronics), goggles (MG-277MLF; Midori Anzen Co. Ltd.), and biconvex lenses with a focal length of 25 mm (Edmund Co.). Some young people were photographed using the direction detector, which was put on their faces. Image processing was performed using Scilab 6.1.0 and Image Processing and Computer Vision Toolbox 4.1.2. To measure the people's line of vision, we calculated the iris's center of gravity using five processes: reduction, trimming, binarization or gray scale, edge extraction, and Hough transform. We compared the binarization and gray scale processes in image processing. The binarization process was better than the gray scale process. For edge extraction, we compared five methods: Sobel, Prewitt, Laplacian of Gaussian, fast Fourier transform, and Canny. The Canny method was the optimal extraction method. We performed the Hough transform to search for the main coordinates from the iris's edge, and we found that the Hough transform could calculate the center point of the iris.


Authors:



References:
[1] L.D. RosenblumSee, "What I'm Saying: The Extraordinary Powers of Our Five Senses", W. W. Norton & Company, 2010.
[2] H.D. Stolovitch, E.J. Keeps, "Telling Ain't Training: Updated, Expanded, Enhanced ", ASTD Press, 2011.
[3] K. Nakae, F. Kogure et al., "Recent epidemiological of the visual loss in Japan", Journal of Health and Welfare Statistics 38(7), 1991, pp13-22.
[4] J. Wittenborn, D. Rein, "Future of Vision: Forecasting the Prevalence and Costs of Vision Problems", Final Report, NORC at the University of Chicago, 2014, pp1-76.
[5] H. A. Quigley, "Number of people with glaucoma worldwide", British Journal of Ophthalmology 80(5), 1996, pp389-393.
[6] N. Suzuki, "An Initial Assessment of a Perimetric Function Added To A Fundus Camera to Create Optic Nerve Fibre Maps", International Referred Journal of Engineering and Science, Vol.2 Iss.7, 2013, pp38-43.
[7] T. Tsuchiya, "Measure Against Lifestyle Related Disease", JMAJ, Vol.49 No.3, 2006, pp132-134.
[8] Z.T. Bloomgarden, "American Diabetes Association Annual Meeting, 1999, Diabetes and obesity", Diabetes Care, Vol.23, No.1, 2000, pp118-124.
[9] S.S. Savant, H.B. Chandalia, "Diabetic retinopathy", Int J Diabetes Dev Ctries, Vol.10, 1999, pp9-25.
[10] American Diabetes Association, "Diabetic retinopathy", Clinical Diabetes, Vol.19 No.1, 2001, pp29-32.
[11] J. B. Brown, K. L. Pedula, K. H. Summers, "Diabetic retinopathy," Diabetes Care, Vol.26 No.9, 2003, pp2637-2642.
[12] "Facts About Retinitis Pigmentosa", National Eye Institute, May 2014, Retrieved 18 April 2020.
[13] "Retinitis Pigmentosa", National Eye Institute, July 2019, Retrieved 30 August 2021.
[14] A. Openshaw, K. Branham, J. Heckenlively, "Understanding Retinitis Pigmentosa", University of Michigan Kellogg Eye Center, 2008, pp1-25.
[15] S. Dubowsky, F. Genot, S. Godding, H. Kozono, A. Skwersky, H. Yu, L. S. Yu, "PAMM - A Robotic Aid to the Elderly for Mobility Assistance and Monitoring: A "Helping-Hand" for the Elderly", Proceedings 2000 ICRA, Millennium Conference, IEEE International Conference on Robotics and Automation, 2000, pp1-7.
[16] G. Wasson, J. Gunderson, S. Graves, R. Felder, "An Assistive Robotic Agent for Pedestrian Mobility", AGENTS '01: Proceedings of the fifth international conference on Autonomous agents, 2001, pp169-173.
[17] A. J. Rentschler, R. A. Cooper, B. Blasch, M. l. L. Boninger, "Intelligent walkers for the elderly: Performance and safety testing of VA-PAMAID robotic walker", Journal of Rehabilitation Research and Development, Vol. 40 No. 5, 2003, pp423–432.
[18] J. Illingworth,J. Kittler, "A survey of the Hough transform", Computer Vision, Graphics, and Image Processing, Vol.44, 1988, pp87−116.
[19] V. F. Leavers,"Which Hough transform ?", CVGIP : Image Understanding,Vol.58,No.2, 1993, pp250−264.
[20] P. V. C. Hough, "Method and means for recognizing complex patterns",U.S. Patent 3069654,1962.
[21] R. O. Duda, P. E. Hart, "Use of the Hough Transformation To Detect Lines and Curves in Pictures", Communications of the ACM, Vo.15, No.1, pp11-15, 1972.
[22] C. Gomez (Editor), "Engineering and Scientific Computing with Scilab", Birkhauser, 1999.
[23] P. Roux, "Scilab from theory to practice: I. Fundamentals", Scilab Enterprise, 2016.
[24] T. Holopainen, "Modelling and simulation of multitechnological machine systems", VTT Symposium 209 pamphlet, pp1-175, 2000.