Low-Cost Mechatronic Design of an Omnidirectional Mobile Robot

This paper presents the results of a mechatronic design based on a 4-wheel omnidirectional mobile robot that can be used in indoor logistic applications. The low-level control has been selected using two open-source hardware (Raspberry Pi 3 Model B+ and Arduino Mega 2560) that control four industrial motors, four ultrasound sensors, four optical encoders, a vision system of two cameras, and a Hokuyo URG-04LX-UG01 laser scanner. Moreover, the system is powered with a lithium battery that can supply 24 V DC and a maximum current-hour of 20Ah.The Robot Operating System (ROS) has been implemented in the Raspberry Pi and the performance is evaluated with the selection of the sensors and hardware selected. The mechatronic system is evaluated and proposed safe modes of power distribution for controlling all the electronic devices based on different tests. Therefore, based on different performance results, some recommendations are indicated for using the Raspberry Pi and Arduino in terms of power, communication, and distribution of control for different devices. According to these recommendations, the selection of sensors is distributed in both real-time controllers (Arduino and Raspberry Pi). On the other hand, the drivers of the cameras have been implemented in Linux and a python program has been implemented to access the cameras. These cameras will be used for implementing a deep learning algorithm to recognize people and objects. In this way, the level of intelligence can be increased in combination with the maps that can be obtained from the laser scanner.


Authors:



References:
[1] R. Siegwart and I. Nourbakhsh, Introduction to autonomous mobile robots. Cambridge, Mass: MIT, 2004.
[2] "youBot wiki", Youbot-store.com, 2020. (Online). Available: http://www.youbot-store.com/wiki/index.php/Main_Page. (Accessed: 04- Apr- 2020).
[3] S. Ahmed, A. Topalov, N. Shakev and V. Popov, "Model-Free Detection and Following of Moving Objects by an Omnidirectional Mobile Robot using 2D Range Data", IFAC-PapersOnLine, vol. 51, no. 22, pp. 226-231, 2018. Available: 10.1016/j.ifacol.2018.11.546.
[4] H. Huang and C. Tsai, "Adaptive Trajectory Tracking and Stabilization for Omnidirectional Mobile Robot with Dynamic Effect and Uncertainties", IFAC Proceedings Volumes, vol. 41, no. 2, pp. 5383-5388, 2008. Available: 10.3182/20080706-5-kr-1001.00907.
[5] J. Song and J. Kim, "Energy efficient drive of an omnidirectional mobile robot with steerable omnidirectional wheels", IFAC Proceedings Volumes, vol. 38, no. 1, pp. 73-78, 2005. Available: 10.3182/20050703-6-cz-1902.01282.
[6] O. Diegel, A. Badve, G. Bright, J. Potgieter and S. Tlale, "Improved Mecanum Wheel Design for Omni-directional Robots", in Australasian Conference on Robotics and Automation, 2002, pp. 117-121.
[7] "ROS.org | Powering the world's robots", Ros.org, 2020. (Online). Available: https://www.ros.org. (Accessed: 04- Apr- 2020).
[8] M. Murshed, C. Murphy, D. Hou, N. Khan, G. Ananthanarayanan and F. Hussain, "Machine Learning at the Network Edge: A Survey", arXiv.org, 2020. (Online). Available: https://arxiv.org/abs/1908.00080. (Accessed: 04- Apr- 2020).
[9] "Ultrasonic Distance Sensor - HC-SR04 - SEN-15569 - SparkFun Electronics", Sparkfun.com, 2020. (Online). Available: https://www.sparkfun.com/products/15569. (Accessed: 04- Apr- 2020).
[10] "Scanning Rangefinder Distance Data Output/URG-04LX-UG01 Product Details | Hokuyo Automatic CO., LTD.", Hokuyo-aut.jp, 2020. (Online). Available: https://www.hokuyo-aut.jp/search/single.php?serial=166. (Accessed: 04- Apr- 2020).
[11] "TTL Serial Camera", Adafruit Learning System, 2020. (Online). Available: https://learn.adafruit.com/ttl-serial-camera. (Accessed: 04- Apr- 2020).
[12] “Raspberry-pi-touch-display”, Raspberrypi.org, 2020. (Online). Available: https://www.raspberrypi.org/products/raspberry-pi-touch-display/. (Accessed: 04- Apr- 2020).
[13] "Arduino - ArduinoMega2560", Arduino.cc, 2020. (Online). Available: https://www.arduino.cc/en/Guide/ArduinoMega2560. (Accessed: 04- Apr- 2020).
[14] “Raspberry-pi-3-model-b”, Raspberrypi.org, 2020. (Online). Available: https://www.raspberrypi.org/products/raspberry-pi-3-model-b/. (Accessed: 04- Apr- 2020).
[15] "Robotic-scientific ltd", Robotic-scientific ltd, 2020. (Online). Available: https://robotic-scientific.co.uk. (Accessed: 06- Apr- 2020).
[16] "Ubuntu ARM-RaspberryPi", 2020. (Online). Available: https://wiki.ubuntu.com/ARM/RaspberryPi. (Accessed: 26- Jun- 2020).
[17] "Melodic - ROS Wiki", Wiki.ros.org, 2020. (Online). Available: https://wiki.ros.org/melodic. (Accessed: 16- Apr- 2020).
[18] "rviz - ROS Wiki", Wiki.ros.org, 2020. (Online). Available: http://wiki.ros.org/rviz. (Accessed: 06- Apr- 2020).
[19] S. Kohlbrecher, J. Meyer, O. von Stryk and U. Klingauf, "A Flexible and Scalable SLAM System with Full 3D Motion Estimation", in Proc. IEEE International Symposium on Safety, Security and Rescue Robotics, 2011.
[20] "ros-drivers/rosserial", GitHub, 2020. (Online). Available: https://github.com/ros-drivers/rosserial. (Accessed: 17- Apr- 2020).
[21] "OpenCV", SourceForge, 2020. (Online). Available: https://sourceforge.net/projects/opencvlibrary/. (Accessed: 26- Jun- 2020).
[22] Compute Module 3, 2020. (Online). Available: https://www.raspberrypi.org/products/compute-module-3/. (Accessed: 26- Jun- 2020).
[23] J. Redmon, "YOLO: Real-Time Object Detection", Pjreddie.com, 2020. (Online). Available: https://pjreddie.com/darknet/yolo/. (Accessed: 26- Jun- 2020).
[24] "COCO - Common Objects in Context", Cocodataset.org, 2020. (Online). Available: https://cocodataset.org/#home. (Accessed: 26- Jun- 2020).