Autonomous Tour Guide Robot using Embedded Systems

DOI : 10.17577/IJERTCONV9IS12012

Download Full-Text PDF Cite this Publication

Text Only Version

Autonomous Tour Guide Robot using Embedded Systems

Lakshya J1, Nikitha M2, Ashwini3, Meghana G R4,

Students, GSSSIETW, Mysuru.

Bharathi R5,

Faculty, Dept. of ECE, GSSSIETW, Mysuru.

Summary – This article favors the study of the multifaceted path of the robot in internal conditions. There is a fundamental need to achieve practical, robust and truly accurate answers to meet the needs of automated internal applications. Right now, scientists are studying different perspectives on this problem. The one we present in this document relies on QR (Quick Response) codes to provide links to areas for portable robots. The portable robot is equipped with a cell phone that is configured to identify and view QR codes that are deliberately placed in the robot's working environment. The versatile robot can perform a self- directed sequence through an auxiliary course using constant confirmation of the QR code. QR code lab data is reproduced to guests using the Text-to-Discourse provided via the Android device. Ultrasonic reach sensors that can detect components and measure distances with high accuracy are used for isolated monitoring and cross-sectional operation. The sonar range data collected by the ultrasonic reach sensor is controlled by a microcontroller that self-controls the local field specialist robot. An account that relies on a related joint pressure is used by the local escort robot to generate a more accurate obstruction on the robot movement. Using Bluetooth innovation, data stored in a QR code is sent remotely from a mobile phone to a local expert robot. The experimental setup of the local accompanying robot in addition to the effective execution of the productive strategy for a route procedure is presented.

Keywords – Tour guide robot, Robot navigation, QR code, Ultrasonic range sensor, Bluetooth technology.

adaptable mechanical innovation challenges, including heading, sensor connection and steering.

The proposed neighborhood robot uses notable advancement that joins a Cell application with a robot development. The Phone and 3 ultrasonic arrive at sensors, 1 in front and 2 in left and right side each are mounted on the neighborhood escort robot. The nearby escort robot is changed to follow the divider using the ultrasonic arrive at sensors while keeping about 30cm and avoid crashes with any hindrance like a doorway by identifying and assessing the distance using the ultrasonic front arrive at sensors. Wireless application is adjusted to perform QR code affirmation. QR codes are intentionally associated with the divider in a given course. It has been seen that the affirmation speed of the QR code per user falls incredibly with the speed of the neighborhood robot.

To show the credibility and practicality of our work, this paper presents the arrangement and execution of the proposed neighborhood escort robot, test eventual outcomes of the course and a couple of experiences about the genuine challenges.

The rest of this article is composed as follows: Fragment II presents a compact plane of work related to the composition; Region III presents the nuances of execution of our system. Region IV analyzes the preliminary results, followed by the final in section V.

  1. INTRODUCTION

    Today, robots are no longer a thing of science fiction movies. From cleaning robots to clinical robots, they are inescapable and are becoming a basic part of people's lives. As robots anticipate an ever-increasing number of parts in people's step- by-step lives, their impact on society continues to create.

    Robotic vacuum cleaners, security and observation applications are some demonstrations of the productive application of an indoor robot. Another example of the critical real uses of indoor assistance robots is the use of versatile robots as a neighborhood goes to exhibitions or shows.

    The work presented in this document is based on the improvement of a flexible indoor self-decision robot that can be used as a neighborhood for field visits, for example during school open days.

    The goal is to provide visitors with robotized, usable and improved visitor knowledge. During the improvement of the neighborhood robot, the company expected to oversee several

  2. RELATED WORK

    [1]. Latif, Abdul, et al. "Implementation of Line Follower Robot based Microcontroller ATMega32A." Journal of Robotics and Control (JRC) 1.3 (2020): 70-74.

    This document shows the implementation of a very simple robot program, a line tracking robot, in which the robot just moves along a line. This study uses experimental methods, conducting sequential research procedures, such as: analysis, design of mechanical panels, design of electronic components and design, production, and testing of control programs. The ATmega32A microcontroller-based robot line follower is tested and the results show that the line follower robot can follow the black line on the white floor and display the situation on the LCD screen. But there are still shortcomings in the sensitivity process of line recognition based on a specific speed. At speeds of 90-150 rpm, the robotic line trailer can follow the path, and above 150 rpm the robot cannot follow the path.

    [2]. Zhang, Bin, et al. "Development of An Autonomous Guide Robot Based on Active Interactions with Users." 2020 IEEE/SICE International Symposium on System Integration (SII). IEEE, 2020.

    In this article, active interaction means "the user can let the robot wait for them and explain the surrounding content" and so on. The robot will automatically adjust its speed and moving path to adapt to the user's actions so that it can provide thoughtful services to the user. The robot is controlled by a general control model based on integrated potential field, by which the robot can adapt to different motions of users without alter the moving rules case by case. The effectiveness of the proposed method is seen by both simulation and experiments, where the people are guided by a robot under indoor environments.

    [3]. Chen, Yuhao, et al. "Effects of Autonomous Mobile Robots on Human Mental Workload and System Productivity in Smart Warehouses: A Preliminary Study." Proceedings of the Human Factors and Ergonomics Society Annual Meeting. Vol. 64. No. 1. Sage CA: Los Angeles, CA: SAGE Publications, 2020.

    Safety is a key issue for human-machine collaboration. A human-in-the-loop experiment was conducted to evaluate the performance of a customized robot control scheme in a smart warehouse environment, where human participants perform order picking and assembly tasks, and mobile robots perform simulated pallet movement tasks. Three modes of human- robot interaction were tested (no robot was involved, a mobile robot with an empty load and a mobile robot with a full load). Participants response times, subjective workload, and even activity performance (completion time and errors) were recorded and compared between different conditions. Preliminary results indicated that a slight decline in human productivity (time to completion increased from 191.4 to

    204.1 seconds, p = 0.041) was well offset by relatively large gains from robotic results. Furthermore, there is a negative impact on people's mental load (pupil diameter increased from

    4.0 to 4.2 mm, p = 0.012) by introducing robots to the common workplace.

    [4]. Wang, Shengye, and Henrik I. Christensen. "Tritonbot: First lessons learned from deployment of a long-term autonomy tour guide robot." 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 2018.

    To explore human-robot communication techniques and long- term interactions, we built TritonBot, a sustainable, autonomous robot that acts as a host and test site. He notices people who look like him, talks to them, and directs people to the labs and buildings of an office. This article presents the planning of TritonBot and therefore the lessons learned from the primary month's implementation on technical aspects and aspects of human-robot interaction. TritonBot and its BoxBot variant ran for 108.7 hours, interacted actively with people for

    22.1 hours, greeted 2,950 people, made 150 trips, and traveled

      1. miles. We share TritonBot components through an open license to help the community replicate the TritonBot platform

        and inspire long-term research into autonomy and human-robot interaction.

  3. IMPLEMENTATION DETAILS

    In the development of the guide robot controlled by microcontroller, hardware, and software design techniques are needed. This section presents the hardware design of the main components of the guide robot. Fig. 1 shows the proposed tour guide robot platform because the test-bed for the experiments and also a deployment of the hardware components.

    Fig. 1. The proposed Tour Guide Robot platform. The robot is composed of microcontroller, motor shield, 3 ultrasonic range sensors, and Smartphone.

        1. Hardware Design of the Tour Guide Robot

          The Atmega328 based Arduino UNO R3 Microcontroller is employed because it is the brain to implement this technique. Microcontroller controls a Bluetooth module and multiple sensors. The body of the tour guide robot is designed using DF Robot two-wheel-drive (2WD) platform. It is suitable as a cheap platform for research related projects. A motor shield allows the microcontroller to control the motors. The motor shield control is based on the DF Robot L293D [10] shown in Fig.2(A). In our design, we use it to regulate the driving speed and direction of the mobile robot. As shown in Fig.2(B), the Bluetooth module HC-06 [11] is used for communication between the microcontroller and a Smartphone. As shown in Fig.2(C), 2 ultrasonic sensors are wont to implement the wall- following and obstacle-avoidance behaviors. Ref. [12] offers details about the ultrasonic range sensor units used in this design. The ultrasonic range sensor can detect objects from 2 cm to 400 cm. A Smartphone is adopted to implement a mobile application (app) called MY QR BOT.

          Fig. 2. From left to right, pictures of (A) DF Robot L293 DC motor driver shield, (B) HC-06 Bluetooth module, and (C) HC-SR04 Ultrasonic sensor

        2. Software Used

    Arduino Ide

    Arduino IDE is a cross-platform application (for Windows, macOS, Linux) that is written in functions from C and C++. It is used to write programs to Arduino boards, and upload it to the board, with the help of third-party cores, other vendor development board. Fig .3 shows the Arduino IDE platform.

    Fig.3 Arduino IDE

    MIT App Inventor

    MIT app inventor developed by Google, could be a web application integrated development environment, it's the net platform which is intended to show computational thinking by developing mobile applications. We create applications by dragging and dropping components into a design view and

    employing a visual blocks language to program application behavior. Fig. 4 shows the MIT app inventor.

    Fig.4 MIT app inventor

    The robot must stop at the precise location where the QR code tags are placed. Stopping of the tour guide robot is completed through the assistance of the android device that will be mounted on top of it, because it will detect and scan the QR sticker. After scanning the QR code, for nounce the text is converted to speech. The detection, scanning and text-to- speech conversion is completed through the applying which was developed using MIT App Inventor. After completing the above procedure, the guide moves on to the subsequent location of the campus. The Fig 5 shows the front of the app developed.

    Fig.5 Front end of the app developed

  4. CONCLUSION

    We have developed and tested an autonomous guide robot that uses the popularity of QR codes as a navigation aid. This document has described the planning and implementation of our tour guide robot. The guide robot is provided with 3 ultrasonic range sensors and a developed smart phone to detect QR codes. The experimental environment is formed from several QR codes. The range information of the probe collected by the ultrasonic range sensors controls the movement of the guide robot. Bluetooth technology is used to deliver the knowledge provided within the QR code to the tour guide robot.

    The proposed guide robot was ready to experimentally demonstrate its effective performance to guide visitors to destination places. Furthermore, our implementation is low cost that's supported easily accessible off-the-shelf electronic modules.

    In future work, we might wish to study ways for more robust turning control of the robot. The sensor rotation during a Smartphone is going to be a useful source for consideration. Another aspect of the guide robot we might wish to continue working on is within the interface. This includes development of multimedia content to permit the guide robot play videos to introduce the lab using the Smartphone.

  5. REFERENCES

  1. Latif, Abdul, et al. "Implementation of Line Follower Robot based Microcontroller ATMega32A." Journal of Robotics and Control (JRC) 1.3 (2020): 70-74.

  2. Zhang, Bin, et al. "Development of An Autonomous Guide Robot Based on Active Interactions with Users." 2020 IEEE/SICE International Symposium on System Integration (SII). IEEE, 2020.

  3. Chen, Yuhao, et al. "Effects of Autonomous Mobile Robots on Human Mental Workload and System Productivity in Smart Warehouses: A Preliminary Study." Proceedings of the Human Factors and Ergonomics Society Annual Meeting. Vol. 64. No. 1. Sage CA: Los Angeles, CA: SAGE Publications, 2020.

  4. S. Wang and H. I. Christensen, "TritonBot: First Lessons Learned from Deployment of a Long-Term Autonomy Tour Guide Robot," 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2018.

  5. Del Duchetto, Francesco, Paul Baxter, and Marc Hanheide. "Lindsey the tour guide robot-usage patterns in a museum long-term deployment." 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). IEEE, 2019.

  6. Singh, Rajesh, et al. "PATH LEARNING ALGORITHM FOR FULLY INDIGENOUS MULTIMEDIA AND GUIDE ROBOT." International Journal of Mechatronics and Applied Mechanics 6 (2019): 7-16.

  7. Wang, Shengye, and Henrik I. Christensen. "Tritonbot: First lessons learned from deployment of a long-term autonomy tour guide robot." 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 2018.

  8. Kokare, Snehal, Rajveer Shastri, and Shrikrishna Kolhar. "Line Follower with Obstacle Information System Using ZigBee." 2018 Fourth International Conference on Computing Communication Control and Automation (ICCUBEA). IEEE, 2018.

  9. Rainer, J. Javier, Salvador Cobos-Guzman, and Ramón Galán. "Decision making algorithm for an autonomous guide-robot using fuzzy logic." Journal of Ambient Intelligence and Humanized Computing 9.4 (2018): 1177-1189.

  10. DFRobot L298 DC motor driver shield, http://elesson.tc.edu.tw/md221/pluginfile.php/4241/mod_page/cont ent/ 10/L298_Motor_Shield_Manual.pdf.

  11. HC-06 Manual, http://www.exp-tech.de/service/datasheet/HC-

    Serial- Bluetooth-Products.pdf

  12. HC-SR 04 Ultrasonic Sensor, http://letsmakerobots.com/node/30209

  13. Zxing https://code.google.com/p/zxing/

  14. OpenCV Android SDK http://docs.opencv.org/inde.html

Leave a Reply