Personal Assistant Robot for Paraplegic People with Hand Gesture

DOI : 10.17577/IJERTV9IS030277

Download Full-Text PDF Cite this Publication

Text Only Version

Personal Assistant Robot for Paraplegic People with Hand Gesture

Dr. Nagalakshmi Venugopal

Professor

Department of Computer Science and Engineering Dr. N.G.P Institute of Technology

Coimbatore, Tamilnadu, India

Shrinithi E

Student

Department of Computer Science and Engineering Dr. N.G.P Institute of Technology

Coimbatore, Tamilnadu, India

Ragul N

Student

Department of Computer Science and Engineering Dr. N.G.P Institute of Technology

Coimbatore, Tamilnadu, India

Sri Balu Adityan B

Student

Department of Computer Science and Engineering Dr. N.G.P Institute of Technology

Coimbatore, Tamilnadu, India

Abstract:- In this fast growing modern world, many new technologies are being implemented for the physically challenged, especially for those who are immobile or paraplegic. People still experience lots of difficulties in doing few of their daily work. One such work is constantly lifting or picking up things by themselves. The drastic technological growth can be used to reduce human effort in carrying out these tasks. Robotic vehicles with pick and place mechanisms are highly efficient when compared to the conventional approaches. This system makes use of a completely hand motion controlled robotic vehicle using tilting motions which does not need a single button press. This allows to control vehicle motion as well as the pick and place arm simultaneously. The integration of IoT with Artificial Intelligence can be used to develop robotic vehicles which can act as Personal Assistants. This paper includes various features like Robotic arm with pick and place mechanism and inbuilt personal assistant.

Key words: Paraplegic, pick and place, Personal Assistant, Robotic vehicles.

  1. INTRODUCTION

    In this world of advanced technology, various types of technology have been created to facilitate the Internetworking of physical devices that are embedded with electronic devices, software, sensors, or actuators that can make the data transfer possible. IoT can help paraplegic people in several ways. At its most basic level, it can help the disabled people to carry out everyday tasks in a simple manner without the involvement of humans. This eliminates the need for a helper throughout the day and can do simple tasks like object picking and placing for the disabled people. This technology already exists in industries for manufacturing heavy tools and automobile industries in their assembly lines. This can be introduced in the medical field for efficient operations and maneuverability in tasks.

    With the help of the robotic vehicle, any person who is unable to walk can easily operate the vehicle using simple hand gestures and carry out certain simple tasks. The vehicle aids in the motion of a robotic arm, which can be compared to a human hand that can perform pick and place operations.

    There is no need for a remote control since the hand movements of the person is enough. The person can also be free from

    moving since it can be operated from a stable location and hand movements alone are adequate. Conventional button based remote controlled systems exist, but the main mode of control will be through either remotes or joysticks. They are not comfortable to be used frequently as they cause more strain after continuous usage. The vehicle is used for tilting motions which does not need a single button press.

    The existing system comprises voice controlled wheelchairs that enable a person to move around using voice recognition. They can assist them only in general motion. The proposed system minimizes the effort since it requires only hand movement and the person can remain in a stable location.

    Robotic vehicles are preferred since they are faster and the accuracy is greater when compared to humans. Also they never get tired and hence it can be utilized during any time of the day or during any circumstances. It may help the population in countries like India, where these kind of robotic vehicles are not preferred due to their high cost. This paper focuses on providing an alternative to manual methods of pick and place operations by a person with impaired legs as well as providing them with a virtual assistant.

    Fig1.Chassis

    The assistant provides the userswith periodic reminders to take their medicines regularly. They can also be used to remind them of their hospital appointments priorly for making necessary arrangements. It minimizes the need of a helper throughout the day and people of any age group are free to use since it requires only a minimum level of learning and understanding.

    Fig2. HAND GESTURE CONTROL MODULE

  2. EXISTING SYSTEM

    Motion control gestures are monitored with the help of a radar which records hand gestures, processes the motion data and does the required activity. It illuminates the entire hand and measures a superposition of reflections from multiple dynamic scattering centers. Several learning techniques are applied where the input comprises the motion patterns of the

    users fingers. Radars are large in size, consume high power and require significant computational demand to generate and process radar signals.

    Remote operated robots exist that are equipped with a wide angle camera which gives us visual feedback through VR – Virtual Reality. For inbuilt personal assistants, AI is used to interact with the patients performing two main tasks Assisting and Reminding. Hand gesture recording or recognition can also be performed with the help of a camera.

    Tele-interaction robots have been widely used in healthcare. They are used to monitor diabetes patients by reading the glucose sensor and provide feedback in vocal form. All these are merely connecting three different domains namely Robotics, Internet of Things and Artificial Intelligence by connecting with a network infrastructure.

  3. RELATED WORK

    Robots are usually controlled through remotes with joysticks and buttons. They are not comfortable to use and are also not suitable for continuous usage. Hence a motion controlled approach is used to tackle this issue. The user is given a pair of gloves using which they will be able to control the motion of the robotic vehicle. Left glove is used for the motion of the vehicle. The vehicle can move forward, backward, leftward or rightward according to the hand gestures of the person wearing the glove. The right glove is used for the pick and place operation. The robotic arm can act similar to the hand movements of the person. Hence one glove is used to control the robotic vehicle and the other controls the robotic arm.

    An ATmega based microcontroller circuitry is used in the transmitter section to transmit the motion commands sent by the accelerometer sensor through radio frequency to the receiver unit. The receiver unit is equipped with RF receivers to receive the transmissions simultaneously from both the transmission units. An 8051 family microcontroller is used to convert the received transmissions into motion commands. These motion commands operate the vehicle as well as the pick and place arm simultaneously to achieve complete robot movement without any button press.

    This motion controlled robot can handle things which are present on the vehicles resting surface. Any obstacles in between the path of the robotic vehicle can be avoided as the directions are given by the user himself. The robotic arm consists of two claws for holding and lifting the object. The virtual assistant resembles a patient-caregiver relationship. All the inputs given by the user regarding his medical necessities are stord and reminded at the right time.

    For this purpose, Raspberry PI 0 W is used which has an inbuilt Wi-Fi module. A microphone is fitted in the glove through which the user can communicate to the robot. The output is received through a speaker which is inbuilt to the robotic vehicle. Thus it is not necessary that the user should be close to the vehicle during this process.

    1. SUMMARY

      From the above literature survey, pick and place operations are only performed by guiding them in movement. This can be overcome by our proposed system by operating a vehicle from a firm location that can carry out the desired task. An AI based personal assistant is integrated with this module for reminding pills and checkups and also interacting with the user.

  4. PROPOSED WORK

    The transmitter and receiver units are equipped with several components like microcontroller, accelerometer and DC motor. The personal assistant works by obtaining its input through a microphone and interacting with the user through speakers that are deployed in the robotic vehicle.

      1. Microcontroller AT89S52

        The AT89S52 is a low-power, high-performance CMOS 8- bit microcontroller with 8K bytes of in-system programmable Flash memory. 8052 microcontroller is an extension of 8051 microcontroller. It should be compatible with the robotic arm since it is responsible for the functioning of the arm.

      2. DC Motor

        It is an electric motor that runs on direct current. The internal configuration of a DC motor is designed such that it harnesses magnetic interaction to generate rotational motion. Electrical energy is converted into mechanical energy by applying a certain degree of torque to the motor shaft. This is employed to assist in the motion of the wheels of the robotic vehicle.

      3. Accelerometer

        They are used to sense static and dynamic acceleration. One of the main applications of accelerometer sensor is tilt- sensing. It is used to sense the motion of the vehicle with the maximum ranges stretching from ±1g to ±250g. Accelerometers are preferred since they are less susceptible to noise.

      4. Ultrasonic Sensor

        Ultrasonic sensors are mainly used to detect the position of an object. Any obstacle on the path of the robotic vehicle can be avoided through this sensor. The sensor can detect the objects that are in the range of 3 centimeters to 3 meters. It is used especially for indoor applications and has high noise tolerance.

      5. Motor Driver L293D

        Motor drivers act as current amplifiers since they take low current control signal and provide a higher current signal. In common mode of operation, two DC motors can be driven simultaneously in forward and reverse directions. Input logic 00 or 11 will stop the corresponding motor. Logics 01 and 10 are used to rotate it in clockwise and anticlockwise directions respectively.

    1. Modules

      • Navigation of robotic vehicle

      • Pick and place operation

      • Personal Assistant

    2. Modules Description

    1. Navigation of robotic vehicle

      The motion control sensors and motors that respond to the gestures from the sensor to navigate the robot from one place to another are contained in this module. It is employed at both the transmitter and receiver ends. At the transmitter end, ATmega microcontroller and accelerometer sensors capture the hand movement of the user and transmits to the receiver end.

      All the components are fitted into a PCB which is fixed to the left glove. At the receiver end, the transmitted data is received and the vehicle is navigated from its current location. The receiver side operations take place in the robotic vehicle. It controls the hardware components according to the received signal from the hand gesture gloves. It consists of wheels and motors that navigate the robot to front, back, left and right as well.

      Fig 3. Receiver Module containing motor drivers

    2. Pick and place operation

      The motion control sensors which feed them the gesture data to perform the pick and place mechanism are contained in this module. The transmitting end of the robotic arm consists of the Intel microcontroller 8051 to control the robotic arm which captures the gesture data and transmits to the receiving end. The receiving end of the robotic arm module receives the sensor data and processes it that is responsible to run the motors with the help of the 8051 microcontroller. The robotic arm consists of a clamp that can hold and lift an object.

      The RF encoder and decoder ICs are paired with each other. The encoders and decoders with same number of address and data format are chosen. The decoder receives the serial address and data from its corresponding decoder, transmitted by a carrier using an RF transmission medium and gives output to the output pins after processing the data.

    3. Personal Assistant

    It consists of a virtual assistant powered by a microcontroller. The remainder module is inbuilt which reminds to take medicines on time and the checkup appointments with the physician. The virtual assistant is a chat-bot that can interact with the user by responding to their inputs. The input is given through voice using a microphone and the corresponding output is received through a speaker that is equipped in the robotic vehicle..

    All The sensors and microcontroller are successfully interfaced and robotic vehicle movement is achieved. Whenever the vehicle detects an obstacle, it is detected by the ultrasonic sensors and as a result the vehicle is diverted to the opposite direction automatically. When the vehicle nears the desired object to be picked, the person can use the right hand glove to operate the robotic arm. The data is transmitted from the gloves to the robotic vehicle as RF data and is received by means of RF receiver.

    The intimation for intake of pills and visiting the doctors for checkup are made through the reminder module and prior intimation can also be made according to the users wish.

    Apart from these, voice based interactions can also be made with the robot.

  5. CONCLUSION

The applications of robotics combined with Artificial Intelligence not only can help in large scale operations, but could also be new levers to bring innovation in the medical field. This paper tries to minimize the effort taken by a person seated in a wheelchair so that basic operations like picking and placing an object can be done by himself without considering the motion of the wheelchair. The implementation of this project will surely serve as a rising idea for many more similar inventions and will be more profitable in countries where people prefer low cost robotic vehicles.

Fig 5. Result

REFERENCE

  1. Hand Gesture Controlled Robot, International Journal of Recent Technology and Engineering (IJRTE) ISSN: 2277-3878, Volume-8, Issue-1S4, June 2019.

  2. Soli: Ubiquitous Gesture Sensing with Millimeter Wave Radar- Google ATAP, ACM Reference,SIGGRAPH 2016, July 24-28, 2016, Anaheim, CA ISBN :978-1-4503-ABCD-E/16/07 DOI http://doi.acm.org/10.1145/9999997.9999999

  3. Remotely Teleoperating a Humanoid Robot to Perform Fine Motor Tasks with Virtual Reality – WM2018

  4. Conference, March 18 27, 2018, Phoenix, Arizona, USA.

  5. A Robot to Provide Support in Stigmatizing Patient-Caregiver Relationships – Research supported by the National Science Foundation under Grant #IIS 1317214. Research in collaboration with Linda Tickle-Degnen and Matthias Scheutz at Tufts University, 2018.

  6. Hand Gesture Recognition System – Computer Engineering Department, The Islamic University of Gaza Gaza Strip , Palestine, 2011.

  7. Integrating robots into the Internet of Things – International Journal of circuits, Systems and Signal Processing, 2012.

One thought on “Personal Assistant Robot for Paraplegic People with Hand Gesture

Leave a Reply