Thought based Control of Robotic Arm Via Wireless

DOI : 10.17577/IJERTCONV3IS16165

Download Full-Text PDF Cite this Publication

Text Only Version

Thought based Control of Robotic Arm Via Wireless

S. Venaktesh Gowtham1

is currently pursuing masters degree

Program in bio medical instrumentation engineering in Karunya University, Coimbatore,

  1. Kingston Stanley2

    is currently working

    Assistant professor in Karunya University, Coimbatore,

    A.Sanjeevi Gandhi3

    is currently working

    Assistant professor in Karunya University, Coimbatore,

    Abstract— Artificial limbs and exoskeletons have been widely used in a variety of applications, from military to medicine. The Defence Advanced Research Projects Agency (DARPA) primarily focuses on developing exoskeletons to aid ground soldiers in both physical performance skills and survivability. Robotic arms have been used to assist personalities who have lost the ability to perform everyday activities such as walking because of grievous medical injuries.The main objective of this project is to control a Robotic arm with the power of one's brain. The Brainwave regulator system will use brain waves to initiate, stop and change whereabouts of robotic arm. The main component of the control system is the Emotiv EPOC sensor, which detects the brain functions like thoughts. With the advancements in technology and healthcare facilities, the number of senior citizens and elderly who find it difficult to walk are benefitted. Hence there is a need for designing a robotic arm that is user friendly and this involves fewer complexities. The arm can also be used by the physically challenged who depend on others for physical movements that are challenging. Rehabilitation centres at hospitals can make use of this robotic arm.In this project we acquire the signal using the emotiv epoc. The acquired signals are processed using the LabVIEW environment. The various thought based signals are obtained, based on which the arm is activated.

    Key Word BCI, EEG, Emotiv epoc, Robotic arm

    1. INTRODUCTION

      Electroencephalography (EEG) is well known term in Brain Computer interface (BCI) research community. Researches show that brain-computer interfaces (BCIs) have allowed some interesting advances in the area of medically disabled patients and it allows prosthetic limb and device movement. In the medical research area, BCI have been implemented to allow people with disabilities to help Artificial Arm [3]. The intention of moving something is generally known as cognitive thought in BCI. It becomes useful for severely paralyzed people to use things around them by the help of BCI. This technology is used to detect driver fatigue and

      driver sleepiness. Some other researches are also observed on mind controlled car. Efficiency of cognitive control is still a big challenge for controlling devices. Brain computer interfacing has opened a new era to the world of technology. Brain-computer interfaces (BCIs) allow the user to interact with a system through mental actions alone unlike traditional control procedures such as physical manipulation or verbal commands.

      There are basically two techniques that are used to monitor the users brain activity and they include invasive (cortically- implanted electrodes) and non-invasive (EEG type) techniques. Invasive techniques usually provide more precise and accurate measurements. Neural activity from cerebral cortex is extracted and used to control prosthetic limb. To state specifically, in the invasive technique the subject on whom the experiment is to be done has to undergo an operation which includes implantation of electrodes or chemical substances in the brain. Our society is still not ready to accept this kind of system. Even when we went with external Emotiv headset, we faced number of protests from patients and physically challenged group even though it did not include cortical implantation of electrodes. Emotiv EPOC uses a non-invasive type brainwave monitoring system where EEG is recording the data of electric activity in an interval of 20-40 minutes or even less from the scalp of the brain. Non- invasive technique comes with an advantage of relieving the subject from the difficulties of operation as the subject can easily measure the neural activity through simple wearable items[3]. EEG actually monitors the voltage fluctuations resulting from ionic current flows within the neurons of the brain and it occurs 1.5s before the movement takes place. Diagnostic applications mostly focus on the neural oscillation provided with the EEG signals. After the recording of the stream of data usually the data are processed by detailed algorithm to decode the subjects intention. The simple Event Related Potentials (ERP) to makes this algorithm powerful and generalizes across users.

      The ERP component that emerges in the process of decision making is called the P300 (P3) wave. It has been found that an event related potential across the parietal-central

      area of the skull is 300 ms and is lager after the target stimulus. To cut short, the process is all about combining the target items of low-probability with the high-probability non- target items which are detected by EEG and Electromyography (EMG). BCIs have been implemented on the patients with diseases that included problems regarding central nervous system. So, BCI has always been the only medium to interact with the world. With the vast advancement of sensor and related technology along with the improvement of algorithm type, BCIs have shown its potential not only to the clinical context but also to the general people.

    2. ELECTROENCEPHALOGRAM

      The electroencephalogram (EEG) is the study of brain function that reflects the brain's electrical activity. To gather brain electrical signal, electrodes placed on the scalp. The electrical signal is a scale of micro volts which can be recorded and analysed. EEG signals have different rhythms described by variable ranges of frequency bands and voltages

      1. [4]. Table-1 indicates the most common rhythms including the frequency range, the amplitudes, the prominence areas and the human conditions.

        • FRONTAL LOBE: Planning, Language, Expression and Speech contains the Motor cortex area involved in movements (movement, conscious thought, and controls voluntary movement of body parts ),

        • PARIETAL LOBE: Touch, Taste contains the

          Somatosensory cortex areas (receives and processes sensory signals from the body),

        • OCCIPITAL LOBE visual area (contains the visual cortex) receives and processes signals from the retinas of the eyes,

        • TEMPORAL LOBE: Language Reception.

      Figure.1 Locations of Brodmann areas in the brain

      TABLE.1

      EEG Rhythms with the common characteristics

      Rhythm

      Frequency range

      Amplitude

      Prominence area

      Human status

      Delta

      0.5-3hz

      20-400V

      variable

      Asleep

      Theta

      4-7hz

      5-100 V

      Frontal – Temporal

      Awake-stress

      Alpha

      8-13hz

      5-100 V

      Occipital- Parietal

      Awake-Relax- closed eye

      Beta

      14-30hz

      2-10 V

      Pre central-frontal

      Awake no movement

      Gamma

      >30hz

      2-10 V

      Pre central-frontal

      A wake

    3. MOTIVATION

      The patients who are not able to move any of the body pats and even speak; BCIs have been the only means for them to move things and even conversion of their thoughts in written form. Thus, the disabled or paralyzed patients can feel the essence of overcoming their inabilities to some extent as BCI is creating the path for them to communicate with other human. This is the reason that patients nowadays are adopting BCIs overlooking the shortcomings of it. With the vast advancement of sensor and related technology along with the improvement of algorithm type BCIs have shown its potential not only to the clinical context but also to the general people. Paralyzed or partially paralyzed people are very dependent on an assistant. BCI has opened a solution for them to reduce their dependency on others. Moreover, Emotiv Company provides a cheap consumer headset which is within the reach of common people. The main motivation of the paper is to find the user efficiency in controlling devices with the cognitive thought which is extracted by this device. We prepared a robot and let the user control it with their thoughts and finally calculated the efficiency. This efficiency measurement led us to some significant conclusion about the user effectiveness of the device.

    4. METHODOLOGY

      Many of the benefits of the Emotiv EPOC EEG lie in its usability. In order to extract electroencephalographic data, the EEG must be prepared briefly before each use. On a normal EEG, a saline paste or gel must be applied to each of the many electrodes on the device, to keep proper signal between the electrode and the scalp [7]. The EPOC is simpler in that each felt electrode needs only to be wetted with a commercially available saline solution. After the EEG is prepared, the test subject sets it on their head and begins checking the calibration through the feedback from the included graphical user interface all communication between the device and the computer occurs through a standard USB dongle, transmitting on a proprietary wireless band at 2.4 GHz [8]. From LabVIEW develop Emotiv Create Task" creates a reference to an Emotiv Task object using the dll library. Emotiv Start Task" resets a remote connection

      Figure.2 Simple block diagram

      With the control panel. Emotiv Read" waits for and searches an array indicating which cognitive expressions are detected by the control panel. Emotiv Stop Task" ends the reference to the Emotiv Task objects and reports any errors. Robotic arm is controlled by using the signals from NI MyDAQ through wireless.

    5. IMPLEMENTATION

      5.1 Emotiv Configuration

      Emotiv EPOC is a 14 channels neuroheadset. It has CMS and DRL references. These are used to achieve optimal positioning for accurate spatial resolution. The channel names are: AF3, AF4, F3, F4, F7, F8, FC5, FC6, P3 (CMS), P4

      (DRL), P7, P8, T7, T8, 01, 02 of these channels AF3, AF4, F3, F4, F7and F8 are used for taking frontal EEG data from the part of the brain which is involved in design, organizing, problem solving, selective attention and personality [8] FC5 and FC6 are used for taking EEG from the part of the brain which works on the processes that are engaged in preparing a response of front-central EEG [6], [3].

      Figure.3 Emotiv epoc neuro-headset

      To take EEG from the region in the back of the brain which processes visual information and which is mainly responsible for visual processing O1 and O2 channels are used. In a case for partial area which controls sensation FC5 and FC6 are used. There are two temporal lobes, one on each side of the brain located at about the level of the ears. These lobes allow a person to differentiate one smell from another and one sound from another. They also help in sorting new data. T7, T8 are used for taking data from temporal sites. Shows different connections of the sensors around the scalp and Table is showing the Emotiv Headset configuration Sequential sampling and Single Analog to digital converter (ADC) sequential sampling is used internally in the Emotiv device. Sequential sampling is a non-probability sampling technique where one has to take single or a group of data in a given time interval and analyse the results then again taking another group of data if needed and so on. ADC is a device that converts a continuous voltage to a digital number as the measure's amplitude. The conversion involves quantization of the input. Fixed sampling rate of 128Hz is used. Internally it is oversampled at 2048 Hz per channel but this bandwidth is used to remove very high harmonic electric frequency. If there harmonics are not removed then it mixes with the brain waves. The signal is filtered to reduce the frequency to 128Hz for wireless transmission. The main reason other systems offer greater sampling rates is to allow enough bandwidth to remove these signals. EPOC has an upper bandwidth limit of around 43Hz to avoid 50Hz and 60Hz interference in order to avoid very first evoked potentials.

      Dynamic range is 256Vpp for Emotiv EPOC and EPOC is built in digital 5th order Sync filter [3]. It is an "ideal" low- pass filter. Coupling mode of Emotiv EPOC is AC. Proprietary wireless networks are used for Emotiv headset as it makes one of its own protocols to get a reliable communication link in the 2.4 GHz band. The headset has a

      3.7 volt and600 milliamp rechargeable Lithium Battery inside it which provides users to use 12 hours continuously [9].

      5.2. Emotiv Interfacing Software

      The Emotiv headset collects sequentially sampled data and supplies the data to an application called EPOC Control Panel. It processes the data and provides three built- in outputs and they are- Affective, Cognitive, Expressive suits. Expressive suit detects facial movements and detects different states such as smile, raise brow, left wink, right wink etc. Affective suit measures positive mental states such as concentration, meditation and excitement. Cognitive part stores users neutral or relaxed mental state at first [7]. Then system trained by the users specific thoughts. Proper wearing of headset shows a visual image of sensors in the screen like. Green represents the best quality contact, while a led turns black that means there is a no signal, red for very poor signal, orange for poor signal and yellow for fair signal. A graphical representation of incoming EEG signals is shown in the control panel which is used for training and recognition of thought [12].

      Figure.4 Sensors around the scalp

      Emobot is a virtual robot in the control panel that copies different facial and head movements of a user. It copies different expressive states such as left wink, right wink, raise brow, smile, and blink and clinch teeth. Actually EMG portion of EEG data records the electric activity that is produced by the muscle movements [9].

      TABLE.2

      Number of channels

      14 (plus CMS/DRL references

      Channel names

      AF3, AF4, F3, F4, F7, F8, FC5, FC6, P3 (CMS),

      P4 (DRL), P7, P8, T7, T8, O1, O2

      Sampling method

      Sequential sampling, Single ADC

      Sampling rate

      128 Hz (2048 Hz

      internal)

      Resolution

      16 bits

      Bandwidth

      0.2 45 Hz, digital notch filters at 50 Hz and 60Hz

      Dynamic range

      256 mVpp

      Coupling mode

      AC coupled

      Connectivity

      Proprietary wireless, 2.4 GHz band

      Battery type

      Li-poly

      Battery life

      12 hours

      Impedance measurement

      Contact quality using patented system

      /tr>

      Number of channels

      14 (plus CMS/DRL references

      Channel names

      AF3, AF4, F3, F4, F7, F8, FC5, FC6, P3 (CMS),

      P4 (DRL), P7, P8, T7, T8, O1, O2

      Sampling method

      Sequential sampling, Single ADC

      Sampling rate

      128 Hz (2048 Hz

      internal)

      Resolution

      16 bits

      Bandwidth

      0.2 45 Hz, digital notch filters at 50 Hz and 60Hz

      Dynamic range

      256 mVpp

      Coupling mode

      AC coupled

      Connectivity

      Proprietary wireless, 2.4 GHz band

      Battery type

      Li-poly

      Battery life

      12 hours

      Impedance measurement

      Contact quality using patented system

      Specification of Emotiv Epoc Sensor

      Figure.5 OWI-535 Robotic Arm

      When The EEG data is recorded then muscle movements provide some extra data and are counted as noise which is usually filtered. Brain wave frequencies have been considered into bands of different frequencies and they are delta (0.1-3.5Hz), theta (4-7.5Hz), alpha (8-13Hz), and gamma (greater than 30Hz). Different activities of our brain create different frequencies. For example imaginary motor control which relates to our motor control creates alpha band frequency. The system understands different intention of the user through examining these frequencies

      Cognitive part recognizes several directions of a floating virtual cube of a control panel based on the several user thoughts. Thirteen intentional thoughts can be recognized and they are: UP, DOWN, RIGHT, LEFT, ROTATE RIGHT, ROTATE LEFT, LIFT, PUSH, PULL, DROP and

      INVISIBLE. The user need to train their data for each thought before using it. The user first has to offer neutral data by completely relaxing for eight second training period [9]. Then for any specific thoughts out of the thirteen, the user has to train for eight seconds. The system stores the data using ERP which is taken during the training period and matches the data with the user thought to detect users specific thoughts. A user can set only four thoughts at a time with the keywords or mouse control and even audio files. In our application we used PUSH, DROP, RIGHT, LEFT for forward, backward, right and left operation [9] [10].

        1. Robotic Arm

          The arm has four rotational joints, which is called the base, shoulder, elbow and wrist [10]. The base rotates the arm around the vertical z-axis, while the other three rotate it around the x-axis. The positive x-axis is aiming out of the page, and is also refer to it as on the "right". No joint rotates around the y-axis, which limits the arm's movements but also makes the kinematics calculations later in this chapter much easier [1].

          Each joint has a rotation limit in the backwards and forwards directions for the wrist, elbow and shoulder, and to the left and right for the base, which will become important later when rotations are implemented using angle values. The arm's gripper opens and closes via rotating gearwheels, but its prongs are connected to the wheels in such a way that they stay roughly parallel to each other as they move 16 [13][14]

          Figure.6 Schematic of the Robot Arm

          Each joint has a rotation limit — in the backwards and forwards directions for the wrist, elbow and shoulder, and to the left and right for the base, which will become important later when I implement rotations using angle values [14]. The arm's gripper opens and closes via rotating gearwheels, but its prongs are connected to the wheels in such a way that they stay roughly parallel to each other as they move. One problem is that the gear wheels tend to slip due to the weight of the arm, or if the arm is rotated into a hard obstacle, or beyond a joint's rotation limit [1]. Gravity also means that the arm's rotational velocity isn't constant for example, rotating the shoulder joint downwards takes less time than rotating it upwards by the same amount.

          Another factor is the battery power supply as the batteries fade, so does the arm's speed. The arm has no feedback mechanism, either for sensing when the gripper has closed around something, or to judge the joints' true positions 17

        2. Robot moves from brain commands

          Firstly, PUSH, PULL, DROP, LIFT commands are set with the specific thoughts. For example, PUSH can be set for upward command, PULL for downward, DROP for drop and LIFT for pick command. Whenever the user thinks about pushing anything the LabVIEW Boolean will glow and the upward direction in the robotic arm. Then through transferring over the transmitter part ultimately the receiver part at the robot end gets the signal and manipulates the signal to command the robot to go upward direction. Same thing happens for the other three outputs. Cognitive suit only lets the user to train four thoughts at a time.

        3. Wireless transmission

      RF stands for Radio Frequency. This module consists of further two parts: Transmitter (Tx) and Receiver (Rx) [4]. It is available in different operating frequencies with different operating range. An Encoder Circuit and a Decoder Circuit is used along with the Transmitter and Receiver respectively in order to transmit and receive the message/signal [1]

      The native communication task between the Robotic Arm, Platform and the different hand and leg gestures of the user is done by this module via RF signals. One such RF Module is required in this project.

    6. RESULT

We have experimented on ten users with different physical attributes to calculate the efficiency and the user friendliness of the Emotiv EPOC device. The challenging part was to train the disabled users for the experiment who later could smoothly control the robot. Each of the users had to provide their data regarding Age, Height, Weight, Physical Ability and Gender in the datasheet that had been provided to them before the experiment began. Then the author gave instructions to each of the users about training the neutral state and specific thoughts. The users were given to move the robot in four directions i.e. Forward, Backward, Left and Right. For moving the robot forward, Backward, Left and Right we used PUSH, DROP, LEFT and RIGHT thoughts respectively.

Figure.7 Command Signal for Push

When the subject tends to think of a motion like push, the animation box moves in the forward direction, as showed in

Figure.7 and the prototype robot also imitates the same command. Simultaneously the output is plotted on graph of lab view software.

  1. FUTURE SCOPE

    The future work which focus on effective suite and cognitive suite combine with to control robotic arm is implemented using the Emotiv EPOC sensor thought LabVIEW

    Figure.8 Command Signal for drop

    When the subject tends to think of a motion like drop, the animation box moves in the upward direction, as showed in Figure.8 and the prototype robot also imitates the same command. Simultaneously the output is plotted on graph of lab view software.

    Figure.9 To analysis of four different thought to control robotic arm

    From graph X axis represent time and Y axis represent amplitude of the signals here using four different thought like PUSH, PULL, DROP,LIFT those thoughts are control the robotic arm in particular time.

    7. CONCLUSION

    The Electroencephalogram (EEG) signals are obtained from the scalp of the human. The surface electrodes placed on the scalp, the EEG signals are acquired. The acquired signals are fed to laptop and it was processed using LabVIEW software, then the EEG signals can be filtered to remove ambient noise. This filtered signal can then be separated in order to find out motor imaginary functions based on that the prototype robot is activated.

  2. ACKNOWLEDGMENTS

    I gratefully acknowledge the Almighty GOD who gave us strength and health to successfully complete this venture. The authors wish to than Mr. P. KINGSTON STANLEY, Assistant Professor, for his exhilarating supervision, timely suggestion and guidance during all phase of this work. I must acknowledge the help from Mr. DEVAPRAKASHAM and Mr. JOSEPH CHINNADURAI for their help and insightful suggestions.

    I would like to convey gratitude to my Parents whose prayers and blessing where always there with me. Last but not the least I would like to thank my Friends and Others who directly or indirectly helped me in successful completion of this work.

  3. REFERENCE

[1]. Luzheng Bi, Xin-An Fan, and Yili Liu, (2013) EEG-Based Brain-Controlled Mobile Robots: A Survey IEEE Transactions on Human-Machine Systems, Vol. 43, No. 2, March, PP.161-176

[2]. Vaibhav Gandhi, Girijesh Prasad, Damien Coyle, Laxmidhar Behera, and Thomas Martin mcginnity (2014) EEG-Based Mobile Robot Control Through an Adaptive BrainRobot Interface IEEE Transactions On Systems, Man, And Cybernetics:

Systems, Vol. 44, No. 9, September, PP. 1278-1285

[3]. Yongwook Chae, Jaeseung Jeong, Sungho Jo (2012) Toward Brain-Actuated Humanoid Robots: Asynchronous Direct Control Using an EEG-Based BCI IEEE Transactions on Robotics, Vol. 28, No. 5,October, pp.1131-1144

[4]. Brice Rebsamen, Chee Leong Teo, Qiang Zeng, and Marcelo H. Ang Jr., (2007) Controlling a Wheelchair Indoors Using Thought IEEE Intelligent Systems, Vol. 100, No 2, April pp.18- 24

[5]. Kazuo Tanaka, Kazuyuki Matsunaga, and Hua O. Wang, (2005) Electroencephalogram-Based Control of an Electric Wheelchair IEEE Transactions On Robotics, Vol. 21, No. 4,

August, pp.762- 766

[6]. Gerwin Schalk, Member, IEEE, Dennis J. McFarland, Thilo Hinterberger, Niels Birbaumer, and Jonathan R. Wolpaw (2004) BCI2000: A General Purpose Brain Computer Interface System IEEE Transactions on Biomedical Engineering, Vol. 51, No. 6 pp.149-152.

[7]. Pritom Chowdhury, S. S Kibria Shakim, Md. Risul Karim and Md. Khalilur Rhaman (2014) Cognitive Efficiency in Robot Control by Emotiv EPOC 3rd International Conference on Informatics, Electronics & Vision 978-4799-5180.

[8]. Ericka Janet Rechy-Ramirez (2014) A Flexible Bio-Signal Based HMI for Hands- Free Control of an Electric Powered Wheelchair International Journal of Artificial Life Research, pp.59-76.

[9]. Karolina Holewa, Agata Nawrocka (2014) Emotiv EPOC neuroheadset in Brain Computer Interface 15th International Carpathian Control Conference (ICCC).

[10]. Aleksandra Kawala-Janik, Michal Podpora, Mariusz Pelc, Pawel Piatek, Jerzy Baranowski (2013) Implementation of an Inexpensive EEG Headset for the Pattern Recognition Purpose 7th IEEE International Conference on Intelligent Data

Acquisition and Advanced Computing Systems: Technology and Applications, pp.12-14.

[11]. Rajesh Kannan. Megalingam, Athul. Asokan Thulasi, Rithun. Raj Krishna, Manoj. Katta Venkata, Ajithesh. Gupta B V, Tatikonda. Uday Dutt (2013) Thought Controlled wheelchair Using EEG Acquisition Device 3rd International Conference on Advancements in Electronics and Power Engineering (ICAEPE'2013) pp.207-211.

[12]. Fatma Ben T Aher, Nader Ben Amor, Mohamed Jallouli (2012) EEG control of an electric wheelchair for disabled persons IEEE-International Conference on Individual and Collective Behaviours in Robotics, pp.27-32.

[13]. Heng-Tze Cheng, Zheng Sun, Pei Zhang (2011) Real-Time Imitative Robotic Arm Control for Home Robot Applications, Carnegie Mellon University, IEEE.

[14]. Christian Hernández, Raciel Poot, Lizzie Narváez, Erika Llanes and Victor Chi (2010) Design and Implementation of a System for Wireless Control of a Robot International Journal of Computer Science, Vol. 7, No.5, pp.191-197.

Leave a Reply