Brainwave Controlled Automated Wheelchair

Download Full-Text PDF Cite this Publication

Text Only Version

Brainwave Controlled Automated Wheelchair

Madhura B.N

Department of Electronics and communication Engg.

PES College of Engineering Mandya, India

Nisha U.N

N Chandana

Department of Electronics and communication Engg.

PES College of Engineering Mandya, India

Navyashree M

Department of Electronics and communication Engg.

PES College of Engineering Mandya, India

Dr. Punith Kumar M B

Department of Electronics and communication Engg.

PES College of Engineering Mandya, India

Associate Professor

Department of Electronics and communication Engg.

PES College of Engineering Mandya, India

Abstractwe have seen manypeople across the globe born with physical disabilities. Some of them are born with disabilities which make them difficult for the movement and they need someones assistance for their movement, finding someone who can assist them all the time is very difficult. So to overcome this difficulty Wheelchairs using few technologies have been developed in the past to overcome this problem but, completely paralyzed patients feel it difficult to use the technology like joystick, electromyography arm, voice controlled wheelchair etc. So, in this paper to overcome the limitations of previously existing technologies we have used Electroencephalogram (EEG) signals to operate the wheelchair and this technology is called brain computer interface (BCI) in which the human brain interacts with the computer to perform a particular task. Here eye blinking is used to control the robot as it generates the significant pulse in EEG signals. The first objective of this project is to provide mobility for the completely paralyzed patients. The second objective is to use the EEG signal to control the wheelchair. Hence this robot is programmed in such a way that it takes the EEG signal generated by the eye blinking as commands in a short period of time. In addition to this, the ultrasonic sensors are used to detect any obstacles and to stop the robot thereby ensuring the safety for the users.

KeywordsArduino, Electro encephalogram, Brain Computer Interface.


    All over the globedisabled people and paralyzed patients are facing many difficulties. According to the reports from World health organization (WHO) about 15% of the worlds population lives with some form of disabilities, among which 2-4% experience significant difficulties in functioning. This statistics shows that the people who lost their mobility are significant. In the past the disabled people were getting helpers to control their wheelchair and to perform their daily activities, but in recent years disabled people are finding difficulty in search of helpers. To help such people many of the technologies were emerged in the past.It includes a wheelchair with joystick, electromyography arm and voice controlled wheelchair etc. But these wheelchairs require the use of body movement or the instructions from the patient, which is not convenient for the paralyzed patients to perform the muscularactivities and the people who are dumb cant use voice controlled wheelchair. To overcome this problem Brain computer interface was found useful so that patients can use their brain signals to interact with the computer through which they can control their wheelchair by their own without someones assistance.

    BCI method is categorized into 2 types: invasive method and non- invasive method. In invasive method the electrodes are implemented

    directly into the brain for measuring brain signals whereas in non- invasive method electrodes are kept on the scalp through which brain signals are collected. In this paper we are using EEG because EEG is a non-invasive type and they are of different classes based on the frequency range and the activities performed by the brain.

    EEG not only records the brain signal, it also measures voltage fluctuations which occur due to current present within the neurons of the brain. According to the human thoughts different signals are produced for different activities. Different muscular movements produce different brain signals. Hence the brain signal produced by the eye blink will be collected from the sensor and sent to the computer to generate the signal based on which the wheelchair performs the forward, backward, left and right functions.


    Paralysis is the loss of one or more muscle function in part our body. Paralyzed patient could not sense in the affected area which cause mobility problem in the paralyzed patient. Reasons for paralysis are stroke, poliomyelitis, Amyotrophic Lateral Sclerosis (ALS) or damaged nervous system.

    Paralyzed patient and fully disabled people need someones help to control the wheelchair. Technologies like voice control wheelchair, power wheelchair, joystick control wheelchair are used by paralyzed patient, but they are not convenient. Above technologies cannot be bused by fully paralyzed patients.

    To provide mobility back to fully paralyzed patients, people who lost their legs and hands and unable to move, Brainwave Controlled Automated Wheelchair is the best technology.


    In paper [2] we have learnt aboutElectroencephalogram (EEG) which is a conduction of measurement process to detect the electrical activity of the brain which is caused by neural synaptic excitation, EEG signals are recorded from the electrodes kept on the scalp, a very poor quality of signals are obtained due to noise which are unavoidable element produced either inside or outside on the scalp and skull thickness, thus the signals are amplified.

    In paper [3] we have learnt aboutBCI system and it is distinguished into two different categories, invasive method and non-invasive method. Invasive method involves direct implantation of electrode into brain for measuring brain signal while in non- invasive method electrodes are present outside the brain.

    In paper [4] we have learnt about the EEG and the classification of EEG into different frequency ranges based on the power spectrum of the brain signals. Delta, theta, alpha, beta, gamma are the different frequency ranges. Occurrence of these waves depends upon different activities performed bythe brain.



    Fig 1: Brainwave sensor

    The brainwave sensor contains TGAM (Think Gear ASIC Module) chip which is designed to processes and outputs EEG digitized signal through the Bluetooth present in it.The brain sense sensor collects raw brainwaves. This sensor measures EEG power spectrums such as alpha, beta, gamma etc., for processing and to obtain the output. This sensor also measures the output of a brainsense proprietary eSense meter for attention, meditation and eye blink. It can also be used to analyze the quality of EEG signals.


    In our project we have used MATLAB (2013),it will be used for noise filtering and also it will check whether the command sent by the user is robot controlling command or is that just a natural blinking. If the command is robot controlling command it will send the command to arduino if not it will just ignore.


    The block diagram shown in the below figure [2] explains the working of our project. Brainwave controlled robot which we have implemented basically works on the principal of capturing the EEG signals generated by eye blinking and utilizing it for the movement of the robot. Here we have used Brain sense sensor headset (EEG reader), computer including MATLAB, and robot, which consist of arduino UNO microcontroller, ultrasnic sensor (SR-O4), LCD display ,Bluetooth module (HC-05) and relay module.

    Fig 2: Block Diagram

    Firstly, the human brain signals are captured by the brainwave sensor through TGAM1 chip. This chip also provides the signal filtering and signal amplification. After that, the EEG signal is sent to the Level analyzer unit (LAU) through Bluetooth transmission module. This LAU module consists of computer with MATLAB software, using which the received EEG signals are analyzed. MATLAB in the computer will check for the command received by the eye blink. It checks the command, if it belongs to movement of the robot or it is a natural eye blinking. If the command belongs to the movement of the robot, then MATLAB will also determine which command does the command represents. The Arduino robot consists of ATMega controller, two DC motors, Bluetooth HC- 05module, ultrasonic range detectors (SR-04) and one LCD. When the Bluetooth HC-05 module receives the controlling command from the computer, the Arduino board will consider both the data received from Level Analyzer Unit (LAU) and the data received from the ultrasonic sensor for the movement of the robot.

    Ultrasonic sensor detects the obstacle and stops the robot immediately when robot moves and avoids it automatically. If the command is received by the robot, then the robot follows the same command.LCD is used to display the status of the controlling command.

    The flow chart shown in fig [3] gives an overview of how our project works.

    Fig 3: Design Flow

    Fig 4: Final Prototype

    Every project will be having its own limitations similarly there are some limitations in our project in some scenarios. It is not possible when user wants to get closer to the obstacle because the ultrasonic sensor present in the robotic part will stop the robot when it tends to get closer to the obstacle. However, this limitation could be solved when the robot is turning left or right, we dont use the autonomous obstacle avoidance function. By adopting the modified method disabled people can able to get closer to the obstacle as they wish.

    The figure [5] which is shown below shows the forward movement of the robot and the corresponding waveform.

    Fig 5: Waveform for forward command


    The main objective of our project is to bring mobility back to people who are facing physical disorders and for the paralyzed patients who cant move on their own without someones assistance. The ongoing researchand development works in the field of brain controlled robotshave received a great attention all over the globe because they can helpphysically disabled people to move independently. Use of BCI (Brain Computer Interface)technology in human life ensures a comfort zone to a physically challenged people and our project is one among them.We have usedBrain sense EEG sensor which is used to capture the EEG signal from the brain. Arduino Uno board is used to control the movement of the wheelchair.Matlabis used to read and analyze the brain wave data and to generate and send the command to the arduino board for movement of the wheel chair.ultrasonic sensor is used for obstacle avoidance thereby ensuring the safety of user.

    In terms of cost our project is very cost-effective because the controlling commands of the robot are given in real time. Also, the commands are justeye blinkingwhich can be easily done by the user.


Our project can be implemented further to use in real life by paralyzed patients by training them about the instructions to be given.

Our project can also be further implemented to a wheelchair with an arm and can also include otherBCI technology to the system. It can also be reduced to a smaller unit by replacing a laptop with a single board computer like raspberry pi.


  1. Luzheng Bi, Xin-An Fan, Yili Liu, EEG-Based Brain-Controlled Mobile Robots: ASurvey IEEE transactions on human-machine systems, vol. 43, no. 2, march 2013.

  2. Kazuo Tanaka, Kazuyuki Matsunaga, and Hua O. Wang Electroencephalogram-Based Control of an Electric Wheelchair IEEE transactions on robotics, vol. 21, no. 4, august 2005.

  3. Ms Priyanka D. Girase, Prof. M. P. Deshmukh A Review of Brain Computer Interface International Conference on Global Trends in Engineering, Technology and Management (ICGTETM-2016).

  4. Busra ulker, Mehmet Baris Tabakcioglu, Huseyin cizmeci, Doruk AyberkinRelations of attention and meditation level with learning in engineering education ECAI 2017 – International Conference 9thEdition,IEEE,1-4.

  5. Puviarasi R, Ramalingam M, Chinnavan E. Self assistive technology for disabled people voice controlled wheel chair and home automation system. IAES International Journal of Robotics and Automation (IJRA). 2014; 3(1)

  6. Lawpradit, W. and Yooyativong, T. (2018). The eeg brain signal representation forsurfaces and shapes touching behavior with an inexpensive device. 2018 International ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering (ECTI-NCON), IEEE, 135 140.

  7. Bi, L., Fan, X.-A., and Liu, Y. (2013). EEG-based brain- controlled mobile robots: a survey. IEEE transactions on human- machine systems, 43(2), 161176.

  8. Dev., A., Rahman, M. A., and Mamun, N. (2018). Design of an eeg-based braincontrolledwheelchair for quadriplegic patients. 2018 3rd International Conference for Convergence in Technology (I2CT), IEEE, 15.

  9. Susan K. Mind over Matter: a $199Headset Controls Objects. via Brain Waves. (NeuroSky MindSet). IEEE Spectrum. 2010; 47(5): 23.

  10. Kanaga, E. G. M., Kumaran, R. M., Hema, M., Manohari, R. G., and Thomas, T. A.(2017). An experimental investigations on classifiers for brain computer interface (bci) based authentication. 2017 International Conference on Trends in Electronics and Informatics (ICEI), IEEE, 1


  11. Siswoyo A, Arief Z, Sulistijono IA. Application of Artificial Neural Networks in Modeling Direction Wheelchairs Using Neurosky Mindset Mobile (EEG) Device. EMITTER International Journal of Engineering Technology. 2017; 5(1)

  12. B. Rebsamen, C. Guan, H. Zhang, C. Wang, C. Teo, M. H. Ang, Jr., and E. Burdet, A brain controlled wheelchair to navigate in familiar environments, IEEE Trans. Neural Syst. Rehabil. Eng., vol. 18, no. 6, pp. 590598, Dec. 2010

Leave a Reply

Your email address will not be published. Required fields are marked *