Design and Implementation of Rover Controller and Music Player Control using Sixth Sense Technology

DOI : 10.17577/IJERTV4IS050315

Download Full-Text PDF Cite this Publication

Text Only Version

Design and Implementation of Rover Controller and Music Player Control using Sixth Sense Technology

Mutthuraju K S

IV Semester M. Tech, Dept. of ECE BNM Institute of Technology Bengaluru, India

Dr. P A Vijaya

Professor Dept. of ECE BNM Institute of Technology Bengaluru, India

AbstractIn this paper, the design and development of input devices based on Sixth sense technology using dynamic air gestures are deliberated. Using sixth sense technology we can control number of applications and it connects the physical world to digital world. This paper eliminating the using of input keyboard and proposes the implementation of Rover controller and control the music player using dynamic air gestures. The entire design has been coded using MATLAB and Embedded C language and the result of the rover controller has been verified through Proteus Simulator.

Keywords Sixth sense technology, MATLAB, Embedded C, Proteus Simulator.

  1. INTRODUCTION

    Here we are imitating sixth sense technology by dynamic air gestures. There are two types of gestures, static gestures and dynamic gestures; the proposed method uses the concept of the dynamic air gestures by binary large object analysis and connected component property. The main objective of this paper is to implement Human machine communication for digital and computing systems without involving the hardware. Image processing technique is an important basis for the implementation of this technology. The applications of this technology include the virtual painting, virtual keyboard, Virtual keypad for calling function, music player and Rover controller applications. Here we are concentrating on two major applications such as music player and Rover controller.

  2. LITERATURE SURVEY

    In [1], authors explain the concept of sixth sense is a wearable gesture interface that enhances the physical world around us with digital world. They determine the word

    Wearable computer is a sixth sense technology and it consists of number of components such as phone, projector, Marker, camera, and a mirror to control the number of applications. Authors executed the sixth sense technology as neck warn projector with a webcam. Earlier the word sixth sense knowledge there is no direct connection between human to digital world. Now sixth sense technology is filling the gap between the physical to digital world. Using static air gestures and dynamic air gestures, we are controlling the many applications such as Viewing map, Taking picture,

    Drawing applications, Making calls, Flight applications, Checking the mail, Check the time.

    In [2], the people are wondering what is sixth sense technology, the answer is sixth sense technology is a wearable signal interface method and it is connect to something which gives greater than that what we are expending in physical sense. The prototype of sixth sense is developed using mirror and a camera, it makes the system as user approachable. Webcam is used to capture the input markers tied at the human hand and it recognizes the color to select the appropriate keywords in fundamental keyboard. They can collect the material on any place by using a webcam. This technology has number of applications such as Newspaper reading, Google mapping, Watch the time, Virtual cellphone for calling and messaging. Advantages of this technology are people can carry the system easily at wherever and less cost.

    In [3], in this paper authors proposed the concept of how to eliminate the input hardware device like mouse using sixth sense technology. Use colored devices, motion sensors and they are able to operate same working procedure of mouse. Using the camera they are capturing the motion recognition of colored substances and detecting the RGB colored matrix to implement the hand tracking in real time. The further enhancement of this technique such as zoom in, zoom out functions will be implemented.

    In [4], authors mainly focused on sixth sense technology for safety determination. To develop sixth sense authors used the components such as projector and a camera. Whenever the attacks found on the social media like internet, its going to detect the attacks. This paper is useful in military applications, and also helpful to find the originality of the products. They continued to add few more applications to this paper namely make a call, Flight updates, Motion capture, the map, the clock, Book evidence and drawing applications.

    In [5], authors explain the concept of sixth sense by using projector attached with a webcam or camera, which acts as the PC and connected to many devices, here the information stored on the social media. The webcam identifies materials around a human being suddenly, with the projector covering

    the evidence on any surface, including the device itself or hand. The software of sixth sense technology is developed using the C++ and C# programming languages. The hardware components used to implement this technology are Camera, Microphone, projector, Mirror, Color components, mobile

    sixth sense technology is to control the Rover controller using hand movements and it is shown in figure 1.

    phone. The applications of this paper such as, Create a Zooming features, multimedia applications, Take a picture, Give the technology information to people, Develop the email information, product information details, book information details. An advantage of this technique is an Open source. The disadvantage is required high resolution camera.

    Input

    PC

    Camera

    Fig 1: Block diagram for Rover controller

    Output

    Rover controller

    Microcontroller

    Bluetooth

    DC motor and driver

    In [6], this paper implements the sixth sense technology for education field. It overcomes the old technique that is instead of getting the information on notice board get the information digitally. Anybody can study the books that are available in the library by using this technology. We can check the quality of the books or products like ISI mark. Authors implemented the Sixth sense by using the concept of Radio frequency identification and Computer vision. The advantages of this technology are Video newspaper, check the quality of the information and make a call.

    In [7], authors explain the concept of air gestures for mobile devices. Our aim is that to combine the both touch screen and voice signal to control the smartphones and it makes the device is comfortable. This concept avoids the difficult switching mode by adopting the voice commands or air signals. Instead of using software, apply the method software pipeline to work efficiently. The other method is to control the mobile by using hand movements which is using the concept of image processing technique. Pre-processing and segmentation are the two processes can be takes place in image processing to extract the edges of hand. The main controlling applications are Navigation controller, menu switch controller, Manage the Memory alignment etc. The results have been verified by different users in different lighting conditions.

    In [8], air gestures method is already implemented, thing is that authors looking for how to make a proper curser on to the screen to improve the performance of sixth sense technology. The steps are involved to implement this technology, yield the video from camera and transfer it as images then process the image or extract the color from the image. To extract the color, Morphological processing to eliminate noise, color thresholding, large obect detection, Size detection and Centroid calculations are required. The applications are: zooming the images, close a window and reduction the window. The advantages are: speed, accuracy and good device usage.

  3. DESIGN OF ROVER CONTROLLER

    Sixth sense technology is a technique is used to establish a communication between the physical worlds and digital world. It makes us dynamic air gestures to control the applications like Rover controller, music player control, virtual mobile phone, etc. One of the major applications of

    The above block diagram consists of Input device, camera, PC and Robot or Rover controller.

    Input device is the color device that is red color device is usually desirable, instead of this color we can use remaining two colors of RGB, Green or blue color is also used as input device.

    A PC or system is the heart of designing the Rover controller. It is used to process the user inputs. Real time camera acts as an eye which connects to the digital world. It is used to capture the input color component and also for the morphological processing to eliminate unwanted signal. Rover controller or Robot is designed using the components such as PIC micro controller, Bluetooth device, DC motor and DC motor driver is as shown in figure 1.

    PIC microcontroller (PIC18F452) is interfaced with DC motors, and Bluetooth device (HC 05) which controls or manages the entire Rover controller part. For an each DC motor there should be a driver, here L293D is the DC motor driver. L293D has the capability to drive the two dc motors simultaneously. DC motors are used to run the Rover in a specified direction like Forward, Backward, Left and Right directions. Bluetooth device is used to establish a wireless communication between the Rover controller and PC.

    1. Design steps

      Steps to design the Rover controller is as follow

      • Use one of the basic colors (preferred red color) as input device

      • Color component is the two dimensional signal and it is captured by using real time camera. In order to obtain a good result, use the camera with good resolution.

      • The captured signal (image) is sent to the PC for processing the signal. The processing includes color extraction and noise removal (morphological processing)

      • Find the centroid of the input colored device, depending upon the centroid value; it develops the communication between the human and system. But input colored device has to be at certain distance otherwise it unable to find the centroid value of input device.

      • After obtained the centroid value, a point will be created. Using this point we are able to determine the directions of the Rover controller.

      • A Bluetooth which is installed in PC is used to send the direction signals to Rover controller which is received by

        another Bluetooth device (HC-05) is interfaced in the Rover controller.

      • Based on the received signal to the Rover, PIC microcontroller is going handle the DC motors to run the Rover controller in a specified direction with the help of DC motor driver.

    2. Flowchart for Rover controller

    The flowchart for music player control is as shown in figure 2.

    Start

    Initialize the camera

    Select the input colored device

    No Is

    centroid

    is proper

    Yes Create point

    Determine the direction

    Send direction signal to Rover controller

    Perform directions

    Exit

  4. DESIGN OF MUSIC PLAYER CONTROL

    This paper adds another concept is music player control; it can be controlled by using the concept of dynamic air gesture. The block diagram for music player control is as shown in figure 3.

    PC

    CAMERA

    MUSIC PLAYER

    I/p o/p

    Fig 3: Block diagram for music player control

    Music player control application implements the sixth sense technology by using input device, camera and PC. The working procedure of music player control is almost same as Rover controller but it has no hardware devices.

    In figure 3, Input device is the color device that is red color device is usually desirable, instead of this color we can use remaining two colors of RGB i.e. Green or blue color is also used as input device. Input device is used to send a signal to PC or a Laptop. PC is used to process the user inputs. Camera acts as an eye which connects to the digital world. It is used to capture the input color component and also for the morphological processing to eliminate unwanted signal. Music player buttons can be controlled by using dynamic air gestures.

    1. Design steps for music player control

      Steps to design the music player control is as follow

      • Use one of the basic colors (preferred red color) as input device

      • Color component is the two dimensional signal and it is captured by using real time camera. In order to obtain a good result, use camera with good resolution.

      • The captured signal (image) is sent to the PC for processing the signal. The processing includes color extraction and noise removal (morphological processing)

      • Find the centroid of the input colored device, depending upon the centroid calculations; it develops the communication between the human and system. But input colored device has to be at certain distance otherwise it unable to find the centroid value of input device.

      • After obtained the centroid value, a point will be created. Using this point we are able to control the music player buttons like Next, Previous Pause and Play.

      Fig 2: Flowchart for Rover controller control

    2. Flowchart for Music player control

    The flowchart for music player control is as shown in figure 4.

    Start

    Initialize the webcam

    Select the input colored device

    No Is

    centroid

    is proper

    Create point

    Find sector in which centroid is located

    Yes

    Control the buttons

    Exit

    Fig 5: Forward direction

    Fig 6: Backward direction

    Fig 4: Flowchart of music player

  5. RESULTS AND ANALYSIS

    This section presents the results of proposed work of different simulated results obtained in MATLAB.

    1. Rover controller

      The entire design has been coded in both MATLAB and embedded C language.

      The figures 5 to 8 shows the snapshots of simulated results of the different cases like forward, back, left and right directions to control the Rover controller.

      In figures 5 to 8, we can observe that the snapshot contains a window, with 4 images. First one shows the direction to be controlled is shown as black image. The second one shows that a red color device is an input device. Third one represents the difference image of the input device and fourth one represents the direction of the Rover controller to control its directions with the help of PIC microcontroller.

      Fig 7: Left direction

      Fig 8: Right direction

    2. Music player

      The control operations for music player are designed using MATLAB and the simulated results are shown in figure 9 to 12.

      Fig 9: Next button control of music player

      Fig 10: Pause button control

      In figures 9 to 12, each figure has two window shows the music player window and music player buttons control window using dynamic air gestures. Depending upon the hand movement we can control the music player buttons without using input keyboard. Snapshots shows that simulated results of the different cases like Next, Pause, Play and Previous buttons to control the music player.

      Fig 9: Next button control

      Fig 12: Previous button control

  6. CONCLUTION AND FUTURE SCOPE

By making the trial and error method we got the many of the obstacles after that simulation result is obtained for Rover controller and music player control by using dynamic air gestures. From the results obtained we can concluded that, using sixth sense technology is helps to overcome the using of input peripheral device like keyboard and there is continuous touch with human machine interaction.

Future scope: Instead of developing the applications of sixth sense technology by using laptop, create a new Operating System for Tab.

Fig 10: Pause button control

Fig 11: Play button control

REFERENCES

    1. Gaurav Subhash Nikam, Rushikesh Prathaprao bhoite, Nilesh Anil Jagtap, The Sixth Sense Technology, ISSN: 2319-6319 Volume 3 Issue 1-march 2015.

    2. Amit Kumar Gupta, Mohmd. Shahid, The sixth sense technology, Preceding of the 5th National Conference, March 10-11,2011, New Delhi.

    3. Shany Jophin, Sheetal M.S, Priya Philip, T M Bharuguram, Gesture based interface using motion and image comparison, IJAIT, Vol. 2, No.3, June 2012.

    4. Aakanksha chopra, Natasa Narang, A study on The Sixth Sense technology and Its various Security Threts, IJICT, ISSN 0974-2239 Vol. 4, Number 7(2014), pp.663-670.

    5. Mr. Ashish Parmeshwar, Miss. Tanvi Praful Khandhedia, Prof. Umesh

      W. Kaware, Sixth Sense Technology, IJREAT, Vol. 3, Issue 2, April- May, 2015.

    6. Sindhuja Raghupatruni, niharika Nasam, Keerthi Lingam, Sixth Sense Enabled Campus- Possibilities and Challenges, IJCA, Vol. 75,-No 8, August 2013.

    7. Jie Song, Gabor Soros, Fabrziom Pece, In air Gestures Around Unmodified Mobile Devices, copywrite@2014 ACM 978-1, UIST 2014, October 5-8, 2014.

    8. Uma Sahu, Ditty Vargese, Gayatri Gole, Hand cursor implementation using image processing & sixth sense technology, IJIIT, Vol. 1, Issue 3, January, 2012-2013.

    9. Piyush kumar, jyothi verma, Shitala Prasad, Hand Data Glove:A Wearable Real-Time Device for Computer Interaction, Vol. 43, June, 2012.

    10. Jasleen Josan, Sixth Sense Technology, ISSN:2347-2812, Volume-1, Issue-1, October, 2013.

    11. R. Bavithra, L.Ayeesha Begame, K.S.L Deepika, Mobile Projectors Using Sixth Sense Technology, ISBN:978-93-84209-16-2, 10th May, 2014

    12. Stoerring, M. Moeslund, T.B. Liu and Y. GranumComputer vision- based gesture recognition for an augmented reality interface, In 4th IASTED international conference on visualization, imaging and Image processing (2010).

    13. F. Wallhoff, M.Zobi, G. Rigoll Action segmentation and recognition, In proceedings on IEEE International Conference on Image processing (2012) VOL 77(2).

    14. E.Kaiser, A. Olwal, D. McGee, D.Benko, H. Corradini, A. Lix, P. Cohen, P. Feiner, Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality, in Proceedings on IEEE International conference on multimodal intyerfaces (2008).

    15. Rudra pratap, Getting started with MATLAB, Oxford University, Newyork, 2010.

    16. Oge. Marques, practical image and video processing using MATLAB, Florida Atlantic university, A. Jhon Wiley and sons. Inc publication, 2011.

Leave a Reply