Interactive Projector using Camera Tracked Laser Pointer

DOI : 10.17577/IJERTV3IS21410

Download Full-Text PDF Cite this Publication

Text Only Version

Interactive Projector using Camera Tracked Laser Pointer

Dhanashri Ghorpade1, Swati Devkar2, Viraj Londhe3

Department of Computer Engineering SKNCOE, University of Pune

Pune, India

Abstract – In this paper we introduce an Interactive Projector using image processing and computer vision technique. The interactive projector provides great advantage with new human-computer interactive large screen display system. The main aim of the Interactive projector is to provide graphical user interface on the projector screen, so that user will be able to draw on the projector screen using laser beam. The components it includes are computer/Laptop, Laser pen, Projector, Camera.

Keywords-Human-computer interaction, Camera, Projector, Laser Pen, Computer

  1. INTRODUCTION

    Projector screens are most widely used in teaching environment as well as in Auditoriums. And it also used in Conference Hall for the presentations. This paper provides an easy solution for drawing on projector screen to make an interactive projector on flat surface wall so that presenter can add more value to presentation and understanding of attendees so that the session becomes more interactive. The main drawback of the present system is, it ties the presenter with his computer and keyboard so presenter cant make presentation more interactive. To avoid this we have developed this system.

    The basic idea is very simple: a program running on the PC will process the image viewed from the webcam, find the position of the laser pen/light spot, and use this position to determine where to place the mouse cursor on the screen. Now, if we assume that the light spot is always the brightest object in the area, then the detection algorithm will be very simple: find the location of the pixel with the highest intensity level. This idea is diagrammatically explained in Fig.2

  2. LITERATURE SURVEY

    1. Identification of Need

      Normal Projector cannot provide any interaction mode for presenter .so that he/she cannot with it. It can only allow presenter to control slide using remote control. Todays projector do not interact with user i.e.it cannot allow user to draw line, shape or highlight specific content. Therefore our aim is to develop projector which can interact with presenter and allow presenter to draw line and highlight content.

    2. Present System in Use

      Fig.1 Present system

    3. Flaws in Present System

      1. Lack of Interaction in Projector Presentation:

        Todays projector is not so interactive. When user is using projector he/she will capable of controlling it using remote control or computer attaches to it.

      2. Presenter cannot Highlight, mark on the screen:

    For pointing any content on screen operator has to use either laser or wooden scale. Presenter cannot highlight, mark on the content. Using interactive projector presenter can point out and height content on slide.

  3. SYSTEM ARCHITECTURE

    Fig.2. System Architecture

    Fig.2. shows system architecture diagram, the user interacts with the system through laser beam. System consists of camera which is facing towards projected screen and it continuously captures the images of the projector screen. These images are analyzed and laser beam is detected using detection algorithm. The behaviour of the laser beam is analyzed and similar signals are generated from it. Then the operations performed by the laser beam are displayed on the screen

  4. PROCESSING PHASES OF THE SYSTEM

    The system consist of four Processing Phases [2]

    1. Image Capture

    2. Image processing

    3. Transformation

    4. Interpretation

    1. Image Capture

      The processing starts with image capture, camera continuously capture the images of projector screen and send it to the computer/Laptop.

    2. Image Processing

      The images captured in previous phase are processed in image processing phase .The main aim of the image processing phase is to find out position of laser point in the image. The laser point detection is detailed explained in next point no.V.

    3. Transformation

      In the Transformation phase display image coordinates are calculated. So to calculate they first consider area for display image. From the camera image coordinates display image coordinates are calculated.[2] And from these display

      image coordinates we find out the location of laser beam/point.

    4. Interpretation

    The goal of interpretation phase is the generation of mouse/keyboard actions like Left click, Right click, Button press and Button release. As laser pen has no buttons so the mouse/keyboard actions cannot be performed directly. The Button press action is simulated as follows:

    While using laser pen it is impossible to point laser beam at same location. To deal with this problem consider a certain area. If laser point is present in that area for certain amount of time then Button press actions are generated. By this way interpretation of mouse/keyboard actions is done.

  5. LASER POINT DETECTION AND CALLIBRATION

    1. Set the Region of Interest Areas

      1. To settle the laser pointers exactly on the images inputted by the camera, first we select four points before starting the program. We set the four regions of interest areas as shown in Fig. 3: upper left, upper right, bottom left and bottom right.

        1. Upper left. b.Upper right

      c. Bottom left d. Bottom right Fig. 3 Selection of Region of Interest

    2. Detection of Laser point

    The detection of laser point enables us to detect the laser points position on the captured images. To find out position of laser point in image we use RGB and HSV filter technique. After passing image through RGB and HSV filter it results into Black and white image with detected laser beam as white and rest of the part as black.

    After filtering the laser point from the image next step is to find out its pixel position .To find out pixel position we consider a particular area and find out its centre using Centroid of Area algorithm. [3] Next we acquire that area to visualize a laser beam. When the beam is reported the coordinates(x, y) of its centre are computed by using following equation,

    n

    =

    =1

    n

    n

    =

    =1

    n

    [3] Where n represents the number of pixels in the beam (in its area) and (xi, yi) are the coordinates of each individual pixel.

  6. WORK FLOW DIAGRAM

    Following fig. 3 explains the general work flow of the system:

    Start

    Start webcam and projector

    No

    Install webcam and projector correctly

  7. FUTURE SCOPE

    In future, use of multiple laser pointers can be made possible. This can be achieved by modifying the current algorithm to capture whole screen regardless of pointer detection. Also accuracy of the pointer position can be increased in future. Currently system works for 30fps camera, in future the speed can be increased with faster algorithms.

    CONCLUSION

    User friendliness is important while giving presentation with the help of projector. The interactive projector provides user friendly interface. With simpleimage processing techniques, direct access from screen is possible. Using a simple laser pointer we can access projector screen. The drawback of accuracy can be reduced with high quality camera.

    ACKNOWLEDGMENT

    We Extend our sincere thanks and deep gratitude to our projec guide Prof Vaishali S.Deshmukh for her immensely valuable advice and guidence.We sincerely thank our Head Of Department,Prof Parikshit N. Mahalle for his precious advice in the early stage of project selection.Lastly,we wish to thank our parents,Families and friends for their patience and support.

    Display the image

    Place mouse cursor at determined pixel position

    Find pixel position

    Process the image viewed by camera

    Draw on screen

    End

    Yes

    REFERENCES

    1. Sun Zhenying, Wang Yigang, Ye Lexiao:Research on human- computer interaction with laser pen in projection display; 2008, 11th IEEE conference on communication technology preceedings

    2. C. Kirstein: A System for Human-Computer Interaction with a Projection Screen Using a Camera-tracked Laser Pointer;Technical Report 686, Informatik VII, Univ. of Dortmund,1998.

    3. Carsten Kirstein, Heinrich M¨uller :Interaction with a Projection Screen Using a Camera-Tracked Laser Pointer; Journal of Image Processing ISSN: 0976-8732 & E-ISSN: 0976-8890, Volume 3, Issue 1, 2011, pp.-68-70.

    4. http://www.hindawi.com/isrn/machine.vision/2013/252406/

    5. http://www.cs.columbia.edu/~hgs/research/projects/laserpointer- mouse/

    6. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.58.606&re p=rep1&type=pdf

    7. http://www.iwb.org.uk/

    8. http://sellistoit103201.blogspot.in/2012/02/interactive-whiteboards- advantages-and.htm

    9. Si-Jung Kim, Moon-sang Jang,Kwang-chul Jung ,Hong-sick Kim

      :An interactive user interface for computer based Education-The Laser shot System ;IEEE SMC2004,pp.1191-1197.

    10. http://www.eyetoplay.co.kr

Fig.4 Work Flow Diagram

Leave a Reply