Virtual and Augmented Reality Applications in Manufacturing

Download Full-Text PDF Cite this Publication

Text Only Version

Virtual and Augmented Reality Applications in Manufacturing

Vishinika Veni Dept of CSE Assistant Professor PITS

R. Arvinthan

Dept .of CSE PITS

T. Vengadasubramaniyam

Dept .of CSE PITS

Abstract:- Augmented Reality is a breakthroughchnology and easily executable operations.Virtual reality is real world

VR AND AR INROBOTICS

stimulation.It is used in laboratories.

Parts used: Eye glass,HUD,I/P processor,Sensor etc… Programming used:Blender,3d modeling.

It used in gaming platform.

INTRODUCTION:

Barely a century ago, manufacturing was known as the black art where most of the tools and technologies were primarily mechanical in nature. Mechanical moving elements were initially powered by steam and later by electric power. Elaborate overhead belt systems were used to provide power supply to each machine as it was more economical than hav- ing machines driven by individual power sources.

In the 1950s, numerical controlled machine tools made a huge leap and since then, manufacturing had entered a new era. Virtual reality as a simulation tool was first reported in the 1960s. Since then, many different forms had appeared, from 2D monitor-based to 3D immersive and sophisticated set up such as the CAVE. In just over two decades, augmented real- ity (AR) technology has matured and proven to be an innova- tive and effective tool to address some of the critical prob- lems to simulate, guide and improve manufacturing proc- esses before they are launched. Activities such as design, planning, machining, etc.,

HARDWARE DEVICES

Head-mounted display (HMD) devices have been a popular choice when AR applications were first developed, as the eye-level display facilitates direct perception of the com- bined AR scene. HMD devices, however, are uncomfortable and may cause headache and dizziness, especially after pro- longed usage.

Current research in AR applications is towards mobility us- ing handheld devices (HHD) either commercially available or specially designed (Hakkarainen et al/, 2008, Stutzman et al., 2009, Xin et al., 2008). The advantages of using HHD are quite obvious as high resolution camera, touch screen, gyro- scope, etc., have already been embedded in these mobile devices.

VR has been proven to be useful in medical robots for sur- geries (Burdea, 1996), tele-robotics (Freund and Rossmann, 2005), welding (Liu et al., 2010), modeling of a six-DOF virtual robot arm (Chen et al., 2010), etc. In (Chen et al., 2010), the authors proposed a new Human Computer Interac- tion (HCI) method for VR-based robot path planning and virtual assembly systems. However, the main constraint in VR-based robot programming is the need to construct the entire Virtual Environment (VE), and this requires full a pri- ori knowledge of the workpieces, working area and thus more computationalresources.

AR-based robotic systems offer the users with graphics, text and animation through augmenting illustrative and informa- tive elements over the real scene via a video stream. An AR cueing method was reported by Nawab et al. (2007) and Chintamani et al. (2010) to assist the users in navigating the end-effector (EE) of a real robot using two joysticks.

The use of AR to address human-robot interaction and robot programming issues have been reported in several studies. Operators can program and guide the virtual model without having to interact physically with the real robot. Zaeh and Vogl (2006) introduced a laser-projection-based approach where the operators can manually edit and modify the planned paths projected over the real workpiece through an interactive stylus. Reinhart et al. (2008) adopted a similar human-robot interface in remote robot laser welding applica- tions. Chong et al. (2009) and Ong et al. (2010) presented a methodology to plan a collision-free path through guiding a virtual robot using a probe attached with a planar marker and developed the RPAR (Robot Programming using Augmented Reality) system. The methodology is interactive as the hu- man is involved in obtaining the 3D data points of the de- sired curve to be followed through performing a number of demonstrations, defining the free space relevant to the task, and planning the orientations of the end-effector along the curve.

RPAR was further developed and enhanced to the RPAR-II system (Fang et al., 2009, Fang et al., 2012) . It includes a SCORBOT-ER VII manipulator, grip- per, robot controller, desktop-PC, desktop-based display, stereo camera, and an interaction device attached with a marker-cube. The

Volume 7, Issue 11

Published by, www.ijert.org 11

augmented environment consists of the physical entities that exist in the robot operation space and a virtual robot model which includes the virtual end effector to replicate the real robot.

In RPAR-II , a collision-free path can be generated through human-virtual robot interaction in a real working environment, as illustrated in is the setup for a robotic task, which is to transfer an object from a start point to a goal point. With the start point and goal point known a priori, after generating a collision-free volume (CFV) in the workspacethe user proceeds to create a series of control points within the collision-free volume using the in- teraction device . Using these points as inputs, a cubic- spline interpolation is applied to generate a smooth path automatically.

TECHNICAL ISSUES AND CHALLENGES IN AR

Trackingaccuracy

AR applications in manufacturing require a high level of tracking accuracy such as in robot path planning and CNC machining simulation. A combination of computer-vision, inertial and hybrid tracking techniques will be required. CV- based tracking will not be able to handle high frequency mo- tion as well as rapid camera movements. Hybrid systems using laser, RFID and other types of sensing devices will need to be considered.

REFERENCES

,"Archived copy" (PDF). Archived from the original (PDF) on 24 May 2016. Retrieved 27 November 2018., Department of Communication,

Stanford University. 15 October 1993.

  1. ^21 April 2016 at the National Center for Supercomputing Applications, University of Illinois.

  2. ^ Rosenberg, L.B. (1993). "Virtual Fixtures: Perceptual Overlays for Manipulation". Proceedings of IEEE Virtual Reality Annual International Symposium. pp. 76 82. doi:10.1109/VRAIS.1993.380795. ISBN 978-0-7803-1363- 7.

  3. ^ a b Kevin (6 September 2016). "I Saw the Future Through Microsoft's s". Popular Mechanics.

  4. ^ "How to Transform Your Classroom with Augmented Reality

    – News". 2 November 2015.

  5. ^, Jan van der (16 October 2018). "Why We Need More Tech in History Education". Ancient Retrieved 23 October 2018.

  6. ^ Chen, Brian (25 August 2009). "If You're Not Seeing Data, You're Not Seeing". Wired. Retrieved 18 June 2019.

  7. ^ Maxwell, Kerry. "Augmented Reality". macmillandictionary.com. Retrieved 18 June2019.

  8. ^ "Augmented Reality (AR)". augmentedrealityon.com. Archived from the original on 5 April 2012. Retrieved 18 June2019.

2 Volume 7, Issue 11 Published by, www.ijert.org 2

Leave a Reply

Your email address will not be published. Required fields are marked *