- Open Access
- Total Downloads : 0
- Authors : Thejaswini K P , Usha Rani C M , B A Sujatha Kumari
- Paper ID : IJERTV7IS070138
- Volume & Issue : Volume 07, Issue 07 (July 2018)
- Published (First Online): 05-01-2019
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Thejaswini K P
B A Sujatha Kumari
Department of Electronics and communication, SJCE College, Mysuru, India
Usha Rani C M
2nd Semester, 2nd Semester,
Department of Electronics and communication, Department of Electronics and communication, SJCE College, Mysuru, India SJCE College, Mysuru, India
Abstract A combination of a real scene viewed by a user and a virtual scene generated by a computer that augments the scene with additional information. Augmented reality (AR) can be used in the automotive industry to compare real parts of a car with their associated construction data. The real parts have to be checked whether they correspond to the latest version of the design and whether they have been manufactured with appropriate precision. The real and the virtual part should both be visible at the same time and in the same place. Therefore, for this kind of overlay, a special method of augmentation is needed. The present and discuss technologies, types and key components of augmented realities and also some of its applications.
Key words Augmented Reality, collaborative vehicle, see-through, Region of Interest, In-vehicle Infotainment, Visual guidance, Oculus rift
Augmented Reality (AR) is a real-time direct or indirect view of a physical real-world environment that has been enhanced/augmented by adding virtual computer- generated information to it. AR is both interactive and registered in 3D as well as combines real and virtual objects. It is defined by Paul Milgram and Fumio Kishino as a continuum that spans between the real environment and the virtual environment comprise Augmented Reality and Augmented Virtuality (AV) in between, where AR is closer to the real world and AV is closer to a pure virtual environment. An AR system adds virtual computer- generated objects, audio and other sense enhancements to a real-world environment in real time. The Ultimate Goal of AR Create a system such that a user cannot tell the difference between the real world and the virtual augmentation of it.
Augmented Reality aims at simplifying the users life by bringing virtual information not only to his immediate surroundings, but also to any indirect view of the real-world environment, such as live-video stream. AR enhances the users perception of and interaction with the real world. Virtual Reality (VR) technology or Virtual Environment is completely immerses users in a synthetic world without seeing the real world, AR technology augments the sense of reality by superimposing virtual objects and cues upon the real world in real time. And do not consider AR to be restricted to a particular type of display technologies such as head-mounted display (HMD), nor to be limited to the sense of sight. AR can potentially apply to all senses, augmenting smell, touch and
hearing as well. AR can also be used to augment or substitute users missing senses by sensory substitution, such as augmenting the sight of blind users or users with poor vision by the use of audio signals, or augmenting hearing for deaf users by the use of visual signals.
AR applications that require removing real objects from the environment, which are more commonly called mediated or diminished reality, in addition to adding virtual objects. Indeed, removing objects from the real world corresponds to covering the object with virtual information that matches the background in order to give the user the impression that the object is not there. Virtual objects added to the real environment show information to the user that the user cannot directly detect with his senses.AR systems have the following three characteristics Combines real and virtual objects in a real environment is interactive in real-time Registers (aligns) real and virtual objects with each other.
BACKGROUND WORK OR RELATED WORK
A Real-time Augmented Reality System to See-Through Cars
One of the most hazardous driving scenarios is the overtaking of a slower vehicle, indeed; In this case the front vehicle (being overtaken) can occlude an important part of the field of view of the rear vehicles driver. This lack of visibility is the most probable cause of accidents in this context. Recent research works tend to prove that augmented reality applied to assist driving can significantly reduce the risk of accidents. Here, two cars are equipped with cameras and an appropriate wireless communication system . The stereo vision system mounted on the front car allows creating a sparse 3D map of the environment where the rear car can be localized. Using this inter-car pose estimation, a synthetic image is generated to overcome the occlusion and to create a seamless see-through effect which preserves the structure of the scene.
Figure 1: Picture of the front car equipped with a stereo vision system
Figure 2: Rear car with a single camera fixed on the rooftop
A see-through system solves the visibility problem in overtaking situations by overcoming the occlusion caused by the front car. Specifically, aim to literally see through the front car from the rear car point of view. To achieve this goal developed a specific setup .The front vehicle is equipped with a stereo vision system while the rear car requires a single camera only. The information transfer between the vehicles is ensured by a wireless Vehicle to Vehicle (V2V) communication system. This method consists in four main steps:
A sparse real scale 3D map is computed from the front vehicle thanks to a fast Stereo Visual Odometry (VO) Approach
Then, estimate the pose of the rear car in this 3D map using a novel 2D-3D feature tracking
At this step a dense disparity map is computed with the front car stereo vision rig in order to generate a synthetic image from the rear car view point
Finally, stitch the synthetic patch on the occluded area of the rear car image
In-Vehicle Augmented Reality Traffic Information System: A New Type of Communication between Driver and Vehicle
In order to improve driving safety and minimize driving workload, the information provided should be represented in such a way that it is more easily understood and imposing less cognitive load onto the driver. Augmented Reality Head-up Display (ARHUD) can facilitate a new form of dialogue between the vehicle and the driver; and enhance intelligent transportation systems by superimposing surrounding traffic information on the users view and keep drivers view on roads. In this paper, we investigated the potential costs and benefits of using AR cues to improve driving safety as new form of dialog between the vehicle and the driver. We present a new approach for marker-less AR Traffics Signs Recognition system that superimposes augmented virtual objects onto a real scene under all types of driving situations, including unfavorable weather conditions. Our method uses two steps: hypothesis generation and hypothesis verification. In the first step, Region of Interest (ROI) is extracted using a scanning window with Haar cascade detector and AdaBoost classifier to reduce the computational region in the hypothesis generation step. The second step verifies whether a given candidate and classified into vehicle and non-vehicle classes using edge information and symmetry measurement to verify them. We employ this approach to improve the accuracy of AR traffic information system to assist the
driver in various driving situations, increase the driving comfort and reduce traffic accidents.
In recent years, the use of the utomobile as the primary mode of transportation has been increasing and driving has become an important part of daily life. Every accident is one too many, especially when it results in injury or even loss of life. Given that the global volume of traffic has increased to around one billion vehicles, road safety has become a key challenge for society, industry and politicians1. The rapid growth of technology has propelled novel ways in which drivers senses may be augmented. Steering a vehicle has become a challenging task and this is underpinned.
Figure 3: Frames illustrating the insertion of virtual 3D object sign
Augmented Reality traffic information system can facilitate a new form of dialogue between the vehicle and the driver; and enhance intelligent transportation systems by superimposing surrounding traffic information on the users view and keep drivers view on roads. In this paper, we have proposed a in-vehicle AR-HUD system for providing driving- safety information that superimposes augmented virtual objects onto a real scene under all types of driving situations, including unfavorable weather conditions . Our method uses two steps: hypothesis generation and hypothesis verification. In the first step, Region of Interest (ROI) is extracted using a scanning window with Haar cascade detector and AdaBoost classifier to reduce the computational region in the hypothesis generation step. The second step verifies whether a given candidate and classified into vehicle and non-vehicle classes using edge information and symmetry measurement to verify them. In- vehicle contextual AR has the potential to provide novel visual feedback to drivers for an enhanced driving experience and a new form of dialog between the vehicle and the driver. To provide driving-safety information using the proposed AR- HUD, drivers receive all important information before their eyes in an easily comprehensible way.
Design Methods for Augmented Reality In-Vehicle Infotainment Systems
To bring AR into production cars, we still face a range of challenges to design an AR system that meets vehicle specific requirements.
In this paper, analyze the influence of augmented reality on the design of the in-vehicle electric/electronic (E/E) architecture. AR in vehicle really offer is a great reduction of driving stress by presenting the large amount of driving information in a more intuitive and less distractive way. Navigation arrows and acoustic guiding instructions will be replaced by paths straightforwardly augmented on the road. Drivers will be warned of crossing traffics by highlights direct on the windshield instead of beeps or vibrations of the steering wheel. Street and building names as well as house numbers will be anchored on the infrastructure as if they were fixed components in the environment. To make all these happen AR imposes on the vehicle's construction, package and design, and to analyze the influence of AR on the in- E/E architecture.
Regarding AR applications on a vehicle platform, focus on the following four major technical terms in this paper: localization, tracking, rendering and display. Localization is a form of determining the vehicle poses (position and orientation) in a global coordinate system, which applies global positioning techniques in combination with tracking techniques. Tracking involves both self-tracking and tracking of external objects. The goal of Rendering is to visualize virtual objects and seamlessly integrate them into the surrounding environment . Head-up display may become the major trend for AR driver assistance and navigation guidance.
Figure 4: Architecture of AR system
Architecture involves a wide range of issues between the sensors and the actuators. Sensors, including cameras, radars, GPS, inertial sensors etc., function as the input of the AR system. Actuators, including different displays in a vehicle, are the system output. Bandwidth for transmitting videos is a general issue pertaining to the entire AR system from sensors through the architecture to actuators. There are five displays controlled by four ECUs. The HUD, the Instrument Cluster (IC) display and the central display will preferably be used for AR driver assistance, whereas the rear seat LCDs will rather be employed for AR infotainment. To achieve a smooth AR experience for the driver and the passengers, the communication and synchronization among the head unit, the rear seat entertainment controller and the two ECUs for HUD and IC will have to be optimized.
Three ECUs are between the front camera and one of the rear seat LCDs, while only one ECU is on the path from the front camera to the head-up-display.
Both the transmission delay and the delay inside a ECU have to be taken into consideration. Plus the time consumed by the protocol conversion between LVDS and MOST, the image delay on the rear seat LCD might be noticeable by passengers.
Visual Assembling Guidance Using Augmented Reality
This paper describes a study of using the concept of augmented reality for supporting assembly line workers in carrying out their task optimally. AR makes it possible to improve the visual guidance to the workers by overlaying virtual information on real world objects and thereby enhance the humans perception of reality. A prototype system is developed based on the Oculus Rift platform and evaluated using a simulated assembling task.
The main aim is
Ability to combine real and virtual objects in a real environment,
Ability to register (align) real and virtual objects with each other, and
Ability to run interactively, in three dimensions, and in real time.
A head-worn implementation is selected, since such implementation frees the user's hands. For realizing the head- worn implementation, video-based solution is chosen. The real world and the virtual world are merged into the same view, and the user's view is completely digitalized . The hardware platform used in the study is Oculus Rift. The Oculus Rift is a virtual reality headset developed and manufactured by Oculus VR.
Figure 5: Camera mounted on Oculus Rift
There are two sets of a bowed aluminum plates, has three holes in it and each hole has a nut screw. The screws are used to hold a plastic disc on which the cameras are glued. The plastic discs can be adjusted laterally by tightening or loosening the screws. So easy to adjust the cameras laterally which is important in order to compensate for different users having different distances between their eyes. The software platform chosen for the project was Unity as this platform is easy to work with and fully supported by Oculus Rift. The virtual objects are related to the pattern and rendered as a layer above the camera plane, external camera view and the virtual overlay, are then merged and rendered to the Oculus screens.
Assembling a three-dimensional puzzle with nine pieces. The pieces are to be placed in a certain order and at specific positions, just like in industrial assembling.
Figure 6: Three-dimensional puzzle used in assembly task.
The user wears the modified Oculus Rift. At the Oculus's screens, the piece to select next is highlighted in green and the place to put it is marked with the same shape and color. The piece is to be placed at the position marked with the same shape, color and number as the piece itself.
Figure 7: Screenshot from Oculuss screen
In the screenshot, the user holds piece number one, which is overlaid with green color and pointed on with a virtual arrow.
A combination of a real scene viewed by a user and a virtual scene generated by a computer that augments the scene with additional information. An AR system adds virtual computer- generated objects, audio and other sense enhancements to a real-world environment in real time. The Ultimate Goal of AR Create a system such that a user cannot tell the difference between th real world and the virtual augmentation of it. The augmented reality (AR) features are the System augments the real world scene, User maintains a sense of presence in real world, Needs a mechanism to combine virtual and real worlds ,Hard to register real and virtual objects.
The virtual reality (VR) features are the Virtual Reality Totally immersive environment senses are under control of system, Need a mechanism to feed virtual world to user, Hard to make VR world interesting. AR systems have the following characteristics are It combines real and virtual objects in a real environment, It is interactive in real-time Registers (aligns) real and virtual objects with each other. There are three components needed in order to make an augmented-reality system work:
Mobile computing power
How Does Augmented Reality Work?
In order to understand how augmented reality technology works, one must first understand its objective is to bring computer generated objects into the real world, which only the user can see. In most augmented reality applications, a user will see both synthetic and natural light. This is done by overlaying projected images on top of a pair of see-through goggles or glasses, which allow the images and interactive virtual objects to layer on top of the user's view of the real world. Augmented Reality devices are often self-contained, meaning that unlike the Oculus Rift or HTC Vive VR headsets, they are completely untethered and do not need a cable or desktop computer to function.
To bring computer generated objects into the real world, which only the user can see.A user will see both synthetic and natural light, by overlaying projected images on top of a pair of see-through goggles or glasses, which allow the images and interactive virtual objects to layer on top of the user's view of the real world. Augmented realities can be displayed on a wide variety of displays, from screens and monitors, to handheld devices or glasses. Google Glass and other Head-UP-Displays (HUD) put augmented reality directly onto your face, usually in the form of glasses.
APPLICATIONS AND ADVANTAGES
While there are many possibilities for using augmented reality in an innovative way. The four types of applications that are most often being used for AR research: advertising and commercial, entertainment and education, medical, and mobile application for iPhones. Note that it was decided here to replace the navigational and informational domain application that was encountered in the AR systems section by a study of the augmented reality mobile applications as these applications most often have navigational and informational use.
Figure 8: Applications of AR
CONCLUSION AND FUTURE SCOPE
A Real-time Augmented Reality System to See- Through Cars the information, the front car generates a synthetic unconcluded image from the rear cars view point. Finally, to reduce the bandwidth consumption, only the ROI affected by the occlusion is transferred to the rear car to be displayed.
In-Vehicle Augmented Reality Traffic Information System a New Type of Communication between Driver and Vehicle is to provide driving-safety information using the proposed AR-HUD, drivers receive all important information before their eyes in an easily comprehensible way.
Visual Assembling Guidance Using Augmented Reality the AR prototype based on the Oculus Rift platform was developed as part of the study and used for assembling a 3D puzzle. The assembling task is fully replicable and possible for others to use for benchmarking.
Design Methods for Augmented Reality In-Vehicle Infotainment Systems for an automotive AR system, issues of the current generation E/E architecture including bandwidth, latency and synchronization of flexible displays will have to be considered.
Expanding a PC screen to real environment. The program windows & icons appear as virtual devices in real space & are eye or gesture operated, by gazing or pointing. Enhanced media application like pseudo holographic virtual screens, virtual surround cinema. Replacement of cell phones: eye dialing, insertion of information directly into environment.
REFERENCES FrancÂ¸ois Rameau_ Hyowon Ha_ Kyungdon Joo_ Jinsoo Choi_ Kibaek Park_ In So Kweon_ A Real-time Augmented Reality System to See-Through Cars, IEEE Transactions on Visualization and Computer Graphics (Volume: 22, Issue: 11, Nov. 2016)  Lotfi Abdia, Faten Ben Abdallahb, Aref Meddebb In-Vehicle Augmented Reality Traffic Information System: A New Type of Communication Between Driver and Vehicle, Science Direct- Elsevier in Procedia Computer Science 73 (2015 ) 242 249.  Qing Rao, Christian Grunlery, Markus Hammori, Samarjit Chakraborty, Design Methods for Augmented Reality In- Vehicle Infotainment Systems, IEEE truncations on Design automation conference(DAC),2014 51ST ACM/EDAC/IEEE.  Anna Syberfeldt, Oscar Danielsson, Magnus Holm1 and Lihui WangVisual Assembling Guidance Using Augmented Reality, ScienceDirect-Elsevier Procedia manufacturing volume1, 2015, Pages 98109.  R.Naja. Wireless Vehicular Networks for Car Collision Avoidance Springer Publishing Company, Incorporated 2013.  X.-S.Gao, X.-R.Hou, J.Tang, and H.-F.Cheng. Complete solution classification for the perspective-three-point problem,IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI),25(8):930943,2003.  H.-I.Chen, Y.-L.Chen, W.T.Lee, F.Wang, and B.Y.Chen. Integrating dashcam views through inter video mapping, In IEEE International Conferenceon Computer Vision (ICCV), 2015.  H.S. Park, K.H. Kim, in Virtual, Augmented and Mixed Reality- Applications of Virtual and Augmented Reality, Springer, 2014, pp. 435442