International Scientific Platform
Serving Researchers Since 2012

AR Path WIET: An Interactive AR Indoor Navigation System for Students and Visitor

DOI : 10.17577/IJERTV15IS042173
Download Full-Text PDF Cite this Publication

Text Only Version

AR Path WIET: An Interactive AR Indoor Navigation System for Students and Visitor

Aarti Sakpal

Department of Computer Engineering Watumull Institute of Engineering and Technology, Thane, India

Rahul Dasari

Department of Computer Engineering Watumull Institute of Engineering and Technology Thane, India

Kaushik Gosala

Department of Computer Engineering Watumull Institute of Engineering and Technology Thane, India

Adarshkumar Azadkumar Singh

Department of Computer Engineering Watumull Institute of Engineering and Technology Thane, India

Under Guidance of – Prof. Avinash Gondal

Computer Engineering, Watumull Institute of Engineering and Technology Thane, India

Abstract Most navigation systems today rely on two- dimensional (2D) maps, which require users to mentally interpret directions and relate them to the real world. This often leads to confusion, especially in unfamiliar locations such as college campuses or crowded urban areas. To address this issue, this paper presents an Augmented Reality (AR)-based navigation system that provides real- time visual guidance by overlaying virtual navigation elements directly onto the users surroundings.

The proposed system combines GPS data, smartphone sensors, and AR frameworks to display directional arrows, route paths, and destination indicators on the device screen. Unlike traditional systems, this approach allows users to follow directions naturally without switching between the map and the environment. Experimental results show that the system improves navigation accuracy, reduces user confusion, and enhances overall experience.

Keywords Augmented Reality, Navigation System, ARCore, GPS, Mobile Computing, Computer Vision

  1. INTRODUCTION

    Traditional navigation systems such as Google Maps and other GPS-based applications provide directions using two-dimensional (2D) maps. Although these tools are widely used, they depend heavily on the users ability to understand and mentally convert map directions into real- world movement. This can often be confusing, especially for people who are not familiar with the area. For example, deciding which way to turn at an intersection or identifying the correct building in a crowded place can be difficult when relying only on a top-view map. Because of this, there is a need for a more intuitive and user-friendly navigation system.

    With the advancement of Augmented Reality (AR), it

    has become possible to combine digital information with the real world in a more meaningful way. AR technology allows virtual elements such as arrows, signs, and labels to be displayed directly on the live camera view of a smartphone. In navigation, this means users can see directions placed in their actual environment instead of interpreting them from a map. This makes navigation easier, more natural, and more interactive, as users can simply follow visual cues shown on their screen.

    Recent improvements in smartphone technology, including better GPS accuracy, advanced cameras, and AR development platforms like Google ARCore and Apple ARKit, have made AR-based navigation systems practical and accessible. These technologies help in accurately placing virtual objects in the real world and tracking the users movement and orientation in real time. As a result, AR navigation applications can now run efficiently on regular smartphones without requiring any additional hardware.

    The AR Navigation System proposed in this project aims to simplify the navigation process by displaying directions directly on the users mobile screen. The user enters the starting point and destination, and the system calculates the best route using mapping services such as the Google Maps API. This route is then displayed in the real-world view using AR, where virtual arrows guide the user step by step. As the user moves, the system continuously updates their position and adjusts the navigation cues in real time.

    Unlike traditional navigation methods, where users must frequently switch their attention between the map and their surroundings, this AR-based approach reduces confusion by combining both into

    a single view. Users can stay focused on their environment while following clear visual directions, which improves both accuracy and safety.

    In conclusion, the main aim of this project is to make navigation simpler, more intuitive, and engaging by using AR technology. By combining GPS data, smartphone sensors, and real-time AR visualization, the system provides a modern navigation solution that improves user experience and reduces the effort required to reach a destination.

    .

  2. Literature Survey

Fig.1. Evolution of AR-Based Navigation System

Augmented Reality (AR)-based navigation systems have significantly evolved over the years, progressing from basic Global Positioning System (GPS)-based overlays to advanced real-time navigation solutions using modern frameworks such as Google ARCore and Apple ARKit. These advancements have improved the accuracy, interactivity, and usability of navigation systems in real- world environments.

In the early stages, AR navigation systems primarily relied on integrating GPS data with camera-based overlays to provide directional guidance. Although these systems enabled basic outdoor navigation, they suffered from limitations such as low positional accuracy, poor alignment between virtual objects and the real world, and delayed updates. These challenges affected the overall reliability and user experience.

Chung et al. (2016) proposed an Android-based AR navigation system that utilized GPS, camera, and smartphone sensors to display information about nearby buildings and landmarks. The system enhanced user interaction by combining real-world and virtual data. However, the use of both marker-based and markerless approaches increased system complexity, making it less suitable for lightweight and scalable applications.

Carmigniani et al. (2011) presented a comprehensive survey of AR technologies and their applications across various domains such as navigation, education, and healthcare.

Their study highlighted that AR improves user perception by overlaying contextual information onto the physical environment. At the same time, they identified key challenges such as calibration errors, latency issues, and misalignment of virtual objects, especially in outdoor scenarios.

Grubert and Grasset (2013) explored the development of AR applications on mobile platforms using technologies such as ARToolKit and OpenGL. Their work demonstrated the feasibility of real-time AR-based navigation using mobile sensors and image tracking techniques. This research contributed to the advancement of mobile AR development by improving visualization and interaction capabilities.

In recent years, the introduction of advanced AR frameworks such as ARCore and ARKit has significantly enhanced AR navigation systems. These frameworks provide essential features including motion tracking, environmental understanding, and light estimation, which enable accurate placement of virtual objects in real-world environments. As a result, modern AR navigation systems can operate without markers and provide smoother, more stable, and real-time guidance.

Furthermore, several studies have focused on domain- specific AR navigation applications, including campus navigation, tourism, and healthcare facilities. Systems such as ARCampusGuide and ARTour have demonstrated that AR-based navigation improves user experience by reducing confusion and ncreasing engagement. Users can follow intuitive visual cues instead of interpreting traditional maps, leading to improved navigation efficiency

III Limitation of Existing systems

While Augmented Reality (AR)-based navigation systems have gained significant attention in recent years, many existing solutions still face several practical limitations that affect their performance and usability in real-world scenarios. These limitations arise due to technical constraints, environmental conditions, and user interaction challenges.

Fig.2. Limitations of Existing AR Navigation System.

    1. Dependence on Internet Connectivity

      Most AR navigation systems require a stable internet connection to function properly. They depend on online services to load maps and calculate routes. However, in areas such as rural locations, tunnels, or underground spaces, internet connectivity may be weak or unavailable. In such situations, maps may fail to load, directions may not update, and the AR view may stop working. This makes the system unreliable in offline conditions.

    2. GPS Accuracy Issues

      GPS is widely used to determine the users location, but it is not always accurate. Its performance can be affected by tall buildings in urban areas, bad weather conditions, and signal blockage. These issues can result in incorrect location detection, misaligned AR arrows, and wrong navigation directions. Even small errors in location can confuse users and reduce navigation efficiency.

    3. Complex System Design

      Some AR navigation systems include many advanced features such as object detection and 3D visualization. While these features improve functionality, they also make the system more complex. This increases battery consumption, slows down performance, and makes the application harder to use for regular users.

    4. Limited Indoor Navigation

      Most navigation systems rely on GPS, which does not work effectively indoors. As a result, the system cannot determine the users location accurately inside buildings, and AR navigation becomes unreliable. To solve this issue, additional technologies such as Wi-Fi positioning or Bluetooth beacons are required, but they are not always available.

    5. User Interface Complexity

      Some AR applications display too much information on the screen, such as multiple arrows, labels, and notifications. This creates visual clutter, which can confuse users and make it difficult to understand directions. Instead of simplifying navigation, a complex interface can reduce usability and safety.

    6. Device Compatibility Issues

      AR applications require smartphones with high processing power, good sensors, and quality cameras. However, not all users have access to such devices. On low- end smartphones, AR performance may be slow, with lagging visuals and unstable tracking. This limits the accessibility of AR navigation systems for many users.

    7. Latency and Calibration Issues

AR systems need to synchronize real-world movement with virtual objects in real time. Sometimes, delays (latency) occur, or sensors may not function accurately. This can lead to incorrect placement of arrows and delays in updating directions. These issues reduce system accuracy and user trust.

IV The Proposed Framework

The proposed system is an Augmented Reality (AR)-based navigation application designed to provide real-time guidance through visual overlays. Unlike traditional navigation systems that rely on two- dimensional maps, this system integrates digital navigation elements directly into the users real-world environment using AR technology. This approach allows users to follow directions more naturally and reduces the need to interpret maps, thereby improving navigation accuracy and user experience.

The system architecture is divided into multiple layers, each responsible for a specific function. These layers work together to ensure accurate positioning, efficient route computation, and smooth AR visualization

Fig.3.System Architecture of AR-Based Navigation System

  1. User Interface Layer

    The User Interface (UI) layer is responsible for enabling interaction between the user and the application. It provides a simple and intuitive interface where users can enter their source and destination locations. The UI supports both map view and AR view, allowing users to switch based on their preference.

    Additionally, this layer displays important navigation information such as:

    • Directional arrows

    • Distance to destination

    • Turn-by-turn guidance

      The interface is designed to be clean and minimal, reducing user confusion and improving overall usability.

  2. Sensor Layer

    The Sensor Layer collects real-time data from the smartphones built-in sensors to determine the users position and orientation accurately. The primary sensors used include:

    • GPS for location tracking

    • Accelerometer for detecting movement

    • Gyroscope for tracking orientation

    • Compass for directional alignment

      This sensor data is continuously updated and

      processed, enabling the system to track user movement in real time and provide accurate navigation guidance.

  3. AR Processing Layer

    The AR Processing Layer is responsible for implementing augmented reality functionalities. It uses frameworks such as Google ARCore to perform key operations including:

    • Motion tracking

    • Plane detection

    • Environmental understanding

      Based on this information, the system places virtual navigation elements such as arrows and markers in real- world coordinates. These elements are properly aligned with the users surroundings to ensure a seamless and immersive AR experience.

  4. Mapping Layer

    The Mapping Layer handles route generation and navigation logic. It integrates with mapping services such as Google Maps API to compute optimal routes between the source and destination.

    Efficient pathfinding algorithms such as:

    • Dijkstra Algorithm

    • A* Algorithm

      are used to determine the shortest and most efficient route. The system also supports dynamic route updates in case the user deviates from the planned path.

  5. Visualization Layer

    The Visualization Layer is responsible for presenting navigation information in AR mode. It overlays virtual elements onto the live camera feed, including:

    • Direction arrows

    • Route paths

    • Distance indicators

    • Destination markers

These visual elements are updated in real time based on sensor inputs and user movement. The primary goal of this layer is to provide clear, accurate, and easy-to-follow navigation guidance without overwhelming the user..

V. Methodology

The proposed AR-based navigation system follows a structured methodology aimed at achieving accurate real- time navigation through augmented visualization and sensor-based tracking.

  1. Requirement Analysis

    The initial phase identifies limitations in traditional navigation systems. Requirements are classified into functional requirements such as real-time AR navigation and visual guidance, and non-functional requirements including accuracy, performance, and usability.

  2. System Design

    A modular system architecture is developed to integrate multiple components efficiently. The system utilizes Android Studio for application development, ARCore for augmented reality processing, and Google Maps API for route computation and geolocation services.

  3. Data Acquisition

    Data is collected in real time using GPS sensors and inertial measurement unit (IMU) sensors, including accelerometer and gyroscope. This data is used to determine user position, orientation, and movement direction.

  4. AR Implementation

    ARCore is used to enable motion tracking, environmental understanding, and plane detection. Navigation instructions such as arrows, directional markers, and route overlays are rendered onto the camera feed for intuitive guidance.

  5. Pathfinding and Route Conversion

    Optimal path computation is performed using shortest path algorithms. The generated route is converted into AR-compatible spatial coordinates to ensure correct alignment with the real-world environment.

  6. Testing and Evaluation

The system is tested in real-world scenarios to evaluate performance. Key metrics such as navigation accuracy, latency, tracking stability, and usability are measured to ensure system reliability and effectiveness.

VII Result

The proposed Augmented Reality (AR) based navigation system was implemented and evaluated in both outdoor and semi-indoor environments to assess its real-world performance and usability. The system was analyzed based on key parameters such as route understanding, navigation accuracy, user interaction, real-time guidance, and spatial awareness.

The results demonstrate that the AR-based navigation system significantly enhances route understanding compared to traditional navigation methods. Users were able to follow directions more effectively due to real-time visual overlays on the live camera feed, which reduced cognitive effort and improved clarity of navigation paths.

In terms of navigation accuracy, the system showed improved performance with reduced directional errors. The integration of GPS data, device sensors, and ARCore-based tracking improved positioning accuracy, particularly in complex environments such as intersections and multi-path roads.

User interaction was found to be more intuitive and engaging. The system provides visual elements such as directional arrows, route paths, and destination markers, which enhance user experience compared to conventional map-based navigation.

The system also provides continuous real-time guidance by dynamically updating navigation instructions based on user movement. This ensures that users receive immediate and context-aware directions during navigation.

Furthermore, spatial awareness was significantly improved as users could visually connect virtual elements with their real-world surroundings.

navigation elements with real-world surroundings. This helped users maintain better orientation and reduced confusion in unfamiliar environments.

During real-world testing, it was observed that users reached destinations faster using the AR-based navigation system compared to traditional navigation applications. The system also reduced confusion at intersections and provided smooth AR rendering with minimal latency. Users reported improved confidence while navigating unfamiliar locations. Overall, the results confirm that the proposed AR-based navigation system provides a more accurate, interactive, and efficient navigation experience compared to traditional

methods.

VII. FUTURE SCOPE

The proposed Augmented Reality (AR) navigation system has strong potential for further improvement and real-world use. Future enhancements can improve system accuracy, usability, and performance across different environments.

  1. Indoor Navigation

    The system can be extended to indoor environments such as malls, hospitals, airports, and campuses. This can be achieved using technologies like Wi-Fi fingerprinting, Bluetooth Low Energy (BLE) beacons, or Ultra-Wideband (UWB). These technologies help in accurate positioning where GPS signals are weak or unavailable.

  2. Voice-Guided Navigation

    Voice assistance can be added to provide spoken navigation instructions. This will make the system more user-friendly and allow hands-free operation, which is especially useful for pedestrians and drivers.

  3. Obstacle Detection using Computer Vision

    Computer vision techniques can be integrated to detect obstacles such as vehicles, pedestrians, or physical barriers in real time. This will improve safety by alerting users about possible hazards during navigation.

  4. Integration with Smart City Infrastructure

    The system can be connected with smart city technologies such as traffic management systems, public transport data, and road monitoring services. This will help in providing optimized routes based on real-time conditions.

  5. AR Glasses and Wearable Devices

In the future, the system can be adapted for AR glasses and wearable devices. This will provide a more immersive and hands-free navigation experience, making it more practical for everyday use.

VIII. CONCLUSION

This paper presented an Augmented Reality (AR)-based navigation system designed to enhance user experience through real-time visual guidance. By integrating AR technology with GPS and sensor data, the proposed system improves navigation accuracy while significantly reducing user cognitive load during route following.

The results demonstrate that overlaying digital directions onto the physical environment makes navigation more

intuitive, interactive, and user-friendly compared to traditional map-based approaches. Overall, the system highlights the strong potential of AR technology in transforming conventional navigation methods into more efficient, immersive, and context-aware solutions suitable for both outdoor and mixed environments.

XI. References

  1. Ronald T. Azuma, A Survey of Augmented Reality, Presence: Teleoperators and Virtual Environments, vol. 6, no. 4, pp. 355385, 1997.

  2. Google, ARCore Developer Guide, [Online].

    Available: https://developers.google.com/ar

  3. Apple, ARKit Documentation, [Online]. Available: https://developer.apple.com/augmented- reality/

  4. Edsger W. Dijkstra, A Note on Two Problems in Connexion with Graphs, Numerische Mathematik, vol. 1, pp. 269271, 1959.

  5. Peter E. Hart, N. J. Nilsson, and B. Raphael, A Formal Basis for the A* Search Algorithm, IEEE Transactions on Systems Science and Cybernetics, vol. 4, no. 2, pp. 100107, 1968.

  6. T. Langlotz et al., Sketching Up the World: In Situ Authoring for Mobile Augmented Reality Navigation, IEEE Transactions on Visualization and Computer Graphics, vol. 18, no. 4, pp. 745754, 2012.

  7. S. Zollmann et al., Augmented Reality for Navigation: A Survey, IEEE Transactions on Visualization and Computer Graphics, vol. 25, no. 1,

    pp. 120, 2019.

  8. M. Billinghurst, A. Clark, and G. Lee, A Survey of Augmented Reality, Foundations and Trends in HumanComputer Interaction, vol. 8, no. 23, pp. 73

    272, 2015.

  9. J. Carmigniani et al., Augmented Reality Technologies, Systems and Applications, Multimedia Tools and Applications, vol. 51, no. 1, pp. 341377, 2011.

  10. J. Grubert and R. Grasset, Augmented Reality for Android Application Development, IEEE Computer Graphics and Applications, vol. 33, no 4,

pp. 69, 2013.