Leading Research Platform
Serving Researchers Since 2012

AI-Enabled Drone for Precision Spraying and Field Monitoring

DOI : 10.17577/IJERTV15IS043136
Download Full-Text PDF Cite this Publication

Text Only Version

AI-Enabled Drone for Precision Spraying and Field Monitoring

Aruna R

Assistant Professor

Department of Computer Science and Engineering Sri Krishna Institute of Technology Chikkabanavara, Bangalore, India

Chandana M

Department of Computer Science and Engineering Sri Krishna Institute of Technology Chikkabanavara, Bangalore, India

Likitha R

Department of Computer Science and Engineering Sri Krishna Institute of Technology Chikkabanavara, Bangalore, India

Abstract – Precision agriculture is evolving with the integration of intelligent technologies aimed at improving crop productivity while minimizing resource usage. However, most advanced agricultural drones rely on complex artificial intelligence models and expensive hardware, making them inaccessible to small-scale farmers. This paper presents a cost-effective AI-assisted drone system designed for real-time crop monitoring and decision support. The proposed system utilizes an ESP32-based camera module to stream live video over Wi-Fi and applies HSV color-spacebased image analysis to identify variations in crop health. Instead of relying on dataset-trained models, a rule-based approach is used to classify plant conditions such as healthy, stressed, or diseased based on color distribution. The system follows a human-in-the-loop methodology, where farmers analyze the output and manually perform precision spraying. This approach reduces computational complexity, lowers cost, and improves practical usability. Experimental observations demonstrate that the system effectively supports crop monitoring and targeted intervention, making it suitable for small-scale agricultural applications.

  1. INTRODUCTION

    Agriculture plays a critical role in sustaining global food demand, yet traditional farming methods often rely on manual inspection and uniform pesticide application. These practices result in increased labor effort, inefficient use of chemicals, and delayed detection of crop diseases. With advancements in technology, precision agriculture has emerged as a promising solution to enhance productivity and resource efficiency.

    Modern agricultural systems increasingly utilize drones and artificial intelligence for automated crop monitoring and spraying. While these systems provide high accuracy and automation, they often depend on large datasets, high computational resources, and costly hardware components. As a result, such solutions are not practical for small-scale farmers who require affordable and easy-to-use systems.

    To address this gap, the proposed work introduces an AI-assisted agricultural drone that focuses on real-time monitoring and decision support rather than full automation. The system employs an ESP32 camera module to capture and stream live field images using Wi-Fi communication. HSV color-spacebased image processing is used to analyze plant conditions by detecting color variations associated with crop health. Unlike deep learning approaches, the system uses rule-based classification, reducing computational requirements and eliminating the need for large datasets.

    The design follows a human-in-the-loop approach, where the farmer interprets the system output and controls the spraying process manually. This ensures safety, reliability, and better adaptability to varying field conditions. The proposed system aims to provide a balance between technological capability and practical usability, making precision farming accessible to small-scale agricultural communities.

  2. LITERATURE SURVEY

    Recent developments in precision agriculture have led to the integration of unmanned aerial vehicles and intelligent systems for crop monitoring. Early approaches relied on manual observation and simple imaging techniques, which were time-consuming and prone to inaccuracies. To improve efficiency, researchers introduced image processing methods based on handcrafted features such as color histograms, textures, and edges. Although these techniques provided basic automation, they were highly sensitive to environmental conditions and lacked robustness.

    With the advancement of artificial intelligence, deep learning models such as convolutional neural networks have been widely adopted for crop disease detection. These models are capable of learning complex patterns from large datasets and have shown high accuracy in identifying plant

    diseases. However, they require extensive training data, powerful processing units, and continuous updates, which increases system cost and complexity.

    Advanced sensing technologies such as multispectral and hyperspectral imaging have also been explored for detailed crop analysis. These methods capture information across multiple wavelengths, enabling early detection of plant stress. Despite their effectiveness, the requirement of specialized sensors and high processing capability limits their application in low-cost systems.

    To overcome these limitations, recent research has focused on lightweight and efficient methods such as color-space analysis. HSV-based image processing has been identified as a reliable approach for outdoor environments, as it separates color information from illumination. Studies indicate that HSV-based detection can effectively identify visible symptoms such as discoloration and leaf damage.

    Additionally, embedded systems with wireless communication capabilities have enabled real-time monitoring solutions. The concept of combining AI-based analysis with human decision-making has gained importance, as it enhances reliability and reduces operational risk. These developments highlight the need for affordable, practical, and user-friendly systems, which is addressed in the proposed work.

  3. EXISTING APPROACHES

    Existing agricultural drone systems can be broadly classified into manual, semi-automated, and fully autonomous systems. Manual methods rely entirely on human observation, making them inefficient for large-scale farming. Semi-automated systems provide drone-based monitoring with limited image processing capabilities, assisting farmers in decision-making.

    Fully autonomous systems represent the most advanced solutions, incorporating deep learning algorithms, GPS navigation, and automated spraying mechanisms. These systems can detect diseases and perform spraying without human intervention. However, they involve high costs, require skilled operation, and are difficult to maintain in rural environments.

    Many existing solutions also depend on cloud-based processing and large datasets, which increases latency and limits usability in areas with poor connectivity. Additionally, variations in environmental conditions such as lighting and crop diversity affect detection accuracy.

    The lack of affordability, complexity of operation, and limited adaptability are major drawbacks of current systems. These challenges highlight the need for a simplified approach that balances intelligence with usability.

  4. PROPOSED SYSTEM

    The proposed system presents a cost-effective AI-assisted agricultural drone designed to support real-time crop monitoring and decision-making in precision farming. Unlike conventional systems that rely on complex deep

    learning models and autonomous operation, this approach emphasizes simplicity, affordability, and practical usability for small-scale farmers. The system integrates an ESP32-based camera module mounted on a drone platform, which captures live video of the crop field and transmits it over a Wi-Fi network. This eliminates the need for external nfrastructure such as cloud servers or high-end processing units.

    The core functionality of the system lies in its ability to analyze crop health using HSV color-spacebased image processing. Instead of relying on large datasets or computationally intensive models, the system applies rule-based logic to classify plant conditions. By analyzing variations in hue, saturation, and brightness, the system identifies visible symptoms such as discoloration, dryness, or abnormal patterns in leaves. The results are displayed in real time to the user through a simple interface.

    A key feature of the proposed system is its human-in-the-loop design. Rather than automating all operations, the system assists the farmer by providing visual insights and classification results, while leaving the final decision-making and spraying control to the user. This approach improves reliability, reduces risk, and ensures better adaptability to diverse agricultural environments. Overall, the system is designed to bridge the gap between traditional farming and high-cost automation by offering an intelligent yet accessible solution.

  5. METHO DOLOGY

    The methodology of the proposed system follows a structured workflow that integrates image acquisition, processing, classification, and decision support. Initially, the ESP32 camera module captures continuous video frames of the agricultural field during drone operation. These frames are transmitted in real time to a connected user device through a Wi-Fi-based communication channel, ensuring low latency and direct accessibility.

    Once the video stream is received, individual frames are processed using HSV color-space transformation. This step converts raw pixel values into hue, saturation, and value components, which are more suitable for analyzing color variations under changing lighting conditions. The system then evaluates pixel distributions and calculates the proportion of different color regions present in the frame.

    Based on predefined threshold rules, the system classifies the crop condition into categories such as healthy or potentially diseased. For instance, a higher proportion of green pixels indicates healthy vegetation, while the presence of yellow, brown, or dark regions suggests stress or disease symptoms. These classification results are continuously updated and displayed to the farmer in real time.

    In addition to live monitoring, the system also supports image capture functionality, allowing frames to be stored for later inspection. This is particularly useful when the system is unable to confidently classify a condition or when further analysis is required. Finally, based on the visual output and

    classification results, the farmer manually controls the spraying process, ensuring targeted and efficient pesticide application.

    Fig.1.Use Case Diagram of AI-Enabled Field Monitoring Workflow

    Fig.2.Data Flow Diagram of Crop Detection and Spraying Process

  6. IMPLEMENTATION

    The implementation of the proposed system involves the integration of both hardware and software components to achieve real-time functionality. On the hardware side, the

    system utilizes a drone platform equipped with essential components such as motors, electronic speed controllers, a power supply, and an ESP32 camera module. The ESP32 is configured to operate in Wi-Fi access point mode, enabling direct communication with a user device without the need for external network infrastructure.

    The camera module captures images in JPEG format and streams them using an HTTP-based MJPEG protocol. This allows continuous frame transmission with reduced data size, ensuring smooth real-time performance. On the software side, a web-based interface is developed to display the live video feed and provide interaction controls. The interface includes options for viewing real-time analysis results, capturing images, and monitoring color distribution.

    Image processing is performed on the client side using JavaScript, where each frame is analyzed using HSV conversion algorithms. The system calculates color percentages and applies rule-based conditions to determine crop health status. This client-side processing approach reduces the computational burden on the ESP32 and enables faster response times.

    The spraying mechanism is intentionally kept manual to maintain simplicity and safety. Based on the analysis displayed on the interface, the farmer decides when and where to activate the spraying system. This combination of real-time processing, user interaction, and manual control ensures a balanced and practical implementation suitable for real-world agricultural conditions.

  7. RESULTS AND ANALYSIS

    The developed system was evaluated under practical conditions to assess its performance in real-time crop monitoring and analysis. The ESP32 camera successfully streamed live video over a Wi-Fi network, providing continuous visual feedback to the user without significant delay. The MJPEG streaming approach proved effective in maintaining a stable connection and ensuring smooth frame updates.

    The HSV-based image processing algorithm demonstrated reliable performance in identifying visible crop health variations. The system was able to distinguish between healthy and stressed plants based on color distribution, particularly in detecting yellowing and browning patterns associated with common crop issues. Compared to traditional RGB-based methods, the HSV approach showed improved robustness under varying lighting conditions, such as sunlight and shadows.

    The rule-based classification system provided consistent results with minimal computational requirements. Although the system does not achieve the precision of deep learning models, it offers sufficient accuracy for practical decision support in small-scale farming scenarios. The inclusion of image capture functionality allowed farmers to verify results and make informed decisions.

    The human-in-the-loop design played a significant role in improving overall system reliability. By allowing the farmer

    to interpret results and control spraying manually, the system reduced the risk of incorrect pesticide application. Additionally, the low-cost design and minimal hardware requirements make the system accessible and scalable for rural deployment.

    Fig.3.Crop-Based Crop Health Detection Results

    Fig.4.Real-Time Crop Frame Captured by ESP32-CAM

    Fig.5.AI-Enabled Quadcopter Drone for Precision Spraying

    Fig.6.Autonomous Drone in Flight over Agricultural Field during Spraying Operation

    Fig.7.Real-Time Disease Detection Results from Drone Camera Feed

    Fig.8.Waypoint Configuration Interface for Autonomous Drone Flight

  8. CONCLUSION

The proposed system presents a cost-effective and practical approach to precision agriculture by integrating real-time video streaming, HSV-based image processing, and human-

guided decision-making. By avoiding complex dataset-driven models and high-end hardware, the system remains accessible to small-scale farmers while still providing meaningful insights into crop health. The use of color-based analysis enables reliable detection of visible plant stress under varying field conditions with minimal computational requirements.

Furthermore, the human-in-the-loop design ensures that critical actions such as pesticide spraying remain under farmer control, improving safety, trust, and adaptability in real-world environments. Although the system does not offer full automation, it effectively supports informed decision-making and targeted intervention. Overall, the proposed solution demonstrates that affordable, AI-assisted tools can significantly enhance agricultural practices without increasing financial or operational burden.

REFERENCES

  1. K. Kumar and A. Singh, Precision agriculture using UAVs: A review, Journal of Agricultural Scienc, vol. 14, no. 2, pp. 4552, 2020.

  2. S. Gupta, R. Patel, and M. Shah, Automated crop monitoring and spraying using drones, IEEE Access, vol. 9, pp. 112345112356, 2021.

  3. G. Latif et al., Deep learning based intelligent cognitive vision drone for automatic plant diseases identification and spraying, Journal of Intelligent & Fuzzy Systems, vol. 39, no. 6, pp. 114, 2020.

  4. R. S. Shinde and P. D. Jadhav, Design and implementation of a pesticide spraying drone for smart agriculture, International Journal of Engineering Research & Technology, vol. 10, no. 4, pp. 123128, 2022.

  5. S. Patil, N. Sharma, and P. Ghadge, Drone-based precision agriculture: Disease detection and monitoring, in Proceedings of the International Conference on Automation, Computing and Renewable Systems (ICACRS), 2025, pp. 523528.

  6. A. Samanta and B. Saha, Affordable precision agriculture using edge AI and TinyML for resource-constrained farming systems, arXiv preprint arXiv:2603.15085, 2026.

  7. V. Thomopoulos et al., A comprehensive review of UAV applications in agriculture, Journal of Electrical Systems and Information Technology, 2024.

  8. A. Pretto et al., Building an aerial-ground robotics system for precision farming: An adaptable solution, IEEE Robotics and Automation Letters, 2019.

  9. O. Joochim et al., Development of agricultural drone for smart farming, in Proceedings of the International Conference on Robotics and Automation Sciences, (ICRAS), 2021, pp. 111115.

  10. S. Rachmawati et al., Application of drone technology for mapping and monitoring agricultural land, in Proceedings of the International Conference on Information and Communication Technology for Smart Society (ICISS), 2021, pp. 15.

  11. Austy B. Evangeline et al., Smart irrigation system for precision farming using IoT and ESP32, International Research Journal on Advanced Engineering Hub, vol. 4, no. 1, pp. 2731, 2026.

  12. N. Tsipianitis et al., Internet of Things (IoT) and UAVs in smart farming: A comprehensive review, Internet of Things Journal, vol. 18, 2022.

Ms. Aruna R completed her Master of Technology in the field of Embedded Systems from Aditya Engineering College (JNTUK), Andhra Pradesh. Her main research areas are Embedded Systems, Web-based Instructional Technologies, Programming Concepts, and Artificial Intelligence. She completed her Bachelor of Engineering in Electronics and Communication Engineering from Shirdi Sai Engineering College, Bangalore. She is currently working as an Assistant Professor in the Department of Computer Science and Engineering at Sri Krishna Institute of Technology (SKIT), Bangalore. She has experience in teaching undergraduate and postgraduate students and mentoring academic and technical projects. She has guided students in projects related to IoT, Artificial Intelligence, and sensor-based systems. She has also served as Skill Development Coordinator and Convener of the Entrepreneurship Development Cell (EDC). She is actively involved in internship programs and student development initiatives. She has secured two KSCST-funded student projects and has published several papers in journals and conferences.