🏆
International Research Press
Serving Researchers Since 2012

SARS: An AI-IoT Enabled Smart Anti-Rodent System for Automated Detection and Deterrence

DOI : https://doi.org/10.5281/zenodo.19681313
Download Full-Text PDF Cite this Publication

Text Only Version

SARS: An AI-IoT Enabled Smart Anti-Rodent System for Automated Detection and Deterrence

Dr. Yogesh Angal, Ashutosh Thakur, Samruddhi Khedkar, Mohini Mashere

Department of Electronics and Telecommunication

JSPM’s Bhivrabai Sawant Institute of Technology and Research Wagholi, Pune (MH)

Abstract: In the Smart Anti-Rodent System (SARS), artificial intelligence plays a major role, but it works hand-in-hand with the IoT and hardware components. The AI aspect is mostly about using a camera, OpenCV, and machine learning models to find rodents in real time. This step is the brain of the whole system because AI only starts doing things after it finds a rodent.

But once AI sees anything, the IoT and hardware side takes care of the re- mainder of the process, which includes sending commands to the ESP32, turning on ultrasonic devices, setting off alarms, turning on LEDs, documenting events, and sending Telegram alerts. These activities don’t use AI, but they are necessary for the system to work and respond.

So, when we look at the overall system, the AI portion contributes around 40% to 50% of the entire framework. This proportion shows that AI is in charge of the smart recognition process. The IoT parts, on the other hand, handle the rest of the work, such as communication, control, notifications, and turning on de- vices.

Keywords: Artificial Intelligence (AI), Internet of Things (IoT), ESP32, OpenCV, Smart Anti-Rodent System (SARS), Real-Time Detection, Ultrasonic Deterrent, Telegram Alerts.

  1. INTRODUCTION

    Rodents are still a big problem in warehouses, food storage facilities, and industrial settings. They can cause big financial losses and serious health problems. According to reports on worldwide food safety, about 20% of stored food grains become contami- nated each year because of rodents. Rodents not only ruin packaging and stock, but they also spread hazardous germs through their droppings and urine, which is a big problem for hygiene and safety. Traps, poison baits, and regular manual checks are examples of traditional control methods that are primarily reactive, dangerous, and need ongoing human effort. They don’t offer much in the way of real-time monitoring or prevention. In big storage areas, it is almost difficult to keep an eye on things all the time by hand. This makes the need for an intelligent, automated, and proactive rodent control solution more important than ever.This percentage shows that AI is in charge of the smart recognition process, while the IoT parts take care of the rest of the tasks, such as com- munication, control, notifications, and turning on devices.

    Recent progress in Artificial Intelligence (AI) and the Internet of Things (IoT) has made it possible to create smart monitoring systems that can work and make decisions on their own [6]. Computers can now accurately find, sort, and track items in real time by combining machine learning models and computer vision technologies like OpenCV together [7], [8].

    IoT microcontrollers like the ESP32, on the other hand, are great for developing smart environmental management systems because they have dependable wireless con- nections, are easy to connect to sensors, and can automate tasks [9]. When AI-powered picture recognition is combined with IoT-based control systems, it is feasible to set off automatic responses to changes in the environment. This makes things more efficient overall and lessens the need for people to be involved all the time [10].

    In this context, the Smart Anti-Rodent System (SARS) is proposed as a practical and intelligent AI-IoTbased solution that can automatically detect and repel rodents with- out constant human involvement. The device has a USB camera that is driven by AI and is attached to a laptop. It keeps an eye on the storage area all the time [11]. Using a Python-based AI model created with OpenCV and trained detection algorithms, the laptop can determine right away when a rodent moves [12]. As soon as the AI unit sees a rodent, it sends a signal to the ESP32 microcontroller. The ESP32 then turns on a number of deterrent activities, like ultrasonic sound emitters, flashing lights, and audio sirens, to scare the rodent away [13]. The gadget also sends the warehouse owner quick Telegram messages to let them know straight away if anything happens [14]. This cycle of finding, responding, and

    notifying in real time makes it easy, fast, and entirely auto- matic to deal with rodents.

    The technology also sends fast Telegram messages to the warehouse owner to let them know right away if any activity is found [14]. This cycle of detection, response, and alerting in real time makes it easy, quick, and completely automatic to manage rodents. The device is far more useful and clever because it can send alerts and log information across the Internet of Things. This can’t be done with traditional methods of managing rats [15]. At the correct moment, the system records every detection auto- matically. People may observe the current state of affairs and the peak times for break- ins, enabling them to make more informed decisions in the long run. When the ESP32 detects no motion, it enters a low-power standby mode thanks to its intelligent control logic.

    This makes the system use less energy and work better [16]. The SARS architecture is a smart system that can run all the time with no work or care from the user. Sensors, processors, communicators, and actuators are all part of the same system [17].

    Compared to conventional methods of pest control, the SARS system has numerous advantages, such as complete automation, non-lethal deterrence, improved sanitation, and easy expansion [18]. Because it use ultrasonic and light-based repelling rather than dangerous poisons, the system is safe for both humans and the environment.

    It has a modular design, so it’s easy to add more cameras, sensors, or even cloud connectivity to satisfy the needs of bigger warehouses or more intricate monitoring

    needs [19]. SARS is a new and smart technique to get rid of pests that works well with Industry 4.0 automation requirements. The Internet of Things (IoT) and artificial intel- ligence (AI)-powered vision are used for object identification and operation [20].

  2. REVIEW OF RELATED WORK

    Work In their study, A. Smith and J. Brown [1] looked at how convolutional neural networks (CNNs) could be used to detect pests and rodents in industrial and agricultural settings. They found tiny animals using real-time picture processing, and their system maintained a high level of accuracy even when the lighting changed environments. Their method used real-time picture processing to find small animals and was able to keep its accuracy high even when the lighting changed. The scientists showed that deep learning models greatly lower the number of false positives compared to traditional motion-based detection methods, which are often affected by shadows, wind, or other background noise. Modern AI-driven detection systems owe a great debt to their groundbreaking work, which impacts solutions like as SARS, which rely on clever vis- ual recognition to accurately identify and categorize rodents. A smart animal repellant system was created by K. Patel [2]. It utilized motion sensors and ultrasonic emitters to ward off pests. As soon as it detected motion, the setup’s microcontroller activated the deterrent devices.

    K. Patel [2] invented a smart animal repellent device that used ultrasonic emitters and motion sensors to keep pests away. The microprocessor in the setup turned o the deterrent devices whenever it saw movement. The system worked effectively, but it didn’t have cloud connectivity or visual recognition; it only employed motion detection based on sensors. Even though this was a concern, the study highlighted how IoT and automation may work together to produce pest control solutions that are proactive. This idea helped develop the SARS system, which uses ESP32 microcontrollers and ultra- sonic modules to keep rodents away in real time and automatically.

    R. Gupta and S. Sharma [3] examined the integration of AI and IoT to develop more intelligent, automated security systems in agriculture. Their research revealed that com- bining AI-driven analytics with IoT sensor data can help systems respond more cor- rectly and make better judgments. Their method used environmental sensors to keep an eye on conditions that could help pests thrive, and AI algorithms to look at this data and guess when infestations would emerge. This research provided valuable and appli- cable insights into the synergistic effects of real-time monitoring and intelligence anal- ysis on enhancing rodent control tactics in storage settings. These ideas were very im- portant in making the SARS system, which uses both AI and IoT to improve detection and prevention.

    The study conducted by J. Lee and M. Park [4] provided an in-depth examination of how deep learning-based object detection might improve contemporary surveillance systems. They employed YOLO (You Only Look Once) models in their study to quickly and accurately detect things in industrial monitoring settings. The system could find small moving objects even in big video frames, and it didn’t take long for the com- puter to process the information. Their method had a big impact on the SARS system’s vision-based AI module, which uses OpenCV and machine learning to find things quickly and accurately and send alarms right away.

    P. Nair and L. Kumar [5] created an IoT-based system for monitoring the environ- ment that used low-power sensors and cloud connectivity to collect and analyze data all the time. Their technology kept an eye on the temperature and humidity levels in real

    time to guess how likely it was that pests would come. They also added sophisti- cated ways to save electricity to the IoT nodes to cut down on energy use. This research was very significant in making the SARS system’s intelligent power management func- tion work. The ESP32 automatically goes into standby mode when it doesn’t detect any activity.

    M. proposed a new AI-based model for finding rodents. Chatterjee [6] created an object detection system utilizing OpenCV and deep learning specifically for monitoring warehouses. The research demonstrated that pre-trained CNN architectures could con- sistently distinguish rodents from typical background motion, even in intricate settings. The author did, however, underline how important it is to incorporate edge computing to speed up replies and cut down on wait times. This information had a huge effect on how the SARS system was developed as a hybrid. For example, image processing was done on a laptop to speed up the identification and activation of deterrent measures.

    D. conducted another significant study. Meena and V. Singh [7] created an auto- mated pest control system that utilizes both edge computing and deep learning. Their architecture combined AI-powered detection with IoT-controlled deterrent devices, such as light-based and ultrasonic emitters. The results revealed that splitting the job between the AI processing unit and the IoT parts not only made detection more accu- rate, but it also used less energy. This idea of distributed processing and localized de- cision- making directly formed the primary working model of SARS, where both AI and IoT aspects work together to manage rodents more quickly and intelligently.

    C. Zhao and T. Wu [8] examined the feasibility of real-time object detection on com- pact, low-power embedded devices with the YOLOv8 method. Their work was about making hardware platforms like the ESP32 and Raspberry Pi work better. They showed that even small, energy-efficient computers can conduct AI inference well. Their results showed useful ways to use lightweight AI models for real-time surveillance applica- tions. Using what we learned from these studies, the SARS system combines OpenCV- based visual detection with deterrent measures controlled by the ESP32 to find the right balance between cost, power use, and overall performance.

    S. Kaur and R. Verma [9] created a smart system for finding and scaring away ani- mals using a Raspberry Pi and ultrasonic emitters. They employed PIR sensors and basic motion-based algorithms to find intrusions. The system worked well for small- scale uses, but it didn’t have advanced classification functions or the ability to send alerts in real time. Even yet, their results showed that non-lethal, sound-based deterrent approaches are still useful. This is an ethical and safe way to control rodents that has been added to the SARS framework as a vital part of its strategy.

    H. Nguyen and F. Tran [10] conducted a study aimed at developing energy-efficient IoT systems with ESP32 microcontrollers for environmental monitoring applications. Their work created smart sleep scheduling and adaptive wake-up systems that are meant to make IoT devices last longer. This idea had a direct impact on the SARS system, which uses similar power-saving methods to go into standby mode when there is no rodent activity. SARS can run for a long time without wasting energy by using these methods.

    H. Nguyen and F. Tran [10] conducted a study aimed at developing energy-efficient IoT systems with ESP32 microcontrollers for environmental monitoring applications. Their work created smart sleep scheduling and adaptive wake-up systems that are meant to make IoT devices last longer. This idea had a direct impact on the SARS system, which uses similar power-saving methods to go into standby mode when there is no rodent activity. SARS can run for a long time without wasting energy by using these methods.

    T. Banerjee [12] looked into advanced surveillance methods that use OpenCV to find motion and spot strange things. The author elucidated the application of back- ground subtraction and contour detection techniques to differentiate between human and animal movement in the study. Even though the method was rather simple, it laid the framework for systems like SARS. These early ideas helped form the motion anal- ysis module of SARS, which gets rid of movements that aren’t important to cut down on false alarms and make detection more accurate.

    A. Ali and K. Hussain [13] created a cloud-connected monitoring system for their IoT-based pest control framework. This system used sensor data to find places where rodents were likely to be active. Their work showed how important it is to be able to communicate in real time, making sure that system alarms get to users right away. The SARS system builds on this idea by not only storing events in the cloud but also sending instant Telegram notifications. This gives consumers both real-time information and a record of their behavior over time for further research.

    P. Kumar and S. Rao [14] conducted a comprehensive analysis of several machine learning methodologies employed for object detection. They looked at how fast and accurate well-known models like CNN, R-CNN, and YOLO were in their research. They decided that light-weight models that can extract features well are suitable for embedded AI applications, where resources are limited. Their ideas directly influenced the SARS system’s decision to employ OpenCV and basic machine learning classifiers, which are a good balance between speed and low cost.

    D. Chen [15] built a low-cost IoT system for keeping an eye on fields using ESP32 and Blynk. The study concentrated on scalability, remote accessibility, and real-time environmental control. The design idea is in accordance with SARS’s architecture,

    which is a cheap, scalable, and energy-efficient system that can add more sensors and modules to keep the warehouse safe for longer.

  3. EXISTING SYSTEM

    Most of the time, traditional rodent control techniques in warehouses and godowns involve traps, poison baits, and chemical repellents that people set up themselves. These treatments are common, however they only work after an infestation has already hap- pened, thus they are not preventative. Traps and poisons need constant human monitor- ing, regular maintenance, and are bad for both health and the environment. Throwing away dead rodents and polluted materials adds to the problems of cleanliness and costs of labor. Chemical-based repellents can also get into stored items, such food grains or medicines, making them not safe to store in clean places [1][3].

    Some semi-automated systems use motion sensors or infrared detectors to find activity in certain places. They frequently switch on lights, alarms, or other warning devices when they see something move. This can be useful in simple setups, but these systems typically can’t detect the difference between people, rodents, and objects in the envi- ronment that aren’t dangerous, such wind or changing shadows. This causes a lot of false positives, which not only disrupt normal operations but also cause workers to ig- nore the alerts over time. This affects the system’s overall trust and reliability [5][7]. These systems also have a huge problem: they can’t adapt to new conditions or learn from what they’ve done in the past, which makes them far less useful in real life, where things are always changing.

    Some research prototypes and commercial gadgets use ultrasonic deterrents that create high-pitched noises to scare rats away. These ultrasonic devices are harmless for the environment and don’t kill animals, but they don’t perform very well because rats can get habituated to the same frequencies over time. Plus, these systems don’t really find rodents; they just run all the time or on a timer, which consumes power and wears down the equipment faster [8][10]. This illustrates that there aren’t any sophisticated control systems that can only turn on deterrent when it’s needed.

    Some farms and factories employ CCTV cameras to keep an eye on things, and they can also see mice running around. These systems don’t operate with any systems that can detect things smartly or react automatically. People have to watch the tape by hand, so they can’t respond right away. This manual dependency not only makes mitigation procedures take longer, but it also makes them more expensive because personnel have to keep a watch on video streams all the while [6][9].

    Also, most of the solutions that are already out there don’t have IoT-based communica- tion or remote monitoring features. There are no alert systems that work together to let owners know right away when rodent activity is found. Because of this, the damage may have already happened before anyone can do anything about it. Without real-time alerts, cloud logging, and automated control, these systems aren’t as useful in modern warehouses where constant supervision isn’t possible [11][13].

    Another important problem with traditional and semi-automated systems is that they don’t use power well. Cameras, sensors, and deterrents are some of the items that work all the time, no matter what the weather is like or how busy things are. This wastes energy, so batteries need to be replaced more often, and costs more to run. None of the typical systems offer adaptive power management, which is needed for long-term, un- attended operation in large buildings [10][14].

    In short, the current systems for managing rodents are either manual, semi-auto- mated, or not smart, and they don’t have important features like AI-based detection, IoT connectivity, data logging, or energy-efficient control. They don’t offer proactive pro- tection, depend too much on people to work, and can’t be controlled from a distance. These flaws show how important it is to have a smart, self-driving, and connected so- lution. The Smart Anti-Rodent System (SARS) suggests using AI for visual detection and IoT for real-time communication and control.

  4. PROPOSED SYSTEM

    In order to detect, identify, and eliminate rats, the Smart Anti Rodent System (SARS) employs artificial intelligence (AI) image processing, internet of things (IoT) connec- tivity, and ultrasonic deterrent control. To constantly monitor items and prevent crime, the suggested system employs OpenCV machine vision and real-time processing on the ESP32. Conventional methods of extermination, such physical inspection or static traps, do not work in this situation. An autonomous setting that deters mice is the SARS prototype. Various components are linked together by IoT protocols. These include modules that can detect, process, act, and communicate.The fundamental architecture of the SARS system consists of four layers:

    1. Image Acquisition and Preprocessing Layer

    2. Intelligent Detection Layer

    3. Response and Control Layer

    4. Monitoring and Notification Layer

      Each layer has a distinct job that helps the system find and scare away rodents quickly and effectively. The ESP32 microcontroller is the main processing unit. It makes sure that the camera sensor, ultrasonic emitter, and Telegram-based alarm sys- tem can all talk to each other.

      1. Image Acquisition and Preprocessing Layer

        Warehouses, kitchens, and storage facilities can all have their surroundings captured in real time by the system’s high-resolution ESP32-CAM module. To preprocess these frames, the OpenCV libraries are utilized. Included in this process is the elimination of unnecessary pixel data, a reduction in noise, a grayscale conversion, and the application of Gaussian filtering to improve visibility. Separating moving objects from their back- grounds is another part of the preprocessing stage. This facilitates feature extraction in contexts with dynamic and variable lighting.

      2. Intelligent Detection Layer

        Whether it’s a warehouse, a kitchen, or a storage facility, the system’s high-resolution ESP32-CAM module captures images of the environment in real time. To preprocess these frames, the OpenCV libraries are utilized. Included in this process is the elimina- tion of unnecessary pixel data, a reduction in noise, a grayscale conversion, and the application of Gaussian filtering to improve visibility. Separating moving objects from their backgrounds is another part of the preprocessing stage. This facilitates feature extraction in contexts with dynamic and variable lighting.

      3. Response and Control Layer

        The Smart Anti Rodent System (SARS) uses AI-based image processing, IoT con- nectivity, and ultrasonic deterrent control to smartly find, identify, and stop rodent in- trusions. Unlike regular pest control meth When the ESP32 controller finds a rodent, it turns on the ultrasonic emitter module, which sends out high-frequency sound waves (usually between 35 and 65 kHz) that rodents don’t like but people can’t hear. This quick response is meant to scare the rodent away without hurting it, following the rules of non-lethal and eco-friendly pest control. At the same time, the system records the detection event with a timestamp.

        The proposed system can also turn on LED lights or buzzer alarms to alert people nearby. This is especially useful in industrial or food storage settings where quick action is needed.

        The proposed system combines machine vision (OpenCV) with real-time pro- cessing on ESP32 to provide continuous, automated surveillance and deterrence. This is better than static traps or manual inspection. The SARS prototype is a fully autono- mous anti-rodent ecosystem made up of several modules, including sensing, pro- cessing, actuation, and communication. These

        modules are all connected through IoT protocols.

      4. Monitoring and Notification Layer

        With SARS’s IoT connectivity, you can see objects from far away in real time. The system uses Wi-Fi integration to transmit detection warnings over the Telegram Bot API. The alerts include an image of the rodent that was found and the location of the user’s smartphone. This feature makes sure that users are notified right quickly, even if they aren’t present. Using the Telegram interface, users can switch the deterrent on or off from a distance, which lets them regulate it in both directions. We keep track of things like how often we find rodents, when we find them, and what they do in the environment so that we can look for trends in their movement and come up with better ways to block them from getting in.

      5. System Architecture Overview

        Fig. 1. System Architecture

        Achieving optimal results, the Smart Anti-Rodent System (SARS) integrates AI-based detection with IoT-enabled deterrence. The storage area is continuously being recorded by the USB camera. In order to detect mouse movements or forms, a real-time AI model built in Python and utilizing OpenCV analyses the footage. The ESP32 microcontroller is immediately notified of a rodent’s detection via a serial or Wi-Fi connection.

        The ESP32 then uses things like an ultrasonic repeller, LED lights, a buzzer, or voice notifications that have already been recorded to scare the mice away. The ESP32 also uses the Telegram API to give the warehouse owner alerts in real time with information about what happened and when it happened. The system also keeps a local record of detections for further use. The ESP32 saves electricity by placing the system in standby mode while it’s not in use. It turns the system back on when it sees new movement. This design makes sure that the rodent control system is fully automated, smart, and doesn’t use as much energy, with only a little help from people.

      6. Experimental Implementation

    The prototype used MicroPython and C++ (Arduino IDE) to write the code that made the ESP32-CAM take pictures and send data. We used a set of photos of rodents from different places to train the AI detection model. Tests were done in controlled settings using toy models and movement patterns that looked like rodents. The technology was able to send notifications and find objects with very little delay. The ultrasonic deterrent worked nicely and reacted in less than a second after it was located. This showed that the integrated architecture was effective.

    Fig. 2. Flow Chart

      1. Hardware and Software Used

        Hardware Components:

        • ESP32-CAM Module: Serves as the main controller for image capture, processing, and communication.

        • Ultrasonic Sensor & Emitter: Detects obstacles and emits high-frequency waves to repel rodents.

        • Power Supply (5V/12V): Provides stable voltage for all modules.

        • LED Indicators & Buzzer: Give visual and audible alerts during detection events.

        • Wi-Fi Module (Built-in ESP32): Enables IoT connectivity for real-time notifications.

        • Relay Module: Controls activation of the ultrasonic deterrent.

        • Breadboard & Connecting Wires: Used for circuit prototyping and integration.

          Software Components:

        • Arduino IDE: For ESP32-CAM programming and microcontroller interfacing.

        • Python (OpenCV Library): Used for image processing and object detection model training.

        • Telegram Bot API: For sending real-time detection alerts to users.

  5. RESULTS AND DISCUSSION

    We put the Smart Anti-Rodent System (SARS) to the test in real life to evaluate how well it could identify rodents, how much energy it needed, and how quickly it reacted. The performance was assessed based on a range of environmental and operational fac- tors, and the results showed that it was quite reliable at finding and stopping rodent activity in warehouses.

    Table 1: Experimental Results Summary

    Sr.

    No.

    Parameter

    Measured

    Description

    Observed

    Result

    Outcome

    1

    Detection Ac- curacy vs Light-

    ing

    Tests model reliabil- ity under varying

    light conditions

    7895%

    accuracy range

    Stable AI perfor- mance across

    lighting variations

    2

    Response Time

    vs Frame Reso- lution

    Measures detection- to-alert time

    120280

    ms

    Real-time opera- tion maintained

    3

    Power Con-

    sumption vs System Activity

    Compares energy us-

    age in different modes

    0.83.2 W

    Highly energy-ef- ficient system

    4

    Number of De- tections vs

    Time

    Tracks frequency of rodent appearances

    Peaks at 20:00

    05:00

    Higher night-time rodent activity de-

    tected

    Fig. 3. Detection Accuracy vs Lighting Conditions.

    This number reflects how well the AI-based detection algorithm operates in different types of light. The accuracy keeps getting better as the lighting becomes better, reaching 95% in the optimum situations. This shows that the algorithm for detecting things that uses OpenCV works well.

    Fig. 4. Response Time vs Frame Resolution.

    This chart shows how long it takes for the system to respond at different video res- olutions. Even though better resolutions make processing take a little longer, the sys- tem still works in real time within 300 ms, which is good for activating an instant de- terrent.

    Fig. 5. Power Consumption vs System Activity.

    This number tells you how much power the system uses in three separate modes: standby, active detection, and alert. The ESP32 is perfect for long-term monitoring that doesn’t waste energy because it needs extremely little power while it’s not in use (<1 W).

    Fig. 6. Number of Detections vs Time (Day).

    This number represents how many times rodents were found in a 24-hour period. The pattern reveals that there is greater activity at night (20:0005:00), which shows that the system works to discover rodents that are active at night.

    The experimental results validate the effectiveness of the proposed SARS model. The AI-based detection accuracy was always over 90% when the light was good. This shows that OpenCV-based image classification works well even when there isn’t much light. The response time test demonstrated that the system works in real time (less than 300 ms), which is good for fast turning on deterrent devices.

    The control logic based on the ESP32 performed a remarkable job of cutting down on energy use by alternating between active and standby states based on how much motion there was. This made sure that the system will have enough power to run for a long period. The trend of detecting rats based on the time of day indicated that they usually roam around at night, which is what we would expect from their biology. These results suggest that the proposed system is smart, flexible, and works well, which makes it a good approach to keep mice out of warehouses.

  6. CONCLUSION

    In order to prvide a dependable and energy-efficient method for preventing rodents from entering warehouses, the Smart Anti- Rodent System (SARS) combines AI-based picture detection with IoT-enabled automation. An ESP32 microcontroller activates the deterrent and sends out notifications, while OpenCV is utilized by the system for so- phisticated picture analysis. That way, it can be monitored remotely, react automati- cally, and send out alerts in real time. Experiment findings show that the solution is effective and long-lasting due to its low power consumption, fast response time, and high detection rate. Cleaner, safer, and more

    efficient storage facilities are the results of implementing the SARS idea, which reduces the need for human intervention.

  7. FUTURE SCOPE

Future enhancements to the Smart Anti-Rodent System (SARS) should concentrate on improving detection accuracy and scalability. Adding deep learning models like YOLOv8 or TensorFlow Lite to edge devices can make real-time accuracy better and get rid of the need for outside systems. Using cloud-based analytics will let you store detection records in one place and look at patterns over time. You may make a special mobile app that lets you watch live video, see your alert history, and control things from far away. Solar-powered modules and multi-node IoT designs can also make the system more long-lasting and better for big warehouses. In the future, smart, automated build- ings may also have voice command features and the ability to alter the environment (such changing the humidity or light levels) to keep rodents from coming around.

REFERENCES

  1. Food and Agriculture Organization, Pest and Rodent Management in Food Storage Systems, FAO Reports, Rome, Italy, 2023.

  2. R. Singh and P. Sharma, Impact of Rodent Infestation on Stored Food Grains, Journal of Food Safety and Hygiene, vol. 14, no. 2, pp. 8894, 2022.

  3. A. Kumar et al., Hygienic and Economic Losses Due to Rodent Infestation in India, International Journal of Environmental Health Research, vol. 18, no. 3,

    pp. 221229, 2021.

  4. M. Dutta and S. Chatterjee, Limitations of Traditional Pest Control Methods in Industrial Warehouses, Pest Management Science, vol. 76, no. 5, pp. 2123 2131, 2020.

  5. P. Verma and R. Gupta, IoT-Based Monitoring Solutions for Industrial and Agricultural Warehouses, IEEE Access, vol. 9, pp. 147231147240, 2021.

  6. J. Patel et al., Applications of Artificial Intelligence in Environmental Monitoring Systems, IEEE Internet of Things Journal, vol. 8, no. 10, pp. 85648575, 2021.

  7. M. Redmon and A. Farhadi, YOLOv3: An Incremental Improvement, arXiv preprint arXiv:1804.02767, 2018.

  8. R. Shukla and K. Joshi, Real-Time Object Detection using OpenCV and Deep Learning, Procedia Computer Science, vol. 196, pp. 247254, 2022.

  9. Espressif Systems, ESP32 Technical Reference Manual, Espressif Systems, 2022.

  10. H. Lee and S. Kim, Integration of Computer Vision and IoT for Smart Surveillance Systems, IEEE Sensors Journal, vol. 22, no. 12, pp. 1214412153, 2022.

  11. L. Sharma, AI-Based Pest Detection System Using Image Processing, International Journal of Advanced Computer Science, vol. 12, no. 4, pp. 340347, 2021.

  12. N. Pandey et al., Object Recognition Using Machine Learning for Agricultural Applications, Computers and Electronics in Agriculture, vol. 200, pp. 107174, 2022.

  13. S. Kaur and A. Thakur, Design of Ultrasonic and Audio-Based Animal Repellent Devices, International Journal of Smart Devices and IoT, vol. 7, no. 3,

    pp. 4552, 2020.

  14. T. Nguyen, Real-Time Notification Systems Using Telegram Bot API, Journal of Embedded Systems and IoT Applications, vol. 9, no. 2, pp. 113120, 2021.

  15. A. Rahman and S. Paul, Data Logging and Event Monitoring in IoT Environments, IEEE Transactions on Instrumentation and Measurement, vol. 71, pp. 19, 2022.

  16. K. Mehta, Energy-Efficient Operation of IoT Devices in Smart Systems, IEEE Internet Computing, vol. 25, no. 3, pp. 3342, 2021.

  17. Y. Wang and J. Chen, Sensor Integration and Communication Models for IoT Devices, IEEE Communications Surveys & Tutorials, vol. 24, no. 1, pp. 115131, 2022.

  18. M. Fernandes, Comparative Study of Pest Control Technologies in Smart Farming, Computers and Electronics in Agriculture, vol. 189, pp. 106413, 2021.

  19. V. Rao and S. Shetty, Scalable IoT Framework for Industrial Monitoring Applications, IEEE Access, vol. 10, pp. 8841188421, 2022.

  20. A. Sharma and R. Patel, IoT and AI Integration in Industry 4.0: A Review, IEEE Access, vol. 11, pp. 1789217910, 2023.

  21. A. Smith and J. Brown, AI-Driven Pest Detection Systems Using Convolutional Neural Networks, IEEE Access, vol. 10, pp. 1245612464, 2022.

  22. K. Patel, Design and Implementation of a Smart Animal Repellent System Using IoT, International Journal of Advanced Research in Electronics and Communication Engineering, vol. 10, no. 4, pp. 101107, 2021.

  23. R. Gupta and S. Sharma, Integration of AI and IoT in Automated Agricultural Protection, Computers and Electronics in Agriculture, vol. 198, pp. 107030, 2022.

  24. J. Lee and M. Park, Deep Learning-Based Object Detection for Smart Surveillance Applications, IEEE Transactions on Image Processing, vol. 31, pp.

    223232, 2022.

  25. P. Nair and L. Kumar, IoT-Based Real-Time Environmental Monitoring System, IEEE Internet of Things Journal, vol. 8, no. 6, pp. 45984607, 2021.

  26. M. Chatterjee, AI-Powered Rodent Detection for Warehouse Safety, Journal of Intelligent Systems, vol. 31, no. 3, pp. 415425, 2022.

  27. D. Meena and V. Singh, Automated Pest Control Using Edge Computing and Deep Learning, IEEE Access, vol. 9, pp. 5641256420, 2021.

  28. C. Zhao and T. Wu, Real-Time Object Recognition with YOLOv8 on Embedded Devices, IEEE Sensors Journal, vol. 22, no. 9, pp. 93349342, 2023.

  29. S. Kaur and R. Verma, Smart Animal Detection and Repelling Mechanism Using Raspberry Pi, Procedia Computer Science, vol. 195, pp. 600607, 2022.

  30. H. Nguyen and F. Tran, Energy-Efficient IoT Systems Using ESP32 for Smart Monitoring, IEEE Internet Computing, vol. 25, no. 4, pp. 4553, 2021.

  31. N. Sharma, Integration of AI Models with Telegram Bot for Alert Systems, International Journal of Computer Applications, vol. 183, no. 39, pp. 1218, 2021.

  32. T. Banerjee, Advanced Security Systems Using AI and OpenCV, International Journal of Innovative Research in Computer and Communication Engineering, vol. 9, no. 5, pp. 432438, 2021.

  33. A. Ali and K. Hussain, IoT-Based Pest Control and Monitoring System, International Journal of Smart Systems and Technologies, vol. 7, no. 2, pp. 7179,

    2021.

  34. P. Kumar and S. Rao, A Review on Machine Learning Techniques for Object Detection, IEEE Access, vol. 9, pp. 1674216756, 2021.

  35. D. Chen, Low-Cost IoT Devices for Smart Agriculture, IEEE Transactions on Industrial Informatics, vol. 17, no. 5, pp. 3412340, 2021.

  36. M. Patel and R. Desai, AI-Enabled Rodent Control Using OpenCV and Ultrasonic Sensors, International Journal of Emerging Trends in Engineering

    Research, vol. 9, no. 7, pp. 765770, 2021.

  37. J. Park and H. Lee, Edge AI-Based Smart Monitoring Systems for Industrial Warehouses, IEEE Access, vol. 10, pp. 4982049828, 2022.

  38. G. Thomas, YOLO-Based Object Detection for Security Applications, IEEE Transactions on Consumer Electronics, vol. 68, no. 2, pp. 115122, 2022.

  39. L. Singh, Smart Animal Repellent System Using IoT and AI Integration, International Journal of Advanced Computer Science and Applications, vol. 13, no. 10, pp. 221228, 2022.

  40. A. Bhosale, AI-Driven Environmental Monitoring Using ESP32 and Telegram Bot, International Journal of Innovative Technology and Exploring

    Engineering, vol. 11, no. 4, pp. 311318, 2023.

  41. B. Sahu and R. Ghosh, Smart Pest Detection in Agricultural Fields Using Deep Learning, IEEE Transactions on Artificial Intelligence, vol. 4, no. 2, pp. 123132, 2023.

  42. M. Li and C. Zhang, IoT-Based Smart Monitoring System for Food Warehouses, IEEE Access, vol. 11, pp. 4213242140, 2023.

  43. D. Patel, Design and Implementation of AI-Based Surveillance Using Raspberry Pi, International Journal of Embedded Systems and IoT Applications, vol.

    10, no. 1, pp. 7784, 2021.

  44. S. Deshmukh and N. Kulkarni, A Hybrid Deep Learning Model for Animal Detection in Farms, Procedia Computer Science, vol. 218, pp. 702709, 2023.

  45. A. Tanwar, S. Tyagi, and N. Kumar, Integration of IoT and Cloud for Smart Industrial Applications, IEEE Internet of Things Magazine, vol. 4, no. 3, pp. 3441, 2021.

  46. J. Park and M. Choi, AI-Based Object Detection for Security Applications Using OpenCV and TensorFlow, IEEE Transactions on Consumer Electronics,

    vol. 68, no. 2, pp. 121129, 2022.

  47. R. Verma, Ultrasonic Frequency-Based Rodent Repellent Device Design, International Journal of Smart Sensors and Systems, vol. 12, no. 2, pp. 5965, 2022.

  48. L. Nguyen, Automated Detection of Pests Using Image Processing, Computers and Electronics in Agriculture, vol. 193, pp. 107339, 2022.

  49. H. Patel and V. Singh, IoT-Based Environmental Monitoring with ESP32 and Cloud Integration, IEEE Access, vol. 10, pp. 5832058330, 2022.

  50. G. Thomas and K. George, Deep Learning-Based Video Analytics for Warehouse Surveillance, IEEE Access, vol. 9, pp. 188471188480, 2021.