Authentic Engineering Platform
Serving Researchers Since 2012
IJERT-MRP IJERT-MRP

Live Streaming for ADL Monitoring in Smart Home Environments System UI

DOI : 10.17577/IJERTCONV13IS05014

Download Full-Text PDF Cite this Publication

Text Only Version

Live Streaming for ADL Monitoring in Smart Home Environments System UI

1st S.Suganthan

Department Of Computer science Engineering

IFET College of Engineering Villupuram, India sugansankar2015@gmail.com

2st M.Navalarasu

Department Of Computer science Engineering

IFET College of Engineering Villupuram, India sanjaynavalarasu2002@gmail.com

3ndDr.Sivasankaran.S,AP

Department Of Computer science Engineering

IFET College of Engineering Villupuram, India sankarsvision@gmail.com

Abstract:

This system presents a streaming data processing system for monitoring Activities of Daily Living (ADL) in smart home environments, utilizing various sensors to track and analyze residents' health and activity in real time. The system integrates MEMS sensors to detect movement, temperature sensors to monitor environmental conditions, heart beat sensors to track vital signs, and an ESP32-CAM for video-based activity recognition. By applying machine learning techniques to process the data from these sensors continuously, the system can detect patterns and anomalies, such as unusual behaviors or health risks, and provide immediate alerts. This low-latency, real-time monitoring solution aims to improve the safety and well-being of individuals, particularly the elderly or those with chronic conditions, by enabling timely interventions and promoting a healthier, more secure living environment. The use of edge computing ensures efficient data processing and reduces the need for cloud-based resources, making the system more responsive and energy-efficient.

Keywords: ADL Monitoring, IoT , ESP32 cam,Real-time tracking, wearable device.

  1. INTRODUCTION

    In recent years, smart home environments have gained popularity as a way to enhance the safety, comfort, and well-being of individuals, especially for the elderly and those with health conditions requiring continuous

    monitoring. One critical application within these environments is the monitoring of Activities of Daily Living (ADL), which helps caregivers and healthcare professionals track the physical and health status of residents. ADL monitoring systems provide insights into daily routines and can detect abnormalities that may indicate health issues or emergencies.

    This paper focuses on developing an efficient ADL monitoring system using a range of sensors and streaming data processing. By integrating various sensors like MEMS sensors for movement detection, temperature sensors for environmental monitoring, heart beat sensors for vital sign tracking, and ESP32- CAM for video-based activity recognition, the system offers a comprehensive and real-time approach to monitoring residents' activities. The use of edge computing allows the data to be processed directly within the smart home environment, reducing latency and improving response times in case of emergencies. The proposed system aims to offer continuous, real- time health and activity monitoring without disrupting the residents' daily routines. By analyzing the data from these sensors, the system can detect any unusual patterns or behaviors and provide alerts, enabling timely interventions. This solution holds great potential in improving the quality of life for individuals in smart homes, particularly the elderly or those with chronic conditions, by providing a safer and more responsive living environment.

  2. RELATED WORKS

    EXSISRecent advancements in multisensor fusion have significantly enhanced the accuracy and reliability of Activities of Daily Living (ADL) monitoring systems. Studies by Rashidi and Mihailidis (2013) highlighted the integration of motion, temperature, and physiological sensors to provide a holistic view of individual health and activity patterns. These systems are instrumental in detecting abnormal behaviors, such as sudden falls or emergencies, by correlating diverse data sources, thereby improving early intervention capabilities[1]. The utilization of Microelectromechanical Systems (MEMS) has revolutionized movement detection in ADL monitoring. Research by Saxena et al. (2018) demonstrated that MEMS-based accelerometers and gyroscopes deliver precise motion tracking with low power consumption, making them ideal for smart home applications. These sensors have been particularly effective in detecting falls and other critical movements, especially for elderly residents, ensuring timely alerts for caregivers [2]. Vital sign monitoring is another critical component in smart home health systems. Work by Pantelopoulos and Bourbakis (2010) emphasized the integration of heart rate sensors to monitor cardiovascular health. These sensors detect conditions like arrhythmias or heart attacks and, when incorporated into ADL systems, provide an added layer of health surveillance. Such integration ensures early warnings for potential health risks, contributing to improved medical response [3]. Environmental monitoring has also emerged as a pivotal aspect of smart home systems. Research by Almalioglu et al. [10] focused on temperature sensors to ensure a safe and comfortable living environment. These sensors monitor room conditions to detect overheating or sudden temperature fluctuations, which are particularly crucial for vulnerable populations,

    such as the elderly or chronically ill[4]. Incorporating multiple sensors into ADL systems facilitates a more comprehensive monitoring approach. Studies have shown that combining physiological and environmental data enhances the detection of complex scenarios, such as identifying stress or discomfort caused by both health anomalies and external conditions. This integrated perspective allows for a more nuanced understanding of individual well- being[5]. Recent advancements in real-time data analysis and artificial intelligence have further improved the capabilities of ADL systems[10]. Machine learning algorithms enable the processing of large datasets collected by sensors, identifying trends and anomalies with greater accuracy. This technology supports predictive healthcare, offering insights into potential health issues before they escalate[6]. Finally, these technologies collectively represent a significant leap in smart home healthcare. By integrating multisensor fusion, MEMS, vital sign monitoring, and environmental analysis, researchers have paved the way for more robust and responsive systems. These advancements not only enhance the quality of life for individuals but also reduce the burden on healthcare providers, promoting a more sustainable and efficient healthcare ecosystem[7][9].

    PROBLEM STATEMENT

    Develop a real-time smart home monitoring system using multi-sensor data fusion and machine learning to track Activities of Daily Living (ADL), detect anomalies, and provide timely alerts for enhancing the safety and well-being of residents. The solution leverages edge computing for low-latency, energy- efficient data processing.

  3. EXISTING SYSTEM

    The features an intuitive user interface designed to provide real-time monitoring and insights. The main

    dashboard includes a live feed section, allowing users to view video streams from various rooms in the house, such as the living room, bedroom, and kitchen. An activity summary highlights detected ADLs like walking, sitting, or abnormal activities such as falls, ensuring immediate attention to emergencies. A health stats panel integrates wearable device data, displaying live parameters like heart rate and temperature for comprehensive health monitoring. Additionally, users can receive alerts and notifications for anomalies, including environmental risks such as overheating or sudden temperature drops, ensuring proactive care. The UI includes a room selection panel for toggling between different camera feeds and a settings section to manage thresholds for activity detection and health parameters. Users can access historical reports, view activity logs, and analyze past alerts for better decision-making. The system ensures secure access with authentication and offers privacy options, such as blurring sensitive areas. A clean and accessible design with high-contrast visuals and adjustable font sizes ensures ease of use, especially for elderly users or those with visual impairments, making it an ideal solution for ADL monitoring.

    Feature

    Existing System

    Proposed System

    Hardware

    Relies on

    Includes

    Integration

    traditional

    advanced

    sensors (e.g.,

    MEMS

    basic

    accelerometer,

    accelerometers

    pulse sensor,

    and

    and ESP32-

    temperature

    CAM for

    sensors).

    comprehensive

    monitoring.

    Video

    Limited or no

    Real-time

    Monitoring

    video-based

    video capture

    monitoring

    and

    capabilities.

    recognition

    with ESP32- CAM for enhanced accuracy and reliability.

    Precision & Calibration

    Basic tracking with limited location accuracy.

    Utilizes landmark information for precise location calibration and improved user tracking.

    Technology Utilization

    Minimal use of IoT and outdated power-hungry components.

    Leverages IoT technology for easy deployment, low power consumption, and cost- effectiveness.

    Real-Time Processing

    Delayed or less efficient data processing.

    High positioning accuracy and real-time data processing for timely responses and decision- making.

    Table no.1 Existing vs Proposed System

  4. PROPOSED SYSTEM

    The proposed system introduces a novel portable hardware design for real-time project development, featuring an integrated MEMS accelerometer, pulse sensor,RFID and ESP32-CAM for enhanced video- based monitoring. This system improves user tracking by utilizing landmark information for precise location calibration. Leveraging IoT technology and machine learning techniques, the device benefits from easy deployment, low power consumption, and cost- effectiveness. The addition of the ESP32-CAM provides real-time video capture and recognition, further enhancing the accuracy and reliability of the system. The combination of these features ensures

    high positioning accuracy and real-time data processing, making the system both practical and efficient for real-time applications.

  5. METHODOLOGY:

    The following sections provide an overview of the key modules that form the foundation of the proposed system. Together, these modules create a fully functional unit. At the heart of the system is the Controller module, built on the AI-Tensilica platform. This module features an ESP32 microcontroller with built-in Wi-Fi, Bluetooth, and BLE capabilities, allowing it to manage sensor inputs and control other modules through versatile GPIOs and communication interfaces.The IoT Module adds location-based functionality by using BLE to communicate with nearby devices, sending and receiving data based on proximity. For movement tracking and abnormality detection, a MEMS Accelerometer and a Pulse Sensor are employed, continuously relaying data to the controller in real time.To enhance monitoring capabilities, the ESP32-CAM is integrated into the

    system, providing real-time video capture and enabling video-based activity recognition. This feature strengthens the overall system by adding visual data for more accurate event detection, such as unusual movements or environmental changes.An OLED Display is included to provide a visual interface, showing system status, sensor data, and alerts from the connected application. The Control Switch allows users to manually initiate emergency call alerts, while the Relay manages high-power loads and ensures circuit protection. Lastly, the Power Supply stabilizes the system by converting and regulating voltage, effectively powering all components.When integrated, these components work seamlessly together, creating a cohesive system designed for real-time monitoring and safety applications, offering enhanced tracking, sensor-based detection, and video monitoring for improved situational awareness.

  6. HARDWARE EXPLANATION

  1. AI BOARD AI LSAI48266x

    TENSILICA'S PROCESSOR-WROOM-32 is an

    extremely performance-intensive MCU module with Wi-Fi+BT+BLE and represents a high-performance, yet extremely energy-efficient platform for applications that vary from the most demanding activities, such as voice processing and MP3 decoding, up to those requiring minimum power consumption. The core of this module is TENSILICA'S PROCESSOR-D0WDQ6 chip, designed to be highly scalable and flexible. The chip comes equipped with two CPU cores, which can be controlled separately, and the CPU clock speed can be set within the range of 80 MHz-240 MHz, among other things. And users have the possibility of shutting down the CPU to operate the low-power co-processor in polled mode, constantly checking peripherals for threshold events. TENSILICA'S PROCESSOR comes with a complete

    set of peripherals, including capacitive touch sensors, Hall sensors, SD card support, Ethernet, high-speed SPI, UART, I2S, and I2C.

  2. I ESP32cam

    ESP32 CAM module is an ESP32 live camera module. It features an on board OV2640 2 MP camera and onboard SD memory card slot. It is good for home and office smart devices, wireless monitoring and other IoT applications. ESP32 CAM module features on board PSRAM.

  3. HEART BEAT SENSOR

    A beat in a human refers to the sound created by the valves within his or her heart as it contracts and expands in sending blood from one part of the body to

    another. Heart rate is the frequency of the beats per minute, while the pulse is any beat that the person feels within an artery close to the skin.

  4. TEMPERATURE SENSOR

    The LM35 series are precision, integrated circuit temperature sensors, whose output voltage is directly proportional to the centigrade temperature. This gives the LM35 an advantage over linear temperature sensors calibrated in Kelvin because users do not have to subtract a large constant voltage from the output for convenient centigrade scaling. The LM35 requires no external calibration or adjustment to achieve typical accuracies of ±0.25°C at room temperature and

    ±0.75°C across a full temperature range of 55°C to

    +150°C. It has trimming and calibration at the wafer level for low cost. Its low output impedance, linear output, and precise inherent calibration make it remarkably easy to interface with readout or control circuits. It can be used on single power supplies, or with both positive and negative supplies. Operating from a supply of only 60A, it shows very low self- heating, less than 0.1°C in still air. The LM35 is designed for operation over the temperature range of

    5°C to +150°C, whilst the LM35C is specified over the range 40°C to +110°C (10°C with enhanced accuracy). In addition to the TO-46 transistor packages, the LM35 series is also available in hermetic TO-46 transistor packages, and the LM35C, LM35CA, and LM35D in plastic TO-92 transistor packages.

  5. 3 AXIS LINEAR ACCELEROMETER

The MPU6050 is a small, thin, low power, a whole 3 xyz-axis accelerometer with sign conditioned voltage outputs.The MPU6050 Module 3-axis Analog Output Accelerometer measures acceleration with a minimum full-scale range of ±2g.It can degree the static acceleration of gravity in tilt-sensing programs, as well as dynamic MEMS acceleration as a consequence of shock or movement.This breakout board comes with an onboard voltage regulator and works at each 5V.An MEMS accelerometer is an electro-mechanical tool in order to measure acceleration forces.

  1. RESULTS AND DISCUSSION

    The proposed system demonstrated exceptional real- time monitoring capabilities, effectively combining MEMS accelerometer and pulse sensor,RFID data with IoT-based precise location tracking. This integration ensured accurate movement detection and health monitoring, providing a robust framework for activity tracking and emergency response. The inclusion of the ESP32-CAM for real-time video monitoring significantly enhanced situational awareness, offering a deeper understanding of user activity and surroundings. The systems low latency facilitated prompt emergency alerts, ensuring quick response times in critical situations. Moreover, its power efficiency, achieved through the ESP32

    microcontroller and BLE technology, made it suitable for long-term, practical use. User testing revealed that the system was intuitive and reliable for safety applications, making it highly practical for real-world deployment. However, while the video monitoring aspect proved valuable, certain limitations were observed in the recognition of complex scenarios or environments with low lighting. These findings suggest that while the system meets its primary objectives of accurate, real-time monitoring and user- friendly operation, further advancements in video recognition algorithms and hardware capabilities could enhance its performance, particularly in more demanding use cases.

  2. FUTURE SCOPE

The future scope of this system includes improving video recognition capabilities using advanced machine learning algorithms to enhance accuracy in complex scenarios. Integration with cloud computing and AI can enable large-scale data analysis for predictive insights and personalized health monitoring. Expanding the system to incorporate additional sensors, such as temperature and humidity, can provide more comprehensive environmental and physiological monitoring. Furthermore, the development of a mobile application for user interaction and real-time notifications can increase

accessibility and usability, making the system more versatile for diverse safety and healthcare applications. 9.CONCLUSION

In conclusion, the proposed system demonstrates a significant advancement in real-time monitoring and safety applications for smart environments. By integrating the ESP32 microcontroller with built-in Wi-Fi, Bluetooth, and BLE capabilities, alongside sensors like the MEMS accelerometer, pulse sensor, and the ESP32-CAM for video capture, the system offers a comprehensive solution for tracking movement, detecting abnormalities, and providing visual monitoring. The inclusion of IoT technology further enhances location-based accuracy, making the system reliable and efficient. Overall, this portable, low-power, and cost-effective design is a promising tool for applications that require continuous monitoring and emergency alerts, offering a practical solution for real-world safety challenges.

REFERENCE

  1. Cook, D. J., & Schmitter-Edgecombe, M. (2009). Assessing the quality of activities in a smart environment. Methods of Information in Medicine, 48(5), 480-485.

  2. Wang, J., Chen, Y., Hao, S., Peng, X., & Hu, L. (2019). Deep learning for sensor-based activity recognition: A survey. Pattern Recognition Letters, 119, 3-11.

  3. Chen, L., Hoey, J., Nugent, C. D., Cook, D. J., & Yu, Z. (2012). Sensor-based activity recognition in the context of ambient assisted living: A review. Journal of Biomedical Informatics, 45(3), 448-462.

  4. Zhang, J., Wu, T., Zhang, Y., Meng, H., & Wu, J. (2018). Real-time human activity recognition in smart home environments. Information Sciences, 447, 89-

    102.

  5. Mohamed, A. A., Wahab, M. H., & El-Ghareeb, H.

    A. (2021). IoT-based health monitoring and assisted living for smart homes. Journal of Ambient Intelligence and Humanized Computing, 12(3), 3663-

    3684.

  6. Rashidi, P., & Mihailidis, A. (2013). A survey on ambient-assisted living tools for older adults. IEEE Journal of Biomedical and Health Informatics, 17(3), 579-590.

  7. Reiss, A., & Stricker, D. (2012). Creating and benchmarking a new dataset for wearable activity recognition. Proceedings of the 5th International Conference on Pervasive Technologies Related to Assistive Environments, 1-8.

  8. Hu, Y., Chen, Y., & Fan, X. (2020). A framework for real-time activity recognition in smart homes. Sensors, 20(18), 5403.

  9. Yang, J., Nguyen, M. N., San, P. P., Li, X., & Krishnaswamy, S. (2015). Deep convolutional neural networks on multichannel time series for human activity recognition. Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), 3995-4001.

  10. Woznowski, P., Burrows, A., Diethe, T., & Flach,

    P. (2017). Smart homes: IoT for real-time activity recognition in smart environments. IEEE Internet of Things Journal, 4(5), 948-959.

  11. Dang, L. M., Piran, M. J., Han, D., Min, K., & Moon, H. (2020). A survey on Internet of Things and cloud computing for healthcare. Electronics, 9(2), 294.

  12. Rashid, U., Ullah, S., & Ali, R. (2021). Smart home activity recognition: A literature survey. International Journal of Advanced Computer Science and Applications, 12(1), 342-351.

  13. Kim, T., & Song, W. (2020). Real-time activity recognition using streaming data in smart homes. IEEE Access, 8, 216857-216866.

  14. Wang, F., Zhang, X., & Zhang, S. (2021). Real- time ADL monitoring using lightweight deep learning models. Pattern Recognition Letters, 147, 21-27.

  15. Das, B., & Cook, D. J. (2015). Data mining challenges in smart environments: Activity learning and recognition. Journal of Pervasive and Mobile Computing, 25, 24-35.

  16. Miorandi, D., Sicari, S., Pellegrini, F. D., & Chlamtac, I. (2012). Internet of Things: Vision, applications, and research challenges. Ad Hoc Networks, 10(7), 1497-1516.