🌏
Global Academic Platform
Serving Researchers Since 2012

A Simple Non-Invasive Wearable System for Automated Meal Alerts in Elderly Care Eating-Based Alert Termination Using Wrist-Based Hand-to-Mouth Gesture Detection

DOI : https://doi.org/10.5281/zenodo.18085969
Download Full-Text PDF Cite this Publication

Text Only Version

A Simple Non-Invasive Wearable System for Automated Meal Alerts in Elderly Care Eating-Based Alert Termination Using Wrist-Based Hand-to-Mouth Gesture Detection

Eating-Based Alert Termination Using Wrist-Based Hand-to-Mouth Gesture Detection

Fionna Ananth

4th Year Department of Biomedical Engineering Rohini College of Engineering and Technology Kanyakumari, Tamil Nadu, India

AbstractElderly individuals with memory impairment often forget whether they have consumed their meals, leading to missed or repeated food intake. Conventional meal reminder systems depend on manual user acknowledgment, which is unreliable for memory-impaired users. This paper proposes a simple, non- invasive wearable system that provides automated meal-time alerts and terminates the alert only after eating is detected. The system uses a wrist-worn inertial measurement unit to identify repetitive hand-to-mouth gestures associated with eating. Once eating behavior is confirmed, the alert is automatically stopped without requiring any user interaction. The proposed approach ensures reliable meal confirmation, preserves user dignity, and reduces caregiver dependency. The system is designed to be low- cost, privacy-preserving, and suitable for elderly care applications.

KeywordsWearable healthcare, Elderly care, Meal reminder system, Hand-to-mouth gesture detection, Non-invasive monitoring, Assistive technology

  1. INTRODUCTION

    The rapid growth of the elderly population has increased the prevalence of age-related memory impairments, including dementia and mild cognitive decline. One of the most common daily challenges faced by such individuals is forgetting whether they have consumed their meals. This can result in missed meals, repeated food intake, poor nutrition, and increased anxiety. Ensuring regular and confirmed meal intake is essential for maintaining the health and well-being of elderly individuals.

    Traditional solutions such as alarms and reminders are widely used to address this problem. However, these systems rely on manual user acknowledgment, which is unreliable for individuals with memory impairment. This paper presents a simple, non-invasive wearable system that provides automated meal-time alerts and stops the alert only after eating is detected, thereby ensuring reliable meal confirmation without requiring user interaction.

    1. Elderly Care and Memory Impairment

      Memory impairment in elderly individuals can occur due to conditions such as Alzheimers disease, dementia, or natural cognitive decline associated with aging. These conditions affect short-term memory, making it difficult for individuals to

      remember routine activities, including eating meals. As a result, elderly individuals may unintentionally skip meals or consume multiple meals within a short period, both of which can negatively impact their health.

      Caregivers often need to provide frequent reminders or supervision to ensure proper meal intake. This increases caregiver burden and reduces the independence of elderly individuals. Assistive technologies that support autonomous daily living while ensuring essential activities such as eating are therefore highly needed in elderly care.

    2. Limitations of Existing Meal Reminder Systems

      Most existing meal reminder systems operate using time- based alerts such as alarms, mobile notifications, or smart pillboxes. These systems require the user to manually acknowledge or turn off the alert. For elderly users with memory impairment, this approach is ineffective, as the user may forget to eat after turning off the alert or may forget to turn it off entirely.

      Some approaches attempt to use physiological parameters such as heart rate or blood pressure to infer eating activity. However, these parameters are highly variable and influenced by factors such as physical activity, emotional stress, and medical conditions. Therefore, they do not provide a reliable indication of actual meal intake, especially in elderly users.

    3. Motivation and Research Objective

    The motivation for this work arises from a real-life elderly care scenario in which an individual was unable to remember whether a meal had been consumed and could not reliably interact with reminder systems. This highlights the need for an automated solution that does not depend on memory or manual input.

    The primary objective of this research is to design a simple, low-cost, non-invasive wearable system that generates automated meal-time alerts and terminates the alert only after eating is detected. By using wrist-based hand-to-mouth gesture detection, the proposed system ensures reliable meal confirmation while preserving user comfort, privacy, and dignity.

  2. PROBLEM STATEMENT

    Elderly individuals with memory impairment often face difficulty remembering whether they have consumed their meals. This issue is common in conditions such as dementia and age-related cognitive decline, where short-term memory is affected. Forgetting meal intake can lead to skipped meals, repeated eating, nutritional imbalance, and increased anxiety for both the elderly individual and caregivers.

    Most existing meal reminder systems rely on time-based alerts that require the user to manually acknowledge or turn off the alert. However, for memory-impaired elderly users, manual interaction is unreliable. The user may turn off the alert without eating, forget to turn it off, or become confused by repeated alerts. As a result, these systems fail to ensure that a meal has actually been consumed.

    Some monitoring approaches attempt to use physiological parameters such as heart rate or blood pressure to detect eating activity. These parameters are non-specific and vary significantly due to factors such as physical movement, stress, medication, and existing health conditions. Therefore, they cannot provide dependable confirmation of meal intake in elderly users.

    The core problem addressed in this research is the absence of a simple, non-invasive, and automated wearable solution that can reliably confirm meal intake without relying on user memory or manual acknowledgment. There is a need for a system that provides automated meal-time alerts and ensures that the alert stops only after eating has been detected, thereby supporting reliable nutrition management and independent living for elderly individuals..

  3. LITERATURE REVIEW

    This section reviews existing research related to meal reminder systems, eating activity detection, and wearable technologies for elderly care. The aim is to highlight the limitations of current approaches and identify the research gap addressed by the proposed system.

    1. Time-Based Meal Reminder Systems

      Time-based reminder systems are commonly used to assist elderly individuals in maintaining daily routines. These systems typically use alarms, mobile notifications, or smart pillboxes to remind users to eat at predefined times. While such systems are simple and widely available, they rely entirely on user acknowledgment to confirm task completion.

      For elderly users with memory impairment, manual acknowledgment is unreliable. Users may turn off the alert without eating, forget to respond to the alert, or become confused by repeated alarms. As a result, time-based reminder systems do not guarantee actual meal intake and often fail to provide effective assistance for memory-impaired individuals.

    2. Physiological Parametr-Based Eating Detection

      Several studies have explored the use of physiological parameters such as heart rate, blood pressure, and body temperature to infer eating activity. These parameters may exhibit changes following food intake due to metabolic or digestive processes. However, such changes are indirect and highly variable, particularly in elderly populations.

      Physiological signals are influenced by numerous factors including physical activity, emotional stress, medication, and chronic health conditions. Due to this variability, physiological parameter-based methods often produce inaccurate results and are not suitable for reliable meal detection in elderly care applications.

    3. Motion-Based Eating Detection Using Wearable Sensors

    Advances in wearable technology have enabled the use of inertial sensors such as accelerometers and gyroscopes to recognize human activities. Motion-based eating detection focuses on identifying characteristic hand-to-mouth gestures associated with food intake using wrist-worn devices.

    Previous research has demonstrated that wrist-based inertial sensing can effectively recognize eating-related gestures in a non-invasive and low-cost manner. This approach provides direct behavioral confirmation of eating activity, making it more reliable than physiological parameter-based methods. However, many existing systems focus on activity recognition for data analysis rather than real-time intervention.

    There is limited research on integrating motion-based eating detection with automated alert termination mechanisms specifically designed for elderly individuals with memory impairment. This gap motivates the proposed system.

  4. PROPOSED SYSTEM

    The proposed system is a simple, non-invasive wearable device designed to assist elderly individuals with memory impairment by providing automated meal-time alerts and terminating the alert only after eating is detected. The system integrates time-based alert generation with motion-based eating detection to ensure reliable meal confirmation without requiring any manual user interaction.

    1. System Overview

      The system operates as a closed-loop wearable solution consisting of a wrist-worn device equipped with inertial sensors, a control unit, and an alert mechanism. At predefined meal times, the wearable generates an alert to notify the user. The alert continues until eating behavior is detected through wrist motion analysis.

      The core idea of the system is to eliminate user-dependent acknowledgment and replace it with behavior-based confirmation. This makes the system suitable for elderly users who may forget to interact with conventional reminder systems.

      Fig. 1 Proposed system architecture of the wearable meal alert system

    2. Hardware Architecture

      The hardware components of the proposed system include:

      Inertial Measurement Unit (IMU):A combination of accelerometer and gyroscope sensors used to capture wrist motion associated with hand-to-mouth gestures during eating.

      Microcontroller Unit (MCU):Processes sensor data, executes gesture detection logic, and controls the alert mechanism.

      Alert Unit: A buzzer or vibration motor that provides auditory or haptic alerts to notify the user at meal times.

      Power Supply:A compact rechargeable battery that supports continuous operation of the wearable device.

      Component

      Description

      IMU Sensor

      Detects wrist motion and orientation

      Microcontroller

      Processes sensor data and controls

      alerts

      Alert Unit

      Provides sound or vibration alerts

      Battery

      Powers the wearable device

      Table I Hardware components

    3. Software Architecture

      The software architecture is designed to operate efficiently on resource-constrained wearable hardware and consists of the following modules:

      Meal Scheduling Module:Activates alerts at predefined meal times such as breakfast, lunch, and dinner.

      Gesture Detection Module:Continuously analyzes IMU data to detect repetitive hand-to-mouth gestures indicative of eating.

      Alert Control Module:Maintains the alert until eating is confirmed and automatically terminates the alert once confirmation criteria are satisfied.

      The software does not require any manual input from the user, ensuring ease of use and reliability.

    4. Design Considerations

      The system architecture is developed with the following design considerations:

      • Non-invasive and comfortable wearable design

      • Low computational complexity for real-time processing

      • Privacy preservation by avoiding cameras and microphones

      • Reliability for elderly users with memory impairment

  5. METHODOLOGY

    This section describes the methodology used to implement the proposed non-invasive wearable system for automated meal alerts with eating-based alert termination. The methodology focuses on reliable eating detection using wrist- based motion sensing and simple rule-based decision logic suitable for elderly users.

    1. Hand-to-Mouth Gesture Detection Approach

      The proposed system detects eating activity using hand- to-mouth gesture recognition captured through a wrist-worn inertial measurement unit (IMU). The IMU continuously records acceleration and angular velocity data corresponding to wrist movements.

      During eating, the wrist exhibits characteristic motion patterns such as:

      • Upward movement of the hand toward the mouth

      • Rotation of the wrist while bringing food

      • Repetitive motion occurring over a short duration These motion patterns are analyzed in real time once the meal- time alert is activated.

        Fig. 2 Hand-to-mouth gesture detection using wrist-based inertial sensing

    2. Gesture Validation and Eating Confirmation

      To prevent false detections caused by non-eating activities such as scratching, adjusting clothing, or casual hand movements, the system applies simple validation criteria. Eating is confirmed only when:

      • Hand-to-mouth gestures occur repeatedly

      • Gestures are detected within a predefined time window

      • The activity persists for a minimum duration

        This rule-based validation ensures that only genuine eating behavior is detected while keeping computational complexity low.

    3. Alert Activation and Termination Logic

      At predefined meal times, the alert unit is activated to notify the user through sound or vibration. Once activated, the alert continues uninterrupted while the system monitors wrist motion data.

      If eating behavior is not detected, the alert remains active. When valid eating gestures are confirmed based on the defined

      criteria, the system automatically terminates the alert. Manual alert termination is intentionally disabled to ensure that the alert stops only after eating has occurred.

    4. Methodological Advantages

      The proposed methodology offers several advantages:

      • Fully non-invasive and comfortable for elderly users

      • Eliminates dependence on memory or manual interaction

      • Operates in real time with low power consumption

      • Suitable for continuous daily use in elderly care

    5. /ol>

    6. SYSTEM WORKFLOW

      This section explains the operational workflow of the proposed wearable system, detailing how automated meal alerts are generated and terminated based on detected eating behavior.

      1. Workflow Description

        The system operates in a sequential and continuous manner, beginning with predefined meal scheduling. When the scheduled meal time is reached, the wearable device activates an alert to notify the user. The alert can be provided through sound or vibration depending on user preference.

        Once the alert is activated, the system continuously monitors wrist motion using the inertial measurement unit. The captured motion data is analyzed in real time to identify hand- to-mouth gestures associated with eating. If eating behavior is not detected, the alert remains active, ensuring that the reminder persists until the intended action is completed.

        When repetitive and validated hand-to-mouth gestures are detected for a predefined duration, the system confirms that eating has occurred. Upon confirmation, the alert is automatically terminated without requiring any manual user interaction.

        Fig.3 Workflow

      2. Step-by-Step Workflow Operation

        The workflow of the proposed system can be summarized as follows:

        1. The wearable device initializes and waits for the scheduled meal time.

        2. At the scheduled time, the alert unit is activated.

        3. Wrist motion data is continuously captured by the IMU sensor.

        4. The system analyzes motion patterns to detect hand- to-mouth gestures.

        5. Gesture validation rules are applied to confirm eating behavior.

        6. Once eating is confirmed, the alert is automatically stopped.

        7. The system resets and waits for the next scheduled meal time.

      3. Reliability Considerations

      The workflow is designed to prevent accidental or premature alert termination. By enforcing repetitive gesture detection and minimum duration criteria, the system ensures that alerts stop only after genuine eating activity. This improves reliability and makes the system suitable for elderly users with memory impairment.

      Feature

      Existing Systems

      Proposed System

      Alert control

      Manual

      Automatic

      Eating confirmation

      Not ensured

      Gesture- based

      User dependency

      High

      Low

      Suitability for elderly

      Limited

      High

      Table II Comparison between existing reminder systems and

      the proposed system

    7. FUTURE SCOPE

      The proposed non-invasive wearable meal alert system addresses a critical problem in elderly care by ensuring reliable meal confirmation through behavior-based detection. Although the current system focuses on simplicity and practicality, several enhancements can be explored in future work to improve accuracy, scalability, and real-world applicability.

      1. Machine LearningBased Gesture Recognition

        In future implementations, machine learning techniques can be integrated to improve eating gesture recognition accuracy. Supervised learning models such as decision trees, support vector machines, or lightweight neural networks can be trained using wrist motion data collected during eating activities.

        Personalized models can be developed to adapt to individual eating behaviors, considering variations in speed, hand movement, and posture. This would further reduce false detections and enhance system reliability, especially for elderly users with unique motion patterns

      2. Personalization and Adaptive Thresholds

        Currently, the system uses predefined thresholds for gesture repetition and duration. Future versions can incorporate adaptive thresholds that automatically adjust based on the users daily behavior. This personalization would make the system more robust and suitable for long-term use across different individuals.

        Adaptive learning can also help accommodate changes in eating behavior due to aging, illness, or recovery.

      3. Caregiver Monitoring and Mobile Application Integration

        A caregiver-support mobile application can be developed to complement the wearable system. The application can display daily meal logs, alert caregivers if a meal is missed, and provide historical data for monitoring nutritional adherence.

        Such integration would be particularly useful in home care and assisted living environments, enabling caregivers to intervene only when necessary and reducing continuous supervision.

      4. Differentiation Between Eating and Similar Activities

        Future research can focus on improving the systems ability to distinguish eating from similar hand-to-mouth activities such as drinking water, brushing teeth, or phone usage. This can be achieved by analyzing additional motion features or combining gesture patterns with contextual information such as time and duration.

        Improved activity differentiation would further enhance system accuracy and trustworthiness.

      5. Clinical Trials and Real-World Deployment

        Future work should include pilot studies and clinical trials in real-world elderly care settings. Testing the system with elderly users and caregivers will provide valuable feedback on comfort, usability, and effectiveness.

        Such validation will help refine the system design, establish performance benchmarks, and support large-scale deployment in hospitals, assisted living facilities, and home care environments.

        Aspect

        Benefit

        Non-invasive

        Comfortable for elderly

        Automated

        No manual interaction

        Low-cost

        Affordable deployment

        Privacy-safe

        No camera or audio

        Table III Advantages of the proposed wearable system

      6. Integration with Other Assistive Healthcare Systems

      The wearable system can be extended to integrate with other assistive technologies such as medication reminder systems, hydration monitoring, and nutritional tracking platforms. This would allow the development of a comprehensive elderly care wearable that supports multiple daily activities essential for health and well-being.

    8. CONCLUSION

    This paper presented a simple, non-invasive wearable system designed to assist elderly individuals with memory impairment by providing automated meal-time alerts that stop only after eating is detected. Unlike conventional reminder systems that rely on manual user acknowledgment, the proposed approach ensures reliable meal confirmation through behavior-based detection using wrist-based hand-to-mouth gesture recognition.

    The system eliminates dependence on memory or user interaction, making it particularly suitable for elderly users who may experience cognitive decline. Its non-invasive design, low-cost hardware requirements, and privacy-preserving operation make it practical for real-world deployment in home care, assisted living facilities, and hospital environments.

    By integrating automated alert control with eating detection, the proposed system addresses a critical gap in elderly care technologies. The approach has the potential to improve nutritional adherence, enhance independent living, and reduce caregiver burden. Future enhanements such as machine learning integration, personalization, and caregiver monitoring can further strengthen the systems effectiveness and scalability.

    Overall, the proposed wearable system demonstrates a meaningful and socially impactful solution for supporting daily meal intake in elderly care and serves as a strong foundation for further research and development in assistive healthcare technologies.

    ACKNOWLEDGMENT

    The author would like to express sincere gratitude to the faculty members and mentors for their guidance and support throughout this research work. Their valuable suggestions contributed significantly to the completion of this paper.

    The author also acknowledges the support and encouragement of her family. This work was motivated by a real-life elderly care scenario, which inspired the development of this research.

    REFERENCES

    1. World Health Organization, Ageing and health, World Health

      Organization, Fact Sheet, 2023.

    2. A. Pantelopoulos and N. G. Bourbakis, A survey on wearable sensor- based systems for health monitoring and prognosis, IEEE Transactions on Systems, Man, and Cybernetics, Part C, vol. 40, no. 1, pp. 112, 2010.

    3. J. M. Fontana, M. A. Sazonov, and J. A. Sazonov, A wrist-worn sensor

      for automatic detection of eating, IEEE Sensors Journal, vol. 14, no. 6,

      pp. 19411952, 2014.

    4. S. Bi, M. Y. Chen, and J. A. Sazonov, Automatic eating detection using inertial sensors, IEEE Journal of Biomedical and Health Informatics, vol. 21, no. 3, pp. 866875, 2017.

    5. S. Patel, H. Park, P. Bonato, L. Chan, and M. Rodgers, A review of wearable sensors and systems with application in rehabilitation, Journal of NeuroEngineering and Rehabilitation, vol. 9, no. 21, pp. 1 17, 2012.

    6. O. Amft and G. Tröster, Recognition of eating and drinking activities using inertial sensors, Proceedings of the IEEE International Symposium on Wearable Computers, pp. 14, 2008.

    7. R. Paradiso and G. Loriga, Wearable health monitoring systems for elderly care, IEEE Engineering in Medicine and Biology Magazine, vol. 24, no. 4, pp. 3743, 2005.

    8. M. A. Rashid, A. M. Hasan, and M. M. Rahman, Wearable health monitoring systems for elderly care: A review, International Journal of Biomedical Engineering and Technology, vol. 31, no. 2, pp. 123137, 2019.