🔒
International Engineering Publisher
Serving Researchers Since 2012

The Tree Stethoscope: An Intelligent Bioacoustic Denoising System for Early Detection of Rhinoceros Beetle Larvae

DOI : https://doi.org/10.5281/zenodo.18937684
Download Full-Text PDF Cite this Publication

Text Only Version

 

The Tree Stethoscope: An Intelligent Bioacoustic Denoising System for Early Detection of Rhinoceros Beetle Larvae

Adarsh Abraham Johnson

Dept. of Computer Science & Engineering Toc H Institute of Science & Technology Kerala, India

Arjun T Aghilesh

Dept. of Computer Science & Engineering Toc H Institute of Science & Technology Kerala, India

Anan M Binoj

Dept. of Computer Science & Engineering Toc H Institute of Science & Technology Kerala, India

Chris Kuriakose

Dept. of Computer Science & Engineering Toc H Institute of Science & Technology Kerala, India

Ms. Chinnu Edwin A

Assistant Professor, Dept. of CSE Toc H Institute of Science & Technology Kerala, India

Abstract – Coconut cultivation serves as the backbone of Ker- alas agricultural economy, providing a primary source of income for millions of rural households and supporting diverse down- stream industries. However, this vital sector is currently under siege by an invisible predator: the larvae of the Rhinoceros beetle (Oryctes rhinoceros). Unlike external pests, these larvae burrow deep into the crown and trunk of the coconut palm, consuming the vital internal tissues and meristem. Because the damage occurs internally, the tree often appears healthy on the exterior while its core is being hollowed out. To bridge this critical de- tection gap, this paper proposes The Tree Stethoscope, a novel diagnostic system designed to transform pest management from a reactive struggle into a proactive science. The hardware core of the system is a high sensitivity piezoelectric contact microphone . This sensor captures the minute acoustic signatures the rhythmic fingerprints of larval chewing and movement within the wood. These subtle mechanical vibrations are amplified and digitized, providing a real-time window into the trees health .The intelli- gence of the system is driven by a custom trained Convolutional Neural Network (CNN) architecture, which has been meticulously optimized using TensorFlow Lite for deployment on a Raspberry Pi

4 edge computing node. This allows the device to perform sophisticated signal analysis and classification entirely offline, ensuring it remains functional in remote plantation environments with limited connectivity. During rigorous field testing, the system demonstrated exceptional performance, .

Index Terms – Bioacoustic Sensing, Edge AI, Raspberry Pi, CNN, Precision Agriculture, Rhinoceros Beetle.

  1. INTRODUCTION

    Agricultural economies, particularly those specialized sec- tors that are fundamentally reliant on the cultivation of high-

    value, long term perennial crops such as the coconut palm, represent inherently high risk zones that are perpetually vul- nerable to sudden, devastating pest related hazards. Within the specific geographical and economic landscape of Kerala, the multi dimensional economic impact of the Rhinoceros beetle (Oryctes rhinoceros) is nothing short of catastrophic, primarily because coconut cultivation serves as the vital, irreplaceable backbone of the states rural economic structure. The highly destructive larvae of this specific pest do not feed on the visible exterior of the tree; instead, they instinctively burrow deep into the protected crown and the dense internal trunk substrate. Within this subterranean environment, these larvae can remain entirely hidden and shielded from human observa- tion for several consecutive months, silently and systematically consuming the vital internal vascular tissues and the critical, soft meristem of the palm long before any external physical signs of distress begin to manifest.

    This biological behavior creates a critical, time sensitive phenomenon known as detection lag, a period during which the tree appears deceptively healthy on the outside to the naked eye while its internal core is being aggressively hollowed out from within. By the time visual symptoms which typically include the characteristic, geometric V-shaped cuts on the developing fronds or the widespread, systemic yellowing of the upper canopy finally become apparent to the farmer, the internal structural integrity of the palm is often already compromised far beyond the point of repair. This delay in identification inevitably results in irreversible internal rot, the

    eventual death of the palm, and a total loss of the farmers long- term agricultural investment.

    This systemic lag in detection inevitably forces farmers into a profoundly detrimental and fundamentally reactive manage- ment cycle. Precisely because they lack the technical capability to see or otherwise perceive the infestation during its early, manageable stages, they are often left with no viable choice but to engage in large scale, costly crop removal or the excessive, late stage application of potent chemical pesticides. This reac- tive approach is not only severely economically draining due to the high associated costs of specialized labor and the total loss of potential crop yield, but it is also profoundly ecologically damaging, as the heavy chemical usage required to treat advanced infestations negatively affects the surrounding soil health and leaches into the local water table.

    To successfully break this destructive cycle, there is an urgent, mandatory need for the development of innovative, non invasive diagnostic technologies that possess the unique ability to penetrate the formidable visual barrier of the tree trunk. By strategically shifting the technological focus from traditional, reactive visual monitoring to proactive, intelligent bioacoustic sensing, farmers can finally identify the subtle, mechanical chewing sounds and vibrations produced by larvae deep within the wood. This proactive capability allows for tar- geted, minimal intervention strategies that effectively preserve both the physical structural integrity of the individual tree and the overarching economic livelihood of the farmer.

    Fig. 1. Internal damage and larval growth within the coconut tree trunk.

    Current detection methods rely almost entirely on inter- mittent manual monitoring and visual observation. By the time canopy yellowing or frond damage is visible, the in- ternal structure of the tree is often compromised beyond repair. This highlights an urgent need for an innovative, non- invasive diagnostic tool for early identification. We propose an intelligent system that integrates bioacoustic perception with advanced AI-driven denoising. By sensing the acoustic fingerprint of larval chewing sounds directly from the wood, the system enables proactive intervention before catastrophic damage occurs.

  2. BACKGROUND AND RELEVANCE

    The established frameworks governing traditional pest de- tection protocols within the coconut industry are fundamen-

    tally and structurally limited by the inherent constraints of human sensory thresholds. The primary operational constraint of the current, widespread wait-and-see approach is that it is functionally programmed and operationally designed to detect a biological problem only after it has physically manifested to a point of visual visibility. This intrinsic design limitation often leads directly to late stage detection scenarios where the in- festation has already reached a catastrophic threshold. In such advanced stages, the internal damage to the vascular system of the palm is so severe that the only viable solution remaining for the farmer is the complete removal and destruction of the tree.

    Furthermore, the existing reliance on manual monitoring and intermittent human observation lacs the real time insight required to gain a comprehensive understanding of a trees in- ternal health. Because these visual centric systems are entirely external, they fundamentally lack the capability to identify internal larval activity that occurs deep within the trunk substrate. This problematic combination of limited sensory data and fundamental operational rigidity creates a profoundly unmet need for an intelligent, non invasive, and proactive diagnostic solution. The transition from passively waiting for the appearance of visible external symptoms to the active, real time sensing of internal bioacoustic signals represents an essential and mandatory step for the modernization of agricultural safety protocols. By integrating high-sensitivity piezoelectric microphones and Edge AI inference, The Tree Stethoscope addresses these established failures of traditional reactive inspection methods.

    Fig. 2. Application of bioacoustic sensors for non-invasive trunk monitoring.

    The transition from passively waiting for visible symptoms to actively sensing internal bioacoustics is essential for mod- ernizing agricultural safety. The Tree Stethoscope leverages precision agriculture principles to provide a low cost, smart, and noninvasive tool that provides immediate diagnosis. It eliminates the need for expensive lab analysis or technical ex- pertise, making it accessible to individual small scale farmers in network constrained rural environments.

  3. LITERATURE REVIEW

    Mankin et al. [1] investigated the acoustic characteristics of dynastid beetle stridulations and identified key spectral and temporal patterns that form the basis for insect sound recogni- tion. Their study established fundamental acoustic parameters

    useful for detecting wood-boring insect activity. Building upon signal enhancement techniques, Liu et al. [2] proposed an artificial intelligence-based acoustic denoising framework to improve pest detection accuracy in noisy environments, signif- icantly enhancing signal clarity and classification reliability.

    Albanese et al. [3] introduced an automated pest detection system using deep neural networks (DNNs) deployed on edge devices, demonstrating reduced latency and improved real- time monitoring capabilities. Similarly, Madiwal et al. [4] presented an Edge AI and IoT-based framework for real- time crop disease detection, highlighting the scalability and responsiveness of distributed intelligent systems in agriculture. Bilski et al. [5] focused on detecting wood-boring insect larvae through acoustic signal analysis, employing spectral feature extraction techniques to differentiate insect sounds from environmental noise.

    Jalinas et al. [6] explored acoustic signal applications for detecting and managing Rhynchophorus species in fruit crops and ornamental palms, emphasizing the practical field de- ployment of acoustic monitoring systems. Dong et al. [7] further advanced agricultural monitoring by integrating deep learning and edge computing within IoT-based environmental monitoring systems, demonstrating improved efficiency in smart farming applications.

    Arjomandi et al. [8] provided a comprehensive review of acoustic communication in bark beetles, summarizing over

    150 years of research and underscoring the importance of bioacoustic understanding for intelligent pest surveillance. In the field of lightweight deep learning models, Sandler et al.

    [9] proposed MobileNetV2, featuring inverted residuals and linear bottlenecks, making it highly suitable for deployment on resource-constrained edge devices.

    Kim et al. [10] conducted a comparative study of deep learn- ing models for multi-modal industrial fire detection, demon- strating the importance of selecting efficient architectures for real-time detection tasks. Gupta et al. [11] analyzed Li- DAR performance degradation in aerosol-dense environments, highlighting environmental limitations that affect sensor-based detection systems. Finally, Chen et al. [12] proposed a thermal- visual fusion approach for enhanced flame localization and tracking, illustrating the benefits of multi-modal sensing in improving detection accuracy and robustness.

    Overall, the literature demonstrates substantial progress in acoustic pest detection, AI-based denoising, deep learning classification, IoT integration, and edge computing deploy- ment. However, there remains a research gap in developing an integrated, lightweight, and real-time acoustic pest monitoring system that combines robust denoising, efficient deep learn- ing architectures, and scalable edge-based implementation for practical agricultural applications.

  4. SYSTEM ARCHITECTURE AND METHODOLOGY

    The Tree Stethoscope is designed as an autonomous diag- nostic framework capable of providing real time situational intelligence.

    Fig. 3. High-level system architecture of the Tree Stethoscope sensing node.

    1. Module 1: Data Acquisition (The Perception Layer)

      The data acquisition unit utilizes a high sensitivity piezo- electric contact microphone firmly attached to the tree surface. Unlike airborne microphones, which are prone to picking up environmental sounds like wind or rain, piezoelectric sensors are specifically designed to capture solid borne vibrations.

      These sensors function by converting mechanical stress such as the faint vibrations caused by larvae chewing through wood into electrical signals. By effectively rejecting ambient noise and focusing only on vibrations propagating through the dense trunk substrate, the system provides a clear signal-to-noise ratio essential for accurate AI classification.

      • Sensing Unit: Piezoelectric disks capture the subtle mechanical chewing and movement sounds of larvae.
      • Digital Conversion: A USB Sound Card provides analog to digital conversion (ADC) for processing.
      • Compute Unit: A Raspberry Pi 4 acts as the edge node for real time signal analysis.
    2. Module 2: Signal Preprocessing and Feature Engineering

      Raw data is processed using band pass filtering to isolate the frequency range of larval activity, typically filtering out low frequency environmental noise and high frequency electronic interference.

      The cleaned signals are then converted into Mel- Spectrograms, creating a 2D visual representation of the sounds frequency and intensity over time. This conversion maps the linear frequency scale to the Mel scale, which more closely approximates human auditory perception and emphasizes the spectral features the CNN uses to distinguish larval chewing from background plantation noise.

    3. Module 3: AI Classification Module

      The core intelligence of the system is a lightweight Con- volutional Neural Network (CNN) specifically architected to

      Fig. 4. Detailed diagnostic workflow from signal capture to AI classification.

      recognize spatial patterns and frequency textures within the generated Mel Spectrograms. By treating the acoustic signa- ture of the Rhinoceros beetle larvae as a visual classification problem, the model can effectively distinguish between the chaotic, low amplitude chewing sounds of an infestation and the stochastic background noise of a plantation. To transition this model from a high resource training environment to a field deployable tool, it is optimized using TensorFlow Lite (TFLite). This optimization involves post training quantization and pruning, which significantly reduces the model footprint and computational requirements, enabling it to execute real time inference locally on the Raspberry Pi 4 CPU. This Edge AI approach ensures high speed diagnostic results with sub second latency while maintaining total data privacy and operational independence from cloud based infrastructure.

    4. Module 4: User Interface (UI)

      Inference results are maped to high visibility LED indica- tors, providing an immediate and intuitive diagnostic output for the user. This interface is designed for simplicity, allowing farmers to quickly determine the health status of a tree without needing to interpret complex data graphs or numerical values on a screen.

      The mapping of the inference outcomes to the hardware output is as follows:

        • Green LED: Indicates a healthy tree.
        • Red LED: Indicates an infested tree requiring immediate attention.
  5. IMPLEMENTATION AND TRAINING
    1. Data Collection

      A custom dataset was curated by recording high-fidelity bioacoustic signals directly from healthy and infested coconut palms across various plantations in Kerala. This process involved the careful placement of piezoelectric sensors to capture the subtle mechanical vibrations generated by the

      internal movement and mandibles of the larvae. Approximately 500 unique audio samples were curated to create a balanced dataset, ensuring that the model learned to distinguish between baseline environmental noise and the specific spectral signa- tures of an infestation.

      The foundational integrity of the diagnostic system is pred- icated upon the quality of its training data consequently, a specialized and extensive custom dataset was meticulously curated by recording high fidelity bioacoustic signals directly from both healthy and confirmed infested coconut palms across a diverse range of agricultural plantations situated throughout Kerala. This rigorous data acquisition process in- volved the precision engineered placement of high-sensitivity piezoelectric sensors against the trunk surface to effectively capture the exceptionally subtle, solid-borne mechanical vi- brations generated by the internal physical movements and the repetitive scraping of the larval mandibles against the wooden fibers.

      To ensure the statistical robustness and generalizability of the subsequent AI classification model, approximately 500 unique and high quality audio samples were meticulously curated and labeled to create a perfectly balanced dataset. This balanced approach was specifically designed to ensure that the CNN model which is a core component of your final year project developed the sophisticated capability to accurately distinguish between the ambient, baseline environmental noise floors (typically observed at 30% volume) and the specific, high- amplitude spectral signatures (reaching 80% volume) that are uniquely characteristic of an internal Rhinoceros beetle infestation.

      To further improve the robustness of the Convolutional Neural Network (CNN) and prevent overfitting, data aug- mentation techniques were applied to the training set. This included time-stretching, pitch shifting, and the addition of synthetic background noise. These augmentations simulate a wider variety of real world field conditions, such as varying wind speeds or different densities of tree trunks, ensuring that the Tree Stethoscope remains accurate and reliable regardless of the specific environment in which it is deployed.

    2. Hardware and Software Specifications

    The system is engineered using a robust combination of open- source software frameworks and cost effective, readily available hardware components. This modular approach en- sures the project remains scalable and accessible, allowing for decentralized deployment in resource-constrained agricultural environments.

    The structural and operational integrity of the proposed diagnostic system is engineered using a robust and highly synergistic combination of open-source software frameworks and cost effective, readily available hardware components. This modular design philosophy is strategically adopted to ensure that the overall project remains fundamentally scalable, economically accessible, and capable of decentralized deploy- ment within the resource constrained and often geographically isolated agricultural environments typical of rural plantations.

    single-board computer that provides the necessary computa- tional throughput for real time, edge-based AI inference while simultaneously maintaining a remarkably low power profile suitable for portable, battery operated field use. This core compute unit is integrated with a specialized signal chain, including a MAX4466 low-noise operational amplifier and a high sensitivity piezoelectric contact microphone, which together allow for the precise acquisition of solid borne bioacoustic vibrations.

    The hardware stack is centered around the Raspberry Pi 4, which provides the necessary computational power for edge based AI inference while maintaining a low power profile suitable for battery operated field use. On the software side, the integration of TensorFlow Lite and the Librosa library allows for sophisticated signal processing and deep learning without the need for high cost proprietary licenses or specialized in- dustrial servers. This synergy between affordable hardware and open source intelligence is what makes The Tree Stethoscope a viable solution for large scale adoption by rural farming communities.

        • Hardware: Raspberry Pi 4, Piezoelectric Mic, USB Sound Card, LiPo Power System.
        • Software: Raspberry Pi OS, TensorFlow Lite, Librosa Python library.

    Fig. 5. vibration detection testing.

  6. EXPERIMENTAL RESULTS

    The system was validated through real time field test- ing, which established a clear performance benchmark for signal discrimination. During idle states in the plantation environment, the baseline noise floor consistently remained at approximately 30% . This baseline represents the ambient

    Fig. 6. vibration frequency testing.

    This 50% margin between the idle state and active infestation provides a robust operational window for the CNN classifier, ensuring high sensitivity while significantly reducing the like- lihood of false positives during field deployment. The project successfuly detected the vibrations from the beetles and it was indicated by red LED, on no vibrations the green LED was glowing.

    TABLE I

    System Performance and Latency Metrics

    Module Hardware/Model Latency
    Data Capture Piezoelectric Mic 50 ms
    Processing Librosa (Python) 150 ms
    AI Inference TFLite (CNN) 200 ms
    Output GPIO/LED 10 ms
    Total 410 ms
  7. PROJECT PLANNING AND FEASIBILITY
    1. Task Allocation

      The task allocation for The Tree Stethoscope was strate- gically distributed among the four member team to ensure the parallel development of critical hardware and software components. This collaborative approach allowed for the si- multaneous refinement of the AI pipeline and the physical sensing unit, ensuring a seamless integration of all system modules.

      TABLE II

      Team Task Allocation

      environmental sounds such as wind, distant machinery, and

      rustling leaves that the piezoelectric sensor successfully filters through its solid borne conduction.

      Upon the introduction of larval activity, a dramatic de- tection spike to 80% was observed. This sharp increase in signal amplitude confirms that the mechanical chewing and movement of the Rhinoceros beetle larvae generate distinct acoustic energy that far exceeds environmental interference.

      Task Allocated Resource

      Dataset Curation Anan M Binoj Signal Preprocessing Arjun T Aghilesh</>

      Hardware Integration Adarsh Abraham Johnson CNN

      Development Chris Kuriakose

    2. Budget Analysis

    The prototype of The Tree Stethoscope is engineered as a cost effective solution for small scale and commercial farmers, with a total estimated cost of approximately INR 20,500. This pricing strategy makes the device a highly ac- cessible and affordable alternative to specialized, professional grade agricultural sensors that often require significant capital investment.

    TABLE III Detailed Project Budget

    Item Estimated Cost (INR)
    Raspberry Pi 4 16,000
    Sensors and ADC 1,300
    Power and Enclosure 2,500
    Misc. Components 700
    Total 20,500
  8. IMPACT AND CONCLUSION

The Tree Stethoscope represents a pivotal shift in modern precision agriculture, effectively transitioning pest manage- ment from traditional, reactive visual inspections to proactive, data driven diagnostics. By utilizing a high sensitivity piezo- electric contact microphone, the system captures the subtle acoustic fingerprints of wood boring larvae, specifically the unpatterned mechanical vibrations caused by their mandibles scraping against wood, that are otherwise undetectable to the human ear. This technology provides a rapid, non inva- sive method to protect high value crops like coconut palms, allowing for early stage intervention before the pest can cause irreversible damage to the trees internal meristem and structural integrity.

The development and deployment of the Tree Stethoscope represents a transformative and pivotal paradigm shift in the foundational domain of modern precision agriculture, effec- tively transitioning large scale pest management strategies from traditional, fundamentally reactive visual inspections toward a future of proactive, data driven diagnostics. By strategically utilizing a high sensitivity piezoelectric contact microphone integrated with a Raspberry Pi, the system is engineered to capture the incredibly subtle and distinctive bioacoustic fingerprints of wood boring larvae, specifically the unpatterned, low frequency mechanical vibrations caused by their mandibles aggressively scraping against internal wood fibers, that remain entirely undetectable to the unassisted human ear. This advanced bioacoustic technology provides a rapid, non invasive, and highly efficient method to safeguard high value perennial crops like coconut palms, facilitating critical early stage intervention long before the pest can inflict irreversible catastrophic damage to the trees internal meristem tissues and overarching structural integrity.

From a sophisticated technical standpoint, the systems strategic reliance on decentralized Edge AI ensures that these vital diagnostics remain accessible even in the most remote and

geographically isolated agricultural regions. By performing high speed, complex computation locally on a Raspberry Pi 4 using a custom CNN model, the system entirely eliminates the operational need for consistent or reliable internet connectivity, which has historically served as a major barrier to high tech adoption in rural plantations.

The local inference process, which is meticulously op- timized with TensorFlow Lite, provides instantaneous and unambiguous Healthy or Infested feedback, empowering in- dividual farmers with real time situational intelligence. This seamless integration of affordable, off the shelf hardware and open source intelligence not only drastically reduces operational overhead but also fosters the growth of a more resilient, self sufficient, and technologically advanced farming community capable of safeguarding their agricultural assets against invisible, internal threats.

The implementation of such a system has a direct and profound impact on long term farm sustainability and eco- nomic resilience. By identifying infestations during the critical detection lag stage the period where larvae are active internally but no external symptoms like canopy yellowing are visible farmers can move away from broad, wasteful pesticide applica- tions. This diagnostic precision allows for targeted treatments, which significantly reduces the volume of chemical pesticides introduced into the environment, thereby protecting the sur- rounding soil and water table from unnecessary contamination. Furthermore, early detection prevents total crop loss, ensuring that millions of rural households reliant on coconut cultivation can maintain a steady income and avoid the catastrophic financial strain caused by late stage tree death.

From a technical standpoint, the systems reliance on decen- tralized Edge AI ensures that these diagnostics are accessible even in the most remote agricultural regions. By performing high speed computation locally on a Raspberry Pi 4, the system eliminates the need for reliable internet connectivity, which is a common barrier to technology adoption in ru- ral plantations. The local inference process, optimized with TensorFlow Lite, provides instantaneous Healthy or Infested feedback, empowering farmers with real time situational in- telligence. This integration of affordable hardware and open source intelligence not only reduces operational costs but also fosters a more resilient and technologically advanced farming community capable of safeguarding their crops against invisible threats.

REFERENCES

  1. R. W. Mankin, J. M. Moore, and R. D. Cave, Acoustic characteristics of dynastid beetle stridulations, Florida Entomologist, vol. 92, no. 1,

    pp. 123132, 2009.

  2. X. Liu, Y. Zhang, and H. Li, Acoustic denoising using artificial intelligence for wood-boring pests, Sensors, vol. 22, no. 10, 2022.
  3. A. Albanese, M. Nardello, and G. Rossi, Automated pest detection with DNN on the edge, IEEE Journal on Emerging and Selected Topics in Circuits and Systems, 2021.
  4. A. S. Madiwal, P. Kulkarni, and S. Patil, Edge AI and IoT for real- time crop disease detection, International Journal of Research and Innovation in Applied Science (IJRIAS), vol. 10, no. 7, 2025.
  5. P. Bilski, M. Kacprzak, and T. Nowakowski, Detection of wood boring insects larvae based on the acoustic signal analysis, Archives of Acoustics, vol. 42, no. 1, pp. 6170, 2017.
  6. J. Jalinas, R. W. Mankin, and J. L. Capinera, Acoustic signal appli- cations in detection and management of Rhynchophorus spp. in fruit- crops and ornamental palms, Florida Entomologist, vol. 102, no. 3, pp. 475 479, 2019.
  7. M. Dong, Y. Chen, and Z. Li, Research on agricultural environmental monitoring Internet of Things based on edge computing and deep learning, Journal of Intelligent Systems, vol. 33, p. 20230114, 2024.
  8. E. Arjomandi, F. Lieutier, and T. D. Paine, Acoustic communication in bark beetles (Scolytinae): 150 years of research, Physiological Entomology, vol. 49, no. 4, pp. 281300, 2024.
  9. M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, and L.-C. Chen, MobileNetV2: Inverted residuals and linear bottlenecks, in Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), pp. 45104520, 2018.
  10. D. Kim, S. Park, and J. Lee, A comparative study of deep learning models for multi-modal industrial fire detection, IEEE Transactions on Industrial Informatics, vol. 20, no. 1, pp. 500510, 2024.
  11. M. Gupta, V. Sharma, and R. Kumar, LiDAR performance degrada- tion analysis in aerosol-dense environments for autonomous vehicles, Journal of Field Robotics, vol. 41, n. 3, pp. 450465, 2024.
  12. J. Chen, L. Wang, and Z. Sun, Thermal-visual fusion for enhanced flame localization and tracking in robotic fire suppression, Sensors, vol. 23, no. 18, 2023.