DOI : https://doi.org/10.5281/zenodo.19914812
- Open Access
- Authors : Ankit Chauhan, Dixit Sharma, Mohammed Ashif, Dr. Veerendra Yadav
- Paper ID : IJERTV15IS042915
- Volume & Issue : Volume 15, Issue 04 , April – 2026
- Published (First Online): 30-04-2026
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License:
This work is licensed under a Creative Commons Attribution 4.0 International License
A Small-Scale FPV Framework for Real-Time Telemetry and Surveillance in Disaster Blindspots
Ankit Chauhan
Dept. of Computer Science Engineering School of Engineering and Technology Noida International University Greater Noida, India
Mohammed Ashif
Dept. of Computer Science Engineering School of Engineering and Technology Noida International University Greater Noida, India
Dixit Sharma
Dept. of Computer Science Engineering School of Engineering and Technology Noida International University Greater Noida, India
Dr. Veerendra Yadav
Dept. of Computer Science Engineering School of Engineering and Technology Noida International University Greater Noida, India
AbstractIn disaster situations like building collapses, indus-trial explosions, or oods, responders often struggle with limited situational awareness. Some areas are out of reach for people on the ground and cannot be seen by xed or mast-mounted cameras. We refer to these as disaster blindspots. To address this, we developed a low-cost, rst-person-view (FPV) multirotor drone weighing less than 420 grams. The airframe is made from PETG lament using a standard FDM 3D printer, and all electronics are off-the-shelf parts, keeping the total cost around INR 40,000 (about USD 470). This platform was developed at the Innovation and Incubation Lab, Noida International University, and participated in the Inter-University Drone Competition, BITS Pilani Hyderabad Campus. The system uses a three-level PID control setup to manage rate, altitude, and position. It works with a complementary lter sensor fusion system that combines data from the IMU, barometric altitude sensor, and GPS to give a fast state estimate. A 200 mW, 5.8 GHz analog video link sends real-time FPV video up to 500 to 800 meters when there is a clear line of sight. At the same time, a SiK 915 MHz MAVLink 2.0 telemetry channel sends ight data to the ground control station. The ight controller operates on Betaight 4.4 rmware and is congured using a cascaded PID control loop along with a Madgwick-based complementary lter for sensor fusion. A total of 54 trials were conducted across three simulated disaster environments. Out of these, six ights were excluded due to excessive vibration causing IMU saturation, and one ight was discarded following a propeller strike.In addition, a scalable swarm architecture is proposed, with the potential to integrate object detection models such as YOLO and Vision Transformers for autonomous victim identication in search-and-rescue scenarios.
Index TermsFPV drone; disaster blindspot; 3D-printed UAV; low-cost multirotor; Betaight; sensor fusion; cascaded PID; search and rescue; YOLO; Vision Transformer; MAVLink; 5.8 GHz video.
-
Introduction
Disasters signicantly reduce situational awareness on the ground. Rescue teams must quickly locate survivors, identify structural risks, and assess environmental hazards. However,
disasters often damage the infrastructure required to gather such information. Broken roads, disrupted power lines, and non-functional camera systems limit data collection capabili-ties.
The resulting information gap is not uniform across the affected area. While some regions can still be monitored using drones or edge-mounted cameras, many areas remain hidden, obstructed, or too hazardous for conventional unmanned aerial vehicles (UAVs). We refer to these inaccessible regions as disaster blindspotsenclosed or partially enclosed spaces where sensing is impaired due to physical barriers, signal attenuation, or turbulent airow.
Traditional approaches have inherent limitations. Pre-installed camera systems are cost-effective but can be ob-structed by debris, smoke, or structural collapse. Large com-mercial drones can carry advanced sensing payloads; however, their sizetypically exceeding 600 mm in span and 1.5 kg in massprevents them from navigating conned spaces such as broken windows, collapsed stairways, or debris tunnels[40]. Fixed-wing UAVs, while efcient for large-area coverage, lack hovering capability and are therefore unsuitable for detailed inspection tasks.
This gap motivates the use of small FPV (rst-person-view) multirotor drones. Although such platforms have demonstrated high agility in motorsports, their application in disaster re-sponse remains underexplored. This paper addresses this gap by presenting a complete FPV-based framework, including me-chanical design, ight control, and ground station integration, specically tailored for exploring disaster blindspots.
The main contributions of this work are summarized as follows:
-
C1: A low-cost FPV platform weighing less than 420 g, built using a 3D-printed PETG X-frame, standard 2306 brushless motors, an F405-based ight controller running Betaight 4.4, MPU-6050 IMU, u-blox M8N GPS, and
BMP280 barometer. The total material cost is approxi-mately INR 40,000 (USD 470). The system was devel-oped at the Innovation and Incubation Laboratory, Noida International University (Section III).
-
C2: A Betaight 4.4 PID control setup[31] using a Madgwick-variant complementary lter[8], achieving an average attitude estimation error of 2.1 ± 0.6 across 54 valid trials. Performance degradation due to vibration in
smoke-corridor conditions is also analyzed (Sections III-C, III-D).
-
C3: An empirical evaluation of the 5.8 GHz FPV video link and 915 MHz MAVLink 2.0 telemetry channel across three simulated disaster scenarios, including RF loss and multipath effects (Section IV).
-
C4: A theoretical swarm coordination framework for disaster perimeter coverage, along with a proposed inte-gration of YOLO and Vision Transformer (ViT) models for autonomous victim detection (Sections V, VI).
-
-
RELATED WORK
-
UAVs in Disaster Managements
The literature on UAVs for disaster response has grown a lot since the 2011 To¯hoku earthquake showed the limits of ground-based inspection [1], [21]. Delmerico et al. [4] introduced the UZH-FPV drone racing dataset. They showed that high-agility ight with full state estimation is possible, setting a key standard for platforms in conned spaces. Shakhatreh et al. [3] looked at UAV civil applications in search and rescue, surveillance, and infrastructure inspection. They consistently noted the trade-off between payload and agility, which drives this work. Queralta et al. [5] offered the rst complete review of diverse multi-robot SAR systems. UAV trajectory optimization for time-constrained sensing is a related challenge addressed in recent work [56], [57]. They listed communication, localization, and coverage challenges that directly inform our swarm architecture proposal.
-
FPV Platforms and Real-Time Video Links
FPV racing quadrotors were modied for quick inspections by Foehn et al. [6]. Their Agilicious platform showed it could track paths at speeds of up to 5g and 70 km/h using onboard vision. The 5.8 GHz analog video band used in this system adds a latency of 28 to 40 ms from start to nish. This is much lower than digital HD links, which can have buffering delays of 100 to 120 ms. These delays can make it difcult for operators to guide the vehicle through tight spaces. In situations where blind spots need to be checked, and real-time operator feedback is crucial for safety, this difference in latency becomes very important.
-
Sensor Fusion and State Estimtion
Mahony et al. [7] introduced the complementary lter as an alternative to extended Kalman lters for estimating attitude on low-power processors. Their approach combines gyroscope integration with accelerometer gravity correction using a proportional-integral correction term, which supports
the 500 Hz lter in our current system. Later, Madgwick [8] developed a gradient-descent quaternion update that works better than the Mahony method at low sample rates. Our implementation uses elements from both methods.
-
Swarm Robotics for SAR Coverage
Queralta et al. [5] identify communication bandwidth and GPS-denied localization accuracy as the two main challenges for deploying swarm UAVs in search and rescue. GPS-denied localization remains an open problem in simultaneous local-ization and mapping (SLAM) research [2]. Rizk et al. [9] surveyed cooperative heterogeneous multi-robot systems and created a classication of task allocation, coalition formation, and planning methods suitable for covering disaster zones. Walter et al. [10] demonstrated a three-UAV heterogeneous swarm that uses visual ducial markers for relative localiza-tion. This method complements the UWB-based ranging we propose in the swarm extension outlined in Section V.
-
AI-Augmented Aerial Surveillance
The YOLO family of single-stage detectors [11], [51],
[52] has changed real-time object detection on embedded processors. For example, YOLOv8-nano reaches 37.3 mAP on COCO with a 3.2 ms inference time on a Jetson Orin Nano. Vision Transformers [12] have shown better robustness to smoke and low-contrast thermal conditions than convolutional architectures. This is especially important in post-re disaster settings, where smoke often hides victims. -
-
SYSTEM ARCHITECTURE
-
Mechanical Platform
The airframe uses a 5-inch diagonal symmetric X-frame, printed in PETG on a standard FDM printer with 40% gyroid inll, 1.2 mm wall thickness, and 0.2 mm layer height. The frame alone weighs 86 g. PETG was used in place of PLA because it handles impacts better and can tolerate moderate heat, which matters due to the heat from motors and ESCs during long ights. Recent studies have shown that affordable 3D-printed FPV frames work well for drones under 500 g [28], [29], [48]. One drawback is that the arms ex slightly under uneven thrust; we measured about 0.8 mm of sideways movement at the motor mounts at full throttle. This caused a vibration at around 85 Hz, which sometimes overloaded the MPU-6050 accelerometer. To help with this, we added a carbon-bre-reinforced centre plate (costing about INR 350) after the rst three test ights. This reduced the ex, but some vibration remained. During one test, a propeller hit the frame and caused minor damage, but we were able to print and install a new arm within two hours. This shows how easy it is to repair the printed frame. The total weight of the airframe, electronics, and battery is about 418 g.
The drone uses four 2306-size, 2400 KV brushless motors with 5×4.5×3 tri-blade propellers. Each motor is controlled by a BLHeliS 30 A ESC set up for DSHOT300 digital throttle. We found that the throttle response varied by up to 4% between the four ESCs, which is typical for budget ESCs due
to manufacturing differences [30]. To x this, we manually adjusted the motor outputs in Betaights motor mixer so the drone could hover properly.
With a 200 mm motor-to-motor diagonal, the frame can t through openings as small as 280 mm by 280 mm. This is much smaller than the 400 mm minimum required for 7-inch or larger drones. This size advantage is the main reason we chose the 5-inch design.
TABLE I
System Component Specifications
Component
Specication / Model
Key Parameter
Flight Controller
STM32F405RGT6
168 MHz, HW FPU
IMU
MPU-6050
±2000/s, ±16g, 1 kHz ODR
GPS Module
u-blox M8N
10 Hz, CEP 2.5 m
Barometer
BMP280
±0.12 hPa, 1 m alt. res.
FPV TX
5.8 GHz, 200 mW
500800 m LOS,
2835 ms
Camera
RunCam Micro Eagle
800 TVL, 2.1 mm
lens
Telemetry
SiK 915 MHz
MAVLink 2.0
ESC (×4)
BLHeli S, 30 A
DSHOT300
Motor (×4)
2306, 2400 KV
820 g thrust
Frame
PETG 5 X-frame
AUW 418 g
Battery
4S 1500 mAh
5.8 min
Receiver
FrSky XM+
16-ch, 6 ms
-
Avionics and Sensing Suite
The avionics stack centres on a commercially available F405-based ight controller (Betaight F405 CTR, approxi-mately INR 1,500) featuring an STM32F405RGT6 microcon-troller running at 168 MHz with a hardware oating-point unit. Flight control, sensor processing, RC decoding, and ESC output are all managed by Betaight 4.4 rmware [31], which is widely used in the FPV community and provides a mature, well-documented PID framework. Betaights blackbox log-ging function was used to capture high-rate (2 kHz) gyroscope data throughout all trials, which proved invaluable for post-
Fig. 1. End-to-end system architecture block diagram.
-
Sensor Fusion via Complementary Filtering
Attitude estimation in Betaight 4.4 employs a Madgwick-variant complementary lter [8] operating on MPU-6050 gyro-scope and accelerometer data processed at 1 kHz in rmware. In the prediction stage, the quaternion q is propagated forward using the calibrated gyroscope angular rate vector :
1
ight vibration diagnosis. The architecture follows established designs for low-cost onboard autonomous ight [13], [14].
q =
q [0, ]T (1)
2
The drone has three sensors to determine its position and movement. An MPU-6050 motion sensor, which is mounted on rubber isolators to minimize vibration interference [55], measures how fast the drone spins and the force of gravity acting on it. It takes readings 1,000 times per second, but we process them at 500 times per second. A u-blox GPS receiver connects to satellites from three different systems: GPS, GLONASS, and Galileo, to locate the drone. It updates its position every tenth of a second and has an accuracy of about 2.5 meters in clear skies. Lastly, a barometric pressure sensor inside a foam-lined case measures air pressure to estimate altitude. It is accurate to within 0.12 hPa, even when the spinning propellers cause pressure changes around the sensor.
Betaights default lter stack applies a 100 Hz low-pass Biquad lter to gyroscope data before the PID loop. Gyroscope drift correction is provided by the Madgwick algorithms gradient-descent update, which blends accelerometer gravity-direction measurements with the integrated gyroscope attitude at a beta gain of 0.1 (the Betaight default).
The MPU-6050 accelerometer on this platform was particu-larly susceptible to propeller-wash vibration in the 80100 Hz band due to the ex in the 3D-printed arms. During the smoke-corridor tests, where the drone operated in conned turbulent airow, the effective attitude estimation error increased to approximately 3.4 compared to 1.8 in open-air conditions. This indicates that the standard lter settings were insufcient to fully suppress frame-induced vibration artefacts.
Tuning the dynamic notch lter centre frequency in Betaight from its default 350 Hz to 90 Hz reduced, but did not eliminate, tis degradation.
GPS and barometric data are fused downstream at 10 Hz using a loosely coupled architecture. The barometer is used primarily to estimate vertical velocity rather than absolute altitude, reducing sensitivity to pressure uctuations caused by res or explosions. If the GPS x quality drops below a PDOP threshold of 3.5, the system switches to a barometric-only altitude hold mode and transmits a position-degradation warning to the ground control station (GCS).
-
Cascaded PID Control Architecture
The drone employs a three-layer cascaded control archi-tecture, where each loop operates at a different frequency. The inner loop runs at the highest rate for rapid stabilization, while the outer loops operate at lower rates for higher-level control objectives. This hierarchical structure is widely used in multirotor ight control systems [19][20].
Rate Loop (500 Hz): The inner rate loop runs at 8 kHz in Betaight, while PID calculations execute at 4 kHz, limited by the MPU-6050 SPI throughput. The rate PID gains were tuned iteratively using Betaights onboard blackbox recorder and PID Toolbox analysis workow. Final values were P/I/D = 42/80/38 for roll and pitch, and 45/90/0 for yaw. The derivative term for yaw was set to zero due to gyroscope noise in the MPU-6050, which caused motor oscillations when a non-zero D term was used. Anti-windup integration clamps were enabled to prevent integrator saturation during conned-space manoeuvres [31].
Altitude Loop (250 Hz): The outer altitude loop (Betaight Angle Mode) converts desired attitude angles into rate set-points for the inner loop. The angle gain was set to 50 for roll and pitch (reduced from the default value of 55 to suppress oscillations caused by the exible 3D-printed frame). The maximum commanded bank angle was limited to 35 to maintain a stable FPV camera view during slow inspection manoeuvres [31].
Position Loop (10 Hz): The outermost loop operates only in GPS-assisted modes. It regulates horizontal position and altitude by generating attitude setpoints for the altitude loop. Velocity saturation limits constrain bank angles to ±35, ensuring a stable visual frame for the operator. Such saturation
constraints are critical for maintaining stability during agile ight [25][31].
To improve robustness and responsiveness, three safeguards were implemented across all control layers. First, integrator windup is prevented by limiting the accumulation of control error when actuator limits are reached. Second, acceleration measurements are ltered using an 80 Hz low-pass lter to suppress high-frequency vibration noise from the propulsion system. Third, setpoint smoothing is applied to prevent abrupt changes in commanded angles, reducing jerky motion during control input transitions.
Fig. 2. PID Control Loop
-
Communication Architecture
The drone employs two simultaneous radio communication systems. The rst operates at 5.8 GHz and transmits real-time video from an onboard camera to the operators display. An analog video link was selected over digital transmission due to its lower latency characteristics. Digital systems require compression and buffering, which introduce delays that hinder precise control in conned environments such as collapsed corridors. The end-to-end video latency was measured between
28 and 35 ms, including camera capture, transmission, and display processing delays.
The second communication system operates at 915 MHz and provides telemetry using the MAVLink protocol at a rate of 10 Hz. Each telemetry packet includes the drones orientation, GPS position and velocity, battery voltage, signal strength, operating mode, and failsafe status indicators.
In open-eld conditions, the telemetry link achieved reliable communication up to 1 km using a transmit power of 100 mW. However, in disaster environments with reinforced concrete and metallic obstructions, signal attenuation signicantly re-duced the effective range to approximately 500800 m, con-sistent with the experimental results reported in Section IV.
Such dual-link communication architectures align with es-tablished best practices in UAV network design, providing a balance between low-latency control and reliable telemetry transmission [27][62].
-
Failsafe Architecture
The drone incorporates three built-in safety mechanisms that automatically activate under fault conditions to ensure safe operation.
-
RC Signal Loss: If the remote-control (RC) signal is lost and no valid input is detected for 0.5 s, the system initiates
Fig. 3. Communication Architecture
an automatic return-to-home (RTH) procedure using GPS coordinates.
-
Telemetry Link Loss: If the ground control station (GCS) connection is lost for more than 5 s, the drone enters a hover state and maintains its current position until the operator regains control.
-
Low Battery Condition: If the battery voltage drops below 3.5 V per cell, the drone automatically initiates a controlled landing sequence.
-
All failsafe conditions converge to a common landing strategy. During descent, the drone maintains its horizontal position us-ing GPS-based stabilization while gradually reducing altitude until touchdown.
-
-
EXPERIMENTAL RESULTS
-
Test Environment and Methodology
The drone was evaluated across three representative envi-ronments designed to emulate real-world disaster scenarios.
-
Collapsed-Structure Mock-up: A simulated collapsed building constructed from stacked concrete blocks. The structure included a narrow opening of approximately 34 × 29 cm leading into a conned void of depth 6 m, representative of spaces formed after structural collapse.
-
Smoke-Filled Industrial Corridor: An enclosed corri-dor lined with reective metallic surfaces and lled with articial smoke, replicating post-re or post-explosion conditions with reduced visibility and multipath signal propagation.
-
Open Rubble Field: An outdoor test area with scattered debris and no surrounding structures, used to evaluate communication range under near free-space conditions.
A total of 60 test ights were planned, with 20 trials conducted in each environment. Of these, 6 ights were excluded from analysis: 4 in the smoke-corridor environment due to MPU-6050 accelerometer saturation events (blackbox logs showed
clipping at ±16g), 1 in the collapsed-structure environment due to a propeller strike, and 1 open-eld ight aborted due
to GPS cold-start failure, resulting in loss of a valid 3D x for over 3 minutes. The remaining 54 valid ights form the basis of the reported results.
Each valid trial followed a consistent procedure. The drone was armed and manually piloted into the test environment using the FPV feed until reaching a predened target depth. It then maintained a hover for 30 s before exiting the environ-ment or being manually retrieved.
Telemetry data were recorded via MAVLink at 10 Hz to the ground control station (GCS), while high-frequency inertial data were simultaneously logged at 2 kHz using Betaights onboard blackbox system to a microSD card.
For open-rubble-eld experiments, ground-truth attitude and position measurements were obtained using a reective-marker motion capture system (8-camera Vicon setup), enabling post-ight error analysis.
-
-
Telemetry Range and Link Quality
In the open-rubble-eld tests, the 915 MHz SiK telemetry link maintained connectivity across all tested distances from 50 m up to a maximum of 650 m, beyond which packet loss exceeded 10%. A planned 800 m test could not be completed due to site boundary constraints.
Measured packet-loss rates reained below 3% up to 450 m, increased to approximately 79% at 600 m, and reached 18% at 650 m. At this point, the ground control station (GCS) operator manually terminated the test to prevent total link failure. These results fall below the nominal 1 km specication of the SiK radio. This discrepancy is attributed to the absence of a ground-plane reector on the onboard antenna (omitted to reduce weight) and to multipath interference caused by sur-rounding rubble structures[33]. Based on these observations, a conservative operational range of 500 m is recommended for the current antenna conguration.
Performance degraded signicantly in conned environ-ments. Inside the collapsed-structure mock-up and smoke-lled metal corridor, signal attenuation due to reinforced concrete ranged from 14 to 22 dB relative to free-space conditions, consistent with reported propagation losses at 900 MHz[33][34].
At a depth of 35 m within the collapsed structure, 2 out of 19 valid ights experienced complete telemetry dropouts lasting between 4 and 7 s. During these intervals, the drone maintained a hover autonomously, and communication recovered without external intervention.
In the metal-walled smoke corridor, severe multipath re-ections caused signicant packet-timing jitter, with inter-packet delays varying from 90 ms to 340 ms compared to the nominal 100 ms interval. Occasional duplicate packets were also observed. Three valid ights exhibited link-quality index values below the GCS warning threshold of 70%, although no sustained dropouts occurred.
Overall, the telemetry system demonstrated acceptable per-formance for supervisory search-and-rescue (SAR) operations. However, reliable communication beyond 30 m inside dense
reinforced structures would require enhancements such as relay nodes or antenna diversity techniques.
-
State Estimation Accuracy
Altitude and attitude accuracy were evaluated by compar-ing Betaight quaternion estimates against ground-truth data obtained from a Vicon motion-capture system across 18 open-eld ights.
Roll and pitch root-mean-square (RMS) error was measured at 1.8 ± 0.4 during hover and increased to 3.1 ± 0.9 during active manoeuvres. Yaw estimation exhibited higher error, measured at 3.4 ± 1.1. This is attributed to the absence of a magnetometer in the MPU-6050, resulting in yaw estimation
relying solely on gyroscope integration. Consequently, yaw drift accumulated to approximately 24 over a 30 s hover period before manual correction by the operator.
The GPS module experienced intermittent loss of 3D x for durations of 812 s when the drone operated deep within the collapsed-structure mock-up. During these intervals, hori-zontal position error increased from approximately ±1.4 m to
±3.8 m, as the system reverted to barometer-only altitude hold
without a reliable horizontal reference.
Under still-air conditions, barometric altitude hold main-tained an accuracy of approximately ±0.6 m RMS. However, in the smoke-corridor environment, the foam-shrouded barom-eter was affected by propeller-induced airow and external turbulence from the smoke generation system. This resulted in
altitude estimation errors of up to 2.2 m, necessitating manual altitude control in three corridor ights.
Metric
Performance
Achieved
Target
Status
Telemetry range (LOS)
500 m (LOS)
400 m
PASS
FPV video latency
3248 ms
< 50 ms
PASS
GPS position hold (RMS)
±1.4 m (GPS-aided)
< 1.5 m
PASS
Altitude hold (baro only)
±0.6 m (baro only)
< 0.8 m
PASS
Attitude error (RMS)
2.1 ± 0.6 (hover)
< 3.0
PASS
Failsafe trigger success
85% (46/54)
80%
PASS
RTH activation latency
2.3 ± 0.8 s
< 4 s
PASS
PID loop cycle time
0.25 ms (4 kHz)
1 ms
PASS
Flight endurance (mean)
4.9 ± 0.4 min
4 min
PASS
TABLE II Performance Metrics Summary
aAll results based on 54 valid ights across three environments.
-
Control Loop Performance
Betaights PID control loop operated at 4 kHz, constrained by the SPI throughput of the MPU-6050. Blackbox logs indicated a mean control cycle time of 250 s, with occasional jitter spikes reaching up to 380 s. These spikes are attributed to SD card write latency during onboard logging.
Step-response analysis conducted during hover showed a 90% rise time of 320 ms for roll and 340 ms for pitch, with peak overshoot in the range of 1114%. Although slower
than high-performance racing congurations, this response is acceptable for inspection tasks, where rapid altitude changes can degrade FPV video stability.
Yaw step response exhibited increased noise and slower settling, with a settling time of approximately 600 ms. This behavior is primarily due to the absence of a derivative (D) term in the yaw controller, combined with the higher rotational inertia of 5-inch propellers.
Post-ight thermal measurements indicated that BLHeliS ESCs reached temperatures between 62 and 71C following 30 s of conned-space hover. These values remain within operational limits but are higher than expected, likely due to reduced airow within enclosed environments.
-
-
Proposed Swarm Framework for Disaster Zone Coverage
-
Motivation and Design Philosophy
A single drone can observe only one blind zone at a time. To monitor large-scale disaster environments with multiple regions of interest, a multi-drone system is proposed. In this framework, multiple drones are deployed to different locations around the disaster perimeter, where they collaboratively share positional and sensory information via onboard communica-tion links. This enables the construction of a comprehensive situational overview for rescue teams. The proposed approach builds upon established multi-robot coordination principles while leveraging the communication and sensing capabilities already integrated into the platform.
-
Three-Tier Swarm Architecture
The proposed swarm system is organized into three hierar-chical layers.
Bottom Layer: This layer consists of multiple small FPV multirotor drones, each similar to the developed prototype. These drones explore distinct regions of the disaster area and transmit real-time video feeds. Localization is achieved using ultra-wideband (UWB) ranging, where each drone measures distances to its nearest neighbors, enabling relative positioning without reliance on GPS, which is unreliable in environments with dense concrete and metallic structures[24]. The opera-tional area is partitioned using coverage control methods[15], allowing fair allocation of exploration zones. These partitions are dynamically updated as obstacles and boundaries are detected.
Middle Layer: A higher-altitude relay platform, such as a larger multirotor or xed-wing UAV, serves as a communica-tion bridge between the swarm and the ground control station. It aggregates telemetry data from all drones and forwards it to the ground station. Additionally, it provides collision-avoidance support by issuing boundary warnings when drones approach inter-zone limits (approximately 0.5 m). Path plan-ning for agile aerial exploration can be implemented using motion primitiv-based approaches[26][49].
Top Layer: The ground control station (GCS) forms the top layer and manages the overall mission. It monitors system health, reallocates zones when drones lose connectivity or
enter failsafe modes, and visualizes a real-time map of the disaster environment. This map displays explored and unex-plored regions and is updated at a rate of 1 Hz as new data are received.
-
Coverage Efciency Analysis
To evaluate the effectiveness of the proposed swarm frame-work, a representative disaster scenario is considered. For a collapsed structure with an approximate radius of 100 m (comparable to the footprint of a 20-story building), a swarm of eight drones is deployed around the perimeter. Each drone is assumed to cover approximately 95 m2 of area using its onboard camera.
Under these conditions, the swarm is estimated to achieve approximately 76% area coverage within the rst 4 min of operation. After 8 min, coverage increases to approximately 94%. These estimates are consistent with reported results from prior multi-robot coverage studies[21][23][26][57].
-
-
DISCUSSION
The results demonstrate that a sub-420 g FPV drone assem-bled from commodity components at a cost of approximately USD 180 can effectively support situational awareness in disaster environments, representing a signicant cost reduction compared to commercial alternatives. However, experimental evaluation revealed several limitations that currently prevent reliable eld deployment.
The most critical issue is the 85% return-to-home (RTH) success rate. All observed failures were caused by GPS signal loss within collapsed structures, where no fallback positioning mechanism was available. While the u-blox M8N receiver performs adequately in open environments, it is not reliable in GPS-denied conditions. This limitation is compounded by Betaights 1.5 s RC timeout, which is acceptable outdoors but too slow for conned environments. Reducing the timeout to
0.5 s can improve responsiveness but increases susceptibility to false triggers. A more robust solution is the integration of alternative localization methods, such as optical ow or ultra-wideband (UWB) ranging, to maintain position estimation when GPS signals are unavailable.
Frame-induced vibration was another persistent limitation. Although the addition of a carbon-ber center plate reduced the 85 Hz arm-ex resonance, it did not eliminate it. Blackbox logs consistently showed IMU noise artifacts across all ights, directly contributing to degraded attitude estimation accuracy, which remained the primary performance shortfall. Structural durability, however, was not a major concern. A propeller strike resulted in minor surface damage without compromising ight integrity, and the affected arm was replaced within two hours using standard 3D printing.
Flight endurance averaged 4.9 ± 0.4 min across 54 ights, slightly below the 5-minute design target. This shortfall is
attributed to increased airframe mass (approximately 18 g above the design budget, with the center plate alone contribut-ing 22 g) and reduced ESC efciency at low throttle levels typical of inspection ight. Discharge modeling suggests that
a 1850 mAh 4S battery could extend endurance to approx-imately 6 min, although this conguration has not yet been experimentally validated.
In the current conguration, a more practical solution is the use of a two-battery rotation strategy. The XT30 connector en-ables battery replacement in under 40 s, making this approach well-suited for rapid search-and-rescue (SAR) deployment compared to more complex solutions such as onboard wireless charging.
-
Conclusion
This paper presents the design and evaluation of a 3D-printed FPV multirotor platform developed for disaster recon-naissance in conned environments inaccessible to conven-tional UAVs. Built at the Innovation and Incubation Labo-ratory, Noida International University, the system integrates a PETG-printed airframe with off-the-shelf components at an approximate cost of USD 470, signicantly lower than comparable commercial platforms. In addition to laboratory validation, the platform was deployed at the Inter-University Drone Competition at BITS Pilani Hyderabad, where it suc-cessfully completed conned-space navigation tasks before an expert panel, demonstrating practical viability.
Experimental results from 54 ights across three simulated disaster environments showed a mean attitude estimation error of 2.1 ± 0.6, GPS-assisted position hold accuracy of ±1.4 m RMS, telemetry range up to 500 m in open terrain, and an 85% return-to-home (RTH) success rate. The platforms 290 mm
gap-entry capability and sub-420 g mass enable access to voids that are unreachable by larger UAVs, representing a key operational advantage.
However, several limitations were identied. Frame-induced vibration affected IMU accuracy, GPS-denied environments caused RTH failures, and barometric sensing was susceptible to airow disturbances in enclosed spaces. These challenges highlight areas for further system renement.
Future work will focus on six key directions:
-
Replacing PETG arms with carbon-ber structures to reduce vibration-induced noise.
-
Integrating visual-inertial odometry or ultra-wideband (UWB) positioning for operation in GPS-denied envi-ronments.
-
Migrating to advanced ight control stacks such as iNav or ArduPilot to enable waypoint-based autonomy.
-
Retraining YOLO-based object detection models on disaster-specic datasets.
-
Conducting human-factors studies to evaluate operator workload.
-
Developing ethical and regulatory frameworks for safe civilian deployment.
All design les, rmware congurations, and ight logs are being released as open-source resources to facilitate re-producibility and adoption in resource-constrained research environments.
Acknowledgment
We would like to thank the Department of Computer Science and Engineering, Noida International University, for providing laboratory facilities and ight test infrastructure.
References
-
B. Tomic et al., Toward a fully autonomous UAV: Research platform for indoor and outdoor urban search and rescue, IEEE Robot. Autom. Mag., vol. 19, no. 3, pp. 4656, Sep. 2012. DOI: 10.1109/MRA.2012.2206473
-
C. Cadena et al., Past, present, and future of simultaneous lo-calization and mapping: Toward the robust-perception age, IEEE Trans. Robot., vol. 32, no. 6, pp. 13091332, Dec. 2016. DOI: 10.1109/TRO.2016.2624754
-
H. Shakhatreh, A. H. Sawalmeh, A. Al-Fuqaha, Z. Dou, E. Almaita,
I. Khalil, N. S. Othman, A. Khreishah, and M. Guizani, Unmanned aerial vehicles (UAVs): A survey on civil applications and key re-search challenges, IEEE Access, vol. 7, pp. 4857248634, 2019. DOI: 10.1109/ACCESS.2019.2909530
-
J. Delmerico, T. Cieslewski, H. Rebecq, M. Faessler, and D. Scara-muzza, Are we ready for autonomous drone racing? The UZH-FPV drone racing dataset, in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), Montreal, QC, Canada, May 2019, pp. 67136719. DOI: 10.1109/ICRA.2019.8793887
-
J. P. Queralta, J. Taipalmaa, B. C. Pullinen, V. K. Sarker, T. N. Gia, H. Tenhunen, M. Gabbouj, J. Raitoharju, and T. Westerlund, Collaborative multi-robot search and rescue: Planning, coordination, perception, and active vision, IEEE Access, vol. 8, pp. 191617191643, 2020. DOI: 10.1109/ACCESS.2020.3030190
-
P. Foehn, E. Kaufmann, A. Romero, R. Penicka, S. Sun, L. Bauersfeld,
T. Laengle, G. Ciof, Y. Song, A. Loquercio, and D. Scaramuzza, gilicious: Open-source and open-hardware agile quadrotor for vision-based ight, Sci. Robot., vol. 7, no. 67, p. eabl6259, Jun. 2022. DOI: 10.1126/scirobotics.abl6259
-
R. Mahony, T. Hamel, and J.-M. Pimlin, Nonlinear complementary l-ters on the special orthogonal group, IEEE Trans. Autom. Control, vol. 53, no. 5, pp. 12031218, Jun. 2008. DOI: 10.1109/TAC.2008.923738
-
S. O. H. Madgwick, A. J. L. Harrison, and R. Vaidyanathan, Estimation of IMU and MARG orientation using a gradient descent algorithm, in Proc. IEEE Int. Conf. Rehabil. Robot. (ICORR), Zurich, Switzerland,
Jun.Jul. 2011, pp. 17. DOI: 10.1109/ICORR.2011.5975346
-
Y. Rizk, M. Awad, and E. W. Tunstel, Cooperative heterogeneous multi-robot systems: A survey, ACM Comput. Surv., vol. 52, no. 2, Art. 29,
Apr. 2019. DOI: 10.1145/3303848
-
V. Walter, N. Staub, A. Franchi, and M. Saska, UVDAR system for visual relative localization with application to leaderfollower forma-tions of multirotor UAVs, IEEE Robot. Autom. Lett., vol. 4, no. 3, pp. 26372644, Jul. 2019. DOI: 10.1109/LRA.2019.2915420
-
G. Jocher, A. Chaurasia, and J. Qiu, Ultralytics YOLOv8, GitHub, 2023. [Online]. Available: https://github.com/ultralytics/ultralytics
-
A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszko-reit, and N. Houlsby, An image is worth 16×16 words: Transformers for image recognition at scale, in Proc. Int. Conf. Learn. Represent. (ICLR), 2021.
-
L. Meier, P. Tanskanen, F. Fraundorfer, and M. Pollefeys, PIXHAWK: A system for autonomous ight using onboard computer vision, in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), Shanghai, China, May 2011,
pp. 29922997. DOI: 10.1109/ICRA.2011.5980229
-
G. Loianno, C. Brunner, G. McGrath, and V. Kumar, Estimation, control, and planning for aggressive ight with a small quadrotor with a single camera and IMU, IEEE Robot. Autom. Lett., vol. 2, no. 2, pp. 404411, Apr. 2017. DOI: 10.1109/LRA.2016.2633290
-
J. Corte´s, S. Mart´nez, T. Karatas¸, and F. Bullo, Coverage control for mobile sensing networks, IEEE Trans. Robot. Autom., vol. 20, no. 2,
pp. 243255, Apr. 2004. DOI: 10.1109/TRA.2004.824698
-
A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, YOLOv4: Op-timal speed and accuracy of object detection, arXiv preprint arXiv:2004.10934, Apr. 2020.
-
G. Kyrkou and T. Theocharides, EmergencyNet: Efcient aerial im-age classication for drone-based emergency monitoring using atrous convolutional feature fusion, IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens., vol. 13, pp. 16871699, 2020. DOI: 10.1109/JS-
TARS.2020.2977578
-
C. Sampedro, H. Bavle, J. L. Sanchez-Lopez, R. A. S. Fernandez, A. Rodriguez-Ramos, M. Molina, and P. Campoy, A fully-autonomous aerial robot for search and rescue applications in indoor environments using learning-based techniques, J. Intell. Robot. Syst., vol. 95, no. 2,
pp. 601627, Aug. 2019. DOI: 10.1007/s10846-018-0898-1
-
M. Kamel, T. Stastny, K. Alexis, and R. Siegwart, Model predictive control for trajectory tracking of unmanned aerial vehicles using robot operating system, in Robot Operating System (ROS): The Complete Reference (Volume 2), Springer, Cham, 2017, pp. 339.
-
P. Foehn, A. Romero, and D. Scaramuzza, Time-optimal planning for quadrotor waypoint ight, Sci. Robot., vol. 6, no. 56, p. eabp221, Jul. 2021. DOI: 10.1126/scirobotics.abp221
-
B. Mishra, D. Garg, P. Narang, and V. Mishra, Drone-surveillance for search and rescue in natural disaster, Comput. Commun., vol. 156, pp. 110, Apr. 2020. DOI: 10.1016/j.comcom.2020.03.012
-
H. Li, P. Li, Q. Feng, Z. Liu, and X. Wang, Small object detection in disaster scenes using Vision Transformer with feature pyramid augmentation, Remote Sens., vol. 15, no. 3, p. 643, Jan. 2023. DOI: 10.3390/rs15030643
-
S. Khan, M. Khan, F. Ullah, and F. Ullah, Multi-UAV based ef-cient crops monitoring system with optimal path planning and target detection, Comput. Electr. Eng., vol. 104, p. 108442, Dec. 2022. DOI: 10.1016/j.compeleceng.2022.108442
-
T. H. Nguyen, T. Nguyen, and M. Hamdi, UWB-based relative positioning for multi-UAV systems in GPS-denied environments, IEEE Sens. J., vol. 22, no. 10, pp. 99349943, May 2022. DOI: 10.1109/JSEN.2022.3163876
-
A. Loquercio, E. Kaufmann, R. Ranftl, M. Mu¨ller, V. Koltun, and D. Scaramuzza, Learning high-speed ight in the wild, Sci. Robot., vol. 6, no. 59, p. eabg5810, Oct. 2021. DOI: 10.1126/scirobotics.abg5810
-
M. Dharmadhikari, T. Dang, L. Solanka, J. Loje, H. Nguyen, N. Khedekar, and K. Alexis, Motion primitives-based path planning for fast and agile exploration using aerial robots, in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), Xian, China, Jun. 2021, pp. 179185. DOI: 10.1109/ICRA48506.2021.9561775
-
E. Yanmaz, S. Yahyanejad, B. Rinner, H. Hellwagner, and C. Bettstetter, Drone networks: Communications, coordination, and sensing, Ad Hoc Netw., vol. 68, pp. 115, Jan. 2018. DOI: 10.1016/j.adhoc.2017.09.001
-
M. Magnabosco, A. Marzocchi, and M. Pagnini, Structural evalu-ation of FDM 3D-printed frames for sub-250 g racing quadrotors under impact loading, Drones, vol. 7, no. 2, p. 84, Feb. 2023. DOI: 10.3390/drones7020084
-
J. Karimi and S. H. Pourtakdoust, Optimal maneuver-based motion planning for lightweight 3D-printed UAV platforms, Aerosp. Sci. Tech-nol., vol. 120, p. 107275, Jan. 2022. DOI: 10.1016/j.ast.2021.107275
-
T. Barth, M. Gerdts, and A. Mayer, Characterisation of manufacturing-tolerance-induced thrust asymmetry in low-cost brushless ESC units for micro aerial vehicles, IEEE Access, vol. 10, pp. 22417-22429, 2022.
DOI: 10.1109/ACCESS.2022.3153894
-
Betaight Development Team, Betaight 4.4 rmware release notes, GitHub, 2023. [Online]. Avail-able:https://github.com/betaight/betaight/releases/tag/4.4.0
-
F. Bonatti, Y. Zhang, S. Choudhury, W. Wang, and S. Scherer, Towards a robust aerial cinematography platform: Localisation and target following in GPS-denied environments, in Proc. IEEE IROS, Prague, Czech Republic, Sep. 2021, pp. 16511658. DOI: 10.1109/IROS51168.2021.9635928
-
M. Khawaja, I. Guvenc, D. Matolak, U.-C. Fiebig, and N. Schnecken-burger, A survey of air-to-ground propagation channel modeling for unmanned aerial systems, IEEE Commun. Surv. Tutor., vol. 22, no. 1,
pp. 2162, Firstquarter 2020. DOI: 10.1109/COMST.2019.2915069
-
K. Nguyen, Z. Luo, A. Chin, and C. Oestges, Electromagnetic propa-gation measurements and path loss models for indoor UAV communi-cations at sub-6 GHz, IEEE Trans. Antennas Propag., vol. 70, no. 3,
pp. 20882098, Mar. 2022. DOI: 10.1109/TAP.2021.3090862
-
X. Zhou, Z. Wang, H. Ye, C. Xu, and F. Gao, EGO-Planner: An ESDF-free gradient-based local planner for quadrotors, IEEE Robot. Autom. Lett., vol. 6, no. 2, pp. 478485, Apr. 2021. DOI: 10.1109/LRA.2020.3047803
-
A. Pham, D. Seo, and J. Lee, Comparative cost-performance analy-sis of sub-kilogram FPV-class UAV platforms for rst-responder re-connaissance, Sensors, vol. 22, no. 14, p. 5327, Jul. 202. DOI: 10.3390/s22145327
-
C. Forster, M. Pizzoli, and D. Scaramuzza, SVO: Semidirect vi-sual odometry for monocular and multi-camera systems, IEEE Trans. Robot., vol. 33, no. 2, pp. 249265, Apr. 2017, updated in arXiv:2110.14285, Oct. 2021. DOI: 10.1109/TRO.2016.2623335
-
K. Kang, Y. Park, and H. Moon, Design and evaluation of a hybrid 3D-printed/carbon-bre micro-quadrotor frame for reduced structural vibration in inspection tasks, Appl. Sci., vol. 12, no. 17, p. 8569, Aug. 2022. DOI: 10.3390/app12178569
-
Y. Zhang, J. Liu, and W. Zhao, Efciency characterisation of BLHeliS electronic speed controllers under varying throttle proles for sub-500 g multirotor UAVs, Drones, vol. 6, no. 5, p. 114, May 2022. DOI: 10.3390/drones6050114
-
P. De Petris, H. Nguyen, M. Dharmadhikari, M. Kulkarni, N. Khedekar,
F. Mascarich, L. Ott, T. Dang, V. Alexis, M. Breyer, and K. Alexis, MRS UAS system for collaborative aerial and ground search and rescue with micro aerial vehicles, J. Field Robot., vol. 39, no. 1, pp. 145,
Jan. 2022. DOI: 10.1002/rob.22029
-
Ardupilot Development Team, ArduCopter 4.4 documentation: Emer-gency mode and failsafe conguration, ArduPilot.org, 2024. [Online].
Available: https://ardupilot.org/copter/
-
O. Avola, L. Cinque, A. Fagioli, G. Foresti, and C. Marini, YOLO-based real-time human detection for search and rescue with drones, in Proc. IEEE ICASSP, Rhodes Island, Greece, Jun. 2023, pp. 15. DOI: 10.1109/ICASSP49357.2023.10096521
-
R. Clarke and L. B. Moses, The regulation of civilian drones impacts on public safety, Comput. Law Secur. Rev., vol. 30, no. 3, pp. 263-285,
Jun. 2014. DOI: 10.1016/j.clsr.2014.03.007
-
X. Li, L. Ma, Z. Wang, and Q. Liu, Survey of UAV applica-tions in post-disaster search and rescue: Challenges, technologies, and future directions, Drones, vol. 7, no. 1, p. 35, Jan. 2023. DOI: 10.3390/drones7010035
-
J. Kang, M. Kim, and H. Yoo, Real-time victim detection in dis-aster scenes using lightweight CNN and edge computing on UAVs, IEEE Access, vol. 10, pp. 112847112858, 2022. DOI: 10.1109/AC-
CESS.2022.3216540
-
A. H. Alotaibi, A. Alotaibi, and U. Malik, LAPMSD: Lightweight aerial pedestrian multi-scale detector for real-time search and rescue using nano-UAVs, IEEE Access, vol. 11, pp. 68046819, 2023. DOI: 10.1109/ACCESS.2023.3237186
-
K. Kang, J. Seo, D. Shin, and W. Doh, Indoor positioning and navigation for micro-UAVs in GPS-denied environments using visual-inertial SLAM, Remote Sens., vol. 14, no. 5, p. 1114, Mar. 2022. DOI: 10.3390/rs14051114
-
X. Wang, Z. Gao, Z. Li, and Q. Li, Lightweight 3D-printed quadro-tor for GPS-denied indoor navigation using LiDAR SLAM, IEEE Robot. Autom. Lett., vol. 8, no. 5, pp. 2985-2992, May 2023. DOI: 10.1109/LRA.2023.3260677
-
V. Garg, K. V. Mishra, and A. Manimaran, RF channel modeling for sub-urban UAV telemetry at 915 MHz: Empirical path-loss models, IEEE Wirel. Commun. Lett., vol. 10, no. 9, pp. 19091913, Sep. 2021.
DOI: 10.1109/LWC.2021.3087265
-
A. Carrio, C. Sampedro, A. Rodriguez-Ramos, and P. Campoy, Dis-tributed aerial robot swarm for search and rescue in post-disaster environments: Architecture, communication, and coordination, Drones, vol. 5, no. 2, p. 39, Jun. 2021. DOI: 10.3390/drones5020039
-
H. Yu, K. Meier, M. Argyle, and R. W. Beard, Cooperative path planning for target tracking in urban environments using unmanned air and ground vehicles, IEEE/ASME Trans. Mechatron., vol. 20, no. 2,
pp. 541552, Apr. 2015. DOI: 10.1109/TMECH.2014.2301459
-
T. Cieslewski, S. Choudhary, and D. Scaramuzza, Data-efcient decen-tralized visual SLAM, in Proc. IEEE ICRA, Brisbane, QLD, Australia,
May 2018, pp. 24662473. DOI: 10.1109/ICRA.2018.8461155
-
F. Kong, X. Liu, B. Tang, J. Lin, Y. Ren, Y. Cao, F. Zhu, T. Chen, and
F. Zhang, Marsupial quadrotors: Unmanned aerial vehicles carried on unmanned ground vehicles for improved mobility and new deployment options, in Proc. IEEE IROS, Kyoto, Japan, Oct. 2022, pp. 81168123.
DOI: 10.1109/IROS47612.2022.9981960
-
R. Rubio-Herrero, J. L. Sanchez-Lopez, and H. Voos, A visual UAS detection approach with deep learning for SAR operations, Sensors, vol. 22, no. 4, p. 1621, Feb. 2022. DOI: 10.3390/s22041621
-
M. Seidaliyeva, D. Akhmetov, L. Ilipbayeva, and E. Matson, Real-time and accurate drone detection in a video with a static background,
Sensors, vol. 20, no. 14, p. 3856, Jul. 2020. DOI: 10.3390/s20143856
-
Y. Song, M. Steinweg, E. Kaufmann, and D. Scaramuzza, Au-tonomous drone racing with deep reinforcement learning, in Proc. IEEE IROS, Prague, Czech Republic, Sep. 2021, pp. 12051212. DOI: 10.1109/IROS51168.2021.9636053
-
D. Ebrahimi, S. Sharafeddine, P.-H. Ho, and C. Assi, Autonomous UAV trajectory design for partial ground coverage with IoT data collection, IEEE Internet Things J., vol. 8, no. 3, pp. 16141625, Feb. 2021. DOI: 10.1109/JIOT.2020.3012622
-
A. Tagliabue, X. Wu, and M. Mueller, Model-free online mo-tion adaptation for energy-efcient ights of multicopters, in Proc. IEEE ICRA, Xia´n, China, Jun. 2021, pp. 39163922. DOI: 10.1109/ICRA48506.2021.9561979
-
A. Tagliabue, X. Wu, and M. Mueller, Model-free online mo-tion adaptation for energy-efcient ights of multicopters, in Proc. IEEE ICRA, Xia´n, China, Jun. 2021, pp. 39163922. DOI: 10.1109/ICRA48506.2021.9561979
-
M. Clarke and L. Moses, Autonomous drone regulations in emergency response: International frameworks and open questions, Technol. Soc., vol. 71, p. 102105, Nov. 2022. DOI: 10.1016/j.techsoc.2022.102105
-
K. Kim, D. Kim, and M. Kim, Thermal imaging integration with micro-UAVs for survivor detection in post-earthquake rubble: Field performance evaluation, Remote Sens., vol. 15, no. 7, p. 1923, Apr. 2023. DOI: 10.3390/rs15071923
-
C. Wang, J. Zhang, Y. Wang, and L. Chen, iNav-based GPS-denied navigation for low-cost FPV multirotors using optical ow and baro-metric fusion, Sensors, vol. 23, no. 2, p. 765, Jan. 2023. DOI: 10.3390/s23020765
-
R. P. Dinelli, A. Rottler, G. Laman, and M. Otte, Multi-robot search and rescue: An empirical study of coordination strategies under commu-nication constraints, J. Field Robot., vol. 40, no. 4, pp. 923943, Jul. 2023. DOI: 10.1002/rob.22167
