Solution to Bird Pest on Cultivated Grain Farm: A Vision Controlled Quadcopter System Approach

DOI : 10.17577/IJERTV7IS100009

Download Full-Text PDF Cite this Publication

Text Only Version

Solution to Bird Pest on Cultivated Grain Farm: A Vision Controlled Quadcopter System Approach

Solution to Bird Pest on Cultivated Grain Farm: A Vision Controlled Quadcopter System Approach

Adebayo Segun1

1Department of Computer Science and Information Technology,

Bowen University, Iwo, Osun State Nigeria

Oyetade Idowu Sunday3 3Department of Computer Science, Redeemers University, Ede,

Osun State Nigeria

Erastus O. Ogunti2, Francis K. Akingbade2 2Department of Electrical and Electronics Engineering, Federal University of Technology, Akure,

Ondo State Nigeria

AbstractBirds invasion on rice cultivated field have been very disastrous specially in Africa. Most farmer have few options to manage these birds with traditional farmers and their children running up the field shouting, throwing stones and sometimes beating drums to scare birds away. This method and many other methods employed to address this issue are energy consuming and expensive and sometimes not effective at all. This study present the use of vision controlled quadcopter system to detect and chase these birds away from cultivated field. The method involves using robot vision to control the position of quadcopter in an attempt to follow the object while it produces scaring sounds such as bird distress call and predators call.

Index Terms Computer Vision, Object Detection, Rice Crop, Quelea Birds, Quadcopter

  1. INTRODUCTION

    Automatic control has played a vital role in the advance of engineering and science. In addition to its extreme importance in space-vehicle systems, missile-guidance systems, robotic systems, and the likes, automatic control has become an important and integral part of modern agriculture know as smart farming. This is a modern farming management concept using digital techniques to monitor and optimize agricultural production processes. For example, rather than applying the same amount of fertilizers over an entire agricultural field, or feeding a large animal population with equal amounts of feed, smart farming will measure variations in conditions within a field and adapt its fertilizing or harvesting strategy accordingly. Likewise, it will assess the needs and conditions of individual animals in larger populations and optimize feeding on a per-animal basis. Applying automation to every process of farming will increase the quantity and quality of agricultural output while using less input such as water, energy, fertilizers, pesticides and so on. It will save costs, reduce environmental impact and produce more and better food. Thus the aim of this study is to provide an automated solution to solving a problem in the food production process.

    1.1 Problem Identification

    Study by Oerke [1] shows that about 15% of global rice production is lost to animal pests (arthropods, nematodes, rodents, birds, slugs and snails). The Global Rice Science Partnership (GRiSP) identifies birds as the second most important biotic constraint in African rice production after weeds. The study was based on farmer surveys in 20 African countries [2]. Red-billed Quelea has been studied extensively and there are many publications describing its pest status and control strategies in African agriculture ([3, 4]. These birds have been identified as one of the most notorious pest bird species in the world, injurious to various cereal crops such as rice, millet, sorghum and wheat [5]. It occurs throughout sub- Saharan Africa. It gathers in flocks of several million birds and breeds in colonies that can cover more than 100 hectares with about 30,000 nests per hectare. de Mey, Demont [6] estimate annual bird damage to an average of 13.2% of the potential rice production during the wet seasons of 2003 2007, which translates into an average annual economic loss of 7.1 million. Oduntan, Shotuyo [7] estimates that 2 million red-billed quelea birds can destroy up to 50 tons of grain in a day, value equivalent of $600,000. Despite the huge damage caused by this pest and much international attention it had received in the past and is still receiving, little research on bird control is currently conducted.

    Presently, most farmers, along with their children and probably hired workers, run up the field shouting, throwing stones, waving, clapping hands and sometimes try to scare the birds away with drums and catapult. This process is depicted in figure 1. Humans detect the presence of birds using the eyes. Since human brains have been trained to recognize the kind of birds they are, thus, identification, classification, tracking and scaring is done with seamless effort. The brain activates body muscles which is a result of the shouting, throwing of stones and other activities to scare the birds away. This process can be represented in a control block diagram as shown in figure 2. In human-operated system, the eyes, brain and muscles correspond to the sensor, controller and pneumatic valve respectively. Effectiveness of this process requires a lot of human effort, time and cost

    Fig 1. Existing Manual Pest Control.

    Figure 2. Block Diagram of Human operated Farm Status System

  2. PROPOSED SOLUTION

      1. Computer Vision

        Computer vision is a field of informatics. It is a way computers gather and interpret visual information from the surrounding environment. It involves detecting and recognizing object of interest. Usually the image is first processed on a lower level to enhance picture quality to remove noise. Then the picture is processed on a higher level, such as detecting patterns and shapes, to recognize object of interest [8, 9]. A subsection in computer vision is robot vision which involves using a combination of vision sensors such as camera, computer algorithms and other hardware components to allow machine to process visual data from the real world to

        accomplish complex tasks that require visual understanding such as chasing an object away from a mapped out area. Table 1 shows the differences amongst the various subsection of computer vision which include robot vision

        Figure 3 Computer Vision Family Tree [10]

        Table 1 Comparison of Various Techniques

        Technique

        Input

        Output

        Signal Processing

        Electrical signals

        Electrical signals

        Image Processing

        Images

        Images

        Computer Vision

        Images

        Information/features

        Pattern Recognition/Machine Learning

        Information/features

        Information

        Machine Vision

        Images

        Information

        Robot Vision

        Images

        Physical Action

        Source : [10]

        The process in figure 2 can be achieved using robot vision as depicted in figure 4.

        Figure 4. Proposed Machine Controlled Farm Protecting System

        Object detection is commonly referred to as a method that is responsible for discovering and identifying the existence of objects of a certain class. An extension of this can be considered as a method of image processing to identify objects from digital images. A method of object detection refers to classifying objects in images according to colour for instance,

        robotic soccer, where different teams assembles their robots and go head to head with other teams. However, this color- coded approach has its downsides. Experiments in the international RoboCup competition have shown that the lighting condiions are extremely detrimental to the outcome of the game and even the slightest ambient light change can

        prove fatal to the success or failure of any team. Participants

        reasonable approximation to suppose square pixels (i.e. =

        need to recalibrate their systems several times even on the same field, because of the minor ambient light change that

        = and

        2

        =).

        occurs with the time of day [11]. This type of detection is not

        suitable for most real world applications, just because of the constant need for recalibration and maintenance.

        Thus, an advanced and sophisticated method is required for object detection. One such method is detecting objects from images using features or specific structures of the object of interest. An example of feature detection method is Haar-like features which was developed by Viola and Jones [12] on the basis of the proposal by Papageorgiou, Oren [13]. The method considers neighbouring rectangular regions at a specific location in a detection window. It sums up the pixel intensities in each region and calculates the difference between these sums. This difference is then used to categorize subsections of an image. Commonly, the areas around the eyes are darker than the areas on the cheeks. One example of a Haar-like feature for face detection is therefore a set of two neighbouring rectangular areas above the eye and cheek regions. Other improved methods include the use of local binary pattern features [14]. It is an operator for image description that is based on the signs of differences of neighboring pixels. It is fast to compute and invariant to monotonic gray-scale changes of the image. Despite being simple, it is very descriptive, which is attested by the wide variety of different tasks it has been successfully applied to [15]. The LBP histogram has proven to be a widely applicable image feature for example texture classification [16], face analysis [17], video background subtraction [18] and so on [19]. A possible drawback of the LBP operator is that the threshold operation in comparing the neighboring pixels could make it sensitive to noise. Histogram of gradient had also been used in object detection [20]. It is a shape descriptor that counts occurrences of gradient orientations in localized portions of an image.

        Vision sensor such as camera is a major component in computer vision. A camera performs the perspective projection of a 3D point to the image plane. The image plane is a matrix of light sensitive cells. The resolution of the image is the size of the matrix. The single cell is called a pixel". For each pixel of coordinates (u, v), the camera measures the intensity of the light. A 3D point, with homogeneous coordinates X = (X, Y, Z) project to an image point with homogeneous coordinates p = (u, v) as shown in figure 4

        [] (1)

        Where K is a matrix containing the intrinsic parameters

        matrix of the camera

        cos()

        Vision-based control also known as visual servoing (VS), is a technique which uses feedback information extracted from a vision sensor to control the motion of a robot. In recent years, unmanned aerial vehicles (UAVs), which is a robot, have become a very active field of research and made a huge progress in automated navigation, surveillance, military application, rescue tasks and agriculture. Among various research areas on UAV, vision based autonomous control has become the main interest for the environment where GPS is denied. Vision-based control approaches generally use points as visual features.

        Figure 4: Pinhole camera 2D image representation

      2. Related Work

        Robots are widely used today in agriculture tasks. Many of these tasks require machine vision algorithms to operate successfully. The robots and their machine vision algorithms change form to best suit their function, starting from fields plowing [21], seeds planting [22], weeds handling[23], growth monitoring, fruits and vegetables picking, sorting, grading and even packaging[24]. Since the focus of this study is mainly growth monitoring of crop with emphasis on rice crop, literature pertaining to the use of unmanned aerial vehicle for precision farming will be addressed in this work.

        There has been an increase in interest in the development and use of unmanned aerial vehicles (UAVs) for agricultural and environmental applications. Reports indicate that agriculture industry could potentially be the largest user of this technology [25]. The practical applications of UAVs so far have occurred in Europe and in countries like Canada, Australia and Japan where there are fewer airspace regulations compared to United States [26]. Use of UAVs for commercial purposes is prohibited in the United States; only hobbyists are allowed to fly small, radio-controlled airplanes for recreational purposes. UAVs are currently being applied by farmers in wide field analysis of crop behavior such as rice,

        = [ 0

        sin()

        ] (2)

        maize and wheat where they scan through the field, take images and report abnormality [27]. In Japan, Yamaha

        0 0 1

        Where and are the pixels coordinates of the principal point, and are the scaling factors along the and axes (in pixels/meters), is the angle between these axes and f is

        the focal length. For most of commercial cameras, it is a

        Industrial Unmanned Helicopters are small commercial viable helicopter UAVs to meet requirements for crop dusting and spraying [28]. Yamaha Aero Robot "R-50," is an industrial- use unmanned helicopter with a 20 kg effective load capacity.

        Currently, there are two broad platforms for UAVs, namely the Fixed Wing and Rotary Wing (copter) types. Fixed wing UAVs have the advantage of being able to fly at high speeds for long durations with simpler aerodynamic features. Some of them do not even require a runway or launcher for takeoff and landing. The rotary wing UAVs have the advantage of being able to take off and land vertically and hover over a target. However, because of mechanical complexity and shortened battery power, they have a short flight range. These UAVs fly up to an altitude of 400 feet and are able to follow the same path or GPS-guided routes daily, weekly or as desired. Cameras gather images with normal light, infrared or thermal, still photos or video formats. These images are digitized, geo-referenced and mapped. Crop consultants and farmers can use this information to scout crops, detect nutrient deficiencies, assess flood or drought damage, forecast weather patterns, monitor wildlife and even locate cattle in distant pastures. Research also reveals that UAV can be used for detecting atmospheric microbes and air pollution [29, 30], spot-spraying chemicals and

        micronutrients [31]

      3. Quadcopter Fundamentals

        A quadcopter, as shown in fig 3, is a helicopter equipped with four motors and propellers mounted on each. Each opposite motors rotate in the same direction counter-clockwise and clockwise respectively. The conventional helicopter has a tail rotor for stability, this is absent in quadcopter due to its configuration of opposite pairs directions. There are four basic movements governing the quadcopter movement to reach certain altitude and attitude. These are throttle, roll,

        The pitch, an opposite of the roll, is achieved by concurrently increasing or decreasing the speed of the rear propellers and by decreasing or increasing the speed of the front propellers at the same rate. This generates a torque with respect to the y axis which makes the quad-rotor to tilt about the same axis, thereby creating a pitch.

        The yaw command is achieved by increasing (or decreasing) the opposite propellers speed and by decreasing (or increasing) that of the other two propellers. It leads to a torque with respect to the z axis which makes the quadrotor turn clock wise or anti clock wise about the z-axis.

      4. Quadcopter Mathematical Model

        A survey of methods of modeling was caried out by [33] which had categorized these methods. Using Newton-Euler method which is based on Newtons second law equations on rigid body, a quadcopter has six degrees of freedom (6DOF) which are the translational motion (x,y,z) and the rotational angles (row, pitch and yaw). These motions are represented as

        (x,y,z,, , ). This is further divided as the inner loop and

        outer loop configuration as shown in Fig 6. The inner loop

        which are the Attitude (roll, pitch and yaw) and the Altitude (Z height) can be controlled using four proportional integral and differential (PID) controllers. The outerloop are the x and y position of the quadcopter in space and two more PID controllers can be used to control them. The output of these two controllers will be input to the roll and pitch controllers. The altitude, pitch, roll and yaw control are represented in equations 2, 3, 4 and 5 respectively.

        pitch and yaw.

        The throttle is achieved by concurrently increasing or decreasing all propeller speeds with the same amount and rate. This generates a cumulative vertical force from the four propellers, with respect to the body-fixed frame. As a result, the quadcopter is raised or lowered by a certain value.

        The roll is provided by concurrently increasing or decreasing the left propellers speed and decreasing or increasing the right propellers speed at the same rate. It generates a torque with respect to the x axis which makes the quadcopter to tilt about the axis, thus creating a roll angle. The total vertical thrust is maintained as in hovering; thus this command leads only to a roll angular acceleration.

        1 = +

        2 = +

        + (2)

        + (3)

        Fig 6. Quadcopter control system

        3 = +

        +

        (4)

        Fig 5. Typical Quadcopter system[32]

        4 = +

        +

        (5)

        Where , and are the altitude PI-D controller parameters. (altitude error) = Zdes (desired altitude) – Zmes

        (measured altitude).

        , and are the roll angle PI-D controller parameters. ( error) = des (desired roll) – mes

        (measured roll).

        , and are the pitch controller PI-D parameters.

        ( error) = des (desired pitch) – mes (measured pitch).

        , and are the yaw angle PI-D controller parameters. ( error) = des (desired yaw) – mes

        (measured yaw).

      5. Vision-Based Control

    Controlling quadcopter movements using object of interest detected position will achieve an object following robot. The position of object in every single frame of a video provides route for the position of quadcopter at any given time. Components of the objects relative position, r, in the quadcopter reference frame as shown in Fig 7 can be determined using image principle as stated by [34-36]. Equation 11 shows the relationship of the desired yaw and detected object image in a frame.

    The outer control loop outputs are desired roll and pitch angles, which they are the inputs to the inner loop for the

    = arctan (

    ) (12)

    desired X and Y position. The linear acceleration for the

    quadcopter is represented in equations 6 and 7.

    = = (13)

    = = (14)

    = (cos sin cos + sin sin ) 1

    = (sin sin cos + cos sin ) 1

    Equation 6 and 7 can be simplified as

    = ( cos + sin ) 1

    = ( sin + cos ) 1

    Putting equation 8 and 9 in matrix form we obtain

    (6)

    (7)

    (8)

    (9)

    Fig 7. Controlling Quadcopter movement using object position [37]

    Controlling x and y position as well as the yaw angle will ensure a quadcopter bird chasing system. Equipping the

    1

    [

    sin cos

    system with other methods of bird deterrent signals such as

    ] =

    [ ] [ ] (10)

    cos sin

    predators call and bird distress sounds will ensure an effective deterrent system.

    Thus, the desired roll and pitch is as

    1

    [] = [sin cos ]

    (11)

  3. RESULTS AND DISCUSSION

    1

    cos sin

    In this section, the proposed position control method is

    evaluated based on simulations in MATLAB/SIMULINK. Using AR Drone 2.0 model[38], scenarios with different bird trajectories were tested as waypoints for the quadcopter position control.

    Figure 8. Simulink Quadcopter Model

    Table 4. Controller Performance

    Table 2 shows the tracked bird locations detected in different

    Metric

    Roll

    Pitch

    Altitude

    Yaw

    frames of a video. Table 3 shows the proportional, integral and

    differential (PID) gain. Table 4 shows the steady state oscillation amplitude, settling time within steady state oscillation and overshoot. Fig 8 shows the euler angles during

    Steady State Oscillation Amplitude

    0.16o

    0.23o

    0.99m

    0.56o

    flight for the simulation. Fig 9 shows the trajectories of bird and that of the simulated quadcopter. The result shows that following was successfully achieved.

    Settling Time within Steady State Oscillation

    1.8sec

    2.5sec

    1.9sec

    3.5sec

    Waypoints

    1

    Xd

    0

    Yd

    0

    0

    0

    0

    2

    1

    5

    1

    5

    78.7

    3

    2

    14

    1

    9

    83.7

    4

    3

    19

    1

    5

    78.7

    5

    1

    61

    -2

    42

    -87.3

    6

    0

    58

    -1

    -3

    71.6

    7

    -1

    55

    -1

    -3

    71.6

    8

    0

    18

    1

    -37

    -88.5

    9

    1

    2

    1

    -16

    -86.4

    Table 2. Waypoints for quadcopter

    10 -5 -80 -6 -82 85.8 Table 3. PID gain parameter

    Controller Value

    Roll -0.5

    Pitch 0.5

    Altitude 1

    Heading (yaw) 1.5

    Overshoot 1.51% 38.19% 1.53% 13.07%

    Fig 8. Euler Angles during flight

    Fig 9. Simulated Quadcopter trajectory compared with detected bird trajectory

  4. CONCLUSION AND RECOMMENDATION

This study presents an effective and reliable method of preventing birds invasion on rice farm using vision-based bird tracking and chasing quadcopter system. The system is easy to set up and the can be executed on a quadcopter. Finally, experimental results were presented which showed the actual performance of the proposed controller.

REFERENCE

[1] Oerke, E.-C., Crop losses to pests. The Journal of Agricultural Science, 2006. 144(1): p. 31-43.

[2] IRRI, A., CIAT (2010) Global Rice Science Partnership (GRiSP).

CGIAR Thematic Area, 2010. 3.

[3] Dallimer, M., et al., Genetic evidence for male biased dispersal in the redbilled quelea Quelea quelea. Molecular Ecology, 2002. 11(3): p. 529-533.

[4] Ward, P., The migration patternsof Quelea quelea in Africa. Ibis, 1971. 113(3): p. 275-297.

[5] Manikowski, S., A. N'Diaye, and B. Tréca, Manuel de protection des cultures contre les dégats d'oiseaux. 1991.

[6] de Mey, Y., M. Demont, and M. Diagne, Estimating bird damage to rice in Africa: evidence from the Senegal River Valley. Journal of Agricultural Economics, 2012. 63(1): p. 175-200.

[7] Oduntan, O., et al., Human-wildlife Conflict: A View on Red- Billed Quelea (Quelea quelea). International Journal of Molecular Evolution and Biodiversity, 2015. 5: p. 1-4.

[8] Sonka, M., V. Hlavac, and R. Boyle, Image processing, analysis, and machine vision. 2014: Cengage Learning.

[9] Klette, R., Concise computer vision. 2014: Springer.

[10] Owen-Hill, A. Robot Vision vs Computer Vision: What's the Difference? 2016 [cited 2018 8th March]; Available from: https://www.roboticstomorrow.com/content.php?post=8484.

[11] Matsumura, R. and H. Ishiguro, Development of a high- performance humanoid soccer robot. International Journal of Humanoid Robotics, 2008. 5(03): p. 353-373.

[12] Viola, P. and M. Jones. Rapid object detection using a boosted cascade of simple features. in IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2001. 2001. IEEE.

[13] Papageorgiou, C.P., M. Oren, and T. Poggio. A general framework for object detection. in Computer vision, 1998. sixth international conference on. 1998. IEEE.

[14] Ahonen, T., A. Hadid, and M. Pietikainen, Face description with local binary patterns: Application to face recognition. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2006(12): p. 2037-2041.

[15] Sahu, H., An Analysis of Texture Classification: Local Binary Pattern. Journal of Global Research in Computer Science, 2013. 4(5): p. 17-20.

[16] Guo, Z., L. Zhang, and D. Zhang, A completed modeling of local binary pattern operator for texture classification. IEEE Transactions on Image Processing, 2010. 19(6): p. 1657-1663.

[17] Zhang, B., et al., Local derivative pattern versus local binary pattern: face recognition with high-order local pattern descriptor. IEEE transactions on image processing, 2010. 19(2): p. 533-544.

[18] Liu, L., et al., Median robust extended local binary pattern for texture classification. IEEE Transactions on Image Processing, 2016. 25(3): p. 1368-1381.

[19] Zhang, M., et al., Blind image quality assessment using the joint statistics of generalized local binary pattern. IEEE Signal Processing Letters, 2015. 22(2): p. 207-210.

[20] Carcagnì, P., et al., Facial expression recognition and histograms of oriented gradients: a comprehensive study. SpringerPlus, 2015. 4(1): p. 645.

[21] Tripicchio, P., et al. Towards smart farming and sustainable agriculture with drones. in Intelligent Environments (IE), 2015 International Conference on. 2015. IEEE.

[22] Burema, H. and A. Filin, Aerial farm robot system for crop dusting, planting, fertilizing and other field jobs. 2016, Google Patents.

[23] Hung, C. and S. Sukkarieh, Using robotic aircraft and intelligent surveillance systems for orange hawkweed detection. Plant Protection Quarterly, 2015. 30(3): p. 100.

[24] Wang, J., et al., Apple fruit recognition based on support vector machine using in harvesting robot. Nongye Jixie Xuebao= Transactions of the Chinese Society for Agricultural Machinery, 2009. 40(1): p. 148-151.

[25] Anderson, K. and K.J. Gaston, Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Frontiers in Ecology and the Environment, 2013. 11(3): p. 138-146.

[26] Stöcker, C., et al., Review of the current state of UAV regulations.

Remote sensing, 2017. 9(5): p. 459.

[27] Muchiri, N. and S. Kimathi. A review of applications and potential applications of UAV. in Proceedings of Sustainable Research and Innovation Conference. 2016.

[28] Yamaha. 2018 [cited 2018 12th June]; Available from: https://www.yamaha-motor.com.au/buying/sky/content/aerial- services.

[29] Alvear, O., et al., Using UAV-based systems to monitor air pollution in areas with poor accessibility. Journal of Advanced Transportation, 2017. 2017.

[30] Smith, B., et al. Development and validation of a microbe detecting UAV payload. in Research, Education and Development of Unmanned Aerial Systems (RED-UAS), 2015 Workshop on. 2015. IEEE.

[31] Xiongkui, H., et al., Recent development of unmanned aerial vehicle for plant protection in East Asia. International Journal of Agricultural and Biological Engineering, 2017. 10(3): p. 18-30.

[32] Syma Co, L.S.C.X.S.

[33] Musa, S., Techniques for Quadcopter Modelling & Design: A Review. Journal of Unmanned System Technology, 2018.

[34] Herrera, D., J. Kannala, and J. Heikkilä. Accurate and practical calibration of a depth and color camera pair. in International Conference on Computer analysis of images and patterns. 2011. Springer.

[35] Xing, G., et al. People-following system design for mobile robots using kinect sensor. in Control and Decision Conference (CCDC), 2013 25th Chinese. 2013. IEEE.

[36] Neto, E.N.A., et al. Real-Time head pose estimation for mobile devices. in International Conference on Intelligent Data Engineering and Automated Learning. 2012. Springer.

[37] Kendall, A.G., N.N. Salvapantula, and K.A. Stol. On-board object tracking control of a quadcopter with monocular vision. in 2014 International Conference on Unmanned Aircraft Systems (ICUAS). 2014. IEEE.

[38] Jeurgens, N., Implementing a Simulink controller in an AR. Drone 2.0. 2016.

Leave a Reply