Orchard Spray Robot Tracks Targets in Real Time Using LiDAR and AI Control
In a breakthrough for precision agriculture, a team of engineers from China Agricultural University has developed an intelligent orchard spraying robot capable of detecting tree canopies in real time and dynamically adjusting its spray angle to match the shape and position of each target. The system, which leverages LiDAR scanning and advanced control algorithms, represents a significant leap forward in reducing pesticide waste, minimizing environmental contamination, and improving the efficiency of orchard pest management.
The research, led by Shijie Jiang, Hengtao Ma, Shenghui Yang, Chao Zhang, Daobilige Su, and Feng Kang, with corresponding author Yongjun Zheng, was published in the peer-reviewed journal Transactions of the Chinese Society of Agricultural Engineering. The study introduces a fully integrated target detection and tracking system designed specifically for ground-based orchard robots, offering a scalable solution to one of the most persistent challenges in modern agriculture: over-spraying.
Traditional air-assisted sprayers, while effective at covering large areas quickly, operate on a one-size-fits-all principle. They spray continuously, regardless of whether a tree is present, its size, or its canopy structure. This results in significant pesticide drift, off-target deposition, and excessive chemical use—problems that not only raise environmental and health concerns but also increase operational costs for farmers. In many cases, up to 50% or more of the applied chemicals miss their intended targets, ending up in soil, waterways, or non-crop areas.
The new system developed by Zheng’s team directly addresses this inefficiency. By equipping a small, crawler-based robotic platform with a 360-degree LiDAR sensor, the researchers have created a machine that “sees” the orchard as it moves through the rows. The LiDAR scans the environment, generating a dense point cloud that maps the three-dimensional structure of tree canopies in real time. From this data, the system identifies the optimal target points—specifically, the middle to lower sections of the canopy where pests often reside and where ground-based spraying is most effective.
What sets this system apart is not just its ability to detect targets, but how it responds. Once a target is identified, the robot calculates the precise elevation angle needed for its nozzles to align with the target point. This calculation takes into account the physical geometry of the spraying mechanism, the position of the LiDAR sensor, and the dynamic movement of the electric actuator that adjusts the spray boom. The result is a highly responsive system that can reposition its spray angle within milliseconds, ensuring that each burst of pesticide is delivered exactly where it is needed.
The core of the system lies in its closed-loop control architecture. At the heart of the operation is a microcontroller (STM32F429) that processes sensor data and executes control commands. The LiDAR feeds point cloud data to a mini industrial computer, which performs the initial processing—filtering out noise, identifying tree trunks and gaps, and extracting key features such as the maximum and minimum polar angles of the canopy. From these values, the system computes an average target point, which serves as the reference for the spray angle.
To ensure smooth and accurate movement, the researchers implemented an incremental Proportional-Integral-Derivative (PID) control algorithm. Unlike traditional PID controllers that output absolute values, the incremental version calculates the change in control output from one time step to the next. This approach offers several advantages: it reduces overshoot, improves stability, and allows for fine-tuned adjustments even in the presence of mechanical backlash or sensor noise. The controller continuously compares the actual spray angle—measured via an encoder on the electric pushrod—with the desired target angle, adjusting the motor input in real time to minimize error.
One of the critical innovations in the design is the way the system handles variability in tree size and spacing. Orchards are inherently irregular environments. Trees differ in height, width, and density, and planting patterns can vary significantly between regions. To accommodate this diversity, the team defined a trapezoidal detection zone in front of the robot, limiting the area in which targets are evaluated. This zone is calibrated based on typical orchard dimensions observed in field surveys across Guangxi, Beijing, and Shanxi—regions known for their fruit production. By focusing only on the middle to lower canopy (approximately 2.0 to 3.5 meters above ground), the system avoids interference from weeds, ground clutter, and upper canopy sections better suited for aerial drones.
The integration of this ground robot with aerial drones forms the foundation of what the researchers call a “ground-air collaborative plant protection model.” In this vision, drones handle the upper canopy, where their vantage point gives them superior coverage, while the ground robot focuses on the lower and middle sections, which are often shielded from above and difficult to reach with conventional methods. This dual-layer approach ensures complete canopy coverage while optimizing chemical use and reducing overlap.
Field tests conducted on a campus grove of Malus spectabilis trees demonstrated the system’s effectiveness. The robot traveled at a steady speed of 0.5 meters per second along a U-shaped path, scanning trees on both sides of the row. Three randomly selected trees were analyzed in detail. The results showed that the calculated target points consistently fell within the 2.0 to 3.5-meter range, aligning well with the intended spray zone. The minimum spray elevation angle recorded was 47.8 degrees, and the maximum was 51.4 degrees—values that reflect subtle but meaningful adjustments based on tree height and distance.
Perhaps most impressively, the system achieved a maximum adjustment time of just 0.06 seconds between consecutive target points. This rapid response time is crucial for maintaining accuracy at higher operating speeds. If the robot cannot adjust quickly enough, it risks spraying past the target or misaligning with the canopy. At 0.06 seconds, the system can keep pace with real-world field conditions, even when traveling at speeds up to 4 kilometers per hour.
Another key finding was the small variation in target position within individual trees. Because the canopy structure of a single tree changes gradually along the row, the researchers realized that processing every single frame of LiDAR data was unnecessary and computationally expensive. Instead, they proposed a segmentation strategy: dividing each tree into equal sections (e.g., thirds) and averaging the target points within each segment. This approach reduces the number of angle adjustments required, lowers the computational load, and extends the lifespan of the mechanical components by minimizing unnecessary movements. In practice, this means the robot can maintain high accuracy while operating more efficiently.
The implications of this technology extend beyond just pesticide reduction. By enabling precise, on-demand spraying, the system supports the broader goals of sustainable agriculture. It reduces chemical runoff into water systems, lowers the risk of pesticide resistance in pest populations, and minimizes exposure for farm workers and nearby communities. Moreover, because the robot applies only the necessary amount of chemical, farmers can achieve the same level of pest control with less input, leading to cost savings and improved profitability.
From a technological standpoint, the use of LiDAR instead of camera-based vision systems offers several advantages. LiDAR is less affected by lighting conditions, making it reliable in early morning, late evening, or under dense foliage where shadows are common. It also provides direct distance measurements, eliminating the need for complex stereo vision algorithms or depth estimation models. While LiDAR sensors have historically been expensive, recent advances in solid-state and MEMS-based designs have driven costs down, making them increasingly viable for agricultural robotics.
The choice of an electric pushrod actuator with built-in encoder feedback further enhances the system’s precision. Each rotation of the motor is translated into a known linear displacement, allowing the controller to track the exact position of the spray boom. The encoder outputs pulses that are counted and interpreted by the microcontroller, forming a closed-loop feedback system that ensures the actual angle matches the commanded angle—even in the presence of external disturbances such as wind or uneven terrain.
The research also highlights the importance of mechanical design in agricultural robotics. The spray bracket is mounted on a rotating joint, allowing it to tilt upward or downward as the pushrod extends or retracts. The geometric relationship between the pushrod, the pivot point, and the nozzle array was carefully modeled to ensure smooth, predictable motion across the full range of operation. This attention to mechanical detail is often overlooked in academic robotics but is essential for real-world deployment.
While the current prototype is designed for research and small-scale applications, the principles it demonstrates are scalable. The same detection and control framework could be adapted for larger, tractor-mounted systems or integrated into autonomous orchard platforms. With minor modifications, the system could also be used for other tasks such as pruning, harvesting, or yield estimation—all of which require accurate perception and precise actuation.
The team acknowledges that challenges remain. LiDAR performance can degrade in heavy rain or fog, and the system currently assumes a relatively uniform row structure. In orchards with highly irregular planting patterns or significant undergrowth, additional filtering and classification algorithms may be needed. Future work will focus on integrating machine learning models to improve target classification, distinguishing between healthy foliage, fruit, and diseased areas to enable truly selective spraying.
Nevertheless, the success of this system marks a turning point in the evolution of agricultural robotics. It moves beyond simple automation—replacing human labor with machines—and into the realm of intelligent decision-making. The robot does not just follow a path; it perceives its environment, interprets the data, and acts accordingly. This level of autonomy is what defines the next generation of smart farming equipment.
As global food demand rises and environmental regulations tighten, the pressure on farmers to do more with less will only increase. Technologies like this orchard spray robot offer a path forward—a way to boost productivity while protecting natural resources. By combining advanced sensing, real-time control, and intelligent algorithms, the researchers have created a system that is not just efficient, but truly sustainable.
The work also underscores the growing role of Chinese institutions in agricultural innovation. With strong government support through programs like the National Key R&D Program and regional collaboration initiatives, researchers at China Agricultural University are pushing the boundaries of what’s possible in farm automation. Their work is part of a broader trend toward precision agriculture, where data-driven decisions replace guesswork and waste is systematically eliminated.
In the coming years, systems like this are likely to become standard equipment in high-value orchards. As sensor costs continue to fall and computing power increases, the economic case for intelligent spraying will only grow stronger. For farmers, the benefits are clear: healthier crops, lower costs, and a cleaner environment. For the planet, it represents a step toward a more sustainable food system.
The study, titled Target Detection and Tracking System for Orchard Spraying Robots, was conducted by Shijie Jiang, Hengtao Ma, Shenghui Yang, Chao Zhang, Daobilige Su, Yongjun Zheng, and Feng Kang from China Agricultural University and Beijing Forestry University, and published in Transactions of the Chinese Society of Agricultural Engineering. doi:10.11975/j.issn.1002-6819.2021.09.004