Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
Optics+Photonics Showcase
News
Menu
Applications

Farm robot uses lidar system to navigate autonomously and harvest fruit

22 Apr 2025

Control algorithm shows promise with high-bed cultivation methods.

In the farming of strawberries, high-bed cultivation somewhat eases the associated manual labor, but there is now a new lidar-assisted robot solution to to help harvest such soft fruit as strawberries, tomatoes, and other such produce.

As a first step, Osaka Metropolitan University Assistant Professor Takuya Fujinaga has developed an algorithm for fruit-picking robots to autonomously drive in two modes: moving to a pre-designated destination; and moving alongside raised cultivation beds.

The Graduate School of Engineering researcher experimented with an agricultural robot that utilizes lidar point cloud data to map the environment.

Guided by lidar, the farming robot can move accurately while maintaining a constant distance from the cultivation bed, with its effectiveness verified in virtual and actual environments.

Precision movement

“If robots can move around the farm more precisely, the range of tasks that they can perform automatically will expand, not only for harvesting, but also for monitoring for disease and pruning,” commented Professor Fujinaga.

“My research shows the possibility, and once this type of agricultural robot becomes more practical to use, it will make a significant contribution to improving work efficiency and reducing labor, especially for high-bed cultivation,” he said.

Professor Fujinaga’s findings are described in Computers and Electronics in Agriculture.

Paper abstract

Further details of the Osaka group’s work are contained in the paper’s abstract:

“By alternating between the navigation methods – waypoint navigation; and cultivation bed navigation – the robot can achieve self-navigation within a farm without relying on path planning, which requires accurate localization in areas with limited environmental features. The robot uses lidar point cloud data to navigate effectively. The navigation approach was initially simulated in a virtual environment and then evaluated in a real-world strawberry farm.

“The results demonstrate the ability of the robot to maintain a specified distance of ± 0.05 m and an orientation angle of ± 5° relative to the cultivation bed. These findings confirm the feasibility of the proposed method for achieving accurate and stable navigation on a farm.

“This study also highlights the importance of simulations in agricultural robotics development. Simulated environments provide a cost-effective platform for refining robot specifications, such as sensor selection and navigation algorithms, before real-world deployment. For example, simulations have shown that reducing the maximum measurement range of the lidar can significantly impact localization accuracy and navigation stability.

“Future work will focus on creating dynamic simulation environments that replicate real-world conditions, such as uneven surfaces and varying farm layouts. Enhancing simulation fidelity will improve the reliability of evaluations and accelerate the practical implementation of agricultural robots, contributing to their broader adoption and efficiency in farming operations.”

ESPROS Photonics AGLASEROPTIK GmbHOptikos Corporation Hamamatsu Photonics Europe GmbHPhoton Lines LtdCHROMA TECHNOLOGY CORP.SPECTROGON AB
© 2025 SPIE Europe
Top of Page