Datenbestand vom 23. März 2024

Warenkorb Datenschutzhinweis Dissertationsdruck Dissertationsverlag Institutsreihen     Preisrechner

aktualisiert am 23. März 2024

ISBN 9783843944175

84,00 € inkl. MwSt, zzgl. Versand


978-3-8439-4417-5, Reihe Informatik

Patrick Fleischmann
Map-aided Off-road Path Following for Autonomous Vehicles

256 Seiten, Dissertation Technische Universität Kaiserslautern (2020), Softcover, A5

Zusammenfassung / Abstract

This thesis focuses on perception and navigation for future commercial vehicles operating in off-highway environments. The ability to detect and follow existing pathways for safe and efficient navigation is one essential capability of these vehicles. Keeping the application area in mind, robust and low-cost sensor systems like stereo and time-of-flight cameras have been selected for this work.

The proposed approach partitions the problem into multiple cues which are finally combined to obtain a trajectory. The first input is based on extended data of the OpenStreetMap project and delivers a route considering the robot’s capabilities. The remaining inputs are generated by the two perception systems regarded in this thesis. One component determines the pathway's position based on the traversability, as it can be observed that paths in the rural areas are often those places with the fewest obstructions.

Due to essential differences in the point clouds obtained from the two sensing technologies, two approaches are proposed to estimate the traversability. Another source of information is the shape of the ground, which is often a good indicator of the pathway’s location. This roughness is especially gainful in cases where the texture does not clearly separate the path, but its surface is flat compared to the roadsides. As the last perceptive component of this thesis, an online-adaptive path detection system is proposed, which exploits the RGB data of the camera image. The method eliminates many possible outliers by exploiting the results of the stereo point cloud segmentation but overcomes the distance limitation in parallel by incorporating the map data.

Finally, the different cues are fused to determine a safe and suitable trajectory, which guides the robot on existing pathways to a given target. All perception components and also the developed fusion have been validated using a robotic vehicle based on a John Deere John Deere XUV855D Gator Utility Vehicle.