This project describes the setup for a prototype that incorporates Time-of-Flight (ToF) and RGB sensors into an autonomous vehicle. It also discusses the software framework used for developing machine vision algorithms. Several algorithms, including convolutional neural networks (CNNs) such as Yolov.8 and Mask RCNN, were tested and compared in terms of accuracy, processing speed, and robustness etc. The experimental results highlight the advantages of combining ToF depth data with RGB color information to enhance object detection and tracking accuracy. The evaluation of these machine vision algorithms provides insights into their suitability for real-time use in autonomous vehicles. The proposed approach is shown to effectively address challenges and improve the performance of the forklift prototype. Additionally, the work presents a comparison of different machine vision algorithms (Yolov.8 and Mask RCNN) and introduces an accurate method for fork load measurement and detection based on machine vision models.

Summit 2023
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.