Mono-Camera ADAS

ADAS Project Overview

This Level-1 ADAS project employs a monocular camera in conjunction with an NVIDIA Jetson Nano board. Through over 1.5 years of dedicated research, we successfully developed and fine-tuned AI and Computer Vision algorithms, achieving the targeted performance benchmarks for Forward Collision Warning (FCW), Lane Departure Warning (LDW), and High Beam Assist (HBA) systems. For a visual demonstration, test videos are prepared and will be shown below.


My role in this project:

As the AI team lead for the ADAS project, I played a pivotal role in overseeing the research and development efforts spanning over a year. My responsibilities encompassed rigorous testing and evaluation of various AI models, particularly focused on vehicle detection and lane segmentation. Notably, I spearheaded the training of a specialized Convolutional Neural Network (CNN) model tailored for detecting car lights, enabling precise classification into the front, rear, and other categories, including reflected beams which is used for the HBA algorithm.

 

image

Furthermore, I delved into extensive experimentation with a diverse range of processor boards and mini PCs, ranging from Raspberry Pi and Intel MYRIAD to Jetson Nano and Jetson Orin Nano. This exhaustive process culminated in the successful implementation of object detection using the innovative DeepStream framework, which was subsequently harnessed for Forward Collision Warning (FCW). Additionally, I strategically leveraged a pre-trained Tusimple model based on ResNet for Lane Departure Warning (LDW), optimizing its performance through the application of TensorRT. For High Beam Assistance (HBA), I developed and fine-tuned CNN networks using TensorFlow, culminating in their conversion to TF-Lite format for seamless, high-performance deployment on Jetson platforms.

In addition, In my role as an embedded systems developer, I efficiently extracted CAN data at speeds of both 100 kb/s and 500 kb/s. I accomplished this using a cost-effective CAN to Serial module from LONGAN Labs. This enabled me to retrieve crucial information such as light status, indicator status, vehicle speed, and steering-wheel angle. Leveraging this data, we established a user interface for monitoring and integrated it into the decision-making processes of our Forward Collision Warning (FCW), Lane Departure Warning (LDW), and High Beam Assistance (HBA) algorithms. This real-time data integration significantly enhanced the accuracy and responsiveness of our safety systems.

Field Test Videos:

In this section, you can see more details about this project in these videos. (please enable CC for English translation)