GaN Talk a blog dedicated to crushing silicon
Designing LiDAR and more into Autonomous E racing

Designing LiDAR and more into Autonomous E racing

Aug 27, 2018

This post, authored by Steve Taranovich, Editor-in-Chief, Planet Analog was originally published August 10, 2018 on the Planet Analog website. Learn more about eGaN technology and EPC GaN solutions for LiDAR.

 

I have a pathological interest in the promotion of electric vehicles; Formula E racing is one of the most exciting venues for techies like myself. See some of my articles on Formula E in the links at the end of this blog.

What caught my eye recently was a ROBORACE video at a Formula E race track in Rome, Italy:

I think that ROBORACE is one of the most intriguing Motorsports I have ever seen for advancing the technology necessary for safe and practical electric vehicles. Its uniqueness is in the fact that their race cars are driverless as well as being electric. The sheer autonomy of the vehicle is advancing the software, sensors and processing electronics that will lead to practical and safe human operatable and autonomous electric vehicles.

The ROBORACE vehicle designer is Daniel Simon who has technological contributions to such movies as Tron: Legacy, Oblivion, and Captain America (These are some of my favorite movies)

Check out this YouTube video of the ROBORACE vehicle:

DEVBOT

DEVBOT is a development car that enables ROBORACE teams to develop their unique software and test that out in conjunction with the vehicle’s hardware. This vehicle has a GPS unit, 5 LiDAR sensors, 18 ultrasonic sensors and 6 Blackfly AI cameras.

LiDAR

LIDAR has some tested advantages in autonomous driving. It has been found useful for systems implementing automatic braking, object detection, collision avoidance, and more. LIDAR in cars can have a range of up to 60 m (Depends upon the kind of sensor used).

To be fair, LIDAR units can be heavy, bulky in size, and expensive. Also, atmospheric conditions such as rain or fog can impact the coverage and accuracy of these systems. Recent solid-state LIDARs are significantly smaller and relatively inexpensive.

The LiDAR in DEVBOT appears to be a Valeo SCALA design, that is sold through ibeo.

Check out this video

Autonomous race cars, so far, have been demonstrated under fairly controlled conditions. For example, the DEVBOT was raced against a human by comparing lap times on a closed course. There were no other cars on the track at the time. It may be counter-intuitive, but it is easier to go fast on a closed course than it is to negotiate traffic in San Francisco. Race cars are an exciting case for autonomy but are less challenging than avoiding crazy drivers in urban rush hour traffic.

Another example of an autonomous race car can be found (here). This car uses a Velodyne LIDAR unit as confirmed in the article. See References 1, 2, and 3.

RADAR

It looks like the RADAR is an electronically scanned system from Delphi. See this Mathworks article: Delphi Develops Radar Sensor Alignment Algorithm for Automotive Active Safety System. Using multiple sensing systems like LIDAR and RADAR together will help ensure that the vehicle can operate under many different environmental conditions.

Analog Devices has an excellent Phased Array RADAR solution; see the video below:

Why use GaN in LIDAR?

Efficient Power Conversion (EPC) eGaN FETs improve resolution critical to autonomy and real-world driving conditions. There are many GaN FETs in the industry, but I am partial to EPC because of their packageless technology, talented technical staff, and their efforts to educate the industry on GaN technology and its applications.

The reason eGaN FETs (and now ICs) are used in all the LiDAR systems for autonomous cars, and now autonomous race cars, is that they enable much higher resolution (due to extremely short laser pulses), faster image speed (due to short laser pulses), and the ability to see greater distances with high accuracy (due to fast laser pulses at very high current).

All the autonomous vehicle manufacturers are reportedly using Efficient Power Conversion’s eGaN power elements.

The Brain

NVIDIA Drive, Scalable platform for Autonomous Driving. This system fuses data from multiple cameras and also from LIDAR, RADAR, and ultrasonic transducers. In this way, algorithms can have a full 360-degree view of the environment surrounding the vehicle.

Sensor positions on ROBORACE and DevBot vehicles

Visit the imgur site to understand a bit more regarding sensor placements on the ROBORACE and DevBot vehicles:

The first two images are renders of the final ROBORACE car with the position of the sensors, and field of view of the sensors. In the second image, the car is facing to the right. The ultra-sonics are shown in white, the LiDAR in blue, the cameras in yellow, and the radars in red.

The final 4 images show the sensor setup on Devbot. Now we can clearly see some of the layout of the sensors. First, the ultrasonic sensors. These are the little white dots in the metal brackets that can be seen hanging beneath the top part of the front wing. There are:

  • 6 on the front of the vehicle
  • 6 on the rear of the vehicle
  • 3 on each side

The 5 LiDAR sensors seem to be placed:

  • 2 forward and side looking (1 on either front corner)
  • 2 rearward and side looking (1 behind each front wheel)
  • 1 rearward looking (just above the center of the rear diffuser)

There are 2 cameras on the front of the car; one AI camera for the computer to see and one for the driver and team to see (each is next to the front LiDAR). There is also a camera on the roof of the car (maybe a wide angle one?). Based on the field of view image, there should be cameras positioned at each LiDAR, but we are not able to see a rear-facing camera, or a camera next to the rearward-side facing LiDAR.

There are also 2 RADARs visible - one in the center front, and one in the center rear above the rear LiDAR sensor.

There are 18 ultrasonic sensors around the vehicle as well to sense things very close to the vehicle.

Take a look at this ROBORACE YouTube videos below:

They mention in the above videos that the front RADAR has been hidden inside the bodywork (behind material that radar will pass through).

Quanergy did the same with their LiDAR sensors that use EPC’s eGaN technology with Mercedes Benz.

Tags: GaNLidar