Baraja and Monash Motorsport: Integration of the Spectrum-Scan™ LiDAR

Monash Motorsport has been focused on autonomous vehicle development since 2017, with the constant iteration and evolution of perception, path-planning and actuation systems. As part of its 2021 vehicle, M21, the team is excited to tackle its most ambitious vehicle concept yet, a fully autonomous, electric race car based on an all new vehicle architecture. As part of this ongoing development, this blog series aims to document and share the team’s progress in implementing a brand new LiDAR solution with Baraja’s Spectrum-Scan™ LiDAR

Having begun our partnership with Baraja towards the end of 2020, work was started on integrating a Baraja Spectrum-Scan™ LiDAR system into our existing Autonomous Systems pipeline. The pipeline is a software stack running on our computing units, which interface with the Electronic Control Unit and other safety systems on the vehicle. It is split up into “nodes”, individual programs which communicate with each other using the Robot Operating System (ROS). ROS is a set of open source software libraries and tools commonly used for robotic applications.

The first part of the pipeline consists of the perception algorithms. These process the raw sensor measurements from our stereo cameras and the Baraja Spectrum-Scan™ LiDAR. Once objects have been identified and located, this information is passed on to our state estimation algorithms. A map of the racetrack is created, which is then sent to our path and motion planners, down through to the lower-level controllers, and finally to the actuators to physically move the vehicle.

We will be detailing a more in-depth look about how we detect objects from the pointcloud in a future post.

To practically implement this, the first step was to receive useful and usable data from the LiDAR. User Datagram Protocol (UDP) packets are sent from the engine unit of the Baraja Spectrum-Scan™ LiDAR over an ethernet connection, and assembled into the pointcloud. This is then used for object detection, either in a custom interface, or using a ROS driver package which the team at Baraja have allowed us to utilise. This has provided us with both flexibility and support in our implementation. This package worked straight out-of-the-box, and meant we didn’t need to worry about the finer details in communicating with the system, and allowed us to focus our efforts on rapidly developing our detection algorithms.

At the beginning of the integration phase, the team at Baraja provided us with specifications and CAD models of the Baraja Spectrum-Scan™ LiDAR unit, along with some pointcloud data that they had pre-recorded from the system. This allowed us to perform some preliminary calculations to determine the potential range and field of view (FoV) at which our software would be able to detect small traffic cones. The increase in cone-detection range, from 7 meters to over 35 meters was naturally a massive improvement. However, we were initially unsure of the impact that the change in FoV might have, transitioning from a 360 degree rotating system to a 120 degree Spectrum-Scan™ LiDAR system. This is of particular interest to us because of the Skidpad event, where the vehicle is required to navigate around two loops, designed to test cornering ability. This can be particularly challenging for autonomous vehicles, as the vehicle’s heading is a very sensitive variable.

To better understand how the change in the car’s perception abilities would affect vehicle performance, we ran a number of simulations, based on the estimated detection range and FoV of the new system. First we compared the performance of both systems, using only LiDAR, without other vehicle sensors to complete the Skidpad event.

Simulated Velodyne LiDAR performance.

Simulated Velodyne LiDAR performance.

Simulated Spectrum Flex LiDAR performance

Simulated Baraja Spectrum-Scan™ LiDAR performance

Simulated LiDAR Trajectory Energy and Skidpad Times

Simulated LiDAR Trajectory Energy and Skidpad Times

From quantitative and qualitative analysis, we observed that:

  • Sensor FoV does not impact lap times significantly

  • Reduced FoV has a noticeable effect on SLAM stability, but can be largely remedied by using additional sensors (e.g. cameras, GPS)

  • Overall, with the addition of other sensors, SLAM performance should be adequate even with the reduced FoV. However, optimal performance can be obtained by retaining the FoV at a larger range.

We then began to investigate the performance impact of the range on the Autocross event, in which the vehicle must navigate an unknown track.

Baraja Spectrum Flex FSG 2019 Simulation

Baraja Spectrum-Scan™ LiDAR FSG 2019 Simulation

Velodyne FSG 2019 Simulation

Velodyne FSG 2019 Simulation

LiDAR Range Simulated Lap Times

LiDAR Range Simulated Lap Times

While we don’t believe that this simulation is fully representative of real world performance, and that further validation of these changes is needed, the results do indicate that we can expect significant gains in the stability of our state estimation and path planning algorithms.

The Baraja Spectrum-Scan™ LiDAR has the potential to take Monash Motorsport’s autonomous racing capabilities to the next level, and we are very excited to have Baraja on board, and working with us along the way. Stay tuned for more updates as our team makes its way toward competing with our new fully autonomous, electric race car, M21.