Monash Motorsport’s Autonomous System (AS) Simulation subsection is an integral part of testing our driverless car, M19-D, even when it’s not at the track. Their goal is to construct a simulator that can represent three main aspects of real world racing: the environment, the vehicle dynamics and the perception sensors. The simulator is constructed using Gazebo, an open-source simulation software that is widely used for robotics. The main reason Gazebo was chosen is that it integrates well with Robot Operating System (ROS) and has a large user community.
3D Visualisation of M19-D’s simulation
The major focus is to simulate the perception sensors. This includes a stereo camera, a LiDAR, a GPS and an inertial measurement unit (IMU). By creating models for the sensors in Gazebo, we can obtain simulated data such as the video feeds from the cameras and a point cloud from the LiDAR.This gives us the advantage of being able to validate our algorithms and iron out bugs before we conduct on-track testing, allowing for a safer and smoother testing session.
The environment simulation mainly simulates the cones that mark the track, including the four types of cones as defined in the Formula Student Germany (FSG) competition handbook.
The vehicle dynamics simulation is not directly running in Gazebo. Instead, it utilises a separate vehicle model that is also used in other autonomous systems such as path planning and control. This vehicle model takes control requests from our control algorithms, and simulates the reactions of the actuators. It also mimics the response of the vehicle to update the vehicle position, orientation, velocity and acceleration over time.
With the help of this simulation package, we are able to constantly refine our perception as well as path planning and motion control algorithms in the workshop and ultimately, maximize our on-track testing hours.
by Grace Zhang and Matthew Lane