Multi-sensor calibration, a process that aligns the geometric coordinate transforms between one or more sensors, is a key factor for enabling sensor fused technologies such as intelligent automation. Calibration involves correlating corresponding points in the world between sensors to be aligned, which is achieved with fiducial markers on a target board. This project investigated and demonstrated multiple targets in a static configuration to calibrate the coordinate transforms between a camera and lidar.
This project developed custom target calibration boards with fiducials that were detected within the camera and lidar sensor data for correspondence of the same target. The target board design was tested to quantify the detection range and orientations through various heights, angles, and distances. For the calibration process, multiple static targets were detected and the between sensor correspondences were integrated within an optimization process that converged to an extrinsic coordinate transform such that the reprojection error was minimized.
The experiments considered the number of target boards needed to cover the sensors’ views along with their positions and orientations relative to the sensors. We exceeded our success metric and were able to achieve a 35% improvement on reprojection error over the current state of the art. The resulting calibration solution is modular and scalable to various sensor configurations and multiple cameras with a lidar for multi-view or surround view sensing.