Top 11 Mainstream LiDAR and Camera Calibration Tool

CATAGORY
SHARE ON

Table of Contents

Calibration of cameras and LiDAR is foundational work for many tasks, where the precision of calibration sets the upper limit for the fusion of downstream solutions. Many companies in autonomous driving and robotics have invested significant resources to continuously improve this. Today, we will review some of the common Camera-LiDAR calibration toolkits, which are worth bookmarking!

(1) Libcbdetect

Libcbdetect
Libcbdetect-automatic sub-pixel checkerboard / chessboard pattern detection

Multi-chessboard detection with a single shot: https://www.cvlibs.net/software/libcbdetect/

Implemented in MATLAB, this algorithm automatically extracts corners with sub-pixel accuracy and combines them into rectangular chessboard patterns. It can handle various images (pinhole cameras, fisheye cameras, omnidirectional cameras).

 

(2) Autoware Calibration Package

Autoware Camera-LiDAR Calibration
Autoware Camera-LiDAR Calibration

A LiDAR-camera calibration toolkit as part of the Autoware framework.

Link: https://github.com/autowarefoundation/autoware_ai_utilities/tree/master/autoware_camera_lidar_calibrator

 

(3) Target Calibration Based on 3D-3D Matching

LiDAR and camera setup
LiDAR and camera setup

LiDAR-camera calibration based on 3D-3D point correspondences, a ROS package, from the paper “LiDAR-Camera Calibration using 3D-3D Point correspondences.”

Link: https://github.com/ankitdhall/lidar_camera_calibration

 

(4) Shanghai AI Lab OpenCalib

Calibration tools in the list
Calibration tools in the list

Produced by Shanghai Artificial Intelligence Lab, OpenCalib offers a sensor calibration toolbox. The toolbox can be used for calibrating IMU, LiDAR, cameras, and radar sensors.

Link: https://github.com/PJLab-ADG/SensorsCalibration

 

(5) Apollo Calibration Tools

The Apollo calibration toolkit, link: https://github.com/ApolloAuto/apollo/tree/master/modules/calibration

 

(6) Livox-Camera Calibration Tools

Livox Coloring rendering
Livox Coloring rendering

This solution provides a manual method to calibrate the extrinsic parameters between Livox radars and cameras, verified on Mid-40, Horizon, and Tele-15. It includes code related to computing camera intrinsics, obtaining calibration data, optimizing extrinsic parameters computation, and LiDAR-camera fusion applications. Calibration targets such as checkerboard corners are used, and due to the non-repetitive scanning feature of Livox radars, the point clouds are dense, making it easier to find the accurate position of corners in the LiDAR point clouds. Calibration and fusion of camera and LiDAR can also yield good results.

Link: https://github.com/Livox-SDK/livox_camera_lidar_calibration

Chinese Documentation: https://github.com/Livox-SDK/livox_camera_lidar_calibration/blob/master/doc_resources/README_cn.md

 

(7) CalibrationTools

Map-based calibration
Map-based calibration

CalibrationTools provides calibration tools for sensor pairs such as LiDAR-LiDAR and LiDAR-camera. In addition, it offers:

1) Positioning deviation estimation tools for sensors used in dead reckoning (IMU and odometer) to achieve better positioning performance!

2) Visualization and analysis tools for Autoware control outputs;

3) Calibration tools for correcting vehicle command delays;

Link: https://github.com/tier4/CalibrationTools

 

(8) Matlab

Visualize the LiDAR and the image data fused together.
Visualize the LiDAR and the image data fused together.

Matlab’s built-in toolbox supports LiDAR and camera calibration, link: https://ww2.mathworks.cn/help/lidar/ug/lidar-and-camera-calibration.html

 

(9) ROS Calibration Tools

LiDAR Camera Calibration Demo
LiDAR Camera Calibration Demo

ROS Camera LIDAR Calibration Package, link: https://github.com/heethesh/lidar_camera_calibration

 

(10) Direct Visual LiDAR Calibration

Direct LiDAR Calibration
Direct LiDAR Calibration

This package provides a toolbox for LiDAR-camera calibration: Universal: It can handle various LiDAR and camera projection models, including rotation and non-repetitive scanning LiDARs, as well as pinhole, fisheye, and omnidirectional projection cameras. Target-less: It does not require calibration targets but uses environmental structures and textures for calibration. Single-shot: Calibration requires at least one pair of LiDAR point clouds and camera images. Optionally, multiple LiDAR-camera data pairs can be used to improve accuracy. Automatic: The calibration process is automatic, without the need for initial guesses. Accurate and robust: It employs a pixel-level direct LiDAR-camera registration algorithm, which is more robust and accurate compared to edge-based indirect LiDAR-camera registration.

Link: https://github.com/koide3/direct_visual_lidar_calibration

 

(11) 2D LiDAR-Camera Toolbox

Collect laser data to create rosbag
Collect laser data to create rosbag

This is an automatic calibration code for single-line lasers and camera extrinsic parameters based on ROS. The calibration principle is as shown in the following figure: the camera estimates the plane equation of the calibration board in the camera coordinate system through QR codes. Since the laser point cloud falls on the plane, the point cloud is transformed into the camera coordinate system through the extrinsic parameters of the laser coordinate system to the camera coordinate system. The distance from the point to the plane is used as the error, and nonlinear least squares are used for solving.

Link: https://github.com/MegviiRobot/CamLaserCalibraTool

Request User Manual

Request Datasheet