The below describes the calibration method for Beta RapidSense release. This requires some manual work. In the future RapidSense release, the process will include a GUI and be much more user friendly.
Overview
The current method uses the robot to calibrate the cameras.
Calibration Requirements
The requirements for calibration are:
Aruco tag mounted off of the robot faceplate
Static Aruco tag placed in the scene which is easily visible to the sensor
Calibration preset which has the TCP located at the center of the Aruco tag. More details in the cal.json file linked below
cal.json file which has all the camera serial_numbers and it’s corresponding robot target positions for calibration which have the Aruco tag on the robot in the sensor field of view
Setup
To easily generate the cal.json file and save it in /etc/rapidsense/cal.json, we have a calibration_generator.py script which prompts the user for all the necessary information. After the cal.json file has been generated, run the calibration_service.py script and go to localhost:9000/calibration to calibrate all the cameras assuming you have rapidsense_app and proxy running. This script will pull in information from the cal.json file and make the robots move to the cal_targets sequentially, extrinsically calibrating all corresponding cameras. The robot will then return to the home location and all the cameras will save the pose of the static marker placed in the scene. After a successful calibration, all the values will be automatically updated in the cal.json file.
Architecture
The diagram below gives a quick overview of the architecture of the calibration sequence:
The calibration ASCII command { 'topic': 'Calibrate', 'data': { 'serial_number' : serial_number, 'tcp_pose' : tcp_pose } } is used where the serial_number is the camera to be calibrated and the TCP_pose is for the associated robot with the tag mounted on it. These values are sent over the proxy to rapidsense where aruco detection algorithm finds the markers which gives us the aruco_to_camera transformation. The TCP_pose is requested when in the calibration preset so we get the world_to_aruco transform.
Using these two transforms, we can calculate the world_to_camera transform as:
World_to_camera = World_to_aruco * aruco_to_camera
After successful calibration, we get the pose of the static marker and save it in the file cal.json. We send out a status = 0 to appliance and also update the GetRapidSenseStatus as true for the is_calibrated for the corresponding sensor.
The algorithm also periodically checks the pose of the static marker for every camera. Assuming the cameras have been bumped, we will get a different pose for the static marker than previously stored during calibration. Thus indicating that the calibration is off. Subsequently we will throw a warn the user and update the GetRapidSenseStatus to false for Is_calibrated for that particular sensor.
Further details of components can be found in the pages below: