...
The diagram below gives a quick overview of the architecture of the calibration sequence:
...
The calibration ASCII command { 'topic': 'Calibrate', 'data': { 'serial_number' : serial_number, 'tcp_pose' : tcp_pose } } is used where the serial_number is the camera to be calibrated and the TCP_pose is for the associated robot with the tag mounted on it. These values are sent over the proxy to rapidsense where aruco detection algorithm finds the markers which gives us the aruco_to_camera transformation. The TCP_pose is requested when in the calibration preset so we get the world_to_aruco transform.
Using these two transforms, we can calculate the world_to_camera transform as:
World_to_camera = World_to_aruco * aruco_to_camera
After successful calibration, we get the pose of the static marker and save it in the file cal.json. We send out a status = 0 to appliance and also update the GetRapidSenseStatus as true for the is_calibrated for the corresponding sensor.
The algorithm also periodically checks the pose of the static marker for every camera. Assuming the cameras have been bumped, we will get a different pose for the static marker than previously stored during calibration. Thus indicating that the calibration is off. Subsequently we will throw a warn the user and update the GetRapidSenseStatus to false for Is_calibrated for that particular sensor.
Further details of components can be found in the pages below:
...