Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Download an aruco tag from the PDF linked to this page and follow the instructions for making a calibration plate https://realtimerobotics.atlassian.net/lwiki/cpx/82arqwQVJw_ozg

  • Connect the Realsense cameras

  • Run rtr_rapidsense and rtr_calibration from a terminal. It will create a template calibration configuration file for you

  • With your RPC project open, create a preset named Calibration which has the TCP at the center of the calibration plate you made, and a target in the field of view for each camera.

    • https://realtimerobotics.atlassian.net/lwiki/cpx/zCDDnDeoFxCozg

    • You can launch rapidsense with rtr_rapidsense -f to see live feeds from the cameras, which helps map the serial number of a camera to it’s respective calibration target.

    • Using the 'live connect' feature in the RPC jog dialog, you can easily create a target at the current robot position after jogging it until you can see the aruco tag roughly centered in the cameras view.

    • Ensure you create a home target in the calibration preset with the name <robot_name>_home as the calibration process will send a move to that target to finish.

    • If the robot used for calibration is mated to the project origin, this well help minimize any error.

  • Create then modify the calibration configuration files (cal.json and markers.json) with the marker and target information.

  • Restart rtr_rapidsense

    • If you used the -f argument before, you no longer need it.

  • Load the RPC project through the control panel and verify that the robot can safely move through all edges in the calibration preset.

    • When the calibration process is run, the robot will move through the targets following the order in cal.json.

  • Navigate to <rtr_controller_ip>/rsm and click on Settings. At the bottom of the Sensors tab will be a ‘Run Camera Calibration’ button which will begin the process.

    • Make sure you are in Operation mode before running the calibration.

  • From the sensors pane of the control panel, configure a volume to cover the area cameras will be monitoring

    • A volume is the region in which cameras will report voxels. For initial testing you can make the volume super big, just to see what the sensor data is looking like, and then later on crop it to the application specific needs. Start with -10m -> 10m. If there is a shift in the voxel stream (maybe the cal plate TCP is upside down or rotated), a volume of this size will allow the voxels to show up and help with debugging.

  • Send ScanScene with the start_streaming option, to see the voxelized pointcloud

    • https://realtimerobotics.atlassian.net/lwiki/cpx/6tAj1wcdiBCozg

    • send { data: { type: start_streaming }, topic: ScanScene} through the control panel's terminal feature, or over a socket opened to <rtr_controller_ip>:9999 to start streaming voxels.

    • send { data: { type: clear }, topic: ScanScene} to stop the voxel stream.

...