By default on 2.9, RapidPlan Dynamic (RPD) now runs the generic sensor interface as a system service. To run native sensors like before, you’ll need the following additional steps.
Additional Steps
Stop the system service
sudo systemctl stop rtr_rapidsense.service
Create a new config file with a directory for RapidPlan Dynamic calibration artifacts, by running this command in the terminal
mkdir -p ~/.config/rtr_rapidsense && echo "config_dir=/home/$USER/rtr_rapidsense" > ~/.config/rtr_rapidsense/rtr_rapidsense.conf
To run RPD, use the following commands in a terminal, and then the system will behave exactly like it did before.
rtr_rapidsense rtr_calibration
Getting Started with RapidPlan Dynamic
On 2.9 the GUI in the control panel can be used to edit the calibration configuration files, but that requires the RPC project to be loaded in the appliance, and then you have to build the calibration roadmap via the CreateTarget and AddConnection workflow. I find it easier to have the RPC project, frame viewer, and VS Code open, and edit the config files manually that way I can build the roadmap in RPC.
Download an aruco tag from the PDF linked to this page and follow the instructions for making a calibration plate https://realtimerobotics.atlassian.net/l/cp/82arqwQV
Connect the Realsense cameras
Run
rtr_rapidsense
andrtr_calibration
from a terminal. It will create a template calibration configuration file for youRemember to stop the system service first, as noted above.
With your RPC project open, create a preset named
Calibration
which has the TCP at the center of the calibration plate you made, and a target in the field of view for each camera.You can launch rapidsense with
rtr_rapidsense -f
to see live feeds from the cameras, which helps map the serial number of a camera to it’s respective calibration target.Using the 'live connect' feature in the RPC jog dialog, you can easily create a target at the current robot position after jogging it until you can see the aruco tag roughly centered in the cameras view.
Ensure you create a home target in the calibration preset with the name
<robot_name>_home
as the calibration process will send a move to that target to finish.If the robot used for calibration is mated to the project origin, this well help minimize any error.
Create then modify the calibration configuration files (cal.json and markers.json) with the marker and target information.
The
id
being referenced is the number printed on the tag you downloaded.
Restart
rtr_rapidsense
If you used the
-f
argument before, you no longer need it.
Load the RPC project through the control panel and verify that the robot can safely move through all edges in the calibration preset.
When the calibration process is run, the robot will move through the targets following the order in
cal.json
.
Navigate to
<rtr_controller_ip>/rsm
and click on Settings. At the bottom of the Sensors tab will be a ‘Run Camera Calibration’ button which will begin the process.Make sure you are in
Operation
mode before running the calibration.
From the sensors pane of the control panel, configure a volume to cover the area cameras will be monitoring
A volume is the region in which cameras will report voxels. For initial testing you can make the volume super big, just to see what the sensor data is looking like, and then later on crop it to the application specific needs. Start with -10m -> 10m. If there is a shift in the voxel stream (maybe the cal plate TCP is upside down or rotated), a volume of this size will allow the voxels to show up and help with debugging.
Send ScanScene with the start_streaming option, to see the voxelized pointcloud
send
{ data: { type: start_streaming }, topic: ScanScene}
through the control panel's terminal feature, or over a socket opened to <rtr_controller_ip>:9999 to start streaming voxels.send
{ data: { type: clear }, topic: ScanScene}
to stop the voxel stream.