Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Info

Internal Note: Here is how we create the manual.

  • First focus on getting internal documentation refined and organized

  • Create a skeleton of what the manual would be to start working off of

  • User manual will pull information from internal documentation, but also include things for user workflow, including things like hardware requirements, disclaimers, and "happy path" examples

Table of Contents
minLevel1
maxLevel6
outlinefalse
typelist
printablefalse

Introduction

Overview

What is RapidSense

  1. Value proposition
    If work cell variations are causing downtime for your robot system, RapidSense can help get it moving again. When RapidSense is paired with RapidPlan, unmodeled obstacles can be avoided and goal-directed motions computed at runtime, allowing for process variation and environmental changes to be autonomously managed by the robot.

  2. Target use case (High-level)

  3. Visual guide/demo (GIFs, videos)

System Architecture and Requirements

  1. System diagram

  2. Relationship with RapidPlan(Create & Control)

    1. incl. references to the manuals

  3. System requirements

    1. Supported vision sensors

      1. There will only be Intel RealSense camera support for the Beta release

    2. Aruco tag definition and fabrication/ordering requirements

      1. Aruco pattern needs to be printed 9cm x 9cm size (in the future for larger robot systems, this size may need to be larger)

      2. The pattern needs to be crisp and high contrast white and black

      3. Material surface with Aruco pattern needs to be matte and non-reflective

      4. This needs to be on a reliably rigid flat surface/material (plate) that will not flex during robot motion, but does not have to be so thick for anything load bearing (3mm thick is likely OK for an aluminum 9cm x 9cm tag)

      5. There should be a mounting hole in each corner of the plate for a user to be able to mount it on the robot end effector or in the scene

      6. (2) dowel pin holes should be provided between the mounting holes on at least (2) of the sides of the plate. In the event a user needs to remove the Aruco tag/plate for a reason, it can be easily replaced accurately.

    3. vrd definition

    4. PC spec and OS requirements

Setup

  1. Safety Precautions

Note
  1. WARNING: The Realtime Controller is NOT safety-certified during operation with robots.

Users are required to exercise caution and follow all safety protocols when using the Realtime

Controller and its software with physical robots.

Users must comply with any and all safety standards regarding the use of robotics, automation, and the use of necessary safeguarding. Compliance typically requires performing a risk assessment of each application to determine the needed safety performance and safeguarding.

The Realtime Controller and its software are not supplied for specific applications.

The safety of any system incorporating the Realtime Controller and its software is the responsibility of the end user.

Hardware Installation

Required hardware

  1. <TODO> Describe what equipment is required for RapidSense.
    i.e.) Here is the list of items in RS1.4
    Realsense D435 (4x)

    1. USB-C to USB-B cable (4x)

    2. Calibration plate

    3. Calibration plate adapter

    4. Mounting hardware

Cameras

  1. Mount the Cameras in the Work Area

    Occlusions are the parts of the environment that are not in view of the camera due to an obstruction. The goal in placing your cameras, therefore, is to minimize occlusions. What positions will give the cameras as complete a view as possible of the robot setup at all times?

    Camera placement also depends on the robot application. For example, for a pick and place application, ask yourself, where will the robot move? Where will the occlusions be, and for which cameras? 

    You should expect this process to be somewhat trial-and-error. 

    To get the optimal performance out of the RapidSense system, keep the following guidelines in mind when mounting your cameras:

    • The optimal distance at which the Intel Realsense cameras detect obstacles is between 0.5 and 1.5 meters. Make sure to position the cameras in your workspace to maximize the likelihood that obstacles will appear within this depth window. 

    • The Intel Realsense specifications are for +/- 2% depth accuracy when they’re within 2 meters of an obstacle, so we recommend placing cameras across the workspace so that obstacles are within this distance. 

    • Space the cameras apart from one another and at different angles. 

    • Mount the cameras securely and rigidly. If a camera vibrates relative to robot motion, it will return noisy data resulting in false positives for obstacles. 

    • Connect the cameras with strain relievers. 

    • 80/20 is an excellent method of rigidly mounting cameras in the environment while still being adjustable.

    • Tripods will work, but be careful not to bump them or the extrinsic calibration will have to be redone. 

      Image Removed

...

  1. <TODO>Describe how camera CAD with the field of view (FOV) of the camera visible can be imported into RPC (CAD obtained from sensor provider for now) with “ignore collisions” checked to be able to verify aruco tags could be visible to the camera. A picture here could help

  2. Aruco Calibration Tag Mounting

    1. Static mounted

    2. Robot mounted

Software Installation

  • Licensing?

  • How to install

  • How to update/uninstall

System Setup

  • Calibration
    The calibration process requires a cal.json file which contains the cameras' serial numbers, Aruco markers' ids and measurements, and calibration targets. For the calibration process to work as expected, every project needs to have a home target named as robotname_home, for example, if we have a Fanuc robot in the project and we will use it for calibration we need to have a Fanuc_home target. The calibration targets should be connected on Roadmap to the home pose in RPC project in order for the calibration to succeed.

    For the further information and instruction, refer to cal.json

  • Configuration
    A rapidsense.json is another file which is required to start rapidsense. This configuration file defines the volumes in the scene. Look here for details: Configuration

User Guide

Get started

(Happy path of a user workflow)

  • Start a project

  • Alignment?

  • currently required user input to run: calibration files and aruco transform

RapidSense HMI

  • Layout description

  • Hardware Status

    • <TODO>Add description for different monitored states of the cameras, especially when the cameras are detected out of calibration by comparing to the static target in the scene

    • <TODO>Add description for running calibration through the button on the HMI. May need to obtain new screenshots of the latest

Features

  • Scene Update
    This feature is used for obstacle detection. This is the main functionality of Beta release

  • RapidSense Filters

    How to apply the different filters with their API and what they can do

    • ScanScene (Obstacle Detection) filter

      • <TODO>description of the function and API. Streaming, Snapshot, and Clear RS voxel data modes

  • API

Maintenance/ Repair

Troubleshooting and Support

  • FAQ

  • Common Problems and Solutions

    • Error code

      • <TODO> error code table with description, common causes, and common solutions.

  • Contact Information

Legal and Compliance

Licensing Agreement

Privacy Policy

Appendices

...

Glossary

...

Table of Contents

Child pages (Children Display)
alltrue

Overview

RapidSense is RTR’s premier perception toolkit. Through its usage, user’s can expand RTR’s state-of-the-art motion planning solution into unstructured environments. The following document, walks a new user through the process of setting up a RapidSense project and the hardware components in a cell.