UR CB-Series (Polyscope 3.14 or Above)

You are currently viewing the documentation for version 1.7.4. To access documentation for other versions, click the "Switch Version" button located in the upper-right corner of the page.

■ To use the latest version, visit the Mech-Mind Download Center to download it.

■ If you're unsure about the version of the product you are using, please contact Mech-Mind Technical Support for assistance.

This section introduces the process of setting up the Standard Interface communication with a Universal Robots (UR) CB-series robot.

Plugin Installation and Setup

This section introduces the installation and setup of Mech-Mind 3D Vision Interface (URCap plugin) for UR CB-series.

Prerequisites

Verify that you meet the minimum required versions for Mech-Mind vision-series software and Polyscope.

To view the version of Polyscope, press About on the Polyscope home screen of the UR teach pendant.

setup robot cb
about cb

Install the URCap plugin

To install the URCap plugin, follow these steps:

  1. Find the URCap plugin file with the file extension “1.5.0.urcap” in the Mech-Center\Robot_Interface\Robot_Plugin\UR_URCAP directory of the Mech-Mind Software Suite installation path.

  2. Insert the USB drive into the UR teach pendant.

  3. On the Polyscope home screen, press Setup Robot.

    setup robot cb
  4. In the Setup Robot window, select URCaps on the left panel.

    urcaps button cb
  5. In the URCaps window, press + to navigate to the USB drive to locate the URCap plugin (.urcap file).

    urcaps screen cb
  6. In the Select URCap to install window, select the URCap plugin and press Open. The URCap plugin will be automatically installed.

    install urcaps cb
  7. Press Restart for the change to take effect.

    restart urcaps cb

Till now, the URCap plugin is successfully installed on the UR teach pendant.

After installing the URCap plugin, you also need to set the IP address of the robot (select Set Robot > Network). Note that the robot’s IP address and the IPC’s IP address must be on the same subnet.

Use Mech-Mind 3D Vision Interface

Before use, make sure that your Mech-Vision and Mech-Viz (if used) projects are ready to run, and the Mech-Mind IPC is connected to the robot’s network.

To use Mech-Mind 3D Vision Interface, you need to complete the following setup.

  1. Click Robot and Interface Configuration on the toolbar of Mech-Vision.

  2. Select Listed robot from the Select robot drop-down menu, and then click Select robot model. Select the robot model that you use, and then click Next.

  3. Select Standard Interface for Interface Type, TCP Server and ASCII for Protocol, and 50000 (Cannot be changed) for Port Number. Then click Apply.

    configure communication 1
  4. Make sure the Interface Service is started: On the toolbar of Mech-Vision, the Interface Service switch on the far right is flipped and turned to blue.

    configure communication 2

After the TCP server interface has been started in Mech-Vision, you can connect to it on the UR teach pendant.

  1. On the Polyscope home screen, press Program Robot.

  2. Select the Installation tab, and press Network Settings on the left panel. The Network Settings window of the URCap plugin is displayed.

    network settings cb
  3. Set Mech-Mind IPC IP Address and Port No. to the IP address and port number of the Mech-Mind IPC respectively. Then, press Apply.

  4. Press Connection Test.

    • When the connection to the TCP server interface is established successfully, the return status should look like this:

      connection pass
    • When the connection to the TCP server interface fails to be established, the return status should be like this:

      connection failed

    The connection test is just for testing. Once connected, it will disconnect automatically. Therefore, you will see client online and offline logs on the Console tab of Mech-Vision Log panel.

Hand-Eye Calibration Using the Plugin

This section introduces the process of calibrating the camera extrinsic parameters using the URCap plugin.

Prerequisites

Before proceeding, please make sure that:

  • You have successfully connected to the TCP server interface with the URCap plugin.

  • You are familiar with the contents in Hand-Eye Calibration Guide of Mech-Vision.

Create a Calibration Program

  1. On the UR teach pendant, select Program Robot  Program, and press Empty Program.

  2. Select Structure  URCaps, press Mech-Mind Calibrate under the URCaps tab. An example program node Calibrate is automatically created under the Robot Program program tree on the left panel.

    add calibrate node cb

    The created example program node is just a template. You need to further configure the calibration program and teach the calibration start point.

Teach the Calibration Start Point

  1. Select the Calibrate node in the program tree, press the Command tab on the right panel, and set the Type of point received from Mech-Vision parameter to “Joint Angle” or “Flange Pose” according to the actual needs.

    set calibrate node cb
  2. Press Next on the bottom bar to proceed to the MOVEJ node, set the motion type to “MoveJ”, “MoveL” or “MoveP”, and Set TCP to Use Tool Flange on the right Move panel to ensure that the waypoint will be recorded as flange pose.

    set movej cb
  3. Manually control the robot to move to the start point for the calibration.

  4. Go back to the UR teach pendant, select the start_pose node on the left panel, and press Set Waypoint on the right Waypoint panel. You will be switched to the Move tab.

    teach startpoint cb
  5. On the Move tab, confirm that the robot’s current flange pose is proper and press OK.

    confirm waypoint cb

Run the Calibration Program

  1. Select the Robot Program program tree on the left panel, and clear the Program Loops Forever checkbox on the right Program panel to ensure that the program is run just once.

    run calibrate cb
  2. On the bottom bar, lower the robot speed to a proper value, such as 10%, for safety concerns.

  3. Press run cb on the bottom bar to run the program.

If the calibration program runs successfully, the message “Entering the calibration process, please start the calibration in Mech-Vision” will be displayed in the Console tab of Mech-Vision Log panel.

Complete Calibration in Mech-Vision

  1. In Mech-Vision, click Camera Calibration (Standard) on the toolbar, or select Camera  Camera Calibration  Standard from the menu bar.

  2. Follow the instructions based on different camera mounting methods to complete the configuration.

To save the calibration program for future use, select File  Save As to save it.

save program cb

After you complete the hand-eye calibration, you can create pick and place programs to instruct UR robots to execute vision-guided pick and place tasks.

Create Pick and Place programs

The URCap plugin provides an example Pick and Place program node for you to create pick and place programs with minimal programming efforts.

This Pick and Place program node provides three options:

The plugin provides a program template for each option to facilitate the programming.

The following examples assume that the actually used gripper and its TCP have been set for the robot properly.

Create a Pick and Place with Mech-Vision (picking points) Program

To create a Pick and Place with Mech-Vision (picking points) program, follow these steps:

  1. Enable the With Vision (picking points) option.

    1. On the UR teach pendant, select Program Robot  Program  Empty Program.

    2. Select Structure  URCaps, press Mech-Mind Pick and Place under the URCaps tab. An example program node Pick and Place is automatically created under the Robot Program program tree on the left panel.

      add pick place node cb
    3. Select the Command tab, and press the With Vision (picking points) button on the right panel.

      select pick place option cb
    4. When you see that a program template is automatically created under the Pick and Place node in the program tree, press Next on the bottom bar.

      display vision option cb
  2. At the Mech-Mind Connect node, verify that the Host IP setting is the IP address of the Mech-Mind IPC on the right Mech-Mind Connect panel.

    verify host ip vision cb
  3. Set the image-capturing pose.

    1. Manually move the robot to the proper location where Mech-Vision triggers to capture an image.

      • For Eye-In-Hand scenario, the robot should be placed above the workpiece.

      • For Eye-To-Hand scenario, the robot should not block the camera view.

    2. Go back to the teach pendant, press Next on the bottom bar to proceed to the MOVEJ node, set the motion type to “MoveJ”, “MoveL” or “MoveP”, and Set TCP to Use active TCP on the right Move panel, and press Next.

      set movej capture cb
    3. Press Set Waypoint on the right Waypoint panel. You will be switched to the Move tab.

      set waypoint capture cb
    4. On the Move tab, confirm that the robot’s current TCP is proper and press OK.

      confirm waypoint cb
    5. Once the image capturing pose is set, press Next.

  4. Trigger the Mech-Vision project to run.

    1. Set the parameters Type of robot pose to send, Mech-Vision Project ID, and Num of expected poses on the right Trigger Mech-Vision panel.

      trigger vision cb
      Parameter Description

      Type of robot pose to send

      The type of the robot pose to send to Mech-Vision.

      • Current Position: Send robot poses to the vision system by “Current JPs + Flange”. It is recommended to set this value when the camera is mounted in the Eye In Hand mode. When this parameter is set to this value, the “Path Planning” Step in the Mech-Vision project will use the joint positions sent by the robot. If the flange pose data are all 0, the vision system ignores the flange pose data.

      • Predefined JPs: The robot pose in the joint positions defined by the user will be sent to the vision system. It is recommended to set this value when the camera is mounted in the Eye To Hand mode. When this parameter is set to this value, the “Path Planning” Step in the Mech-Vision project will use the joint positions sent by the robot as the initial pose.

      Mech-Vision project ID

      The index of the Mech-Vision project to trigger. You can find the project ID in the Project List panel of Mech-Vision.

      Num of expected poses

      The number of vision points that you expect Mech-Vision to output.

      • If it is set to 0, all detected object poses, but no more than 20, will be output.

      • If it is set to a number from 1 to 20, Mech-Vision will try to return the fixed number of object poses if the total detected number is greater than the number you expect.

    2. (Optional) Press Set the Recipe ID and a Set Recipe ID node are added under the Trigger Mech-Vision node in the program tree.

    3. Select the Set Recipe ID node in the program tree, set Project parameter Recipe ID on the right Set Recipe ID panel, and press Next.

      set receipe id
  5. Set how to receive Mech-Vision result.

    Select the Receive Mech-Vision Result node under the Pick and Place node in the program tree, select Result type as Basic, set variable names for Pose, Label, Total Received, and Status Code, which are used to save the vision result, and then press Next.

    receive vision result cb
    Parameter Description

    Result type

    Basic: Receive vision point and label. Planned path: Receive waypoint and label sent from the “Path Planning” Step. Custom: Receive vision point, label, and user-defined port data.

    Pose

    The points of detected parts is in the XYZ format. The robot can directly move to the point with active TCP. By default, the received points are saved in the array variable “pose[]”, start with array index 1.

    Label

    The object labels of detected parts. The label is an Integer. By default, the labels are saved in the array variable “label[]”, start with array index 1. Labels and Poses are one-to-one paired.

    Total Received

    The total number of received object poses.

    Status code

    The returned status code. See Status Codes and Troubleshooting for reference. 11xx is normal code, and 10xx is error code. By default, the status code is saved in the variable “status_code”.

    Starting at ID

    The start index of the array variables for poses and labels. By default, the index starts with 1.

    Picking point index

    This variable is only displayed in the interface if Result type is Planned path. The index of Vision Move in the received waypoints. For example, the “Path Planning” Step sends three points in the order of Relative Move_1, Vision Move_1, and Relative Move_2, so the picking point index is 2. By default, the vision point index is saved in the variable “vision_point”.

    Custom data

    This variable is only displayed in the window if Result type is Custom. It is the user-defined data received from the Steps in Mech-Vision, namely, data of ports other than poses and labels. By default, custom data is stored in the variable “custom_data”.

  6. Set the pick task.

    A pick task consists of three motions: Pick_above, performs a linear approach to the pick point; Pick, picks the object, and Pick_depart, performs a linear departure away after picking.

    1. Set the parameters Dist and Coordinates for the Pick_above and Pick_depart respectively on the right Mech-Vision Pick panel, and then press Next.

      set distance coordinates cb
    2. Keep the default settings for the MoveJ node on the right Move panel, and then press Next.

    3. Keep the default settings for the pick_above node on the right Waypoint panel, and then press Next.

    4. Keep the default settings for the MoveL node on the right Move panel, and then press Next.

    5. Keep the default settings for the pick node on the right Waypoint panel, and then press Next.

    6. Add gripper control logic for picking after the pick node according to your actual conditions.

    7. Keep the default settings for the MoveJ node on the right Move panel, and then press Next.

    8. Keep the default settings for the pick_depart node on the right Waypoint panel, and then press Next.

  7. Set the place task.

    1. Press Next to proceed to the next MoveJ node.

    2. Keep the default settings for the MoveJ node on the right Move panel, and then press Next.

    3. Manually move the robot to the proper location to place the picked object.

    4. Go back to the teach pendant, and press Set Waypoint on the right Waypoint panel. You will be switched to the Move tab.

      set waypoint place cb
    5. On the Move tab, confirm that the robot’s current flange pose is proper and press OK.

      confirm waypoint cb
    6. Once the place pose is set, press Next.

    7. Add gripper control logic for placing after the place node in the program tree according to your actual conditions.

Till now, a simple pick-and-place program with Mech-Vision (picking points) has been completed. You can run it by pressing run cb on the bottom bar.

Create a Pick and Place with Mech-Vision (picking path) Program

To create a Pick and Place with Mech-Vision (picking path) program, follow these steps:

  1. Enable the With Vision (picking path) option.

    1. On the UR teach pendant, select Program Robot  Program  Empty Program.

    2. Select Structure  URCaps, press Mech-Mind Pick and Place under the URCaps tab. An example program node Pick and Place is automatically created under the Robot Program program tree on the left panel.

      add pick place node cb
    3. Select the Command tab, and press the With Vision (picking path) button on the right panel.

      select pick place option cb
    4. When you see that a program template is automatically created under the Pick and Place node in the program tree, press Next on the bottom bar.

      select with vision path option cb
  2. Set the value of Host IP to the IP address of the Mech-Mind IPC by referring to step 2 in Pick and Place with Mech-Vision (picking points).

  3. Set the image-capturing pose by referring to step 3 in Pick and Place with Mech-Vision (picking points).

  4. See how to trigger the Mech-Vision project to run by referring to step 4 in Pick and Place with Mech-Vision (picking points).

  5. Set how to receive Mech-Vision result by referring to step 5 in Pick and Place with Mech-Vision (picking points).

    • Select Receive Mech-Vision Result node in the program tree, and the Result type should be Planned path.

    • The variable of the Picking point index indicates the index of Vision Move in the received waypoints. For example, the “Path Planning” Step sends three points in the order of Relative Move_1, Vision Move_1, and Relative Move_2, so the picking point index is 2. By default, the vision point index is saved in the variable “vision_point”.

  6. Configure the motion loop, which drives the robot to follow the path planned by the “Path Planning” Step, that is, approach the object, pick the object, and depart from the pick point (not including placing the object). For how to set MoveL and MoveJ nodes, refer to Steps 6 in Pick and Place with Mech-Vision (picking points).

    In actual applications, the motion loop may contain several pick_above MoveJ nodes, a pick MoveL node, and several pick_depart MoveJ nodes.

  7. Set the place task by referring to step 7 in Pick and Place with Mech-Vision (picking points).

Till now, a simple pick-and-place program with Mech-Vision (picking path) has been completed. You can run it by pressing run cb on the bottom bar.

Create a Pick and Place with Mech-Viz Program

To create a Pick and Place with Mech-Viz program, follow these steps:

  1. Enable the With Mech-Viz option.

    1. On the UR teach pendant, select Program Robot  Program  Empty Program.

    2. Select Structure  URCaps, press Mech-Mind Pick and Place under the URCaps tab. An example program node Pick and Place is automatically created under the Robot Program program tree on the left panel.

      add pick place node cb
    3. Select the Command tab, and press the With Mech-Viz option.

      select pick place option cb
    4. When you see that a program template is automatically created under the Pick and Place node in the program tree, press Next on the bottom bar.

      display viz option cb
  2. At the Mech-Mind Connect node, verify that the Host IP setting is the IP address of the Mech-Mind IPC on the right Mech-Mind Connect panel.

    verify host ip viz cb
  3. Set the image-capturing pose by referring to step 3 in Pick and Place with Mech-Vision (picking points).

    The point is where to trigger the Mech-Viz project.

  4. Trigger the Mech-Viz project to run.

    1. Set the Type of robot pose to send on the right Trigger Mech-Viz panel.

      trigger viz cb
      • If the robot pose type is Current Position, the current JPs and flange pose will be sent to Mech-Viz, and the simulated robot of Mech-Viz will move to the first waypoint from the current position of robot JPs. If the robot pose type is Predefined JPs, a predefined JPs will be sent to Mech-Viz, and the simulated robot of Mech-Viz will move to the first waypoint from the current position set by the current robot joint variable.

      • If you use the branch task in the Mech-Viz project and want the robot to select the branch out port, press Set Branch Out Port, and proceed to Step b to set the branch out port.

      • If you use a move-class task that has the index parameter, press Set Index Value, and proceed to Step c to set the index value.

    2. (Optional) Select the Set Branch Value node in the program tree, set Branch Step ID and Exit port number on the right Set Branch Value panel, and press Next.

      set branch value cb
    3. (Optional) Select the Set Index Value node in the program tree, set Move Step ID and Index value on the right Set Index Value panel, and press Next.

      set index value cb
  5. Set how to receive Mech-Viz result.

    Select the Receive Mech-Viz Result node in the program tree, set variable names for Pose, Label, Speed (%), Total Received, Status Code, and Picking point index, which are used to save the planned path by Mech-Viz, and then press Next.

    receive viz result cb
    Parameter Description

    Pose

    The planned waypoint pose in XYZ format. The robot can directly move to the point with active TCP. By default, the received poses are saved in the array variable “pose[]”, start with array index 1.

    Label

    The object labels of detected parts. The label is an Integer. For non-vision point, the label should be 0. By default, the labels are saved in the array variable “label[]”, start with array index 1. Labels and Poses are one-to-one paired.

    Speed

    The motion speed of the robot moving to the point in percentage.

    Total Received

    The total received number of waypoint pose.

    Status code

    The returned status code. See Status Codes and Troubleshooting for reference. 21xx is normal code, and 20xx is error code. By default, the status code is saved in the variable “status_code”.

    Picking point index

    The index of Vision Move in the received waypoints. For example, Mech-Viz sends three points in the order of Relative Move_1, Vision Move_1, and Relative Move_2, so the picking point index is 2. By default, the vision point index is saved in the variable “vision_point”.

    Starting at ID

    The start index of the array variables for poses and labels. By default, the index starts with 1.

  6. Configure the motion loop, which drives the robot to follow the path planned by Mech-Viz, that is, approach the object, pick the object, and depart from the pick point (not including placing the object). For how to set MoveL and MoveJ nodes, refer to Steps 6 in Pick and Place with Mech-Vision (picking points).

    • In actual applications, the motion loop may contain several pick_above MoveJ nodes, a pick MoveL node, and several pick_depart MoveJ nodes.

    • If you change the default variable names of poses, labels, etc. in the node of Receive Mech-Viz Result, you need to change the variables used in this step.

  7. Set the place task by referring to step 7 in Pick and Place with Mech-Vision (picking points).

Till now, a simple pick-and-place program with Mech-Viz has been completed. You can run it by pressing run cb on the bottom bar.

We Value Your Privacy

We use cookies to provide you with the best possible experience on our website. By continuing to use the site, you acknowledge that you agree to the use of cookies. If you decline, a single cookie will be used to ensure you're not tracked or remembered when you visit this website.