Workpiece Locating

You are viewing an old version of the documentation. You can switch to the documentation of the latest version by clicking the top-right corner of the page.

Before using this tutorial, you should have created a Mech-Vision solution using the General Workpiece Recognition case project in the Hand-Eye Calibration section.

In this tutorial, you will first learn the project workflow, and then deploy the project by adjusting the Step parameters to recognize the workpiece poses and output the vision result.

In this tutorial, you will need to convert the model file of the workpiece in the CAD format into a point cloud matching model. Since it takes a long time to prepare the CAD model file, you are recommended to prepare the CAD model file of the workpiece before using this tutorial. You can download it by clicking here.
Video Tutorial: Workpiece Locating

Introduction to the Project Workflow

The following table describes each Step in the project workflow.

No. Phase Step Image Description

1

Capture images

Capture Images from Camera

project build understand step function 1

Connect to the camera and capture images

2

Recognize workpieces

3D Workpiece Recognition

project build understand step function 2

Use 3D matching algorithms to calculate the workpieces’ poses (as pick points)

3

Adjust poses

Adjust Poses V2

project build understand step function 3

Transform Poses from Camera Reference Frame to Robot Reference Frame

4

Output the vision result

Procedure Out

project build understand step function 4

Output the workpieces’ poses for the robot to pick

5

Send Scene Point Cloud

Send Point Cloud to External Service

project build understand step function 5

Send the scene point cloud to the Mech-Viz for pick-and-place with the Mech-Viz

A pick point refers to a point on the workpiece on which the robot can pick the object.

Adjust Step Parameters

In this section, you will deploy the project by adjusting the parameters of each Step.

Capture Images from Camera

You should adjust the parameters of the Capture Images from Camera Step to connect to the camera.

  1. Select the Capture Images from Camera Step, and click Select camera on the Step parameters tab.

    project build click select camera
  2. In the pop-up window, click project build link camera before on the right of the camera serial No. to connect the camera. After the camera is connected, the project build link camera before icon will turn into the project build link camera after icon.

    project build link camera

    After the camera is connected, select the parameter group. Click the Select parameter group button and select the calibrated parameter group with ETH/EIH and date.

    project build select parameter group
  3. After the camera is connected and the parameter group is selected, the calibration parameter group, IP address, and ports of the camera will be obtained automatically. Just keep the default settings of the other parameters.

    project build other parameter

Now you have connected the software to the camera.

3D Workpiece Recognition

The 3D Workpiece Recognition Step has integrated a 3D Workpiece Recognition visualized configurator, which provides point cloud preprocessing, model-based matching, and pose (pick point) calculation.

Select the 3D Workpiece Recognition Step, and click Open an Editor on the Step parameters tab.

project build open 3d workpiece recognition visual configuration tool

The 3D Workpiece Recognition visualized configurator is shown below.

project build check tool interface

Then follow the procedure to recognize workpieces.

project build 3d workpiece recognition workflow

Select Workpiece

After entering the 3D Workpiece Recognition visualized configurator, you need to make the point cloud model for the workpieces to recognize.

  1. Open the Model Editor.

    On the upper part of the 3D Workpiece Recognition visualized configurator. click Select workpiece.

    project build click select workpiece

    In the pop-up Workpiece Library window, click the Model Editor button to open the Model Editor interface.

    project build click model editor
  2. Import the CAD file.

    On the left of the Matching Model and Pick Point Editor interface, click the Import CAD file button.

    project build click import cad

    Import the prepared workpiece model in STL format, select the unit of measurement of the model object with which the workpiece was created, and click OK.

    project build set size

    After the CAD file is imported, it will be displayed in the visualization area of the Matching Model and Pick Point Editor interface.

    project build show cad
  3. Use the CAD file to make the point cloud model.

    On the left resource list of the Matching Model and Pick Point Editor interface, select the CAD file, and click the project build create model out surface point cloud icon icon on the toolbar. Then, set the sampling interval on the prompted window to generate the point cloud outside the exterior surface of the CAD model.

    project build create model
    project build set down sample
  4. View the generated point cloud model.

    The point cloud model generated based on the CAD file will be displayed in the resource list.

    project build chect model

    Select the point cloud model file and you will see the point cloud model in the visualization area of the Matching Model and Pick Point Editor interface.

    project build show model
  5. Add a pose.

    Click the project build add pose icon icon on the toolbar to add a pose as a pick point to the point cloud model of the workpiece.

    project build click add pose

    The added poses are shown below.

    project build check pose
  6. Save the model and poses.

    Close the Matching Model and Pick Point Editor, and click Yes in the pop-up window.

    project build save model and pose
  7. Select this workpiece from the Workpiece Library.

    After closing the Matching Model and Pick Point Editor, select the saved point cloud model of the workpiece, and click OK.

    project build select workpiece

    Subsequently, the target workpiece to recognize is displayed in the upper-right corner of the 3D Workpiece Recognition visualized configurator.

    project build workpiece select result

Now, you have selected the workpiece. Click Next on the bottom of the 3D Workpiece Recognition visualized configurator.

project build click next step 1

Preprocessing

Preprocessing is used to set an effective region for recognition to exclude the point cloud of unnecessary parts and keep only the point cloud of the workpiece, thus improving recognition efficiency.

The following figure displays the Preprocessing interface.

project build preprocess interface
  1. Set the region for recognition.

    Click the Settings button.

    project build click set 3d roi

    In visualized interface, set the region for recognition (3D ROI). Press and hold the Ctrl key, select the vertices of the 3D ROI, and drag the 3D ROI to the proper size. The following figure displays the set 3D ROI.

    project build set 3d roi
  2. Save the region for recognition.

    Click Save and apply to save the region for recognition.

    project build click save and use

Now, you have finished the preprocessing procedure. Click Next on the bottom of the 3D Workpiece Recognition visualized configurator to enter the recognition procedure.

project build click next step 2

Recognize workpieces

In this procedure, you can adjust the 3D matching related parameters in a visualized manner, and output the workpieces’ poses.

The following figure displays the Recognition interface.

project build recognize workpiece interface
  1. Since this project needs to recognize a maximum of five workpieces, set the Output count upper limit to 5.

    project build set output number
  2. View the visualized output result

    Click the Run Step (Shift+R) button.

    project build click run step

    You can view the visualized output result in the visualized area. As the figure below, the poses of four workpieces are output.

    project build check recognize workpiece result
  3. Save the configuration.

    Click the Finish button on the bottom of the 3D Workpiece Recognition visualized configurator.

    project build click finish

    Click Save in the pop-up window.

    project build click save

    Now, you have recognized the workpiece and calculated its pose.

Adjust Poses V2

The pick points output by the 3D Workpiece Recognition Step is in the camera reference frame. To facilitate robot picking, you need to adjust the workpieces’ poses to transform them from the camera reference frame to the robot reference frame.

  1. Open the pose adjustment tool.

    Select the Adjust Poses V2 Step, and click the Open the editor button in the Step Parameters panel.

    project build click open pose editor

    The interface of the pose adjustment tool is shown below.

    project build pose editor interface
  2. Adjust the reference frame.

    In the upper-right corner of the pose adjustment tool, under Reference Frame Settings, check the Convert Pose to Robot option.

    project build set transform type
  3. View the reference frame transformation result.

    Click the Next button in the lower-right corner of the pose adjustment tool.

    You can see the transformed poses in the visualized area of the pose adjustment tool.

    project build transform pose
  4. Save the configuration.

    Close the pose editor, and click Save in the pop-up window.

    project build save pose editor set

Now, the reference frame of the poses has been transformed.

Procedure Out

The Procedure Out Step sends the results of the current project to the backend service.

Send Point Cloud to External Service

The Send Point Cloud to External Service Step sends the point cloud to the Mech-Viz, which can be used to debug the project or check the actual results of the project.

Up to now, you have deployed the Mech-Vision project.

We Value Your Privacy

We use cookies to provide you with the best possible experience on our website. By continuing to use the site, you acknowledge that you agree to the use of cookies. If you decline, a single cookie will be used to ensure you're not tracked or remembered when you visit this website.