Manual Calibration in Eye in Hand Scenarios (6-Axis Robot - Multiple Random Poses of Calibration Board)

You are currently viewing the documentation for a pre-release version (2.2.0). To access documentation for other versions, click the "Switch Version" button located in the upper-right corner of the page.

■ If you're unsure about the version of the product you are using, please contact Mech-Mind Technical Support for assistance.

This operation guide describes how to complete manual hand-eye calibration for a 6-axis robot and a 2D camera in an Eye in Hand (EIH) scenario (multiple random poses of calibration board).

Overview

The overall workflow for manual calibration of a 6-axis robot in an Eye in Hand (EIH) scenario is shown below.

calibration process
  • Preparation: Complete the required preparation before calibration.

  • Pre-Configuration: Select pre-configuration items before calibration, such as robot model and camera mounting method.

  • Start Calibration: Formally start calibration and obtain calibration results by completing a series of steps. This step involves robot-side operations to establish communication between the vision system and the robot.

  • Validate Calibration Results: Verify the obtained calibration results and check whether they meet requirements.

  • Apply Calibration Results: Use the new calibration parameter group in the vision project.

The following sections describe the workflow in detail.

Preparation

Before hand-eye calibration, complete the following preparation tasks:

Complete Camera Installation

Refer to the Camera Installation section to complete camera installation.

Hand-eye calibration requires Mech-Vision and Mech-Viz. Ensure these software products are installed and updated to the latest version.

Complete Robot Communication Configuration

If the robot communicates with the vision system through the standard interface, complete standard interface communication configuration for the robot. Depending on your robot brand, refer to the corresponding robot’s standard-interface communication configuration document in Standard Interface Communication.

If the robot communicates with the vision system through master control, complete master-control communication configuration for the robot. Depending on your robot brand, refer to the corresponding robot’s master-control communication configuration document in Master Control Communication.

Prepare Required Materials

Automatic calibration in ETH scenarios requires a calibration board or marker.

  • If you use a calibration board, prepare it according to the following requirements:

    • Make sure circles on the calibration board are clear, with no obvious scratches, and the board has no obvious bending or deformation.

    • In ETH scenarios, first install the calibration board connector on the robot end flange, then install the calibration board on the connector. Ensure the calibration board is firmly mounted, positioned at the center of the camera field of view, and as parallel as possible to the camera imaging plane, that is, as perpendicular as possible to the Z-axis of the camera coordinate system.

      When a non-removable gripper is mounted on the robot flange, the calibration board can be directly fixed to the gripper.

  • If using a calibration board is inconvenient on site (for example, limited space or installation constraints), you can use a marker. Prepare it according to the following requirement:

    • The marker must have clear feature points, and the spatial distribution of feature points should be as uniform as possible.

Check Calibration Board Image Quality

Calibration board image quality affects hand-eye calibration accuracy. Therefore, you need to check calibration board image quality. The calibration workflow includes this check, but you can also perform it in advance to reduce calibration time.
  1. Place the calibration board horizontally at the center of the work surface within the camera field of view.

  2. In 2D Camera Management Tool, connect the camera and adjust camera parameters to ensure overall brightness of the calibration board in the 2D image is not too dark, too bright, or uneven, and each calibration circle is clearly visible.

    If ambient light on site is complex, it is recommended to use shielding or supplementary lighting to reduce ambient light impact on 2D images.

    Normal Overexposed Underexposed

    calibration normal1

    calibration overexposure

    calibration underexposure

Complete Pre-Calibration Checks

Refer to Pre-Calibration Checks and complete the following checks:

  • Confirm the robot base is firmly installed.

  • Confirm the camera bracket and camera are firmly installed.

  • Confirm robot absolute accuracy meets requirements.

  • Verify robot model parameters are accurate.

  • Confirm the camera has no distortion, or distortion calibration has been completed.

  • Confirm camera warm-up has been completed.

Pre-Configuration

  1. Open Mech-Vision. On the menu bar, choose Camera Assistant  2D Camera Calibration  Hand-Eye Calibration. The Calibration Pre-Configuration window is displayed.

  2. After confirming pre-calibration checks are completed, click Confirm Checks, then click Next.

  3. In the Select How to Calibrate window, select Start New Calibration, then click Next.

  4. In the Select Calibration Task window, select Hand-Eye Calibration for Other Robots from the drop-down list, specify Robot Euler Angle Type as needed, select the robot coordinate system type, and then click Next.

  5. In the Select Robot Type for Calibration window, select 6-Axis Robot according to your robot type, then click Next.

  6. In the Select Camera Mounting Method window, select Eye in Hand, then click Next.

  7. In the Select How to Collect Data window, select Multiple Random Poses of Calibration Board, then click Start Calibration. The Calibration (Eye in Hand) window is displayed.

Pre-configuration is now complete. You can proceed to the formal calibration workflow.

Start Calibration

Connect Camera

  1. Connect the camera.

    In the Connect Camera step, you can select a connected camera from the drop-down list.

    If no camera is available in the list, click 2D Camera Management, connect the camera in 2D Camera Management Tool, and then return here to select the corresponding camera.

  2. Confirm the camera can capture images normally.

    After connecting the camera, you can click Continuous Capture or Single Capture to view captured images in the Image View panel on the right.

    When capturing images, ensure overall brightness of the calibration board is not too dark, too bright, or uneven, and each calibration circle is clearly visible. If image quality does not meet requirements, adjust Exposure Time and Gain to improve image quality.

  3. Load distortion calibration results.

    To eliminate image distortion and ensure subsequent hand-eye calibration is computed based on more accurate image coordinates, load distortion calibration results for the corresponding camera. The tool uses these results to correct captured images before subsequent calibration operations.

    If you have already confirmed the camera has no distortion, this operation can be skipped.

After the camera is connected and image quality is confirmed, click Next on the bottom bar.

Select Calibration Method

The tool provides two calibration methods: Calibration with Calibration Board and Calibration with Feature Points.

Calibration with Calibration Board

This method is suitable for scenarios that require high calibration accuracy and have no suitable marker.

  1. In the Calibration Method drop-down list, select Calibration with Calibration Board.

  2. Based on the markings on the calibration board, select the model of the calibration board used in the Standard Calibration Board Model drop-down list.

Calibration with Feature Points

This method is suitable for scenarios where using a calibration board is inconvenient on site (limited space or installation constraints) and usable feature points already exist on a marker.

If edges of target workpieces are clear and not fully symmetric, the target workpiece can be used as a marker.

The tool provides two feature-point calibration modes: 2D Matching and Obtain from Project.

Feature Point Recognition Mode Description Operation

2D Matching

Recognizes feature points in images by template matching.

Suitable for scenarios where feature points are regular and easy to match.

  1. Select a template.

    Create a template in 2D Matching Template Editor and select it here.

  2. Set matching parameters.

    • Edge Polarity Sensitive: Controls whether the edge grayscale transition direction during matching must be consistent with the template (for example, bright-to-dark or dark-to-bright).

      Parameter tuning suggestions:

      • When contrast between workpiece and background is stable and lighting changes are small (for example, fixed light source + fixed installation), enable this parameter to reduce false matches.

      • When reflections are obvious, lighting fluctuates greatly, or brightness inversion may occur on the workpiece, disable this parameter to improve matching robustness.

      • You can test with this parameter enabled first. If matching occasionally fails even when position is correct, try disabling it and retest.

    • Minimum Matching Score: Used to determine whether a matching result is valid. Results below this threshold are filtered out.

      Parameter tuning suggestions:

      • A higher threshold can reduce false positives, but may cause missed detections (especially when image noise is high or workpiece edges are unclear).

      • A lower threshold can increase detection rate, but may introduce false positives.

      • It is recommended to tune parameters using a high-to-low strategy: start with a higher threshold to ensure reliable results, then gradually reduce it to find a balance between detection rate and stability.

Typical scenarios:

  • If workpiece shape is regular and background is clean, use a higher matching score to improve positioning stability.

  • If workpiece surface has wear, stains, or incomplete edges, appropriately lower the matching score to avoid matching failures caused by local defects.

Obtain from Project

Obtains feature points through existing processing flows in a vision project.

Suitable for scenarios that require flexible feature-point acquisition using multiple image-processing methods (such as creating points and lines).

  1. Select the vision project in the current solution.

  2. Select the step that outputs feature points in the project.

Obtain Feature Points and Poses

  1. In the Obtain Feature Points and Poses step, control the robot to move to different calibration points, and then click Add Image and Record Flange Pose.

    After the robot moves to different calibration points, record the pose of each point in the robot program for direct reuse during recalibration.
  2. Enter robot flange poses in the pop-up window.

    Fill in robot flange poses according to the poses displayed on the teach pendant.

    • If you use a UR robot, use Rotation Vector to represent orientation.

    • If Euler angles are used to represent orientation:

      • If you use other robot brands, select the Euler angle type corresponding to that robot brand.

      • If you use an adapted robot, the software automatically selects the correct Euler angle type. No manual setting is required.

    Create a local file (.txt or .xlsx) to save the entered robot flange poses for convenient reuse during recalibration.
  3. Repeat the above steps until the added calibration points meet data requirements, then click Next on the bottom bar.

Calculate Extrinsic Parameters

  1. In the Calculate Extrinsic Parameters step, click Calculate Extrinsics in the Calculate Extrinsics and View Results area.

  2. In the calibration-success dialog, click OK, and then view calibration results in the message panel below.

  3. In the Calculate Extrinsic Parameters step, click Save on the bottom bar. When the message panel indicates successful save, calibration results are automatically saved to the calibration folder of the solution.

Validate Calibration Results

After obtaining calibration results, verify their accuracy through trial picking to ensure requirements are met.

  1. Save calibration results.

    In the Calculate Extrinsic Parameters step, click Save on the bottom bar. When the message panel indicates successful save, calibration results are automatically saved to the calibration folder of the solution.

  2. Prepare trial-picking workpieces.

    • Prepare multiple workpieces of the same type as those in actual applications.

    • Place workpieces within the camera field of view, and ensure they are distributed across different regions (such as center and corners) to comprehensively validate calibration accuracy.

  3. Prepare a workpiece recognition project and a trial picking project.

    • Ensure Enable Calibration is enabled in the 2D Smart Camera step, and select the saved calibration results from the drop-down list.

    • Set picking points and gripper type according to actual application requirements.

  4. Run the workpiece recognition project and trial picking project.

    Observe whether the robot can pick workpieces accurately and check the following indicators:

    • Picking success rate: The robot should pick stably, with a success rate not lower than 95%.

    • Picking position accuracy: Check whether the robot picking position is consistent with the relative position during calibration. If a systematic offset exists (for example, always offset in one direction), calibration results may have errors.

  5. Determine whether calibration is valid based on trial-picking results.

    • Validation passed: If trial-picking results meet application requirements, calibration results are valid and the calibration parameter group can continue to be used.

    • Validation failed: If obvious position or orientation deviation exists during trial picking, calibration accuracy does not meet requirements. In this case, it is recommended to:

      • Check whether pre-calibration preparation is sufficient, such as whether camera distortion exists and whether feature-point recognition is accurate.

      • Based on deviation direction and magnitude, consider whether calibration parameters need to be adjusted (for example, increasing rotation count or expanding translation range), then recalibrate.

Apply Calibration Results

After extrinsic parameter validation is completed, you can apply calibration results.

Select the 2D Smart Camera step, enable Enable Calibration in the step parameter panel, and select saved calibration results in the drop-down list. Then the calibration results can be used for subsequent vision processing and picking.

This completes the calibration workflow.

Is this page helpful?

You can give a feedback in any of the following ways:

We Value Your Privacy

We use cookies to provide you with the best possible experience on our website. By continuing to use the site, you acknowledge that you agree to the use of cookies. If you decline, a single cookie will be used to ensure you're not tracked or remembered when you visit this website.