Check Whether the Recognition Effect Has Degraded

You are currently viewing the documentation for the latest version (2.1.0). To access a different version, click the "Switch version" button located in the upper-right corner of the page.

■ If you are not sure which version of the product you are currently using, please feel free to contact Mech-Mind Technical Support.

This section guides you through checking whether the recognition effect has degraded.

In this section, you need to check the following:

  • Check whether the point cloud quality has degraded

  • Check whether the effect of deep learning inference has degraded

  • Check whether the effect of 3D matching has degraded

Check Whether the Point Cloud Quality Has Degraded

If 3D matching is used for recognition, follow this section for checks.

Check method:

  1. Place the target object in the area where the picking inaccuracy issue occurs.

  2. In the EIH setup, move the camera to the camera-capturing position where the picking inaccuracy issue occurs.

  3. Open the Mech-Eye Viewer software and connect to the camera used by the project.

  4. After connecting to the camera, switch parameter group to the parameter group used by the project, and click the single cap button to capture the image once.

  5. On the Point Cloud tab in the data display area, view the quality of the target object point cloud.

Check criteria:

If the target object point cloud has the following problems, the point cloud quality has degraded:

  • Point clouds are incomplete, and picking-related feature points are missing.

  • The point cloud fluctuates (with noise) and has obvious up-and-down jitters.

Correction method:

The common causes of poor point cloud quality and ways to deal with them are as follows.

Possible causes Measures

Ambient light changes

Adjust the parameters in the 3D Parameters, Point Cloud Processing, Depth Range, and ROI categories to improve the point cloud quality. If the ambient light is too strong, shading measures should be taken.

Changes in incoming material orientation (the direction of camera stripes is not perpendicular to the longer side of the target object)

Restore the original incoming orientation, and keep the direction of the camera stripe perpendicular to the direction of the long side of the target object.

Check whether the effect of deep learning inference has degraded

If your project uses deep learning to assist recognition, you need to check whether the effect of the deep learning inference has degraded.

Check method:

  • In the toolbar of Mech-Vision, click Production Interface to enter the production interface of the current solution.

  • View the effect of deep learning inference in the configured “Deep learning result” view.

Check criteria:

If the following situations occur, the effectiveness of deep learning inference may decline:

  • Missed recognition of normal target objects.

  • Misrecognition of target objects and the scene. Misrecognition includes falsely identifying the scene as a target object, misidentifying different types of target objects, recognizing upper and lower layers of target objects incorrectly, and misidentifying target objects between each other.

  • Poor masks and low confidence in recognition. If the mask is poor and the confidence level of the segmented instance is low, it will lead to misrecognition or missed recognition, which will affect the picking.

Correction method:

The model needs to be iterated in the following situations:

  • When the confidence level is set very low, the target object still cannot be recognized normally. In this case, the model iteration of the missed recognitions is required.

  • If misrecognition is unrelated to the image quality or the target object itself, model iteration is required for addressing misrecognition.

  • If masking is poor and confidence in segmenting instances is low, and adjusting the confidence threshold does not address the issue, model iteration is required.

Other potential causes and corresponding measures for the decline in deep learning inference effect are provided in the table below.

Possible causes Measures

Changes in ambient light leading to overexposure or underexposure in the images

Adjust the 2D exposure parameters. If the ambient light is too strong, shading measures should be taken.

Changes in incoming material orientation (abnormal incoming material)

Restore the original incoming material orientation and adjust the 2D ROI to include all target objects in the 2D image.

Abnormal target objects

Exclude abnormal target objects.

Check Whether the Effect of 3D Matching Has Degraded

If your project uses 3D matching for recognition, you need to check whether the 3D matching effect has degraded.

Check method:

  • In the toolbar of Mech-Vision, click Production Interface to enter the production interface of the current solution.

  • See the effect of the 3D matching in the configured “Recognition result” view.

Check criteria:

If inaccurate matching occurs, the effect of 3D matching may decline.

Correction method:

The reasons and measures for the decline in 3D matching effect are provided in the table below.

Possible causes Measures

Ambient light changes

Adjust the parameters in the 3D Parameters, Point Cloud Processing, Depth Range, and ROI categories to improve the point cloud quality. If the ambient light is too strong, shading measures should be taken.

The 3D matching parameters not set properly

Contact Technical Support to help adjust the parameters.

Target object changes

Make point cloud models for new target objects.

We Value Your Privacy

We use cookies to provide you with the best possible experience on our website. By continuing to use the site, you acknowledge that you agree to the use of cookies. If you decline, a single cookie will be used to ensure you're not tracked or remembered when you visit this website.