Example Program 1: MM_S1_Vis_Basic

You are currently viewing the documentation for the latest version (2.1.0). To access a different version, click the "Switch version" button located in the upper-right corner of the page.

■ If you are not sure which version of the product you are currently using, please feel free to contact Mech-Mind Technical Support.

Program Introduction

Description

The robot triggers the Mech-Vision project to run, and then obtains the vision result for picking and placing the object.

File path

You can navigate to the installation directory of Mech-Vision and Mech-Viz and find the file by using the Communication Component/Robot_Interface/KUKA/sample/MM_S1_Vis_Basic path.

Project

Mech-Vision project

Prerequisites

This example program is provided for reference only. Before using the program, please modify the program according to the actual scenario.

Program Description

This part describes the MM_S1_Vis_Basic example program.

DEF MM_S1_Vis_Basic ( )
;---------------------------------------------------
; FUNCTION: trigger Mech-Vision project and get
; vision result
; Mech-Mind, 2023-12-25
;---------------------------------------------------
   ;set current tool no. to 1
   BAS(#TOOL,1)
   ;set current base no. to 0
   BAS(#BASE,0)
   ;move to robot home position
PTP HOME Vel=100 % DEFAULT
   ;initialize communication parameters (initialization is required only once)
   MM_Init_Socket("XML_Kuka_MMIND",873,871,60)
   ;move to image-capturing position
LIN camera_capture Vel=1 m/s CPDAT1 Tool[1] Base[0]
   ;trigger NO.1 Mech-Vision project
   MM_Start_Vis(1,0,2,init_jps)
   ;get vision result from NO.1 Mech-Vision project
   MM_Get_VisData(1,pos_num,status)
   ;check whether vision result has been got from Mech-Vision successfully
   IF status<> 1100 THEN
      ;add error handling logic here according to different error codes
      ;e.g.: status=1003 means no point cloud in ROI
      ;e.g.: status=1002 means no vision result
      halt
   ENDIF
   ;save first vision point data to local variables
   MM_Get_Pose(1,Xpick_point,label,toolid)
   ;calculate pick approach point based on pick point
   tool_offset={X 0,Y 0,Z -100,A 0,B 0,C 0}
   Xpick_app=Xpick_point:tool_offset
   ;move to intermediate waypoint of picking
PTP pick_waypoint CONT Vel=50 % PDAT1 Tool[1] Base[0]
   ;move to approach waypoint of picking
LIN pick_app Vel=1 m/s CPDAT2 Tool[1] Base[0]
   ;move to picking waypoint
LIN pick_point Vel=0.3 m/s CPDAT3 Tool[1] Base[0]
   ;add object grasping logic here, such as "$OUT[1]=TRUE"
   halt
   ;move to departure waypoint of picking
LIN pick_app Vel=1 m/s CPDAT2 Tool[1] Base[0]
   ;move to intermediate waypoint of placing
PTP drop_waypoint CONT Vel=100 % PDAT2 Tool[1] Base[0]
   ;move to approach waypoint of placing
LIN drop_app Vel=1 m/s CPDAT4 Tool[1] Base[0]
   ;move to placing waypoint
LIN drop Vel=0.3 m/s CPDAT5 Tool[1] Base[0]
   ;add object releasing logic here, such as "$OUT[1]=FALSE"
   halt
   ;move to departure waypoint of placing
LIN drop_app Vel=1 m/s CPDAT4 Tool[1] Base[0]
   ;move back to robot home position
PTP HOME Vel=100 % DEFAULT
END

The workflow corresponding to the above example program code is shown in the figure below.

sample1

The table below explains the above program. You can click the hyperlink to the command name to view its detailed description.

Feature Code and description

Set the reference frame

   ;set current tool no. to 1
   BAS(#TOOL,1)
   ;set current base no. to 0
   BAS(#BASE,0)
  • BAS(#TOOL,1): Set the current tool reference frame to 1.

  • BAS(#BASE,0): Set the current base reference frame to 0.

The above two statements set the current tool and base reference frames.

Move to the home position

   ;move to robot home position
PTP HOME Vel=100 % DEFAULT
  • PTP: The point-to-point mode. In this mode, the robot moves its TCP along the quickest path to the target point.

  • HOME: The name of the target point (home position).

    You should teach the home position in advance. For detailed instructions, see Teach Calibration Start Point in the calibration document.

  • Vel=100 %: The velocity.

  • DEFAULT: The system-assigned name for a motion data set.

The above statement specifies to move the robot in PTP mode to the taught home position.

Initialize communication parameters

   ;initialize communication parameters (initialization is required only once)
   MM_Init_Socket("XML_Kuka_MMIND",873,871,60)

The MM_Init_Socket command establishes the TCP communication between the robot and the vision system based on the configurations in the XML_Kuka_MMIND.xml file.

For more information about the XML_Kuka_MMIND file, see MM_Init_Socket.

Move to the image-capturing position

   ;move to image-capturing position
LIN camera_capture Vel=1 m/s CPDAT1 Tool[1] Base[0]
  • LIN: The linear motion mode.

  • camera_capture: The target point name, which is the image-capturing position.

    You should teach the image-capturing position in advance. For detailed instructions, see Teach Calibration Start Point in the calibration document.
  • Vel=1 m/s: The velocity.

  • CPDAT1: The system-assigned name for a motion data set.

  • Tool[1]: The tool reference frame 1.

  • Base[0]: The base reference frame 0.

The above statement specifies to move the robot linearly to the taught image-capturing position.

Trigger the Mech-Vision project to run

   ;trigger NO.1 Mech-Vision project
   MM_Start_Vis(1,0,2,init_jps)
  • MM_Start_Vis: The command to trigger the Mech-Vision project to run.

  • 1: The Mech-Vision project ID.

  • 0: The Mech-Vision project is expected to return all vision points.

  • 2: Specify that the robot flange pose must be input to the Mech-Vision project.

  • init_jps: Custom joint positions. The joint positions in this example program are of no practical use but must be set.

The entire statement indicates that the robot triggers the vision system to run the Mech-Vision project with an ID of 1 and expects the Mech-Vision project to return all vision points.

Get the vision result

   ;get vision result from NO.1 Mech-Vision project
   MM_Get_VisData(1,pos_num,status)
  • MM_Get_VisData: The command to obtain the vision result.

  • 1: The Mech-Vision project ID.

  • pos_num: The variable that stores the number of received vision points.

  • status: The variable that stores the command execution status code.

The entire statement indicates that the robot obtains the vision result from the Mech-Vision project with an ID of 1.

The returned vision result is saved to the robot memory and cannot be directly obtained. To access the vision result, you must store the vision result in a subsequent step.
   ;check whether vision result has been got from Mech-Vision successfully
   IF status<> 1100 THEN
      ;add error handling logic here according to different error codes
      ;e.g.: status=1003 means no point cloud in ROI
      ;e.g.: status=1002 means no vision result
      halt
   ENDIF
  • IF A THEN …​ ENDIF: When condition A is met, the program executes the code between IF and ENDIF.

  • <>: Not equal.

The above statement indicates that when the status code is 1100, the robot has successfully obtained all vision result; otherwise, an exception has occurred in the vision system and the program executes the code between IF and ENDIF. You can perform the corresponding operation based on the specific error code. In this example program, all error codes are handled in the same way, by pausing the program execution using the halt command.

Store the vision result

   ;save first vision point data to local variables
   MM_Get_Pose(1,Xpick_point,label,toolid)
  • MM_Get_Pose: The command to store the vision result.

  • 1: The first vision point is stored.

  • Xpick_point: The variable that stores the TCP of the first vision point, which is the TCP of the picking waypoint.

  • label: The variable that stores the label corresponding to the first vision point.

  • toolid: The variable that stores the tool ID corresponding to the first vision point.

The entire statement stores the TCP, label, and tool ID of the first vision point in the specified variables.

   ;calculate pick approach point based on pick point
   tool_offset={X 0,Y 0,Z -100,A 0,B 0,C 0}
   Xpick_app=Xpick_point:tool_offset
  • tool_offset={X 0,Y 0,Z -100,A 0,B 0,C 0}: 100 mm in the negative direction of the Z-axis of the tool reference frame.

  • Xpick_app=Xpick_point:tool_offset: 100 mm in the negative direction of the Z-axis relative to Xpick_point (i.e., 100 mm above the picking waypoint). Xpick_app indicates the approach waypoint of picking.

The above two statements calculate the position of the approach waypoint of picking. In the subsequent steps, the robot will reach this position.

Move to the intermediate waypoint

   ;move to intermediate waypoint of picking
PTP pick_waypoint CONT Vel=50 % PDAT1 Tool[1] Base[0]

The robot moves in PTP mode to a intermediate waypoint (pick_waypoint) between the image-capturing point and the approach waypoint of picking.

  • Adding intermediate waypoints can ensure smooth robot motion and avoid unnecessary collisions. You can add multiple intermediate waypoints according to the actual situation.

  • You need to teach the intermediate waypoint in advance. For information about how to teach the waypoint, see Teach Calibration Start Point in the calibration document.

Move to the approach waypoint of picking

   ;move to approach waypoint of picking
LIN pick_app Vel=1 m/s CPDAT2 Tool[1] Base[0]

The robot moves linearly to a point 100 mm above the picking waypoint (i.e. the approach waypoint of picking). pick_app and Xpick_app indicate the same position.

Adding approach waypoints of picking can prevent the robot from colliding with objects (such as bins) in the scene when moving. You can adjust the Z-axis offset (such as “tool_offset={X 0,Y 0,Z -100,A 0,B 0,C 0}”) based on the actual scene to ensure that the approach process is collision-free.

Move to the picking waypoint

   ;move to picking waypoint
LIN pick_point Vel=0.3 m/s CPDAT3 Tool[1] Base[0]

The robot moves linearly from the approach waypoint of picking to the picking waypoint. pick_point and Xpick_point indicate the same position.

Set DOs to perform picking

   ;add object grasping logic here, such as "$OUT[1]=TRUE"
   halt

After the robot moves to the picking waypoint, you can set a DO (such as “$OUT[1]=TRUE”) to control the robot to use the tool to perform picking. Please set DOs based on the actual situation.

halt indicates to pause the program execution. If you have added a statement to set a DO, you can delete the halt statement here.

Move to the departure waypoint of picking

   ;move to departure waypoint of picking
LIN pick_app Vel=1 m/s CPDAT2 Tool[1] Base[0]

The robot moves to 100 mm above the picking waypoint and reaches the departure waypoint of picking. pick_app and Xpick_app indicate the same position.

Adding departure waypoints of picking can prevent the robot from colliding with objects (such as bins) in the scene when moving. You can adjust the Z-axis offset (such as “tool_offset={X 0,Y 0,Z -100,A 0,B 0,C 0}”) based on the actual scene to ensure that the departure process is collision-free.

Move to the intermediate waypoint

   ;move to intermediate waypoint of placing
PTP drop_waypoint CONT Vel=100 % PDAT2 Tool[1] Base[0]

The robot moves to a intermediate waypoint (drop_waypoint) between the departure waypoint of picking and the approach waypoint of placing.

  • Adding intermediate waypoints can ensure smooth robot motion and avoid unnecessary collisions. You can add multiple intermediate waypoints according to the actual situation.

  • You need to teach the intermediate waypoint in advance. For information about how to teach the waypoint, see Teach Calibration Start Point in the calibration document.

Move the robot to the approach waypoint of placing

   ;move to approach waypoint of placing
LIN drop_app Vel=1 m/s CPDAT4 Tool[1] Base[0]

The robot moves from the intermediate waypoint to the approach waypoint of placing (drop_app).

  • Adding approach waypoints of placing can prevent the robot from colliding with objects (such as bins) in the scene when moving.

  • You need to teach the approach waypoint of placing in advance. For information about how to teach the waypoint, see Teach Calibration Start Point in the calibration document.

Move to the placing waypoint

   ;move to placing waypoint
LIN drop Vel=0.3 m/s CPDAT5 Tool[1] Base[0]

The robot moves from the approach waypoint of placing to the placing waypoint (drop).

  • The placing waypoint should be located at a safe distance from other stations, personnel, and equipment, and should not exceed the robot’s maximum reach.

  • You need to teach the placing waypoint in advance. For information about how to teach the waypoint, see Teach Calibration Start Point in the calibration document.

Set DOs to perform placing

   ;add object releasing logic here, such as "$OUT[1]=FALSE"
   halt

After the robot moves to the placing waypoint, you can set a DO (such as “$OUT[1]=FALSE”) to control the robot to use the tool to perform placing. Please set DOs based on the actual situation.

halt indicates to pause the program execution. If you have added a statement to set a DO, you can delete the halt statement here.

Move the robot to the departure waypoint of placing

   ;move to departure waypoint of placing
LIN drop_app Vel=1 m/s CPDAT4 Tool[1] Base[0]

The robot moves from the placing waypoint to the departure waypoint of placing (drop_app).

  • Adding departure waypoints of placing can prevent the robot from colliding with objects (such as bins) in the scene when moving.

  • You need to teach the departure waypoint of placing in advance. For information about how to teach the waypoint, see Teach Calibration Start Point in the calibration document.

Move to the home position

   ;move back to robot home position
PTP HOME Vel=100 % DEFAULT

The robot moves from the departure waypoint of placing to the home waypoint again.

You should teach the home position in advance. For detailed instructions, see Teach Calibration Start Point in the calibration document.

We Value Your Privacy

We use cookies to provide you with the best possible experience on our website. By continuing to use the site, you acknowledge that you agree to the use of cookies. If you decline, a single cookie will be used to ensure you're not tracked or remembered when you visit this website.