Calibration of Eye-in-Hand system model for Visual Robot Manipulator Target Detection
Industrial robot grasping includes grasping detection, trajectory planning, and control execution. It is challenging for industrial robots to grasp as dexterously as human arms. In traditional industrial robot teaching programming, predetermined control parameters and the position and posture of grasping targets must be fixed. The operation can do simple tasks, but its flexibility is too poor. 3D visual task requires the visual system to have the ability to understand the six-degree-of-freedom pose information of the target object in space, that is, the three-degree-of-freedom translation transformation process and the three-degree-of-freedom rotation transformation process of the target object in space. Therefore, now that industrial robots more and more pays attention to flexibility and intelligence, these two indicators have gradually be-come an important standard for measuring the industrial robot. To address these issues, this research focuses on the use of machine vision to grasp object detection links, grasping based on visual feedback, implementing the algorithm for deciding the best robot grasp posture, and planning the trajectory to accomplish the grasping task. To accomplish these objectives, dynamic approach calibration of a robot manipulator for visual Robot Target Detection by using an Eye-in-Hand system model is proposed.