|本期目录/Table of Contents|

[1]乐恒韬,等.基于LSTM网络的机器人异空间手眼标定方法[J].武汉工程大学学报,2024,46(05):574-578.[doi:10.19843/j.cnki.CN42-1779/TQ.202401027]
 YUE Hengtao,ZHAO Kangkang,WU Songlin,et al.Robot hand-eye calibration in different spaces based on LSTM network[J].Journal of Wuhan Institute of Technology,2024,46(05):574-578.[doi:10.19843/j.cnki.CN42-1779/TQ.202401027]
点击复制

基于LSTM网络的机器人异空间手眼标定方法(/HTML)
分享到:

《武汉工程大学学报》[ISSN:1674-2869/CN:42-1779/TQ]

卷:
46
期数:
2024年05期
页码:
574-578
栏目:
机电与信息工程
出版日期:
2024-10-28

文章信息/Info

Title:
Robot hand-eye calibration in different spaces based on LSTM network
文章编号:
1674 - 2869(2024)05 - 0574 - 05
作者:
乐恒韬1 2赵康康1 2吴松林1 2付中涛*1 2陈绪兵1 2
1. 武汉工程大学机电工程学院,湖北 武汉 430205;
2. 智能焊接装备与软件工程技术湖北省研究中心(武汉工程大学),湖北 武汉 430205
Author(s):
YUE Hengtao12 ZHAO Kangkang12 WU Songlin12 FU Zhongtao*12 CHEN Xubing12
1. School of Mechanical and Electrical Engineering, Wuhan Institute of Technology, Wuhan 430205, China;
2. Hubei Research Center of Intelligent Welding Equipment and Software Engineering Technology
(Wuhan Institute of Technology), Wuhan 430205, China
关键词:
机器人手眼标定LSTM网络异空间
Keywords:
robot eye-hand calibration LSTM network different spaces
分类号:
TP241
DOI:
10.19843/j.cnki.CN42-1779/TQ.202401027
文献标志码:
A
摘要:
针对现有机器人操作空间与相机视野异空间的情况,基于长短期记忆(LSTM)网络,提出了一种新颖的机器人异空间手眼标定方法。首先依次提取并记录标定板上各圆心的像素坐标,然后利用传送带将标定板平移送至机器人工作空间内,并记录机器人末端顶针的位姿信息。其次利用LSTM网络数据训练获得手眼映射关系。最后使用采集的36组真实数据作为验证集来验证预测精度。结果表明,该方法训练的模型所预测的机器人基坐标系坐标平均平移误差仅为0.69 mm,并且针对随机分布于传送带所有工作空间中的验证集数据,平移误差波动值均小于1 mm,有效验证了该方法的鲁棒性和有效性。相较于经典平面标定方法,本文所提出的方法有效工作空间大,标定精度高,并且可以有效补偿相机镜头畸变、深度值变化等因素所带来的误差。
Abstract:
We present a novel robot hand-eye calibration method in an alien space based on the long short-term memory (LSTM) network, addressing the mismatch between the current robot operating space and camera field of view. Firstly, pixel coordinates of each circle center on the calibration board were extracted and recorded sequentially. Subsequently, the calibration board was translated into the robot workspace using a conveyor belt, and the pose information of the robot end effector was recorded. The LSTM network was then employed for data training to establish the hand-eye mapping relationship. Finally, 36 sets of real data were collected as a validation set to assess prediction accuracy. The results indicate that the model trained by this method predicts with an average translation error of only 0.69 mm in the robot base coordinate system. Moreover, for the validation dataset distributed randomly across all workspaces on the conveyor belt, the fluctuation value of translation error is less than 1 mm, effectively affirming the robustness and effectiveness of this method. In comparison to classical planar calibration methods, the proposed method operates efficiently in larger workspaces, provides high calibration accuracy, and can effectively compensate for errors arising from factors such as camera lens distortion and depth value changes.

参考文献/References:

[1] 熊有伦, 李文龙, 陈文斌, 等. 机器人学:建模、控制与视觉[M]. 2版. 武汉: 华中科技大学出版社, 2020.

[2] 郭源, 张爱军. 基于3D相机的拆垛机器人手眼标定方法研究[J]. 组合机床与自动化加工技术, 2024(1): 1-4.
[3] SHIU Y C , AHMAD S .Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX=XB[J]. IEEE Transactions on Robotics and Automation, 1989, 5(1):16-29.
[4] 袁泉, 邹冲, 闵锋. 双目视觉在类人机器人测距中的应用[J]. 武汉工程大学学报, 2017, 39(2):193-198.
[5] 章秀华, 洪汉玉, 华夏,等. 动目标多视点基于标定的去模糊方法[J]. 武汉工程大学学报, 2014, 36(12): 56-62.
[6] WANG W, LIU F, YUN C. Calibration method of robot base frame using unit quaternion form [J]. Precision Engineering, 2015, 41: 47-54.
[7] 王一凡, 段锁林, 高仁洲, 等. 基于对偶四元数的机器人手眼标定算法研究[J]. 机电工程, 2019, 36(7): 755-760, 765.
[8] LIU Z Y, LIU X, DUAN G F, et al. Precise hand-eye calibration method based on spatial distance and epipolar constraints[J]. Robotics and Autonomous Systems, 2021, 145: 103868.
[9] FU Z T, PAN J B, SPYRAKOS-PAPASTAVRIDIS E, et al. A dual quaternion-based approach for coordinate calibration of dual robots in collaborative motion[J]. IEEE Robotics and Automation Letters, 2020, 5(3): 4086-4093.
[10] BAYRO-CORROCHANO E, DANIILIDIS K, SOMMER G. Motor algebra for 3D kinematics: the case of the hand-eye calibration[J]. Journal of Mathematical Imaging and Vision, 2000,13: 79-100.
[11] ZHAO Z J, LIU Y C. A hand-eye calibration algorithm based on screw motions[J]. Robotica, 2009, 27(2): 217-223.
[12] CHEN L, ZHONG G W, WAN Z H, et al. A novel binocular vision-robot hand-eye calibration method using dual nonlinear optimization and sample screening[J]. Mechatronics, 2023, 96: 103083.
[13] CHEN E C S, MORGAN I, JAYARATHNE U, et al. Hand-eye calibration using a target registration error model[J]. Healthcare Technology Letters,2017,4(5): 157-162.
[14] GRAVES A. Supervised sequence labelling with recurrent neural networks [M]. Berlin: Springer, 2012.
[15] BILAL D K, UNEL M, TUNC L T, et al. Development of a vision based pose estimation system for robotic machining and improving its accuracy using LSTM neural networks and sparse regression[J]. Robotics and Computer-Integrated Manufacturing, 2022, 74: 102262.
[16] KUO Y L, HSIEH C H. Static calibration and dynamic compensation of the SCORBOT robot using sensor fusion and LSTM networks[J]. Journal of the Chinese Institute of Engineers,2023,46(8):881-894.

相似文献/References:

备注/Memo

备注/Memo:
收稿日期:2024-01-31
基金项目:湖北省技术创新重点研发计划 (2023BAB071);武汉东湖新技术开发区“揭榜挂帅”项目(2024KJB352)
作者简介:乐恒韬,硕士研究生。 Email:[email protected]
*通信作者:付中涛,博士,副教授。 Email:[email protected]
引文格式:乐恒韬,赵康康,吴松林,等. 基于LSTM网络的机器人异空间手眼标定方法[J]. 武汉工程大学学报,2024,46(5):574-578.
更新日期/Last Update: 2024-10-26