簡易檢索 / 詳目顯示

研究生: 湯景皓
Tang, Ching-Hao
論文名稱: 基於查表法的異質機器手臂動作模仿
A Data-driven Approach to Motion Imitation between Dissimilar Robotic Arms by Table Lookup
指導教授: 金仲達
King, Chung-Ta
口試委員: 張禎元
Chang, Jen-Yuan
朱宗賢
Chu, Tsung-Hsien
劉靖家
Liou, Jing-Jia
學位類別: 碩士
Master
系所名稱: 電機資訊學院 - 資訊工程學系
Computer Science
論文出版年: 2022
畢業學年度: 110
語文別: 英文
論文頁數: 32
中文關鍵詞: 機器手臂動作模仿迭代加速查找表
外文關鍵詞: robotic arm, motion imitation, iterative, accelerate, look-up table
相關次數: 點閱:2下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 這篇論文提出一種不同機器手臂之間的模仿演算法,並讓機器手臂能夠模仿另外一具不同手臂的動作。所謂的動作模仿,我們指的是(1)模仿者手臂的末端點能夠與示者手臂的末端點有相同的軌跡,(2)模仿者的姿勢盡可能與示範者相似。這種技術是很重要的,一個例子是我們會想讓一具機器手臂能夠執行另外一具機器手臂已經學會的操作。
    目標(1)能夠通過逆運動學的計算程序完成,我們能夠將示範手臂的末端點軌跡中作為目標,逆運動學能夠根據這些目標點計算出適合的模仿關節參數。
    目標(2)會是一個挑戰性的問題。一具機器手臂擁有越多冗於自由度,我們直覺的認為他能夠更好的模仿另外一具手臂。不幸的是,到目前為止,除了透過專家設計轉換程式以外,只能依靠通用的優化演算法完成,通常會需要多次的迭代才能找到適合的手臂關節參數。
    在這篇論文中,我們提出了一個基於查找法的演算法完成這個模仿任務。對於一具機器手臂,我們可以自動的蒐集資料並建立查找法,之後就能夠根據給定的相似度模仿任何其他手臂。透過這個方法,我們能夠系統性的完成不同機器手臂之間的動作模仿,在本文中,我們會通過2具不同的機器手臂對我們提出的方法進行評估與測試。


    This thesis presents a solution to allow a robotic arm to imitate the motion of another dissimilar arm. By motion imitation, we mean that (1) the end-effector of the imitator arm follows the same trajectories of the demonstrator arm and (2) its pose is as similar to that of the demonstrator as possible. Such a technique is important, for example, if we want one robotic arm to perform the operations that have been learned by another robotic arm. Requirement (1) can easily be satisfied by solving the \textit{inverse kinematics} (IK) of the points on the demonstrator's motion trajectory for the imitator arm, assuming the end-effector of the imitator can reach all the points on the demonstrator's trajectory. Requirement (2) is more challenging. Intuitively, the more redundant \emph{degree-of-freedoms} (DoFs) a robotic arm has, the better the arm can imitate the pose of another arm. Most previous works solve the problem by adding extra constraints in the IK formulations. This necessarily increases the complexity and execution time in solving the problem. In this thesis, we propose a table-lookup method to speed up the optimization iterations. The table only needs to be built once after the robotic arm is designed and can be used to imitate the poses of any arm according to the given similarity metric, assuming the imitator has redundant DoFs. With the table, motion imitation of dissimilar robotic arms can be done outside of the IK formulations. The effectiveness of the proposed approach is evaluated by two dissimilar 7-DoF robotic arms.

    1 Introduction 1 2 Related Work 5 3 Method 7 3.1 The Lookup Table 8 3.2 Pose Similarity 10 3.3 Static Pose Imitation 11 3.4 Trajectory Generation 13 4 Experiments 17 4.1 Evaluation of Static Pose Imitation 17 4.1.1 Reaching Target Position 18 4.1.2 Imitating Pose 22 4.2 Evaluation of Motion Imitation 24 5 Conclusion 29 References 31

    [1] H. Ha, J. Xu, and S. Song, “Learning a decentralized multi-arm motion planner,” 2020.
    [2] M. Alles and E. Aljalbout, “Learning to centralize dual-arm assembly,” 2021.
    [3] C.-C. Wong, S.-Y. Chien, H.-M. Feng, and H. Aoyama, “Motion planning for dual-arm
    robot based on soft actor-critic,” IEEE Access, 2021.
    [4] E. Hsu, K. Pulli, and J. Popović, “Style translation for human motion,” in ACM SIGGRAPH 2005 Papers, p. 1082–1089, 2005.
    [5] S. Xia, C. Wang, J. Chai, and J. Hodgins, “Realtime style transfer for unlabeled heterogeneous human motion,” ACM Trans. Graph., 2015.
    [6] K. Grochow, S. L. Martin, A. Hertzmann, and Z. Popović, “Style-based inverse kinematics,” ACM Trans. Graph., 2004.
    [7] S. Nakaoka, A. Nakazawa, F. Kanehiro, K. Kaneko, M. Morisawa, H. Hirukawa, and
    K. Ikeuchi, “Learning from observation paradigm: Leg task models for enabling a biped
    humanoid robot to imitate human dances,” The International Journal of Robotics Research,
    pp. 829–844, 2007.
    [8] B. Dariush, M. Gienger, B. Jian, C. Goerick, and K. Fujimura, “Whole body humanoid
    control from human motion descriptors,” in 2008 IEEE International Conference on
    Robotics and Automation, pp. 2677–2684, 2008.
    [9] J. Koenemann, F. Burget, and M. Bennewitz, “Real-time imitation of human whole-body
    motions by humanoids,” in 2014 IEEE International Conference on Robotics and Automation (ICRA), 2014.
    [10] K. Yamane, S. O. Anderson, and J. K. Hodgins, “Controlling humanoid robots with human
    motion data: Experimental validation,” in 2010 10th IEEE-RAS International Conference
    on Humanoid Robots, 2010.
    [11] K. Ayusawa and E. Yoshida, “Motion retargeting for humanoid robots based on simultaneous morphing parameter identification and motion optimization,” IEEE Transactions
    on Robotics, 2017.
    [12] A. Ude, C. G. Atkeson, and M. Riley, “Programming full-body movements for humanoid
    robots by observation,” Robotics and Autonomous Systems, 2004. Robot Learning from
    Demonstration.
    [13] B. Bócsi, L. Csató, and J. Peters, “Alignment-based transfer learning for robot models,”
    in The 2013 International Joint Conference on Neural Networks (IJCNN), 2013.
    [14] B. Delhaisse, D. Esteban, L. Rozo, and D. Caldwell, “Transfer learning of shared latent
    spaces between robots with similar kinematic structure,” in 2017 International Joint Conference on Neural Networks (IJCNN), pp. 4142–4149, 2017.
    [15] S. Chen, Z. Sun, Y. Li, and Q. Li, “Partial similarity human motion retrieval based on
    relative geometry features,” in 2012 Fourth International Conference on Digital Home,
    2012.
    [16] M. Müller, T. Röder, and M. Clausen, “Efficient content-based retrieval of motion capture
    data,” ACM Trans. Graph., 2005.
    [17] M. E. von Eschenbach, B. Manela, J. Peters, and A. Biess, “Metric-based imitation learning between two dissimilar anthropomorphic robotic arms,” 2020.
    [18] A. W. Moore, “An intoductory tutorial on kd-trees,” 1991.
    [19] J. L. Bentley, “Multidimensional binary search trees used for associative searching,” Commun. ACM, 1975.
    [20] Webots, “http://www.cyberbotics.com.” Open-source Mobile Robot Simulation Software.

    QR CODE