Humanoid robot's visual imitation of 3-D motion of a human subject using neural-network-based inverse kinematics

Chih Lyang Hwang, Bo Lin Chen, Huei Ting Syu, Chao Kuei Wang, Mansour Karkoub

Research output: Contribution to journalArticle

9 Citations (Scopus)

Abstract

A sequence of 3-D motion images of a human facing an imitator [i.e., a humanoid robot (HR)] is captured using a stereo-vision system installed on the HR. After acquiring enough motion sequences via image ratio and background registration, the estimation of seven feature points (i.e., the head, two arm tips, two leg tips, and two elbows) of the human are obtained. Using five feature points (i.e., the head, two arm tips, and two leg tips), the extraction of key posture frames from the recorded videos is achieved. To ensure a stable imitation motion of the HR, the human motions of the lower body (LB) and the upper body (UB) are imitated separately. All 11 stable motions for the LB are classified by the motion direction, the motion state, and the support phase, whereas only the two arm tips are employed to imitate the motions of the UB. Due to the disadvantages of traditional inverse kinematics (IK), neural-network-based IK is developed to reduce the computation time and improve the accuracy of the time response of the system. Finally, the combined motions of the LB and the UB with suitable interpolation and time intervals are applied to imitate the 3-D motion of a human. The corresponding experimental results confirm the efficacy of the proposed technique.

Original languageEnglish
Article number6913510
Pages (from-to)685-696
Number of pages12
JournalIEEE Systems Journal
Volume10
Issue number2
DOIs
Publication statusPublished - 1 Jun 2016

    Fingerprint

Keywords

  • Humanoid robot (HR)
  • Motion detection
  • Multilayer neural network (MLNN) modeling
  • Posture estimation
  • Stereo vision for 3-D localization
  • Visual imitation

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Cite this