EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm

Sungtae Shin, Reza Tafreshi, Reza Langari

Research output: Contribution to journalArticle

1 Citation (Scopus)


This study focuses on a myoelectric interface that controls a robotic manipulator via neuromuscular electrical signals generated when humans make hand gestures. The proposed system recognizes dynamic hand motions, which change shapes, poses, and configuration of a hand over time, in real-time. Varying muscle forces controls the activation/inactivation modes. Gradients of a limb orientation give directions of movements of the robot arm. Classified dynamic motions are used to change the control states of the HCI system. The performance of the myoelectric interface was measured in terms of real-time classification accuracy, path efficiency, and time-related measures. The usability of the developed myoelectric interface was also compared to a button-based jog interface. A total of sixteen human subjects were participated. The average real-time classification accuracy of the myoelectric interface was over 95.6%. The path efficiency of the myoelectric interface of the majority of the subjects showed similar performance to that of the jog interface. The results of the jog interface in the time-measures outperformed the results of the myoelectric interface. However, with the consideration of the overall advantages of the myoelectric interface, the decrease in the time-related performances may be offset.

Original languageEnglish
Pages (from-to)861-876
Number of pages16
JournalJournal of Intelligent and Fuzzy Systems
Issue number1
Publication statusPublished - 1 Jan 2018



  • electromyography (EMG)
  • gesture recognition
  • Human computer interaction (HCI)
  • myoelectric classification

ASJC Scopus subject areas

  • Statistics and Probability
  • Engineering(all)
  • Artificial Intelligence

Cite this