EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm

Sungtae Shin, Reza Tafreshi, Reza Langari

Research output: Contribution to journalArticle

Abstract

This study focuses on a myoelectric interface that controls a robotic manipulator via neuromuscular electrical signals generated when humans make hand gestures. The proposed system recognizes dynamic hand motions, which change shapes, poses, and configuration of a hand over time, in real-time. Varying muscle forces controls the activation/inactivation modes. Gradients of a limb orientation give directions of movements of the robot arm. Classified dynamic motions are used to change the control states of the HCI system. The performance of the myoelectric interface was measured in terms of real-time classification accuracy, path efficiency, and time-related measures. The usability of the developed myoelectric interface was also compared to a button-based jog interface. A total of sixteen human subjects were participated. The average real-time classification accuracy of the myoelectric interface was over 95.6%. The path efficiency of the myoelectric interface of the majority of the subjects showed similar performance to that of the jog interface. The results of the jog interface in the time-measures outperformed the results of the myoelectric interface. However, with the consideration of the overall advantages of the myoelectric interface, the decrease in the time-related performances may be offset.

Original languageEnglish
Pages (from-to)861-876
Number of pages16
JournalJournal of Intelligent and Fuzzy Systems
Volume35
Issue number1
DOIs
Publication statusPublished - 1 Jan 2018

Fingerprint

End effectors
Human computer interaction
Gesture
Robot
Robots
Real-time
Force control
Interfaces (computer)
Manipulators
Muscle
Dynamical systems
Robotics
Computer systems
Chemical activation
Robotic Manipulator
Path
Force Control
Motion
Handover
System Dynamics

Keywords

  • electromyography (EMG)
  • gesture recognition
  • Human computer interaction (HCI)
  • myoelectric classification

ASJC Scopus subject areas

  • Statistics and Probability
  • Engineering(all)
  • Artificial Intelligence

Cite this

EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm. / Shin, Sungtae; Tafreshi, Reza; Langari, Reza.

In: Journal of Intelligent and Fuzzy Systems, Vol. 35, No. 1, 01.01.2018, p. 861-876.

Research output: Contribution to journalArticle

@article{b2432bd451fa42e889b99cf5b500c9b5,
title = "EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm",
abstract = "This study focuses on a myoelectric interface that controls a robotic manipulator via neuromuscular electrical signals generated when humans make hand gestures. The proposed system recognizes dynamic hand motions, which change shapes, poses, and configuration of a hand over time, in real-time. Varying muscle forces controls the activation/inactivation modes. Gradients of a limb orientation give directions of movements of the robot arm. Classified dynamic motions are used to change the control states of the HCI system. The performance of the myoelectric interface was measured in terms of real-time classification accuracy, path efficiency, and time-related measures. The usability of the developed myoelectric interface was also compared to a button-based jog interface. A total of sixteen human subjects were participated. The average real-time classification accuracy of the myoelectric interface was over 95.6{\%}. The path efficiency of the myoelectric interface of the majority of the subjects showed similar performance to that of the jog interface. The results of the jog interface in the time-measures outperformed the results of the myoelectric interface. However, with the consideration of the overall advantages of the myoelectric interface, the decrease in the time-related performances may be offset.",
keywords = "electromyography (EMG), gesture recognition, Human computer interaction (HCI), myoelectric classification",
author = "Sungtae Shin and Reza Tafreshi and Reza Langari",
year = "2018",
month = "1",
day = "1",
doi = "10.3233/JIFS-171562",
language = "English",
volume = "35",
pages = "861--876",
journal = "Journal of Intelligent and Fuzzy Systems",
issn = "1064-1246",
publisher = "IOS Press",
number = "1",

}

TY - JOUR

T1 - EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm

AU - Shin, Sungtae

AU - Tafreshi, Reza

AU - Langari, Reza

PY - 2018/1/1

Y1 - 2018/1/1

N2 - This study focuses on a myoelectric interface that controls a robotic manipulator via neuromuscular electrical signals generated when humans make hand gestures. The proposed system recognizes dynamic hand motions, which change shapes, poses, and configuration of a hand over time, in real-time. Varying muscle forces controls the activation/inactivation modes. Gradients of a limb orientation give directions of movements of the robot arm. Classified dynamic motions are used to change the control states of the HCI system. The performance of the myoelectric interface was measured in terms of real-time classification accuracy, path efficiency, and time-related measures. The usability of the developed myoelectric interface was also compared to a button-based jog interface. A total of sixteen human subjects were participated. The average real-time classification accuracy of the myoelectric interface was over 95.6%. The path efficiency of the myoelectric interface of the majority of the subjects showed similar performance to that of the jog interface. The results of the jog interface in the time-measures outperformed the results of the myoelectric interface. However, with the consideration of the overall advantages of the myoelectric interface, the decrease in the time-related performances may be offset.

AB - This study focuses on a myoelectric interface that controls a robotic manipulator via neuromuscular electrical signals generated when humans make hand gestures. The proposed system recognizes dynamic hand motions, which change shapes, poses, and configuration of a hand over time, in real-time. Varying muscle forces controls the activation/inactivation modes. Gradients of a limb orientation give directions of movements of the robot arm. Classified dynamic motions are used to change the control states of the HCI system. The performance of the myoelectric interface was measured in terms of real-time classification accuracy, path efficiency, and time-related measures. The usability of the developed myoelectric interface was also compared to a button-based jog interface. A total of sixteen human subjects were participated. The average real-time classification accuracy of the myoelectric interface was over 95.6%. The path efficiency of the myoelectric interface of the majority of the subjects showed similar performance to that of the jog interface. The results of the jog interface in the time-measures outperformed the results of the myoelectric interface. However, with the consideration of the overall advantages of the myoelectric interface, the decrease in the time-related performances may be offset.

KW - electromyography (EMG)

KW - gesture recognition

KW - Human computer interaction (HCI)

KW - myoelectric classification

UR - http://www.scopus.com/inward/record.url?scp=85051394416&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85051394416&partnerID=8YFLogxK

U2 - 10.3233/JIFS-171562

DO - 10.3233/JIFS-171562

M3 - Article

AN - SCOPUS:85051394416

VL - 35

SP - 861

EP - 876

JO - Journal of Intelligent and Fuzzy Systems

JF - Journal of Intelligent and Fuzzy Systems

SN - 1064-1246

IS - 1

ER -