ArmSym: a virtual human-robot interaction laboratory for assistive robotics

Samuel Bustamante, Jan Peters, Bernhard Schölkopf, Moritz Grosse-Wentrup, Vinay Jayaram

Veröffentlichungen: Beitrag in FachzeitschriftArtikelPeer Reviewed


Research in human–robot interaction for assistive robotics usually presents many technical challenges for experimenters, forcing researchers to split their time between solving technical problems and conducting experiments. In addition, previous work in virtual reality setups tends to focus on a single assistive robotics application. In order to alleviate these problems, we present ArmSym, a virtual reality laboratory with a fully simulated and developer-friendly robot arm. The system is intended as a testbed to run many sorts of experiments on human control of a robotic arm in a realistic environment, ranging from an upper limb prosthesis to a wheelchair-mounted robotic manipulator. To highlight the possibilities of this system, we perform a study comparing different sorts of prosthetic control types. Looking at nonimpaired subjects, we study different psychological metrics that evaluate the interaction of the user with the robot under different control conditions. Subjects report a perception of embodiment in the absence of realistic cutaneous touch, supporting previous studies in the topic. We also find interesting correlations between control and perceived ease of use. Overall our results confirm that ArmSym can be used to gather data from immersive experiences prosthetics, opening the door to closer collaboration between device engineers and experience designers in the future.
Seiten (von - bis)568-577
FachzeitschriftIEEE Transactions on Human-Machine Systems
PublikationsstatusVeröffentlicht - Dez. 2021

ÖFOS 2012

  • 102026 Virtual Reality