Date created
2022-04-01
Authors/Contributors
Author (aut): Zakia, Umme
Author (aut): Menon, Carlo
Abstract
In this study, human robot collaboration (HRC) via force myography (FMG) bio-signal was investigated. Interactive hand force was estimated during moving a wooden rod in 3D with a Kuka robot. A baseline FMG-based deep convolutional neural network (FMG-DCNN) model could moderately estimate applied forces during the HRC task. Model performance can be improved with additional training data; however, collection of it was impractical and time-consuming. Available long-term multiple source data (32 feature spaces) during human robot interaction (HRI) with a linear robot collected over a long time period might be useful. Therefore, we explored a cross-domain generalization (CDG) technique that allowed pretraining a model to transfer knowledge between two unrelated source (2D-HRI) and target data (3D-HRC) for the first time. An FMG-based transfer learning with CDG (TL-CDG) model trained with these multiple source domains was examined in estimating applied forces from 16-channel FMG data during interactions with the Kuka robot. Two target scenarios were evaluated: case i ) collaborative task of moving the wooden rod in 3D, and case ii) grasping interactions in 1D. In both cases, few calibration data finetuned the TL-CDG model and improved recognizing out-of-domain target data (case i: R2≈60 -63%, and case ii: R2≈79 -87%) compared to the baseline FMG-DCNN model. Hence, cross-domain generalization could be useful in platform-independent FMG-based HRI applications.
Document
SFU DOI
Scholarly level
Peer reviewed?
Yes
Funder
Funder (spn): Canada Research Chairs Program (CRC)
Language
English
Member of collection