Minimally invasive surgery is a cost-effective surgery which offers less pain, less trauma and shorter recovery for the patient than traditional open surgery. Nevertheless, surgeons require new hand-eye coordination and substantial training to master the skills. Virtual reality training simulators with visual and touch sensation feedback are being developed to improve surgeons’ proficiency. Research studies have shown that update rates of 1 kHz or more are desired for force/tactile feedback when interactive with mechanics-based virtual models. We proposed a mathematical model of a novel 4-DOF haptic interface for such training application. We developed and studied a novel desktop computational platform for force control of the device. Such architecture can offer a novel distributed system for tele-operation over the Internet and haptic rendering of deformable objects using medical imaging data. Through analysis of the experimental setup, key parameters have been identified for synthesis of a future tele-operation environment.
Copyright is held by the author.
Member of collection