论文标题
通过虚拟指南的混合物在不断变化的环境中辅助远程运行
Assisted Teleoperation in Changing Environments with a Mixture of Virtual Guides
论文作者
论文摘要
触觉指导是将人类和自主系统的优势结合起来的强大技术。自主系统可以提供触觉提示,使操作员能够执行精确的运动;操作员可以干扰利用其出色认知能力的自主系统计划。但是,提供触觉提示使个人优势没有受损是具有挑战性的,因为低力提供了指导,而强大的力量可能会阻碍操作员实现他/她的计划。基于变异推断,我们学习了轨迹的高斯混合模型(GMM),以完成给定的任务。学到的GMM用于构建决定触觉提示的潜在领域。基于我们对计划及其各自阶段的最新信念,潜在的领域在远程运行过程中平稳变化。此外,当操作员不遵循任何拟议的计划或在环境变化之后,在线学习新计划。用户研究证实,我们的框架可帮助用户比没有触觉提示更准确地执行远距离任务,在某些情况下,更快。此外,我们演示了使用框架来帮助主题在选择任务中进行7 DOF操纵器。
Haptic guidance is a powerful technique to combine the strengths of humans and autonomous systems for teleoperation. The autonomous system can provide haptic cues to enable the operator to perform precise movements; the operator can interfere with the plan of the autonomous system leveraging his/her superior cognitive capabilities. However, providing haptic cues such that the individual strengths are not impaired is challenging because low forces provide little guidance, whereas strong forces can hinder the operator in realizing his/her plan. Based on variational inference, we learn a Gaussian mixture model (GMM) over trajectories to accomplish a given task. The learned GMM is used to construct a potential field which determines the haptic cues. The potential field smoothly changes during teleoperation based on our updated belief over the plans and their respective phases. Furthermore, new plans are learned online when the operator does not follow any of the proposed plans, or after changes in the environment. User studies confirm that our framework helps users perform teleoperation tasks more accurately than without haptic cues and, in some cases, faster. Moreover, we demonstrate the use of our framework to help a subject teleoperate a 7 DoF manipulator in a pick-and-place task.