论文标题

并发交叉模式反馈有助于目标搜索:通过视觉,听觉和触觉方式显示距离信息

Concurrent Crossmodal Feedback Assists Target-searching: Displaying Distance Information Through Visual, Auditory and Haptic Modalities

论文作者

Feng, Feng, Stockman, Tony

论文摘要

人类的距离感取决于多感觉提示的整合。传入的视觉亮度,听觉音高和触觉振动都可能有助于距离判断的能力。如果多模式提示以一致的方式关联,则可以增强这种能力,这种现象已称为跨模式对应关系。在多感觉互动的背景下,很少研究与连续运动参与度(尤其是针对目标搜索活动)的信息处理以及如何影响信息处理。本文提出了一项实验性用户研究,以解决此问题。我们建立了一个基于桌面的目标搜索应用程序,显示了单峰和跨模式的距离提示,同时响应了搜索运动的人们,并通过运动学评估来测量任务绩效。我们发现跨模式显示音频显示导致提高搜索效率和准确性。更有趣的是,运动学分析证实了这一改进,该分析还揭示了可以解释这一改进的基本运动功能。我们讨论了这些发现如何使辅助技术的设计和其他多种感觉互动的设计阐明。

Humans sense of distance depends on the integration of multi sensory cues. The incoming visual luminance, auditory pitch and tactile vibration could all contribute to the ability of distance judgement. This ability can be enhanced if the multimodal cues are associated in a congruent manner, a phenomenon has been referred to as Crossmodal correspondences. In the context of multi-sensory interaction, whether and how such correspondences influence information processing with continuous motor engagement, particularly for target searching activities, has rarely been investigated. This paper presents an experimental user study to address this question. We built a target-searching application based on a Table-top, displayed the unimodal and Crossmodal distance cues concurrently responding to peoples searching movement, measured task performance through kinematic evaluation. We find that the Crossmodal display an audio display lead to improved searching efficiency and accuracy. More interestingly, this improvement is confirmed by kinematic analysis, which also unveiled the underlying movement features that could account for this improvement. We discussed how these findings could shed lights on the design of assistive technology and of other multi sensory interaction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源