Abstract
This work investigates how visual and spoken cues of virtual agents interact to affect user perception of agent trustworthiness. It is directly motivated by practical applications, such as an assistive robot companion for the elderly or homebound, or a virtual agent that can provide psychological assessment and treatment for individuals with mental health challenges. Such technologies have the capacity to assist human users in impactful ways, but without human trust in these systems, adoption and usage will remain severely limited. Our findings reveal strong correlations between both visual and auditory features and perceived trustworthiness. This underscores the importance of incorporating a comprehensive range of nonverbal cues and auditory signals into interface design.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 6th Conference on ACM Conversational User Interfaces, CUI 2024 |
| Publisher | Association for Computing Machinery, Inc |
| ISBN (Electronic) | 9798400705113 |
| DOIs | |
| State | Published - 8 Jul 2024 |
| Externally published | Yes |
| Event | 6th Conference on ACM Conversational User Interfaces, CUI 2024 - Luxembourg City, Luxembourg Duration: 8 Jul 2024 → 10 Jul 2024 |
Publication series
| Name | Proceedings of the 6th Conference on ACM Conversational User Interfaces, CUI 2024 |
|---|
Conference
| Conference | 6th Conference on ACM Conversational User Interfaces, CUI 2024 |
|---|---|
| Country/Territory | Luxembourg |
| City | Luxembourg City |
| Period | 8/07/24 → 10/07/24 |
Bibliographical note
Publisher Copyright:© 2024 Owner/Author.
Keywords
- Auditory
- Conversational Agents
- Human Perception
- Multimodal
- Trust Cues
- Visual