出版社
ASSOC COMPUTING MACHINERY
DOI: 10.1145/3183654.3183691
关键词
Voice; Human-Machine Interaction; trust; behavioural measures
资金
- Marie Sklodowska-Curie Actions [CogNovo FP7-PEOPLE-2013-ITN604764]
- European Union [713567]
- ADAPT Centre for Digital Content Technology - SFI Research Centres Programme [13/RC/2016]
- European Regional Development Fund
Societies rely on trustworthy communication in order to function, and the need for trust clearly extends to human-machine communication. Therefore, it is essential to design machines to elicit trust, so as to make interactions with them acceptable and successful. However, while there is a substantial literature on first impressions of trustworthiness based on various characteristics, including voice, not much is known about the trust development process. Are first impressions maintained over time? Or are they influenced by the experience of an agent's behaviour? We addressed these questions in three experiments using the iterated investment game, a methodology derived from game theory that allows implicit measures of trust to be collected over time. Participants played the game with various agents having different voices: in the first experiment, participants played with a computer agent that had either a Standard Southern British English accent or a Liverpool accent; in the second experiment, they played with a computer agent that had either an SSBE or a Birmingham accent; in the third experiment, they played with a robot that had either a natural or a synthetic voice. All these agents behaved either trustworthily or untrustworthily. In all three experiments, participants trusted the agent with one voice more when it was trustworthy, and the agent with the other voice more when it was untrustworthy. This suggests that participants might change their trusting behaviour based on the congruency of the agent's behaviour with the participant's first impression. Implications for human-machine interaction design are discussed.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据