skip to main content
10.1145/3308532.3329466acmconferencesArticle/Chapter ViewAbstractPublication PagesivaConference Proceedingsconference-collections
research-article

The Effects of Anthropomorphism and Non-verbal Social Behaviour in Virtual Assistants

Published: 01 July 2019 Publication History

Abstract

The adoption of virtual assistants is growing at a rapid pace. However, these assistants are not optimised to simulate key social aspects of human conversational environments. Humans are intellectually biased toward social activity when facing anthropomorphic agents or when presented with subtle social cues. In this paper, we test whether humans respond the same way to assistants in guided tasks, when in different forms of embodiment and social behaviour. In a within-subject study (N=30), we asked subjects to engage in dialogue with a smart speaker and a social robot. We observed shifting of interactive behaviour, as shown in behavioural and subjective measures. Our findings indicate that it is not always favourable for agents to be anthropomorphised or to communicate with nonverbal cues. We found a trade-off between task performance and perceived sociability when controlling for anthropomorphism and social behaviour.

References

[1]
Samer Al Moubayed, Jonas Beskow, Gabriel Skantze, and Bj�rn Granstr�m. 2012. Furhat: a back-projected human-like robot head for multiparty human-machine interaction. In Cognitive behavioural systems .
[2]
Muhammad Raisul Alam, Mamun Bin Ibne Reaz, and Mohd Alauddin Mohd Ali. 2012. A review of smart homes - Past, present, and future. IEEE Transactions on Systems, Man, and Cybernetics (2012).
[3]
Sean Andrist, Xiang Zhi Tan, Michael Gleicher, and Bilge Mutlu. 2014. Conversational gaze aversion for humanlike robots. In ACM/IEEE international conference on Human-robot interaction .
[4]
Michael Argyle and Mark Cook. 1976. Gaze and mutual gaze. (1976).
[5]
Wilma A Bainbridge, Justin Hart, Elizabeth S Kim, and Brian Scassellati. 2008. The effect of presence on human-robot interaction. In RO-MAN .
[6]
Michael Bonfert, Maximilian Splieth�ver, Roman Arzaroli, Marvin Lange, Martin Hanci, and Robert Porzel. 2018. If You Ask Nicely: A Digital Assistant Rebuking Impolite Voice Commands. In International Conference on Multimodal Interaction .
[7]
Cynthia Breazeal, Kerstin Dautenhahn, and Takayuki Kanda. 2016. Social robotics. In Springer handbook of robotics .
[8]
Cynthia Breazeal and Paul Fitzpatrick. 2000. That certain look: Social amplification of animate vision. In AAAI .
[9]
Herbert H Clark. 2005. Coordinating with each other in a material world. Discourse studies (2005).
[10]
Michael H Cohen, Michael Harris Cohen, James P Giangola, and Jennifer Balogh. 2004. Voice user interface design .
[11]
Paul Dourish. 2004. Where the action is: the foundations of embodied interaction .
[12]
Stefania Druga, Randi Williams, Cynthia Breazeal, and Mitchel Resnick. 2017. Hey Google is it OK if I eat you?: Initial Explorations in Child-Agent Interaction. In Conference on Interaction Design and Children .
[13]
Terrence Fong, Illah Nourbakhsh, and Kerstin Dautenhahn. 2003. A survey of socially interactive robots. Robotics and autonomous systems (2003).
[14]
Chris Fullwood and Gwyneth Doherty-Sneddon. 2006. Effect of gazing at the camera during a video link on recall. Applied Ergonomics (2006).
[15]
Henry Goble and Chad Edwards. 2018. A Robot That Communicates With Vocal Fillers Has... Uhhh... Greater Social Presence. Communication Research Reports (2018).
[16]
Randy Gomez, Deborah Szapiro, Kerl Galindo, and Keisuke Nakamura. 2018. Haru: Hardware Design of an Experimental Tabletop Robot Assistant. In International Conference on Human-Robot Interaction .
[17]
Zenzi M Griffin. 2001. Gaze durations during speech reflect word selection and phonological encoding. Cognition (2001).
[18]
Edward Twitchell Hall. 1966. The hidden dimension . Vol. 609.
[19]
Chad Harms and Frank Biocca. 2004. Internal consistency and reliability of the networked minds measure of social presence. (2004).
[20]
J�rg Hauber, Holger Regenbrecht, Mark Billinghurst, and Andy Cockburn. 2006. Spatiality in videoconferencing: trade-offs between efficiency and social presence. In Computer supported cooperative work .
[21]
Younbo Jung and Kwan Min Lee. 2004. Effects of physical embodiment on social presence of social robots. Proceedings of PRESENCE (2004).
[22]
Alisa Kalegina, Grace Schroeder, Aidan Allchin, Keara Berlin, and Maya Cakmak. 2018. Characterizing the Design Space of Rendered Robot Faces. In International Conference on Human-Robot Interaction .
[23]
Adam Kendon. 1967. Some functions of gaze-direction in social interaction. Acta psychologica (1967).
[24]
James Kennedy, Paul Baxter, and Tony Belpaeme. 2015. Comparing robot embodiments in a guided discovery learning interaction with children. International Journal of Social Robotics (2015).
[25]
Cory D Kidd and Cynthia Breazeal. 2004. Effect of a robot on user perceptions. In IROS .
[26]
Cory D Kidd and Cynthia Breazeal. 2008. Robots at home: Understanding long-term human-robot interaction. In IROS .
[27]
Scott R Klemmer, Bj�rn Hartmann, and Leila Takayama. 2006. How bodies matter: five themes for interaction design. In Designing Interactive systems .
[28]
Tomoko Koda and Takuto Ishioh. 2018. Analysis of the Effect of Agent's Embodiment and Gaze Amount on Personality Perception. In 4th International Workshop on Multimodal Analyses Enabling Artificial Agents in Human-Machine Interaction .
[29]
Dimosthenis Kontogiorgos, Vanya Avramova, Simon Alexandersson, Patrik Jonell, Catharine Oertel, Jonas Beskow, Gabriel Skantze, and Joakim Gustafsson. 2018a. A multimodal corpus for mutual gaze and joint attention in multiparty situated interaction. In LREC .
[30]
Dimosthenis Kontogiorgos, Andre Pereira, and Joakim Gustafson. 2019. The trade-off between interaction time and social facilitation with collaborative social robots. In The Challenges of Working on Social Robots that Collaborate with People, CHI 2019 .
[31]
Dimosthenis Kontogiorgos, Elena Sibirtseva, Andre Pereira, Gabriel Skantze, and Joakim Gustafson. 2018b. Multimodal Reference Resolution In Collaborative Assembly Tasks. In International Workshop on Multimodal Analyses Enabling Artificial Agents in Human-Machine Interaction .
[32]
Kwan Min Lee, Younbo Jung, Jaywoo Kim, and Sang Ryong Kim. 2006. Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people's loneliness in human--robot interaction. International Journal of Human-Computer Studies (2006).
[33]
Diane J Litman and Shimei Pan. 2002. Designing and evaluating an adaptive spoken dialogue system. User Modeling and User-Adapted Interaction (2002).
[34]
Michal Luria, Guy Hoffman, and Oren Zuckerman. 2017. Comparing Social Robot, Screen and Voice Interfaces for Smart-Home Control. In CHI Conference on Human Factors in Computing Systems .
[35]
George M Marakas, Richard D Johnson, and Jonathan W Palmer. 2000. A theoretical model of differential social attributions toward computing technology: when the metaphor becomes the model. International Journal of Human-Computer Studies (2000).
[36]
Rachel Metz. 2017. Growing Up with Alexa. https://www.technologyreview.com/s/608430/growing-up-with-alexa/
[37]
Antje S Meyer, Astrid M Sleiderink, and Willem JM Levelt. 1998. Viewing and naming objects: Eye movements during noun phrase production. Cognition (1998).
[38]
Hiroshi Mizoguchi, Tomomasa Sato, Katsuyuki Takagi, Masayuki Nakao, and Yotaro Hatamura. 1997. Realization of expressive mobile robot. In Robotics and Automation .
[39]
Youngme Moon and Clifford Nass. 1996. How "real" are computer personalities? Psychological responses to personality types in human-computer interaction. Communication research (1996).
[40]
Clifford Nass and Jonathan Steuer. 1993. Voices, boxes, and sources of messages: Computers and social actors. Human Communication Research (1993).
[41]
David G Novick, Brian Hansen, and Karen Ward. 1996. Coordinating turn-taking with gaze. In ICSLP 96 .
[42]
Andre Pereira, Rui Prada, and Ana Paiva. 2014. Improving social presence in human-agent interaction. In SIGCHI Conference on Human Factors in Computing Systems .
[43]
Elizabeth Phillips, Xuan Zhao, Daniel Ullman, and Bertram F Malle. 2018. What is Human-like?: Decomposing Robots' Human-like Appearance Using the Anthropomorphic roBOT (ABOT) Database. In International Conference on Human-Robot Interaction .
[44]
Aaron Powers, Sara Kiesler, Susan Fussell, Susan Fussell, and Cristen Torrey. 2007. Comparing a computer agent with a humanoid robot. In International conference on Human-robot interaction .
[45]
Daniel C Richardson, Rick Dale, and Natasha Z Kirkham. 2007. The art of conversation is coordination. Psychological science (2007).
[46]
Michael S. Rosenwald. 2017. How millions of kids are being shaped by know-it-all voice assistants. https://www.washingtonpost.com/local/how-millions-of-kids-are-being-shaped-by-know-it-all-voice-assistants/
[47]
Takanori Shibata, Toshihiro Tashima, and Kazuo Tanie. 1999. Emergence of emotional behavior through physical interaction between human and robot. In Robotics and Automation .
[48]
John Short, Ederyn Williams, and Bruce Christie. 1976. The social psychology of telecommunications. (1976).
[49]
Gabriel Skantze, Anna Hjalmarsson, and Catharine Oertel. 2014. Turn-taking, feedback and joint attention in situated human--robot interaction. Speech Communication (2014).
[50]
Ilona Straub. 2016. 'It looks like a human!' The interrelation of social presence, interaction and agency ascription: a case study about the effects of an android robot on social agency ascription. AI & society (2016).
[51]
Lucy A Suchman. 1987. Plans and situated actions: The problem of human-machine communication .
[52]
Sam Thellman, Annika Silvervarg, Agneta Gulz, and Tom Ziemke. 2016. Physical vs. virtual agent embodiment and effects on social interaction. In International Conference on Intelligent Virtual Agents .
[53]
Elena Torta, Johannes Oberzaucher, Franz Werner, Raymond H Cuijpers, and James F Juola. 2013. Attitudes towards socially assistive robots in intelligent homes: results from laboratory studies and field trials. Journal of Human-Robot Interaction (2013).
[54]
Alexandra Vtyurina and Adam Fourney. 2018. Exploring the Role of Conversational Cues in Guided Task Support with Virtual Assistants. In CHI Conference on Human Factors in Computing Systems .
[55]
Joshua Wainer, David J Feil-Seifer, Dylan A Shell, and Maja J Mataric. 2006. The role of physical embodiment in human-robot interaction. In ROMAN 2006 .

Cited By

View all
  • (2024)CUI@CHI 2024: Building Trust in CUIs—From Design to DeploymentExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3636287(1-7)Online publication date: 11-May-2024
  • (2024)Join Me Here if You Will: Investigating Embodiment and Politeness Behaviors When Joining Small Groups of Humans, Robots, and Virtual CharactersProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642905(1-16)Online publication date: 11-May-2024
  • (2024)Listening to the Voices: Describing Ethical Caveats of Conversational User Interfaces According to Experts and Frequent UsersProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642542(1-18)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
IVA '19: Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents
July 2019
282 pages
ISBN:9781450366724
DOI:10.1145/3308532
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. conversational artificial intelligence
  2. empirical studies
  3. human-computer interaction
  4. smart speakers
  5. social robots

Qualifiers

  • Research-article

Conference

IVA '19
Sponsor:

Acceptance Rates

IVA '19 Paper Acceptance Rate 15 of 63 submissions, 24%;
Overall Acceptance Rate 53 of 196 submissions, 27%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)259
  • Downloads (Last 6 weeks)19
Reflects downloads up to 17 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)CUI@CHI 2024: Building Trust in CUIs—From Design to DeploymentExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3636287(1-7)Online publication date: 11-May-2024
  • (2024)Join Me Here if You Will: Investigating Embodiment and Politeness Behaviors When Joining Small Groups of Humans, Robots, and Virtual CharactersProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642905(1-16)Online publication date: 11-May-2024
  • (2024)Listening to the Voices: Describing Ethical Caveats of Conversational User Interfaces According to Experts and Frequent UsersProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642542(1-18)Online publication date: 11-May-2024
  • (2024)Effects of Transparency in Humanoid Robots - A Pilot StudyCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640613(750-754)Online publication date: 11-Mar-2024
  • (2024)Are You Sure? - Multi-Modal Human Decision Uncertainty Detection in Human-Robot InteractionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634926(621-629)Online publication date: 11-Mar-2024
  • (2024)Exploring the Impact of Social Robot Design Characteristics on Users’ Privacy Concerns: Evidence from PLS-SEM and FsQCAInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2402126(1-22)Online publication date: 17-Sep-2024
  • (2024)Nonverbal Communication in the MetaverseCommunication in the Metaverse10.1007/978-3-031-63485-7_5(69-91)Online publication date: 28-Jun-2024
  • (2024)Effect of Socio-Demographic Factors on Consumer’s Attitude Towards Artificial Intelligent Based Digital Voice AssistantGlobal Economic Revolutions: Big Data Governance and Business Analytics for Sustainability10.1007/978-3-031-50518-8_2(15-27)Online publication date: 12-Jan-2024
  • (2023)Research Hotspots and Trends of Social Robot Interaction Design: A Bibliometric AnalysisSensors10.3390/s2323936923:23(9369)Online publication date: 23-Nov-2023
  • (2023)Seeing eye to eye: trustworthy embodiment for task-based conversational agentsFrontiers in Robotics and AI10.3389/frobt.2023.123476710Online publication date: 30-Aug-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media