A Taxonomy of Social Cues for Conversational Agents

When using the taxonomy, please cite as Feine, J., Gnewuch U., Morana S. and Maedche A. (2019): “A Taxonomy of Social Cues for Conversational Agents” International Journal of Human-Computer Studies. To read the paper, please click here.

Social cue: Degree of human-likeness
Communication system: Visual
Cue category: Appearance
Cue Description
The appearance of the CA ranges from an extremely photorealistic appearance to a comic appearance.
Cue example
Real photo of a human being, comic figure.
Cue impact
The degree of anthropomorphism impacts the perceived credibility and likeability of agent (Nowak 2004; Forlizzi et al. 2007), trust resilience, and a higher resistance to breakdowns in trust (Visser et al. 2016). Human visualizations increase the amount of recognized emotions of agent (Beer et al. 2015) and also influences self-awareness which impacts the amount of self-disclosure (Sah, Peng 2015). Speakers used significantly fewer words that referred to physical appearance when speaking to robot looking bot (Brahnam, Angeli 2012) or symbolic looking bot (Knijnenburg et al. 2016). Degree of anthropomorphism of agent should further match the role of agent (Trovato et al. 2015; Keeling et al. 2004). Robot and iconic looking images are more useful to represent AI-based service agent and human images to signal a human service agent (Wuenderlich, Paluch 2017). Furthermore, anthropomorphized agents can create a backfire effect on user engagement (Kim et al. 2016) and not all kind of appearances can be matched to certain voices (Mersiol et al. 2002; Louwerse et al. 2005). Finally, not all studies find differences in social responses between a human or symbolic appearance (Catrambone et al. 2002; Verhagen et al. 2014). Furthermore, users have different preferences for partly or completely embodied agents (McBreen, Jack 2001; Cowell, Stanney al. 2005).
Reference List
1. Beer, J. M., Smarr, C.-A., Fisk, A. D., & Rogers, W. A. (2015). Younger and older users’ recognition of virtual agent facial expressions. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES (75, pp. 1-20.
2. Brahnam, S., & Angeli, A. de (2012). Gender affordances of conversational agents. INTERACTING WITH COMPUTERS (24:3), pp. 139-153.
3. Catrambone, R., Stasko, J., & Xiao, J. (2002). Anthropomorphic agents as a user interface paradigm: Experimental findings and a framework for research. Proceedings of the Cognitive Science Society (24:24).
4. Cowell, A. J., & Stanney, K. M. (2005). Manipulation of non-verbal interaction style and demographic embodiment to increase anthropomorphic computer character credibility. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES (62:2), pp. 281-306.
5. Forlizzi, J., Zimmerman, J., Mancuso, V., & Kwak, S. (2007). How Interface Agents Affect Interaction Between Humans and Computers. In : DPPI ’07, Proceedings of the 2007 Conference on Designing Pleasurable Products and Interfaces (pp. 209-221). New York, NY, USA: ACM.
6. Keeling, K., Beatty, S., McGoldrick, P., & Macaulay, L. (Eds.). 2004. Face value? Customer views of appropriate formats for embodied conversational agents (ECAs) in online retailing. 37th Annual Hawaii International Conference on System Sciences, 2004. Proceedings of the.
7. KIM, S., CHEN, R. P., & ZHANG, K. E. (2016). Anthropomorphized Helpers Undermine Autonomy and Enjoyment in Computer Games. Journal of Consumer Research (43:2), pp. 282-302.
8. Knijnenburg, B. P., & Willemsen, M. C. (2016). Inferring Capabilities of Intelligent Agents from Their External Traits. ACM Trans. Interact. Intell. Syst. (6:4), pp. 28:1-28:25, from http://doi.acm.org/10.1145/2963106.
9. Louwerse, M. M., Graesser, A. C., Lu, S. L., & Mitchell, H. H. (2005). Social cues in animated conversational agents. APPLIED COGNITIVE PSYCHOLOGY (19:6), pp. 693-704.
10. McBreen, H. M., & Jack, M. A. (2001). Evaluating humanoid synthetic agents in e-retail applications. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS (31:5), pp. 394-405.
11. Mersiol, M., Chateau, N., & Maffiolo, V. (Eds.). 2002. Talking heads: Which matching between faces and synthetic voices? Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.
12. Nancy Viola Wuenderlich and Stefanie Paluch. A Nice and Friendly Chat with a Bot: User Perceptions of AI-Based Service Agents. ICIS 2017 Proceedings (2017:
13. Nowak, K. (2004). The Influence of Anthropomorphism and Agency on Social Judgment in Virtual Environments. Journal of Computer-Mediated Communication (9:
14. Sah, Y. J., & Peng, W. (2015). Effects of visual and linguistic anthropomorphic cues on social perception, self-awareness, and information disclosure in a health website. Computers in Human Behavior (45, pp. 392-401.
15. Trovato, G., Ramos, J. G., Azevedo, H., Moroni, A., Magossi, S., & Ishii, H., et al. (Eds.). 2015. “Olá, my name is Ana”: A study on Brazilians interacting with a receptionist robot. 2015 International Conference on Advanced Robotics (ICAR).
16. Verhagen, T., van Nes, J., Feldberg, F., & van Dolen, W. (2014). Virtual Customer Service Agents: Using Social Presence and Personalization to Shape Online Service Encounters. Journal of Computer-Mediated Communication (19:3), pp. 529-545.
17. Visser, E. J. de, Monfort, S. S., McKendrick, R., Smith, M. A. B., McKnight, P. E., Krueger, F., & Parasuraman, R. (2016). Almost Human: Anthropomorphism Increases Trust Resilience in Cognitive Agents. Journal of Experimental Psychology. Applied (22:3), pp. 331-349.
18. Pak, R., N. Fink, M. Price, B. Bass and L. Sturre (2012). Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults Ergonomics 55 (9), 10591072.
19. Cassell, J. (2000). Embodied conversational interface agents? Communications of the ACM 43 (4), 70?78.
20. Bailenson, J. N. and N. Yee (2005). Digital chameleons. Automatic assimilation of nonverbal gestures in immersive virtual environments Psychological science 16 (10), 814?819.