How is communicative gesture behavior in robots perceived by humans? Although gesture is crucial in social interaction, this research question is still largely unexplored in the field of social ...robotics. Thus, the main objective of the present work is to investigate how gestural machine behaviors can be used to design more natural communication in social robots. The chosen approach is twofold. Firstly, the technical challenges encountered when implementing a speech-gesture generation model on a robotic platform are tackled. We present a framework that enables the humanoid robot to flexibly produce synthetic speech and co-verbal hand and arm gestures at run-time, while not being limited to a predefined repertoire of motor actions. Secondly, the achieved flexibility in robot gesture is exploited in controlled experiments. To gain a deeper understanding of how communicative robot gesture might impact and shape human perception and evaluation of human-robot interaction, we conducted a between-subjects experimental study using the humanoid robot in a joint task scenario. We manipulated the non-verbal behaviors of the robot in three experimental conditions, so that it would refer to objects by utilizing either (1) unimodal (i.e., speech only) utterances, (2) congruent multimodal (i.e., semantically matching speech and gesture) or (3) incongruent multimodal (i.e., semantically non-matching speech and gesture) utterances. Our findings reveal that the robot is evaluated more positively when non-verbal behaviors such as hand and arm gestures are displayed along with speech, even if they do not semantically match the spoken utterance.
Full text
Available for:
EMUNI, FIS, FZAB, GEOZS, GIS, IJS, IMTLJ, KILJ, KISLJ, MFDPS, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, SBMB, SBNM, UKNU, UL, UM, UPUK, VKSCE, ZAGLJ
The International Gesture Workshops (GW) are interdisciplinary events for those researching gesture-based communication across the disciplines. The focus of these events is a shared interest in ...understanding gestures and sign language in their many facets, and using them for advancing human-machine interaction. Since 1996, International Gesture Workshops have been held roughly every second year, with fully reviewed proceedings published by Springer. The International Gesture Workshop GW 2009 was hosted by Bielefeld University's Center for Interdisciplinary Research (ZiF - Zentrum für interdisziplinäre Forschung) during February 25-27, 2009. Like its predecessors, GW 2009 aimed to provide a platform for participants to share, discuss, and criticize recent and novel research with a multidisciplinary audience. More than 70 computer scientists, linguistics, psychologists, neuroscientists as well as dance and music scientists from 16 countries met to present and exchange their newest results under the umbrella theme "Gesture in Embodied Communication and Human-Computer Interaction. " Consistent with the steady growth of research activity in this area, a large number of high-quality submissions were received, which made GW 2009 an exciting and important event for anyone interested in gesture-related technological research relevant to human-computer interaction. In line with the practice of previous gesture workshops, presenters were invited to submit theirs papers for publication in a subsequent peer-reviewed publication of high quality. The present book is the outcome of this effort. Representing the research work from eight countries, it contains a selection of 28 thoroughly reviewed articles.
Full text
Available for:
FIS, FZAB, GEOZS, GIS, IJS, IMTLJ, KILJ, KISLJ, MFDPS, NUK, OBVAL, OILJ, PNG, SAZU, SBCE, SBJE, SBMB, SBNM, UKNU, UL, UM, UPUK, VKSCE, ZAGLJ
"This volume presents important results of the Collaborative Research Center (Sonderforschungsbereich) ""Situated Artificial Communicators,"" which was funded by grants from the German Research ...Foundation (Deutsche Forschungsgemeinschaft) for more than twelve years. The contributions focus on different aspects of human-human and human-machine interaction in situations which closely model everyday workplace demands. The authors are linguists, psycho- und neurolinguists, psychologists and computer scientists at Bielefeld University. They jointly tackle questions of information processing in task-oriented communication. The role of key notions such as context, integration (of multimodal information), reference, coherence, and robustness is explored in great depth. Some remarkable findings and recurrent phenomena reveal that communication is, to a large extent, a matter of joint activity. The interdisciplinary approach integrates theory, description and experimentation with simulation and evaluation."
Alignment in Communication Wachsmuth, Ipke; Ruiter, Jan de; Jaecks, Petra ...
2013, 2013-11-30, Volume:
6
eBook
In accordance with accumulating evidence from research, we assume a strong but flexible relation between emotional and communicative alignment in interaction. The communicative function of emotional ...adaptation, the processing of emotions on all linguistic levels and the empirical evidence in studies with neurological patient groups support our approach. In this chapter, we will discuss the link, i.e. the differences and influences, between emotional and communicative processes of adaptation and extend on emotional communication in human-robot interaction. In the course of this, we propose a three-layered model of emotional alignment in order to explain how emotional alignment could be computationally modelled in a human-robot setting.
We present a collaborative approach towards a detailed understanding of the usage of pointing gestures accompanying referring expressions. This effort is undertaken in the context of human-machine ...interaction integrating empirical studies, theory of grammar and logics, and simulation techniques. In particular, we attempt to measure the precision of the focussed area of a pointing gesture, the so-called pointing cone. The pointing cone serves as a central concept in a formal account of multi-modal integration at the linguistic speech-gesture interface as well as in a computational model of processing multi-modal deictic expressions.
Full text
Available for:
FIS, FZAB, GEOZS, GIS, IJS, IMTLJ, KILJ, KISLJ, MFDPS, NUK, OBVAL, OILJ, PNG, SAZU, SBCE, SBJE, SBMB, SBNM, UKNU, UL, UM, UPUK, VKSCE, ZAGLJ
We introduce the WASABI (WASABI Affect Simulation for Agents with Believable Interactivity) Affect Simulation Architecture, in which a virtual human’s cognitive reasoning capabilities are combined ...with simulated embodiment to achieve the simulation of primary and secondary emotions. In modeling primary emotions we follow the idea of “Core Affect” in combination with a continuous progression of bodily feeling in three-dimensional emotion space (PAD space), that is subsequently categorized into discrete emotions. In humans, primary emotions are understood as onto-genetically earlier emotions, which directly influence facial expressions. Secondary emotions, in contrast, afford the ability to reason about current events in the light of experiences and expectations. By technically representing aspects of each secondary emotion’s connotative meaning in PAD space, we not only assure their mood-congruent elicitation, but also combine them with facial expressions, that are concurrently driven by primary emotions. Results of an empirical study suggest that human players in a card game scenario judge our virtual human MAX significantly older when secondary emotions are simulated in addition to primary ones.
Full text
Available for:
EMUNI, FIS, FZAB, GEOZS, GIS, IJS, IMTLJ, KILJ, KISLJ, MFDPS, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, SBMB, SBNM, UKNU, UL, UM, UPUK, VKSCE, ZAGLJ
Empathy in Virtual Agents and Robots Paiva, Ana; Leite, Iolanda; Boukricha, Hana ...
ACM transactions on interactive intelligent systems,
2017, Volume:
7, Issue:
3
Journal Article
Peer reviewed
This article surveys the area of computational empathy, analysing different ways by which artificial agents can simulate and trigger empathy in their interactions with humans. Empathic agents can be ...seen as agents that have the capacity to place themselves into the position of a user’s or another agent’s emotional situation and respond appropriately. We also survey artificial agents that, by their design and behaviour, can lead users to respond emotionally as if they were experiencing the agent’s situation. In the course of this survey, we present the research conducted to date on empathic agents in light of the principles and mechanisms of empathy found in humans. We end by discussing some of the main challenges that this exciting area will be facing in the future.