In this article we report on a new digital interactive self-report method for the measurement of human affect. The AffectButton (Broekens and Brinkman, 2009. ACII 2009: IEEE) is a button that enables ...users to provide affective feedback in terms of values on the well-known three affective dimensions of pleasure (valence), arousal and dominance. The AffectButton is an interface component that functions and looks like a medium-sized button. The button presents one dynamically changing iconic facial expression that changes based on the coordinates of the user's pointer in the button. To give affective feedback the user selects the most appropriate expression by clicking the button, effectively enabling 1-click affective self-report on 3 affective dimensions. Here we analyze 5 previously published studies, and 3 novel large-scale studies (n=325, n=202, n=128). Our results show the reliability, validity, and usability of the button for acquiring three types of affective feedback in various domains. The tested domains are holiday preferences, real-time music annotation, emotion words, and textual situation descriptions (ANET). The types of affective feedback tested are preferences, affect attribution to the previously mentioned stimuli, and self-reported mood. All of the subjects tested were Dutch and aged between 15 and 56 years. We end this article with a discussion of the limitations of the AffectButton and of its relevance to areas including recommender systems, preference elicitation, social computing, online surveys, coaching and tutoring, experimental psychology and psychometrics, content annotation, and game consoles.
► A new digital interactive self-report method for affective feedback. ► Enables 1-click affective self-report on 3 affective dimensions. ► Reliable and valid affective feedback. ► Evaluation of usability, reliability and validity in a wide variety of settings.
Affective labeling of multimedia content has proved to be useful in recommender systems. In this paper we present a methodology for the implicit acquisition of affective labels for images. It is ...based on an emotion detection technique that takes as input the video sequences of the users' facial expressions. It extracts Gabor low level features from the video frames and employs a k nearest neighbors machine learning technique to generate affective labels in the valence-arousal-dominance space. We performed a comparative study of the performance of a content-based recommender (CBR) system for images that uses three types of metadata to model the users and the items: (i) generic metadata, (ii) explicitly acquired affective labels and (iii) implicitly acquired affective labels with the proposed methodology. The results show that the CBR performs best when explicit labels are used. However, implicitly acquired labels yield a significantly better performance of the CBR than generic metadata while being an unobtrusive feedback tool.
The ability to regulate emotions is a critical feature of healthy psychological functioning. It is therefore essential to understand under what conditions different emotion regulation strategies may ...or may not be effective. Neurobiological evidence suggests that certain contexts, including acute psychological stress and sleep deprivation, may impair emotion regulation ability. However, there is little data on the causal effects of these contexts on emotion regulation processing. In this dissertation, I present three studies that examine the neurobiological effects of stress and sleep deprivation on two emotion regulation strategies: cognitive reappraisal and affect labeling. In Paper 1, we induced acute psychosocial stress and measured its impact (relative to a control manipulation) on emotional responding during a cognitive reappraisal task while participants underwent fMRI. Findings revealed no evidence that stress modulated the effects of cognitive reappraisal on subjective or physiological measures of emotional responding. Modest effects of stress on reappraisal-related neural activation were observed in the prefrontal cortex and amygdala, but these relationships were statistically fragile. These findings were extended in Paper 2, in which we tested the effects of one night of total sleep deprivation on the same cognitive reappraisal task. Once again, results showed no evidence that the context manipulation (this time sleep deprivation) affected subjective, physiological, or neural responses to cognitive reappraisal. However, it was not the case that these null effects generalized to all types of emotion regulation. In Paper 3, we examined the effects of sleep deprivation on affect labeling, an implicit emotion regulation strategy hypothesized to rely on the cognitive control functions of the right ventrolateral prefrontal cortex. Findings revealed up-regulated recruitment of this prefrontal region as well as increased functional connectivity with the amygdala during sleep deprivation. Increased coupling was associated with lower baseline negative affect when sleep deprived, suggesting that sleep-loss-induced increases in activation may have adaptive buffering effects on mood. Together, findings from these papers show that two context manipulations expected to impair emotion regulation ability do not appear to impact cognitive reappraisal, despite influencing the processing of a more implicit emotion regulation strategy. This work calls for a careful examination of the way emotion regulation is presently studied and whether cognitive reappraisal, or only certain aspects of it, may in fact be robust to the effects of contextual factors like stress and sleep deprivation.
Recent work has shown an increase of accuracy in recommender systems that use affective labels. In this paper we compare three labeling methods within a recommender system for images: (i) generic ...labeling, (ii) explicit affective labeling and (iii) implicit affective labeling. The results show that the recommender system performs best when explicit labels are used. However, implicitly acquired labels yield a significantly better performance of the CBR than generic metadata while being an unobtrusive feedback tool.
Affective labeling of multimedia content can be useful in recommender systems. In this paper we compare the effect of implicit and explicit affective labeling in an image recommender system. The ...implicit affective labeling method is based on an emotion detection technique that takes as input the video sequences of the users’ facial expressions. It extracts Gabor low level features from the video frames and employs a kNN machine learning technique to generate affective labels in the valence-arousal-dominance space. We performed a comparative study of the performance of a content-based recommender (CBR) system for images that uses three types of metadata to model the users and the items: (i) generic metadata, (ii) explicitly acquired affective labels and (iii) implicitly acquired affective labels with the proposed methodology. The results showed that the CBR performs best when explicit labels are used. However, implicitly acquired labels yield a significantly better performance of the CBR than generic metadata while being an unobtrusive feedback tool.
Affect-based indexing and retrieval of films Chan, Ching Hau; Jones, Gareth J. F.
International Multimedia Conference: Proceedings of the 13th annual ACM international conference on Multimedia; 06-11 Nov. 2005,
11/2005
Conference Proceeding
Odprti dostop
Digital multimedia systems are creating many new opportunities for rapid access to content archives. In order to explore these collections using search applications, the content must be annotated ...with significant features. An important and often overlooked aspect of human interpretation of multimedia data is the affective dimension. Affective labels of content can be extracted automatically from within multimedia data streams. These can then be used for content-based retrieval and browsing. In this study affective features extracted from multimedia audio content are mapped onto a set of keywords with predetermined emotional interpretations. These labels are then used to demonstrate affect-based retrieval on a range of feature films.