1. INTRODUCTION Since the introduction of the smartphone, capturing a video has become relatively easy. Videos can record emotional experiences of the video shooter, such as their interest, astonishment and excitement, and can also be then shared with friends on Social Networking Services. There are now SNS where videos are the main content through which users communicate with each other. In the past, expensive video cameras and video editing software were necessary for creating personal videos. Currently, we can capture and edit videos using just a smartphone. However, the video shooter’s emotional state is not completely presented in these videos. It is difficult for people who have little knowledge concerning professional video shooting to express their emotional experiences using only video footage. On the other hand, research on sports spectating and television shows have used physiological signals to measure the emotional state (excitement, stress, etc.) of athletes and actors for the purpose of sharing them with viewers. Those studies have confirmed the positive effects of sharing physiological data of an athlete or an actor with viewers, and that this helps to create a strong connection between them[2]. Those signals are represented as a reflective numeric or graph and shared with viewers. In this study, we developed a mobile video camera application called AffectiView that captures users’ affective responses while they are taking a video, and also provides a way for sharing that data with other users. Users’ affective responses are measured using physiological proxies studied in prior work. To measure the affective response, we have used Skin Conductance Level (SCL), which is also employed in many other research to measure excitement levels[10]. Our system has two modes, the video camera mode and the video viewer mode. In the video camera mode, users can capture a video together with their physiological signals and the captured data is then automatically shared with other users. In the video viewer mode, users can view the shared video with the physiological signals presented in three different ways. They can also input any comments they may have about the presentation of the physiological data. We have organized the representation of physiological data between users into three styles (Explicit Representation, Implicit Representation, Haptic Representation), based on prior work. AffectiView provides a novel mobile video experience through the sharing of affective response of video shooters with video viewers.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission