We propose a new interface evaluation tool that incorporates affective metrics wich are provided from the
ElectroEncephaloGraphy (EEG) signals of the Emotiv EPOC neuro-headset device. The evaluation tool captures and analyzes information in real time from a multitude of sources such as EEG, facial expressions, and
affective metrics such as frustration, engagement and excitement. The proposed tool has been used to gain
detailed affective information of users interacting with a mobile multimodal (touch and speech) iPhone application, for which we investigated the effect of speech recognition errors and modality usage patterns.