In contrast to initiatives such as TRECVID [4] that are mostly targeting video search via analysis and indexing, VBS focuses on evaluation of video search tools as a whole, disallowing, for example, textual queries in order to highlight the truly interactive aspects of a search process. Systems submitted by researchers are evaluated in two rounds: an expert round, where a user familiar with a system (usually one of its developers) solves given search tasks, and a novice round where the same is done by a person who is unfamiliar with the search tool (usually a volunteer from the audience).