Recommender systems increasingly use contextual and demographical data as a basis for recommendations.
Users, however, often feel uncomfortable providing such information.
In a privacy-minded design of recommenders, users are free to decide for themselves what data they want to disclose about themselves.
But this decision is often complex and burdensome, because the consequences of disclosing personal
information are uncertain or even unknown.
Although a number of researchers have tried to analyze and facilitate such information disclosure decisions, their research results are fragmented, and they often do not hold up well across studies.
This article describes a unified approach to privacy decision research that describes the cognitive processes involved in users’ “privacy calculus” in terms of system-related perceptions and experiences that act as mediating factors to information disclosure.
The approach is applied in an online experiment with 493 participants using a mock-up of a context-aware recommender
system.
Analyzing the results with a structural linear model, we demonstrate that personal privacy concerns and disclosure justification messages affect the perception of and experience with a system, which in turn drive information disclosure decisions. Overall, disclosure justification messages do not increase disclosure. Although they are perceived to be valuable, they decrease users’ trust and satisfaction.
Another result is that manipulating the order of the requests increases the disclosure of items requested
early but decreases the disclosure of items requested later.