Researchers have created a tool for measuring how much discomfort apps cause users by collecting personal data.
You would think that feeling chronically uneasy about products would spur a movement away from them. However, this is not the case for apps use. Even though surveys show that users to feel emotional stress by the fact that apps collect personal data, we continue using them.
“It seems that people accept this uneasy feeling almost as a part of the user experience. Somehow, we have been trained to live with being uncomfortable. But you may ask how it can be defensible to treat people and their emotional states so terribly,” says Irina Shklovski, professor in the computer science department at the University of Copenhagen.
Shklovski presented a paper on the findings at the Human Factors in Computing Science conference.
“I think most of us have tried feeling uneasy when downloading apps, but most often you can’t really put your finger on what the problem might be. So, we decided to create a way of measuring the degree of discomfort,” Shklovski says.
The researchers broke down the problem into three parts. To be creepy, an app needs to:
- violate the boundaries of the user;
- do so unexpectedly; and
- possess ambiguity of threat.
High scores in all three categories would amount to one very creepy app.
“Notably, we are talking about emotional response here. Even in a situation where objectively everything is fine, for instance if a technical solution guarding against misuse of personal data is in place, the user may still feel discomfort,” Shklovski emphasizes.
Having a score for creepiness means the researchers can now examine how various modifications can change user experience.
In the study, they divided 751 participants into cohorts that rated their experience under different “regimes.” All regimes would feature a fictitious app called “Remember Music.” Just like several real-world apps, Remember Music can recognize a tune or song that you might hear randomly, for example as you walk down the street.
“Just like in the real world, the participants would have to agree to a license agreement, and again just like in the real world they would click accept without thinking twice,” says Shklovski.
In one regime, the app would collect your location. In another regime, it would soon start to make suggestions on more music from the identified artists. In yet another regime, the app would post on Facebook what you are listening to. Further, some participants were granted control of what the app was doing: they could approve or deny having their music habits displayed on Facebook.
“We had expected the group with control to feel more comfortable, but surprisingly they didn’t,” Shklovski says.
“Lawyers and organizations working to improve data privacy are often focused on improving user control. While this may be desirable for other reasons, sadly our research show that the emotional stress to users will not be relieved.”
As part of the experiment, participants rated themselves on digital literacy.
“We normally assume people who have a high degree of digital literacy to be more critical towards the apps, but again surprisingly, the opposite is true. The more you see yourself as digitally literate, the higher the likeliness of you continuing using an app which is invasive,” says Shklovski.
And again, this discovery delivers challenges industry dogma:
“Industry and public bodies will argue that this is a question of personal data hygiene. In other words, that as users become more digitally aware they will favor less intrusive apps over the more intrusive. Based on the data from our study, we can say that trying to shift responsibility to the user in this way will not work. That horse has bolted. If we want things to get better, we need developers and policy makers to change the scene,” Shklovski concludes.
Additional coauthors of the study are from Michigan State University, Indiana University Bloomington, and the University of Utah.
Source: University of Copenhagen