As concerns about privacy increase for people using mobile apps, new research suggests that trust and engagement may hinge on perceptions about how the app uses personal data and whether it seeks user input before delivering services.
Researchers add, however, that reactions may also depend on how familiar users are with technology.
In a study of a prototype app for recommending eco-friendly stores, users considered an app more trustworthy and easier to use if they felt they were consulted about the distance and nature of the stores they prefer, a process called overt personalization. Usability of the app was dampened when personalization was covert, when it recommended stores without first asking their preferences.
“If you give people a perception of control, they trust the app more…”
But, it’s not always feasible to consult app users because doing so would interrupt them and require them to make too many choices, researchers say. One solution is to make sure that users have a clear understanding of how the app is using their data.
Higher perceived transparency—whether users recognize that the app is clearly conveying how and why it is collecting the data—is associated with better product involvement and user engagement, researchers say. Transparency can also mean lower privacy concerns.
“Providing details about how the app is going to do things, such as how it will use your information, how it will store the data, and how it’s going to delete that information, may reduce some of the privacy concerns and the feeling of being creeped out by personalized offerings,” says S. Shyam Sundar, professor of communications at Penn State and co-director of the Media Effects Research Laboratory.
The perception of control can lead to a series of positive user reactions, says Tsai-Wei Chen, a user experience designer at Optum, who worked with Sundar.
“If you give people a perception of control, they trust the app more, and, the more they trust it, the greater their involvement in the app and the more positive attitudes. Their privacy concerns also went down and they had greater engagement with the app.”
The researchers, who presented their findings at the CHI Conference in Montreal, found a connection between a user’s technological savvy and his or her ability to perceive overt personalization and information transparency.
“People who were more familiar with using technology—power users—could tell the difference between overt and covert personalization,” says Sundar. “They better recognized the value of information transparency and felt that it made up for perceived lack of overtness in personalization.”
The researchers suggest that because users’ familiarity with technology may influence how they experience features, such as privacy controls, developers should have a clear understanding of their customers’ expertise and limitations when designing an app.
Developers should also make cues about information usage more obvious for casual tech users, they add.
Do friends give your data to third-party apps?
“For users who have some tech expertise, it’s easier to incorporate covert personalization, but make sure the transparency cues are apparent and easy to understand,” says Chen. “For users with lower tech expertise, you need to work hard to convey overt personalization and information transparency, or find other features to increase their trust.”
For the study, the researchers recruited 302 participants to use five different versions of an app prototype, called GreenByMe, that recommended local eco-friendly stores. The five versions covered the different conditions of the experiment, including covert personalization, overt personalization, high transparency, low transparency, and a control condition.
In the overt condition, the app displayed selection menus. To test transparency, in the high transparency condition, a screen contained an explanation on how the information would be used.
The National Science Foundation supported this work.
Source: Penn State