Remember When You Said ‘I agree’?

Remember the last time you downloaded a new app or opened a new website. You saw a button saying 'I agree' or some other variant of it. What did you do? Did you spend some time reading the policies or did you, like most others, just accept everything in order to quickly access the service? Recent experiments that we have run with over 3000 subjects across India and Kenya confirms this - informed consent is very hard to achieve. When asked in surveys, we all say we are concerned about our privacy. However, as we are so fully aware, very little of what we consent to share is thought through and informed. We don’t know what we are sharing, who we are sharing with, and what they will do with our data. This gap between what we say we want and what we do is the privacy paradox. This issue is further compounded during the pandemic and lockdown, given greater need to be online and the surge in time spent on the digital platform. Our behaviour when it comes to making decisions regarding our privacy is subject to several biases that prevent us from acting in our own best interest. The onus of protecting our privacy should not lie with us. It needs to be achieved through better and stronger regulation.

Why is it so hard for people to make good decisions regarding their own privacy? 

Firstly, there’s present bias where people attach a higher value to a smaller, immediate gratification i.e. what I am getting NOW  (e.g. accessing a service) as compared to a higher reward in the future (e.g. better security). It leads to people readily agreeing to the privacy terms and conditions of apps and websites. 

Secondly, there is information overload i.e. a high amount of cognitive load is placed on users when they are asked for consent, due to the length of the policy and strict legal language. In the current trying times, users' decision making power is further compromised due to the additional stress, increasing the inability to make rational choices about their privacy. 

Thirdly, another mechanism at play could be the hot-cold empathy gap - individuals underestimate the effect of their visceral states on their decision making ability i.e. decision taken to utilise a service with dubious privacy credentials under the ‘hot’ condition of immediate need in the lockdown might not be congruent with their normal state preferences where they might value privacy more.

Additionally, preferences for privacy are often malleable as choices are discreet (you share or do not share), time-sensitive (accept to move forward) and with asymmetric information (entity asking for data knows more than consumer providing data)

We are currently setting dangerous norms of extreme data sharing due to the necessity for the public good and should think through how we can mitigate this post-COVID. One suggestion would be to consider individuals as owners of their data with governments and other players being granted temporary access. The second idea could be to provide opportunities for users to revisit their data sharing choices made during the pandemic and make sharing granular rather than a blanket. Further, data sharing and use practices that evolve in these times must be accompanied with sunset clauses.  

Regulations should also be put in place for stricter default privacy settings by businesses. Further, businesses should need to make their privacy policies more salient. They can explore different styles of presentation like a summary fact sheet which lists out key points of the policy. Another method could be third-party quality assessments like star rating to inform the users of the level and quality of their privacy terms. Our experiments have also unveiled that users express more trust in organisations, demonstrated through increased sharing, when given additional time to make privacy choices, for example, a cool-down period of 30 seconds or more before they have to give their consent. For such practices to become norms would require a top-down approach and effective implementation through regulation.

The focus on the use of highly personalised data during the pandemic has moved the privacy debate centerstage. Weak consumer awareness and the biases at play deem individuals incapable of making the right privacy decisions for themselves. Businesses often have opposing interests and benefit from greater data sharing. Hence efficient regulation is the only solution to safeguard the privacy of our vulnerable digital users, as the next few hundred million join us online and as conducting a larger fraction of our lives online becomes the new normal.  

dummy-image

James Vancel

Guest Author The author is CEO, Busara Center for Behavioral Economics
dummy-image

Pooja Haldea

Guest Author The author is Senior Advisor, CSBC & Ashoka University

Also Read

Subscribe to our newsletter to get updates on our latest news