Authors Damini Mohan and Mahima Dixit investigate predatory practices surrounding obtaining digital consent, and advocate for protective policy and the use of technology to empower users to make informed decisions regarding their personal data.
Kaushal is a smallholder farmer in India. He recently opened a savings bank account to receive government benefits. A few days later, he noticed a deduction from his bank account for a premium under a life insurance policy he did not recall buying. When he asked a bank official, he learned that during the sign-up process his consent was taken to bundle the insurance policy along with his savings bank account.
Sheila, the owner of a small grocery store in Indonesia, opted for a digital credit scheme to expand her store operations. However, a delay in credit repayment resulted in officials threatening to leak her photograph, location, and other sensitive information. Upon further investigation, Sheila discovered that the company representative who had assisted her in obtaining the credit had sought her consent, during the sign-up process to share her details with third parties in specified or exceptional situations. Nevertheless, the definition of these “exceptional” circumstances remained unclear to Sheila, who “consented” without having complete information.
Sheila and Kaushal’s stories echo experiences worldwide. Many fall prey to data leaks and financial losses because taking consent during the collection of sensitive data is merely customary in its current form.
In a qualitative study that analyzed users’ perceptions of privacy across four Indian states, we found that most users in urban and rural settings alike were uncertain about the service terms and conditions of digital entities. Similarly, research on collecting refugee data shows that refugees and other vulnerable populations rarely know the purpose and use of their data, even as they give individual consent to share it with entities.
Entities typically request consent from users during the registration process for their services they offer. Its purpose is to ensure that users actively grant permission for the collection of their data and are fully aware of how the entity intends to use their personal information. While securing consent is essential to respecting users’ agency over their data, it is often reduced to an obligatory step, passively sought from users through checkboxes. This approach can pose high risks for users, primarily because it lacks a user-centric design. Consent terms are usually verbose and filled with difficult legalese an ordinary person would struggle to understand.
The risk is exacerbated for users who rely on assisted channels for registration. This group includes new users of services or those with less education. In such cases, users are asked upfront by facilitators if they agree to given terms instead of being briefed on the reason for data collection. The facilitator often ticks the check box and accepts the terms and conditions on the end-user’s behalf, and leaves customers in the dark about the terms.
The consent collection process needs reform to retain the end-users’ agency over their personal information. Laws in about 59% of countries cover the use and management of user data by seeking entities, such as the General Data Protection Regulation in the EU and the Data Protection Act in Kenya. Draft legislation exists in other countries, such as the Data Protection and Privacy Bill in India. All these legislations hinge on the premise that users make informed decisions around consent.
Clearly stating the purpose and use of data before seeking consent can enhance credibility of the seeking entities. Equally important is the communication of mechanisms to revoke consent.
Singapore has enabled an easily understandable consent management module for its Singpass app, which citizens use to access most government services and requires using the national biometric ID details to log in. This module explains the use and purpose of data and allows citizens to revoke consent and manage information shared with entities. This mechanism has a strong potential for success with a literate and tech-savvy population. Yet, what about parts of the world that still grapple with the digital divide?
To address this challenge, one effective approach is to invest in context-based tools that facilitate consent decisions for everyone. This is an increasingly common practice in the medical trials that occur with vulnerable communities. For instance, in the Gambia, researchers deployed multimedia tools — such as videos, animation, and audio — in all major spoken languages to take informed consent from the Gambia’s population. The country had a 50% literacy rate when the study was conducted in 2015. The study notes that this approach gives Gambians greater autonomy over their decisions. It also ensures a higher recall value of terms provided by entities and less “perceived” risk.
Some countries like India have attempted to address this issue through tools that can make consent management easier for a linguistically diverse population with varying levels of digital readiness. For example, MEITY’s Bhashini, a program to create models for translating Indian languages, can help communicate consent in language by the public and private sectors. Jugalbandi is another innovative example that uses ChatGPT and Bhashini to help answer queries in more than 25 Indian languages through voice notes on WhatsApp.
People can “chat” with Jugalbandi by sharing a voice recording with their questions and the AI-powered chatbot sends back an audio response along with text and convenient links wherever applicable.
The use of technologies that drive consent through easily understandable multimedia tools can help marginalized communities access more services and make informed decisions. Moreover, it also ensures that a lack of education does not hinder high civic awareness. Ultimately, seeking user’s consent should not be reduced to a customary click. Instead, this should be a transparent and easily understandable process that upholds choice in the information shared and empowers customers to make informed decisions around their personal data.
The blog was first published on the Hertie School website on 23rd October 2023.
Leave comments