Whatsapp, Twitter, Facebook, Microsoft, Zoom, UBEEQO, Tim Hortons - all of them share a common problem. Their products had privacy issues leading to either hefty fines or regulator scrutiny.
As companies build out technical privacy programs, they rely on Personal Data discovery tools & work backward to find data uses/purposes. This will not suffice for DPDP compliance as you might discover where your PII data is stored but you don’t know details on how it is used, shared, or collected.
Generally, we see that there are three sources of privacy violations, i.e., sensitive data sharing, excessive data collection, and illegal data use. Let's take a closer look at them.
A. Sensitive Data Sharing: Sensitive data sharing or unsolicited data sharing refers to the unauthorized or unintentional disclosure of personally identifiable information (PII) or confidential data to third parties or partner/group companies without the explicit consent of the Data Principal. This could include information such as financial records, health data, location data, or any other information that could pose a risk to individuals' privacy and security if disclosed improperly. At times, this could also include personal data being shared during a purchase or sale of a business too.
Let us understand this with a simple example. For an e-commerce site, search history is not sensitive, but it becomes sensitive if your machine learning algorithms can predict your personal choices for targeted ads. Another reason for unsolicited sharing is that instead of sending specific variables to these third parties or group companies, an entire object containing personal data is sent that contains more information than intended. Following are some of the examples of how this could happen:
Have any of the companies done any of these, you might want to ask. It is a resounding yes. In January 2021, Meta released a new data-sharing policy for WhatsApp, mandating the transfer of your information between WhatsApp and Facebook. After users complained, the company then noted that it would limit WhatsApp’s features for anyone who doesn’t opt-in.
WhatsApp, which was bought by Facebook in 2014, had until 2016 kept the data of WhatsApp users separate from that of Facebook users. But in 2016, WhatsApp and Facebook changed their privacy policy so that meta-data (details like the phone number of users) could be shared between the two apps. This allowed Facebook to do limited profiling of WhatsApp users and then help the company show targeted advertisements when these users are using Facebook and Instagram. In Feb 2023, WhatsApp told the Supreme Court of India that only some data of WhatsApp users is shared with Facebook, and if users wanted absolute privacy, they should ideally use WhatsApp without having an account on Facebook. How bizarre!
Let us take another example, a leading and listed Indian bank, buried in its privacy policy, is a user consent that is vague, and ambiguous with no clear affirmative action. The language used in the policy grants the bank the liberty to share your personal data with various entities, for purposes not just limited to marketing. In simpler terms, it implies they can share your personal information with any company without explicitly seeking your approval. Not only does it embrace such unclear consent, but it also seeks protection against potential consequences of any wrongdoing. Surprisingly, this trend isn't exclusive to one bank. In fact 8 out of 10 major banks in India employ similar tactics in their privacy policies.
Similar trends are observed in other industries as well. One of the FMCG giants in India had the following statement in the privacy policy of their merchant onboarding app :
The excerpt suggests that personal data is treated as a business asset by the company, and when they engage in transactions involving the sale or purchase of assets, including subsidiaries or business units, personal data may be transferred as part of those assets. This practice is concerning as it implies that users' personal data can be transferred without explicit consent in such transactions.
The company views users' personal data as a business asset, which is subject to transfer during various business transactions, including asset sales. During the sale of business assets, including subsidiaries or business units, users' personal data may be transferred without explicit consent. While the DPDP Act may not explicitly address the sale of personal data, the unauthorized sharing of personal data is considered a data breach. Treating personal data as a transferable business asset without clear user consent could be interpreted as a violation of data protection principles.
B. Excessive Data Collection: Before the enactment of the DPDP Act, the prevailing mindset equated data with the new oil, with little consideration for whether we were gathering more personal data than necessary. Over the years, mobile apps and online platforms have routinely engaged in excessive data collection practices. A common violation seen is related to permissions in mobile applications. For example, using Precise Location over Coarse Location, and if you need Precise Location, do you need to collect precise location continuously or on-demand when someone is trying to use your application?
Following are some instances related to Excessive Data collection
Excessive data collection is widely seen in the BFSI sector. When opening a savings bank account in India, the contrast in the number of personal data items collected by different banks is alarming. While a leading and listed bank may require as many as 22 personal data items, another bank accomplishes the same process with just 6. This discrepancy extends across various banking products, with one bank needing only 10 personal data items for a personal loan application, while another demands 15. Despite detailed RBI guidelines and regulations in place on banking products, the significant variation in the collection of Personally Identifiable Information (PII) is something that needs attention.
C. Illegal Data Use: Illegal data use occurs when personal data collected is processed for purposes not disclosed to the Data Principal. It can occur when the data processing implemented by developers diverges from the promises made in the privacy policy. Such violations can occur within products such as mobile apps or backend systems, including micro-services and data pipelines. It could also occur when internal teams are not entirely aligned on the purposes outlined in the privacy policy.
For example, a company might collect phone numbers for providing timely updates and alerts, only for its marketing team to unlawfully utilize these numbers for sending unsolicited promotional campaigns, deviating from the stated purpose. Additionally, imagine a scenario where a business collects name, age, and location details to facilitate product shipments, but its technical team misuses this data as training data for an AI model aimed at predicting product demand, breaching user trust and privacy agreements.
Here are some examples of illegal data use:
Despite these promises by the majority of companies, instances of data misuse are unfortunately common. Consider the case of Twitter. In 2013, they began asking users to provide either a phone number or email address to improve account security. The information was used to help reset user passwords and unlock accounts the company might have blocked due to suspicious activity, as well as for enabling two-factor authentication. Two-factor authentication provides an extra layer of security by sending a code to either a phone number or email address to help users log into Twitter along with a username and password.
From 2014 to 2019, more than 140 million Twitter users provided their phone numbers or email addresses after the company told them this information would help secure their accounts. Twitter, however, failed to mention that it also would be used for targeted advertising. Twitter used the phone numbers and email addresses to allow advertisers to target specific ads to specific consumers by matching the information with data they already had or obtained from data brokers. Consumers who share their private information have a right to know if that information is being used to help advertisers target customers.
Just how persuasive was Twitter’s security pitch? More than 140 million users gave Twitter their email addresses or phone numbers for security purposes. Would that same number of people have given Twitter that information if they knew how else Twitter was going to use it?
A similar trend is seen in Indian companies as well. For instance, a leading insurance aggregator's policy lists several purposes for utilizing personal data, yet conveniently omits details about unsolicited phone calls and targeted advertising on websites and social media. The lack of transparency in these policies raises significant concerns about the undisclosed and potentially intrusive use of personal information.
Privacy violations remain a pervasive issue across various industries in India. While technical privacy programs and personal data discovery tools are crucial, they alone may not ensure compliance with DPDP. The advent of the DPDP Act is an inflection point, necessitating the adoption of data minimization practices to curtail excessive data collection and minimize the risk of regulatory fines from the Data Protection Board. It also calls for a cultural shift within organizations, promoting awareness and understanding among every individual regarding their responsibilities in upholding privacy rights. It is imperative for all teams, including sales, marketing, tech, and HR, to align with the DPDP Act and ensure strict adherence to privacy promises, both at the organizational level and for individual product’s consent Notices. Additionally, rigorous scrutiny and governance of data sharing practices by Data Fiduciaries with Data Processors or partners, coupled with explicit consent, are essential in complying with the DPDP Act of 2023.