<p>Recently, <a href="https://www.deccanherald.com/tag/amazon" target="_blank">Amazon </a>was ordered by the European Union to simplify its process of cancelling Prime subscriptions for its users as they were being made to pass through numerous hurdles, including complicated navigation menus, confusing choices, and skewed wording to unsubscribe from Prime, Amazon’s fast-shipping club. This practice of Amazon is known as ‘Dark Pattern’. Following the EU order, the company will now allow users to unsubscribe from Prime with two clicks, via a prominent ‘cancel button’.</p>.<p>Not only Amazon, you must have come across many platforms that use dark patterns to make it difficult for their subscribers to leave. If you’re not familiar with dark patterns, they are user interface designs that are intended to fool users into doing something they wouldn’t normally do. Usually, this is something that the site designer wants them to do -- like sign up for a subscription, buy a product, or share their personal information.</p>.<p>Thus, dark patterns are an underhanded, manipulative way to get people to do what you want them to on your company’s website. For example, Apple uses Identifier for Advertisers (IDFA), which tracks browsing activity that advertisers use to target ads. Apple allows users to turn it off, but here comes the tricky part. The label says, “Limit Ad Tracking”, which is confusing since “turning it off” actually means “turning on” the ad-tracking feature!</p>.<p>Dark patterns, prima facie, do not appear to be illegal. However, in some cases, dark patterns can be said to have been used to commit illegal acts. For example, if a company uses a dark pattern to trick someone into buying something they don’t want or need, that could be considered fraud in general. Thus, Amazon was asked to refund over $70 million after the FTC sued Amazon in 2014, saying the company had insufficient safeguards to prevent children from making purchases, some of which ranged into hundreds of dollars. When Amazon first launched its Kindle Fire OS, the default settings had parental controls turned off, while in-app purchases were allowed. Kids could charge their parents up to $99.99 for a single in-app purchase, often without needing a password.</p>.<p>However, some practices which are not fraudulent do not come under the purview of any law. Thus, some provisions are required to maintain the user’s right to choose, which is diminishing day by day. For example, LinkedIn sends unsolicited and sponsored messages from influencers to its users; Instagram shows posts unrelated to users’ preferences; and YouTube shows ads just before any video ends. Further, these patterns are set in such a way that users find it too difficult to change or disable them. Although these patterns confuse their users, consume their time, and lead them to sign up for unwanted services/products, etc., yet they are unregulated and are practised openly by almost all the platforms.</p>.<p>However, recently, some countries have started taking action against such companies for protecting the best interest of the users. For example, <a href="https://www.deccanherald.com/tag/google" target="_blank">Google </a>and YouTube were penalised for making it harder for users to opt-out of cookies than it is to accept them. In response to the fines, Google is beginning to roll out new cookie permissions that give users a clear “Deny All” button, allowing them to easily opt out. The same regulator also hit Google with a €50 million fine back in 2019 for lack of transparency and valid consent in its ad personalization. In 2020, it was at it again, fining Google €100 million for placing cookies on users’ computers without first obtaining informed consent.</p>.<p>India can take inspiration from the laws in other countries. In 2018, California introduced the California Consumer Privacy Act (CCPA), with the aim of boosting privacy rights and consumer protections for its citizens. In 2020, its provisions were expanded with the California Privacy Rights Act (CPRA). It contained some provisions against dark patterns. For example, it states that “A business’s methods for submitting requests to opt-out shall be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out. A business shall not use a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out.” This explicitly forbids the dark pattern of making it really hard for users to opt-out.</p>.<p>Another provision states that “A business shall not use confusing language, such as double-negatives (e.g., “Don’t Not Sell My Personal Information”), when providing consumers the choice to opt-out.” This prevents a provider from tricking users into handing over their data.</p>.<p>Another provision states that “Upon clicking the “Do Not Sell My Personal Information” link, the business shall not require the consumer to search or scroll through the text of a privacy policy or similar document or webpage to locate the mechanism for submitting a request to opt-out.” This makes it much easier for users to opt out.</p>.<p>The most significant one is the definition of “consent”. It states that “consent” means any freely given, specific, informed, and unambiguous indication of the consumer’s wishes by which the consumer…signifies agreement to the processing of personal information relating to the consumer for a narrowly defined particular purpose. Acceptance of a general or broad terms of use…does not constitute consent. Hovering over, muting, pausing, or closing a given piece of content does not constitute consent. Likewise, agreement obtained through use of dark patterns does not constitute consent.” This is significant, because it completely invalidates any “consent” that was given through a dark pattern. The Colorado Privacy Act (CPA) is on the same lines.</p>.<p>While there are some promising legislative steps in important regions like California and Europe, India is still to have any law against such practices. It is important to have such legislation as the users have nearly lost their right to choose on the platforms they access. For now, there’s little that an individual user can do apart from just having to sit back and wait, trying not to get frustrated whenever we get caught in a dark pattern.</p>.<p><em><span class="italic">(The writers are advocates in the Delhi High Court and Supreme Court)</span></em></p>
<p>Recently, <a href="https://www.deccanherald.com/tag/amazon" target="_blank">Amazon </a>was ordered by the European Union to simplify its process of cancelling Prime subscriptions for its users as they were being made to pass through numerous hurdles, including complicated navigation menus, confusing choices, and skewed wording to unsubscribe from Prime, Amazon’s fast-shipping club. This practice of Amazon is known as ‘Dark Pattern’. Following the EU order, the company will now allow users to unsubscribe from Prime with two clicks, via a prominent ‘cancel button’.</p>.<p>Not only Amazon, you must have come across many platforms that use dark patterns to make it difficult for their subscribers to leave. If you’re not familiar with dark patterns, they are user interface designs that are intended to fool users into doing something they wouldn’t normally do. Usually, this is something that the site designer wants them to do -- like sign up for a subscription, buy a product, or share their personal information.</p>.<p>Thus, dark patterns are an underhanded, manipulative way to get people to do what you want them to on your company’s website. For example, Apple uses Identifier for Advertisers (IDFA), which tracks browsing activity that advertisers use to target ads. Apple allows users to turn it off, but here comes the tricky part. The label says, “Limit Ad Tracking”, which is confusing since “turning it off” actually means “turning on” the ad-tracking feature!</p>.<p>Dark patterns, prima facie, do not appear to be illegal. However, in some cases, dark patterns can be said to have been used to commit illegal acts. For example, if a company uses a dark pattern to trick someone into buying something they don’t want or need, that could be considered fraud in general. Thus, Amazon was asked to refund over $70 million after the FTC sued Amazon in 2014, saying the company had insufficient safeguards to prevent children from making purchases, some of which ranged into hundreds of dollars. When Amazon first launched its Kindle Fire OS, the default settings had parental controls turned off, while in-app purchases were allowed. Kids could charge their parents up to $99.99 for a single in-app purchase, often without needing a password.</p>.<p>However, some practices which are not fraudulent do not come under the purview of any law. Thus, some provisions are required to maintain the user’s right to choose, which is diminishing day by day. For example, LinkedIn sends unsolicited and sponsored messages from influencers to its users; Instagram shows posts unrelated to users’ preferences; and YouTube shows ads just before any video ends. Further, these patterns are set in such a way that users find it too difficult to change or disable them. Although these patterns confuse their users, consume their time, and lead them to sign up for unwanted services/products, etc., yet they are unregulated and are practised openly by almost all the platforms.</p>.<p>However, recently, some countries have started taking action against such companies for protecting the best interest of the users. For example, <a href="https://www.deccanherald.com/tag/google" target="_blank">Google </a>and YouTube were penalised for making it harder for users to opt-out of cookies than it is to accept them. In response to the fines, Google is beginning to roll out new cookie permissions that give users a clear “Deny All” button, allowing them to easily opt out. The same regulator also hit Google with a €50 million fine back in 2019 for lack of transparency and valid consent in its ad personalization. In 2020, it was at it again, fining Google €100 million for placing cookies on users’ computers without first obtaining informed consent.</p>.<p>India can take inspiration from the laws in other countries. In 2018, California introduced the California Consumer Privacy Act (CCPA), with the aim of boosting privacy rights and consumer protections for its citizens. In 2020, its provisions were expanded with the California Privacy Rights Act (CPRA). It contained some provisions against dark patterns. For example, it states that “A business’s methods for submitting requests to opt-out shall be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out. A business shall not use a method that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out.” This explicitly forbids the dark pattern of making it really hard for users to opt-out.</p>.<p>Another provision states that “A business shall not use confusing language, such as double-negatives (e.g., “Don’t Not Sell My Personal Information”), when providing consumers the choice to opt-out.” This prevents a provider from tricking users into handing over their data.</p>.<p>Another provision states that “Upon clicking the “Do Not Sell My Personal Information” link, the business shall not require the consumer to search or scroll through the text of a privacy policy or similar document or webpage to locate the mechanism for submitting a request to opt-out.” This makes it much easier for users to opt out.</p>.<p>The most significant one is the definition of “consent”. It states that “consent” means any freely given, specific, informed, and unambiguous indication of the consumer’s wishes by which the consumer…signifies agreement to the processing of personal information relating to the consumer for a narrowly defined particular purpose. Acceptance of a general or broad terms of use…does not constitute consent. Hovering over, muting, pausing, or closing a given piece of content does not constitute consent. Likewise, agreement obtained through use of dark patterns does not constitute consent.” This is significant, because it completely invalidates any “consent” that was given through a dark pattern. The Colorado Privacy Act (CPA) is on the same lines.</p>.<p>While there are some promising legislative steps in important regions like California and Europe, India is still to have any law against such practices. It is important to have such legislation as the users have nearly lost their right to choose on the platforms they access. For now, there’s little that an individual user can do apart from just having to sit back and wait, trying not to get frustrated whenever we get caught in a dark pattern.</p>.<p><em><span class="italic">(The writers are advocates in the Delhi High Court and Supreme Court)</span></em></p>