top of page
Outward-Looking

Protecting Privacy: Consumer Rights in a Digitally-Driven World

Updated: Mar 22, 2021

Written by Bailey Cho


On 12/01/2021, Spotify was granted a patent for technology that will allow it to analyze consumers’ voices and suggest songs based on “emotional state, gender, age, or accent" [1].



How Do Algorithms Work?


Streaming services use machine learning — a subset of AI — to create a deeply personal user experience. Machine learning is a process in which machines are fed data and continually develop the algorithms they have been programmed with. Essentially, the more data the machines are fed, the more sophisticated the algorithms will become [2]. Complex algorithms contribute to the dominance of streaming services such as Spotify. Users don’t have to actively search for their next playlist, and increased data collection enables music recommendations to improve over time.


Product discovery is convenient for the consumer, but the data used to recommend new music can be used for a multitude of purposes. Companies use consumer data to sell products, reflected in the relevant advertisements on our feeds. But data collection can also result in more serious consequences, including unfair targeting of marginalized communities or unauthorized government surveillance.


Societal Impacts of Artificial Intelligence


AI systems could analyze individuals’ behaviors to determine their access to credit or other critical services. They could also compromise immigration status. Even if certain applications claim to protect respondents’ anonymity, data misuse and location services could provide the government with sensitive information on undocumented immigrants, enabling agencies to locate and deport them. Furthermore, emotion-recognition technology — an extension of Spotify’s latest patent — could also be used by law enforcement to criminalize certain racial groups. AI systems could incorrectly label people as “suspicious”, perpetuating false incarceration and bias in our legal systems.


The private right of action (PRA) allows citizens to seek judicial relief from injuries caused by violators, including companies that abuse consumer data [3]. This is necessary because marginalized communities cannot strictly rely on the government to protect their rights. However, some experts assert the PRA is ineffective in consumer privacy cases: it often results in class actions that mainly benefit attorneys with little relief to victims. There remains debate over who should be allowed to file a case in the U.S., but marginalized communities are clearly the most vulnerable group in the digital environment. Privacy rights affect all consumers, but the wealthy and privileged have greater protections against risks.


Consumer Privacy Legislation in Europe


In 2018, the European Union implemented the toughest privacy and security law in the world — the General Data Protection Regulation (GDPR). Though it was drafted and passed by the EU, it imposes obligations onto any organisation that targets or collects data of Schengen-area residents [4]. The GDPR defines consumer rights, company obligations, and data protection principles for “data processors'' (any company that collects data). It can also distribute harsh fines to violators. However, many companies have found loopholes to the law, according to data protection and AI expert Ernani Cerasaro of European consumer organisation BEUC.


Cerasaro states that the GDPR is a comprehensive legal framework for privacy, but the primary issue is enforcement: “We have the rules. It's really about making sure that they are enforced and platforms, unfortunately, will not do it completely themselves”. Different companies have different attitudes towards compliance. Many view GDPR compliance as a hassle, when in fact, says Cerasaro “it is really about the society that we are building”.


If the GDPR was correctly enforced, it could prevent the market from being monopolized by a small number of large companies that process enormous quantities of data, creating mega-platforms. For now, companies that follow more ethical data collection practices can’t compete, and consumers who know which companies mishandle their data remain vulnerable. They are often trapped with these platforms or applications because there is no real alternative.


(The Lack of) Consumer Privacy Legislation in the U.S.


Unlike the European Union, there is not a single, comprehensive privacy law in the United States, and only three states — California, Nevada, and Maine — have privacy laws in effect [5]. The lack of a federal privacy law is the central issue for Americans’ digital privacy, according to Ridhi Shetty, Policy Counsel with the Center for Democracy and Technology’s (CDT) Privacy & Data Project.


Without federal legislation, consumers do not have a clear understanding of their rights, and companies are faced with uncertainty — there is no consistent framework for the collection, processing, and sharing of data. A national standard would give consumers more control over their personal data and allow companies to focus their efforts on compliance and protection of individuals’ rights, not on a complex set of requirements that would differ from jurisdiction to jurisdiction. This not only reduces companies’ compliance costs but also encourages them to innovate on their data in a legally-compliant manner.


In any situation, requiring “consent” will not be enough. If the consumer isn’t aware of what they are consenting to, this consent should not be considered real. The first step toward comprehensive policy is to educate consumers on their rights and engage in an open dialogue between legislative branches and companies. According to Shetty, there remains much argument about what kind of rights people can assert on their own and what they should be depending on the government to do. Legislation needs to be drafted with the nature of modern technology and the needs of individuals in mind, providing more certainty to both companies and consumers [6].


The Fine Line Between Discovery and Control


Although we legally forfeit our data by “accepting” a 100-page terms and conditions document, is this ethical when most consumers don’t analyze its terms? Whether you believe algorithmic advantages outweigh its disadvantages, it is clear that consumer privacy rights will become more prominent, as the pandemic has shifted most of our activity online and the United States still lacks comprehensive federal privacy legislation. Algorithms may bring us convenience, but they certainly represent a fine line between consumer discovery and control.


Refer to the corresponding interview with Ridhi Shetty.

 

Sources:


[1] bbc.com/news/entertainment-arts-55839655

[2]

mediaupdate.co.za/media/147264/three-ways-ai-has-changed-the-game-for-streaming-services

[3] law.com/newyorklawjournal/2020/04/07/establishing-a-private-right-of-action-in-personal-injury-cases/

[4] gdpr.eu/what-is-gdpr/

[5] varonis.com/blog/us-privacy-laws/

[6] nextgov.com/ideas/2020/09/need-unified-data-protection-us/168643/


37 views0 comments

Recent Posts

See All

Comments


bottom of page