Legally Permissible Does Not Mean Ethical

Safia Kazi
Author: Safia Kazi, CSX-F, CIPT
Date Published: 30 October 2023

In a deeply concerning report of car manufacturers recently released by Mozilla Foundation, all 25 major car brands reviewed received a failing privacy score.1 Helpful car features, such as backup cameras, lane-change assistants and onboard navigation, have sensors that might collect sensitive information and provide it to the vehicle manufacturer. This information could be sold, unbeknownst to drivers and passengers, to data brokers.

Car manufacturers get away with collecting and selling excessive amounts of customer information because consumers are often unaware of these practices, and, in many regions, there are weak or no laws and regulations in place to protect drivers from these malicious practices. Some car manufacturers say they collect genetic information, and one claims to collect information about sexual activity and intelligence.2

Numerous websites and consumers have written about Mozilla’s report, and the backlash against auto manufacturers’ privacy practices is strong. Mozilla’s report should serve as a reminder that just because enterprises may be permitted do something, that does not mean they should. Treating compliance as an aspiration and collecting excessive amounts of information is harmful to consumers; privacy professionals must help their enterprises’ privacy practices be more ethically focused.

Compliance Is the Floor, Not the Ceiling

Privacy laws and regulations vary around the world and may differ based on industry. Despite the excessive amount of data collected by vehicle manufacturers, many of them are compliant with applicable laws and regulations. This paradox is possible because compliance is the bare minimum; it should be a given that enterprises are working toward being compliant. Compliance does not ensure ethical behavior.

Enterprises should aim to respect and protect their consumers, and using compliance requirements as the goal is problematic for a few reasons. First, consumers living in areas with weak privacy laws (or without privacy laws) are at a disadvantage compared to consumers in areas with strong privacy regulations in place. All consumers deserve privacy, regardless of the regulatory landscape of their region. Second, privacy laws will not be able to protect consumers in every possible way: predatory enterprises will find loopholes to laws and regulations or choose to be noncompliant and pay any associated fines.

Privacy practitioners who work at enterprises that are solely focused on compliance should advocate for taking privacy measures a step further. They should look at what others in the industry are doing; if enterprises are going above and beyond to protect consumers, they will have a competitive advantage. Any enterprise that takes steps to better protect privacy can stand apart from their competitors. For example, if one of the car manufacturers reviewed by Mozilla had received a passing score, it would have been able to capitalize on this and received a lot of free positive publicity.

Becoming a Target

The more data an enterprise has, the more of a target they are to hackers. Cars may have access to location data, health-related information and driver habits, and automotive enterprises have been breached in the past because of this. Mercedes-Benz experienced a data breach in 2021 that affected 1.6 million consumer records, which included credit card information, social security numbers and driver’s license numbers.3  While that alone is concerning, consider what would happen if the other data such as health conditions and genetic information4  was compromised.

Privacy practitioners should emphasize the importance of data minimization (i.e., data is adequate, relevant and limited to what is necessary in relation to the purposes for which it is processed. Not having sensitive information can make enterprises less of a target to attackers, and not having a lot of information about consumers can minimize the harm to data subjects in the event of a breach.

It is also imperative that privacy professionals work closely with security professionals to ensure that data is protected. Depending on organizational structure, privacy professionals should also work with data engineers and data scientists to better understand what information the enterprise collects and why.

Consider the Harm to Consumers

Assumptions about an individual’s health conditions, political beliefs, religion and orientation can be made from data enterprises collect.

In the case of vehicles, precise location data could be used to stalk people. In 2019, a man pleaded guilty to stalking his ex-girlfriend using an app that monitored her location. All the app required was her car’s vehicle identification number.5 Not adequately securing location data or selling it to third parties who then turn around and sell that to anyone who wants to purchase it could put consumers’ lives at risk.

Tesla’s sophisticated autopilot feature require a lot of sensors and awareness of surroundings. Teslas’ cameras can capture sensitive videos, and many employees have access to these clips. Tesla has a video of a completely naked man who approached a Tesla. Another video of a Tesla hitting a child went viral at a Tesla office.6 In the case of a car camera, enterprises must ensure that customers know what the camera might capture, even if no one is driving. Ex-Tesla employees said their cars had footage of people doing laundry and other “really intimate things.”7 Although people may be aware that their car is capturing data while driving it, they may not consider that it is also capturing information while parked.

Privacy practitioners must ensure that consumers understand exactly what data may be collected and how it will be used. It is also important to look at the conclusions that can be drawn from the data collected. Although enterprise leaders may think they are only selling consumers’ location data, they are also selling all the assumptions associated with that location data. Understanding the conclusions that can be drawn about individuals based on their data is crucial to understanding the harms associated with collecting, selling or sharing data.

It is also critical that all employees complete privacy awareness training. They must understand that sharing or mocking sensitive customer information with other employees would feel very violating to that customer, and there should be consequences for this kind of behavior.

Is Selling Data a Necessity?

The average price for a new car in April 2023 was US$48,275 in the United States.8 Auto manufacturers could be financially stable without having to resort to selling data, but selling data can be a lucrative business. It is estimated that car-generated data could become a US$450-750 billion market by 2030.9

The US state of Massachusetts introduced legislation to address the privacy associated with cars,10 but enterprises should not look to laws and regulations for guidelines around selling data. It is likely that laws and regulations will not do enough to protect people from their data being sold. This is because data brokers spend a considerable amount of money to ensure they can continue to buy and sell personal information. After the US House of Representatives introduced the American Data Privacy and Protection Act, data broker lobbying increased considerably, with data brokers spending US$1.73 million during the second quarter of 2022, compared to US$1.55 million in the second quarter of 2021.11

Many industries that do not need to sell data to survive are cashing in on this lucrative business. Privacy professionals need to explain to senior leadership that when consumers eventually find out that their data is being sold, it could cause reputational harm to the enterprise and damage trust with consumers, ultimately impacting their bottom line.

Providing Meaningless Choices

In many cases—not just with vehicles—privacy-related choices provided to consumers are meaningless: consent to excessive data collection or do not leverage the service. Given that every car brand Mozilla reviewed received a failing privacy score, consumers do not even have the option of selecting a car manufacturer that better protects privacy. With one vehicle manufacturer, passengers consent to their data being used or sold just by sitting in the car.12

Privacy professionals must advocate for meaningful privacy-related choices, not a zero-sum setup in which users have to accept the terms and conditions as they are. Users should be able to granularly express their privacy preferences.

Privacy professionals should also advocate for settings that default to protecting privacy. Providing consumers with the ability to express their preferences is important, but they should not have to do anything to have their privacy protected; it should be automatic. One car manufacturer responded to Mozilla’s report and emphasized that customers can opt out of some data collection.13 But that is not enough; the default should be that superfluous data collection does not happen. Practicing privacy by design can help ensure that default settings are privacy preserving.

Privacy professionals working at enterprises that leverage legal privacy loopholes with regards to their data processing must advocate for consumers.

Conclusion

Just because enterprises are legally permitted to collect and use data in a certain manner does not mean they should. Building trust with consumers necessitates wanting to do more for them than the bare minimum legal obligations and understanding the harm that comes with excessive data collection and selling data. Privacy professionals working at enterprises that leverage legal privacy loopholes with regards to their data processing must advocate for consumers; if harm comes to a data subject as a result of an enterprise’s use of that person’s data, it could irreversibly damage trust between consumers and providers. Any enterprise in an industry heavily relying on unethical data collection practices that chooses not to join their peers in excessive data collection can set themselves apart from their predatory competitors and gain a competitive advantage.

Endnotes

1 Mozilla, “‘Privacy Nightmare on Wheels’: Every Car Brand Reviewed By Mozilla—Including Ford, Volkswagen and Toyota—Flunks Privacy Test,” 6 September 2023
2 Caltrider, J.; M. Rykov; Z. MacDonald; “After Researching Cars and Privacy, Here’s What Keeps Us Up at Night,” Mozilla, 6 September 2023
3 Sharma, A.; “Mercedes-Benz Data Breach Exposes SSNs, Credit Card Numbers,” Bleeping Computer, 25 June 2021
4 Op cit Caltrider
5 Yeo, A.; “Man Pleads Guilty to Stalking His Ex Using Her Car's Built-In Tracking App,” Mashable, 8 November 2019
6 Stecklow, S.; W. Cunningham; H. Jin; “Tesla Workers Shared Sensitive Images Recorded By Customer Cars,” Reuters, 6 April 2023
7 Ibid.
8 Tucker, S.; “Average New Car Price Holds Steady; Incentives Rising,” Kelley Blue Book, 10 May 2023
9 McKinsey and Company, Monetizing Car Data, USA, September 2016
10 DaSilva, S.; “Massachusetts Is Trying to Protect Data Privacy in Cars,” Jalopnik, 18 September 2023
11 Ng, A.; “Privacy Bill Triggers Lobbying Surge By Data Brokers,” Politico, 28 August 2022
12 Mozilla, “Subaru,” 15 August 2023
13 DeMattia, N.; “BMW Defends Its Data Privacy Policies, Rejects Mozilla’s Scathing Report,” The Drive, 14 September 2023

Safia Kazi, CSX-F, CIPT

Is a privacy professional practices principal at ISACA®. In this role, she focuses on the development of ISACA’s privacy-related resources, including books, white papers and review manuals. Kazi has worked at ISACA for 9 years, previously working on the ISACA® Journal and developing the award-winning ISACA Podcast. In 2021, she was a recipient of the AM&P Network’s Emerging Leader award, which recognizes innovative association publishing professionals under the age of 35.

Additional resources