Special Report – Tesla workers shared sensitive images recorded by customer cars
2023.04.06 10:08
© Reuters. FILE PHOTO: A Tesla Model 3 vehicle is shown using the Autopilot Full Self Driving Beta software (FSD) while navigating a city road in Encinitas, California, U.S., February 28, 2023. REUTERS/Mike Blake/File Photo
2/2
By Steve Stecklow, Waylon Cunningham and Hyunjoo Jin
LONDON/SAN FRANCISCO (Reuters) – Tesla (NASDAQ:) Inc assures its millions of electric car owners that their privacy “is and will always be enormously important to us.” The cameras it builds into vehicles to assist driving, it notes on its website, are “designed from the ground up to protect your privacy.”
But between 2019 and 2022, groups of Tesla employees privately shared via an internal messaging system sometimes highly invasive videos and images recorded by customers’ car cameras, according to interviews by Reuters with nine former employees.
Some of the recordings caught Tesla customers in embarrassing situations. One ex-employee described a video of a man approaching a vehicle completely naked.
Also shared: crashes and road-rage incidents. One crash video in 2021 showed a Tesla driving at high speed in a residential area hitting a child riding a bike, according to another ex-employee. The child flew in one direction, the bike in another. The video spread around a Tesla office in San Mateo, California, via private one-on-one chats, “like wildfire,” the ex-employee said.
Other images were more mundane, such as pictures of dogs and funny road signs that employees made into memes by embellishing them with amusing captions or commentary, before posting them in private group chats. While some postings were only shared between two employees, others could be seen by scores of them, according to several ex-employees.
Tesla states in its online “Customer Privacy Notice” that its “camera recordings remain anonymous and are not linked to you or your vehicle.” But seven former employees told Reuters the computer program they used at work could show the location of recordings – which potentially could reveal where a Tesla owner lived.
One ex-employee also said that some recordings appeared to have been made when cars were parked and turned off. Several years ago, Tesla would receive video recordings from its vehicles even when they were off, if owners gave consent. It has since stopped doing so.
“We could see inside people’s garages and their private properties,” said another former employee. “Let’s say that a Tesla customer had something in their garage that was distinctive, you know, people would post those kinds of things.”
Tesla didn’t respond to detailed questions sent to the company for this report.
About three years ago, some employees stumbled upon and shared a video of a unique submersible vehicle parked inside a garage, according to two people who viewed it. Nicknamed “Wet Nellie,” the white Lotus Esprit sub had been featured in the 1977 James Bond film, “The Spy Who Loved Me.”
The vehicle’s owner: Tesla Chief Executive Elon Musk, who had bought it for about $968,000 at an auction in 2013. It is not clear whether Musk was aware of the video or that it had been shared.
Musk didn’t respond to a request for comment.
To report this story, Reuters contacted more than 300 former Tesla employees who had worked at the company over the past nine years and were involved in developing its self-driving system. More than a dozen agreed to answer questions, all speaking on condition of anonymity.
Reuters wasn’t able to obtain any of the shared videos or images, which ex-employees said they hadn’t kept. The news agency also wasn’t able to determine if the practice of sharing recordings, which occurred within some parts of Tesla as recently as last year, continues today or how widespread it was. Some former employees contacted said the only sharing they observed was for legitimate work purposes, such as seeking assistance from colleagues or supervisors.
LABELING PEDESTRIANS AND STREET SIGNS
The sharing of sensitive videos illustrates one of the less-noted features of artificial intelligence systems: They often require armies of human beings to help train machines to learn automated tasks such as driving.
Since about 2016, Tesla has employed hundreds of people in Africa and later the United States to label images to help its cars learn how to recognize pedestrians, street signs, construction vehicles, garage doors and other objects encountered on the road or at customers’ houses. To accomplish that, data labelers were given access to thousands of videos or images recorded by car cameras that they would view and identify objects.
Tesla increasingly has been automating the process, and shut down a data-labeling hub last year in San Mateo, California. But it continues to employ hundreds of data labelers in Buffalo, New York. In February, Tesla said the staff there had grown 54% over the previous six months to 675.
Two ex-employees said they weren’t bothered by the sharing of images, saying that customers had given their consent or that people long ago had given up any reasonable expectation of keeping personal data private. Three others, however, said they were troubled by it.
“It was a breach of privacy, to be honest. And I always joked that I would never buy a Tesla after seeing how they treated some of these people,” said one former employee.
Another said: “I’m bothered by it because the people who buy the car, I don’t think they know that their privacy is, like, not respected … We could see them doing laundry and really intimate things. We could see their kids.”
One former employee saw nothing wrong with sharing images, but described a function that allowed data labelers to view the location of recordings on Google (NASDAQ:) Maps as a “massive invasion of privacy.”
David Choffnes, executive director of the Cybersecurity and Privacy Institute at Northeastern University in Boston, called sharing of sensitive videos and images by Tesla employees “morally reprehensible.”
“Any normal human being would be appalled by this,” he said. He noted that circulating sensitive and personal content could be construed as a violation of Tesla’s own privacy policy — potentially resulting in intervention by the U.S. Federal Trade Commission, which enforces federal laws relating to consumers’ privacy.
A spokesperson for the FTC said it doesn’t comment on individual companies or their conduct.
To develop self-driving car technology, Tesla collects a vast trove of data from its global fleet of several million vehicles. The company requires car owners to grant permission on the cars’ touchscreens before Tesla collects their vehicles’ data. “Your Data Belongs to You,” states Tesla’s website.
In its Customer Privacy Notice, Tesla explains that if a customer agrees to share data, “your vehicle may collect the data and make it available to Tesla for analysis. This analysis helps Tesla improve its products, features, and diagnose problems quicker.” It also states that the data may include “short video clips or images,” but isn’t linked to a customer’s account or vehicle identification number, “and does not identify you personally.”
Carlo Piltz, a data privacy lawyer in Germany, told Reuters it would be difficult to find a legal justification under Europe’s data protection and privacy law for vehicle recordings to be circulated internally when it has “nothing to do with the provision of a safe or secure car or the functionality” of Tesla’s self-driving system.
In recent years, Tesla’s car-camera system has drawn controversy. In China, some government compounds and residential neighborhoods have banned Teslas because of concerns about its cameras. In response, Musk said in a virtual talk at a Chinese forum in 2021: “If Tesla used cars to spy in China or anywhere, we will get shut down.”
Elsewhere, regulators have scrutinized the Tesla system over potential privacy violations. But the privacy cases have tended to focus not on the rights of Tesla owners but of passers-by unaware that they might be being recorded by parked Tesla vehicles.
In February, the Dutch Data Protection Authority, or DPA, said it had concluded an investigation of Tesla over possible privacy violations regarding “Sentry Mode,” a feature designed to record any suspicious activity when a car is parked and alert the owner.
“People who walked by these vehicles were filmed without knowing it. And the owners of the Teslas could go back and look at these images,” said DPA board member Katja Mur in a statement. “If a person parked one of these vehicles in front of someone’s window, they could spy inside and see everything the other person was doing. That is a serious violation of privacy.”
The watchdog determined it wasn’t Tesla, but the vehicles’ owners, who were legally responsible for their cars’ recordings. It said it decided not to fine the company after Tesla said it had made several changes to Sentry Mode, including having a vehicle’s headlights pulse to inform passers-by that they may be being recorded.
A DPA spokesperson declined to comment on Reuters findings, but said in an email: “Personal data must be used for a specific purpose, and sensitive personal data must be protected.”
REPLACING HUMAN DRIVERS
Tesla calls its automated driving system Autopilot. Introduced in 2015, the system included such advanced features as allowing drivers to change lanes by tapping a turn signal and parallel parking on command. To make the system work, Tesla initially installed sonar sensors, radar and a single front-facing camera at the top of the windshield. A subsequent version, introduced in 2016, included eight cameras all around the car to collect more data and offer more capabilities.
Musk’s future vision is eventually to offer a “Full Self-Driving” mode that would replace a human driver. Tesla began rolling out an experimental version of that mode in October 2020. Although it requires drivers to keep their hands on the wheel, it currently offers such features as the ability to slow a car down automatically when it approaches stop signs or traffic lights.
In February, Tesla recalled more than 362,000 U.S. vehicles to update their Full Self-Driving software after the National Highway Traffic Safety Administration said it could allow vehicles to exceed speed limits and potentially cause crashes at intersections.
As with many artificial-intelligence projects, to develop Autopilot, Tesla hired data labelers to identify objects in images and videos to teach the system how to respond when the vehicle was on the road or parked.
Tesla initially outsourced data labeling to a San Francisco-based non-profit then known as Samasource, people familiar with the matter told Reuters. The organization had an office in Nairobi, Kenya, and specialized in offering training and employment opportunities to disadvantaged women and youth.
In 2016, Samasource was providing about 400 workers there for Tesla, up from about an initial 20, according to a person familiar with the matter.
By 2019, however, Tesla was no longer satisfied with the work of Samasource’s data labelers. At an event called Tesla AI Day in 2021, Andrej Karpathy, then senior director of AI at Tesla, said: “Unfortunately, we found very quickly that working with a third party to get data sets for something this critical was just not going to cut it … Honestly the quality was not amazing.”
A former Tesla employee said of the Samasource labelers: “They would highlight fire hydrants as pedestrians … They would miss objects all the time. Their skill level to draw boxes was very low.”
Samasource, now called Sama, declined to comment on its work for Tesla.
Tesla decided to bring data labeling in-house. “Over time, we’ve grown to more than a 1,000-person data labeling (organization) that is full of professional labelers who are working very closely with the engineers,” Karpathy said in his August 2021 presentation.
Karpathy didn’t respond to requests for comment.
Tesla’s own data labelers initially worked in the San Francisco Bay area, including the office in San Mateo. Groups of data labelers were assigned a variety of different tasks, including labeling street lane lines or emergency vehicles, ex-employees said.
At one point, Teslas on Autopilot were having difficulty backing out of garages and would get confused when encountering shadows or objects such as garden hoses. So some data labelers were asked to identify objects in videos recorded inside garages. The problem eventually was solved.
In interviews, two former employees said in their normal work duties they were sometimes asked to view images of customers in and around their homes, including inside garages.
“I sometimes wondered if these people know that we’re seeing that,” said one.
“I saw some scandalous stuff sometimes, you know, like I did see scenes of intimacy but not nudity,” said another. “And there was just definitely a lot of stuff that like, I wouldn’t want anybody to see about my life.”
As an example, this person recalled seeing “embarrassing objects,” such as “certain pieces of laundry, certain sexual wellness items … and just private scenes of life that we really were privy to because the car was charging.”
MEMES IN THE SAN MATEO OFFICE
Tesla staffed its San Mateo office with mostly young workers, in their 20s and early 30s, who brought with them a culture that prized entertaining memes and viral online content. Former staffers described a free-wheeling atmosphere in chat rooms with workers exchanging jokes about images they viewed while labeling.
According to several ex-employees, some labelers shared screenshots, sometimes marked up using Adobe (NASDAQ:) Photoshop, in private group chats on Mattermost, Tesla’s internal messaging system. There they would attract responses from other workers and managers. Participants would also add their own marked-up images, jokes or emojis to keep the conversation going. Some of the emojis were custom-created to reference office inside jokes, several ex-employees said.
One former labeler described sharing images as a way to “break the monotony.” Another described how the sharing won admiration from peers.
“If you saw something cool that would get a reaction, you post it, right, and then later, on break, people would come up to you and say, ‘Oh, I saw what you posted. That was funny,’” said this former labeler. “People who got promoted to lead positions shared a lot of these funny items and gained notoriety for being funny.”
Some of the shared content resembled memes on the internet. There were dogs, interesting cars, and clips of people recorded by Tesla cameras tripping and falling. There was also disturbing content, such as someone being dragged into a car seemingly against their will, said one ex-employee.
Video clips of crashes involving Teslas were also sometimes shared in private chats on Mattermost, several former employees said. Those included examples of people driving badly or collisions involving people struck while riding bikes – such as the one with the child – or a motorcycle. Some data labelers would rewind such clips and play them in slow motion.
At times, Tesla managers would crack down on inappropriate sharing of images on public Mattermost channels since they claimed the practice violated company policy. Still, screenshots and memes based on them continued to circulate through private chats on the platform, several ex-employees said. Workers shared them one-on-one or in small groups as recently as the middle of last year.
One of the perks of working for Tesla as a data labeler in San Mateo was the chance to win a prize – use of a company car for a day or two, according to two former employees.
But some of the lucky winners became paranoid when driving the electric cars.
“Knowing how much data those vehicles are capable of collecting definitely made folks nervous,” one ex-employee said.
(Reported by Steve Stecklow and Waylon Cunningham in London and Hyunjoo Jin in San Francisco. Edited by Peter Hirschberg.)