A live demonstration uses facial recognition technology at the Las Vegas Convention Center. (Getty Images)
Private company, ACLU both back S.F. ordinance banning use by authorities
The city of San Francisco drew a line in the sand for the nascent facial recognition industry with a new city ordinance that bans the technology from being used by law enforcement or other authorities.
The city is one of the first to block the software from being deployed, calling the technology “psychologically unhealthy” and, if allowed, “the beginnings of a surveillance state,” according to city supervisor Aaron Peskin, who sponsored the bill. Similar bans are under consideration in Massachusetts and other states.
The ban does not affect the use of the technology by private companies.
Mary Haskett is the co-founder of Blink Identity, one of the leading facial recognition software companies testing its software in North American venues and a partner of Live Nation. She does not see the new law having any effect on her company.
“I’m actually a huge fan of the legislation,” Haskett said. “It prevents law enforcement and other government agencies from using the technology for mass surveillance, which is a little bit creepy.”
Haskett said that the company is focused on using its technology for VIP lanes to get fans in and out of venues quicker and that it is not involved in using facial recognition technology to capture the faces of anyone who has not signed up for the program.
“Our clients have control over their data,” she said. “They can take themselves out at any time. It’s used for one thing, and we don’t share our data with anyone. As a private citizen, I want cities and municipalities to pick up the ban and keep it going.”
Haskett can reel off many things facial recognition is an amazing tool for — finding lost people, finding terrorists, rapid entry into venues — but acknowledges that putting cameras everywhere with people being monitored becomes mass surveillance.
“You can see how that would be abused. Just look at China and how they are abusing their tech with their ‘social score,’” Haskett cautioned.
“Who you are seen associating with will make your score go up and down. You go into a ‘Black Mirror’ episode stuff pretty quickly,” she said, referring to the Netflix series full of technology-driven nightmares.
“There is a place for government to be setting standards. I think the important thing is transparency and accountability — what data do you have and what are you doing with it and can you be held accountable if you misuse it?”
Misuse is exactly what Jay Stanley, senior policy analyst for the American Civil Liberties Union’s Speech, Privacy and Technology Project, worries about.
“The officials at venues should tell people that their faces will be scanned for security purposes — preferably before they paid for a ticket,” Stanley said. “They also should tell attendees whether they were saving the photos and what they were planning to do with them. If venue officials didn’t think people would mind, there’s no reason they shouldn’t do so. If they did think that their customers would mind, but did it anyway, that’s pretty shady.”
Stanley is also concerned with how the technology is rolled out. “If it is rushed into deployment by lots of different operators, it’s not going to be done right,” he said.
No law now prevents a private property owner from installing facial recognition technology or dictates what can be done with the the data it may collect.
“As these watchlists become institutionalized, in all likelihood, the shared data’s dissemination will have consequences and there will be unfair treatment … as these lists become magnified,” Stanley said. “We understand the importance of protecting society from dangerous people, but we don’t want to see companies turning into stalkers themselves, following our every move, just because a technology has come along that makes that possible.”
As for government interference into private companies using the technology, Haskett said she was not worried.
“In all of human history I can’t think of a single example of a technology being developed and then people decided not to use it,” Haskett said. “We aren’t very good at putting the genie back in the bottle. Useful technologies get used. We just have to develop a set of principles and guidelines for how it is appropriately used.”