San Francisco stands up against potential abuse oftechnology for facial recognition used by policemen and other agencies. Thisyear in early May, the software used for facial recognition has been officiallybanned in the city.
San Francisco is now the first-ever American citydisallowing the use of a tool that police officers use to search for both smallfry criminals, terrorists, and dangerous people responsible for mass killings.
In June’s mass shooting, the authorities utilized the sametechnology for identifying the suspect. However, the civil liberty groupsexpressed their fear of the app’s potential abuse that could result in overlyoppressive surveillance in the country.
The city supervisor supporting the bill, Aaron Peskin, said it shouts a powerful message to the nation, exposing which cities have been transformed by technology.
“I think part of San Francisco being the real andperceived headquarters for all things tech also comes with a responsibility forits local legislators,” Mr. Peskin said in an interview with N.Y. Times.
He added, “we have an outsize responsibility to regulate the excesses of technology precisely because they are headquartered here.”
On the other hand, opposing critics believe, instead offocusing on bans, S.F. must come up with regulations admitting the usefulnessof the tool.
According to constitutional law expert at George WashingtonUniversity, Jonathan Turley, “it is ridiculous to deny the value of thistechnology in securing airports and border installations.”
“It is hard to deny that there is a public safety valueto this technology,” he added.
It appears San Francisco will not be the only city in theUnited States that aims to ban facial recognition. In Oakland and Somerville,Mass., similar bills are under consideration. A Massachusetts state legislaturebill bestows a prohibition on facial recognition as well as on remote biometricsurveillance systems. Capitol Hill also introduced a law banning users ofcommercial face recognition technology which gathers and shares dataidentifying and tracking customers without their consent. The bill, however,doesn’t mention the government’s use of such technology.
A lawyer with the A.C.L.U. of Northern California, Matt Cagle, pointed out, in a nutshell, the many concerns about facial recognition. Cagle believes the technology, “provides [the] government with unprecedented power to track people going about their daily lives. That’s incompatible with a healthy democracy.”
Cagle added, the San Francisco proposal “is reallyforward-looking and looks to prevent the unleashing of this dangeroustechnology against the public.”
The technology is currently being used across the U.S. inairports, big stadiums, and by police authorities. Popstar, Taylor Swift, alsoacknowledged the usefulness of the tool for identifying stalkers, which sheadmitted had been used in one of her shows.
The battle in San Francisco over facial recognitiontechnology is mainly theoretical. Authorities do not deploy such tools besidesthe airports and ports that are under federal jurisdiction and where thelegislation do not apply.
According to Jennifer Friedenbach, the executive director ofthe Coalition on Homelessness, biometric finger scans and photos are used insome local homeless shelters. Friedenbach said the use these technologiesprevented undocumented residents from using the shelters.
The issue has been a debate between opposing sides residing in the city with a high rate of property crimes. The bill prohibits agencies from using facial recognition technology or other systems that provide leaning information. It’s included in the legislative package, where local agencies provide policies to control the use of such tools. Still, some exemptions favor prosecutors if transparency requirements interfere with the investigations.
The San Francisco Police Officers Association, a union ofofficers believes, the ban limits the efforts of the officers in investigatingcrimes.
“Although we understand that it’s not a 100 percentaccurate technology yet, it’s still evolving,” said the president of theassociation, Tony Montoya. “I think it has been successful in at leastproviding leads to criminal investigators.”
“Basically, governments and companies have been verysecretive about where it’s being used, so the public is largely in the darkabout the state of play,” Mr. Cagle said on the widespread of the use oftechnology in the country and why it’s impossible to identify precisely.
The senior investigative researcher at the ElectronicFrontier Foundation, Dave Maass, provided an incomplete list of departmentsthat used the technology. Maass included Las Vegas, San Jose, Orlando, SanDiego, Boston, New York City, Detroit, and Durham, N.C.
Maass also added to the list, Colorado Department of PublicSafety, the California Department of Justice, the Pinellas County Sheriff’sOffice in Florida and the Virginia State Police.
In many airports and seaports in the country, the U.S.Customs and Border Protection uses facial recognition for travelers, where theystand before cameras and have their passport pictures matched with theirphotos. The agency said they comply with privacy laws but still receivescriticisms from the Electronic Privacy Information Center.
“When you have the ability to track people in physical space, in effect everybody becomes subject to the surveillance of the government,” said the group’s executive director, Marc Rotenberg, pointing out a bigger concern.
The technology has grown big and rapidly, which has now madeinto reality the once far-fetch ideas implemented to smartphones. Now devicesare featuring facial recognition which is also used to unlock phones and othergadgets.
However, experts fear that the government is stripping outits responsibility to protect people’s privacy. A similar scenario alreadyexists in China where there’s a close surveillance of Uighurs, a mostly Muslimminority. The technology is integrated with a system powered by about 200million cameras.
Civil liberties in the U.S. expressed that facialrecognition without people’s consent may threaten their freedom to attendpolitical protests or meetings anonymously. Bradford L. Smith, the president ofMicrosoft, asked the congress to oversee the use of this technology as it’s toorisky for the companies to police on their own.
Arguments over the issue have been stronger after studiesare published, implying the technology’s bias in recognizing faces. Since then,companies like I.B.M., Microsoft, and Amazon, has improved its tools and foundno difference in recognizing faces.
Last year, the American Civil Liberties Union and non-profitorganizations called on Amazon to stop selling such technologies to government,as African-Americans and women could be easily wrongly identified and arrested.
In an essay from a postdoctoral researcher at MicrosoftResearch Montreal, Luke Stark, facial surveillance is described as “theplutonium of artificial intelligence,” reasoning that it must be“recognized as anathema to the health of human society, and heavilyrestricted as a result.”
Director of Georgetown University’s Center on Privacy and Technology, Alvaro Bedoya, said that 30 states allow authorities to check people’s driver’s license. Bedoya said it’s equivalent to an endless police line up as they use it for matching the faces of suspected criminals. However, the difference is that an algorithm does the work with such technologies versus real people.
“This is the most pervasive and risky surveillancetechnology of the 21st century,” Bedoya added, pointing out that there’s alack in the regulation of the technology use.
Directing the Center for Data Innovation at the InformationTechnology and Innovation Foundation, and among who oppose the facialtechnology use, Daniel Castro said he prefers the authorities to have access tothe technology’s data after they get a warrant from a judge, following theSupreme court’s guidelines.
On the other hand, supporters of the technology said theyare willing to cease the use of the tools and perform further studies beforeharm is done.
“The government and the public don’t have a handle onwhat the technology is and what it will become,” said Ben Ewen-Campen, thecouncilor who sponsored the surveillance ban in Somerville, the Boston suburb.
The former police commissioner in Boston, Ed Davis,expressed it was “premature to be banning things.” Davis was theleader of the department during the Boston Marathon attack. He also said thatno person in the country would aim to follow the scenario in China.
“This technology is still developing,” Davis saidon the potential of the technology, “and as it improves, this could be theanswer to a lot of problems we have about securing our communities.”
Vice president of Stop Crime SF, Joel Engardio, agrees thatthe technology isn’t yet seamless, but he said that shouldn’t wholly abandonits use in the future when the tools have undergone improvements.
“Instead of an outright ban, why not amoratorium?” asks Engardio.
“Let’s keep the door open for when the technologyimproves. I’m not a fan of banning things when eventually it could actually behelpful.”