'Stop secret surveillance ordinance'

US technology hub, San Francisco has voted to ban the purchase and use of facial recognition technology by city personnel. Is face recognition an intrusive form of surveillance?


Facial recognition is a category of biometric software that maps an individual's facial features mathematically and stores the data as a faceprint. 

The software identifies 80 nodal points on a human face. In this context, nodal points are endpoints used to measure variables of a person’s face, such as the length or width of the nose, the depth of the eye sockets and the shape of the cheekbones. The system works by capturing data for nodal points on a digital image of an individual’s face and storing the resulting data as a faceprint. The faceprint is then used as a basis for comparison with data captured from faces in an image or video.


San Francisco officials are pre-emptively banning the use of facial recognition technology. The California metropolis’ Board of Supervisors voted 8-to-1 in favour of the “Stop Secret surveillance ordinance’ citing harm to residents’ civil liberties. The ban applies to police and other municipal departments but does not affect the use of technology by federal governments at the airport and ports nor does it limit personal or business uses.

The action puts San Francisco at the forefront of the increasing discontent in the United States over facial recognition, which government agencies have used for years and now has become powerful with the rise of cloud computing and artificial intelligence technologies. The ordinance also creates a process for the police department to disclose what surveillance technology they use, such as license plate readers and cell-site simulators that can track residents' movements over time. The requirement to go public about surveillance tools is about building trust between law enforcement and the communities they serve, said Nathan Sheard, a grassroots advocacy organizer at the Electronic Frontier Foundation.

Matt Cagle, a technology and civil liberties attorney at the ACLU of Northern California, said the legislation was a positive step towards slowing the rise of technologies that may infringe on the rights of people of color and immigrants. “Face surveillance won’t make us safer, but it will make us less free,” Cagle told the Guardian after the proposal passed a committee vote last week. Brian Hofer, Executive Director of privacy advocacy group Secure Justice said that surveillance technology is a huge legal and civil liberties risk due to its significant error rate.

There are reports of China using facial recognition technology to racially profile its citizens, sorting faces into categories of Han Chinese and Uyghur Muslim.

Microsoft asked the federal government to regulate facial recognition technology before it gets more widespread and said it declined to sell the technology to law enforcement. As it is, the technology is on track to become pervasive in airports and shopping centres and other tech companies like Amazon are selling the technology to police departments. 


A non-profit think tank based in Washington, D.C , The Information Technology and Innovation Foundation issued a statement chiding San Francisco for considering the facial recognition ban. It said advanced technology makes it cheaper and faster for police to find suspects and identify missing people. “Critics were silly to compare surveillance usage in the United States with China, given that one country has strong constitutional protections and the other does not,” said Daniel Castro, the foundation's vice president. “In reality, San Francisco is more at risk of becoming Cuba than China -- a ban on facial recognition will make it frozen in time with outdated technology,” he said.


Our assessment is that identification of people without their knowledge and consent stands in the way of their ability to act and move about freely. It can be noted that surveillance efforts have historically been used to intimidate and oppress certain communities and groups more than others, including those that are defined by a common race, ethnicity; religion, national origin, income level, sexual orientation, or political perspective. 

An MIT study found that programs were more likely to return false positives on subjects with darker skin prompting anxiety about the hazards of computerized misidentification.  As compared to other biometric techniques, face recognition may not be most reliable and efficient. Factors such as illumination, expression, pose and noise during face capture can affect the performance of facial recognition systems. Also, specific haircuts and makeup patterns prevent the used algorithm to detect a face known as computer vision dazzle.