#3E-Justice. Traits: 3E-determination
Explore tagged Tumblr posts
3rdeyeinsights · 1 year ago
Text
0 notes
evanhunerberg · 1 year ago
Text
0 notes
neptunecreek · 3 years ago
Text
The Movement to Ban Government Use of Face Recognition
In the hands of police and other government agencies, face recognition technology presents an inherent threat to our privacy, free expression, information security, and social justice. Our faces are unique identifiers that can’t be left at home, or replaced like a stolen ID or compromised password. The technology facilitates covert mass surveillance of the places we frequent, people we associate with, and, purportedly, our emotional state.
Fortunately, communities across the country are fighting back. In the three years since San Francisco passed its first-of-a-kind ban on government use of facial recognition, at least 16 more municipalities, from Oakland to Boston, have followed their lead. These local bans are necessary to protect residents from harms that are inseparable from municipal use of this dangerous technology.
The most effective of the existing bans on government face surveillance have crucial elements in common. They broadly define the technology, provide effective mechanisms for any community member to take legal enforcement action should the ordinance be violated, and limit the use of any information acquired in an inadvertent breach of the prohibition.
There are, however, important nuances in how each ordinance accomplishes these goals. Here we will identify the best features of 17 local bans on government use of face recognition. We hope this will help show authors of the next round how best to protect their communities.
You can press the play button below to see a map showing the 17 communities that have adopted these bans.
Tumblr media
%3Ciframe%20src%3D%22https%3A%2F%2Fwww.google.com%2Fmaps%2Fd%2Fembed%3Fmid%3D1SYYYrCe8rmRPrZyz5uFPSo4j4a_Dlkhb%26amp%3Behbc%3D2E312F%22%20width%3D%22640%22%20height%3D%22480%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E
Privacy info. This embed will serve content from google.com
Here is a list of these 17 communities:
Berkeley, CA
Boston, MA
Brookline, MA
Cambridge, MA
King County, WA
Madison, WI
Minneapolis, MN
New Orleans, LA
Northampton, MA
Oakland, CA
Pittsburgh, PA
Portland, ME
Portland, OR
San Francisco, CA
Santa Cruz, CA
Somerville, MA
Springfield, MA
  Definition of “face recognition”
Particular consideration must be given in any tech-related legislation to define what tools and applications are, and are not, intended to be covered. Complicating that challenge is the need to define the relevant technology broadly enough to assure that emerging capabilities are suitably captured, while not inadvertently impacting technologies and applications that should not fall within the bill's scope.
Here, many forms of government use of face recognition technology may present significant threats to essential civil liberties. They may also exacerbate bias. Today, the most widely deployed class of face recognition is often called “face matching.” This can be used for “face identification,” that is, an attempt to link photographs of unknown people to their real identities. For example, police might take a faceprint from a new image (e.g., taken by a surveillance camera) and compare it against a database of known faceprints (e.g., a government database of ID photos). It can also be used for “face verification,” for example, to determine whether a person may have access to a location or device. Other forms of face matching include “face clustering,” or automatically assembling together all the images of one person, and “face tracking,” or automatically following a person’s movements through physical space. All of these threaten digital rights.
Another application of face recognition is “face analysis,” also known as “face inference,” which proponents claim can identify demographic traits, emotional state, and more based on facial features. This invites additional bias, and suggests a return to the age of phrenology.
Bans on government use of face recognition must be drawn broadly enough to address all of these threats. Fortunately, many of the existing bans follow Boston’s example in defining face surveillance and face surveillance systems as:
“Face surveillance” shall mean an automated or semi-automated process that assists in identifying or verifying an individual, or in capturing information about an individual, based on the physical characteristics of an individual's face.
“Face surveillance system” shall mean any computer software or application that performs face surveillance.
Critically, these definitions are not limited just to face identification and face verification, but extend also to other technologies that use face characteristics to capture information about people.
Oakland, California offers another strong example:
“Face Recognition Technology” means an automated or semi-automated process that: (A) assists in identifying or verifying an individual based on an individual's face; or (B) identifies or logs characteristics of an individual's face, head, or body to infer emotion, associations, expressions, or the location of an individual.
Notably, it extends beyond face characteristics, to also cover head and body characteristics. It thus captures many of the current uses and future-proofs for some of the most concerning types of biometric data.
Importantly, each definition effectively captures the intended technology and applications, while not inadvertently capturing less-concerning practices such as ordinary film, video, and still photography.
Don’t use it, don’t outsource it
While it is critical that cities ban their own agencies from acquiring and using face recognition technology, this alone is not enough to protect residents from harm. It is also necessary for cities to ban their agencies from acquiring or using information derived from face recognition technology. Otherwise, city employees banned from using the technology could just ask others to use the technology for them.
While police departments in large cities like New York and Detroit may have in-house face recognition systems and teams of operators, many more local police agencies around the country turn to state agencies, fusion centers, and the FBI for assistance with their face recognition inquiries. Thus, legislation that addresses the technology while not addressing the information derived from the technology may have little impact.
Lawmakers in several cities including Berkeley have taken the important additional step of making it unlawful to access or use information obtained from Face Recognition Technology, regardless of the source of that information:
it shall be a violation of this ordinance for the City Manager or any person acting on the City Manager’s behalf to obtain, retain, request, access, or use: i) any Face Recognition Technology; or ii) any information obtained from Face Recognition Technology...
Berkeley's ordinance further elaborates that even when city employees inadvertently gain access to information derived from face recognition technology, the data generally must be promptly destroyed and cannot be used. Also, any inadvertent receipt or use of this information must be logged and included in the city’s annual technology report, including what measures were taken to prevent further transmission or use. This vital transparency measure assures residents and legislators are made aware of these errors, and can better identify any patterns suggesting intentional circumvention of the law’s intent.
Exemptions
Exceptions can swallow any rule. Authors and supporters of bans on government use of face recognition must tread carefully when carving out allowable uses.
First, some ordinances allow face detection technologies that identify and blur faces in government records, to prepare them for disclosure under Freedom of Information Acts (FOIAs). This can help ensure, for example, transparent public access to government-held videos of police use of force, while protecting the privacy of the civilians depicted. Face detection technology does not require the creation of faceprints that distinguish one person from another, so it raises fewer privacy concerns. Unfortunately, there can be racial disparities in accuracy.
King County’s ordinance provides two necessary safeguards for government use of face detection technology. It can only be used “for the purpose of redacting a recording for release …, to protect the privacy of a subject depicted in the recording.” Also, it “can not generate or result in the retention of any facial recognition information.”
Second, some ordinances allow local government to provide its employees with phones and similar personal devices, for use on the job, that unlock with the employee’s faceprint. Some employees use their devices to collect personal information about members of the public, and that information should be securely stored. While passwords provide stronger protection, some employees might fail to lock their devices at all, without the convenience of face locks.
Third, some ordinances allow local government to use face locks to control access to restricted government buildings. Portland, Maine’s ordinance has two important safeguards. As to people authorized for entry, no data can be processed without their opt-in consent. As to other people, no data can be processed at all.
Fourth, a few ordinances allow police, when investigating a specific crime, to acquire and use information that another entity obtained through face recognition. EFF opposes these exemptions, which invite gamesmanship. At a minimum, police prohibited from themselves using this tech must also be prohibited from asking another agency to use this tech on their behalf. Boston has this rule. But unsolicited information is also a problem. San Francisco police broadly circulated a bulletin to other agencies, including the photo of an unknown suspect; one of these agencies responded by running face recognition on that photo; and then San Francisco police used the resulting information. New Orleans’ ordinance goes a step farther, prohibiting use of information generated by this tech “with the knowledge of” a city official. Fortunately, 12 of 17 jurisdictions do not have this exemption at all.
Fifth, a few jurisdictions exempt compliance with the National Child Search Assistance Act. This is unnecessary: that Act simply requires agencies to report information they already have, and does not require any acquisition or use of technology or information. Fortunately, 13 of 17 jurisdictions eschew this exemption.
Enforcement
It is not enough to ban government use of face recognition. It is also necessary to enforce this ban. The best way is to empower community members to file their own enforcement lawsuits. These are called private rights of action.
The best ones broadly define who can sue. In Oakland, for example, “Any violation of this Article … constitutes an injury and any person may institute proceedings …” It is a mistake to limit enforcement just to a person who can show injury from being subjected to face recognition. It can be exceedingly difficult to identify such people, despite a brazen violation of the ordinance. Further, government use of face recognition harms the entire community, including through the chilling of protest in public spaces.
Private enforcement requires a full arsenal of remedies. A judge must have the power to order a city to comply with the ordinance. Also, there should be damages for a person who was subjected to face recognition. Oakland provides this. A prevailing plaintiff should be paid their reasonable attorney fees. This ensures access to the courts for everyone, and not just wealthy people who can afford to hire a lawyer. San Francisco properly allows full recovery of all reasonable fees.
Other enforcement tools are also important. First, evidence collected in violation of the ordinance should be excluded from court proceedings, as in Minneapolis. Second, employees who blow the whistle on rule-breaking should be protected, as in Berkeley. Third, employees who break the rules should be subject to workplace discipline, as in Brookline.
Other bans
When legislators and advocates write a ban on government use of face recognition, they should consider whether to also ban government use of other kinds of surveillance technologies. Many are so dangerous and invasive that government should not use them at all.
For example, EFF opposes government use of predictive policing. We are pleased that four cities have ordinances forbidding municipal use: New Orleans, Oakland, Pittsburgh, and Santa Cruz. Likewise, EFF supported Oakland’s ban on municipal use of voiceprints.
Nationwide ban
City and county-level lawmakers are not alone in understanding that government use of face surveillance technology chills free speech, threatens residents’ privacy, and amplifies historical bias. Federal lawmakers including Senators Edward Markey, Jeff Merkley, Bernie Sanders, Elizabeth Warren, and Ron Wyden alongside U.S. Representatives Pramila Jayapal, Ayanna Pressley, and Rashida Tlaib have stepped forward in introducing the Facial Recognition and Biometric Technology Moratorium Act (S.2052/H.R.3907). If passed, it would ban federal agencies like Immigration and Customs Enforcement, the Drug Enforcement Administration, the Federal Bureau of Investigation, and Customs and Border Protection from using face recognition to surveil U.S. residents and travelers. The act would also withhold certain federal funding from local and state governments that use face recognition.
Take Action
If you don’t live in one of the 17 cities that have already adopted a local ban on government use of face recognition, there’s no place like home to begin making a change. In fact, there may already be groups in your community setting the wheels in motion. Our About Face campaign helps local organizers educate their representatives and communities, and every resident to take that first step in calling for change. If you have an Electronic Frontier Alliance group in your area, they can also be a great resource in finding like-minded neighbors and activists to amplify your efforts. If your city has already protected you and your neighbors (and even if it has not yet), you can still stand up for friends and loved ones by letting your congressional representatives know it’s time to ban federal use of face recognition, too.  
from Deeplinks https://ift.tt/wafKHbU
0 notes