Facial recognition is hairy territory

Charlene Han
11 min readDec 14, 2020

To: CIO, GovTech Singapore

From: Charlene Han, Chief Advisor

Source: https://opengovasia.com/face-verification-technology-allows-singpass-holders-to-sign-up-for-digital-banking-services/

1. Our Smart Nation ambition aims to harness technology for Singapore to be an “outstanding city… where people live meaningful and fulfilled lives, enabled seamlessly by technology, offering exciting opportunities for all.” It is a vision of technology in our daily lives and communities, enabling us to live comfortably, sustainably, and connectedly. This was an inclusive vision for all of us, whether rich or poor, young, or old, where Government will play a key role in laying the foundation.

2. The oldest and simplest justification for government is as protector: protecting citizens from violence. Any smart nation initiative must extend into providing safety and security for our citizens. Considering this vision, I will outline my views regarding the Commissioner of Police’s (COP) proposal to sign an agreement with Ring, a company that produces products such as security cameras, video doorbells, and ring alarms, to gain access to their security system’s audio and video data, with user permission, to improve security and reduce crime.

Smart Nation = Smart Policing

3. “21st-century criminals are committing 21st-century crimes, and there’s simply no way that law enforcement can consistently catch them using 20th-century methods” — this explains why our security systems and processes must evolve with the times. Deloitte has outlined the top five policing innovations of the future, and at first glance, the COP’s proposal checks all the boxes (see Figure 1).

Figure 1: Assessing COP’s proposal with Ring against Deloitte’s top 5 policing innovations

4. Furthermore, Ring is owned by Amazon. Beyond the above benefits, partnering with a technology giant like Amazon provides additional advantages: first, as a government we can be assured that this partnership is on credible and firm grounds, with strong backing; second, as the first Asian government to partner Ring in this manner, we can be assured that Ring and Amazon would do all they can to provide end-to-end service, as it would be in their benefit to use us a springboard to break into other Asian markets; third, we can leverage this agreement and relationship to tap on Amazon Web Services for complementary uses such as analytics, compute and storage.

5. Such smart technologies and infrastructure are also not alien to us. We are first in the world to use facial verification in our national identity scheme, SingPass, to allow users to “authenticate themselves, and prove they are genuinely present” (i.e., it is not a photograph, or a deepfake). We already have a network of security cameras around public infrastructure such as highways and lift-landings of public flats. Our Lamppost as a Platform (LaaP) project fits lampposts with a network of wireless sensors to monitor changes in environmental conditions such as humidity, rainfall, noise, pollutants, footfall, and speed of personal mobility devices[6] . Our legal framework also provides a supportive ground: The Personal Data Protection Act (PDPA) recognizes both the rights of individuals to protect their personal data and the rights of organizations to collect, use or disclose personal data for legitimate and reasonable purposes; specifically, personal data can be used without consent if its use is necessary for any investigation or proceedings. This proposal can be seen as a natural extension of our Smart Nation ambition into the private space, and the ground appears ripe for it.

Into the unknown with facial recognition

6. Despite these upsides, we need to tread very carefully in this space. First and foremost, we must be clear about what facial recognition is and what it is not. Facial verification and facial recognition have often been lumped together but the truth is that the two cannot be more different (see Figure 2 for explanation). Facial recognition wades into the territory of mass surveillance and creates deeper and darker concerns about an Orwellian state, outlined below.

Figure 2: Differences between face verification and face recognition. Source: https://www.iproov.com/blog/face-recognition-vs-face-verification-whats-the-difference

7. The top issue is about the governance: how the data would be used and who decides the purpose of using facial recognition. Baltimore police used facial recognition during the Freddie Gray protests to monitor and identify protestors, a move that civil liberties groups argued invades privacy and borders on suppression. We need to develop appropriate regulations:

a. Scope of use of such data — is law enforcement too broad a category? For instance, the Chinese government had been criticized for using facial recognition as a law enforcement tool to subject more citizens to the criminal justice system for petty crimes. Do we need further specification i.e., matching those on a wanted list or searching for a missing person? Should there be specific exclusion clauses to prevent impinging on the freedom of speech?

b. Storage and security — where are the facial recognition data and analysis stored, and for how long? Minimally, such sensitive data should be stored on secured government servers, and there ought to be an agreed time limit.

c. Related protocols — that before an arrest, to guard against false positives (see para 8), additional identification to confirm the individual would have to done via an alternative method; only specific public officers stated by name, not by grade (e.g., Inspectors) can legitimately access specific categories of data.

d. Legal requirements to access private databases — e.g., only with a warrant?

e. Enforcement against wrongful access — we do have the Public Service Governance Act, which makes the mishandling and unauthorized access of data a criminal offence. This prevents abuses from public officers accessing data of their ex-partners, or an idol.

e. Appropriate privacy and transparency safeguards — e.g., consider alerting users, likely after the fact, given the time sensitivity of the investigation, that their data was accessed (see also para 10).

8. These regulations would have to be iterated against the accuracy of the technology: whether the technology works and how humans and technology are interfacing. Timnit Gebru, a leader of Google’s ethical AI team argued that facial recognition is “dangerous” and “should be banned at the moment” because the detection technology was “way less accurate than humans”. She adds that the automation bias, or overreliance on technology further confounds the issue as we are more likely to believe the model if states a 99% match to Person A even if we intuitively disagree (similar misdiagnoses in predictive analytics occurred in the Allegheny Family Screening Tool). Studies have also exposed vulnerabilities in facial recognition technology where simply by adjusting a few pixels would render a face unrecognizable, or that solutions may not be able to differentiate between twins or cosmetic changes. This may lead to false positives, where innocent people risk being flagged and arrested, and false negatives, where criminals get away. This is not simply an evolution — e.g., equipping policemen with iPads so that they can get data on the move; this is akin to a revolution, where the entire process of policing, its norms and appropriate protocols would need to be redeveloped. It would also require the appropriate training of our police force to leverage technology for its advantages, and not become policing by algorithm, removing the human judgment so critical in many aspects of enforcement.

9. The next priority of issues concern who gets the policing and who gets policed? Even if technology improves and a case it to be made to proceed with facial recognition software, we need to distinguish between that in public spaces and private spaces. In public spaces such airports and train stations, where the need and public benefits for surveillance and security are higher, there is greater public acceptance for facial recognition software. Moving facial recognition onto private grounds via this proposal, raises new concerns of equality and inclusivity. Only those who can afford Ring products, i.e., the “haves”, will benefit from the policing, while the “have-nots” will be the policed. This technology-driven fault line, based on socio-economic status and likely race, deepens especially when we start using the dataset for more “evidence-based policing” — e.g., burglaries were found to be committed by Type X individuals; we start training the dataset to look out for Type X individuals; police units are dispatched to arrest Type X individuals, and this results in a self-fulfilling feedback loop where the biases get lodged even more deeply.

10. A related issue is whose consent is required, and is it sufficient? The proposal states that user permission would be obtained before access would be granted to Ring’s data. Apart from further skewing the datasets (e.g., people in richer neighborhoods may not consent as they value privacy more, or they are able to hire their own private security detail), we need to think deeper with regard whose consent is required apart from the owner of the product. Some European countries ban the use of dashcams because the data is collected secretly and individuals have the right to their own privacy. It would be ideal to obtain consent regarding data handling but the issue at hand is one of practicality — first, how can this be done for everyone who might be within the neighborhood or in the homes of Ring users; and second, it is impractical to do so for targets of investigations to avoid tipping them off. Regardless, privacy advocates also contend that consent is a low threshold when dealing with sensitive biometric data especially when there is an imbalance of power between “controllers and data subjects, such as the one observed in citizen-state relationships”.

11. Security of this valuable database is also paramount. Given that biometrics are increasingly being used to access bank accounts, government services, or buildings, this database is becoming extremely valuable and will attract hacking attempts. Hackers may be drawn to this for monetary purposes or even for a range of purposes ranging from revenge, curiosity, or obsession. Over the years, we have seen many doxing attempts by online mobs. Beyond such external threats, we also need to be mindful of (semi)internal threats: what are the safeguards in place to prevent Ring or Amazon from selling our database to the highest bidder? Even if we had agreements ensuring this, how would we know whether Ring or Amazon were holding up their side of the bargain? Another area of security would be to ensure that Ring’s database itself was not compromised. All these would require additional resources and capabilities.

12. Lastly, there are questions around due process and fairness of procurement. It is unclear how COP arrived at this agreement with Ring, whether there was any public tender, who were the other vendors, and what were the criteria used to judge the bids. What would be the relationship between Ring and our Police — was this an exclusive contract, and if so, there were serious implications about the consolidation of power by Ring and Amazon aided by government. In addition, if indeed we were to procure complementary services from Amazon, we would have to consider whether we were becoming too dependent and beholden to a private company, and the implications of this on our ability to govern. Have we worked out the how the data would be shared between both parties, and the necessary “firewalls” in place for any unwanted and illegal access, for instance, if Ring should request for reciprocal access, to find out the identity of a particular stalker within the neighborhood, or if Ring was compromised and hackers were using Ring to get through to our database?

13. Unfortunately, instead of providing you with answers, I have raised more questions. While some of these questions can be worked out internally within government, many of these require a broader engagement with the public, to determine the risk appetite of our nation and to established shared norms about the reach of technology in our private lives.

Test, Learn, Iterate

14. Notwithstanding these areas of unreached consensus, we should not be adopting an ostrich mentality. Dr Mark Cenite puts it well: “Those opting for a moratorium on facial recognition can rest assured that friends and foes around the world use it” — if we resist progress, we are falling behind.

15. The good news is that trust in the government, remains high. In fact, the Edelman poll showed that it had increased by 3 percentage points this year to 70%. Singapore institutions are also seen as ethical and competent, outperforming the global average (see Figure 3). This provides a conducive backdrop to explore advances in facial recognition technology and to concurrently, open the public conversation on the issues I have raised above; in other words, we adopt an agile approach.

Figure 3: High trust in Singapore institutions. Source: https://www.straitstimes.com/asia/trust-in-singapore-government-up-edelman-poll

16. My counterproposal is to run some pilots in lower risk settings, sequentially or concurrently, depending on the amount of resources available.

a. Bring facial recognition closer to communities but stay within public areas. We could add on the software to our LaaP project for a specific neighborhood.

b. We could leverage the “Vehicles on Watch” initiative where vehicle owners have volunteered to use their in-car cameras as extra “eyes” in the community to deter and solve crimes. This was a successful project. We could build on its positivity and add facial recognition software to one of these neighborhoods as a trial.

c. Assuming Ring is the best partner now and procurement rules were met, we could start with their business clients instead — for example, that of private buildings or malls.

17. These are “safer” zones in which to experiment and to work out the governance kinks, consent mechanisms (e.g., everyone who enters the neighborhood, building or mall, would have to consent to their data being used for facial recognition), and transparency protocols (e.g., how to alert the user that his/her data was used). It would also enable the police to adjust their internal standard operating procedures (SOP). Even as we debate publicly about the balance of public security and public privacy, we should engage the users within these pilots on more concrete and specific details (e.g., signages that indicate a user was entering into a surveilled space, how should the consent form be presented).

Conclusion

18. In summary, I do not support the COP’s proposed agreement with Ring. My advice would be for you to urge the COP to put the brakes on as there are too many questions unanswered. The risk of this going awry is high, and it would not just be another failed and very public spectacle, but it would also curtail the painstaking growth we have made towards our Smart Nation ambition (see Figure 4), much of which is premised on a high level of trust between citizens and government. The worst-case scenario is that it unravels our progress and takes us 3 steps back, in addition to hefty political price tag.

Figure 4: Key milestones for Smart Nation Projects. Source: https://opengovasia.com/government-service-delivery-around-moments-of-life-and-smart-urban-mobility-included-among-strategic-smart-nation-projects/

19. That said, you should adopt a supportive and open stance towards COP’s desire to embrace technology for better policing. His intentions are to be lauded. You may wish to raise the alternative pilots I have outlined in para 16 for his consideration, and to offer resources in the form of our staff as experts and specialists to support the pilots — the details can be worked out subsequently. You may also wish to engage your peers in the Ministry of Law, National Development, and Finance, and invite them in early to collaborate in this project, and to gain buy-in for future ones. A multi-ministry co-creation effort will be required as we shape the future of our nation.

--

--