Amazon pitched its facial recognition to ICE, released emails show [Updated]

Amazon pitched its facial recognition to ICE, released emails show [Updated]

Department of Homeland Security

Documents obtained through a Freedom of Information Act filing by the Project on Government Oversight (POGO) show that Amazon’s government sales unit was actively seeking to provide the Department of Homeland Security Immigration and Customs Enforcement (ICE) Homeland Security Investigations with Rekognitionthe controversial facial recognition system. The pitch was part of a larger discussion of Amazon Web Services offerings to ICE HSI, an offering including artificial intelligence algorithms and predictive analytics.

Amazon has provided cloud services to DHS in the past, including US Citizenship and Immigration Services (USCIS). USCIS uses Amazon cloud storage to store data, including information associated with Alien Registration Numbers (A-Numbers)—numbers also used by ICE, Customs and Border Patrol, and other DHS agencies to identify and track immigrants. And DHS has been pushing to do more in AWS. In a 2017 Request for Information (RFI), DHS requested “information regarding forward-thinking, modern Development, Security and Operations (DEVSECOPs), specifically in association with Big Data, Analytics, PersonCentric, Entity Resolution and Machine Learning… to build, enhance, and support systems in large cloud environments, specifically Amazon Web Services (AWS).”

In a June 15 email to a DHS recipient, Amazon Web Service’s federal sales principal for Homeland Security followed up on a conversation about Amazon’s artificial intelligence (AI) and machine learning (ML) technologies that occurred at the Redwood City, California, offices of the consulting firm McKinsey & Company. The firm had an “integrated consulting management services” contract with ICE, part of which focused on “developing ICE [Enforcement and Removal Operations’] modernized vision and strategy.”

Update, 2:30 PM: An Amazon spokesperson told Ars that Amazon  “participated with a number of other technology companies in technology ‘boot camps’ sponsored by McKinsey Company, where a number of technologies were discussed, including Rekognition. As we usually do, we followed up with customers who were interested in learning more about how to use our services (Immigration and Customs Enforcement was one of those organizations where there was follow-up discussion).”

In the follow-up e-mail, Amazon’s Homeland Security sales representative wrote, “We are ready and willing to support the vital HSI mission. I hope what we shared regarding… recent development updates surrounding our AI/ML suite was of value.” The Amazon representative then listed “action items” from the conversation, which included potentially setting up a one-day “Innovation Workshop focused on a big HSI problem.” The list also included setting up a deeper tech briefing on a number of AI and ML technologies, including:

  • Predictive analytics core capabilities and deployment scenarios.
  • Elasticsearch as a managed service, [and] path to an ATO [Authority to Operate].
  • Neptune Graph Database value, use cases, and security.
  • Rekognition Video tagging/analysis, scalability, [and] custom object libraries.

The action items mention an introduction to someone at USCIS to discuss that agency’s Elasticsearch implementation.

Ironically, the meeting at McKinsey occurred as McKinsey employees were speaking out about the moral and ethical concerns surrounding the firm’s contract with ICE. A McKinsey contract with ICE’s Enforcement and Removal Operations division ended in July.

In July, the American Civil Liberties Union of Northern California published results of a test they had conducted using photos of members of Congress and a public database of 25,000 arrest mugshots. In total, 28 members of Congress, including six members of the Congressional Black Caucus, were flagged as matches. The ACLU test followed a number of studies—including one by the US Government Accountability Office in March of 2017—that found unacceptable rates of false positives in facial recognition systems. An MIT Media Lab study found that facial recognition systems often misidentified people with darker skin.

Amazon disputed the results of the ACLU test and sought to reassure DHS. In a follow-up email to the DHS contact, the Amazon sales lead sent a link to an Amazon blog post, stating “[Dr. Matt Wood’s] comments on the ACLU, facial recognition accuracy, and ‘Confidence Levels’ may be of interest given your ongoing efforts.”

In response to the disclosure, the ACLU has repeated its call for “a moratorium on the use of facial recognition technology for immigration enforcement and law enforcement purposes until Congress and the public debate what, if any, uses of this technology should be permitted,” and announced a Freedom of Information Act request “demanding the Department of Homeland Security disclose to the public if and how the agency is using or plans to use the technology.”

“ICE should not be using face recognition for immigration or law enforcement,” ACLU Senior Legislative Counsel Neema Singh Guliani said. “Congress has never authorized such use, and should immediately take steps to ensure that federal agencies put the brakes on the use of face recognition for immigration or law enforcement purposes.”

Similar Posts: