#Private Intellligence Firm Proposes “Google” for Tracking Terrorists’ Faces

A top facial recognition company is partnering with a private intelligence firm that tracks terrorists to create an Internet-based tool to scan and identify terrorists’ faces for law enforcement, military, and intelligence agencies.

According to a late October email obtained by The Intercept—sent by Ben Venzke, the CEO of IntelCenter, the new tool would combine his company’s pre-existing terrorist data with facial recognition algorithms developed by Morpho, which specializes in biometric technology.

Morpho, which does not disclose its proprietary algorithms to identify faces, has also partnered with the FBI in the past to develop its facial recognition programs. This new tool would combine not only the facial recognition technology but also the database of faces—or the “terrorist data” as Venzke describes it.

IntelCenter was founded by Ben Venzke, who was just 16 when he got into the terrorist tracking business, according to a 2008 profile in German magazine Der Spiegel. The company, which works for government clients, tracks online messaging, videos, social media, hostage crises, and information disseminated by terror groups and supporters.

The new tool, four years in the making, is designed so that “searching for the face of a terrorist [is] as easy as running a Google search,” Venzke wrote. Anyone from a local beat cop with a shoestring budget to the more sophisticated State or FBI investigator would have access to the inexpensive, “quick and easy” tool, he continued.

According to Venzke, who was writing to INTELST Forum, an intelligence email listserv founded in 2000, this new tool is “restricted to military, intelligence, and law enforcement agencies of qualifying countries.” He didn’t say which countries might be excluded.

In an earlier email from June discussing the same technology, Venzke said the product has “various counterterrorism use cases that previously would have been impossible” without his combination of facial recognition technology and terrorist data.

The U.S. military has a long history of using biometric technology—fingerprint scanners, and cameras—to identify militants overseas. There’s an entire Army unit, the Defense Forensics and Biometrics Agency, devoted to the work. A soldier in Afghanistan developed one of the first mobile biometric devices, the Handheld Interagency Identity Detection Equipment, which resembles a chunky Nikon without the protracted lens. It alerts personnel to matches on a watch list based on iris scans, fingerprints, and camera images, and is reportedly capable of storing tens of thousands of biometric profiles.

Biometric tools developed for the military have started surfacing in state and local law enforcement precincts. The FBI’s database of faces is so bloated that half of all adults in the U.S. are kept there; African Americans are even more likely to be in the database, according to research conducted by the Georgetown Center on Privacy & Technology in October.

Privacy advocates are now raising questions about the involvement of private companies in developing the algorithms and compiling the database of faces. If IntelCenter itself is providing the data to state and local law enforcement agencies, and it includes Americans, “that seems like a due process issue, if they are enrolling somebody on a terrorist watch list,” said Clare Garvie, Law Fellow at Georgetown’s Center on Privacy & Technology, in an interview with The Intercept. “As a private company, they’ll be exempt from records laws, information about who’s in the database, how they got there.”

Garvie expressed concern that the tool might exacerbate racial profiling if it’s used by every cop on the street. “Would state and local officers be more inclined to stop someone who looks like a Middle Eastern male if they have an app on their phones that compares it to a list of terrorists?” she said. “That would raise some huge red flags for me.”

It’s unclear whether the database would be made up of publicly identified serious criminals or terrorists—or whether it would be based on a broader watch list. Venzcke doesn’t offer details about its database except to potential clients in the intelligence, military, and law enforcement spheres.

But IntelCenter is already collecting faces. Users can sign up for a free trial on IntelCenter’s website, which includes access to a new “facial component.” From there, for example, users can run a “facial search” for people like ISIS member Mukhmad Turkoshvili, who is “likely based out of Syria.”

The photo provided is poorly lit, obscured by his hat and facial hair.

Venzke of IntelCenter and Morpho, the facial recognition company, did not respond to a request for comment about the new product, or potential customers.

Yet the idea of combining a private database with facial recognition could prove problematic.  Government watch lists, such as the no fly list, are already a heavily criticized, because those databases inevitably include errors, and don’t easily offer people the chance to challenge their inclusion.

The Government Accountability Office also criticized the FBI in May for failing to protect privacy: the bureau’s facial recognition system, which was using Morpho’s algorithms, was only accurate 86 percent of the time. That means that out of every 100 faces, 14 innocent people would be inaccurately identified.

Facial recognition mismatching has lead to at least one false allegation, in the case of Steve Talley, who was mistakenly identified as a bank robber partly thanks to grainy surveillance records and a seemingly similar head shape and jaw line.

The technology is also limited in many cases. Depending on how many faces are in the database, and how the photos were taken—whether they were picked out of videos with poor lighting versus clearer drivers’ license images—it can be difficult to correctly identify specific features, like nose length or distance between eyes.

“The smaller the database is, the more accurate it will be,” Garvie noted. If the technology is used on a very small segment of known serious criminals, “this could be a pretty positive application of facial recognition.”

James Wayman, who worked on biometric technology at San Jose State University, told an audience at the RAND Corporation that facial recognition could be a vital tool to identify terrorists and serious criminals—but it can fail even with small databases of faces to identify.

But the work is progressing. Some scientists, like Anil Jain at Michigan State, are working to improve the accuracy of facial recognition techniques, even when police only have an obscured, rotated, or distorted face to work with. His lab received funding from the FBI to improve techniques for identifying faces from grainy surveillance videos.

Jain says facial recognition software, depending on the quality of the images, can produce both false positives and false negatives. “One should regard the output of face recognition as providing an ‘investigative lead’ rather than gospel,” he wrote in an email to The Intercept.

Either way, facial recognition is here to stay. “It is not feasible for forensic examiners to manually search millions of faces in a government database to find the person of interest in a timely fashion,” Jain wrote.

Top photo: A laptop display is visible at the launch event of An Garda Siochana’s new facial recognition system Evo-Fit in Dublin on Feb. 6, 2015.

The post Private Intellligence Firm Proposes “Google” for Tracking Terrorists’ Faces appeared first on The Intercept.

from The Intercept ift.tt/2fDuWHt

Leave a Reply

Your email address will not be published. Required fields are marked *

The CAPTCHA cannot be displayed. This may be a configuration or server problem. You may not be able to continue. Please visit our status page for more information or to contact us.

This site uses Akismet to reduce spam. Learn how your comment data is processed.