The US Military is growing a nightmarish thermal facial recognition system

The US Army has just taken a giant step towards developing killer robots that can see and identify faces in the dark.

DEVCOM, the US Army’s corporate research division, published a pre-print paper last week documenting the development of an image database for training AI in facial recognition using thermal imaging.

Why it matters: Robots can use night vision optics to see effectively in the dark. However, until now there has been no method by which they can be trained to identify surveillance targets using only thermal imaging. This database, made up of hundreds of thousands of images made up of regular photographs of people and their corresponding thermal images, is set to change that.

How it works: Similar to any other facial recognition system, an AI would be trained to categorize images based on a certain number of parameters. The AI ​​doesn’t care whether the images are images of faces using natural light or thermal imaging. It only needs extensive amounts of data in order to become “better” at recognition. As far as we know, this database is the largest that contains thermal images. But with fewer than 600,000 images and only 395 subjects, it is relatively small compared to standard facial recognition databases.

[Read next: Meet the 4 scale-ups using data to save the planet]

This lack of comprehensive data means that identifying faces just wouldn’t be very good. Currently, the prior art face recognition is poorly able to identify anything other than white male faces, and thermal images contain less clearly identifiable data than traditionally illuminated images.

These disadvantages are obvious, as the DEVCOM researchers conclude in their work:

The analysis of the results shows two challenging scenarios. First, the thermal landmark detection and facial verification models from thermal to visible face were severely impacted on out-of-pose images. Second, the thermally visible face verification models encountered an additional challenge when a subject was wearing glasses in one image but not in the other.

Take quickly: The real problem is that the US government has shown time and time again that it is willing to use facial recognition software that doesn’t work very well. In theory, this could result in better combat control in battlefield scenarios, but when executed it is more likely that innocent black and brown people will die via police or robbery drones to identify the wrong suspect in the dark.

H / t: Jack Clark, AI import

Published on January 11, 2021 – 23:07 UTC

Comments are closed.