The name is PimEyes and it costs US$29.99 (R$142 at today’s price) per month to use. It is a facial recognition software that, after uploading a photo, scans the internet for images of the same person. And, the New York Times report found and Digital Gaze confirmed, it’s grimly accurate.
In the NYT tests, several members of the journalistic team submitted photos of themselves to tests. PimEyes found photos that they forgot existed, or that they didn’t even know were once taken.
One of the reporters is shown dancing at an art museum event more than a decade ago, and in another, crying after receiving a marriage proposal – she didn’t like that last photo, but the photographer was using it to advertise his business on Yelp.
A tech reporter was found at the 2011 Coachella music festival in a crowd. A correspondent appeared in multiple wedding photos and, blurry, in a photo taken at an airport in Greece.
The search worked even in the case of a person wearing sunglasses in the photo presented, looking to the side, wearing a mask, or with a beard – finding photos without a beard (more on that at the end).
PimEyes works by uploading a photo and clicking on a box, claiming that the photo is yours. Although several times on the site it is said that it is for personal use only, there is no brake beyond that box, the honesty of an anonymous person on the internet. Nothing really prevents the user from uploading other people’s images.
The program does not scan social networks, but open sites such as news pages, forums, photographer albums and, most worryingly of all, pornographic sites.
The idea, according to the company, is to be on the right side: to help a person find the abusive use of their own image and take it down. In theory, this can be used against, in an example they give, revenge porn, to take abusive content off the air. But it can also be used by one person to find someone else’s unwanted photos.
In fact, one of the stories brought up by the NYT was that of computer engineer Cher Scarlett. In 2005, at age 19, she was broke and auditioned for the porn industry. She hated it and immediately gave up on the idea. But, through PimEyes, she discovered that images of her were still in the air.
In the facial recognition program itself, she discovered that there was an option to remove the photos: the PROTect premium plan that cost up to $299.99. “It’s basically extortion,” says she, who paid for the most expensive plan.
According to the NYT, PrimEyes claimed to have returned the money to Scarlett and noted that there is a free plan to remove images, but it’s kind of hard to find. In solidarity with Ukraine, the company also blocked Russian users.
My Face Recognition Test
The platform allows a free trial of a simplified survey. Uploading a photo of me, I found some profile pictures on public websites and multiple versions of a photo that I had uploaded on Reddit in 2020.
What happened then was that my dog ate my glasses and I had to make a jigsaw to see with just one lens. I ended up with a Dragon Ball skin, which was a level more than 9k hit.
It’s not much of a surprise that this image ends up going viral. And she looks a lot like the picture I uploaded, with the same beard and hair I still wear today.
The surprise was at the bottom of the list, where he found a “potentially explicit” photo (it’s a false positive, of someone good-looking, apparently singing shirtless; my music career is limited to karaoke).
And the most surprising of all: a real photo of me from 2009, without a beard, with a mustache painted with pencil for a June party (it’s the one in the lower left corner). If you presented my photo of today to myself in 2009, I don’t think you would recognize me.
Tests with other photos again brought up a mountain of my Dragon Ball meme and some obvious false positives, people who don’t even look like me (unlike the mystery singer).
The free trial does not display the links to where the photos were found, and displays them in a blurry way so that you cannot reverse search. Clicking on them takes you to a screen asking you to sign. It’s not hard to agree that if you find a photo problematic (I didn’t), it could end up looking like extortion.
Have you watched our videos on YouTube? Subscribe to our channel!