Imagine strolling down a busy city street and snapping a photo of a stranger then uploading it into a search engine that almost instantaneously helps you identify the person.
This isn't a hypothetical. It's possible now, thanks to a website called PimEyes, considered one of the most powerful publicly available facial recognition tools online.
On TikTok, PimEyes has become a formidable tool for internet sleuths trying to identify strangers, with videos notching many millions of views showing how a combination of PimEyes, and other search tools, can, for example, figure out the name of a random cameraman at a Taylor Swift concert. TikTok's community guidelines ban content with personal information that could lead to stalking, identity theft and other crimes. But this particular video was still up on Wednesday morning.
Originally founded in 2017 by two computer programmers in Poland, it's an AI tool that's like a reverse image search on steroids — it scans a face in a photo and crawls dark corners of the internet to surface photos many people didn't even know existed of themselves in the background of restaurants or attending a concert.
While the company claims it is a service that can help people monitor their online presence, it has generated controversy for its use as a surveillance tool for stalkers, collecting countless images of children and for adding images of dead people to its database without permission.
Without any federal laws on the books in the U.S. governing facial recognition technology, services copying PimEyes are expected to proliferate in the coming years.
Consider the consequences, says journalist Kashmir Hill, of everyone deciding to use this technology at all times in public places.
"Something happens on the train, you bump into someone, or you're wearing something embarrassing, somebody could just take your photo, and find out who you are and maybe tweet about you, or call you out by name, or write nasty things about you online," said Hill, a reporter for The New York Times who recently published a book on facial recognition technology called "Your Face Belongs to Us."
PimEyes CEO: Service has many 'legitimate purposes'
A basic version of PimEyes is free for anyone to use, but the company offers advanced features, like alerts on images that users may be interested in when a new photo appears online, for a monthly subscription fee.
TikTok users have pointed out that there is a way for people to opt-out of having their photos in the PimEyes database, but tests of the search tool show that it is not always a guaranteed way of removing oneself from the company's massive trove of photos.
Giorgi Gobronidze, an academic who studies artificial intelligence based in Georgia in eastern Europe, is now CEO of PimEyes, which he said has a staff of about 12 people.
In an interview with NPR, he said the abuse of the tool has been overstated, noting that the site's detection tools intercepted just a few hundreds instances of people misusing the service for things like stalking or searching for children.
When someone searches PimEyes, the name of the person pictured does not appear. Still, it does not take much internet detective work to fit the pieces together and figure out someone's identity.
Nonetheless, Gobronidze emphasizes that PimEyes, technically, does not alone generate someone's identity.
"We don't identify people," he said. "We identify websites that identify images similar to the search material."
PimEyes' rules stipulate that people only search for themselves, or people who consent to a search. Still, there is nothing stopping anyone from running a search of anyone else at any time, but Gobronidze said "people are not as terrible as sometimes we like to imagine."
He continued: "PimEyes can be used for many legitimate purposes, like to protect yourself from scams," he said. "Or to figure out if you or a family member has been targeted by identity thieves."
Gobronidze said PimEyes now blocks access in 27 countries, including Iran, China and Russia, over fears government authorities could use the service to target protesters and dissidents.
The technology Google dared not to release
Journalist Hill with the Times said super-powerful face search engines have already been developed at Big Tech companies like Meta and Google.
Yet the potential for the tool to be weaponized is so great that some top executives — like former Google CEO Eric Schmidt — have been reluctant to release them into the world, an almost unthinkable move in the fast-paced, hyper-competitive world of Silicon Valley.
"Eric Schmidt as far back as 2011, said this was the one technology that Google had developed and decided to hold back, that it was too dangerous in the wrong hands — if it was used by a dictator, for example," Hill said.
There are potential uses of the technology that could be beneficial. For instance, for people who are blind, or for quickly identifying someone whose name you forgot and, as the company highlights, keeping tabs on one's own images on the web.
But the technology has the potential to compromise the privacy of citizens. For instance, government and private companies could deploy the technology to profile or surveil people in public, something that has alarmed privacy experts who study the tool.
"These benefits are being used as a pretext for government and industry simply to expand their power and profits, without any meaningful gains any way," said Woodrow Hartzog, a Boston University School of Law professor who specializes in facial recognition technology. "And so, I simply don't see a world where humanity is better off with facial recognition than without it."
Like Apple Face ID, except on steroids
Of course, some version of facial recognition tools are already out in the world. Unlocking iPhones with Apple's Face ID. And at airports, the Transportation Security Administration can confirm someone's identify with a face scan.
But a face search engine takes this idea to an entirely different level.
And while Big Tech companies have been holding back, smaller startups pushing the technology are gaining momentum like PimEyes, and another called Clearview AI, which provides AI-powered face search engines to law enforcement.
ClearviewAI did not make anyone available for an interview.
Hartzog said Washington needs to regulate, even outright ban, the tools before it becomes too widespread.
"I think that it should really tell you something about how radioactive and corrosive facial recognition is that the larger tech companies have resisted wading in, even when there's so much money to be made on it," Hartzog said.
Just like AI chatbots, facial recognition search engines can take off
Most Silicon Valley watchers predict it is just a matter of time.
Look at AI chatbots as an instructive lesson. Silicon Valley giants had developed the powerful chatbots for years in labs, but kept them a secret until a smaller startup, OpenAI, made ChatGPT available to the public.
Eventually, tech analysts say, Big Tech companies will likely have no choice but to make advanced face search engines publicly available in order to stay competitive.
Hatzog said he hopes it is a future that never comes to pass.
"If facial recognition is deployed widely, it's virtually the end of the ability to hide in plain sight, which we do all the time, and we don't really think about," he said.
A "walking barcode"
In the European Union, lawmakers are debating a ban of facial recognition technology in public spaces.
Brussels-based activist Ella Jakubowska is hoping regulators go even farther and enact an outright ban of the tools.
Jakubowska is behind a campaign called Reclaim Your Face that is warning against a society where visits to the doctor, a stroll down a college campus, or even crossing a street, will expose someone's face to scanning. In some parts of the world, it is already a part of daily life.
"We've seen in Italy the use of biometric, they call them 'smart' surveillance systems, used to detect if people are loitering or trespassing," Jakubowska said.
Jakubowska said the EU's so-called AI Act will be coming up with rules over how biometric data, like someone's face, fingerprints and voice, will be regulated.
"We reject the idea that, as human beings, we should be treated as walking bar code so that governments can keep tabs on us, even when we haven't done anything wrong," she said.
In the U.S., meanwhile, there are laws in some parts of the country, like Illinois, that give people protection over how their face is scanned and used by private companies. A state law there imposes financial penalties against companies that scan the faces of residents without consent.
But until there is federal regulation, how and where are faces are recorded by private companies is nearly unrestricted and largely determined by the multi-billionaire-dollar tech companies developing the tools.
300x250 Ad
300x250 Ad