“With a single click, users can see every photo of the person they’re searching for”

Like a vacuum cleaner sucking up photos from every corner of the internet — and then selling that information through a database. It’s not allowed, but it’s happening. Thanks to Bits of Freedom, action is now being taken. Policy advisor Lotte Houwing: “It gives hope for better protection.”
Internet Freedom
“At Bits of Freedom, we advocate for internet freedom. Even online, companies must follow the rules and respect users’ rights. Yet they often find ways to sidestep the law.”
“For example, we raised the alarm about a well-known AI company. Like a vacuum, this company scrapes every photo it can find from the internet and stores them in a database. Their business model? Charging law enforcement agencies for access. With the click of a button, users can pull up all photos of the person they are looking for. With just one photo, you can trace someone’s identity—meaning anonymity in public spaces could disappear.”
Nothing to Hide?
“People sometimes say, ‘Well, I have nothing to hide.’ But even if you have nothing to hide, it’s important to have control over your own data. You share different information with your doctor than you would during a job interview. Some pieces of personal information are meant to stay separate.”
“There’s also a margin of error in the technology. People of color, particularly women of color, are at greater risk of being misidentified. Some AI companies show little regard for these sensitivities or for privacy rules.”
Unpaid Fines
“In 2024, we discovered that the AI company was actively offering its database on the Dutch market, and that it was being used here. We already had evidence that Dutch citizens’ photos were in the database — when we filed a data access request, I actually saw my own photos. Based on our findings, we informed the Dutch Data Protection Authority (Autoriteit Persoonsgegevens).”
“Other European countries had already fined the company, but since it’s not based in Europe, it’s been able to evade those penalties. That’s why we also advised on alternative strategies to hold the company accountable.”
Getting to Work
“It was encouraging to see the Dutch Data Protection Authority explore those options. Could they, for instance, hold Dutch customers or organizations working with certain AI companies responsible? Our investigation also spurred Dutch policymakers and regulators into action: this kind of facial recognition technology is not allowed — now the question is, how do we make sure it really stops?”