On a cold, sunny October day on the outskirts of Copenhagen, Denmark, a group of men dressed in black gathers outside Brondby Stadium to shoot off a couple of rockets, raise their fists and shout about how the home team will soon beat — and beat up — the visiting archnemesis, FC Copenhagen.
Police are out in force, riot helmets at the ready. Brondby-Copenhagen matches have a history of leading to vandalism, arrests and general mayhem.
An attempted photo of the group gets a gloved hand in the face. "You need to stop," says the hand's black-clad owner, before he disappears back into the crowd. A security officer scurries over in concern.
"Don't film them," she warns NPR. "It'll end badly. They don't want to be recognized."
Ironically, this group of black-dressed men is also the primary target of Brondby's new facial recognition system.
Once the men's chant is over, the group moves toward the stadium's entrance, where the men — along with 21,000 other fans — are asked to remove masks, hats and glasses so a computer can scan their faces. The scans will be compared against a list of roughly 50 banned troublemakers and will be used to determine whether the spectators will be allowed in.
No one is stopped on this day. But since the system's launch in July, it has caught four people on the blacklist, who were then turned over to police.
The use of facial recognition is best known in China, but it is also used in countries including Israel, the U.S. and the United Arab Emirates. In Europe, biometric data is protected under the General Data Protection Regulation, arguably the world's most comprehensive privacy law, which went into full effect in mid-2018. Several institutions that had been using facial recognition technology prior to that — including a school in Sweden and a police force in Wales — have since been challenged, with differing results.
What's unique about the use of facial recognition in Brondby is that experts agree that it appears to be one of Europe's first large-scale, private systems created and vetted in the era of GDPR. (A definitive list is hard to come by since every country implements GDPR differently.)
Brondby resident Martin Lund, waiting in line for the game with his kids, is on board.
"I think it's great!" he says. "We're here to enjoy the game. We're here to have a nice time. We're here to shout at the opponent ... but we're not here to fight. If people want to fight, they shouldn't go to the game."
While the idea of facial recognition technology gives him pause, Lund says he trusts in his country to ensure that it's being used well.
"You can't do anything in Denmark without getting the proper approval," he says. "So it's not being misused, I don't think. You can't do that in Denmark."
According to the Brondby soccer club's security chief, Mickel Lauritsen, getting this system approved was a long process. It started almost five years ago, when the team kept getting fined for lax stadium security but felt hamstrung in its attempt to make improvements.
For example, stadium stewards were allowed to see only descriptions of the troublemakers they were expected to pick out of the crowd. Then they won approval to use photos. With that approval in place, the team launched its request for a facial recognition system and began nearly three years of negotiation involving the Danish Data Protection Agency, team lawyers, fan input and system developer Panasonic.
With the system in use now, Lauritsen says, he's very careful to stay within its prescribed boundaries. That means pictures of those on the watchlist are entered into the system on game day and are deleted again at the end of the day. The system is not connected to the Internet. There's a cross-check to avoid false positives.
Lauritsen says that at one point, the police asked him to enter a suspect's picture into the system to help with an investigation. He said no.
"I know if I misuse the system, I'm not allowed to do anything with it going forward, and then we'll be restricted in what we can do even further than we are now," he explains.
Lauritsen would eventually like to be able to share watchlists with other soccer clubs in Denmark. As a former police officer, he'd like it if the Brondby system could be of service to law enforcement. But for now that is not part of the agreement. "So I'm not going to misuse the system," he says, "or misuse the trust we've been given."
Still, that's not enough reassurance for everyone. Jesper Lund, who chairs the IT-Political Association of Denmark, a watchdog group, says it's a slippery slope from one facial recognition system to the next. He believes it's a tool that should be reserved for rare situations involving terrorism or serious crimes. He also notes that this particular technology can be unreliable and inaccurate.
"Using this very invasive and error-prone technology for something like making sure that persons on a banned list cannot go to a football match is really not proportionate," he says. "So in my opinion, this should never have been allowed by the DPA," Denmark's Data Protection Agency.
Even so, Lund acknowledges that his is a tough fight. Every non-hooligan whom NPR interviewed at the Brondby Stadium expressed some version of "I have nothing to hide" or "Facial recognition is inevitable."
University of Copenhagen IT law professor Henrik Udsen believes the best thing any country can do is start having the conversation. Udsen was involved in the Brondby decision as a member of the council that makes precedent-setting decisions under the Danish Data Protection Agency. He explains that GDPR includes different options for legal processing of personal data. The most common, user consent, is not practical in a stadium situation. Instead, Brondby had to convince regulators that improving stadium security was in the public interest.
Of course, not everyone agrees about what constitutes the public interest.
"The important part," says Udsen, "is that we have these discussions and we are taking both advantages and risks into account when we decide what to do. Because there are no necessary right and wrong answers here."
In the U.S., no federal laws explicitly regulate facial recognition technology — yet — so discussion has been happening primarily at the state and local levels and usually focuses on the public sector. San Francisco, for example, recently banned the use of automated facial recognition by the police and city agencies.
But that could soon change. Several bills now pending in Congress would regulate facial recognition technology on multiple fronts.
300x250 Ad
300x250 Ad