(Jeremy Vale coauthored this post.)
All eyes may be on on Hong Kong right now and its use of facial recognition to identify protesters, but a far more enlightening case — for those fortunate to have less antagonistic relationships with local authorities — can be found in the small Swedish town of Skellefteå (“sche-left-eye-o“), located near the Arctic Circle. This sleepy mining town made international headlines when it became the recipient of Sweden’s first ever GDPR fine for piloting the use of computer vision to track the attendance of high school students.
Why is a small fine handed out to a remote town important? Because it demonstrates that computer vision — and facial recognition in particular — can now be implemented successfully by everyone — regardless of resources and capabilities — and that you must focus on meaningful consent to ensure successful adoption.
Pupils Trained On Pupils
This past year, Skellefteå’s Anderstorp high school partnered with a Finnish software and IT consulting firm called Tieto to pilot automation technologies for monitoring and reporting student attendance. Per Swedish law, schools must report attendance numbers daily — a time-consuming requirement that burns many valuable hours over the course of the school year. Collectively, Tieto reports that the school’s teachers spent an astonishing 17,280 hours on attendance reporting annually.
Tieto piloted two solutions to this problem: one based on RFID tags that would register students as present when they were within range of a receiver and one based on facial recognition technology. The latter proved to be more successful, as a large percentage of students would forget to bring their tags to class, rendering the data incomplete and inaccurate.
Don’t Cheat On Meaningful Consent
So why the SEK 200,000 (~$20,000) fine? A key elements of the Swedish Data Protection Authority’s (DPA) complaint was around the issue of consent. Yes, students and parents were asked for consent (and some indeed did refuse to participate), but the agency deemed that this wasn’t sufficient given the power dynamics between the students and the school. Thus, written consent is not sufficient, even when the person has a viable alternative, if the person is pressured or coerced into providing consent. This is the difference between “consent” and “meaningful consent,” and it applies everywhere when it comes to artificial intelligence. You risk customer, employee, and regulatory blowback when the individual feels tricked, pressured, or otherwise coerced into using an AI solution.
The right way to introduce an AI solution is to offer one that materially benefits the individual and persuades them of those benefits — while offering alternatives, including the status quo. And there should be every guarantee that choosing the status quo will have no negative repercussions and will still be as good, or, ideally, better with the AI solution around. Otherwise, you are right back to making people feel coerced.
Do Your Homework
Another key element that led to this fine appears to be the lack of a sufficient impact assessment. For sensitive projects such as this, it’s critically important to proactively be in touch with the relevant authorities, and it seems the DPA was never contacted beforehand. Important questions needed to be examined: How would the students who opted out feel? Even if they knew they were not having their biometric data collected, were they really still comfortable? Did the students who opted in trust that the technology was accurately recording their attendance? What were the risks to collecting and storing this kind of biometric data, and how would those risks be mitigated?
So does this mean that computer vision solutions are not compatible with GDPR? Far from it. This fine isn’t based on GDPR, but the interpretation of GDPR in Sweden — a country that has a history of stringent data protection legislation going back decades. It is unlikely that this establishes a limiting precedent across the European Union. In fact, if there is a larger geographic takeaway from this scenario, it’s a positive one. A decade ago, this computer vision use case would have been science fiction. Today, the technology is mature enough that you can deploy it anywhere — even at the edge of the Arctic Circle.