Mobile App Developers: Stop Capturing Unnecessary Data Before Regulators Stop You
The findings presented in an article by German magazine Computerwoche published on Feb 11, 2014, are a forceful reminder that messages about excessive data capture via mobile apps seem to have gone unheeded so far. As reported, tests by TÜV Trust IT established that “almost one in two mobile apps suck up data unnecessarily”.
What’s “unnecessary” of course depends on your viewpoint: it may seem unnecessary to me if my mobile email app captures my location; the provider of the app, on the other hand, could be capturing the information to provide me with a better service and/or to make money from selling such data to a third party. The trouble is that I don’t know, and I don’t have a choice if I want to use the app. From a consumer perspective, this is not a satisfactory situation; I’d even go as far as calling it unacceptable. Not that it matters what I feel; but privacy advocates and regulators are increasingly taking notice. Unless app providers take voluntary measures, they may see their data capture habits curtailed by regulation to a greater degree than would otherwise be the case.
Let’s step back a moment and consider why so many mobile apps capture more data than is strictly speaking necessary for the functioning of the app:
- Conscious decision, for commercial reasons: in particular when companies are making apps available free of charge, they may (not unreasonably) feel that they have a right to recoup their investment in some way. One way to do that is monetizing the data – the more you capture, the more you can sell. That’s no doubt why many providers of paid-for apps can’t resist the temptation, either.
- Conscious decision, simply because it’s possible: with the best of intentions, developers may decide to capture information just because they can, in case it’s of use at some point later – even if nobody has thought about what kind of future use case or functionality might leverage this data. This applies even in countries with stringent data privacy laws.
- Ignorance and carelessness: the quickest and easiest way to get a mobile app up and running is often to take existing apps or components and reuse them. Developers may either not realize or simply not care that the code they’re reusing is capturing all kinds of data that’s not actually needed for their app.
And we’re not just talking about device data here (OS and version, language, serial numbers, location, time stamps, accelerometer data, and so on); some apps also transmit personal information, including address book content, without the user being aware. So far, we’ve seen comparatively little user pushback. This is likely because many users simply not aware of what’s going on, while others simply accept whatever is presented to them for agreement, because they want the app, and they want it now. But this may be changing: in a survey carried out by YouGov in December 2013 on behalf on behalf of the UK’s Information Commissioner to support an awareness raising campaign, 61% of respondents were either concerned or very concerned about “the way mobile apps can use your personal information (e.g. locational data, web browsing history)”; 39% stated that they’d “decided not to download an app due to privacy concerns”.
Covert data capture can also lead to regulatory intervention, which of course attracts headlines that in turn further increase consumer awareness. One example is the US Federal Trade Commission’s action against Goldenshores Technologies, the developer of the “Brightest Flashlight Free” app, which failed to disclose that it was capturing and transmitting users’ location data and unique device identifiers to third parties (http://www.ftc.gov/news-events/press-releases/2013/12/android-flashlight-app-developer-settles-ftc-charges-it-deceived).
As the flashlight example also demonstrates, the argument is as much about transparency as it is about data capture: the ruling against Goldenshores was about its failure to disclose properly how users’ information would be used, not the fact that it was capturing particular types of data.
As my colleague Fatemeh Khatibloo argues eloquently in her blog post “Rumors Of Privacy’s Death Have Been Greatly Exaggerated”, the time has come to take a different approach to consumer privacy – one based on context, consensus and transparency.
In the context of mobile app development, good guidance is already available on what a “privacy by design” approach should look like. Two resources to highlight are the UK Information Commissioner’s “Privacy in mobile apps. Guidance for app developers” and the GSMA’s proposed set of “Privacy design guidelines for mobile applications”. It’s worth noting that the latter was published back in February 2012; in other words, there’s no excuse for blaming mobile data capture misdemeanors on lack of guidance.
I strongly encourage all app developers to follow mobile app privacy guidelines voluntarily, such as those referenced above. True, some of the principles may appear to be in conflict with the organization’s immediate commercial goals. But I’d argue that it’s better to figure out a consensual, transparent way of addressing this type of data capture than to risk a regulatory clamp-down.