Numerous reports like this one from TechJury underscore our increasing dependence on mobile apps for everything from tracking fitness and monitoring health to ordering food and booking flights. Statistics show that, in 2020, the average smartphone user installed 40 apps, spending 87% of mobile time using them. While offering incredible conveniences, these apps are also a vehicle for malicious hackers to obtain sensitive data and personal information. But before we dive into the work of hackers, it is important to understand user privacy.

While many mobile apps require a user to accept terms and conditions before launching, it's safe to say most people skip over the pages of small print and just hit the "accept" button — trusting the app maker has the best interest of users in mind. This is not necessarily the case, even with the most widely used apps. Take Facebook, for example. When terms and conditions of use are accepted, Facebook has permission to access all of the user's internal phone storage, call logs, texts, contacts, camera rolls, microphone, Wi-Fi connection and user location. Many people respond to this by saying, "I have nothing to hide, so what's the big deal?" Well, here's the big deal: The more dispersed one's personal data, and the more apps that have exposure to one's data, the greater the chance that data will fall into the hands of a hacker. Add to this sort of vulnerability the number of fake mobile apps users are unwittingly downloading to their phones. 

Granted, there may be little the average mobile app user can do — except read the terms and decide whether or not to use such an app — but there is plenty a mobile app developer can do to protect consumers' privacy. 

Traditionally, mobile app developers build their apps then upload them to an app store, understanding that, once it's "out there in the wild," it is difficult to know who and where it will be downloaded and installed. This, however, opens the door to vulnerability, which is why privacy regulations have increased over the last several years. That's not to say, all is well. On the contrary. In the U.S., for example, there exists many data privacy and data security laws, with more coming quickly, among the 50 states, some of which apply only to government entities while others apply to private entities, or both. This presents a patchwork of policies that are not only nearly impossible for the average reader/mobile app user to understand but also disjointed. No single federal standard is in place.

It's definitely alarming to be reminded just how very vulnerable consumers are in extending their smartphones and IoT devices with their fragile onboard security (and a treasure trove of financial data, personal details, relationships and health statuses) into a wilderness of copycat apps and malware. But given this information and awareness, user privacy, particularly Personally Identifiable Information (PII) and other sensitive data, is increasingly becoming a top consideration for ethical app developers during the construction of apps and all throughout the development lifecycle. First and foremost, mobile app compliance should be a part of an overall strategy and the SDLC cycle to ensure users' right to privacy. And it should be communicated in a way that's easily understood by the user. For example, rather than a stream of small print and paragraph after paragraph of jargon explaining the terms and conditions, users must be able to clearly identify and read in plain language specific and relevant information, including the following:

  • The purpose for collecting data 
  • The benefit to the consumer
  • What specific personal data is collected
  • In what form the data is collected
  • Where data is transferred to
  • How long data is retained by the app
  • How data can be deleted by the user

A compliant app will also honestly and objectively provide the user with all mandatory information such as proper app metadata on the commercial marketplace. Google, for instance, recently published updated guidance for Android developers to improve app quality and discovery on Google Play to ensure that store listing assets can help users anticipate the in-app or in-game experience and drive meaningful downloads. Its pre-announced policy change (enforcement date has yet to be determined) for app metadata includes the following:

  • Limiting the length of app titles to 30 characters
  • Prohibiting keywords that imply store performance, promotion in the icon title and developer name
  • Eliminating graphic elements that may mislead users in the app icon

Other information and descriptions, such as the explanation of the need to access the device's advertising identifier (iOS IDFA, Android AAID) and what this means for the user, should be provided even if a third party rather than the app itself performs the tracking. Users should also have necessary information explaining user privacy-related notifications (optional, either push or in-app), permission requests (messages about what value is delivered to the user, i.e., location tracking services), attempts to gather user analytics to track behavior or performance, as well as the provision of informational screens about the app, about the developer, customer support or FAQs. Above all, data protection should be considered a shared responsibility by all parties accessing a user's data. In fact, mobile app developers should feel a sense of obligation to ensure privacy and security during design and production. This includes authorization, proper use of system API, encrypting confidential data-at-rest and data-in-transit, as well as passing formal security testing. 

With the dynamic nature of data collected today, mobile apps should be designed not just for the present but the future — and with the ability to gracefully handle potential situations where user permission granted is revoked or where consent given is nullified or any data collected is erased. In other words, the application can react accordingly and keep its state consistently. In doing so, we can all contribute to a much safer mobile world.