Why Google and Apple App Stores Aren't Effective at Protecting Users Privacy
It's important for users to understand what personal data applications for iOS and Android collect and how the data is used. Apple and Google, as market leaders, have strict and well-developed requirements for app developers in terms of privacy protection; however, incidents with personal data leaks are not becoming rare. In fact, it's safe to say that effective control over privacy is impossible to achieve nowadays when we leave it only up to corporations. Not only do they continue to ignore incidents, they also don't allow third-party developers to give users control.
What’s the problem with Apple's strict control?
From a privacy point of view, Apple’s policies are well defined, and there’s no big difference between the declared and the actual approach. Apple imposes quite stringent restrictions on what user information an app can obtain.
Is it possible to quietly break these rules?
Yes it is, and unfortunately, there are plenty of high-profile examples. Recently it was discovered that Sensor Tower, a popular analytics platform for technology developers and investors, secretly gathers data about millions of people who installed popular VPN and ad-blocking apps for Android and iOS. As BlackBerry states, hundreds of apps circumvented Apple and Google security measures. And the sad part here is that Apple itself can search for these violators, but other researchers experience difficulties in doing so.
In 2018, the Uber iOS app suddenly received additional rights to access users’ screen recordings, which is quite an unprecedented step. Private APIs cannot be used in applications on Apple App Store, and the Uber API, which technically could allow them to record the display of the device, was eventually blocked.
Apple's privacy guidelines is tightened year by year, but the interpretation of the rules tends to change over time, and there is also evidence of this.
The case with AdGuard Pro is one of the examples. As a result, we even had to temporarily suspend the development. The reason was: a sudden change in the interpretation of some paragraphs of the App Store rules.
Users of iOS applications themselves don't have any means of control over their data; they simply have to trust the app developer. And Apple has no desire to let third-party developers provide privacy protection tools for end-users; instead, it severely limits the functionality of such applications.
And Apple, in turn, has no desire to provide privacy protection tools to third-party developers; instead, it severely limits the functionality of applications.
It would be more convenient for us, as an app developer, to live without restrictions. We provide a tool that they don't have, but they limit us in functionality and don’t always explain the reason. This isn't very convenient, but, unlike Google, Apple is ready to make contact. Nevertheless, our applications could’ve done so much more if we hadn’t been restricted in functionality.
The problem is also that Apple reviewers may not see what the application actually does with personal data. The number of Apple apps (as well as their developers) is growing rapidly, and in recent years the corporation had to enlarge their staff of reviewers. Unfortunately, new employees don't have the proper experience. They may not fully understand the guidelines, so they interpret the rules in their own way, each time differently. As a result, it can be difficult to agree with them on what's permitted and what's not. On the other hand, they at least explain in detail what the problem is, unlike Google, where you often have to speculate what's meant by a particular requirement.
In conclusion, we would like to say that Apple has very good privacy guidelines, and they try to apply them fairly, but at the same time, they also try to keep all privacy issues under their own control. And they can act quite selectively, and that precisely is the problem. But in general, in terms of privacy, the iOS platform is the most secure for the user.
West Google Play
Applications on Google Play are absolutely disrespectful to users’ personal data. Sad but true. The protection of personal data in Android apps remains surprisingly poor, despite the large number of high-profile incidents.
In 2018 we conducted some research and confirmed that Android applications from the TOP1000 can, without notifying the user, extract email addresses, contacts and text messages, and transfer them to third parties, and there is almost no protection against this. It was especially unpleasant to see that some of the most popular applications (10M+ downloads), award-winning, all those "Editors' Choice" and such on Google Play were doing this.
We found that at least three applications developed by the Chinese company GOMO violated users’; privacy and tried to siphon as much information as possible. The GO SMS Pro app boasts over 100 million installations according to Google Play. Immediately after installing the app, it sends your email to the goconfigsync.3g.cn domain directly in the request URL using regular HTTP. Therefore, your email isn't just sent to their server, but also provided to all intermediate third-party organizations.
Plus, we hafe found two more apps: Z Camera - Photo Editor, Beauty Selfie, Collage and S Photo Editor - Collage Maker, Photo Collage with more than 100M installations each. Both apps send your email address together with other various information to the domain zcamera.lzt.goforandroid.com. Ironically, GOMO likes to focus on privacy when describing their apps.
And nothing has really changed over the past two years. News about incidents regarding Google Play applications is still appearing. It's also impossible to be completely sure that security apps are not involved in unauthorized tracking.
The question is, why do such cases go unnoticed by Google?
In the meantime, the privacy situation on the Play Store can be described as the “Wild Wild West.” It would seem that Google sets the right requirements for mobile application developers, but the struggle to enforce them remains the task of “lone sheriffs.”
What does it amount to?
Google's declared privacy protection guidelines are milder than Apple's. Google doesn’t have restrictions on user identification, for instance. Plus, they have the appropriate restrictions on certain information, such as location, date, and contacts. According to the guidelines, personal data can only be requested from a user if the app actually uses it. Here's what's forbidden: “Apps that steal user authentication information (such as usernames or passwords) or imitate other apps or websites to trick users into revealing personal information or authentication information.”
Despite these restrictions, unfortunately, it's very easy not to follow them. Any developer can ignore the requirements, and there are dozens of examples of it. However, when new high-profile cases become public, Google may not even pay attention. Like they say, when a product is free, you are the product.
Users' data protection needs run contrary to the corporation's advertising business model. And it's not profitable for corporations to remove ads from their closed platforms, so they staunchly defend them.
If Google didn't restrict the functionality of third-party apps, the situation might not be so deplorable. Android app users’ sensitive information remains unprotected from unrestricted access by third parties. Google isn’t trying to solve this problem, nor does it allow third-party developers to do it, nor does it take responsibility for the inevitable incidents.
Gradually, Google is forcing developers to ask the user to grant access to certain data, and this helps a little with privacy protection. At the same time, they’re trying to solve all the problems in one fell swoop without using “manual force,” such as the App Store, which is a highly inefficient method. In our view, the ideal solution would be for Google to take control of user data and to give the users control via third-party developers.
In the meantime, Google is trying to automate all the processes of interaction with developers. But as a result, the procedure for interaction remains completely opaque, inconvenient, and constantly requiring further clarification.
Why do incidents occur?
Unfortunately, Apple and Google's policies actually have little in common with real security and privacy. Developers of privacy protection solutions are not comfortable working with either Google or Apple. Apple gives some opportunities, but at the same time it clamps users in the jaws of restrictions and approaches different companies selectively. Google allows developers to do anything, but not on Google Play :)
Of course, there are many more incidents of personal data leaks ahead, and as more and more users become interested in the topic of privacy, this problem will be dealt with more actively.