The hearing has been held in one of Apple’s legal battles against Epic, and now we await the verdict of the California courts. In the meantime, Apple’s 325-page legal position is now public (here’s the core of Apple’s defence), and in it are some extremely interesting details about how one of the world’s biggest and most important tech companies operates.
The number that initially blew me away is simply how many developers have signed up to iOS, a platform that launched 14 years ago. There are “27 million registered iOS developers who have agreed to abide by the Developer Agreement,” which means that approximately 0.33% of the global population are iOS developers.
This obviously translates into a lot of apps, and “Apple therefore uses—indeed, pioneered—robust manual review in the app review process, involving close to 500 Apple employees deployed across the globe.” Apple first uses machine review to detect whether the app violates any App Store rules or uses private APIs, and the algorithm can also make judgements about copycat or scam apps (such as rejecting a Fortnite clone called Fortcraft).
The human element is essential, however, because basically algorithms are dumb: it cites the example of how a human would instantly recognise that a calculator app asking to use the camera is suspicious activity, while the machine review process wouldn’t necessarily pick that up. Every single app is subject to human review.
There’s perhaps a perception that anything can get on the App Store, but this review process “results in the rejection of about 40% of submitted apps. Most of these rejections prevent apps that have software glitches or bugs, or that would compromise users’ data privacy or security.” To give some idea of the scale of this, in 2020 Apple rejected over 150,000 app submissions for violating its privacy guidelines.
The outcome of which is that iOS is one of the safest platforms out there for users: a key part of Apple’s argument for its ‘walled garden’ approach. “As a result of the App Review efforts and Apple’s subsequent processes, there is a significantly smaller number of malicious iOS apps than those available on Android. In 2018, the iPhone platform accounted for just 0.85% of malware infections. By contrast, Android accounted for 47.15% and Windows/PC accounted for 35.82%.”
Apple’s review process has long been somewhat opaque. A CNBC report that talked to some company insiders details the Executive Review Board, which meets weekly and sets policy for the developer relations department. The ERB has the final say on whether an app is allowed on the store or otherwise: it was this body, for example, that made the decision to remove the Infowars app in 2018.
During the Apple vs Epic trial, the California Court also heard testimony from Tristan Kosmynka, one of the heads of the App Store (thanks, 9to5Mac). Kosmynka re-emphasised the above numbers and added a few more details: roughly 5 million apps are submitted every year, and in 2019 the exact figure was 4,808,685 apps, of which 36%—1,747,278—were rejected.
One thing to be clear about, however, is that Apple’s system is far from perfect and the company struggles particularly with political cases. Most infamously, it banned an App that was being used during the recent Hong Kong protests to track police, which certainly does not seem like putting customers first. It’s far from the first tech company to do Beijing’s bidding, but that hardly makes it right.
Perhaps the most interesting aspect of this system is that, even with these problems, it’s undoubtedly the best approach anyone’s come up with, to the extent that Google, after several years of trying to moderate the Android store with machine learning, has transitioned to an approach much closer to that of Apple. There are certainly aspects of the walled garden to dislike: but it clearly does what it’s designed for.