
Let me explain why I think Apple’s decision to revoke Facebook’s developer certificate was a strategic err. Not just for Apple, but for tech in general.
First the background. For an app to run on an iPhone, it has to be signed by an Apple certificate. Apple provides app developers with both an enterprise certificate, with extra privileges for internal iOS app development, and a general certificate, for the app store. Facebook broke Apple’s rule in using the enterprise certificate to sign a Facebook VPN app (giving it more privileges) for an app going to the App Store for regular customers. This app could then collect information on which other apps were installed on user’s phones, and what data traffic was sent. This was a violation of the rules, a privacy breach, and was done knowingly. Make no mistake. Facebook was bad bad bad.
In response Apple revoked Facebook’s enterprise certificate, which in turn shut down all iOS development and testing inside Facebook. It’s the nuclear option. Then it turned out Google had done something not quite as bad but similar, so (to be fair), Apple shut down Google as well. And now it turns out there are other companies which violated the terms too. After a few days, Apple re-instated the certs for everyone, so now things are getting back to normal.
Here’s my point. As Ben Thompson noted: “I wouldn’t be surprised if, in the long run, the company comes to regret this move.” This is exactly correct. I believe Apple made a strategic err they will regret. Even though it’s popular right now. For example John Gruber “Give me a break. Everybody knows that Apple’s enterprise certificate program does not in any way permit distribution of this kind of software, which wouldn’t be allowed in the App Store. But that’s exactly how Facebook was using it. There are pros and cons to Apple’s iron-clad control over native apps on iOS. This incident with Facebook is one of the pros.”
Both can be true: 1) Facebook gets away with too much and absolutely deserves severe punishment, and 2) Apple going nuclear was a mistake. No matter how deliciously satisfying. No matter if the rules said they could revoke Facebook’s certificate. Rules should be enforced with an eye toward precedent, with an assumption bad actors will eventually latch on to those same precedents.
Here’s an analogy. Suppose back in the day when Netscape had just invented the web browser and was riding high, and Microsoft was the evil empire (think circa 1997), Microsoft shut down all Netscape windows development for a rules violation. It’s similar to what Apple (good company) did to punish Facebook (bad company). But in this hypothetical the villains are swapped. It’s Microsoft (bad company) punishing Netscape (good company). While this couldn’t happen back then, with cloud centralization it can now. As Apple just demonstrated.
The cold war between the Soviet Union and the US was very fraught, but the one thing both sides got right was showing restraint with nuclear weapons, either directly or in proxy wars. The cold war bomb threat was invoked all the time. Fine. But leaders showed restraint. Knowing if one side (the good side) used them, even if justified, then the other side (the bad side) would run with that precedent and also use them. Apple could have played the brinkmanship game, publicly threatening to revoke the cert in a Bay of Pigs style confrontation with Facebook, and gotten much the same result, but without actually firing nukes.
It’s unclear exactly how and when this may come back to bite. But as a hypothetical imagine a startup using modern centralized platforms such as Amazon AWS, Microsoft github, Google maps, etc etc. And Amazon, Microsoft, Google put in place complex developer rules on what’s not allowed. So it’s hard to comply. Then if that startup later competes against one of those companies, the platform owner has the temptation to stop that company dead by disallowing development on their platform on the pretext of a rules violation. Threatening may be enough. And I’m not saying Amazon, Microsoft or Google in particular will do this. Rather that in the cloud era all centralized cloud platforms face this temptation.
Apple set a precedent. Nukes are now allowed. Even if the letter of the law says you can do it, restraint is often a wiser choice. There will be increased temptation to create platform rules which are easy to violate. Laying a trap. Just in case. Then, if the mood strikes, bombs away.

I normally find your posts very clear, informative and convincing, but I found this one hard to understand. Perhaps if I were a subscriber to Ben Thompson the citation to his post would have helped. Are you saying that overly complex rules should never be enforced, but instead only used for leverage in trying to maintain control? Does enforcing the rules even once somehow reduce future options? Or is the concern that Apple should have kept its excessive power more hidden? Or perhaps that Apple is increasing uncertainty in its marketplace by using the rules so directly? Does your cold war analogy allude to worries about retaliation and all out warfare? I also don’t understand your future scenarios about github, AWS and google. Who is doing what, and what has been lost because of current actions?
sorry it was not clear. My point is about enforcing rules with judicious restraint.
A software platform owner can decide not to harshly enforce the most drastic penalty for a rule, even if technically they are completely within their rights to do so. And even if the bad actor morally deserves it.
If they use the most drastic penalty, then that norm becomes a precedent. And in future other (bad actors) have the chance to enforce rules on their platform, they will be more able to use the most drastic penalty. The legal framework rests on a bed of norm behavior. And using restraint when enforcing rules is sometimes wise.
Amazon AWS is a software platform, github is a repository platform, Google maps is a platform used widely by many 3rd parties. If those platform makers choose to do so, they can make complex rules which people will violate, and then kick them off the platform. That will be easier to do if the norm of kicking bad people off platforms is encouraged.
Thanks, that was very helpful. I think the fact that they worked with Facebook and Google to reinstate internal certificates the next day sets a rather mild precedent.