Why does Xcode 14 deprecate bitcode?
This year’s WWDC brought a lot of news to Apple developers, but one announcement in particular from the Xcode 14 beta release notes took my attention:
- Starting with Xcode 14, bitcode is no longer required for watchOS and tvOS applications, and the App Store no longer accepts bitcode submissions from Xcode 14.
- Xcode no longer builds bitcode by default and generates a warning message if a project explicitly enables bitcode: “Building with bitcode is deprecated. Please update your project and/or target settings to disable bitcode.” The capability to build with bitcode will be removed in a future Xcode release. IPAs that contain bitcode will have the bitcode stripped before being submitted to the App Store. Debug symbols for past bitcode submissions remain available for download.
Bitcode is now gone!
It seems like an unexpected removal at first but after some thought, I think it makes total sense, so let me try to bring some clarity to this deprecation.
What is bitcode?
Bitcode is essentially an LLVM Intermediate Representation (IR) of your code, somewhere between source code and machine code. When you compile source code using LLVM, source code is translated into this intermediate language, named bitcode. This bitcode can be then analyzed, optimized, and translated to CPU instructions for the desired target CPU.
According to Apple documentation, bitcode “allows the AppStore to compile your app optimized for the target devices and operating system versions and may recompile it later to take advantage of specific hardware, software, or compiler changes.”
But why?
Sounds amazing right? so, why is Apple deprecating it?
Apple has nearly completed its transition to hardware that uses exclusively arm64-based architectures, meaning that is no longer required the flexible back-ends provided by LLVM to generate instructions for another architecture without having to re-compile the source code.
Personal take
There are a lot of ifs and guesses about bitcode and nobody knows to what extent Apple has been using it to do what they preach in their documentation.
A safe guess could be that, when bitcode was originally conceived, Apple was trying to futureproof its development ecosystem as it wasn’t clear the direction they were going to take regarding future architectures.
That probably changed soon after as they decided to transition every platform to arm-based architectures, drawing bitcode pointless in the upcoming future. They just keep forcing developers to enable bitcode for the sake of not making things more confusing, waiting silently til they reach a point where they didn’t need to support a non-arm64 architecture.
Furthermore, we can be a bit skeptical with the usage Apple did of the bitcode and ask ourselves some questions like, why didn’t Apple provide a way to get macOS apps converted from x86 to arm64 when M1s were introduced despite that being one of the most prominent use case for bitcode?
I assume that, despite forcing bitcode for all apps, the optimizations they promised when bitcode first appear didn’t happen on their end, they just delivered exactly the binary code that we have built and signed ourselves and never any code that Apple has themselves created from bitcode.
Was bitcode good for developers? No
Was it bad? Sometimes
It was a good (and pointless) riddance bitcode, nobody (except security companies) will miss you!