The iPhone of dead San Bernardino terrorist Syed Rizwan Farook has been in the hands of the FBI for weeks now. So why hasn’t the intelligence been accessed and evaluated for the protection of Americans against possible future attacks and the prosecution of additional terrorist conspirators? Encryption? Not really. Ethics? Possibly. Public perception and future market share? Getting warmer!

On this past Tuesday (Feb. 16), a federal magistrate in California ordered Apple to develop a custom version of its IOS software that disables embedded security features and install it on Farook’s iPhone. On Wednesday, Apple CEO Tim Cook counterpunched in an open letter to Apple customers and the public, in which he described these actions as an “unprecedented step which threatens the security of our customers.”

Sen. Tom Cotton (R-Ark) asserted yesterday that Apple is more concerned with “a dead terrorist’s privacy over the security of the American people.” This issue is as simple as it is complicated, yet it presents a greater issue for Apple—which may be taking for granted the public’s naiveté on the differences between encryption and security.

The iPhone LTE (Long Term Evolution) protocol is a massive encrypted security protocol. In fact, at this point in time, even the NSA is unable to completely crack the code and listen to both sides of a conversation (metadata is a different topic that I wrote about a while ago). I don’t want to get too deep into the weeds here, but it’s a 256-bit end-to-end user encryption with multiple layers.

Basically, it would require breaking a code that’s the equivalent of a trillion x trillion x trillion x trillion (give or take a trillion) characters long, or more than a lifetime for even the most sophisticated computer. That’s the encryption level ensuring privacy of the current and future editions of iPhones. The iPhone 5S and newer have added the same layer of security to the individual device’s unique identifier (UID), which is randomly assigned and Apple does not have or store those UIDs. So from this perspective, Mr. Cook is correct, and Apple does not have a magic wand or universal key to break this encryption.

What the FBI is asking for is help with breaking the security access code. Farook’s mobile device is/was an IPhone 5C. This point is vital as Apple does have the technical capability to comply with the court order. For those who have iPhones, we know that our information can be completely erased if we incorrectly attempt too many security password inputs.

Although the iPhone 5C is not as dynamic in its deletion process as the newer editions have (and future editions will have), they don’t want to take the chance. The iPhone 5C does not have a fingerprint sensor or secure enclave (an unrecorded key and mixing process within a cryptographic co-processor), and because of this, Apple could easily assist by running a program requiring a reset of the Basic Input/Output System (BIOS) password.

Here are two thoughts that we are not hearing in the mainstream:

  1.  Why don’t the FBI’s computer forensic specialists “jailbreak” the phone since it has lower-level security-access protection?
  2. Why is Apple fighting this court order with such vigor since they have complied with law enforcement in the past?

The FBI wants this precedent established and made public. Improving encryption programs are challenging the ability of law enforcement to combat terrorism as well as drug trafficking, sexual predators, etc. Equally important, the FBI doesn’t hire the best hackers. They don’t fit the suit-and-tie, cookie-cutter mold.

The U.S. and China are in a cyber war. With record iPhone sales in China and the increasing potential for profit, Asia has put Apple in a position where they don’t want to be viewed as placing “backdoors” into devices so that Chinese buyers may be spied on by an adversarial government.

In this case, the FBI is not asking for a “backdoor” or universal key that may be applied at will. It has requested through the proper judicial process access to one individual device, the equivalent to longstanding precedents of getting phone records from phone companies, banks, landlords, and employers. Tim Cook is appealing to the public sector from which many do not understand the subtle differences in various security terminologies. The balance that the courts must weigh is in requiring technology companies to comply, yet wording the decision in such a narrowly construed legal principle that access is allowed in only the most compelling cases of national security.

The FBI wins this case.