Looking back at this 2016 article from The Washington Post, one really shouldn’t blame them for their optimism regarding WhatsApp’s “end-to-end encrypted” capabilities. The encryption (e.g. security) behind WhatsApp, owned by Facebook, is actually quite good at its core.
But what couldn’t have been so easily predicted by the Post is what I’ll call the “Facebook effect.” What I mean by that is any Facebook service has a single purpose: to collect information about you, whether personal information or generic metadata. It wants it all. It might be somewhat secure, but it certainly isn’t private.
‘The idea is simple: when you send a message, the only person who can read it is the person or group chat that you send that message to. No one can see inside that message. Not cybercriminals. Not hackers. Not oppressive regimes. Not even us,’ WhatsApp co-founders Jan Koum and Brian Acton wrote in a blog post.”
In reality, WhatsApp might not officially be able to decrypt your messages or have a motive to target you personally—but I don’t trust that they wouldn’t. If hackers can compromise WhatsApp to access contacts, messages, and your smartphone camera, I’m pretty sure Facebook, WhatsApp, and pretty much any other major partner of theirs could do the same.
Here’s the next gem from the Washington Post article.
“WhatsApp and Facebook are ‘great American companies,’ said FBI General Counsel James A. Baker Tuesday in a moderated discussion at a conference of the International Association of Privacy Professionals. ‘But this presents us with a significant problem.'”
In light of the current storm clouds hovering over Facebook for their apparent lack of ethics, it’s amusing that Mr. Baker made this statement. In the context of federal investigators at the time being frustrated by cases involving encrypted communications, it could be inferred that Mr. Baker resented the idea that they had any trouble monitoring and accessing private data. It’s now well-known that Facebook executives developed a pattern: demonstrating a lack of conscience or boundaries regarding monitoring and collecting private data.
James Baker’s boss, James Comey, was also quoted about how encrypted apps are “hindering investigations.” Ironically, this was especially true during the Clinton scandal that he’s infamously tied to. From the article:
“His boss, FBI Director James B. Comey, has often said that the Islamic State is using encrypted apps to direct people to kill ‘innocent people’ in the United States. And it is hindering investigations of murder, child pornography, organized crime and a range of other crimes, law enforcement officials said. Still, Comey said at a congressional hearing last month: ‘It is not our job to tell the American people how to resolve that problem…Our job is simply to tell people there is a problem.'”
Mr. Comey was true to his word in only telling us there was a problem with an investigation that was being “hindered”—but not doing anything about it.
It’s easy to look back and point out where this article and the people in it were wrong. In reality, the lesson here is to never give these large tech companies—Facebook, Google, and others—too much credit for any privacy and security measures they take.
All of them present large attack surfaces for anyone with the time, money, and resources to exploit vulnerabilities. It isn’t a matter of if they’ll be breached or hacked: it’s how long it will take before they are. The real threat might just be what the tech companies are doing with your information, not an external hacker.
And as for assurances from industry leaders or government head-honchos—well, if history is any indicator, I wouldn’t put too much trust in what they have to say either.