You could call it déjà vu, but everyone knew it was just a matter of time before it would happen again: another mass shooting, unanswered questions and, inevitably, another locked phone that could hold some answers.
But how can I say it any more clearly? The safety of our nation does not depend on giving Attorney General William Barr the keys to spy on anyone with a mobile phone. In fact, it’s the opposite.
Yet in another fatuous speech, the kind of which he seems to specialize in, Mr. Barr attacked Apple on Monday for not helping to unlock two iPhones that he claims the government needs to scrutinize, saying the company has given him “no substantive assistance” with an important criminal investigation.
Mr. Barr’s focus was a shooting last month at a naval air station in Pensacola, Fla., in which a Saudi cadet training with Americans killed three and wounded eight others. (Mr. Barr was also complimentary of Saudi Arabia in the speech; welcome to the Trump administration.)
Apple rightly pushed back against Mr. Barr’s claims, pointing out that the company had, in fact, helped the government by turning over information on the dead gunman from its servers.
What does Mr. Barr specifically want from Apple? He doesn’t say, of course. There is nothing Apple can actually do to unlock the gunman’s phones that government tech geeks cannot do, short of a systematic change that would weaken the security of all phones. Thus, his aim is clear: He wants the power to go in and out of any phone, any time.
Believe nothing Mr. Barr says. We should question how long the government has taken to make these requests of Apple (I’m told that the company was made aware of a second phone only last week). And, perhaps most important to keep in mind, the government can probably already break into our phones if it wants to.
It has before, even if Mr. Barr seemed to indicate on Monday that the government could not pull it off without help from Apple.
That was what happened when federal investigators battled with the company over iPhone encryption after the mass shooting in San Bernardino, Calif., that killed 14 people in 2015. In that case, the government was trying to get Apple to “break” the protection of the suspect’s phone by demanding it create what Apple called in a court filing a “GovtOS” (that is, a government operating system), giving it that metaphorical key to a backdoor to all phones. The F.B.I. director at the time, James Comey, and President Barack Obama pressed Apple to help it crack into a device that one of the gunmen had owned.
The Apple chief executive, Tim Cook, refused to do it, in what was probably the most difficult decision of his career. He stuck to the principle espoused by his predecessor, the Apple founder Steve Jobs, who had long maintained that the ultimate act of patriotism is to protect the privacy of Apple customers against unnecessary search and seizure.
Over the years, Apple has increased security for consumers through encryption. In the most simple terms, phones were built to be opened only with a user’s passcode (this tech is also what enables fingerprint and face identification), and the company has built in no power to bypass that. The iPhone has been engineered to offer more and more user control.
That’s why Apple would not bend in the San Bernardino case. In the end, reports say, an Israeli firm unlocked the phone for the government, and the collision between Apple and the F.B.I. was averted.
In the Pensacola case, Apple has said it has given up all the data it has to the government. But law enforcement officials, as all law enforcement officials tend to do, are asking for more, claiming this time that they want only to look at the particular phones in the Pensacola case, and are not seeking a back door for all phones or a special operating system.
The problem? There is no breaking into a single phone without showing the government how to break into all iPhones.
While public safety is important, we have to ask whether getting into these phones is the only avenue the government has to pursue justice or find the truth. We must trust our public officials with confidential and sensitive information to allow them to do their jobs. But the need to guard users’ data — especially in an age when hackers and more autocratic governments can use it for more nefarious reasons — still trumps the public-safety concerns Mr. Barr is raising.
He said in his news conference, “This situation perfectly illustrates why it is critical that the public be able to get access to digital evidence.” Really?
Does it also perfectly illustrate that the United States government did a bad job vetting the Pensacola gunman, a radicalized Saudi cadet? Does it also perfectly illustrate that American officials missed his public postings that indicated an interest in jihadism? Does it also perfectly illustrate that the Saudi government has backed a lot of the type of extremism that might have led to this attack?
I don’t know the answers to these questions. I don’t know how to deal with phones that have become much more secure. But the issues raised here most definitely require real cooperation between tech companies and the government; we may even need legislative solutions.
But first we need a whole lot of substantive public discussion and debate about the trade-offs we face as real progress in potentially dangerous surveillance is only increasing.
Apple clearly feels strongly about this matter. The company has benefited from Trump administration policies — the tax cuts, among others — and has little apparent incentive to pick a fight with the attorney general. Mr. Cook has met with Mr. Trump many times and has been cordial, even solicitous. Those days seem to be over.
The government appears to be trying to make everyone more vulnerable to privacy violations, which is completely backward. A private company should not have to protect us from the government. The government should be protecting us — and the fact that it’s not should be the scariest thing of all.