The Wall Street Journal has opined, in a column by L. Gordon Crovitz, that "Terrorists Love Silicon Valley." Crovitz is an Executive Vice President of Dow Jones, which publishes The Journal, and he is President of its Consumer Media Group.
The column I am referencing appeared in the July 6, 2015 edition of The Journal, and the main basis for Crovitz' claim that the Silicon Valley is soft on terrorism seems to be the fact that Apple has now announced that its iPhone will encrypt data by default. Even Apple itself will not be able to "unlock" your data, and that means that Apple can't provide any of your data to the federal government, even if the government can produce a properly obtained search warrant.
Crovitz takes great exception to this new development, but I am thinking that this is, in fact, some rather "good news" for the ordinary citizen. Heretofore, the United States government has demanded that software manufacturers, and equipment manufacturers, always provide some sort of "backdoor," to allow the government to get access to private data, including encrypted data. As we have learned from Edward Snowden, and others, the government has seldom bothered to get a search warrant when it decides to make use of this open back door.
The practical problem, of course, is that any back door left open for the convenience of your lover may let burglars into the house. In fact, that is exactly what has been happening. The government's demand that all data, including encrypted data, be accessible to the federal government has led to massive data thefts by nefarious types who have figured how to get into that back door left open for the government. Maybe the data thefts are being orchestrated from China, or maybe from North Korea, or maybe from somewhere else.
It looks like Apple and other Silicon Valley high-tech companies have decided against leaving that back door open. As I say, I am not so sure that this is truly a bad thing!
It is true, clearly, that if encrypted data is now actually going to be "secure," then the government will not be able to browse through everyone's data to find potential terrorists. It does seem, based on Crovitz' article, that it should be possible to set up a system that would allow the government to get access to encrypted data through a "multiple key" system that would permit "reasonable searches and seizures" authorized by procedures that actually comply with the Fourth Amendment. Such procedures don't allow the government to search anyone and any place they want to, in the hope that maybe they will find some incriminating evidence. The Fourth Amendment requires the government to prove to a judge, before searching, that there is some "probable cause" to search. If the judge thinks there is, then a search warrant will issue and the search can proceed.
The Fourth Amendment is to protect all of us from general searches by government officials, based on nothing but the officials' decision to search.
Figure out a technology that lets the Fourth Amendment work the way it's supposed to, and I'm all for letting the government have access to private data, including encrypted data.
WITH a search warrant.
Until then, don't bother trying the back door. I keep it locked!
The column I am referencing appeared in the July 6, 2015 edition of The Journal, and the main basis for Crovitz' claim that the Silicon Valley is soft on terrorism seems to be the fact that Apple has now announced that its iPhone will encrypt data by default. Even Apple itself will not be able to "unlock" your data, and that means that Apple can't provide any of your data to the federal government, even if the government can produce a properly obtained search warrant.
Crovitz takes great exception to this new development, but I am thinking that this is, in fact, some rather "good news" for the ordinary citizen. Heretofore, the United States government has demanded that software manufacturers, and equipment manufacturers, always provide some sort of "backdoor," to allow the government to get access to private data, including encrypted data. As we have learned from Edward Snowden, and others, the government has seldom bothered to get a search warrant when it decides to make use of this open back door.
The practical problem, of course, is that any back door left open for the convenience of your lover may let burglars into the house. In fact, that is exactly what has been happening. The government's demand that all data, including encrypted data, be accessible to the federal government has led to massive data thefts by nefarious types who have figured how to get into that back door left open for the government. Maybe the data thefts are being orchestrated from China, or maybe from North Korea, or maybe from somewhere else.
It looks like Apple and other Silicon Valley high-tech companies have decided against leaving that back door open. As I say, I am not so sure that this is truly a bad thing!
It is true, clearly, that if encrypted data is now actually going to be "secure," then the government will not be able to browse through everyone's data to find potential terrorists. It does seem, based on Crovitz' article, that it should be possible to set up a system that would allow the government to get access to encrypted data through a "multiple key" system that would permit "reasonable searches and seizures" authorized by procedures that actually comply with the Fourth Amendment. Such procedures don't allow the government to search anyone and any place they want to, in the hope that maybe they will find some incriminating evidence. The Fourth Amendment requires the government to prove to a judge, before searching, that there is some "probable cause" to search. If the judge thinks there is, then a search warrant will issue and the search can proceed.
The Fourth Amendment is to protect all of us from general searches by government officials, based on nothing but the officials' decision to search.
Figure out a technology that lets the Fourth Amendment work the way it's supposed to, and I'm all for letting the government have access to private data, including encrypted data.
WITH a search warrant.
Until then, don't bother trying the back door. I keep it locked!
http://www.wsj.com/articles/why-terrorists-love-silicon-valley-1436110797
I don't think you understand how encryption works. Waving around a search warrant doesn't decrypt data. In order to actualize a search warrant of encrypted data, and without a back door, it takes tremendous computational power and time. FBI Director James Comey is concerned with the strong encryption in use today. He calls this the "going dark" problem.
ReplyDeleteSo why not let the FBI install back doors? They're inherently unsafe. A back door is a weakness built into an encryption algorithm. Once someone stumbles onto the weakness, which is guaranteed to happen eventually, anyone can exploit it, and the encryption protocol is rendered useless.
Luckily, as Peter Swire argues, the government "already has the tools it needs to catch criminals." [1]
1. http://slate.me/1LaI3rU