Influencers: Apple should not help FBI crack San Bernardino iPhone
Loading...
A strong majority of Passcode Influencers said that Apple should not comply with a US court鈥檚 order to help the FBI get into the San Bernardino shooter鈥檚听iPhone.
鈥淎pple should use every resource in its tool chest to fight against a government-mandated backdoor that makes us all far less safe,鈥 said Sascha Meinrath, director of the X-Lab, a tech policy think tank. 鈥淭oday鈥檚 battle is not about San Bernardino, it鈥檚 about the integrity of our information and communications and our fundamental right to privacy.鈥
A magistrate judge in Riverside, Calif., on Tuesday ordered Apple to disable built-in security features on the iPhone of the one of the Islamic State-inspired shooters who killed 14 people in December, which could allow the FBI to run a program that would crack the phone鈥檚 password faster. Apple CEO Tim Cook vowed to fight the court order,听听that creating new, weaker software designed to get around the security features is tantamount to building a backdoor. That is simply 鈥渢oo dangerous to create,鈥 he warned, because it could be used not just in this case听but to break into Apple products again and听again.
As the world鈥檚 largest tech company goes head-to-head with the US government, the battle promises to be an important test case in the simmering听encryption debate. And 60 percent of Passcode鈥檚 pool of more than 130 digital security and privacy experts who took the survey sided with听Apple.
Kevin Bankston, director of New America鈥檚 Open Technology Institute, worries that if Apple helps the FBI in this case, it will set a dangerous legal precedent. 鈥淭his case isn鈥檛 about about just one backdoor into just one iPhone,鈥 Mr. Bankston said. 鈥淚f Apple builds this backdoor software, our government 鈥 and less savory governments around the world 鈥 will demand to use this new software every time they encounter an encrypted听iPhone.鈥
The precedent would not stop at just Apple and its iPhones, either, Bankston said. 鈥淚f a court can legally compel Apple to build this backdoor, then it also likely could also compel any other software provider to do the same, including compelling the secret installation of surveillance software via automatic updates to your phone or laptop,鈥 he said. 鈥淪uch a broad precedent would spell digital disaster for the trustworthiness of any and every computer and mobile device鈥 Apple is right to fight this. A loss would not only undermine its product but its entire industry and the security of every user of digital technology.鈥
Many听Influencers who argued Apple should fight the order said they were sympathetic to the FBI鈥檚 plight and said that increasingly pervasive encryption and stronger security measures do make investigators鈥 jobs harder. They also acknowledge that this stance may be especially controversial in a sensitive case such as this one. But the court鈥檚 solution in this case, Influencers said, would put the American public鈥檚 security at risk in the long听term.
鈥淟aw enforcement has legitimate concerns about access to information. But this short term fix hurts our cybersecurity long term,鈥 said Jenny Durkan, chair of the Cyber Law and Privacy Group at Quinn Emanuel law firm. 鈥淚t is a dangerous precedent to order a company to purposely breach a product鈥檚 security. When that product is as important and relevant as the iPhone, the security and privacy risks are too great. The sad fact is that right now we are incapable of protecting our most vital information. Purposely creating more flaws is short-sighted.鈥
The security of Internet-connected devices, added 海角大神 Dawson, cofounder of the Internet Infrastructure Coalition (i2Coalition), which comprises leading industry infrastructure providers and technology firms, 鈥渋s vitally important to the economy, and the millions of consumers who rely upon being able to keep their personal data, financial, medical and legal records,听secure.鈥
鈥淩equiring workarounds that weaken encryption could have broad ramifications and the potential to jeopardize the security and safety of all users,鈥 Mr. Dawson stresses. While Internet infrastructure companies work with law enforcement agencies every day to combat and deter crime, the industry 鈥渃annot support a government mandate to weaken security standards.鈥澨
A 40 percent minority of Passcode Influencers said Apple should help the FBI because there is a court order.听While companies such as Apple have a right to design products that protect their customers, they said, that right does not extend beyond the reach of law enforcement.
鈥淪ociety relies on civil support to law enforcement. Privacy does not extend to hiding evidence of a crime. This is a big question of our industry today: Is it legitimate to create items with features designed to hide and/or destroy evidence?鈥 said one Influencer, who preferred to remain anonymous. Influencers have the option to reply on-record or anonymously to preserve the candor of their responses.听
鈥淭he argument that we must build systems to prevent crime is legitimate,鈥 the Influencer continued. 鈥淏ut enabling other crimes while trying to prevent others is not a legitimate trade.鈥
Another Influencer who thought Apple should comply with the order said this will be an important test for how companies behave under the rule of American law. 鈥淎ll of the slippery slope, 鈥榃hat happens when the Chinese try to do the same,鈥 and similar arguments are nice rhetorical lines with zero basis in our Constitutional traditions,鈥 the Influencer听said.
鈥淥f course this puts Apple in a tricky spot, in the same way, say, other industries have had to 鈥 consistent with a court order and hence compelling public purpose 鈥 also disclose their customers鈥 data. Or the same way certain companies don鈥檛 and won鈥檛 do business in places like China or Russia. 听Apple has a clear choice: act consistent with the rule of law of the United States or create its own, pan-international law that is somehow unhinged from our legal and Constitutional traditions.鈥
Other Influencers say they support strong encryption and other security measures 鈥 and worry that Apple鈥檚 refusal to comply will ultimately hurt the battle for consumer security. 鈥淭his is the wrong issue for Apple to fall on their sword over,鈥 one Influencer said. 鈥淚t鈥檚 a hack that would only impact an outdated device they no longer sell, enabling them to mitigate impact over a broader backdoor. They run the risk that this stance backfires and leads to broader restrictions on encryption.鈥
While many Influencers said Apple seems to be on the right side of the encryption issue, a highly-politicized election year might mean the government鈥檚 arguments against terrorism trump arguments for encryption.
鈥淎pple would like this to be an issue about technology, encryption, and customer trust,鈥 an Influencer said. 鈥淚n an election year, they will not be able to sustain that argument. 听The campaigns will make it an issue about terrorism and public safety. Apple is right on the encryption debate. But that鈥檚 not the battleground on which this fight will take听place.鈥
What do you think?听听of the Passcode Influencers Poll.
听
COMMENTS:
NO
鈥淭his is a perfect test case for the government concept of forcing companies to provide backdoors into their security, because it involves a terror plot and it lacks exigency. That gives courts and policymakers ample time to fully examine this question. In my opinion, the FBI鈥檚 probable cause that relevant evidence will be found on the phone is pretty thin, and the consequences are dire: there is no such thing as a one-time back-door, nor is there such a thing as a backdoor that can be used only for good.鈥 -听Nick Selby, StreetCred Software
鈥淰endors should and will help law enforcement when court orders are obtained for specific access. However, the FBI is asking Apple to provide a compromised version that could subvert security in any iPhone, a very different story. The courts need to rule on this first.鈥 -听John Pescatore, SANS Institute
鈥淯nfortunately, if Apple were to acquiesce to this demand, there is no limit to the sort of similar demands they are likely to receive from other courts and countries. Everybody thinks these kinds of lengths are demanded by the facts of their case.鈥 -听Tom Cross, Drawbridge Networks
鈥18th Century laws (the All Writs Act) are out of step with 21st century technology.鈥 -听Marc Rotenberg, Electronic Privacy Information Center
鈥淣ot only do I not think Apple should comply but I also believe those arguing for Apple to comply, such as the White House spokesman, have been misleading the public. As an example, the White House spokesman stated that Apple could do this simply and in a limited way. There is nothing simple or limited about this request. Policymakers and law enforcement officials may want something, it may be a good debate to have, but misleading the public on the technological challenges and repercussion does not benefit anyone.鈥 -听Robert Lee, Dragos Security
鈥淭his case highlights one of the most crucial open questions in the cryptowars debate: In a rule of law country, what tools are reasonable for law enforcement? 听I don鈥檛 believe those tools should include mandatory backdoors or exceptional access systems for encrypted tech. 听But, on the facts of this case, is this an appropriately narrowly tailored request by the FBI? 听Or would it establish dangerous precedent allowing law enforcement agencies to compel technology companies to hack into customers鈥 devices? 听What happens when those devices include IoT products - like sensors and cameras in our homes, information from our cars, or information from wearables?鈥 -听Influencer
鈥淭im Cook seemed to capture the issue well when he said that Apple has been 鈥榓sked for something that they don鈥檛 have, and is too dangerous to create.鈥 If the U.S. Government wants the authorities with which to mandate such an effort, passing a law to do it would be a reasonable first step.鈥 -听Bob Stratton, MACH37
鈥淚 agree with the Internet Association鈥檚 statement, which says that governments should not require the weakening of these necessary security standards. Efforts to weaken or undermine strong encryption harm consumers and undermine our national security.鈥 -听Abigail Slater, Internet Association
鈥淔ight the power.鈥 -听Influencer
鈥淪ecurity and privacy are crucial to the future of trusted communications. Without trust, all networks fail. Apple and Google are correct in serving the public good by not creating backdoors or 鈥榤aster keys鈥 for anyone for any reason. The controversy being generated by this U.S. court order has allowed the debate to be taken to the people. The court order and its threat to civil liberties is of great benefit because it has made the debate more public, and millions here and abroad are learning about how and why encrypted communications are crucial as we create new ways of working in the digital age. Trust is everything.鈥 -听Influencer
鈥淧rivacy must trump terrorism. The rights of U.S. citizens have already been diluted due to our first ill-informed response to 9/11 called the Patriot Act. The government showed that it could not be trusted to apply those provisions narrowly and with great discretion. Now the U.S. government is making the same case that in order to stop terrorism the American people must once again surrender certain rights in order to be safe. I applaud Apple for not caving in to that argument and hope that my peers will join me in placing the need for privacy above making law enforcement鈥檚 job easier.鈥 -听Jeffrey Carr, Taia Global
鈥淭here are two issues here; one smaller and one larger. The smaller one is that law enforcement is asking Apple to hack their own phone; to write code that will essentially break it. It is analogous to law enforcement asking the Ford Motor Company to spend resources in order to put a permanent flaw in their latest truck engine. I don鈥檛 think that has ever been done before. The larger issue is this. Apple, and other Silicon Valley companies, decided a while ago that they did not want to hold a key to any backdoor that law enforcement might want to leverage in the future; not for obvious terrorist cases like this but for all citizens. They didn鈥檛 want to be the law enforcement enabler if and when law enforcement decided to skirt our privacy rights (see the first version of the Patriot Act). The data for these terrorists is not sitting in some database somewhere. It is sitting on the terrorist鈥檚 phone. In order to get it, you have to hack the phone. For this larger issue, you have to ask yourself if you want to grant law enforcement the ability to hack into anybody鈥檚 phone even if they have a warrant. And before you say yes too quickly, remember that once this flaw is installed, every bad guy on the planet will discover it and leverage it. The FBI, in my opinion, is leveraging our fear of terrorism to give them a back door; not for this terrorist case but for every case in the future. Apple鈥檚 decision, I think, is the smart play in the long run; for their customers for sure but for the nation.鈥 -听Rick Howard, Palo Alto Networks
鈥淐reating a master iPhone key would have disastrous consequences for data security and US business interests overseas.鈥 -听Chris Finan, Manifold Security
鈥淎pple should use every resource in its tool chest to fight against a government-mandated back-door that makes us all far less safe. Today鈥檚 battle is not about San Bernadino, it鈥檚 about the integrity of our information and communications and our fundamental right to privacy. Only the most myopic and technologically illiterate believe that fundamentally undermining encryption will magically make us safer -- the reality, one well understood by most top technologists, is that strong encryption (like computers themselves) is a huge boon to civil society and must remain safe and secure.鈥 -听Sascha Meinrath, X-Lab
鈥淭his court order demanding that Apple custom-build malware to undermine its own product鈥檚 security features, and then digitally sign that software so the iPhone will trust it as coming from Apple, doesn鈥檛 just set us down a slippery slope 鈥 it drops us off a cliff. This case isn鈥檛 about about just one backdoor into just one iPhone. If Apple builds this backdoor software, our government 鈥 and less savory governments around the world--will demand to use this new software every time they encounter an encrypted iPhone. But this isn鈥檛 just about iPhones, either: if a court can legally compel Apple to build this backdoor, then it also likely could also compel any other software provider to do the same, including compelling the secret installation of surveillance software via automatic updates to your phone or laptop. Such a broad precedent would spell digital disaster for the trustworthiness of any and every computer and mobile device. The FBI has already spent the last year arguing for backdoors in front of Congress and at the White House, and now that鈥檚 it鈥檚 come up empty it鈥檚 trying to get a lower court judge to convert a vague, centuries-old catch-all statute into a powerful government hacking statute. That鈥檚 not how we make policy in this country, and Apple is right to fight this鈥攁 loss would not only undermine its product but its entire industry and the security of every user of digital technology. A line must be drawn here, and we at OTI are eager to continue to fight to ensure that we can continue to trust the security and integrity of the devices we use every day.鈥 -听Kevin Bankston, Open Technology Institute
听
YES
鈥淣ot a good question. Of course they should comply 鈥 if they can鈥檛 get it reversed on appeal. Should they comply without legally challenging it? No.鈥 -听Influencer
鈥淪o long as it doesn鈥檛 force Apple to redesign its products, modify business processes, or taint future development. If it does, then Apple should fight to get clarity on just how far a court can mandate product features.鈥 -听Jeff Moss, DEF CON Communications
鈥淭his is the wrong issue for Apple to fall on their sword over. It鈥檚 a hack that would only impact an outdated device they no longer sell, enabling them to mitigate impact over a broader backdoor. They run the risk that this stance backfires and leads to broader restrictions on encryption.鈥 -听Influencer
鈥淎pple has an obligation to assist the government if it can do so.鈥 -听Stewart Baker, Steptoe & Johnson
鈥淭hey鈥檙e willing to harvest every drop of personal information from their users and sell it to the highest bidders via ad networks, but won鈥檛 let it be used in a real life/death law enforcement situation?鈥 -听Influencer
鈥淭he private sector should cooperate with lawful government investigations. No company should be forced to make design changes to the products and services it sells. But if companies have the ability to remotely update devices, then it is reasonable for law enforcement to ask them to use that capability to comply with court orders.鈥 -听Influencer
鈥淭his is different than a permanent backdoor in all devices.鈥 -听Influencer
鈥淎pple would like this to be an issue about technology, encryption, and customer trust. 听In an election year, they will not be able to sustain that argument. 听The campaigns will make it an issue about terrorism and public safety. Apple is right on the encryption debate. But that鈥檚 not the battleground on which this fight will take place.鈥 -听Influencer
鈥淭here are ways to ensure that data is provided only from the terrorist鈥檚 phone without giving out Apple鈥檚 encryption secrets.鈥 -听Influencer
鈥淎ccording to Newsweek, Apple has unlocked their phones at least 70 times since 2008. 听Let鈥檚 say that again: APPLE HAS UNLOCKED THEIR PHONES AT LEAST 70 TIMES SINCE 2008!!! 听So why the big protest now? Because it鈥檚 a marketing ploy. The Constitution says there shall be no 鈥榰nreasonable鈥 searches and seizures; it does NOT say there shall be NO searches and seizures, and it does not specify that companies get to decide what is reasonable. 听That is the purview of the courts, and in the immediate case, a US court has decided that it is in fact reasonable for Apple to provide the assistance necessary to open ONE SPECIFIC PHONE. So: Apple has done what it鈥檚 being asked to do scores of times previously; they are being presented with a legal, constitutional demand to produce information; there is a compelling legal and national security imperative at stake. So of course they should comply, and failing compliance, they should be subject to the full force of US Government sanctions and punishment.鈥 -Influencer
鈥淏ad facts make bad law - and Apple picked just about the worst facts to make their stand and they could end up doing a disservice to both the tech and the privacy communities. 听Thanks a lot, Apple.鈥 -听Influencer
鈥淭his is a cat and mouse game where the national security need is clear but industry needs to be ordered over great objection. 听The overarching issue is the global nature of markets and particular issues being addressed in local legal frameworks.鈥 -听Influencer
鈥淪ociety relies on civil support to law enforcement. Privacy does not extend to hiding evidence of a crime. This is a big question of our industry today: Is it legitimate to create items with features designed to hide and/or destroy evidence. In an analogy, it is one thing to manufacture a knife. We don鈥檛 hold the manufacturer responsible for the actions of the holder. It is another thing to make a knife that deliberately destroys DNA evidence on its handle. Likewise, you can鈥檛 hide the knife in a commercial safe and refuse to open for the government with a warrant. The argument that we must build systems to prevent crime is legitimate. I have spent my career on this. But enabling other crimes while trying to prevent others is not a legitimate trade. Privacy advocates fear a precedent under which other countries where they do business will also ask for support with the intent of suppressing their citizens. This argument would seem to be an ethical one on first blush, even admirable. However, it is actually advocating for corporations to establish their own laws, ignoring the jurisdictions in which they do business. Will these corporations use their own internal courts to determine which laws they will respect? We don鈥檛 always agree, but we must work within the law.鈥 -听Influencer
What do you think?听听of the Passcode Influencers Poll.
听