Professional Documents
Culture Documents
This is an advantage stem that can be read with the Surveillance State Repeal Act and Secure Data Act
affirmatives (and others that we havent worked on yet). It accesses many of the other impact areas that
we have developed in other advantage files. This file contains the internal link stem for the advantage; the
impact scenarios are found in other advantage files and the solvency cards for a particular plan are found
in that affirmative file.
1AC Options
Holder, the outgoing US attorney general, has joined the FBI and other law
enforcement agencies in calling for the security of all computer
systems to be fatally weakened. This isnt a new project the idea has been around since
Eric
the early 1990s, when the NSA classed all strong cryptography as a munition and regulated civilian use
of it to ensure that they had the keys to unlock any technological countermeasures you put around your
data.
In 1995, the Electronic Frontier Foundation won a landmark case establishing that code was a form of
protected expression under the First Amendment to the US constitution, and since then, the whole world
has enjoyed relatively unfettered access to strong crypto.
interested in these keys to the (United) Kingdom will be much larger, and theyll have more money, and
theyll be able to do more damage.
Thats really the argument in a nutshell. Oh, we can talk about whether the
danger is as grave as the law enforcement people say it is, point out that
only a tiny number of criminal investigations run up against
cryptography, and when they do, these investigations always find
another way to proceed. We can talk about the fact that a ban in the
US or UK wouldnt stop the bad guys from getting perfect crypto from
one of the nations that would be able to profit (while US and UK business suffered)
by selling these useful tools to all comers. But thats missing the point:
even if every crook was using crypto with perfect operational security,
the proposal to back-door everything would still be madness.
Because your phone isnt just a tool for having the odd conversation with
your friends nor is it merely a tool for plotting crime though it does duty in both cases. Your
phone, and all the other computers in your life, they are your digital
nervous system. They know everything about you. They have
cameras, microphones, location sensors. You articulate your social
graph to them, telling them about all the people you know and how you know them. They are
privy to every conversation you have. They hold your logins and
passwords for your bank and your solicitors website; theyre used to chat to your therapist and the
STI clinic and your rabbi, priest or imam.
The most shocking Snowden revelation wasnt the mass spying (we already
knew about that, thanks to whistleblowers like Mark Klein, who spilled the beans in 2005). It was the
fact that the UK and US spy agencies were dumping
$250,000,000/year into sabotaging operating systems, hardware, and
standards, to ensure that they could always get inside them if they
wanted to. The reason this was so shocking was that these spies were
notionally doing this in the name of national security but they were
dooming everyone in the nation (and in every other nation) to using
products that had been deliberately left vulnerable to attack by
anyone who independently discovered the sabotage.
There is only one way to make the citizens of the digital age secure,
and that is to give them systems designed to lock out everyone
except their owners. The police have never had the power to listen
Never mind that Congress has already said it doesnt expect communications providers to provide such
capabilities in most cases. Here's relevant language from the law:
A telecommunications carrier shall not be responsible for decrypting, or ensuring the
governments ability to decrypt, any communication encrypted by a subscriber or customer,
unless the encryption was provided by the carrier and the carrier possesses the information
necessary to decrypt the communication.
security expert Matt Blaze put it this way: Crypto backdoors are
dangerous even if you trust the government not to abuse them. We
simply don't know how to build them reliably.
The hackers of the worldcriminals, foreign governments, you name itwill be thrilled
if Holder, Comey and the no-privacy-for-you backup singers get their
way.
The other, even worse, disconnect is the implicit notion that there is no
measure we shouldnt take to guarantee our ability to stop and punish
crime. The Constitution, and especially the Bill of Rights, says we do take
some additional risks in order to have liberty. Why have we become so
paranoid and fearful as a society that wed even entertain the notion
that civil liberties mean next to nothing in the face of our fear?
[Insert Impact Module(s) and Solvency Card(s)]
We are all born with certain inalienable rights. Now, the notion of free speechcore
to the entire idea of human rightsmust be constantly re-examined in the face of a rapidly changing world
where the most important speech increasingly takes place on the Internet.
Free speech isnt merely about the abstract idea of saying whatever you want. Its
the freedom to speak, ask questions, and seek knowledge without
anyone, hidden or otherwise, looking over your shoulder. In the digital age,
free speech is about being able to use the Internet unmolested by the
greedy eyes of corporations with incentive to sell your personal data,
hackers wanting to steal or destroy it, and massive regimes collecting
it all.
Everything a person does online is saved as data. The messages you
send, the websites you look at, the files you save are all data that can
live on forever. The only way to protect your most personal datayour
location, credit card numbers, political thoughts, medical queries, and everything else you do on a phone,
tablet, or computerfrom
Modern encryptionat its core, really good mathematics that make it possible to protect your data so that
no computer can decrypt it without your go aheadis legally protected by a 1995 American court decision
that declared computer code constitutionally protected free speech. But many of the worlds top cops
continue to demonize encryption, saying it will inevitably enable and immunize criminals on massive
scales.
The war over encryptionmost notably the so-called crypto wars of the 1990ssaw the
American government try to make strong encryption a military-grade
weapon in the eyes of the law. Opposed chiefly by the Electronic Frontier
Foundation, courts declared computer code to be free speech and said the
governments regulations were unconstitutional.
Despite the landmark legal victory, the war over encryption has
continued to this day.
John J. Escalante, chief of detectives for the Chicago Police Department, has
called encryption mostly a tool of pedophilesa claim thats
disingenuous and misleading, if not outright dangerous. For one thing, many
city and federal police agents use encryption tools regularly, and
encryption stymied a total of nine police investigations last year. There
are plenty of ways to investigate crimes involving cryptography that
dont involve banning or curtailing it.
Theres no denying that these tools have some very ugly users.
However, for the few billion of us who want to keep our digital lives
private from unwanted eavesdroppers and hackers, being forcefully
grouped in with terrorists and pedophiles is a hard insult to stomach.
Encryption works to protect youand everyone elseonline. More than
that, its the best protection you have. There are simply no other
options that can compare.
If, for some reason, you assume a hack will never happen to you, let me give
you some perspective on the current state of digital security. 2014 is known in
information technology circles as the year of the breach because it has
boasted some of the biggest hacks in history. 2013 had a nickname
too: The year of the breach. Come to think of it, 2012 was called something
eerily similar: The year of the breach.
2011? You get the idea.
This isnt merely one year of massive security breaches, its an era of
profound digital insecurity in which sensitive personal datathe
information that can be put together to add up to a startlingly complete picture of our lives and thoughts
holds and controls more data about the daily lives and social
interactions of half a billion people than 20th-century totalitarian
governments ever managed to collect about the people they
surveilled.
The Internets architecture has also made it possible for businesses and
governments to fill giant data vaults with the ore of human existence
the appetites, interests, longings, hopes, vanities, and histories of
people surfing the Internet, often unaware that every one of their clicks
is being logged, bundled, sorted, and sold to marketers, the New York Times
journalist Jim Dwyer wrote in his new book, More Awesome Than Money. Together, they
amount to nothing less than a full psyche scan, unobstructed by law or
social mores.
When people like Comey suggest that law enforcement should have a
back door or golden key that allows cops to easily access all
encrypted communication, they are willfully ignoring the reality
shouted to them by the vast majority of the information technology industry.
You cant build a back door that only the good guys can walk
through, cryptographer Bruce Schneier wrote recently. Encryption protects against
cybercriminals, industrial competitors, the Chinese secret police, and
the FBI. Youre either vulnerable to eavesdropping by any of them, or
youre secure from eavesdropping from all of them.
When encryption becomes a campaign issue, thats going to go on the bumper stickers: You either
have real privacy and security for everyone or for no one.
The existing back doors in network switches, mandated under U.S. laws such as
CALEA, have become the go-to weak-spot for cyberwar and industrial
espionage, author Cory Doctorow wrote in the Guardian. It was Googles lawful interception
backdoor that let the Chinese government raid the Gmail account of dissidents. It was the lawful
interception backdoor in Greeces national telephone switches that let someoneidentity still unknown
listen in on the Greek Parliament and prime minister during a sensitive part of the 2005 Olympic bid
(someone did the same thing the next year in Italy).
If, like many Americans, you say you dont mind if the U.S. government
watches what you do online, take a step back and consider the bigger
picture.
The American government is not the only governmentnevermind
other organizationswatching and hacking people on the Internet.
China, Russia, Iran, Israel, the U.K., and every other nation online
decided long ago that cyberspace is a militarized country. All the
states with the necessary resources are doing vast watching and
hacking as well.
Encryption proved a crucial help to protesters during the Arab Spring. It
helps Iranian liberals push against their oppressive theocracy. From
African free speech activists to Chinese pro-democracy organizers to
American cops investigating organized crime, strong encryption saves
lives, aids law enforcement (ironic, huh?), protects careers, and helps
build a more free and verdant world. Journalistscitizen and professional
alikedepend on encryption to keep communications and sources
private from the people and groups they report on, making it
essential to an independent and free press.
The right to privacy, the right to choose what parts of yourself are exposed to the world, was described
over a century ago by the U.S. Supreme Court and held up as an issue of prime importance last year by
U.N. human rights chief Navi Pillay. Its something we all need to worry about.
system that can only sustain itself by arrogating these powers can
possibly be called just.
In the digital age, encryption is our only guarantee of privacy.
Without it, the ideal of free speech could be lost forever.
Over the course of the last year, a host of cyberattacks have been
perpetrated on a number of high profile American companies. In
January 2014, Target announced that hackers, using malware, had
digitally impersonated one of the retail giants contractors, stealing
vast amounts of dataincluding the names, mailing addresses, phone numbers or email
1
addresses for up to 70 million individuals and the credit card information of 40 million shoppers. 4
10
11
12
13
15
16
17
18
19
20
21
22
23
25
26
While the terrestrial fears of terrorism and Ebola have dominated headlines,
American leaders are fretting about what may be even more serious
virtual threats to the nations security.
This year, hundreds of millions of private records have been exposed in an
unprecedented number of cyberattacks on both US businesses and the
federal government.
On Monday, just as President Obama arrived in Beijing to being a week-long summit with regional leaders,
Chinese hackers are suspected to have breached the computer networks of the US Postal Service, leaving
the personal data of more than 800,00 employees and customers compromised, The Washington Post
reports.
The data breach, which began as far back as January and lasted through mid-August, potentially exposed
500,000 postal employees most sensitive personal information, including names, dates of birth, and Social
Security numbers, the Postal Service said in a statement Monday. The data of customers who used the
Postal Services call center from January to August may have also been exposed.
"The FBI is working with the United States Postal Service to determine the nature and scope of this
incident," the federal law enforcement agency said in a statement Monday. Neither the FBI nor the Postal
Service, however, confirmed it was the work of Chinese hackers.
The breach did not expose customer payment or credit card information, the Postal Service said, but
hackers did gain access to its computer networks at least as far back as January. The FBI informed the
Postal Service of the hack in mid-September.
General Patrick Donahoe in a statement. The United States Postal Service is no different. Fortunately, we
have seen no evidence of malicious use of the compromised data and we are taking steps to help our
employees protect against any potential misuse of their data.
Since 2006, cyber-intruders have gained access to the private data of nearly 90 million people in federal
networks, the Associated Press reported in a major investigation published Monday.
Hackers have also accessed 255 million customer records in retail networks during this time, 212 million
customer records in financial and insurance industry servers, as well as 13 million records of those in
educational institutions, the AP reported.
The
When historians write about this period in U.S. history it could very well be
that one of the themes will be how the United States lost its global
technology leadership to other nations. And clearly one of the factors
they would point to is the long-standing privileging of U.S. national
security interests over U.S. industrial and commercial interests when it
comes to U.S. foreign policy.
This has occurred over the last few years as the U.S. government has
done relatively little to address the rising commercial challenge to U.S.
technology companies, all the while putting intelligence gathering first
and foremost. Indeed, policy decisions by the U.S. intelligence community
have reverberated throughout the global economy. If the U.S.
tech industry is to remain the leader in the global marketplace, then the
U.S. government will need to set a new course that balances
unlock when the Chinese government comes knocking for bad ones. A
backdoor mandate, by contrast, makes life easy for oppressive
regimes by guaranteeing that consumer devices are exploitable by
defaultpresenting U.S. companies with a presence in those countries with a horrific choice between
enabling repression and endangering their foreign employees.
Case Backlines
secrecy where decryption keys are deleted immediately after use, so that stealing the encryption key
used by a communications server would not compromise earlier or later communications. A related
technique, authenticated encryption, uses the same temporary key to guarantee confidentiality and to
verify that the message has not been forged or tampered with.
services, which tend to use similar technologies and are more likely to have the resources to manage
Engineering Task Force in 2000 emphasized that adding a requirement for wiretapping will make affected
protocol designs considerably more complex. Experience has shown that complexity almost inevitably
jeopardizes the security of communications.355 More recently, a May 2013 paper from the Center for
Democracy and Technology on the risks of wiretap modifications to endpoints concludes that deployment
of an intercept capability in... communications services, systems and applications poses serious security
risks.356 The authors add that on balance mandating that endpoint software vendors build intercept
functionality into their products will be much more costly to personal, economic and governmental security
overall than the risks associated with not being able to wiretap all communications.357
While NSA
programs such as SIGINT Enablingmuch like proposals from domestic law enforcement agencies to
update the Communications Assistance for Law Enforcement Act (CALEA) to require digital wiretapping
capabilities in modern Internet-based communications services358 may
Many high-visibility sites, such as Twitter, Google, Reddit, and YouTube, default to SSL/TLS encryption now.
When there were bugs in the libraries that support this type of encryption, the IT world moved heaven and
earth to patch them and eliminate the vulnerability. Security pros were sweating bullets for the hours,
days, and in some cases weeks between the hour Heartbleed was revealed and the hour they could finally
The report, written by UN Special Rapporteur David Kaye, is based on questionnaire responses submitted
by 16 countries, opinions submitted by 30 non-government stakeholders, and statements made at a
meeting of experts in Geneva in March.
The problem with all of those approaches is that they "inject a basic
vulnerability into secure systems," Kaye told the Washington Post. "It results in
Encryption matters because it protects our intellectual privacy -our ability to be protected from surveillance or interference when we
are making sense of the world by thinking, reading and speaking
privately with those we trust. More and more, the acts of reading, thinking,
and private communication are mediated by electronic technologies like
computers, tablets, ebooks and smartphones. Whenever we shop, read, speak, and
think, we now do so using devices that create records of these
activities.
When we are watched, tracked and monitored, we act differently.
There's an increasing body of evidence that internet surveillance stops
us from reading unpopular or controversial ideas. Remember that our most
cherished ideas -- that people should control the government, that
heretics should not be burned at the stake and that all people are
equal were once unpopular and controversial ideas. A free society
should not fear dangerous ideas, and does not need complete
intellectual surveillance. Existing forms of surveillance and policing
are enough.
Encryption and intellectual privacy will of course make it more difficult
for the security services to do their jobs, but so too do our other civil
liberties of free speech, democratic control of the police and military,
and requiring a warrant before the government enters our homes or
reads our mail.
We are more secure when we have hope than when we are filled with
fear and treated like potentially naughty children. Encryption
promotes this kind of political security. It promotes other kinds of
security as well; after all, a back door for the government can also be a
back door for criminal hackers.
reporter who wrote a piece in the Washington Post supporting FBI backdoors (and then later changed his
their own example in concealing correspondence, one can make an even stronger case supporting
The Edward Snowden leaks left much of the world in shock. Even the most
paranoid security freaks were astounded to learn about the scope of
the surveillance apparatus that had been built by the NSA, along with its
allies in the "Five Eyes" countries (the UK, Canada, New Zealand, and Australia).
The nontechnical world was most shocked by the revelation that the NSA was snaffling up such
unthinkable mountains of everyday communications. In some countries, the NSA is actively recording
every single cell-phone conversation, putting millions of indisputably innocent people under surveillance
without even a hint of suspicion.
in the tech world, the real showstopper was the news that the NSA and
the UK's spy agency, the GCHQ, had been spending $250 million a year on two
programs of network and computer sabotage BULLRUN, in the USA, and
EDGEHILL, in the UK. Under these programs, technology companies are
bribed, blackmailed, or tricked into introducing deliberate flaws into
their products, so that spies can break into them and violate their
users' privacy. The NSA even sabotaged U.S. government agencies,
such as the National Institute for Standards and Technology (NIST), a rock-ribbed expert body that
But
produces straightforward engineering standards to make sure that our digital infrastructure doesn't fall
Google
HackerOne
Hackers/Founders
Hewlett-Packard Company
Internet Archive
Internet Association
Internet Infrastructure Coalition (i2Coalition)
Level 3 Communications
LinkedIn
Microsoft
Misk.com
Mozilla
Open Spectrum Inc.
Rackspace
Rapid7
Reform Government Surveillance
Sonic
ServInt
Silent Circle
Slack Technologies, Inc.
Symantec
Tech Assets Inc.
TechNet
Tumblr
Twitter
Wikimedia Foundation
Yahoo
CPs
CISA CP
CISA doesnt solve cybersecurity.
Greenberg 15 Andy Greenberg, Senior Writer at Wired covering security, privacy, information
freedom, and hacker culture, 2015 (CISA Security Bill: An F for Security But an A+ for Spying, Wired,
March 20th, Available Online at http://www.wired.com/2015/03/cisa-security-bill-gets-f-security-spying/,
Accessed 07-06-2015)
More False Warnings Than Real Threats
For those who value security over privacy, CISAs surveillance compromises might seem acceptable. But
would waive privacy laws in the name of cybersecurity. In April, the US House of Representatives passed by
strong majorities two similar cyber threat information sharing bills. These bills grant companies immunity
for giving DHS information about network attacks, attackers, and online crimes.
Instead of removing (non-existent) barriers to sharing and undermining American privacy in the process
Congress should consider how to make sharing worthwhile. Ive been told by many entities, corporate
and academic, that they dont share with the government because the government doesnt share back.
Silicon Valley engineers have wondered aloud what value DHS has to offer in their efforts to secure their
employers services. Its not like DHS is setting a great security example for anyone to follow. OPMs
Inspector General warned the government about security problems that, left unaddressed, led to the OPM
breach.
And theres a very serious trust issue. We recently learned that the NSA is sitting on the domestic Internet
backbone, searching for foreign cyberthreats, helping the FBI and thinking about how to get authority to
scan more widely. You can see the lifecycle now. Vulnerable company reports a threat to DHS, NSA
programs its computers to search for that threat, vulnerable companys proprietary data gets sucked in by
FBI. Any company has to think at least twice about sharing how they are vulnerable with a government
that hoards security vulnerabilities and exploits them to conduct massive surveillance.
To researchers who have spent their careers studying code, the FBIs
belief that it can shut down the development of strong cryptography is
ludicrous. Code, after all, is just math.
This all requires an idea that people just wont innovate in areas where
the government doesnt like them to, said Cohn. And thats really never
been the case.
Hall said, Youre basically trying to prevent people from doing certain
kinds of math.
Philip Zimmerman, the inventor of the widely used PGP encryption scheme, neatly summed up the
problem with government encryption limits when he told a 1996 Senate subcommittee hearing, Trying
to stop this is like trying to legislate the tides and the weather.
The mathematical nature of encryption is both a reassurance that it
cannot be banned and a reminder that it cannot be massaged to fit an
agendaeven agendas ostensibly meant to save lives. Software
engineers simply cannot build backdoors that do what the FBI wants
without serious security vulnerabilities, owing to the fundamental
mathematical nature of cryptography. It is no more possible for
the FBI to design a secure backdoor than it is for the National Weather
Service to stop a hurricane in its tracks.
No amount of presto change-o is going to change the math, Cohn
said. Some people say time is on their side; I think that math is on our
side when it comes to crypto.
* Cohn = Cindy Cohn, Executive Director and former Legal Director and General Counsel of the Electronic
Frontier Foundation, holds a J.D. from the University of Michigan Law School; Hall = Joseph Hall, chief
technologist at the Center for Democracy & Technology
reasons and Ive given it quite a bit of thought and Im working with some companies in this area
too.
countries say great, we want to have a duplicate key too, with Beijing or in Moscow or someplace
else? The companies are not going to have a principled basis to refuse to do that. So thats going
to be a strategic problem for us.
we do not historically
organize our society to make it maximally easy for law
enforcement, even with court orders, to get information. We
often make trade-offs and we make it more difficult. If that were not
Finally, I guess I have a couple of overarching comments. One is
the case then why wouldnt the government simply say all of these [takes out phone] have to be
configured so theyre constantly recording everything that we say and do and then when you get
a court order it gets turned over and we wind up convicting ourselves. So I dont think socially we
do that.
And I also think that
we fear we are. In the 90s there was a deb when encryption first became a big deal
debate about a Clipper Chip that would be embedded in devices or whatever your
communications equipment was to allow court ordered interception. Congress ultimately and the
President did not agree to that. And, from talking to people in the community afterwards, you
know what? We collected more than ever. We found ways to deal with that issue.
requiring
people to build a vulnerability may be a strategic mistake.
These are, of course, all the same answers opponents to back doors always offer
(and Chertoff has made some of them before). But Chertoffs answer is notable both
because it is so succinct and because of who he is: a long-time
prosecutor, judge, and both Criminal Division Chief at DOJ and
Secretary of Homeland Security. Through much of that career, Chertoff has
been the close colleague of FBI Director Jim Comey, the guy
pushing back doors now.
So its a little bit of a long-winded answer. But I think on this one, strategically, we,
Its possible hes saying this now because as a contractor hes being paid to voice the opinions of the tech
Political and law enforcement leaders in the United States and the United Kingdom
have called for Internet systems to be redesigned to ensure
government access to information even encrypted information. They argue
Our
strong recommendation is that anyone proposing regulations should first
present concrete technical requirements, which industry, academics,
and the public can analyze for technical weaknesses and for hidden
costs.
lawful surveillance orders when they meet the requirements of human rights and the rule of law.
Julian Sanchez, a senior fellow at the Cato Institute, was incredulous about Comeys insistence that experts
are wrong: How does his head not explode from cognitive dissonance when he repeats he has no tech
expertise, then insists everyone who does is wrong? he tweeted during the hearing.
But no experts were invited to testify, a fact that several intelligence committee
members brought up, demanding a second hearing to hear from them.
Comey got little pushback from the panel, despite his lack of any
formal plan and his denial of science. Sen. Martin Heinrich, D-N.M., thanked him for his
display of humility in not presenting a solution, while Committee Chairman Richard Burr, R-N.C., said I
think you deserve a lot of credit for your restraint.
Comey at one point briefly considered the possibility of a world not like the one he imagined, then
concluded: If thats the case, then I think were stuck.
that should be the end of it. If the brightest minds in the world
cannot come up with something the FBI wants in its wildest dreams,
someone has to back down. You can't just push ahead with something
if it's not technically feasible.
But that's not stopping FBI director James Comey, who on Wednesday made his case -- for
And
the millionth time -- that encryption will prevent his agency (and others) from finding the bad guys.
In a brief (yet still rambling) opinion piece for Lawfare, the FBI director aimed for "healthy discussion" but
failed to retain his point once he promised he was "not a maniac (or so at least [his] family says so" -which, by the way, is exactly what a maniac would say.
Comey wrote:
"In universal strong encryption, I see something that is with us already and growing every day
that will inexorably affect my ability to do that job. It may be that, as a people, we decide the
benefits here outweigh the costs and that there is no sensible, technically feasible way to
optimize privacy and safety in this particular context, or that public safety folks will be able to do
their job well enough in the world of universal strong encryption."
At least he managed to avoid using the word "backdoor" in his piece. He redeemed himself when, in
Wednesday's testimony, Comey said that the FBI was "not seeking a backdoor." (For those not watching CSPAN, he then proceeded to describe a backdoor.)
At one point during his testimony, the FBI director specifically said, dodging the question by Sen John
Cornyn (R-TX), that he doesn't want to "scare people by saying I'm certain people will die."
He, and others, are blind to the fact that undermining encryption by
installing backdoors won't prevent crime. In a tweet, security expert and researcher
Matt Blaze, one of the cryptographers who also signed the aforementioned letter, said: " 'Crypto
Instead of offering specifics, FBI officials say they want more dialog with
Congress, the security community, and the public. But that appears to be little more
than a stalling tactic.
the 90s, he would have seen that this was one of the primary reasons similar schemes were almost
response:
It's telling that his remarks echo so closely the arguments of that era. Compare them, for example, with
this comment from former FBI Director Louis Freeh in May of 1995, now nearly twenty years ago:
[W]e're in favor of strong encryption, robust encryption. The country needs it, industry needs it.
We just want to make sure we have a trap door and key under some judge's authority where we
can get there if somebody is planning a crime.
Perhaps Comey's speech is saber rattling. Maybe it's an attempt to persuade the American people that
we've undertaken significant reforms in light of the Snowden revelationsthe U.S. government has not
and that it's time for the "pendulum" to swing back. Or maybe by putting this issue in play, the FBI may
hope to draw our eyes away from, say, its attempt to water down the National Security Letter reform that
Congress is considering. It's difficult to tell.
But if the FBI gets its way and convinces Congress to change the law , or
even if it convinces companies like Apple that make our tools and hold our data to weaken the security
they offer to us, we'll all end up less secure and enjoying less privacy. Or as
the Fourth Amendment puts it: we'll be be less "secure in our papers and effects."
For more on EFF's coverage of the "new" Crypto Wars, read this article focusing on the security issues we
wrote last week in Vice. And going back even earlier, a broader update to a piece we wrote in 2010, which
itself was was based on our fights in the 90s. If the FBI wants to try to resurrect this old debate, EFF will be
in strong opposition, just as we were 20 years ago. That's becausejust like 20 years ago the
jump the airgaps to get the keying material together in one place
thats probably also an airgapped facility.
Daniel Weitzner argued that there was simply no way to reconcile a
backdoors dual requirements of security and accessibility. If you
physically disperse keys across the country to make them easier for
law enforcement to reach, you add more venues for exploitation, he said.
If you put one hardware security module in the FBIs heavily guarded
Washington headquarters, you prevent disparate law-enforcement
groups from quickly accessing it to launch real-time monitoring
operations.
Im not even sure were good at doing that, keeping keys like that
technically secure, Green said. Im not sure we have any hardware thats
ever been put to that test.
* Hall = Joseph Hall, chief technologist at the Center for Democracy & Technology; Weitzner = Daniel
Weitzner, lecturer in the computer science department at the Massachusetts Institute of Technology; Green
= Matthew Green, assistant research professor at the Johns Hopkins Information Security Institute
Terrorism/Crime DA
but that's the thing: You can't build a "back door" that only the good guys
can walk through. Encryption protects against cybercriminals,
industrial competitors, the Chinese secret police and the FBI. You're
either vulnerable to eavesdropping by any of them, or you're secure
from eavesdropping from all of them.
Back-door access built for the good guys is routinely used by the bad
guys. In 2005, some unknown group surreptitiously used the lawful-intercept capabilities built into the
Ah,
Greek cell phone system. The same thing happened in Italy in 2006.
In 2010, Chinese hackers subverted an intercept system Google had put into Gmail to comply with U.S.
government surveillance requests. Back doors in our cell phone system are currently being exploited by
the FBI and unknown others.
This doesn't stop the FBI and Justice Department from pumping up
the fear. Attorney General Eric Holder threatened us with kidnappers and
sexual predators.
The former head of the FBI's criminal investigative division went even
further, conjuring up kidnappers who are also sexual predators. And, of
course, terrorists.
FBI Director James Comey claimed that Apple's move allows people to place themselves beyond the law"
and also invoked that now overworked "child kidnapper." John J. Escalante, chief of detectives for the
Chicago police department now holds the title of most hysterical: "Apple will become the phone of choice
for the pedophile."
It's all bluster. Of the 3,576 major offenses for which warrants were
granted for communications interception in 2013, exactly one
involved kidnapping. And, more importantly, there's no evidence that
encryption hampers criminal investigations in any serious way. In
2013, encryption foiled the police nine times, up from four in 2012
and the investigations proceeded in some other way.
This is why the FBI's scare stories tend to wither after public scrutiny.
A former FBI assistant director wrote about a kidnapped man who
would never have been found without the ability of the FBI to decrypt
an iPhone, only to retract the point hours later because it wasn't
true.
We've seen this game before. During the crypto wars of the 1990s, FBI
Director Louis Freeh and others would repeatedly use the example of
mobster John Gotti to illustrate why the ability to tap telephones was so
vital. But the Gotti evidence was collected using a room bug, not a
telephone tap. And those same scary criminal tropes were trotted out
then, too. Back then we called them the Four Horsemen of the
Infocalypse: pedophiles, kidnappers, drug dealers, and terrorists.
Nothing has changed.
Official statistics disprove the link.
Franceschi-Bicchierai 15 Lorenzo Franceschi-Bicchierai, Staff Writer covering hacking,
information security, and digital rights at VICE Motherboard, former writer at Mashable and Danger Room
the Wired blog, holds an M.S. in Journalism from Columbia University, 2015 (Data Shows Little Evidence
for FBI's Concerns About Criminals 'Going Dark', Motherboard, July 1st, Available Online at
http://motherboard.vice.com/read/data-shows-little-evidence-for-fbis-concerns-about-criminals-going-dark,
Accessed 07-20-2015)
previous years but only reported in 2014. Of those, the feds were able to crack the communications in
four of the five.)
The FBI did not respond to Motherboards request for comment.
Yet, other experts warn that the Wiretap Report is only a small window into the world of government
surveillance.
First of all, the FBI has been railing against encryption not just when its used for communications, but
especially when its used to safeguard data on the phone or computer. The whole recent debate was
spurred by Apples announcement that it wouldnt be able to unlock phones for the police anymore, and
that new iPhones would be encrypted by default. Wiretaps arent used to get that kind of data, but cover
mostly communications.
Moreover, the FBI has said in the past that it doesnt apply for wiretaps when it know it cant intercept the
targeted communications, according to Albert Gidari, a lawyer at Perkins Coie who has worked with
technology firms on surveillance matters, and Jonathan Mayer, a computer scientist and lawyer at Stanford
University.
The report is suggestive, but hardly conclusive, Mayer told Motherboard. Much
more telling, in
is that law enforcement and intelligence officials remain unable
to provide episodes where encryption frustrated an investigation.
So far, the FBI has yet to put forth a valid example where encryption
really thwarted an investigation. In fact, some of the examples cited by
Comey have been debunked in media reports.
my view,
This crypto debate continues to be a red herring because we really are uninformed about the facts that
the FBI contends supports their position, Gidari said.
The Wiretap Report contains other interesting information that shed a light on government surveillance
practices. Out of the more than 3,554 wiretaps authorized by judges, the vast majority of them (3,409 or
89 percent) were for drug related offenses. Homicide, in turn, was the reason behind only 4 percent of the
the wiretaps. And virtually all of them (96%) were for portable devices, such as cellphones.
Even if the Wiretap Report is just small a peek behind the scenes of
government surveillance, it shows that for now, at least when it comes
to wiretapping, the FBIs isnt really going dark.
Law enforcement can still get a warrant no link.
Cohn et al. 14 Cindy Cohn, Executive Director and former Legal Director and General Counsel
of the Electronic Frontier Foundation, holds a J.D. from the University of Michigan Law School, with Jeremy
Gillula, Staff Technologist at the Electronic Frontier Foundation, holds a Ph.D. in Computer Science from
Stanford University, and Seth Schoen, Senior Staff Technologist at the Electronic Frontier Foundation, 2014
(What Default Phone Encryption Really Means For Law Enforcement, Vice News, October 8th, Available
Online at https://news.vice.com/article/what-default-phone-encryption-really-means-for-law-enforcement,
Accessed 07-05-2015)
decade. Encryption can stop or mitigate the damage from crimes like
identity theft and fraud targeted at smartphone users.181
Going dark makes a good sound bite. But the neat phrase ignores
the fundamental reality that law enforcement and intelligence
agencies have been going bright. The digital era provides troves
of evidence that never existed before. IBM has estimated that
90% of the data in the world has been created in the last two to
three years and EMC projects this volume will increase sevenfold-fold
by 2020. This vast flood of informationemails, surveillance and traffic
cameras, transactions, mobile phone location data, and many other
sourcesprovides law enforcement with digital trails everywhere all
the time. Expanded use of encryption may obscure a portion of such
data, but that still leaves exabytes of information from which to
seek evidence.
Fears that encryption will cause agencies to go dark are
wrong and explained by loss aversion and the endowment
effect.
Swire and Ahmad 11 Peter Swire, C. William ONeill Professor of Law at the Moritz College
of Law of the Ohio State University, served as the Chief Counselor for Privacy in the Office of Management
and Budget during the Clinton Administration, holds a J.D. from Yale Law School, and Kenesa Ahmad, Legal
and Policy Associate with the Future of Privacy Forum, holds a J.D. from the Moritz College of Law of the
Ohio State University, 2011 (Going Dark Versus a Golden Age for Surveillance, Center for Democracy
& Technology, November 28th, Available Online at https://cdt.org/blog/%E2%80%98going-dark
%E2%80%99-versus-a-%E2%80%98golden-age-for-surveillance%E2%80%99/, Accessed 06-24-2015)
What explains the agencies sense of loss when the use of wiretaps has
expanded, encryption has not been an important obstacle, and
agencies have gained new location, contact, and other information? One
answer comes from behavioral economics and psychology, which has drawn
academic attention to concepts such as loss aversion and the
endowment effect. Loss aversion refers to the tendency to prefer
avoiding losses to acquiring gains of similar value. This concept also
helps explain the endowment effect the theory that people place
higher value on goods they own versus comparable goods they do not
own. Applied to surveillance, the idea is that agencies feel the loss of
one technique more than they feel an equal-sized gain from other
techniques. Whether based on the language of behavioral economics
or simply on common sense, we are familiar with the human tendency
to pocket our gains assume we deserve the good things that come
our way, but complain about the bad things, even if the good things
are more important.
A simple test can help the reader decide between the going dark and
golden age of surveillance hypotheses. Suppose the agencies had a
choice of a 1990-era package or a 2011-era package. The first package
would include the wiretap authorities as they existed pre-encryption,
but would lack the new techniques for location tracking, confederate
identification, access to multiple databases, and data mining. The
second package would match current capabilities: some encryptionrelated obstacles, but increased use of wiretaps, as well as the
capabilities for location tracking, confederate tracking and data
mining. The second package is clearly superior the new surveillance
tools assist a vast range of investigations, whereas wiretaps apply only
to a small subset of key investigations. The new tools are used far
more frequently and provide granular data to assist investigators.
Conclusion
Due to changing
technology, there are indeed specific ways that law enforcement and
national security agencies lose specific previous capabilities. These
specific losses, however, are more than offset by massive gains.
Public debates should recognize that we are truly in a golden age of
surveillance. By understanding that, we can reject calls for bad
encryption policy. More generally, we should critically assess a wide range
of proposals, and build a more secure computing and communications
infrastructure.
This post casts new light on government agency claims that we are going dark.
technological changes over time can confer advantages on both police investigators and criminals seeking
to avoid surveillance, and the law adjusts over time to preserve a balance between the ability of citizens to
protect their privacy and the ability of law enforcement to invade it with sufficiently good reason. As I
trove of data for police even if that trove does not include backdoor
access to physical devices. The ordinary, unsophisticated criminal may be more able to
protect locally stored files than he was a decade ago, but in a thousand other ways, he can expect to be
far more minutely tracked in both his online and offline activities. An encrypted text messaging system
may be worse from the perspective of police than an unencrypted one, but it is it really any worse than a
system of pay phones that allow criminals to communicate without leaving any record for police to sift
technology opens a thousand new windows to our government monitors. If we aim to preserve an
equilibrium between government power and citizen privacy, we should accept that it will occasionally
close one as well.
After considering the issue, Orin Kerr rethought his position, looking at this in terms of a technological-legal
trade-off. I think he's right.
Given everything that has made it easier for governments and others
to intrude on our private lives, we need both technological security and
legal restrictions to restore the traditional balance between
government access and our security/privacy. More companies should follow Apple's
lead and make encryption the easy-to-use default. And let's wait for some actual
evidence of harm before we acquiesce to police demands for
reduced security.
"I am a huge believer in the rule of law, but I am also a believer that no one in this country is
above the law," Comey told reporters at FBI headquarters in Washington. "What concerns me
about this is companies marketing something expressly to allow people to place themselves
above the law."
[....]
"There will come a day -- well it comes every day in this business -- when it will matter a great,
great deal to the lives of people of all kinds that we be able to with judicial authorization gain
access to a kidnapper's or a terrorist or a criminal's device. I just want to make sure we have a
good conversation in this country before that day comes. I'd hate to have people look at me and
say, 'Well how come you can't save this kid,' 'how come you can't do this thing.'"
"I get that the post-Snowden world has started an understandable pendulum swing," he said. "What I'm
worried about is, this is an indication to us as a country and as a people that, boy, maybe that pendulum
swung too far."
BS:
In 2010, the US Deputy Secretary of Defense William Lynn wrote: "Although the threat to intellectual
property is less dramatic than the threat to critical national infrastructure, it may be the most significant
their speeds to 60 kph so bank robbers can't get away so fast. But he doesn't understand the comparable
trade-offs in his proposed legislation.
The general idea coming from these camps is that terrorists use encryption to
communicate. Thus, if there are backdoors, then law enforcement can
eavesdrop on those communications. Leaving aside the massive
vulnerabilities that would be introduced on everyone else, its clear
that the terrorists could very easily modify their communications to
evade those types of encryption or set up alternative communication
methods. We would be creating holes in the protection used for trillions
of transactions, all for naught.
Citizens of a city do not give the police the keys to their houses. We do
not register our bank account passwords with the FBI. We do not
knowingly or specifically allow law enforcement to listen and record our
phone calls and Internet communications (though that hasnt seemed to matter). We
should definitely not crack the foundation of secure Internet
communications with a backdoor that will only be exploited by
criminals or the very terrorists that were supposedly trying to thwart.
Remember, if the government can lose an enormous cache of
extraordinarily sensitive, deeply personal information on millions of its
own employees, one can only wonder what horrors would be visited
upon us if it somehow succeeded in destroying encryption as well .
Conflation obscures issues. That's what's happening now with FBI Director
Comey's arguments regarding ISIS, Going Dark, and device encryption.
On Wednesday, Ben, quoting the director, discussed how the changes resulting from ISIS means we ought
to reexamine the whole encryption issue. "Our job is to find needles in a nationwide haystack, needles that
are increasingly invisible to us because of end-to-end encryption," Comey said. "This is the 'going dark'
problem in high definition."
William Lynn here before, but the point he made is directly relevant, and it bears repeating. The Deputy
Director of Defense wrote, "the
Comey admits encryption lets people lock stuff away from criminals (and
supports innovation), and admits there are lots of good things about this. He
then introduces costs, without enumerating them. In a paragraph
purportedly explaining how the good things and costs are in
tension, he raises the ISIL threat as well as as an afterthought criminal
investigations all over the country.
Without providing any evidence about that tension.
As I have noted, the recent wiretap report raises real questions, at least
about the criminal investigations all over the country, which in fact
are not being thwarted. On that ledger, at least, there is no question: the
good things (AKA, benefits) are huge, especially with the million or so
iPhones that get stolen every year, and the costs are negligible, just
a few wiretaps law enforcement cant break.
I conceded we cant make the same conclusions about FISA orders or
the FBI generally because Comeys agencys record keeping is so bad (which
is consistent with all the rest of its record-keeping). It may well be that were not able to
access ISIL communications with US recruits because of encryption,
but simply invoking the existence of ISIL using end-to-end encrypted
mobile messaging apps is not evidence (especially because so much
evidence indicates that sloppy end-user behavior makes it possible for
FBI to crack this).
Especially after the FBIs 0-for-40 record about making claims about
terrorists since 9/11.
It may be that the FBI is facing increasing problems tracking ISIL. It
may even be though Im skeptical that those problems would outweigh the
value of making stealing iPhones less useful.
But even as he calls for a real debate, Comey offers not one bit of real
evidence to counter the crappy FBI reporting in the official reports to
suggest this is not more FBI fearmongering.
This argument is baseless fearmongering.
McLaughlin 15 Jenna McLaughlin, Reporter and Blogger covering surveillance and national
security for The Intercept, former national security and foreign policy reporter and editorial fellow at
Mother Jones, 2015 (FBI and Comey Find New Bogeyman for Anti-Encryption Arguments: ISIS, The
Intercept, July 7th, Available Online at https://firstlook.org/theintercept/2015/07/07/fbi-finds-new-bogeymananti-encryption-arguments-isis/, Accessed 07-20-2015)
After months of citing hypothetical crimes as a reason to give law enforcement a magical key to unlock
who texted details of their violent plots that law enforcement agents werent privy to.
But, as The Intercept reported shortly after, those examples were largely bogus and
had nothing to do with encryption.
Now, in a preview of his appearance Wednesday before the Senate Intelligence Committee, Comey
is playing the ISIS card, saying that it is becoming impossible for the FBI to stop the groups
recruitment and planned attacks. (He uses an alternate acronym, ISIL, for the Islamic State.)
The current ISIL threat involves ISIL operators in Syria recruiting and tasking dozens of troubled
Americans to kill people, a process that increasingly takes part through mobile messaging apps that are
end-to-end encrypted, communications that may not be intercepted, despite judicial orders under the
Fourth Amendment, Comey wrote on Monday in a blog post on the pro-surveillance website Lawfare.
They also note that law enforcement can thwart encryption in most
cases, and can supplement investigations with traditional methods
not involving surveillance.
The FBI have been trying to argue that the internet is going dark for several years now, and Congress
has not yet bought into their propositions, Amie Stepanovich, the U.S. policy manager for digital rights at
Access, an international pro-privacy organization, wrote in an email to The Intercept. Terrorist threats are
harder to substantiate and easier to use as justifications for additional funding, she wrote.
But FBI surveillance didnt stop Elton Simpson the Garland Police
Department did. The local police never got the FBIs email, and if they
had, Garlands Police Chief Bates told NPR, the response would not
have been any different: Please note that the contents of that
email would not have prevented the shooting nor would it have
changed the law enforcement response in any fashion.
The theory is that companies have every incentive for market reasons to protect consumer
privacy, but no incentives at all to figure out how to provide law enforcement access in the
context of doing so.
There's some truth to this theory. Tech companies are particularly wary of appearing to be complicit in
government surveillance programs as a couple of years of leaks have done considerable damage to their
prospects in foreign markets.
Wittes suggests the government isn't doing much to sell this broken encryption plan, despite Comey's
multiple statements on the dangers posed by encrypted communications. And he's right. If the
government truly wants a "fix," it needs to start laying the groundwork. It can't just be various intel/law
enforcement heads stating "we're not really tech guys" and suggesting tech companies put the time and
effort into solving their problems for them.
If we beginas the computer scientists dowith a posture of great skepticism as to the
plausibility of any scheme and we place the burden of persuasion on Comey, law enforcement,
and the intelligence community to demonstrate the viability of any system, the obvious course is
government-sponsored research. What we need here is not a Clipper Chip-type initiative, in which
the government would develop and produce a complete system, but a set of intellectual and
technical answers to the challenges the technologists have posed. The goal here should be an
elaborated concept paper laying out how a secure extraordinary access system would work in
sufficient detail that it can be evaluated, critiqued, and vetted; think of the bitcoin paper here as a
model. Only after a period of public vetting, discussion, and refinement would the process turn to
the question of what sorts of companies we might ask to implement such a system and by what
legal means we might ask.
Thus ends the intelligent suggestions in Wittes' thinkpiece. Everything else is exactly the sort of thing
Comey keeps hinting at, but seems unwilling to actually put in motion. It's the government-power elephant
in the room. Actually, several elephants. It's the underlying, unvocalized threat that lies just below the
surface of Comey's government-slanted PR efforts.
them.
The advantage to this approach is that it potentially lets a thousand flowers bloom. Each company
might do it differently. They would compete to provide the most security consistent with the
performance standard. They could learn from each other. And government would not be in the
position of developing and promoting specific algorithms. It wouldn't even need to know how the
task was being done.
government research exploring the feasibility of the proposed encryption bypass with one of his worst
ideas:
If you simply require the latter [law enforcement access] as a matter of law, [tech companies] will
devote resources to the question of how to do so while still providing consumer security. And
while the problems are hard, they will prove manageable once the tech giants decide to work
them hardrather than protesting their impossibility.
There's not a worse idea out there than making certain forms of
encryption illegal to use in the United States. But Wittes tries his hardest to
find equally awful ideas. Like this one, which would open tech companies to an entire new
area of liability.
Another, perhaps softer, possibility is to rely on the possibility of civil liability to incentivize
companies to focus on these issues. At the Senate Judiciary Committee hearing this past week,
the always interesting Senator Sheldon Whitehouse posed a question to Deputy Attorney General
Sally Yates about which I've been thinking as well: "A girl goes missing. A neighbor reports that
they saw her being taken into a van out in front of the house. The police are called. They come to
the home. The parents are frantic. The girl's phone is still at home." The phone, however, is
encrypted.
[W]e have an end-to-end encryption issue, in significant part, because companies are trying to
assure customers worldwide that they have their backs privacy-wise and are not simply tools of
NSA. I think those politics are likely to change. If Comey is right and we start seeing law
enforcement and intelligence agencies blind in investigating and preventing horrible crimes and
significant threats, the pressure on the companies is going to shift. And it may shift fast and hard.
Whereas the companies now feel intense pressure to assure customers that their data is safe
from NSA, the kidnapped kid with the encrypted iPhone is going to generate a very different sort
of political response. In extraordinary circumstances, extraordinary access may well seem
reasonable.
If this does happen, Wittes' assumption will likely be correct. Politicians have never been shy about
capitalizing on tragedies to nudge the government power needle. This will be no different. One wonders
why no one has come forward with a significantly compelling tragedy by this point, considering the wealth
of encryption options currently on the market. A logical person would assume this lack of compelling
anecdotal evidence would suggest encryption really hasn't posed a problem yet -- especially considering
the highly-motivated sales pitches that have been offered nonstop since Google and Apple's
before even considering whether a governmentonly backdoor is possible and cost-effective, it seems to me that Wittess
analysis is flawed.
The problem lies in the limits of his analogy.
In an ungoverned territory like Somalia, bad actors can take violent
physical actions with impunitysay, seizing a cargo ship, killing the captain, and taking
hostages. If authorities were similarly helpless on Americas streets if gangs
could rob or murder pedestrians as they pleased, and police couldnt see or do a thing that would,
indeed, be dystopian. But when communications are encrypted, the
ungoverned territory does not encompass actions, violent or otherwise, just
thoughts and their expression.
No harm is done within the encrypted space.
To be sure, plots planned inside that space can do terrible damage in the
real worldbut so can plots hatched by gang members on public
streets whispering into one anothers ears, or Tony Soprano out on his
boat, having swept it for FBI bugs.
Even at this conceptual level,
To be clear, I dont mean to assert that backdoor access to digital communications is just like equivalent
There's just one problem: Hosko was wrong. In the case he cited, the
police had not used information gleaned from a seized smartphone.
Instead, they used wiretaps and telephone calling records methods
that would have been unaffected by Apple's new encryption feature.
The Washington Post was forced to issue a correction.
Indeed, while law enforcement groups love to complain about ways that
encryption and other technologies have made their jobs harder,
technology has also provided the police with vast new troves of
information to draw upon in their investigations. With the assistance of cell phone providers, law
enforcement can obtain detailed records of a suspect's every move. And consumers increasingly use
cloud-computing services that store emails, photographs, and other private information on servers where
they can be sought by investigators.
Its a good thing Najibullah Zazi didnt have access to a modern iPhone or Android device a few
years ago when he plotted to blow up New York City subway stations. He was caught because his
email was tapped by intelligence agenciesa practice that Silicon Valley firms recently decided
the U.S. government is no longer permitted.
Apple , Google, Facebook and others are playing with fire, or in the case of Zazi with a plot to blow
up subway stations under Grand Central and Times Square on Sept. 11, 2009. An Afghanistan
native living in the U.S., Zazi became a suspect when he used his unencrypted Yahoo email
account to double-check with his al Qaeda handler in Pakistan about the precise chemical mix to
complete his bombs. Zazi and his collaborators, identified through phone records, were arrested
shortly after he sent an email announcing the imminent attacks: The marriage is ready.
The Zazi example (he pleaded guilty to conspiracy charges and awaits sentencing) highlights the
risks that Silicon Valley firms are taking with their reputations by making it impossible for
intelligence agencies or law enforcement to gain access to these communications.
5th Amendment concerns with that latter route, it's still not "impossible." And it's not about
communications.
Crovitzs argument doesnt make sense. The FBI still has many
investigative tools.
Yakowicz 15 Will Yakowicz, Staff Writer for Inc. magazine, holds a B.A. in Journalism and English
Literature from New York University, 2015 (What the Government Gets Wrong About Cybersecurity, Inc.,
July 7th, Available Online at http://www.inc.com/will-yakowicz/is-data-encryption-really-the-enemy-fbiclaims.html, Accessed 07-20-2015)
A Senate Judiciary Committee hearing on Wednesday is set to put a spotlight on the fine line tech
companies must walk between keeping users' data private and making it available to the government in
matters of national security.
and others sent a letter to President Obama urging him to shoot down proposals that would force them to
enfeeble the security of their products to help law enforcement access encrypted data more easily.
On the other side of the debate, FBI director James
against encryption, stating that if his agency cannot intercept internet communications, it will not
be able to save citizens from terror attacks. During a speech, Comey said, "encryption threatens to lead all
of us to a very dark place."
The column contends that tech companies have made consumer products too secure with end-to-end
encryption. Crovitz says when the FBI asked tech companies to find a way to balance privacy encryption
and court-ordered legal searches, the technologists said it was impossible.
"Terror attacks are increasingly planned online, outside the reach of intelligence and law enforcement,"
Crovitz writes. "Once a recruit is identified, ISIS tells him to switch to an encrypted smartphone. Legal
wiretaps are useless because the signal is indecipherable. Even when the devices are lawfully seized
through court orders, intelligence and law-enforcement agencies are unable to retrieve data from them."
noticed that none of the homes had curtains or blinds in the windows. When he asked why, his guide told
him that if you cover your windows you have something to hide.
"In
North Korea, you have to keep your windows wide open so people
can look inside," Gorodyansky says. "Is that the society we want to create
here?"
enforcement track down criminals who don't leave a trail of muddy footprints?
Then Crovitz shifts to his own personal worldview -- insisting that the public actually doesn't want privacy
or protection from the snooping eyes of government. He insists, the truth is the public really wants to be
spied on.
It looks like Silicon Valley has misread public opinion. The initial media frenzy caused by the
Edward Snowden leaks has been replaced by recognition that the National Security Agency is
among the most lawyered agencies in the government. Contrary to initial media reports, the NSA
does not listen willy-nilly to phone and email communications.
Last week, the Senate killed a bill once considered a sure thing. The bill would have created new
barriers to the NSA obtaining phone metadata to connect the dots to identify terrorists and
prevent their attacks. Phone companies, not the NSA, would have retained these records. There
would have been greater risks of leaks of individual records. An unconstitutional privacy advocate
would have been inserted into Foreign Intelligence Surveillance Court proceedings.
First off, no, the USA Freedom Act was never "a sure thing." From the very beginning, it was considered a
massive long shot. And, no it would not have "created new barriers" -- it would have merely made it clear
that the NSA can't simply collect everyone's data in the hopes of magically sifting through the haystack
and finding connections. Also, Crovitz is flat out wrong (again!) that this would have led to a "greater risk"
because the phone companies held the data. While this was the key talking point among those who voted
against it, it's simply incorrect. The telcos already retain that information. The bill made no changes to
what information telcos could and would retain. It only said that they shouldn't also have to ship all that
data to the NSA as well. There was no increased risk. Saying so is -- once again -- trumpeting Crovitz's
ignorance.
Furthermore, the idea that the public is miraculously comfortable with the government spying on them...
based on the government voting against curtailing government surveillance is simply ludicrous. It doesn't
even pass a basic laugh test. The Pew Research poll that tracks this issue most closely continues to show
that the vast majority of people are against NSA surveillance on American data, and the numbers who feel
that way have been growing consistently since the first of the Snowden revelations.
But let me repeat the assertion Crovitz made here, just to remind everyone of how idiotic it is: he's saying
that the public is now comfortable with surveillance because Congress voted down surveillance reform.
And he thinks this is obvious.
The lesson of the Snowden accusations is that citizens in a democracy make reasonable trade-offs
between privacy and security once they have all the facts. As people realized that the rules-bound
NSA poses little to no risk to their privacy, there was no reason to hamstring its operations.
Likewise, law-abiding people know that there is little to no risk to their privacy when
communications companies comply with U.S. court orders.
Facts, huh? It's kind of funny that he'd argue for the facts when he seems to be lacking in many of them.
And he's wrong. There is tremendous risk to privacy, as illustrated by the fact that the NSA regularly
abused its powers to spy on Americans. Furthermore, he ignores (or is ignorant of the fact) that much of
the data the NSA collects is also freely available to the CIA and FBI -- and that the FBI taps into it so often
that it doesn't even track how many times it dips into the database.
And of course, none of this even bothers to point out that the reason why Google and Apple are increasing
encryption is because it makes us all much safer from actual everyday threats -- including the very threats
that the NSA and others in law enforcement keep warning us about. Making us all safer is a good thing,
though, not to L. Gordon Crovitz, apparently.
In a much-discussed
editorial that ran Friday, The Washington Post sided with law enforcement.
Bizarrely, the Post acknowledges backdoors are a bad ideaa back door can
and will be exploited by bad guys, too and then proposes one in the very next
sentence: Apple and Google, the paper says, should invent a secure golden key that would let police
And yet, the argument for encryption backdoors has risen like the undead.
The paper doesnt explain why this golden key would be less
vulnerable to abuse than any other backdoor. Maybe its the name, which
seems a product of the same branding workshop that led the Chinese government to name its Internet
censorship system the golden shield. Whats not to like? Everyone loves gold!
Implicit in the Posts argument is the notion that the existence of the
search warrant as a legal instrument obliges Americans to make their
data accessible: that weakening your crypto is a civic responsibility
akin to jury duty or paying taxes. Smartphone users must accept that they cannot be
above the law if there is a valid search warrant, writes the Post.
This talking point, adapted from Comeys press conference, is an insult to anyone
savvy enough to use encryption. Both Windows and OS X already support strong full-disk
crypto, and using it is a de facto regulatory requirement for anyone handling sensitive consumer or
There's hope that by the time the Washington Post's editorial board takes a
third crack at the encryption whip, it might say something worthwhile.
Late on Saturday, the The Washington Post's editorial board published what initially read as a scathing
anti-encryption and pro-government rhetoric opinion piece that scolded Apple and Google (albeit a
somewhat incorrect assertion) for providing "end-to-end encryption" (again, an incorrect assertion) on their
devices, locking out federal authorities investigating serious crimes and terrorism.
Critically, what the Post gets out of this editorial remains widely unknown, perhaps with the exception of
riling up members of the security community. It's not as though the company is particularly invested in
either side. Aside the inaccuracies in the board's opinion, and the fair (and accurate) accusation that the
article said "nothing" (one assumes that means nothing of "worth" or "value"), it's hypocritical to make
more than one statement on this matter while at the same time becoming the first major news outlet to
start encrypting its entire website.
The board's follow-up sub-600 worded note did not offer anything new, but reaffirmed
its desire to see both tech companies and law enforcement "reconcile the competing imperatives" for
privacy and data access, respectively. (It's worth noting the board's opinion does not represent every
journalist or reporter working at the national daily, but it does reflect the institution's views on the whole.)
The Post's own decision to roll out encryption across its site seems
bizarre considering the editorial board's conflicting views on the
matter.
Such head-scratching naivety prompted one security expert to ask
anyone who covers security at the Post to "explain reality" to the
board. Because, clearly, the board isn't doing its job well if on two
separate occasions it's fluffed up reporting on a subject with zero
technical insight.
If the board, however, needs help navigating the topic, there is no doubt a virtual long line of security
experts, academics, and researchers lining up around the block ready to assist. At least then there's hope
the board can strike it third-time lucky in covering the topic.
The Washington Post editorial board found Comey's diatribes supereffective! It published a post calling for some sort of law enforcementonly, magical hole in Apple and Google's encryption.
How to resolve this? A police back door for all smartphones is undesirable a back door can
and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google
could invent a kind of secure golden key they would retain and use only when a court has
approved a search warrant. Ultimately, Congress could act and force the issue, but wed rather
see it resolved in law enforcement collaboration with the manufacturers and in a way that
protects all three of the forces at work: technology, privacy and rule of law.
When is a "back door" not a "back door?" Well, apparently when an editorial
board spells it G-O-L-D-E-N K-E-Y. It's the same thing, but in this particular
pitch, it magically isn't, because good intentions. Or something.
Months later, the debate is still raging. But it's boiled down to two
arguments:
1. This is impossible. You can't create a "law enforcement only"
backdoor in encryption. It's simply not possible because a backdoor is
a backdoor and can be used by anyone who can locate the door
handle.
2. No, it isn't. Please see below for citations and references:
[the article includes several paragraph breaks to indicate that there are, in fact, no citations or references]
The editorial finally wraps up by calling for experts in the field to resolve this issue:
This conflict should not be left unattended. Nineteen years ago, the National Academy of Sciences
studied the encryption issue; technology has evolved rapidly since then. It would be wise to ask
the academy to undertake a new study, with special focus on technical matters, and
recommendations on how to reconcile the competing imperatives.
The WaPo editorial board is no better than James Comey. It can cite
nothing in support of its view but yet still believes it's right. And just
like Comey, the board is being wholly disingenuous in its "deferral" to
security researchers and tech companies. It, like Comey, wants to hold
two contradictory views.
Tech/security researchers are dumb when they say this problem
can't be solved.
Tech/security researchers are super-smart and can solve this
problem.
So, they (the board and Comey) want to ignore the "smart guys" when they say
this is impossible, but both are willing to listen if they like the answers
they're hearing.
Politics DA
No Link Obama
Either Obama wont spend political capital on encryption or his
new position will quickly end the debate.
Geller 15 Eric Geller, Deputy Morning Editor at The Daily Dotthe hometown newspaper of the
Internet, 2015 (The rise of the new Crypto War, The Daily Dot, July 10th, Available Online at
http://www.dailydot.com/politics/encryption-crypto-war-james-comey-fbi-privacy/, Accessed 07-20-2015)
Divided government
As Comey, Rogers, and other national-security officials campaign for backdoors,
one important
Obamas interview with Swisher marked a rare entrance for the president
into the backdoor debate, which has pitted his law-enforcement
professionals against the civil libertarians who were encouraged by his
historic 2008 election and disappointed by his subsequent embrace of the
surveillance status quo.
If the president felt strongly enough about strong encryption, he could
swat down FBI and NSA backdoor requests any time he wanted. White
House advisers and outside experts have offered him plenty of policy
cover for doing so. The Presidents Review Group on Intelligence and Communications
Technologies, which Obama convened in the wake of the Snowden disclosures, explicitly discouraged
backdoors in its final report.
The review group recommended fully supporting and not undermining efforts to create encryption
standards, ... making clear that [the government] will not in any way subvert, undermine, weaken, or
make vulnerable generally available commercial encryption, and supporting efforts to encourage the
greater use of encryption technology for data in transit, at rest, in the cloud, and in storage.
The report also warned of serious economic repercussions for American businesses resulting from a
growing distrust of their capacity to guarantee the privacy of their international users. It was a general
warning about the use of electronic surveillance, but it nevertheless applies to the potential fallout from a
backdoor mandate.