Police, FBI, NSA, DOJ... UK Freak Out Over Encryption
#1
Last week, we noted that it was good news to see both Apple and Google highlight plans to encrypt certain phone information by default on new versions of their mobile operating systems, making that information no longer obtainable by those companies and, by extension, governments and law enforcement showing up with warrants and court orders. Having giant tech companies competing on how well they protect your privacy? That's new... and awesome. Except, of course, if you're law enforcement. In those cases, these announcements are apparently cause for a general freakout about how we're all going to die. From the Wall Street Journal:
Quote: One Justice Department official said that if the new systems work as advertised, they will make it harder, if not impossible, to solve some cases. Another said the companies have promised customers "the equivalent of a house that can't be searched, or a car trunk that could never be opened.''

Andrew Weissmann, a former Federal Bureau of Investigation general counsel, called Apple's announcement outrageous, because even a judge's decision that there is probable cause to suspect a crime has been committed won't get Apple to help retrieve potential evidence. Apple is "announcing to criminals, 'use this,' " he said. "You could have people who are defrauded, threatened, or even at the extreme, terrorists using it.''

The level of privacy described by Apple and Google is "wonderful until it's your kid who is kidnapped and being abused, and because of the technology, we can't get to them,'' said Ronald Hosko, who left the FBI earlier this year as the head of its criminal-investigations division. "Who's going to get lost because of this, and we're not going to crack the case?"
That Hosko guy apparently gets around. Here he is freaking out in the Washington Post as well:
Quote: Ronald T. Hosko, the former head of the FBI’s criminal investigative division, called the move by Apple “problematic,” saying it will contribute to the steady decrease of law enforcement’s ability to collect key evidence — to solve crimes and prevent them. The agency long has publicly worried about the “going dark” problem, in which the rising use of encryption across a range of services has undermined government’s ability to conduct surveillance, even when it is legally authorized.

“Our ability to act on data that does exist . . . is critical to our success,” Hosko said. He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information.
Think of the children! And the children killed by terrorists! And just be afraid! Of course, this is the usual refrain any time there's more privacy added to products, or when laws are changed to better protect privacy. And it's almost always bogus. I'm reminded of all the fretting and worries by law enforcement types about how "free WiFi" and Tor would mean that criminals could get away with all sorts of stuff. Except, as we've seen, good old fashioned police/detective work can still let them track down criminals. The information on the phone is not the only evidence, and criminals almost always leave other trails of information.

No one has any proactive obligation to make life easier for law enforcement.

Orin Kerr, who regularly writes on privacy, technology and "cybercrime" issues, announced that he was troubled by this move, though he later downgraded his concerns to "more information needed." His initial argument was that since the only thing these moves appeared to do was keep out law enforcement, he couldn't see how it was helpful:
Quote: If I understand how it works, the only time the new design matters is when the government has a search warrant, signed by a judge, based on a finding of probable cause. Under the old operating system, Apple could execute a lawful warrant and give law enforcement the data on the phone. Under the new operating system, that warrant is a nullity. It’s just a nice piece of paper with a judge’s signature. Because Apple demands a warrant to decrypt a phone when it is capable of doing so, the only time Apple’s inability to do that makes a difference is when the government has a valid warrant. The policy switch doesn’t stop hackers, trespassers, or rogue agents. It only stops lawful investigations with lawful warrants.

Apple’s design change one it is legally authorized to make, to be clear. Apple can’t intentionally obstruct justice in a specific case, but it is generally up to Apple to design its operating system as it pleases. So it’s lawful on Apple’s part. But here’s the question to consider: How is the public interest served by a policy that only thwarts lawful search warrants?
His "downgraded" concern comes after many people pointed out that by leaving backdoors in its technology, Apple (and others) are also leaving open security vulnerabilities for others to exploit. He says he was under the impression that the backdoors required physical access to the phones in question, but if there were remote capabilities, perhaps Apple's move is more reasonable.

Perhaps the best response (which covers everything I was going to say before I spotted this) comes from Mark Draughn, who details "the dangerous thinking" by those like Kerr who are concerned about this. He covers the issue above about how any vulnerability left by Apple or Google is a vulnerability open to being exploited, but then makes a further (and more important) point: this isn't about them, it's about us and protecting our privacy:
Quote: You know what? I don’t give a damn what Apple thinks. Or their general counsel. The data stored on my phone isn’t encrypted because Apple wants it encrypted. It’s encrypted because I want it encrypted. I chose this phone, and I chose to use an operating system that encrypts my data. The reason Apple can’t decrypt my data is because I installed an operating system that doesn’t allow them to.

I’m writing this post on a couple of my computers that run versions of Microsoft Windows. Unsurprisingly, Apple can’t decrypt the data on these computers either. That this operating system software is from Microsoft rather than Apple is beside the point. The fact is that Apple can’t decrypt the data on these computers is because I’ve chosen to use software that doesn’t allow them to. The same would be true if I was posting from my iPhone. That Apple wrote the software doesn’t change my decision to encrypt.
Furthermore, he notes that nothing Apple and Google are doing now on phones is any different than tons of software for desktop/laptop computers:
Quote:
I’ve been using the encryption features in Microsoft Windows for years, and Microsoft makes it very clear that if I lose the pass code for my data, not even Microsoft can recover it. I created the encryption key, which is only stored on my computer, and I created the password that protects the key, which is only stored in my brain. Anyone that needs data on my computer has to go through me. (Actually, the practical implementation of this system has a few cracks, so it’s not quite that secure, but I don’t think that affects my argument. Neither does the possibility that the NSA has secretly compromised the algorithm.)

Microsoft is not the only player in Windows encryption. Symantec offers various encryption products, and there are off-brand tools like DiskCryptor and TrueCrypt (if it ever really comes back to life). You could also switch to Linux, which has several distributions that include whole-disk encryption. You can also find software to encrypt individual documents and databases.
In short, he points out, the choice of encrypting our data is ours to make. Apple or Google offering us yet another set of tools to do that sort of encryption is them offering a service that many users value. And shouldn't that be the primary reason why they're doing stuff, rather than benefiting the desires of FUD-spewing law enforcement folks?

Originally Published: Tue, 23 Sep 2014 18:18:00 GMT
source
Reply
#2
i think it's amusing... we see stuff like this:

Oklahoma Sherriff Accused Of Keeping Extensive Database On Citizens

... yet the pigs still wonder why we want to keep them out of our fucking business.
Reply
#3
When I use a public restroom I close the door. Not because I'm doing anything illegal, immoral, or even remotely uncommon; simply because I feel more comfortable with some privacy.

If someone removed the doors, or installed cameras, or did anything to record my activities (say, because children are sometimes molested in public restrooms and "if I'm not doing anything wrong I have nothing to fear") I would take countermeasures. Not because I'm doing anything illegal, immoral or even remotely uncommon, simply because "if I'm not doing anything wrong" it's nobodies fucking business what I'm doing and I'd feel uncomfortable being filmed.

People close doors (or encrypt their phones) because other people overstep the boundaries of acceptable behaviour.

It is not evil, it is defence against evil.
Reply
#4
(Sep 24, 2014, 03:08 am)NIK Wrote: If someone removed the doors, or installed cameras, or did anything to record my activities (say, because children are sometimes molested in public restrooms and "if I'm not doing anything wrong I have nothing to fear") I would take countermeasures.

[Image: ZG4ji8R.jpg]
Reply
#5
Yesterday, we wrote about law enforcement freaking out over the announcements from both Apple and Google that they'd start encrypting phones by default, better protecting data on those phones from anyone who wants it -- whether government/law enforcement or hackers. We noted, oddly, that former FBI guy Ronald Hosko had showed up in articles in both the Washington Post and the WSJ spewing a bunch of FUD about it. In the WSJ:
Quote: The level of privacy described by Apple and Google is "wonderful until it's your kid who is kidnapped and being abused, and because of the technology, we can't get to them,'' said Ronald Hosko, who left the FBI earlier this year as the head of its criminal-investigations division. "Who's going to get lost because of this, and we're not going to crack the case?"
In the Washington Post:
Quote: Ronald T. Hosko, the former head of the FBI’s criminal investigative division, called the move by Apple “problematic,” saying it will contribute to the steady decrease of law enforcement’s ability to collect key evidence — to solve crimes and prevent them. The agency long has publicly worried about the “going dark” problem, in which the rising use of encryption across a range of services has undermined government’s ability to conduct surveillance, even when it is legally authorized.

“Our ability to act on data that does exist . . . is critical to our success,” Hosko said. He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information.
This is just blatant fear mongering, and not even close to realistic. But the Washington Post doubled down and let Hosko write an entire (and entirely bogus) story about how he helped save a kidnapped man from murder earlier this year and "with Apple's and Google's new encryption rules, he would have died." He accurately writes about a kidnapping in North Carolina, and how law enforcement tracked down the perpetrators, including by requesting and getting "the legal authority to intercept phone calls and text messages." Of course, here's the thing: nothing in this new encryption changes that. Transmitted content is unrelated to the encryption of stored content on the phones. It's the stored content that is being encrypted. It's kind of scary that a supposed "expert" like Hosko doesn't seem to comprehend the difference.

Either way, he insists that the encryption would have prevented this (it wouldn't). His story originally said:
Quote: Last week, Apple and Android announced that their new operating systems will be encrypted by default. That means the companies won’t be able to unlock phones and iPads to reveal the photos, e-mails and recordings stored within.

It also means law enforcement officials won’t be able to look at the range of data stored on the device, even with a court-approved warrant. Had this technology been used by the conspirators in our case, our victim would be dead. The perpetrators would likely be freely plotting their next revenge attack.
After some people pointed out how very, very, very wrong this is, Hosko or the Washington Post "updated" the story, but still makes the same basic claims:
Quote: Last week, Apple and Google announced that their new operating systems will be encrypted by default. Encrypting a phone doesn’t make it any harder to tap, or “lawfully intercept” calls. But it does limit law enforcement’s access to a data, contacts, photos and email stored on the phone itself.

Had this technology been in place, we wouldn’t have been able to quickly identify which phone lines to tap. That delay would have cost us our victim his life.The perpetrators would likely be freely plotting their next revenge attack.
Except, even the update is not true. As the AP's Ted Birdis notes, the affidavit in the case shows that the FBI used phone toll records and wiretaps to figure out the case, and didn't get access to any phones "until after [the] victim [was] safe."

In other words, Hosko's story is pure FUD. The new moves by these companies would not have meant the guy died. It wouldn't have impacted the story at all.

Meanwhile, as a massive post by Julian Sanchez notes, phone encryption products have been on the market for a while and if it was such a big problem we'd already know about it, but so far it's been pretty limited. In the entire US in 2013, there were nine cases where police claimed that encryption stymied their investigations. Furthermore, in the vast majority of cases where they came up against encryption, they were still able to crack it. So... the impact here is minimal.

But that apparently won't stop lies from the likes of Ronald Hosko.

Update: And... it appears that the Washington Post edited the story again to now make it accurate, but which also disproves the entire point of the story. Now the basic story is "we saved this guy... and mobile encryption would have done nothing to stop it, but it's a bad bad thing anyway." If Hosko couldn't get the very basics right, how could he be considered a credible person discussing this issue?

Originally Published: Wed, 24 Sep 2014 14:58:12 GMT
source
Reply
#6
We already wrote about how law enforcement was freaking out over the (good) news that Apple and Google were making encryption a default on both iOS and Android. Then we had a followup where a recently retired FBI guy insisted that such encryption would have meant a kidnap victim died... until everyone pointed out that the entire premise of that story was wrong and the Washington Post had to change the entire thing. We had hoped that, maybe, just maybe the misguided whining and complaining wouldn't come from those in charge, but apparently that's not happening.

On Thursday, FBI boss James Comey displayed not only a weak understanding of privacy and encryption, but also what the phrase "above the law" means, in slamming Apple and Google for making encryption a default:
Quote: "I am a huge believer in the rule of law, but I am also a believer that no one in this country is above the law," Comey told reporters at FBI headquarters in Washington. "What concerns me about this is companies marketing something expressly to allow people to place themselves above the law."

[....]

"There will come a day -- well it comes every day in this business -- when it will matter a great, great deal to the lives of people of all kinds that we be able to with judicial authorization gain access to a kidnapper's or a terrorist or a criminal's device. I just want to make sure we have a good conversation in this country before that day comes. I'd hate to have people look at me and say, 'Well how come you can't save this kid,' 'how come you can't do this thing.'"
First of all, nothing in what either Apple or Google is doing puts anyone "above the law." It just says that those companies are better protecting the privacy of their users. There are lots of things that make law enforcement's job harder that also better protect everyone's privacy. That includes walls. If only there were no walls, it would be much easier to spot crimes being committed. And I'm sure some crimes happen behind walls that makes it difficult for the FBI to track down what happened. But we don't see James Comey claiming that homebuilders are allowing people to be "above the law" by building houses with walls.
Quote: "I get that the post-Snowden world has started an understandable pendulum swing," he said. "What I'm worried about is, this is an indication to us as a country and as a people that, boy, maybe that pendulum swung too far."
Wait, what? The "pendulum" hasn't swung at all. To date, there has been no legal change in the surveillance laws post-Snowden. The pendulum is just as far over towards the extreme surveillance state as it has been since Snowden first came on the scene. This isn't the pendulum "swinging too far." It's not even the pendulum swinging. This is just Apple and Google making a tiny shift to better protect privacy.

As Christopher Soghoian points out, why isn't Comey screaming about the manufacturers of paper shredders, which similarly allow their customers to hide papers from "lawful surveillance?"

But, of course, the freaking out continues. Over in the Washington Post, there's this bit of insanity:
Quote: “Apple will become the phone of choice for the pedophile,” said John J. Escalante, chief of detectives for Chicago’s police department. “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”
Um. No. That's just ridiculous. Frankly, if pedophiles are even thinking about encryption, it's likely that they already are using one of the many encryption products already on the market. And, again, this demonizing of encryption as if it's only a tool of pedophiles and criminals is just ridiculous. Regular everyday people use encryption every single day. You're using it if you visit this very website. And it's increasingly becoming the standard, because that's just good security.

Originally Published: Fri, 26 Sep 2014 14:55:57 GMT
source
Reply
#7
Quote:They can promise strong encryption. They just need to figure out how they can provide us plain text. - FBI General Counsel Valerie Caproni, September 27, 2010

Quote:[W]e're in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want to make sure we have a trap door and key under some judge's authority where we can get there if somebody is planning a crime. - FBI Director Louis Freeh, May 11, 1995

Here we go again.  Apple has done (and Google has long announced they will do) basic encryption on mobile devices. And predictably, law enforcement has responded with howls of alarm.

We've seen this movie before.  Below is a slightly adapted blog post from one we posted in 2010, the last time the FBI was seriously hinting that it was going to try to mandate that all communications systems be easily wiretappable by mandating "back doors" into any encryption systems.  We marshaled eight "epic failures" of regulating crypto at that time, all of which are still salient today.  And in honor of the current debate, we've added a ninth: 

. . .

If the government howls of protest at the idea that people will be using encryption sound familiar, it's because regulating and controlling consumer use of encryption was a monstrous proposal officially declared dead in 2001 after threatening Americans' privacy, free speech rights, and innovation for nearly a decade. But like a zombie, it's now rising from the grave, bringing the same disastrous flaws with it.

For those who weren't following digital civil liberties issues in 1995, or for those who have forgotten, here's a refresher list of why forcing companies to break their own privacy and security measures by installing a back door was a bad idea 15 years ago:

  1. It will create security risks. Don't take our word for it. Computer security expert Steven Bellovin has explained some of the problems. First, it's hard to secure communications properly even between two parties. Cryptography with a back door adds a third party, requiring a more complex protocol, and as Bellovin puts it: "Many previous attempts to add such features have resulted in new, easily exploited security flaws rather than better law enforcement access."It doesn't end there. Bellovin notes:

    Quote:Complexity in the protocols isn't the only problem; protocols require computer programs to implement them, and more complex code generally creates more exploitable bugs. In the most notorious incident of this type, a cell phone switch in Greece was hacked by an unknown party. The so-called 'lawful intercept' mechanisms in the switch — that is, the features designed to permit the police to wiretap calls easily — was abused by the attacker to monitor at least a hundred cell phones, up to and including the prime minister's. This attack would not have been possible if the vendor hadn't written the lawful intercept code.

    More recently, as security researcher Susan Landau explains, "an IBM researcher found that a Cisco wiretapping architecture designed to accommodate law-enforcement requirements — a system already in use by major carriers — had numerous security holes in its design. This would have made it easy to break into the communications network and surreptitiously wiretap private communications."

    The same is true for Google, which had its "compliance" technologies hacked by China.

    This isn't just a problem for you and me and millions of companies that need secure communications. What will the government itself use for secure communications? The FBI and other government agencies currently use many commercial products — the same ones they want to force to have a back door. How will the FBI stop people from un-backdooring their deployments? Or does the government plan to stop using commercial communications technologies altogether?
     

  2. It won't stop the bad guys. Users who want strong encryption will be able to get it — from Germany, Finland, Israel, and many other places in the world where it's offered for sale and for free. In 1996, the National Research Council did a study called "Cryptography's Role in Securing the Information Society," nicknamed CRISIS. Here's what they said:

    Quote:Products using unescrowed encryption are in use today by millions of users, and such products are available from many difficult-to-censor Internet sites abroad. Users could pre-encrypt their data, using whatever means were available, before their data were accepted by an escrowed encryption device or system. Users could store their data on remote computers, accessible through the click of a mouse but otherwise unknown to anyone but the data owner, such practices could occur quite legally even with a ban on the use of unescrowed encryption. Knowledge of strong encryption techniques is available from official U.S. government publications and other sources worldwide, and experts understanding how to use such knowledge might well be in high demand from criminal elements. — CRISIS Report at 303

    None of that has changed. And of course, more encryption technology is more readily available today than it was in 1996. So unless the goverment wants to mandate that you are forbidden to run anything that is not U.S. government approved on your devices,  they won't stop bad guys from getting  access to strong encryption.
     

  3. It will harm innovation. In order to ensure that no "untappable" technology exists, we'll likely see a technology mandate and a draconian regulatory framework. The implications of this for America's leadership in innovation are dire. Could Mark Zuckerberg have built Facebook in his dorm room if he'd had to build in surveillance capabilities before launch in order to avoid government fines? Would Skype have ever happened if it had been forced to include an artificial bottleneck to allow government easy access to all of your peer-to-peer communications?This has especially serious implications for the open source community and small innovators. Some open source developers have already taken a stand against building back doors into software.
     

  4. It will harm US business. If, thanks to this proposal, US businesses cannot innovate and cannot offer truly secure products, we're just handing business over to foreign companies who don't have such limitations. Nokia, Siemens, and Ericsson would all be happy to take a heaping share of the communications technology business from US companies. And it's not just telecom carriers and VOIP providers at risk. Many game consoles that people can use to play over the Internet, such as the Xbox, allow gamers to chat with each other while they play. They'd have to be tappable, too.
     

  5. It will cost consumers. Any additional mandates on service providers will require them to spend millions of dollars making their technologies compliant with the new rules. And there's no real question about who will foot the bill: the providers will pass those costs onto their customers. (And of course, if the government were to pay for it, they would be using taxpayer dollars.)
     

  6. It will be unconstitutional.. Of course, we wouldn't be EFF if we didn't point out the myriad constitutional problems. The details of how a cryptography regulation or mandate will be unconstitutional may vary, but there are serious problems with nearly every iteration of a "no encryption allowed" proposal that we've seen so far. Some likely problems:
    • The First Amendment would likely be violated by a ban on all fully encrypted speech.
    • The First Amendment would likely not allow a ban of any software that can allow untappable secrecy. Software is speech, after all, and this is one of the key ways we defeated this bad idea last time.
    • The Fourth Amendment would not allow requiring disclosure of a key to the backdoor into our houses so the government can read our "papers" in advance of a showing of probable cause, and our digital communications shouldn't be treated any differently.
    • The Fifth Amendment would be implicated by required disclosure of a private papers and the forced utterance of incriminating testimony.
    • Right to privacy. Both the right to be left alone and informational privacy rights would be implicated.
     

  7. It will be a huge outlay of tax dollars. As noted below, wiretapping is still a relatively rare tool of government (at least for the FBI in domestic investigations -- the NSA is another matter as we now all know). Yet the extra tax dollars needed to create a huge regulatory infrastructure staffed with government bureaucrats who can enforce the mandates will be very high. So, the taxpayers would end up paying for more expensive technology, higher taxes, and lost privacy, all for the relatively rare chance that motivated criminals will act "in the clear" by not using encryption readily available from a German or Israeli company or for free online.
     

  8. The government hasn't shown that encryption is a problem. How many investigations have been thwarted or significantly harmed by encryption that could not be broken? In 2009, the government reported only one instance of encryption that they needed to break out of 2,376 court-approved wiretaps, and it ultimately didn't prevent investigators from obtaining the communications they were after.This truth was made manifest in a recent Washington Post article written by an ex-FBI agent. While he came up with a scary kidnapping story to start his screed, device encryption simply had nothing to do with the investigation.  The case involved an ordinary wiretap. In 2010, the New York Times reported that the government officials pushing for this have only come up with a few examples (and it's not clear that all of the examples actually involve encryption) and no real facts that would allow independent investigation or confirmation. More examples will undoubtedly surface in the FBI's PR campaign, but we'll be watching closely to see if underneath all the scary hype there's actually a real problem demanding this expensive, intrusive solution.
     

  9. Mobile devices are just catching up with laptops and other devices.  Disk encryption just isn't that new. Laptops and desktop computers have long had disk encryption features that the manufacturers have absolutely no way to unlock. Even for simple screen locks with a user password, the device maker or software developer doesn't automatically know your password or have a way to bypass it or unlock the screen remotely.Although many law enforcement folks don't really like disk encryption on laptops and have never really liked it, and we understand that some lobbied against it in private, we haven't typically heard them suggest in public that it was somehow improper for these vendors not to have a backdoor to their security measures.That makes us think that the difference here is really just that some law enforcement folks think that phones are just too popular and too useful to have strong security.  But strong security is something we all should have.  The idea that basic data security is just a niche product and that ordinary people don't deserve it is, frankly, insulting.  Ordinary people deserve security just as much as elite hackers, sophisticated criminals, cops and government agents, all of whom have ready access to locks for their data.  

The real issue with encryption may simply be that the FBI has to use more resources when they encounter it than when they don't. Indeed, Bellovin argues: "Time has also shown that the government has almost always managed to go around encryption." (One circumvention that's worked before: keyloggers.) But if the FBI's burden is the real issue here, then the words of the CRISIS Report are even truer today than they were in 1996:

Quote:It is true that the spread of encryption technologies will add to the burden of those in government who are charged with carrying out certain law enforcement and intelligence activities. But the many benefits to society of widespread commercial and private use of cryptography outweigh the disadvantages.

The mere fact that law enforcement's job may become a bit more difficult is not a sufficient reason for undermining the privacy and security of hundreds of millions of innocent people around the world who will be helped by mobile disk encryption.  Or as Chief Justice of John Roberts recently observed in another case rejecting law enforcement's broad demands for access to the information available on our mobile phones: "Privacy comes at a cost."  

Reposted from the Electronic Frontier Foundation's Deeplinks Blog


Permalink
Reply
#8
So, what did cops do when there was no phone or internet before the 21 century; my guess is real investigation and police work.

Today, when I think about cops doing their work all I come with is a man sitting somewhere and google-ing whether he will find the crime.
Reply
#9
(Sep 27, 2014, 06:27 am)Lightrex Wrote: So, what did cops do when there was no phone or internet before the 21 century;

they violated rights the good old fashioned way: with phone books, second hand smoke and oppressive mustaches.
Reply
#10
Well, you knew it was coming. First, law enforcement trotted out random low level "law enforcement officials" to freak out about Apple and Google's announced plans to make encryption the default on mobile phones. Then it got taken up a notch when FBI boss James Comey lashed out at the idea, bizarrely arguing that merely encrypting your data made individuals "above the law" (none of that is accurate). And, now, Comey's boss, Attorney General Eric Holder has stepped up to issue a similar warning. However, Holder has cynically chosen to do so at the Biannual Global Alliance Conference Against Child Sexual Abuse Online.

At this point, it's all too predictable that when anyone in power is getting ready to take away your rights, they'll figure out a way to claim that it's "for the children!" The statements over the past week by law enforcement, Comey and now Holder are clearly a coordinated attack -- the start of the new crypto wars (a repeat of what we went through a decade and a half ago), designed to pass some laws that effectively cripple encryption and put backdoors in place. Holder's take on this is to cynically pull on heartstrings about "protecting the children" despite this having nothing, whatsoever, to do with that.
Quote: When a child is in danger, law enforcement needs to be able to take every legally available step to quickly find and protect the child and to stop those that abuse children. It is worrisome to see companies thwarting our ability to do so.
Again, as stated last week, the same argument could be made about walls and doors and locks.
Quote: It is fully possible to permit law enforcement to do its job while still adequately protecting personal privacy.
The key issue here is "adequately" and forgive many of us for saying so, but the public no longer trusts the DOJ/NSA/FBI to handle these things appropriately. And, just as importantly, we have little faith that the backdoors that the DOJ is pushing for here aren't open to abuse by others with malicious intent. Protecting personal privacy is about protecting personal privacy -- and the way you do that is with encryption. Not backdoors.

But Holder used this opportunity to cynically pile on about criminals using encryption, rather than noting any of the important benefits towards privacy they provide:
Quote: Recent technological advances have the potential to greatly embolden online criminals, providing new methods for abusers to avoid detection. In some cases, perpetrators are using cloud storage to cheaply and easily store tens of thousands of images and videos outside of any home or business – and to access those files from anywhere in the world. Many take advantage of encryption and anonymizing technology to conceal contraband materials and disguise their locations.
The DOJ has long wanted to restart the crypto wars that it lost (very badly) last time around (even though that "loss" helped enable parts of the internet to thrive by making it more secure). For years it's been looking to do things like reopen wiretapping statutes like CALEA and mandate wiretap backdoors into all sorts of technology. Now it's cynically jumping on this bit of news about Apple and Google making it just slightly easier to protect your privacy to try to re-open those battles and shove through new laws that will emphatically decrease your privacy.

Originally Published: Wed, 01 Oct 2014 13:08:36 GMT
source
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  I just found out that Telegram has it's own crpto coin Ladyanne3 1 1,526 Jul 10, 2024, 07:39 am
Last Post: dueda
  Why Even the FBI Gave Up on the Pirate Bay: The Pirtate Bay History 101 RobertX 1 4,888 Jul 23, 2023, 07:58 am
Last Post: rezwaki
  Tell me which phone out of these would you choose? VantaBlac 2 6,181 Feb 10, 2023, 11:37 am
Last Post: VantaBlac
  For People Who Bring Their Laptops Out RobertX 7 22,740 Aug 29, 2021, 15:40 pm
Last Post: waregim
  onionflix took over btdb and sky? Iliketacos 0 13,286 Jul 01, 2021, 23:23 pm
Last Post: Iliketacos



Users browsing this thread: 2 Guest(s)