Hactivismo

We have always been at war with crypto

The dream of the 90s is alive

As we debate "responsible encryption," here is a long scroll of pullquotes from the previous incarnations of CryptoWars. If you're concerned about this, donate to the EFF -- they've always been there, fighting this insanity back.

1995

"Opposing Clipper is an odd pairing of civil liberties activists and corporations. The activists worry that the government could have too much access to private exchanges. Companies have chafed at export restrictions that stop them from using the best encryption technologies in products they sell abroad. … Companies would rather include many different encryption technologies in the products they sell and don't want to be locked into government-approved hardware. They also point out that their customers overseas are unlikely to want to use the Clipper lock knowing that the U.S. government holds the keys."
-- https://www.washingtonpost.com/archive/business/1995/03/16/three-ways-t…

Entropy Story-time: From Claude Shannon to Equifax

Mix Two Colors / Pietro Jeng

There's an piece floating around that does a great, succinct job at summarizing Claude Shannon's contributions to our modern understanding of information. If you haven't read The bit bomb on Aeon, head over there. It'll make your brain happy with things like this:

"Shannon – mathematician, American, jazz fanatic, juggling enthusiast – is the founder of information theory, and the architect of our digital world. It was Shannon’s paper ‘A Mathematical Theory of Communication’ (1948) that introduced the bit, an objective measure of how much information a message contains."

The article digs deep into how easy it is to predict things - especially language. It ends up focusing on the power of pattern detection in being able to compress information:

"Shannon expanded this point by turning to a pulpy Raymond Chandler detective story […] He flipped to a random passage … then read out letter by letter to his wife, Betty. Her role was to guess each subsequent letter […] Betty’s job grew progressively easier as context accumulated […] a phrase beginning ‘a small oblong reading lamp on the’ is very likely to be followed by one of two letters: D, or Betty’s first guess, T (presumably for ‘table’). In a zero-redundancy language using our alphabet, Betty would have had only a 1-in-26 chance of guessing correctly; in our language, by contrast, her odds were closer to 1-in-2. "

Let's talk about PGP

I've been working on a new way to explain email encryption; I'd appreciate feedback on this approach. If you're looking to try email encryption out - buy me a beer (let let me buy you one) if we're in the same place, or check out the usable, in-browser work by Mailvelope.

New GPG Keys!

I am transitioning both my professional and personal GPG keys. This transition document (in full, below) and both updated keys are signed with both old and new keys for both personal and professional accounts to validate the transition.

In short:
[email protected] - new keyID 270C17F1
[email protected] - new keyID FDDB8C25

If this is all greek to you, GPG (or PGP) is a way to encrypt your email so that only other specific people (who must also be using GPG) are able to read it. While we think of email like regular mail, with a level of privacy like something in an envelope, the reality is that it's better to compare it to a postcard. If you're interested in getting started, I highly recommend EFF's excellent PGP guide, and Mailvelope is a super-easy browser plugin to help get you started in more secure webmail (it works great, for example, with gmail).

What Good Are Secure Communications Tools if No One Uses Them?

USABLE.tools

Cross-posted from my piece on Medium

It was the second day of digital security training, and I was losing the room. The journalists, documentarians, and media activists around the table were more intent on following their friends and colleagues via Facebook chat than dealing with the fidgety, hard to install, but super-secure communications tools I was trying to promote.
They had good reason — it was winter 2014, during the tense final days of Ukraine’s EuroMaidan protests, going on just across town from our training. The urgency of communication was just too much. Overnight, most of the trainees had chosen to uninstall the app we’d burnt the better part of the previous day getting to install on a mix of Windows XP, 7, Macs, and even Linux systems.

But then again, I had good reason to urge security. Protesters were being arrested because of insecure communications. People were worried about their own government, but also about the small number of companies controlling their telecommunications.

I thought I had understood their need — they wanted a way to have trusted, private communications that spanned from mobile to desktop, chat to voice.
But I had failed. I was pushing a collection of tools I knew to be the best in its class for security, developed transparently as open source, with constant attention to not only bugs but the nuances of cryptography and careful, responsible implementation and monitoring of new possible flaws. The tools were also the only ones that combined these security features, with both text and voice capabilities that could bridge desktop and mobile.

These activists required a tool that they could show to others and start using in minutes; not one that took a day of training and debugging just to install. Tools that aren’t used aren’t providing security.

A Recent History of Back Doored Encryption, in 4 links

TSA Keys, 3D-printed

This is partially a footnotes section from last week's Crpyto Saves Lives post, but every week brings new stories, and this week was a doozy. So, let's recap the whole "backdoored crypto / secret golden keys can work" argument:

Claims:

(1) We can protect private information

*Cough* OPM *Cough*

Update: "Security bloggers and researchers claim to have uncovered a publicly available database exposing the personal information of 191 million voters on the Internet. The information contains voters’ names, home addresses, voter IDs, phone numbers and date of birth, as well as political affiliations and a detailed voting history since 2000."

(2) Well, we are really good at protecting super-important crypto keys that only give good guys access,

So, those luggage locks with a "golden key", now required world-wide that only trained TSA agents can pop open? Yeah, about that... - TSA's master key set was allowed to be photographed, and while that photo was quickly taken off the internet, the damage was done. Anyone can now 3D print completely functional TSA keys.

(3) Besides, adding a backdoor won't cause problems!

Tags

Of Tor and Condoms

A garlic flavored condom

I am far from the first to compare digital security practices to safer sex practices. Heck, you can even see a rap career blooming as Jillian York and Jacob Appelbaum suggest that it's time that we "talk about P-G-P" at re:publica.

Talking about software and trust gets both very boring and very depressing quickly. Let's instead move on to the juicy sex-ed part!

A quick disclaimer: First, apologies for the at-times male and/or heteronormative point of view; I'd welcome more inclusive language, especially around the HTTPS section. Second, I am unabashedly pro-Tor, a user of the tor network, and am even lucky enough to get to collaborate with them on occasion. The garlic condom photo comes from The Stinking Rose..

Super-duper Unsafe Surfing

Using the Internet without any protection is a very bad idea. The SANS Institute's Internet Storm Center tracks "survival time" - the time a completely unprotected computer facing the raw Internet can survive before becoming compromised by a virus - in minutes. Not days, not even hours. This is so off the charts, that with a safer sex metaphor, using no protection is more akin to just injecting yourself with an STD than engaging in a risky behavior.

Barely less unsafe surfing

Adding in a constantly-updated anti-virus tool, and a firewall, and making sure that your operating system is up to date is akin to being healthy. You have a basically operational immune system - congrats!. You'll be fine if the person you're sleeping with has the common cold, but anything more serious than that and you're in trouble.

Using HTTPS - visiting websites which show up with a green lock icon - is also a good practice. You can even install some browser plugins like HTTPS Everywhere and CertPatrol that help you out.

HTTPS is kind of like birth control. You may successfully prevent *ahem* the unauthorized spread of your information, but you're still relying on a significant amount of trust in your partner (to have taken the pill, to withdraw), and there are things out of your knowledge that can go wrong - the pharmacist provided fake pills, or you have a withdrawal failure (please note this is about digital security advice, and not at all giving good safer sex advice - a quick visit to wikipedia is a good start for effective -- and non effective birth control methods!). With SSL Certificates, you are still trusting that the website has good practices to protect your information (insert the constant litany of password reset links you've had to deal with this year here), and there have been cases of stolen SSL certificates) and are tools to help an attacker try and intercept your encrypted traffic.

Slightly Safer Surfing

With digital security, a lot like with safer sex, some methods can be combined for a greater effect, but layering other methods can be a horrible idea. Adding using anti-virus tools, firewalls, system updates, and HTTPS on top of any other method here is a universally Good Thing.

Using a VPN is like using a condom, provided by your partner for this encounter, and given to them by a source neither of you have any real trust in. Asking the manufacturer for information about exactly how it's made, or what its expiration date is will often result in grand claims (but no hard evidence). Requests to see the factory floor and verify these claims are presumed to be jokes. The VPN-brand condom generally works, and is definitely fast and easy, but you're placing a lot of trust in a random company you found while searching the Internet, and probably also the cheapest one you found. On top of that, you're also still trusting your partner to not have poked any holes in the condom.

Overall, It's still much better to be using the VPN than not, and if you trust your partner (i.e. the website or service you're going to), and you trust the VPN provider for whatever reason - perhaps a widely trusted company has given an independent audit of the VPN, or you or your workplace has set it up yourself - then for most situations you're pretty safe. Layering a VPN on top of the above tools is good, but layering VPNs on VPNs or on other networks is actually not dissimilar to layering condoms - it actually makes failure in very weird (and, lets face it, awkward) ways /more/ likely.

Safer Surfing

Still, though, wouldn't it be better if you could rely even less on trust, and have that trust backed up with evidence that you yourself can look at?

Using Tor is like using a condom which you not only know has gone through extensive testing, you can even visit the factory floor, look at the business' finances, and talk with the engineers and factory staff. It's /still/ not 100% safe, but it is a heck of a lot safer, and you can verify each and every claim made about what it does and does not do.

And to be clear here, if you're logging in to a website over Tor, that website now knows who you are (you're no longer anonymous to them, and possibly others watching you do this along the wire), and that website is storing your password and may fail to protect it at some point. That website can still turn out to be malicious and attack you, and very powerful adversaries can even specifically try and intercept traffic coming from a website and going into the super-secret Tor network, change it, and include an attack they know works well against out of date versions of the browser you're using. An out of date Tor browser is like an expired condom - it's best not to bet your life on it.

To really (over-)extend the analogy, the Tor-branded condom business happens to be heavily funded by a religious organization that is strongly against birth control (and indeed has an entire project that tries to undermine birth control methods, to the point of installing secret hole-punchers in condom factories). This same organization (it's large!) does have a different and vocal component that strongly supports safer sex, and not only funds giving away condoms, but also the production of them. It's not, seemingly, the most logical set up, but hey, we're talking religion, politics and sex - logic doesn't always come in to play here.

Like sex, there is no truly "safe" way to play on the Internet, and it's unrealistic to expect that abstinence from the Internet is realistic. So, be careful out there, adopt safer practices, and keep your wits about you. Good luck!

Tags

Do you trust your tools?

There's a budding conversation on "trust" over in the twitterverse. I began a draft post a while back that compared Tor (the amazing privacy and anti-censorship network and all privacy-protecting software to condoms. More on that soon, but let's actually talk about how you might have trust in a software project, using Tor as an example. Tor has been in the news recently, and I've had a ton of people ask me about how safe it is to use, so I figured one click-bait headline is as good as another in having an open and honest discussion about Tor.

First, let's be transparent. Tor - not unlike the Internet itself - did in fact start out as a project by the US Naval Research Laboratory, and does continue to receive funding by the US Government to support freedom of expression around the world, with targeted efforts to enable free speech and access to uncensored information in countries where Internet connections are heavily filtered.

So, can you trust Tor? How do you know that the NSA hasn't forced Tor into building a "back door" into the Tor software, like they did with RSA Security, and many other pieces of software you use daily, or like what has historically happened to privacy-protecting services like hushmail?

The answer is actually that you should not actually need to trust the organization behind Tor in order to be confident that the software is built to be safe. This is enabled by the fact that Tor is open source - meaning you can read every line of the code they use to build the software you install. Of course, even with open source software, you're trusting whoever is compiling it do do so on a secure system and without any extra malicious intent. The Tor Project answers this problem by using "deterministic builds", which let you check, independently, that the code posted publicly is the code you're running.

If you use Windows or Mac, both "closed source" operating systems, you are absolutely, 100% trusting that no one in the company, nor any government with significant sway over these companies, has snuck in code to allow remote spying. You have no way to inspect the code running your operating system, and every tool you use on top of it is vulnerable to being undermined by something as simple as a hack to the tiny piece of software that tells your computer how to talk with the keyboard, which could just as easily also store every password you have ever typed in. You're also trusting your ISP, every web site you log in to, and thousands of other intermediaries and companies, from the ones who provide SSL Certificates (enabling the "green lock" of a secure website) to the manufacturer or your wifi router and cablemodem to not betray your trust by accident, under duress, or with malicious intent.

Of course, even back in the green pastures of open source, there is no "absolute" level of trust, no matter how much we'd like there to be. Rare is the user who actually checks the "signature" of the download file against the posted "signature" online to make sure the tool they're about to install is the intended one. And even rarer is the user who checks in on the deterministic build process (and it's still fragile, so hard to guarantee even so). Even at this level, you are trusting the developers and others in the open source and security community to write solid code and check on it for bugs. The Tor Project does an exceptional job at this, but as heartbleed reminds us, huge, horrible bugs can go unseen, even in the open, for a long time. You're also trusting all the systems that the developers work on to not be compromised, and to be running code that is also in more or less good condition, and to be using compilers that aren't doing funny things.
For what it's worth, this is hardly a new problem. In my unhumble opinion, I'd still rather have this more open model of shared trust in the open source world than rely on any single company, whose prime motive is to ship software features on time.

So - can you trust Tor? I do. But saying that I "trust" Tor doesn't mean I have 100% faith that their software is bulletproof. All software has bugs, and particularly security software requires a lot of work on the part of the user to actually make it all work out as expected. It's time to talk about trust less as a binary and more as a pragmatic approach to decision making based on best practices, source availability, and organizational transparency.

Tags

Of BBQ, Open Source, and Heartbleed.

Heartbleed logo

There's a point here about heartbleed and security — I promise. Keep with me.

As I am wont to once the weather finally begins to coöperate, I've been trying a few new things out on the grill. When I'm in this exploratory phase, I love digging through the infinitely interesting BBQ blogs of the Internet - they're full of hard-won knowledge about fire and smoke, but often lack a certain level of technical polish.

Case in point, my reference blog for this week's experiment was a well-seasoned old blog, but they'd lost every single comment from years of discussions. Why? No technical glitch, but simply because they'd chosen a private company to manage their comments - and it went out of business, leaving them not only without a commenting tool, but without those years of educational clarifications and discussions.

Ownership and control matter. This is true when you're talking about your possessions, your house, your comments on a BBQ blog, and with your software. I've railed against app-ification before, but I want to make a slightly deeper point here. If you bought a house, but with the condition that any repair, no matter how minor, you had to contract the previous owner (and only them) to make at a cost of their choosing - would you feel you really owned or controlled that house? Would you buy a car where the hood was locked shut, accessible only to the specific dealership where you bought it?

These cases are very much the situation with the vast majority of software you run on your computer. From Microsoft Word to Apple's iTunes, and even more insidiously, OSX and Microsoft Windows themselves - are all locked away from you. You've been forced to pay hundreds of dollars for them with the purchase of any computer - but you have no control or real ownership over them.

Open Source

The alternative is what's called "free" or "open source" software (people get into fierce debates on the terminology here, which I'm ignoring for the time being). All software starts with instructions that are more-or-less understandable by humans; commands like if (this thing) then (do this other thing). Generally speaking, this "language" is then turned into something that's closer the more basic tools that computers understand. Imagine a particularly skilled dog with a great memory - by stringing together enough fetches, play deads, stops, roll overs and so on, you could eventually come up with a sequence of commands that would have this dog go out and buy a beer for you at the corner store, and bring in back.

"Closed source" software only gives you the computer-understandable version, and it's surprisingly difficult to turn that back into a simple, human-understandable chunk of logic. "Open source" software, on the other hand, always provides you with the original, understandable language.

This means a lot of things - one, you can tweak it. If you don't like the beer that your dog fetched, you can find the human-speak parts of the commands where it's selected, and make sure your preference for hoppy beer is respected, and then turn it back into the commands your computer can do.

This ability to change how your own tools work itself has many additional benefits - you can share that change, and if it's useful enough, that change itself will be included in the next version of the "core" software that everyone uses.

And finally, Heartbleed

This openness also means anyone can look at the logic that is driving their tool. This means that when you start talking about trusting software, there's a heavy preference towards the software that you can look at the source code of, and even more preference towards software where a lot of people have been looking at this same code.

So, that failed with heartbleed. The team behind OpenSSL is tiny compared to their impact. Two out of every three secure servers in the world are running the software that this four-person team manages. And on New Years Eve 2011, one of their developers committed a very, very subtle piece of code that basically didn't make sure that all the doors were closed behind it, and no one else at the time (or anyone who'd taken a look the in two years and chance since) noticed.

So obviously the whole open source thing is broken, right? The bug is out in the open for anyone to figure out, but no one fixed it!

It's not quite so simple. Do you really think that a working piece of closed-source code gets a second glance by its development team? They're just as bound by priorities and shipping product releases as an open-source team, but their code gets locked away with not even the chance for a third party to find a bug and lend a hand — but it's no more secure than the open source tools from concentrated probing, and testing for flaws just like heartbleed.

So yes, heartbleed was bad, but it was also a reminder in how powerful the open source software world can be in finding and fixing a bug. Most of us woke up with some updates to install, and that was the end of it. What horrible, dark bugs are lurking, unfindable, in every piece of closed source software? The precise number is unknowable, but the prevalence of viruses and malware that affect deeply closed systems like Windows might be a strong hint.

No more broken hearts

Going forward, I obviously have a long wishlist of things I'd like to see - a public discussion on what trust in software really means, better tools on every platform to guarantee software packages are what they claim to be (Tor is doing amazing work here), a return to inter-operable standards, especially when we're talking security systems... But as a beginning point, simply better support structures for open code development would be nice. We have volunteers building the basic structures of the Internet - which is an absolutely amazing and good thing - but let's make sure they have the time and resources to do it.