Thursday, March 26, 2015

Message to Errata employees

Dear employees,

Starting next week, Errata Security will be following RSA Conference's lead and institute a "Morality Dress Code" in order to deal with the problem of loose women on the premises.

Attire of an overly revealing or suggestive nature is not permitted. Examples of such attire may include but are not restricted to:

  • Tops displaying excessive cleavage;
  • Tank tops, halter tops, camisole tops or tube tops;
  • Miniskirts or minidresses;
  • Shorts;
  • Lycra (or other Second-Skin) bodysuits;
  • Objectionable or offensive costumes.
These guidelines are applicable to all staff, regardless of gender, and will be strictly enforced. Therefore, Dave's practice of showing up on casual Friday's in a miniskirt and push-up bra will no longer be tolerated. We have burkas on hand of varying sizes for those who fail to comply.

If you have any questions, please consult the Morality Officer for your department.

Regards,
Robert Graham
CEO, Errata Security

"Shalim" by Zivya - Own work. Licensed under CC BY-SA 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Shalim.JPG#/media/File:Shalim.JPG

PS: This is satire, of course. We don't support RSA's morality code.

Wednesday, March 25, 2015

x86 is a high-level language

Just so you know, x86 machine-code is now a "high-level" language. What instructions say, and what they do, are very different things.

I mention this because of those commenting on this post on OpenSSL's "constant-time" calculations, designed to avoid revealing secrets due to variations in compute time. The major comment is that it's hard to do this perfectly in C. My response is that it's hard to do this even in x86 machine code.

Consider registers, for example. Everyone knows that the 32-bit x86 was limited to 8 registers, while 64-bit expanded that to 16 registers. This isn't actually true. The latest Intel processors have 168 registers. The name of the register in x86 code is really just a variable name, similar to how variables work in high-level languages.

So many registers are needed because the processor has 300 instructions "in flight" at any point in time in various stages of execution. It rearranges these instructions, executing them out-of-order. Everyone knows that processors can execute things slightly out-of-order, but that's understated. Today's processors are massively out-of-order.

Consider the traditional branch pair of a CMP (compare) followed by a JMPcc (conditional jump). While this is defined as two separate instructions as far as we humans are concerned, it's now a single instruction as far as the processor is concerned.

Consider the "xor eax, eax" instruction, which is how we've traditionally cleared registers. This is never executed as an instruction, but just marks "eax" as no longer used, so that the next time an instructions needs the register, to allocate a new (zeroed) register from that pool of 168 registers.

Consider "mov eax, ebx". Again, this doesn't do anything, except rename the register as far as the processor is concerned, so that from this point on, what was referred to as ebx is now eax.

The processor has to stop and wait 5 clock cycles to read something from L1 cache, 12 cycles for L2 cache, or 30 cycles for L3 cache. But because the processor is massively out-of-order, I can continue executing instructions in the future that don't depend upon this memory read. This includes other memory reads. Inside the CPU, the results always appear as if the processor executed everything in-order, but outside the CPU, things happen in strange order.

This means any attempt to get smooth, predictable execution out of the processor is very difficult. That means "side-channel" attacks on x86 leaking software crypto secrets may always be with us.

One solution to these problems is the CMOV, "conditional move", instruction. It's like a normal "MOV" instruction, but succeeds or fails based on condition flags. It can be used in some cases to replace branches, which makes pipelined code more efficient in some cases. Currently, it takes constant time. When moving from memory, it still waits for data to arrive, even when it knows it's going to throw it away. As Linus Torvalds famously pointed out, CMOV doesn't always speed up code. However, that's not the point here -- it does make code execution time more predictable. But, at the same time, Intel can arbitrarily change the behavior on future processors, making it less predictable.

The upshot is this: Intel's x86 is a high-level language. Coding everything up according to Agner Fog's instruction timings still won't produce the predictable, constant-time code you are looking for. There may be some solutions, like using CMOV, but it will take research.



Wednesday, March 18, 2015

What ever it is, CISA isn't cybersecurity

In the next couple months, Congress will likely pass CISA, the Cybersecurity Information Sharing Act. This is a bad police-state thing. It will do little to prevent attacks, but do a lot to increase mass surveillance.

They did not consult us security experts when drafting this bill. If they had, we would have told them the idea doesn’t really work. Companies like IBM and Dell SecureWorks already have massive “cybersecurity information sharing” systems where they hoover up large quantities of threat information from their customers. This rarely allows them to prevent attacks as the CISA bill promises.

In other words, we’ve tried the CISA experiment, and we know it doesn’t really work.

Thursday, March 12, 2015

GitHub won because it's social-media

Today Google shut down Google Code, because GitHub has taken over that market. GitHub won not because Git is a better version-control system, but because it became a social-media website like Facebook and Twitter. Geeks like me express ourselves through our code. My GitHub account contains my projects just like Blogger contains my blogs or Twitter contains my tweets.

To be sure, Git's features are important. The idea of forking a repo fundamentally changed who was in control. Previously, projects were run with tight control. Those in power either accepted or rejected changes made by others. If your changes were rejected, you could just fork the project, making it your own version, with your own changes. That's the beauty of open-source: by making their source open, the original writers lost the ability to stop you from making changes.

However, forking was discouraged by the community. That's because it split efforts. When forks became popular, some people would contribute to one fork, while others would contribute to the other. Drama was a constant factor in popular open-source projects over the evil people who "hurt" projects by forking them.

But with Git, forking is now encouraged. Indeed, that's now the first step in contributing changes to a project. You fork it, make changes to your own version, then ask the original project to pull your changes from your fork.

This caused an explosion in social coding. Look at the average coder's GitHub account and you'll see a bunch of forked projects, plus a bunch of their original projects forked by others. For example, on my GitHub account, you'll see my Masscan project which 395 people have forked. You'll also see that I've forked and made a change to SecureDrop, a project for secure submissions by leakers to newspapers. I found a vulnerability, so I submitted a fix for it. The original project didn't accept my pull request, but instead just completely rewrote that part of the code.

Sometimes when I write blog posts, I include code. That code is on GitHub. When I hacked the Lenovo/Superfish key for example, I had to write a small password cracker for SSL certificate files. I just put it on GitHub. Others have forked it. Since it was a quick and dirty project, I put the comment "DON'T JUDGE ME" in the code. So somebody forked it and simply committed a change saying "...not judging". As I said: GitHub makes coding social.

Like blog posts, Facebook posts, or Tweets, people can post comments. An example of this was a pull request to libuv (an important networking library) that simply changed a comment from using the gendered pronoun "he" to a neutral "they". This resulted in a long comment chain as people debated this.

I sometimes write blogposts that go viral and get a million hits. I sometimes write tweets that go viral and get passed around everywhere. The same is true of GitHub. When I announced my Masscan project, it went viral, and was the "top trending project" on GitHub for a day. That they even track such a thing shows yet again how they are a social media site.

FedEx is famous for saying that what it really sells is procrastination. It's not that they can overnight something in an emergency, it's that you can wait until the last moment to send something. The same is true of the Internet. The tendency is to believe that a website is solely what it claims, that GitHub won with better version control, as this Wired article claims. That's not true. GitHub won because it made the solitary task of coding extremely social. GitHub won because it enabled anti-social Asperger coders to express themselves through their code.

Tuesday, March 10, 2015

No, the CIA isn't stealing Apple's secrets

The Intercept news site by Glenn Greenwald is activism rather than journalism. Their stories don't reference experts knowledgeable about subjects, but only activists who are concerned about the subjects. This was demonstrated yet against in their piece claiming "The CIA Campaign to Steal Apple's Secrets". Yes, the Snowden documents are real, but pretty much everything else is made up.

Here's the deal. Terrorist leaders use iPhones. They are a status symbol, and status symbols are important to leaders. Moreover, since Apple's security is actually pretty good, terrorists use the phones for good reason (most Android devices suck at security, even the Blackphone). Getting software onto terrorist's phones, or basebands, is an important goal of intelligence.

When CIA drones bomb a terrorist compound, iPhones will be found among the bodies. Or, when there is a terrorist suspect coming out of a dance club in Karachi, a CIA agent may punch them in the face and run away with their phone. However, it happens, the CIA gets phones and wants to decrypt them.

Back in 2011 when this conference happened, the process of decrypting retrieved iPhones was time consuming (months), destructive, and didn't always work. The context of the presentation wasn't that they wanted to secretly spy on everyone's phones. The context was that they wanted to decrypt the phones they were getting.

Yes, they want to get into specific iPhones. But they aren't succeeding in subverting the entire system as the Intercept story implies.

The Intercept's article quotes Chris Soghoian, a technologist from the ACLU, saying “If I were Tim Cook, I’d be furious". Soghoian doesn't know what he's talking about -- if anything, the reverse is true. If Tim Cook cares at all, it's glee over the CIA's difficulties, because Apple is winning the fight. Apple made is prohibitively expensive to reverse engineer secrets back with the iPhone 4 except in the direst of circumstances (like picking up phones from Bin Laden's compound). They've likely made it completely impossible with the iPhone 6. When the CIA comes for me, I doubt they will be able to lift any information from my iPhone.

The Intercept doesn't quote people who actually know what they are talking about. As I repeat over and over, for every Snowden document, there's some expert who has presented on that topic at BlackHat, DefCon, or similar hacking/cybersec conference. There's no excuse for writing a story on these topics and quoting only activists like Soghoian rather than technical experts from these conferences. For example, a quick search of "BlackHat reverse engineering chips" quickly lead to this presentation.

I point this out because another subject of that Intercept article was about trojaning XCode, the Apple development tool used to compile iOS apps. A quick search would have come up with a BlackHat presentation by Errata Security's own David Maynor where he trojaned Microsoft's compiler, GCC, and a lesser known compiler called LCC. There's no excuse for writing this story without reaching out to Maynor, or even Ken Thompson, the co-creator of C/Unix who inspired compiler-trojaning.

Again with compilers, there's context that is carefully hidden by the Intercept story. The CIA isn't modifying the XCode that everyone uses, that would be impossible. They aren't trojaning the version Apple ships to developers. If you have XCode installed, no you don't have to worry about the CIA. Nor is the CIA trying to sneak something into a popular app like Angry Birds. Instead, their goal is to target the hundred users of a hawala money transfer app used almost exclusively by legitimate targets. The idea is a black bag operation to break into the teenager's apartment who wrote the app in order to backdoor his/her XCode, so that all users can be identified.

I mention this because when real journalists pick up the story, they give The Intercept credit as if they were real journalists who did their job reporting on the issue. That's improper. These journalists should either do their own reporting based on the raw documents themselves, find independent sources to analyze the data, or just report "activists are yet again upset over CIA/NSA activities", and leave out their manufactured message.



Update: Here is a better description of the technology:


Monday, March 09, 2015

Some notes on DRAM (#rowhammer)

My twitter feed is full of comments about the "rowhammer" exploit. I thought I'd write some quick notes about DRAM. The TL;DR version is this: you probably don't need to worry about this, but we (the designers of security and of computer hardware/software) do.

There are several technologies for computer memory. The densest, and hence cheapest-per-bit, is "DRAM". It consists of a small capacitor for each bit of memory. The thing about capacitors is that they lose their charge over time and must be refreshed. In the case of DRAM, every bit of memory must be read them re-written every 64-milliseconds or it becomes corrupt.

These tiny capacitors are prone to corruption from other sources. One common source of corruption is cosmic rays. Another source is small amounts of radioactive elements in the materials used to construct memory chips. So, chips must be built with radioactive-free materials. The banana you eat has more radioactive material inside it than your DRAM chips.

Wednesday, March 04, 2015

Cliché: Safen Up!

RSA Conference is often a mockery of itself. Yesterday, they posted this tweet:



This is similar to the Simpsons episode where Germans buy the power plant. In fear for his job, Homer (the plant's Safety Inspector) starts going around telling people to "Stop being so unsafe!".



Security is not a platitude; insecurity is not a moral weakness. It's a complex set of tradeoffs. Going around telling people to "safen up" will not improve the situation, but will instead breed resentment. Infosec people are widely disliked because of their moralizing.

The only way to be perfectly secure is to cut the cables, turn off the machines, thermite the drives, and drop the remnants in a deep ocean trench. Anything less and you are insecure. Learn to deal with insecurity instead of blaming people for their moral weaknesses.