What's a Wassenaar?
It's a town in Europe where in 1996 a total of 41 nations agreed to an arms control treaty. The name of the agreement, the Wassenaar Arrangement, comes from the town. The US, Europe, and Russia are part of the agreement. Africa, Middle East, and China are not.
The primary goal of the arrangement is anti-proliferation, stopping uranium enrichment and chemical weapons precursors. Another goal is to control conventional weapons, keeping them out of the hands of regimes that would use them against their own people, or to invade their neighbors.
Historically in cybersec, we've complained that Wassenaar classifies crypto as a munition. This allows the NSA to eavesdrop and decrypt messages in those countries. This does little to stop dictators from getting their hands on strong crypto, but does a lot to prevent dissidents in those countries from encrypting their messages. Perhaps more importantly, it requires us to jump through a lot of bureaucratic hoops to export computer products, because encryption is built-in to virtually everything.
Why has this become important recently?
Last year, Wassenaar added cyberweapons to the list. On May 20th, the United States Bureau of Industry and Security (BIS) proposed US rules to comply with the Wassenaar additions. They are currently accepting comments about these rules.
The proposed BIS rules go beyond the simpler Wassenaar rules, affecting a large number of cybersecurity products, and cybersecurity research. These rules further restrict anything that may be used to develop a cyberweapon, which therefore make a wide number of innocuous product export-restricted, such as editors and compilers.
It's not that these rules will necessarily block the export of legitimate products, but that it creates a huge bureaucracy that will apply the rules prejudicial and arbitrarily. It's easy to make mistakes -- and a mistake can cost a person 20 years in jail and $1 million. This will create a huge chilling effect even among those who don't intend to export anything.
What specific cyber-weapons is Wassenaar trying to restrict?
The arrangement added three categories of cyber-weapons.
The first is "intrusion malware". The specific example is malware sold by FinFisher to governments like Bahrain, which has been found on laptops of Bahraini activists living in Washington D.C.
The second is "intrusion exploits". These are tools, including what's known as "0-days", that exploit a bug or vulnerability in software in order to hack into a computer, usually without human intervention.
The third is "IP surveillance" products. These are tools, like those sold by Amesys, that monitor Internet backbones in a country, spy on citizen's activities, and try to discover everyone activists/dissents talk to.
Wassenaar includes both intrusion malware and intrusion exploits under the single designation "intrusion software", but while they are both related, they are significantly different from each other. The BIS rules clarifies this difference more.
Haven't I heard about 0-days/zero-days before?
The bulk of cyber-security research is into vulnerabilities, which are software bugs that hackers can exploit in order to break into computer. Over the last 15 years, the relentless pursuit of these vulnerabilities has made computers dramatically safer.
When such bugs are first discovered, before anybody else knows about them, they are known as 0-days. Almost always, researchers give those 0-days to the appropriate company so that they can fix the bug.
Sometimes, however, a researcher may sell the 0-day to the NSA, so that they can secretly hack into computers using a bug nobody knows about. Selling 0-days has been a big controversy in the community, especially since the Snowden affair.
It's perfectly legal for American researchers to sell 0-days to the Chinese government instead of the NSA -- which would presumably then use them to hack American computers. One goal of the Wassenaar agreement is to close this obvious loophole.
One of the controversial provisions of the export license is that companies/individuals may have to share their secret 0-days with the NSA in order to get a license.
Isn't stopping intrusion and surveillance software a good thing?
Maybe. Certainly companies like FinFisher and Amesys are evil, knowingly selling to corrupt governments that repress their people.
However, good and evil products are often indistinguishable from each other. The best way to secure your stuff is for you to attack yourself.
That means things like bug bounties that encourage people to find 0-days in your software, so that you can fix them before hackers (or the NSA) exploit them. That means scanning tools that hunt for any exploitable conditions in your computers, to find those bugs before hackers do. Likewise, companies use surveillance tools on their own networks (like intrusion prevention systems) to monitor activity and find hackers.
Thus, while Wasenaar targets evil products, they inadvertently catch the bulk of defensive products in their rules as well.
Isn't stopping intrusion and surveillance software a good thing? (part 2)
Maybe. Here's the thing, though: the cyberspace has no borders.
Normal arms control works because they are physical things. They require a huge industrial base to produce. Not only the weapons themselves, but the equipment and materials used to produce weapons can be tracked. Even if the bad guys sneak through the original weapons, they will still struggle to keep smuggling the parts needed to keep them working.
None of this argument applies to cyberspace. A single hacker working out of their mom's basement can create the next devastating 0-day. Right now, e-commerce sites block the IP addresses from restricted countries. But, those countries can simply call up their ambassador in an unblocked country in order to purchase a product.
That's not to say export controls would have no leverage. For example, these products usually require an abnormally high degree of training and technical support that can be tracked. However, the little good export controls provide is probably outweighed by the harm -- such as preventing dissidents in the affected countries from being able to defend themselves. We know they do little good know because we watch Bashar Al Assad brandish the latest iPhone that his wife picked up in Paris. Such restrictions may stop the little people in his country getting things -- but they won't stop him.
Isn't there an exception for open-source?
Yes and no. Wassenaar explicitly exempts open-source code in theory. That means you can publish your code to GitHub knowing that corrupt governments will use it, without getting in trouble with the law.
However, there are situations where this doesn't apply. When security researchers discover 0-day, they typically write a proof-of-concept exploit, then present their findings at the next conference. That means they have unpublished code on their laptop, code that they may make public later, but which is not yet technically open-source. If they travel outside the country, they have technically violated both the letter and the spirit of the export restrictions, and can go to jail for 20 years and be forced to pay a $1 million fine.
Thus, make sure you always commit your latest changes to GitHub before getting on a plane.
What's the deal with security research?
One of the most vocal groups in opposition to Wassenaar is security researchers. That's because they are under attack by a wide variety of proposals in the current administration's "War on Hackers".
Proposed changes to the anti-hacking law, the CFAA, would technically make security research into 0days illegal. Copyright law, the DMCA, is frequently exploited for the non-copyright purpose of suppressing researchers. The recent of State of Emergency declaration would allow the government to unilaterally seize a security researcher's assets if the government believed they helped Chinese hackers. And lastly, these proposed BIS rules would impose export restrictions on all security research.
Discovering vulnerabilities in products, especially products from prickly companies like Oracle and Microsoft, embarrasses them. They see the security researchers, rather than the hackers, as their primary threat. They put a lot of pressure on government to do something about those pesky researchers.
What's the penalty for improperly exporting something?
Nobody knows, because the BIS gets to arbitrarily impose penalties. They could decide to send you a warning letter, or they could decide to send you to jail for 20 years with a $1 million fine. It's described somewhat here.
It seems good that the BIS can decide to simply warn you if you make a mistake, but the opposite is true. Such warnings go to people who play along, such as by sharing their 0-days with the NSA. Harsher punishments go to those who stand up against the system.
That's been a frequent criticism of anti-hacking laws: their punishments are unreasonably severe, and meted out in a prejudicial and arbitrary fashion. Those who annoy the powerful are the ones who get punished most.
Why this anger toward privacy groups?
Because they got precisely what they asked for.
Privacy groups have long attacked companies like FinFisher and Amesys. They have pushed for regulations to stop these companies, sometimes explicitly for export restrictions. Now that these regulations are here, and their impact obvious, these privacy activists are complaining the rules go too far -- and that they aren't responsible.
But cybersecurity experts have long warned of this, specifically that good and bad products are technically indistinguishable. Privacy groups have ignored these warnings. A good example is this post from Privacy International where they consider, then reject, the warning.
The feuding between privacy/rights organizations and cybersecurity researchers predates the Wassenaar debate. For example, Chris Soghoian, the Principal Technologist at the ACLU, calls 0-day sellers "merchants of death". 0-day sellers in turn call Soghoian a "fascist" for his attack on their free speech rights.
Smarter organizations like the EFF have consistently warned that technical distinctions in regulations were nearly impossible. However, they still have championed the cause that "something must be done" about FinFisher and Amesys without taking a principled stand against government regulation -- at least not the same stand as cybersecurity researchers.
Are there other issues besides cybersecurity?
There's a couple First Amendment issues. Code is speech, and in many ways this restricts code (though open-source code is untouched by the rules). Separately, the way restrictions and punishments can be arbitrarily applied gives the government leeway to punish those who speak up.
Conclusion
One thing to note is that the comments we want to make don't precisely match up with the questions they are asking. For example, they ask "How many additional license applications would your company be required to submit per year?" This has nothing to do with why people are up in arms over this proposal.
Update: The original version said that the DMCA was recently changed to attack researchers. The reverse is true -- it's long been used to attack researchers. Some are trying to clarify the rules to allow vulnerability research, but there is strong pushback from various quarters, like car companies.
Update: The original version said that the DMCA was recently changed to attack researchers. The reverse is true -- it's long been used to attack researchers. Some are trying to clarify the rules to allow vulnerability research, but there is strong pushback from various quarters, like car companies.
I agree with what @CDA said on twitter - the 15 CFR 734.3 exceptions for "publicly available technology and software" probably cover your scenario where the researcher hasn't published the code yet, but intends to. This is different from the "open source" exceptions that we're used to dealing with in the context of encryption. The 734.3 exemptions don't apply to encryption software. Apparently, they do apply to "intrusion software," and they apply regardless of whether or not the thing being disseminated is open source.
ReplyDeleteThe problem comes up where the researcher doesn't intend to publish the exploit - they intend to only disclose to the software vendor that makes the vulnerable product, and they are doing that across a border, or they if they sell the exploit to a bug bounty program. Bug bounty programs and coordinated disclosures play a big role in information security and the government shouldn't be interfering with them.
Shoulda said, wassa wassanar?
ReplyDeleteUnemployed Payday loans @ http://www.waytoloans.com/unemployed-payday-loans.html
ReplyDeletePayday Laons @ http://www.waytoloans.com/payday-loans.html
Unsecured Tenant Loans for Bad Credit @ http://www.waytoloans.com/unsecured-loans-for-bad-credit.html
Unsecured Tenant Loans @ http://www.waytoloans.com/unsecured-tenant-loans.html
Faxless Payday Loans @ http://www.waytoloans.com/faxless-payday-loans.html
ReplyDeleteBecause it's title states, these types of financial situation are provided from the vehicle name of the candidate. Essentially, these types of are guaranteed loans which are agreed to the folks that have taken the vehicle name on the title and prepared to place it because security from the loan quantity.
payday loans for bad credit @ http://badcreditpaydayloansx.co.uk/
loans for bad credit @ http://badcreditpaydayloansx.co.uk/
This comment has been removed by a blog administrator.
ReplyDelete