Archive for April, 2016

Computer System Security Policy Debate (Follow-up)

As a follow-up to my recent post on the debate in the US over new encryption restrictions, I thought a short addition might be relevant.  This continues.

There was a recent Congressional hearing on the topic that featured mostly what you would expect.  Police always want access to any possible source of evidence and the tech industry tries to explain that the risks associated with mandates to do so are excessive with grandstanding legislators sprinkled throughout.   What I found interesting (and I use that word with some trepidation as it is still a multi-hour video of a Congressional hearing) is that there was rather less grandstanding and and less absolutism from some parties than I was expecting.

There is overwhelming consensus that these requirements [for exceptional access] are incompatible with good security engineering practice

Dr. Matthew Blaze

The challenge is that political people see everything as a political/policy issue, but this isn’t that kind of issue.  I get particularly frustrated when I read ignorant ramblings like this that dismiss the overwhelming consensus of the people that actually understand what needs to be done as emotional, hysterical obstructionism.  Contrary to what seems to be that author’s point, constructive dialogue and understanding values does nothing to change the technical risks of mandating exceptional access.  Of course the opponents of Feinstein-Burr decry it as technologically illiterate, it is technologically illiterate.

This doesn’t quite rise to the level of that time the Indiana state legislature considered legislating a new value (or in fact multiple values) for the mathematical constant Pi, but it is in the same legislative domain.

Future of secure systems in the US

As a rule, I avoid writing publicly on political topics, but I’m making an exception.

In case you haven’t been following it, the senior Republican and the senior Democrat on the Senate Intelligence Committee recently announced a legislative proposal misleadingly called the Compliance with Court Orders Act of 2016.  The full text of the draft can be found here.  It would effectively ban devices and software in the United States that the manufacturer cannot retrieve data from.  Here is a good analysis of the breadth of the proposal and a good analysis of the bill itself.

While complying with court orders might sound great in theory, in practice this means these devices and software will be insecure by design.  While that’s probably reasonably obvious to most normal readers here, don’t just take my word for it, take Bruce Schneier‘s.

In my opinion, policy makers (and it’s not just in the United States) are suffering from a perception gap about security and how technically hard it is to get right.  It seems to me that they are convinced that technologists could just do security “right” while still allowing some level of extraordinary access for law enforcement if they only wanted to.  We’ve tried this before and the story never seems to end well.  This isn’t a complaint from wide eyed radicals that such extraordinary access is morally wrong or inappropriate.  It’s hard core technologists saying it can’t be done.

I don’t know how to get the message across.  Here’s President Obama, in my opinion, completely missing the point when he equates a desire for security with “fetishizing our phones above every other value.”  Here are some very smart people trying very hard to be reasonable about some mythical middle ground.  As Riana Pfefferkorn’s analysis that I linked in the first paragraph discusses, this middle ground doesn’t exist and all the arm waving in the world by policy makers won’t create it.

Coincidentally, this same week, the White House announced a new “Commission on Enhancing National Cybersecurity“.  Cybersecurity is certainly something we could use more of, unfortunately Congress seems to be heading off in the opposite direction and no one from the executive branch has spoken out against it.

Security and privacy are important to many people.  Given the personal and financial importance of data stored in computers (traditional or mobile), users don’t want criminals to get a hold of it.  Companies know this, which is why both Apple IOS and Google Android both encrypt their local file systems by default now.  If a bill anything like what’s been proposed becomes law, users that care about security are going to go elsewhere.  That may end up being non-US companies’ products or US companies may shift operations to localities more friendly to secure design.  Either way, the US tech sector loses.  A more accurate title would have been Technology Jobs Off-Shoring Act of 2016.

EDIT: Fixed a typo.