2017 felt like the year that the security blanket was yanked from all of the technology world. There were so many deep dark revelations that meant that there is no safe harbor; that unless you spend ALL your time on security, you and your company are just basically waiting for a determined hacker to put a target on your back. Security by obscurity is looking very appealing to a lot of people these days (sic).

      

It came from all levels:

That last statement should make you shiver. There is already practically a processor in everything. And in the near future, there will be multiple processors in everything, including in your shirts(!). Fifteen years ago, there was war about the best way to build processors, RISC vs. CISC. This kind of competition resulted in many innovations, but the innovation that spanned both approaches was speculative execution. Speculative execution is basically a processor’s way of eating its cake and having it – when a processor gets to a fork in the road of which only one if valid, because they are computers and unlike humans, they can take both paths simultaneously. And then when they finally figure out which one path is valid, they just essential ghost/delete that path that was invalid. It’s like some kind of time travel stuff but in binary.

It turns out that this simple strategy speeds things up pretty dramatically. Time is money and the processor now doesn’t need to mark time for expensive and time intensive decisions in memory to get resolved. Well turns out that this entire approach is insecure. You can read about it here, but the TLDR is that the speculation itself exposes memory that may be used by other programs in ways that is hard to defend. There is another related vulnerability called meltdown that exposes the boundary between the kernel and user space, but that is mostly an Intel flaw and should be much easier to fix (easy being relative, of course).

The entire industry is struggling to patch this vulnerability, but it’s not going to be easy. Lots and lots of deployed processors are hard to reach. But also, more troubling, these fixes will retard performance. Remember when I said that speculative execution was about speed? Well turns out when you try to fix an endemic flaw of a strategy that speeds up processors, you will inevitably slow it down. The biggest impact of this is in cloud infrastructure, where people chain machines in order to run things at scale. And sometimes in shared environments, this increases the risk that someone/some program who/which in an adjacent compartment, could break into your cloud machines. If you have customers, this prospect is chilling. Even if you ignore the potential litigation, the ethics of it is horrendous.

Maybe the only way to eliminate this risk is to not put your programs in a shared cloud environment. I’m pretty sure this will occur to just about everyone and may retard cloud growth for the foreseeable future. Or maybe the cloud guys will figure out how to reassure people of that risk. But I dunno, it seems no one has a handle on risk in tech anymore. Who knows what other fundamental thing is around the corner. When you can basically do the equivalent of disprove gravity, all things are up for grabs.

Don’t mean to be a downer, but man, be very afraid.

Leave a Reply

Your email address will not be published. Required fields are marked *