Opinion | Contributing Op-Ed Writer
For computer security professionals, 2018 started with a bang. A new class of security vulnerability — a variety of flaws that affect almost all major microprocessor chips, and that could enable hackers to steal information from personal computers as well as cloud computing services — was announced on Wednesday. The news prompted a rush of fixes, ruining the holiday vacations of system administrators worldwide.
For an ordinary computer user, there is not much to panic about right now. Just keep your software updated so you receive the fixes. And consider installing an ad-blocker like uBlock Origin to protect against ads that carry malware that could exploit these vulnerabilities. That is about all you can do.
However, as a citizen of a world in which digital technology is increasingly integrated into all objects — not just phones but also cars, baby monitors and so on — it is past time to panic.
We have built the digital world too rapidly. It was constructed layer upon layer, and many of the early layers were never meant to guard so many valuable things: our personal correspondence, our finances, the very infrastructure of our lives. Design shortcuts and other techniques for optimization — in particular, sacrificing security for speed or memory space — may have made sense when computers played a relatively small role in our lives. But those early layers are now emerging as enormous liabilities. The vulnerabilities announced last week have been around for decades, perhaps lurking unnoticed by anyone or perhaps long exploited.
Almost all modern microprocessors employ tricks to squeeze more performance out of a computer program. A common trick involves having the microprocessor predict what the program is about to do and start doing it before it has been asked to do it — say, fetching data from memory. In a way, modern microprocessors act like attentive butlers, pouring that second glass of wine before you knew you were going to ask for it.
But what if you weren’t going to ask for that wine? What if you were going to switch to port? No problem: The butler just dumps the mistaken glass and gets the port. Yes, some time has been wasted. But in the long run, as long as the overall amount of time gained by anticipating your needs exceeds the time lost, all is well.
Except all is not well. Imagine that you don’t want others to know about the details of the wine cellar. It turns out that by watching your butler’s movements, other people can infer a lot about the cellar. Information is revealed that would not have been had the butler patiently waited for each of your commands, rather than anticipating them. Almost all modern microprocessors make these butler movements, with their revealing traces, and hackers can take advantage.
There has been a rush to fortify our computing systems, and it may work for the moment. But at best, potential temporary fixes will entail a performance cost, since they involve rolling back strategies for optimizing performance. And since the problem is built into the hardware — billions of chips that cannot easily be replaced — fixing this class of problems may also be prohibitively expensive.
At worst, these fixes are too late. The vulnerabilities announced last week were found by three independent teams whose investigations converged on the same flaw at the same time. It is possible that less-responsible actors were also converging on this flaw and may have already succeeded in exploiting it.
Modern computing security is like a flimsy house that needs to be fundamentally rebuilt. In recent years, we have suffered small collapses here and there, and made superficial fixes in response. There has been no real accountability for the companies at fault, even when the failures were a foreseeable result of underinvestment in security or substandard practices rather than an outdated trade-off of performance for security.
There are better ways to make systems more secure. For example, you can build more isolation and separation into our systems, moving security functions to properly audited hardware and away from software (which is always more vulnerable). Things cannot be hacked if they cannot be reached. This may mean that we have to sacrifice some speed for security.
But the truth is that our computers are already quite fast. When they are slow for the end-user, it is often because of “bloatware”: badly written programs or advertising scripts that wreak havoc as they try to track your activity online. If we were to fix that problem, we would gain speed (and avoid threatening and needless surveillance of our behavior).
As things stand, we suffer through hack after hack, security failure after security failure. If commercial airplanes fell out of the sky regularly, we wouldn’t just shrug. We would invest in understanding flight dynamics, hold companies accountable that did not use established safety procedures, and dissect and learn from new incidents that caught us by surprise.
And indeed, with airplanes, we did all that. There is no reason we cannot do the same for safety and security of our digital systems.