The Fallout from WannaCry
There was a joke going around thirty years ago, a not very good joke but like any two-edged sword it cut either way, that said that Israel was a “one disk” country. The meaning was that everyone copied stuff from their friends and didn’t pay for it.
At that time there was not much worry about computers or security, there were no smartphones (the Blackberry was just emerging), and the Internet was there but not the gargantuan edifice it is today.
But copying at that time was mostly a problem for the music industry, and as computer processors, storage and memory improved, it also became a worry for film producers who feared losing revenue. But still we were in early days.
Today much of the fraud in the computer business is illegally copied software. Big American companies, and probably big companies in Europe and some in Asia, are careful to use only licensed software because of the fear they might get caught pirating software from commercial vendors. But smaller companies are less inclined to worry about such things and, in some countries, stealing commercial software is quite common, even for major industries including banking.
That is why it is so interesting that Russia and China experienced a large number of ransomware attacks recently, part of the WannaCry exploit. In Russia, there are a large number of users (including probably some in government agencies) who use pirated software. One of the problems of pirated software is that you cannot easily keep the software up to date. That’s because in most cases to do so requires that you go with your registered and authenticated copy to the software manufacturer for updates. If yours is illegal, you don’t do that, or perhaps you try to figure out what the patch or update is, and install it yourself. By and large this left computers in Russia heavily exposed to the ransomware attack, which angered Vladimir Putin who, partly correctly, blamed NSA in the United States for his troubles.
It is not just Russia, of course. There are four reasons why WannaCry became such a threat. These are:
- The underlying exploit was developed by the NSA and WannaCry was built on top of the NSA spyware. For some time NSA most likely used the exploit for spying or other special cyber-activist operations. At this point, all was secret. But after Snowden, the NSA should have realized that many of their treasured software spyware systems were exposed. They should have quietly got the vendors such as Microsoft to build patches. But they delayed some seven or eight months, leaving the door open to hacker threats.
- Microsoft indeed did work out a patch, but did not update older software, especially Windows XT which runs on many servers worldwide. It is quite true that Microsoft had told its customers a year or two ago they would no longer support Windows XP. But the meaning of this for consumers was they would have to go out and buy an upgraded version of Windows and, perhaps, also have to upgrade all their other software, especially custom software they built around XT. So they didn’t. Microsoft could have, alternatively, charged for patching XT as a way to compensate for lost revenues in continuing to support XT, but perhaps they were not interested in that business model. In any case, XT users were left in the lurch.
- While patches were available, one of the continuing problems facing computer users is keeping them up to date. Unfortunately, when you buy a computer operating system or any other software program, you think it is built to stand up to security threats. But the opposite is true. Not only is most software likely to be full of holes just off the shelf, but the problems rapidly multiply as hackers probe for hidden vulnerabilities. In software, there is no real warranty: even the support a vendor gives to his software is highly voluntary. Just read the lawyer’s statement stuck in the software box with so many disclaimers that the bottom line is you are on your own. To add to the problem, IT staffs are not always so diligent in maintaining systems, since much of their time is spent fixing bugs on workstations and crashes on the server. Unlike desktop or laptop machines that can benefit from automatic updates (if the user opts to allow them), servers generally have to be shut down for patches to be applied, disrupting workflow. Thus, computer work often goes on when the rest of the organization is shut down, provided of course that it gets done at all. To all this one can add that even automated updates pose security problems, especially for government but also for industry too. Essentially you invite the software vendor into your machine in an entirely laissez faire manner: what happens next you don’t know, but tons of information could potentially be lifted from your server. This alone makes automatic updating very dangerous.
- Finally, there is a lot of software running on machines that wasn’t exactly paid for. It could be, though it is by no means certain, that the WannaCry ransomware has the biggest impact outside the United States (where resort to pirated software seems to have diminished). It would be interesting to be able to prove this point (I cannot). But one suspects that, like Russia and China, WannaCry exposed many not-so-legal implementations even in good-sized businesses and organizations around the world.
So what are the lessons (if any)? I suggest the following:
- NSA and other government agencies proactively using cyber intrusions for national security reasons must put in place rules to better protect against discovery of their operations, including how to mitigate any threat resulting from their operations, especially to the critical infrastructure. Clearly, NSA was laggard and mindless in protecting the public at large and, even worse, leaving the nation exposed to real danger. They can and must do better in future.
- Because software is not designed for security -- at best it is an afterthought -- the vulnerabilities and risks pile up, and while vendors may try to fix things as they go along, in many cases they are late to the party. For critical infrastructure applications this is a risk too far. As I have been advocating, the time has come for the U.S. government to sponsor a form of classified operating system and user software suites built from the ground up for security and available only to critical infrastructure operators. Doing this is well within the reach of technology, but it requires leadership. Because Washington almost always bows in the direction of the big software and hardware companies, progress has so far been retarded, or there has been no progress whatever. WannaCry demonstrates that Washington has been disregarding all the warning signs and has merrily spent billions on security that does not work, trying to retain the use of commercial off-the-shelf software that is a security nightmare. If it is not changed soon, Washington could leave our entire governmental, banking, communications, food supply, and military systems at risk of attack and instant meltdown. If you are looking for a cyber Pearl Harbor, here it is.
International enforcement regarding cyber intrusions is weak and in many cases nonexistent. Probably no one will catch the WannaCry cyberthugs, not because they don’t want to, but because they are protected by interests that are bigger than the hacker's themselves. In the case of WannaCry the betting is on North Korea as the ultimate culprit. So what can you do? One could certainly proactively hit the North Koreans with a MOACA (Mother of All Cyber Attacks), but we won’t. That is because Washington isn’t so interested, except to complain, and our allies are worse. If the Russians got mad at the North Koreans instead of the NSA, maybe something would happen to “discourage” the North Koreans from these reckless attacks. But that also did not happen. Most analysts have felt, for some time now that there is, as yet, no punishment that fits the crime (to crib indecently from Gilbert and Sullivan). Considerable work needs to be done so we can strike back at the perpetrators and their sponsors and do so in real time.