Two of the largest companies in the world are bickering with each other about how best to protect users. I won’t get into just how historically hypocritical this is for both Microsoft and Google or how childish it makes them both look but it brings up a debate that has been raging in security circles for over a hundred years starting way back in the 1890s with the release of locksmithing information. An organization I was involved with, L0pht Heavy Industries, raised the debate again in the 1990’s as security researchers started finding vulnerabilities in products.
In the beginning there was full disclosure, and there was only full disclosure, and we liked it. In the beginning the goal was to get stuff fixed, it wasn’t about glory, it wasn’t about bug bounties, it wasn’t about embarrassing your competition. No, in the beginning it was about getting bugs fixed. It was about getting the software that you used, the software you deployed to your users, it was about getting it fixed, getting it to be safe. However, in the beginning vendors didn’t see it that way, many of them still don’t. Vendors would ignore you, or purposely delay you. There is no money in fixing bugs that no one else is complaining about so most vendors wouldn’t fix them, at least not until it became public and all of their customers started to complain about them. That was the power of full disclosure.
Vendors of course hated full disclosure because they had no control over the process, in fact there was no process at all and so they complained, vociferously. Vendors talked about ethics and morality and how full disclosure helped the bad guys. So a guy named Rain Forest Puppy published the first Full Disclosure Policy promising to release vulnerabilities to vendors privately first but only so long as the vendors promised to fix things in a timely manner. If the vendor didn’t get stuff fixed the researcher could still pull out their most effective tool, full disclosure, to get the job done.
But vendors didn’t like this one bit and so Microsoft developed a policy on their own and called it Coordinated Disclosure. Coordinated Disclosure calls on researchers to work with the vendor until a fix can be released regardless of how long that takes. Under Coordinated Disclosure there is no option for Full Disclosure at all. Of course Coordinated Disclosure assumes that the vendor is even interested in fixing the bug in the first place.
The problem that many companies who have vulnerability disclosure policies don’t realize, such as Microsoft, is they have forgotten that they are not the ones in control. Vendor disclosure policies are not binding on the researcher. It is the researchers choice whether or not to follow a company’s disclosure policy. Vendor policies work great for the vendor, it gives them all the time in the world to fix a bug but for researchers who want to get stuff fixed such policies can be a major pain to work within.
Disclosing vulnerabilities isn’t an easy thing. In the mid nineties at L0pht Heavy Industries we quickly found that vendors had absolutely no interest in fixing bugs at all and instead would prefer that we just kept our mouths shut. A lifetime later it was part of my job to help coordinate vulnerability disclosure with various vendors that were found by our pentesters. If you’re a lone researcher and only have one vulnerability its not such a big deal, you send a few emails, wait a little while and if the vendor is cooperative a fix is pushed out in a few days or months time. If you happen to have several dozen vulnerabilities that you are attempting to get fixed, all at the same time, and all by different vendors, the process can be a little more involved. In fact simply coordinating these disclosures can be a full time job for multiple people within an organization. There is no ROI here either, the ‘simple’ process of attempting to disclose vulnerabilities eats up revenue in the time your employees take trying to coordinate vulnerabilities and get stuff fixed.
In 2009 several researchers found the disclosure process so onerous that they started the “No More Free Bugs” campaign and promised not to release any vulnerabilities for free. In response vendors started bug bounty programs where they rightly paid researchers for their hard work. However, even that process comes at a cost for both the vendor and the researcher. So much so that there are now third party companies that will help vendors run bug bounty programs and help researchers disclose vulnerabilities.
Of course there are still vendors who refuse to fix stuff or who wait forever to do so. According to Tipping Point’s Zero Day Initiative there are currently 212 known security vulnerabilities without fixes, several of which are over a year old. It is likely that the only way any of these ancient bugs will get fixed is by pulling out the old standby of Full Disclosure. In fact Tipping Point has threaten to do just that, giving vendors just six months to get stuff fixed before they publish limited details on the bugs.
This has all lead us to the point where Google has a disclosure policy that basically says they’re going full disclosure in 90 days if the bug is fixed or not. And the point where Microsoft is asking for just a few more days so they can include the fix with their regular Patch Tuesday. Two big kids who should be setting the example are instead acting like a couple of teenagers on the playground. How does any of this get stuff fixed and protect users?
This is why you see many companies and individual researchers not disclosing anything at all, and this should not happen. And I haven’t even gotten into the issue of vendors filing lawsuits against researchers as a means to keep them quiet.
The entire process has gotten out of hand. The number one goal here should be getting stuff fixed because getting stuff fixed helps protect the user, it helps defeat the bad guys and it helps make the world a better place.
Microsoft says that full disclosure “forces customers to defend themselves” which is the wrong way to look at it. Full disclosure allows companies to defend themselves if they so choose. The opposite is non-disclosure, which helps no one. Just because a bug hasn’t been disclosed doesn’t mean it is not there. It doesn’t magically pop into existence only when someone publishes something about it. The bug is there, waiting to be found. Maybe the bad guys already found it. Maybe they are already using that bug against you. And yet you are blissfully unaware that the bug even exists. Full disclosure gives you knowledge that you can use to protect yourself even if a patch is not available. You can choose to turn off the affected device, or add additional protections to your environment to help you mitigate the risk. Once disclosure happens this is now your choice, you can evaluate the risk this particular bug presents to your environment and make an educated decision of what steps to take depending on your own risk tolerance. While most users will continue on blissfully unaware or choose to ignore the information that too is their choice, not Microsoft’s, and not Google’s.
Google’s goal of getting everything they find fixed within three months is laudable but unrealistic. Some bugs just take a little bit longer to verify, develop patches for, and test. It is not unreasonable to be a little flexible if you feel the vendor is working in good faith to develop a patch. To arbitrarily go full disclosure when you know the vendor has a patch just days away is immoral. It puts users at risk and makes you look like a stubborn child.
In this particular case both the vendor and the researcher are wrong. Microsoft obviously communicated the status of the fix to Google and told Google when to expect the patch. It is not unreasonable for Microsoft to ask for a few extra days and it should not be unreasonable for Google to grant such a request. On the other hand I am sure Google informed Microsoft that they would only wait 90 days before going full disclosure, Microsoft was informed of the risk of full disclosure and should have pushed harder to meet the deadline.
And so the disclosure debate continues unabated for over a hundred years. With two of the giants in our industry acting like spoiled children we as security professionals must take the reigns from our supposed leaders and set a better example. It needs to be about protecting the user. It should not be about grandstanding or whining or even making a buck, in the end it should be about getting stuff fixed.
UPDATE 2015.02.13
Google has made an update to its 90-day disclosure deadline. They have decided to make allowances for deadlines that fall on weekends and holidays and more importantly have granted a grace period for vendors who communicate their intent to release a patch with 14-days of the 90-day deadline. It is nice to see vendors and researchers working together. The goal here should be to protect the users and not embarrass vendors. This grace period shows an understanding of the issues surrounding disclosure that impact vendors while at the same time continuing to hold them to a high standard.
—
Interested in reading more?
Microsoft’s latest plea for VCD is as much propaganda as sincere – OSVDB
Microsoft blasts Google for vulnerability disclosure policy – CSO Online
A Call for Better Vulnerability Response – ErrataSec