So I was having lunch with one of the organizers of the Source Boston 2008 conference yesterday (Spicy Beef Bowl, mmmmm) and realized that this is going to be one really great conference. Not only are there big name speakers like Richard Clarke, Steven Levy and Dan Geer there are some well respected security industry luminaries as well like Carole Fennelly, Frank Rieger, James Atkinson and a host of others. But I think the big thing that will set this conference apart from the big ones like BlackHat or RSA, (besides that the fact that it is within driving distance for me) is the size. There won’t be tens of thousands of people in attendance meaning you will probably be able to get a lot of one on one time with some of the smartest security minds in the country. If your in the Boston area at all you should probably stop by for a day or two or even all three.
Oh, and the L0pht renion panel is scheduled for Friday, the day after the Pub Crawl, which ought to be interesting.
I don’t have time for all of the stupidity out there but this is just to stupid to let pass by. Easy Nova a German company that makes a variety of computer storage accessories, recently released a hard drive case with hardware data encryption with 128-bit AES and access control via an RFID chip. Which on the surface sounds really really cool. Portable secure data, what more could you ask for? As it turns out you still need to ask for it to be secure because according to Heise Online and c’t Magazine that despite the claims of AES hardware encryption the product actually uses XOR encryption to write your data! Evidently the AES is only used to encrypt the RFID signal between the drive and the key fob. AES for the RFID chip but XOR for the data? I mean WTF! How about some truth in labeling. I suppose we should be happy they didn’t use double XOR.
This is yet another example of a security product that isn’t secure. How is the consumer supposed to know? Not everyone has diagnostic labs and forensic tools at the their disposal to test each and every product they buy for security. I’ve mentioned the formation of a Cyber UL before and clearly it is sorely needed.
If a vendor finds a vulnerability in a competitors code are they obligated to tell them? What exactly is ethical and or responsible disclosure when it comes to competing vendors? Among security researchers the general consensus these days is to notify the vendor and then wait a reasonable amount time for a patch to be developed before going public. While this scenario is for the most part agreed upon and followed it is by no means a perfect solution. Now through in competing vendors and it gets even stickier.
Recently the Mozilla group was notified of an exploit in their code which they dutifully fixed. In the process they evidently realized that the same hole effected the Opera browser. Like good net citizens they notified Opera of the hole but did not wait around for Opera to fix it.
So is Opera justified in being a little miffed at Mozilla for not waiting for a fix or should they be happy that they got notified at all? Should vendors be held to the same ethical standards as researchers when it comes to vulnerability disclosure even if it is with a competitors product? Why have we had this same problem for decades without some sort of solution?
Looks like I missed this the first time around but there is an article about a speech recently given by Peter Tippet, a VP at Verizon and a scientist at ICSA labs, who talks about how useless most security actually is. Most of his points are ones that I have been making for years like the uselessness of long complex passwords, all your doing is inconveniencing the user. Or how ineffective the continuous search for, reacting to, and patching of new software holes really is when you consider that only a small percentage of those holes are ever exploited. Do you want the highest rate of return on your security dollar? Spend it on the weakest link, the people. Security awareness training, while hard to quantify, will provide the biggest return in terms of security. If you can train your users to think about security as part of their everyday work lives your overall level of security will increase dramatically.
Thats right the US Air Force is looking for a few good geeks. And evidently they are willing to relax a few of the requirements of military service to get them. According to this quote in Wired Major General William Lord of the US air Force’s Cyber Command said “So if they can’t run three miles with a pack on their backs but they can shut down a SCADA system, we need to have a culture where they fit in.”
As a former Sergent in the US Army (7th ID (light)) I am pretty shocked at this statement. Military physical fitness standards are not that hard to achieve or maintain (especially in the Air Force). PT speaks to the very core of what it means to be a part of the military. When the Air Force needs pilots they don’t reduce or eliminate requirements they offer cash bonuses for reenlistment. So what happens when there is a shortage of cooks? or mechanics? Will we end up with a military that is to damn fat to get out of its own way? I don’t care what your job is, cyber warrior or not, if you’ve made the commitment to serve your country then you can make a commitment to pass a damn PT test.
Ever hear of CIPAV? It is some pretty bad-ass spyware that tracks every website, every chat, every email that you send from your computer. Maybe you know it by its more common name Computer and Internet Protocol Address Verifier. Sounds pretty official for a piece of nefarious software. Guess what, it is the software used by the FBI. Which is Ok I guess, I mean the FBI needs investigative and forensic tools don’t they? But what do they do with all this data they collect? Who are they collecting it from? Do they need or even attempt to get a warrant when they use it? Why is it such a big secret? These are just a few questions about this secret program that have come up since its existence was first discovered. Now it looks like the FBI actually asked the FISC court (Thats the secret court that rubber stamps eavesdropping warrants for the FBI) if they could use the software. Looks like they have been using this stuff since 2005! No idea of how it gets installed or if AV software will catch it. I’m all for the FBI and other agencies having the tools they need to do their jobs but there is no reason why it needs to be all secret and cloak and dagger, how about a little oversight?
The folks over at CanSecWest will once again be hosting their popular PWN to OWN contest at this years con. I wrote about last year’s contest that was won after a spl0it was found in Quicktime that allowed the attacker to PWN the Mac laptop. This year they are also putting up an Ubuntu and Vista box. They haven’t mentioned what the configuration will be, what aps will be installed etc… but it doesn’t really matter. This exercise will prove nothing other than that the CanSecWest organizers know how to be media whores (hey, even I’m writing about it). Even if one or two of the boxes get owned it will not prove that one OS is more secure than the other. OS Security is proven (or disproved) over the types and severity (not number or frequency) of vulnerabilities found over the long term. So while this contest will likely get a lot of press, especially if someone is successful and owns one of the boxes, in the long run it really doesn’t mean anything.