2008-08-07

Day 5, Blackhat Briefings

Electronic Discovery: John Benson
I didn’t care For Mr. Benson's speech; it was un-organized and didn’t really go anywhere.
Here are the factoids Jon discusses:
1.) To Lawyers, Information technology processes are barely even a blip on their collective legal counsel radar. They only know a handful of massive screw-ups of which they read less than a paragraph about of which consists of:
a. Document Meta-Data such as Words track changes feature and embedded data in picture files.
b. The extent of they knowledge of Electronic discovery is securing the backup tapes.
2.) According to a large percentage of lawyers the world is paper based, computers are not used. (According to John)
3.) The discovery process is outlined during the 26F meeting, which occurs 99 days before the trial begins. (In short your have a little over 3 months to secure requested data.)
4.) The Lawyers will ask for a “Data Map” which means something different to every one.
5.) Don’t worry about compliance until you get sued, electronic discovery (according to John) isn’t worth the costs of changing all of your processes.

In short sit down and pick your legal counsel’s brain to nail down exactly what the other party really wants. If they ask you to secure the backups it means you’re counsel doesn’t understand what they other party wants. Its possible the other party doesn’t know what they want and is issuing blanket request.

The rest of the information that leaked out was about the American Common Law Justice system and how it works.

John mentioned the Cow-town Computer Congress (Apparently his brain child.) I will have to visit and see what that is all about.

DNS Spoofing Exploit:
(This guy won the Pwny award for Pwning the media)
Basically the DNS server uses insufficient randomness in its keys to prevent some one from spamming every possible communication key with a bogus DNS entry result.

Also mentioned was the fact DNS would often allow an attacker to bypass a-non-stateful firewall and perform the attack.

The trick is to cause the DNS server to go out and seek DNS information from an external DNS server.

In short this is highly dangerous and needs to be updated ASAP. He has been working closely with vendors in helping them update.

Virtualization
The problem with mass implementing virtualization is that the software switch has ZERO network administrative features and is not fault tolerant. No available virtual security package is able to provide this functionality with out adding a huge load to your virtual infrastructure. The Virtualization environment by itself cant be monitored for mal-ware or other malicious software infections with out the use of API’s and products which wont be available for 6 months to 2 years even then at great cost.

In short, Virtualization is still way too young to be deployed in a secured environment, however it is more suited for small networks with light load and existing physical infrastructure.

Possible Bluetooth 2.1 Specification implementation issues:

BT2 uses a password, which is vulnerable to an off-line brute-force attack. BT2.1 has the same issue if the manufacturer does not enforce the user using a different key every time.

Debian OpenSSL Vulnerability
(Pwnie award)
In short the developers said: “Sorry our bad, but we did catch it, just a little too late”
The code comment out was the 3 out of the 4 sources of entropy used to encrypt data. So it was reduced to about 32,000 possible combinations from the current process ID. Since the PID changes every time the program is run, the numbers produced appeared to be random.

The people who did the presentation I think they misread the data, which said out of their sample of sites only 140 sites (as of the release of the vulnerability) were actually vulnerable.

The number is of course higher than that because that count did not include workstations.

If either host has one of these bad keys the encryption is nullified.

See also my post on this subject.

Keynote:

Older people have developed a cynical view on technology. I can agree with part of that in that people who are unaware of the past are doomed to repeat it. The speaker hit on this theme and brought up excellent points, which need to be addressed. A manager manages things and leads people and the speaker warned of the dangers of trying to manage people too and the distinction between installation of a tool and a system. To install a tool is to introduce a tool made for a purpose. A system is created not by the tool or the act of creating the tool, but the people using the tool to perform a purpose.
Management policies only cover a limited stretch of the bell curve so when a process operates out side of the process we run in a hazardous situation. Rather than stopping and treating the new situation as such, we try to match it to a situation, which is similar to one, which there is a process for which is a bad approach. Web applications when developed are perfect, so long as all input thrown at it is with in the realm of what was originally anticipated, the problem with buggy software is that it fails to adapt to new situations in a constructive fashion.

The speaker also noted that people were trying to use technology to solve bad systems where in reality the existing system is given the new tool and will use it in whatever way suits its purpose.

In short, IT is the constant making of tools for the system to use to promote its purpose. So long as a functional process is able to make use of the tool to increase productivity and other positive ends then technology is successful, if the system is broken then no tool in the world will solve the problem.

The speakers point was to stop trying to use technology as a replacement for existing systems and work with the system itself.

blog comments powered by Disqus