Friday, December 31, 2010

IPv6 deployment guides have been released by NIST

I really wasn't planning on posting anything until the new year, but this story in the SANS NewsBites prompted me to post this up.  NIST has posted a final version of its "Guidelines for the Secure Deployment of IPv6."

This has ramifications in a bunch of checklists that DoD auditors will be looking at.  From experience though, I know of many checks in the Application Security and Development checklist where this will have impact.  Specifically, you can look at: V-16781, V-6164, V-6165, V-19706, V-19707, V-19708, V-19709, V-16829, and mentioned in V-16822 and V-16836.

Saturday, December 25, 2010

Merry Christmas

I saw this on Roger's blog....pretty funny.

Garfield.....Merry Christmas

Thursday, December 23, 2010

This video reminds me of the legendary Dead Parrot Skit

I saw this linked on AndyITGuy's site.  Hysterical.  I've been a big Monty Python fan for a while, and this video reminded me of the Dead Parrot skit.  I hope you enjoy.

It's been a while

I looked at my last post, and was a little surprised to see that it was dated in November.  Yeesh.  It has been a very busy month and change for me.  There have been multiple trips to a particular client in order to help them ramp up security of their IP.  And, of course, the write-up of those trips.  Then, I had to write up the test results from a classified enclave....always fun when you have to write up the findings when you did not actually test the system.  It's very hard to write mitigations and rationales when you haven't seen the system and what makes it up.  And, to top it all off, I had data and a laptop dropped off on my desk from a test that occurred a couple of weeks ago.  I've only skimmed through the data, and it is incomplete at best.  From what I can gather the test was more of a dog-and-pony-show for some high up brass.

I've updated a couple of previous posts with recent pertinent information.

There may be some interesting news concerning content on the site in the coming weeks.  Hopefully I can keep my musings to a minimum and get some actual content up.

Enjoy any holiday break you might take.

Friday, November 19, 2010

Open Source Forensics

This is more a mental note than anything else.  I wrote down the address the other day, but when I looked through my news feed, I couldn't find where I saw the original announcement.  Anyway, a resource for open source forensics has been developed.

I will probably compose a post on my "essential" tools at some point.

Thursday, October 28, 2010

Ovaldi initialize error

When auditing systems that run a Microsoft operating system, I use Ovaldi to find patch management issues.  (I understand it will run on *nix-based systems, but I've never tried it.)  I have it scripted out in a large script that performs other host-based scanning and configuration gathering.  Very occasionally, I'll see in my results directory that Oval did not run for a particular server.  (Typically, it is servers where I find missing results.  Rarely, I see the problem on workstations.) 

Today, I was looking through Oval documentation when I came across the following at

Also, on some Windows systems, the OVAL Interpreter may fail with the following error message when executed.

“The application failed to initialize properly (0xc0150002). Click OK to terminate the application.”

This error message occurs when the run-time components of Visual Studio, that are required to run an application developed with Visual Studio, are not installed. If you receive this error message while executing the OVAL Interpreter, please install the VC++ redistributable package that can be obtained at the following link.

The VC++ redistributable package will install the required run-time components.

As a third-party auditor, I do not add the VC++ redistributable package as I do not want to introduce potentially new vulnerabilities to the system.  And, I do not want to break anything else.

Wednesday, October 27, 2010

Uptick in webmail spam messages

Lately, I've been receiving more and more SPAM in my inbox, and the message usually contains a single URL.  Most likely, that URL leads to a site that is heavily poisoned or makes an attempt at stealing personal information.  It's happened to some of my friends just recently and they've asked for help.  (What they've done about the issue, I do not know.)

In my limited analysis, it appears that their email accounts have been hacked, and someone/thing is using the accounts to pump out spam.  I haven't been able to do a root cause analysis, so I don't know if it is the machine that they are logging on to that is infected, or if there is another vector.  The latest article I've seen on the problem is listed here:

Hacked web mail accounts used to send spam

My response (when asked by my friends) has been to fully scan the computer with anti-virus software that has current definition files to ensure that there is nothing obvious on the system.  Secondly, change the password to the webmail account from a computer that is known to be free from malware.  From the friends that have taken this advice, I've heard good results.  But, short of fully analyzing a machine, I really don't know what's there.

Is there more to it that this?  Is there a bigger problem?  If you have any answers, leave them in the comments.

Monday, October 11, 2010

SCAP-based process

It's been a while since I posted anything.  For one, we were waiting for the fiscal year to end to see what proposals we would be awarded.  Two, after weeks of slowness, I just got back from a big audit.  It was interesting because it was as if they did not want us there.  We were holed up in a back conference room, our contacts went out of their way to ignore us, and we found lots of different machines/technologies/platforms that we were not expecting.  (At least, they didn't tell us about them before we got there.)  I know, shocking.  I don't know if it is because they don't want to pay for more work, or they are just ignorant about their network.  Granted, there was virtually no documentation, and we STILL do not have a network diagram.

While working this contract, we are working on updating our testing process.  I don't think it is a secret that DISA is getting out of the business of producing Gold Disks.  Personally, I think they want to get out of the tool development process all together.  I foresee DISA maintaining the STIGS and requirements, but I do not see them developing tools to test those requirements.  To that end, we've been working on how we will test those controls in the future; and we're looking at SCAP-based products.  We'll see how this goes.

Friday, September 10, 2010

Old email address humor

This is probably old, but I just saw it today.  Funny stuff.

What your email address says about you


Wednesday, August 25, 2010

Can Security be harmful?

This week's SANS NewsBites (Vol. 12, Num 67) has a story of the potential of security (or lack thereof) in the Spanair air crash that killed 154 people.  According to the post, the official cause of the crash was due to pilot error.  The investigation also discovered that a warning indicator did not activate.  These events would have been logged in the company's maintenance system.  It has been alleged that the maintenance system was riddled with malware.  Could this be a case where not patching a system could indirectly lead to deaths?

Recently, I've audited systems and applications that reside in medical treatment facilities.  One system was responsible for the delivery of radiation to patients.  The vendor stated that they are the only authority allowed to administer patches to the system as they need to test out each and every patch before it could be released live in production so as not to endanger a patient.  They talked about one particular case where the pushing of patches by a medical treatment facility enabled a system to administer too much radiation.  And, if were not due to the diligence of an alert technician, fatal conditions would have been met.

Granted, the day-to-day security decisions and risk analysises we make are not going to be that critical.  Heck, just driving to work each day we go through a risk analysis.  Sure, there's a risk that we could get in an accident, but it's not that high and we accept it.  But when it comes to mission-critical systems, or systems that are deemed of high importance, well thought-out risk analysis could be what causes or averts a dire situation.

Friday, August 13, 2010

Vulnerable Web Applications for testing and practice

I'm working on a small presentation for web application testing.  In order to get the bullet points across, I want to have an application where the students can actually try the attacks and see the results as I find that this gets the points across more effectively than PowerPoint slides.  Knowing only a handful of the more popular applications, I started searching.  Google gave me more than I could imagine, and I'm listing a bunch of them here.

This first group are actual applications to be installed:
OWASP Insecure Web App Project
Damn Vulnerable Web App
Hacme Travel
Hacme Bank
Hacme Shipping
Hacme Casino
Hacme Books
The Butterfly Project
Stanford SecuriBench
BodgeIt Store

Live sites (hosted on the internet):
SPI Dynamics
Acunetix (php)
Acunetix (asp)
NT Objectives

If I have missed a good one, please let me know.  I haven't picked one yet, I'm still evaluating.  But I'll add to the list as I hear of and try more applications.

edit: 4-19-2011 added BodgeIt Store

Tuesday, August 10, 2010

Free Monitoring Tools for Systems and Networks

The SANS Internet Storm Center has a great post today, that I'm linking to in order to come back to it as there are some great monitoring tools (free or inexpensive) for various operating systems.

Monitoring Tools

As there are a lot of posts at the Storm Center, I'm sure I'll lose this one if I don't create a link to it.  Hopefully it will help out someone else.

Sunday, July 11, 2010

Book Review: SQL Injection Attacks and Defense by Justin Clarke

Wow!  What a great book.  But let me put this in perspective first.  As a DoD autditor, I do many audits where the enclave/system has a web server with application(s) residing somewhere on a web/application server.  Many times these are GOTS applications that need auditing under the auspices of the Application Security and Development checklist.  That checklist is rather large, and covers the whole spectrum of application development.  A good portion of the checklist is manual in nature; either looking at configurations, checking documentation, or interviewing the appropriate personnel with regards to the application.  There are a few controls for which we run an automated  tool looking for some of the major vulnerabilities (command injection, XSS, SQLi, limited buffer overflows, temp files, error messages, etc.)

I bought this book so I could understand the underpinnings of the SQL injection:  why it exists, how to make it work, and to some degree, how to fix them.  (I don't need to fix them, however, it's nice to be able to help the client mitigate open vulnerabilities.)  I have prior SQL experience (coming from the DBA/DBD world) so the concepts of SQL are quite familiar to me.  And of course, when most people think of SQLi, they think of the familiar ' or 1=1.  I wanted to know more, and I wanted a more advanced knowledge.  Sure, we run tools, but I wanted to be able to confirm our tool's output.  And, I wanted the ability to perform the analysis manually.  One other caveat worth mentioning, many of the systems I audit are CAC-enabled; which the tools we use do not handle too well.  It's easy to manually log in, authenticate, and manually exploit/search for vulnerabilities.

I admit it, I bought the book mainly for chapters 2, 4, 5, 6, and 7.  These chapters talk about testing for SQL injection, exploiting SQL injection, blind SQL injection, exploiting the operating system and advanced topics, respectively.  And those chapters were great.  I satisfied my objective to learn how to really accomplish the task of finding and somewhat-exploiting found inection-able fields.  (We're not hired to pen-test, just we can't go nuts with a finding once we find it.  The fact that the vulnerability is there is enough.)  Chapter 7 discussed advanced topics, and contained a sub-section on finding second-order vulnerabilities.  The topic presented made me think of this post, and other uses for SQL injection.

However, I got so much more from the book.  Chapter 3 discussed reviewing code for potential SQL injection.  What to look for, types of data, frameworks, and static code analysis tools were all discussed.  This was extremely beneficial because while we do not do code reviews, I can better speak to the client on how they can make their code reviews better; and what they should be looking out for (at least in a general sense.)

Chapter 8 is a great chapter on how to remediate findings; fixing code, both in the application and in the database.  Chapter 9 talked about platform-level defenses and remeditions that can be used to harden the servers and operating systems that house the data, web server or database server.  Many of the points mentioned in Chapter 9 are controls found in the Application Security and Development checklist.  These are issues that if employed will directly satisfy the requirement in the checklist.  Finally, chapter 10 includes a great reference on SQL and SQL injection.  As a former DBA/DBD, I could skip the SQL primer.  The chapter also included some of the databases that the book did not delve into; namely PostgreSQL, DB2, Informix and Ingres.  Finally, there were cheat sheets for the topics discussed throughout the book.

This is certainly one of the best books I've read in a bit.  I really could not find much that detracted from the book.  Go into the book knowing that the topics are geared towards MS SQL Server, Oracle, and MySQL; as they should be, they are the three widest-used databases.  Be prepared to learn a lot.  I'm looking for vulnerable web applications that I can install and use to reinforce the concepts so that I am better prepared when I go on audits.  I highly recommend the book; whether you are an application/sql developer, an auditor or a pen-tester.

Monday, July 5, 2010

Thanks for the iTunes Gift Card, but, no thanks

The other day I received a note with a file attached:  My email server stripped the payload, and left me with an empty zip file.  Before I examined the .zip, I uploaded it to VirusTotal, but because it was empty, nothing came back.  I would have liked to have seen what it was, but I'm glad the AV is working on the mail server.  I'm sure the headers on the email were spoofed, if the address was even valid in the first place.  But, the mail came from:  "iTunes Store" .  If anyone can tell me what the actual virus/worm was, I'd be curious to know.

Thursday, July 1, 2010

What Works Conference and what to do

With a little more than a week to go, it looks like I won't be able to attend this year's What Works in Forensics and Incident Response conference coming up next week.  I really wanted to attend, but it looks like the company is scaling back on conferences (no BlackHat this year, just DefCon.)

Which leads me to thoughts I've been having lately.  I've been reading and working a lot on web application auditing and testing; and I'm finding I'm pretty good at it.  Yet, I still love the Incident Response and Forensics.  While I practice IR and forensics with my own company I can't quite get it into the other company.

Tuesday, June 15, 2010

Time sensitivity of data

SANS Newsbytes had this post in today's email:
--Judge Disallows Evidence Gathered From Laptop Six Months After Seizure
(June 10 & 14, 2010)
A US federal judge has ruled that evidence gathered in June 2009 from a
laptop computer seized at a US border crossing in late January 2009 may
be suppressed.  Andrew Hanson was randomly selected for secondary
baggage search in January 2009.  Hanson is a US citizen who was
returning from South Korea to the US through San Francisco.  An image
of child pornography justified seizure of his laptop; a subsequent scan
of the hard drive several weeks later turned up more evidence.  However,
the laptop's contents were not viewed again until June 2009.  The judge
allowed evidence discovered on the laptop in early February 2009 because
the search was conducted within a reasonable time frame.  The judge
determined that evidence obtained during the June search, which was
conducted without a warrant, was inadmissible; a search so long after
the fact requires a warrant.,2817,2365030,00.asp
This is an interesting concept that I had not considered in the forensics arena.   In the government space, the ACA that we work for has said that data we acquire is only valid for six months; after that, we would need to re-test a system up for accreditation.  And I agree with that concept fully.  Many times we will test a system, come back with data, analyze it, and write it up.  We'll get questions up to a year later on what we wrote.  And how valid are those questions?  Many times, the system has changed so much that the initial report is almost invalid (sometimes it is - hopefully the system owners have made changes.)

So, to see this in the forensics space is interesting.  At what point does data become invalid. Certainly, an image of a system should be valid as a point-in-time snapshot for a long time; almost indefinitely.

Granted, given the circumstances around the original story, I think that there is more to it.

Thought-provoking none-the-less.

Monday, May 31, 2010

MS Access SQL Injection

I have a testing trip coming up that involves a web application built using MS Access on the backend.  I've just gotten Jusin Clarke's great book SQL Injection Attacks and Defense.  There doesn't seem to be a treatise on SQL injection for MS Access, but there are some good sources.  I know that Access doesn't support the ', so I'm working on other methods.  Already, I have information on the application, and based on what I've discussed with the developers, the app has to be "injectable," it's just a matter of where.  Some of the MS Access SQL Injection resources I've been going through include:

Saturday, May 29, 2010

Online Penetration Testing class

I haven't decided yet how to spend my education budget.  I really want to attend the SANS What Works in Forensics and Incident Response summit, but I think it's too much for the company to pay for.  Then, the other day, I was reading Darknet, and I saw their posting on the eLearnSecurity Online Penetration class.  The price is pretty good, and I would really like to learn the network pen testing segment.  As I've been doing a lot of web application auditing lately, I wouldn't mind learning the tips and tricks to web applications as I think it will only help the auditing skills.

If you've taken any of the classes by eLearnSecurity, I'd like to hear your feedback.

Wednesday, May 5, 2010

Hackin9 magazine now free

This post might be a bit late.  I originally posted/mused about Hackin9 back in January of 2008.  Now, I see it has become a free e-zine.  Head to and sign up for the newsletter.  At the time of this post, you are able to download the current issue in .pdf form.

Friday, April 23, 2010

Facebook changes to make

Facebook made changes, yet again, to the privacy and data restriction usage.  You might want to check your settings to see what's new, and what's being shared.

Here's a good link to helping "reset" some of the changes Facebook made to your account.

I suspect my older post here, is now outdated.  I think some of the main ideas may still be relevant, just implementing them will take more steps.

Monday, April 12, 2010

Web form incident

I received a call today for an interesting incident. Bear in mind that the customer doesn't have an incident response policy, but I think that is going to change.  It seems a staff member received an "anonymous" email that, while technically not threatening, was certainly personal, mean and inappropriate.

The staff member forwarded me the email, with all relevant headers.  Even though there was a "from" address on the email, I realized that email addresses can be spoofed.  However, digging through the headers, I found an "admin"@company.subdomain email address.  Thinking the website might have been tampered with, I perused the web sites directory.  The site only has 20 or so static pages with one contact form.  Thinking contact form, I contacted the webmaster to see how the sub-domain actually worked.  After some digging, I learned that there is one script on the sub-domain that processes the contact form.  Bingo.  Looking at the contact form, email address is not required.  So, I tested sending the form without entering an email address; and I was able to replicate the incident.

I am now working with them to fix two issues:  1) There needs to be a documented incident response policy, such that the client is protected.  2) The website needs to address how to handle submissions without an email address.

The security ball is rolling, so hopefully good things can come of the incident.

And, while we may be able to get the IP of the person that submitted the form, I'm not sure what that will buy us.

Friday, April 2, 2010

Book Review: Seven Deadliest Web Applications by Mke Shema

I have been doing more and more testing of applications, and most of those applications have been web applications.  I think it is the nature of the beast.  Sure, sites have servers to manage their client workspace, and they handle mail.  But, if they have a web server and/or database server there's a good chance that they are hosting a web application.  Whether the organization "knows" it or not.

Last year I worked on the big application project where we had to test north of 80 applications in a very short time period.  My eyes were really opened during that project.  Since then, more and more, the sites we are going to have web applications.  I've taken it upon myself to improve my application (and by extension, web application) testing.  While the DoD has the Application Security and Development Checklist to guide them, actually testing the controls is not the easiest.  So, I've been trying to delve deeper into App testing.

I picked some books to start going through, and some of them are listed on my reading list.  The first one I've gotten to is Seven Deadliest Web Application Attacks by Mike Shema.  One of the reasons I picked this book is because the book just came out, literally, last week.  So, the book is current.  Mike is not a stranger to security or web applications and has a couple of books on web application security.

I have to say that I really liked the book.  Mike's style is easy to read and he's able to convey the knowledge and the points in each chapter concisely and in a matter that facilitates remembering the facts and putting them to use.  There are seven chapters that Mike focuses on:  Cross-site Scripting (XSS), Cross-site Request Forgery, Structured Query Language (SQL) Injection, Server Misconfiguration and Predictable Pages, Breaking Authentication Schemes, Logic Attacks, and Web of Distrust.

Each chapter goes into detail about the specific attack.  There are many good examples, code snippets, and current stories about relevant attacks in the real world.  Many of the attacks have analogies to reinforce the concept using non-technical language to put the attack in a different frame of mind.  The big three (XSS, CSRF, and SQLi) make up the first three chapters.  I admit, I come from more of a DBA background (from before my security days) so I didn't focus on SQLi as much.  However, I still picked up knowledge from the SQLi chapter.  We test for XSS a lot, and I was familiar with the information; but of the big three, I learned the most from the CSRF chapter.  Server Misconfiguration delves into attacks that are based on applications and servers that are not as hardened as they should be.  Breaking authentication discusses different methods to subvert and get around authentication.  I liked the chapter on logic attacks because it employed the use of thinking outside the box.  These attacks were not the result of fundamental coding errors; they were errors in the way the application was designed.  Finally, the last chapter (Web of Distrust) compiled a bunch of different attacks.  Malware, scareware, referer attacks, html, and DNS all were given sections.

Each chapter starts out by defining the specific attack; what it is composed of, how it works, and how to employ it.  The chapter also devotes a section to employing countermeasures and how to mitigate the specific attacks.  Finally, a summary puts it all together.  I really liked the small sections titled "Epic Fail" where a specific attack is given a real-world example.  Also, there are many URLs given for tools that are discussed and for points for further information.  There are a couple of tools that I'm actively researching and looking to use that I learned about from the book.

The book is not a cookbook or a how-to on performing the attacks.  While lines of code are given as examples, there are not step-by-step instructions on how to successfully exploit the attack.  And that was fine for me; that wasn't my goal.  There are other books out there; in fact, there are whole books devoted to some of the chapters.  So, if you are looking for step-by-step instructions, this book is not for you.

I rate the book very highly and would recommend it to anyone wanting a good overview and jumping off point for the seven major web application attacks.  I've brought the book into my office and hope others will get as much out of it as I have.  You will not go wrong in reading the book if you want to learn what the major attacks are, how they work, and some mitigations to lessening the threat of these attacks.

Monday, March 29, 2010

New Leatherman tools

I've always like the Skeletool CX by Leatherman, though I've never actually gotten one. I keep a pair of wire snips, four different screwdrivers, pliers, and tweezers in my laptop bag. Having a good set of tools comes in handy all the time. However, a post on Wired's Gadget Lab shows what appears to be even smaller versions of of the Skeletool. On the left is the Style CS, and the right the Style. Heck, I just noticed that they have a Freestyle CX, which is between the Skeletool and the Style.

image from the Wired post.

Saturday, March 27, 2010

Using Foremost to recover files from a dead hard drive

A client gave me a 250 gig hard drive that wouldn't boot any more. I was hoping it was a problem with Windows, such that I could image it and move on. However, when I tried imaging the drive, it would fail after 145 gigs of imaging. I tried this a couple of times and was able to repeat the fail at the 145 gig mark. Without a physical image, I wasn't able to pull out the logical partition. However, the client was asking what word documents I could pull off the machine.

So, with an image (as complete as I could make it) I decided to carve out what I could find. I edited the foremost.conf file to uncomment the "doc" file type. Following that, I ran foremost:

foremost -o /path/to/foremost/output -c /path/to/formost.conf /path/to/image

This bombed right away. I shouldn't say that it bombed, rather it brought back many files, and most of them were huge files, quite obviously not Word documents. Taking a look at the documentation, I decided to add the -q switch, which starts the search of files on sector boundaries. This produced more files, but all of them were least, I couldn't read anything meaningful from them. I took another look at the foremost.conf file and some postings on the internet and found that the ole type has automatic extration. And, I would not need the config file. My final command was:

foremost -q -t ole -o /path/to/foremost/output /path/to/image

This carved out plenty of Word files for me. I'm going to try carving jpgs in a few minutes. One spec I haven't found is Word 2007 files (docx) or excel files. If you have a config that can be used in a foremost.conf file for those formats, I'd appreciate it. Just leave a comment.

Tuesday, March 23, 2010

Security quote

I flipped the page in my planner the other day, and at the top was the following quote:
Security can only be achieved through constant change, through discarding old ideas that have outlived their usefulness and adapting others to current facts.
- William O. Douglas
The quote is appropriate given what we do. User education is critically important. And, as threats evolve, we need to evolve with them; and pass that knowledge on. We've seen that viruses are not necessarily the major threat like they once were; when viruses wrecked havoc and were a nuisance. Now, there are concerted efforts by malware writers at hiding their intentions while attempting to get the user to give up valuable information. As the bad guys have adapted to make use of more advanced tactics, so have the warriors. The cat and mouse game will continue for as long as technology moves forward.

Monday, March 15, 2010

Script to help find unknown or rogue sql servers

When I'm auditing a lan/enclave/data center, one of my test script producesa "netstat -naob" of the machine. (I do this by running "netstat -naob > [machinename]_netstat.txt. machinename gets populated from an environment variable.) Sure, I understand it is a point in time snapshot of the machine, but it gives me a good idea of what is running. And it's nice to have the output in a file in order to review later.

I like to check for sql servers because they traditionally get installed by a user as part of an application. Sure, many times it is the "lite" version of sql server. But that is not always the case. And, we all know how vulnerable a sql server could be; and their need for extra care and feeding.

Typically, I'll collect all of my netstat files in one directory. Then, I run following script:

for /R "c:\documents and settings\me\desktop\NetstatFiles" %f in (*_netstat.txt) do findstr /M "sqlservr.exe" "%f" >> sqlseeker.txt

Each of my netstat files is prefaced by the machine name of the machine being tested. sqlseeker.txt is a file that contains a list of all netstat files that contained a netstat line where a SQL server was found to be listening.

I'm sure there could be a false positive or two, but it gives me someplace else to look for rouge or unknown sql servers.

Sunday, March 14, 2010

Corrupted registry entries

I received a laptop the other day that was horked beyond belief. I was only able to run the anti-virus once before the machine died. And try as I might, I was unable to boot the machine back up. Using a linux live-CD, I was able to verify that the drive was physically good and in working order, no bad blocks or sectors. However, upon booting, the BIOS screen displayed, then nothing. Blackness. If I tried safe-mode, I could see a bunch of dlls loading, but the machine continually hung on on dll.

I went to a faithful standby, Trinity Rescue Kit. I ran all of the tools (that were appropriate,) and still nothing. Most of the errors seemed to be "illegal access" errors, or logic not supported.

Finally, I decided to use an old version of the registry. I followed the directions on this Microsoft page, and BINGO. I was able to reboot the machine. Fortunately, I copied the Documents and Settings folder of the user. So, I was able to put the files back.

Some software will probably have to be re-installed. And the machine is not the most stable. But it is a real old XP (Media Center) laptop, and could probably stand to be updated to Windows 7.

Saturday, March 13, 2010

Peach Fuzz training

This past week I received two days of Peach Fuzz training by the author Michael Eddington. This is a great fuzzing tool that is extremely powerful, yet extremely extensible and flexible. Michael did a great job in teaching the class; it probably helps that he's the author of the program. For those of you doing pen testing or research into application bugs, this program is for you. We used it in class to find (known) bugs in a few applications. But the possibilities are endless.

However, as an DoD auditor, I just don't see the use. I won't have time while on a client site to get this up and running as there is so much more for me to do on site. We are usually cramped for time with many different technologies and platforms to test. And really, contractually, I don't believe we are authorized to pen-test. We run web application scanners, but we can not exploit the vulnerabilities we find.

Awesome tool, though. And I'm glad I was given the opportunity to attend the class.

Thursday, March 4, 2010

Helping the crooks to your valuables

I was out west at my last testing engagement. We finished early, and I was able to visit with family and relax. We were taking a hike in a park when we unexpectedly reached a parking lot. The sign in the parking lot was so unbelievable I had to take a picture. I was using my Droid, so the picture didn't come out the best.

The sign said: "Notice! For our protection, secure your valuables in trunk." I think the "y" was removed. Either way....why don't you just leave the key?

Sunday, February 21, 2010

DoD allowing USB thumb drives again

Wired's Danger Room blog has a post announcing that the DoD is allowing USB thumb drives, effectively ending the DoD-wide ban. Ostensibly, the ban has been lifted in certain circumstances, and will not be for everyone. Also, it is not supposed to be easy to get a drive. As an auditor, though, I've been to installations where USB drives were in rampant use.

None of the ACAs that our company does work for has sent out any kind of "official" notification. It will be interesting to see what actually comes out, and when.

Is it me, or is the TSA getting stricter on what we can carry?

After this trip, I will have been testing two of the last three weeks. (And I've already got a trip planned in March.) But, on the last trip, and on this trip, I've had my laptop bag given the extra once-over by TSA. On the last trip, the TSA rep in Kansas City took my bag apart and let me know the problem was with my cable tester. I've traveled many times with the cable tester, and only now did it cause a problem. When I asked what the (specific) problem was, I was told that "it contained a scary image on the x-ray machine."

On this most recent trip, it was the hub that caused the commotion. I didn't get an explanation; I didn't ask. As an incident responder at heart, I like to have everything with me that I might possibly need. The lesson learned is that I'm going to have to check most of the tools that I keep in my bag. I've always packed my tools (snips, screwdrivers, etc.) but it looks like I'll be packing and checking more of the tools in my bag. (Truth be told, I don't mind shedding the pounds.)

My question is, is the TSA getting stricter, or have I just gotten lucky in the past.

Thursday, February 18, 2010

Web App Testing Environment

Since I have been testing more and more networks/enclaves/systems that have web applications as components, I've been trying to get more involved in the web application environment. I was browsing OWASP's site when I came across the OWASP Live CD for testing web applications. Later, I found out that one of my co-workers is actively working on the project. How cool. It's great that there is a group looking at streamlining and making web application testing more efficient. The site is

I'm thinking of taking the SANS 542 class, and then attempting the exam. If you have any feedback on the class, I'd like to hear about it.

Monday, February 15, 2010

Web Application Testing

With the DoD, we've done much more testing of web applications in the last year. When I started with the company almost two years ago, this was not the case. Frequently, we would get on site, test the web and database server, and move on. I can't ever remember testing the content of those servers. Generally, the reason I was given (by the senior testers) was that we couldn't run our tools and DOS the servers or clobber the data in the SQL servers.

Fast forward to last year and we were awarded a big contract to accredit a large quantity of applications; most of their web applications. It would not be acceptable to test the hardware and software without testing the application itself. We came up with a methodology, that included testing the application in a test/staging/or STIG compliant development environment, in order to fully test the application. We used the Application Security and Development STIG and the Application Security and Development Checklist as our guides to frame how we would test those applications. Since that project, I believe we have enhanced our methodology. And now, there is not a testing engagement that I will attend where I will not extensively test the application if I find a web server and/or a database server.

However, I think I can do a better job. Lately, I've been perusing the OWASP web site looking for guidance on application auditing. Clearly, we're not contractually allowed to pentest. Yet, there are aspects of the application and its underlying architecture that we need to evaluate. I've found the OWASP Testing Project and a pdf of their guide to be a great help in giving me specifics to testing/auditing specific controls.

I'm toying with joining the OWASP project. And, I'm looking for certifications that can help me specifically in auditing applications. I know there are certifications with regard to pentesting, yet since we're not allowed to pentest, I feel the courses might go to deep.

I suspect I'll be adding more posts on the subject.

Thursday, January 28, 2010

Mental Note on Firefox forensics using Firefox 3 Extractor

I left a post the other day on Firefox forensics, linking to Harlan's great page.

However, I wanted to dig a little further. I went to the Firefox 3 Forensics site and downloaded the Firefox 3 Extractor. It took a few minutes to get it right, but when I got it running, it was awesome; and a little eye opening.

First, I copied f3e.exe and sqlite3.dll into my firefox profile directory. I launched f3e, but couldn't get any results. Remembering my old sql developer days, it dawned on me that the files were locked as I had Firefox open. So, I closed Firefox and reran. Bingo. The internet history report came out. I tried to run another report, and the program failed with an error message.

So, this time, I followed the directions and copied the Firfox sqlite files to a seperate directory, and dumped f3e.exe and sqlite3.dll in there. Now, I could run any report, as many times as I like.

A couple of things I like:
The program asks for a case reference (maybe the profile of the subject)
The program asks for a cast name.
The program asks for the investigator.
With the internet history report option, you are asked if you want to use the favicons.

I chose the Internet History Usage report, which was D on my menu. After answering the questions, the html file is named "case refernce" - "case investigator" - Internet Usage.html so it is easy to find if you are running many reports.
Besides giving you the reference, name, and investigator, the report shows:
the top 20 most visited sites, with their counts, and,
A table with rows showing: favicon (if used), visit date, url, title, and if the url was typed.

I found it interesting going through the table that Yahoo mail uses the subject of the email as the title of the page. This could be useful if having to trace through web email.

I ran the other reports and have only skimmed the .csv files that have been produced. A quick look shows a detailed cookie analysis, a forms history file, a detailed bookmarks analysis, favorite icon analysis, and a couple of others that were blank (I might not be recording that information.)

There is a mini-FAQ, that lists where the various profile directories are stored.

Running the tool got me to consider the difference between "Private Browsing" and "Clearing Private Data". Normally, I clear my private data at the end of each session. But, I'm thinking of moving to Private Browsing, as it appears private browsing does not write the information to the hard drive.

So far, this is a great tool, that I plan to use in the future.

Wednesday, January 27, 2010

Mental Note for Harlan's Firefox forensics post

Harlan has a great post on Firefox forensics. I've been storing the link in email, but better to post it so I can more easily find it.

Thursday, January 21, 2010

Is IT security a life and death matter?

My boss and I were on a conference call today with a vendor with whom we will be sending a test team in order to test their system. This engagement is with a medical device company whose machines work with radiation. We were talking about patch management and how the company sets their policy of updating and patching their machines. What came out was an interesting story.

They mentioned to us that they handle ALL patching and updating for everything installed on the system. Because of the nature of the software (and that it controls radiation levels being administered to a patient) they do not patch the machine until that patch has gone through rigorous testing. They told of a system administrator that saw one of the vendor's machines on his network, without the latest patches. Without looking further at the machine, he remotely pushed out a whole bunch of patches to the machines. What the system administrator did not know was that the machine was actively administering radiation to a patient. The patches locked the machine, preventing the dosing engine from completing. Had a technician not been carefully monitoring the procedure, and hitting the emergency override switch....who knows what would have happened.

Friday, January 15, 2010

Quick XSS reminder

I keep forgetting this great link.

XSS from IronGeek

A great site. And a very useful XSS post.

Thursday, January 7, 2010

DISA SRR tools need CAC in order to get them

I know that DISA periodically makes part of their site unavailable while they make changes to their regulations (checklists, STIGS, etc.) So, for the past couple of days I've been waiting for the new checklists to be posted so as to prepare for a new trip. Yesterday, a bunch of the checklists updated: MS SQL Server has been split into SQL Server 2000 and 2005. The Oracle checklists have been split up by Oracle version. I notice that three of the Windows checklists have been updated: Windows 2000, 2003, and 2008. Curiously, there is not a checklist posted for Windows XP or Vista. I supposed they are forth-coming.

However, I was highly surprised to see that the SRR scripts have been moved to a site that requires CAC authentication. And at this, I have to wonder why. In my opinion, the scripts do a great job of testing configurations against what the DoD expects items under their purview to be configured. Was that such a bad thing that everyone had access to the tools? It only makes the community safer. I'm hoping this is a temporary measure, and that all will return to normal as I would hate to see valuable tools be available only to a select few.

Monday, January 4, 2010

IM-Me by GirlTech

One of our kids got this for their birthday this past holiday season. Ok, it's a little cute, but I think there is better/maybe cheaper technology out there. The basic premise is that you plug the dongle in, and the child can "IM" with a friend who has the same device. Both users have to be "logged in" to the "chat service" and it connects over the internet. Of course, they want you to recruit many users, but that's another story. We got it because one of our child's friends is moving many states away, and we thought this might be a novel way to stay in touch. Installing it did not exactly please me.

A quick note, we have a couple of desktop machines spread out about the house. One, the main desktop, has multiple accounts on it; and the kids' accounts are very limited in what they can do. There's another computer that is not as limited, but does not have as much connectivity.

I found that I could only install it on the "main" computer because the hardware support for older computers is dodgy at best. To install, you had to install the device (and drivers) as admin. That wasn't too bad, as I don't let the kids' accounts install anything. But, in order for my child's account to be able to use the dongle, I had to temporarily grant admin permissions to the child's account such that the install could finish. And, during installation, I had the choice of installing it for the admin, or all accounts. I suppose I could have hacked the install, but I didn't. And now, EVERY user gets an install script failure upon logging in. Finally, when the child sets up their account, it prompts for a username and password. My kids have security drilled into them, so the password was not really an issue. However, when you enter the password, it's in clear text. Same with the password confirm box. I wasn't too pleased with that. I should have sniffed the transmission of the username and password to see if it was passed to the server in clear text. However, said child was pitching a fit that it wasn't installed yet.

All in all, had I known more about the architecture before, I might not have purchased it. But, the kids have had fun with it. I suppose it will wear off quickly, as they approach more consumer grade technologies.