Monday, January 27, 2014

IEF and Google Translate Browsing History

In working a recent forensics investigation for Sibertor, I ran into something I hadn't noticed before - Google Translate URLs.  I ran into these while sifting through Internet Evidence Finder (IEF) output from the hibernation file pulled from the image of my subject's primary machine. As usual, IEF carved some really cool gmail fragments, rebuilt facebook pages and Internet browsing history.  Since the subject worked at an organization that required a great deal of translating as part of his daily job, a good chunk of the Firefox browsing history was Google Translate URLs.  It really was cool to see exactly what was being translated in the URL. As an example, and sadly not related to the investigation, here is one of my favorite conversations translated to Malay (Hello to my friends in Malaysia!):

http://translate.google.com/#en/ms/Justin%20Bieber%20threw%20up%20on%20stage

Since it was relevant to my investigation, I paid special attention to the language that the text was being translated to - in this example the "ms" stands for malay.  In my case, "it" was used to translate the text by the user to Italian.  What is also notable here, and a point of interest to some, is that unlike Google searches, which have been sent encrypted by default for logged in users since October 2011 and all users since September 2013, the Google translations requests are sent in the clear, regardless of whether a user is logged in.

This is good stuff - definitely something that relates to our employee investigations module in SANS FOR526: Windows Memory Forensics In-Depth. The application of memory forensics in employee investigations have yielded some serious wins for me and it sounds like other internal forensics teams are pulling memory more frequently as well. 

Thursday, January 16, 2014

Case Closed: Memory-Only Analysis

I recently presented the SANS@Night “Have No Fear, DFIR is Here” with Rob Lee, Chad Tilbury & Lenny Zeltser at SANS CDI.  The premise of the four-examiner presentation was to analyze and properly scope the involvement in the Stark Research Labs(SRL) intrusion of the “Natasha Romanoff” Windows 7 x86 system.  This scenario is known quite well by FOR508 alumni who have served as members of the SRL CIRT during the Day 6 Challenge. In FOR526: Windows Memory Forensics In-Depth, we take memory analysis a step deeper, diving into Windows Memory Structures in order to understand how the tools work and how to extract smoking guns from a system memory image.  My goal during my 20 minutes of presenting with the “All-Star Forensics Team” was to prove to my fellow instructors (and @Night attendees) that I didn’t need their file system forensic analysis at all.  In fact, all of the questions pertaining to the intrusion and this system’s involvement could be answered through the analysis of the memory image. 

Our mission: Determine the initial vector of infection for the Romanoff system, find all malicious code present on the system and show signs of execution.  In addition, we were asked to answer the typical questions customers asked in an incident response scenario, “What credentials were compromised?” “How did they get in and how long have they been here?” and of course, “What did they take?”. 

Most digital forensic examiners know of the Volatility Framework, an open source python project supported by a forward-leaning team of developers.  Yet, there are other tools we can bring to bear while performing memory analysis.  I included Mandiant’s Redline, Simson Garfinkel’s Bulk Extractor and page_brute in my tool arsenal, as I unraveled the story of the Romanoff machine in the SRL intrusion. 

Process Enumeration

We cover a six-step process in malicious process identification in our FOR526 Windows Memory Forensics In-Depth class that begins with process enumeration.  By running through a couple of Volatility plugins “pslist” and “psscan”, I identified some curious processes.
 love pslist output purely for the reason that it is chronological – great for doing fast timeline analysis of what took place on the system since that last reboot.  We can see from this pslist output that:

1.) Based on the “Start Time” for the System/SMSS.exe Processes, the system last rebooted at 2012-04-04 11:47:29 UTC.
2.) With two csrss.exe processes (PIDs 412 & 2132) seen in this output, we can ascertain there were two active sessions at the time the image was created (specific to Windows 7 & later).
3.) With the start of the explorer.exe process (PID 296), we know a user logged into the system at 2012-04-04 14:45:45 UTC.
4.) With the presence of rdpclip.exe (PID 2408), we know that this logon was a remote desktop session via terminal services.
5.) A quick hierarchical analysis (Parent/Child surveying), we can identify some potentially anomalous processes to include:
            a.) svchost.exe processes (PID 3612 & 6404) whose PPID is 2100 – not expected since svchost.exe is typically spawned by services.exe (PID 564).
            b.) PSEXESVC.EXE (PID 2100) – notable due to the fact that it is the parent of the late starting svchost.exe, but also typically associated with the execution of psexec, the SysInternals tool used for remote execution.
            c.) terminated processes, a.exe (PID 3264 &  5008) & spinlock.exe (3796 & 1208)

Of course, our next step in this analysis, seen in the next blog post, will be to further investigate each notable process, attempting to enumerate the process parameters, to include identifying the command line (how each process was instantiated). 




We will continue the domination of the digital forensics world via memory in the next few posts.

Saturday, January 4, 2014

Monday, July 2, 2012

SANS DFIR Summit 2012 - Austin, TX - June 26 & 27


I had the opportunity to speak at the 5th SANS DFIR Summit last week in Austin on "Why Not to Stay in your lane as a Digital Forensic Examiner".  Slides can be found here.  This was the best conference I have attended to date, especially with regards to the sense of community felt amongst attendees.  Thanks to everyone who attended for making it a great experience.  And thanks to Rob Lee for inviting me to be part of an impressive group of speakers.  Not every digital forensic examiner has the opportunity to take a hiatus from casework and switch to the offensive side, like I have had, and I really appreciate being given the "airtime" to talk about what I have learned from the experience.  

Notable presenters included Cindy Murphy for her keynote the first day and her excellent 360 (6 minute) presentation and David Nides, KPMG, who was absolutely amazing in his 360, debuting his GUI frontend to log2timeline.  Nick Harbour, CrowdStrike, my new idol!, presented "Anti-Incident Response" and provided great insight into evasion tactics used to foil today's IR processes.  There were so many great presentations - I apologize for not mentioning everyone's here, but I had to head out early on Wednesday to return to work.  From what I heard, there were some amazing "end of summit" sessions that contained great technical content and were perhaps accompanied by chirping crickets!  Sorry to have missed that and I hope next year's summit is just as fun!


Thursday, March 22, 2012

Parsing MFT Entries

I can’t share with you the specifics of the problems that make up the CFCE Mentor process, but I can tell you that knowing the ins and outs of basic File System structures is key.  Doesn’t give much away, does it?  To understand all of the intricacies of NTFS, I relied heavily on Brian Carrier’s book, File System Forensic Analysis.  My first exposure to this book was when my co-worker and friend (and former cop) lent me his copy WAY back when I was studying for my EnCE (Guidance Software’s EnCase Certified Examiner) certification.  Let me describe the condition of his book – that thing was dog-eared, sticky-tabbed, highlighted and underlined and stunk of blood, sweat and tears.  I don’t think I had ever seen a book in such a “lovingly  used” condition and at the time, I didn’t understand it.  Yet, now after putting the finishing touches on Problem 4 of my CFCE certification, I get it.  I now own two copies (and my husband has one, too!) and I keep one in the back of my car at all times.  I can’t stress enough the many mysteries the book has revealed to me.  Yet, there is one thing I ran into that was NOT in “the book” and I wish to share it here:

If you ever need to find the MFT entry number of a deleted entry, the old-school technique of figuring this out was to count the distance (in bytes) from the start of the MFT (Entry 0 - $MFT) to the start of the entry you needed to enumerate.  Divide this value by the size of an MFT entry (1,024 bytes) and the resultant number will be the deleted entry number.   So, what I didn’t know, until just recently, is that there is an easier way.

At offset 44-47 of each MFT entry for Windows XP and later, the value is also that of the MFT entry number.  (See image below)
So, what does this mean? 
1.)    You need to write on p. 354 of your Carrier book a note in the top margin that offset 44-47 of the data structure of the basic MFT entry represents the Entry Record Number.

2.)    You no longer need to do MATH to figure out the MFT entry number.

This clearly is not easy-to-find information and that is why I was motivated to write this post. I spent DAYS searching for this information and found some very confusing documentation on the subject but no specifics!


Sunday, January 22, 2012

Mentoring in Digital Forensics

After running the gauntlet of forensic certifications, I have come upon one that is meeting my need for mentoring - having someone else look over my methodologies and give me some feedback.  I am currently working through the mentoring portion of the CFCE (Certified Forensic Computer Examiner) certification program through IACIS.  This is historically a certification only open to Law Enforcement, but just a year or so ago, they opened it up to those who meet other criteria. (https://www<dot>iacis<dot>com/certification/cfce_faqs)  What makes this one different is the mentoring phase - where the candidate performs acquisition/analysis/report writing and sends it to the mentor for critique.  The mentors in the program are volunteers - professionals who are just trying to "pass it forward" and further other examiners' knowledge. 

You may ask, "Why is mentoring so valuable?  Doesn't everyone get that through OJT (on the job training)?"  I can answer that question with a simple "No."  In my past experience, some forensics teams have such a heavy workload that the mentoring/on-boarding process is quite brief.  In other instances, managers decide that peer case reviews are a waste of time.  When I asked to initiate a monthly case review at one of my old workplaces, I was told that I must have "low self-esteem" and that our cases were so routine that time spent reviewing the analysis and reporting of a case as a team would be wasted. 

Whatever the reason, the importance of mentoring, or as they call it in the educational realm, "scaffolding", cannot be overlooked.  Scaffolding is a teaching strategy that supports the novice by limiting complexities and gradually removing those limits as he gains more skills and confidence.  For a new forensic examiner, this type of model would involve working cases with another examiner on the team, then performing supervised acquisitions, working up to analysis and report writing.  This type of mentoring is an excellent way to ensure all examiners on a team are aware of and are performing within the organizational SOPs.  No matter how strong a team you have (or think you have), collaboration and group think can strengthen individual skills and build esprit de corps.