Mobile Device Forensic Process v3.0

Cindy Murphy has updated her paper on a process for Mobile Device Evidence and Data Extraction. We at DFM are happy to help get this into the hands of Digital Forensic Investigators globally and whilst it has not been reviewed through our normal technical review process we are happy to help publicise this piece of much needed work. The article is available for download using the link below or subscribers to Digital Forensics Magazine can download the paper from the White Papers Downloads Section of the DFM Website.

Cindy Murphy is a Detective with the City of Madison, WI Police Department and has been a Law Enforcement Officer since 1985. She is a certified forensic examiner (EnCE, CCFT-A, DFCP), and has been involved in computer forensics since 1999. Det. Murphy has directly participated in the examination of hundreds of hard drives, cell phones, and other digital evidence pursuant to criminal investigations including homicides, missing persons, computer intrusions, sexual assaults, child pornography, financial crimes, and various other crimes. She has testified as a computer forensics expert in state and federal court on numerous occasions, using her knowledge and skills to assist in the successful investigation and prosecution of criminal cases involving digital evidence. She is also a part time digital forensics instructor at Madison Area Technical College, and is currently working on her MSc in Forensic Computing and Cyber Crime Investigation through University College in Dublin, Ireland.

 Mobile Device Forensic Process v3.0 (8)

轉自:http://digitalforensicsmagazine.com/blogs/?p=437

明德鑑識科學班 培育未來柯南

十二年國教高中要力拚特色,新北市明德高中和鑑識專家李昌鈺博士合作,將成立國內第一所高中「鑑識科學專班」,要培育未來的柯南和福爾摩斯,吸引基測達前三志願的考生申請。李昌鈺預定九月回國親自主持開班典禮,明年十二年國教上路後,鑑識科學專班將採特色招生選才。

美國電視影集《CSI犯罪現場》廣受歡迎,也讓鑑識科學變熱門,但國內鑑識人才的培育管道很有限。明德高中校長黃棋楓指出,「鑑識科學專班」要招收四十名國三畢業生,分為刑事鑑識、經濟鑑識兩組,相當於高中自然組及社會組,傳授刑事犯罪和經濟犯罪科學基礎鑑識課程。

今年九月第一年試辦師資採外聘,由警察大學、警專、台北大學、法務部調查局、刑事警察局和李昌鈺博士物證科學教育基金會教授群,親自到高中授課。

黃棋楓說,歐美等國都極力培養鑑識科學人才,曾任美國康州警政廳榮譽廳長的李昌鈺,更在康州致力發展鑑識科學教育,美國目前已有近二十所高中成立鑑識科學班,澳洲、英、法、德等國也在積極培育鑑識人才;台灣則由明德高中開先鋒。

「鑑識科學專班」除了一般高中必修課程外,高一、高二各選修五學分專業課程,包括會計學、資產鑑價實務分析、電腦審計與內部稽核,及物理、化學、生物鑑識科學和犯罪現場調查概論等。

高三上學期要準備學測考試,不開專業課程,但高三下學期甄選入學放榜後,將和大學合作開三個學分的先修課程,先修學分升上大學後可抵免,學生也會到警大、刑事局見習。

黃棋楓說,將和開設相關科系、學程的大學洽談,希望每所大學提供一到三名繁星推薦名額。今年第一年試辦,第一階段免試入學已吸引十二名校排名前百分之五的 國三生報到,第二階段申請入學也錄取七名考生,包括基測成績三九六分可進前三志願程度的考生,其餘名額將留給登記分發錄取學生就讀。

大學開鑑識學程 成風潮

黃棋楓表示,鑑識科學人才的起薪相當誘人,刑事鑑事人員起薪「55K」,法務部調查局負責經濟鑑識的三等調查員,起薪更高達「77K」。另外,從事不動產 或動產的鑑價,也需要科學鑑識人才。目前除了警察大學和警專外,一般大學體系的中正大學、東吳大學、台北大學都開設鑑識學程。技職體系的台科大則先在通識 課程試水溫,在校內成立第一個與鑑識科學相關的資通安全中心。

與李昌鈺博士交情深厚的台科大前校長陳希舜,曾邀李昌鈺到台科大擔任講座,每場都爆滿。陳希舜認為,讓學生接觸跨界、跨領域的知識,有助於創意激發;東吳 大學「鑑識科學學程」是前行政院長劉兆玄擔任東吳校長任內規畫,結合理學院的物理、化學、微生物、心理四個系,和法學院合作開設鑑識科學。

台北大學則開「資本市場鑑識學分學程」,必修學分包括犯罪學、鑑識會計與犯罪偵察、經濟犯罪專題研討等,主要在培育經濟鑑識人才,防範經濟犯罪、查假帳、洗錢等。


轉自:http://www.merit-times.com.tw/NewsPage.aspx?Unid=312953
 

光碟救援!!

過去約五、六年前,由於硬碟的價格還很高昂,因此大多數的人都選擇用DVD來燒錄備份自己的重要資料。那時有很多 人可能備份了數百片、甚至上千片的光碟檔案,認為這樣保存資料應該就萬無一失了。但是現在當你把這些光碟想要載入硬碟時,發現有些光碟竟然無法讀取了,真 是始料未及。

快速瀏覽:

  1. 光碟救援原理
  2. 修復遭刮傷的光碟片表面
  3. 救援光碟片資料實作
光碟片無法讀取主要有兩種可能,一種是光碟片裡頭的用來保存資料的記錄層,因為化學變化導致資料無法讀取。第二種原因則是光碟片因為保存不當,導致光碟片的讀寫面產生刮痕,造成光碟機的讀寫頭讀取資料不正常,而無法讀取資料。
▲當你複製光碟的時候,發現視窗停頓在這個畫面很久,光碟機不停的轉動,但什麼反應都沒有,那很有可能就是光碟受損了。

更換光碟機讀取試試看

由於光碟片的讀寫原理,是利用雷射光束將資料打到光碟片的反射層上,然後藉由反射的原理,將刻在反射層上的數位資料折射讀取回來,然後才能轉換為電 腦資料。因此,光碟鏡面上的刮痕雖然並沒有傷害到數位資料本身,但是因為影響了反射過程的訊號讀取,所以導致資料讀取失敗。因此,市面上有一些修補光碟的 工具,主要都是針對修補鏡面刮痕而設計,可以透過這類型的工具來將影響雷射讀取的障礙減低,提高讀取成功的機率。
至於光碟片變質的問題,在採取之後的其他急救步驟之前,建議你可以先更換一台電腦,用其他人的光碟機讀取試試看。因為撇開光碟片的因素,光碟機的讀 寫頭也有可能因為使用過久,而輸出的雷射讀寫功率不夠,因此造成某些品質較差的光碟片讀取不到。這個時候,如果更換一台新的光碟機,或是其他電腦的光碟 機,就有可能因為輸出功率正常,而能夠正常讀取你要搶救的資料。
▲市面上有販售這種「光碟片修復機」,號稱可以將受損的光碟自動修復。不過其實你也可以在家自己動手用現成的工具修復受損光碟。
▲光碟燒錄片雖然號稱可以保存很久,但其實五年、十年,就可能因為染料變質或是刮傷而無法讀取資料。

光碟救援軟體原理

當你將有問題的光碟片放進電腦,可能可以從檔案總管中看到光碟片的檔案目錄,只是當你要複製檔案時,總是卡在某個地方然後永遠跳不出來。或是有另一種情況是電腦根本找不到這個光碟片,表示為空白。
在這裡我們介紹的救援軟體,是針對前面的第一種狀況。如果電腦找不到這個光碟片,用救援軟體是無解的,你只能試著用其他光碟機讀取看看。而光碟救援 軟體的原理,是利用重複讀取光碟片發生錯誤的資料區域,或是降低讀取光碟的速度以提高讀取資料的正確性,甚至可以選擇略過錯誤不計來盡可能地搶救光碟片上 的資料。

CD Recovery Toolbox Free

這是最推薦的光碟救援軟體,可以針對鏡面刮傷進行多層次的處理,先是降低光碟讀取速度重複讀取相同損害的地方,然後真的讀取不到就略過補上特定的資訊。只要設定好要救援的光碟機來源,它就會自動去讀取光碟片並進行修補。
▲CD Recovery Toolbox的使用步驟很簡單,介面上也有清楚的指引你走向下一步。

Roadkil's Unstoppable Copier

大陸網友開發的工具,介面上採簡體中文顯示。它會將讀取不良的檔案部分,用特定資訊填寫進入,以提高檔案搶救的機率。提供了較多的搶救選項,可以設定是否要略過損害的檔案,選擇速度快或是較佳但速度慢的救援方式。
▲Roadkil's Unstoppable Copier採簡體中文介面,提供較多的設定可進行修補。

MiniTool Power Data Recovery

在前面介紹修復磁區的時候,我們用它來做過救援分割磁區資料的介紹。其實它的功能真的很廣,從反刪除檔案、救援分區檔案,到光碟回覆的功能都有提供,只是免費版本只有提供1GB的資料救援量,不免會讓人有點小失望。
▲包山包海的MiniTool Power Data Recovery,也有提供光碟救援的功能。

GMVB

GMVB的全名為「Get My Videos Back」,除了影片之外,它也可以拯救音樂檔案,他一次只救一個檔案,而在下方會顯示該檔案的資料區塊,會以紅色標出有哪些地方資料是損毀並且救不回來的,讓你比較清楚救援的結果。
▲一定要點選左下角的「Open And Start」按鈕才會開始救援光碟的動作。

修復遭刮傷的光碟片表面

如果你的光碟鏡面刮傷很嚴重的話,建議你在使用救援軟體之前,可以先針對光碟的鏡面做一些修補的動作。可以提高軟體救援的成功率。

Step 1

用來修補光碟片的主要工具為:牙膏、無色的鞋油、化妝棉。牙膏本身是顆粒很細的研磨劑,可以將刮痕有效的減少,而不致於在鏡面上造成更多的刮傷。

Step 2

利用牙膏在光碟刮傷的部位上塗抹,不需要太多,然後用化妝棉在刮傷的部位,以小範圍、順時針的方向來慢慢將牙膏塗平,反覆以順時針圓形的方式,將刮傷的範圍盡量磨平。

Step 3

你應該盡量用化妝棉打磨到將刮傷的部份磨平,但是牙膏的效果有限,最後一定還是看得到刮痕。打磨到後來應該要看不到牙膏,只在光碟鏡面上留下一層霧面的模糊範圍,如果你打磨到後來牙膏還殘餘很多,那就用清水清除。

Step 4

現在將光碟片放下,另外取一張新的化妝棉,沾取鞋油準備對霧面進行去污、拋光的動作。再次提醒,鞋油要用無色的。

Step 5

鞋油的成分其實就是由蠟跟油兩種成分所構成,因此可以將牙膏的霧面很好的清除掉,並且還可以藉由蠟將光碟的鏡面再加以拋光。

Step 6

雖然照片很難真實還原光碟片上刮痕處理的前後比較,但真的有很明顯的差別,而且對於後續光碟救援軟體的處理,雖然依然偶爾有讀取障礙,但是已經大大減低救援所需時間。

救援光碟片資料實作

如我們前面強調的,光碟救援最好是軟硬兼施,先在有刮痕的光碟片進行鏡面的處理,將可以讓這個步驟進行的更順利,救援的資料完整性提高。

Step 1

CD Recovery Toolbox Free整個設計的流程很自動,使用者幾乎不需要任何的設定動作。執行程式之後,它會自動檢查電腦中的光碟機以及光碟,然後你只需要按下「Next」執行即可。

Step 2

就跟其他的救援軟體一樣,你必須要指定電腦中其他的硬碟資料夾位置,用來儲存救援出來的檔案。點選上方的按鈕,選擇要儲存的資料夾位置。

Step 3

接下來會以樹狀結構列出這張光碟裡頭的所有資料夾以及資料夾檔案,如果你要全部都救出來,就選擇全部勾選,不然就僅選擇你要的檔案來拯救就好了。選擇完畢,按下方的「Save」按鍵就可以開始儲存檔案。

Step 4

接下來會開始複製檔案到你的硬碟上,並且會有救援出來的檔案清單,在後面綠色的字表示處理狀況良好,已經成功將檔案儲存到硬碟上。

Step 5

一切過程幾乎都是自動化,最後你只要看這個視窗所列示出來的清單,就可以看到有多少筆檔案儲存到你的硬碟中,以及每一個檔案是否有狀況。

Step 6

前面的示範步驟顯示的是光碟修復良好的狀況,如果你的光碟損害嚴重,則在跑完第一輪檔案救援的動作後,接下來程式會自動降低光碟讀取的轉速,用其他 救援技術來提高檔案的救援機率。當你進到這步驟,通常接下來就要等待很久的時間才會結束。當你看到清單中檔案狀況都是紅字的話,表示會進入進階的救援過 程,要花更多的時間來處理刮痕上的資料。

轉自 T客邦

dd2vmdk

dd2vmdk is an online tool for converting raw disk images to VMware virtual disk files.
  • -Browser based conversion - uses the output of sfdisk and ldminfo as source information
  • -Supports Windows Dynamic Disks by converting back to regular partitions
  • -Requires no installation of foreign executables - everything done through pasting UNIX shell script commands
You can convert a dd image to a vmware vmdk with this tool by clicking here. To test it paste in the tool output in the walk-through below.

轉自 http://www.schatzforensic.com.au/2006/p2v/

Quickpost: Disassociating the Key From a TrueCrypt System Disk

TrueCrypt allows for full disk encryption of a system disk. I use it on my Windows machines.

You probably know that the TrueCrypt password you type is not the key. But it is, simply put, used to decrypt the master key that is in the volume header.

On a system drive, the volume header is stored in the last sector of the first track of the encrypted system drive (TrueCrypt 7.0 or later). Usually, a track is 63 sectors long and a sector is 512 bytes long. So the volume header is in sector 62.

When this header is corrupted or modified, you can no longer decrypt the disk, even with the correct password. You need to use the TrueCrypt Rescue Disk to restore the volume header. This rescue disk was created when you encrypted the disk.

I’m using Tiny Hexer on the Universal Boot CD For Windows to erase the volume header (you can’t modify the volume header easily when you booted from the TrueCrypt system disk; using a live CD like UBCD4WIN is one possible workaround).

First I’m checking the geometry of the system drive with MBRWizard:
Take a look at the CHS (Cylinders Heads Sectors) value: S = 63 confirms that a track is 63 sectors long.

Then I open the system drive with Tiny Hexer (notice that the sector size is 512 bytes or 0×200 bytes):


I go to sector 62, the last sector of the first track:

It contains the volume header (an encrypted volume header has no recognizable patterns, it looks like random bytes):

Then I erase the volume header by filling the sector with zeroes and writing it back to disk:

And if you absolutely want to prevent recovery of this erased sector, write several times to it with random data.

Booting is no longer possible, even with the correct password. The TrueCrypt bootloader will tell you the password is incorrect:

One can say that I’ve created a TrueCrypt disk that requires 2-factor authentication. To decrypt this disk, you need 2 factors: the password and the corresponding TrueCrypt Rescue Disk.

First you need to boot from the TrueCrypt Rescue Disk, and select Repair Options (F8):

And then you write the volume header back to the system disk. Remark that the TrueCrypt Rescue Disk requires you to enter the password before it writes the volume header to the disk:
And now you can boot from the system disk with your password.

Use this method if you need to travel with or mail an encrypted system disk and want to be 100% sure there is no way to decrypt the drive while in transit. But don’t travel with the 2 factors on you, send the TrueCrypt Rescue Disk via another channel.

Remark: MBRWizard allows you to wipe sectors, but for whatever reason, it couldn’t successfully wipe sector 62 on my test machine.

Oh yeah, don’t forget to make a full backup before you attempt this technique


轉自  http://blog.didierstevens.com/2012/02/09/quickpost-disassociating-the-key-from-a-truecrypt-system-disk/

PASSWARE KIT 11.7

 NEW IN PASSWARE KIT 11.7
  • Instantly decrypts MS Office 2007-2010 documents through memory analysis
  • Instantly decrypts PGP Whole Disk Encryption volumes through memory analysis
  • Recovers passwords for Apple Disk Images (DMG)
  • Improved integration with Guidance EnCase:
    • One-click password recovery from EnCase
    • Imports dictionaries/wordlists directly from EnCase

轉自:http://www.lostpassword.com/news/pnl63.htm

Blade™ v1.9 Released - AFF® Support, Hiberfile.sys Conversion and New Evaluation Version

Digital Detective Software - Blade Professional - Forensic Data Recovery

This release of Blade brings a number of fixes and some great new features.  This is the first release of Blade to have evaluation capabilities which allow the user to test and evaluate our software for 30  days. When Blade is installed on a workstation for the first time (and a valid USB dongle licence is not inserted) the software will function in evaluation mode.

The following list contains a summary of the new features:
  • Support for Advanced Forensic Format (AFF®)
  • Hiberfil.sys converter - supports XP, Vista, Windows 7 32 and 64bit
  • Accurate hiberfil.sys memory mapping, not just Xpress block decompression
  • Hiberfil.sys slack recovery
  • Codepage setting for enhanced multi-language support
  • SQLite database recovery
  • 30  Day evaluation version of Blade Professional
  • New recovery profile parameters for more advanced and accurate data recovery
  • Support for Logicube Forensic Dossier®
  • Support for OMA DRM Content Format for Discrete Media Profile (DCF)
We have also been working on the data recovery engines to make them more efficient and much faster than before. The searching speed has been significantly increased.

轉自 http://blog.digital-detective.co.uk/2012/02/blade-v19-released-aff-support.html

Downloads and Full Release Information

電腦鑑識與鑑識會計

轉自 law

筆者算是國內比較早接觸電腦鑑識(數位鑑識)這個領域,大約在91年間就開始閱讀相關文獻,但是那時候很少資料,連英美國家的電腦鑑識書籍,也非常少,而且概念非常混亂。(那時候差不多是網路剛興起及網路泡沫時代)

為了讓資訊科技發達的我國,也不會與電腦鑑識這個時代潮流脫節,經過努力整理當時各國的書籍與文獻,曾經於93年間出版過一本「電腦鑑識與企業安全」,但是當時這個領域太冷門,出版社後來好像也不知去向,這本書也絕版了。(希望不會是因為本書)

但無論如何,也成就了亞洲應該是第一本有關電腦鑑識的書籍。雖然從現在的角度來看,這本書的內容頗為粗淺,但畢竟是一個小小里程碑......

過了幾年,實務上有了長足的發展,政府部門也在95年間成立了第一座實驗室,又建立了第二座里程碑。

筆者沒有再深入學習進階的電腦鑑識,開始再開發更冷門的數位證據領域,也在98年間寫了一本「圖解數位證據」的著作,將法院判決中的錯誤見解,在清楚又簡單的編排架構下呈現出來。當然這個領域還是很冷門,所以銷售情況依舊慘澹。

不過這類型的書籍本來就不是以銷售為目的。經過幾年,感覺這個領域一直在冰箱中,冷到不行。在此冷到不行的同時,筆者也在100年6月完成了法律博士的學位,主題也就是「數位證據」。

99年間個人資料保護法的通過,電腦鑑識的領域突然熱門了起來。也許是電腦鑑識可以幫助瞭解資料外洩的原因,可以作為企業免責的證明,再加上個人資料保護法的賠償金額過高,迄今每一場相關的研討會都是滿場,其中也幾乎都有一場是有關電腦鑑識的議題。
本來筆者並未有意踏入個人資料這一個領域。
為什麼呢?

理由很簡單,電腦鑑識並非僅與個人資料有關係,電腦鑑識應該是與每個領域都有關聯性,因為每個領域都有可能涉及到數位證據,電腦鑑識就是一種嚴謹的採證與分析數位證據的程序。
但是筆者聽過許多講座之後,發現許多主講者也許是為了資訊產品的行銷,講偏了電腦鑑識的真正意涵,搞得好像個人資料保護法的立法目的,是為了相關資訊產業能賣出產品。

所以,近來筆者開始公告將整合個人資料保護法與電腦鑑識領域,從非商業產品推銷的角度,將正確的知識推廣給想聽的朋友。

目前也在短短的兩個月不到的期間講了六場,全省超過千人聽過,更利用自己的休閒時間,三個月內已經完成「圖解個人資料保護法」的著作,就等101年年中施行細則的通過,即可出版,相信屆時可以作為有需要朋友的參考。

最近有某研究所的教授徵詢筆者意見,希望能在其「鑑識會計」的課程中,向研究生解說一下電腦鑑識的概念。如上圖,電腦鑑識確實為鑑識會計領域中的一小部份,透過電腦鑑識的方式,找到企業營運的問題與弊病,如同前面所述,每個領域都有可能涉及到數位證據。

鑑識會計在100年高考三級中,也首次成為考題,看來繞著電腦鑑識領域打轉的議題,也將會愈來愈多。




WhatsApp Xtract

I don't want to bore you explaining what is WhatsApp . If you have this serious gap, you can fill it here . Forensically speaking, WhatsApp was a very cool app until the last June. After that, someone had decided to add the extension “crypt” to such excellent source of information which was msgstore.db .

This database stores information about contacts and also entire conversations.
But simply opening it with SQLite Browser , you can have some troubles in extracting a single chat session with a desired contact, or in reordering the messages. My last python script wants to overcome these problems, avoiding to deal with complex SQL queries.


轉自 http://blog.digital-forensics.it/2011/12/whatsapp-xtract.html




What WhatsApp doesn't tell you...

It is the 'top' app in the mobile world, almost immediately followed the ' give me your mobile number ' request comes the following question ' Do you have WhatsApp? '. Clearly this application is changing the concept of free SMS messaging.

Alberto warned about insecurity issues in how WhatsApp transmits data in plain text and what this means in shared environments.


Today we have to talk about the inside, the way in which WhatsApp stores and manages its data.
Looking from within the file structure of the application we have two files called msgstore.db and wa.db (locations vary, of course, between Android and iPhone). These files are in SQLite format.

Once we import these files with a tool to browse inside their content (eg SQLite Manager), here comes the first surprise: none of the information contained is encrypted .
Contacts are stored in wa.db and EVERY sent messages are in msgstore.db .


Wait a sec, did I say EVERY?
Absolutely, every sent and received messages are there. And why "EVERY" is in uppercase?, simply because although theoretically WhatsApp give us the opportunity through its graphical interface to delete conversations, the reality is that they still remain in the database ad infinitum.

And the issue is even more fun if we sent or received messages at a time which GPS was enabled, because WhatsApp also stores coordinates in msgstore.db


In the case of Android there are even more important things stored that might be of interest to a forensic investigator - or maybe a jealous boyfriend/girlfriend. Apparently WhatsApp is configured by default with a very 'verbose' level of logging and store, within the directory / files / Logs, files with this appearance:

# pwd
/data/data/com.whatsapp/files/Logs
# ls
whatsapp-2011-06-06.1.log.gz whatsapp-2011-06-09.1.log.gz
whatsapp-2011-06-07.1.log.gz whatsapp.log
whatsapp-2011-06-08.1.log.gz
#

In these files are recorded every XMPP transactions made by the application with a very high verbose (debug) level, with the timestamp of when it receives or sends a message (among other things).

011-06-09 00:47:21.799 xmpp/reader/read/message 346XXXXXXX@s.whatsapp.net 1307XXXXXX-30 0 false false

These files are easily "parseable" to extract the ratio of mobile numbers which has maintained some kind of conversation with us. I created a small script that parses the file and pulls out this list of numbers:

  import re
 import sys


 logfile = sys.argv[1]
 logdata = open(logfile,"r")
 dump = logdata.readlines()

 numerosin = []
 numerosout = []

 for line in dump:

        m = re.search('(?<=xmpp/reader/read/message )\d+', line)

       if m:

                if not numerosin.count(m.group(0)):

                        numerosin.append(m.group(0))


        m = re.search('(?<=xmpp/writer/write/message/receipt )\d+', line)

        if m:

                if not numerosout.count(m.group(0)):

                        numerosout.append(m.group(0))

 print "Messages received from\n"
 print "\n".join(numerosin)
 print "\nMessages sent to\n"
 print "\n".join(numerosout) 

Executing the script, it will ouput the information as follows:

$ python whatsnumbers.py whatsapp-2011-06-08.1.log
Messages received form

34611111111
34622222222

Messages sent to

34611111111
34622222222
 
轉自  http://www.securitybydefault.com/2011/06/what-whatsapp-doesnt-tell-you.html

Interesting Malware in Email Attempt - URL Scanner Links

Last weekend I spent some time with extended family helping confirm for them that their on-line email account got hacked and had been used to send some malware-linking spam emails to users in their contact list.
Yesterday our family email account was on the receiving end of someone -- possibly -- who fell victim to an email account hack as our email address was amongst several others included together receiving the email. I say possibly as none of us recognized the sender’s email address and it wasn’t in any of our address books. Possibly our along with the other’s email addresses had been harvested somehow and this was a fake spamming account. The “show-as” name was definitely non-standard and used some letters that related to that in the subject line.
It was pretty evident to me this was probably a dangerous site to go to, but being curiously-minded, I couldn’t pass up the chance to do some detective work.
The email originated from a yahoo mail account.
The Subject line was baited “ACH Transfer Canceled…” and the display name in the email address contained the letters “NACHA.”
ACH is meant to refer to the “Automated Clearing House” which handled financial transactions in the US overseen by the NACHA.  To most Americans, I’m betting these acronyms mean very little and they would be more taken with a sudden urge to grab some NACHOES instead. Maybe Europeans would be a little more anxious emails purporting to come from ACH and NACHA. I digress.
First thing I looked at was the message header. Lots of goodies there. We can follow the bounce between the yahoo mail sender to our ISP’s email servers. Times/dates of transmission.
Since this was a Yahoo mail account, it appears the header may actually contain the IP address of the the location the mail account was logged into from. This is the first time I have seen this so I need to do more research. The IP associated with this particular email is located in France.
The website IP Address Locator has lots of good tools for locating IP addresses as well as a feature that allows a copy/paste/analyze of email headers.
The content of the email was very thin, a single line with all the text ran together. There is a URL link markup there, however it misses getting all the characters. Hmm.
Toggling between the different modes of viewing email content in Thunderbird reveals odd results. If I look at it in original html mode I see a single line of text with an hyperlink in the middle.
If I view it in simple html most of the text is the same but a few characters are different.
If I view it in plain text, there is nothing showing.
Hovering over the hyperlink displayed shows a URL shortner link. Hmm. Set that aside for a moment.
So I back and look at the full header view again and find this in the message body:
Content-Type: text/html; charset=ISO-8859-5
Content-Transfer-Encoding: base64
Ah! So I copy/paste that large text block that follow that into this base64 online encoder / decoder and get a binary file to download! 
(More regarding content encoding methods here Content-Transfer-Encoding - MSDN, here The Content-Transfer-Encoding Header Field via freesoft.org and here Decoding Internet Attachments - A Tutorial by Michael Santovec.)
Opening that binary file in Notepad++ reveals the html code with the same actual URL embedded.
Guessing here they are using base64 coding for the content to try to get around email scanners.
OK, so let’s check out that URL.
Turns out it is using Google’s own URL shortning service: Google URL Shortener.  More info here. Google URL shortener - Web Search Help
Turns out this is a pretty cool choice from both sides of the security fence. By appending the URL with “.info” at the end of a Goog.le shortened URL we can find out the stats from Goo.gl URL shortener (Google Groups)
This is good from an attacker standpoint as they can easily monitor their success rate on the nibbles of this hook and any “hits” to the actual URL. Researchers can get info as well by monitoring the same info and how fast/long the “click-through” may happen.
h0j5wpnx.2up
Neat isn’t it?
Now that I’ve got the actual long URL that this points to, we can start tossing the URL at some on-line link analysis/scanner tools.
VirusTotal shows both TrendMicro and SCUMWARE.org report the long URL as a Malware/Malicious site.
Quttera reports it as serving up a suspicious javascript content via HTML page code.
Anubis: Analyzing Unknown Binaries provided a deeper review of the URL by capturing Windows system events in a virutal sandbox system. It accesses the Windows registry, mucks with some keys, created a cookie, reads the autoexec.bat file, mods some files and maps dll’s to memory and appears to try to download more stuff. The report is available in HTML, XML, PDF, and TXT formats.  Also, they offer a traffic.pcap file to download so you can examine the network traffic generated and perform any NFA you want to do.  This site/tool rocks from a depth of information standpoint.
urlQuery gives some more report feedback when it is sandboxed. Lots of Java script stuff. Another strong URL analysis reporting site.
Trying it a few more times changing the browser type/java version/flash version gets different results and the URL serving code reflects all kinds of different IP’s each time so that long URL seems to be hosted at a dynamic IP host allowing it to bounce around (serving up HTTP redirects) and serve up the malware code depending on platform from all over the place making it harder to track down the source.
urlQuery actually identified the network traffic code as being detected as Blackhole exploit kit v1.2 HTTP GET request.  Another clue.
I tossed the pcap file I got from Anubis into NETRESEC NetworkMiner. Nothing very interesting but my Microsoft Security Essentials alerted when the HTML page was reassembled by NetworkMiner and quarantined the file. It identified the page code as being Exploit:JS/Blacole.AR. (MS’s way of saying “blackhole” I suppose…)
Here are a series of links regarding these kinds of email spam threats in general as well as Blackhole info in particular as it relates with email spam campaigns, if you are curious.
I doubt this is the last our email inbox will see of these things, but the whole process has been quite fun to follow.
I’ve decided to leave out links/images of the actual email and the header-code/URL (short/long) but have passed it along to a number of security-spam websites in case it is of use.
A long time ago I had a list of URL-testing sites to feed a URL into to see if they were safe or not.  Most seem to have gone away, however the following forums had a number of new ones worth bookmarking. Hat tip to “PROROOTECT” for the legwork!
Here is a combined and cleaned up list based on the collective work there from PROROOTECT in both places and at least one or two I’m tossing in and a few from those lists I removed that seem dead/redirected incorrectly.  PROROOTECT does make a great point that the effectiveness of these vary, so a “bad” URL in one may come back as “clean” in another. So it’s best to run your URL through multiple sources.
Note, these are URL/web-page scanners. They are a bit different than on-line file-scanners/sandboxes used to analyze malware samples. Though a few seem to come pretty darn close with the depth of their reports/analysis.
Not “necessarily” ordered in order of usefulness.
PROROOTECT’s suggestion to use an online URL screenshotting service to capture the displayed URL safely is some good outside the box thinking. Kinda a “look-before-you-leap” thing if all the above items pass OK.
Fun trip if it wasn’t so serious…
--Claus V.
Update: I meant to add this in to the original post but got sidetracked. A recent Digital Forensics Case Leads post has mention of a super-fantastic investigation/forensic report involving anonymous emails. This is must-read material, not just in terms of the investigative methodology but also the way the report was composed and presented. Very clearly done!  I’m keeping a saved copy of the report for future reference; both technically and as a report template. From the post via the link above:
University of Illinois recently released a detailed investigation report (PDF) regarding anonymous emails allegedly sent by its Chief of Staff to the University's Senates Conference. The report is an interesting read, and also serves as a potentially useful model for those looking for report samples and templates.

轉自 http://grandstreamdreams.blogspot.com/2012/01/interesting-malware-in-email-attempt.html

Ripping Volume Shadow Copies – Introduction

Windows XP is the operating system I mostly encounter during my digital forensic work. Over the past year I’ve been seeing more and more systems running Windows 7. 2011 brought with it my first few cases where the corporate systems I examined (at my day job) were all running Windows 7. There was even a more drastic change for the home users I assisted with cleaning malware infections because towards the end of the year all my cases involved Windows 7 systems. I foresee Windows XP slowly becoming a relic as the corporate environments I face start upgrading the clients on their networks to Windows 7. One artifact that will be encountered more frequently in Windows 7 is Volume Shadow Copies (VSCs). VSCs can be a potential gold mine but for them to be useful one must know how to access and parse the data inside them. The Ripping Volume Shadow Copies series is discussing another approach on how to examine VSCs and the data they contain.

What Are Volume Shadow Copies


VSCs are not new to Windows 7 and have actually been around since Windows Server 2003. Others in the DFIR community have published a wealth of information on what VSCs are, their forensic significance, and approaches to examine them. I’m only providing a quick explanation since Troy Larson’s presentation slides provide an excellent overview about what VSCs are as well as Lee Whitfield’s Into the Shadows blog post. Basically, the Volume Shadow Copy Service (VSS) can backup data on a Windows system. VSS monitors a volume for any changes to the data stored on it and will create backups only containing those changes. These backups are referred to as a shadow copies. According to Microsoft, the following activities will create shadow copies on Windows 7 and Vista systems:

        -  Manually (Vista & 7)
        -  Every 24 Hours (Vista)
        -  Every 7 Days (7)
        -  Before a Windows Update (Vista & 7)
        -  Unsigned Driver Installation (Vista & 7)
        -  A program that calls the Snapshot API (Vista & 7)

Importance of VSCs


The data inside VSCs may have a significant impact on an examination for a couple of reasons. The obvious benefit is the ability to recover files that may have been deleted or encrypted on the system. This ringed true for me on the few cases involving corporate systems; if it wasn’t for VSCs then I wouldn’t have been able to recover the data of interest. The second and possibly even more significant is the ability to see how systems and/or files evolved over time. I briefly touched on this in the post Ripping Volume Shadow Copies Sneak Peek. I mentioned how parsing the configuration information helped me know what file types to search for based on the installed software. Another example was how the user account information helped me verify a user account existed on the system and narrow down the timeframe when it was deleted. A system’s configuration information is just the beginning; documents, user activity, and programs launched are all great candidates to see how they changed over time.


To illustrate I’ll use a document as an example. When a document is located on a system without VSCs - for the most part - the only data that can be viewed in the document is what is currently there. Previous data inside the document might be able to be recovered from copies of the document or temporary files but won’t completely show how the document changed over time. To see how the document evolved would require trying to recover it at different points in time from system backups (if they were available). Now take that same document located on a system with VSCs. The document can be recovered from every VSC and each one can be examined to see its data. The data will only be what was inside the document when each VSC was created but it could cover a time period of weeks to months. Examining each document from the VSCs will shed light on how the document evolved. Another possibility is the potential to recover data that was in the document at some point in the past but isn't in the document that was located on the system. If system backups were available then they could provide additional information since more copies of the document could be obtained at other points in time.


Accessing VSCs


The Ripping Volume Shadow Copies approach works against mounted volumes. This means a forensic image or hard drive has to be mounted to a Windows system (Vista or 7) in order for the VSCs in the target volume to be ripped. There are different ways to see a hard drive or image’s VSCs and I highlighted some options:

        -  Mount the hard drive by installing it inside a workstation (option will alter data on the hard drive)
        -  Mount the hard drive by using an external hard drive enclosure (option will alter data on the hard drive)
        -  Mount the hard drive by using a hardware writeblocker
        -  Mount the forensic image using Harlan Carvey’s method documented here, here, and the slide deck referenced here
        -  Mount the forensic image using Guidance Software’s Encase with the PDE module (option is well documented in the QCCIS white paper Reliably recovering evidential data from Volume Shadow Copies)

Regardless of the option used to mount the hard drive or image, the Windows vssadmin command or Shadow Explorer program can show what if VSCs are available for a given mounted volume. The pictures below show the Shadow Explorer program and vssadmin command displaying the some VSCs for the mounted volume with drive letter C.

Shadow Explorer Displaying C Volume VSCs

VSSAdmin Displaying C Volume VSCs

Picking VSCs to examine is dependent on the examination goals and what data is needed to accomplish those goals. However, time will be a major consideration. Does the examination need to review an event, document, or user activity for specific times or for all available times on a computer? Answering that question will help determine if certain VSCs covering specific times are picked or if every available VSCs should be examined. Once the VSCs are selected then they can be examined to extract the information of interest.


Another Approach to Examine VSCs


Before discussing another approach to examining VSCs it’s appropriate to reflect on the approaches practitioners are currently using. The first approach is to forensically image each VSC and then examine the data inside each image. Troy’s slide deck referenced earlier has a slide showing how to image a VSC and Richard Drinkwater's Volume Shadow Copy Forensics post from a few years ago shows imaging VSCs as well. The second popular approach doesn’t use imaging since it copies data from each VSC followed by examining that data. The QCCIS white paper referenced earlier outlines this approach using the robocopy program as well as Richard Drinkwater in his posts here and here. Both approaches are feasible for examining VSCs but another approach is to examine the data directly inside VSCs bypassing the need for imaging and copying. The Ripping VSCs approach examines data directly inside VSCs and the two different methods to implement the approach are: Practitioner Method and Developer Method.


Ripping VSCs: Practitioner Method


The Practitioner Method uses ones existing tools to parse data inside VSCs. This means someone doesn’t have to learn a new tool or learn a programming language to write their own tools. All that’s required is for the tool to be command line and the practitioner willingness to execute the tool multiple times against the same data. The picture below shows how the Practitioner Method works.

Practitioner Method Process

Troy Larson demonstrated how a symbolic link can be used to provide access to VSCs. The mklink command can create a symbolic link to a VSC which then provides access to the data stored in the VSC. The Practitioner Method uses the access provided by the symbolic link to execute one’s tools directly against the data. The picture above illustrates a tool executing against the data inside Volume Shadow Copy 19 by traversing through a symbolic link. One could quickly determine the differences between VSCs, parse registry keys in VSCs, examine the same document at different points in time, or track a user’s activity to see what files were accessed. Examining VSCs can become tedious when one has to run the same command against multiple symbolic links to VSCs; this is especially true when dealing with 10, 20, or 30 VSCs. A more efficient and faster way is to use batch scripting to automate the process. Only a basic understanding about batch scripting (need to know how a For loop works) can create powerful tools to examine VSCs. In future posts I’ll cover how simple batch scripts can be leverage to rip data from any VSCs within seconds.


Ripping VSCs: Developer Method


I’ve been using the Practitioner Method for some time now against VSCs on live systems and forensic images. The method has enabled me to see data in different ways which was vital for some of my work involving Windows 7 systems. Recently I figured out a more efficient way to examine data inside VSCs. The Developer Method can examine data inside VSCs directly which bypasses the need to go through a symbolic link. The picture below shows how the Developer Method works.

Developer Method Process

The Developer Method programmatically accesses the data directly inside of VSCs. The majority of existing tools cannot do this natively so one must modify existing tools or develop their own. I used the Perl programming language to demonstrate that the Developer Method for ripping VSCs is possible. I created simple Perl scripts to read files inside a VSC and I modified Harlan’s lslnk.pl to parse Windows shortcut files inside a VSC. Unlike the Practitioner Method, at the time of this post I have not extensively tested the Developer Method. I’m not only discussing the Developer Method for completeness when explaining the Ripping VSCs approach but my hope is by releasing my research early it can help spur the development of DFIR tools for examining VSCs.


What’s Up Next?


Volume Shadow Copies have been a gold mine for me on the couple corporate cases where they were available. The VSCs enabled me to successfully process the cases and that experience is what pushed me towards a different approach to examining VSCs. This approach was to parse the data while it is still stored inside the VSCs. I’m not the only DFIR practitioner looking at examining VSCs in this manner. Stacey Edwards shared in her post Volume Shadow Copies and LogParser how she runs the program logparser against VSCs by traversing through a symbolic link. Rob Lee shared his work on Shadow Timelines where he creates timelines and lists deleted files in VSCs by executing the Sleuthkit directly against VSCs. Accessing VSCs’ data directly can reduce examination time while enabling a DFIR practitioner to see data temporally. Ripping Volume Shadow Copies is a six part series and the remaining five posts will explain the Practitioner and Developer methods in-depth.

        Part 1: Ripping Volume Shadow Copies - Introduction
        Part 2: Ripping VSCs - Practitioner Method
        Part 3: Ripping VSCs - Practitioner Examples
        Part 4: Ripping VSCs - Developer Method
        Part 5: Ripping VSCs - Developer Example
        Part 6: Examing VSCs with GUI Tools



轉自 http://journeyintoir.blogspot.com/2012/01/ripping-volume-shadow-copies.html

base64 Encode / Decode