Recommendations for...
 
Notifications
Clear all

Recommendations for carving software

10 Posts
8 Users
0 Likes
1,761 Views
(@jadams951)
Posts: 37
Eminent Member
Topic starter
 

Just wondering what others use/recommend for carving images. My main use would be for images and videos.

 
Posted : 26/10/2019 2:28 am
tracedf
(@tracedf)
Posts: 169
Estimable Member
 

Check out Autopsy (which uses PhotoRec for carving); it's free.

https://www.sleuthkit.org/autopsy/

 
Posted : 26/10/2019 4:32 am
Igor_Michailov
(@igor_michailov)
Posts: 529
Honorable Member
 

Tools up the best software and hardware tools for computer forensics

https://www.group-ib.com/blog/digital_forensics_tools

 
Posted : 26/10/2019 5:02 am
(@eugenebelk)
Posts: 16
Active Member
 

You, as well as other members interested in data carving, may find our guide (related to Belkasoft Evidence Center and it capabilities) useful https://articles.forensicfocus.com/2019/05/03/walkthrough-carving-with-belkasoft-evidence-center/

 
Posted : 11/11/2019 3:26 pm
keydet89
(@keydet89)
Posts: 3568
Famed Member
 

Just wondering what others use/recommend for carving images. My main use would be for images and videos.

Google

https://www.nist.gov/itl/ssd/software-quality-group/computer-forensics-tool-testing-program-cftt/cftt-technical-0

https://www.linkedin.com/pulse/some-useful-forensics-tools-your-investigation-tonny-bj%C3%B8rn/

Etc.

 
Posted : 11/11/2019 4:57 pm
minime2k9
(@minime2k9)
Posts: 481
Honorable Member
 

I'd always recommend X-Ways, from everything I've tested it has the best results.

 
Posted : 12/11/2019 6:51 am
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

Inspired by the links by keydet89 and by the suggestions by tracedf and minime2k9, I decided to find out if the data of the tests by the NIST confirmed that X-Ways and Photorec are the "best" tools around (for image carving, not videos).

I purged the results related to thumbnails (that IMHO have only a very minor relevance when compared to "full size" images) and assembled the rests in a spreadsheet, inventing 😯 my own votes/scores scale.

As expected, X-Ways and Photorec got the best ratings, but there are IMHO some surprises in the other results

Tool Score
Ideal Tool (non-existing) 10,00
X-Ways Forensics v17.6 7,86
Photorec v7.0 WIP 6,91
R-Studio v6.2 6,67
Recover My Files v5.2.1 5,90
FTK 4.1 5,14
Adroit Photo Forensics v3.1d 4,88
Encase Forensics v6.18.0.59 3,64
Scalpel v2.0 -5,03
Encase Forensics v7.09.05 -414,28
ILook v2.2.7 -999,99 <- This is a joke (but the NIST started it)
particularly now I start to understand why users of Encase v6 were so upset by v7.

The spreadsheet is here
http//s000.tinyupload.com/index.php?file_id=85214280268665349961

jaclaz

 
Posted : 12/11/2019 4:12 pm
(@rich2005)
Posts: 535
Honorable Member
 

Thanks for that jaclaz - very useful (I'd been contemplating improving my general strategy by carving in X-Ways and bringing that into my NUIX case via a container following a bit of filtering - rather than carving in NUIX for a number of reasons).

 
Posted : 12/11/2019 4:24 pm
minime2k9
(@minime2k9)
Posts: 481
Honorable Member
 

Even though it is very useful, I should point out that pretty much all of the NIST testing was done in 2014 and some of those versions are quite old. X-Ways is now on 19.8 SR9 (19.9 is still in Beta) and 17.6 was tested by NIST, which is a lot of versions difference.

The NIST reports are probably the best validation of tools I have seen, but its also a great example of how things move far to fast to actually perform any meaningful validation.

 
Posted : 12/11/2019 8:31 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

Even though it is very useful, I should point out that pretty much all of the NIST testing was done in 2014 and some of those versions are quite old. X-Ways is now on 19.8 SR9 (19.9 is still in Beta) and 17.6 was tested by NIST, which is a lot of versions difference.

The NIST reports are probably the best validation of tools I have seen, but its also a great example of how things move far to fast to actually perform any meaningful validation.

Sure and the blatant differences in performance (besides my personal scores) of the two subsequent versions of Encase (hopefully an isolated case) show clearly enough how you cannot "extend" validation from one release to another of a same tool.

I have no idea how the tools (or the investigators) deal or should deal with "false positives", but if the thingy generated around 9000 false positives on a test dataset containing 40 images, how many would be generated if the disk under examinations has a thousands images ? millions? 😯

Also, the (nice) tests by NIST (understandbly) do not touch the topic of "validation", how do you actually validate any of those?

I mean, let's take the single test where the overall better scoring tools (X-Ways and Photorec) were less "brilliant"
4.5 Fragmented Out of Order
X-Ways Forensics v17.6 out of 35 images, 24 carved, of which 3 viewable, 24 only partially viewable, 7 not viewable
Photorec v7.0 WIP out of 35 images, 12 carved, of which 3 viewable, 9 only partially viewable, 0 not viewable

How can you validate the latter (and "trust" it) when it has roughly half of the performance of the former?

Then you take another tool (actually not very well performing in "simpler" tests), and you have (in the specific test) a result that is double the best one of the two above
Encase Forensics v6.18.0.59 out of 35 images, 46 carved, of which 10 viewable, 9 only partially viewable, 6 not viewable, 21 false positives

The validation may only refer to the fact that all three tools above did not "invent" new images from random bytes.

And still I would like to see what happens in Court when one investigator (using tool A) says (test results multiplied by 100 to convey the effect)
From the disks I examined I was able to recover 4,900 viewable images in total..
and the investigator for the other party (using tool B) says
From the same disks (images) I was able to recover 16,000 viewable images in total.

jaclaz

 
Posted : 13/11/2019 9:22 am
Share: