±Forensic Focus Partners

Become an advertising partner

±Your Account


Forgotten password/username?

Site Members:

New Today: 0 Overall: 36317
New Yesterday: 0 Visitors: 163

±Follow Forensic Focus

Forensic Focus Facebook PageForensic Focus on TwitterForensic Focus LinkedIn GroupForensic Focus YouTube Channel

RSS feeds: News Forums Articles

±Latest Articles

±Latest Videos

±Latest Jobs

Logical evidence file size reduction

Computer forensics discussion. Please ensure that your post is not better suited to one of the forums below (if it is, please post it there instead!)
Reply to topicReply to topic Printer Friendly Page
Forum FAQSearchView unanswered posts
Page Previous  1, 2 

Senior Member

Re: Logical evidence file size reduction

Post Posted: May 28, 19 07:57

U also can use the toolbox inside Hadoop MapReduce  

Senior Member

Re: Logical evidence file size reduction

Post Posted: May 28, 19 13:34

Thank you TinyBrain,

Will look into that too.

Interestingly enough, I was just able to export an 8TB LEF onto a 4TB USB drive.

Despite the error messages about not enough free space, and some forensic suites even grey out the continue button based on the size, it appears that at least 1 of the forensic suites does deduplicate adequately. According to the log files the LEF export process was successful and without errors. Remarkable.


Senior Member

Re: Logical evidence file size reduction

Post Posted: May 28, 19 15:50

I'm not sure of your exact strategy but I'd suggest breaking things up.

Create a data set of all your live files first and separate this (this will obviously be considerably less than the original drive size).

You've then got to deal with all your deleted files, other non-live files such as those from volume shadows if you've got lots of those, and then carved files.

I'd again suggest filtering for all your non-live files, then running some sort of validation process, like checking signatures are OK, or even better some further filtering (if your tool allows text summaries like NUIX you might quickly be able to filter out lots that aren't showing any decodable textual information - this isn't perfect but would be a valid strategy if done in the knowledge you may be excluding image-based documents or problematic ones). You could also then remove all duplicates, by hash, that exist in the file-system set already as an example.

For the carving you could do much the same sort of thing. Although if you do have NUIX I'd warn that it'll be unlikely to match lots of things very well by hash because their carving logic is nonsense (and it carves until the end of the sector I believe - rather than what appears to be the end of the file - so the carving results will usually not match identical documents by hash).  

Page 2 of 2
Page Previous  1, 2