ISO17025 Inter Lab ...
 
Notifications
Clear all

ISO17025 Inter Lab Comparison data-sets (Final year project)

11 Posts
6 Users
0 Likes
839 Views
(@marca)
Posts: 2
New Member
Topic starter
 

Hello,

I am a computer forensics student at the University of South Wales in the final year of my degree. I spent the last year on an internship in law enforcement. The lab that I was working in was ISO 17025 accredited.

For my final year project, I am aiming to create data sets that can be used for validation in compliance with ISO 17025. These data sets could be used in an Inter Lab Comparison scheme. As I have only worked in one lab I only know their requirements and issues faced. I would like to have some input from others as to what elements you think I need to include in these data sets

• Which devices?

• Which operating systems?

• Which file types?

• Which forensic processes?

• Which file systems?

Thank you in advance for help and advice.
Marc

 
Posted : 23/09/2019 8:30 am
(@rich2005)
Posts: 535
Honorable Member
 

I think this would be a good question for the forensic regulator who implemented the silly standard wink

 
Posted : 23/09/2019 9:32 am
(@athulin)
Posts: 1156
Noble Member
 

As I have only worked in one lab I only know their requirements and issues faced. I would like to have some input from others as to what elements you think I need to include in these data sets

General principle data for the most common or most important analysis methods / tools. If those methods rest on prerequisites, data that validates that those prerequisites are fulfilled.

Example You may have a tool that does something useful with NTFS file data – say, examines ADS contents. So primarily, data that contains all kinds 'files' (I use the term in NTFS sense, not in everyday sense) with ADS data. Including at least one that has the max number of ADS that NTFS supports.

But prerequisites to run that tool is a) the disk/volume does contain a NTFS file system, and b) that NTFS file system is not malformed. So additional data sets with 'other' file systems, or 'old' NTFS, and possibly also HPFS (from Window NT days), as well as more or less malformed NTFS file systems. (Haven't looked into NTFS portability – it might be an idea to provide Micorosoft NTFS from Intel platform, Microsoft NTFS from ARM and other hardware platforms, Linux NTFS from similar 'different' platforms, as well as various release versions.)

Additionally, some analyses or sub-analyses ('find all documents on this volume that have been created on it') may depend on other analyses ('find all word processors on this system, present as well as past') and closely related ones ('find tools that can convert from one word processing format to another, but aren't word processors') … and so on. ('Find all archive files' … can all archive files be identified? Can they be examined correctly, or does it drop 'unusual' data or metadata?)

Your remaining questions should not impossibly be answered by asking the lab you have worked with and ask what operating systems/file systems/ … are the most common / most important for them to get right? (That list would probably be useful to post here for additional suggestions or ideas.)

The only question I'm not sure can be answered is that about 'what forensic processes'. I don't think there are any sufficiently well established. Best way is probably to suggest a few – that has better chance of proving suggestions or criticism, and so may lead to improvements.

I don't do this anymore, but the standard things I used to do before any specific analysis was performed included verification that filesystems were reasonably sound (basically fsck or equivalent), system on/off timeline, with particular attention on nice shutdowns vs brutal power off, users (existing and past), user login/logout timelines, external devices connected/removed, external connections, identifiable software (installed as well as uninstalled), and a kind of scraping of system logs for signs of problems or such (IOC, if you like, or other signs of problems). And usually any standard 'initiate case' on whatever forensic tool platform I was using.

I think it's easier to focus on tools, but … processes are certainly the ultimate goal.

I hope it's obvious that data sets are only part of validation. The validation process itself needs to be defined or … at the very least smorgasboarded so that someone can decide what to keep and what to pass by.

 
Posted : 23/09/2019 10:41 am
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

For my final year project, I am aiming to create data sets that can be used for validation in compliance with ISO 17025. These data sets could be used in an Inter Lab Comparison scheme. As I have only worked in one lab I only know their requirements and issues faced.

I personally have serious difficulties in visualizing what is a "data set".

Why don't you post an example "data set" limited to one of the experiences you had (i.e. consisting in one - I presume one out of the many - actually used approach limited to only the requirements and issues faced in that specific lab during your internship)?

jaclaz

 
Posted : 23/09/2019 12:18 pm
minime2k9
(@minime2k9)
Posts: 481
Honorable Member
 

OK So this this is a much bigger issue that can be covered in a 1 year project (IMHO) and I would focus on a specific process.
In a proper scenario, all the processes would be mapped out, validated and then covered by an over-arching procedure that links them together.

Lets take recovery of files - though focusing on picture/video files as that is a large percentage of LE usage. For sake of this argument we will limit it to common file systems and the Windows OS only.
Have a look at the NIST testing for carving images and that would give you an idea of the depth that I would create.
Then the requirements for file recovery should be specified such as
must recover non-fragmented files from file-system
Must recover thumbnail images store on from given OS

In both of the above, filesystem and OS would be variations for each test.

Then you would need to build on this test set to create test for thumbnail files being created in all OS between say Windows 7 and Windows 10 as a representative example, or base it on justification as to when the thumbnail file format changed etc.

Then further images where pictures included in
Different types of archives
Disk images (VHD, VMDK, VDI, ISO)
SQLite databases
Shadow volumes
…. probably quite a few other things that I have missed here.

I reckon as a conservative image that you would need somewhere in the region of 100 different image files to properly cover picture and video file recovery.

This is why ISO 17205 is a joke, nobody does proper validation (that I have seen) for the investigation side but UKAS rubber stamp it and everybody pats themselves on the back like they have done something useful with their lives.

Ideally there should be a national set of requirements for each process and then organisations validate against the ones they require. This way company x validating requirements 1 - 10 would be comparable (though no necessarily as good) as organisation y validating requirements 1 - 10.

I can provide further assistance via PM if you want some guidance with some template documents.

My 2 pence/rant.

 
Posted : 23/09/2019 3:15 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

This is why ISO 17205 is a joke, …

Well, no evil , a line must be drawn somewhere, you cannot just put ISO 17025 in the category of jokes.

That is unfair to good jokes, you need to specify that it is a NOT-funny joke. wink

jaclaz

 
Posted : 24/09/2019 10:01 am
minime2k9
(@minime2k9)
Posts: 481
Honorable Member
 

This is why ISO 17205 is a joke, …

Well, no evil , a line must be drawn somewhere, you cannot just put ISO 17025 in the category of jokes.

That is unfair to good jokes, you need to specify that it is a NOT-funny joke. wink

jaclaz

Perhaps we could classify it as a trick, similar to something we would expect from Loki?

 
Posted : 24/09/2019 10:32 am
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

Perhaps we could classify it as a trick, similar to something we would expect from Loki?

Loki may be evil, but he is not stupid, you should never confuse evilness with stupidity roll .

Compare with Carlo M.Cipolla, The Basic Laws of Human Stupidity
https://en.wikipedia.org/wiki/Carlo_M._Cipolla
ISO17025 (actually the people that forced its application on digital forensics) are in the lower left quadrant, Loki (and his tricks) would definitely be on the lower right one.

jaclaz

 
Posted : 24/09/2019 11:53 am
minime2k9
(@minime2k9)
Posts: 481
Honorable Member
 

I think I need to print out that chart and put it on my wall at work!

 
Posted : 24/09/2019 4:28 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

I think I need to print out that chart and put it on my wall at work!

Then get the "right" (original) one, including the POM line dividing Helpless and Bandits in two categories.

There is a (IMHO horrible) short and simplified version illustrated by James Donnelly, easily available online, originally publiahed
https://web.archive.org/web/20110320011208/http//wwwcsif.cs.ucdavis.edu/~leeey/stupidity/basic.htm

and the "original"
Carlo M. Cipolla The Basic Laws of Human Stupidity il Mulino ISBN 978-88-15-23381-3 Copyright © 2011

Again a line needs to be drawn between Bandits with overtones of Intelligence (B1) and Bandits with overtones of Stupidity (B2), Loki would be most probably in the first category ? .

jaclaz

 
Posted : 24/09/2019 7:11 pm
Page 1 / 2
Share: