What is Validation?
Do the tools that you employ in your casework actually work? Are the results that you generate worthy of trust? How would you know?
Prior to using a tool in casework, the tools should be validated.
What does that mean? Let’s take a look.
One of the problems with products that generate results based on a comparison to an internal database is errors – false positives / false negatives. As much as we like Amped’s Authenticate, it does suffer from this problem. Many of it’s results are based upon an internal database. This database might not be updated as often as you’d like. The new reporting feature in Authenticate makes this worse, not better.
Here’s the scenario.
You load an image into Authenticate and check the File Information, or you run the “Smart Report.” One of the results in this panel indicates a mismatch of the JPEG’s Quantization Table – “Compression signature is incompatible with the actual camera make-model.”
Question: how do you know if this result comes from an actual mismatch? How do you know if this result is in fact a “non match,” meaning the evidence file comes from a camera who’s JPEG QT information isn’t actually in the product’s database – in this case Amped Authenticate?
Answer: run a batch File Format Comparison against a valid sample set of random images taken by the same model of camera.
This is the essence of tool validation – understanding the results and attempting to determine the potential / real problems with a tool.
In this case, the validation is simple if you have the right tools. Calculate and assemble an appropriate group of sample images and drop a copy of your reference image into the folder of samples. Then run the File Format Comparison test.
When the report is generated, scroll across to find the column for JPEG QT hash. If, in fact, your evidence image is a mismatch – “Compression signature is incompatible with the actual camera make-model” – then the sample images’ JPEG QT hash should not match the reference file’s JPEG QT. In this case, all of the samples matched each other, and matched the reference file. The result from the software was false. This result would go in the validation report for the tool.
Were this your case, an actual case, you would want to thoroughly document this process of validating the results. You wouldn’t want to leave the statement of “mismatch” untested. Yes, the software returned the result. However, we assembled a valid sample set to test the results. Here, we found the software in error (likely the software’s internal database was incomplete or hadn’t been updated in a while). We found that the evidence file’s JPEG QT matched the JPEG QT for all of the sample files.
The pace of change in camera technology is quite brisk. The issue is further complicated by the fact that camera technology is a bit regional – meaning there are cameras sold in some areas but not others. Additionally, not all manufacturers make their QTs available to developers. Very few do, actually. Thus, is it realistic to find all QTs present in a particular software’s database? Of course not. Therefor, you’ll need to remember to validate your tools ahead of casework.
If you would like to engage our services, we’d be happy to talk with you about our tool validation services. If you’d like us to help you in your case, we’re here for you as well. If you’d like to learn this valuable skill, check out our training options and sign up today. If you don’t see a date that works for you, suggest one. We can come to your location, or you can come to ours.