Articles Tagged with digital evidence

People now ask ChatGPT and Claude everything, including what to do after an arrest, whether the police can prove a case, and how to explain suspicious facts. If you have been accused of a crime, that can be a serious mistake. A recent federal court opinion shows why people should be very cautious before typing case facts, strategy, timelines, or explanations into a consumer AI platform.

A recent opinion from the Southern District of New York, United States v. Heppner, addressed whether a criminal defendant’s communications with the AI platform Claude were protected by the attorney-client privilege or the work-product doctrine. On the facts before it, the court said no.  The Harvard Law Review’s discussion of the decision is worth reading, and helped inspire this article.

The practical lesson is straightforward. AI is not your lawyer. A public AI platform is not the same thing as a confidential legal channel. If you are under investigation, worried about charges, or already facing prosecution, you should assume that discussing your case with AI can create risks your lawyer would rather have avoided.

Deleted CSAM evidence in Michigan is rarely gone for good. Forensic analysts recover files through hash values stored in a device’s cache, even when images have been deleted by the user. A hash match against the NCMEC database can support charges without recovering the complete image file.

Deleted CSAM files are rarely eliminated from a device’s storage simply by hitting delete. One place deleted images frequently remain is in the device’s cache or temporary memory, where forensic analysts can identify and recover them even when the user believes the material is gone.

When a forensic analyst identifies a CSAM file on a device, what they have established is that a file matching a known hash value was present. A hash value is a unique numerical fingerprint assigned to a specific digital file. The National Center for Missing and Exploited Children maintains a database of hash values for known CSAM images, and a match against that database can support charges even when the original image has been deleted, was never fully opened, or exists only as a fragment in the device’s cache memory.

Contact Information