P

Pro Digital Forensics Analysis Toolkit

A comprehensive skill that enables digital forensics investigation and analysis techniques. Built for Claude Code with best practices and real-world patterns.

SkillCommunitysecurityv1.0.0MIT
0 views0 copies

Digital Forensics Analysis Toolkit

Comprehensive digital forensics investigation framework covering evidence acquisition, filesystem analysis, memory forensics, network packet analysis, and forensic reporting for incident response.

When to Use This Skill

Choose Digital Forensics Analysis when:

  • Investigating security incidents requiring evidence preservation
  • Analyzing compromised systems for indicators of compromise
  • Performing malware analysis on suspicious files
  • Conducting log analysis across multiple system sources
  • Preparing forensic reports for incident documentation

Consider alternatives when:

  • You need real-time threat detection — use SIEM tools
  • Performing routine vulnerability scanning — use dedicated scanners
  • Need active threat response — use EDR/XDR platforms

Quick Start

# Activate forensics toolkit claude skill activate pro-digital-forensics-analysis-toolkit # Analyze a disk image claude "Perform forensic analysis on the disk image at /evidence/case001.dd" # Memory forensics claude "Analyze the memory dump for indicators of compromise"

Example Investigation Workflow

# 1. Evidence Acquisition - Create forensic disk image dd if=/dev/sda of=/evidence/disk.dd bs=4096 conv=noerror,sync status=progress # Generate hash for integrity verification sha256sum /evidence/disk.dd > /evidence/disk.dd.sha256 # 2. Mount read-only for analysis mkdir /mnt/evidence mount -o ro,loop,noexec /evidence/disk.dd /mnt/evidence # 3. Timeline generation find /mnt/evidence -type f -printf '%T+ %p\n' | sort > timeline.txt # 4. Extract and analyze logs cat /mnt/evidence/var/log/auth.log | grep -E "Failed|Accepted|session opened" # 5. File carving for deleted files foremost -t jpg,pdf,doc -i /evidence/disk.dd -o /evidence/carved/ # 6. Hash analysis against known malware md5deep -r /mnt/evidence/tmp/ > file_hashes.txt

Core Concepts

Forensic Analysis Phases

PhaseDescriptionTools
IdentificationDetermine scope and evidence sourcesCase documentation
PreservationCreate forensic copies, establish chain of custodydd, dc3dd, FTK Imager
AnalysisExamine evidence for artifacts and indicatorsAutopsy, Volatility, Wireshark
DocumentationRecord findings with timestamps and evidence linksForensic report templates
PresentationPrepare findings for stakeholders or legal proceedingsTimeline visualizations

Evidence Types and Analysis

Evidence TypeArtifactsAnalysis Method
FilesystemFiles, deleted data, timestamps, permissionsTimeline analysis, file carving
MemoryRunning processes, network connections, encryption keysVolatility framework
NetworkPacket captures, DNS queries, flow dataWireshark, tcpdump, Zeek
LogsSystem events, application logs, audit trailsLog correlation, pattern matching
Registry (Windows)User activity, installed software, persistenceRegistry hive analysis
BrowserHistory, cookies, cached pages, downloadsBrowser artifact extraction
# Volatility memory analysis commands # Identify operating system profile volatility -f memory.dmp imageinfo # List running processes volatility -f memory.dmp --profile=Win10x64 pslist # Find hidden processes volatility -f memory.dmp --profile=Win10x64 psscan # Extract network connections volatility -f memory.dmp --profile=Win10x64 netscan # Dump suspicious process memory volatility -f memory.dmp --profile=Win10x64 memdump -p 1234 -D /evidence/proc/ # Check for code injection volatility -f memory.dmp --profile=Win10x64 malfind

Configuration

ParameterDescriptionDefault
case_idUnique case identifier for evidence trackingRequired
evidence_dirRoot directory for evidence storage/evidence
hash_algorithmHash algorithm for integrity verificationsha256
timezoneTimezone for timestamp normalizationUTC
preserve_timestampsMaintain original file timestampstrue
chain_of_custodyEnable chain of custody loggingtrue
report_formatOutput format: html, pdf, markdownmarkdown

Best Practices

  1. Never modify original evidence — Always work on forensic copies. Mount evidence images as read-only and use write blockers for physical media. A single modified byte can invalidate an entire evidence chain in legal proceedings.

  2. Document every action with timestamps — Maintain a detailed forensic log of every command executed, every file accessed, and every finding discovered. Include your timezone, tool versions, and the exact commands used for reproducibility.

  3. Establish and maintain chain of custody — Record who handled the evidence, when, and what actions were taken. Use cryptographic hashes to verify evidence integrity at every transfer point and before each analysis session.

  4. Normalize timestamps before correlation — Different systems use different timezones, clock skew, and timestamp formats. Convert all timestamps to UTC before building timelines to avoid false chronological orderings.

  5. Use multiple tools to verify findings — Cross-verify important findings with at least two different tools or techniques. Single-tool analysis can produce false positives or miss artifacts that other tools detect through different methods.

Common Issues

Evidence integrity hash doesn't match after copying. This typically indicates I/O errors during acquisition. Use dc3dd or dd with conv=noerror,sync to handle bad sectors gracefully. Always verify hashes immediately after acquisition and re-acquire if mismatched. Network transfers should use integrity-checked protocols.

Timestamps show impossible sequences or future dates. Attackers frequently manipulate file timestamps (timestomping) to cover their tracks. Cross-reference filesystem timestamps with log entries, journal records, and MFT entries which are harder to forge consistently. Look for MACE (Modified, Accessed, Created, Entry) timestamp inconsistencies.

Memory analysis tools fail to identify the correct OS profile. Memory dumps may be corrupted, partially overwritten, or from an unusual OS version. Try multiple profiles, use volatility kdbgscan to find kernel debugger data blocks, and verify the dump size matches expected physical memory. For virtual machines, use hypervisor-level memory acquisition for cleaner dumps.

Community

Reviews

Write a review

No reviews yet. Be the first to review this template!

Similar Templates