Quick Test Quality Analyzer
Streamline your workflow with this analyze, test, suite, quality. Includes structured workflows, validation checks, and reusable patterns for testing.
Quick Test Quality Analyzer
Evaluate the quality of your existing tests by checking assertion density, test isolation, naming conventions, and anti-pattern detection.
When to Use This Command
Run this command when...
- You suspect your test suite has low-quality tests that pass but do not actually verify behavior
- You want to enforce consistent test naming and structure standards across the team
- You are onboarding to a new codebase and need to assess test health quickly
Avoid this command when...
- You have no tests yet and need to write them from scratch
- You only need to check whether tests pass, not assess their structural quality
Quick Start
# .claude/commands/quick-test-quality-analyzer.md --- allowed-tools: ["Bash", "Read", "Grep", "Glob"] --- Analyze test files for quality signals: assertion density, naming patterns, test isolation, and common anti-patterns.
Example usage:
/quick-test-quality-analyzer
Example output:
Test Quality Report
===================
Files analyzed: 47
Avg assertions/test: 2.3 (good: >1.5)
Issues Found:
[WARN] 5 tests have zero assertions (empty tests)
[WARN] 12 tests share mutable state (isolation risk)
[INFO] 8 tests use non-descriptive names (test1, test2)
Score: 72/100 (Fair)
Core Concepts
| Concept | Description |
|---|---|
| Assertion density | Number of expect/assert calls per test case |
| Test isolation | Whether tests share state or have ordering dependencies |
| Naming quality | Descriptive names that explain the intent of the test |
| Anti-patterns | Empty tests, commented-out assertions, sleep-based waits |
Test Files --> Parse AST --> Extract Metrics
|
+-----------+-----------+
| | |
Assertions Isolation Naming
| | |
+--- Quality Score -----+
Configuration
| Option | Default | Description |
|---|---|---|
min-assertions | 1 | Minimum assertions per test to consider passing |
naming-pattern | should/when/it | Expected naming convention regex |
check-isolation | true | Flag shared mutable state between tests |
ignore-patterns | .setup. | Glob patterns for files to skip |
output | text | Report format (text, json, html) |
Best Practices
- Treat zero-assertion tests as bugs -- a test that asserts nothing verifies nothing and provides false confidence.
- Enforce descriptive names --
it('should return 404 when user not found')beatstest('error case'). - Isolate every test -- shared state causes flaky tests that fail non-deterministically.
- Avoid sleep-based waits -- use polling, callbacks, or test library wait utilities instead.
- Run periodically -- test quality degrades over time as shortcuts accumulate; schedule monthly reviews.
Common Issues
- False positives on assertion count -- Some testing patterns use custom matchers not recognized by the analyzer. Add custom assertion function names to the config.
- Setup files flagged as issues -- Shared setup/teardown files are normal infrastructure. Add them to ignore-patterns.
- Language not supported -- The analyzer supports JS/TS, Python, and Go. For other languages, check for language-specific quality tools.
Reviews
No reviews yet. Be the first to review this template!
Similar Templates
Git Commit Message Generator
Generates well-structured conventional commit messages by analyzing staged changes. Follows Conventional Commits spec with scope detection.
React Component Scaffolder
Scaffolds a complete React component with TypeScript types, Tailwind styles, Storybook stories, and unit tests. Follows project conventions automatically.
CI/CD Pipeline Generator
Generates GitHub Actions workflows for CI/CD including linting, testing, building, and deploying. Detects project stack automatically.