Doc Coauthoring Engine
Battle-tested skill for guide, users, through, structured. Includes structured workflows, validation checks, and reusable patterns for productivity.
Doc Coauthoring Engine
A structured skill for collaborative document creation with AI ā guiding users through context gathering, iterative refinement, and reader testing in a three-phase workflow that produces polished documents through active partnership rather than one-shot generation.
When to Use This Skill
Choose Doc Coauthoring Engine when you need to:
- Co-write a document through structured iteration with AI
- Create documents that require deep context gathering before writing
- Build documents with multiple revision cycles
- Write content where the user's domain expertise drives the substance
- Produce reader-tested documents that communicate effectively
Consider alternatives when:
- You need a simple one-shot document (just ask directly)
- You need a template to fill in (use a document template skill)
- You need writing style improvement only (use a writing clarity skill)
Quick Start
# Start a coauthoring session claude "Let's co-write a technical design document for migrating our monolith to microservices. Guide me through the process."
# Coauthoring Session: Microservices Migration Design Doc ## Phase 1: Context Gathering š I'll ask you a series of questions to understand the full picture. Answer as briefly or thoroughly as you'd like. ### Architecture Context 1. What does the current monolith do? (main features) 2. What tech stack is it built on? 3. What's the current deployment model? 4. How many engineers work on it? ### Motivation 5. What's driving the migration? (scaling, velocity, reliability?) 6. What's the most painful part of the current architecture? 7. Have you attempted any decomposition before? ### Constraints 8. What's the timeline expectation? 9. Are there services that must stay monolithic? 10. What's the team's experience with microservices? [User answers questions...] ## Phase 2: Draft & Refine āļø Based on your answers, here's the first draft: [Document draft with sections based on gathered context] ### Refinement Prompts - "This section about service boundaries ā does this match how your team thinks about domain boundaries?" - "I assumed REST for inter-service communication. Should I explore event-driven alternatives?" - "The migration timeline seems aggressive. Should I add a phased approach?" ## Phase 3: Reader Test š Let's stress-test this document: - If a new engineer reads this, can they understand why we're migrating? - If the CTO reads the timeline, will they have concerns? - Does the risk section address the team's actual worries?
Core Concepts
Three-Phase Workflow
| Phase | Goal | Duration |
|---|---|---|
| Context Gathering | Understand the full picture | 10-15 questions |
| Refinement | Draft, iterate, improve | 2-3 iterations |
| Reader Testing | Validate from audience perspective | 3-5 test reads |
Context Gathering Techniques
## Question Categories ### Discovery Questions (Open-Ended) "What's the biggest challenge with the current system?" "If you could change one thing, what would it be?" ā Reveals priorities and pain points ### Specification Questions (Specific) "How many requests per second does the API handle?" "What's the p99 latency for the checkout flow?" ā Establishes concrete parameters ### Constraint Questions (Boundaries) "What can't change? (team size, budget, timeline)" "Are there regulatory requirements we must maintain?" ā Identifies hard limits ### Validation Questions (Confirm Understanding) "So the main driver is team velocity, not scaling?" "Let me make sure I understand: the auth service is the first candidate because it has the clearest boundaries?" ā Prevents misalignment before writing begins
Refinement Cycle
## Iterative Refinement Process ### Round 1: Structure Review - "Does this outline cover everything?" - "Is anything missing or in the wrong order?" - "Should any section be expanded or cut?" ### Round 2: Content Review - "Does this accurately describe your architecture?" - "Are the tradeoffs presented fairly?" - "Is the recommendation defensible?" ### Round 3: Tone and Clarity - "Is this too technical for the CTO audience?" - "Does the executive summary stand alone?" - "Are there any ambiguous statements?" ### Each round follows the pattern: 1. Present the current version 2. Ask 2-3 targeted questions 3. Incorporate feedback 4. Present updated version
Configuration
| Parameter | Description | Example |
|---|---|---|
document_type | Type of document being coauthored | "design_doc" / "rfc" |
audience | Primary reader(s) | "engineering + leadership" |
max_iterations | Maximum refinement rounds | 3 |
question_style | How questions are presented | "batch" / "conversational" |
output_format | Final document format | "markdown" / "gdoc" |
Best Practices
-
Gather context before writing a single word ā The biggest coauthoring failure is generating a draft before understanding the problem space. Spend the first 5-10 minutes asking questions. A well-informed first draft saves 3-4 revision cycles.
-
Ask questions in batches of 3-5, not 20 at once ā A wall of 20 questions overwhelms the user and gets superficial answers. Ask 3-5 at a time, process the answers, then ask follow-up questions informed by what you've learned.
-
Present specific refinement questions, not open-ended "feedback?" ā "What do you think?" gets vague responses. "Does the service boundary between orders and payments match how your team thinks about ownership?" gets actionable feedback.
-
Read-test from multiple audience perspectives ā A design doc that works for engineers may fail for executives. After refining content, explicitly walk through the document as each target reader and identify where they'd get confused, bored, or unconvinced.
-
Declare when the document is done ā Without explicit closure, coauthoring sessions iterate forever. After 2-3 refinement rounds, declare: "This is ready for review. Here's the final version with your changes incorporated."
Common Issues
User provides too little context and expects a perfect draft ā Some users answer questions with one-word responses and expect a comprehensive document. Explain that document quality is directly proportional to context quality. Offer to rephrase questions or provide examples of the depth of answer needed.
Refinement cycles don't converge ā Each round of feedback introduces new topics instead of refining existing content. Set a rule: Round 1 is for structural changes only, Round 2 is for content accuracy, Round 3 is for polish. No new sections after Round 1.
The document serves the coauthor but not the reader ā It's easy to write a document that the author understands because they have context the reader doesn't. Reader testing catches this: "Would someone who wasn't in this conversation understand what 'the legacy integration issue' refers to?"
Reviews
No reviews yet. Be the first to review this template!
Similar Templates
Full-Stack Code Reviewer
Comprehensive code review skill that checks for security vulnerabilities, performance issues, accessibility, and best practices across frontend and backend code.
Test Suite Generator
Generates comprehensive test suites with unit tests, integration tests, and edge cases. Supports Jest, Vitest, Pytest, and Go testing.
Pro Architecture Workspace
Battle-tested skill for architectural, decision, making, framework. Includes structured workflows, validation checks, and reusable patterns for development.