Advisor Data Champion
Enterprise-grade agent for agent, need, discover, collect. Includes structured workflows, validation checks, and reusable patterns for deep research team.
Advisor Data Champion
An agent that champions data-driven decision making within organizations, helping teams establish data governance practices, quality standards, literacy programs, and analytics culture that turns raw data into reliable business intelligence.
When to Use This Agent
Choose Data Champion when:
- Establishing data governance frameworks and data ownership models
- Defining data quality standards and monitoring processes
- Building data literacy programs for non-technical stakeholders
- Creating data dictionaries and metadata documentation
- Advising on data strategy alignment with business objectives
Consider alternatives when:
- Building data pipelines (use a data engineering agent)
- Performing statistical analysis (use a data science agent)
- Designing database schemas (use a database architect agent)
Quick Start
# .claude/agents/advisor-data-champion.yml name: Data Champion model: claude-sonnet-4-20250514 tools: - Read - Write - Bash - Glob - Grep prompt: | You are a data champion who advocates for data-driven decision making. Help organizations establish data governance, quality standards, and analytics culture. Focus on making data accessible, trustworthy, and actionable for all stakeholders.
Example invocation:
claude --agent advisor-data-champion "Help us establish a data governance framework for our growing startup. We have data in PostgreSQL, Snowflake, and various SaaS tools with no consistent definitions or ownership model."
Core Concepts
Data Governance Framework
Strategy β Standards β Ownership β Quality β Access β Compliance
β β β β β β
Vision Naming Stewards Monitoring Policies Regulations
Goals Definitions Councils SLAs Roles Audit
Metrics Formats Escalation Profiling Catalog Retention
Data Quality Dimensions
| Dimension | Definition | Example Check |
|---|---|---|
| Accuracy | Data correctly represents reality | Email format validation |
| Completeness | Required fields are populated | Null rate < 5% |
| Consistency | Same value across systems | Customer name matches in CRM and billing |
| Timeliness | Data is current enough for use | Updated within last 24 hours |
| Uniqueness | No unintended duplicates | Deduplicated customer records |
| Validity | Data conforms to business rules | Age between 0 and 150 |
Data Maturity Model
| Level | Characteristics | Focus Area |
|---|---|---|
| 1 - Ad Hoc | No standards, tribal knowledge | Awareness and inventory |
| 2 - Managed | Basic documentation, some owners | Governance framework |
| 3 - Defined | Consistent standards, quality monitoring | Automation and tooling |
| 4 - Measured | Quality SLAs, data contracts | Continuous improvement |
| 5 - Optimized | Self-service analytics, data mesh | Innovation and democratization |
Configuration
| Parameter | Description | Default |
|---|---|---|
maturity_target | Target maturity level | Level 3 |
governance_scope | Governance coverage | Critical datasets first |
quality_monitoring | Quality check automation | Automated with alerts |
catalog_tool | Data catalog platform | Open-source (DataHub) |
ownership_model | Data ownership structure | Domain-based stewardship |
compliance_reqs | Regulatory requirements | None specified |
documentation_standard | Documentation template | Data dictionary format |
Best Practices
-
Start with the most critical datasets, not all data. Trying to govern everything at once paralyzes the initiative. Identify the 5-10 datasets that drive the most important business decisions (revenue metrics, customer records, product usage data) and establish governance for those first. Success with critical data builds credibility and momentum for expanding to other datasets.
-
Assign data owners from the business side, not just IT. Data ownership belongs to the teams who generate and use the data. The marketing team should own marketing campaign data because they understand what the fields mean, what quality looks like, and how the data should be used. IT teams maintain the infrastructure; business teams define the standards and resolve quality issues.
-
Create a data dictionary with business definitions, not just technical schemas. A column called
statuscould mean order status, account status, or approval status. The data dictionary should define each field in business language: "Order Status: The current fulfillment state of a customer order. Valid values: pending, processing, shipped, delivered, cancelled. Set by the fulfillment system when state changes." Technical stakeholders can read schemas; business stakeholders need human language. -
Implement data quality monitoring as automated checks, not manual reviews. Define quality rules as code that runs automatically: null rate thresholds, valid value ranges, cross-system consistency checks, and freshness requirements. Alert data owners when quality drops below SLAs. Manual quality reviews catch issues days or weeks late; automated checks catch them within hours.
-
Measure data quality with business-meaningful metrics. Instead of tracking abstract quality scores, measure impact: "How many support tickets were caused by incorrect customer data? How many reports were delayed by missing data? What revenue was affected by stale pricing data?" Business-impact metrics get executive attention and budget that abstract quality scores never will.
Common Issues
Data governance initiative stalls due to lack of executive sponsorship. Governance requires cross-team coordination and sometimes conflicts with team autonomy. Without executive backing, teams can ignore governance standards. Secure a C-level sponsor by framing governance in business terms: regulatory risk, decision quality, and operational efficiency. A data incident (wrong report to the board, compliance violation) often creates the urgency that launches governance programs.
Teams resist data documentation because it feels like overhead. Make documentation part of the workflow, not a separate task. Integrate data dictionary updates into the pull request process for schema changes. Use tools that auto-generate technical documentation and only require business descriptions to be added manually. When documentation prevents a data incident or saves onboarding time, share that story to demonstrate value.
Data quality issues persist despite monitoring. Monitoring identifies problems; it doesn't fix them. Establish clear escalation paths: who gets notified? Who is responsible for fixing? What's the SLA for resolution? Without accountability, quality alerts become noise. Assign data quality incidents the same urgency and tracking as production incidents, with root cause analysis and prevention measures.
Reviews
No reviews yet. Be the first to review this template!
Similar Templates
API Endpoint Builder
Agent that scaffolds complete REST API endpoints with controller, service, route, types, and tests. Supports Express, Fastify, and NestJS.
Documentation Auto-Generator
Agent that reads your codebase and generates comprehensive documentation including API docs, architecture guides, and setup instructions.
Ai Ethics Advisor Partner
All-in-one agent covering ethics, responsible, development, specialist. Includes structured workflows, validation checks, and reusable patterns for ai specialists.