S

Search Ai Assistant

Production-ready agent that handles expert, guidance, modern, search. Includes structured workflows, validation checks, and reusable patterns for web tools.

AgentClipticsweb toolsv1.0.0MIT
0 views0 copies

Search AI Assistant

Optimize content and technical infrastructure for visibility across traditional search engines, answer engines, and generative AI citation systems.

When to Use This Agent

Choose this agent when you need to:

  • Develop search strategies spanning Google rankings, AI answer boxes, and generative engine citations from ChatGPT and Perplexity
  • Audit technical SEO foundations including crawlability, indexation, Core Web Vitals, and structured data markup
  • Create content architectures for dual visibility in traditional SERPs and AI-generated responses through entity optimization

Consider alternatives when:

  • Your focus is strictly on paid search advertising or PPC campaign management
  • You need deep code-level web application development rather than search optimization strategy

Quick Start

Configuration

name: search-ai-assistant type: agent category: web-tools

Example Invocation

claude agent:invoke search-ai-assistant "Audit our SaaS pages for generative engine optimization and traditional SEO gaps"

Example Output

Search & AI Optimization Audit
===============================
Domain: acmewidgets.io | Pages: 47

TRADITIONAL SEO
- Crawl budget waste: 312 orphaned URLs
- Core Web Vitals: LCP 3.8s (POOR), CLS 0.04 (GOOD)
- Missing H1 tags on 8 landing pages

AI ENGINE OPTIMIZATION
- Entity coverage: 62% lack structured data markup
- FAQ schema missing on 91% of support pages
- Content authority signals absent on 38 pages

PRIORITY ACTIONS
1. Implement FAQ and HowTo schema across documentation
2. Add author expertise markup (credentials, publications)
3. Restructure pages with entity-first content blocks

GEO Readiness: 34/100 | Traditional SEO: 61/100

Core Concepts

Search Ecosystem Overview

AspectDetails
Traditional SEOKeyword targeting, backlink authority, crawlability, on-page optimization
Answer Engine OptimizationFeatured snippet targeting, FAQ schema, direct-answer formatting
Generative Engine OptimizationEntity authority, citation-worthy structure, claim-evidence patterns
Technical InfrastructureXML sitemaps, robots.txt, canonical tags, hreflang, render budgets

Search Visibility Architecture

+------------------+     +------------------+     +------------------+
|  Content Layer   |---->|  Technical SEO   |---->|  Traditional     |
|  Topic Clusters  |     |  Crawl & Index   |     |  SERP Rankings   |
|  Entity Modeling |     |  Schema Markup   |     |  Featured Snippets|
+------------------+     +------------------+     +------------------+
        |                        |                        |
        v                        v                        v
+------------------+     +------------------+     +------------------+
|  Authority       |     |  Structured Data |     |  AI Citation     |
|  E-E-A-T Markers |     |  Knowledge Graph |     |  LLM References  |
+------------------+     +------------------+     +------------------+

Configuration

ParameterTypeDefaultDescription
target_enginesarray["google","bing","perplexity"]Search and AI engines to optimize for
content_depthstring"comprehensive"Analysis depth: quick-scan, standard, comprehensive
schema_typesarray["Organization","FAQ","Article"]Structured data types to validate
cwv_thresholdobject{"lcp":2500,"cls":0.1,"inp":200}Core Web Vitals pass thresholds
geo_focusstring"global"Geographic targeting scope

Best Practices

  1. Structure Content for Dual Consumption - Use the claim-evidence-source pattern where each assertion is followed by supporting data and a reference. This improves readability and increases generative AI citation probability simultaneously.

  2. Build Entity Authority Systematically - Establish your brand as a recognized entity in knowledge graphs through consistent NAP data, author schema with credentials, and cross-platform entity mentions that both search algorithms and LLM systems prioritize.

  3. Optimize Technical Foundation Before Content - Ensure clean URL architecture, proper canonical tags, and sub-2.5-second LCP scores before content expansion. Search engines and AI crawlers alike deprioritize slow or inaccessible pages.

  4. Implement Layered Structured Data - Deploy schema at multiple levels: Organization for brand, Article with author attribution, FAQ for questions, HowTo for procedures. Each layer increases surface area for rich results and AI extraction.

  5. Monitor AI Citation Patterns - Track how generative AI systems reference your content. Analyze which formats and authority signals correlate with higher citation rates, then apply those patterns across your portfolio.

Common Issues

  1. Schema Validation Failures - Structured data passing syntax checks but failing Rich Results Test due to missing required properties. Validate against specific rich result requirements, not just JSON-LD syntax.

  2. AI Content Cannibalization - AI-generated site content can reduce citation authority in generative search. LLM systems prioritize original research and expert perspectives. Replace generic AI-written material with expert-attributed, data-backed analysis.

  3. CWV Regression on Dynamic Pages - Pages passing lab audits fail in production due to third-party scripts and dynamic ad injection. Implement Real User Monitoring alongside lab testing with automated alerts for threshold breaches.

Community

Reviews

Write a review

No reviews yet. Be the first to review this template!

Similar Templates