S

Specialist Legacy Modernizer

Comprehensive agent designed for agent, modernizing, legacy, systems. Includes structured workflows, validation checks, and reusable patterns for modernization.

AgentClipticsmodernizationv1.0.0MIT
0 views0 copies

Specialist Legacy Modernizer

Systematic legacy system transformation agent for incremental migration from aging codebases to modern architectures with zero production disruption.

When to Use This Agent

Choose this agent when you need to:

  • Assess a legacy codebase for technical debt, security vulnerabilities, and modernization priorities
  • Apply the strangler fig pattern to incrementally replace legacy modules with modern services
  • Extract business rules embedded in legacy code before refactoring or rewriting
  • Plan a multi-phase modernization roadmap with rollback procedures at each stage

Consider alternatives when:

  • Your modernization is primarily a cloud infrastructure migration (use Cloud Migration Pro)
  • You need to redesign a frontend application architecture (use Expert Architecture Bot)

Quick Start

Configuration

name: specialist-legacy-modernizer type: agent category: modernization

Example Invocation

claude agent:invoke specialist-legacy-modernizer "Assess a 15-year-old PHP monolith and create a modernization roadmap to Node.js microservices"

Example Output

Legacy Assessment Report
=========================
System: E-Commerce Platform (PHP 5.6, MySQL 5.5)
Age: 15 years | Lines of Code: 340,000 | Test Coverage: 8%

Technical Debt Score: 87/100 (Critical)
  - 23 known CVEs in dependencies
  - No dependency injection, global state throughout
  - SQL queries embedded in view templates
  - No automated tests for payment processing module

Modernization Roadmap:
  Phase 1 (Month 1-2): Safety Net
    - Write characterization tests for payment module
    - Set up CI/CD pipeline with GitHub Actions
    - Deploy application monitoring (Datadog APM)
    - Target: 45% test coverage on critical paths

  Phase 2 (Month 3-5): Strangler Fig - API Layer
    - Introduce API gateway (Kong) in front of monolith
    - Extract product catalog as Node.js microservice
    - Run parallel: PHP + Node.js serving same endpoints
    - Target: 2 modules extracted, zero downtime

  Phase 3 (Month 6-8): Core Migration
    - Extract payment processing to isolated service
    - Migrate from MySQL 5.5 to PostgreSQL 15
    - Implement event-driven communication (RabbitMQ)
    - Target: 5 modules extracted, 75% test coverage

  Rollback: Feature flags on every new service, DNS failback ready

Core Concepts

Modernization Strategy Overview

AspectDetails
AssessmentCode quality, dependency audit, business rule extraction
Migration PatternStrangler fig, branch by abstraction, parallel run
Safety NetCharacterization tests, monitoring, feature flags
RefactoringExtract service, introduce facade, replace algorithm
DatabaseSchema evolution, data migration, read replicas
DeploymentCanary releases, A/B testing, instant rollback

Strangler Fig Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  API         │────▢│  Router /    β”‚
β”‚  Gateway     β”‚     β”‚  Feature Flagβ”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
        β”‚                   β”‚
        β–Ό                   β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Legacy      β”‚     β”‚  Modern      β”‚
β”‚  Monolith    β”‚     β”‚  Service     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Configuration

ParameterTypeDefaultDescription
assessmentDepthenum"full"Depth: quick-scan, standard, full (includes security audit)
migrationPatternenum"strangler-fig"Pattern: strangler-fig, branch-abstraction, big-bang
testCoverageTargetnumber80Minimum test coverage percentage before extraction
rollbackStrategyenum"feature-flag"Rollback: feature-flag, dns-failback, blue-green
parallelRunDurationnumber14Days to run legacy and modern services in parallel

Best Practices

  1. Write Characterization Tests Before Changing Anything Before modifying a single line of legacy code, write tests that capture the system's current behavior including its quirks and bugs. These tests become your safety net, ensuring that modernization does not introduce regressions that break existing business logic.

  2. Extract the Least Coupled Module First Start with a module that has minimal dependencies on the rest of the monolith. Success on an isolated module builds team confidence, establishes the extraction playbook, and surfaces integration challenges before you tackle tightly coupled core modules.

  3. Run Legacy and Modern Services in Parallel Before cutting over traffic to the new service, run both systems simultaneously and compare their outputs. This parallel run phase catches behavioral differences that unit tests miss, especially around edge cases in data formatting and error handling.

  4. Preserve Business Rules Through Code Archaeology Legacy systems often encode business rules that exist nowhere else -- no documentation, no specification. Before removing legacy code, systematically extract and document every conditional, calculation, and validation rule. Lost business rules are expensive to rediscover.

  5. Use Feature Flags for Granular Rollback Control Wrap every new service call behind a feature flag that can instantly redirect traffic back to the legacy system. This provides confidence to deploy aggressively because rollback takes seconds rather than hours of redeployment.

Common Issues

  1. Extracted service produces different results than the legacy module Subtle differences in date handling, floating-point arithmetic, or character encoding between the old and new technology stacks cause output mismatches. Implement a comparison harness that runs both systems and logs discrepancies before cutting over production traffic.

  2. Database migration breaks foreign key constraints in the monolith When splitting a shared database, tables that remain in the monolith may reference tables that moved to the new service's database. Introduce a synchronization layer or API-based lookups to maintain referential integrity during the transition period.

  3. Team velocity drops during the first extraction phase The first module extraction is always the slowest because the team is building the extraction playbook, CI/CD pipeline, and deployment infrastructure simultaneously. Set realistic expectations and treat the first extraction as an investment that accelerates all subsequent phases.

Community

Reviews

Write a review

No reviews yet. Be the first to review this template!

Similar Templates