logging-migrator
npx machina-cli add skill a5c-ai/babysitter/logging-migrator --openclawLogging Migrator Skill
Migrates logging infrastructure, handling log format standardization, structured logging conversion, and aggregation setup.
Purpose
Enable logging modernization for:
- Log format standardization
- Structured logging conversion
- Log aggregation setup
- Correlation ID injection
- Retention policy migration
Capabilities
1. Log Format Standardization
- Define standard format
- Convert existing logs
- Implement across services
- Validate compliance
2. Structured Logging Conversion
- Convert to JSON format
- Add metadata fields
- Handle custom fields
- Support multiple languages
3. Log Aggregation Setup
- Configure centralized logging
- Set up log shipping
- Handle high volume
- Implement failover
4. Correlation ID Injection
- Implement trace IDs
- Propagate across services
- Handle async operations
- Enable distributed tracing
5. Log Level Normalization
- Standardize log levels
- Map between frameworks
- Configure filtering
- Handle verbosity
6. Retention Policy Migration
- Define retention rules
- Implement rotation
- Handle archival
- Manage storage
Tool Integrations
| Tool | Purpose | Integration Method |
|---|---|---|
| ELK Stack | Log aggregation | Config |
| Datadog | Observability | API |
| Splunk | Log analysis | API |
| Loki | Log aggregation | Config |
| Fluentd | Log shipping | Config |
Output Schema
{
"migrationId": "string",
"timestamp": "ISO8601",
"logging": {
"format": "string",
"aggregation": {
"tool": "string",
"endpoint": "string"
},
"retention": {
"days": "number",
"archival": "boolean"
}
},
"services": [
{
"name": "string",
"status": "migrated|pending",
"logFormat": "string"
}
]
}
Integration with Migration Processes
- logging-observability-migration: Primary migration tool
- cloud-migration: Cloud logging setup
Related Skills
performance-baseline-capturer: Observability metrics
Related Agents
observability-migration-agent: Full observabilityoperational-readiness-agent: Operations setup
Source
git clone https://github.com/a5c-ai/babysitter/blob/main/plugins/babysitter/skills/babysit/process/specializations/code-migration-modernization/skills/logging-migrator/SKILL.mdView on GitHub Overview
Logging Migrator modernizes your logging by standardizing formats, converting logs to structured JSON, and setting up centralized aggregation. It supports correlation ID injection, retention policy migration, and cross-service traceability to enable reliable observability.
How This Skill Works
It defines a standard log format across services, converts existing logs to JSON with metadata, and configures centralized aggregation via tools like ELK, Datadog, Splunk, Loki, or Fluentd. It also enables correlation IDs and normalized log levels to support distributed tracing and consistent retention policies across the stack.
When to Use It
- Starting a log modernization initiative across a distributed system
- Migrating existing logs to a structured JSON format with metadata
- Setting up centralized log aggregation and log shipping
- Implementing correlation IDs and distributed tracing across services
- Migrating retention rules, rotation, and archival policies
Quick Start
- Step 1: Assess current logging formats and select a standard (e.g., JSON with metadata)
- Step 2: Convert existing logs to structured JSON and enable log shipping to your chosen aggregation tool
- Step 3: Enable correlation IDs, normalize log levels, and configure retention/rotation rules
Best Practices
- Define a single standard log format before migration and document required fields
- Run a pilot on a representative service to validate structured logs
- Map and normalize log levels across frameworks to avoid noise
- Test correlation IDs propagation in asynchronous workflows
- Plan retention, rotation, and archival with storage cost and compliance in mind
Example Use Cases
- Migrate a microservices suite using JSON logs and ELK stack with cross-service correlation
- Inject trace IDs and propagate through async message queues, enabling distributed tracing
- Ship logs to Datadog via API integration for unified observability
- Standardize log format across legacy services and modern apps with rotation
- Implement retention rules and archival for cost-effective storage