Get the FREE Ultimate OpenClaw Setup Guide →

aiconfig-tools

npx machina-cli add skill launchdarkly/agent-skills/aiconfig-tools --openclaw
Files (1)
SKILL.md
3.6 KB

AI Config Tools

You're using a skill that will guide you through adding capabilities to your AI agents through tools (function calling). Your job is to identify what your AI needs to do, create tool definitions, attach them to variations, and verify they work.

Prerequisites

  • LaunchDarkly API token with /*:ai-tool/* permission
  • Existing AI Config (use aiconfig-create skill first)
  • Tools endpoint: /ai-tools (NOT /ai-configs/tools)

Core Principles

  1. Start with Capabilities: Think about what your AI needs to do before creating tools
  2. Framework Matters: LangGraph/CrewAI often auto-generate schemas; OpenAI SDK needs manual schemas
  3. Create Before Attach: Tools must exist before you can attach them to variations
  4. Verify: The agent fetches tools and config to confirm attachment

API Key Detection

  1. Check environment variablesLAUNCHDARKLY_API_KEY, LAUNCHDARKLY_API_TOKEN, LD_API_KEY
  2. Check MCP config — Claude config if applicable
  3. Prompt user — Only if detection fails

Workflow

Step 1: Identify Needed Capabilities

What should the AI be able to do?

  • Query databases, call APIs, perform calculations, send notifications
  • Check what exists in the codebase (API clients, functions)
  • Consider framework: LangGraph/LangChain auto-generate schemas; direct SDK needs manual schemas

Step 2: Create Tools

Follow API Quick Start:

  1. Create toolPOST /projects/{projectKey}/ai-tools with key, description, schema
  2. Schema format — Use OpenAI function calling format (type, function.name, function.parameters)
  3. Clear descriptions — The LLM uses the description to decide when to call

Step 3: Attach to Variation

Tools cannot be attached during config creation. PATCH the variation:

PATCH /projects/{projectKey}/ai-configs/{configKey}/variations/{variationKey}

Body: {"model": {"parameters": {"tools": [{"key": "tool-name", "version": 1}]}}}

See API Quick Start for full curl example.

Step 4: Verify

  1. Verify tool exists:

    GET /projects/{projectKey}/ai-tools/{toolKey}
    
  2. Verify attached to variation:

    GET /projects/{projectKey}/ai-configs/{configKey}/variations/{variationKey}
    

    Check model.parameters.tools includes your tool key.

  3. Report results:

    • ✓ Tool created with valid schema
    • ✓ Tool attached to variation
    • ⚠️ Flag any issues

Orchestrator Note

LangGraph, CrewAI, AutoGen often generate schemas from function definitions. You still need to create tools in LaunchDarkly and attach keys to variations so the SDK knows what's available.

Edge Cases

SituationAction
Tool already exists (409)Use existing or create with different key
Wrong endpointUse /ai-tools, not /ai-configs/tools
Schema invalidUse OpenAI function format

What NOT to Do

  • Don't use /ai-configs/tools — it doesn't exist
  • Don't try to attach tools during config creation
  • Don't skip clear tool descriptions (LLM needs them)

Related Skills

  • aiconfig-create — Create config before attaching tools
  • aiconfig-variations — Manage variations

References

Source

git clone https://github.com/launchdarkly/agent-skills/blob/main/skills/ai-configs/aiconfig-tools/SKILL.mdView on GitHub

Overview

AI Config Tools guides you to empower AI agents with capabilities expressed as tools. It helps you identify required actions, create tool definitions using the OpenAI function calling format, attach tools to variations, and verify integration within your framework.

How This Skill Works

Identify the needed capabilities first. Create tools by posting to /projects/{projectKey}/ai-tools with a key, description, and OpenAI function schema. Then patch the variation to attach the tool (PATCH /projects/{projectKey}/ai-configs/{configKey}/variations/{variationKey}). Finally verify tool existence and attachment via GET requests and confirm the model.parameters.tools reflect the tool key.

When to Use It

  • When the AI needs to query databases, call APIs, perform calculations, or send notifications
  • When your framework (LangGraph/CrewAI vs OpenAI SDK) requires different schema handling
  • When you need to create tool definitions before attaching them to variations
  • When you must verify that tools exist and are attached to variations
  • When debugging tool attachment or resolving endpoint or schema issues

Quick Start

  1. Step 1: Identify Needed Capabilities
  2. Step 2: Create Tool using POST /projects/{projectKey}/ai-tools with key, description, and OpenAI function schema
  3. Step 3: Attach to Variation with PATCH /projects/{projectKey}/ai-configs/{configKey}/variations/{variationKey} and then verify with GET endpoints

Best Practices

  • Start with clearly defined capabilities before creating tools
  • Write clear, actionable tool descriptions so the LLM knows when to call
  • Use the OpenAI function calling format for tool schemas
  • Create tools first, then attach them to variations
  • Verify existence and attachment with API calls and monitor for issues

Example Use Cases

  • Create a data-inquiry tool with key, description, and a function schema, then attach to a variation
  • Define a weather API tool and add a descriptive prompt to guide calls
  • Patch a variation to attach the new tool and confirm via variation fetch
  • Verify the tool exists with GET /projects/{projectKey}/ai-tools/{toolKey} and that the variation reflects the tool key
  • Handle edge cases: tool already exists (409) or incorrect endpoint (/ai-tools vs /ai-configs/tools) and fix accordingly

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers