mcp
IgnitionAI MCP Repos
claude mcp add --transport stdio ignitionai-mcp npx fitmcp
How to use
This MCP repository hosts multiple implementations of the Model Context Protocol (MCP), enabling AI models to discover and call external tools and data sources across different domains. Each server is implemented as its own MCP, typically started via npx in the monorepo and can be configured with environment variables for credentials and endpoints. The included servers cover Azure storage integration, Azure AI search, LinkedIn API access, a lightweight LinkedIn HTTP variant, and a fitness/nutrition MCP, plus a template for creating new servers. To use, pick the MCP you need, provide the required credentials through environment variables, and run the server using the command outlined in the configuration. Once running, you can use the MCP inspector to debug and validate tool registrations, prompts, and responses, and you can interact with registered tools from your AI model through the MCP interface.
How to install
Prerequisites:
- Node.js (LTS) and pnpm installed on your system
- Optional: Docker and Docker Compose if you plan to run containerized services
- Access/credentials for any external services you intend to connect (e.g., Azure, LinkedIn)
-
Clone the repository: git clone https://github.com/ignitionai/ ignitionai-mcp cd ignitionai-mcp
-
Install dependencies for the monorepo (pnpm is recommended): pnpm install
-
Build all MCP projects (optional but recommended before running): pnpm build
-
Start an MCP server (example with a specific server): pnpm --filter azure-storage start or to start all (if a script exists): pnpm start
-
Verify that the MCP inspector is available for debugging: pnpm inspect
-
If you prefer Docker, ensure Docker is running and use the provided docker-compose setup to boot relevant services, for example: docker compose up -d
Note: Each MCP server typically requires its own environment variables (such as connection strings and API keys). Refer to the README sections under each server or the mcp_config example to determine which env vars to provide and how to format them.
Additional notes
Tips and notes:
- Use the mcp_config example as a starting point to wire up multiple servers in one place. Each entry shows typical command and args (npx-based) along with placeholder environment variables.
- For LinkedIn-related MCPs, ensure you’ve created a LinkedIn Developer App and supplied CLIENT_ID, CLIENT_SECRET, and REDIRECT_URI as environment variables.
- If a server is marked as in development, you may want to run it locally with verbose logging to catch integration issues early.
- The monorepo uses TypeScript with ESM modules and Rollup; use pnpm build to generate production-ready bundles before deploying.
- When running multiple MCPs, consider isolating credentials per server to avoid cross-contamination of secrets.
- The README demonstrates typical usage; adapt the mcp_config values to your deployment environment and naming conventions.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
MiniMax -JS
Official MiniMax Model Context Protocol (MCP) JavaScript implementation that provides seamless integration with MiniMax's powerful AI capabilities including image generation, video generation, text-to-speech, and voice cloning APIs.
mcp-bundler
Is the MCP configuration too complicated? You can easily share your own simplified setup!
akyn-sdk
Turn any data source into an MCP server in 5 minutes. Build AI-agents-ready knowledge bases.
promptboard
The Shared Whiteboard for Your AI Agents via MCP. Paste screenshots, mark them up, and share with AI.