ollama
Use fast-agent to use MCP tools with local LLM, API or Claude Desktop. WIP
How to use
The Ollama MCP server allows you to seamlessly integrate Model Context Protocol (MCP) tools with local language models (LLMs), APIs, or Claude Desktop. By utilizing fast-agent, you can enhance your application’s capabilities, making it easier to manage and communicate with various AI models in your development projects. This server is particularly beneficial for developers looking to streamline their AI interactions and improve performance.
Once connected to the Ollama MCP server, you can interact with it by sending specific commands to leverage the available tools. Although no tools are documented yet, you can expect to execute queries that involve text processing, AI model management, or API calls. For best results, focus on concise commands that specify the task you want the model to perform, ensuring clarity in your requests to optimize the server's response.
How to install
To install the Ollama MCP server, ensure you have the following prerequisites:
- Node.js (version 14 or later recommended)
Option A: Quick start with npx
If you prefer a quick setup without installing globally, you can use npx:
npx -y angrysky56/ollama-mcp-server
Option B: Global install alternative
For a global installation, you can clone the repository:
git clone https://github.com/angrysky56/ollama-mcp-server.git
cd ollama-mcp-server
npm install
After installation, you can run the server with:
node server.js
Additional notes
When configuring the Ollama MCP server, ensure that your environment is set up correctly to handle any dependencies and avoid port conflicts. You might need to set environment variables to define your API keys or specify model parameters. Keep an eye on the server’s logs for any errors during startup, as they can provide insights into configuration issues.
Related MCP Servers
npcpy
The python library for research and development in NLP, multimodal LLMs, Agents, ML, Knowledge Graphs, and more.
sample-agentic-ai-demos
Collection of examples of how to use Model Context Protocol with AWS.
Archive-Agent
Find your files with natural language and ask questions.
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-crew-ai
MCP Crew AI Server is a lightweight Python-based server designed to run, manage and create CrewAI workflows.
mcpx-py
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps