MCP
Model Context Protocol
claude mcp add --transport stdio bilgisayarkavramlari-mcp-server python main.py \ --env OPENAI_API_KEY="sk-... (set your OpenAI API key)"
How to use
This MCP server implements a stateful gateway for interacting with language models. It maintains per-user conversation context in Redis, enabling coherent, personalized dialogues across requests. The API is exposed via a FastAPI-based REST interface, with automatic OpenAPI/Swagger documentation. The gateway enriches user prompts with stored context before sending them to a language model and then updates the context based on the response. Tools available include endpoints for starting sessions, sending chat messages, and retrieving or inspecting session history, all designed to support production-like deployments (security, rate limiting, observability). To use it, configure your environment with a Redis instance and an OpenAI API key, run the server, and then interact with the REST endpoints to begin conversations, continue chats, and fetch sessions as needed.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- Redis server accessible to the application
- OpenAI API key (for model access)
Install and run locally (typical approach):
-
Clone the repository: git clone https://github.com/your-org/bilgisayarkavramlari-mcp-server.git cd bilgisayarkavramlari-mcp-server
-
Create and activate a virtual environment (optional but recommended): python -m venv venv source venv/bin/activate # on macOS/Linux .\venv\Scripts\activate # on Windows
-
Install dependencies: pip install -r requirements.txt
-
Ensure Redis is running. If you’re using Docker: docker run --name mcp-redis -p 6379:6379 -d redis
-
Set required environment variables (openai key, redis url if not default, etc.). Example for a local setup: export OPENAI_API_KEY="sk-...your-key..." export REDIS_URL="redis://localhost:6379/0"
-
Run the server: python main.py
-
Verify the server starts and exposes the API (the app should provide interactive Swagger UI documentation via FastAPI).
Notes:
- If you deploy to a container or hosting service, adapt the Redis connection string and port mappings accordingly.
- For production, consider securing the API, enabling rate limiting, and configuring proper observability/logging.
Additional notes
Tips and common issues:
- Ensure Redis is reachable from the MCP server process; misconfigured Redis URL is a common startup issue.
- If the OpenAI API key is invalid or missing, requests to the model will fail; verify OPENAI_API_KEY is correctly set.
- The server is designed to be stateless between restarts for the API layer itself while maintaining per-user state in Redis; ensure Redis persistence if you need long-term history.
- When running locally, you may need to adjust firewall or Docker network settings if using containers.
- Check the auto-generated OpenAPI docs for available endpoints and request/response formats.
- For Docker deployments, you can containerize this service to simplify deployment across environments.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP