UnrealGenAISupport
An Unreal Engine plugin for LLM/GenAI models & MCP UE5 server. Includes OpenAI's GPT 5.1, Deepseek V3.1, Claude Sonnet 4.5 APIs, Gemini 3, Alibaba Qwen, Kimi and Grok 4.1, with plans to add Gemini, audio tts, elevenlabs, OpenRouter, Groq, Dashscope & realtime APIs soon. UnrealMCP is also here!! Automatic scene generation from AI!!
claude mcp add --transport stdio prajwalshettydev-unrealgenaisupport docker run -i unrealgenaisupport:latest \ --env GENAI_API_KEY="your-api-key" \ --env GENAI_API_BASE="https://api.genai.example" \ --env UE_PLUGIN_PATH="Path to Unreal Engine plugin (if applicable)" \ --env UNREAL_ENGINE_VERSION="5.x or compatible"
How to use
This MCP server provides integration capabilities for the Unreal Engine Generative AI Support Plugin. It exposes tools to orchestrate and interact with GenAI models within Unreal Engine workflows, enabling you to spawn scene objects, control their transformations and materials, generate Unreal Blueprints and functions, add components, and run Python scripts as part of automated AI-assisted development. The server is designed to connect to various GenAI providers, offering a unified interface for model selection, prompt construction, and result integration directly into your Unreal projects. Use it to prototype AI-assisted gameplay, level design tasks, or evaluation scenarios where NPCs, objects, and environments are guided by GenAI decisions. Tools are oriented toward in-engine automation and evaluation pipelines, with capabilities to route model outputs into UE constructs, manage assets, and execute scripted pipelines for rapid iteration.
How to install
Prerequisites:
- Docker installed on your system and running
- Access to a GenAI API (OpenAI, Deepseek, Claude, etc.) and a valid API key
- Unreal Engine plugin compatible with your project (as described in the UnrealGenAISupport repo)
Installation steps:
- Clone this MCP repository or ensure you have access to the UnrealGenAISupport MCP image in your registry.
- If using Docker, pull the latest image: docker pull unrealgenaisupport:latest
- Run the MCP server container locally: docker run -d --name unrealgenaisupport -e GENAI_API_KEY="<your-api-key>" -e UE_PLUGIN_PATH="/path/to/ue/plugin" unrealgenaisupport:latest
- Verify the server is reachable (e.g., via its exposed port or health endpoint as documented in the repo).
- Configure your MCP client to connect to the server (server name: unrealgenaisupport) and provide any required environment variables (API keys, Unreal engine versions).
Notes:
- If you prefer a local development setup, ensure Docker is authorized to access your local Unreal Engine plugin directory and that the container has network access to your GenAI provider.
- Replace placeholders (API keys, paths) with your actual values before runtime.
Additional notes
Tips and common issues:
- Ensure your GenAI API key is valid and has the required permissions for the models you intend to use.
- If the container fails to access Unreal Engine plugin files, verify the UE plugin path is correctly mounted or exposed to the container.
- Some models may have latency; consider using batching or streaming options if supported.
- Check environment variables like UE_PLUGIN_PATH, GENAI_API_BASE, and UNREAL_ENGINE_VERSION to align with your project setup.
- If you upgrade Unreal Engine versions, verify compatibility with the UnrealGenAISupport plugin and the MCP container image.
- For troubleshooting, inspect container logs and verify network access to GenAI endpoints from within the container.
Related MCP Servers
ai-guide
程序员鱼皮的 AI 资源大全 + Vibe Coding 零基础教程,分享大模型选择指南(DeepSeek / GPT / Gemini / Claude)、最新 AI 资讯、Prompt 提示词大全、AI 知识百科(RAG / MCP / A2A)、AI 编程教程、AI 工具用法(Cursor / Claude Code / OpenClaw / TRAE / Lovable / Agent Skills)、AI 开发框架教程(Spring AI / LangChain)、AI 产品变现指南,帮你快速掌握 AI 技术,走在时代前沿。本项目为开源文档版本,已升级为鱼皮 AI 导航网站
Everywhere
Context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.
archestra
Secure cloud-native MCP registry, gateway & orchestrator
MCP-SuperAssistant
Brings MCP to ChatGPT, DeepSeek, Perplexity, Grok, Gemini, Google AI Studio, OpenRouter, DeepSeek, T3 Chat and more...
py-gpt
Desktop AI Assistant powered by GPT-5, GPT-4, o1, o3, Gemini, Claude, Ollama, DeepSeek, Perplexity, Grok, Bielik, chat, vision, voice, RAG, image and video generation, agents, tools, MCP, plugins, speech synthesis and recognition, web search, memory, presets, assistants,and more. Linux, Windows, Mac
Unity
AI-powered bridge connecting LLMs and advanced AI agents to the Unity Editor via the Model Context Protocol (MCP). Chat with AI to generate code, debug errors, and automate game development tasks directly within your project.