x-post
A Model Context Protocol (MCP) server that allows interaction with the X API, along with a client to interact with the server using Google's Gemini AI.
claude mcp add --transport stdio subhadeeproy3902-x-post-mcp bun run dev \ --env GEMINI_API_KEY="your_gemini_api_key" \ --env TWITTER_API_KEY="your_twitter_api_key" \ --env TWITTER_API_SECRET="your_twitter_api_secret" \ --env TWITTER_ACCESS_TOKEN="your_twitter_access_token" \ --env TWITTER_ACCESS_SECRET="your_twitter_access_secret"
How to use
X-Post MCP provides a server exposing a tool to post content directly to X (formerly Twitter) via the Model Context Protocol (MCP). The server uses Express and SSE to handle MCP tool calls, delegating AI-driven post creation to Google's Gemini models and then publishing the result to X through the official API. To interact with it, run the server and use the accompanying client to generate posts. The client connects to the MCP server, passes user prompts to Gemini for post generation, and calls the createPost tool exposed by the server to publish the content to X. This setup makes it easy to integrate automated content generation into AI workflows while ensuring posts adhere to X’s rules and length constraints.
Typical usage flow:
- Start the MCP server (bun run dev) in the server directory.
- Launch the client (bun run index.ts) to open an interactive chat interface.
- Ask the AI to draft a post; Gemini generates the content, and the MCP tool posts it to X upon confirmation.
- Use the tool to manage post length, truncation, and formatting as needed. The server handles compression to fit within X’s character limit and can be extended with additional MCP tools if required.
How to install
Prerequisites:
- Bun v1.2.5 or later installed on your machine
- X (Twitter) API credentials (API Key, API Secret, Access Token, Access Token Secret)
- Google Gemini API key
Installation steps:
- Clone the repository:
git clone https://github.com/subhadeeproy3902/x-post-mcp.git
cd x-post-mcp
- Install server dependencies (uses Bun):
cd server
bun install
- Install client dependencies (optional for local testing):
cd client
bun install
- Create environment configuration files:
- Server: create server/.env with your X API credentials (TWITTER_API_KEY, TWITTER_API_SECRET, TWITTER_ACCESS_TOKEN, TWITTER_ACCESS_SECRET)
- Client: create client/.env with GEMINI_API_KEY
- Run the server in development mode:
cd server
bun run dev
- In another terminal, start the client (optional for interactive testing):
cd client
bun run index.ts
Notes:
- Ensure your API keys are kept secret and not committed to version control.
- The server exposes a MCP tool named createPost to publish posts to X.
- You may adjust environment variables or configuration as needed for production deployments.
Additional notes
Tips and common issues:
- If you encounter authentication errors with the X API, double-check that your app permissions include Read, Write, and Direct Messages and that the tokens are correctly set.
- The Gemini API key should be kept secure; restrict it to necessary services and avoid exposing it in client-side code.
- Ensure the Bun runtime version matches the project’s prerequisites (v1.2.5+).
- When deploying, consider using a process manager and environment variable injection for secure credential management.
- If the MCP tool cannot post, check server logs for rate limits or invalid post content; ensure messages comply with X’s policies.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
mcp-documentation
MCP Documentation Server - Bridge the AI Knowledge Gap. ✨ Features: Document management • Gemini integration • AI-powered semantic search • File uploads • Smart chunking • Multilingual support • Zero-setup 🎯 Perfect for: New frameworks • API docs • Internal guides
gemini
MCP Server that enables Claude code to interact with Gemini
furi
CLI & API for MCP management
mcp-gemini
This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
architect
A powerful, self-extending MCP server for dynamic AI tool orchestration. Features sandboxed JS execution, capability-based security, automated rate limiting, marketplace integration, and a built-in monitoring dashboard. Built for the Model Context Protocol (MCP).