Get the FREE Ultimate OpenClaw Setup Guide →

sample less

MCP server from aws-samples/sample-serverless-mcp-server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio aws-samples-sample-serverless-mcp-server npx -y sample-serverless-mcp-server \
  --env GITHUB_PERSONAL_ACCESS_TOKEN="YOUR_GITHUB_PERSONAL_ACCESS_TOKEN"

How to use

This MCP server implements the Streamable HTTP MCP protocol and is designed to run on AWS Lambda via a serverless architecture. It transforms the official TypeScript MCP Server to support streamable HTTP and dynamic context prioritization, enabling chunked HTTP responses and scalable, event-driven MCP handling. To operate locally, you can use the Serverless Offline workflow to simulate AWS Lambda and API Gateway. The repository exposes commands for deploying to Lambda and running a local offline server, so you can test MCP requests end-to-end against a local proxy that mimics AWS infrastructure. The available tooling centers around the Serverless framework and npm scripts, including offline testing and deployment routines.

How to install

Prerequisites:

  • Node.js 20+ (including npm)
  • AWS CLI configured with appropriate credentials
  • OSS-Serverless CLI (for serverless operations)
  • Git

Step-by-step installation:

  1. Clone the repository: git clone https://gitlab.aws.dev/wsuam/sample-serverless-mcp-server.git cd sample-serverless-mcp-server/src/github/

  2. Install dependencies: npm install

  3. Install Serverless Offline tooling (if not already installed globally): npm install -g osls

  4. Copy and customize the serverless configuration: cp serverless.example.yml serverless.yml

    Edit serverless.yml as needed, including GITHUB_PERSONAL_ACCESS_TOKEN in environment variables

  5. Test locally (offline mode simulating AWS API Gateway/Lambda): npm sls offline

  6. Deploy to AWS Lambda: npm sls deploy

Notes:

  • Ensure your GitHub Personal Access Token has the necessary scopes for repository access as required by the MCP server logic.
  • The server uses Streamable HTTP over HTTP chunked transfer encoding; ensure your local proxy or test harness supports chunked responses.

Additional notes

Environment variables and configuration:

  • GITHUB_PERSONAL_ACCESS_TOKEN: Your GitHub token used to access repository data during MCP operations. Do not commit this token to public repositories.
  • Serverless deployment requires appropriate AWS permissions for creating Lambda, API Gateway, and related resources.

Common issues:

  • If npm sls offline fails to start, verify Node.js version and ensure that OSS-Serverless CLI is installed and in PATH.
  • Ensure serverless.yml includes the correct environment variable mappings and that the token is accessible to the Lambda execution role when deployed.
  • For streamable HTTP, verify chunked transfer support in your testing proxy or client, as some HTTP clients may buffer responses differently.

Configuration tips:

  • You can tune dynamic context prioritization by adjusting Lambda environment variables or the MCP server’s runtime configuration if exposed by the deployment package.
  • Monitor logs in CloudWatch (Lambda) or the local console output during offline tests to trace MCP request handling and streaming behavior.

Related MCP Servers

Sponsor this space

Reach thousands of developers