Get the FREE Ultimate OpenClaw Setup Guide →

linkedin

This MCP server allows Claude and other AI assistants to access your LinkedIn. Scrape LinkedIn profiles, companies and jobs, and perform job searches.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio stickerdaniel-linkedin-mcp-server uvx linkedin-scraper-mcp

How to use

This MCP server exposes a LinkedIn scraping toolkit that can be used by AI assistants and clients to access LinkedIn data through a persistent browser-based workflow. The server runs via uvx (a universal runner for Python-based MCPs). Before using the server, you authenticate with a persistent browser profile so that login and session management keep working across runs. The available tools include getting person and company profiles, retrieving company posts, searching for jobs or people, and obtaining detailed job information. Typical usage involves starting the MCP server with uvx linkedin-scraper-mcp and then sending requests to the configured MCP endpoint (default host 127.0.0.1, port 8000, path /mcp) from your client or inspector tool. The setup emphasizes persistent profile management and a login flow to keep an authenticated session for LinkedIn scraping.

How to install

Prerequisites:

  • Python 3.8+ installed on your system
  • uv (the Universal Installer) installed (see install steps below)
  • Optional: Docker if you prefer containerized deployment

Step 1: Install uv

Step 2: Install the LinkedIn MCP server package

  • If installing via uvx directly for the package, first ensure you have uvx available: uvx --version
  • Install the LinkedIn MCP server package (example command shown if you install via uvx/patchright setup you can run the provided install step): uvx linkedin-scraper-mcp --login

    This will guide you through a manual login flow to create a persistent browser profile

Step 3: Run the MCP server (examples)

  • Local stdout transport (default): uvx linkedin-scraper-mcp
  • In HTTP mode for web clients, you can use the standard uvx execution with appropriate transport flags (refer to uvx docs for transport options)

Step 4: Configure a client to talk to the MCP

  • Create a client configuration file (mcp.json) like: { "mcpServers": { "linkedin": { "command": "uvx", "args": ["linkedin-scraper-mcp"] } } }
  • Start the server via the above command and point your MCP client to http://127.0.0.1:8000/mcp (default) or the configured host/port.

Notes:

  • If authentication issues arise, re-run the login step: uvx linkedin-scraper-mcp --login
  • The browser profile is saved to ~/.linkedin-mcp/profile/

Additional notes

Tips and common issues:

  • Ensure uv is installed (curl -LsSf https://astral.sh/uv/install.sh | sh) and uv --version is 0.4.0 or higher.
  • The LinkedIn login flow requires solving possible captchas; the profile is persistent and stored under ~/.linkedin-mcp/profile/.
  • If sessions expire or you encounter authentication errors, re-run uvx linkedin-scraper-mcp --login to create a new persistent profile.
  • Timeouts can be adjusted with --timeout MS or via the TIMEOUT environment variable if supported by the underlying MCP implementation.
  • If you need to change the browser path or profile location, use --chrome-path and --user-data-dir respectively, or set CHROME_PATH and related env vars.
  • Breaking changes to LinkedIn scraping may require you to update to the latest patchright-based workflow as noted in the README; older session.json and LINKEDIN_COOKIE env vars are no longer supported.

Related MCP Servers

Sponsor this space

Reach thousands of developers