This MCP server allows Claude and other AI assistants to access your LinkedIn. Scrape LinkedIn profiles and companies, get your recommended jobs, and perform...
Documentation
LinkedIn MCP Server
Through this LinkedIn MCP server, AI assistants like Claude can connect to your LinkedIn. Access profiles and companies, search for jobs, or get job details.
Installation Methods
Usage Examples
Research the background of this candidate https://www.linkedin.com/in/stickerdaniel/Get this company profile for partnership discussions https://www.linkedin.com/company/inframs/Suggest improvements for my CV to target this job posting https://www.linkedin.com/jobs/view/4252026496What has Anthropic been posting about recently? https://www.linkedin.com/company/anthropicresearch/Features & Tool Status
| Tool | Description | Status |
|---|---|---|
get_person_profile | Get profile info with explicit section selection (experience, education, interests, honors, languages, contact_info, posts) | Working |
get_company_profile | Extract company information with explicit section selection (posts, jobs) | Working |
get_company_posts | Get recent posts from a company's LinkedIn feed | Working |
search_jobs | Search for jobs with keywords and location filters | Working |
search_people | Search for people by keywords and location | Working |
get_job_details | Get detailed information about a specific job posting | Working |
close_session | Close browser session and clean up resources | Working |
Tool responses keep readable sections text and may also include a compact references map keyed by section. Each reference includes a typed target, a relative LinkedIn path (or absolute external URL), and a short label/context when available.
When one section fails but the overall tool call still completes, responses may also include section_errors. Each entry contains structured diagnostics for that section, including the error type/message, a compact runtime summary, trace/log locations, matching-open-issue hints when available, and the path to a generated issue-ready markdown report with the full session details.
[!IMPORTANT]
Breaking change: LinkedIn recently made some changes to prevent scraping. The newest version uses Patchright with persistent browser profiles instead of Playwright with session files. Old
session.jsonfiles andLINKEDIN_COOKIEenv vars are no longer supported. Run--loginagain to create a new profile + cookie file that can be mounted in docker. 02/2026
🚀 uvx Setup (Recommended - Universal)
Prerequisites: Install uv and run uvx patchright install chromium to set up the browser.
Installation
Step 1: Create a session (first time only)
uvx linkedin-scraper-mcp --loginThis opens a browser for you to log in manually (5 minute timeout for 2FA, captcha, etc.). The browser profile is saved to ~/.linkedin-mcp/profile/.
Step 2: Client Configuration:
{
"mcpServers": {
"linkedin": {
"command": "uvx",
"args": ["linkedin-scraper-mcp"]
}
}
}[!NOTE]
Sessions may expire over time. If you encounter authentication issues, run
uvx linkedin-scraper-mcp --loginagain
uvx Setup Help
🔧 Configuration
Transport Modes:
- Default (stdio): Standard communication for local MCP servers
- Streamable HTTP: For web-based MCP server
- If no transport is specified, the server defaults to
stdio - An interactive terminal without explicit transport shows a chooser prompt
CLI Options:
--login- Open browser to log in and save persistent profile--no-headless- Show browser window (useful for debugging scraping issues)--log-level {DEBUG,INFO,WARNING,ERROR}- Set logging level (default: WARNING)--transport {stdio,streamable-http}- Optional: force transport mode (default: stdio)--host HOST- HTTP server host (default: 127.0.0.1)--port PORT- HTTP server port (default: 8000)--path PATH- HTTP server path (default: /mcp)--logout- Clear stored LinkedIn browser profile--timeout MS- Browser timeout for page operations in milliseconds (default: 5000)--user-data-dir PATH- Path to persistent browser profile directory (default: ~/.linkedin-mcp/profile)--chrome-path PATH- Path to Chrome/Chromium executable (for custom browser installations)
Basic Usage Examples:
# Create a session interactively
uvx linkedin-scraper-mcp --login
# Run with debug logging
uvx linkedin-scraper-mcp --log-level DEBUGHTTP Mode Example (for web-based MCP clients):
uvx linkedin-scraper-mcp --transport streamable-http --host 127.0.0.1 --port 8080 --path /mcpRuntime server logs are emitted by FastMCP/Uvicorn.
Tool calls are serialized within a single server process to protect the shared
LinkedIn browser session. Concurrent client requests queue instead of running in
parallel. Use --log-level DEBUG to see scraper lock wait/acquire/release logs.
Test with mcp inspector:
1. Install and run mcp inspector ``bunx @modelcontextprotocol/inspector``
2. Click pre-filled token url to open the inspector in your browser
3. Select Streamable HTTP as Transport Type
4. Set URL to http://localhost:8080/mcp
5. Connect
6. Test tools
❗ Troubleshooting
Installation issues:
- Ensure you have uv installed:
curl -LsSf https://astral.sh/uv/install.sh | sh - Check uv version:
uv --version(should be 0.4.0 or higher)
Session issues:
- Browser profile is stored at
~/.linkedin-mcp/profile/ - Make sure you have only one active LinkedIn session at a time
Login issues:
- LinkedIn may require a login confirmation in the LinkedIn mobile app for
--login - You might get a captcha challenge if you logged in frequently. Run
uvx linkedin-scraper-mcp --loginwhich opens a browser where you can solve it manually.
Timeout issues:
- If pages fail to load or elements aren't found, try increasing the timeout:
--timeout 10000 - Users on slow connections may need higher values (e.g., 15000-30000ms)
- Can also set via environment variable:
TIMEOUT=10000
Custom Chrome path:
- If Chrome is installed in a non-standard location, use
--chrome-path /path/to/chrome - Can also set via environment variable:
CHROME_PATH=/path/to/chrome
🐳 Docker Setup
Prerequisites: Make sure you have Docker installed and running.
Authentication
Docker runs headless (no browser window), so you need to create a browser profile locally first and mount it into the container.
Step 1: Create profile on the host (one-time setup)
# Installed package usage
uvx linkedin-scraper-mcp --login
# Local development from this repo
uv run -m linkedin_mcp_server --loginIf you are debugging or verifying code changes in this repository, prefer uv run -m linkedin_mcp_server ... so the running process matches your workspace files. Use uvx when intentionally testing the packaged distribution.
This opens a browser window where you log in manually (5 minute timeout for 2FA, captcha, etc.). The browser profile is saved to ~/.linkedin-mcp/profile/.
After login, the host writes:
- source profile:
~/.linkedin-mcp/profile/ - portable cookies:
~/.linkedin-mcp/cookies.json - source session metadata:
~/.linkedin-mcp/source-state.json
Docker foreign runtimes derive a Linux runtime profile under:
~/.linkedin-mcp/runtime-profiles/linux-amd64-container/profile/~/.linkedin-mcp/runtime-profiles/linux-amd64-container/storage-state.json~/.linkedin-mcp/runtime-profiles/linux-amd64-container/runtime-state.json
By default, Docker now creates a fresh bridged Linux session on every startup using the minimal working auth cookie subset (li_at, JSESSIONID, bcookie, bscookie, lidc) and keeps that session alive for the server lifetime. This currently works more reliably than reusing a checkpointed derived runtime profile across restarts.
Runtime traces/logs are captured into an ephemeral run directory by default and are automatically preserved only when a scrape failure occurs. Set LINKEDIN_TRACE_MODE=always to keep every run or LINKEDIN_TRACE_MODE=off to disable trace persistence entirely.
If you want to experiment with persistent derived runtime reuse anyway, set LINKEDIN_EXPERIMENTAL_PERSIST_DERIVED_SESSION=1. In that mode, the first Docker run performs an internal checkpoint restart after /feed/ succeeds and later Docker runs try to reuse the committed Linux runtime profile directly.
Step 2: Configure Claude Desktop with Docker
{
"mcpServers": {
"linkedin": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-v", "~/.linkedin-mcp:/home/pwuser/.linkedin-mcp",
"stickerdaniel/linkedin-mcp-server:latest"
]
}
}
}[!NOTE]
Docker now fresh-bridges by default on each startup. Persistent derived runtime reuse is still available behind
LINKEDIN_EXPERIMENTAL_PERSIST_DERIVED_SESSION=1, but it remains experimental.
[!NOTE]
**Why can't I run
--loginin Docker?** Docker containers don't have a display server. Create a profile on your host using the uvx setup and mount it into Docker.
Docker Setup Help
🔧 Configuration
Transport Modes:
- Default (stdio): Standard communication for local MCP servers
- Streamable HTTP: For a web-based MCP server
- If no transport is specified, the server defaults to
stdio - An interactive terminal without explicit transport shows a chooser prompt
CLI Options:
--log-level {DEBUG,INFO,WARNING,ERROR}- Set logging level (default: WARNING)--transport {stdio,streamable-http}- Optional: force transport mode (default: stdio)--host HOST- HTTP server host (default: 127.0.0.1)--port PORT- HTTP server port (default: 8000)--path PATH- HTTP server path (default: /mcp)--logout- Clear all stored LinkedIn auth state, including source and derived runtime profiles--timeout MS- Browser timeout for page operations in milliseconds (default: 5000)--user-data-dir PATH- Path to persistent browser profile directory (default: ~/.linkedin-mcp/profile)--chrome-path PATH- Path to Chrome/Chromium executable (rarely needed in Docker)
[!NOTE]
--loginand--no-headlessare not available in Docker (no display server). Use the uvx setup to create profiles.
HTTP Mode Example (for web-based MCP clients):
docker run -it --rm \
-v ~/.linkedin-mcp:/home/pwuser/.linkedin-mcp \
-p 8080:8080 \
stickerdaniel/linkedin-mcp-server:latest \
--transport streamable-http --host 0.0.0.0 --port 8080 --path /mcpRuntime server logs are emitted by FastMCP/Uvicorn.
Test with mcp inspector:
1. Install and run mcp inspector ``bunx @modelcontextprotocol/inspector``
2. Click pre-filled token url to open the inspector in your browser
3. Select Streamable HTTP as Transport Type
4. Set URL to http://localhost:8080/mcp
5. Connect
6. Test tools
❗ Troubleshooting
Docker issues:
- Make sure Docker is installed
- Check if Docker is running:
docker ps
Login issues:
- Make sure you have only one active LinkedIn session at a time
- LinkedIn may require a login confirmation in the LinkedIn mobile app for
--login - You might get a captcha challenge if you logged in frequently. Run
uvx linkedin-scraper-mcp --loginwhich opens a browser where you can solve captchas manually. See the uvx setup for prerequisites. - If Docker auth becomes stale after you re-login on the host, restart Docker once so it can fresh-bridge from the new source session generation.
Timeout issues:
- If pages fail to load or elements aren't found, try increasing the timeout:
--timeout 10000 - Users on slow connections may need higher values (e.g., 15000-30000ms)
- Can also set via environment variable:
TIMEOUT=10000
Custom Chrome path:
- If Chrome is installed in a non-standard location, use
--chrome-path /path/to/chrome - Can also set via environment variable:
CHROME_PATH=/path/to/chrome
📦 Claude Desktop (DXT Extension)
Prerequisites: Claude Desktop and Docker installed & running
One-click installation for Claude Desktop users:
1. Download the DXT extension
2. Double-click to install into Claude Desktop
3. Create a session: uvx linkedin-scraper-mcp --login
[!NOTE]
Sessions may expire over time. If you encounter authentication issues, run
uvx linkedin-scraper-mcp --loginagain.
DXT Extension Setup Help
❗ Troubleshooting
First-time setup timeout:
- Claude Desktop has a ~60 second connection timeout
- If the Docker image isn't cached, the pull may exceed this timeout
- Fix: Pre-pull the image before first use:
docker pull stickerdaniel/linkedin-mcp-server:2.3.0- Then restart Claude Desktop
Docker issues:
- Make sure Docker is installed
- Check if Docker is running:
docker ps
Login issues:
- Make sure you have only one active LinkedIn session at a time
- LinkedIn may require a login confirmation in the LinkedIn mobile app for
--login - You might get a captcha challenge if you logged in frequently. Run
uvx linkedin-scraper-mcp --loginwhich opens a browser where you can solve captchas manually. See the uvx setup for prerequisites.
Timeout issues:
- If pages fail to load or elements aren't found, try increasing the timeout:
--timeout 10000 - Users on slow connections may need higher values (e.g., 15000-30000ms)
- Can also set via environment variable:
TIMEOUT=10000
🐍 Local Setup (Develop & Contribute)
Contributions are welcome! See CONTRIBUTING.md for architecture guidelines and checklists. Please open an issue first to discuss the feature or bug fix before submitting a PR.
Prerequisites: Git and uv installed
Installation
# 1. Clone repository
git clone https://github.com/stickerdaniel/linkedin-mcp-server
cd linkedin-mcp-server
# 2. Install UV package manager (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# 3. Install dependencies
uv sync
uv sync --group dev
# 4. Install Patchright browser
uv run patchright install chromium
# 5. Install pre-commit hooks
uv run pre-commit install
# 6. Create a session (first time only)
uv run -m linkedin_mcp_server --login
# 7. Start the server
uv run -m linkedin_mcp_serverLocal Setup Help
🔧 Configuration
CLI Options:
--login- Open browser to log in and save persistent profile--no-headless- Show browser window (useful for debugging scraping issues)--log-level {DEBUG,INFO,WARNING,ERROR}- Set logging level (default: WARNING)--transport {stdio,streamable-http}- Optional: force transport mode (default: stdio)--host HOST- HTTP server host (default: 127.0.0.1)--port PORT- HTTP server port (default: 8000)--path PATH- HTTP server path (default: /mcp)--logout- Clear stored LinkedIn browser profile--timeout MS- Browser timeout for page operations in milliseconds (default: 5000)--status- Check if current session is valid and exit--user-data-dir PATH- Path to persistent browser profile directory (default: ~/.linkedin-mcp/profile)--slow-mo MS- Delay between browser actions in milliseconds (default: 0, useful for debugging)--user-agent STRING- Custom browser user agent--viewport WxH- Browser viewport size (default: 1280x720)--chrome-path PATH- Path to Chrome/Chromium executable (for custom browser installations)--help- Show help
Note: Most CLI options have environment variable equivalents. See
.env.examplefor details.
HTTP Mode Example (for web-based MCP clients):
uv run -m linkedin_mcp_server --transport streamable-http --host 127.0.0.1 --port 8000 --path /mcpClaude Desktop:
{
"mcpServers": {
"linkedin": {
"command": "uv",
"args": ["--directory", "/path/to/linkedin-mcp-server", "run", "-m", "linkedin_mcp_server"]
}
}
}stdio is used by default for this config.
❗ Troubleshooting
Login issues:
- Make sure you have only one active LinkedIn session at a time
- LinkedIn may require a login confirmation in the LinkedIn mobile app for
--login - You might get a captcha challenge if you logged in frequently. The
--logincommand opens a browser where you can solve it manually.
Scraping issues:
- Use
--no-headlessto see browser actions and debug scraping problems - Add
--log-level DEBUGto see more detailed logging
Session issues:
- Browser profile is stored at
~/.linkedin-mcp/profile/ - Use
--logoutto clear the profile and start fresh
Python/Patchright issues:
- Check Python version:
python --version(should be 3.12+) - Reinstall Patchright:
uv run patchright install chromium - Reinstall dependencies:
uv sync --reinstall
Timeout issues:
- If pages fail to load or elements aren't found, try increasing the timeout:
--timeout 10000 - Users on slow connections may need higher values (e.g., 15000-30000ms)
- Can also set via environment variable:
TIMEOUT=10000
Custom Chrome path:
- If Chrome is installed in a non-standard location, use
--chrome-path /path/to/chrome - Can also set via environment variable:
CHROME_PATH=/path/to/chrome
Acknowledgements
Built with FastMCP and Patchright.
⚠️ Use in accordance with LinkedIn's Terms of Service. Web scraping may violate LinkedIn's terms. This tool is for personal use only.
License
This project is licensed under the Apache 2.0 license.
Similar MCP
Based on tags & features
Trending MCP
Most active this week