Claude-LMStudio BridgeServer

infinitimeless
GitHub
MCPAILocal LLM

Loading subscription status...

๐Ÿ’ก Description

An MCP server that connects local LLMs running on LM Studio with Claude, enabling various functionalities such as listing available models, generating text, and supporting chat completions.

๐Ÿ“ JSON Entries

{
  "mcpServers": [
    {
      "lmstudio-bridge": {
        "env": {},
        "args": [
          "/path/to/claude-lmstudio-bridge/run_server.sh"
        ],
        "command": "/bin/bash"
      }
    },
    {
      "lmstudio-bridge": {
        "env": {},
        "args": [
          "/c C:\\path\\to\\claude-lmstudio-bridge\\run_server.bat"
        ],
        "command": "cmd.exe"
      }
    }
  ]
}

๐Ÿ› ๏ธ Tools

PythonLM Studio

โšก Features

  • List all models available in LM Studio
  • Generate text using a local LLM
  • Chat completion support
  • Status check tool for LM Studio connection

๐Ÿ’ฌ Example Queries

  • Can you check if my LM Studio server is running?
  • List the available models in my local LM Studio
  • Generate a short poem about spring using my local LLM
  • Ask my local LLM: 'What are the main features of transformers in machine learning?'