Ollama MCP Server

Stableapi
240 starsTypeScriptcommunity
GitHub
About

Overview

The Ollama MCP Server enables AI agents to use locally-running language models through Ollama. It provides tools for listing available models, generating text, creating embeddings, and pulling new models. Ideal for privacy-sensitive applications that require on-premises AI inference.
Capabilities

Tools & Capabilities

generate

Generate text using a local model

list_models

List available local models

embed

Generate embeddings from text

pull_model

Pull a new model from the Ollama registry

Setup

Installation

bash
Install
npx -y mcp-server-ollama
Examples

Example Usage

javascript
Usage
{
  "mcpServers": {
    "ollama": {
      "command": "npx",
      "args": ["-y", "mcp-server-ollama"],
      "env": {
        "OLLAMA_URL": "http://localhost:11434"
      }
    }
  }
}

Quick Info

Authorcommunity
LanguageTypeScript
StatusStable
Stars 240
Last UpdatedFeb 12, 2026

Need a Custom MCP Server?

Our team builds custom MCP servers tailored to your workflow.

Get in Touch

Need a Custom MCP Server?

Our team builds custom MCP servers tailored to your workflow. From proprietary data sources to internal tools, we have you covered.

Contact Us