Use a local Ollama model with MCPHost and Reltio MCP Server
Learn how to install MCPHost, configure it to connect to the Reltio MCP Server, and run it with a local Ollama model on Windows, macOS, and Linux.
This topic explains how to set up and run MCPHost with a local Ollama model connected to the Reltio MCP Server. You'll install MCPHost using Go, create a configuration file with your MCP Server details, pull and serve an Ollama model locally, and then run MCPHost so it can interact with the MCP Server through that model. The steps in this procedure are specific to Windows, macOS, and Linux, and they use only commands and parameters provided in the supported setup process.
Prerequisites
- Go installed (v1.23+): https://go.dev/dl
- Ollama installed: https://ollama.com
- MCPHost CLI, installed using Go
- A valid Reltio MCP server token
Troubleshooting for the Ollama model setup with MCPHost and Reltio MCP Server
- command not found: mcphost / 'mcphost' is not recognized
- macOS/Linux:The installed binary is placed in the Go bin directory, typically
$HOME/go/bin
. If this directory is not in your shell's PATH, the system will not find themcphost
command. To fix:
Add this line permanently to your shell profile file (~/.bash_profile, ~/.zshrc, or ~/.bashrc) to keep it across sessions.export PATH=$PATH:$HOME/go/bin
Windows:
The installed binary is placed in C:\Users\<your-username>\go\bin. If this folder is not in your PATH, Windows will not find mcphost. To fix for the current PowerShell session:
To make this permanent, add the folder to your User PATH in .$env:Path = "$env:Path;$env:USERPROFILE\go\bin"
- Slowness/unresponsiveness
- Select a smaller model and repeat the process. Search on Ollama for available models.
- Agent error
Select a model that supports tools and repeat the process. Search Ollama for models that are compatible with tool use.error during Chat request: registry.ollama.ai/library/<model> does not support tools