🤖
Ollama
✓ Official⭐ Featuredby ollama
About
Run large language models locally with Ollama. Pull models like Llama 3, Phi-3, and Gemma, execute prompts, and manage model library from AI assistants.
Categories
Frequently Asked Questions
What is the Ollama MCP server?
Run large language models locally with Ollama. Pull models like Llama 3, Phi-3, and Gemma, execute prompts, and manage model library from AI assistants.
How do I install Ollama?
Visit the GitHub repository for installation instructions.
What AI clients work with Ollama?
Related Guides
Quick Info
- Install Type
- binary
- Author
- ollama
- Categories
- 1
- Integrations
- 5
Related Servers
🧠✓
Memory
Knowledge graph-based persistent memory system. Store and retrieve contextual information.
🤖✓
Sequential Thinking
Dynamic and reflective problem-solving through thought sequences.
🔍
Exa
Search Engine made for AIs. Neural search with understanding of content meaning.
🗄️
Milvus
Search, Query and interact with data in your Milvus Vector Database.
🗄️
Chroma
Embeddings, vector search, document storage, and full-text search with the open-source AI application database.
Ad Placeholder