Skip to content

MassimilianoPasquini97/raycast_ollama

Repository files navigation

logo

Raycast Ollama

Use Ollama for local llama inference on Raycast. This application is not directly affiliated with Ollama.ai.

Requirements

Ollama installed and running on your mac. At least one model need to be installed throw Ollama cli tools or with 'Manage Models' Command. You can find all available model here.

How to Use

Command: Manage Models

View, add, and remove models that are installed locally or on a configured remote Ollama Server. To manage and utilize models from the remote server, use the Add Server action.

Command: Chat With Ollama

Chat with your preferred model from Raycast, with the following features:

  • CMD+M, Change Model: change model when you want and use different one for vision or embedding.
  • CMD+S, Selection: Add text from selection or clipboard to the prompt.
  • CMD+B, Browser Selection Tab: Add content from selected tab to the prompt. Raycast Browser Extention is required.
  • CMD+I, Image From Clipboard: Add jpeg or png image to te prompt. A Model with vision capabilities is required.
  • CMD+F, File: Add content from files. THis feature is still experimental.

From extentions preferences you can chose how many messages use as memory. By default it use the last 20 messages.

Command: Create Custom Commands

All preconfigured commands are crafted for general use. This command permits you to create a custom command for your specific needs.