Skip to content

A Node-RED module that wraps the ollama.js library, offering its functionalities as configurable nodes for easy integration into flows.

License

Notifications You must be signed in to change notification settings

jakubburkiewicz/node-red-contrib-ollama

Repository files navigation

Logo

node-red-contrib-ollama

Add AI functionality to your flows! This module includes a set of nodes that enable easy communication with Ollama, enriching your projects with intelligent solutions.

A Node-RED module that wraps the ollama.js library, offering its functionalities as configurable nodes for easy integration into flows.

Requirements

This module requires Node.js version 18 or higher.

To use it, you need to have Node-RED installed on your system. For more information on how to install Node-RED, refer to the official Node-RED documentation.

You also need either:

  • Ollama running on the same or a different system (local installation). For detailed instructions on how to install Ollama, please refer to the official Ollama site.
  • An Ollama Cloud account with an API key. Get your API key from Ollama Cloud Settings.

Installation

To install the module, you can use the Node-RED editor or the Node-RED command-line tool.

Node-RED Editor

  1. Open the Node-RED editor.
  2. Click on the menu button in the top-right corner.
  3. Select "Manage palette".
  4. Go to the "Install" tab.
  5. Search for "node-red-contrib-ollama".
  6. Click on the "Install" button.

Node-RED Command-Line Tool

  1. Open a terminal.
  2. Run the following command:
npm install node-red-contrib-ollama

After installing the module, you need to restart Node-RED to apply the changes.

Configuration

Server Configuration

Before using any Ollama nodes, you need to configure the Ollama server connection:

  1. Add any Ollama node to your flow
  2. Open the node configuration
  3. Create a new Ollama server configuration

Local Ollama Server

For a local Ollama installation:

  • Host: localhost (or the IP address of your Ollama server)
  • Port: 11434 (default Ollama port)

Ollama Cloud

For Ollama Cloud:

  1. Check the "Use Ollama Cloud" checkbox
  2. Enter your API Key (get it from Ollama Cloud Settings)
  3. The host will automatically be set to ollama.com and port to 443

Usage

The module provides a set of nodes that can be used to interact with the ollama.js library. The nodes are:

  • Chat: Generate the next message in a chat with a provided model.
  • Copy: Copy a model. Creates a model with another name from an existing model.
  • Create: Create a model from a Modelfile.
  • Delete: Delete a model and its data.
  • Embed: Generate embeddings from a model.
  • Generate: Generate a response for a given prompt with a provided model.
  • List: List models that are available.
  • Pull: Download a model from the ollama library.
  • Push: Upload a model to a model library. Requires registering for ollama.ai and adding a public key first.
  • Show: Show information about a model including details, modelfile, template, parameters, license, and system prompt.
  • Ps: List models that are currently loaded into memory.
  • Abort: This method will abort all streamed generations currently running.

Each node has its own set of configuration options that can be used to customize its behavior. For more information on how to use each node, refer to the help text provided in the Node-RED editor.

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

A Node-RED module that wraps the ollama.js library, offering its functionalities as configurable nodes for easy integration into flows.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project