Ollama
To use Ollama embeddings, you need to import OllamaEmbedding
from @llamaindex/ollama
.
Note that you need to pull the embedding model first before using it.
In the example below, we're using the nomic-embed-text
model, so you have to call:
ollama pull nomic-embed-text
Installation
npm i llamaindex @llamaindex/ollama
pnpm add llamaindex @llamaindex/ollama
yarn add llamaindex @llamaindex/ollama
bun add llamaindex @llamaindex/ollama
import { OllamaEmbedding } from "@llamaindex/ollama"; import { Document, Settings, VectorStoreIndex } from "llamaindex"; Settings.embedModel = new OllamaEmbedding({ model: "nomic-embed-text" }); const document = new Document({ text: essay, id_: "essay" }); const index = await VectorStoreIndex.fromDocuments([document]); const queryEngine = index.asQueryEngine(); const query = "What is the meaning of life?"; const results = await queryEngine.query({ query, });
API Reference
Edit on GitHub
Last updated on