Skip to content

Commit 0f73764

Browse files
committed
Docs update
1 parent 9cdbf31 commit 0f73764

File tree

3 files changed

+34
-40
lines changed

3 files changed

+34
-40
lines changed

README.md

Lines changed: 5 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -118,15 +118,15 @@ Follow these steps to quickly set up and run the ChatGPT Retrieval Plugin:
118118
export QDRANT_GRPC_PORT=<your_qdrant_grpc_port>
119119
export QDRANT_API_KEY=<your_qdrant_api_key>
120120
export QDRANT_COLLECTION=<your_qdrant_collection>
121-
121+
122122
# AnalyticDB
123123
export PG_HOST=<your_analyticdb_host>
124124
export PG_PORT=<your_analyticdb_port>
125125
export PG_USER=<your_analyticdb_username>
126126
export PG_PASSWORD=<your_analyticdb_password>
127127
export PG_DATABASE=<your_analyticdb_database>
128128
export PG_COLLECTION=<your_analyticdb_collection>
129-
129+
130130
131131
# Redis
132132
export REDIS_HOST=<your_redis_host>
@@ -277,7 +277,7 @@ poetry install
277277
The API requires the following environment variables to work:
278278

279279
| Name | Required | Description |
280-
| ---------------- | -------- |----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
280+
| ---------------- | -------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
281281
| `DATASTORE` | Yes | This specifies the vector database provider you want to use to store and query embeddings. You can choose from `chroma`, `pinecone`, `weaviate`, `zilliz`, `milvus`, `qdrant`, `redis`, `azuresearch`, `supabase`, `postgres`, `analyticdb`. |
282282
| `BEARER_TOKEN` | Yes | This is a secret token that you need to authenticate your requests to the API. You can generate one using any tool or method you prefer, such as [jwt.io](https://jwt.io/). |
283283
| `OPENAI_API_KEY` | Yes | This is your OpenAI API key that you need to generate embeddings using the `text-embedding-ada-002` model. You can get an API key by creating an account on [OpenAI](https://openai.com/). |
@@ -348,11 +348,8 @@ For detailed setup instructions, refer to [`/docs/providers/llama/setup.md`](/do
348348
[Postgres](https://www.postgresql.org) offers an easy and efficient way to store vectors via [pgvector](https://github.com/pgvector/pgvector) extension. To use pgvector, you will need to set up a PostgreSQL database with the pgvector extension enabled. For example, you can [use docker](https://www.docker.com/blog/how-to-use-the-postgres-docker-official-image/) to run locally. For a hosted/managed solution, you can use any of the cloud vendors which support [pgvector](https://github.com/pgvector/pgvector#hosted-postgres). For detailed setup instructions, refer to [`/docs/providers/postgres/setup.md`](/docs/providers/postgres/setup.md).
349349

350350
#### AnalyticDB
351-
[AnalyticDB](https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/latest/product-introduction-overview) is a distributed cloud-native vector database designed for storing documents and vector embeddings.
352-
As a high-performance vector database, it is fully compatible with PostgreSQL syntax, making it easy to use. Managed by Alibaba Cloud, AnalyticDB is a cloud-native database with a powerful vector compute engine.
353-
Its out-of-the-box experience enables processing of billions of data vectors and offers a wide range of features, including indexing algorithms, structured and unstructured data capabilities, real-time updates, distance metrics, scalar filtering, and time travel searches.
354-
Additionally, it provides full OLAP database functionality and an SLA commitment for production use.
355-
For detailed setup instructions, refer to [`/docs/providers/analyticdb/setup.md`](/docs/providers/analyticdb/setup.md).
351+
352+
[AnalyticDB](https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/latest/product-introduction-overview) is a distributed cloud-native vector database designed for storing documents and vector embeddings. It is fully compatible with PostgreSQL syntax and managed by Alibaba Cloud. AnalyticDB offers a powerful vector compute engine, processing billions of data vectors and providing features such as indexing algorithms, structured and unstructured data capabilities, real-time updates, distance metrics, scalar filtering, and time travel searches. For detailed setup instructions, refer to [`/docs/providers/analyticdb/setup.md`](/docs/providers/analyticdb/setup.md).
356353

357354
### Running the API locally
358355

docs/deployment/removing-unused-dependencies.md

Lines changed: 12 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -4,16 +4,17 @@ Before deploying your app, you might want to remove unused dependencies from you
44

55
Here are the packages you can remove for each vector database provider:
66

7-
- **Pinecone:** Remove `weaviate-client`, `pymilvus`, `qdrant-client`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
8-
- **Weaviate:** Remove `pinecone-client`, `pymilvus`, `qdrant-client`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
9-
- **Zilliz:** Remove `pinecone-client`, `weaviate-client`, `qdrant-client`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
10-
- **Milvus:** Remove `pinecone-client`, `weaviate-client`, `qdrant-client`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
11-
- **Qdrant:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
12-
- **Redis:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
13-
- **LlamaIndex:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `chromadb`, `redis`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
14-
- **Chroma:**: Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `llama-index`, `redis`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
15-
- **Azure Cognitive Search**: Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `llama-index`, `redis` and `chromadb`, `supabase`, and `psycopg2`+`pgvector`.
16-
- **Supabase:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `redis`, `llama-index`, `azure-identity` and `azure-search-documents`, and `psycopg2`+`pgvector`.
17-
- **Postgres:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `redis`, `llama-index`, `azure-identity` and `azure-search-documents`, and `supabase`.
7+
- **Pinecone:** Remove `weaviate-client`, `pymilvus`, `qdrant-client`, `redis`, `chromadb`, `llama-index`, `azure-identity`, `azure-search-documents`, `supabase`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
8+
- **Weaviate:** Remove `pinecone-client`, `pymilvus`, `qdrant-client`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, `psycopg2`+`pgvector`, `psycopg2cffi`.
9+
- **Zilliz:** Remove `pinecone-client`, `weaviate-client`, `qdrant-client`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
10+
- **Milvus:** Remove `pinecone-client`, `weaviate-client`, `qdrant-client`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
11+
- **Qdrant:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `redis`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
12+
- **Redis:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `chromadb`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
13+
- **LlamaIndex:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `chromadb`, `redis`, `azure-identity` and `azure-search-documents`, `supabase`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
14+
- **Chroma:**: Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `llama-index`, `redis`, `azure-identity` and `azure-search-documents`, `supabase`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
15+
- **Azure Cognitive Search**: Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `llama-index`, `redis` and `chromadb`, `supabase`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
16+
- **Supabase:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `redis`, `llama-index`, `azure-identity` and `azure-search-documents`, `psycopg2`+`pgvector`, and `psycopg2cffi`.
17+
- **Postgres:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `redis`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2cffi`.
18+
- **AnalyticDB:** Remove `pinecone-client`, `weaviate-client`, `pymilvus`, `qdrant-client`, `redis`, `llama-index`, `azure-identity` and `azure-search-documents`, `supabase`, and `psycopg2`+`pgvector`.
1819

1920
After removing the unnecessary packages from the `pyproject.toml` file, you don't need to run `poetry lock` and `poetry install` manually. The provided Dockerfile takes care of installing the required dependencies using the `requirements.txt` file generated by the `poetry export` command.

docs/providers/analyticdb/setup.md

Lines changed: 17 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1,46 +1,43 @@
11
# AnalyticDB
22

3-
[AnalyticDB]( https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/latest/product-introduction-overview) is a distributed cloud-native vector database designed for storing documents and vector embeddings.
4-
As a high-performance vector database, it is fully compatible with PostgreSQL syntax, making it easy to use.
5-
Managed by Alibaba Cloud, AnalyticDB is a cloud-native database with a powerful vector compute engine.
6-
Its out-of-the-box experience enables processing of billions of data vectors and offers a wide range of features, including indexing algorithms, structured and unstructured data capabilities, real-time updates, distance metrics, scalar filtering, and time travel searches.
7-
Additionally, it provides full OLAP database functionality and an SLA commitment for production use.
8-
9-
## Install requirements
10-
Running the following command to install requirement packages. It will install `psycopg2cffi` package.
3+
[AnalyticDB](https://www.alibabacloud.com/help/en/analyticdb-for-postgresql/latest/product-introduction-overview) is a distributed cloud-native vector database designed for storing documents and vector embeddings. It is a high-performance vector database that is fully compatible with PostgreSQL syntax, making it easy to use. Managed by Alibaba Cloud, AnalyticDB offers a powerful vector compute engine, processing billions of data vectors and providing a wide range of features, including indexing algorithms, structured and unstructured data capabilities, real-time updates, distance metrics, scalar filtering, and time travel searches. Additionally, it offers full OLAP database functionality and an SLA commitment for production use.
4+
5+
## Install Requirements
6+
7+
Run the following command to install the required packages, including the `psycopg2cffi` package:
8+
119
```
12-
poetry install --extras "postgresql"
10+
poetry install --extras "postgresql"
1311
```
14-
If your meet with this issue `Error: pg_config executable not found.`
15-
It appears that the `pg_config` executable is not found in your system. The `pg_config` utility is part of the PostgreSQL development package, which is required for building `psycopg2cffi`.
1612

17-
To resolve this issue, you need to install the PostgreSQL development package on your system. Here's how to do it on different Linux distributions:
13+
If you encounter the `Error: pg_config executable not found.` issue, you need to install the PostgreSQL development package on your system. Follow the instructions for your specific Linux distribution:
1814

19-
1. On Debian-based systems (e.g., Ubuntu):
15+
1. Debian-based systems (e.g., Ubuntu):
2016

2117
```bash
2218
sudo apt-get update
2319
sudo apt-get install libpq-dev
2420
```
2521

26-
2. On RHEL-based systems (e.g., CentOS, Fedora):
22+
2. RHEL-based systems (e.g., CentOS, Fedora):
2723

2824
```bash
2925
sudo yum install postgresql-devel
3026
```
3127

32-
3. On Arch-based systems (e.g., Manjaro, Arch Linux):
28+
3. Arch-based systems (e.g., Manjaro, Arch Linux):
3329

3430
```bash
3531
sudo pacman -S postgresql-libs
3632
```
3733

38-
4. On macOS
34+
4. macOS:
35+
3936
```bash
4037
brew install postgresql
4138
```
4239

43-
After installing the required package, try to install `psycopg2cffi` again. If the `pg_config` executable is still not found, you might need to add its location to your system's `PATH` variable. You can typically find the `pg_config` executable in the `bin` directory of your PostgreSQL installation, for example `/usr/pgsql-13/bin/pg_config`. To add it to your `PATH` variable, use the following command (replace the path with the correct one for your system):
40+
After installing the required package, try to install `psycopg2cffi` again. If the `pg_config` executable is still not found, add its location to your system's `PATH` variable. You can typically find the `pg_config` executable in the `bin` directory of your PostgreSQL installation, for example `/usr/pgsql-13/bin/pg_config`. To add it to your `PATH` variable, use the following command (replace the path with the correct one for your system):
4441

4542
```bash
4643
export PATH=$PATH:/usr/pgsql-13/bin
@@ -51,7 +48,7 @@ Now, try installing `psycopg2cffi` again using Poetry.
5148
**Environment Variables:**
5249

5350
| Name | Required | Description | Default |
54-
|------------------|----------|-------------------------------------|-------------------|
51+
| ---------------- | -------- | ----------------------------------- | ----------------- |
5552
| `DATASTORE` | Yes | Datastore name, set to `analyticdb` | |
5653
| `BEARER_TOKEN` | Yes | Secret token | |
5754
| `OPENAI_API_KEY` | Yes | OpenAI API key | |
@@ -64,8 +61,7 @@ Now, try installing `psycopg2cffi` again using Poetry.
6461

6562
## AnalyticDB Cloud
6663

67-
For a hosted [AnalyticDB Cloud](https://cloud.qdrant.io/) version, provide the AnalyticDB instance
68-
URL
64+
For a hosted [AnalyticDB Cloud](https://cloud.qdrant.io/) version, provide the AnalyticDB instance URL:
6965

7066
**Example:**
7167

@@ -83,4 +79,4 @@ A suite of integration tests verifies the AnalyticDB integration. Launch the tes
8379

8480
```bash
8581
pytest ./tests/datastore/providers/analyticdb/test_analyticdb_datastore.py
86-
```
82+
```

0 commit comments

Comments
 (0)