When working with language models like DeepSeek or OpenAI-compatible APIs in your Python code, it’s often useful to test requests manually using Postman. This guide shows you how to replicate your Python OpenAI
SDK call using raw HTTP requests in Postman.
🧠 The Goal
Your Python code does the following:
import os from dotenv import load_dotenv from openai import OpenAI # Load .env variables load_dotenv() # Read values from environment api_key = os.getenv("OPENAI_API_KEY") base_url = os.getenv("OPENAI_BASE_URL") # Create OpenAI client client = OpenAI(api_key=api_key, base_url=base_url) # Send chat completion request response = client.chat.completions.create( model="deepseek-chat", messages=[ {"role": "system", "content": "You are a helpful assistant"}, {"role": "user", "content": "Hello"}, ], stream=False ) print(response.choices[0].message.content)
Now let’s test the same call in Postman.
✅ Step 1: Set the URL and HTTP Method
- Method:
POST
- URL:
https://api.deepseek.com/v1/chat/completions
This matches the base_url
+ /v1/chat/completions
endpoint used by the OpenAI-compatible API.
✅ Step 2: Set Headers
In Postman's Headers tab, add the following key-value pairs:
Key | Value |
---|---|
Authorization | Bearer YOUR_API_KEY (e.g. sk-abc123... ) |
Content-Type | application/json |
🔐 Replace
YOUR_API_KEY
with your actual API key from your.env
file.
✅ Step 3: Configure the JSON Body
Go to the Body tab → Select raw → Set type to JSON.
Paste this JSON body:
{ "model": "deepseek-chat", "messages": [ { "role": "system", "content": "You are a helpful assistant" }, { "role": "user", "content": "Hello" } ], "stream": false }
This mirrors your Python code’s messages
and model
.
✅ Step 4: Send the Request
Click the Send button. You should get a response like:
{ "id": "7e1894a2-12d7-4649-864a-da8984118be3", "object": "chat.completion", "created": 1752505878, "model": "deepseek-chat", "choices": [ { "index": 0, "message": { "role": "assistant", "content": "Hello! How can I assist you today? 😊" }, "logprobs": null, "finish_reason": "stop" } ], "usage": { "prompt_tokens": 9, "completion_tokens": 11, "total_tokens": 20, "prompt_tokens_details": { "cached_tokens": 0 }, "prompt_cache_hit_tokens": 0, "prompt_cache_miss_tokens": 9 }, "system_fingerprint": "fp_8802369eaa_prod0623_fp8_kvcache" }
🎉 That means it’s working just like your Python script!
🛠 Bonus: Use Postman Environment Variables
To avoid hardcoding sensitive values:
In Postman → Environments, create a new environment.
Add:
-
API_BASE_URL
:https://api.deepseek.com
-
API_KEY
:sk-your-key
- In your request:
- URL:
{{API_BASE_URL}}/v1/chat/completions
- Header:
Authorization: Bearer {{API_KEY}}
Now your requests are portable and secure.
📌 Summary
Task | Equivalent in Postman |
---|---|
api_key | Authorization: Bearer YOUR_API_KEY header |
base_url | Postman request URL (https://api.deepseek.com ) |
JSON messages list | Raw request body in JSON format |
.env variables | Postman environment variables |
🔐 Don't Forget
- Never commit your
.env
file to version control. - Avoid sharing Postman collections that contain raw keys.
Top comments (1)
Have you tested the DeepSeek Chat API in Postman based on a Python example and compared its performance with ChatGPT?