Tags: mozilla-ai/any-llm
Tags
fix(platform): Cleanup dependencies. (#667) - Drop unused dependencies. Requiring and pinning `bcrypt>5.0.0` causes problems with projects using `passlib` (i.e FastAPI templates). - Replace `requests` with `httpx`. As it is a base dependency providing the same functionality we need.
fix(dep): add missing requests dependency for platform build (#662) ## Description This adds the requests dependency in the build for platform. Requests is used in the utils.py file and on a fresh run of any-llm on a clean environment, it is not working 😢 ## PR Type 🐛 Bug Fix ## Relevant issues <!-- e.g. "Fixes #123" --> ## Checklist - [ ] I have added unit tests that prove my fix/feature works - [ ] New and existing tests pass locally - [x] Documentation was updated where necessary - [x] I have read and followed the [contribution guidelines](https://github.com/mozilla-ai/any-llm/blob/main/CONTRIBUTING.md)
chore: Nebius through portkey dropped qwen3-14b (#603) ## Description <!-- What does this PR do? --> ## PR Type <!-- Delete the types that don't apply --!> 💅 Refactor ## Relevant issues <!-- e.g. "Fixes #123" --> ## Checklist - [ ] I have added unit tests that prove my fix/feature works - [ ] New and existing tests pass locally - [ ] Documentation was updated where necessary - [ ] I have read and followed the [contribution guidelines](https://github.com/mozilla-ai/any-llm/blob/main/CONTRIBUTING.md)```
feat(batch): Support for Native Batch API (#578) ## Description <!-- What does this PR do? --> Initial support for batch. This PR implements only openai batch, but sets up repo for more implementations. I added an experimental decorator to make it clear that the Batch API may need a few changes to the interface as we expand the implementations and usage. The `any-llm` batch API requires you to pass a **path to a local JSONL file** containing your batch requests. The provider implementation automatically handles uploading and file management as needed. Different providers handle batch processing differently: - **OpenAI**: Requires uploading a file first, then creating a batch with the file ID - **Anthropic** (future): Expects file content passed directly in the request - **Other providers**: May have their own unique requirements By accepting a local file path, `any-llm` abstracts these provider differences and handles the implementation details automatically. ## PR Type <!-- Delete the types that don't apply --!> 🆕 New Feature ## Relevant issues <!-- e.g. "Fixes #123" --> ## Checklist - [x] I have added unit tests that prove my fix/feature works - [x] New and existing tests pass locally - [x] Documentation was updated where necessary - [x] I have read and followed the [contribution guidelines](https://github.com/mozilla-ai/any-llm/blob/main/CONTRIBUTING.md)```
fix(google): add usage data (#573) ## Description <!-- What does this PR do? --> Found that gemini/vertexai didn't have usage data being added to their completionchunks. ## PR Type <!-- Delete the types that don't apply --!> 🐛 Bug Fix ## Relevant issues <!-- e.g. "Fixes #123" --> ## Checklist - [x] I have added unit tests that prove my fix/feature works - [x] New and existing tests pass locally - [x] Documentation was updated where necessary - [x] I have read and followed the [contribution guidelines](https://github.com/mozilla-ai/any-llm/blob/main/CONTRIBUTING.md)```
feat(mistral): add support for OpenAI-style response format dictionar… …ies with Mistral (#535) Current support is limited to `params.response_format` being a Pydantic Model. This PR enables using openAI schema. Will be useful when migrating openai agents framework to use any-llm: mozilla-ai/any-agent#828
fix: threading approach in run_async_in_sync didn't wait for all task… …s to complete (#487) ## Description Fix the error when running tasks in threads ## PR Type <!-- Delete the types that don't apply --!> 🐛 Bug Fix ## Relevant issues <!-- e.g. "Fixes #123" --> ## Checklist - [x] I have added unit tests that prove my fix/feature works - [x] New and existing tests pass locally - [x] Documentation was updated where necessary - [x] I have read and followed the [contribution guidelines](https://github.com/mozilla-ai/any-llm/blob/main/CONTRIBUTING.md)```
PreviousNext