a2a-python
openrouter-runner
| a2a-python | openrouter-runner | |
|---|---|---|
| 2 | 58 | |
| 1,465 | 1,140 | |
| 13.2% | - | |
| 9.7 | 5.2 | |
| 7 days ago | 4 months ago | |
| Python | Python | |
| Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
a2a-python
- How to use a2a-python to Create and Connect Github Agent with Google's Agent-to-Agent (A2A) Protocol
A2A SDK - The underlying A2A protocol implementation
- Implementing CurrencyAgent with A2A Python SDK
The official a2a-python SDK from Google has been frequently updated, and our tutorial needs to keep up. In this article, we'll implement a simple CurrencyAgent using version 0.2.3 of the a2a-python SDK.
openrouter-runner
- HOW TO CREATE YOUR OWN SEARCH AGENT USING LANGCHAIN
from langchain_openai import ChatOpenAI from langchain_community.tools import DuckDuckGoSearchRun from langchain.tools import Tool from langchain.memory import ConversationBufferMemory from langchain.agents.agent_types import AgentType from langchain.agents import initialize_agent from dotenv import load_dotenv import os load_dotenv() def webSearchAgent(question): llm = ChatOpenAI( model = "deepseek/deepseek-chat-v3-0324:free", base_url = "https://openrouter.ai/api/v1", api_key = os.getenv("OPENROUTER_API_KEY"), ) search = DuckDuckGoSearchRun() tool = [ Tool ( name = "search", func = search.run, description = "When you want real time data use this", ) ] memory = ConversationBufferMemory(memory_key = "chat_history") agent = initialize_agent( llm = llm, tools = tool, agent = AgentType.CONVERSATIONAL_REACT_DESCRIPTION, memory = memory, verbose = True, handle_parsing_errors = True ) result = agent.run(question) print(f"Agent:\n\n {result}") webSearchAgent("Give me some latest news about Ai")
- Cracking the Opus: Red Teaming Anthropic’s Giant with Promptfoo
OpenRouter API Key → Create an account at OpenRouter and grab your key
- How I built a self-hosted AI automation stack without losing my mind
OpenRouter (multi-model API wrapper) https://openrouter.ai
- Reproducible LLM Benchmarking: GPT-5 vs Grok-4 with Promptfoo
- Building an AI Development Environment with Claude Code Claude Router Open Router
{ "Providers": [ { "name": "openrouter", "api_base_url": "https://openrouter.ai/api/v1/chat/completions", "api_key": "sk-xxx", "models": [ "anthropic/claude-sonnet-4" ], "transformer": { "use": ["openrouter"] } } ], "Router": { "default": "openrouter,anthropic/claude-sonnet-4" }, "LOG": true, "HOST": "127.0.0.1", + "PORT": 23456 }
- 5 tools we wish were on the Awesome AI Tools list
OpenRouter - We love OpenRouter because it allows you to easily try out new models and load balance between models. We actually got an open source contribution for this one recently, so we should be supporting it ❤️
- The Untold Misadventures of Red Teaming Kimi K2 with Promptfoo
Link: Promptfoo Open Source Tool for Evaluation and Red Teaming Link: OpenRouterFor Moonshot Kimi K2 APIs Link: Kimi K2 Model Model Page
- How OpenRouter Unlocked Our Workshop Strategy
Then I discovered OpenRouter: "a unified API platform that provides access to a wide range of large language models with intelligent routing and automatic fallbacks." You can use whatever model you want with the same API key. But the feature I really needed was its provisionary API key system, which allowed me to generate one master key and programmatically:
- Anthropic teams use Claude Code
What’s different is all the open weight models like Kimi-k2 or Qwen-3 Coder that are as good and, depending on the task, better than Anthropic’s Sonnet model for 80% less via openrouter [1] and other similar services.
You can use these models through Claude Code; I do it everyday.
Some developers are running smaller versions of these LLMs on their own hardware, paying no one.
So I don’t think Anthropic and the other companies can dramatically increase their prices without losing the customers that helped them go from $0 to $4 billion in revenue in 3 years.
Users can easily move between different AI platforms with no lock-in, which makes it harder to increase prices and proceed to enshitify their platforms.
[1]: https://openrouter.ai/
- Show HN: I built an LLM chat app because we shouldn't need 10 AI subscriptions
What are some alternatives?
a2a-py-github-agent
ollama - Get up and running with OpenAI gpt-oss, DeepSeek-R1, Gemma 3 and other models.
a2a-python - Official Python SDK for the Agent2Agent (A2A) Protocol [Moved to: https://github.com/google-a2a/a2a-python]
llm - Access large language models from the command-line
a2a-python-currency - Implementing CurrencyAgent with A2A Python SDK
plandex - Open source AI coding agent. Designed for large projects and real world tasks.