Skip to main content
LangChain v1.0Welcome to the new LangChain documentation! If you encounter any issues or have feedback, please open an issue so we can improve. Archived v0 documentation can be found here.See the release notes and migration guide for a complete list of changes and instructions on how to upgrade your code.
LangChain is the easiest way to start building agents and applications powered by LLMs. With under 10 lines of code, you can connect to OpenAI, Anthropic, Google, and more. LangChain provides a pre-built agent architecture and model integrations to help you get started quickly and seamlessly incorporate LLMs into your agents and applications. We recommend you use LangChain if you want to quickly build agents and autonomous applications. Use LangGraph, our low-level agent orchestration framework and runtime, when you have more advanced needs that require a combination of deterministic and agentic workflows, heavy customization, and carefully controlled latency. LangChain agents are built on top of LangGraph in order to provide durable execution, streaming, human-in-the-loop, persistence, and more. You do not need to know LangGraph for basic LangChain agent usage.

Install

pip install -U langchain 

Create an agent

# pip install -qU "langchain[anthropic]" to call the model  from langchain.agents import create_agent  def get_weather(city: str) -> str:  """Get weather for a given city."""  return f"It's always sunny in {city}!"  agent = create_agent(  model="anthropic:claude-sonnet-4-5-20250929",  tools=[get_weather],  system_prompt="You are a helpful assistant", )  # Run the agent agent.invoke(  {"messages": [{"role": "user", "content": "what is the weather in sf"}]} ) 

Core benefits


⌘I