DEV Community

Cover image for Build Your Own AI Stock Portfolio Agent with Agno + AG-UI
Bonnie for CopilotKit

Posted on • Edited on • Originally published at copilotkit.ai

Build Your Own AI Stock Portfolio Agent with Agno + AG-UI

In this guide, you will learn how to integrate Agno agents with the AG-UI protocol.

Additionally, we will cover how to integrate the AG-UI and Agno agents with CopilotKit, allowing users to chat with the agent and stream its responses in the frontend.

Before we jump in, here is what we will cover:

  • What is AG-UI protocol?

  • Integrating Agno agents with AG-UI protocol

  • Integrating a frontend to the AG-UI + Agno agents using CopilotKit

Here’s a preview of what we will be building:

What is AG-UI protocol?

The Agent User Interaction Protocol (AG-UI), developed by CopilotKit, is an open-source, lightweight, event-based protocol that facilitates rich, real-time interactions between the frontend and AI agents.

The AG-UI protocol enables event-driven communication, state management, tool usage, and streaming AI agent responses.

Star AG-UI ⭐️

To send information between the frontend and your AI agent, AG-UI uses events such as:

  • Lifecycle events: These events mark the start or end of an agent’s task execution. Lifecycle events include RUN_STARTED and RUN_FINISHED events.

  • Text message events: These events handle streaming agent responses to the frontend. Text message events include TEXT_MESSAGE_START, TEXT_MESSAGE_CONTENT, and TEXT_MESSAGE_END events.

  • Tool call events: These events manage the agent’s tool executions. Tool call events include TOOL_CALL_START, TOOL_CALL_ARGS, and TOOL_CALL_END events.

  • State management events: These events keep the frontend and the AI agent state in sync. State management events include STATE_SNAPSHOT and STATE_DELTA events.

You can learn more about the AG-UI protocol and its architecture here on AG-UI docs.

Image from Notion

Now that we have learned what the AG-UI protocol is, let us see how to integrate it with the Agno agent framework.

Let’s get started!

Prerequisites

To fully understand this tutorial, you need to have a basic understanding of React or Next.js.

We'll also make use of the following:

  • Python - a popular programming language for building AI agents with LangGraph; make sure it is installed on your computer.

  • Agno - a full-stack framework for building Multi-Agent Systems with memory, knowledge, and reasoning.

  • OpenAI API Key - an API key to enable us to perform various tasks using the GPT models; for this tutorial, ensure you have access to the GPT-4 model.

  • CopilotKit - an open-source copilot framework for building custom AI chatbots, in-app AI agents, and text areas.

Integrating Agno agents with AG-UI protocol

To get started, clone the Open AG UI Demo repository that consists of a Python-based backend (agent) and a Next.js frontend (frontend).

Next, navigate to the backend directory:

cd agent 
Enter fullscreen mode Exit fullscreen mode

Then install the dependencies using Poetry:

poetry install 
Enter fullscreen mode Exit fullscreen mode

After that, create a .env file with OpenAI API Key API key:

OPENAI_API_KEY=<<your-OpenAI-key-here>> 
Enter fullscreen mode Exit fullscreen mode

Then run the agent using the command below:

poetry run python main.py 
Enter fullscreen mode Exit fullscreen mode

To test the AG-UI + Agno integration, run the curl command below on https://reqbin.com/curl.

curl -X POST "http://localhost:8000/agno-agent" \ -H "Content-Type: application/json" \ -d '{ "thread_id": "test_thread_123", "run_id": "test_run_456", "messages": [ { "id": "msg_1", "role": "user", "content": "Analyze AAPL stock with a $10000 investment from 2023-01-01" } ], "tools": [], "context": [], "forwarded_props": {}, "state": {} }' 
Enter fullscreen mode Exit fullscreen mode

Let us now see how to integrate AG-UI protocol with Agno agents framework.

Step 1: Create your Agno agent workflow

Before integrating AG-UI protocol with Agno agent, create your Agno agent workflow, as shown in the agent/stock_analysis.py file.

# Import necessary libraries and modules for stock analysis workflow from agno.agent.agent import Agent # Core agent functionality from agno.models.openai.chat import OpenAIChat # OpenAI chat model integration from agno.workflow.v2 import Step, Workflow, StepOutput # Workflow management components from ag_ui.core import EventType, StateDeltaEvent # Event handling for UI updates from ag_ui.core import AssistantMessage, ToolMessage # Message types for chat interface import uuid # For generating unique identifiers import asyncio # For asynchronous operations from openai import OpenAI # OpenAI API client from dotenv import load_dotenv # For loading environment variables import os # Operating system interface import json # JSON data handling import yfinance as yf # Yahoo Finance API for stock data from datetime import datetime # Date and time handling import numpy as np # Numerical computing import pandas as pd # Data manipulation and analysis from prompts import insights_prompt, system_prompt # Custom prompt templates  # Load environment variables from .env file (contains API keys, etc.) load_dotenv() // ... # WORKFLOW DEFINITION: Complete stock analysis pipeline # This workflow orchestrates all the steps in sequence: # 1. Chat: Parse user input and extract parameters # 2. Simulation: Gather historical stock data # 3. Cash_allocation: Calculate portfolio performance and allocations # 4. Gather_insights: Generate market insights stock_analysis_workflow = Workflow( name="Mixed Execution Pipeline", steps=[chat, simultion, cash_allocation, gather_insights], # Function ) // ... 
Enter fullscreen mode Exit fullscreen mode

Step 2: Create an endpoint with FastAPI

Once you have defined your Agno agent workflow, create a FastAPI endpoint and import the Agno agent workflow as shown in the agent/main.py file.

# Import necessary libraries for FastAPI web server and async operations from fastapi import FastAPI # Main FastAPI framework for web API from fastapi.responses import StreamingResponse # For streaming real-time responses to the client import uuid # For generating unique identifiers from typing import Any # Type hints for better code documentation import os # Operating system interface for environment variables import uvicorn # ASGI server for running FastAPI applications import asyncio # Asynchronous I/O operations and event loop management  # Import event system components from ag_ui.core for real-time UI updates from ag_ui.core import ( RunAgentInput, # Input data structure for agent requests  StateSnapshotEvent, # Event for sending current state to UI  EventType, # Enumeration of all possible event types  RunStartedEvent, # Event signaling agent run has started  RunFinishedEvent, # Event signaling agent run has completed  TextMessageStartEvent, # Event for beginning text message streaming  TextMessageEndEvent, # Event for ending text message streaming  TextMessageContentEvent, # Event for streaming text content chunks  ToolCallStartEvent, # Event for beginning tool/function calls  ToolCallEndEvent, # Event for ending tool/function calls  ToolCallArgsEvent, # Event for streaming tool arguments  StateDeltaEvent, # Event for incremental state updates ) # Import event encoder for formatting events for streaming from ag_ui.encoder import EventEncoder # Encodes events for client consumption from typing import List # Type hint for list types  # Import the main stock analysis workflow from our custom module from stock_analysis import stock_analysis_workflow # Initialize FastAPI application instance app = FastAPI() # MAIN API ENDPOINT: Handle stock analysis agent requests # This endpoint receives investment queries and streams back real-time responses @app.post("/agno-agent") async def agno_agent(input_data: RunAgentInput): // ... # SERVER STARTUP FUNCTION: Initialize and run the FastAPI server def main(): """Run the uvicorn server.""" # Step 1: Get port from environment variable or default to 8000  port = int(os.getenv("PORT", "8000")) # Step 2: Start uvicorn ASGI server with configuration  uvicorn.run( "main:app", # Module:app reference  host="0.0.0.0", # Listen on all network interfaces  port=port, # Port number  reload=True, # Auto-reload on code changes (development mode)  ) # SCRIPT ENTRY POINT: Run server when script is executed directly if __name__ == "__main__": main() # Start the server 
Enter fullscreen mode Exit fullscreen mode

Step 3: Define an event generator

After creating a FastAPI endpoint, define an event generator that produces a stream of AG-UI protocol events, initialize the event encoder, and return the streaming response to the client or the frontend, as shown in the agent/main.py file.

# MAIN API ENDPOINT: Handle stock analysis agent requests # This endpoint receives investment queries and streams back real-time responses @app.post("/agno-agent") async def agno_agent(input_data: RunAgentInput): try: # ASYNC GENERATOR: Streams events to client in real-time  # This function generates a stream of events that get sent to the frontend  async def event_generator(): # Step 1: Initialize event streaming infrastructure  encoder = EventEncoder() # Encodes events for transmission  event_queue = asyncio.Queue() # Queue for handling events from workflow  # Step 2: Define event emission callback function  # This function gets called by workflow steps to send updates to UI  def emit_event(event): event_queue.put_nowait(event) # Add event to queue without blocking  # Step 3: Generate unique message identifier for this conversation  message_id = str(uuid.uuid4()) // ... except Exception as e: # Step 23: Handle any errors during execution  print(e) # Log error for debugging  # Step 24: Return streaming response to client  # FastAPI will stream the events as Server-Sent Events (SSE)  return StreamingResponse(event_generator(), media_type="text/event-stream") 
Enter fullscreen mode Exit fullscreen mode

Step 4: Configure AG-UI protocol lifecycle events

Once you have defined an event generator, define the AG-UI protocol lifecycle events that represent the lifecycle of an AG-UI + Agno agent workflow run as shown in the agent/main.py file.

# MAIN API ENDPOINT: Handle stock analysis agent requests # This endpoint receives investment queries and streams back real-time responses @app.post("/agno-agent") async def agno_agent(input_data: RunAgentInput): try: # ASYNC GENERATOR: Streams events to client in real-time  # This function generates a stream of events that get sent to the frontend  async def event_generator(): // ... # Step 4: Send initial "run started" event to client  # Signals to the UI that the agent has begun processing  the yield encoder.encode( RunStartedEvent( type=EventType.RUN_STARTED, thread_id=input_data.thread_id, # Conversation thread identifier  run_id=input_data.run_id, # Unique run identifier  ) ) // ... # Step 22: Send final "run finished" event  # Signal to client that the entire agent run has completed  the yield encoder.encode( RunFinishedEvent( type=EventType.RUN_FINISHED, thread_id=input_data.thread_id, run_id=input_data.run_id, ) ) except Exception as e: # Step 23: Handle any errors during execution  print(e) # Log error for debugging  # Step 24: Return streaming response to client  # FastAPI will stream the events as Server-Sent Events (SSE)  return StreamingResponse(event_generator(), media_type="text/event-stream") 
Enter fullscreen mode Exit fullscreen mode

Step 5: Configure AG-UI protocol state management events

After defining AG-UI protocol lifecycle events, integrate AG-UI protocol state management events using the STATE_DELTA event in your Agno agent workflow steps, as shown in the agent/stock_analysis.py file.

# WORKFLOW STEP 1: Initial chat processing and parameter extraction # This function handles the first interaction with the user query async def chat(step_input): try: // ... # Step 3: Emit state change event to update UI  # Uses JSON patch operations to update the frontend state  step_input.additional_data["emit_event"]( StateDeltaEvent( type=EventType.STATE_DELTA, delta=[ { "op": "add", # Add new log entry  "path": "/tool_logs/-", # Append to tool_logs array  "value": { "message": "Analyzing user query", "status": "processing", "id": tool_log_id, }, } ], ) ) await asyncio.sleep(0) # Yield control to event loop  // ... # Step 7: Update tool log status to completed  # Find the last log entry and mark it as completed  index = len(step_input.additional_data['tool_logs']) - 1 step_input.additional_data["emit_event"]( StateDeltaEvent( type=EventType.STATE_DELTA, delta=[ { "op": "replace", # Update existing value  "path": f"/tool_logs/{index}/status", "value": "completed", } ], ) ) await asyncio.sleep(0) # Yield control to event loop  // ... except Exception as e: # Step 10: Handle errors gracefully  print(e) # Log error for debugging  # Add an empty assistant message to maintain conversation flow  a_message = AssistantMessage(id=response.id, content="", role="assistant") step_input.additional_data["messages"].append(a_message) return "end" # Signal workflow termination 
Enter fullscreen mode Exit fullscreen mode

Then, in the FastAPI endpoint, initialize your Agno agent workflow state using the STATE_SNAPSHOT state management event, as shown below.

# MAIN API ENDPOINT: Handle stock analysis agent requests # This endpoint receives investment queries and streams back real-time responses @app.post("/agno-agent") async def agno_agent(input_data: RunAgentInput): try: # ASYNC GENERATOR: Streams events to client in real-time  # This function generates a stream of events that get sent to the frontend  async def event_generator(): # Step 1: Initialize event streaming infrastructure  encoder = EventEncoder() # Encodes events for transmission  event_queue = asyncio.Queue() # Queue for handling events from workflow  // ... # Step 5: Send current state snapshot to client  # Provides initial state including cash, portfolio, and logs  yield encoder.encode( StateSnapshotEvent( type=EventType.STATE_SNAPSHOT, snapshot={ "available_cash": input_data.state["available_cash"], # User's cash balance  "investment_summary": input_data.state["investment_summary"], # Portfolio summary  "investment_portfolio": input_data.state[ "investment_portfolio" # Current holdings  ], "tool_logs": [], # Initialize empty tool execution logs  }, ) ) // ... except Exception as e: # Step 23: Handle any errors during execution  print(e) # Log error for debugging  # Step 24: Return streaming response to client  # FastAPI will stream the events as Server-Sent Events (SSE)  return StreamingResponse(event_generator(), media_type="text/event-stream") 
Enter fullscreen mode Exit fullscreen mode

Step 6: Configure your Agno agent workflow with AG-UI protocol

Once you have initialized the Agno agent workflow state, integrate your Agno agent workflow with AG-UI protocol, as in the agent/main.py file.

# MAIN API ENDPOINT: Handle stock analysis agent requests # This endpoint receives investment queries and streams back real-time responses @app.post("/agno-agent") async def agno_agent(input_data: RunAgentInput): try: # ASYNC GENERATOR: Streams events to client in real-time  # This function generates a stream of events that get sent to the frontend  async def event_generator(): // ... # Step 6: Start the stock analysis workflow as an async task  # This runs the entire analysis pipeline in the background  agent_task = asyncio.create_task( stock_analysis_workflow.arun( # Execute workflow asynchronously  additional_data= { "tools": input_data.tools, # Available tools/functions  "messages": input_data.messages, # Conversation history  "emit_event": emit_event, # Callback for sending UI updates  "available_cash": input_data.state["available_cash"], # Cash balance  "investment_portfolio": input_data.state["investment_portfolio"], # Holdings  "tool_logs": [], # Initialize logs array  } ) ) # Step 7: Stream events from workflow while it's running  # This loop processes events from the workflow and streams them to the client  while True: try: # Step 8: Wait for events from workflow (with timeout)  event = await asyncio.wait_for(event_queue.get(), timeout=0.1) yield encoder.encode(event) # Send event to client  except asyncio.TimeoutError: # Step 9: Check if workflow has completed  # Check if the agent is done  if agent_task.done(): break # Exit loop when workflow finishes  # Step 10: Clear tool logs after workflow completion  # Send event to reset tool logs in UI  yield encoder.encode( StateDeltaEvent( type=EventType.STATE_DELTA, delta=[{"op": "replace", "path": "/tool_logs", "value": []}], ) ) // ... except Exception as e: # Step 23: Handle any errors during execution  print(e) # Log error for debugging  # Step 24: Return streaming response to client  # FastAPI will stream the events as Server-Sent Events (SSE)  return StreamingResponse(event_generator(), media_type="text/event-stream") 
Enter fullscreen mode Exit fullscreen mode

Step 7: Configure AG-UI protocol tool events to handle Human-in-the-Loop breakpoint

After integrating your Agno agent workflow with AG-UI protocol, append a tool call message with a tool call name to the state, as shown in the cash allocation step in the agent/stock_analysis.py file.

# WORKFLOW STEP 3: Cash allocation and portfolio simulation # This function calculates how investments would perform over time async def cash_allocation(step_input): # Step 1: Validate that we have tool calls to process  if step_input.additional_data["messages"][-1].tool_calls is None: return # Step 2: Initialize tool logging for allocation calculation  tool_log_id = str(uuid.uuid4()) step_input.additional_data["tool_logs"].append( { "id": tool_log_id, "message": "Calculating portfolio allocation", "status": "processing", } ) // ... # Step 31: Add tool message to conversation  step_input.additional_data["messages"].append( ToolMessage( role="tool", id=str(uuid.uuid4()), content="The relevant details had been extracted", # Confirmation message  tool_call_id=step_input.additional_data["messages"][-1].tool_calls[0].id, ) ) # Step 32: Request chart rendering through tool call  step_input.additional_data["messages"].append( AssistantMessage( role="assistant", tool_calls=[ { "id": str(uuid.uuid4()), "type": "function", "function": { "name": "render_standard_charts_and_table", # Frontend rendering function  "arguments": json.dumps( {"investment_summary": step_input.additional_data["investment_summary"]} ), }, } ], id=str(uuid.uuid4()), ) ) # Step 33: Mark allocation calculation as completed  index = len(step_input.additional_data["tool_logs"]) - 1 step_input.additional_data["emit_event"]( StateDeltaEvent( type=EventType.STATE_DELTA, delta=[ { "op": "replace", "path": f"/tool_logs/{index}/status", "value": "completed", } ], ) ) await asyncio.sleep(0) # Yield control to event loop  return 
Enter fullscreen mode Exit fullscreen mode

Then, define AG-UI protocol tool call events that an agent can use to trigger frontend actions by calling the frontend action using a tool name in order to request user feedback, as shown in the agent/main.py file.

# MAIN API ENDPOINT: Handle stock analysis agent requests # This endpoint receives investment queries and streams back real-time responses @app.post("/agno-agent") async def agno_agent(input_data: RunAgentInput): try: # ASYNC GENERATOR: Streams events to client in real-time  # This function generates a stream of events that get sent to the frontend  async def event_generator(): // ... # Step 11: Process final workflow results and stream appropriate response  # Check if the last message from assistant contains tool calls or text  if agent_task.result().step_responses[-1].content['messages'][-1].role == "assistant": if agent_task.result().step_responses[-1].content['messages'][-1].tool_calls: # BRANCH A: Handle tool call responses (charts, analysis, etc.)  # for tool_call in state['messages'][-1].tool_calls:  # Step 12: Send tool call start event  yield encoder.encode( ToolCallStartEvent( type=EventType.TOOL_CALL_START, tool_call_id=agent_task.result().step_responses[-1].content['messages'][-1].tool_calls[0].id, toolCallName=agent_task.result().step_responses[-1].content['messages'][-1] .tool_calls[0] .function.name, # Name of function being called (e.g., render_charts)  ) ) # Step 13: Send tool call arguments  # Stream the arguments being passed to the tool/function  yield encoder.encode( ToolCallArgsEvent( type=EventType.TOOL_CALL_ARGS, tool_call_id=agent_task.result().step_responses[-1].content['messages'][-1].tool_calls[0].id, delta=agent_task.result().step_responses[-1].content['messages'][-1] .tool_calls[0] .function.arguments, # JSON arguments for the function call  ) ) # Step 14: Send tool call completion event  # Signals that the tool call has finished  the yield encoder.encode( ToolCallEndEvent( type=EventType.TOOL_CALL_END, tool_call_id=agent_task.result().step_responses[-1].content['messages'][-1].tool_calls[0].id, ) ) else: // ... // ... except Exception as e: # Step 23: Handle any errors during execution  print(e) # Log error for debugging  # Step 24: Return streaming response to client  # FastAPI will stream the events as Server-Sent Events (SSE)  return StreamingResponse(event_generator(), media_type="text/event-stream") 
Enter fullscreen mode Exit fullscreen mode

Step 8: Configure AG-UI protocol text message events

Once you have configured AG-UI protocol tool events, define AG-UI protocol text message events in order to handle streaming agent responses to the frontend, as shown in the agent/main.py file.

# MAIN API ENDPOINT: Handle stock analysis agent requests # This endpoint receives investment queries and streams back real-time responses @app.post("/agno-agent") async def agno_agent(input_data: RunAgentInput): try: # ASYNC GENERATOR: Streams events to client in real-time  # This function generates a stream of events that get sent to the frontend  async def event_generator(): // ... # Step 11: Process final workflow results and stream appropriate response  # Check if the last message from assistant contains tool calls or text  if agent_task.result().step_responses[-1].content['messages'][-1].role == "assistant": // ... else: # BRANCH B: Handle text message responses  # Step 15: Start text message streaming  # Signal to UI that a text message is beginning  yield encoder.encode( TextMessageStartEvent( type=EventType.TEXT_MESSAGE_START, message_id=message_id, role="assistant", # Message from AI assistant  ) ) # Step 16: Stream message content (if available)  # Only send content event if content is not empty  if agent_task.result().step_responses[-1].content['messages'][-1].content: content = agent_task.result().step_responses[-1].content['messages'][-1].content # Step 17: Split message into chunks for streaming effect  # Split content into 100 parts  n_parts = 100 part_length = max(1, len(content) // n_parts) # Ensure at least 1 char per part  parts = [ content[i : i + part_length] for i in range(0, len(content), part_length) ] # Step 18: Handle edge case where splitting creates too many parts  # If splitting results in more than 5 due to rounding, merge the last parts  if len(parts) > n_parts: parts = parts[: n_parts - 1] + [ "".join(parts[n_parts - 1 :]) ] # Step 19: Stream each content chunk with a delay for typing effect  for part in parts: yield encoder.encode( TextMessageContentEvent( type=EventType.TEXT_MESSAGE_CONTENT, message_id=message_id, delta=part, # Chunk of message content  ) ) await asyncio.sleep(0.05) # Small delay for typing effect  else: # Step 20: Handle case where no content was generated  # Send error message if content is empty  yield encoder.encode( TextMessageContentEvent( type=EventType.TEXT_MESSAGE_CONTENT, message_id=message_id, delta="Something went wrong! Please try again.", ) ) # Step 21: End text message streaming  # Signal to UI that text message is complete  yield encoder.encode( TextMessageEndEvent( type=EventType.TEXT_MESSAGE_END, message_id=message_id, ) ) // ... except Exception as e: # Step 23: Handle any errors during execution  print(e) # Log error for debugging  # Step 24: Return streaming response to client  # FastAPI will stream the events as Server-Sent Events (SSE)  return StreamingResponse(event_generator(), media_type="text/event-stream") 
Enter fullscreen mode Exit fullscreen mode

Congratulations! You have integrated an Agno agent workflow with AG-UI protocol. Let’s now see how to add a frontend to the AG-UI + Agno agent workflow.

Integrating a frontend to the AG-UI + Agno agent workflow using CopilotKit

In this section, you will learn how to create a connection between your AG-UI + Agno agent workflow and a frontend using CopilotKit.

Let’s get started.

First, navigate to the frontend directory:

cd frontend 
Enter fullscreen mode Exit fullscreen mode

Next, create a .env file with OpenAI API Key API key:

OPENAI_API_KEY=<<your-OpenAI-key-here>> 
Enter fullscreen mode Exit fullscreen mode

Then install the dependencies:

pnpm install 
Enter fullscreen mode Exit fullscreen mode

After that, start the development server:

pnpm run dev 
Enter fullscreen mode Exit fullscreen mode

Navigate to http://localhost:3000, and you should see the AG-UI + Agno agent frontend up and running.

Image from Notion

Let’s now see how to build the frontend UI for the AG-UI + Agno agent using CopilotKit.

Step 1: Create an HttpAgent instance

Before creating an HttpAgent instance, let’s understand what HttpAgent is.

HttpAgent is a client from the AG-UI Library that bridges your frontend application with any AG-UI-compatible AI agent’s server.

To create an HttpAgent instance, define it in an API route as shown in the src/app/api/copilotkit/route.ts file.

// Import CopilotKit runtime components for AI agent integration import { CopilotRuntime, // Core runtime for managing AI agents and conversations copilotRuntimeNextJSAppRouterEndpoint, // Next.js App Router integration helper OpenAIAdapter, // Adapter for OpenAI-compatible API endpoints } from "@copilotkit/runtime"; // Import Next.js request type for proper TypeScript typing import { NextRequest } from "next/server"; // Import HttpAgent for communicating with external AI agents import { HttpAgent } from "@ag-ui/client"; // STEP 1: Initialize HTTP Agent for Stock Analysis Backend // Create agent connection to our FastAPI stock analysis service const agnoAgent = new HttpAgent({ // Use environment variable for backend URL, fallback to localhost url: process.env.NEXT_PUBLIC_AGNO_URL || "http://0.0.0.0:8000/agno-agent", }); // STEP 2: Configure OpenAI Service Adapter // Set up adapter for OpenAI-compatible API communication const serviceAdapter = new OpenAIAdapter(); // STEP 3: Initialize CopilotKit Runtime // Create the main runtime that orchestrates AI agent interactions const runtime = new CopilotRuntime({ agents: { // Our FastAPI endpoint URL // @ts-ignore - Suppress TypeScript error for agent configuration agnoAgent: agnoAgent, // Register our stock analysis agent }, }); // Alternative simple runtime configuration (commented out) // const runtime = new CopilotRuntime() // STEP 4: Define POST Request Handler // Export async function to handle incoming POST requests from CopilotKit export const POST = async (req: NextRequest) => { // STEP 5: Create Request Handler with CopilotKit Integration // Configure the endpoint handler with our runtime and service adapter const { handleRequest } = copilotRuntimeNextJSAppRouterEndpoint({ runtime, // Our configured CopilotKit runtime with agents serviceAdapter, // OpenAI adapter for LLM communication endpoint: "/api/copilotkit", // This API route's endpoint path }); // STEP 6: Process and Return Request // Delegate request handling to CopilotKit's built-in handler // This will route requests to appropriate agents and handle responses return handleRequest(req); }; 
Enter fullscreen mode Exit fullscreen mode

Step 2: Set up CopilotKit provider

To set up the CopilotKit Provider, the [<CopilotKit>](https://docs.copilotkit.ai/reference/components/CopilotKit) component must wrap the Copilot-aware parts of your application.

For most use cases, it's appropriate to wrap the CopilotKit provider around the entire app, e.g., in your layout.tsx, as shown below in the src/app/layout.tsx file.

// Next.js imports for metadata and font handling import type { Metadata } from "next"; import { Geist, Geist_Mono } from "next/font/google"; // Global styles for the application import "./globals.css"; // CopilotKit UI styles for AI components import "@copilotkit/react-ui/styles.css"; // CopilotKit core component for AI functionality import { CopilotKit } from "@copilotkit/react-core"; // Configure Geist Sans font with CSS variables for consistent typography const geistSans = Geist({ variable: "--font-geist-sans", subsets: ["latin"], }); // Configure Geist Mono font for code and monospace text const geistMono = Geist_Mono({ variable: "--font-geist-mono", subsets: ["latin"], }); // Metadata configuration for SEO and page information export const metadata: Metadata = { title: "AI Stock Portfolio", description: "AI Stock Portfolio", }; // Root layout component that wraps all pages in the application export default function RootLayout({ children, }: Readonly<{ children: React.ReactNode; }>) { return ( <html lang="en"> <body className={`${geistSans.variable} ${geistMono.variable} antialiased`}> {/* CopilotKit wrapper that enables AI functionality throughout the app */} {/* runtimeUrl points to the API endpoint for AI backend communication */} {/* agent specifies which AI agent to use (stockAgent for stock analysis) */} <CopilotKit runtimeUrl="/api/copilotkit" agent="agnoAgent"> {children} </CopilotKit>  </body>  </html>  ); } 
Enter fullscreen mode Exit fullscreen mode

Step 3: Set up a Copilot chat component

CopilotKit ships with a number of built-in chat components, which include CopilotPopupCopilotSidebar, and CopilotChat.

To set up a Copilot chat component, define it as shown in the src/app/components/prompt-panel.tsx file.

// Client-side component directive for Next.js "use client"; import type React from "react"; // CopilotKit chat component for AI interactions import { CopilotChat } from "@copilotkit/react-ui"; // Props interface for the PromptPanel component interface PromptPanelProps { // Amount of available cash for investment, displayed in the panel availableCash: number; } // Main component for the AI chat interface panel export function PromptPanel({ availableCash }: PromptPanelProps) { // Utility function to format numbers as USD currency // Removes decimal places for cleaner display of large amounts const formatCurrency = (amount: number) => { return new Intl.NumberFormat("en-US", { style: "currency", currency: "USD", minimumFractionDigits: 0, maximumFractionDigits: 0, }).format(amount); }; return ( // Main container with full height and white background <div className="h-full flex flex-col bg-white"> {/* Header section with title, description, and cash display */} <div className="p-4 border-b border-[#D8D8E5] bg-[#FAFCFA]"> {/* Title section with icon and branding */} <div className="flex items-center gap-2 mb-2"> <span className="text-xl">🪁</span>  <div> <h1 className="text-lg font-semibold text-[#030507] font-['Roobert']"> Portfolio Chat </h1>  {/* Pro badge indicator */} <div className="inline-block px-2 py-0.5 bg-[#BEC9FF] text-[#030507] text-xs font-semibold uppercase rounded"> PRO </div>  </div>  </div>  {/* Description of the AI agent's capabilities */} <p className="text-xs text-[#575758]"> Interact with the LangGraph-powered AI agent for portfolio visualization and analysis </p>  {/* Available Cash Display section */} <div className="mt-3 p-2 bg-[#86ECE4]/10 rounded-lg"> <div className="text-xs text-[#575758] font-medium"> Available Cash </div>  <div className="text-sm font-semibold text-[#030507] font-['Roobert']"> {formatCurrency(availableCash)} </div>  </div>  </div>  {/* CopilotKit chat interface with custom styling and initial message */} {/* Takes up majority of the panel height for conversation */} <CopilotChat className="h-[78vh] p-2" labels={{ // Initial welcome message explaining the AI agent's capabilities and limitations initial: `I am a Crew AI agent designed to analyze investment opportunities and track stock performance over time. How can I help you with your investment query? For example, you can ask me to analyze a stock like "Invest in Apple with 10k dollars since Jan 2023". \n\nNote: The AI agent has access to stock data from the past 4 years only.` }} />  </div>  ); } 
Enter fullscreen mode Exit fullscreen mode

Step 4: Sync AG-UI + Agno agent state with the frontend using CopilotKit hooks

In CopilotKit, CoAgents maintain a shared state that seamlessly connects your frontend UI with the agent's execution. This shared state system allows you to:

  • Display the agent's current progress and intermediate results

  • Update the agent's state through UI interactions

  • React to state changes in real-time across your application

You can learn more about CoAgents’ shared state here on the CopilotKit docs.

Image from Notion

To sync your AG-UI + Agno agent state with the frontend, use the CopilotKit useCoAgent hook to share the AG-UI + Agno agent state with your frontend, as shown in the src/app/page.tsx file.

"use client"; import { useCoAgent, } from "@copilotkit/react-core"; // ... export interface SandBoxPortfolioState { performanceData: Array<{ date: string; portfolio: number; spy: number; }>; } export interface InvestmentPortfolio { ticker: string; amount: number; } export default function OpenStocksCanvas() { // ... const [totalCash, setTotalCash] = useState(1000000); const { state, setState } = useCoAgent({ name: "agnoAgent", initialState: { available_cash: totalCash, investment_summary: {} as any, investment_portfolio: [] as InvestmentPortfolio[], }, }); // ... return ( <div className="h-screen bg-[#FAFCFA] flex overflow-hidden"> {/* ... */} </div>  ); } 
Enter fullscreen mode Exit fullscreen mode

Then render the AG-UI + Agno agent's state in the chat UI, which is useful for informing the user about the agent's state in a more in-context way.

To render the AG-UI + Agno agent's state in the chat UI, you can use the useCoAgentStateRender hook, as shown in the src/app/page.tsx file.

"use client"; import { useCoAgentStateRender, } from "@copilotkit/react-core"; import { ToolLogs } from "./components/tool-logs"; // ... export default function OpenStocksCanvas() { // ... useCoAgentStateRender({ name: "agnoAgent", render: ({ state }) => <ToolLogs logs={state.tool_logs} />,  }); // ... return ( <div className="h-screen bg-[#FAFCFA] flex overflow-hidden"> {/* ... */} </div>  ); } 
Enter fullscreen mode Exit fullscreen mode

If you execute a query in the chat, you should see the AG-UI + Agno agent’s state task execution rendered in the chat UI, as shown below.

Image from Notion

Step 5: Implementing Human-in-the-Loop (HITL) in the frontend

Human-in-the-loop (HITL) allows agents to request human input or approval during execution, making AI systems more reliable and trustworthy. This pattern is essential when building AI applications that need to handle complex decisions or actions that require human judgment.

You can learn more about Human in the Loop here on CopilotKit docs.

To implement Human-in-the-Loop (HITL) in the frontend, you need to use the CopilotKit useCopilotKitAction hook with the renderAndWaitForResponse method, which allows returning values asynchronously from the render function, as shown in the src/app/page.tsx file.

"use client"; import { useCopilotAction, } from "@copilotkit/react-core"; // ... export default function OpenStocksCanvas() { // ... useCopilotAction({ name: "render_standard_charts_and_table", description: "This is an action to render a standard chart and table. The chart can be a bar chart or a line chart. The table can be a table of data.", renderAndWaitForResponse: ({ args, respond, status }) => { useEffect(() => { console.log(args, "argsargsargsargsargsaaa"); }, [args]); return ( <> {args?.investment_summary?.percent_allocation_per_stock && args?.investment_summary?.percent_return_per_stock && args?.investment_summary?.performanceData && ( <> <div className="flex flex-col gap-4"> <LineChartComponent data={args?.investment_summary?.performanceData} size="small" /> <BarChartComponent data={Object.entries( args?.investment_summary?.percent_return_per_stock ).map(([ticker, return1]) => ({ ticker, return: return1 as number, }))} size="small" /> <AllocationTableComponent allocations={Object.entries( args?.investment_summary?.percent_allocation_per_stock ).map(([ticker, allocation]) => ({ ticker, allocation: allocation as a number, currentValue: args?.investment_summary.final_prices[ticker] * args?.investment_summary.holdings[ticker], totalReturn: args?.investment_summary.percent_return_per_stock[ ticker ], }))} size="small" /> </div>  <button hidden={status == "complete"} className="mt-4 rounded-full px-6 py-2 bg-green-50 text-green-700 border border-green-200 shadow-sm hover:bg-green-100 transition-colors font-semibold text-sm" onClick={() => { debugger; if (respond) { setTotalCash(args?.investment_summary?.cash); setCurrentState({ ...currentState, returnsData: Object.entries( args?.investment_summary?.percent_return_per_stock ).map(([ticker, return1]) => ({ ticker, return: return1 as number, })), allocations: Object.entries( args?.investment_summary?.percent_allocation_per_stock ).map(([ticker, allocation]) => ({ ticker, allocation: allocation as a number, currentValue: args?.investment_summary?.final_prices[ticker] * args?.investment_summary?.holdings[ticker], totalReturn: args?.investment_summary?.percent_return_per_stock[ ticker ], })), performanceData: args?.investment_summary?.performanceData, bullInsights: args?.insights?.bullInsights || [], bearInsights: args?.insights?.bearInsights || [], currentPortfolioValue: args?.investment_summary?.total_value, totalReturns: ( Object.values( args?.investment_summary?.returns ) as number[] ).reduce((acc, val) => acc + val, 0), }); setInvestedAmount( ( Object.values( args?.investment_summary?.total_invested_per_stock ) as number[] ).reduce((acc, val) => acc + val, 0) ); setState({ ...state, available_cash: totalCash, }); respond( "Data rendered successfully. Provide a summary of the investments by not making any tool calls." ); } }}> Accept </button>  <button hidden={status == "complete"} className="rounded-full px-6 py-2 bg-red-50 text-red-700 border border-red-200 shadow-sm hover:bg-red-100 transition-colors font-semibold text-sm ml-2" onClick={() => { debugger; if (respond) { respond( "Data rendering rejected. Just give a summary of the rejected investments by not making any tool calls." ); } }}> Reject </button>  </>  )} </>  ); }, }); // ... return ( <div className="h-screen bg-[#FAFCFA] flex overflow-hidden"> {/* ... */} </div>  ); } 
Enter fullscreen mode Exit fullscreen mode

When an agent triggers frontend actions by tool/action name to request human input or feedback during execution, the end-user is prompted with a choice (rendered inside the chat UI). Then the user can choose by pressing a button in the chat UI, as shown below.

Image from Notion

Step 6: Streaming AG-UI + Agno agent responses in the frontend

To stream your AG-UI + Agno agent responses or results in the frontend, pass the agent’s state field values to the frontend components, as shown in the src/app/page.tsx file.

"use client"; import { useEffect, useState } from "react"; import { PromptPanel } from "./components/prompt-panel"; import { GenerativeCanvas } from "./components/generative-canvas"; import { ComponentTree } from "./components/component-tree"; import { CashPanel } from "./components/cash-panel"; // ... export default function OpenStocksCanvas() { const [currentState, setCurrentState] = useState<PortfolioState>({ id: "", trigger: "", performanceData: [], allocations: [], returnsData: [], bullInsights: [], bearInsights: [], currentPortfolioValue: 0, totalReturns: 0, }); const [sandBoxPortfolio, setSandBoxPortfolio] = useState< SandBoxPortfolioState[] >([]); const [selectedStock, setSelectedStock] = useState<string | null>(null); return ( <div className="h-screen bg-[#FAFCFA] flex overflow-hidden"> {/* Left Panel - Prompt Input */} <div className="w-85 border-r border-[#D8D8E5] bg-white flex-shrink-0"> <PromptPanel availableCash={totalCash} />  </div>  {/* Center Panel - Generative Canvas */} <div className="flex-1 relative min-w-0"> {/* Top Bar with Cash Info */} <div className="absolute top-0 left-0 right-0 bg-white border-b border-[#D8D8E5] p-4 z-10"> <CashPanel totalCash={totalCash} investedAmount={investedAmount} currentPortfolioValue={ totalCash + investedAmount + currentState.totalReturns || 0 } onTotalCashChange={setTotalCash} onStateCashChange={setState} />  </div>  <div className="pt-20 h-full"> <GenerativeCanvas setSelectedStock={setSelectedStock} portfolioState={currentState} sandBoxPortfolio={sandBoxPortfolio} setSandBoxPortfolio={setSandBoxPortfolio} />  </div>  </div>  {/* Right Panel - Component Tree (Optional) */} {showComponentTree && ( <div className="w-64 border-l border-[#D8D8E5] bg-white flex-shrink-0"> <ComponentTree portfolioState={currentState} />  </div>  )} </div>  ); } 
Enter fullscreen mode Exit fullscreen mode

If you query your agent and approve its feedback request, you should see the agent’s response or results streaming in the UI, as shown below.

Conclusion

In this guide, we have walked through the steps of integrating Agno agents with AG-UI protocol and then adding a frontend to the agents using CopilotKit.

While we’ve explored a couple of features, we have barely scratched the surface of the countless use cases for CopilotKit, ranging from building interactive AI chatbots to building agentic solutions—in essence, CopilotKit lets you add a ton of useful AI capabilities to your products in minutes.

Hopefully, this guide makes it easier for you to integrate AI-powered Copilots into your existing application.

Follow CopilotKit on Twitter and say hi, and if you'd like to build something cool, join the Discord community.

Top comments (19)

Collapse
 
david-723 profile image
David

Def saving this one.
I've been diving into Agno and this looks like a cool project that will help me start to understand how to build with agents.
Thanks for taking the time to publish this
Very detailed, super helpful

Collapse
 
the_greatbonnie profile image
Bonnie CopilotKit

I am happy to hear that, David.

Collapse
 
anik_sikder_313 profile image
Anik Sikder

এইটা একদম practical guide, AG-UI আর Agno integration দেখে মনে হচ্ছে AI frontend এখন অনেক সহজ হয়ে গেছে। ধন্যবাদ এত সুন্দরভাবে বোঝানোর জন্য

Collapse
 
zack-123 profile image
Sayeed

Been hearing a lot about AGUI lately
Nice tutorial

Collapse
 
nathan_tarbert profile image
Nathan Tarbert CopilotKit

That's awesome, Sayeed. Just curious where you've been hearing about AG-UI?

Collapse
 
the_greatbonnie profile image
Bonnie CopilotKit

Thanks, Sayeed.

Collapse
 
james0123 profile image
James

This is great, I've been waiting for an Agno guide. Thanks

Collapse
 
the_greatbonnie profile image
Bonnie CopilotKit

You are welcome, James.

Collapse
 
samcurran12 profile image
Sammy Scolling

Super dope!

Collapse
 
the_greatbonnie profile image
Bonnie CopilotKit

Thanks, Sammy.

Collapse
 
morgan-123 profile image
Morgan

Sick!

Collapse
 
nicholasthegeek profile image
Nick

Great work

Collapse
 
the_greatbonnie profile image
Bonnie CopilotKit

Thanks, Nick

Collapse
 
nathan_tarbert profile image
Nathan Tarbert CopilotKit

This is awesome, Bonnie!

Great walkthrough!

Collapse
 
the_greatbonnie profile image
Bonnie CopilotKit

Thanks, Nathan

Collapse
 
ferguson0121 profile image
Ferguson

I'm still unclear what AG-UI is...Can you explain it in a non-technical way?

Collapse
 
nathan_tarbert profile image
Nathan Tarbert CopilotKit

Hi Ferguson, it’s basically a universal way for AI and the app’s interface to stay in sync.
The AI can show its progress step-by-step, and the UI can send your choices back instantly.

Collapse
 
johncook1122 profile image
John Cook

I'm in charge of AI innovation, and I'm interested in CopilotKit. What's the best way to get a hold of someone there?

Collapse
 
nathan_tarbert profile image
Nathan Tarbert CopilotKit

Hi John, thanks for leaving a comment.
Please DM me on Twitter if you'd like x.com/nathan_tarbert
I would love to chat with you and find out what you're building and your use case.

Some comments may only be visible to logged-in visitors. Sign in to view all comments.