Files
agent-chat-ui/README.md

105 lines
3.3 KiB
Markdown
Raw Normal View History

2025-03-10 16:30:10 -07:00
# Agent Chat UI
2025-03-07 11:27:21 -08:00
2025-03-10 16:30:10 -07:00
Agent Chat UI is a Vite + React application which enables chatting with any LangGraph server with a `messages` key through a chat interface.
2025-02-18 19:35:46 +01:00
2025-03-10 17:31:55 -07:00
> [!NOTE]
> 🎥 Watch the video setup guide [here](https://youtu.be/lInrwVnZ83o).
2025-02-18 19:41:47 +01:00
## Setup
2025-02-18 19:35:46 +01:00
2025-03-10 15:11:59 -07:00
> [!TIP]
2025-03-10 16:42:52 -07:00
> Don't want to run the app locally? Use the deployed site here: [agentchat.vercel.app](https://agentchat.vercel.app)!
2025-02-18 19:35:46 +01:00
First, clone the repository, or run the [`npx` command](https://www.npmjs.com/package/create-agent-chat-app):
2025-03-10 15:11:59 -07:00
```bash
npx create-agent-chat-app
```
2025-03-11 10:47:07 -07:00
or
2025-03-11 10:47:07 -07:00
2025-03-10 15:11:59 -07:00
```bash
2025-03-10 16:30:10 -07:00
git clone https://github.com/langchain-ai/agent-chat-ui.git
2025-03-10 15:11:59 -07:00
2025-03-10 16:30:10 -07:00
cd agent-chat-ui
2025-03-10 15:11:59 -07:00
```
Install dependencies:
```bash
pnpm install
```
Run the app:
```bash
pnpm dev
```
The app will be available at `http://localhost:5173`.
## Usage
Once the app is running (or if using the deployed site), you'll be prompted to enter:
- **Deployment URL**: The URL of the LangGraph server you want to chat with. This can be a production or development URL.
- **Assistant/Graph ID**: The name of the graph, or ID of the assistant to use when fetching, and submitting runs via the chat interface.
- **LangSmith API Key**: (only required for connecting to deployed LangGraph servers) Your LangSmith API key to use when authenticating requests sent to LangGraph servers.
After entering these values, click `Continue`. You'll then be redirected to a chat interface where you can start chatting with your LangGraph server.
## Hiding Messages in the Chat
You can control the visibility of messages within the Agent Chat UI in two main ways:
**1. Prevent Live Streaming:**
To stop messages from being displayed _as they stream_ from an LLM call, add the `langsmith:nostream` tag to the chat model's configuration. The UI normally uses `on_chat_model_stream` events to render streaming messages; this tag prevents those events from being emitted for the tagged model.
_Python Example:_
```python
from langchain_anthropic import ChatAnthropic
# Add tags via the .with_config method
model = ChatAnthropic().with_config(
config={"tags": ["langsmith:nostream"]}
)
```
_TypeScript Example:_
```typescript
import { ChatAnthropic } from "@langchain/anthropic";
const model = new ChatAnthropic()
// Add tags via the .withConfig method
.withConfig({ tags: ["langsmith:nostream"] });
```
**Note:** Even if streaming is hidden this way, the message will still appear after the LLM call completes if it's saved to the graph's state without further modification.
**2. Hide Messages Permanently:**
To ensure a message is _never_ displayed in the chat UI (neither during streaming nor after being saved to state), prefix its `id` field with `do-not-render-` _before_ adding it to the graph's state, along with adding the `langsmith:do-not-render` tag to the chat model's configuration. The UI explicitly filters out any message whose `id` starts with this prefix.
_Python Example:_
```python
result = model.invoke([messages])
# Prefix the ID before saving to state
result.id = f"do-not-render-{result.id}"
return {"messages": [result]}
```
_TypeScript Example:_
```typescript
const result = await model.invoke([messages]);
// Prefix the ID before saving to state
result.id = `do-not-render-${result.id}`;
return { messages: [result] };
```
This approach guarantees the message remains completely hidden from the user interface.