105 lines
3.3 KiB
Markdown
105 lines
3.3 KiB
Markdown
# Agent Chat UI
|
|
|
|
Agent Chat UI is a Vite + React application which enables chatting with any LangGraph server with a `messages` key through a chat interface.
|
|
|
|
> [!NOTE]
|
|
> 🎥 Watch the video setup guide [here](https://youtu.be/lInrwVnZ83o).
|
|
|
|
## Setup
|
|
|
|
> [!TIP]
|
|
> Don't want to run the app locally? Use the deployed site here: [agentchat.vercel.app](https://agentchat.vercel.app)!
|
|
|
|
First, clone the repository, or run the [`npx` command](https://www.npmjs.com/package/create-agent-chat-app):
|
|
|
|
```bash
|
|
npx create-agent-chat-app
|
|
```
|
|
|
|
or
|
|
|
|
```bash
|
|
git clone https://github.com/langchain-ai/agent-chat-ui.git
|
|
|
|
cd agent-chat-ui
|
|
```
|
|
|
|
Install dependencies:
|
|
|
|
```bash
|
|
pnpm install
|
|
```
|
|
|
|
Run the app:
|
|
|
|
```bash
|
|
pnpm dev
|
|
```
|
|
|
|
The app will be available at `http://localhost:5173`.
|
|
|
|
## Usage
|
|
|
|
Once the app is running (or if using the deployed site), you'll be prompted to enter:
|
|
|
|
- **Deployment URL**: The URL of the LangGraph server you want to chat with. This can be a production or development URL.
|
|
- **Assistant/Graph ID**: The name of the graph, or ID of the assistant to use when fetching, and submitting runs via the chat interface.
|
|
- **LangSmith API Key**: (only required for connecting to deployed LangGraph servers) Your LangSmith API key to use when authenticating requests sent to LangGraph servers.
|
|
|
|
After entering these values, click `Continue`. You'll then be redirected to a chat interface where you can start chatting with your LangGraph server.
|
|
|
|
## Hiding Messages in the Chat
|
|
|
|
You can control the visibility of messages within the Agent Chat UI in two main ways:
|
|
|
|
**1. Prevent Live Streaming:**
|
|
|
|
To stop messages from being displayed _as they stream_ from an LLM call, add the `langsmith:nostream` tag to the chat model's configuration. The UI normally uses `on_chat_model_stream` events to render streaming messages; this tag prevents those events from being emitted for the tagged model.
|
|
|
|
_Python Example:_
|
|
|
|
```python
|
|
from langchain_anthropic import ChatAnthropic
|
|
|
|
# Add tags via the .with_config method
|
|
model = ChatAnthropic().with_config(
|
|
config={"tags": ["langsmith:nostream"]}
|
|
)
|
|
```
|
|
|
|
_TypeScript Example:_
|
|
|
|
```typescript
|
|
import { ChatAnthropic } from "@langchain/anthropic";
|
|
|
|
const model = new ChatAnthropic()
|
|
// Add tags via the .withConfig method
|
|
.withConfig({ tags: ["langsmith:nostream"] });
|
|
```
|
|
|
|
**Note:** Even if streaming is hidden this way, the message will still appear after the LLM call completes if it's saved to the graph's state without further modification.
|
|
|
|
**2. Hide Messages Permanently:**
|
|
|
|
To ensure a message is _never_ displayed in the chat UI (neither during streaming nor after being saved to state), prefix its `id` field with `do-not-render-` _before_ adding it to the graph's state, along with adding the `langsmith:do-not-render` tag to the chat model's configuration. The UI explicitly filters out any message whose `id` starts with this prefix.
|
|
|
|
_Python Example:_
|
|
|
|
```python
|
|
result = model.invoke([messages])
|
|
# Prefix the ID before saving to state
|
|
result.id = f"do-not-render-{result.id}"
|
|
return {"messages": [result]}
|
|
```
|
|
|
|
_TypeScript Example:_
|
|
|
|
```typescript
|
|
const result = await model.invoke([messages]);
|
|
// Prefix the ID before saving to state
|
|
result.id = `do-not-render-${result.id}`;
|
|
return { messages: [result] };
|
|
```
|
|
|
|
This approach guarantees the message remains completely hidden from the user interface.
|