Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file removed images/chainlit-haystack.png
Binary file not shown.
Binary file added images/hayhooks-chainlit.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
110 changes: 110 additions & 0 deletions integrations/chainlit.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
---
layout: integration
name: Chainlit
description: Use Chainlit UI for your Haystack apps through Hayhooks
authors:
- name: deepset
socials:
github: deepset-ai
twitter: deepset_ai
linkedin: https://www.linkedin.com/company/deepset-ai/
pypi: https://pypi.org/project/hayhooks/
repo: https://github.com/deepset-ai/hayhooks
type: UI
report_issue: https://github.com/deepset-ai/hayhooks/issues
logo: /logos/chainlit.png
version: Haystack 2.0
toc: true
---

### Table of Contents

- [Overview](#overview)
- [Installation](#installation)
- [Usage](#usage)
- [License](#license)

## Overview

[Chainlit](https://chainlit.io/) is an open-source Python package for building
production-ready Conversational AI. By exposing your Haystack app (standalone agent or
pipeline) through [Hayhooks](https://github.com/deepset-ai/hayhooks) as
OpenAI-compatible endpoints, you can run the Chainlit chat UI inside your Hayhooks
server, giving you a zero-configuration frontend to interact with your deployed
pipelines without a separate client.

For full details, see the [Hayhooks Chainlit integration guide](https://deepset-ai.github.io/hayhooks/features/chainlit-integration).

## Installation

Install Hayhooks with the `chainlit` extra:

```bash
pip install "hayhooks[chainlit]"
```

## Usage

### Hayhooks Quick Start

The simplest way to enable the Chainlit UI is via the `--with-chainlit` flag:

```bash
hayhooks run --with-chainlit
```

This starts Hayhooks with the embedded Chainlit UI available at `http://localhost:1416/
chat`.

### Create a Pipeline Wrapper

```python
# pipelines/my_chat/pipeline_wrapper.py
from typing import Generator

from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage

from hayhooks import BasePipelineWrapper, streaming_generator

class PipelineWrapper(BasePipelineWrapper):
def setup(self) -> None:
self.system_message = ChatMessage.from_system("You are a helpful assistant.")
chat_prompt_builder = ChatPromptBuilder()
llm = OpenAIChatGenerator(model="gpt-4o-mini")

self.pipeline = Pipeline()
self.pipeline.add_component("chat_prompt_builder", chat_prompt_builder)
self.pipeline.add_component("llm", llm)
self.pipeline.connect("chat_prompt_builder.prompt", "llm.messages")

def run_chat_completion(self, model: str, messages: list[dict], body: dict) -> Generator:
chat_messages = [self.system_message] + [
ChatMessage.from_openai_dict_format(msg) for msg in messages
]
return streaming_generator(
pipeline=self.pipeline,
pipeline_run_args={"chat_prompt_builder": {"template": chat_messages}},
)
```

Pipelines must support chat completion (e.g. using `streaming_generator` or `async_streaming_generator`). See [OpenAI compatibility](https://deepset-ai.github.io/hayhooks/features/openai-compatibility) and the [pipeline examples](https://deepset-ai.github.io/hayhooks/examples/overview/) for implementation details.

### Run Hayhooks with UI

```bash
hayhooks run --with-chainlit --pipelines-dir ./pipelines
```

Navigate to [`http://localhost:1416/chat`](http://localhost:1416/chat) in your browser. You'll see your deployed pipeline and can start chatting!

![Chainlit UI](../images/hayhooks-chainlit.gif)

### Examples
Learn how to build an end-to-end agent example with Haystack, Hayhooks and Chainlit in [Chainlit Weather Agent Example](https://github.com/deepset-ai/hayhooks/tree/main/examples/pipeline_wrappers/chainlit_weather_agent)

## License

`hayhooks` and `chainlit` are distributed under the terms of the [Apache-2.0](https://spdx.org/licenses/Apache-2.0.html) license.
73 changes: 73 additions & 0 deletions integrations/openwebui.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
---
layout: integration
name: Open WebUI
description: Use Open WebUI as a chat frontend for your Haystack apps through Hayhooks
authors:
- name: deepset
socials:
github: deepset-ai
twitter: deepset_ai
linkedin: https://www.linkedin.com/company/deepset-ai/
pypi: https://pypi.org/project/hayhooks/
repo: https://github.com/deepset-ai/hayhooks
type: UI
report_issue: https://github.com/deepset-ai/hayhooks/issues
logo: /logos/openwebui.png
version: Haystack 2.0
toc: true
---

### Table of Contents

- [Overview](#overview)
- [Installation](#installation)
- [Usage](#usage)
- [License](#license)

## Overview

[Open WebUI](https://openwebui.com/) is an open-source chat UI for LLM apps. By exposing your Haystack pipelines and agents through [Hayhooks](https://github.com/deepset-ai/hayhooks) as OpenAI-compatible endpoints, you can use Open WebUI as the frontend: run Hayhooks and Open WebUI (separately or via Docker Compose), then connect Open WebUI to Hayhooks in Settings. You get streaming, optional [status and notification events](https://deepset-ai.github.io/hayhooks/features/openwebui-integration#open-webui-events), and optional [OpenAPI tool server](https://deepset-ai.github.io/hayhooks/features/openwebui-integration#openapi-tool-server) integration.

For full details, see the [Hayhooks Open WebUI integration guide](https://deepset-ai.github.io/hayhooks/features/openwebui-integration).

## Installation

Install Hayhooks:

```bash
pip install hayhooks
```

Install and run Open WebUI separately, e.g. with Docker:

```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e WEBUI_AUTH=False -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main
```

For a pre-wired setup, use the [Hayhooks + Open WebUI Docker Compose](https://github.com/deepset-ai/hayhooks-open-webui-docker-compose) (see [Quick Start with Docker Compose](https://deepset-ai.github.io/hayhooks/getting-started/quick-start-docker/)).

## Usage

### Connect Open WebUI to Hayhooks

1. Start Hayhooks with your pipelines:

```bash
hayhooks run --pipelines-dir ./pipelines
```

2. In Open WebUI go to **Settings → Connections** and add a connection:
- **API Base URL**: `http://localhost:1416` (or `http://hayhooks:1416/v1` when using Docker Compose)
- **API Key**: any value (Hayhooks does not require auth)

3. In a new chat, select your deployed pipeline as the model.

Pipeline wrappers must support chat completion (e.g. implement `run_chat_completion` or `run_chat_completion_async`). See [OpenAI compatibility](https://deepset-ai.github.io/hayhooks/features/openai-compatibility) and the [pipeline examples](https://deepset-ai.github.io/hayhooks/examples/overview/) for implementation details.

### Optional: Open WebUI events

For status updates, notifications, and tool-call feedback in the UI, use helpers from `hayhooks.open_webui` in your pipeline and stream `OpenWebUIEvent` objects. See [Open WebUI Events](https://deepset-ai.github.io/hayhooks/examples/openwebui-events/) and the [open_webui_agent_events](https://github.com/deepset-ai/hayhooks/tree/main/examples/pipeline_wrappers/open_webui_agent_events) and [open_webui_agent_on_tool_calls](https://github.com/deepset-ai/hayhooks/tree/main/examples/pipeline_wrappers/open_webui_agent_on_tool_calls) examples.

## License

`hayhooks` is distributed under the terms of the [Apache-2.0](https://spdx.org/licenses/Apache-2.0.html) license. Open WebUI is subject to its [own license](https://github.com/open-webui/open-webui).
Binary file added logos/openwebui.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.