Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ This repository contains LaunchDarkly AI SDK packages for Python, including the
| Package | PyPI | README |
| ------- | ---- | ------ |
| [launchdarkly-server-sdk-ai](packages/sdk/server-ai) | [![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai/) | [README](packages/sdk/server-ai/README.md) |
| [launchdarkly-server-sdk-ai-optimization](packages/optimization) | [![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-optimization.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai-optimization/) | [README](packages/optimization/README.md) |

| AI Provider Packages | PyPI | README |
| -------------------- | ---- | ------ |
Expand Down
5 changes: 4 additions & 1 deletion packages/ai-providers/server-ai-langchain/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# LaunchDarkly AI SDK - LangChain Provider

[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-langchain.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai-langchain/)
[![Actions Status](https://github.com/launchdarkly/python-server-sdk-ai/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/launchdarkly/python-server-sdk-ai/actions/workflows/ci.yml)

[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-langchain.svg?maxAge=2592000)](https://pypi.org/project/launchdarkly-server-sdk-ai-langchain/)
[![PyPI](https://img.shields.io/pypi/pyversions/launchdarkly-server-sdk-ai-langchain.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai-langchain/)

> [!CAUTION]
> This package is in pre-release and not subject to backwards compatibility
Expand Down
7 changes: 5 additions & 2 deletions packages/ai-providers/server-ai-openai/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# LaunchDarkly AI SDK OpenAI Provider

[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-openai-dev.svg?style=flat-square)](https://pypi.org/project/launchdarkly-server-sdk-ai-openai-dev/)
[![Actions Status](https://github.com/launchdarkly/python-server-sdk-ai/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/launchdarkly/python-server-sdk-ai/actions/workflows/ci.yml)

[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-openai.svg?maxAge=2592000)](https://pypi.org/project/launchdarkly-server-sdk-ai-openai/)
[![PyPI](https://img.shields.io/pypi/pyversions/launchdarkly-server-sdk-ai-openai.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai-openai/)

> [!CAUTION]
> This package is in pre-release and not subject to backwards compatibility
Expand All @@ -13,7 +16,7 @@ This package provides an OpenAI integration for the LaunchDarkly AI SDK.
## Installation

```bash
pip install launchdarkly-server-sdk-ai-openai-dev
pip install launchdarkly-server-sdk-ai-openai
```

## Quick Start
Expand Down
5 changes: 4 additions & 1 deletion packages/optimization/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# LaunchDarkly AI SDK — optimization

[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-optimization.svg?style=flat-square)](https://pypi.org/project/launchdarkly-server-sdk-ai-optimization/)
[![Actions Status](https://github.com/launchdarkly/python-server-sdk-ai/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/launchdarkly/python-server-sdk-ai/actions/workflows/ci.yml)

[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai-optimization.svg?maxAge=2592000)](https://pypi.org/project/launchdarkly-server-sdk-ai-optimization/)
[![PyPI](https://img.shields.io/pypi/pyversions/launchdarkly-server-sdk-ai-optimization.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai-optimization/)

> [!CAUTION]
> This package is in pre-release and not subject to backwards compatibility
Expand Down
22 changes: 14 additions & 8 deletions packages/sdk/server-ai/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
# LaunchDarkly Server-Side AI SDK for Python

[![Actions Status](https://github.com/launchdarkly/python-server-sdk-ai/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/launchdarkly/python-server-sdk-ai/actions/workflows/ci.yml)
[![readthedocs](https://readthedocs.org/projects/launchdarkly-python-sdk-ai/badge/)](https://launchdarkly-python-sdk-ai.readthedocs.io/en/latest/)

[![PyPI](https://img.shields.io/pypi/v/launchdarkly-server-sdk-ai.svg?maxAge=2592000)](https://pypi.org/project/launchdarkly-server-sdk-ai/)
[![PyPI](https://img.shields.io/pypi/pyversions/launchdarkly-server-sdk-ai.svg)](https://pypi.org/project/launchdarkly-server-sdk-ai/)

This package contains the LaunchDarkly Server-Side AI SDK for Python (`launchdarkly-server-sdk-ai`).

> [!CAUTION]
Expand Down Expand Up @@ -90,16 +96,16 @@ if ai_config.enabled:
# Use with your AI provider
```

## Chat for Conversational AI
## ManagedModel for Conversational AI

`Chat` provides a high-level interface for conversational AI with automatic conversation management and metrics tracking:
`ManagedModel` provides a high-level interface for conversational AI with automatic conversation management and metrics tracking:

- Automatically configures models based on AI configuration
- Maintains conversation history across multiple interactions
- Automatically tracks token usage, latency, and success rates
- Works with any supported AI provider (see [AI Providers](https://github.com/launchdarkly/python-server-sdk-ai#ai-providers) for available packages)

### Using Chat
### Using ManagedModel

```python
import asyncio
Expand All @@ -109,23 +115,23 @@ from ldai import LDAIClient, AICompletionConfigDefault, ModelConfig, LDMessage
# Use the same default_config from the retrieval section above
async def main():
context = Context.create("user-123")
chat = await ai_client.create_chat(
model = await ai_client.create_model(
'customer-support-chat',
context,
default_config,
variables={'customerName': 'John'}
)

if chat:
if model:
# Simple conversation flow - metrics are automatically tracked by invoke()
response1 = await chat.invoke('I need help with my order')
response1 = await model.invoke('I need help with my order')
print(response1.message.content)

response2 = await chat.invoke("What's the status?")
response2 = await model.invoke("What's the status?")
print(response2.message.content)

# Access conversation history
messages = chat.get_messages()
messages = model.get_messages()
print(f'Conversation has {len(messages)} messages')

asyncio.run(main())
Expand Down
Loading