diff --git a/agentic/setup-guide.mdx b/agentic/setup-guide.mdx
index b65f1fb..dd466f0 100644
--- a/agentic/setup-guide.mdx
+++ b/agentic/setup-guide.mdx
@@ -3,180 +3,177 @@ title: "Setup Guide"
description: "Step-by-step instructions for connecting Claude, ChatGPT, Cursor, Codex CLI, and n8n to the TabPFN MCP server."
---
-#### Claude Code
-
-
-After setup, checkout our [Claude skill](/agentic/tutorials/mcp-claude-skills) to get the most out of TabPFN MCP.
-
-
-```bash
-# If you haven't, install Claude Code
-npm install -g @anthropic-ai/claude-code
-
-# Navigate to your project
-cd your-tabpfn-project
-
-# Add TabPFN MCP (general access)
-claude mcp add --transport http tabpfn https://api.priorlabs.ai/mcp/server
-
-# Start coding with Claude
-claude
-
-# Authenticate the MCP tools by typing /mcp
-# This will trigger the OAuth flow
-/mcp
-```
-
-#### Claude.ai and Claude for desktop
-
-
-After setup, checkout our [Claude skill](/agentic/tutorials/mcp-claude-skills) to get the most out of TabPFN MCP.
-
-
-
-
-
- The fastest way to get started with Claude Desktop is the TabPFN MCP Bundle.
- Download it, double-click the file, and Claude Desktop will open the install dialog.
-
- 1. Enable network egress in Claude Desktop's Settings → Capabilities (required to reach `api.priorlabs.ai`)
- 2. Get your API token from [ux.priorlabs.ai](https://ux.priorlabs.ai) (log in or sign up, then copy from the dashboard)
- 3. Download the bundle:
-
-
- Install in Claude
-
-
- v0.1.0 · .mcpb · ~1 MB · macOS, Windows, Linux
-
-
- 4. Open the downloaded `.mcpb` file — Claude Desktop will show the install dialog
- 5. Paste your API token when prompted
+
+
+ After setup, checkout our [Claude skill](/agentic/tutorials/mcp-claude-skills) to get the most out of TabPFN MCP.
+
+
+ ```bash
+ # If you haven't, install Claude Code
+ npm install -g @anthropic-ai/claude-code
+
+ # Navigate to your project
+ cd your-tabpfn-project
+
+ # Add TabPFN MCP (general access)
+ claude mcp add --transport http tabpfn https://api.priorlabs.ai/mcp/server
+
+ # Start coding with Claude
+ claude
+
+ # Authenticate the MCP tools by typing /mcp
+ # This will trigger the OAuth flow
+ /mcp
+ ```
-
- 1. Open Settings in the sidebar
- 2. Navigate to Connectors and select Add custom connector
- 3. Configure the connector:
- - Name: TabPFN
- - URL: https://api.priorlabs.ai/mcp/server
+
+
+ After setup, checkout our [Claude skill](/agentic/tutorials/mcp-claude-skills) to get the most out of TabPFN MCP.
+
+
+
+
+
+
+ The fastest way to get started with Claude Desktop is the TabPFN MCP Bundle.
+ Download it, double-click the file, and Claude Desktop will open the install dialog.
+
+ 1. Enable network egress in Claude Desktop's Settings → Capabilities (required to reach `api.priorlabs.ai`)
+ 2. Get your API token from [ux.priorlabs.ai](https://ux.priorlabs.ai) (log in or sign up, then copy from the dashboard)
+ 3. Download the bundle:
+
+
+ Install in Claude
+
+
+ v0.1.0 · .mcpb · ~1 MB · macOS, Windows, Linux
+
+
+ 4. Open the downloaded `.mcpb` file — Claude Desktop will show the install dialog
+ 5. Paste your API token when prompted
+
+
+ 1. Open Settings in the sidebar
+ 2. Navigate to Connectors and select Add custom connector
+ 3. Configure the connector:
+ - Name: TabPFN
+ - URL: https://api.priorlabs.ai/mcp/server
+
+
+ Custom connectors using remote MCP are available on Claude and Claude Desktop for users on Pro, Max, Team, and Enterprise plans.
+
+
+
+ You may add the MCP server by editing the Claude Desktop config file:
+
+ 1. Locate your Claude Desktop config file based on your operating system
+ 2. Get your API key from [ux.priorlabs.ai](https://ux.priorlabs.ai) (log in or sign up, then copy from the dashboard)
+ 3. Edit the config file to add the TabPFN server:
+ ```json
+ {
+ "mcpServers": {
+ "tabpfn": {
+ "url": "https://api.priorlabs.ai/mcp/server",
+ "headers": {
+ "Authorization": "Bearer YOUR_API_KEY_HERE"
+ }
+ }
+ }
+ }
+ ```
+ 4. Replace `YOUR_API_KEY_HERE` with your actual API key from step 2
+ 5. Save the config file and restart Claude Desktop for the changes to take effect
+
+
+
+
+
+
+ Follow these steps to set up TabPFN as a connector in ChatGPT:
+
+ 1. Enable Developer mode:
+ - Go to Settings → Connectors → Advanced settings → Developer mode
+ 2. Open ChatGPT settings
+ 3. In the Connectors tab, `Create` a new connector:
+ - Give it a name: TabPFN
+ - MCP server URL: https://api.priorlabs.ai/mcp/server
+ - Authentication: OAuth
+ 4. Click Create
- Custom connectors using remote MCP are available on Claude and Claude Desktop for users on Pro, Max, Team, and Enterprise plans.
+ Custom connectors using MCP are available on ChatGPT for Pro and Plus accounts on the web.
-
- You may add the MCP server by editing the Claude Desktop config file:
+
+ Codex CLI is OpenAI's local coding agent that can run directly from your terminal.
+
+ ```bash
+ # Install Codex
+ npm i -g @openai/codex
+
+ # Add TabPFN MCP
+ codex mcp add tabpfn --url https://api.priorlabs.ai/mcp/server
+
+ # Start Codex
+ codex
+ ```
+
+ When adding the MCP server, Codex will detect OAuth support and open your browser to authorize the connection.
+
+
+ To add TabPFN MCP to your Cursor environment, add the snippet below to your project-specific or global `.cursor/mcp.json` file manually. For more details, see the [Cursor documentation](https://docs.cursor.com/en/context/mcp).
- 1. Locate your Claude Desktop config file based on your operating system
- 2. Get your API key from [ux.priorlabs.ai](https://ux.priorlabs.ai) (log in or sign up, then copy from the dashboard)
- 3. Edit the config file to add the TabPFN server:
```json
{
- "mcpServers": {
- "tabpfn": {
- "url": "https://api.priorlabs.ai/mcp/server",
- "headers": {
- "Authorization": "Bearer YOUR_API_KEY_HERE"
- }
- }
+ "mcpServers": {
+ "tabpfn": {
+ "url": "https://api.priorlabs.ai/mcp/server"
}
}
+ }
```
- 4. Replace `YOUR_API_KEY_HERE` with your actual API key from step 2
- 5. Save the config file and restart Claude Desktop for the changes to take effect
+
+ Once the server is added, Cursor will attempt to connect and display a Needs login prompt. Click on this prompt to authorize Cursor to access your Prior Labs account.
+
+
+ Watch the video below to learn how to integrate TabPFN with n8n workflows.
+
+
-
-#### ChatGPT
-
-
-
-Follow these steps to set up TabPFN as a connector in ChatGPT:
-
-1. Enable Developer mode:
- - Go to Settings → Connectors → Advanced settings → Developer mode
-2. Open ChatGPT settings
-3. In the Connectors tab, `Create` a new connector:
- - Give it a name: TabPFN
- - MCP server URL: https://api.priorlabs.ai/mcp/server
- - Authentication: OAuth
-4. Click Create
-
-
-Custom connectors using MCP are available on ChatGPT for Pro and Plus accounts on the web.
-
-
-#### Codex CLI
-
-Codex CLI is OpenAI's local coding agent that can run directly from your terminal.
-
-```bash
-# Install Codex
-npm i -g @openai/codex
-
-# Add TabPFN MCP
-codex mcp add tabpfn --url https://api.priorlabs.ai/mcp/server
-
-# Start Codex
-codex
-```
-
-When adding the MCP server, Codex will detect OAuth support and open your browser to authorize the connection.
-
-#### Cursor
-
-To add TabPFN MCP to your Cursor environment, add the snippet below to your project-specific or global `.cursor/mcp.json` file manually. For more details, see the [Cursor documentation](https://docs.cursor.com/en/context/mcp).
-
-```json
-{
- "mcpServers": {
- "tabpfn": {
- "url": "https://api.priorlabs.ai/mcp/server"
- }
- }
-}
-```
-
-Once the server is added, Cursor will attempt to connect and display a Needs login prompt. Click on this prompt to authorize Cursor to access your Prior Labs account.
-
-#### n8n
-
-Watch the video below to learn how to integrate TabPFN with n8n workflows.
-
-
\ No newline at end of file
diff --git a/docs.json b/docs.json
index 68cb2aa..ad10f16 100644
--- a/docs.json
+++ b/docs.json
@@ -129,6 +129,7 @@
"group": "Integrations",
"icon": "cloud",
"pages": [
+ "integrations/foundry",
"integrations/sagemaker"
]
},
diff --git a/integrations/foundry.mdx b/integrations/foundry.mdx
new file mode 100644
index 0000000..e5c3552
--- /dev/null
+++ b/integrations/foundry.mdx
@@ -0,0 +1,217 @@
+---
+title: "Microsoft Foundry"
+description: "Access TabPFN in your secure Azure environment."
+---
+
+Access TabPFN directly from Azure AI Foundry with Azure-native endpoints and authentication. Usage is billed through your Azure subscription and you are charged by Azure only for the compute resources needed to host TabPFN models.
+
+## Prerequisites
+* An active Azure subscription with access to [Azure AI Foundry](https://ai.azure.com/explore/models)
+* Azure quota for VM SKUs with GPU
+* TabPFN deployed as a MaaP (Model-as-a-Platform) endpoint in your Foundry project
+
+
+For a full list of supported VM SKUs please visit the TabPFN Microsoft Foundry Model Card.
+
+
+## Getting Started
+1. Navigate to the Azure AI Foundry [Model Catalog](https://ai.azure.com/explore/models)
+2. Search for TabPFN and select TabPFN-2.5
+3. Click **Use this model** and follow the guided setup
+4. Once deployed, note your endpoint URL and API key from the deployment details page
+
+
+Microsoft Foundry hosts each TabPFN version as a separate model. When a new TabPFN version is released, it will appear as a distinct model in the catalog and must be deployed independently - existing deployments will not be updated automatically.
+
+
+## Installation
+
+A TabPFN model set up using Microsoft Foundry can be accessed using any HTTP client.
+
+```bash
+pip install requests numpy pandas
+```
+
+## Usage Guide
+
+TabPFN on Azure Foundry exposes a single `POST /predict` HTTP endpoint. You send training data, labels, and test data in one request and get predictions back immediately - without any model training.
+
+### Endpoint
+
+Authenticate using the API key from your deployment's **Keys and Endpoint** page in Azure ML Studio.
+
+```bash
+POST https://..inference.ml.azure.com/predict
+Content-Type: application/json
+Authorization: Bearer
+```
+
+### Request
+
+
+ Training features. Accepts a row-oriented 2D array `[[f1, f2], [f1, f2], ...]`.
+
+
+
+ Training labels or targets. One value per training row.
+
+
+
+ Test features to predict for. Same format as `X_train`, without the target.
+
+
+
+ Controls the models behavior.
+
+
+
+ `"classification"` or `"regression"`.
+
+
+
+ Model parameters. Safe to omit — defaults are applied server-side.
+ See the source for supported fields:
+ [classification](https://github.com/PriorLabs/TabPFN/blob/main/src/tabpfn/classifier.py#L203),
+ [regression](https://github.com/PriorLabs/TabPFN/blob/main/src/tabpfn/regressor.py#L211).
+
+
+
+ Controls output shape. See [Output types](#output-types) below.
+
+
+
+
+## Examples
+
+
+
+
+ Get a probability distribution over classes for each test row.
+
+ ```python
+ import requests
+
+ response = requests.post(
+ "https://..inference.ml.azure.com/predict",
+ headers={
+ "Content-Type": "application/json",
+ "Authorization": "Bearer ",
+ },
+ json={
+ "task_config": {
+ "task": "classification",
+ "predict_params": {"output_type": "probas"},
+ },
+ "X_train": [[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]],
+ "y_train": [0, 1, 0],
+ "X_test": [[2.0, 3.0], [4.0, 5.0]],
+ },
+ )
+
+ result = response.json()
+ print(result["prediction"])
+ # [[0.12, 0.88], [0.55, 0.45]]
+ ```
+
+ `prediction` is a 2D array — one inner list per test row, one probability per class.
+
+
+
+
+ Get a single predicted class label for each test row.
+
+ ```python
+ import requests
+
+ response = requests.post(
+ "https://..inference.ml.azure.com/predict",
+ headers={
+ "Content-Type": "application/json",
+ "Authorization": "Bearer ",
+ },
+ json={
+ "task_config": {
+ "task": "classification",
+ "predict_params": {"output_type": "preds"},
+ },
+ "X_train": [[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]],
+ "y_train": [0, 1, 0],
+ "X_test": [[2.0, 3.0], [4.0, 5.0]],
+ },
+ )
+
+ result = response.json()
+ print(result["prediction"])
+ # [1, 0]
+ ```
+
+ `prediction` is a 1D array — one class per test row.
+
+
+
+
+ Get a single predicted value for each test row.
+
+ ```python
+ import requests
+
+ response = requests.post(
+ "https://..inference.ml.azure.com/predict",
+ headers={
+ "Content-Type": "application/json",
+ "Authorization": "Bearer ",
+ },
+ json={
+ "task_config": {
+ "task": "regression",
+ "predict_params": {"output_type": "mean"},
+ },
+ "X_train": [[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]],
+ "y_train": [10.0, 11.0, 9.5],
+ "X_test": [[2.0, 3.0], [4.0, 5.0]],
+ },
+ )
+
+ result = response.json()
+ print(result["prediction"])
+ # [10.2, 10.7]
+ ```
+
+ `prediction` is a 1D array if default `mean` output type is used — one prediction per test row.
+
+
+
+
+## Output types
+
+### Classification
+
+TabPFN natively outputs class probabilities, giving you calibrated uncertainty estimates from a single model with no extra configuration.
+
+| Output type | Shape | Description |
+|---------------|-------|-------------|
+| `probas` *(default)* | `number[][]` | One probability list per test row |
+| `preds` | `number[]` | Predicted class label per test row |
+
+### Regression
+
+TabPFN models can provide full predictive distribution rather than just point estimates, so you can extract quantiles or summary statistics with a single inference call.
+
+| Output type | Shape | Description |
+|---------------|-------|-------------|
+| `mean` *(default)* | `number[]` | Predicted mean per test row |
+| `median` | `number[]` | Predicted median per test row |
+| `mode` | `number[]` | Predicted mode per test row |
+| `quantiles` | `number[][]` | One list per quantile |
+| `full` | `object` | All outputs (mean, median, quantiles, etc.) |
+| `main` | `object` | Main outputs only |
+
+---
+
+## Errors
+
+| Code | Cause |
+|------|-------|
+| `400` | Missing required fields or invalid JSON |
+| `415` | Content-Type is not `application/json` |
+| `422` | Validation error — e.g. `y_train` has multiple columns, invalid `output_type` |
\ No newline at end of file