diff --git a/README.md b/README.md
index e05f8dfe..954bbf32 100644
--- a/README.md
+++ b/README.md
@@ -35,6 +35,7 @@ Perfect for building **RAG pipelines** with real-time retrieval, **AI agents** w
| **[Vector Search](#retrieval)**
*Similarity search with metadata filters* | **[LLM Memory](#llm-memory)**
*Agentic AI context management* | **Async Support**
*Async indexing and search for improved performance* |
| **[Complex Filtering](#retrieval)**
*Combine multiple filter types* | **[Semantic Routing](#semantic-routing)**
*Intelligent query classification* | **[Vectorizers](#vectorizers)**
*8+ embedding provider integrations* |
| **[Hybrid Search](#retrieval)**
*Combine semantic & full-text signals* | **[Embedding Caching](#embedding-caching)**
*Cache embeddings for efficiency* | **[Rerankers](#rerankers)**
*Improve search result relevancy* |
+| | | **[MCP Server](#mcp-server)**
*Expose an existing Redis index to MCP clients* |
@@ -50,7 +51,16 @@ Install `redisvl` into your Python (>=3.9) environment using `pip`:
pip install redisvl
```
+Install the MCP server extra when you want to expose an existing Redis index through MCP:
+
+```bash
+pip install redisvl[mcp]
+```
+
+The `redisvl[mcp]` extra requires Python 3.10 or newer.
+
> For more detailed instructions, visit the [installation guide](https://docs.redisvl.com/en/latest/user_guide/installation.html).
+> For MCP concepts and setup, see the [RedisVL MCP docs](https://docs.redisvl.com/en/latest/concepts/mcp.html) and the [MCP how-to guide](https://docs.redisvl.com/en/latest/user_guide/how_to_guides/mcp.html).
## Redis
@@ -525,11 +535,45 @@ usage: rvl []
Commands:
index Index manipulation (create, delete, etc.)
+ mcp Run the RedisVL MCP server
version Obtain the version of RedisVL
stats Obtain statistics about an index
```
-> Read more about [using the CLI](https://docs.redisvl.com/en/latest/overview/cli.html).
+Run the MCP server over stdio with:
+
+```bash
+uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml
+```
+
+Use `--read-only` to expose search without upsert.
+
+> Read more about [using the CLI](https://docs.redisvl.com/en/latest/overview/cli.html) and [running RedisVL MCP](https://docs.redisvl.com/en/latest/user_guide/how_to_guides/mcp.html).
+
+### MCP Server
+
+RedisVL includes an MCP server that lets MCP-compatible clients search or upsert data in an existing Redis index through a small, stable tool contract.
+
+The server:
+
+- connects to one existing Redis Search index
+- reconstructs the schema from Redis at startup
+- uses the configured vectorizer for query embedding and optional upsert embedding
+- exposes `search-records` and, unless read-only mode is enabled, `upsert-records`
+
+Run it over stdio with:
+
+```bash
+uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml
+```
+
+Use `--read-only` when clients should only search:
+
+```bash
+uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml --read-only
+```
+
+For configuration details, tool arguments, and examples, see the [RedisVL MCP docs](https://docs.redisvl.com/en/latest/concepts/mcp.html) and the [MCP how-to guide](https://docs.redisvl.com/en/latest/user_guide/how_to_guides/mcp.html).
## 🚀 Why RedisVL?
@@ -542,6 +586,7 @@ Built on the [Redis Python](https://github.com/redis/redis-py/tree/master) clien
For additional help, check out the following resources:
- [Getting Started Guide](https://docs.redisvl.com/en/stable/user_guide/01_getting_started.html)
+- [RedisVL MCP](https://docs.redisvl.com/en/latest/concepts/mcp.html)
- [API Reference](https://docs.redisvl.com/en/stable/api/index.html)
- [Redis AI Recipes](https://github.com/redis-developer/redis-ai-resources)
diff --git a/docs/concepts/index.md b/docs/concepts/index.md
index 0e522b1a..a68d0802 100644
--- a/docs/concepts/index.md
+++ b/docs/concepts/index.md
@@ -47,6 +47,13 @@ Vector, filter, text, hybrid, and multi-vector query options.
Vectorizers for embeddings and rerankers for result optimization.
:::
+:::{grid-item-card} 🧠MCP
+:link: mcp
+:link-type: doc
+
+How RedisVL exposes an existing Redis index to MCP clients through a stable tool contract.
+:::
+
:::{grid-item-card} 🧩 Extensions
:link: extensions
:link-type: doc
@@ -65,5 +72,6 @@ search-and-indexing
field-attributes
queries
utilities
+mcp
extensions
```
diff --git a/docs/concepts/mcp.md b/docs/concepts/mcp.md
new file mode 100644
index 00000000..854b6a91
--- /dev/null
+++ b/docs/concepts/mcp.md
@@ -0,0 +1,102 @@
+---
+myst:
+ html_meta:
+ "description lang=en": |
+ RedisVL MCP concepts: how the RedisVL MCP server exposes an existing Redis index to MCP clients.
+---
+
+# RedisVL MCP
+
+RedisVL includes an MCP server that exposes a Redis-backed retrieval surface through a small, deterministic tool contract. It is designed for AI applications that want to search or maintain data in an existing Redis index without each client reimplementing Redis query logic.
+
+## What RedisVL MCP Does
+
+The RedisVL MCP server sits between an MCP client and Redis:
+
+1. It connects to an existing Redis Search index.
+2. It inspects that index at startup and reconstructs its schema.
+3. It instantiates the configured vectorizer for query embedding and optional upsert embedding.
+4. It exposes stable MCP tools for search, and optionally upsert.
+
+This keeps the Redis index as the source of truth for search behavior while giving MCP clients a predictable interface.
+
+## How RedisVL MCP Runs
+
+RedisVL MCP works with a focused model:
+
+- One server process binds to exactly one existing Redis index.
+- The server uses stdio transport.
+- Search behavior is owned by configuration, not by MCP callers.
+- The vectorizer is configured explicitly.
+- Upsert is optional and can be disabled with read-only mode.
+
+## Config-Owned Search Behavior
+
+MCP callers can control:
+
+- `query`
+- `limit`
+- `offset`
+- `filter`
+- `return_fields`
+
+MCP callers do not choose:
+
+- which index to target
+- whether retrieval is `vector`, `fulltext`, or `hybrid`
+- query tuning parameters such as hybrid fusion or vector runtime settings
+
+That behavior lives in the server config under `indexes..search`. The response includes `search_type` as informational metadata, but it is not a request parameter.
+
+## Single Index Binding
+
+The YAML config uses an `indexes` mapping with one configured entry. That binding points to an existing Redis index through `redis_name`, and every tool call targets that configured index.
+
+## Schema Inspection and Overrides
+
+RedisVL MCP is inspection-first:
+
+- the Redis index must already exist
+- the server reconstructs the schema from Redis metadata at startup
+- runtime field mappings remain explicit in config
+
+In some environments, Redis metadata can be incomplete for vector field attributes. When that happens, `schema_overrides` can patch missing attrs for fields that were already discovered. It does not create new fields or change discovered field identity.
+
+## Read-Only and Read-Write Modes
+
+RedisVL MCP always registers `search-records`.
+
+`upsert-records` is only registered when the server is not in read-only mode. Read-only mode is controlled by:
+
+- the CLI flag `--read-only`
+- or the environment variable `REDISVL_MCP_READ_ONLY=true`
+
+Use read-only mode when Redis is serving approved content to assistants and another system owns ingestion.
+
+## Tool Surface
+
+RedisVL MCP exposes two tools:
+
+- `search-records` searches the configured index using the server-owned search mode
+- `upsert-records` validates and upserts records, embedding them when needed
+
+These tools follow a stable contract:
+
+- request validation happens before query or write execution
+- filters support either raw strings or a RedisVL-backed JSON DSL
+- error codes are mapped into a stable set of MCP-facing categories
+
+## Why Use MCP Instead of Direct RedisVL Calls
+
+Use RedisVL MCP when you want a standard tool boundary for agent frameworks or assistants that already speak MCP.
+
+Use direct RedisVL client code when your application should own index lifecycle, search construction, data loading, or richer RedisVL features directly in Python.
+
+RedisVL MCP is a good fit when:
+
+- multiple assistants should share one approved retrieval surface
+- you want search behavior fixed by deployment config
+- you need a read-only or tightly controlled write boundary
+- you want to reuse an existing Redis index without rebuilding retrieval logic in every client
+
+For setup steps, config, commands, and examples, see {doc}`/user_guide/how_to_guides/mcp`.
diff --git a/docs/user_guide/how_to_guides/index.md b/docs/user_guide/how_to_guides/index.md
index c03d705d..fd24fbfc 100644
--- a/docs/user_guide/how_to_guides/index.md
+++ b/docs/user_guide/how_to_guides/index.md
@@ -39,6 +39,7 @@ How-to guides are **task-oriented** recipes that help you accomplish specific go
:::{grid-item-card} 💻 CLI Operations
- [Manage Indices with the CLI](../cli.ipynb) -- create, inspect, and delete indices from your terminal
+- [Run RedisVL MCP](mcp.md) -- expose an existing Redis index to MCP clients
:::
::::
@@ -59,6 +60,7 @@ How-to guides are **task-oriented** recipes that help you accomplish specific go
| Optimize index performance | [Optimize Indexes with SVS-VAMANA](../09_svs_vamana.ipynb) |
| Decide on storage format | [Choose a Storage Type](../05_hash_vs_json.ipynb) |
| Manage indices from terminal | [Manage Indices with the CLI](../cli.ipynb) |
+| Expose an index through MCP | [Run RedisVL MCP](mcp.md) |
```{toctree}
:hidden:
@@ -74,4 +76,5 @@ Optimize Indexes with SVS-VAMANA <../09_svs_vamana>
Cache Embeddings <../10_embeddings_cache>
Use Advanced Query Types <../11_advanced_queries>
Write SQL Queries for Redis <../12_sql_to_redis_queries>
+Run RedisVL MCP
```
diff --git a/docs/user_guide/how_to_guides/mcp.md b/docs/user_guide/how_to_guides/mcp.md
new file mode 100644
index 00000000..d5aef922
--- /dev/null
+++ b/docs/user_guide/how_to_guides/mcp.md
@@ -0,0 +1,402 @@
+---
+myst:
+ html_meta:
+ "description lang=en": |
+ How to run the RedisVL MCP server, configure it, and use its search and upsert tools.
+---
+
+# Run RedisVL MCP
+
+This guide shows how to run the RedisVL MCP server against an existing Redis index, configure its behavior, and use the MCP tools it exposes.
+
+For the higher-level design, see {doc}`/concepts/mcp`.
+
+## Before You Start
+
+RedisVL MCP assumes all of the following are already true:
+
+- you have Python 3.10 or newer
+- you have Redis with Search capabilities available
+- the Redis index already exists
+- you know which text field and vector field the server should use
+- you have installed the vectorizer provider dependencies your config needs
+
+Install the MCP extra:
+
+```bash
+pip install redisvl[mcp]
+```
+
+If your vectorizer needs a provider extra, install that too:
+
+```bash
+pip install redisvl[mcp,openai]
+```
+
+## Start the Server
+
+Run the server over stdio:
+
+```bash
+uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml
+```
+
+Run it in read-only mode to expose search without upsert:
+
+```bash
+uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml --read-only
+```
+
+You can also control boot settings through environment variables:
+
+| Variable | Purpose |
+|----------|---------|
+| `REDISVL_MCP_CONFIG` | Path to the MCP YAML config |
+| `REDISVL_MCP_READ_ONLY` | Disable `upsert-records` when set to `true` |
+| `REDISVL_MCP_TOOL_SEARCH_DESCRIPTION` | Override the search tool description |
+| `REDISVL_MCP_TOOL_UPSERT_DESCRIPTION` | Override the upsert tool description |
+
+## Example Config
+
+This example binds one logical MCP server to one existing Redis index called `knowledge`.
+
+The config uses `${REDIS_URL}` and `${OPENAI_API_KEY}` as environment-variable placeholders. These values are resolved when the server starts. You can also use `${VAR:-default}` to provide a fallback value.
+
+```yaml
+server:
+ redis_url: ${REDIS_URL}
+
+indexes:
+ knowledge:
+ redis_name: knowledge
+
+ vectorizer:
+ class: OpenAITextVectorizer
+ model: text-embedding-3-small
+ api_config:
+ api_key: ${OPENAI_API_KEY}
+
+ schema_overrides:
+ fields:
+ - name: embedding
+ type: vector
+ attrs:
+ dims: 1536
+ datatype: float32
+
+ search:
+ type: hybrid
+ params:
+ text_scorer: BM25STD
+ stopwords: english
+ vector_search_method: KNN
+ combination_method: LINEAR
+ linear_text_weight: 0.3
+
+ runtime:
+ text_field_name: content
+ vector_field_name: embedding
+ default_embed_text_field: content
+ default_limit: 10
+ max_limit: 25
+ max_upsert_records: 64
+ skip_embedding_if_present: true
+ startup_timeout_seconds: 30
+ request_timeout_seconds: 60
+ max_concurrency: 16
+```
+
+### What This Config Means
+
+- `redis_name` must point to an index that already exists in Redis
+- `search.type` fixes retrieval behavior for every MCP caller
+- `runtime.text_field_name` tells full-text and hybrid search which field to search
+- `runtime.vector_field_name` tells the server which vector field to use
+- `runtime.default_embed_text_field` tells upsert which text field to embed when a record needs embedding
+- `schema_overrides` is only for patching incomplete field attrs discovered from Redis
+
+## Tool Contracts
+
+RedisVL MCP exposes a small, implementation-owned contract.
+
+### `search-records`
+
+Arguments:
+
+- `query`
+- `limit`
+- `offset`
+- `filter`
+- `return_fields`
+
+Example request payload:
+
+```json
+{
+ "query": "incident response runbook",
+ "limit": 2,
+ "offset": 0,
+ "filter": {
+ "and": [
+ { "field": "category", "op": "eq", "value": "operations" },
+ { "field": "rating", "op": "gte", "value": 4 }
+ ]
+ },
+ "return_fields": ["title", "content", "category", "rating"]
+}
+```
+
+Example response payload:
+
+```json
+{
+ "search_type": "hybrid",
+ "offset": 0,
+ "limit": 2,
+ "results": [
+ {
+ "id": "knowledge:runbook:eu-failover",
+ "score": 0.82,
+ "score_type": "hybrid_score",
+ "record": {
+ "title": "EU failover runbook",
+ "content": "Restore traffic after a regional failover.",
+ "category": "operations",
+ "rating": 5
+ }
+ }
+ ]
+}
+```
+
+Notes:
+
+- `search_type` is response metadata, not a request argument
+- when `return_fields` is omitted, RedisVL MCP returns all non-vector fields
+- returning the configured vector field is rejected
+- `filter` accepts either a raw string or a JSON DSL object
+
+### `upsert-records`
+
+Arguments:
+
+- `records`
+- `id_field`
+- `skip_embedding_if_present`
+
+Example request payload:
+
+```json
+{
+ "records": [
+ {
+ "doc_id": "doc-42",
+ "content": "Updated operational guidance for failover handling.",
+ "category": "operations",
+ "rating": 5
+ }
+ ],
+ "id_field": "doc_id"
+}
+```
+
+Example response payload:
+
+```json
+{
+ "status": "success",
+ "keys_upserted": 1,
+ "keys": ["knowledge:doc-42"]
+}
+```
+
+Notes:
+
+- this tool is not registered in read-only mode
+- records that need embedding must contain `runtime.default_embed_text_field`
+- when `skip_embedding_if_present` is `true`, records that already contain the vector field can skip re-embedding
+
+## Search Examples
+
+### Read-Only Vector Search
+
+Use read-only mode when assistants should only retrieve data:
+
+```bash
+uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp.yaml --read-only
+```
+
+With a `search.type` of `vector`, callers send only the query, filters, pagination, and field projection:
+
+```json
+{
+ "query": "cache invalidation incident",
+ "limit": 3,
+ "return_fields": ["title", "content", "category"]
+}
+```
+
+### Raw String Filter
+
+Pass a raw Redis filter string through unchanged:
+
+```json
+{
+ "query": "science",
+ "filter": "@category:{science}",
+ "return_fields": ["content", "category"]
+}
+```
+
+### JSON DSL Filter
+
+The DSL supports logical operators and type-checked field operators:
+
+```json
+{
+ "query": "science",
+ "filter": {
+ "and": [
+ { "field": "category", "op": "eq", "value": "science" },
+ { "field": "rating", "op": "gte", "value": 4 }
+ ]
+ },
+ "return_fields": ["content", "category", "rating"]
+}
+```
+
+### Pagination and Field Projection
+
+```json
+{
+ "query": "science",
+ "limit": 1,
+ "offset": 1,
+ "return_fields": ["content", "category"]
+}
+```
+
+### Hybrid Search With `schema_overrides`
+
+Use `schema_overrides` when Redis inspection cannot recover complete vector attrs, then keep hybrid behavior in config:
+
+```yaml
+schema_overrides:
+ fields:
+ - name: embedding
+ type: vector
+ attrs:
+ algorithm: flat
+ dims: 1536
+ datatype: float32
+ distance_metric: cosine
+
+search:
+ type: hybrid
+ params:
+ text_scorer: BM25STD
+ stopwords: english
+ vector_search_method: KNN
+ combination_method: LINEAR
+ linear_text_weight: 0.3
+```
+
+The MCP caller still sends the same request shape:
+
+```json
+{
+ "query": "legacy cache invalidation flow",
+ "filter": { "field": "category", "op": "eq", "value": "release-notes" },
+ "return_fields": ["title", "content", "release_version"]
+}
+```
+
+## Upsert Examples
+
+### Auto-Embed New Records
+
+If a record does not include the configured vector field, RedisVL MCP embeds `runtime.default_embed_text_field` and writes the result:
+
+```json
+{
+ "records": [
+ {
+ "content": "First upserted document",
+ "category": "science",
+ "rating": 5
+ },
+ {
+ "content": "Second upserted document",
+ "category": "health",
+ "rating": 4
+ }
+ ]
+}
+```
+
+### Update Existing Records With `id_field`
+
+```json
+{
+ "records": [
+ {
+ "doc_id": "doc-1",
+ "content": "Updated content",
+ "category": "engineering",
+ "rating": 5
+ }
+ ],
+ "id_field": "doc_id"
+}
+```
+
+### Control Re-Embedding With `skip_embedding_if_present`
+
+```json
+{
+ "records": [
+ {
+ "doc_id": "doc-2",
+ "content": "Existing content",
+ "category": "science",
+ "rating": 4
+ }
+ ],
+ "id_field": "doc_id",
+ "skip_embedding_if_present": false
+}
+```
+
+Set `skip_embedding_if_present` to `false` when you want the server to regenerate embeddings during upsert. In most cases, the caller should omit the vector field and let the server manage embeddings from `runtime.default_embed_text_field`.
+
+## Troubleshooting
+
+### Missing MCP Dependencies
+
+If `rvl mcp` reports missing optional dependencies, install the MCP extra:
+
+```bash
+pip install redisvl[mcp]
+```
+
+If the configured vectorizer needs a provider SDK, install that provider extra too.
+
+### Unsupported Python Runtime
+
+RedisVL MCP requires Python 3.10 or newer even though the core package supports Python 3.9. Use a newer interpreter for the MCP server process.
+
+### Configured Redis Index Does Not Exist
+
+The server only binds to an existing index. Create the index first, then point `indexes..redis_name` at that index name.
+
+### Missing Required Environment Variables
+
+YAML values support `${VAR}` and `${VAR:-default}` substitution. Missing required variables fail startup before the server registers tools.
+
+### Vectorizer Dimension Mismatch
+
+If the vectorizer dims do not match the configured vector field dims, startup fails. Make sure the embedding model and the effective vector field dimensions are aligned.
+
+### Hybrid Config Requires Native Runtime Support
+
+Some hybrid params depend on native hybrid support in Redis and redis-py. If your environment does not support that path, remove native-only params such as `knn_ef_runtime` or upgrade Redis and redis-py.
diff --git a/docs/user_guide/index.md b/docs/user_guide/index.md
index 5d2cf6df..c9be86d2 100644
--- a/docs/user_guide/index.md
+++ b/docs/user_guide/index.md
@@ -39,7 +39,17 @@ Schema → Index → Load → Query
**Solve specific problems.** Task-oriented recipes for LLM extensions, querying, embeddings, optimization, and storage.
+++
-LLM Caching • Filtering • Vectorizers • Reranking
+LLM Caching • Filtering • MCP • Reranking
+:::
+
+:::{grid-item-card} 🧠MCP Setup
+:link: how_to_guides/mcp
+:link-type: doc
+
+**Expose Redis through MCP.** Run the RedisVL MCP server, configure one existing index, and use search or optional upsert tools.
+
++++
+stdio transport • One index • Search and upsert
:::
:::{grid-item-card} 💻 CLI Reference
diff --git a/docs/user_guide/installation.md b/docs/user_guide/installation.md
index cfa1bb32..56704379 100644
--- a/docs/user_guide/installation.md
+++ b/docs/user_guide/installation.md
@@ -31,6 +31,7 @@ $ pip install redisvl[vertexai] # Google Vertex AI embeddings
$ pip install redisvl[bedrock] # AWS Bedrock embeddings
# Other optional features
+$ pip install redisvl[mcp] # RedisVL MCP server support (Python 3.10+)
$ pip install redisvl[langcache] # LangCache managed service integration
$ pip install redisvl[sql-redis] # SQL query support
```
@@ -44,7 +45,7 @@ $ pip install redisvl\[openai\]
You can install multiple optional dependencies at once:
```bash
-$ pip install redisvl[openai,cohere,sentence-transformers]
+$ pip install redisvl[mcp,openai,cohere,sentence-transformers]
```
To install **all** optional dependencies at once:
@@ -53,6 +54,10 @@ To install **all** optional dependencies at once:
$ pip install redisvl[all]
```
+```{note}
+The core RedisVL package supports Python 3.9+, but the `redisvl[mcp]` extra requires Python 3.10 or newer because the MCP server depends on `fastmcp`.
+```
+
## Install RedisVL from Source
To install RedisVL from source, clone the repository and install the package using `pip`:
diff --git a/spec/MCP-production-example.md b/spec/MCP-production-example.md
index 30a9978e..16e0739b 100644
--- a/spec/MCP-production-example.md
+++ b/spec/MCP-production-example.md
@@ -77,7 +77,6 @@ The behavioral contract stays the same. The operational controls around networki
```yaml
server:
redis_url: ${REDIS_URL}
- read_only: true
indexes:
knowledge:
@@ -97,6 +96,15 @@ indexes:
dims: 1536
datatype: float32
+ search:
+ type: hybrid
+ params:
+ text_scorer: BM25STD
+ stopwords: english
+ vector_search_method: KNN
+ combination_method: LINEAR
+ linear_text_weight: 0.3
+
runtime:
text_field_name: content
vector_field_name: embedding
@@ -109,6 +117,12 @@ indexes:
max_concurrency: 16
```
+Run the server in read-only mode with the CLI flag or environment variable instead of YAML:
+
+```bash
+uvx --from redisvl[mcp] rvl mcp --config /path/to/mcp_config.yaml --read-only
+```
+
Why this is realistic:
- The index already exists and is discovered automatically.
@@ -127,7 +141,6 @@ Request:
```json
{
"query": "How do we mitigate elevated cache miss rate after a regional failover?",
- "search_type": "vector",
"limit": 5,
"filter": {
"and": [
@@ -152,7 +165,6 @@ Request:
```json
{
"query": "deprecation of legacy cache invalidation flow",
- "search_type": "hybrid",
"limit": 3,
"filter": {
"field": "product",
diff --git a/spec/MCP.md b/spec/MCP.md
index c160db79..e2fb9320 100644
--- a/spec/MCP.md
+++ b/spec/MCP.md
@@ -342,7 +342,7 @@ Tool executions are bounded by an async semaphore (`runtime.max_concurrency`). R
{
"or": [
{ "field": "rating", "op": "gte", "value": 4.5 },
- { "field": "is_pinned", "op": "eq", "value": true }
+ { "field": "category", "op": "eq", "value": "featured" }
]
}
]
@@ -433,7 +433,6 @@ Not registered when read-only mode is enabled.
|----------|------|----------|---------|-------------|
| `records` | list[object] | yes | - | non-empty and `len(records) <= runtime.max_upsert_records` |
| `id_field` | str | no | `null` | if set, must exist in every record |
-| `embed_text_field` | str | no | `runtime.default_embed_text_field` | must exist in every record |
| `skip_embedding_if_present` | bool | no | `runtime.skip_embedding_if_present` | if false, always re-embed |
### Response Contract
@@ -449,7 +448,7 @@ Not registered when read-only mode is enabled.
### Upsert Semantics
1. Validate input records before writing.
-2. Resolve `embed_text_field`.
+2. Use `runtime.default_embed_text_field` for records that require embedding.
3. Respect `skip_embedding_if_present` (default true): only generate embeddings for records missing configured vector field.
4. Populate configured vector field.
5. Call `AsyncSearchIndex.load`.
@@ -664,7 +663,7 @@ Note: Full n8n MCP client support depends on n8n's MCP implementation. Refer to
- filter behavior
- `test_upsert_tool.py`
- insert/update success
- - id_field/embed_text_field validation failures
+ - id_field validation failures
- read-only mode excludes tool
### Deterministic Verification Commands