You can install it from PyPI or Docker. By default, the conversion features use Ollama running locally. Cloud providers (OpenAI, Anthropic, Mistral, DeepSeek, Gemini) are also supported and require only an API key.
From PyPi, install the package with pip install opensyndrome. Then run it with opensyndrome --help.
From Docker, you can run the following command to build the image, tagged opensyndrome:
docker build -t opensyndrome .Run the container interactively, removing it when it exits
docker run --rm -it opensyndromeTo read a .env file, mount it:
docker run --rm -it \
-v "$(pwd)/.env:/app/.env:ro" \
opensyndromeTo name the container and keep it around:
docker run --name opensyndrome-cli -it opensyndromeFirst, download the schema and definitions in order to work with the CLI locally.
opensyndrome download schema
opensyndrome download definitionsThe files will be placed in the folder .open_syndrome in $HOME.
The provider and model can be set via environment variables so you don't have to pass them on every command:
OPENSYNDROME_PROVIDER=ollama # ollama (default), openai, anthropic, mistral, deepseek, gemini
OPENSYNDROME_MODEL=mistral # overrides the provider's default modelCopy .env.example to .env and fill in the relevant values:
| Provider | Required env var | Default model |
|---|---|---|
ollama |
— (runs locally) | mistral |
openai |
OPENAI_API_KEY |
gpt-4o |
anthropic |
ANTHROPIC_API_KEY |
claude-3-haiku-20240307 |
mistral |
MISTRAL_API_KEY |
mistral-large-latest |
deepseek |
DEEPSEEK_API_KEY |
deepseek-chat |
gemini |
GEMINI_API_KEY |
gemini-1.5-flash |
For Ollama, the model must be pulled before use: ollama pull mistral. You can also override the Ollama base URL with OLLAMA_BASE_URL
(default: http://localhost:11434).
Ollama models tested: llama3.2, mistral, deepseek-r1. Known to not work well with structured output: qwen2.5-coder.
If you do not pass
-hror-hf, an editor will open for you to enter the definition.
# see some examples from ECDC: https://www.ecdc.europa.eu/en/all-topics/eu-case-definitions
# pass the definition as inline text
opensyndrome convert -hr "Any person with pneumonia"
# pass the definition from a TXT file
opensyndrome convert -hf definition.txt
# use a specific provider and model
opensyndrome convert -hr "Any person with pneumonia" --provider openai --model gpt-4o
# to have the JSON translated to a specific language and edit it just after conversion
opensyndrome convert --language "Português do Brasil" --edit
# include a validation step after conversion
opensyndrome convert --validateopensyndrome humanize <path-to-json-file>
opensyndrome humanize <path-to-json-file> --provider anthropic
opensyndrome humanize <path-to-json-file> --model mistral-large-latest --language "Português do Brasil"opensyndrome validate <path-to-json-file>To get started with development, you need to have uv installed.
uv syncYou only need to do this if you are a maintainer adding a new OSI schema or updating an existing one.
Since Ollama requires a specific, more simple, JSON format, we need to generate an Ollama-compatible schema.
To do this, we use datamodel-code-generator to generate a Pydantic schema. Run the following command to update it:
make ollama_schemaIt will create a schema.py file in the root of the project. Be careful when editing this file manually.
If you find this repository helpful, feel free to cite our publication: The Open Syndrome Definition
@misc{ferreira2025opensyndromedefinition,
title={The Open Syndrome Definition},
author={Ana Paula Gomes Ferreira and Aleksandar Anžel and Izabel Oliva Marcilio de Souza and Helen Hughes and Alex J Elliot and Jude Dzevela Kong and Madlen Schranz and Alexander Ullrich and Georges Hattab},
year={2025},
eprint={2509.25434},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2509.25434},
}