All contributions are highly appreciated! Start by forking the repository on GitHub and setting up Deepnote Toolkit for local development.
mise automatically manages Python, Java, and other tool versions:
-
Install mise: Getting started
-
Run setup:
mise install # Installs Python 3.12 and Java 17 mise run setup # Installs dependencies and pre-commit hooks
-
Install poetry: Installation
-
Install Java 17 (required for PySpark tests):
- macOS:
brew install openjdk@17 - Ubuntu/Debian:
sudo apt-get install openjdk-17-jdk - RHEL/Fedora:
sudo dnf install java-17-openjdk-devel
- macOS:
-
Set up venv for development package:
# If Python 3.10 is available, point Poetry to it $ poetry env use 3.10 -
Verify the virtual environment location:
$ poetry env info
-
Install dependencies:
$ poetry install
-
Install Poe Poetry add-on:
$ poetry self add 'poethepoet[poetry_plugin]' -
Install pre-commit hooks:
$ poetry poe setup-hooks
-
Verify installation:
$ poetry poe lint $ poetry poe format
-
If
poetry installfails with errorlibrary 'ssl' not found:env LDFLAGS="-I/opt/homebrew/opt/openssl/include -L/opt/homebrew/opt/openssl/lib" poetry install -
If
poetry installfails installingpymssql, installfreetdsvia Homebrew.
Tests run against all supported Python versions using nox in Docker for reproducible environments.
# Run unit tests (no coverage by default)
mise run test
# Run unit tests with coverage
mise run test:coverage
# Run tests quickly without nox/coverage overhead
mise run test:quick tests/unit/test_file.py
mise run test:quick tests/unit/test_file.py::TestClass::test_method -v
# Pass custom arguments (including --coverage)
mise run test -- --coverage tests/unit/test_file.py# Run unit tests without coverage
poetry run nox -s unit
# Run unit tests with coverage
poetry run nox -s unit -- --coverage
# Run specific test file
poetry run nox -s unit -- tests/unit/test_file.py# Run unit tests
TEST_TYPE="unit" TOOLKIT_VERSION="local-build" ./bin/test
# Run integration tests
TEST_TYPE="integration" TOOLKIT_VERSION="local-build" TOOLKIT_INDEX_URL="http://localhost:8000" ./bin/test
# Or use the test-local script for both unit tests and integration tests
./bin/test-local
# Run a specific file with test-local
./bin/test-local tests/unit/test_file.py
# ... or specific test
./bin/test-local tests/unit/test_file.py::TestClass::test_method- Kernel dependencies: Add to
[tool.poetry.dependencies]in pyproject.toml
# Add a package to kernel bundle (available in notebooks)
$ poetry add pandas
# Add a package with specific version
$ poetry add "pandas>=2.0.0"# Add a development dependency
$ poetry add --group dev pytestAfter adding dependencies, run tests to verify compatibility:
$ ./bin/test-localTo develop deepnote-toolkit against a locally running Deepnote Cloud with hot-reload:
-
Build the local development image:
docker build -t deepnote/jupyter-for-local:local -f ./dockerfiles/jupyter-for-local-hotreload/Dockerfile . -
Setup
DEEPNOTE_TOOLKIT_SOURCE_PATHenv variable pointing to folder with toolkit source. This can go either in.zshrc(or similar file for your shell) or set per shell session withexport DEEPNOTE_TOOLKIT_SOURCE_PATH=.... If not set, Deepnote Cloud will try to resolve it to../deepnote-toolkitrelative to Deepnote Cloud root folder. -
In the Deepnote Cloud repository, run:
pnpm dev:app:local-toolkit
This mounts your toolkit source into the container and installs it in editable mode. Toolkit module code changes are reflected after kernel restart (use "Restart kernel" action in the Deepnote Cloud).
Each PR creates a review application for testing. Access it via GitHub checks. Monitor logs in Grafana:
{pod="p-PROJECT_ID", container="notebook"}
We use Docker to ensure reproducible environments due to Jupyter libraries' binary dependencies:
-
builder.Dockerfile: Creates Python package bundles for different versions (3.10-3.13), generates kernel and server bundles, and packages the toolkit for distribution using Poetry. -
test.Dockerfile: Provides consistent test environment for running unit and integration tests across Python versions using nox. Used both locally and in CI/CD pipeline. -
jupyter-for-local.Dockerfile: Creates development environment with Jupyter integration, used for local development from docker-compose used in Deepnote Cloud. -
jupyter-for-local-hotreload.Dockerfile: Creates development environment which expects toolkit source to be mounted at/toolkit. Used for development against locally running Deepnote Cloud by Deepnote employees.
To release a new version to production:
- Merge your changes to main. This will automatically trigger a GitHub Actions workflow that runs the test suite and a staging deployment.
- Trigger a new GitHub Release in the GitHub UI.
- Monitor the GitHub Actions workflows and ensure a successful production deployment.
Note: The production release pipeline automatically creates two PRs in the ops and app-config repositories:
- A staging PR that updates staging values and is auto-merged
- A production PR that updates production values and requires manual approval and merge
Important: Always test the changes in the staging environment before approving and merging the production PR to ensure everything works as expected.