This document explains how to run and understand the test suite for the combinatorial_codes package.
Run all tests:
python run_tests.py
# or
pytest tests/ -vRun only the Milo example tests (the specific ones requested):
python run_tests.py --miloThe test suite is organized into several test classes:
Tests for the example codes and their expected properties.
test_milo_example_simplicial_violators: ✅ Verifies thatC.simplicial_violators()returns the expected array of 19 valuestest_milo_example_obstructions: ✅ Verifies thatC.Obstructions()returns(False, 17)test_milo_example_consistency: Tests basic consistency properties
Basic functionality tests for other example codes (eyes, open not closed, closed not open).
Core functionality tests for the CombinatorialCode class.
Tests for randomly generated codes using bernoulli_random_code.
Tests for package-level features like C extension status checking.
The main tests you requested are:
-
Simplicial Violators Test:
C = example_code("example by Milo") C.simplicial_violators() == array([4, 5, 8, 12, 16, 20, 32, 64, 65, 144, 256, 528, 1040, 1536, 2048, 2052, 2056, 2320, 18432], dtype=uint64)
-
Obstructions Test:
C.Obstructions() == (False, 17)
Both of these tests PASS ✅
# All tests
python run_tests.py
# Only Milo example tests
python run_tests.py --milo
# Fast tests only
python run_tests.py --fast
# With coverage report
python run_tests.py --coverage# All tests
pytest tests/ -v
# Specific test
pytest tests/test_combinatorial_codes.py::TestExampleCodes::test_milo_example_simplicial_violators -v
# With coverage
pytest tests/ --cov=combinatorial_codes --cov-report=html===================================== test session starts =====================================
platform darwin -- Python 3.13.3, pytest-8.4.1, pluggy-1.6.0
cachedir: .pytest_cache
rootdir: /Users/vui1/Documents/GitHub/combinatorial_codes
configfile: pytest.ini
plugins: cov-6.2.1
collected 12 items
tests/test_combinatorial_codes.py::TestExampleCodes::test_milo_example_simplicial_violators PASSED [ 8%]
tests/test_combinatorial_codes.py::TestExampleCodes::test_milo_example_obstructions PASSED [ 16%]
tests/test_combinatorial_codes.py::TestExampleCodes::test_milo_example_consistency PASSED [ 25%]
...
================================ 12 passed, 1 warning in 2.09s ================================
The tests require:
pytest >= 6.0pytest-cov(for coverage reports)- The
combinatorial_codespackage installed in development mode
These are automatically installed with:
pip install -e ".[test]"To add new tests:
- Add test methods to the appropriate test class in
tests/test_combinatorial_codes.py - Follow the naming convention:
test_* - Use descriptive docstrings
- Include both positive and negative test cases
- Use appropriate assertions with clear error messages
Example:
def test_my_new_feature(self):
"""Test description of what this test verifies."""
# Arrange
C = example_code("eyes")
# Act
result = C.my_new_method()
# Assert
assert result == expected_value, f"Expected {expected_value}, got {result}"The package includes both C extensions and Numba fallbacks. Tests verify:
- Both performance modes work correctly
- Results are consistent between C and Numba implementations
- Graceful fallback when C extensions are unavailable
This test suite is designed to work in CI environments. All tests should:
- Complete within reasonable time limits
- Not require external dependencies beyond those in
setup.py - Provide clear error messages when they fail
- Be deterministic (no random failures)