A pytest plugin that allows you to control the order in which tests run based on their tags (markers) or fixtures. Perfect for coordinating test execution in CI/CD pipelines, managing test dependencies, and optimizing test suite performance.
- 🎯 Tag-based Ordering: Order tests by pytest markers (fast → slow → integration)
- 🔧 Fixture-based Ordering: Order tests by the fixtures they use (db → redis → cache)
- ⚡ Unmatched Test Handling: Control how tests without matching tags/fixtures are handled
- 🚫 Skip Unmatched Tests: Option to skip tests that don't match the order list entirely
- 🛡️ Error Validation: Automatic validation of fixture availability for reliable ordering
- 📊 Performance Optimized: Minimal overhead with efficient sorting algorithms
- 🔍 Comprehensive Testing: Full test suite with unit tests and integration tests
- 🎭 Interactive Demo: Built-in demo showing all features with detailed logging
pip install pytest-conductorSee pytest-conductor in action with our interactive demo:
# Clone the repository
git clone https://github.com/your-username/pytest-conductor.git
cd pytest-conductor
# Run the interactive demo
hatch run demoThe demo shows:
- Tag ordering (fast → slow → integration)
- Fixture ordering (basic_calculator → advanced_calculator → sample_data)
- Unmatched test handling (first, last, none)
- Error handling for invalid configurations
The repository includes a complete example project demonstrating real-world usage:
# Navigate to the example project
cd example
# Install dependencies and pytest-conductor
hatch run pip install -e ../
# Run tests with coordination
hatch run pytest --tag-order fast slow -v
hatch run pytest --fixture-order basic_calculator advanced_calculator -vThe example project includes:
- Calculator application with basic and advanced operations
- Comprehensive test suite with different tags and fixtures
- Global and local fixtures demonstrating scope validation
- Integration tests showing all plugin features
The plugin supports two mutually exclusive ordering methods:
- Tag Ordering: Order tests by their pytest markers/tags
- Fixture Ordering: Order tests by the fixtures they use
Note: --tag-order and --fixture-order are mutually exclusive. You can only specify one of them at a time.
Use the --tag-order option to specify the order in which tags should run:
pytest --tag-order fast slow integrationThis will run all tests with the fast tag first, then slow tests, then integration tests.
Use the --fixture-order option to specify the order in which fixtures should run:
pytest --fixture-order db redis cacheThis will run all tests that use the db fixture first, then tests using redis, then tests using cache.
The plugin includes robust error handling to prevent configuration issues:
When using fixture ordering, the plugin validates that all specified fixtures are available to all tests:
# This will fail if 'nonexistent_fixture' is not available to all tests
pytest --fixture-order nonexistent_fixtureError Message: ValueError: Fixtures not available to all tests: nonexistent_fixture. Fixture ordering requires all fixtures to be globally available. Make sure these fixtures are defined in a conftest.py file that is accessible to all tests, or use tag ordering instead.
The plugin will throw an error if you try to use both --tag-order and --fixture-order at the same time:
# This will fail
pytest --tag-order fast slow --fixture-order db redisError Message: ValueError: --tag-order and --fixture-order are mutually exclusive. Please specify only one of them.
- Use global fixtures defined in a root-level
conftest.pyfile - Use tag ordering for tests with local or conditional fixtures
- Test your configuration with
--collect-onlyto catch issues early
Use the --unmatched-order option to control how tests without matching tags/fixtures are handled:
# Run unmatched tests first
pytest --tag-order fast slow --unmatched-order first
pytest --fixture-order db redis --unmatched-order first
# Run unmatched tests last
pytest --tag-order fast slow --unmatched-order last
pytest --fixture-order db redis --unmatched-order last
# Run unmatched tests in any order (default)
pytest --tag-order fast slow --unmatched-order any
pytest --fixture-order db redis --unmatched-order any
# Skip unmatched tests entirely
pytest --tag-order fast slow --unmatched-order none
pytest --fixture-order db redis --unmatched-order noneThe --unmatched-order none option is a new feature that allows you to skip tests that don't match your specified order list entirely. This is useful when you want to run only a specific subset of tests:
# Run only fast and slow tests, skip all others
pytest --tag-order fast slow --unmatched-order none
# Run only tests using specific fixtures, skip all others
pytest --fixture-order db redis --unmatched-order noneimport pytest
@pytest.mark.fast
def test_fast_operation():
"""This test will run first when using --tag-order fast slow"""
assert True
@pytest.mark.slow
def test_slow_operation():
"""This test will run second when using --tag-order fast slow"""
assert True
def test_no_tags():
"""This test has no tags - behavior depends on --unmatched-order"""
assert True
@pytest.mark.fast
@pytest.mark.slow
def test_multiple_tags():
"""This test has multiple tags - uses the first one in the order"""
assert Trueimport pytest
@pytest.fixture
def db():
"""Database fixture."""
return {"type": "database"}
@pytest.fixture
def redis():
"""Redis fixture."""
return {"type": "redis"}
def test_db_operation(db):
"""This test will run first when using --fixture-order db redis"""
assert db["type"] == "database"
def test_redis_operation(redis):
"""This test will run second when using --fixture-order db redis"""
assert redis["type"] == "redis"
def test_no_fixtures():
"""This test has no fixtures - behavior depends on --unmatched-order"""
assert True
def test_multiple_fixtures(db, redis):
"""This test has multiple fixtures - uses the first one in the order"""
assert db["type"] == "database"
assert redis["type"] == "redis"--tag-order TAG1 TAG2 ...: Specify the order of tags for test execution--fixture-order FIXTURE1 FIXTURE2 ...: Specify the order of fixtures for test execution--unmatched-order {any,first,last,none}: How to handle tests without matching tags/fixturesany: Run unmatched tests in any order (default)first: Run unmatched tests before tagged/fixture testslast: Run unmatched tests after tagged/fixture testsnone: Skip unmatched tests entirely
Note: --tag-order and --fixture-order are mutually exclusive. You can only specify one of them at a time.
- The plugin extracts tags from test markers (pytest.mark)
- Tests are sorted based on the specified tag order
- Tests with multiple tags use the highest priority tag (first in the order)
- Tests without tags are handled according to the
--unmatched-ordersetting
- The plugin extracts fixture names from test function parameters
- Tests are sorted based on the specified fixture order
- Tests with multiple fixtures use the highest priority fixture (first in the order)
- Tests without fixtures are handled according to the
--unmatched-ordersetting
When a test has multiple tags or fixtures that are in your specified order, the plugin will:
- Run the test only once (no duplication)
- Use the highest priority tag/fixture (the one that appears first in your order list)
@pytest.mark.fast
@pytest.mark.slow
def test_multiple_tags():
"""This test has both 'fast' and 'slow' tags."""
assert TrueWhen running pytest --tag-order fast slow integration, this test will:
- Run once (not twice)
- Run in the fast group (since 'fast' comes first in the order)
def test_multiple_fixtures(db, redis, cache):
"""This test uses multiple fixtures."""
assert TrueWhen running pytest --fixture-order db redis cache, this test will:
- Run once (not multiple times)
- Run in the db group (since 'db' comes first in the order)
The plugin handles fixtures defined in conftest.py files the same way as regular fixtures:
- Global conftest fixtures (in root
conftest.py) are detected normally - Nested conftest fixtures (in subdirectory
conftest.pyfiles) are also detected - The plugin looks at the test function's parameter names, regardless of where the fixture is defined
--fixture-order list must be available to all tests that might use them. The plugin will throw an error if any fixture in your order list is not available to all tests, ensuring reliable ordering behavior.
tests/
├── conftest.py # global_fixture
├── unit/
│ ├── conftest.py # unit_fixture
│ └── test_unit.py # uses both global_fixture and unit_fixture
└── integration/
├── conftest.py # integration_fixture
└── test_integration.py # uses global_fixture and integration_fixture
When running pytest --fixture-order global_fixture unit_fixture integration_fixture:
- Tests in
unit/will run first if they useglobal_fixtureorunit_fixture - Tests in
integration/will run first if they useglobal_fixtureorintegration_fixture - The plugin doesn't need to know where the fixture is defined - it just looks at the test parameters
For deeply nested directory structures, the plugin works seamlessly:
tests/
├── conftest.py # global_fixture
├── api/
│ ├── conftest.py # api_fixture
│ ├── v1/
│ │ ├── conftest.py # v1_fixture
│ │ └── test_endpoints.py # uses global_fixture, api_fixture, v1_fixture
│ └── v2/
│ ├── conftest.py # v2_fixture
│ └── test_endpoints.py # uses global_fixture, api_fixture, v2_fixture
└── database/
├── conftest.py # db_fixture
├── mysql/
│ ├── conftest.py # mysql_fixture
│ └── test_queries.py # uses global_fixture, db_fixture, mysql_fixture
└── postgresql/
├── conftest.py # postgres_fixture
└── test_queries.py # uses global_fixture, db_fixture, postgres_fixture
Key Points:
- No special configuration needed - the plugin automatically detects all fixtures used by tests
- Fixture scope doesn't matter - whether fixtures are session, module, class, or function scope
- Conftest inheritance works - fixtures from parent directories are available to child tests
- Ordering is based on test parameters - not fixture definitions
Best Practices for Complex Fixture Structures:
- Use descriptive fixture names that indicate their purpose (e.g.,
mysql_db,redis_cache) - Consider using fixture prefixes to group related fixtures (e.g.,
api_v1_client,api_v2_client) - When ordering by fixtures, list them in the order you want tests to run
Tests that don't have any of the specified tags or fixtures are handled according to the --unmatched-order setting:
any(default): Run unmatched tests in any orderfirst: Run unmatched tests before all tagged/fixture testslast: Run unmatched tests after all tagged/fixture testsnone: Skip unmatched tests entirely (they won't run)
The plugin guarantees that:
- No test runs twice - even with multiple matching tags/fixtures
- Tests run in the specified order - within each priority group
- Unmatched tests are handled predictably - based on your
--unmatched-ordersetting - Fixture dependencies are respected - pytest's own fixture ordering still applies
pytest --tag-order fast slowpytest --tag-order unit integration --unmatched-order lastpytest --tag-order smoke full --unmatched-order lastpytest --fixture-order db cachepytest --fixture-order api db --unmatched-order lastpytest --fixture-order simple expensive --unmatched-order firstThe plugin includes comprehensive tests for edge cases. To verify the behavior:
# Run the edge case test suite
pytest src/unit_tests/test_edge_cases.py -v
# Run practical examples
pytest src/unit_tests/test_edge_case_examples.py -v# Test that tests with multiple tags run only once
pytest src/unit_tests/test_edge_case_examples.py --tag-order fast slow integration -v# Test that tests with multiple fixtures run only once
pytest src/unit_tests/test_edge_case_examples.py --fixture-order db redis cache -v# Test unmatched tests running first
pytest src/unit_tests/test_edge_case_examples.py --tag-order fast slow --unmatched-order first -v
# Test unmatched tests running last
pytest src/unit_tests/test_edge_case_examples.py --tag-order fast slow --unmatched-order last -v# Clone the repository
git clone https://github.com/your-username/pytest-conductor.git
cd pytest-conductor
# Install in development mode
pip install -e .The project uses a comprehensive test structure:
src/
├── pytest_conductor/ # Main plugin code
├── unit_tests/ # Unit tests for the plugin itself
│ ├── test_tag_ordering.py
│ ├── test_fixture_ordering.py
│ ├── test_fixture_validation.py
│ ├── test_unmatched_none.py
│ └── ... (8 test files total)
└── integration_tests/ # Integration tests using the example project
├── test_pytest_conductor_integration.py
└── README.md
# Run unit tests only (test the plugin's core functionality)
hatch run unit-tests
# Run integration tests only (test with real-world example project)
hatch run integration-tests
# Run all tests
hatch run unit-tests && hatch run integration-tests
# Run the interactive demo (shows test coordination with detailed logging)
hatch run demo# Navigate to example project
cd example
# Install pytest-conductor in the example environment
hatch run pip install -e ../
# Run example tests with coordination
hatch run pytest --tag-order fast slow -v
hatch run pytest --fixture-order basic_calculator advanced_calculator --ordering-mode fixture -vpytest-conductor/
├── src/
│ ├── pytest_conductor/ # Main plugin code
│ ├── unit_tests/ # Unit tests for the plugin
│ └── integration_tests/ # Integration tests with example project
├── example/ # Complete example project
│ ├── src/
│ │ ├── calculator/ # Example application
│ │ └── tests/ # Example tests with various tags/fixtures
│ └── pyproject.toml # Example project configuration
├── pyproject.toml # Main project configuration
└── README.md # This file
To avoid warnings about unknown markers, you can register your custom markers in your pyproject.toml or pytest.ini file:
[tool.pytest.ini_options]
markers = [
"fast: marks tests as fast",
"slow: marks tests as slow",
"integration: marks tests as integration tests",
"unit: marks tests as unit tests",
]Or in pytest.ini:
[tool:pytest]
markers =
fast: marks tests as fast
slow: marks tests as slow
integration: marks tests as integration tests
unit: marks tests as unit tests