ThemisDB uses Google Test for unit and integration testing.
# CMake + CTest
cmake --build build --target test
# Or directly with CTest
cd build && ctest --output-on-failure# Run specific test executable
./build/tests/storage_tests
./build/tests/aql_tests
./build/tests/vector_tests
# With Google Test filters
./build/tests/storage_tests --gtest_filter="RocksDB*"# Windows
.\run_critical_tests.bat
# Linux
./run_critical_tests.sh# Windows
.\run_extended_tests.bat
# Linux
./run_extended_tests.sh- Test individual components in isolation
- Fast execution (< 1 second per test)
- No external dependencies (mock/stub as needed)
- Location:
tests/unit/
TEST(VectorIndexTest, InsertAndSearch) {
VectorIndex index(128); // dimension
std::vector<float> vec = generateRandomVector(128);
index.insert(1, vec);
auto results = index.search(vec, 5); // top-5
ASSERT_EQ(results[0].id, 1);
}- Test interaction between components
- May use real dependencies (RocksDB, etc.)
- Slower execution (< 10 seconds per test)
- Location:
tests/integration/
TEST(DatabaseIntegrationTest, QueryWithIndex) {
Database db("test_db");
db.createTable("users", schema);
db.createIndex("users", "name");
db.insert("users", user_data);
auto results = db.query("SELECT * FROM users WHERE name = 'Alice'");
ASSERT_EQ(results.size(), 1);
}- Performance measurement
- Uses Google Benchmark
- Location:
benchmarks/
static void BM_VectorSearch(benchmark::State& state) {
VectorIndex index(128);
prepareIndex(index, 10000); // 10k vectors
auto query = generateRandomVector(128);
for (auto _ : state) {
auto results = index.search(query, 10);
benchmark::DoNotOptimize(results);
}
}
BENCHMARK(BM_VectorSearch);When adding a new feature:
- Unit tests for core functionality
- Integration tests for component interaction
- Edge case tests for boundary conditions
- Error handling tests for failure modes
// Example: Complete test coverage for a feature
TEST(FeatureTest, BasicFunctionality) { /* ... */ }
TEST(FeatureTest, EdgeCase_EmptyInput) { /* ... */ }
TEST(FeatureTest, EdgeCase_MaxSize) { /* ... */ }
TEST(FeatureTest, ErrorHandling_InvalidInput) { /* ... */ }
TEST(FeatureTest, ErrorHandling_OutOfMemory) { /* ... */ }- Regression test that reproduces the bug
- Test should fail before fix, pass after fix
// Example: Regression test for bug #123
TEST(RegressionTest, Issue123_MemoryLeakInSnapshotCleanup) {
// Setup that triggers the bug
Database db;
auto snapshot = db.createSnapshot();
// This should not leak memory
snapshot.reset();
// Verify cleanup (using memory profiling tools)
ASSERT_NO_MEMORY_LEAKS();
}# Configure with coverage enabled
cmake -B build -DCMAKE_BUILD_TYPE=Debug -DTHEMIS_ENABLE_COVERAGE=ON
# Build and run tests
cmake --build build
cd build && ctest
# Generate report (requires lcov/gcovr)
cmake --build build --target coverage- Critical paths: 90%+ coverage
- Core modules: 80%+ coverage
- Overall project: 70%+ coverage
Priority for test coverage:
- Data integrity (storage, transactions)
- Query correctness (AQL parser, executor)
- Concurrency safety (MVCC, locks)
- API contracts (public interfaces)
Use fixtures for shared setup:
class DatabaseTest : public ::testing::Test {
protected:
void SetUp() override {
db_ = std::make_unique<Database>(test_db_path_);
db_->createTable("test_table", schema_);
}
void TearDown() override {
db_.reset();
cleanupTestData();
}
std::unique_ptr<Database> db_;
std::string test_db_path_ = "/tmp/test_db";
Schema schema_ = createTestSchema();
};
TEST_F(DatabaseTest, InsertAndQuery) {
// Use db_ directly
db_->insert("test_table", data);
auto results = db_->query("SELECT * FROM test_table");
ASSERT_EQ(results.size(), 1);
}- Small test data:
tests/data/ - Generated data: Create in test setUp/fixture
- Temporary data: Use
/tmp/or OS temp directory
- External services (network, databases)
- Expensive operations (file I/O, computation)
- Non-deterministic behavior (time, random)
class MockVectorIndex : public VectorIndex {
public:
MOCK_METHOD(SearchResults, search,
(const std::vector<float>&, int),
(override));
MOCK_METHOD(void, insert,
(int, const std::vector<float>&),
(override));
};
TEST(QueryExecutorTest, UsesVectorIndex) {
MockVectorIndex mock_index;
QueryExecutor executor(&mock_index);
EXPECT_CALL(mock_index, search(_, 10))
.WillOnce(Return(mock_results));
executor.execute("VECTOR SEARCH ...");
}BENCHMARK(BM_InsertOneMillionRecords)
->Unit(benchmark::kMillisecond)
->MinTime(5.0); // Run for at least 5 secondsDocument expected performance:
// Target: < 10ms for 10k vector search
TEST(PerformanceTest, VectorSearchLatency) {
auto start = std::chrono::high_resolution_clock::now();
index.search(query, 10);
auto end = std::chrono::high_resolution_clock::now();
auto duration = std::chrono::duration_cast<std::chrono::milliseconds>(
end - start
).count();
EXPECT_LT(duration, 10) << "Search took " << duration << "ms";
}Tests run automatically on:
- Every push to feature branches
- Every pull request
- Scheduled nightly builds
See .github/workflows/ci.yml for CI setup.
# Run with timeout
ctest --timeout 30
# Or with specific test timeout
./build/tests/my_tests --gtest_timeout=10000 # 10 seconds- Identify and fix race conditions
- Use deterministic test data
- Add retries only as last resort
Each test should be independent:
// ✅ Good: Clean state for each test
TEST_F(DatabaseTest, Test1) {
db_->insert(...); // Fresh db_ from setUp()
}
// ❌ Bad: Tests depend on execution order
TEST_F(DatabaseTest, Test2) {
// Assumes Test1 already ran - BAD!
db_->query(...);
}For architecture-specific testing:
- Threading tests: Document in
docs/architecture.md - Concurrency tests: Document locking strategies
- Fast tests: Keep unit tests under 1 second
- Clear assertions: Use descriptive messages
- One concept per test: Test one thing at a time
- Arrange-Act-Assert: Structure tests clearly
- No test logic: Tests should be simple and obvious
TEST(ExampleTest, DescriptiveName) {
// Arrange: Set up test data
Database db;
auto data = createTestData();
// Act: Perform the operation
auto result = db.insert(data);
// Assert: Verify the outcome
ASSERT_TRUE(result.success);
EXPECT_EQ(db.count(), 1);
}