Skip to content

Benchmarking MVP #1443

@mattias-p

Description

@mattias-p

We should have a basic benchmark framework for tracking performance variation across versions. Additional features should be added as follow-ups.

  • The benchmark framework should include:
    • A set of name servers serving a set of zones
    • A CLI tool that runs Zonemaster Engine on a set of zones and produces a report
    • A list of zones to run the CLI tool on
    • An instruction for running the benchmark.
  • The report should be in a machine-readable format and should include:
    • Benchmark start time
    • Machine hostname
    • Zonemaster Engine version
    • Average per-zone duration
    • Individual per-zone durations

Perhaps the test-zone-data could be reused for this purpose. While that data is certainly not representative of the real-world, we do control it, and it is readily available.

N.B., MVP is an abbreviation of Minimum Viable Product.

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions