Perl Template Roundup October 2010: Methodology
All benchmark data is generated using the Template::Benchmark Perl module, so it may be worth checking that link out and reading the documentation if any of the following needs explanation.
For each supported template engine and each template feature
supported by that engine, the benchmark_template_engines
script is run with only that template engine and feature enabled.
This helps to prevent the benchmarks from contaminating
each other.
For example, one engine uses $&
as part of its
parsing, which notoriously slows down all regexps for the
entire perl process encountering it.
The other benefit is that it makes it easier for me to interrupt the benchmark generation and resume where I left off, which is kinda important since it takes 4 CPU-days to run all the benchmarks and, if run contiguously, that's going to clash with daily-backups and other busy periods on the machine.
Each benchmark is run for a duration of 20 CPU-seconds, which appears to keep random fluctuations under the 10% mark. How far under 10% I'm afraid I don't know, I've not checked exhaustively and I'd rather be vague than misleading.
Each benchmark comes in a 1, 15 and 30 repeats version.
Repeats refers to how many times the given template feature
is used in the benchmark template, so for scalar_variable
that would mean 1, 15 or 30 variables are replaced in the
template.
Using the --json
option to benchmark_template_engines
the results of each benchmark are saved as a JSON structure in a file.
Eventually 4+ days later, when all the benchmarks have run, I
have a directory full of .json
files, and the next step...
well, the next step after taking a backup so I don't lose all
that computing time, is to collate the data in all those
separate files into one big monolithic dataset.
Then that dataset is sliced in various directions to produce the data used in the charts.
This data is then saved as a single report.json
file, so that
I don't have to repeat the 5-6 minute collation, sorting, slicing,
etc.
And finally, after all that, a script turns that report.json
file into pretty HTML pages and Google Charts that you see on
this site.