Trust, and verify
Sometimes people just make up stuff because the facts don't fit the story they want to tell you.
You will probably come across stuff that people say about the benchmarks game. Do they show that what they claim is true? If they provide a URL, does the content actually confirm what they say?
Perhaps they are genuinely mistaken. Perhaps the content changed.
Wtf kind of benchmark counts the jvm startup time?
Compare the times against [pdf] Java Microbenchmark Harness reports:
||23.367 ± 0.062
||4.061 ± 0.054
||0.112 ± 0.001
JVM start-up, JIT, OSR… are quickly effective and typical cold / warmed-up comparison at most of these workloads will show miniscule difference. Note the exception.
In stark contrast to the traditional expectation of )
warmup, some benchmarks exhibit
slowdown, where the performance of in-process iterations drops over time.
Прогрелось. Но работает минимум в 65x медленнее
– Поэтому супер-оптимизация…
No. The repeated measurements of the Java pi-digits program without restarting the JVM, did not make the Java program seem 65x slower.
That really would have been ridiculous. The JavaOne expert's slides mysteriously fail to show that the sum was correctly divided by 65 to give an average.
…to dispute a decision you basically need to pray the maintainer reopens it for some reason.
No. Followup comments could always be made in the project ticket tracker. There was a public discussion forum, etc. etc.
brilliant hack was rejected. Someone took the opportunity to push traffic to their personal blog.
There's a reason they call it
The name "benchmarks game" signifies nothing more than the fact that programmers contribute programs that compete (but try to remain comparable) for fun not money.
It's what you make of it.
April 2017 through March 2018, Google Analytics shows 477,419 users.
Popular enough that many web search results show web spam - be careful!
|unique page views|
(go.html python.html etc March 2018)
Once upon a time…
Doug Bagley had a burst of crazy curiosity:
When I started this project, my goal was to compare all the major scripting languages. Then I started adding in some compiled languages for comparison…
That project was abandoned in 2002, restarted in 2004 by Brent Fulgham, continued from 2008 by Isaac Gouy, and interrupted in 2018 by the Debian Alioth hosting service EOL. Everything has changed; several times.