Nearly All Mobile Device Makers Cheat on Benchmarks, Except Apple and Motorola
Following Tuesday's report that Samsung artificially inflates its benchmarking scores, well-respected hardware review site AnandTech has published evidence suggesting nearly all mobile manufacturers, with the exception of Apple and Motorola, use CPU/GPU optimizations to game benchmark tests.
Samsung and other OEMs use a variety of methods to enhance device performance when a benchmark is detected. For example, with its Galaxy S 4 Samsung raised its thermal limits (and max GPU frequency) to get an edge on certain benchmarks and also raised its CPU voltage/frequency to its highest state when a benchmark was sensed, a tactic engaged by multiple manufacturers like LG, HTC, and ASUS as well.
In the table below, Anandtech highlights devices that detect benchmarks and immediately respond with max CPU frequency.
With the exception of Apple and Motorola, literally every single OEM we’ve worked with ships (or has shipped) at least one device that runs this silly CPU optimization. It’s possible that older Motorola devices might’ve done the same thing, but none of the newer devices we have on hand exhibited the behavior. It’s a systemic problem that seems to have surfaced over the last two years, and one that extends far beyond Samsung.
AnandTech notes that it’s a continual "cat and mouse" game discovering which devices have optimized for which benchmarks, because targeted benchmarks must be avoided.
The only realistic solution is to continue to evolve the suite ahead of those optimizing for it. The more attention you draw to certain benchmarks, the more likely they are to be gamed. We constantly play this game of cat and mouse on the PC side, it's just more frustrating in mobile since there aren’t many good benchmarks to begin with. […]
There's no single solution here, but rather a multi-faceted approach to make sure we’re ahead of the curve. We need to continue to rev our test suite to stay ahead of any aggressive OEM optimizations, we need to petition the OEMs to stop this madness, we need to work with the benchmark vendors to detect and disable optimizations as they happen and avoid benchmarks that are easily gamed.
Despite all of the effort that OEMs put into benchmark optimizations, the gains are negligible. The impact on CPU tests revealed a 0 to 5 percent performance increase, and a less than 10 percent increase on GPU benchmarks.