Mea-Culpa: It Should Have Been Caught Earlier

Section By Andrei Frumusanu

As stated on the previous page, I had initially had seen the effects of this behaviour back in January when I was reviewing the Kirin 970 in the Mate 10. The numbers I originally obtained showed worse-than-expected performance of the Mate 10, which was being beaten by the Mate 9. When we discussed the issue with Huawei, they attributed it to a firmware bug, and pushed me a newer build which resolved the performance issues. At the time, Huawei never discussed what that 'bug' was, and I didn't push the issue as performance bugs do happen.

For the Kirin 970 SoC review, I went through my testing and published the article. Later on, in the P20 reviews, I observed the same lower performance again. As Huawei had told me before it was a firmware issue, I had also attributed the bad performance to a similar issue, and expected Huawei to 'fix' the P20 in due course.

Looking back in hindsight, it is pretty obvious there’s been some less than honest communications with Huawei. The newly detected performance issues were not actually issues – they were actually the real representation of the SoC's performance. As the results were somewhat lower, and Huawei was saying that they were highly competetive, I never would have expected these numbers as genuine.

It's worth noting here that I naturally test with our custom benchmark versions, as they enable us to get other data from the tests than just a simple FPS value. It never crossed my mind to test the public versions of the benchmarks to check for any discrepancy in behaviour. Suffice to say, this will change in our testing in the future, with numbers verified on both versions.

Analyzing the New Competitive Landscape

With all that being said, our past published results for Kirin 970 devices were mostly correct - we had used a variant of the benchmark that wasn’t detected by Huawei’s firmware. There is one exception however, as we weren't using a custom version of 3DMark at the time. I’ve now re-tested 3DMark, and updated the corresponding figures in past reviews to reflect the correct peak and sustained performance figures.

As far as I could tell in my testing, the cheating behaviour has only been introduced in this year’s devices. Phones such as the Mate 9 and P10 were not affected. If I’m to be more precise, it seems that only EMUI 8.0 and newer devices are affected. Based on our discussions with Huawei, we were told that this was purely a software implementation, which also corroborates our findings.

Here is the competitive landscape across our whole mobile GPU performance suite, with updated figures where applicable. We are also including new figures for the Honor Play, and the new introduction of the GFXBench 5.0 Aztec tests across all of our recent devices:

3DMark Sling Shot 3.1 Extreme Unlimited - Graphics 

3DMark Sling Shot 3.1 Extreme Unlimited - Physics 

GFXBench Aztec Ruins - High - Vulkan/Metal - Off-screen GFXBench Aztec Ruins - Normal - Vulkan/Metal - Off-screen 

GFXBench Manhattan 3.1 Off-screen 

GFXBench T-Rex 2.7 Off-screen

Overall, the graphs are very much self-explanatory. The Kirin 960 and Kirin 970 are lacking in both performance and efficiency compared almost every device in our small test here. This is something Huawei is hoping to address with the Kirin 980, and features such as GPU Turbo.

Raw Benchmark Numbers The Reality of Silicon And Market Pressure
Comments Locked

84 Comments

View All Comments

  • goatfajitas - Tuesday, September 4, 2018 - link

    The tech world is far to hung up on benchmarking these days. Benchmarking is like the Kardashians of tech sites. The lowest form of entertainment. :P
  • R0H1T - Tuesday, September 4, 2018 - link

    So the mainland phone makers are cheating in benchmarks as well? I know this isn't a China only thing, but seems like they're trying to grab more than what they can chew.
  • goatfajitas - Tuesday, September 4, 2018 - link

    I am saying the tech world in general is far to hung up on it. Companies, tech sites and their visitors - so hung up on it and its perceived importance that companies pull crappy moves to appear to benchmark better.
  • MonkeyPaw - Tuesday, September 4, 2018 - link

    Maybe 5-10 years ago, such benchmarks were important, as the performance gain was quite noticeable. However, now I think we are well beyond the point of tangible gains on a smartphone, at least until the time that we expect more from the devices than the current usage model.
  • niva - Tuesday, September 4, 2018 - link

    I'm not sure how you don't notice 10-40% improvements in peak performance and efficiency between generations, gains are very tangible for everyone, in multiple ways, regardless of the usage model. Even if you just use your phone for making actual phone calls, you can notice the standby time increase, better radio reception, ability to answer calls while on LTE or wireless only. Maybe YOU don't notice these things, but please speak for yourself. Thank you!

    As for Huawei, the company is shady beyond belief. I consider the Nexus 6p the only Huawei phone I've ever wanted to get. I don't trust them, not one bit. Then again I don't trust Google either but Google seems to be an unavoidable evil I have to live with, and I do trust them quite a bit more than Huawei.
  • Samus - Wednesday, September 5, 2018 - link

    Benchmarks are always important. If a customer is shopping for a device based on performance, the metric they have to depend on is that measured by...benchmarks.
  • Flunk - Tuesday, September 4, 2018 - link

    Those brands aren't being sold here, so it's more of a deflection than a real answer. The only other recent example of this problem is the OnePlus 5, which is another Chinese phone. All Huawei is doing is making Chinese brands look bad.
  • techconc - Friday, September 21, 2018 - link

    No, it's is normal behavior for a heavy GPU test to peak initially and then throttle back down as thermal limitations are reached. What Huawei is doing is ignoring those thermal limitations and actually overheating devices for specifically named benchmarks.
  • kirsch - Tuesday, September 4, 2018 - link

    They very well may be. But that is a completely orthogonal discussion to companies cheating to show better results than they should.
  • Reflex - Tuesday, September 4, 2018 - link

    This right here. A lot of us would agree that benchmark results are not the end all/be all of a device. But in no way is that an appropriate response to an article about benchmark cheating.

Log in

Don't have an account? Sign up now