Video

Video Roundup: HTC One X Quad-Core vs HTC One X Dual-Core

49

One of the biggest misconceptions in Android that more cores always equals better performance. Much like the “ricer” vs muscle car argument, there are other factors involved when calculating speed. Qualcomm makes a pretty nice little dual-core S4 processor, using more advanced Cortex “A15-class” cores providing for nice torque. Compare that to Nvidia’s Tegra 3 quad-core processor that, while based on an older A9 architecture can pull a bit more weight when performing tasks like graphic intensive gaming.

It’s rare when we get to compare 2 of the exact same devices, with the same screen resolution, running on the same version of Android (mostly similar HTC software) — only with different processors. Well, I came across a few videos on YouTube showing off the differences in benchmarks between the HTC One XL (essentially the US version coming to AT&T) with its dual-core Qualcomm S4 processor, and the international HTC One X packing a quad-core Nvidia Tegra 3 processor. So, which version of the HTC One X performed the best? Well, watch the videos below and see for yourself.

HTC One X vs HTC One XL

Antutu Benchmark

  • HTC One X (Tegra 3): 9713
  • HTC One XL (S4): 5640

HTC One XL vs HTC One X

AndEBench

  • HTC One XL (S4): 4117 native, 161 Java
  • HTC One X (Tegra 3): 9997 native, 311 Java

CF Bench

  • HTC One XL (S4): 9531
  • HTC One X (Tegra 3: 13,719

Antutu Benchmark

  • HTC One XL (S4): 6693
  • HTC One X (Tegra 3): 10788

Quadrant Standard

  • HTC One XL (S4): 4668
  • HTC One X (Tegra 3): 4780

Smartbench

  • HTC One XL (S4): 3002/2669
  • HTC One X (Tegra 3): 4453/2697

GLBenchmark

  • HTC One XL (S4): 5669 50fps
  • HTC One X (Tegra 3): 7325 65fps

HTC One X (AT&T) vs HTC One X (International)

Quadrant Standard

  • AT&T HTC One X (S4): 4794
  • HTC One X (Tegra 3): 4988

Linpack

  • AT&T HTC One X (S4): 206.699
  • HTC One X (Tegra 3): 132.184

NenaMark2

  • AT&T HTC One X (S4): 57.9fps
  • HTC One X (Tegra 3): 48.2fps

Of course, you’ll always have those that say the results were skewed for whatever reason. I’ve heard everything from one phone could have had more apps running in the background to, another was in airplane mode, to one was being held the wrong way, or even that the two devices were different colors. And really, if that were the case, then benchmarks are completely useless given these variables and what would even be the point of them? Well, that’s as good a question as any. I prefer to take them as they are, looking at the results and seeing if it matches up to what I see in real-world performance. What’s interesting is that when comparing dual-core and quad-core variations of the One X, Nvidia’s Tegra 3 mostly came out on top. However, things sort of flip flopped once you compared the US AT&T version of the One X against the Tegra 3 international version.

What did you guys think of the results? Did you feel they were a bit unfair or biased? Have they changed your mind about your perceptions of dual-core vs quad-core processors? Let’s pretend you had 2 HTC One X’s in front of you — do you think you would even be able discern which which processors the 2 were powered with?

[Android Forums: Official Sprint HTC EVO 4G LTE Pre-release Thread]

Thanks, Nate and Tommy!

Chris Chavez
I've been obsessed with consumer technology for about as long as I can remember, be it video games, photography, or mobile devices. If you can plug it in, I have to own it. Preparing for the day when Android finally becomes self-aware and I get to welcome our new robot overlords.

U Prepaid Coming Soon To A Walmart Near You From Alltel and US Cellular

Previous article

HTC One X Software Update Version 1.29.401.7 Being Pushed Out Now In Europe

Next article

You may also like

49 Comments

  1. Video cliff notes please? Can’t watch 12 mins of this stuff. Wanna gouge my ears out.

    1. The first video isn’t very accurate because the AT&T model was updating when he did the bootup test.

      Didn’t mean to reply to you with that lol

      1. I saw the sync icon but it turned off before he ran the test…?

      2. I noticed that as well but  my evo3d and vivid both sync periodically (I always turn sync on) and the only reason I know it is because of the icon, Never noticed a tic difference in speed.

    2. Sorry. I updated the post with all the results from the videos. =)

      1. We appreciate it!

  2. In the second one, it looks like the dual-core snappercrapper really is worse than the tegra 3 quad when it comes to unbiased benchmarks (read not developed by qualcomm).

  3. You should use a cover picture that isn’t a One X & a One S

    1.  I looked at that picture for 5 minutes trying to figure out if it was just a lazy shooper or if the “simulated” screen images were just placed wrong…

      1.  Yeah because the One S was the same size as the One X, I was confused for a moment.

        1. Totally missed that. Lol

    2. Fixed with a more appropriate image… 

      1.  Haha! Much better.

  4. Could you just give us screenshots with labels please?

  5. I’m sorry guys, but in real world practice with graphics and such, the S4 pulled through. It could dish out animations smoother than a Tegra 3, and quicker. A benchmark means nothing if the real world performance doesn’t back it up, and that is why I always watch ‘benchmarks’ that display logistics that people care about. A couple numbers doesn’t tell me anything about a phone.

  6. I don’t care about speed. I care about battery life.

    1. Which do you think will provide for better battery life? The S4 using the more power efficient 28nm process or the Tegra 3 with 5th companion core?

      1. On XDA the S4 is consistently showing better battery life. And these videos have an AT&T phone that doesn’t even have the right software update, that makes for a bad comparison.

        1. Just going by the nature of updates, and how slow they are to roll out — will they ever both be on the same software version?

          1. It’s not a matter of being the same software version because they will never be the same because one will be optimized to tegra 3 and the other to the S4 Krait. What I’m saying is the software was probably not tuned to the S4 because the phone wasn’t even released at the time. 

            Neither of the two phones will be used to its potential in real world applications though, not for a while.

  7. Why does the AT&T version (XL) have the HTC branding on top of the screen instead of the AT&T branding? It should have the AT&T branding…
    Also, its clocked at 1.2ghz instead of 1.5ghz?
    Something smells fishy here…

    1. Yup, the video isn’t of the final update, talk about messed up benchmarks.

      1. Could just be the rogers version

        1. It says AT&T when starting up, so probably a build that wasn’t final.

  8. I want a quad core s4 with adreno 320 gpu

  9. So basically you trade an unnoticeable amount of speed for great leaps in battery life

  10. Krait is not an A15 nor is it A15 class.

    It falls somewhere in between the A9 and A15. The Cortex-A15 based SoCs are going to outperform the Krait by quite a bit.

    Also, mobile benchmarks are a poor judge of overall performance. The better SoC is the one whose extra features are actually used. The S4’s extra CPU horsepower will likely end up like the extra horsepower from the Exynos… largely unused.

    At least nVidia is making an effort to get devs to actually use the Tegra’s extra capabilities.

    1. Source for the Krait being between A9 and A15?

      It’s an asynchronous processor, and I would submit certainly on par with what to expect from the A15, except without the synchronous core speed kernel compromises.

      1. ARM’s own Cortex A15 site.

        http://www.arm.com/products/processors/cortex-a/cortex-a15.php

        The A15 is going to have many more features and a far more capable architecture vs the Krait.

        Examples: Krait only supports VFPv3 while the A15 will support VFPv4. The A15 is going to have Jazelle which will improve the overall performance of the JIT (Dalvik), increased/improved DSP and codec support, and optimizations that could provide up to 30% reduction in memory needed to store instructions, and support for up to 1TB of memory.

        The A15 is supposed to get 3.5+ DMIPS/Mhz (but 4.01 has also been reported) while the Krait racks up 3.3 DMIPS/Mhz.

        They’ve also taped out a 20nm A15 last fall while the Krait will be at 28nm for the next generation or two.

        You’ll also start seeing some devices with A15s in them around late fall or early winter… 6 months after the S4 devices hit… which was 6 months after the Tegra 3 hit.

        1. Jazelle will be interesting, especially if incorporated into the Dalvik.

          Yes, it does seem that the DMIPS/MHz race is 3.5 vs 3.3, hardly a huge difference and certainly placing both within the same class.

          1. When taking the other features into consideration, the A15 is above and beyond Krait in my eyes.

          2. I can respect that. I doubt that anyone getting a future A15 based processor in an Android device will have much to complain about.

            That said, the A15 design is somewhat generic, as it’s intended to be embodied in a number of different processors, most with the same overall goals.

            The Krait is specific to the S4, and is quite specific in its support for other S4 cores and subsystems. Those details are not published by Qualcomm, instead favoring the marketing of the end SoC, in contrast to the ARM marketing.

          3. The Krait is Qualcomm’s latest effort to out-do ARM’s own architecture by using an architecture that is largely based on ARM’s own architecture. The entire idea is flawed.

          4. I believe that you’ll find that the flaw is the urban myth that Qualcomm cpus, either the old Scorpion or the new Krait, are based on ARM architectures. They aren’t.

          5. Debatable.

            What isn’t debatable is that they use ARM’s ISA. Qualcomm is the only SoC OEM that doesn’t use unmodified ARM Cortex architecture. If ARM changes or advances their architecture in a certain direction, Qualcomm will be the only company not guaranteed to follow and keep pace.

          6. Yes, they use the ubiquitous ARM v7 instruction set. Much like AMD has been known to execute the Intel instruction set without sharing the same hardware architecture. As for what-ifs, I would prefer to leave those discussions for when the day comes.

          7. Architecture debates aside…

            Qualcomm has a maximum throne time of 6 months. The OMAP5 prototype shown off at CES already scores better and the S4’s performance advantage will go unused just like the Exynos and hummingbird… all benchmarks but nothing else.

            Meanwhile, nvidia is actually making an effort to get devs to take advantage of the Tegra. TI showed off a better performing SoC. Samsung’s Galaxy S3 has mindshare. The Snapdragon SOCs are a poor choice.

          8. For those of us living in the USA, the programmable world modem on board makes the S4 a great choice. No increase in parts count or layout complexity to support our various cellular radio standards for 3G and 4G. I don’t care much for benchmarks as few relate to end user, everyday tasks. Qualcomm may have a maximum throne time of 6 months for the cpus (which, I think you’re being generous there), but the overall utility of S4 will keep it relevant for some time.

          9. The programmable modem is going to be matched by the integrated icera softmodems nVidia is putting in the next gen Tegras which will also be sporting A15 cores and a better GPU (even better than the Adreno 320) plus a growing library of optimized software.

            Android is evolving with ARM, not Qualcomm. The next version of Android will be taking advantage of the features of the Cortex-A15, not the Krait.
            I’m sorry but the Krait’s advantages will remain largely untapped and will be outclassed before anyone can blink twice.

            With everything that’s barreling down the pipeline, Krait’s relevance is going to be very limited unless someone makes use of the extra horsepower before the A15s hit… very unlikely.

          10. Both of the cpu architectures promise power savings by being able to execute commands more efficiently. Overall power efficiency will be dictated by the end products. The A15, like the Krait, simply accounts for a few of the many cores in the final SoC processor. Until they build them we can’t compare.

            That said, I might expect that much of power costs or savings in the real world on our devices will come down to software, same as it ever was.

            The S4 and the anticipated A15-based devices definitely tilt the balance in our favor on power.

          11. first off, thanks to you and EarlyMon for the knowledgeable debate.  Could either of you chime in with (your estimation of)  real user battery life between the two classes? 

          12. Krait is not somehow lesser than A15.  Qualcomm only licenses the ARM instruction set and not the actual chip architecture.  Krait is not *based* on A15 (or any other ARM standard), it is built to *rival* A15. And most indications at this point are that Krait will be (other things equal) a bit more powerful than A15 (very much like Scorpion was more powerful than the A8 it was designed to compete with — enough so that it still held it’s own against A9 chips).

            Where Qualcomm absolutely trounces the competition though is in radios.  Right now, there is *nobody* else making LTE radios.  It is *not* that the Tegra3 is somehow incapable of supporting LTE, it’s that Nvidia is incapable of making their own radio (for now).  Samsung is in the same boat (see the T-Mobile USA SGS2 and ATT Skyrocket using Qualcomm chips because Sammy can’t build an HSPA+42 radio at all – and the rumors of LTE SGS3s using Qualcomm chips because Sammy can’t make a decent LTE radio).

            Will A15 chips be impressive?  Sure.  On the other hand, about the same time companies are getting 2.0Ghz dual-core A15 chips into devices on the market, they will have to compete with the 2.5Ghz quad-core Krait hitting the market at the same time.  Oh yeah, and those A15 chips will *still* need a second, discrete (read: power hungry) chip for baseband connectivity.

          13. @caynairb – Exactly, thank you.

            I was using an app and evidently not seeing all responses at the time. You summed it up beautifully.

        2. I won’t buy a new phone or tablet until they have at least a dual core 2 Ghz Cortex A15 chip in them. That’s the way I can ensure I’m future-proof for a few years. Then I’ll buy new ones when the ARMv8 architecture arrives with whatever chip is next after Cortex A15 that is build on top of it.

          1. You’ll likely see the ARMv8 around 2014 or 2015. That’s why it’s worthwhile to wait for the A15 SoCs to hit instead of jumping on the S4 whose current version isn’t even sporting the “good” GPU (Adreno 320) yet.

  11. point is here is that the tegra 3 rips to shreads the s4. my gnex kills these s4 benchmarks although i am overclocked to 1.65Ghz still no removeable batt, crappy sense and low performance means total skip. if i were in the market for a new phone it woul be S3, Gnex or One X(tegra 3)

    1. Ummmm no it doesn’t so stop it. lol I swear people just type just to be typing.

      1.  i see 2 tests where tegra did not win and in the 2 tests the result is useless…. so how does the tegra not win?

  12. big deal about scores, I want to see it in action; how will it perform on my daily use

Leave a reply

Your email address will not be published. Required fields are marked *

More in Video