Home

Share this Story

NVIDIA Tegra K1 Dual-core and Quad-core Chips Put Through Benchmarks, Rips Apart the Competition

nvidia tegra k1

While we continue waiting ever-so-patiently for the Tegra K1 from NVIDIA to show up in hardware we can actually buy, we shall proceed to drool over its benchmark scores. The 64-bit quad-core and dual-core models of the Tegra K1 were announced back in January at CES, featuring a somewhat unheard of 192-core GPU, based on the Kepler architecture which is found in NVIDIA’s GeForce line of PC graphics cards.

To sum up the hype in simple terms, this mobile SoC features desktop grade performance, which will be essential for next-gen mobile game developers to take advantage of. 

Apparently, there had yet to be a known official clock speed of the CPU in the Tegra K1. Thanks to recent benchmarks from mydrivers.com, we now know that the 64-bit dual-core variant of the Tegra K1 tops at a clock speed of 3.0GHz with 2GB of RAM.

64-bit_version_of_Tegra_K1_first_run_of_exposure__dual-core_second_eight_nuclear-Tegra_K1__64-bit__dual-core__Denver__run_points__performance__-_drive_home 3

As for the competition, which includes the Snapdragon 805, Snapdragon 801, Exynos processors from Samsung, and LG’s Odin octa-core chip, the Tegra K1 is merciless in overall performance. With the dual-core and quad-core variants both scoring in the 43K range, the closest competitor number is 37K, belonging to the Snapdragon 805. Most flagships at this point are running 2013’s Snapdragon 800, which scores in the 34K range.

Tegra K1

While it is cool to see the Tegra K1 perform so well on benchmarks, these obviously can’t compare to having a device in your hands that features the silicon. Rumor has it that NVIDIA will have a couple products launching with the Tegra K1 sometime in the first half of 2014, but we shall remain patient for official word.

Via: MyDrivers | Neowin
  • _Flip_

    Nvidia Shield 2 is specifically built for K1. Compared to a tablet, it’s going to have more ram, bigger drive, hdmi ports and other gaming specific features that would make any gamer drool over.

  • Coffee

    Should I get a shield or wait for a k1 tablet? Ugh decisions decisions, I really wish nvidia would just say how much a 7 inch k1 tablet would cost

  • Engineer

    No matter how good Tegra K1 performs, you’ll never see it used in phones in north america. Nvidia faces the same issue that the rest of merchant chip vendors (except for Qualcomm) face … lack of a proper integrated modem in their SOC!

    Intel’s Bay Trail / Merrifield chip is already ahead of all in terms of performance per watt, but they can’t get into phones either due to the same reason … lack of a proper integrated modem that supports all 4g/3g standards.

  • http://twitter.com/jdrch jdrch

    Where have we seen this before?

  • https://www.facebook.com/christopher.johnsc chris_johns

    id love this chip in a tablet…not so much a phone

  • Kyle Cordiano

    Tegra Note 7 is a beast.

  • Stormy Knight

    Problems are that nVidia Tegra SOCs are hot and power hungry. These benchmarks don’t mean anything unless they are in actual available hardware. These numbers are to be taken with not a grain of salt, but the entire shaker.

    • renz

      will K1 be too power hungry? we will need to wait actual hardware for test for that but the latest A15 used in K1 already tweaked by ARM for better power consumption. also when anandtech did the architecture preview of K1 they mention that the concerned about K1 will be too power hungry might be overrated

      • https://www.facebook.com/christopher.johnsc chris_johns

        Ah yes, logic, shame so many people comment here before using this long lost art

  • Josh Meade

    Snapdragon 805 still will have better battery life and more support… Name 3 phones that will actually have the tegra k1

    • https://www.facebook.com/christopher.johnsc chris_johns

      Stop it

  • Kevin

    This is why we shouldn’t even be using quad-cores right now.

  • Justen DeBowles

    As a Fan of NVIDIA and a Tegra Note Owner… I like this news. :)
    Look like the Tegra Note line will continue to be a monster in the performance category.

  • eilegz

    after having bad support from nvidia for future android updates, everyone learned a lesson and move on from any tegra device…

    • Justen DeBowles

      The shield (summer) and Tegra Note (november) both launched with 4.2 and were updated pretty quickly. My Note was already on 4.4.2 before the end of Feb which is two major updates in a 3 month time. That doesn’t sound that bad to me.

    • renz

      that is depend on OEM as well. just look at nvidia tegra note. they already update the tablet to the latest version of android (4.4.2). but HP Slate 7 Extreme still on 4.2.2 since nvidia does not have their hand to update the device.

    • https://www.facebook.com/christopher.johnsc chris_johns

      stop it

    • deppman

      Not only do the Tegra Note and the Shield both have 4.4.2, but they *also* have the ‘move application and data to SD’ capability enabled (since 4.3). This means their SD cards are far more usable than many devices that don’t make this available even if they have an SD slot. Worse yet, Nexus devices (also on 4.4.2, IIRC), don’t have microSD at all.

      In all, I think nVidia is trying hard to be a good Android vendor. And they are quite well so far (as others have also noted).

  • kixofmyg0t
    • Franklin Ramsey

      There are so many problems with this article that it is basically click bate.

      First, that is a demo box of an unfinished product so there is no way to actually see what the power draws are. However, Nvidia has already shown demos where when scaled under 1W power, the unfinished chip was running roughly with the same amount of processing power as an iPad 4. http://www.anandtech.com/show/7169/nvidia-demonstrates-logan-soc-mobile-kepler/2

      Second they are making points about running with an actively cooled system pointing to there being no way it can provide the performance they are promising in a low wattage solution, but forget that Nvidia was already showing off reference tablets at CES that was using 5W of power and was pushing numbers close to a PS3 and that was the prototype.

      I could go on, but I don’t see the point.

      • kixofmyg0t

        Wait….so you’re saying that a site called semiaccurate isn’t 100% accurate?

        Relax man. Even if those numbers are flubbed, nvidia has a long and proud tradition of over-promising and under-delivering in the Tegra line.

        Oh and it doesn’t help that they just obsoleted the K1’s GPU architecture literally weeks ago with Maxwell. That 192 core SMX block that powers K1? Even nvidia said it uses too much power and dropped it in favor of Maxwell’s SM blocks.

        • Franklin Ramsey

          I was just pointing out that their article wasn’t something people should be taking seriously. If you knew that they aren’t accurate, why point out the article in the first place?

          • kixofmyg0t

            Because where there’s smoke, there’s fire.

            Looking at how Tegra4 murders battery life in the products it’s in and shield requires active cooling…..it’s not hard to believe that the trend continues with K1. Especially at 3Ghz that are claimed here.

          • deppman

            The Shield is designed for active cooling to provide optimal game performance without throttling. Any SoC in that environment would require the same. If you want to see throttling, or an exploding phone, try gaming for an hour on a Nexus 4 or Nexus 5 (the latter apparently bursts into flames).

            Also, how do you explain the Tegra Note 7? It stays surprisingly cool even after hours of continuous game play. It’s battery runs easily 7 hour with demanding applications; 10 hours for video only. All this with front facing speakers, a fantastic stylus, an SD card, and nearly twice the performance of a Nexus 5 or 7. Yeah, what a flop. Everyone that sees mine wants one, especially after I tell them it costs $199. Of course, I could spend three times as much and get a Note 3 with a slower stylus and marginally lower performance. But why waste the $450? Oh, I know – that wonderful, laggy, touch-wiz bloat!

            Also, the ASUS TF701T is quite a nice kit with a beautiful 2560×1600 display running the Tegra 4. It performs very well, and unlike the Nexus 5, doesn’t explode when pushed. Also, at $369, its a great deal.

            So, how many T4 devices do you own? My guess its somewhere less than 1 and not greater than 0. One final note: the Shield gets 14 hours of battery life, and they *removed* one third of the battery from the original design. Laptop magazine did a study and showed an average *system* draw of 4W during hours of 3D gaming.

          • Adrian Meredith

            What tegra 4 did or didn’t do is totally irreleveant to a chip which has absolutely nothing in common with it.

      • Ryan5609

        I thought that I also read somewhere that the K1 wasn’t even destined to be inside of a tablet or smartphone, but rather for unrelated mobile computing devices that could manage the kind of power draw this thing requires. I am thinking Windows 8 hybrid laptop’s, chromebook’s, etc. I guess we will see.

        • Franklin Ramsey

          Well since they had a reference tablet at CES running Android, I’d say we would see the K1 in at least tablets.

    • renz

      see ‘semiaccurate’ and i know i don’t need to take it seriously. not because it is ‘semi-accurate’ but because of Charlie D

    • deppman

      @kixofmyg0t:disqus how long have you been an automotive engineer? Yeah, that’s what I thought. Same with Charlie I’d bet.

      Because if you were, you’d notice that this is an automotive demonstration, not a tablet. And you’d know that vehicle design operating temperature is -40 to +125C. In order to meet those harsh range of temperatures, or even a reasonable subset (say -20C to +90C) you need far more available juice and cooling capacity respectively. Ever try to start a car at -20C? Wonder why it struggles so much harder? How about opened your car door and found the interior temperature at 90C?

      That harsh environment is far different than a typical tablet or smartphone which often are stated as +10 to +40C, or there abouts. Not to put too fine a point on it, but my Nexus 4 *doubles* its Antutu score if I pop it in the freezer for a few minutes before I run the test because the passive cooling barely works within that pansy operating range.

      If you wanted to see a tablet running a K1, that was on display also, in the exact same packaging as the Tegra Note 7 (if not at CES, definitely at MWC). Although I wasn’t there, I got the impression that it was running continuously, in an enclosed rubber casing, *and* it didn’t explode or melt. I wouldn’t be surprised if it were power hungry relative to less powerful chips. But no way could it be to the magnitude that Charlie is thinking. Just do the math on the tablet – it would have melted.

  • Shadowstare

    Nice chip, what about battery life? A fast chip isn’t any good if the phone has a dead battery.

    • http://www.droid-life.com/ Tim-o-tato

      NVIDIA is still integrating their companion core with the Tegra K1, which is the battery-saving core. At least, that’s what I am assuming. Pretty darn sure they are, though.

      • renz

        for the A15 variant then yes. but from the die shoot mock the denver variant will not have those companion cores anymore

      • Jason B

        Doubt that’ll help when you’re using that 192 “core” GPU in games. We’ll see how power hungry it is once it ships.

      • masterpfa

        Even with the companion core Nvidia T4 struggles to compete on the battery front. My Tegra Note 7 has improved battery ever since the KitKat upgrade to 4.4.2, but prior to that it was horrendous and IMO a little behind the competitors currently on battery life alone.

  • hfoster52

    Insert Apple Fan Boy – I don’t see this running against the A7?

    • Diablo81588

      Because cross platform benchmarks are useless.

  • Asmodai

    Do we know if the two core 64bit version is running on a 64bit OS or is it just a 64bit SoC running the tests on a 32bit OS using the ARMv8 backwards compatibility?

  • yahyoh

    Arm V8 :D

  • Dave

    This is probably only good news for people who buy an Asus tablet this fall.

  • http://infotainmentempire.blogspot.com Rob

    I’m guessing the SHIELD 2 will have the K1? Not that benchmarks are indicative of everyday use, but I’m pretty stoked the 800 isn’t that far behind the 801 and 805. Hell, the jump from 600 to 800 is pretty significant compared to the 800 to 801/805.

  • Keith Taylor

    Put this in the next round of Asus Transformer Infiniti line and I will get it.

  • Matthew

    ARMV8?!

  • SA_NYC

    I hear this “desktop performance” line a lot lately, which is interesting. But is it actually anywhere true? I mean, until very recently I ran the 32-bit version of Office, and I can assure you, my performance experience was night-and-day better than what I get with Office docs on my phone–on desktop Office . So I guess I’m asking, is this a case of only being as strong as one’s weakest link? And if that’s the case, what are the other links in the phone performance chain? Memory could be one I suppose, 2 or 3GB versus 8GB on my laptop. Is that likely to impact performance? What else, network connectivity, graphics performance, something else? I’m just trying to figure out if we can actually expect anything close to “desktop performance” when this new chip makes it out into the wild. Or (more likely in my opinion), is it more likely going to be an incremental improvement. Thought?

    • Cory_S

      It’s about 10 years behind. So, I guess really anything is desktop performance if you look back far enough.

    • DoctorJB

      It’s similar to (if not above) the level of the Bay Trail CPUs which are already in some low power desktops. We’ve already seen that Mobile CPUs are approaching parity with the experiment that was Windows RT. The vast majority of people don’t multitask enough to need 8GB of ram. Graphics performance is typically similar to CPU performance because mobile uses APUs with CPU/GPU being of the same generation (similar to most laptops).

  • The Narrator

    2GB of ram, ya lost me there.

    • http://www.droid-life.com/ Tim-o-tato

      Yeah, just because this test device was running 2GB at the time, doesn’t mean any devices it ships with will have 2GB.

    • DoctorJB

      You running Android VMs over there?!?

  • jeff manning

    Put it in a nexus now!

  • Cory_S

    Newer chip is faster. Neat.

    • Franklin Ramsey

      Well if the K1 Ships in the second half of this year and the Snapdragon 805 doesn’t start shipping till the second half of the year also, They’d both be new chips with the K1 being faster.

      • Voltism

        Looking at the difference between the 800 and 805, the k1 looks two generations ahead

      • Cory_S

        Doubt you will really ever see the 805 in much of anything. It doesn’t have mobile radios. The 801/805 are to the 800 as the 600 was to the S4 Pro. An ‘S” generation in iPhone terminology.

      • kixofmyg0t

        See that’s the thing, you’re assuming nvidia is actually gonna ship K1 on time.

        What uses the Tegra4? a Handful of ASUS tablets and SHIELD. I think a Windows tablet of some sort does too but it’s not important.

        What uses the Tegra4i(not the A15 powered 4, but the A9 powered 4i..)? Well LG supposedly has a G2 Mini variant with it.

        So when can we expect a device with Tegra K1 in it? Next year when they announce it’s successor.

        • Franklin Ramsey

          First off, I didn’t assume anything. I made a true statement. “Well If the K1 ships in the second half of this year and the Snapdragon 805 doesn’t start shipping till the second half of the year also, They’d both be new chips with the K1 being faster.”

          See that If at the very beginning? That indicates it might not happen. I’m saying if Nvidia meets their ship time frame, the K1 would be “new” about the same time the Snapdragon 805 is expected to ship. I made no claim about how many products we would see with the K1, no statements about even wanting the K1. I just made a factual statement that if both chipset makers release their respective products at the same time, they would be new at the same time.

        • renz

          actually there are plenty tegra 4 tablet. in fact i know more tablets with tegra 4 than snapdragon 800. also T4i is just coming out. but it seems the performance is not really that bad given the A9 used for the chip

          http://www.tomshardware.com/news/wiko-wax-nvidia-tegra-4i-preview,26157.html

        • Socius

          Surface 2 uses Tegra 4. Along with other phones in India/China that are sold out. Tegra 4 is a damn good chip. K1 will be epic if Nvidia switches to 20nm fab as I think heat/power on K1 at 28nm may be higher than the competition which means need for active cooling, excessive throttling, and low battery life. It is a great upgrade for Nvidia though. It will at the very least be competitive with the top dogs.

        • masterpfa

          Unless they release their own tablet as they did with the Tegra Note 7

    • michael arazan

      You could run the newest Battlefield and Call of Duty full versions with 3 Ghz, but you would need 4-8 gb of ram. That would be awesome on the go to go from a PC xbox or ps2 to your tablet if they ever do that.

  • http://www.DavidPat.com David Pat

    Yeah problem is that Nvidia is terrible at getting their chips into actual phones.

    • Eric R.

      Next Nexus tablet could have it.

      • Good_Ole_Pinocchio

        Google seems a bit comfortable with Qualcomm….I think they’ll keep it there.

        • KingofPing

          You’re joking, right? (I know, snark…sorry…)

          JBQ quit because of Qualcomm’s BS… I think Google as well as the Android dev community would *love* to see a decent Tegra Nexus.

          (I’m not saying it’s going to happen; I’m just expressing the dissatisfaction of many in the community around Qualcomm’s drivers and SoC.)

          • brkshr

            JBQ wasn’t happy because he couldn’t release binaries and/or factory images because of Qualcomm’s proprietary drivers. This is pretty much only a problem for the Nexus line that releases those things.

            Meanwhile, Qualcomm is still more developer friendly than Nvidia and seems to have less bugs than NVidia. When Qualcomm chips do have bugs, they usually get fixed very quickly. I would choose Qualcomm over Nvidia any day! Nvidia has also been completely banned from my desktops/laptops. I have had WAY too many problems with NVidia graphic cards over the past decade and now that I got into linux, I hate them. I’ll give their mobile platform a shot, but I would do it begrudgingly (like if a Nexus phone/tablet comes with it again, but I don’t think Google would make that mistake again).

          • deppman

            Dude, if you are using Linux, nVidia is by far the best graphics solution. Thier drivers are vastly better and more reliable than anything else. For Linux, nVidia > Intel > ATI/AMD. I am not the first person to note this: Take a look at Linux system builders: I have yet to see *anyone* use AMD cards; it’s Intel IG with an nVidia upgrade option. Catalyst on Linux is buggy and spotty and only alluring if you truly are a masochist. Maybe you might want to invest $150 in a nice 60W 750Ti Maxwell. It blows the doors off of almost anything near that price range from AMD on Linux at one third the TDP.

          • brkshr

            I’m not going to argue with you, because I am not that into Linux. I only know the basic GUI stuff. I run Ubuntu, Mint and TAILS, pretty much how they come. I have just had nothing but problems on my computers that have Nvidia cards, with loss of functionality, like HDMI out. From what I have heard, Nvidia is not developer friendly at all. Linus Torvalds even scolded Nvidea for this publicly ( https://www.youtube.com/watch?v=iYWzMvlj2RQ ).

            As far as my windows computers go. I have also had nothing but problems with them and so has my buddy. Both of our laptops shut down regularly from over-heating. My buddies pretty much completely destructed from over-heating. The software that Nvidia provides is usually junk as well. I get ‘program not responding’ for Nvidia more often then I would like. Half the time, when I have a multiple monitor situation, the Nvidia software doesn’t do what I configured it to do, as far as mirroring, one monitor off, main display, dual display type stuff goes.

            I’m no computer wiz, but I do know enough to have been the IT guy for the Dr’s office I worked at and I regularly fix everyone’s computers around me. So I know enough to know that I don’t like Nvidia.

            Again, I’m sure you know more than me. But I figured I would let you know a little more why I don’t like Nvidia. I do believe that Nvidia has better performance than AMD cards, but I don’t think they are as stable as AMD cards as far as hardware and software goes. I would rather have reliability over performance. I’m not into games or anything like that.

          • deppman

            That was last year. This year, Torvalds has praised nVidia for release of high quality source for K1 drivers.

            I run Kubuntu and have worked with Desktop Linux since 1997, have administered dozens of Linux and Unix desktops, am a RHCE, and build a Linux box every few years. For top performance, your only option for either nVidia or AMD is the binary blob. The difference is nVidia works well and sometime surpasses Window performance. AMD, OTOH, are maybe 75% of the performance of their Windows counterparts – and far below nVidia at the same price point – *and* are buggy.

            Again, see my comment about system sellers – System76 ships nVidia only. ZaReason ships nVidia by default and offers AMD options below nVidia. Also take a look at https://dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and-opengl-drivers-hall-fameshame/. nVidia gets the only Excellent rating. AMD gets a mediocre rating, and that was seemingly due to their *effort* to improve their drivers from their current state which is probably more accurately represented as ‘poor’. This also gives me pause when purchasing a mobile SoC seeing how badly Adreno and (especially) Mali fare.

            I have no desire to argue with you. I’m just trying to convince you that you really should give nVidia a look if you are running Linux. Again, the 750Ti is really a great card for Linux (http://www.phoronix.com/scan.php?page=article&item=nvidia_maxwell_benchmarks&num=1)

          • brkshr

            I didn’t think you were trying to argue. I just know that I don’t have the knowledge to make for a relative debate on this subject. I can only form my opinion on my past experiences with Nvidia. Nothing really current.

            I appreciate you taking the time to let me know a little more about Nvidia. I will definitely have to take a second look at Nvidia’s current offerings. Thanks for the links and the info!

          • deppman

            Cool, thanks Brshr, and good luck!

          • Rick

            You “banned” Nvidia from your desktops/laptops… haha your funny. I can get both to work just fine, the only difference I see is Nvidia has a better track record for releasing stable drivers. Nvidia is the Benchmark title holder right now. Now this all changes in the phone/tablet world, I have an 5 year old HTC Sensation 4g with a Qualcomm Snapdragon S3 MSM8260 and it still works fine. I would love to see Tegra supported more especially the Tegra K1.

    • WestFiasco

      Then terrible at supporting the chips in the phone’s they do get them in. I don’t really have much faith in them at this point.