More Reading

Post navigation

71 Comments

  • Nvidia Shield 2 is specifically built for K1. Compared to a tablet, it’s going to have more ram, bigger drive, hdmi ports and other gaming specific features that would make any gamer drool over.

  • Should I get a shield or wait for a k1 tablet? Ugh decisions decisions, I really wish nvidia would just say how much a 7 inch k1 tablet would cost

  • No matter how good Tegra K1 performs, you’ll never see it used in phones in north america. Nvidia faces the same issue that the rest of merchant chip vendors (except for Qualcomm) face … lack of a proper integrated modem in their SOC!

    Intel’s Bay Trail / Merrifield chip is already ahead of all in terms of performance per watt, but they can’t get into phones either due to the same reason … lack of a proper integrated modem that supports all 4g/3g standards.

  • Problems are that nVidia Tegra SOCs are hot and power hungry. These benchmarks don’t mean anything unless they are in actual available hardware. These numbers are to be taken with not a grain of salt, but the entire shaker.

    • will K1 be too power hungry? we will need to wait actual hardware for test for that but the latest A15 used in K1 already tweaked by ARM for better power consumption. also when anandtech did the architecture preview of K1 they mention that the concerned about K1 will be too power hungry might be overrated

  • Snapdragon 805 still will have better battery life and more support… Name 3 phones that will actually have the tegra k1

  • As a Fan of NVIDIA and a Tegra Note Owner… I like this news. 🙂
    Look like the Tegra Note line will continue to be a monster in the performance category.

  • after having bad support from nvidia for future android updates, everyone learned a lesson and move on from any tegra device…

    • The shield (summer) and Tegra Note (november) both launched with 4.2 and were updated pretty quickly. My Note was already on 4.4.2 before the end of Feb which is two major updates in a 3 month time. That doesn’t sound that bad to me.

    • that is depend on OEM as well. just look at nvidia tegra note. they already update the tablet to the latest version of android (4.4.2). but HP Slate 7 Extreme still on 4.2.2 since nvidia does not have their hand to update the device.

    • Not only do the Tegra Note and the Shield both have 4.4.2, but they *also* have the ‘move application and data to SD’ capability enabled (since 4.3). This means their SD cards are far more usable than many devices that don’t make this available even if they have an SD slot. Worse yet, Nexus devices (also on 4.4.2, IIRC), don’t have microSD at all.

      In all, I think nVidia is trying hard to be a good Android vendor. And they are quite well so far (as others have also noted).

    • There are so many problems with this article that it is basically click bate.

      First, that is a demo box of an unfinished product so there is no way to actually see what the power draws are. However, Nvidia has already shown demos where when scaled under 1W power, the unfinished chip was running roughly with the same amount of processing power as an iPad 4. http://www.anandtech.com/show/7169/nvidia-demonstrates-logan-soc-mobile-kepler/2

      Second they are making points about running with an actively cooled system pointing to there being no way it can provide the performance they are promising in a low wattage solution, but forget that Nvidia was already showing off reference tablets at CES that was using 5W of power and was pushing numbers close to a PS3 and that was the prototype.

      I could go on, but I don’t see the point.

      • Wait….so you’re saying that a site called semiaccurate isn’t 100% accurate?

        Relax man. Even if those numbers are flubbed, nvidia has a long and proud tradition of over-promising and under-delivering in the Tegra line.

        Oh and it doesn’t help that they just obsoleted the K1’s GPU architecture literally weeks ago with Maxwell. That 192 core SMX block that powers K1? Even nvidia said it uses too much power and dropped it in favor of Maxwell’s SM blocks.

        • I was just pointing out that their article wasn’t something people should be taking seriously. If you knew that they aren’t accurate, why point out the article in the first place?

          • Because where there’s smoke, there’s fire.

            Looking at how Tegra4 murders battery life in the products it’s in and shield requires active cooling…..it’s not hard to believe that the trend continues with K1. Especially at 3Ghz that are claimed here.

          • The Shield is designed for active cooling to provide optimal game performance without throttling. Any SoC in that environment would require the same. If you want to see throttling, or an exploding phone, try gaming for an hour on a Nexus 4 or Nexus 5 (the latter apparently bursts into flames).

            Also, how do you explain the Tegra Note 7? It stays surprisingly cool even after hours of continuous game play. It’s battery runs easily 7 hour with demanding applications; 10 hours for video only. All this with front facing speakers, a fantastic stylus, an SD card, and nearly twice the performance of a Nexus 5 or 7. Yeah, what a flop. Everyone that sees mine wants one, especially after I tell them it costs $199. Of course, I could spend three times as much and get a Note 3 with a slower stylus and marginally lower performance. But why waste the $450? Oh, I know – that wonderful, laggy, touch-wiz bloat!

            Also, the ASUS TF701T is quite a nice kit with a beautiful 2560×1600 display running the Tegra 4. It performs very well, and unlike the Nexus 5, doesn’t explode when pushed. Also, at $369, its a great deal.

            So, how many T4 devices do you own? My guess its somewhere less than 1 and not greater than 0. One final note: the Shield gets 14 hours of battery life, and they *removed* one third of the battery from the original design. Laptop magazine did a study and showed an average *system* draw of 4W during hours of 3D gaming.

          • What tegra 4 did or didn’t do is totally irreleveant to a chip which has absolutely nothing in common with it.

      • I thought that I also read somewhere that the K1 wasn’t even destined to be inside of a tablet or smartphone, but rather for unrelated mobile computing devices that could manage the kind of power draw this thing requires. I am thinking Windows 8 hybrid laptop’s, chromebook’s, etc. I guess we will see.

        • Well since they had a reference tablet at CES running Android, I’d say we would see the K1 in at least tablets.

    • see ‘semiaccurate’ and i know i don’t need to take it seriously. not because it is ‘semi-accurate’ but because of Charlie D

    • @kixofmyg0t:disqus how long have you been an automotive engineer? Yeah, that’s what I thought. Same with Charlie I’d bet.

      Because if you were, you’d notice that this is an automotive demonstration, not a tablet. And you’d know that vehicle design operating temperature is -40 to +125C. In order to meet those harsh range of temperatures, or even a reasonable subset (say -20C to +90C) you need far more available juice and cooling capacity respectively. Ever try to start a car at -20C? Wonder why it struggles so much harder? How about opened your car door and found the interior temperature at 90C?

      That harsh environment is far different than a typical tablet or smartphone which often are stated as +10 to +40C, or there abouts. Not to put too fine a point on it, but my Nexus 4 *doubles* its Antutu score if I pop it in the freezer for a few minutes before I run the test because the passive cooling barely works within that pansy operating range.

      If you wanted to see a tablet running a K1, that was on display also, in the exact same packaging as the Tegra Note 7 (if not at CES, definitely at MWC). Although I wasn’t there, I got the impression that it was running continuously, in an enclosed rubber casing, *and* it didn’t explode or melt. I wouldn’t be surprised if it were power hungry relative to less powerful chips. But no way could it be to the magnitude that Charlie is thinking. Just do the math on the tablet – it would have melted.

  • Nice chip, what about battery life? A fast chip isn’t any good if the phone has a dead battery.

    • NVIDIA is still integrating their companion core with the Tegra K1, which is the battery-saving core. At least, that’s what I am assuming. Pretty darn sure they are, though.

      • for the A15 variant then yes. but from the die shoot mock the denver variant will not have those companion cores anymore

      • Doubt that’ll help when you’re using that 192 “core” GPU in games. We’ll see how power hungry it is once it ships.

      • Even with the companion core Nvidia T4 struggles to compete on the battery front. My Tegra Note 7 has improved battery ever since the KitKat upgrade to 4.4.2, but prior to that it was horrendous and IMO a little behind the competitors currently on battery life alone.

  • Do we know if the two core 64bit version is running on a 64bit OS or is it just a 64bit SoC running the tests on a 32bit OS using the ARMv8 backwards compatibility?

  • I’m guessing the SHIELD 2 will have the K1? Not that benchmarks are indicative of everyday use, but I’m pretty stoked the 800 isn’t that far behind the 801 and 805. Hell, the jump from 600 to 800 is pretty significant compared to the 800 to 801/805.

  • Put this in the next round of Asus Transformer Infiniti line and I will get it.

  • I hear this “desktop performance” line a lot lately, which is interesting. But is it actually anywhere true? I mean, until very recently I ran the 32-bit version of Office, and I can assure you, my performance experience was night-and-day better than what I get with Office docs on my phone–on desktop Office . So I guess I’m asking, is this a case of only being as strong as one’s weakest link? And if that’s the case, what are the other links in the phone performance chain? Memory could be one I suppose, 2 or 3GB versus 8GB on my laptop. Is that likely to impact performance? What else, network connectivity, graphics performance, something else? I’m just trying to figure out if we can actually expect anything close to “desktop performance” when this new chip makes it out into the wild. Or (more likely in my opinion), is it more likely going to be an incremental improvement. Thought?

    • It’s about 10 years behind. So, I guess really anything is desktop performance if you look back far enough.

    • It’s similar to (if not above) the level of the Bay Trail CPUs which are already in some low power desktops. We’ve already seen that Mobile CPUs are approaching parity with the experiment that was Windows RT. The vast majority of people don’t multitask enough to need 8GB of ram. Graphics performance is typically similar to CPU performance because mobile uses APUs with CPU/GPU being of the same generation (similar to most laptops).

    • Well if the K1 Ships in the second half of this year and the Snapdragon 805 doesn’t start shipping till the second half of the year also, They’d both be new chips with the K1 being faster.

      • Looking at the difference between the 800 and 805, the k1 looks two generations ahead

      • Doubt you will really ever see the 805 in much of anything. It doesn’t have mobile radios. The 801/805 are to the 800 as the 600 was to the S4 Pro. An ‘S” generation in iPhone terminology.

      • See that’s the thing, you’re assuming nvidia is actually gonna ship K1 on time.

        What uses the Tegra4? a Handful of ASUS tablets and SHIELD. I think a Windows tablet of some sort does too but it’s not important.

        What uses the Tegra4i(not the A15 powered 4, but the A9 powered 4i..)? Well LG supposedly has a G2 Mini variant with it.

        So when can we expect a device with Tegra K1 in it? Next year when they announce it’s successor.

        • First off, I didn’t assume anything. I made a true statement. “Well If the K1 ships in the second half of this year and the Snapdragon 805 doesn’t start shipping till the second half of the year also, They’d both be new chips with the K1 being faster.”

          See that If at the very beginning? That indicates it might not happen. I’m saying if Nvidia meets their ship time frame, the K1 would be “new” about the same time the Snapdragon 805 is expected to ship. I made no claim about how many products we would see with the K1, no statements about even wanting the K1. I just made a factual statement that if both chipset makers release their respective products at the same time, they would be new at the same time.

        • Surface 2 uses Tegra 4. Along with other phones in India/China that are sold out. Tegra 4 is a damn good chip. K1 will be epic if Nvidia switches to 20nm fab as I think heat/power on K1 at 28nm may be higher than the competition which means need for active cooling, excessive throttling, and low battery life. It is a great upgrade for Nvidia though. It will at the very least be competitive with the top dogs.

    • You could run the newest Battlefield and Call of Duty full versions with 3 Ghz, but you would need 4-8 gb of ram. That would be awesome on the go to go from a PC xbox or ps2 to your tablet if they ever do that.

      • Google seems a bit comfortable with Qualcomm….I think they’ll keep it there.

        • You’re joking, right? (I know, snark…sorry…)

          JBQ quit because of Qualcomm’s BS… I think Google as well as the Android dev community would *love* to see a decent Tegra Nexus.

          (I’m not saying it’s going to happen; I’m just expressing the dissatisfaction of many in the community around Qualcomm’s drivers and SoC.)

          • JBQ wasn’t happy because he couldn’t release binaries and/or factory images because of Qualcomm’s proprietary drivers. This is pretty much only a problem for the Nexus line that releases those things.

            Meanwhile, Qualcomm is still more developer friendly than Nvidia and seems to have less bugs than NVidia. When Qualcomm chips do have bugs, they usually get fixed very quickly. I would choose Qualcomm over Nvidia any day! Nvidia has also been completely banned from my desktops/laptops. I have had WAY too many problems with NVidia graphic cards over the past decade and now that I got into linux, I hate them. I’ll give their mobile platform a shot, but I would do it begrudgingly (like if a Nexus phone/tablet comes with it again, but I don’t think Google would make that mistake again).

          • Dude, if you are using Linux, nVidia is by far the best graphics solution. Thier drivers are vastly better and more reliable than anything else. For Linux, nVidia > Intel > ATI/AMD. I am not the first person to note this: Take a look at Linux system builders: I have yet to see *anyone* use AMD cards; it’s Intel IG with an nVidia upgrade option. Catalyst on Linux is buggy and spotty and only alluring if you truly are a masochist. Maybe you might want to invest $150 in a nice 60W 750Ti Maxwell. It blows the doors off of almost anything near that price range from AMD on Linux at one third the TDP.

          • I’m not going to argue with you, because I am not that into Linux. I only know the basic GUI stuff. I run Ubuntu, Mint and TAILS, pretty much how they come. I have just had nothing but problems on my computers that have Nvidia cards, with loss of functionality, like HDMI out. From what I have heard, Nvidia is not developer friendly at all. Linus Torvalds even scolded Nvidea for this publicly ( https://www.youtube.com/watch?v=iYWzMvlj2RQ ).

            As far as my windows computers go. I have also had nothing but problems with them and so has my buddy. Both of our laptops shut down regularly from over-heating. My buddies pretty much completely destructed from over-heating. The software that Nvidia provides is usually junk as well. I get ‘program not responding’ for Nvidia more often then I would like. Half the time, when I have a multiple monitor situation, the Nvidia software doesn’t do what I configured it to do, as far as mirroring, one monitor off, main display, dual display type stuff goes.

            I’m no computer wiz, but I do know enough to have been the IT guy for the Dr’s office I worked at and I regularly fix everyone’s computers around me. So I know enough to know that I don’t like Nvidia.

            Again, I’m sure you know more than me. But I figured I would let you know a little more why I don’t like Nvidia. I do believe that Nvidia has better performance than AMD cards, but I don’t think they are as stable as AMD cards as far as hardware and software goes. I would rather have reliability over performance. I’m not into games or anything like that.

          • That was last year. This year, Torvalds has praised nVidia for release of high quality source for K1 drivers.

            I run Kubuntu and have worked with Desktop Linux since 1997, have administered dozens of Linux and Unix desktops, am a RHCE, and build a Linux box every few years. For top performance, your only option for either nVidia or AMD is the binary blob. The difference is nVidia works well and sometime surpasses Window performance. AMD, OTOH, are maybe 75% of the performance of their Windows counterparts – and far below nVidia at the same price point – *and* are buggy.

            Again, see my comment about system sellers – System76 ships nVidia only. ZaReason ships nVidia by default and offers AMD options below nVidia. Also take a look at https://dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and-opengl-drivers-hall-fameshame/. nVidia gets the only Excellent rating. AMD gets a mediocre rating, and that was seemingly due to their *effort* to improve their drivers from their current state which is probably more accurately represented as ‘poor’. This also gives me pause when purchasing a mobile SoC seeing how badly Adreno and (especially) Mali fare.

            I have no desire to argue with you. I’m just trying to convince you that you really should give nVidia a look if you are running Linux. Again, the 750Ti is really a great card for Linux (http://www.phoronix.com/scan.php?page=article&item=nvidia_maxwell_benchmarks&num=1)

          • I didn’t think you were trying to argue. I just know that I don’t have the knowledge to make for a relative debate on this subject. I can only form my opinion on my past experiences with Nvidia. Nothing really current.

            I appreciate you taking the time to let me know a little more about Nvidia. I will definitely have to take a second look at Nvidia’s current offerings. Thanks for the links and the info!

          • You “banned” Nvidia from your desktops/laptops… haha your funny. I can get both to work just fine, the only difference I see is Nvidia has a better track record for releasing stable drivers. Nvidia is the Benchmark title holder right now. Now this all changes in the phone/tablet world, I have an 5 year old HTC Sensation 4g with a Qualcomm Snapdragon S3 MSM8260 and it still works fine. I would love to see Tegra supported more especially the Tegra K1.

    • Then terrible at supporting the chips in the phone’s they do get them in. I don’t really have much faith in them at this point.

Comments are closed.

back to top