The first of the Fermi benchmarks are in, but NDA won’t let the be released for a few more weeks. Still, that doesn’t stop info from leaking out, much like the transistors on Nvidia’s new card. =)


ATI’s 5800 series currently dominates the graphics card scene and has been unanswered by Nvidia for almost six months. Six months! What’s the deal? Even die-hard Nvidia fans are slowly leaving the bandwagon for the obvious winner this generation. So when will we finally see the GTX 4xx series on the market? No, I’m not talking about another announcement, a demonstration, or even benchmakrs. I want to know when I can walk into a store (or go to Newegg) and find it available to purchase. Next month? Next year? Maybe by the time ATI’s 6870 debuts?

Apparently, Nvidia is in a lot more trouble that they care to admit. This is a much bigger problem than a few delays can fix. Sure their architecture looks great on paper, but fabrication has prevented it from becoming a reality. The latest news suggests that Fermi consumes almost 300 watts under load! That could also power three of my laptops. What the problems with all that power? Just get a larger PSU, right? Well power means more than just a higher electric bill – it means heat. These chips apparently run hot. The problem with heat is that it change resistance and consequently the normal operation of transistors. Heat effectively increases the current leakage of transistor, which increases the overall power. But remember that increasing the power increases the temperature. This endless cycle is causing the downfall of Fermi.

There are several electric fixes for this, but every scenario requires Nvidia to lower to clock or reduce the number of shaders, both of which sacrifice the “enormous” gain Fermi has over competiting ATI cards.  Not only that, but the the yield from the fabrication is pathetically low – around 2 percent. Technical problems aside, at a yield that low, prices would be much to high to be practical or profitable for Nvidia. This is exactly how Nvidia has been able to selectively show off the card. What they didn’t explain was how it took them thousand of chips to get a working one and cost them thousands to produce. O and hopefully they didn’t have to run it for more than a few minutes, environmentalists wouldn’t be too happy. Nvidia has a lot to overcome, and it may be another year until we see an actual product. Good luck, Nvidia!

Check out the following article, my main source, for more information:

Enter your email address to subscribe to this blog and receive notifications of new posts by email.