Jump to content


Photo

Alas poor nVidia, we knew ye sucked


  • Please log in to reply
11 replies to this topic

#1 Caspa

Caspa

    title available

  • Members
  • 938 posts
  • Location:Sunderland
  •  Shine On, Lurk Moar

Posted 29 March 2010 - 07:52 PM

I'm going to say it right now, nVidia is all but dead. Dragging its festering corpse around for all to see.
The fabled GTX 480, otherwise known as "Fermi" has come, has been benchmarked and has been hailed with a resounding,
overwhelming "is that fucking it?". It sucks power like a desperate hooker and turns your chassis into a blast
furnace. I am not joking here, the idle temperature is a whopping 65°C. That's the idle temp. Full load
can reach 97°C while sucking up over 500W of power. Remember, this is the new flagship, single GPU card that
has been hyped up the arse for the past 9 months. This is the next generation GPU progress we're meant to
expect? Well fuck that. In most cases, it gets pissed all over by the 5870 and 5970, and in some cases, even
by the GTX 295. Even the fanboy crowd should be able to smell the ghastly whiff of rancid goat piss all over this
one. Couple that with the drivers written by drunken, blind, Korean rice farmers and you've got a recipe for a
steaming pile of bullshit and chips.
Don't get me wrong, I do respect what nVidia has tried in graphical innovation. Pushing for better physics in
gaming, by having the GPU do more of the work instead of the CPU. And drastically improving Flash performance
by utilizing the GPU, but I still think most of the development team are huffing paint thinners and giving each
other piggyback rides.
ATIs big gimmick is multi display technology. The GPU being strong enough to support six 1080p monitors at the
same time. nVidias big gimmick? With an expensive and hefty projector, and sitting at exactly the right distance
from the monitor and projector, while wearing a pair of expensive 3D glasses, you can have 3D in any game, so
long as the game boast the "nVidia, the way it's meant to be played" logo. While not trying to sound rather
cynical about all of this, can I have one of those piggyback rides?
Every single one of nVidias releases, since the GTX 280, asides from the monstrous GTX 295, have been
rebrands and renames of old cards. Sometimes they'd upgrade the GPU architecture or cooling, but in most
cases they'd be exactly the same. The Fermi GPU looks to bring about the same level of innovation, with a
range of 470, 450 and 440 cards coming soon. Prepare to meet the release of these cards with the same level
of excitement as you would if you were told somebody in Canada had farted.
nVidia have run out of ideas completely but won't admit defeat yet. They still seem intent on selling overpriced,
underpowered cards until the fanboys give up completely and switch to ATI and good luck to them on that
strategy. If you're reading this and work for nVidia, good luck with your job hunting and give me a fucking
piggyback you lazy, useless, unimaginative fuck.

Edited by Caspa, 29 March 2010 - 07:55 PM.

Hostile is a cunt.

Thought I'd have that here to save time.

#2 Beowulf

Beowulf

    Shipgirl

  • Advisors
  • 7,219 posts
  •  Azur Lane Fangirl

Posted 29 March 2010 - 09:25 PM

You do realize that a lot of this has to do with drivers, right? Performance will increase as the driver releases improve to work better with the new tech. Benchmarks for brand new cards usually isn't very impressive.

NZ.org | BBPCG
Discord: The Astronomer#1314
Steam


#3 Caspa

Caspa

    title available

  • Members
  • 938 posts
  • Location:Sunderland
  •  Shine On, Lurk Moar

Posted 29 March 2010 - 10:58 PM

Drivers? You honestly want to try defending nVidia with their drivers? nVidias drivers are steaming piles of shit these days. You could train a chimp to write drivers with less bugs in them. Remember the glorious triumph of their last drivers? The one that stopped the fan from working and caused the GPU to overheat into a melted silicon/metallic mess? The most recent drivers nVidia released that weren't a buggy pile of rancid goats piss was 190.62. That was a long time ago. Most of the drivers before and all of the drivers since then have been buggy shitpiles.
Hostile is a cunt.

Thought I'd have that here to save time.

#4 Beowulf

Beowulf

    Shipgirl

  • Advisors
  • 7,219 posts
  •  Azur Lane Fangirl

Posted 29 March 2010 - 11:24 PM

No idea what you're problem is. I have not had any single issue with nVidia cards, their drivers or their performance. Fuck off, buy an ATI and shut the fuck up already.

[EDIT] Like I thought, ATI had a brand new driver release just before the latest nVidia cards. This always happens... the other guys release performance drivers to boost performance of their newest releases. nVidia's current driver batch is not tuned for the newly released cards. Subsequent driver releases will increase their performance. This always happens.

I have to ask, do you ever do any fucking research?

NZ.org | BBPCG
Discord: The Astronomer#1314
Steam


#5 Jeeves

Jeeves

    I write the interwebz

  • Members
  • 4,156 posts
  •  Friendly neighborhood standards Nazi

Posted 30 March 2010 - 07:45 AM

God help us all if we're ever to rely on NVIDIA writing drivers

World Domination Status: 2.7%


#6 ambershee

ambershee

    Nimbusfish Rawks

  • Hosted
  • 3,114 posts
  • Location:Derby, UK
  • Projects:Mutator Week & Unreal 3 Projects
  •  Mad Mod Boffin

Posted 30 March 2010 - 10:58 AM

ITT: ATI fanboy hormonal raging. It's like looking at Nvidia demo video comments on youtube. Here's a protip, both manufacturers produce the goods. Buy whatever works best for you for the best price and shut up already, the debate went stale long after the GeForce 6.

Edit: For the record, I can run eight monitors from my Nvidia based machine.

Edited by ambershee, 30 March 2010 - 11:00 AM.


#7 Caspa

Caspa

    title available

  • Members
  • 938 posts
  • Location:Sunderland
  •  Shine On, Lurk Moar

Posted 30 March 2010 - 08:40 PM

Strangely enough, I'm not an ATI fanboy. I'm just rather sickened by nVidia these days and thoroughly disappointed by their latest release.
Hostile is a cunt.

Thought I'd have that here to save time.

#8 ambershee

ambershee

    Nimbusfish Rawks

  • Hosted
  • 3,114 posts
  • Location:Derby, UK
  • Projects:Mutator Week & Unreal 3 Projects
  •  Mad Mod Boffin

Posted 02 April 2010 - 12:43 AM

"Sickened" by a card with Dx11 capability, 1.5Gb dedicated video memory and that performed slightly better in benchmarks than it's ATI competitor (the 5870), yet priced at around £220 when the 5870 is around £280.

I don't really get your argument here.

If you want an overpriced, underpowered card, why not go and take a look at ATIs 5970? I can't find that cheaper than around £550 (approx $1000) at the moment.

#9 Beowulf

Beowulf

    Shipgirl

  • Advisors
  • 7,219 posts
  •  Azur Lane Fangirl

Posted 02 April 2010 - 05:55 PM

I can buy a 5970 for as low as $700. ;)

NZ.org | BBPCG
Discord: The Astronomer#1314
Steam


#10 ambershee

ambershee

    Nimbusfish Rawks

  • Hosted
  • 3,114 posts
  • Location:Derby, UK
  • Projects:Mutator Week & Unreal 3 Projects
  •  Mad Mod Boffin

Posted 02 April 2010 - 06:23 PM

I'd still hardly call that "low" ;)

Also, hardware is cheaper in the US for no reason at all (and I'm running on UK prices).

#11 Beowulf

Beowulf

    Shipgirl

  • Advisors
  • 7,219 posts
  •  Azur Lane Fangirl

Posted 02 April 2010 - 11:06 PM

Hey. It's not $1000. ;)

NZ.org | BBPCG
Discord: The Astronomer#1314
Steam


#12 Guest_Guest_*

Guest_Guest_*
  • Guests

Posted 18 April 2010 - 07:17 PM

The Top 10 Innovations in the New NVIDIA Fermi Architecture,  
and the Top 3 Next Challenges

David Patterson 
Director, Parallel Computing Research Laboratory (Par Lab), U.C. Berkeley

September 30, 2009 

I believe the Fermi architecture is as big as an architectural advance over G80 as G80 was over NV40. The combined result represents a giant step towards bringing GPUs into mainstream computing. [...]

http://www.nvidia.co...NVIDIAFermi.pdf

There is more than gaming performance.
With Fermi Nvidia introduced features that would make their GPU more useful for general computing tasks and a bunch of other things.

Personally I agree it's a monstrosity, but for others it will fit the bill.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users