Lets not forget the micron difference Nvidia = .13 ATI = .15 my Nvidia Fanboydom remains secure! and $499 is well worth it in my opinion. Gabe Newell can put that in his pipe and smoke it. I knew this baby would come before HL2 and Doom 3, my dreams have come true!
i like nvidia, lets hope this supports my loyalties <!--emo&:)--><img src='http://www.unknownworlds.com/forums/html//emoticons/smile.gif' border='0' style='vertical-align:middle' alt='smile.gif' /><!--endemo-->
<!--QuoteBegin-DOOManiac+Apr 14 2004, 06:25 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Apr 14 2004, 06:25 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> (we use them at work) <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Where do you work?
EDIT: Wth? Where's the Quake III benchmarks? Those are the only scores I undestand.
<!--QuoteBegin-DOOManiac+Apr 14 2004, 11:25 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Apr 14 2004, 11:25 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-$niper Chance+Apr 14 2004, 05:09 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> ($niper Chance @ Apr 14 2004, 05:09 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> What's with those female ports? They look nothing like standard VGA ports.
What, we have to buy a new monitor or conversion plug for this? :/ <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Those are DVI plugs. Every card since like the Geforce 1 has had one of them on the back, and now NVidia is finally helping to push monitor technology out of the stoneage.
Its for a much better video signal on flat panel monitors.
Don't worry, I'm sure the card will come with a dongle that converts it to VGA. ATI has DVI only cards too (we use them at work) and they all come w/ a dongle to translate to VGA for the few CRTs we have. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> I noticed that when I bought mine recently. I was worried for a second till I saw the adapter. Phew <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
<!--QuoteBegin-Grillkohle+Apr 14 2004, 04:21 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Grillkohle @ Apr 14 2004, 04:21 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> O_O You could beat someone to death with that video card. I wonder what it weighs. If I could spare the money for a vid card like this one, I would wait until ATi releases their new 'flagship'. I just like their stuff better. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Do people even both reading the articles that are posted anymore?
Too bad you need an extra slot for it and 2x power supply :/ But I guess that's the price to pay if you want to bring to graphics to life. Too bad it really needs 2 power supplies.
EDIT : I wonder what ATi would show though...let's hope they both are similar, and then we have the mases getting confused at which card to get <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo--> <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
Meh. I'd rather wait on the GeForce LX 6660, slotted for mass production in 2009. I've heard rumors that it harnesses the power of foul demons from the netherworld to do its bidding. They're projecting that it'll cost about five thousand dollars on release.
<!--QuoteBegin-007Bistromath+Apr 14 2004, 09:48 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (007Bistromath @ Apr 14 2004, 09:48 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Meh. I'd rather wait on the GeForce LX 6660, slotted for mass production in 2009. I've heard rumors that it harnesses the power of foul demons from the netherworld to do its bidding. They're projecting that it'll cost about five thousand dollars on release. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Oh, MUCH more than that...Id bet my soul that baby will be the king of the new engine in works: <a href='http://www.shinyidol.com/crap/unreal3_0002.wmv' target='_blank'>http://www.shinyidol.com/crap/unreal3_0002.wmv</a>
Just watch the video..It makes HL2 look like HL...
looks like valve took too long... <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo-->
<!--QuoteBegin-Crono5788+Apr 14 2004, 05:45 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Crono5788 @ Apr 14 2004, 05:45 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-DOOManiac+Apr 14 2004, 06:25 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (DOOManiac @ Apr 14 2004, 06:25 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> (we use them at work) <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Where do you work?
EDIT: Wth? Where's the Quake III benchmarks? Those are the only scores I undestand. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> I work at a small company called <a href='http://www.fundsforlearning.com' target='_blank'>Funds For Learning</a>. I get to program though, that pwns.
And you don't use a 5 year old engine to benchmark top of the line hardware...
EDIT: Wth? Where's the Quake III benchmarks? Those are the only scores I undestand. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> I work at a small company called <a href='http://www.fundsforlearning.com' target='_blank'>Funds For Learning</a>. I get to program though, that pwns.
And you don't use a 5 year old engine to benchmark top of the line hardware... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> I dunno.. i think that every benchmark has it's place, as an older engine might stress a different limit (like cpu or something, i dunno)
A pure Quake 3 engine stresses NOTHING. Benchmarks where you're comparing 280fps against 315 are absolutly worthless. Especially when the engine has nothing that grinds the card like modern games do (bump mapping, per pixel lighting, etc).
Hell I can't wait on DOOM 3 and Stalker just for benchmarking purposes...
<!--QuoteBegin-Omegaman!+Apr 14 2004, 10:06 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Omegaman! @ Apr 14 2004, 10:06 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> <!--QuoteBegin-007Bistromath+Apr 14 2004, 09:48 PM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (007Bistromath @ Apr 14 2004, 09:48 PM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> Meh. I'd rather wait on the GeForce LX 6660, slotted for mass production in 2009. I've heard rumors that it harnesses the power of foul demons from the netherworld to do its bidding. They're projecting that it'll cost about five thousand dollars on release. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Oh, MUCH more than that...Id bet my soul that baby will be the king of the new engine in works: <a href='http://www.shinyidol.com/crap/unreal3_0002.wmv' target='_blank'>http://www.shinyidol.com/crap/unreal3_0002.wmv</a>
Just watch the video..It makes HL2 look like HL... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> I don't think you wanna go betting that, considering we <i>are</i> dealing with horrors from beyond, and all that.
From what I understand, the thing was supposed to be good for 2007, but there's some argument in the dev crew as to whether an elder sign or a baphomet would work better to prevent what they call "soulware failure."
The only reason it can achieve such high speeds is because of the massive heat sink. I thought their last one was bad, but this is just stupid.
Watch ATI come out with a card that is just as good, takes one slot, half the power and still manages to beat it like the last two times, sheesh. (I hoping!)
watch, out of no where, 3dfx jumps in with a card 3 times the speed of this and the new radeon...(even though 3dfx is part of nvidia now <!--emo&:p--><img src='http://www.unknownworlds.com/forums/html//emoticons/tounge.gif' border='0' style='vertical-align:middle' alt='tounge.gif' /><!--endemo--> )
<!--QuoteBegin-Leaderz0rz+Apr 15 2004, 12:48 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (Leaderz0rz @ Apr 15 2004, 12:48 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> well if you have 600 dollars to blow and an extra 60 for a new power supply. go right ahead....and heh no pci express support, tisk tisk <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> That's because PCI Express isn't out there, cheif.
The engineers at nvidia must have rocks in their head if they think anyone's going to consider this thing. What a lump of complete and utter uselessness. If a chip needs that much cooling oomph it's running too fast and generating too much heat in the first place.
The engineers at nvidia must have rocks in their head if they think anyone's going to consider this thing. What a lump of complete and utter uselessness. If a chip needs that much cooling oomph it's running too fast and generating too much heat in the first place.
ATI will pwn the pance off it.
--Scythe-- <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd--> Except the highest fan RPM was never triggered in testing and a 50MHz overclock on the CPU was possible, not to mention the whole heat sink is ALUMINUM, so it has to be a little bigger to get rid of all the heat.
Futhermore, it's a fairly quiet card.
If I had the money, i'd buy it in a heartbeat. I wouldn't mind the space it took up and my simple 40 dollar thermaltake PSU has an ample PSU connector amount.
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Doesn't take much to get the fanbois foaming does it ?
Looks like another dusbuster to me. We'll see if NVidia have read the market right this time and people actually buy this puppy.
Anyway, it's all good. If anything, it will move ATI to do better, sooner.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
I'm pretty sure the articles even put emphasis on the fact taht this <i>isn't</i> another dustbuster. That's why the heat sink is so damn big.
<!--emo&:0--><img src='http://www.unknownworlds.com/forums/html//emoticons/wow.gif' border='0' style='vertical-align:middle' alt='wow.gif' /><!--endemo--> much sehks, too ad i cant afford it <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo--> , i can't even afford a 9800. Balls, ill probably end up getting my first ever decent video card in 2012, when im 22 <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo-->
<!--QuoteBegin-version91x+Apr 15 2004, 07:52 AM--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> (version91x @ Apr 15 2004, 07:52 AM)</td></tr><tr><td id='QUOTE'><!--QuoteEBegin--> its all a conspiracy guys.
the geforce 5 is two geforce 4's stuck together.
lets look at the proof:
-twice the size.. check -twice the power usage.. check -twice the cost.. check -twice the performance.. check <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd--> Your theory falls through in 2 areas:
Unreal 3 hasn't been announced damn it. That's just the latest version of the Unreal engine that they were showing at GDC. And just because they showed it doesn't mean it will be in anything before 2 or 3 years time.
Comments
Where do you work?
EDIT: Wth? Where's the Quake III benchmarks? Those are the only scores I undestand.
What, we have to buy a new monitor or conversion plug for this? :/ <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Those are DVI plugs. Every card since like the Geforce 1 has had one of them on the back, and now NVidia is finally helping to push monitor technology out of the stoneage.
Its for a much better video signal on flat panel monitors.
Don't worry, I'm sure the card will come with a dongle that converts it to VGA. ATI has DVI only cards too (we use them at work) and they all come w/ a dongle to translate to VGA for the few CRTs we have. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I noticed that when I bought mine recently. I was worried for a second till I saw the adapter. Phew <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
You could beat someone to death with that video card. I wonder what it weighs.
If I could spare the money for a vid card like this one, I would wait until ATi releases their new 'flagship'. I just like their stuff better. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Do people even both reading the articles that are posted anymore?
Too bad you need an extra slot for it and 2x power supply :/ But I guess that's the price to pay if you want to bring to graphics to life. Too bad it really needs 2 power supplies.
EDIT : I wonder what ATi would show though...let's hope they both are similar, and then we have the mases getting confused at which card to get <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo--> <!--emo&:D--><img src='http://www.unknownworlds.com/forums/html//emoticons/biggrin.gif' border='0' style='vertical-align:middle' alt='biggrin.gif' /><!--endemo-->
Oh, MUCH more than that...Id bet my soul that baby will be the king of the new engine in works: <a href='http://www.shinyidol.com/crap/unreal3_0002.wmv' target='_blank'>http://www.shinyidol.com/crap/unreal3_0002.wmv</a>
Just watch the video..It makes HL2 look like HL...
Where do you work?
EDIT: Wth? Where's the Quake III benchmarks? Those are the only scores I undestand. <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I work at a small company called <a href='http://www.fundsforlearning.com' target='_blank'>Funds For Learning</a>. I get to program though, that pwns.
And you don't use a 5 year old engine to benchmark top of the line hardware...
Where do you work?
EDIT: Wth? Where's the Quake III benchmarks? Those are the only scores I undestand. <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
I work at a small company called <a href='http://www.fundsforlearning.com' target='_blank'>Funds For Learning</a>. I get to program though, that pwns.
And you don't use a 5 year old engine to benchmark top of the line hardware... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I dunno.. i think that every benchmark has it's place, as an older engine might stress a different limit (like cpu or something, i dunno)
Hell I can't wait on DOOM 3 and Stalker just for benchmarking purposes...
Oh, MUCH more than that...Id bet my soul that baby will be the king of the new engine in works: <a href='http://www.shinyidol.com/crap/unreal3_0002.wmv' target='_blank'>http://www.shinyidol.com/crap/unreal3_0002.wmv</a>
Just watch the video..It makes HL2 look like HL... <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
I don't think you wanna go betting that, considering we <i>are</i> dealing with horrors from beyond, and all that.
From what I understand, the thing was supposed to be good for 2007, but there's some argument in the dev crew as to whether an elder sign or a baphomet would work better to prevent what they call "soulware failure."
Watch ATI come out with a card that is just as good, takes one slot, half the power and still manages to beat it like the last two times, sheesh. (I hoping!)
You & me both.
That's because PCI Express isn't out there, cheif.
The thing is huge.
I will certain get it... next year... when the price drop to $60... yeah that's it!
Looks like another dusbuster to me. We'll see if NVidia have read the market right this time and people actually buy this puppy.
Anyway, it's all good. If anything, it will move ATI to do better, sooner.
The engineers at nvidia must have rocks in their head if they think anyone's going to consider this thing. What a lump of complete and utter uselessness. If a chip needs that much cooling oomph it's running too fast and generating too much heat in the first place.
ATI will pwn the pance off it.
--Scythe--
The engineers at nvidia must have rocks in their head if they think anyone's going to consider this thing. What a lump of complete and utter uselessness. If a chip needs that much cooling oomph it's running too fast and generating too much heat in the first place.
ATI will pwn the pance off it.
--Scythe-- <!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
Except the highest fan RPM was never triggered in testing and a 50MHz overclock on the CPU was possible, not to mention the whole heat sink is ALUMINUM, so it has to be a little bigger to get rid of all the heat.
Futhermore, it's a fairly quiet card.
If I had the money, i'd buy it in a heartbeat. I wouldn't mind the space it took up and my simple 40 dollar thermaltake PSU has an ample PSU connector amount.
<!--QuoteBegin--></div><table border='0' align='center' width='95%' cellpadding='3' cellspacing='1'><tr><td><b>QUOTE</b> </td></tr><tr><td id='QUOTE'><!--QuoteEBegin-->Doesn't take much to get the fanbois foaming does it ?
Looks like another dusbuster to me. We'll see if NVidia have read the market right this time and people actually buy this puppy.
Anyway, it's all good. If anything, it will move ATI to do better, sooner.<!--QuoteEnd--></td></tr></table><div class='postcolor'><!--QuoteEEnd-->
I'm pretty sure the articles even put emphasis on the fact taht this <i>isn't</i> another dustbuster. That's why the heat sink is so damn big.
the geforce 5 is two geforce 4's stuck together.
lets look at the proof:
-twice the size.. check
-twice the power usage.. check
-twice the cost.. check
-twice the performance.. check
much sehks, too ad i cant afford it <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo--> , i can't even afford a 9800. Balls, ill probably end up getting my first ever decent video card in 2012, when im 22 <!--emo&:(--><img src='http://www.unknownworlds.com/forums/html//emoticons/sad.gif' border='0' style='vertical-align:middle' alt='sad.gif' /><!--endemo-->
the geforce 5 is two geforce 4's stuck together.
lets look at the proof:
-twice the size.. check
-twice the power usage.. check
-twice the cost.. check
-twice the performance.. check <!--QuoteEnd--> </td></tr></table><div class='postcolor'> <!--QuoteEEnd-->
Your theory falls through in 2 areas:
half the weight
half the noise
-GF6 announced
- Unreal 3 announced
- GF6 the only card advanced enough for Unreal 3
NOES!