Trust me, you can. I've been overclocking everything that can be overclocked for years and you CAN compensate lack of stream processors with higher clock. Stock HD7950 had GPU clock of 850MHz. Mine is running at 1100MHz.
Fanboyish glasses? NVIDIA cards are always significantly more expensive while not offering anything better worth mentioning. And i don't like their driver interface either. I've had a share of cards from both camps, changing sides depending on what was the best for the money. And frankly NVIDIA doesn't have anything good for years.
Fanboyish glasses? NVIDIA cards are always significantly more expensive while not offering anything better worth mentioning. And i don't like their driver interface either. I've had a share of cards from both camps, changing sides depending on what was the best for the money. And frankly NVIDIA doesn't have anything good for years.
Yeah, except better drivers, better driver interface, better performance at lower power consumption, and better multi-card experience.
AMD can't even come close to how far Nvidia have pulled away in the last couple of years. The only thing they can do is produce sub-par quality cards for an adequately sub-par price, or release lagging products. You do realize the "780 killer" is STILL not released 6 months after Nvidia pretty much took 95% of the high-end market without any resistance, and supposedly it "beats" the 780 by consuming 400 W and outputting the heat that goes with that? And all that catch-up 3 months before Nvidia releases their next architecture (not just chip refreshes). AMD has some work to do.
The only place where Nvidia is total shit is "gaming" card compute performance, because they artificially limit it so they can sell their Tesla product line. That's the only thing about AMD that I like - they don't have a "gaming" (7xx) a "workstation" (quadro) and a "compute" (tesla) product lines. They just have "video cards." That, and the obviously cheap price if you don't care about quality.
On topic: Mantle will be as shit as CUDA due to being proprietary, and it's unnecessary for this game. No time should be spent even considering supporting stupidity like that.
Better drivers in what way? To me they are pretty much the same. Better interface? That's a matter of an opinion, but i like CCC more. NVIDIA used to have good interface but the latest one is just bad. Better performance. Right. It's pretty much the same, taking turns everywhere. Lower power consumption? Bollocks. Ever seen Zero Core at work? Who cares about power consumption, i have a high end card, i care about framerate, not power consumption. That's like having a V8 Corvette and worrying about gas consumption... Better multi card? Maybe. To be frank i never liked the idea by either, because both are flawed by design. I prefer a faster single GPU option over anything else...
You can't compare Mantle with CUDA. CUDA is very specific computation only API. mantle handles whole rendering process. Only few apps actualyl benefit from CUDA where every single game could benefit from mantle for as long one adds support for it. CUDA can only be compared to DirectCompute. They are both meant for the same thing and they are basically the same thing in general.
Just went crossfire with amd hd 7770 - was playing with 60-80 fps with 1 card, with crossfire i now play with 150-200fps in NS2,
From what i read on mantle my question would be, if it benifits all GPU's and will lesson the CPU bottleneck even if by only a few % its still something that would help, anything as off now that would increase performace for any gpu is a +
while this debate about amd ati vs NVidia is off-topic from the thread.. I still want to chime in with something.
When *most* people buy video cards, they aren't looking at the highest cost cards to buy, they are looking for the best cards for the price or best bang for the buck. As of this moment, ATI holds the top 3 spots for price vs performance. This is one of the charts that I use when I buy a new video card. I don't like to spend more than ~$100-110 on a video card.
nvidia has better cards if you don't mind spending the money though. My 770 cost abot $500.
Video cards are one of the fastest items in a computer that devalue in price. In 6 months you'll be able to get that 770 for half the price, if not even less. I don't see the point in spending that much for a video card as a result.... unless you just have money to throw around.
It's just money dawg. What's the point of having it if you're not going to spend it on cool things?
Having something for backup, in case something happens?
Also, what's the point in spending money on cool things if you can get 80% the coolness for 50% of the price?
I don't buy a new card every year, so when I finally got around to buying another one I wanted something near the top so it doesn't become out dated quite as fast. It's that or I spend half the money twice as much to make sure I don't fall out of date.
I don't buy a new card every year, so when I finally got around to buying another one I wanted something near the top so it doesn't become out dated quite as fast. It's that or I spend half the money twice as much to make sure I don't fall out of date.
Even a mid range ~$100-$110 dollar card will last a gamer at least 2 years, unless you are a competitive gamer in the latest games that needs as high fps as possible to stay competitive.
Hell, I still use a Radeon 4850 for even the latest games..... Admittedly, that's starting to push it even for me.
DC_DarklingJoin Date: 2003-07-10Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
Fine fine.. ill go help stay off topic.. *glares carefully at mods*.
I choose nvidia the last time I replaced the card, somewhere within a years time.
My previous card, a ati was dying and/or was performing abysmal on the drivers, still unsure which one.
I set on to form a budget and see what my max card would be for that budget with 2 things in mind:
* had to be as strong as I could get.
* had to be as silent as I could get.
Those 2 do not mix, so a interesting combo pushing for a less performance card in the same budget.. ati/amd usually is more hot for the same performance but I kept a open mind.
I proceeded to search the net for benchmark, but more important, reallife tests. Where folk would simply shove cards in the same system and PLAY, lotsa games on lotsa settings.
I then put all the cards in budget next to each other & checked stuff like reported fps, stability, frametimes etc etc etc. (frametimes was a gamble, still fairly newish).
In the end I found out that in my resolution (FullHD) in budget amd and nvidia performed equally on performance. nvidia was ahead on cooling/silence.
The 'equally performance' may have had a fps difference of 5, sometimes 10, depending on games. For both sides. But with a very acceptable minimum in my eyes.
Which made me almost go for nvidia, but I wanted another solid point.
same fps, but nvidia had PhysX. I have PhysX games. So in the end nvidia won.
I now have a Gainward Phantom 570GTX card. The phantom series has huge cooling.
So yes, nvidia is better to a degree, yes they are less hot. But in the end the difference isnt huge and very spossibly due to drivers. Just go check the list yourself & pick then.
Noise is irrelevant once you look at non reference cards. My HD7950 with WindForce 3X cooler is set to run at 30% fan when idle and max 40% when under load. There is a fail safe that pushes fans to higher RPM when GPU exceeds 80°C, but i have yet to see it reach that. I always run all games at max possible settings without exception. Pretty much absolute silence while gaming.
For a game like Natural Selection, I'd love to see hear TrueAudio support.
Just imagine being able to locate and track a skulk with your headset.
Wouldn't that be sweet?
Here is a TrueAudio demo:
Being able to locate and track a skulk with your headset would certainly be sweet! I'm very enthusiastic about 3D audio (especially for headphones) and I'd love to see more game developers implementing technologies that provide 3D audio for headphone users.
Having said that, I'm very sceptical about the demo video you posted. To me, that sounds like a bunch of binaural dummy head recordings that have been mixed together. If this is the case, then it's an absolute fraud.
I would love it to be true, though. I would love for the people at GenAudio to prove to me that each sound source in that demo started as a mono audio asset, and that they used software to position and move each sound around inside a virtual environment, and that the sounds were convolved using sophisticated filters to produce the final 3D sound field.
For now, though, I'm not convinced. The bit in the demo that I'm most sceptical about was the part where he puts a bag over your head. Here's why:
How would you record the unprocessed source audio for that effect?
How would you create a DSP to accurately reproduce that effect from the source sound(s)?
Why would you even want to calculate this effect in real-time when you could just use an actual binaural dummy head recording?
I was also not convinced by their Doom 3 in-game demo video. I was not able to perceive any hint of sounds going behind me. The scientist's voice, for example, seems to only have basic left-to-right panning applied to it. The evil voices at 4:20 seemed more promising but the inconsistencies are concerning.
Since 2010, I've been a huge fan of a 3D audio solution called Rapture3D. Of all of the 3D audio systems that I've tested, this one is the best by far.
If I was an AMD user, I might be much more interested in this. As it stands, I'm only vaguely interested by it. I do look forward to hearing about other peoples' experiences with it in December when Battlefield 4 is patched to use it.
These days, more and more studios are licensing engines like Unreal, Cryengine, Frostbite, and Unity instead of creating their own engines. Thus, it's only a handful of companies that produce the technology that powers a very large portion of all 3D video games on the market. Provided that these companies do the heavy lifting involved with integrating and optimizing their engines for Mantle, we might end up seeing quite a lot of games automatically supporting Mantle as a result. It should be interesting to see how this pans out.
Much like @Ghosthree3, I'm way more excited about NVIDIA's new G-Sync technology, though. I'm looking forward to getting my hands on a G-Sync enabled monitor as soon as they're readily available.
nvidia has better cards if you don't mind spending the money though. My 770 cost abot $500.
Video cards are one of the fastest items in a computer that devalue in price. In 6 months you'll be able to get that 770 for half the price, if not even less. I don't see the point in spending that much for a video card as a result.... unless you just have money to throw around.
Not really true, the gtx 580's I bought for around $500 each about 2 or 3 years ago sold on ebay recently for right about $400 each. So I lost about 20% over the 2 or 3 years I was using them, which really isn't bad. The cheaper cards will definitely lose value much faster, but the gtx 70's and 80's hold up pretty well due to them being more rare and just overall better cards. Than the lower models.
Noise is irrelevant once you look at non reference cards. My HD7950 with WindForce 3X cooler is set to run at 30% fan when idle and max 40% when under load. There is a fail safe that pushes fans to higher RPM when GPU exceeds 80°C, but i have yet to see it reach that. I always run all games at max possible settings without exception. Pretty much absolute silence while gaming.
So you're telling him he's wrong because he switched and got a card that performs better, is quieter/cooler, has better driver support, extra features, and within his budget? I also find it hard to believe you're managing to stay below 80c with 40% fan speed unless you're playing some fairly easy to render games or low resolution with an ambient temperature of like 15c. I'd honestly like to see some proof because that would be somewhat impressive.
the new G-Sync tech (that will only work with nvidia cards).
That's exactly my biggest problem with Nvidia. I get why they do this from a business perspective but this isn't helping us pc gamers one bit. It's practically saying FU to the other 48% on the market.
It does tap into technology they have preexisting in their GTX cards...it's not like they can just make it automatically work with any card, they'd have to give the technology to ATI so they could make their cards work with it. In what world would that even slightly be a good idea, when has that ever happened. For anything?
the new G-Sync tech (that will only work with nvidia cards).
That's exactly my biggest problem with Nvidia. I get why they do this from a business perspective but this isn't helping us pc gamers one bit. It's practically saying FU to the other 48% on the market.
Go open standard or go home.
It's no different from AMD Mantle, which is only compatible with their Graphics Core Next (GCN) based products.
Mantle is a low-level API specification[2] developed by AMD as an alternative to Direct3D and OpenGL. Currently the only implementation is for AMD's GCN graphics processing unit architecture[3] although there is speculation other GPU vendors might be able to implement it in future.[2]
Mantle is a low-level API specification[2] developed by AMD as an alternative to Direct3D and OpenGL. Currently the only implementation is for AMD's GCN graphics processing unit architecture[3] although there is speculation other GPU vendors might be able to implement it in future.[2]
DC_DarklingJoin Date: 2003-07-10Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
@RejZoR
No it is not irrelevant depending on customer 'needs'.
What you describe is, for me, unacceptable.
My current card in my current pc case has fanspeeds I do not influence. End result is it keeps the card around 50/60 degrees, which is a very very good thing for me to have as my pc is on the sunside of the house and it gets hot enought already.
Lesser performing cards made it way way way to hot, which of course sucks in hot air, making a bad situation..worse.
While it DOES keep the card cool, it also does it at a noise level that even under load, I never hear the card. Ever.
For me, it was a perfect choice. It matches all my needs and expectations with 0 downsides,.
IronHorseDeveloper, QA Manager, Technical Support & contributorJoin Date: 2010-05-08Member: 71669Members, Super Administrators, Forum Admins, Forum Moderators, NS2 Developer, NS2 Playtester, Squad Five Blue, Subnautica Playtester, Subnautica PT Lead, Pistachionauts
That G-sync looks slick, but when will we see it exactly?
Only one monitor supports it so far.. and NVidia doesnt have a good record of getting other companies to adopt. If software adoption (physx) for their prevalent hardware has taken X amount of years to utilize, and even then in low frequency due to it's exclusive nature.. I can only imagine how much of a failure its going to be to get actual HARDWARE changes from manufacturers to support their exclusive system..
I wouldn't bet on either mantle or G-sync right now - and both for similar reasons
Honestly it doesn't really bother me if G-Sync doesn't get installed into many monitors (though I have a feeling it might be an option for most newer ones in the next 2 years, with a massive price bump). I'll just get one of the ones that is supported.
That G-sync looks slick, but when will we see it exactly?
Only one monitor supports it so far..
Four manufacturers have signed up to support G-Sync (ASUS, BenQ, Phillips, and ViewSonic). I think I remember reading that some of them will become available early next year.
Well if you want to talk open source, at least the g-sync module looks like something you could use on most monitors if you have the technical skills. Also if it does what the articles say it does which is make FPS no longer an issue, well that's pretty freaking awesome. Making 30fps looks like 60fps would really be a game changer.
I'm not so sure about making fps no longer an issue. The only visual advantage to me would seem to be removing any additional stutter caused by the monitor refreshing more often than there are frames to update, and of course (not visual) basically being vsync without any of the bad side effects. I'm sure fps can still be low enough as to look choppy.
Comments
Fanboyish glasses? NVIDIA cards are always significantly more expensive while not offering anything better worth mentioning. And i don't like their driver interface either. I've had a share of cards from both camps, changing sides depending on what was the best for the money. And frankly NVIDIA doesn't have anything good for years.
Yeah, except better drivers, better driver interface, better performance at lower power consumption, and better multi-card experience.
AMD can't even come close to how far Nvidia have pulled away in the last couple of years. The only thing they can do is produce sub-par quality cards for an adequately sub-par price, or release lagging products. You do realize the "780 killer" is STILL not released 6 months after Nvidia pretty much took 95% of the high-end market without any resistance, and supposedly it "beats" the 780 by consuming 400 W and outputting the heat that goes with that? And all that catch-up 3 months before Nvidia releases their next architecture (not just chip refreshes). AMD has some work to do.
The only place where Nvidia is total shit is "gaming" card compute performance, because they artificially limit it so they can sell their Tesla product line. That's the only thing about AMD that I like - they don't have a "gaming" (7xx) a "workstation" (quadro) and a "compute" (tesla) product lines. They just have "video cards." That, and the obviously cheap price if you don't care about quality.
On topic: Mantle will be as shit as CUDA due to being proprietary, and it's unnecessary for this game. No time should be spent even considering supporting stupidity like that.
You can't compare Mantle with CUDA. CUDA is very specific computation only API. mantle handles whole rendering process. Only few apps actualyl benefit from CUDA where every single game could benefit from mantle for as long one adds support for it. CUDA can only be compared to DirectCompute. They are both meant for the same thing and they are basically the same thing in general.
From what i read on mantle my question would be, if it benifits all GPU's and will lesson the CPU bottleneck even if by only a few % its still something that would help, anything as off now that would increase performace for any gpu is a +
When *most* people buy video cards, they aren't looking at the highest cost cards to buy, they are looking for the best cards for the price or best bang for the buck. As of this moment, ATI holds the top 3 spots for price vs performance. This is one of the charts that I use when I buy a new video card. I don't like to spend more than ~$100-110 on a video card.
Video cards are one of the fastest items in a computer that devalue in price. In 6 months you'll be able to get that 770 for half the price, if not even less. I don't see the point in spending that much for a video card as a result.... unless you just have money to throw around.
Having something for backup, in case something happens?
Also, what's the point in spending money on cool things if you can get 80% the coolness for 50% of the price?
Numbers are wild guesses.
Go big or go home. (while maintaining emergency funds) :P
Even a mid range ~$100-$110 dollar card will last a gamer at least 2 years, unless you are a competitive gamer in the latest games that needs as high fps as possible to stay competitive.
Hell, I still use a Radeon 4850 for even the latest games..... Admittedly, that's starting to push it even for me.
I choose nvidia the last time I replaced the card, somewhere within a years time.
My previous card, a ati was dying and/or was performing abysmal on the drivers, still unsure which one.
I set on to form a budget and see what my max card would be for that budget with 2 things in mind:
* had to be as strong as I could get.
* had to be as silent as I could get.
Those 2 do not mix, so a interesting combo pushing for a less performance card in the same budget.. ati/amd usually is more hot for the same performance but I kept a open mind.
I proceeded to search the net for benchmark, but more important, reallife tests. Where folk would simply shove cards in the same system and PLAY, lotsa games on lotsa settings.
I then put all the cards in budget next to each other & checked stuff like reported fps, stability, frametimes etc etc etc. (frametimes was a gamble, still fairly newish).
In the end I found out that in my resolution (FullHD) in budget amd and nvidia performed equally on performance. nvidia was ahead on cooling/silence.
The 'equally performance' may have had a fps difference of 5, sometimes 10, depending on games. For both sides. But with a very acceptable minimum in my eyes.
Which made me almost go for nvidia, but I wanted another solid point.
same fps, but nvidia had PhysX. I have PhysX games. So in the end nvidia won.
I now have a Gainward Phantom 570GTX card. The phantom series has huge cooling.
So yes, nvidia is better to a degree, yes they are less hot. But in the end the difference isnt huge and very spossibly due to drivers. Just go check the list yourself & pick then.
Being able to locate and track a skulk with your headset would certainly be sweet! I'm very enthusiastic about 3D audio (especially for headphones) and I'd love to see more game developers implementing technologies that provide 3D audio for headphone users.
Having said that, I'm very sceptical about the demo video you posted. To me, that sounds like a bunch of binaural dummy head recordings that have been mixed together. If this is the case, then it's an absolute fraud.
I would love it to be true, though. I would love for the people at GenAudio to prove to me that each sound source in that demo started as a mono audio asset, and that they used software to position and move each sound around inside a virtual environment, and that the sounds were convolved using sophisticated filters to produce the final 3D sound field.
For now, though, I'm not convinced. The bit in the demo that I'm most sceptical about was the part where he puts a bag over your head. Here's why:
I was also not convinced by their Doom 3 in-game demo video. I was not able to perceive any hint of sounds going behind me. The scientist's voice, for example, seems to only have basic left-to-right panning applied to it. The evil voices at 4:20 seemed more promising but the inconsistencies are concerning.
Since 2010, I've been a huge fan of a 3D audio solution called Rapture3D. Of all of the 3D audio systems that I've tested, this one is the best by far.
You can read more about my opinions on this subject in a post I wrote on my personal blog called: I Want HRTFs In My Games!
My suggestion thread on Steam Discussions might also be of interest: SOURCE ENGINE AUDIO :: Head-Related Transfer Functions.
Regarding Mantle (sorry for going off topic)...
If I was an AMD user, I might be much more interested in this. As it stands, I'm only vaguely interested by it. I do look forward to hearing about other peoples' experiences with it in December when Battlefield 4 is patched to use it.
These days, more and more studios are licensing engines like Unreal, Cryengine, Frostbite, and Unity instead of creating their own engines. Thus, it's only a handful of companies that produce the technology that powers a very large portion of all 3D video games on the market. Provided that these companies do the heavy lifting involved with integrating and optimizing their engines for Mantle, we might end up seeing quite a lot of games automatically supporting Mantle as a result. It should be interesting to see how this pans out.
Much like @Ghosthree3, I'm way more excited about NVIDIA's new G-Sync technology, though. I'm looking forward to getting my hands on a G-Sync enabled monitor as soon as they're readily available.
Not really true, the gtx 580's I bought for around $500 each about 2 or 3 years ago sold on ebay recently for right about $400 each. So I lost about 20% over the 2 or 3 years I was using them, which really isn't bad. The cheaper cards will definitely lose value much faster, but the gtx 70's and 80's hold up pretty well due to them being more rare and just overall better cards. Than the lower models.
So you're telling him he's wrong because he switched and got a card that performs better, is quieter/cooler, has better driver support, extra features, and within his budget? I also find it hard to believe you're managing to stay below 80c with 40% fan speed unless you're playing some fairly easy to render games or low resolution with an ambient temperature of like 15c. I'd honestly like to see some proof because that would be somewhat impressive.
That's exactly my biggest problem with Nvidia. I get why they do this from a business perspective but this isn't helping us pc gamers one bit. It's practically saying FU to the other 48% on the market.
Go open standard or go home.
EDIT: And it's not 48%, it's more like 30%.
It's no different from AMD Mantle, which is only compatible with their Graphics Core Next (GCN) based products.
http://en.wikipedia.org/wiki/Mantle_(API)
Speculation. Given the low-level nature of the API, I am doubtful.
Also, here's a tweet from Robert Hallock (PR for Radeon & Gaming @ AMD) in response to the question "Is Mantle open source?":
https://twitter.com/Thracks/status/383872285351739393
No it is not irrelevant depending on customer 'needs'.
What you describe is, for me, unacceptable.
My current card in my current pc case has fanspeeds I do not influence. End result is it keeps the card around 50/60 degrees, which is a very very good thing for me to have as my pc is on the sunside of the house and it gets hot enought already.
Lesser performing cards made it way way way to hot, which of course sucks in hot air, making a bad situation..worse.
While it DOES keep the card cool, it also does it at a noise level that even under load, I never hear the card. Ever.
For me, it was a perfect choice. It matches all my needs and expectations with 0 downsides,.
Only one monitor supports it so far.. and NVidia doesnt have a good record of getting other companies to adopt. If software adoption (physx) for their prevalent hardware has taken X amount of years to utilize, and even then in low frequency due to it's exclusive nature.. I can only imagine how much of a failure its going to be to get actual HARDWARE changes from manufacturers to support their exclusive system..
I wouldn't bet on either mantle or G-sync right now - and both for similar reasons
Four manufacturers have signed up to support G-Sync (ASUS, BenQ, Phillips, and ViewSonic). I think I remember reading that some of them will become available early next year.