So far I have only had ati cards but I'm gonna buy a gtx 4x0 from nvidia this summer. Mainly because they come with many new innovations in their new architecture. The hd5000 are just a build on previous generations from ati with dx11 slapped on the side. It performs poorly in dx11 so I think people who buy a hd5000 these days are gonna be sorry next year when many more dx11-titles start apering.
Sorry the 5k series is true DX 11 whoever said otherwise knows nothing. Not only does the 5890 kick the crap out of nVidia best offering they are priced decently. the 400 series will still be 1 generation behind ATi
When you say 5890 you must be imagining because I sure haven't seen any benchmarks or prices on that card.
If you have tried running the heaven dx11 benchmark with a hd5000 you will know that it performs poorly. It may be a true implementation of dx11 but performance still leaves something to be desired. GF100 runs tesselation much better. As I also said in a recent post, have a look at this site:
You are kidding right about the heaven benchmark? Do you guys just make things up or what? Have you tried to run heaven and do you own a 5000 series ATI card? Because I do and have.
Also I just read that link you posted and may I say... if that doesn't sound like a Nvidia commercial I don't know what is... lets find more unbiased sources when trying to prove points.
ok i know im biased i admitted that and go look in your local best buy youll see a shit ton of ati and very very few nvidia cards
1st point, god your replies are incoherent and hard to read.
2nd point, Biased means you will always side with nVidia and will say and do anything make ATi look bad. Meaning your "facts" are nothing more then opinions.
3rd point, Best Buy? Come on, the only two reasons for that is because Best Buy sucks. They are nothing but salesmen, they don't care to sale quality just whatever has the highest price tag is what they push, leaving ATi on the shelve because they are cheaper. The other reason is because their sales people know nothing and are just there to sell dumb customers products they are just as convinced that nVidia is better because of the crap spewed all over the internet and between fanbois like yourself.
well see the reasons im biased are for the fact i wanted to go cheap and got burned on four separate occasions
so yea my bias is based on my personal opinion and i already stated it was a matter opinion and personal preference
these "facts" come from ppl who got burned by ati and their craptastic cards and quite frankly no amount of anything will change those opinions i was laying out my thoughts you dont like it well get the hell over it
The fact is user error contributes to failures. Its not ATI's fault people don't know how to install cards properly. I have been using ATI and Nvidia cards for over 10 years. If you cannot get a ATI card to work that is not anyones problem but your own.
yes of course my bad the card dies after it has been running for about 3 weeks all on its own and its the customers fault
right of course
and especially when you consider this comp was purchased as it was totally put together by the ppl who sell it
but by all means it was the customers fault
the cutomer doesnt care what the specifications are they just want it to work advising ppl to use ati has never been viable to me as having to come back to fix things because the customer doesnt know what to do is not the best option
i choose nvidia because of the value it has had over ati in the long run the nvidia card lasts longer then the ati has
if you have gone 10 years without any issues good for you your lucky but not everyone has had the luck you have and to say its thier fault because it wasnt installed right is just stupid
the people i know know how to install and the card works fine for awhile then dies or starts stuttering out
graphical glitching clipped corners horrible fps sometimes it is the card and not the user
I am done, seeing as the poster above me wants to present himself as a knowledgeable computer repairman and knows nothing about bottlenecks or even specs, let alone stats. I have seen and heard enough.
Owner/Admin of GodlessGamer.com - Gaming news and reviews for the godless.
You are done because you have nothing valuable to bring to the table other then a biased site. I have been building computers for well over 10+ years and have repaired more then I care to count, not one time have I seen any of my computers have problems with ATI cards. I can even pull cards out from back then a 9600 pro or 9700 pro (ATI card) and they still work and run to there full potential today. If you had issues with a card dying then its the builders fault or the card was faulty (which is rare). Some people just lack the intuition to actually build proper computers.
I have built 2 recent computers both which contain a 5870 in it. Both of which ran Heaven. One has a athlon 64 x2 3.ghz processor and the other has a black edition amd 965 quad core. The quad core ran on 1920x1080(native resolution) @ 45 fps on the benchmark average with all graphics on max with HEAVEN. The dual core ran at 24 fps average. Both machines on windows 7 with 8 gigs of ram.
I am done, seeing as the poster above me wants to present himself as a knowledgeable computer repairman and knows nothing about bottlenecks or even specs, let alone stats. I have seen and heard enough.
You are done because you have nothing valuable to bring to the table other then a biased site. I have been building computers for well over 10+ years and have repaired more then I care to count, not one time have I seen any of my computers have problems with ATI cards. I can even pull cards out from back then a 9600 pro or 9700 pro (ATI card) and they still work and run to there full potential today. If you had issues with a card dying then its the builders fault or the card was faulty (which is rare). Some people just lack the intuition to actually build proper computers.
I have built 2 recent computers both which contain a 5870 in it. Both of which ran Heaven. One has a athlon 64 x2 3.ghz processor and the other has a black edition amd 965 quad core. The quad core ran on 1920x1080(native resolution) @ 45 fps on the benchmark average with all graphics on max with HEAVEN. The dual core ran at 24 fps average. Both machines on windows 7 with 8 gigs of ram.
Must not be speaking to me, I didn't post any site. I am the one trying to defend ATi, please quote the right person. O.o
Owner/Admin of GodlessGamer.com - Gaming news and reviews for the godless.
The only cards I've had to RMA were for some friends, an Nvidia 8600GT and an Nvidia 8800GTX Ultra. I had terrible AGP related driver problems on my Nvidia 6600GT for the first 3-4 months before they got them straightened out. My buddy had driver problems in WoW with his Radeon 9700 Pro and another friend had driver issues in WoW with his Nvidia 5600GT.
I am done, seeing as the poster above me wants to present himself as a knowledgeable computer repairman and knows nothing about bottlenecks or even specs, let alone stats. I have seen and heard enough.
You are done because you have nothing valuable to bring to the table other then a biased site. I have been building computers for well over 10+ years and have repaired more then I care to count, not one time have I seen any of my computers have problems with ATI cards. I can even pull cards out from back then a 9600 pro or 9700 pro (ATI card) and they still work and run to there full potential today. If you had issues with a card dying then its the builders fault or the card was faulty (which is rare). Some people just lack the intuition to actually build proper computers.
I have built 2 recent computers both which contain a 5870 in it. Both of which ran Heaven. One has a athlon 64 x2 3.ghz processor and the other has a black edition amd 965 quad core. The quad core ran on 1920x1080(native resolution) @ 45 fps on the benchmark average with all graphics on max with HEAVEN. The dual core ran at 24 fps average. Both machines on windows 7 with 8 gigs of ram.
Must not be speaking to me, I didn't post any site. I am the one trying to defend ATi, please quote the right person. O.o
Yeah my bad, didn't mean to quote you there. Sorry buddy
I've gone from an Nvidia 8800 gt to an ATI 5870. Over the years, I've used various ATI and Nvidia cards, so I have no particular bias towards either. That being said, right now the ATI 5000 series cards are superior to comparable Nvidia cards by a wide margin. When you add what each card will cost you, going with ATI right now is a no brainer. Better performance and costs less. The only people I can see not wanting to buy ATI over Nvidia right now are those that are waiting to see what Nvidia comes out with in the next few months or Nvidia fanboys. BTW, PhysicsX is a gimmic that relies on the game developer coding it in and not all Nvidia cards are capable of taking advantage of it. Its just not worth it for most developers hence why you see only a handlfull of games capable of it. The ATI cards being Direct X 11 capable has the potential to be much more useful and widespread, though we still haven't very many DX 10 games yet, so who knows.
Saying that nobody uses PhysX is just plain ignorant. Nearly every commercial game engine uses it for their physics right now, and you'll see a lot of games using it release in coming years.
Saying that it's more work for developers to use PhysX (not sure if that was you) is an even bigger mistake. Coding your own physics engine could easily take as long as the core game engine itself, while PhysX would still have a lot of advantages over it. PhysX was a mistake on ATI's end, Nvidia offered them to work together, and allow ATI cards to also benefit from PhysX, but ATI went with Havok instead...
Don't get me wrong though, I still prefer ATI over Nvidia, both have good cards, but Nvidia took advantage of their market position a few years ago and overpriced theirs. IMO, they deserve to suffer now.
You know it, the best way to realize your dreams is waking up and start moving, never lose hope and always keep up.
Saying that nobody uses PhysX is just plain ignorant. Nearly every commercial game engine uses it for their physics right now, and you'll see a lot of games using it release in coming years. Saying that it's more work for developers to use PhysX (not sure if that was you) is an even bigger mistake. Coding your own physics engine could easily take as long as the core game engine itself, while PhysX would still have a lot of advantages over it. PhysX was a mistake on ATI's end, Nvidia offered them to work together, and allow ATI cards to also benefit from PhysX, but ATI went with Havok instead... Don't get me wrong though, I still prefer ATI over Nvidia, both have good cards, but Nvidia took advantage of their market position a few years ago and overpriced theirs. IMO, they deserve to suffer now.
Noone uses hardware accellerated PhysX. PhysX as a physics platform is very popular, though there are still more Havok based games than PhysX based games. But very few support hardware acceleration and most that do only have minor effects that NVidia added themselves. It definitely is more work for devs to add hardware accelerated PhysX over just basic PhysX.
The idea that Nvidia offered to work together is a myth. Physx was ported to CUDA which only Nvidia can use. ATI can pay Nvidia to license Physx then port it to Stream or OpenCL but Nvidia still controls the IP, so they'd be doing a ton of work to port someone else's IP and have no say in it, that's why they went with Havok and now Bullet. Neither require a licensing fee, and when ported to OpenCL either company could accelerate it with no additional work since anyone can use OpenCL (even CPU's, and GPGPU on CPU can run OpenCL).
Nvidia and ATI are both on the OpenCL panel and that is the area they should be working on not CUDA or Stream. Both have their OpenCL compilers at the end of beta or released from what I recall, so just need to get a push behind an open physics standard.
I've gone from an Nvidia 8800 gt to an ATI 5870. Over the years, I've used various ATI and Nvidia cards, so I have no particular bias towards either. That being said, right now the ATI 5000 series cards are superior to comparable Nvidia cards by a wide margin. When you add what each card will cost you, going with ATI right now is a no brainer. Better performance and costs less. The only people I can see not wanting to buy ATI over Nvidia right now are those that are waiting to see what Nvidia comes out with in the next few months or Nvidia fanboys. BTW, PhysicsX is a gimmic that relies on the game developer coding it in and not all Nvidia cards are capable of taking advantage of it. Its just not worth it for most developers hence why you see only a handlfull of games capable of it. The ATI cards being Direct X 11 capable has the potential to be much more useful and widespread, though we still haven't very many DX 10 games yet, so who knows.
Saying that nobody uses PhysX is just plain ignorant. Nearly every commercial game engine uses it for their physics right now, and you'll see a lot of games using it release in coming years.
Saying that it's more work for developers to use PhysX (not sure if that was you) is an even bigger mistake. Coding your own physics engine could easily take as long as the core game engine itself, while PhysX would still have a lot of advantages over it. PhysX was a mistake on ATI's end, Nvidia offered them to work together, and allow ATI cards to also benefit from PhysX, but ATI went with Havok instead...
Don't get me wrong though, I still prefer ATI over Nvidia, both have good cards, but Nvidia took advantage of their market position a few years ago and overpriced theirs. IMO, they deserve to suffer now.
Nvidia will always be ahead of ATI, plus Nvidia now has the combination with Aegis physics so as more games start including hardware physics Nvidia gains more of an edge. Before now only a few games had bothered to make an engine capable of using the Aegis physics since consumers had to go buy a aegis physics card which no one was doing. Now with Nvidia dominating the market and Nvidia including the phsycis hardware on their new cards I would bet more games will start using it.
Nvidia has led ATI for a long time and I'm sure it will continue for a long time. ATI is just there to help keep prices in check the same way that AMD is there simply to try and keep prices in check for Intel. But neither AMD or ATI products will pass their competitors.
mm ati is in the lead technologicly!and by a lot i had a nvidia,i now have an ati
the biggest lead nvidia has isnt the technology,its the user friendlyness
hell i would go on nvidia website use their beta toy ,it detected my computer system right away gave me the french driver my system needed and all the pilot needed also.
if i wanted to do same thing in ati i cant there is no scan system feature ,so i got to do the nightmare by hand
and since i dont know the possibility of ati ,ati doesnt say hey we detected your system could use this nice feature
do you want it .of caurse i want it lol.thats the problem with ati.but from all forum ati is in the lead by 7 or 8 month!
thats the only reason nvidia seems smoother because its always very user friendly so all the toy avail the nvidia user has it
my ati ?loli bet i miss a lot of avail stuf from amd/ati that i would have to search a lot to find(beurk)
It is pretty clear that ATI is on top now, will it last as long as Nvidia was on top with their 8800 and GTX 2xx series? Probably not, but for now, it is not really a contest.
However, I have 2 x GTX285 in my main gaming rig, so by the time I want to replace that, Nvidia may have already catched up or even surpassed ATI
But again, atm, it is no contest, ATI is number 1 and Nvidia is number 2, just like Intel is number 1 with CPU and AMD is number 2, currently and in the near future I bet.
If you are interested in subscription or PCU numbers for MMORPG's, check out my site : http://mmodata.blogspot.be/ Favorite MMORPG's : DAoC pre ToA-NF, SWG Pre CU-NGE, EVE Online
So far I have only had ati cards but I'm gonna buy a gtx 4x0 from nvidia this summer. Mainly because they come with many new innovations in their new architecture. The hd5000 are just a build on previous generations from ati with dx11 slapped on the side. It performs poorly in dx11 so I think people who buy a hd5000 these days are gonna be sorry next year when many more dx11-titles start apering.
Sorry the 5k series is true DX 11 whoever said otherwise knows nothing. Not only does the 5890 kick the crap out of nVidia best offering they are priced decently. the 400 series will still be 1 generation behind ATi
When you say 5890 you must be imagining because I sure haven't seen any benchmarks or prices on that card.
If you have tried running the heaven dx11 benchmark with a hd5000 you will know that it performs poorly. It may be a true implementation of dx11 but performance still leaves something to be desired. GF100 runs tesselation much better. As I also said in a recent post, have a look at this site:
Saying that nobody uses PhysX is just plain ignorant. Nearly every commercial game engine uses it for their physics right now, and you'll see a lot of games using it release in coming years.
Saying that it's more work for developers to use PhysX (not sure if that was you) is an even bigger mistake. Coding your own physics engine could easily take as long as the core game engine itself, while PhysX would still have a lot of advantages over it. PhysX was a mistake on ATI's end, Nvidia offered them to work together, and allow ATI cards to also benefit from PhysX, but ATI went with Havok instead...
Don't get me wrong though, I still prefer ATI over Nvidia, both have good cards, but Nvidia took advantage of their market position a few years ago and overpriced theirs. IMO, they deserve to suffer now.
Noone uses hardware accellerated PhysX. PhysX as a physics platform is very popular, though there are still more Havok based games than PhysX based games. But very few support hardware acceleration and most that do only have minor effects that NVidia added themselves. It definitely is more work for devs to add hardware accelerated PhysX over just basic PhysX.
Unity, T3D and UDK use PhysX, so just about every indie game uses PhysX. PhysX works on the GPU, Havok on the CPU, the current trend is moving it all over to the GPU, physics emulation isn't going to be any different.
As for the bigger games, either it's an eye-blinding fps with impressive physics, those are the games that do use PhysX hardware acceleration (pretty sure Crysis had it). Or you have MMO's or RTS's, who barely use any physics at all. The rest of them can use whatever the hell they want, they barely need any physics anyway.
You know it, the best way to realize your dreams is waking up and start moving, never lose hope and always keep up.
It is pretty clear that ATI is on top now, will it last as long as Nvidia was on top with their 8800 and GTX 2xx series? Probably not, but for now, it is not really a contest. However, I have 2 x GTX285 in my main gaming rig, so by the time I want to replace that, Nvidia may have already catched up or even surpassed ATI But again, atm, it is no contest, ATI is number 1 and Nvidia is number 2, just like Intel is number 1 with CPU and AMD is number 2, currently and in the near future I bet.
yep i got ati
god i miss the beta scan from nvidia
it did way more then i ever thot ,i see it now that i have to dig trough all the avail stuff from ati (beurk)
Unity, T3D and UDK use PhysX, so just about every indie game uses PhysX.
I'm sorry what? Just because an engine supports something does not in any way mean games made using it will be coded to use it. And just because you name 3 engines does not in any way mean that all indie developer use those engines, I am a developer and frankly could care less to try and learn those engines.
There really is little to no different between GPU or CPU accelerated Physics on a decent system. PhysX is a joke, I said it once I will say it again. nVidia supporting it does not make their cards better. As instead of the CPU supporting the Physics all that is going on is it is being offloaded onto a LESS powerful hardware component, and unless you did not know usually the CPU runs at 25-50 possibly only 75% capacity when running a game while the GPU is usually at 75%+ so there is no advantage of GPU, vs CPU with Physics.
This argument is void of anything remotely informative or helpful.
Owner/Admin of GodlessGamer.com - Gaming news and reviews for the godless.
Saying that nobody uses PhysX is just plain ignorant. Nearly every commercial game engine uses it for their physics right now, and you'll see a lot of games using it release in coming years. Saying that it's more work for developers to use PhysX (not sure if that was you) is an even bigger mistake. Coding your own physics engine could easily take as long as the core game engine itself, while PhysX would still have a lot of advantages over it. PhysX was a mistake on ATI's end, Nvidia offered them to work together, and allow ATI cards to also benefit from PhysX, but ATI went with Havok instead... Don't get me wrong though, I still prefer ATI over Nvidia, both have good cards, but Nvidia took advantage of their market position a few years ago and overpriced theirs. IMO, they deserve to suffer now.
Noone uses hardware accellerated PhysX. PhysX as a physics platform is very popular, though there are still more Havok based games than PhysX based games. But very few support hardware acceleration and most that do only have minor effects that NVidia added themselves. It definitely is more work for devs to add hardware accelerated PhysX over just basic PhysX.
Unity, T3D and UDK use PhysX, so just about every indie game uses PhysX. PhysX works on the GPU, Havok on the CPU, the current trend is moving it all over to the GPU, physics emulation isn't going to be any different. As for the bigger games, either it's an eye-blinding fps with impressive physics, those are the games that do use PhysX hardware acceleration (pretty sure Crysis had it). Or you have MMO's or RTS's, who barely use any physics at all. The rest of them can use whatever the hell they want, they barely need any physics anyway.
The only mainstream ones that do are Batman and Mirror's Edge. (UT3 only does it on a single mod with 2 levels designed to showcase Physx) Both will use software based PhysX (which 95% of PhysX games are using) if no PhysX capable GPU is there. That's why you still need the PhysX driver from NVidia to play them even if you are on an ATI card.
Also doing physics on GPU isn't free. Doing it on GPU reduces the GPU cycles available for rendering graphics and lowers your FPS (even though you get more eye candy). It's better to do it on a dedicated cheap GPU like a 9800GT than to blow GPU cycles on it on your main GPU.
That's why I don't think it will really take off until AMD Fusion and Intel on-chip GPU's take off, then a physics SDK for OpenCL will be able to run on these on-CPU GPGPU's and people won't have to buy extra video cards.
This whole discussion about physx vs. havok are irrelevant. That type of technology is dying off and I do not see it being used much in the future with the release of DX11. So lets base this argument on actual card performance then some gimmick that has been collecting dust for years. Thats like someone telling me Nvidia is better because they have 3d glasses compatible in 3 games. A joke.
This is not going to be a good year for nVidia. They have been holding back technology since 2006 and they have finally reached a point where they can no longer hold it back. For its time the HD2900 Pro was a magnificent card compared to the 8800GTX. You can tell the G80 architecture was rushed to market based on its tech. Where the HD2900 was a complete revamp of the x1800 and met all of the DX10 standards. The 8800GTX was a step up from the 7800 and failed to meet Microsoft's standards. After nVidia got their way and standards were removed from DX10, they failed to actually add that tech at somepoint and now are paying for it since they are failing to meet the DX11 standard. Now that DX11 is released its unlikely Microsoft will change it this time.
Fermi isn't going to do it this year to put nVidia on the top. If anything it will perform like the HD2900 did compared to the 8800GTX. Ontop of it you are looking at a 4 billion transistor video card 2 months ahead of schedule putting it releasing in April for ATI. If the Fermi is Faster then the HD5870, it will only last a month as the HD6870 is released. So its not really a question of if nVidia will have the worst card selection this year, the question is if nVidia has an architecture for the future?
In 2006 ATI came out with an ambitious architecture that was ahead of its time, then eventually got it to perform better. Is nVidia's architecture an ambitious one or one that they are rushing to meet DX11 standards? You can make the case in both directions as they are trying to make it smarter, but at the same time they have a poor history of advancing technology.
Don't care how much of a technological advantage ATI has. I prefer reliability, and I have always gotten that with Nvidia cards as well as their drivers. The last couple of times I've used ATI, I had nothing but problems with them. This dates back to the days of the original ATI Radeon series which gave me pink textures while playing Counter-Strike. Seen far too many personal experiences of shoddy ATI drivers for me to consider investing in their cards.
In War - Victory. In Peace - Vigilance. In Death - Sacrifice.
Don't care how much of a technological advantage ATI has. I prefer reliability, and I have always gotten that with Nvidia cards as well as their drivers. The last couple of times I've used ATI, I had nothing but problems with them. This dates back to the days of the original ATI Radeon series which gave me pink textures while playing Counter-Strike. Seen far too many personal experiences of shoddy ATI drivers for me to consider investing in their cards.
This comment seems typical for someone that has nothing solid to say about why they choose ATI. I am not saying that your not telling the truth more like I have heard that story many times before. I repeat what I told an earlier person who claimed to have the same issue. USER ERROR seems to be associated with broken software.
They actually designated Fermi to the 400 series and it will do DX11. The 2 cards will be GTX 470 and GTX 480. The 300 line will either be skipped or used for rebranding GTX 200 series and won't support DX11 since they already tainted the series number with the GT 310.
I think both have been putting out solid cards. ATI imo gets an undeserved bad rap from the driver issues they had ~8 years ago. NVidia had just as bad driver probs more recently in Vista. It's kinda time to let that die.
Since both brands are reliable now, I look at price to performance, which ATI wins easily. Which is good for me because I have a laundry list of things that make NVidia an evil company and I don't like giving my money to evil companies.
A rare educated answer. This is the most truthful, informed, and unbiased answer you will find.
Sic semper tyrannis "Democracy broke down, not when the Union ceased to be agreeable to all its constituent States, but when it was upheld, like any other Empire, by force of arms."
Comments
Sorry the 5k series is true DX 11 whoever said otherwise knows nothing. Not only does the 5890 kick the crap out of nVidia best offering they are priced decently. the 400 series will still be 1 generation behind ATi
When you say 5890 you must be imagining because I sure haven't seen any benchmarks or prices on that card.
If you have tried running the heaven dx11 benchmark with a hd5000 you will know that it performs poorly. It may be a true implementation of dx11 but performance still leaves something to be desired. GF100 runs tesselation much better. As I also said in a recent post, have a look at this site:
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/27892-nvidia-s-geforce-gf100-under-microscope-7.html
You are kidding right about the heaven benchmark? Do you guys just make things up or what? Have you tried to run heaven and do you own a 5000 series ATI card? Because I do and have.
Also I just read that link you posted and may I say... if that doesn't sound like a Nvidia commercial I don't know what is... lets find more unbiased sources when trying to prove points.
1st point, god your replies are incoherent and hard to read.
2nd point, Biased means you will always side with nVidia and will say and do anything make ATi look bad. Meaning your "facts" are nothing more then opinions.
3rd point, Best Buy? Come on, the only two reasons for that is because Best Buy sucks. They are nothing but salesmen, they don't care to sale quality just whatever has the highest price tag is what they push, leaving ATi on the shelve because they are cheaper. The other reason is because their sales people know nothing and are just there to sell dumb customers products they are just as convinced that nVidia is better because of the crap spewed all over the internet and between fanbois like yourself.
well see the reasons im biased are for the fact i wanted to go cheap and got burned on four separate occasions
so yea my bias is based on my personal opinion and i already stated it was a matter opinion and personal preference
these "facts" come from ppl who got burned by ati and their craptastic cards and quite frankly no amount of anything will change those opinions i was laying out my thoughts you dont like it well get the hell over it
The fact is user error contributes to failures. Its not ATI's fault people don't know how to install cards properly. I have been using ATI and Nvidia cards for over 10 years. If you cannot get a ATI card to work that is not anyones problem but your own.
yes of course my bad the card dies after it has been running for about 3 weeks all on its own and its the customers fault
right of course
and especially when you consider this comp was purchased as it was totally put together by the ppl who sell it
but by all means it was the customers fault
the cutomer doesnt care what the specifications are they just want it to work advising ppl to use ati has never been viable to me as having to come back to fix things because the customer doesnt know what to do is not the best option
i choose nvidia because of the value it has had over ati in the long run the nvidia card lasts longer then the ati has
if you have gone 10 years without any issues good for you your lucky but not everyone has had the luck you have and to say its thier fault because it wasnt installed right is just stupid
the people i know know how to install and the card works fine for awhile then dies or starts stuttering out
graphical glitching clipped corners horrible fps sometimes it is the card and not the user
I am done, seeing as the poster above me wants to present himself as a knowledgeable computer repairman and knows nothing about bottlenecks or even specs, let alone stats. I have seen and heard enough.
Owner/Admin of GodlessGamer.com - Gaming news and reviews for the godless.
You are done because you have nothing valuable to bring to the table other then a biased site. I have been building computers for well over 10+ years and have repaired more then I care to count, not one time have I seen any of my computers have problems with ATI cards. I can even pull cards out from back then a 9600 pro or 9700 pro (ATI card) and they still work and run to there full potential today. If you had issues with a card dying then its the builders fault or the card was faulty (which is rare). Some people just lack the intuition to actually build proper computers.
I have built 2 recent computers both which contain a 5870 in it. Both of which ran Heaven. One has a athlon 64 x2 3.ghz processor and the other has a black edition amd 965 quad core. The quad core ran on 1920x1080(native resolution) @ 45 fps on the benchmark average with all graphics on max with HEAVEN. The dual core ran at 24 fps average. Both machines on windows 7 with 8 gigs of ram.
You are done because you have nothing valuable to bring to the table other then a biased site. I have been building computers for well over 10+ years and have repaired more then I care to count, not one time have I seen any of my computers have problems with ATI cards. I can even pull cards out from back then a 9600 pro or 9700 pro (ATI card) and they still work and run to there full potential today. If you had issues with a card dying then its the builders fault or the card was faulty (which is rare). Some people just lack the intuition to actually build proper computers.
I have built 2 recent computers both which contain a 5870 in it. Both of which ran Heaven. One has a athlon 64 x2 3.ghz processor and the other has a black edition amd 965 quad core. The quad core ran on 1920x1080(native resolution) @ 45 fps on the benchmark average with all graphics on max with HEAVEN. The dual core ran at 24 fps average. Both machines on windows 7 with 8 gigs of ram.
Must not be speaking to me, I didn't post any site. I am the one trying to defend ATi, please quote the right person. O.o
Owner/Admin of GodlessGamer.com - Gaming news and reviews for the godless.
The only cards I've had to RMA were for some friends, an Nvidia 8600GT and an Nvidia 8800GTX Ultra. I had terrible AGP related driver problems on my Nvidia 6600GT for the first 3-4 months before they got them straightened out. My buddy had driver problems in WoW with his Radeon 9700 Pro and another friend had driver issues in WoW with his Nvidia 5600GT.
Fact is, card issues are just luck of the draw.
You are done because you have nothing valuable to bring to the table other then a biased site. I have been building computers for well over 10+ years and have repaired more then I care to count, not one time have I seen any of my computers have problems with ATI cards. I can even pull cards out from back then a 9600 pro or 9700 pro (ATI card) and they still work and run to there full potential today. If you had issues with a card dying then its the builders fault or the card was faulty (which is rare). Some people just lack the intuition to actually build proper computers.
I have built 2 recent computers both which contain a 5870 in it. Both of which ran Heaven. One has a athlon 64 x2 3.ghz processor and the other has a black edition amd 965 quad core. The quad core ran on 1920x1080(native resolution) @ 45 fps on the benchmark average with all graphics on max with HEAVEN. The dual core ran at 24 fps average. Both machines on windows 7 with 8 gigs of ram.
Must not be speaking to me, I didn't post any site. I am the one trying to defend ATi, please quote the right person. O.o
Yeah my bad, didn't mean to quote you there. Sorry buddy
Saying that nobody uses PhysX is just plain ignorant. Nearly every commercial game engine uses it for their physics right now, and you'll see a lot of games using it release in coming years.
Saying that it's more work for developers to use PhysX (not sure if that was you) is an even bigger mistake. Coding your own physics engine could easily take as long as the core game engine itself, while PhysX would still have a lot of advantages over it. PhysX was a mistake on ATI's end, Nvidia offered them to work together, and allow ATI cards to also benefit from PhysX, but ATI went with Havok instead...
Don't get me wrong though, I still prefer ATI over Nvidia, both have good cards, but Nvidia took advantage of their market position a few years ago and overpriced theirs. IMO, they deserve to suffer now.
You know it, the best way to realize your dreams is waking up and start moving, never lose hope and always keep up.
Noone uses hardware accellerated PhysX. PhysX as a physics platform is very popular, though there are still more Havok based games than PhysX based games. But very few support hardware acceleration and most that do only have minor effects that NVidia added themselves. It definitely is more work for devs to add hardware accelerated PhysX over just basic PhysX.
The idea that Nvidia offered to work together is a myth. Physx was ported to CUDA which only Nvidia can use. ATI can pay Nvidia to license Physx then port it to Stream or OpenCL but Nvidia still controls the IP, so they'd be doing a ton of work to port someone else's IP and have no say in it, that's why they went with Havok and now Bullet. Neither require a licensing fee, and when ported to OpenCL either company could accelerate it with no additional work since anyone can use OpenCL (even CPU's, and GPGPU on CPU can run OpenCL).
Nvidia and ATI are both on the OpenCL panel and that is the area they should be working on not CUDA or Stream. Both have their OpenCL compilers at the end of beta or released from what I recall, so just need to get a push behind an open physics standard.
Saying that nobody uses PhysX is just plain ignorant. Nearly every commercial game engine uses it for their physics right now, and you'll see a lot of games using it release in coming years.
Saying that it's more work for developers to use PhysX (not sure if that was you) is an even bigger mistake. Coding your own physics engine could easily take as long as the core game engine itself, while PhysX would still have a lot of advantages over it. PhysX was a mistake on ATI's end, Nvidia offered them to work together, and allow ATI cards to also benefit from PhysX, but ATI went with Havok instead...
Don't get me wrong though, I still prefer ATI over Nvidia, both have good cards, but Nvidia took advantage of their market position a few years ago and overpriced theirs. IMO, they deserve to suffer now.
There are 120 games that use Physx en.wikipedia.org/wiki/PhysX#Games
188 games use Havok www.havok.com/index.php
Im guessing thats why ATI went with Havok instead there was already more games supporting Havok compared to Physx.
Yeah the guy up above me beat me to the answer but here are games that are GPU PhysX-accelerated www.nzone.com/object/nzone_physxgames_home.html
That is 16 out of the 120 that use Physx, thats not very many to me.
mm ati is in the lead technologicly!and by a lot i had a nvidia,i now have an ati
the biggest lead nvidia has isnt the technology,its the user friendlyness
hell i would go on nvidia website use their beta toy ,it detected my computer system right away gave me the french driver my system needed and all the pilot needed also.
if i wanted to do same thing in ati i cant there is no scan system feature ,so i got to do the nightmare by hand
and since i dont know the possibility of ati ,ati doesnt say hey we detected your system could use this nice feature
do you want it .of caurse i want it lol.thats the problem with ati.but from all forum ati is in the lead by 7 or 8 month!
thats the only reason nvidia seems smoother because its always very user friendly so all the toy avail the nvidia user has it
my ati ?loli bet i miss a lot of avail stuf from amd/ati that i would have to search a lot to find(beurk)
It is pretty clear that ATI is on top now, will it last as long as Nvidia was on top with their 8800 and GTX 2xx series? Probably not, but for now, it is not really a contest.
However, I have 2 x GTX285 in my main gaming rig, so by the time I want to replace that, Nvidia may have already catched up or even surpassed ATI
But again, atm, it is no contest, ATI is number 1 and Nvidia is number 2, just like Intel is number 1 with CPU and AMD is number 2, currently and in the near future I bet.
If you are interested in subscription or PCU numbers for MMORPG's, check out my site :
http://mmodata.blogspot.be/
Favorite MMORPG's : DAoC pre ToA-NF, SWG Pre CU-NGE, EVE Online
Sorry the 5k series is true DX 11 whoever said otherwise knows nothing. Not only does the 5890 kick the crap out of nVidia best offering they are priced decently. the 400 series will still be 1 generation behind ATi
When you say 5890 you must be imagining because I sure haven't seen any benchmarks or prices on that card.
If you have tried running the heaven dx11 benchmark with a hd5000 you will know that it performs poorly. It may be a true implementation of dx11 but performance still leaves something to be desired. GF100 runs tesselation much better. As I also said in a recent post, have a look at this site:
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/27892-nvidia-s-geforce-gf100-under-microscope-7.html
deactivate tessalation and it perform very good
you might not know it but you will when you get a dx11 with tessalation from nvidia ,tessalation is a ressource hog
did you ask your self why nvidia had to use and empty 295 modded case last september to say we ready also for dx11
its easy to say they are ready !ati released last august (dx11 card)so do you honestly believe they ve been twirling their finger
since then !of caurse not .that being said i just wish ati/amd patchin g were more user friendly because on that front
nvidia is at least 1 year ahead og ati or intel>im certain about that.and this makes a huge diff for the average user>
Noone uses hardware accellerated PhysX. PhysX as a physics platform is very popular, though there are still more Havok based games than PhysX based games. But very few support hardware acceleration and most that do only have minor effects that NVidia added themselves. It definitely is more work for devs to add hardware accelerated PhysX over just basic PhysX.
Unity, T3D and UDK use PhysX, so just about every indie game uses PhysX. PhysX works on the GPU, Havok on the CPU, the current trend is moving it all over to the GPU, physics emulation isn't going to be any different.
As for the bigger games, either it's an eye-blinding fps with impressive physics, those are the games that do use PhysX hardware acceleration (pretty sure Crysis had it). Or you have MMO's or RTS's, who barely use any physics at all. The rest of them can use whatever the hell they want, they barely need any physics anyway.
You know it, the best way to realize your dreams is waking up and start moving, never lose hope and always keep up.
yep i got ati
god i miss the beta scan from nvidia
it did way more then i ever thot ,i see it now that i have to dig trough all the avail stuff from ati (beurk)
I'm sorry what? Just because an engine supports something does not in any way mean games made using it will be coded to use it. And just because you name 3 engines does not in any way mean that all indie developer use those engines, I am a developer and frankly could care less to try and learn those engines.
There really is little to no different between GPU or CPU accelerated Physics on a decent system. PhysX is a joke, I said it once I will say it again. nVidia supporting it does not make their cards better. As instead of the CPU supporting the Physics all that is going on is it is being offloaded onto a LESS powerful hardware component, and unless you did not know usually the CPU runs at 25-50 possibly only 75% capacity when running a game while the GPU is usually at 75%+ so there is no advantage of GPU, vs CPU with Physics.
This argument is void of anything remotely informative or helpful.
Owner/Admin of GodlessGamer.com - Gaming news and reviews for the godless.
..Physx only works on the GPU if devs implement it that way.. which is extra work. There is a very short list of games that do this:
http://www.nzone.com/object/nzone_physxgames_home.html
The only mainstream ones that do are Batman and Mirror's Edge. (UT3 only does it on a single mod with 2 levels designed to showcase Physx) Both will use software based PhysX (which 95% of PhysX games are using) if no PhysX capable GPU is there. That's why you still need the PhysX driver from NVidia to play them even if you are on an ATI card.
Also doing physics on GPU isn't free. Doing it on GPU reduces the GPU cycles available for rendering graphics and lowers your FPS (even though you get more eye candy). It's better to do it on a dedicated cheap GPU like a 9800GT than to blow GPU cycles on it on your main GPU.
That's why I don't think it will really take off until AMD Fusion and Intel on-chip GPU's take off, then a physics SDK for OpenCL will be able to run on these on-CPU GPGPU's and people won't have to buy extra video cards.
Always used nVidia, but when the 5 series came out, I couldnt say no. My 5850 is just amazing.
This whole discussion about physx vs. havok are irrelevant. That type of technology is dying off and I do not see it being used much in the future with the release of DX11. So lets base this argument on actual card performance then some gimmick that has been collecting dust for years. Thats like someone telling me Nvidia is better because they have 3d glasses compatible in 3 games. A joke.
This is not going to be a good year for nVidia. They have been holding back technology since 2006 and they have finally reached a point where they can no longer hold it back. For its time the HD2900 Pro was a magnificent card compared to the 8800GTX. You can tell the G80 architecture was rushed to market based on its tech. Where the HD2900 was a complete revamp of the x1800 and met all of the DX10 standards. The 8800GTX was a step up from the 7800 and failed to meet Microsoft's standards. After nVidia got their way and standards were removed from DX10, they failed to actually add that tech at somepoint and now are paying for it since they are failing to meet the DX11 standard. Now that DX11 is released its unlikely Microsoft will change it this time.
Fermi isn't going to do it this year to put nVidia on the top. If anything it will perform like the HD2900 did compared to the 8800GTX. Ontop of it you are looking at a 4 billion transistor video card 2 months ahead of schedule putting it releasing in April for ATI. If the Fermi is Faster then the HD5870, it will only last a month as the HD6870 is released. So its not really a question of if nVidia will have the worst card selection this year, the question is if nVidia has an architecture for the future?
In 2006 ATI came out with an ambitious architecture that was ahead of its time, then eventually got it to perform better. Is nVidia's architecture an ambitious one or one that they are rushing to meet DX11 standards? You can make the case in both directions as they are trying to make it smarter, but at the same time they have a poor history of advancing technology.
Don't care how much of a technological advantage ATI has. I prefer reliability, and I have always gotten that with Nvidia cards as well as their drivers. The last couple of times I've used ATI, I had nothing but problems with them. This dates back to the days of the original ATI Radeon series which gave me pink textures while playing Counter-Strike. Seen far too many personal experiences of shoddy ATI drivers for me to consider investing in their cards.
In War - Victory.
In Peace - Vigilance.
In Death - Sacrifice.
But you are forgetting about the poor quality of G80's hardware. It was a hot chip and the transistors didn't like the heat.
This comment seems typical for someone that has nothing solid to say about why they choose ATI. I am not saying that your not telling the truth more like I have heard that story many times before. I repeat what I told an earlier person who claimed to have the same issue. USER ERROR seems to be associated with broken software.
A rare educated answer. This is the most truthful, informed, and unbiased answer you will find.
sold my 260 gtx on ebay and got a HIS 5850
Sic semper tyrannis "Democracy broke down, not when the Union
ceased to be agreeable to all its constituent States, but when it was upheld, like any other Empire, by force of arms."