No offense but in my 24+ years of pc gaming ang gpu's AMD cards have required the most work and driver manipulation to get the performnace of nvidia cards. Maybe I had bad luck with AMD cards, but I doubt it.
24 years? Really? Which AMD cards did you have again? You realize that AMD has only made video cards since 2006, I hope.
Refusing to buy an AMD card today because ATI had some driver problems a decade ago is like refusing to buy an Intel Ivy Bridge processor today because Netburst was a bad architecture.
I have used Ati cards for a long long time now, and i will never ever buy one ati card again.
Setting up the new drivers can take 1 min or 5 hours. For some reason Cataslyst Control Center may not be installed properly, so when yo have driver 12.4 and upgrade to 12.6 with a clean instal you may end up with a shit instal for some weird reason.
You do know to uninstall the old drivers before you install the new ones, don't you? If you have part of the Catalyst package from one version and part from another, that can indeed cause problems. But that's thorougly PEBCAK territory.
Will a 15" MacBook pro with windows 7 handle this on high?
2.3GHz quad-core Intel Core i7
Turbo Boost up to 3.3GHz
4GB 1600MHz memory
500GB 5400-rpm hard drive1
Intel HD Graphics 4000
NVIDIA GeForce GT 650M with 512MB of GDDR5 memory
Cheers!
It will handle the game, but not necessarily on high graphical settings. I find it interesting that Nvidia pushes 2 GB on laptop cards with no plausible use for that much video memory, while Apple says 1/4 of that amount is fine. I'd prefer to see that card come with 1 GB.
Originally posted by Quizzical The trouble with asking a hardware question like this outside of the hardware forum is that lots of people have opinions, but half of them don't know what they're talking about.For the original poster, I think a video card upgrade makes sense for you. Your processor is plenty fast enough, and you have plenty of system memory, so those won't hold you back.On general principle, I ask what power supply and case you have before making a recommendation. You never know when you'll find someone who went with a cheap junk "800 W" power supply because they were impressed with a high nominal wattage and didn't realize that it will explode if you try to pull 500 W from it. So, what power supply and case do you have?-----For a video card, on a bang for the buck basis, I'd tend to favor a Radeon HD 7770 or 7870, depending on budget. In your case, I think you want something faster than a 7770, so a 7870 fits the bill. For example:http://www.newegg.com/Product/Product.aspx?Item=N82E16814102981There are a number of cards out there that are faster than a Radeon HD 7870, but performance per dollar declines substantially, so you pay a lot more for only somewhat more performance. If you want to go high end, then you deal with that, I suppose.A bunch of Nvidia fanboys have been recommending the GeForce GTX 660 Ti lately, perhaps because it launched last week. At launch prices, it was a decent value, but AMD has since slashed prices on the 7870.The next step up is a Radeon HD 7950, which makes sense if you're looking to do an Eyefinity setup (due to the 3 GB of system memory) or GPGPU (which isn't for gaming). Otherwise, it's too much added cost as compared to a 7870 for not enough more performance.At the high end, you have the GeForce GTX 670 and Radeon HD 7970. On a large budget, that might be where you want to look. The 7970 tends to be a little faster than the GTX 670, but they're close enough that each will win in some games. The 7970 tends to be a little more expensive than the GTX 670. The 7970 uses quite a bit more power, partially but not entirely due to the 3 GB of video memory. The 7970 also tends to have more overclocking headroom, largely because the GTX 670 is substantially constrained by memory bandwidth, and you can't overclock memory very far from the 1.5 GHz that it ships with.AMD bins out their best Tahiti dies for the Radeon HD 7970 GHz Edition. This is basically just a 7970 with a higher stock clock speed, and usually a premium cooler. For example:http://www.newegg.com/Product/Product.aspx?Item=N82E16814202001That's where you'd want to look if you want the fastest card you can get.Nvidia also offers a GeForce GTX 680, which is more expensive than a Radeon HD 7970 GHz Edition, but also slower. Hardly any of Nvidia's GK104 dies can meet GTX 680 specs, so Nvidia has to price them way too high to keep them from selling out. That's not what you want.There's also the GeForce GTX 690, which is basically two GTX 680s on a single card. The only reason to get a dual GPU card is if you want to get two of them, for quad SLI. If you want to SLI, then just get two GeForce GTX 670s. That's thoroughly unnecessary for a single monitor, however.-----Do you have a good SSD? If not, then you might want to consider getting one. An SSD will greatly reduce your loading times. In Guild Wars 1, this made a huge difference in how the game felt, because of the frequent warping around. I'd expect that to be true in Guild Wars 2, too. Imagine using map travel and reliably having the game ready for you to move around about as quickly as you are.SSDs don't cost as much as they used to, either. For example:http://www.newegg.com/Product/Product.aspx?Item=N82E16820171567
Antec TruePower Quattro 1000W.
Also I stick with Nvidia as I know them well and had few problems with them.
As for SSD. GW2 already loads fast enough. I think I pass and wait until the SSD technology develops a little more.
Will a 15" MacBook pro with windows 7 handle this on high?
2.3GHz quad-core Intel Core i7
Turbo Boost up to 3.3GHz
4GB 1600MHz memory
500GB 5400-rpm hard drive1
Intel HD Graphics 4000
NVIDIA GeForce GT 650M with 512MB of GDDR5 memory
Cheers!
Should run it on mid-high settings in Windows or mid settings on WINE/Crossover in OSX.
lol he can run on high with that setup.. Period!
If an underclocked laptop version of the slowest discrete GPU chip of the generation (from either vendor!) can run a game on high settings, then why do the higher end GPU chips exist?
Maybe it can; some games aren't very demanding. But many games will require settings turned down substantially.
I think I am going to remove/reinstall the 12.8 amd drivers on Thursday night and then re-run my Fumark/MSIafterburner tests/configs and try and push my 6870 up to something like 950/960 mhz 1150/1200 mhz mem and see if that helps on Saturday.
My case is still under 95 F at full load (35C) so I have a lot of room for heat.
Originally posted by expresso My 6870 is quite sweet when combained with an i5 3570, set you back about $160-$200 - get a good 60fps at 1080p on Max settings out in PvE but on average 40 in WvW but i normaly turn the gfx down a notch to maintain a 50fps min.
That is a lie.
I have an HD 6870 on a Phenom II x4 cranked up to 4.1ghz with 8gb Kingston HyperX running GW2 on an Agility III SSD and I do not get 60fps in PvE at 1080p on MAX settings - not even close. Maybe 40 if there aren't a lot of players around.
Why would I lie? Maybe GW2 likes Intel more than AMD, I also got 8GB DDR3 and an SSD. I literatly built the system a few weeks ago so it's all new stuff and new install of Windows etc.
I haven't played GW2, but I suspect that you're not actually maxing settings. Here, let me help you.
Open Catalyst Control Center and go to Gaming -> Image Quality -> Anti-Aliasing Mode. Make sure that's set to Super-sample AA. Now go back out to Image Quality, and then go to Anti-Aliasing. Make sure that's set to 8x and "use application settings" is not checked. Now go back into the game and try to max settings again, and watch your hardware choke.
Originally posted by expresso My 6870 is quite sweet when combained with an i5 3570, set you back about $160-$200 - get a good 60fps at 1080p on Max settings out in PvE but on average 40 in WvW but i normaly turn the gfx down a notch to maintain a 50fps min.
That is a lie.
I have an HD 6870 on a Phenom II x4 cranked up to 4.1ghz with 8gb Kingston HyperX running GW2 on an Agility III SSD and I do not get 60fps in PvE at 1080p on MAX settings - not even close. Maybe 40 if there aren't a lot of players around.
Why would I lie? Maybe GW2 likes Intel more than AMD, I also got 8GB DDR3 and an SSD. I literatly built the system a few weeks ago so it's all new stuff and new install of Windows etc.
I haven't played GW2, but I suspect that you're not actually maxing settings. Here, let me help you.
Open Catalyst Control Center and go to Gaming -> Image Quality -> Anti-Aliasing Mode. Make sure that's set to Super-sample AA. Now go back out to Image Quality, and then go to Anti-Aliasing. Make sure that's set to 8x and "use application settings" is not checked. Now go back into the game and try to max settings again, and watch your hardware choke.
unless ATI cards have a way to display MSAA in differed engines doing that isn't going to make any difference since the engine doesn't support external forms of AA
I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg
just wanted to comment on that I found the 670 the best memory OC card I have had from nvidia in a LONG time.. Using the Gigabyte Windforce and can run the memory at 2.2GHZ no problem. Really no reason to get the 680 over 670.. I'm hitting over 10k p score in 3dmark 11 as well
Color me skeptical that you can genuinely run memory binned for 1.5 GHz at 2.2 GHz without incident.
Look down toward the bottom of the page. The memory chips are rated at 1.5 GHz. If Gigabyte were to pay more for a higher bin of GDDR5 chips, they'd have given the memory a factory overclock.
Part of the GDDR5 memory specification is an error detection mechanism. When it sends information to or from memory, there's a parity check or something like it to check whether the data arrived correctly. If it didn't, then rather than crashing, it just sends the data again. If you clock memory too high, you end up with a lot of errors, so that your effective memory bandwidth once the errors are discarded isn't nearly as high as you think it is.
just wanted to comment on that I found the 670 the best memory OC card I have had from nvidia in a LONG time.. Using the Gigabyte Windforce and can run the memory at 2.2GHZ no problem. Really no reason to get the 680 over 670.. I'm hitting over 10k p score in 3dmark 11 as well
Color me skeptical that you can genuinely run memory binned for 1.5 GHz at 2.2 GHz without incident.
Look down toward the bottom of the page. The memory chips are rated at 1.5 GHz. If Gigabyte were to pay more for a higher bin of GDDR5 chips, they'd have given the memory a factory overclock.
Part of the GDDR5 memory specification is an error detection mechanism. When it sends information to or from memory, there's a parity check or something like it to check whether the data arrived correctly. If it didn't, then rather than crashing, it just sends the data again. If you clock memory too high, you end up with a lot of errors, so that your effective memory bandwidth once the errors are discarded isn't nearly as high as you think it is.
i can get you screenshots of my stress test and gpu-z if you don't believe me
I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg
Since you are looking at the price range of the card you mentioned above, I'm not going be rude to you or condesending to everyone on this thread who may be misinformed and offered advice.
What I am going to do though is give you some advice and hope it helps.
I have a couple graphics cards at my disposal and tested them on my machine. I have used a gigabyte 7870, my friend let me use the card you have your eye on the Gigabyte Geforce GTX 670 and my main card is this:
which after rebate is only $20 more then the 670. There are some other 7970's that run very close in price to the 670 which likely would out perform the 670.
Now knowing how all 3 of those cards work within Guild Wars 2, I'd be pissed if someone recommended me the 7870 when I wanted something comparable to the Gigabyte 670.... Side by side comparisons show that when I used the 7870 my FPS dropped some 20-30fps on average with HUGE fps drops on occasion. The 7970 ran the game without any major dips and with the highest FPS average between all 3 cards. (Mind you these cards were all tested on the same machine.)
I love my 670 but it was out performed by the 7970.
Does the 7870 perform well? yes it does but if someone is asking for a possible recommendation for a card that is better for around the same price as the 670, makes no sense to me to be rude to people and then give low ball a recommendation which could end up making someone disapopinted.
As aerowyn has pointed out the 670 is amazing, you totally wont be disappointed in it for that price range.
Of course the higher budget you have to spend the better performance the card is but you didn't ask for that, so I wont bother going into detail on the higher end pricier cards....
Hope this helps, I mean I am one of those that doesn't know what he's talking about apparently but eh thought I'd give it a try.
just wanted to comment on that I found the 670 the best memory OC card I have had from nvidia in a LONG time.. Using the Gigabyte Windforce and can run the memory at 2.2GHZ no problem. Really no reason to get the 680 over 670.. I'm hitting over 10k p score in 3dmark 11 as well
Color me skeptical that you can genuinely run memory binned for 1.5 GHz at 2.2 GHz without incident.
Look down toward the bottom of the page. The memory chips are rated at 1.5 GHz. If Gigabyte were to pay more for a higher bin of GDDR5 chips, they'd have given the memory a factory overclock.
Part of the GDDR5 memory specification is an error detection mechanism. When it sends information to or from memory, there's a parity check or something like it to check whether the data arrived correctly. If it didn't, then rather than crashing, it just sends the data again. If you clock memory too high, you end up with a lot of errors, so that your effective memory bandwidth once the errors are discarded isn't nearly as high as you think it is.
i can get you screenshots of my stress test and gpu-z if you don't believe me
I have the GTX 670 from EVGA....Love it.:)
Doesn't need OCing at all imo. Runs awesome...hehe.
Also I stick with Nvidia as I know them well and had few problems with them.
As for SSD. GW2 already loads fast enough. I think I pass and wait until the SSD technology develops a little more.
Your power supply should be able to handle whatever upgrade you want, then.
If you want to go with Nvidia, then grab a GeForce GTX 670. If you're going to pay substantially more than you would for a Radeon HD 7870, then you might as well at least get something substantially faster. The GTX 670 is priced reasonably as compared to the competition. The reason the GTX 670 doesn't perform that far behind the GTX 680 is that they have the same memory bandwidth and both are substantially constrained by memory bandwidth. The GTX 660 Ti loses a memory channel and then mismatches the others.
But spending $400 on a video card while deciding that $80 for an SSD is too much is just ridiculous. Given your system, if I had my choice of getting an SSD or a new video card but couldn't get both, I'd go with the SSD. That's partially a matter of personal preferences, as I don't mind turning down graphical settings, but it's also partially because SSDs are a big deal. Unless you love sitting there staring at loading screens. I don't.
You're probably used to everything taking forever to load because that's how computers have always been. You don't realize that it could be otherwise.
unless ATI cards have a way to display MSAA in differed engines doing that isn't going to make any difference since the engine doesn't support external forms of AA
Does GW2 use deferred rendering? If so, that does interfere with traditional MSAA. I'm not sure about SSAA, though. Unlike MSAA, SSAA doesn't need any knowledge of the geometry of the scene. SSAA basically just anti-aliases everything. It's the perfect form of anti-aliasing from an image quality perspective, but brings a huge performance hit because it's a dumb brute force method.
unless ATI cards have a way to display MSAA in differed engines doing that isn't going to make any difference since the engine doesn't support external forms of AA
Does GW2 use deferred rendering? If so, that does interfere with traditional MSAA. I'm not sure about SSAA, though. Unlike MSAA, SSAA doesn't need any knowledge of the geometry of the scene. SSAA basically just anti-aliases everything. It's the perfect form of anti-aliasing from an image quality perspective, but brings a huge performance hit because it's a dumb brute force method.
there is actually a supersampling option ingame.. nvidia cards dont have that option for SSAA in the control panel you need to use nvidia inspector. But any type of other AA like MSAA will not work in this game due to the engine
I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg
Originally posted by Quizzical Originally posted by Aerowynunless ATI cards have a way to display MSAA in differed engines doing that isn't going to make any difference since the engine doesn't support external forms of AA
Does GW2 use deferred rendering? If so, that does interfere with traditional MSAA. I'm not sure about SSAA, though. Unlike MSAA, SSAA doesn't need any knowledge of the geometry of the scene. SSAA basically just anti-aliases everything. It's the perfect form of anti-aliasing from an image quality perspective, but brings a huge performance hit because it's a dumb brute force method.
there is actually a supersampling option ingame.. nvidia cards dont have that option for SSAA in the control panel you need to use nvidia inspector. But any type of other AA like MSAA will not work in this game due to the engine
SSAA doesn't work in game either as far as I could tell.
Originally posted by Quizzical Originally posted by Aerowynunless ATI cards have a way to display MSAA in differed engines doing that isn't going to make any difference since the engine doesn't support external forms of AA
Does GW2 use deferred rendering? If so, that does interfere with traditional MSAA. I'm not sure about SSAA, though. Unlike MSAA, SSAA doesn't need any knowledge of the geometry of the scene. SSAA basically just anti-aliases everything. It's the perfect form of anti-aliasing from an image quality perspective, but brings a huge performance hit because it's a dumb brute force method.
there is actually a supersampling option ingame.. nvidia cards dont have that option for SSAA in the control panel you need to use nvidia inspector. But any type of other AA like MSAA will not work in this game due to the engine
SSAA doesn't work in game either as far as I could tell.
through ATI control panel and nvidia inspector my guess is probably not same as games like TSW nothing much works to improve AA quality. You are basically stuck with trying to downsample, using the crappy ingame AA or use SMAA. Although in TSW i don't mind the new TXAA although some find it to blurry for their tastes. Shame to because sgssaa looks amazing if the game can support it. They did add a supersampling option in the graphics options ingame but honestly it doesn't effect AA much at all and not worth the performance hit it gives IMHO
I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg
That's an interesting chart, but it's surely going to be greatly influenced by other factors, most notably the video settings that people choose. A Radeon HD 7970 isn't merely 3 times as fast as Intel HD Graphics 3000. It's easily more than 10 times as fast.
Yeah, it's a very flawed presentation, but seemed relavent to the thread. I hope Arenanet's hardware performance metrics are much more detailed than that. I'm assuming the auto-detect for the chart was meant to make a wider range of cards appear to offer playable performance. Of course, all those cards do offer playable performance, but someone that wants to stick to high-ultra level settings are going to need to go with the higher end cards.
Comments
24 years? Really? Which AMD cards did you have again? You realize that AMD has only made video cards since 2006, I hope.
Refusing to buy an AMD card today because ATI had some driver problems a decade ago is like refusing to buy an Intel Ivy Bridge processor today because Netburst was a bad architecture.
You do know to uninstall the old drivers before you install the new ones, don't you? If you have part of the Catalyst package from one version and part from another, that can indeed cause problems. But that's thorougly PEBCAK territory.
It will handle the game, but not necessarily on high graphical settings. I find it interesting that Nvidia pushes 2 GB on laptop cards with no plausible use for that much video memory, while Apple says 1/4 of that amount is fine. I'd prefer to see that card come with 1 GB.
Antec TruePower Quattro 1000W.
Also I stick with Nvidia as I know them well and had few problems with them.
As for SSD. GW2 already loads fast enough. I think I pass and wait until the SSD technology develops a little more.
-Azure Prower
http://www.youtube.com/AzurePrower
If an underclocked laptop version of the slowest discrete GPU chip of the generation (from either vendor!) can run a game on high settings, then why do the higher end GPU chips exist?
Maybe it can; some games aren't very demanding. But many games will require settings turned down substantially.
I think I am going to remove/reinstall the 12.8 amd drivers on Thursday night and then re-run my Fumark/MSIafterburner tests/configs and try and push my 6870 up to something like 950/960 mhz 1150/1200 mhz mem and see if that helps on Saturday.
My case is still under 95 F at full load (35C) so I have a lot of room for heat.
I haven't played GW2, but I suspect that you're not actually maxing settings. Here, let me help you.
Open Catalyst Control Center and go to Gaming -> Image Quality -> Anti-Aliasing Mode. Make sure that's set to Super-sample AA. Now go back out to Image Quality, and then go to Anti-Aliasing. Make sure that's set to 8x and "use application settings" is not checked. Now go back into the game and try to max settings again, and watch your hardware choke.
unless ATI cards have a way to display MSAA in differed engines doing that isn't going to make any difference since the engine doesn't support external forms of AA
I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg
I got the EVGA GTX 670 2GB for $389. Cheaper than the 680 and the performance margin is really small. In some cases it outperforms the 680.
I had no issues in GW2 on Max settings. 61FPS. Only time it dropped to about 56FPS when I was around people.
Love the 670.
Edit:
System my BF and I built for GW2:
Case: Coolermaster HAF X
MB: ASRock Z68 Extreme3 Gen3 1155
GPU: EVGA GeForce GTX 670 2GB GDDR5
CPU: Intel Core i5 2500K 3.3Ghz Quad Core(3.7Ghz TB)
Ram: 8GB Corsair Vengeance DDR3 1600
SSD: Intel 320 Series 120GB SATA II
OS: Windows 7 64bit Home Premium
"My Fantasy is having two men at once...
One Cooking and One Cleaning!"
---------------------------
"A good man can make you feel sexy,
strong and able to take on the whole world...
oh sorry...that's wine...wine does that..."
Color me skeptical that you can genuinely run memory binned for 1.5 GHz at 2.2 GHz without incident.
http://www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_670_Windforce/4.html
Look down toward the bottom of the page. The memory chips are rated at 1.5 GHz. If Gigabyte were to pay more for a higher bin of GDDR5 chips, they'd have given the memory a factory overclock.
Part of the GDDR5 memory specification is an error detection mechanism. When it sends information to or from memory, there's a parity check or something like it to check whether the data arrived correctly. If it didn't, then rather than crashing, it just sends the data again. If you clock memory too high, you end up with a lot of errors, so that your effective memory bandwidth once the errors are discarded isn't nearly as high as you think it is.
i can get you screenshots of my stress test and gpu-z if you don't believe me
I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg
Since you are looking at the price range of the card you mentioned above, I'm not going be rude to you or condesending to everyone on this thread who may be misinformed and offered advice.
What I am going to do though is give you some advice and hope it helps.
I have a couple graphics cards at my disposal and tested them on my machine. I have used a gigabyte 7870, my friend let me use the card you have your eye on the Gigabyte Geforce GTX 670 and my main card is this:
GIGABYTE GV-R797OC-3GD Radeon HD 7970 3GB
which after rebate is only $20 more then the 670. There are some other 7970's that run very close in price to the 670 which likely would out perform the 670.
Now knowing how all 3 of those cards work within Guild Wars 2, I'd be pissed if someone recommended me the 7870 when I wanted something comparable to the Gigabyte 670.... Side by side comparisons show that when I used the 7870 my FPS dropped some 20-30fps on average with HUGE fps drops on occasion. The 7970 ran the game without any major dips and with the highest FPS average between all 3 cards. (Mind you these cards were all tested on the same machine.)
I love my 670 but it was out performed by the 7970.
Does the 7870 perform well? yes it does but if someone is asking for a possible recommendation for a card that is better for around the same price as the 670, makes no sense to me to be rude to people and then give low ball a recommendation which could end up making someone disapopinted.
As aerowyn has pointed out the 670 is amazing, you totally wont be disappointed in it for that price range.
Of course the higher budget you have to spend the better performance the card is but you didn't ask for that, so I wont bother going into detail on the higher end pricier cards....
Hope this helps, I mean I am one of those that doesn't know what he's talking about apparently but eh thought I'd give it a try.
IamApropos
See where adventure will lead you!
My PC Specs:
i5-3570k oc'ed @4.2GHz
8GB 1600 RAM
GTX670 oc'ed @ 1.25Ghz
Samsung 830 SSD.
I have the GTX 670 from EVGA....Love it.:)
Doesn't need OCing at all imo. Runs awesome...hehe.
"My Fantasy is having two men at once...
One Cooking and One Cleaning!"
---------------------------
"A good man can make you feel sexy,
strong and able to take on the whole world...
oh sorry...that's wine...wine does that..."
Your power supply should be able to handle whatever upgrade you want, then.
If you want to go with Nvidia, then grab a GeForce GTX 670. If you're going to pay substantially more than you would for a Radeon HD 7870, then you might as well at least get something substantially faster. The GTX 670 is priced reasonably as compared to the competition. The reason the GTX 670 doesn't perform that far behind the GTX 680 is that they have the same memory bandwidth and both are substantially constrained by memory bandwidth. The GTX 660 Ti loses a memory channel and then mismatches the others.
But spending $400 on a video card while deciding that $80 for an SSD is too much is just ridiculous. Given your system, if I had my choice of getting an SSD or a new video card but couldn't get both, I'd go with the SSD. That's partially a matter of personal preferences, as I don't mind turning down graphical settings, but it's also partially because SSDs are a big deal. Unless you love sitting there staring at loading screens. I don't.
You're probably used to everything taking forever to load because that's how computers have always been. You don't realize that it could be otherwise.
Does GW2 use deferred rendering? If so, that does interfere with traditional MSAA. I'm not sure about SSAA, though. Unlike MSAA, SSAA doesn't need any knowledge of the geometry of the scene. SSAA basically just anti-aliases everything. It's the perfect form of anti-aliasing from an image quality perspective, but brings a huge performance hit because it's a dumb brute force method.
there is actually a supersampling option ingame.. nvidia cards dont have that option for SSAA in the control panel you need to use nvidia inspector. But any type of other AA like MSAA will not work in this game due to the engine
I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg
Does GW2 use deferred rendering? If so, that does interfere with traditional MSAA. I'm not sure about SSAA, though. Unlike MSAA, SSAA doesn't need any knowledge of the geometry of the scene. SSAA basically just anti-aliases everything. It's the perfect form of anti-aliasing from an image quality perspective, but brings a huge performance hit because it's a dumb brute force method.
there is actually a supersampling option ingame.. nvidia cards dont have that option for SSAA in the control panel you need to use nvidia inspector. But any type of other AA like MSAA will not work in this game due to the engine
Does GW2 use deferred rendering? If so, that does interfere with traditional MSAA. I'm not sure about SSAA, though. Unlike MSAA, SSAA doesn't need any knowledge of the geometry of the scene. SSAA basically just anti-aliases everything. It's the perfect form of anti-aliasing from an image quality perspective, but brings a huge performance hit because it's a dumb brute force method.
there is actually a supersampling option ingame.. nvidia cards dont have that option for SSAA in the control panel you need to use nvidia inspector. But any type of other AA like MSAA will not work in this game due to the engine
They did add a supersampling option in the graphics options ingame but honestly it doesn't effect AA much at all and not worth the performance hit it gives IMHO
I angered the clerk in a clothing shop today. She asked me what size I was and I said actual, because I am not to scale. I like vending machines 'cause snacks are better when they fall. If I buy a candy bar at a store, oftentimes, I will drop it... so that it achieves its maximum flavor potential. --Mitch Hedberg
Yeah, it's a very flawed presentation, but seemed relavent to the thread. I hope Arenanet's hardware performance metrics are much more detailed than that. I'm assuming the auto-detect for the chart was meant to make a wider range of cards appear to offer playable performance. Of course, all those cards do offer playable performance, but someone that wants to stick to high-ultra level settings are going to need to go with the higher end cards.
Want to know more about GW2 and why there is so much buzz? Start here: Guild Wars 2 Mass Info for the Uninitiated