Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Finally! it's here... MY BEAST!

2»

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Originally posted by PieRad

    The r9 270 (rank 26) is ranked lower than the GTX 760 (rank 14), and the r 280x (rank 13) is only ranked 1 above GTX 760, so I'm not sure that would be worth it...

    If I had to get something better than GTX 760, I'd have to enter the high end of the High End, which is very expensive when it comes to price / performance.

     

    G3D Mark               :       http://www.videocardbenchmark.net/high_end_gpus.html#

    Price / Perfomance :       http://www.videocardbenchmark.net/high_end_gpus.html#value

    Do you care about games, or do you only care about synthetic benchmarks?  Because if the latter, then you shouldn't base your purchase on a synthetic widely known to be very unrepresentative of games.  That benchmark is very favorable to Nvidia today; before Fermi, it strongly preferred ATI cards.  Look, for example, at the GeForce GT 545 somehow beating the GeForce GTX 285; in real games, the 545 might typically get half the performance of the 285.

  • bliss14bliss14 Member UncommonPosts: 595
    Originally posted by Quizzical
    Originally posted by PieRad

    The r9 270 (rank 26) is ranked lower than the GTX 760 (rank 14), and the r 280x (rank 13) is only ranked 1 above GTX 760, so I'm not sure that would be worth it...

    If I had to get something better than GTX 760, I'd have to enter the high end of the High End, which is very expensive when it comes to price / performance.

     

    G3D Mark               :       http://www.videocardbenchmark.net/high_end_gpus.html#

    Price / Perfomance :       http://www.videocardbenchmark.net/high_end_gpus.html#value

    Do you care about games, or do you only care about synthetic benchmarks?  Because if the latter, then you shouldn't base your purchase on a synthetic widely known to be very unrepresentative of games.  That benchmark is very favorable to Nvidia today; before Fermi, it strongly preferred ATI cards.  Look, for example, at the GeForce GT 545 somehow beating the GeForce GTX 285; in real games, the 545 might typically get half the performance of the 285.

    What are real people to do?

  • TheDarkrayneTheDarkrayne Member EpicPosts: 5,297

    God knows why people are claiming there are no real world differences between 1333 and 1866 memory.

    The difference will be massive. That should be your priority upgrade next. A SSD will help with loading times.. a memory upgrade (speed, not capacity beyond 8GB) will help with frames. An SSD will do nothing for frames.

    I don't suffer from insanity, I enjoy every minute of it.
  • GruntyGrunty Member EpicPosts: 8,657
    Originally posted by bliss14
    Originally posted by Quizzical
    Originally posted by PieRad

    The r9 270 (rank 26) is ranked lower than the GTX 760 (rank 14), and the r 280x (rank 13) is only ranked 1 above GTX 760, so I'm not sure that would be worth it...

    If I had to get something better than GTX 760, I'd have to enter the high end of the High End, which is very expensive when it comes to price / performance.

     

    G3D Mark               :       http://www.videocardbenchmark.net/high_end_gpus.html#

    Price / Perfomance :       http://www.videocardbenchmark.net/high_end_gpus.html#value

    Do you care about games, or do you only care about synthetic benchmarks?  Because if the latter, then you shouldn't base your purchase on a synthetic widely known to be very unrepresentative of games.  That benchmark is very favorable to Nvidia today; before Fermi, it strongly preferred ATI cards.  Look, for example, at the GeForce GT 545 somehow beating the GeForce GTX 285; in real games, the 545 might typically get half the performance of the 285.

    What are real people to do?

    I could post another picture.

    "I used to think the worst thing in life was to be all alone.  It's not.  The worst thing in life is to end up with people who make you feel all alone."  Robin Williams
  • ThorkuneThorkune Member UncommonPosts: 1,969
    Gratz on the new rig, OP.
  • PieRadPieRad Member Posts: 1,108

    Originally posted by Quizzical

    Originally posted by PieRad

    The r9 270 (rank 26) is ranked lower than the GTX 760 (rank 14), and the r 280x (rank 13) is only ranked 1 above GTX 760, so I'm not sure that would be worth it...

    If I had to get something better than GTX 760, I'd have to enter the high end of the High End, which is very expensive when it comes to price / performance.

     

    G3D Mark               :       http://www.videocardbenchmark.net/high_end_gpus.html#

    Price / Perfomance :       http://www.videocardbenchmark.net/high_end_gpus.html#value

    Do you care about games, or do you only care about synthetic benchmarks?  Because if the latter, then you shouldn't base your purchase on a synthetic widely known to be very unrepresentative of games.  That benchmark is very favorable to Nvidia today; before Fermi, it strongly preferred ATI cards.  Look, for example, at the GeForce GT 545 somehow beating the GeForce GTX 285; in real games, the 545 might typically get half the performance of the 285.

    I'll trust the benchmarks... 100% accurate? probably not, but I don't need 100% accuracy to decide which card i want, the list look okay to me, usually newer / better cards get better scores, the results of the list are across thousands of tests with the same testing software.

    I do look up the hardware I'm buying on different sites to see what people say about it, and also watch some comparison videos before i decide what to buy.

    But other than benchmarks you got nothing to compare 2 cards except buying both (or watch videos of people who did just that).

     

     

    Originally posted by Vannor

    God knows why people are claiming there are no real world differences between 1333 and 1866 memory.

    The difference will be massive. That should be your priority upgrade next. A SSD will help with loading times.. a memory upgrade (speed, not capacity beyond 8GB) will help with frames. An SSD will do nothing for frames.

    Okay, I have to ask though, some people say that buying faster than 1600 is a waste? should I buy 1600 or 1866?

     

     

    Originally posted by Grunty

    Originally posted by bliss14
    Originally posted by Quizzical
    Originally posted by PieRad

    -snip-

    -snip-

    -snip-

    I could post another picture.

    Please don't, the first one was scary.

     

     

    Originally posted by Thorkune
    Gratz on the new rig, OP.

    Thank you sir.

     

    image

  • TheDarkrayneTheDarkrayne Member EpicPosts: 5,297
    Originally posted by PieRad

    Originally posted by Vannor

    God knows why people are claiming there are no real world differences between 1333 and 1866 memory.

    The difference will be massive. That should be your priority upgrade next. A SSD will help with loading times.. a memory upgrade (speed, not capacity beyond 8GB) will help with frames. An SSD will do nothing for frames.

    Okay, I have to ask though, some people say that buying faster than 1600 is a waste? should I buy 1600 or 1866?

    It will depend on the game really and how CPU intensive it is. For example.. if you played EQ2 with the CPU shadows option then faster RAM would make a huge difference.

    Faster is better, just to be sure. You don't know which games are going to come your way.

    Some games use mostly VRAM, but you will come across plenty that don't. Games that were originally intended for consoles are ones that definitely don't use the VRAM to the full extent.. games such as Dark Souls or GTA IV (possibly V). You know how loads of people complain about terrible console ports? That's the main reason why. They could fix the optimization issues with some faster RAM but most are not willing to accept that. If you played something like a heavily modded Skyrim, faster RAM will make a world of difference because the scripts many mods use can be very CPU intensive. Highly optimized PC games like Crysis will show little difference between RAM speeds, but there will still be a slight difference. Games as optimized as Crysis are very rare though.

    You could definitely expect about an extra 5-10 frames at the very least at 1080p.. double that if it's a CPU intensive game. That's the difference between 1600 and 1866.. the difference between 1333 and 1866 will be more. Faster RAM is also better for overclocking, if you're into that.

    1600 is just the sweet spot when considering price vs performance, it's "mid-range".

    I don't suffer from insanity, I enjoy every minute of it.
  • PieRadPieRad Member Posts: 1,108
    Originally posted by Vannor
    Originally posted by PieRad

    Originally posted by Vannor

    God knows why people are claiming there are no real world differences between 1333 and 1866 memory.

    The difference will be massive. That should be your priority upgrade next. A SSD will help with loading times.. a memory upgrade (speed, not capacity beyond 8GB) will help with frames. An SSD will do nothing for frames.

    Okay, I have to ask though, some people say that buying faster than 1600 is a waste? should I buy 1600 or 1866?

    It will depend on the game really and how CPU intensive it is. For example.. if you played EQ2 with the CPU shadows option then faster RAM would make a huge difference.

    Faster is better, just to be sure. You don't know which games are going to come your way.

    Some games use mostly VRAM, but you will come across plenty that don't. Games that were originally intended for consoles are ones that definitely don't use the VRAM to the full extent.. games such as Dark Souls or GTA IV (possibly V). You know how loads of people complain about terrible console ports? That's the main reason why. They could fix the optimization issues with some faster RAM but most are not willing to accept that. If you played something like a heavily modded Skyrim, faster RAM will make a world of difference because the scripts many mods use can be very CPU intensive.

    You could definitely expect about an extra 5-10 frames at the very least at 1080p.. double that if it's a CPU intensive game. That's the difference between 1600 and 1866.. the difference between 1333 and 1866 will be more. Faster RAM is also better for overclocking, if you're into that.

    1600 is just the sweet spot when considering price vs performance, it's "mid-range".

    Okay, thanks a lot, that was very informative.

     

    I will go with 1866MHz.

    image

  • PieRadPieRad Member Posts: 1,108
    Originally posted by OG_Zorvan
    Originally posted by Grunty

    Well...

     

    He didn't deserve the Church Lady and I have too much respect for the LoLCats I usually post.

     

    So, what'd I miss? lol

    Nothing you'd want to see, trust me, it was scary.

    image

  • sacredfoolsacredfool Member UncommonPosts: 849

    I'd interested in actually seeing some game benchmarks Vannor.

    Unless you use an APU without a dedicated graphics card I am sceptical of that 5 to 10 FPS gain.


    Originally posted by nethaniah

    Seriously Farmville? Yeah I think it's great. In a World where half our population is dying of hunger the more fortunate half is spending their time harvesting food that doesn't exist.


  • TheDarkrayneTheDarkrayne Member EpicPosts: 5,297
    Originally posted by sacredfool

    I'd interested in actually seeing some game benchmarks Vannor.

    Unless you use an APU without a dedicated graphics card I am sceptical of that 5 to 10 FPS gain.

    Benchmark here using Ivybridge (games at the bottom and they are, or were, exclusive to PC):

    http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_4.html

    Show a 5-10% improvement to fps. Bare in mind that Crysis is highly optimized for PC but there is still a noticeable improvement.

     

    I don't suffer from insanity, I enjoy every minute of it.
  • sacredfoolsacredfool Member UncommonPosts: 849
    Originally posted by Vannor
    Originally posted by sacredfool

    I'd interested in actually seeing some game benchmarks Vannor.

    Unless you use an APU without a dedicated graphics card I am sceptical of that 5 to 10 FPS gain.

    Benchmark here using Ivybridge (games at the bottom and they are, or were, exclusive to PC):

    http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_4.html

    Show a 5-10% improvement to fps. Bare in mind that Crysis is highly optimized for PC but there is still a noticeable improvement.

    Ugh. I thought I might see something like this. Those are some large FPS numbers. 

    As the FPS drops the % difference gets smaller.  Let me try to explain:

     

    With such a huge FPS it's obvious the game runs really well on the system - the GPU and the CPU have no problems handling it. Since it's not graphics or processor bound, there has to be some other issue - only then the likely culprit is RAM bandwidth.  

    Basically, all we want to achieve is frame time that is consistently below 15 msec. This is how long the system takes to process the frame and how long it stays on your screen. When the frame time is above 10 msec RAM will not influence the FPS much since at that point the game is throttled by the GPU or CPU.  

    Unless your aim is to achieve a really high FPS for bragging rights, RAM bandwidth doesn't matter. It won't affect your FPS at around 30 to 60 FPS and you won't be able to tell the difference it makes at the 100+ FPS mark. 

    I won't even mention that higher frequency RAM is nowhere near as reliable due to some issues with stability. 

    Some benchmarks with results to back up my claims:

    http://www.tomshardware.com/reviews/memory-bandwidth-latency-gaming,3409.html

    http://www.youtube.com/watch?v=dWgzA2C61z4


    Originally posted by nethaniah

    Seriously Farmville? Yeah I think it's great. In a World where half our population is dying of hunger the more fortunate half is spending their time harvesting food that doesn't exist.


  • trancefatetrancefate Member UncommonPosts: 146
    I don't trust your mobo
  • PieRadPieRad Member Posts: 1,108

    Originally posted by sacredfool

    Originally posted by Vannor
    Originally posted by sacredfool

    I'd interested in actually seeing some game benchmarks Vannor.

    Unless you use an APU without a dedicated graphics card I am sceptical of that 5 to 10 FPS gain.

    Benchmark here using Ivybridge (games at the bottom and they are, or were, exclusive to PC):

    http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_4.html

    Show a 5-10% improvement to fps. Bare in mind that Crysis is highly optimized for PC but there is still a noticeable improvement.

    Ugh. I thought I might see something like this. Those are some large FPS numbers. 

    As the FPS drops the % difference gets smaller.  Let me try to explain:

     

    With such a huge FPS it's obvious the game runs really well on the system - the GPU and the CPU have no problems handling it. Since it's not graphics or processor bound, there has to be some other issue - only then the likely culprit is RAM bandwidth.  

    Basically, all we want to achieve is frame time that is consistently below 15 msec. This is how long the system takes to process the frame and how long it stays on your screen. When the frame time is above 10 msec RAM will not influence the FPS much since at that point the game is throttled by the GPU or CPU.  

    Unless your aim is to achieve a really high FPS for bragging rights, RAM bandwidth doesn't matter. It won't affect your FPS at around 30 to 60 FPS and you won't be able to tell the difference it makes at the 100+ FPS mark. 

    I won't even mention that higher frequency RAM is nowhere near as reliable due to some issues with stability. 

    Some benchmarks with results to back up my claims:

    http://www.tomshardware.com/reviews/memory-bandwidth-latency-gaming,3409.html

    http://www.youtube.com/watch?v=dWgzA2C61z4

    Wow okay, so there is literally no difference in performance, even for 800MHz lol.

     

    I'm still getting new RAM though, because I have 4 blocks of 2gb ddr3 1333MHz, 2 of which is being reported as 1066MHz, I don't understand why, but it annoys me.

     

    But after seeing that video, I might just go 1600MHz instead of 1866.

     

     

    Originally posted by trancefate
    I don't trust your mobo

    Hmm, why?

    image

  • GruntyGrunty Member EpicPosts: 8,657
    Originally posted by OG_Zorvan
    Originally posted by Grunty

    Well...

     

    He didn't deserve the Church Lady and I have too much respect for the LoLCats I usually post.

     

    So, what'd I miss? lol

    Ohhh, nothing much. I posted a picture in reply to Pocketprotector's second post. It was captioned "Well, isn't that special". 

    It was a PG-13 picture of a busty woman with red curly hair. Her head was turned to the side with a come-hither look in her eyes and her tongue sticking out.  She was also wearing a black negligee. None of the standard naughty bits were exposed.

    I received my first ever warning. Well, for pornography that is. image

    Oh, yes. One other thing. She weighed about 170. image

     

    How you doin', Z?

     

    I won't sully your thread anymore, Pierad.

    "I used to think the worst thing in life was to be all alone.  It's not.  The worst thing in life is to end up with people who make you feel all alone."  Robin Williams
  • TheDarkrayneTheDarkrayne Member EpicPosts: 5,297
    Originally posted by sacredfool
    Originally posted by Vannor
    Originally posted by sacredfool

    I'd interested in actually seeing some game benchmarks Vannor.

    Unless you use an APU without a dedicated graphics card I am sceptical of that 5 to 10 FPS gain.

    Benchmark here using Ivybridge (games at the bottom and they are, or were, exclusive to PC):

    http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_4.html

    Show a 5-10% improvement to fps. Bare in mind that Crysis is highly optimized for PC but there is still a noticeable improvement.

    Ugh. I thought I might see something like this. Those are some large FPS numbers. 

    As the FPS drops the % difference gets smaller.  Let me try to explain:

     

    With such a huge FPS it's obvious the game runs really well on the system - the GPU and the CPU have no problems handling it. Since it's not graphics or processor bound, there has to be some other issue - only then the likely culprit is RAM bandwidth.  

    Basically, all we want to achieve is frame time that is consistently below 15 msec. This is how long the system takes to process the frame and how long it stays on your screen. When the frame time is above 10 msec RAM will not influence the FPS much since at that point the game is throttled by the GPU or CPU.  

    Unless your aim is to achieve a really high FPS for bragging rights, RAM bandwidth doesn't matter. It won't affect your FPS at around 30 to 60 FPS and you won't be able to tell the difference it makes at the 100+ FPS mark. 

    I won't even mention that higher frequency RAM is nowhere near as reliable due to some issues with stability. 

    Some benchmarks with results to back up my claims:

    http://www.tomshardware.com/reviews/memory-bandwidth-latency-gaming,3409.html

    http://www.youtube.com/watch?v=dWgzA2C61z4

    Well, first.. as the total FPS drops the percentage increase doesn't change.. the extra frames gained does. If you've got around 55 FPS the increased memory might add an extra 10%.. which will provide a well rounded 60 fps. It also helps with the average FPS. Looking at max and min values is all good and well but the main problem is noticing drops and stuff. Faster memory reduces this kind of thing; reduces stutter; increases 'consistent' fps. Plus, everything I said about CPU intensive games is still valid. Like I said, it depends on the game/application.

    Like on the link you provided, there's a clear difference with F1 but nothing very noticeable with Metro. Also, when benchmarking, people always use the most optimized games.. so there's never any proper comparisons for CPU intensive games. It's even more complicated when you consider things like medium vs high vs ultra settings.. another thing no one ever benchmarks properly.

    A recent culprit is Battlefield 4, it shows a massive difference when using faster memory. It was the reason I upgraded mine. 60 frame increase in this benchmark of 1333 vs 2133, almost a 60% increase with CPU intensive settings and still a decent 13% increase with GPU intensive settings:

    http://www.overclock.net/t/1438222/battlefield-4-ram-memory-benchmark

    And we're talking average fps here.. not max. The consistency is also better. Look at the Ultra test result and you can see that there are less inclines and declines with fast memory. The fps 'range' is higher but how often the fps increases and decreases is reduced.

    I think I got around an extra 10-15 frames at ultra settings, helping me keep a steady 60 fps with vsync enabled (without any noticeable drops I mean). I never tested it properly but there was a definite and very noticeable improvement, especially when doing something like walking out of a building into the open.

    I don't suffer from insanity, I enjoy every minute of it.
  • sacredfoolsacredfool Member UncommonPosts: 849
    Originally posted by Vannor

    A recent culprit is Battlefield 4, it shows a massive difference when using faster memory. It was the reason I upgraded mine. 60 frame increase in this benchmark of 1333 vs 2133, almost a 60% increase with CPU intensive settings and still a decent 13% increase with GPU intensive settings:

    http://www.overclock.net/t/1438222/battlefield-4-ram-memory-benchmark

    And we're talking average fps here.. not max. The consistency is also better. Look at the Ultra test result and you can see that there are less inclines and declines with fast memory. The fps 'range' is higher but how often the fps increases and decreases is reduced.

    I think I got around an extra 10-15 frames at ultra settings, helping me keep a steady 60 fps with vsync enabled (without any noticeable drops I mean). I never tested it properly but there was a definite and very noticeable improvement, especially when doing something like walking out of a building into the open.

    Interesting. I'll admit, this is actually pretty proper research there. I guess RAM can make a difference in games like BF4.

     

    Since most games are GPU bound however, I think it's pretty important to correct one statement you made. You said: 

    "Well, first.. as the total FPS drops the percentage increase doesn't change.. the extra frames gained does."

    Even the research you linked to shows that in "GPU-limited situations" the % increase drops as the framerate drops. At high framerates with a 60% RAM frequency increase the FPS increase was 58%. At lower FPS the same RAM frequency increase gained him only a 13% FPS increase. 

    • FPS increase 58%, RAM mhz increase 60%
    • -> in cpu-limited situation the fps increase is nearly the same as the ram mhz increase
    • 13% fps increase overall, 9 more minimum fps, 28 more maximum fps
    • ->even in GPU-limtied situations there is still a good overall increase


    Originally posted by nethaniah

    Seriously Farmville? Yeah I think it's great. In a World where half our population is dying of hunger the more fortunate half is spending their time harvesting food that doesn't exist.


  • TheDarkrayneTheDarkrayne Member EpicPosts: 5,297
    Originally posted by sacredfool

    Even the research you linked to shows that in "GPU-limited situations" the % increase drops as the framerate drops. At high framerates with a 60% RAM frequency increase the FPS increase was 58%. At lower FPS the same RAM frequency increase gained him only a 13% FPS increase. 

    • FPS increase 58%, RAM mhz increase 60%
    • -> in cpu-limited situation the fps increase is nearly the same as the ram mhz increase
    • 13% fps increase overall, 9 more minimum fps, 28 more maximum fps
    • ->even in GPU-limtied situations there is still a good overall increase

    That's not an accurate comparison because it's CPU intensive vs GPU intensive. The way to test it would be to show two GPU intensive tests, one more demanding, such as an older game vs a new game... like, say Crysis 1 vs Crysis 3 with both at max settings.

    Or something like comparing Skyrim vanilla vs Skyrim with 4k textures.

    I don't suffer from insanity, I enjoy every minute of it.
  • ReaperJodaReaperJoda Member UncommonPosts: 76
    Originally posted by snoocky
    Errrr.. i'm sorry, but this is it?

    ....trolls will troll

  • sacredfoolsacredfool Member UncommonPosts: 849
    Originally posted by Vannor

    That's not an accurate comparison because it's CPU intensive vs GPU intensive. 

    That's not what is happening. Both are just as CPU intensive, simply in the scenario with the 13% the GPU is the bottleneck.

    (not exactly accurate but imagine it this way).

    When you turn up the settings the GPU produces less frames per second which in turn means the CPU and RAM are less "busy" processing them.  Even a 1600 RAM can handle the lower number of high-detail frames but it starts to struggle if the settings are low and the GPU is producing a whole load of frames per second.

    From the very thread you posted:

    The first scenario is on Ultra, which means it's GPU bound. The second scenario is RAM/CPU bound since it's on medium settings. 


    Originally posted by nethaniah

    Seriously Farmville? Yeah I think it's great. In a World where half our population is dying of hunger the more fortunate half is spending their time harvesting food that doesn't exist.


  • ViperDragonViperDragon Member UncommonPosts: 101
    Beautiful rig!

    A great list of free games (mostly MMORPGs): http://www.mytop10games.com/

  • TheDarkrayneTheDarkrayne Member EpicPosts: 5,297
    Originally posted by sacredfool
    Originally posted by Vannor

    That's not an accurate comparison because it's CPU intensive vs GPU intensive. 

    That's not what is happening. Both are just as CPU intensive, simply in the scenario with the 13% the GPU is the bottleneck.

    (not exactly accurate but imagine it this way).

    When you turn up the settings the GPU produces less frames per second which in turn means the CPU and RAM are less "busy" processing them.  Even a 1600 RAM can handle the lower number of high-detail frames but it starts to struggle if the settings are low and the GPU is producing a whole load of frames per second.

    From the very thread you posted:

    The first scenario is on Ultra, which means it's GPU bound. The second scenario is RAM/CPU bound since it's on medium settings. 

    I can't argue with that.

    Either way, if the cost difference is minor.. it's fairly obvious that faster memory can benefit game performance in some games depending on your complete system, whether it be consistency or max frames, so why not just pick some up? It certainly isn't going to make it any worse.

    I don't suffer from insanity, I enjoy every minute of it.
  • DoogiehowserDoogiehowser Member Posts: 1,873
    Originally posted by snoocky
    Errrr.. i'm sorry, but this is it?

    Can you please refrain from trolling on hardware boards at least?

    I would like to see your wtfawesome computer specs though.

    "The problem is that the hardcore folks always want the same thing: 'We want exactly what you gave us before, but it has to be completely different.'
    -Jesse Schell

    "Online gamers are the most ludicrously entitled beings since Caligula made his horse a senator, and at least the horse never said anything stupid."
    -Luke McKinney

    image

Sign In or Register to comment.