Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

ForgeLight Engine

2»

Comments

  • RaunuRaunu Member UncommonPosts: 480
    Originally posted by tkreep
    Why havent any mmo used water color style yet? I think that would be both unique and beautiful and at the same time be low end and mid end spec friendly.  Also it would make any fantasy setting look very fantastical.  Basically an updated and more gritty Okami style graphics.

    While I love the way Okami looks and feels, I'm not sure I'd want those graphics in an MMO.  I'm not sure why, I just don't feel like they'd fit in an MMO world.

    - - "What if the hokey pokey really is what it's all about?" - -

  • UtinniUtinni Member EpicPosts: 2,209

    MMO engines will only get but so good since they have to work on a huge range of hardware to attract the most players.

     

    I downloaded PS2 the other day to see how it was too and was pretty impressed.

  • DejoblueDejoblue Member UncommonPosts: 307
    Originally posted by tkreep
    Why havent any mmo used water color style yet? I think that would be both unique and beautiful and at the same time be low end and mid end spec friendly.  Also it would make any fantasy setting look very fantastical.  Basically an updated and more gritty Okami style graphics.

    GW2 screams water color.

  • Sk1ppeRSk1ppeR Member Posts: 511
    Originally posted by theNextEQ
    Originally posted by Sk1ppeR
    Ha! You gave all that money for Ivy cpu and expensive GPU and IPS monitor (lol) and you got only 1600MHz of RAM. Jokes on you buddy, jokes on you :P 

    if you knew anything about gaming machines you would know that ram speed makes VERY VERY little difference in FPS. 

    SO you have DDR 2133 in your machine. you MIGHT get 1 to 3 FPS more than I do..... jokes on YOU

    When you are already pushing 100 FPS on the highest settings possible on every game on your system, spending more for bragging rights is just childish...

    You can LOL all you want. but don't act like you have better or anything above a cheap Best buy monitor TN panel hooked up to your system. 

    Nah sadly there is no Best Buy in my country, however I have a decent Samsung 3D monitor running on 120Hz. Its real nice :) And believe me, 2133 RAM clock does make a difference :) and its not that more expensive really. Also there is such thing as timings, which is good to have on low, higher clock helps you to achieve just that. After all you are not playing on a console, you are playing on a Windows PC which has to keep ~70 processes while you play including explorer and, if you are still using Windows 7, Aero theme, while you play. But yeah, I'm childish and all, but you are butthurt xD . The only mistake I made 2 years ago was to go for dual gpu build ... that's bullshit on multiple levels. So far only Crytek and Unreal Engine games are able to utilize that power >.> 

  • ReehayReehay Member Posts: 172
    Originally posted by Sk1ppeR
    Ha! You gave all that money for Ivy cpu and expensive GPU and IPS monitor (lol) and you got only 1600MHz of RAM. Jokes on you buddy, jokes on you :P 

    you're dumb. every benchmark in the world shows the performance difference between 1600 and 2100 to be 4-5 fps at the MOST and only in certain games and only certain chipsets, such as the AMD APU's.  in gaming rig priority, RAM is dead last.

    now go put your dunce cap on, and sit in the corner while we all point and laugh at you.

  • Sk1ppeRSk1ppeR Member Posts: 511
    Originally posted by Reehay
    Originally posted by Sk1ppeR
    Ha! You gave all that money for Ivy cpu and expensive GPU and IPS monitor (lol) and you got only 1600MHz of RAM. Jokes on you buddy, jokes on you :P 

    you're dumb. every benchmark in the world shows the performance difference between 1600 and 2100 to be 4-5 fps at the MOST and only in certain games and only certain chipsets, such as the AMD APU's.  in gaming rig priority, RAM is dead last.

    now go put your dunce cap on, and sit in the corner while we all point and laugh at you.

    That is why douchebags like you upgrade their PCs each year, because you are so smart, you bottleneck your CPUs so much with dirt cheap chinese RAM that's not good for Windows what's left for high quality gaming on Ultra setting. 

    But yeah, I will put my dunce cap on and play crysis 3 on ultra on my almost 3 year old PC that I haven't touched since I assembled. ^_^ *puts my hat and points you with finger and laughs my ass off*

    Now go shed those precious dollars on Haswell CPU, and new motherboard and keep blaming your otherwise good parts :) 

    you might as well buy a Titan. I don't mind, since you are so smart c: 

     

    PS: Why do you think the PS4 is running DDR5 and Xbox One is running DDR3 clocked on 2400MHz ? :) 

    By your logic I suppose the engineers at Microsoft and Sony are dunces too? :O 

    You must be smart, oh great one

     

    Do you know how many dependencies there are between computer parts? Or you just plug in your parts to *some* motherboard and hope for the best score? :) 

     

    And to answer to yet another pointless EQ Next topic (I hope this game fails so you all guys cry like pinched student girls) from a guy who is probably working at Sony judging by the account name and well builds a hype train, which is stupid but anyway, Sony has its agents around building hype, false hype that is but still. 

    I'm not saying I don't like cartoony or photorealistc graphics, but I think this engine is a bit overkill. It's a MMO after all and you can have 100+ players at a single spot, spamming skills, skills effects and shit. People already cry that DFUW is resource heavy, and we all know it looks quite bad. The provided screen looks good, most people will enjoy it when running alone I suppose, but what happens when you group up for a large raid?

  • Gallus85Gallus85 Member Posts: 1,092
    Originally posted by Sk1ppeR
    Originally posted by Reehay
    Originally posted by Sk1ppeR
    Ha! You gave all that money for Ivy cpu and expensive GPU and IPS monitor (lol) and you got only 1600MHz of RAM. Jokes on you buddy, jokes on you :P 

    you're dumb. every benchmark in the world shows the performance difference between 1600 and 2100 to be 4-5 fps at the MOST and only in certain games and only certain chipsets, such as the AMD APU's.  in gaming rig priority, RAM is dead last.

    now go put your dunce cap on, and sit in the corner while we all point and laugh at you.

    That is why douchebags like you upgrade their PCs each year, because you are so smart, you bottleneck your CPUs so much with dirt cheap chinese RAM that's not good for Windows what's left for high quality gaming on Ultra setting. 

    But yeah, I will put my dunce cap on and play crysis 3 on ultra on my almost 3 year old PC that I haven't touched since I assembled. ^_^ *puts my hat and points you with finger and laughs my ass off*

    Now go shed those precious dollars on Haswell CPU, and new motherboard and keep blaming your otherwise good parts :) 

    you might as well buy a Titan. I don't mind, since you are so smart c: 

     

    PS: Why do you think the PS4 is running DDR5 and Xbox One is running DDR3 clocked on 2400MHz ? :) 

    By your logic I suppose the engineers at Microsoft and Sony are dunces too? :O 

    You must be smart, oh great one

     

    Do you know how many dependencies there are between computer parts? Or you just plug in your parts to *some* motherboard and hope for the best score? :) 

     

    And to answer to yet another pointless EQ Next topic (I hope this game fails so you all guys cry like pinched student girls) from a guy who is probably working at Sony judging by the account name and well builds a hype train, which is stupid but anyway, Sony has its agents around building hype, false hype that is but still. 

    I'm not saying I don't like cartoony or photorealistc graphics, but I think this engine is a bit overkill. It's a MMO after all and you can have 100+ players at a single spot, spamming skills, skills effects and shit. People already cry that DFUW is resource heavy, and we all know it looks quite bad. The provided screen looks good, most people will enjoy it when running alone I suppose, but what happens when you group up for a large raid?

    Sorry, I'm going to chime in here as the resident PC modding / OCing enthusiast....

    Ram speeds/timings make very, very little difference in gaming performance.  the difference between loose timings and 1333mhz vs high end tight timings and 2133mhz and higher is on average 1 to 3 fps increase.  It really isn't that important.  I've built hundreds of systems for people and I OC just about all of them.  Show me a single benchmark for DDR3 ram that shows significant improvements with high speed / ram timings.

    You claim bottlenecking but you can take the same system, OC the CPU, or drop in a faster CPU, or OC the GPU or drop in a faster GPU and the FPS will rocket upwards with massive gains.

    Go from 1333 mhz DDR3 ram and OC it to 2133 and tighten the timings up and you get like +2 fps lol.  Sometimes not even that.

    Frankly it's not that huge of a deal, because you can get 2133mhz kits now for like +$25 more than what you'd pay for 1600mhz kits, so it's not like the price differences are really drastic.  But the gains are near zero, even when contrasted from the lowest to the highest.

    Also, why are Xbone and PS4 running those specs?  Because it's a great marketing gimmick with a very low cost.  They spend an extra $3 on memory speed per unit after their massive bulk discounts (and like I said, ram is cheap now a days), and they get to throw that up on the TV and in magazines to convince non-tech savvy people (Like yourself) that it's impressive and they should buy an Xbox.

    SEGA Genesis "Blast Processing" ring a bell?

    Legends of Kesmai, UO, EQ, AO, DAoC, AC, SB, RO, SWG, EVE, EQ2, CoH, GW, VG:SOH, WAR, Aion, DF, CO, MO, DN, Tera, SWTOR, RO2, DP, GW2, PS2, BnS, NW, FF:XIV, ESO, EQ:NL

  • WaterlilyWaterlily Member UncommonPosts: 3,105
    Originally posted by SneakyTurtle

    I've noticed this on  my pic too, the moon (which might as well be Luclin) is stretched out, it's an oval shape instead of round, it doesn't look right. I wonder if this is a bug where they didn't map the texture properly.

    It's really off-putting, since everything else looks fine.

  • Gallus85Gallus85 Member Posts: 1,092
    Originally posted by Waterlily
    Originally posted by SneakyTurtle

    I've noticed this on  my pic too, the moon (which might as well be Luclin) is stretched out, it's an oval shape instead of round, it doesn't look right. I wonder if this is a bug where they didn't map the texture properly.

    It's really off-putting, since everything else looks fine.

    It happens due to putting 3D objects onto a flat surface (Your monitor).  The larger you push your FOV cap the more objects get stretched/contracted to compensate for being a 3D object on a flat surface.

    This problem can be minimized via lowering your FOV.

    To see the problem get exacerbated look at someone playing on multi-monitor displays (3x monitors running a game at 5760 x 1200), and look at the far left of the left monitor and the far right of the right monitor.  It'll look really stretched.

    Legends of Kesmai, UO, EQ, AO, DAoC, AC, SB, RO, SWG, EVE, EQ2, CoH, GW, VG:SOH, WAR, Aion, DF, CO, MO, DN, Tera, SWTOR, RO2, DP, GW2, PS2, BnS, NW, FF:XIV, ESO, EQ:NL

  • WaterlilyWaterlily Member UncommonPosts: 3,105
    Originally posted by Gallus85
    Originally posted by Waterlily
    Originally posted by SneakyTurtle

    I've noticed this on  my pic too, the moon (which might as well be Luclin) is stretched out, it's an oval shape instead of round, it doesn't look right. I wonder if this is a bug where they didn't map the texture properly.

    It's really off-putting, since everything else looks fine.

    It happens due to putting 3D objects onto a flat surface (Your monitor).  The larger you push your FOV cap the more objects get stretched/contracted to compensate for being a 3D object on a flat surface.

    This problem can be minimized via lowering your FOV.

    To see the problem get exacerbated look at someone playing on multi-monitor displays (3x monitors running a game at 5760 x 1200), and look at the far left of the left monitor and the far right of the right monitor.  It'll look really stretched.

    I see, they should include a way to make sure objects aren't stretched then. Maybe a scaling feature in the options to calibrate the dimensions correctly. I'd rather have a small black bar left and right than a wrongly scaled environment.

  • Gallus85Gallus85 Member Posts: 1,092
    Originally posted by Waterlily

    I see, they should include a way to make sure objects aren't stretched then. Maybe a scaling feature in the options to calibrate the dimensions correctly. I'd rather have a small black bar left and right than a wrongly scaled environment.

    Lol.  That's like saying "Rockets would have an easier time getting into space if they didn't have to deal with gravity.  They should find a way to negate the effects of gravity".

    Could it happen one day?  Sure.  But right now it's not in our reach technologically.

    It's basically a hardware limitation.  a Holo-deck style set up would make more natural use of your eyes (looking at 3D objects in 3D space), so you wouldn't have this sort of effect.

    But your monitor is the main cause of the distortion because you're viewing 3D objects in a 360 degree world being rendered on a flat surface that only takes up a small amount of your FOV.

    You can make it less notable by lowering your FOV substantially, that way the distortion is smaller and harder for your eyes to see it.

    But it's a hardware limitation that is unavoidable currently.  Most people prefer having a larget FOV, despite the distortion, because having a low FOV makes your game feel like you're playing with blinders on, it's unnatural.

    Legends of Kesmai, UO, EQ, AO, DAoC, AC, SB, RO, SWG, EVE, EQ2, CoH, GW, VG:SOH, WAR, Aion, DF, CO, MO, DN, Tera, SWTOR, RO2, DP, GW2, PS2, BnS, NW, FF:XIV, ESO, EQ:NL

  • WaterlilyWaterlily Member UncommonPosts: 3,105

    I meant you can stretch the game window in games like EQ, but I don't know what the right aspect ratio is.

    I'll make a pic..

  • WaterlilyWaterlily Member UncommonPosts: 3,105

    I'm exaggerating, but you can play EQ like this:

     

     

    and you can play it like this:

     

    It would be nice if developers said:

    We programmed in ratio 16:9 for example...if you want the best results on your monitor...use resolution...bla bla..etc

  • Sk1ppeRSk1ppeR Member Posts: 511
    Originally posted by Gallus85
    Originally posted by Sk1ppeR
    Originally posted by Reehay
    Originally posted by Sk1ppeR
    Ha! You gave all that money for Ivy cpu and expensive GPU and IPS monitor (lol) and you got only 1600MHz of RAM. Jokes on you buddy, jokes on you :P 

    you're dumb. every benchmark in the world shows the performance difference between 1600 and 2100 to be 4-5 fps at the MOST and only in certain games and only certain chipsets, such as the AMD APU's.  in gaming rig priority, RAM is dead last.

    now go put your dunce cap on, and sit in the corner while we all point and laugh at you.

    That is why douchebags like you upgrade their PCs each year, because you are so smart, you bottleneck your CPUs so much with dirt cheap chinese RAM that's not good for Windows what's left for high quality gaming on Ultra setting. 

    But yeah, I will put my dunce cap on and play crysis 3 on ultra on my almost 3 year old PC that I haven't touched since I assembled. ^_^ *puts my hat and points you with finger and laughs my ass off*

    Now go shed those precious dollars on Haswell CPU, and new motherboard and keep blaming your otherwise good parts :) 

    you might as well buy a Titan. I don't mind, since you are so smart c: 

     

    PS: Why do you think the PS4 is running DDR5 and Xbox One is running DDR3 clocked on 2400MHz ? :) 

    By your logic I suppose the engineers at Microsoft and Sony are dunces too? :O 

    You must be smart, oh great one

     

    Do you know how many dependencies there are between computer parts? Or you just plug in your parts to *some* motherboard and hope for the best score? :) 

     

    And to answer to yet another pointless EQ Next topic (I hope this game fails so you all guys cry like pinched student girls) from a guy who is probably working at Sony judging by the account name and well builds a hype train, which is stupid but anyway, Sony has its agents around building hype, false hype that is but still. 

    I'm not saying I don't like cartoony or photorealistc graphics, but I think this engine is a bit overkill. It's a MMO after all and you can have 100+ players at a single spot, spamming skills, skills effects and shit. People already cry that DFUW is resource heavy, and we all know it looks quite bad. The provided screen looks good, most people will enjoy it when running alone I suppose, but what happens when you group up for a large raid?

    Sorry, I'm going to chime in here as the resident PC modding / OCing enthusiast....

    Ram speeds/timings make very, very little difference in gaming performance.  the difference between loose timings and 1333mhz vs high end tight timings and 2133mhz and higher is on average 1 to 3 fps increase.  It really isn't that important.  I've built hundreds of systems for people and I OC just about all of them.  Show me a single benchmark for DDR3 ram that shows significant improvements with high speed / ram timings.

    You claim bottlenecking but you can take the same system, OC the CPU, or drop in a faster CPU, or OC the GPU or drop in a faster GPU and the FPS will rocket upwards with massive gains.

    Go from 1333 mhz DDR3 ram and OC it to 2133 and tighten the timings up and you get like +2 fps lol.  Sometimes not even that.

    Frankly it's not that huge of a deal, because you can get 2133mhz kits now for like +$25 more than what you'd pay for 1600mhz kits, so it's not like the price differences are really drastic.  But the gains are near zero, even when contrasted from the lowest to the highest.

    Also, why are Xbone and PS4 running those specs?  Because it's a great marketing gimmick with a very low cost.  They spend an extra $3 on memory speed per unit after their massive bulk discounts (and like I said, ram is cheap now a days), and they get to throw that up on the TV and in magazines to convince non-tech savvy people (Like yourself) that it's impressive and they should buy an Xbox.

    SEGA Genesis "Blast Processing" ring a bell?

    I'm quite certain you don't know what you are talking about yep. Playing with voltage != overclock. You are far from an overclocker. Have you ever overclocked your GPU's memory? Do you know how a computer system operates? North bridge/south bridge, ring any bell? What does 64-bit actually means? Heap? Stack? Why do I mention CPU when I trololol over RAM clock?

    Here's a little secret, some CPUs can't actually use more than 1600 MHz, so even if you clock the RAM on 5GHz (i know impossibru on DDR3 but still an example), you wouldn't see much gain, at all. But if your chipset can go past that, it's a crime not to utilize it. Just saying. But meh, you are the overclocker here

  • AceshighhhhAceshighhhh Member Posts: 185

    I have to disagree with you OP about Firefall looking better than Planetside 2.

    The Forgelight engine's lighting capabalities, textures, and just overall feel look WAYY better than Firefall to me. The cell-shaded stylized cartoony graphics are cool and all, but they just don't hold a candle to the beautiful and realistic aesthetics of PS2.

    I'd much prefer EQN look like PS2 than Firefall

  • AlleinAllein Member RarePosts: 2,139
    Originally posted by Sk1ppeR
    Originally posted by Gallus85
    Originally posted by Sk1ppeR
    Originally posted by Reehay
    Originally posted by Sk1ppeR
    Ha! You gave all that money for Ivy cpu and expensive GPU and IPS monitor (lol) and you got only 1600MHz of RAM. Jokes on you buddy, jokes on you :P 

    you're dumb. every benchmark in the world shows the performance difference between 1600 and 2100 to be 4-5 fps at the MOST and only in certain games and only certain chipsets, such as the AMD APU's.  in gaming rig priority, RAM is dead last.

    now go put your dunce cap on, and sit in the corner while we all point and laugh at you.

    That is why douchebags like you upgrade their PCs each year, because you are so smart, you bottleneck your CPUs so much with dirt cheap chinese RAM that's not good for Windows what's left for high quality gaming on Ultra setting. 

    But yeah, I will put my dunce cap on and play crysis 3 on ultra on my almost 3 year old PC that I haven't touched since I assembled. ^_^ *puts my hat and points you with finger and laughs my ass off*

    Now go shed those precious dollars on Haswell CPU, and new motherboard and keep blaming your otherwise good parts :) 

    you might as well buy a Titan. I don't mind, since you are so smart c: 

     

    PS: Why do you think the PS4 is running DDR5 and Xbox One is running DDR3 clocked on 2400MHz ? :) 

    By your logic I suppose the engineers at Microsoft and Sony are dunces too? :O 

    You must be smart, oh great one

     

    Do you know how many dependencies there are between computer parts? Or you just plug in your parts to *some* motherboard and hope for the best score? :) 

     

    And to answer to yet another pointless EQ Next topic (I hope this game fails so you all guys cry like pinched student girls) from a guy who is probably working at Sony judging by the account name and well builds a hype train, which is stupid but anyway, Sony has its agents around building hype, false hype that is but still. 

    I'm not saying I don't like cartoony or photorealistc graphics, but I think this engine is a bit overkill. It's a MMO after all and you can have 100+ players at a single spot, spamming skills, skills effects and shit. People already cry that DFUW is resource heavy, and we all know it looks quite bad. The provided screen looks good, most people will enjoy it when running alone I suppose, but what happens when you group up for a large raid?

    Sorry, I'm going to chime in here as the resident PC modding / OCing enthusiast....

    Ram speeds/timings make very, very little difference in gaming performance.  the difference between loose timings and 1333mhz vs high end tight timings and 2133mhz and higher is on average 1 to 3 fps increase.  It really isn't that important.  I've built hundreds of systems for people and I OC just about all of them.  Show me a single benchmark for DDR3 ram that shows significant improvements with high speed / ram timings.

    You claim bottlenecking but you can take the same system, OC the CPU, or drop in a faster CPU, or OC the GPU or drop in a faster GPU and the FPS will rocket upwards with massive gains.

    Go from 1333 mhz DDR3 ram and OC it to 2133 and tighten the timings up and you get like +2 fps lol.  Sometimes not even that.

    Frankly it's not that huge of a deal, because you can get 2133mhz kits now for like +$25 more than what you'd pay for 1600mhz kits, so it's not like the price differences are really drastic.  But the gains are near zero, even when contrasted from the lowest to the highest.

    Also, why are Xbone and PS4 running those specs?  Because it's a great marketing gimmick with a very low cost.  They spend an extra $3 on memory speed per unit after their massive bulk discounts (and like I said, ram is cheap now a days), and they get to throw that up on the TV and in magazines to convince non-tech savvy people (Like yourself) that it's impressive and they should buy an Xbox.

    SEGA Genesis "Blast Processing" ring a bell?

    I'm quite certain you don't know what you are talking about yep. Playing with voltage != overclock. You are far from an overclocker. Have you ever overclocked your GPU's memory? Do you know how a computer system operates? North bridge/south bridge, ring any bell? What does 64-bit actually means? Heap? Stack? Why do I mention CPU when I trololol over RAM clock?

    Here's a little secret, some CPUs can't actually use more than 1600 MHz, so even if you clock the RAM on 5GHz (i know impossibru on DDR3 but still an example), you wouldn't see much gain, at all. But if your chipset can go past that, it's a crime not to utilize it. Just saying. But meh, you are the overclocker here

    Can you give some examples of memory impacting gaming? Years ago it made a difference but now it doesn't seem worth it to OC memory or get the "best" brand or whatever. Wouldn't mind some info if it will help my gaming experience. Ram is usually the part I put the least amount of thought into when planning a new build.

  • Sk1ppeRSk1ppeR Member Posts: 511
    Originally posted by Allein
    Originally posted by Sk1ppeR
    Originally posted by Gallus85
    Originally posted by Sk1ppeR
    Originally posted by Reehay
    Originally posted by Sk1ppeR
    Ha! You gave all that money for Ivy cpu and expensive GPU and IPS monitor (lol) and you got only 1600MHz of RAM. Jokes on you buddy, jokes on you :P 

    you're dumb. every benchmark in the world shows the performance difference between 1600 and 2100 to be 4-5 fps at the MOST and only in certain games and only certain chipsets, such as the AMD APU's.  in gaming rig priority, RAM is dead last.

    now go put your dunce cap on, and sit in the corner while we all point and laugh at you.

    That is why douchebags like you upgrade their PCs each year, because you are so smart, you bottleneck your CPUs so much with dirt cheap chinese RAM that's not good for Windows what's left for high quality gaming on Ultra setting. 

    But yeah, I will put my dunce cap on and play crysis 3 on ultra on my almost 3 year old PC that I haven't touched since I assembled. ^_^ *puts my hat and points you with finger and laughs my ass off*

    Now go shed those precious dollars on Haswell CPU, and new motherboard and keep blaming your otherwise good parts :) 

    you might as well buy a Titan. I don't mind, since you are so smart c: 

     

    PS: Why do you think the PS4 is running DDR5 and Xbox One is running DDR3 clocked on 2400MHz ? :) 

    By your logic I suppose the engineers at Microsoft and Sony are dunces too? :O 

    You must be smart, oh great one

     

    Do you know how many dependencies there are between computer parts? Or you just plug in your parts to *some* motherboard and hope for the best score? :) 

     

    And to answer to yet another pointless EQ Next topic (I hope this game fails so you all guys cry like pinched student girls) from a guy who is probably working at Sony judging by the account name and well builds a hype train, which is stupid but anyway, Sony has its agents around building hype, false hype that is but still. 

    I'm not saying I don't like cartoony or photorealistc graphics, but I think this engine is a bit overkill. It's a MMO after all and you can have 100+ players at a single spot, spamming skills, skills effects and shit. People already cry that DFUW is resource heavy, and we all know it looks quite bad. The provided screen looks good, most people will enjoy it when running alone I suppose, but what happens when you group up for a large raid?

    Sorry, I'm going to chime in here as the resident PC modding / OCing enthusiast....

    Ram speeds/timings make very, very little difference in gaming performance.  the difference between loose timings and 1333mhz vs high end tight timings and 2133mhz and higher is on average 1 to 3 fps increase.  It really isn't that important.  I've built hundreds of systems for people and I OC just about all of them.  Show me a single benchmark for DDR3 ram that shows significant improvements with high speed / ram timings.

    You claim bottlenecking but you can take the same system, OC the CPU, or drop in a faster CPU, or OC the GPU or drop in a faster GPU and the FPS will rocket upwards with massive gains.

    Go from 1333 mhz DDR3 ram and OC it to 2133 and tighten the timings up and you get like +2 fps lol.  Sometimes not even that.

    Frankly it's not that huge of a deal, because you can get 2133mhz kits now for like +$25 more than what you'd pay for 1600mhz kits, so it's not like the price differences are really drastic.  But the gains are near zero, even when contrasted from the lowest to the highest.

    Also, why are Xbone and PS4 running those specs?  Because it's a great marketing gimmick with a very low cost.  They spend an extra $3 on memory speed per unit after their massive bulk discounts (and like I said, ram is cheap now a days), and they get to throw that up on the TV and in magazines to convince non-tech savvy people (Like yourself) that it's impressive and they should buy an Xbox.

    SEGA Genesis "Blast Processing" ring a bell?

    I'm quite certain you don't know what you are talking about yep. Playing with voltage != overclock. You are far from an overclocker. Have you ever overclocked your GPU's memory? Do you know how a computer system operates? North bridge/south bridge, ring any bell? What does 64-bit actually means? Heap? Stack? Why do I mention CPU when I trololol over RAM clock?

    Here's a little secret, some CPUs can't actually use more than 1600 MHz, so even if you clock the RAM on 5GHz (i know impossibru on DDR3 but still an example), you wouldn't see much gain, at all. But if your chipset can go past that, it's a crime not to utilize it. Just saying. But meh, you are the overclocker here

    Can you give some examples of memory impacting gaming? Years ago it made a difference but now it doesn't seem worth it to OC memory or get the "best" brand or whatever. Wouldn't mind some info if it will help my gaming experience. Ram is usually the part I put the least amount of thought into when planning a new build.

    Well it affects your CPU throughput. If you are running Sandy Bridge, even if your motherboard supports over 1600 MHz you wouldn't really see a benefit, maybe just slight, but probably not. If you go for Ivy or Haswell (I'm not experienced with AMD CPUs, sorry, haven't used them in years) I recommend get the clock the chipset is supporting. This way you'll squeeze some extra years of your build. Sure 1600 might be okay *today* but with next gen console right around the corner its a good to have, and as the overclocker above said, the difference is few $$. I usually go for Kingston RAM because its both cheap and effective. Not that RAM is expensive anywhere these days but still. Dual channel is a must, you can easily check that with CPU-Z, there are few requirements in order to activate the dual channel but it usually is if you get a kit. Also you might want to overclock your chipset's FSB (Front-side BUS) to match your RAM's speed. By default it runs pretty low so that it's stable. Also when you purchase 1600MHz RAM, make sure to enable that in your BIOS. Most likely by default it would be running on 1333 MHz. I highly suggest for any PC enthusiast to read about FSB and RAM dependencies and how one affects the other. 

    And because of the new consoles, I'm waiting for GCN GPU (Graphic Core Next). Yeah yeah I know you all are nvidia fanboys and titan is THE gpu, however they charge extra buck premium just for the trademark. I don't like such policy. *rebel* (lol) 

    Bottom line is, if your PC parts run in synergy, your PC would never leave you lagging like an old lady on a stoplight. Unless you try to pump more than it has. If you are curious, my almost 3 year old PC is running *only* 4GBs of RAM but because its clocked nicely, I still get to enjoy ultra graphics on full hd. For me that is an achievement, not many can say that. And I still haven't overclocked the CPU (its running default clock) neither the GPUs (same, no clocks there) 

  • SlavakkSlavakk Member Posts: 36
    I buy High Price/Good Rated quality parts but I don't OC never had a need to... I tend to PvP or FPS most the time so I turn the GPU's graphics to ultra low... I prefer performance over visual quality... I'll jack up the graphics when questing/harvesting sometimes Raiding to look at all the pretty colors... but if there is a chance that I will find PvP I usually run the lowest settings... sometimes even in raids with lots of movement involved.. I could easily run Ultra and max everything out and still get my 60 FPS but I really don't need to cook my PC any more that it does already lol.. But my CPU's are always intel and my GPU's are always AMD's flagship at that current time.. I prefer 2 card on 1 board GPU's 5990, 6990, 7990.. Just never got into the whole SLI/X-Fire thing... I run PS2 at both Ultra and Low settings depending on the situations and my PC don't really slow down or choke at either while playing so I'm guessing I'm good for EQN... I have 3 PC's and all of them run 8g's of ram.. They are all one generation behind each other and still can play all the games on the market now..  And depending how it all plays out with the classes or crafting I might or might not open up my second SoE account and 2 box EQ again... 
  • Sk1ppeRSk1ppeR Member Posts: 511

    Yep what the guy above me said. Besides, if you buy a RAM that can go to 2133 basically means that it has higher quality materials in it. Its not really overclock if it comes like that from the manufacturer. I've seen 1600MHz and 2133MHz RAM chips with coolers on them, I would hardly call that overclocking. But to squeeze everything from your PC you will need to check some stuff up. I highly suggest every one of you to check their BIOS settings if you haven't already. Chances are, your RAM is running on 1333 MHz even though manufacturer says 1600. 

     

    And I agree with the guy above on one more thing. 2 GPUs is kind of hell. You have to dissipate higher heat output (more coolers, more noise) and there are those games, like Guild Wars 2, that just doesn't run well in CrossfireX/SLI mode. They just don't. So far, for my only ~3 years running dual GPU build, only the Crytek and Unreal Engines games have run flawlessly. On all other games I was forced to disable the Crossfire mode in order to get higher FPS (fuck me right) It's a tough lesson I've learned. Better go for cards with 2 GPUs on 1 board instead of 2 separate GPUs if you chase performance. I do enjoy the boost when I play Unreal/Crytek game but that's a rarity after all. The nVidia guys have the same problems. The engines are just not ready yet for dual GPU modes.

  • moosecatlolmoosecatlol Member RarePosts: 1,531

     

    Why is this HERE! ^^^^^^^^^^^^^^^^^

    Planet side 2

    or

    Fire Fall

     

    Remember this is a discussion of pure graphical fidelity.

    I'd go so far as to say there is no better engine for dynamic lighting than the Forgelight engine. At least on the market today. It's honestly the aurora borealis/australis that does it for me. Moreover does Firefall even have PhysX particle effects? I mean come on, what is this 2010?

    I believe what the OP is referring to however is the aesthetics, what the OP should realize is that Planet Side 2 and Ever Quest are two very different games with two very different game design goals. The aesthetic design for Planet Side 2 is naturally more barren than Fire Falls as Planet Side 2 needed create a world that also serves as a battlefield for infantry/armor/air where as Firefall only focuses on infantry map layouts.

     

    Ultimately the Forgelight engine is superior in terms of graphical fidelity when compared to the Offset engine. That's just simple math.

  • Sk1ppeRSk1ppeR Member Posts: 511

    Not sure if you are talking about engines used in MMOs or game engines in general but if you speak overally, here's a list of leviathans 

    Frostbite

    Crytek

    Unreal Engine

    InfinityWars Engine

    They have it all covered, from lighting to destruction. The level of detail is just sick on these. 

  • AceshighhhhAceshighhhh Member Posts: 185
    Originally posted by DMKano
    Originally posted by Sk1ppeR

    Not sure if you are talking about engines used in MMOs or game engines in general but if you speak overally, here's a list of leviathans 

    Frostbite

    Crytek

    Unreal Engine

    InfinityWars Engine

    They have it all covered, from lighting to destruction. The level of detail is just sick on these. 

    True, IMO Cryengine 3 is far superior to Forgelight - which only supports DX9 , none of the more advanced features in DX11 are supported in Forgelight, keep that in mind. 

    Perhaps they've upgraded the engine for EQN? It would be logical seeing as how it's 2013 and most modern "AAA" games out there support DX11.

Sign In or Register to comment.