Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Any one else think we reached our limit on graphics?

13

Comments

  • AercusAercus Member UncommonPosts: 775

    Originally posted by Anubisan

    Games will keep looking more and more realistic until eventually no one will be able to tell the difference. And then there will be something else to push the boundaries like full 3D or something we haven't even comprehended yet. Personally I just hope I'm around long enough to play MMOs on a holodeck! image

     Well, halfway there, we already have holograms :)

  • HashbrickHashbrick Member RarePosts: 1,851


    Originally posted by MMO_Doubter

    Originally posted by cpc71783
    You sure about that? Check out Vexels (essentially pixels) and their use in UNLIMITED detail. Unlimited detail being produced without the need of a video card.
    Unlimited detail uses pixels, or vexels instead of polygons, and renders an unlimited amount of detail in a virtual world. It does this by only rendering what is visible on the screen, instead of rendering the entire world around you all at the same time:

    Eat your words.
    Very interesting video.

    Unlimited Detail is what the 3DO was to the video game market. Junk, it will be ignored it's all theory and while they try to say the process is simple there is nothing anyone can do without a SDK. I just think the way the company is working with it and the direction it is going it just won't make it. There is better ways and theories that will make the new standard then this.

    As for the drblatz I hope to hell not many companies jump on this ruse called "Donnybroke", you seem to be a salesman for it though since if I look back through your post history all I see is donnybroke donnybroke donnybroke. Last thing we need is microsoft controlling that shit. We can already thank them for piece of shit standards throughout the years such as the lovely IE6-8 that is still behind and makes a developers job a living hell. Not to mention the later directX that was very rough around the edges.

    [[ DEAD ]] - Funny - I deleted my account on the site using the cancel account button.  Forum user is separate and still exists with no way of deleting it. Delete it admins. Do it, this ends now.
  • ForceQuitForceQuit Member Posts: 350

    Originally posted by Hashbrick

     




    Originally posted by MMO_Doubter





    Originally posted by cpc71783

    You sure about that? Check out Vexels (essentially pixels) and their use in UNLIMITED detail. Unlimited detail being produced without the need of a video card.

    Unlimited detail uses pixels, or vexels instead of polygons, and renders an unlimited amount of detail in a virtual world. It does this by only rendering what is visible on the screen, instead of rendering the entire world around you all at the same time:



    Eat your words.






    Very interesting video.



     

    Unlimited Detail is what the 3DO was to the video game market. Junk, it will be ignored it's all theory and while they try to say the process is simple there is nothing anyone can do without a SDK. I just think the way the company is working with it and the direction it is going it just won't make it. There is better ways and theories that will make the new standard then this.

    As for the drblatz I hope to hell not many companies jump on this ruse called "Donnybroke", you seem to be a salesman for it though since if I look back through your post history all I see is donnybroke donnybroke donnybroke. Last thing we need is microsoft controlling that shit. We can already thank them for piece of shit standards throughout the years such as the lovely IE6-8 that is still behind and makes a developers job a living hell. Not to mention the later directX that was very rough around the edges.

    Again, Unlimited Detail Technology as of now is total snake oil.  Yes, the concept is interesting, but they have so far shown zero capacity to provide hard answers to very real mathematical questions.  Nor as you have said, even a working SDK.  Add to that totally invented words like MASS CONNECTED PROCESSING and claims like "The result is a perfect pure bug free 3D engine that gives Unlmited Geometry running super fast, and it'a all done in software" I'm surprised anybody could possibly take this company seriously.  Fact is, nothing is unlimited - you may be able to optimize how you render your data, sure, but calculating a simple fractal algorithm and slapping it onto a mesh is a very disingenuous way of saying "infinite detail" LOL...

  • rebelhero1rebelhero1 Member Posts: 229

    Originally posted by wankydrake

    After looking at crysis2 screenshots i dont see in the next year and up of graphics really geting better then that i know a lot of small details could get improved but unless the whole 3D glasses catch on i just cant see it geting higher then we are now.

    We aren't even close.

     

    There are infinite things they could make HUMONGOUS improvements on.

     

    Most games are jerry-rigged for optimization, meaning a lot is sacrificed. Things you don't notice are missing, but would definitely notice if they were there.

     

    Imagine life-like hair. Each strand being processed. Blowing in wind.

     

    Imagine actual stitches on clothing.

     

    When your character gets wounded, he actually receives a proper wound WHERE he got hit.

     

    Trust me man, there's a LOT that hasn't been done. And by a lot, I mean a near endless supply of awesome limited only by imagination and the time they put into it.

    Playing: *sigh* back to WoW :(
    --------
    Waiting for: SW:TOR, APB, WoD
    ---------
    Played and loved: Eve and WoW
    --------
    Played and hated: WoW:WotLK, Warhammer, every single F2P

  • SwampRobSwampRob Member UncommonPosts: 1,003

    Far from it.   The problem is, not everyone has an ultra-fast computer or an ultra-fast high speed connection.   So the companies that make these games correctly tone down the graphics so that their games can be played by a wider audience.

  • Rockgod99Rockgod99 Member Posts: 4,640

    Another two console generations and we will be an 100% photo realism.

    Games out now are so good looking that we will beable to play games like Uncharted 2, MGS4 and Gear of War 2 10-15 years from now and they will still hold up.

    With that said i still think we have a couple upgrades left.

    Just look at the newest games, they still have jaggies and slow down.

    image

    Playing: Rift, LotRO
    Waiting on: GW2, BP

  • ThorkuneThorkune Member UncommonPosts: 1,969

    Not even close I would think.

  • RobsolfRobsolf Member RarePosts: 4,607

    Originally posted by wankydrake

    After looking at crysis2 screenshots i dont see in the next year and up of graphics really geting better then that i know a lot of small details could get improved but unless the whole 3D glasses catch on i just cant see it geting higher then we are now.

    Every time I've come to think this, along comes a game that changes my mind.

    I remember when Star Wars:  Jedi Knight came out, thinking "it just doesn't get any better than this."  Now browser games can do better.  Tie Fighter, with amazing 640x480 gfx, "dude, it's like I'm THERE!"  :)

    There's ALOT further to go even now, particularly in backgrounds.  But I do think there will come a point where there will be diminishing returns.

    Also, we're currently looking at a screen which shows us a very limited field of vision.  There's a LONG way to go in that direction.

  • LoktofeitLoktofeit Member RarePosts: 14,247

    Not sure if it's been mentioned before but here is some of the stuff currently being worked on:

     

    When it comes to graphics, I think we still have a lot left to surprise us in the near future.

    There isn't a "right" or "wrong" way to play, if you want to use a screwdriver to put nails into wood, have at it, simply don't complain when the guy next to you with the hammer is doing it much better and easier. - Allein
    "Graphics are often supplied by Engines that (some) MMORPG's are built in" - Spuffyre

  • brezelbrezel Member Posts: 202

    i saw crysis 2 and i thought to myself: "wow, this shots lookin photorealistic!" well, after i clicked on the thumbnail picture and saw the same shot in a higher resolution the illusion was gone. 

    not in the next 1-2 years but i bet we will see again revolutionary graphic-engines! the technology never sleeps and the companys want make tones of cash with it! they will continue the improve of the graphics, using better textures and more realistic lighting.

    the next step will be realtime raytracing in games. realistic lightning is the key to sucess. 

    http://www.youtube.com/watch?v=vPeZf8WJDM4

  • IzkimarIzkimar Member UncommonPosts: 568

    Although they can make games and such well above the graphical levels we have seen thus far.  This is actually coming to a limiting point, because Moore's Law which he corrected himself later on (for those who don't know Moore's law, it states that technology will double every year) is going to die soon if not rescued by new technologies.  Silicon valley will soon be dead, for the chips we use in computers use SIlicon chips, and the top of the line chips are already down to what 32 nm?  The problem is each year a new chip comes out that shrinks the ammount of nanometers used, and eventually were going to get down to around 10 and that will be the stopping point, for around 8 nm you can't keep trace of the electrons which then will result in the system short circuiting.  However, this could take another 5 years, yet when this does happen all technological advances will stop till we can learn to use a new type of technology such as Quantum or DNA computers.

  • ThomasN7ThomasN7 87.18.7.148Member CommonPosts: 6,690

    Technology waits for no man. I'm sure we'll see something even better within the next 5 years or so.

    30
  • gauge2k3gauge2k3 Member Posts: 442

    No, we have not reached our limit on transistor size, and even if we do that, with the memristor now a proven concept it's only going to get better.

    I won't be satisfied until I can jack in like the matrix.

  • PhryPhry Member LegendaryPosts: 11,004

    Originally posted by WardTheGreat

    Although they can make games and such well above the graphical levels we have seen thus far.  This is actually coming to a limiting point, because Moore's Law which he corrected himself later on (for those who don't know Moore's law, it states that technology will double every year) is going to die soon if not rescued by new technologies.  Silicon valley will soon be dead, for the chips we use in computers use SIlicon chips, and the top of the line chips are already down to what 32 nm?  The problem is each year a new chip comes out that shrinks the ammount of nanometers used, and eventually were going to get down to around 10 and that will be the stopping point, for around 8 nm you can't keep trace of the electrons which then will result in the system short circuiting.  However, this could take another 5 years, yet when this does happen all technological advances will stop till we can learn to use a new type of technology such as Quantum or DNA computers.

     Already we're using more and more processors in a single computer, i think Pentium now have a 6 core that has 12 virtual processors, theres nothing to say that instead of having single processors becoming even more powerful, you just have more processors working together, in 1 years time its not unreasonable to think that we might even be seeing PC systems with even more processors. The limit then will be more than likely, the RAM and the Operating systems themselves. what are the possibilities of a 128 operating system? After all, in order to fully utilise any advance in the core technology, there has to be the software to harness it.

  • SwoogieSwoogie Member UncommonPosts: 399

    Originally posted by gauge2k3

    No, we have not reached our limit on transistor size, and even if we do that, with the memristor now a proven concept it's only going to get better.

    I won't be satisfied until I can jack in like the matrix.

     Actually for processors, we are nearing the limit. We have 32nm tech on these new i3/i5/i7 processors. The are silicon based and silicon CANNOT be streteched past 22nm. When this limit is reached in 2 years, we will have to find another material to make our processors with.

     

     

    :(

    image

  • Chile267Chile267 Member UncommonPosts: 141

    My tri SLi overclocked watercooled GTX 470's say no, we have not reached our limit on graphics. :)

  • EgnarEgnar Member Posts: 2

    We're still in the dark ages of computer graphics in my opinion - I mean seriously it started with television and films which means that graphic technology to this degree has only been around for roughly a century, in the grand scheme of things this is a tiny little spec of time.

     

    Computing is increasingly exponentially which means that in 5 years the average desktop computer for somebody who can't afford a "crazy gaming rig" will be more than double what the average power is today. . .Games like Crysis2 will run on full settings without issue on computers fresh off the assembly line at Dell.

     

    What this means is gaming and graphic companies are and will be looking to push the envelope to the next level, the next big thing - All we have right now is semi-realistic polygon based 2-d modeling of 3D figures - With the advent of computer screens and technology to pan an image over multiple screens such that you gain a full perspective (as opposed to just front end) we'll see a lot more companies trying to delve into 3d.

  • IzkimarIzkimar Member UncommonPosts: 568

    All of our technology is silicon based, and no matter how many cores we will reach the limit of silicon technology.  Each year the nm's shrink down, even with gpu's and memory it is all silicon based.  This information has even been stated by the great Michio Kaku a renown world Physicist.  We will eventually have to move on from silicon.  Yes, we may be in the Dark Ages of computers, but do you think the computing power is just some magical force that increases on its own every year that will never cease?  No, not with current technology.  Silicon has its limits, and if we can't succesfully use stuff like DNA computing or Quantum computing or some other sort of technology then graphical and technological advances will ultimately stop till we do.

  • IzkimarIzkimar Member UncommonPosts: 568

    Originally posted by SaintViktor

    Technology waits for no man. I'm sure we'll see something even better within the next 5 years or so.

     Lolz, I find this statement funny.  Technology does wait on man as a matter of fact. 

  • Mellow44Mellow44 Member Posts: 599

     Any one else think we reached our limit on graphics?

    Hardly.

    When games looks like real life THEN we have reached our limit on graphics.

    I reckon we have a couple of decades before that happens.

    All those memories will be lost in time, like tears in the rain.

  • alakramalakram Member UncommonPosts: 2,301

    Originally posted by wankydrake

    After looking at crysis2 screenshots i dont see in the next year and up of graphics really geting better then that i know a lot of small details could get improved but unless the whole 3D glasses catch on i just cant see it geting higher then we are now.

    No, never. Some years ago when pc's were starting Bill Gates, or someone related to the pc computing said, 640kb for ram memory is more than enought. We run system with 4,000 kb ram memory now and growing... Same with graphics, we never have enought, there is always room for improvement, always.



  • IzkimarIzkimar Member UncommonPosts: 568

    4,000 kb's?  More like 6 gigs bro.  Yet, I do agree and hope we will continue to grow technologically, we're still gonna have to ditch silicon computing.  DNA Computing is already making its way, they've already made some chips out of it.  The potential of it is amazing, it is much cheaper and much faster than silicon.  It's what we would need to go below the 22 nm mark, and retain stability, plus the machine would be much more of a beast than what silicon offers right now.  Computer technology is going to change considerably, silicon was just the first step.

  • astrob0yastrob0y Member Posts: 702

    We are soon there. I would say that we reach the limit in six months. Then it wont get any better. never ever again. 

    I7@4ghz, 5970@ 1 ghz/5ghz, water cooled||Former setups Byggblogg||Byggblogg 2|| Msi Wind u100

  • DaywolfDaywolf Member Posts: 749

    Originally posted by WardTheGreat

    4,000 kb's?  More like 6 gigs bro. 

    Tis the age of multi-core systems, not uncommon to find systems with more such as the Sun Fire X4600 M2 with 512GB of RAM (not the most). 4,000 kb's? (4MB) Well... good for FSB, though many run more.

    M59, UO, EQ1, WWIIOL, PS, EnB, SL, SWG. MoM, EQ2, AO, SB, CoH, LOTRO, WoW, DDO+ f2p's, Demo’s & indie alpha's.

  • NytakitoNytakito Member Posts: 381

    Originally posted by WardTheGreat

    All of our technology is silicon based, and no matter how many cores we will reach the limit of silicon technology.  Each year the nm's shrink down, even with gpu's and memory it is all silicon based.  This information has even been stated by the great Michio Kaku a renown world Physicist.  We will eventually have to move on from silicon.  Yes, we may be in the Dark Ages of computers, but do you think the computing power is just some magical force that increases on its own every year that will never cease?  No, not with current technology.  Silicon has its limits, and if we can't succesfully use stuff like DNA computing or Quantum computing or some other sort of technology then graphical and technological advances will ultimately stop till we do.

     There are other technologies in the mix.  I was just reading up on this a little while ago, I don't remember all the details but two things stick out.  One problem we have now, which is why multiple cores are now prevelant, is the good old galactic speed limit known as the speed of light.  Pumping electrons through nano circuitry at ~3.0GHz is about as fast as we can go on one core with current technology.  We are also reaching the limits of how small we can make things and still have them function in a non-quantum fashion.

    "If I'd asked my customers what they wanted, they'd have said a faster horse." - Henry Ford

Sign In or Register to comment.