Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Did modern graphics kill the seamless world?

12357

Comments

  • fenistilfenistil Member Posts: 3,005
    Originally posted by botrytis
    Originally posted by Tamanous

    Avanced graphics has nothing to do with how seemless a game is. The limitation is the developer engine the world is designed in. How far you see and how much is rendered at once is controlled by your own client and system. Zones are controlled by the game engine and how the game servers load balance. Developers limit themselves through their engine. Seemless worlds are possible now and forever into the future. The developers have to simply decide how lazy they want to be. It is about making the game appear seamless. It always has seams but how much you notice them is up to the limitations put in place by developers.

    Also how much money the game company is for the servers for the game. That is the biggest issue.

    Yeah you can use servers processing power much more effeficently with significant amount of gameplay taking place in lot of instances.     Especially now when virtualization is so advanced. 

    With seamless game you have huge amount of processing power idle in case some place in game gets swamped with players.  That mean more servers and more powerful servers. It is common knowegedle that servers cost does not scale well and high-end servers will cost insane money.

  • DavisFlightDavisFlight Member CommonPosts: 2,556
    Originally posted by Quizzical
    Originally posted by DavisFlight

    First, there's more content packed into Vanguard in each square than most MMOs have in entire zones. It's not barren. Not remotely.
    Neither is Darkfall.

    Vanguard also isn't terribly seamless.  When you cross a zone boundary and the game freezes for several seconds while it loads the next zone, that's a seam.  It has a severe case of hitching from trying to load and unload various things as you move around within a zone.

    Sure, if you add an SSD, those problems go away.  But you know what else could make those problems go away?  Loading screens.

    I usually hitch for half a second on zone lines, and I don't hitch at all going into dungeons, they're just a continuous part of the world. Far more seamless than the majority of MMOs out there.

  • DavisFlightDavisFlight Member CommonPosts: 2,556
    Originally posted by botrytis
    Originally posted by Tamanous

     

    Also how much money the game company is for the servers for the game. That is the biggest issue.

    Not really. If devs with a 30 man team and a 500k budget could make seamless uninstanced MMOs on dial up in 1999, then devs can do it today. It's just it takes actually good game design, and modern MMO devs just clone WoW instead.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by Tamanous

    How far you see and how much is rendered at once is controlled by your own client and system.

    That's one way to make a game world seamless:  make it so that you can't see very far away.  That means that you wouldn't have to load objects until they get closer, and since many objects never get close enough, you'd skip loading them entirely.

    But that comes with a huge drawback, namely, that you can't see very far.  Personally, I hate seeing objects pop into view, but not until I'm close enough that I should have been able to see them much further away.  Even worse is when objects don't pop into view until you're close enough that it mattered what was there before the object appeared.

    But my real objection is when objects pop up in the center of the screen.  When something scrolls in off of the side (e.g., from the player rotating the camera), I'm fine with that.  The way to have a short view distance without objects looking stupid when they pop into view is to have an isometric viewpoint, like 2D games traditionally have, rather than the modern 3D perspective.

    An isometric perspective does NOT mean that you're stuck with 2D graphics, however.  You can have nearly all modern 3D graphical effects with an isometric perspective.  Indeed, OpenGL used to have an API command (glOrtho) to do exactly that.  ("Used to have" because now if you want the same effect, you're supposed to do it in fragment shaders rather than relying upon a fixed function pipeline.)

    And better yet, you can have all of modern 3D graphics, and give players a choice of whether they want an isometric perspective or a 3D perspective.  This is actually pretty easy to do, like letting a player adjust the camera view distance.

    The first several stages of a modern graphics pipeline (whether in DirectX or OpenGL) are all about getting to the point where you have a bunch of triangles and specify the exact window position and depth of each vertex.  (The depth is to figure out which triangle is in "front" at a given point, so that whatever is in front is visible but blocks the view of whatever is behind it.)

    Actually getting to that point is a fair bit of work, as a game implicitly has a bunch of different coordinate systems floating around.  Any object that you can rotate needs to have its own coordinate system.  Each zone of your game world (which can be much smaller than "if you cross here, you have a loading screen" zones) has its own coordinate system, so that you can say that this tree is at this point on the ground instead of off over there.  There is also typically a camera coordinate system, where you say that the camera is at the origin, the direction it is facing is one axis.

    Everything up to and including putting things in the camera coordinate system does not care what perspective you're using.  Converting from camera coordinates to screen and depth coordinates (technically clip coordinates, but that immediately gets converted to screen and depth coordinates) consists of multiplying by a single 4x4 matrix.  Change the matrix and you change the perspective.  Switching from 3D perspective to isometric consists of changing a single 4x4 matrix.  And probably only changing a few numbers in that matrix, no less.

    If all you change is a single matrix, it will work, though it will be less than ideal.  Some effects that depend on distance from the camera (e.g., tessellating an object into more triangles when closer and fewer when further away) should be changed to not depend on distance when using an isometric perspective.  But it's not that hard to do, and give players an option to use either perspective.  It's probably easier than letting players switch between first person and third person perspective.

    And that isometric perspective doesn't mean "2.5D" graphics, either.  You could let players rotate the camera arbitrarily, or even make the camera follow the player like many games with a 3D perspective do.

    And this could easily be freed from some of the problems of traditional 2D games.  A lot of 2D games made it so that having a larger monitor resolution let you see further, which could be a major gameplay advantage.  2D sprites don't stretch well, but if the underlying graphics are 3D, then it's easy to let players adjust to make the game world appear larger or smaller, and pair any arbitrary view distance with any monitor resolution.

    Of course, then you get the problem that in order to see further, you need to load stuff faster.  And that means that people with an SSD have an advantage over people running the game off of a hard drive.  So you've really just traded one set of problems for another.  It can be done, and wouldn't even be all that hard, but whether it should be done is a different question.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by botrytis

    Also how much money the game company is for the servers for the game. That is the biggest issue.

    That has basically nothing to do with whether you want to make a game world seamless or not.  It's not even clear whether having zones separated by loading screens would increase or decrease server costs, but the effect would be trivial.  My guess is that zoned would be more expensive, because then you have to pay for bandwidth for players to download your loading screens.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by fenistil

    Yeah you can use servers processing power much more effeficently with significant amount of gameplay taking place in lot of instances.     Especially now when virtualization is so advanced. 

    With seamless game you have huge amount of processing power idle in case some place in game gets swamped with players.  That mean more servers and more powerful servers. It is common knowegedle that servers cost does not scale well and high-end servers will cost insane money.

    Saying this processor core will do the computations for this zone and that core will do the computations for that zone would be a really stupid way to set it up.  The amount of processing work that a server has to do to keep track of what is going on in the game world is vastly less than what a client has to do, because most of the client's work is to draw things on the screen (or set things up to draw them on the screen, load things that will be drawn on the screen, etc.).

    The server cost for a given amount of computational power has gotten vastly cheaper as time passed, too.  Moore's Law applies on the server side, too--and perhaps more so than for client processors, since server workloads are much easier to scale to many processor cores.

    It will depend considerably on how things are implemented, but I'd expect Internet bandwidth to tend to be a much greater "server" cost for online games than processing power, memory, or storage.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by DavisFlight

    Not really. If devs with a 30 man team and a 500k budget could make seamless uninstanced MMOs on dial up in 1999, then devs can do it today. It's just it takes actually good game design, and modern MMO devs just clone WoW instead.

    Yes, they can do it today, the same way they did it in 1999:  restricting the amount of loading to what they required in 1999, so that the graphics look in some (but not all!) ways like they're straight out of 1999.  There are trade-offs.

  • fenistilfenistil Member Posts: 3,005
    Originally posted by Quizzical
    Originally posted by fenistil

    Yeah you can use servers processing power much more effeficently with significant amount of gameplay taking place in lot of instances.     Especially now when virtualization is so advanced. 

    With seamless game you have huge amount of processing power idle in case some place in game gets swamped with players.  That mean more servers and more powerful servers. It is common knowegedle that servers cost does not scale well and high-end servers will cost insane money.

    Saying this processor core will do the computations for this zone and that core will do the computations for that zone would be a really stupid way to set it up.  The amount of processing work that a server has to do to keep track of what is going on in the game world is vastly less than what a client has to do, because most of the client's work is to draw things on the screen (or set things up to draw them on the screen, load things that will be drawn on the screen, etc.).

    The server cost for a given amount of computational power has gotten vastly cheaper as time passed, too.  Moore's Law applies on the server side, too--and perhaps more so than for client processors, since server workloads are much easier to scale to many processor cores.

    It will depend considerably on how things are implemented, but I'd expect Internet bandwidth to tend to be a much greater "server" cost for online games than processing power, memory, or storage.

    That's not what I am talking about. 

    Imagine.    Game is zoned and game also uses zone multiplying / instancing.  + we're talking about mmorpg

    Then on game release when game is packed game spawn few instances of most popular zones.

    Let's assume each zone is preety niog amd each zone instance is on other physical server. Let's say 300 people per zone instance.  (one zone instance per server or per core whatever - trying to show system and no exact specifics)

    Then when player progress through end game and strain on open world zones is smaller you can safely use big amount of servers you used to dynamically allocate taking care of insttances dungeons, arenas, battlegorunds, etc where most of playerbase will be.

     

    ---------------------

    Now you have seamless mmorpg. Of course game is taken care by multiple servers - each server take care of certain part of game terrain (of course they don't have to be even since in example cities can be more populated).

    BUT

    since there is no zone multiplying / instancing one server have to be ready to take care of more than those 300 players limit on instanced zone game.   So you need stroneger server.

    Of course you can just make internal 'grid / zone' lines smaller instead so each physical server / cpu / core whatever take care of smaller terrain = smaller players = less strain.  But there is really limit to that as you also want to avoid too frequent internal zone / grind borders between zone - because as we know that's tricky part + more servers you need to synchronizewith themself = harder + more potential problems.

    Then also you can't really take those physical servers and make them process other things because you need those processing power at all times if there is in example - player event (or mass pvp if you use seamless world for that) when hundreads of players go to one zone - cause they could crash the server that way if you don't have enough processing ready.   

    You just have more control over processing power and it use in game that use alot of instancing for both "open world" and end game gameplay "instanced dungeons" instead of in example mass pvp in seamless world.    + it is much easier to write server software that work on less cores.
    Remember I am not only talking about simple comparision between in example four zones divided by loading screen vs. four zones without loading screen between them.  Hope that is clear.

    Hope you understand me cause I shortened it - really don't want to format it and write it better - that kind of things i do when someone is paying me not when I have conversation on video game forums ;'p

  • ArChWindArChWind Member UncommonPosts: 1,340
    Originally posted by Quizzical
    Originally posted by fenistil

    Yeah you can use servers processing power much more effeficently with significant amount of gameplay taking place in lot of instances.     Especially now when virtualization is so advanced. 

    With seamless game you have huge amount of processing power idle in case some place in game gets swamped with players.  That mean more servers and more powerful servers. It is common knowegedle that servers cost does not scale well and high-end servers will cost insane money.

    Saying this processor core will do the computations for this zone and that core will do the computations for that zone would be a really stupid way to set it up.  The amount of processing work that a server has to do to keep track of what is going on in the game world is vastly less than what a client has to do, because most of the client's work is to draw things on the screen (or set things up to draw them on the screen, load things that will be drawn on the screen, etc.).

    The server cost for a given amount of computational power has gotten vastly cheaper as time passed, too.  Moore's Law applies on the server side, too--and perhaps more so than for client processors, since server workloads are much easier to scale to many processor cores.

    It will depend considerably on how things are implemented, but I'd expect Internet bandwidth to tend to be a much greater "server" cost for online games than processing power, memory, or storage.

    Actually that is the case in most games. The processors are divided across the entire game world but instanced of different cores that are free. Zone a empty the cores run zone b. If a and b are both full send processes off to core c. Core can be a single processor or a different machine.

    Several engines that don't do this are new and very costly they run one set of areas and nothing else. The problem is that to build these seamless worlds requires more servers than if the world shrad is split into zones.

     

    example is that if I was to build a 100 KM square and required AI, Player CCU of 5000, it would require between 40 and 100 cores. Each server cost in the neighborhood of $40,000 so to maintain 5000 CCU would cost initially between 400,000 to 2,000,000 bucks. The same game world in a zone shrad that would sustain 5000 CCU would require 13 to 28 cores.

    ArChWind — MMORPG.com Forums

    If you are interested in making a MMO maybe visit my page to get a free open source engine.
  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by fenistil

    Now you have seamless mmorpg. Of course game is taken care by multiple servers - each server take care of certain part of game terrain (of course they don't have to be even since in example cities can be more populated).

    BUT

    since there is no zone multiplying / instancing one server have to be ready to take care of more than those 300 players limit on instanced zone game.   So you need stroneger server.

    Of course you can just make internal 'grid / zone' lines smaller instead so each physical server / cpu / core whatever take care of smaller terrain = smaller players = less strain.  But there is really limit to that as you also want to avoid too frequent internal zone / grind borders between zone - because as we know that's tricky part + more servers you need to synchronizewith themself = harder + more potential problems.

    Then also you can't really take those physical servers and make them process other things because you need those processing power at all times if there is in example - player event (or mass pvp if you use seamless world for that) when hundreads of players go to one zone - cause they could crash the server that way if you don't have enough processing ready.   

    That's not a problem of a seamless world.  That's a problem of a stupid way to set up servers.  If you're buying game servers weak enough that you can only host 300 players on a server at a time, and if that's few enough to be a problem for you, then the problem is that whoever is setting up the servers is an idiot.  Make the game heavily zoned with lots of loading screens and you'd still run into all sorts of problems because whoever is setting up the servers is still an idiot.

    Let's ignore login servers, web site hosting, game downloads, and other such things that don't need to be on the servers that host the active game world.  Game servers don't really have to do anything more than they used to.  You check where players are in the game world, and periodically check to ensure that where the client claims to have moved to is possible.  You check to see if players hit mobs, and if mobs hit players.  You decide what mobs will do, and inform players of it.  You inform players of what other players who are nearby are doing.

    And more to the point, it's vastly less than the amount of work that a given client has to do for a game.  Or that a given client had to do for a game 10 years ago, for that matter.  In a given second, a client might have to take a model, compute some matrices relevant to it, and send a request that the video card process and display it hundreds of thousands of times, among other things.  The server that the player is connected to will only have to check on the player's location several times, send updates of what particular objects in the game world are doing a few dozen times (maybe hundreds in a very crowded area), determine whether a couple of attacks hit (whether player attacking mob or vice versa), compute the damage from those hits, and occasionally spawn a new mob or determine what loot a mob dropped.

    That might sound like a lot of things to do, but it's doing them a few times per second, rather than a few hundred thousand times per second.  And that makes a huge difference.  Remember that the client has to recompute a bunch of things every frame.  Servers only have to recompute things when it makes sense to do so.  Servers don't have to keep track of the graphics, but only the underlying physics, which is much simpler.

    Take how many players could fit on a single physical server in 2000.  Now multiply that by 50.  That's about how many players could fit on a single physical server today.  (If you want to argue that it's closer to 30 or 100, I'm not going to quibble; my point is that it's vastly larger than 1.)  Processors, memory capacity, memory bandwidth, and storage capacity have all scaled with something akin to Moore's Law.  Today, if you want it, you can easily get a single physical server with 64 physical processor cores, 256 GB of system memory, and several gigabit ethernet ports for well under $10,000.  That may not be the configuration you'd buy for a given game--but that's because it would be overkill.

    And once again, transferring a player from one zone to another isn't hard to do.  Sending data back and forth over a LAN is vastly faster and easier than sending data back and forth over the Internet.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by ArChWind

    Actually that is the case in most games. The processors are divided across the entire game world but instanced of different cores that are free. Zone a empty the cores run zone b. If a and b are both full send processes off to core c. Core can be a single processor or a different machine.

    Several engines that don't do this are new and very costly they run one set of areas and nothing else. The problem is that to build these seamless worlds requires more servers than if the world shrad is split into zones.

     

    example is that if I was to build a 100 KM square and required AI, Player CCU of 5000, it would require between 40 and 100 cores. Each server cost in the neighborhood of $40,000 so to maintain 5000 CCU would cost initially between 400,000 to 2,000,000 bucks. The same game world in a zone shrad that would sustain 5000 CCU would require 13 to 28 cores.

    Even if the game world appears seamless to players, you're still going to have a bunch of zones defined for internal server use (and for internal client use invisible to players, though likely not the same zones).  Most likely, you define the zones to be small enough that the computational needs for a given zone are far less than you'd have for a single processor core.  Then you can have the computations for different zones running in different threads, with far more threads than processor cores.  Threads for adjacent zones would have to communicate a bit for what is going on near the boundary between them, and with your database to find out when players log on or off and what they have.  But otherwise, each thread could be oblivious as to what is going on in most of the game world.

    So long as the total load on the processors, memory, and so forth is far less than what you have physically available, you can let the OS schedule which threads go on which cores and not have to worry about it.  If one thread needs 1/10 of a processor core one day and 1/5 of a core the next, the OS should be able to easily schedule that and have it work fine.  You only run into problems if you have a thread that needs most of a processor core all by itself, or if the total load from all threads isn't that far shy of the total amount of processing power available.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    One fundamental point is this:  a game's graphics and its underlying physics have nothing to do with each other.  They're programmed completely independently.  Of course, a good programmer tries to make sure that the game mechanics roughly match what players intuitively expect from the graphics.  But they have to be programmed separately, as there isn't anything intrinsic to computer programming that automatically does this for you.

    The computations to deal with graphics have gotten vastly more complicated as time passed.  And they're very important client-side for purposes of making a game world seamless.  But the computations for underlying game mechanics aren't any more complicated than they were 10 years ago, and often aren't much more complicated than they were 20 years ago.  And the server doesn't have to touch the graphics computations at all, so for purposes of making the game servers work, the graphical computations do not matter at all, no matter what they are.

    Some people have been trying to argue that if they could make seamless game worlds 10 years ago, they should still be able to do so today.  There are problems with that on the client side, but only on the client side.  Doing the server side stuff to make a seamless game world is vastly easier than it was 10 years ago, precisely because hardware has gotten so much more powerful.

  • ArChWindArChWind Member UncommonPosts: 1,340
    Originally posted by Quizzical
    Originally posted by ArChWind

    Actually that is the case in most games. The processors are divided across the entire game world but instanced of different cores that are free. Zone a empty the cores run zone b. If a and b are both full send processes off to core c. Core can be a single processor or a different machine.

    Several engines that don't do this are new and very costly they run one set of areas and nothing else. The problem is that to build these seamless worlds requires more servers than if the world shrad is split into zones.

     

    example is that if I was to build a 100 KM square and required AI, Player CCU of 5000, it would require between 40 and 100 cores. Each server cost in the neighborhood of $40,000 so to maintain 5000 CCU would cost initially between 400,000 to 2,000,000 bucks. The same game world in a zone shrad that would sustain 5000 CCU would require 13 to 28 cores.

    Even if the game world appears seamless to players, you're still going to have a bunch of zones defined for internal server use (and for internal client use invisible to players, though likely not the same zones).  Most likely, you define the zones to be small enough that the computational needs for a given zone are far less than you'd have for a single processor core.  Then you can have the computations for different zones running in different threads, with far more threads than processor cores.  Threads for adjacent zones would have to communicate a bit for what is going on near the boundary between them, and with your database to find out when players log on or off and what they have.  But otherwise, each thread could be oblivious as to what is going on in most of the game world.

    So long as the total load on the processors, memory, and so forth is far less than what you have physically available, you can let the OS schedule which threads go on which cores and not have to worry about it.  If one thread needs 1/10 of a processor core one day and 1/5 of a core the next, the OS should be able to easily schedule that and have it work fine.  You only run into problems if you have a thread that needs most of a processor core all by itself, or if the total load from all threads isn't that far shy of the total amount of processing power available.

    Back in 2000 the server was not doing that much work. Many of the functions of the world were inside the client (aka. reason for hacking and cheating). Todays high end game servers require physics calculation, advanced AI calculations and more functions run on the servers and fewer on the client. The server is required to do 90% of the work and the clients are nothing more than a dumb terminal to look at what is going on inside the server.
     
    You should download Zyzom open source shard and compile it. It is free and it does work. I have tested it. There are 13 servers in the shard. Have 100 of your buddies connect on one server and see what happens (average bandwidth is 12K bytes per second connection. That is server/client communications not internal LAN. We will disregard MySQL, client logins and account functions. a lot of processor needed there). If you’re lucky it won’t lock up. Take the same system and put it on 7 machines and you will get lucky to run 1000 CCU. The cap on the 13 server shard (13x4 = 52 cores) is 1500 to 2000 and the world  is SMALL seamless in size.
     
     
    ArChWind — MMORPG.com Forums

    If you are interested in making a MMO maybe visit my page to get a free open source engine.
  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by ArChWind
    Back in 2000 the server was not doing that much work. Many of the functions of the world were inside the client (aka. reason for hacking and cheating). Todays high end game servers require physics calculation, advanced AI calculations and more functions run on the servers and fewer on the client. The server is required to do 90% of the work and the clients are nothing more than a dumb terminal to look at what is going on inside the server.
     
    You should download Zyzom open source shard and compile it. It is free and it does work. I have tested it. There are 13 servers in the shard. Have 100 of your buddies connect on one server and see what happens (average bandwidth is 12K bytes per second connection. That is server/client communications not internal LAN. We will disregard MySQL, client logins and account functions. a lot of processor needed there). If you’re lucky it won’t lock up. Take the same system and put it on 7 machines and you will get lucky to run 1000 CCU. The cap on the 13 server shard (13x4 = 52 cores) is 1500 to 2000 and the world  is SMALL seamless in size.
     
     

    The overwhelming majority of the computational work is graphical in nature, and the server doesn't have to touch that.  If a texture looks wrong on your computer, then that can't be used to cheat, so the server doesn't have to know it.

    The physics and AI calculations usually aren't that complicated.  Games did them server-side a decade ago and it worked.  Well, at least the smart ones physics computations server-side; if a game didn't, it was be quickly overrun with cheaters.  Some games did suffer that fate, but some notably didn't.  AI computations intrinsically have to be done server-side, or else clients will get out of sync and, for example, not agree on whether a mob is dead.

    You can make a game where the physics and AI calculations are really complicated, and that will add a lot more work to what the server has to do.  Or you can make the server code really inefficient, which will greatly increase the hardware requirements.  But that's a problem of a bad game engine, and will give you the same problems in a heavily zoned game world with lots of loading screens as in a seamless world.  It's not a problem of seamless worlds.

    In order to cover up lag, physics computations have to be done both client- and server-side.  To some degree, the server has to trust that what the client tells it is correct (e.g., "I started moving forward 50 ms ago."), as if you don't actually move until the server finds out about it, game controls will feel really laggy and clumsy.  The server will have to verify that what the client told it happened is possible, but doesn't actually need to replicate all client-side physics computations.  In particular, the client will have to compute many things per-frame, while the server doesn't have to care about the client frame rate and can just compute things as often as makes sense.

    The server doesn't have to verify everything, but only the things that could be used to cheat.  That's how you get rubber-banding, for example.  The game computes on the client that a player moved such and such distance, and tells the server about it.  The server may look at it and say, yeah, that's possible (e.g., if the distance traveled in a given amount of time is no greater than how far the player "should" have been able to run, and there's nothing blocking the way), and accept it.  Or the server may say, no, that's impossible, you're actually back here, and tell the client to update its location accordingly.  But the server doesn't need to do that every frame like the client does; several times per second is enough.

  • ArChWindArChWind Member UncommonPosts: 1,340
    Originally posted by Quizzical
    Originally posted by ArChWind
    Back in 2000 the server was not doing that much work. Many of the functions of the world were inside the client (aka. reason for hacking and cheating). Todays high end game servers require physics calculation, advanced AI calculations and more functions run on the servers and fewer on the client. The server is required to do 90% of the work and the clients are nothing more than a dumb terminal to look at what is going on inside the server.
     
    You should download Zyzom open source shard and compile it. It is free and it does work. I have tested it. There are 13 servers in the shard. Have 100 of your buddies connect on one server and see what happens (average bandwidth is 12K bytes per second connection. That is server/client communications not internal LAN. We will disregard MySQL, client logins and account functions. a lot of processor needed there). If you’re lucky it won’t lock up. Take the same system and put it on 7 machines and you will get lucky to run 1000 CCU. The cap on the 13 server shard (13x4 = 52 cores) is 1500 to 2000 and the world  is SMALL seamless in size.
     
     

    The overwhelming majority of the computational work is graphical in nature, and the server doesn't have to touch that.  If a texture looks wrong on your computer, then that can't be used to cheat, so the server doesn't have to know it.

    The physics and AI calculations usually aren't that complicated.  Games did them server-side a decade ago and it worked.  Well, at least the smart ones physics computations server-side; if a game didn't, it was be quickly overrun with cheaters.  Some games did suffer that fate, but some notably didn't.  AI computations intrinsically have to be done server-side, or else clients will get out of sync and, for example, not agree on whether a mob is dead.

    You can make a game where the physics and AI calculations are really complicated, and that will add a lot more work to what the server has to do.  Or you can make the server code really inefficient, which will greatly increase the hardware requirements.  But that's a problem of a bad game engine, and will give you the same problems in a heavily zoned game world with lots of loading screens as in a seamless world.  It's not a problem of seamless worlds.

    In order to cover up lag, physics computations have to be done both client- and server-side.  To some degree, the server has to trust that what the client tells it is correct (e.g., "I started moving forward 50 ms ago."), as if you don't actually move until the server finds out about it, game controls will feel really laggy and clumsy.  The server will have to verify that what the client told it happened is possible, but doesn't actually need to replicate all client-side physics computations.  In particular, the client will have to compute many things per-frame, while the server doesn't have to care about the client frame rate and can just compute things as often as makes sense.

    The server doesn't have to verify everything, but only the things that could be used to cheat.  That's how you get rubber-banding, for example.  The game computes on the client that a player moved such and such distance, and tells the server about it.  The server may look at it and say, yeah, that's possible (e.g., if the distance traveled in a given amount of time is no greater than how far the player "should" have been able to run, and there's nothing blocking the way), and accept it.  Or the server may say, no, that's impossible, you're actually back here, and tell the client to update its location accordingly.  But the server doesn't need to do that every frame like the client does; several times per second is enough.

    I personally am not against it. in fact many of my ideas deal with very large open seamless world Quizzical. You need to read my past post history. In fact I was just in a discussion thread here which got my post deleted in which one person showed a bit of vemon about the term seamless.

    ArChWind — MMORPG.com Forums

    If you are interested in making a MMO maybe visit my page to get a free open source engine.
  • 123443211234123443211234 Member UncommonPosts: 244

    Darkfall released fall 2008

    Darkfall Unholy Wars release Nov. 20 2012

  • DavisFlightDavisFlight Member CommonPosts: 2,556
    Originally posted by Quizzical
    Originally posted by DavisFlight

    Not really. If devs with a 30 man team and a 500k budget could make seamless uninstanced MMOs on dial up in 1999, then devs can do it today. It's just it takes actually good game design, and modern MMO devs just clone WoW instead.

    Yes, they can do it today, the same way they did it in 1999:  restricting the amount of loading to what they required in 1999, so that the graphics look in some (but not all!) ways like they're straight out of 1999.  There are trade-offs.

    I don't think you understand.

    In 1999, the graphics in those seamless games were the bleeding edge high tech. EverQuest was one of the games that made dedicated graphics cards actually a thing. You do not, under ANY circumstances, have to have bad graphics if you want a seamless world.

     

    Do you even read what you type?

  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by DavisFlight

    I don't think you understand.

    In 1999, the graphics in those seamless games were the bleeding edge high tech. EverQuest was one of the games that made dedicated graphics cards actually a thing. You do not, under ANY circumstances, have to have bad graphics if you want a seamless world.

    Have you not noticed that today's bleeding edge graphics are rather more demanding on hardware then 1999's bleeding edge graphics?  You can easily make a seamless world today with graphics that would have been bleeding edge in 1999.  But that it doesn't follow that it's easy to make a seamless world with graphics that are bleeding edge today.

    Processor speed has increased greatly since 1999.  Video card performance has increased greatly since 1999.  So has hard drive capacity, system memory capacity, system memory bandwidth, video memory capacity, video memory bandwidth, AGP/PCI Express bandwidth, hard drive sequential read/write speeds, and probably some other things that don't come to mind off hand.

    But you know what has barely budged since 1999?  Hard drive IOPS.  So if you make a game that, as compared to a bleeding edge game from 1999, demands 10 times the performance in every component, you've got performance to spare in most places.  But you'll completely choke on hard drive IOPS, and things won't load fast enough.

    Imagine if everything had increased greatly, but people were still only using 64 MB of system memory, like in 1999.  Think that greatly increasing performance on everything else would be enough to give today's bleeding edge graphics?  Or do you think that the lack of system memory would be something of a problem and hold a lot of things back?  Or imagine an otherwise really nice computer, except that it's running a 533 MHz Pentium III.  Think you'd be able to get modern bleeding edge graphics without requiring more processor performance?  Poor performance in one place holds the whole system back.

    SSDs bring the IOPS performance increases there that you'd hope for to cover more than a decade.  If you assume that everyone has an SSD, then making a seamless world with modern, bleeding edge becomes easy again.  The problem is that doing so will mean making a game that will run very poorly on the computers that most gamers have.  That's not a viable way to make money--at least not until basically everyone does have an SSD, at which point, I'd expect to see a resurgence in seamless worlds.

  • DavisFlightDavisFlight Member CommonPosts: 2,556
    Originally posted by Quizzical
    Originally posted by DavisFlight

    I don't think you understand.

    In 1999, the graphics in those seamless games were the bleeding edge high tech. EverQuest was one of the games that made dedicated graphics cards actually a thing. You do not, under ANY circumstances, have to have bad graphics if you want a seamless world.

    Have you not noticed that today's bleeding edge graphics are rather more demanding on hardware then 1999's bleeding edge graphics?

    Considering EQ alone was responsible for the shift towards dedicated graphics cards, no. Today's bleeding edge graphics are just as demanding as in the 90s, because it all scales up.

    MMO companies have done seamless worlds then, and they're doing seamless worlds now. It is a design choice.

    Or are you telling me that, Arenanet doesn't have programmers as good as Darkfall's 30 developers in Greece?

  • BadSpockBadSpock Member UncommonPosts: 7,979
    Originally posted by DavisFlight
    Originally posted by Quizzical
    Originally posted by DavisFlight

    I don't think you understand.

    In 1999, the graphics in those seamless games were the bleeding edge high tech. EverQuest was one of the games that made dedicated graphics cards actually a thing. You do not, under ANY circumstances, have to have bad graphics if you want a seamless world.

    Have you not noticed that today's bleeding edge graphics are rather more demanding on hardware then 1999's bleeding edge graphics?

    Considering EQ alone was responsible for the shift towards dedicated graphics cards, no. Today's bleeding edge graphics are just as demanding as in the 90s, because it all scales up.

    MMO companies have done seamless worlds then, and they're doing seamless worlds now. It is a design choice.

    Or are you telling me that, Arenanet doesn't have programmers as good as Darkfall's 30 developers in Greece?

    Actually the shift to dedicated graphics cards was due to open GL for the most part - and games like Quake II and Unreal were massively more popular than EQ.

    And you can't argue with Quiz. You just can't. He wins, every time. He is that good. Just listen and show respect and learn from him.

    And Darkfall graphics are an abomination of horrible - as well as their network code and physics. I couldn't play that game more than an hour before uninstalling, and it had NOTHING to do with the PvP or skill system - it was just that piss poor of quality. And this was only a couple of months ago.

  • itgrowlsitgrowls Member Posts: 2,951

    I think that the combination of two things killed it. 

    Economic instability

    Lack of competition

    Let's face it first the devs of any game are going to try to make it easier for players to play their games, shards do this. It's easier to code a teleport to multiple shard element then to put in one massive world and deal with the crazy upkeep of one massive server chains problems. They want to spend on a budget to get the most out of a game so they usually make it for older machines.

    Second, there really isn't much in the way of mmo and tech competition. What the two major chipsets everyone gets is AMD or Intel and the two major gpu chipsets everyone gets is either ATI or Nvidia. And the two major core game dev software APIs are Directx and OpenGL and i really wouldn't consider it a competition between these two because we all know that DirectX (Microsoft) has held the monopoly over this part of the industry for a very long time somehow without being noticed by anyone despite monopolies being illegal last i checked.

    So really it's a lack of competition. 

    Look at the internet status in the US. The reason why there is so much trouble with the internet connections services in the US is due to lack of competition, which breeds lower standards of support and lower quality tech, and a stagnant advancement.

  • DavisFlightDavisFlight Member CommonPosts: 2,556
    Originally posted by BadSpock
    Originally posted by DavisFlight
    Originally posted by Quizzical
    Originally posted by DavisFlight

    I don't think you understand.

    In 1999, the graphics in those seamless games were the bleeding edge high tech. EverQuest was one of the games that made dedicated graphics cards actually a thing. You do not, under ANY circumstances, have to have bad graphics if you want a seamless world.

    Have you not noticed that today's bleeding edge graphics are rather more demanding on hardware then 1999's bleeding edge graphics?

    Considering EQ alone was responsible for the shift towards dedicated graphics cards, no. Today's bleeding edge graphics are just as demanding as in the 90s, because it all scales up.

    MMO companies have done seamless worlds then, and they're doing seamless worlds now. It is a design choice.

    Or are you telling me that, Arenanet doesn't have programmers as good as Darkfall's 30 developers in Greece?

    Actually the shift to dedicated graphics cards was due to open GL for the most part - and games like Quake II and Unreal were massively more popular than EQ.

    And you can't argue with Quiz. You just can't. He wins, every time. He is that good. Just listen and show respect and learn from him.

    And Darkfall graphics are an abomination of horrible - as well as their network code and physics.

    But he's wrong, he's outright wrong. Many devs in the past and present have done seamless worlds both online and offline. If he's arguing that the limitation is on server speed thats one thing, but hes arguing hard drive speed. How does he explain games like Skyrim?

    As for Darkfall's net code, it is probably the best in the MMO genre. Do you see any other games that have full FPS real time combat on such a massive scale? No. You don't.

    As for the graphics, they're MUCH better than WoW, and are about to get even better.

     

  • ciomi76ciomi76 Member UncommonPosts: 6

    sure

     

    image
  • QuizzicalQuizzical Member LegendaryPosts: 25,507
    Originally posted by DavisFlight
    Originally posted by Quizzical
    Originally posted by DavisFlight

    I don't think you understand.

    In 1999, the graphics in those seamless games were the bleeding edge high tech. EverQuest was one of the games that made dedicated graphics cards actually a thing. You do not, under ANY circumstances, have to have bad graphics if you want a seamless world.

    Have you not noticed that today's bleeding edge graphics are rather more demanding on hardware then 1999's bleeding edge graphics?

    Considering EQ alone was responsible for the shift towards dedicated graphics cards, no. Today's bleeding edge graphics are just as demanding as in the 90s, because it all scales up.

    MMO companies have done seamless worlds then, and they're doing seamless worlds now. It is a design choice.

    Or are you telling me that, Arenanet doesn't have programmers as good as Darkfall's 30 developers in Greece?

    Except that it doesn't all scale up.  Hard drive IOPS hasn't increased much.  A 3.5" hard drive platter spinning at 7200 RPM spins at exactly the same speed as a 3.5" hard drive platter spinning at 7200 RPM that was produced way back in 1999.  There have been some slight improvements in IOPS due to a variety of tweaks that make it more likely that you can move a hard drive to the right radial position to grab data on a given rotation rather than waiting for the next one.  But the expected rotational latency of 4.17 ms is an intrinsic property of 7200 RPM hard drives, and even being able to magically teleport a hard drive head to the right radius instantly would not be able to reduce read latency below that.

    It's simple physics, really.  A hard drive rotating at 7200 RPM rotates at 120 revolutions per second.  Each full revolution takes about 8.33 ms.  If you discover that you want to read a particular cluster and know where the cluster is, you have to wait for the platter to spin it around to where your drive head is.  The amount of time this takes is uniformly distributed between 0 ms and 8.33 ms, for an average value of about 4.17 ms.  And that's assuming that you can move the drive head to the right radius by the time it the platter moves to the right spot the first time.  If not, then add another 8.33 ms per revolution it takes.  Various optimizations can and have tended to decrease the number of extra revolutions, but that 4.17 ms figure is simple physics and cannot ever be beaten with 7200 RPM hard drives.

    Incidentally, the other thing that hasn't really scaled is memory latency, whether for system memory or video memory.  There, you're limited by a variety of factors, one of which is the speed at which an electrical current travels through a copper wire.  That, in turn, is limited by the speed of light in a vacuum, which is a meaningful restriction and has been for a long time.  The speed of light in a vacuum has long been believed to be a universal speed limit that you can't exceed by any means, ever.

    There are still ways to make a seamless world, but it requires major sacrifices that didn't used to be relevant.  Since you cite Darkfall, I'd like to point out that Darkfall's graphics are rather far inferior to Guild Wars 2's graphics.  That's practically a poster child for "we made some major sacrifices in the game" to make the world seamless, among other things.

  • QuizzicalQuizzical Member LegendaryPosts: 25,507

    Look.  I'm not against seamless worlds here.  I've been occasionally linking screenshots in this very thread.  And they're taken from a (surprise!) seamless world.  And more seamless than Darkfall, WoW, Skyrim, or anything else you might want to cite as seamless, for that matter.  Not only is there no loading screens between zones, but the world is round, so there isn't even any "you have reached the end of the world.  Please stop and turn around" seams.  That's a perfectly real type of seam that can cover up an awful lot.

    But there are sacrifices to be made.  In my case, I do it by cutting out the hard drive entirely other than loading the game initially.  Extremely heavy use of tessellation in creative ways means that there is so little vertex data that it can all be loaded into video memory when the game is launched and left there until the game closes.  Textures are all randomly generated by the processor rather than loaded from storage.  It's seamless and it works--but it's very obviously not AAA quality graphics.

Sign In or Register to comment.