Someone have knowledge or any idea of why gw2 used directX 9 instead of 11?
Because the engine they are using is the same engine GW1 was build on, and that engine used DX9. We don't know if GW2 will support DX 10 or 11
what about windows 8 will there be a directx 12? plus i read a article somewhere where they want older rigs to be able to play gw2 and that some older rigs won't support dx11
To keep down minnimum specs I would think. Not everyone has new graphics cards and so far we havent seen the game fully optimized. I will worry when the last beta goes by and we still havent seen dx11 but till then I wouldnt worry.
Someone have knowledge or any idea of why gw2 used directX 9 instead of 11?
Because the engine they are using is the same engine GW1 was build on, and that engine used DX9. We don't know if GW2 will support DX 10 or 11
what about windows 8 will there be a directx 12? plus i read a article somewhere where they want older rigs to be able to play gw2 and that some older rigs won't support dx11
You will not see DX 12 anytime in the near future.
thank god .....I still get bunch of dx11 error and crap on my new rig...why can't windows ever so anything right like there sp1 pack for win 7 has mad dl problems ......I mean when will a computer software company ever just be cool and not put bugs in our software codes..
they started work on this game before there was DX11 and right arounf the time DX10 came out so they used DX9 it been out longer and i bet a lot of the tools they use to make them game use DX9 also.
it is not that easy to switch from one version of DX to the next. i have code for a game done in DX9 i told the compiler to use the DX10 SDK and i got over 200,000 compile errors and i am not sure but i think there is over 1,000,000 lines of code. it would take a lot of time to fix all the errors.
i am not sure how many line of code GW2 has but i read a post some time back on WOW it has over 5,000,000 lines of code so think how long it would time to find and fix 200,000 errors in over 5,000,000 lines of code.
I remember them confirming that the game will use directx9 and 10. Baisacally if your toster can't run Directx 10 it will be able to just play the game on 9 like we were able to do so in the bete build.
But in general the game is meant to run on Directx 10. I think someone asked in a interview about 11 but they didn't confirm anything on that. So its probablly a long way from that.
I remember them confirming that the game will use directx9 and 10. Baisacally if your toster can't run Directx 10 it will be able to just play the game on 9 like we were able to do so in the bete build.
But in general the game is meant to run on Directx 10. I think someone asked in a interview about 11 but they didn't confirm anything on that. So its probablly a long way from that.
It was 5 years ago when they "confirmed" that the game will support directx 10. This was before DX 11 was ever in beta stage.
thank god .....I still get bunch of dx11 error and crap on my new rig...why can't windows ever so anything right like there sp1 pack for win 7 has mad dl problems ......I mean when will a computer software company ever just be cool and not put bugs in our software codes..
Never heard of Linux? Yes it's not suited for gaming, but still you won't gen bluescreens just because you have an old driver plus someone even played at gw2 on linux http://guildwars.incgamers.com/blog/comments/guild-wars-2-on-linux-it-works . I'm using windows just for games atm hope i will be able to play on linux sometime in the future, so i can leave this crappy SO.
Someone have knowledge or any idea of why gw2 used directX 9 instead of 11?
Because the engine they are using is the same engine GW1 was build on, and that engine used DX9. We don't know if GW2 will support DX 10 or 11
But a heavily modified version of said engine so it shouldn't have too much to do with that, my guess is that it's more due to the optimization still ongoing.
Also the fact that there's an indicator on the client options showing "renderer: DirectX 9" is an indication that dx10 / 11 will be supported in future. If only one of em is supported it doesn't make sense to add it there.
I remember them confirming that the game will use directx9 and 10. Baisacally if your toster can't run Directx 10 it will be able to just play the game on 9 like we were able to do so in the bete build.
But in general the game is meant to run on Directx 10. I think someone asked in a interview about 11 but they didn't confirm anything on that. So its probablly a long way from that.
It was 5 years ago when they "confirmed" that the game will support directx 10. This was before DX 11 was ever in beta stage.
I am glad dx10 will be used ....I am no comp genius by any means. Can I replace my graphics card driver yes,turn on my oven and cook a pizza sure thing but seems to me dx11 cause a lot of problems that have yet to be resolved....
I remember them confirming that the game will use directx9 and 10. Baisacally if your toster can't run Directx 10 it will be able to just play the game on 9 like we were able to do so in the bete build.
But in general the game is meant to run on Directx 10. I think someone asked in a interview about 11 but they didn't confirm anything on that. So its probablly a long way from that.
It was 5 years ago when they "confirmed" that the game will support directx 10. This was before DX 11 was ever in beta stage.
Well its written now under : " Frequently Asked Questions" in the offical wiki. And sure, wiki is wiki but it hasn't been changed or denied so I think that they are sticking to theri plans on having dx10.
thank god .....I still get bunch of dx11 error and crap on my new rig...why can't windows ever so anything right like there sp1 pack for win 7 has mad dl problems ......I mean when will a computer software company ever just be cool and not put bugs in our software codes..
Never heard of Linux? Yes it's not suited for gaming, but still you won't gen bluescreens just because you have an old driver plus someone even played at gw2 on linux http://guildwars.incgamers.com/blog/comments/guild-wars-2-on-linux-it-works . I'm using windows just for games atm hope i will be able to play on linux sometime in the future, so i can leave this crappy SO.
yep i liked linux redhat system but it couldnt run anything for games at tht time but y friends who used it in business loved it ....maybe someday will have software that doesnt crash your computer or freeze right as u just about to complete the level lol
Most the the graphics cards in the market are not directx 11 compatiable so I doubt they will waste their time doing it ATM. I foresee them doing it down the line, actually i hope they do what Ncsoft korea did to aion and release a high quality client for guild wars 2 at some point.
Someone have knowledge or any idea of why gw2 used directX 9 instead of 11?
Because the engine they are using is the same engine GW1 was build on, and that engine used DX9. We don't know if GW2 will support DX 10 or 11
what about windows 8 will there be a directx 12? plus i read a article somewhere where they want older rigs to be able to play gw2 and that some older rigs won't support dx11
Windows 8 will bring DirectX 11.1, but not DirectX 12. Radeon HD 7000 series cards (7700 or higher, so not rebrands) support DirectX 11.1. GeForce 600 series cards do not, so Nvidia likely won't bring DirectX 11.1 support until Maxwell in 2014. It's unclear whether DirectX 11.1 will matter; there were only two games ever made that were built for DirectX 10.1.
-----
DirectX 11 is backward compatible to at least DirectX 9.0c, and possibly further. DirectX 10 is not backward compatible at all. If a game is coded in DirectX 10, it flatly will not run on cards that do not support DirectX 10 (GeForce 8000 series or later, Radeon HD 2000 series or later). Rather, what games did was to have both a DirectX 10 version and also a DirectX 9.0c version. Two separate code paths to maintain is cost++;, which is one reason why DirectX 10 never caught on.
If a game is coded in DirectX 11 and a video card only supports, say, DirectX 9.0c, then the game will run just fine, but disable any graphical features that the game uses but DirectX 9.0c does not support. The same is true for systems with a more recent video card running Windows XP, which doesn't support anything past DirectX 9.0c. This means that a game only has to code things once, so even if a game intends to use some features introduced in DirectX 10 but not 11, today it would be coded as a DirectX 11 game.
As for whether the newer DirectX versions matter, DirectX 10 didn't introduce anything interesting, and the only important graphical feature that DirectX 11 brought was tessellation. If Guild Wars 2 doesn't have tessellation, then it might as well be DirectX 9.0c, regardless of what code path the game uses internally.
Actually, as far as I am aware, DX10 or DX11 was seen at some of the conventions. There used to be an option to switch between the two. We also did get confirmation (albeit years ago) that they would support higher than DX9.
With all that, I do believe they have implemented something higher than DX9.
DirectX 11 is backward compatible to at least DirectX 9.0c, and possibly further.
Mmhh no, sorry. If a piece of software is coded to exclusively use DX11, it will not run on DX9 hardware. DX11 is not different from DX10 for that. It's definitely not automatic as you say in the rest of your post. If you want your engine/game to support DX11, but also DX10 and DX9, YOU (the developer) will have to program different rendering paths.
Originally posted by Quizzical
As for whether the newer DirectX versions matter, DirectX 10 didn't introduce anything interesting, and the only important graphical feature that DirectX 11 brought was tessellation. If Guild Wars 2 doesn't have tessellation, then it might as well be DirectX 9.0c, regardless of what code path the game uses internally.
Sorry but... what a bunch of nonsense. You obviously have no clue about what you are trying to talk about.
Actually, you could have done better by just reading wikipedia... where did you get that "info", some misinformed gaming site or forum?
Respect, walk, what did you say? Respect, walk Are you talkin' to me? Are you talkin' to me? - PANTERA at HELLFEST 2023
Someone have knowledge or any idea of why gw2 used directX 9 instead of 11?
There is a "Renderer: DirectX 9" on the options page. If the game was built only for directx 9 that wouldn't make any sense. I think there will be at least dx10 support, don't know about dx11 otherwise why have an option to show which renderer is used by the game?
Block the trolls, don't answer them, so we can remove the garbage from these forums
DirectX 11 is backward compatible to at least DirectX 9.0c, and possibly further.
Mmhh no, sorry. If a piece of software is coded to exclusively use DX11, it will not run on DX9 hardware. DX11 is not different from DX10 for that. It's definitely not automatic as you say in the rest of your post. If you want your engine/game to support DX11, but also DX10 and DX9, YOU (the developer) will have to program different rendering paths.
Originally posted by Quizzical
As for whether the newer DirectX versions matter, DirectX 10 didn't introduce anything interesting, and the only important graphical feature that DirectX 11 brought was tessellation. If Guild Wars 2 doesn't have tessellation, then it might as well be DirectX 9.0c, regardless of what code path the game uses internally.
Sorry but... what a bunch of nonsense. You obviously have no clue about what you are trying to talk about.
Actually, you could have done better by just reading wikipedia... where did you get that "info", some misinformed gaming site or forum?
There are internal code changes, such as DirectX 11 trying to make it easier for games to scale well to more processor cores. That's good, but it's not new graphical features that will be visible to game players. It's also just one of many factors in determining how well games scale well to many cores. There were games that could use several before DirectX 11, and there are DX 11 games that don't scale well.
There are also some fancy lighting features introduced in DirectX 10 and 11, but even if games implement them, it's just more stupid things for players to disable and a waste of coding effort. AMD had their "ladybug" demo to show off one such feature, and it basically consisted of saying, "Look how we can use a new DirectX 11 feature to make a video look worse than before!" I have no idea why anyone would be excited about that. DirectX 11 also brought order-independent transparency in particular, but has any game ever implemented that in an interesting way? Or even an uninteresting way? The only commercial program that I'm aware of that implemented OIT isn't even a game.
What else is there? Compute shaders? Nice for GPGPU, sure. For games? It took nearly two years after DirectX 11 launched for Nvidia to come up with FXAA, but the only real advantage there is a lesser performance hit than MSAA. AMD's MLAA will tend to make games look worse if they contain text or a HUD.
I'm not saying that DirectX 10 and 11 didn't bring any new graphical features at all. But if you want to restrict to interesting features, tessellation is it.
DirectX 11 is backward compatible to at least DirectX 9.0c, and possibly further.
Mmhh no, sorry. If a piece of software is coded to exclusively use DX11, it will not run on DX9 hardware. DX11 is not different from DX10 for that. It's definitely not automatic as you say in the rest of your post. If you want your engine/game to support DX11, but also DX10 and DX9, YOU (the developer) will have to program different rendering paths.
Originally posted by Quizzical
As for whether the newer DirectX versions matter, DirectX 10 didn't introduce anything interesting, and the only important graphical feature that DirectX 11 brought was tessellation. If Guild Wars 2 doesn't have tessellation, then it might as well be DirectX 9.0c, regardless of what code path the game uses internally.
Sorry but... what a bunch of nonsense. You obviously have no clue about what you are trying to talk about.
Actually, you could have done better by just reading wikipedia... where did you get that "info", some misinformed gaming site or forum?
There are internal code changes, such as DirectX 11 trying to make it easier for games to scale well to more processor cores. That's good, but it's not new graphical features that will be visible to game players. It's also just one of many factors in determining how well games scale well to many cores. There were games that could use several before DirectX 11, and there are DX 11 games that don't scale well.
There are also some fancy lighting features introduced in DirectX 10 and 11, but even if games implement them, it's just more stupid things for players to disable and a waste of coding effort. AMD had their "ladybug" demo to show off one such feature, and it basically consisted of saying, "Look how we can use a new DirectX 11 feature to make a video look worse than before!" I have no idea why anyone would be excited about that. DirectX 11 also brought order-independent transparency in particular, but has any game ever implemented that in an interesting way? Or even an uninteresting way? The only commercial program that I'm aware of that implemented OIT isn't even a game.
What else is there? Compute shaders? Nice for GPGPU, sure. For games? It took nearly two years after DirectX 11 launched for Nvidia to come up with FXAA, but the only real advantage there is a lesser performance hit than MSAA. AMD's MLAA will tend to make games look worse if they contain text or a HUD.
I'm not saying that DirectX 10 and 11 didn't bring any new graphical features at all. But if you want to restrict to interesting features, tessellation is it.
It's more than just internal changes. Just the changes to the shader model between the different versions are very important.
And you can't dismiss features for what YOU think (in your opinion) being "stupid things players disable". Just the fact that DX10 got rid of fixed pipelines for programmable ones is a major step ahead. And I won't even mention the vast improvements to occlusion culling in DX10. All this is "visible" to the player not only by better rendering with more effects, but also improved performance allowing more complex scenes to be rendered with way less GPU and CPU usage.
Sorry but anyone who used the three versions professionally for years can only be amused here. You can do impressive things with DX9 because the platform is not the "end of it all" and the talent of the developers and the artists plays a major role, but in DX10 and DX11, you can do things better, or faster, or both, not to mention things that are completely impossible to make in DX9.
To make an analogy... I know the words "atom", "neutron", "electron", "proton", etc... but that doesn't make me a nuclear scientist, and I definitely don't pretend being one.
Respect, walk, what did you say? Respect, walk Are you talkin' to me? Are you talkin' to me? - PANTERA at HELLFEST 2023
Comments
Because direct11 is so full of bugs? worst directx eva...f o r e v a...my sand lot impression.
Because the engine they are using is the same engine GW1 was build on, and that engine used DX9. We don't know if GW2 will support DX 10 or 11
what about windows 8 will there be a directx 12? plus i read a article somewhere where they want older rigs to be able to play gw2 and that some older rigs won't support dx11
To keep down minnimum specs I would think. Not everyone has new graphics cards and so far we havent seen the game fully optimized. I will worry when the last beta goes by and we still havent seen dx11 but till then I wouldnt worry.
You will not see DX 12 anytime in the near future.
thank god .....I still get bunch of dx11 error and crap on my new rig...why can't windows ever so anything right like there sp1 pack for win 7 has mad dl problems ......I mean when will a computer software company ever just be cool and not put bugs in our software codes..
something i posted before about this
they started work on this game before there was DX11 and right arounf the time DX10 came out so they used DX9 it been out longer and i bet a lot of the tools they use to make them game use DX9 also.
it is not that easy to switch from one version of DX to the next. i have code for a game done in DX9 i told the compiler to use the DX10 SDK and i got over 200,000 compile errors and i am not sure but i think there is over 1,000,000 lines of code. it would take a lot of time to fix all the errors.
i am not sure how many line of code GW2 has but i read a post some time back on WOW it has over 5,000,000 lines of code so think how long it would time to find and fix 200,000 errors in over 5,000,000 lines of code.
I remember them confirming that the game will use directx9 and 10. Baisacally if your toster can't run Directx 10 it will be able to just play the game on 9 like we were able to do so in the bete build.
But in general the game is meant to run on Directx 10. I think someone asked in a interview about 11 but they didn't confirm anything on that. So its probablly a long way from that.
It was 5 years ago when they "confirmed" that the game will support directx 10. This was before DX 11 was ever in beta stage.
Never heard of Linux? Yes it's not suited for gaming, but still you won't gen bluescreens just because you have an old driver plus someone even played at gw2 on linux http://guildwars.incgamers.com/blog/comments/guild-wars-2-on-linux-it-works . I'm using windows just for games atm hope i will be able to play on linux sometime in the future, so i can leave this crappy SO.
But a heavily modified version of said engine so it shouldn't have too much to do with that, my guess is that it's more due to the optimization still ongoing.
Also the fact that there's an indicator on the client options showing "renderer: DirectX 9" is an indication that dx10 / 11 will be supported in future. If only one of em is supported it doesn't make sense to add it there.
I think it's supposed to use direct x 11 once the game goes live me thinks. Read it somewhere on their official beta forums me thinks
I am glad dx10 will be used ....I am no comp genius by any means. Can I replace my graphics card driver yes,turn on my oven and cook a pizza sure thing but seems to me dx11 cause a lot of problems that have yet to be resolved....
Who cares? The game looks good, that's what matters.
I'm developing using DX11 regularly, and it's not more bugged than DX9 or DX10.
Respect, walk
Are you talkin' to me? Are you talkin' to me?
- PANTERA at HELLFEST 2023
Well its written now under : " Frequently Asked Questions"
in the offical wiki. And sure, wiki is wiki but it hasn't been changed or denied so I think that they are sticking to theri plans on having dx10.
yep i liked linux redhat system but it couldnt run anything for games at tht time but y friends who used it in business loved it ....maybe someday will have software that doesnt crash your computer or freeze right as u just about to complete the level lol
DX10 but beta weekends where capped at DX9.
I realy hope somehow they can manage DX11 in future my PC can handle it with ease, as its ivybridge high end.
CPU:Intel Core i7-3770K 4GHz
GPU:ASUS HD 7970 DirectCU II TOP
MB:ASUS P8Z77-V DELUXE
Case:Cooler Master HAF X
RAM:Corsair 16GB 1600
PSU:Corsair gold 850
HD:SSD OCZ 256 GB vertex4
Most the the graphics cards in the market are not directx 11 compatiable so I doubt they will waste their time doing it ATM. I foresee them doing it down the line, actually i hope they do what Ncsoft korea did to aion and release a high quality client for guild wars 2 at some point.
Windows 8 will bring DirectX 11.1, but not DirectX 12. Radeon HD 7000 series cards (7700 or higher, so not rebrands) support DirectX 11.1. GeForce 600 series cards do not, so Nvidia likely won't bring DirectX 11.1 support until Maxwell in 2014. It's unclear whether DirectX 11.1 will matter; there were only two games ever made that were built for DirectX 10.1.
-----
DirectX 11 is backward compatible to at least DirectX 9.0c, and possibly further. DirectX 10 is not backward compatible at all. If a game is coded in DirectX 10, it flatly will not run on cards that do not support DirectX 10 (GeForce 8000 series or later, Radeon HD 2000 series or later). Rather, what games did was to have both a DirectX 10 version and also a DirectX 9.0c version. Two separate code paths to maintain is cost++;, which is one reason why DirectX 10 never caught on.
If a game is coded in DirectX 11 and a video card only supports, say, DirectX 9.0c, then the game will run just fine, but disable any graphical features that the game uses but DirectX 9.0c does not support. The same is true for systems with a more recent video card running Windows XP, which doesn't support anything past DirectX 9.0c. This means that a game only has to code things once, so even if a game intends to use some features introduced in DirectX 10 but not 11, today it would be coded as a DirectX 11 game.
As for whether the newer DirectX versions matter, DirectX 10 didn't introduce anything interesting, and the only important graphical feature that DirectX 11 brought was tessellation. If Guild Wars 2 doesn't have tessellation, then it might as well be DirectX 9.0c, regardless of what code path the game uses internally.
Actually, as far as I am aware, DX10 or DX11 was seen at some of the conventions. There used to be an option to switch between the two. We also did get confirmation (albeit years ago) that they would support higher than DX9.
With all that, I do believe they have implemented something higher than DX9.
Mmhh no, sorry. If a piece of software is coded to exclusively use DX11, it will not run on DX9 hardware. DX11 is not different from DX10 for that. It's definitely not automatic as you say in the rest of your post. If you want your engine/game to support DX11, but also DX10 and DX9, YOU (the developer) will have to program different rendering paths.
Sorry but... what a bunch of nonsense. You obviously have no clue about what you are trying to talk about.
Actually, you could have done better by just reading wikipedia... where did you get that "info", some misinformed gaming site or forum?
Respect, walk
Are you talkin' to me? Are you talkin' to me?
- PANTERA at HELLFEST 2023
There is a "Renderer: DirectX 9" on the options page. If the game was built only for directx 9 that wouldn't make any sense. I think there will be at least dx10 support, don't know about dx11 otherwise why have an option to show which renderer is used by the game?
Block the trolls, don't answer them, so we can remove the garbage from these forums
There are internal code changes, such as DirectX 11 trying to make it easier for games to scale well to more processor cores. That's good, but it's not new graphical features that will be visible to game players. It's also just one of many factors in determining how well games scale well to many cores. There were games that could use several before DirectX 11, and there are DX 11 games that don't scale well.
There are also some fancy lighting features introduced in DirectX 10 and 11, but even if games implement them, it's just more stupid things for players to disable and a waste of coding effort. AMD had their "ladybug" demo to show off one such feature, and it basically consisted of saying, "Look how we can use a new DirectX 11 feature to make a video look worse than before!" I have no idea why anyone would be excited about that. DirectX 11 also brought order-independent transparency in particular, but has any game ever implemented that in an interesting way? Or even an uninteresting way? The only commercial program that I'm aware of that implemented OIT isn't even a game.
What else is there? Compute shaders? Nice for GPGPU, sure. For games? It took nearly two years after DirectX 11 launched for Nvidia to come up with FXAA, but the only real advantage there is a lesser performance hit than MSAA. AMD's MLAA will tend to make games look worse if they contain text or a HUD.
I'm not saying that DirectX 10 and 11 didn't bring any new graphical features at all. But if you want to restrict to interesting features, tessellation is it.
It's more than just internal changes. Just the changes to the shader model between the different versions are very important.
And you can't dismiss features for what YOU think (in your opinion) being "stupid things players disable". Just the fact that DX10 got rid of fixed pipelines for programmable ones is a major step ahead. And I won't even mention the vast improvements to occlusion culling in DX10. All this is "visible" to the player not only by better rendering with more effects, but also improved performance allowing more complex scenes to be rendered with way less GPU and CPU usage.
Sorry but anyone who used the three versions professionally for years can only be amused here. You can do impressive things with DX9 because the platform is not the "end of it all" and the talent of the developers and the artists plays a major role, but in DX10 and DX11, you can do things better, or faster, or both, not to mention things that are completely impossible to make in DX9.
To make an analogy... I know the words "atom", "neutron", "electron", "proton", etc... but that doesn't make me a nuclear scientist, and I definitely don't pretend being one.
Respect, walk
Are you talkin' to me? Are you talkin' to me?
- PANTERA at HELLFEST 2023