Pointless thread, anyone who has a dual or quad-core has built-in physic. Specially with Intel and AMD offering 6, 8 & 12 core CPU's by the end of the year, there is no need to have the video card handle something the CPU, with idle cores can handle.
Not only that... but only 50% of the people are using Nividia, so programming for PhysX (which is a brand name) is pointless. Where-as almost everyone in the future will have multi-core computers... so programers are more likely to program using Havok. Nvidia is using you guys with their marketing BS.
useing us...? lol
you wear a tin hat too....
the post was to state that what games use Physx, not a battle.
And from what i know is the GPU Phyisx helps off load the work of the CPU cause a GPU is better at computing Graphics..
Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS) and push out INTEL...... that being if INTEL donsnt have something up thier sleave.
See i too can make far fetched Statements with nothing to back it.
Ok first, senteces start with a capital letter...
Yes, I know what your Opening Post was all about. But since you do not understand the whole picture and only choose to concern yourself with your love for Nvidia, you posts are indeed laughable and fanboi'ish.
Calculating physical object within a game is nothing new, it's just that we havn't had dedicated hardware to do it on a large scale, or have a big enough player base with systems to have that capability. So, over the years some developers have dabbled in using several physics engines such as PhysX or HavoK.
PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games. The problem is that even in Mirror's Edge, it's use was superficial and really doesn't do anything real... unlike the up and comming Ghost Busters, etc.
Where-as.. physics can also be computated on CPU aswell and with prices dropping so rapidly and (again) 6, 8 & 12 core computers comming out later this year... there is no need to worry about physics on a VIDEO card, because the CPU can and will handle it easily. Thus making nvidia's option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
You're just an Nvidia fanboy spewing marketing. It's a war and Nvidia is loosing, while CPU physics is going to be the clear winner. OpenCL ftw.
Pointless thread, anyone who has a dual or quad-core has built-in physic. Specially with Intel and AMD offering 6, 8 & 12 core CPU's by the end of the year, there is no need to have the video card handle something the CPU, with idle cores can handle.
Not only that... but only 50% of the people are using Nividia, so programming for PhysX (which is a brand name) is pointless. Where-as almost everyone in the future will have multi-core computers... so programers are more likely to program using Havok. Nvidia is using you guys with their marketing BS.
useing us...? lol
you wear a tin hat too....
the post was to state that what games use Physx, not a battle.
And from what i know is the GPU Phyisx helps off load the work of the CPU cause a GPU is better at computing Graphics..
Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS) and push out INTEL...... that being if INTEL donsnt have something up thier sleave.
See i too can make far fetched Statements with nothing to back it.
Ok first, senteces start with a capital letter...
Yes, I know what your Opening Post was all about. But since you do not understand the whole picture and only choose to concern yourself with your love for Nvidia, you posts are indeed laughable and fanboi'ish.
Calculating physical object within a game is nothing new, it's just that we havn't had dedicated hardware to do it on a large scale, or have a big enough player base with systems to have that capability. So, over the years some developers have dabbled in using several physics engines such as PhysX or HavoK.
PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games. The problem is that even in Mirror's Edge, it's use was superficial and really doesn't do anything real... unlike the up and comming Ghost Busters, etc.
Where-as.. physics can also be computated on CPU aswell and with prices dropping so rapidly and (again) 6, 8 & 12 core computers comming out later this year... there is no need to worry about physics on a VIDEO card, because the CPU can and will handle it easily. Thus making nvidia's option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
You're just an Nvidia fanboy spewing marketing. It's a war and Nvidia is loosing, while CPU physics is going to be the clear winner. OpenCL ftw.
"W"ell here is a grammer Snob... (read my Quote).....".Lets not talk about my lack of Grammer,if you dont want to talk about your lack of Sex life..Nerd."
and here you go.....
Ok first, senteces(Miss spelled) start with a capital letter...
Yes, I know what your Opening Post was all about. But since you do not understand the whole picture and only choose to concern yourself with your love for Nvidia, you(Miss Spelled) posts are indeed laughable and fanboi'ish.
Calculating physical object within a game is nothing new, it's just that we havn't(Miss Spelled) had dedicated hardware to do it on a large scale, or have a big enough player base with systems to have that capability. So, over the years some developers have dabbled in using several physics engines such as PhysX or HavoK.
PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming(Miss Spelled) games. The problem is that even in Mirror's Edge, it's use was superficial and really doesn't do anything real... unlike the up and comming(Miss Spelled) Ghost Busters, etc.
Where-as.. physics can also be computated (Miss Spelled)on CPU aswell(Miss Spelled) and with prices dropping so rapidly and (again) 6, 8 & 12 core computers comming(Miss Spelled) out later this year... there is no need to worry about physics on a VIDEO card, because the CPU can and will handle it easily. Thus making nvidia's(Capital letter) option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
"You're just an Nvidia fanboy spewing marketing. It's a war and Nvidia is loosing, while CPU physics is going to be the clear winner. OpenCL ftw."
See other people can use spell check too...telling someone they have bad grammer, using bad grammer?...."indeed laughable."
"you posts are indeed laughable and fanboi'ish?"
you mean like:
"6, 8 & 12 core computers coming out later this year." AMD fan maybe?
And i love this
"PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games..........."
"Thus making nvidia's option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all."
Well which is it, Nivida is Forceing the market or not....
And where are you getting all this wonderfull statements....Google...? Or are you realy a wolf in sheeps clothing?
AMD/ATI FAN
PS what hardwear you running?
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
Originally posted by Sir_Drip Viewperf is a benchmark that measures rendering capability in viewports to 3D programs. The viewports are often based on game engine designs and Direct X. This is what makes them a close comparison to how a game could perform if it supported the ATI architecture. Also like nVidia, ATI updates its driver every 2 months.
Well I just shown you that 10% performance gain over the 8800gt went right out the window with a simple driver update....Also if you look at the release notes on them driver you will see performance gains values (%) per game. Do you see that with ATI drivers? Again look at the dates on the drivers aswell. You will see that they put out more drivers per 2 months over ATI. So why would any game maker (in their right mind) want to deal with Ati and their broken ways? Maybe in a few years, Ati will get their shit strait and produce solid drivers that increase performance output with the games that are made today.
ATI updates their drivers once a month.
Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?
And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.
Why saying that ATI don't increases performance with their drivers?
It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.
lol well that last posts got Modded.... well i guess i will have to keep it down a little...dont want to Get sniched again.
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
Pointless thread, anyone who has a dual or quad-core has built-in physic. Specially with Intel and AMD offering 6, 8 & 12 core CPU's by the end of the year, there is no need to have the video card handle something the CPU, with idle cores can handle.
Not only that... but only 50% of the people are using Nividia, so programming for PhysX (which is a brand name) is pointless. Where-as almost everyone in the future will have multi-core computers... so programers are more likely to program using Havok. Nvidia is using you guys with their marketing BS.
useing us...? lol
you wear a tin hat too....
the post was to state that what games use Physx, not a battle.
And from what i know is the GPU Phyisx helps off load the work of the CPU cause a GPU is better at computing Graphics..
Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS) and push out INTEL...... that being if INTEL donsnt have something up thier sleave.
See i too can make far fetched Statements with nothing to back it.
Ok first, senteces start with a capital letter...
Yes, I know what your Opening Post was all about. But since you do not understand the whole picture and only choose to concern yourself with your love for Nvidia, you posts are indeed laughable and fanboi'ish.
Calculating physical object within a game is nothing new, it's just that we havn't had dedicated hardware to do it on a large scale, or have a big enough player base with systems to have that capability. So, over the years some developers have dabbled in using several physics engines such as PhysX or HavoK.
PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games. The problem is that even in Mirror's Edge, it's use was superficial and really doesn't do anything real... unlike the up and comming Ghost Busters, etc.
Where-as.. physics can also be computated on CPU aswell and with prices dropping so rapidly and (again) 6, 8 & 12 core computers comming out later this year... there is no need to worry about physics on a VIDEO card, because the CPU can and will handle it easily. Thus making nvidia's option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
You're just an Nvidia fanboy spewing marketing. It's a war and Nvidia is loosing, while CPU physics is going to be the clear winner. OpenCL ftw.
Hey...Theory (the name fits you)
PhysX is a BRAND NAME used by Nvidia as a marketing tool
When did BRAND NAME and MARKETING TOOLS start useing their own drivers? Where did you read this?
games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
Now you are more confusing than trying to eat meatloaf with no mouth!
The 8 and 12 core CPU's are to be server cores, So what O.S. supports 6 cores? The 6 core CPU's (not Intel) is most likely to be 3 core x 2 under one chip. We all know about the AMD face planting 3 core chips. The 6 core should be twice as bad!
You will be lucky to see Physx on the CPU (cpu only) in 5 years.. So all you ATI fans and the hardware you are using today, are out of luck.
Pointless thread, anyone who has a dual or quad-core has built-in physic. Specially with Intel and AMD offering 6, 8 & 12 core CPU's by the end of the year, there is no need to have the video card handle something the CPU, with idle cores can handle.
Not only that... but only 50% of the people are using Nividia, so programming for PhysX (which is a brand name) is pointless. Where-as almost everyone in the future will have multi-core computers... so programers are more likely to program using Havok. Nvidia is using you guys with their marketing BS.
useing us...? lol
you wear a tin hat too....
the post was to state that what games use Physx, not a battle.
And from what i know is the GPU Phyisx helps off load the work of the CPU cause a GPU is better at computing Graphics..
Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS) and push out INTEL...... that being if INTEL donsnt have something up thier sleave.
See i too can make far fetched Statements with nothing to back it.
PhysX was a decent idea that was overtaken by technology. There is no need to find more things for a GPU to do and no need for a separate PhysX processor. Doing it with a software API on the CPU is much more efficient. PhysX is rapidly becoming a thing of the past.
You got it all wrong!!! Physx is now and will be around for a long time, however they do it!. That why the games are looking better and better. So get use to it!
Colors...? that all you got. Trying to side step all the other statements that you fumbled. By lol at Colors.? You burned me.!!
You wanted to be the Teacher.. so i busted out the Red marker to grade your paper.
Back to the point
WHO GIVES A CRAP WHAT THE FUTURE HOLDS.... we will all find out when the time comes. That's not going to sway my purchase on Tech that i use to play the games of the present. I will buy new hardware when I need to.
WHO KEEPS A PC FOR MORE THAN 3 YEARS WITH OUT AN UPGRADE (OF ANY KIND)?
We all Build PCs to play games that we want to play. We all upgrade Time to time. to keep up games.
So what, if there will be a 6 core or a 12 core that Can do Physics better 2 years from now...
You going to not upgrade your PC for 2 years so your Rig will finely play a game with physics?
No...neither will any serious gamer. Gamers buy tech to play the games at hand... when the tech and games change, so do we.
By the time any OS that is Supporting 100% dual/quad cores we have now, We will have already upgraded to what ever is out when that day comes. LOOK HOW LONG IT HAS BEEN TO EVEN SEE A FEW "TITLES" THAT SUPPORT MULTI THREADED NOW and you think 6, 8, and 12core Support will come quick?..lol. .
this thread was to Show people that the Games coming out THIS year and LAST year, that can run physics. Not for you to say "hell no, IN 3 YEARS TIME A CPU CAN DO IT BETTER THAN A GPU." Who gives a shit about 3~ 2 years from now.? Will that help me get the full effect from A game like Red Fraction next month....No
Your right, I will just wait till a few years till 12 cores are Supported by A fabled OS and games that support it because you Read it somewhere. Then and only then can I experience Physics in my games....lol
And if you say Physics means nothing to gaming....I know its not a MASSIVE thing so far, It adds a little fluff to the game. But Isn't that the same as Upgrading your PC so you can run a little more AA or "up" your textures to clean up your favorite game? IF NOT, then why don't you trade in your PC for a console (that's Cheaper). If Something extra is not your thing, why not Play games @ 800 X 600 with everything on low.
WE PC gamers want more that's why we don't play Consoles. We want the best graphics/sound. that's why we spend more money on PC instead of A XBOX or PS3 but then you state "well Physics is just too much"...."you can pay 1k more for your PC setup over a console to play games...but you can spend 1.5K..."? Were do you all get off "Drawing the line in the sand", for what is too much to spend on a PC?...
Its funny cause if a Console guy was saying "Spending more than 300 dollars for a PC is a waste cause Xbox is a better price/Performance" you would be all Farming him with..." But you can do so much more with a PC"
And yet your Shitting on Physics saying its not worth the extra Cash.... Hypocrites.
Any why is it you think that Physics is the only reason We went Nvidia...lol there are allot more importation reasons...Try they stomp ATI.
Also it seams to me like you all have ATI, and cant run Physics NOW.. for the games NOW, and A year from now.
And your pissed, so you resort to saying Physics is stupid and don't work. IS it make up for the fact you haven't really seen it, you haven't played it, and you can't use it until years from now unless you buy A new card or a CPU can do it.
make Statements all you want but is clear that you Crying a river about it.
That's the only truth
One day ATI will get its Drivers stable and stay that way. And when they do, they can start work on Physics and what ever. Then you can wait till they get the drivers stable on that. (good luck) If the CPU starts doing the Physics... will that really help most of you.? Physics Can Cripple a PC, and most of you all bargain shop for Hardware. You going to have that 12core (Server Chip) when the time comes?
Yea all this could happen but most of you are stating Tech that you cant afford and or hasn't come out yet. can beat what I like to run....that's stupid
let the guy that runs that high-end AMD/ATI hardware... Tell me at least
Anvil what Rig you running?
What physics games have u installed & played all they way through?
What frames did you get?
What did you think wasn't worth it?
I want to Know. if what you state is "hands on" or word of mouth.
Post some proof
we would all like to hear and see it
By the way E3 and CGE, PFG, HIK., Cons are a place where Companys go to Hype and Market there Product. and yet you dont think that you to have been suckerd into thier marketing? The hardwear battle that All Gamers Lose, when our PC are outdated Every month? And we All have been chaseing the dream, filling their pockets....lol by your logic we have all been Suckerd.
Yea we all know that buddy, we didnt need to go to a Con to find that out.
And if we should all beleave you... then what shall we do? Wait till the future comes to Buy something...but then the new future will have something better and wait agian...When do we get to buy something and play it.?...lol
PS: pointing out grammar is for people that dont have a valid point. You point it out trying to make the other look Stupid, thinking it will Make your retort more solid than the guy you doesnt care to run spell check . This is a hooby i do bettween gaming, i dont care if my gammar is Wack. I post this as fast as i can so i can get back to gaming.
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
Originally posted by Sir_Drip Viewperf is a benchmark that measures rendering capability in viewports to 3D programs. The viewports are often based on game engine designs and Direct X. This is what makes them a close comparison to how a game could perform if it supported the ATI architecture. Also like nVidia, ATI updates its driver every 2 months.
Well I just shown you that 10% performance gain over the 8800gt went right out the window with a simple driver update....Also if you look at the release notes on them driver you will see performance gains values (%) per game. Do you see that with ATI drivers? Again look at the dates on the drivers aswell. You will see that they put out more drivers per 2 months over ATI. So why would any game maker (in their right mind) want to deal with Ati and their broken ways? Maybe in a few years, Ati will get their shit strait and produce solid drivers that increase performance output with the games that are made today.
ATI updates their drivers once a month.
Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?
And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.
Why saying that ATI don't increases performance with their drivers?
It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.
Why are NVIDIA releasing performance gains?
It's called SUPPORT! Nvida drivers work and they are tweeking those drivers and getting more performance out of them, not fixing broken drivers to work!. So the question again....Also if you look at the release notes on them driver you will see performance gains values (%) per game. "Do you see that with ATI drivers?". Its a simple yes or no question.
60fps vs 72fps is of less importance.
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!
All nvidia fanboys : Do me a favor. Google "nv_disp"
What do you see?
Stop pretending nvidias drivers are better than ATi. They aren't (please prove me otherwise in a none retarded "nuff said" way) That whole pretense probably started as a joke somewhere that people didn't pick up on.
The only point where physx is interesting is when it is required which means larger parts of the entire engine is using PPU calculated physics. Right now, shit like shattering glass in Mirrors Edge doesn't mean shit to be honest(small difference that doesn't mean anything what so ever for the game). But when we have realistic ballistics modelling based on real world physics effecting traversing bullets, hell yeah. But for that to happen ATI and Nvidia need to agree on the same API(like with Directx), or publishers will have to remove half of the target group due to them having ATI.
I'm so fucking tired of reading about fanboys(on both ATI and Nvidias side) not knowing SHIT about what they talk about. Once actually read about a guy who was convinced that ATI were downgrading their drivers for some reason. Insane. Wouldn't surprise me if he posted on these forums though.
All nvidia fanboys : Do me a favor. Google "nv_disp" What do you see? Stop pretending nvidias drivers are better than ATi. They aren't (please prove me otherwise in a none retarded "nuff said" way) That whole pretense probably started as a joke somewhere that people didn't pick up on. The only point where physx is interesting is when it is required which means larger parts of the entire engine is using PPU calculated physics. Right now, shit like shattering glass in Mirrors Edge doesn't mean shit to be honest(small difference that doesn't mean anything what so ever for the game). But when we have realistic ballistics modelling based on real world physics effecting traversing bullets, hell yeah. But for that to happen ATI and Nvidia need to agree on the same API(like with Directx), or publishers will have to remove half of the target group due to them having ATI. I'm so fucking tired of reading about fanboys(on both ATI and Nvidias side) not knowing SHIT about what they talk about. Once actually read about a guy who was convinced that ATI were downgrading their drivers for some reason. Insane. Wouldn't surprise me if he posted on these forums though.
AMD Athlon64 X2 6400+ (Planning an upgrade)
2 GB PC5400 DDR2 RAM
Yes you've guessed it, HIS HD 4870 512 MB
Sounds like you havent even played a game with PhysX to me. Or did you "read" about one game, out of 40+? The whole market is moving to add Physics of some kind, to there games. Cry 3, Unreal 3 new MMOs and FPS....
keep wipeing your ass with pinecones.... while the rest of us that want that little extra (that you call it) can have it.
why? cause we can.
But it looks like you cant. And thats why you are Full of nerd rage.
Proof you want proof...lol you know how hard it will be to Add up the last 10 years of fourm hate on ATI and nvidia...then average the two then show you the numbers. Even advid gamers will tell you ATI has rocky drivers. of course there is no percfet driver on both sides. But me that used to be a ATI die hard fan that tryied Nvidia and never looked back.
Well what about the fact that ATI as a company has fallen of the grid untill lately...CoulD part of it be shaky drivers....?
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
Just to put in a jab her, but not to sound completely like a fan boy here. I can honestly say, out of the literally hundreds and hundreds of computers I have worked on in the past 10 years, that I hate ATI drivers with a fiery burning passion of a 1000 super novas. I hate the way they load, I hate the problems I have loading them, there control system for there cards is horrible and as far as I am concerned not laid out well for anyone but tech nerds most of which can't tell you what half the crap does. Also the fact that they don't support older hardware on newer OS's, I am talking about the guy who was writing 3rd party drivers for ATI cards that allowed them not only to work with VIsta but run better then the ones ATI had released for XP, and unlocked all the features of the cards. They sure shut him down quick and threatened to sue him for those.
Don't get me wrong Nvida has had there share of issues but I trust there system more then ATI by a long shot. I wish 3dfx would have stayed in buisness and not sold out to them because they buried the best graphics system IMO, Open GL, when they bought them out. As for now the only problems I have with Nvidia are wishing the card prices would come down a bit so I can SLI some things...L OL , As for PhysX I have played a few games with it in it and I must say the frill is nice, and I am glad they are finally adding tech like this into games, just another thing to step PC's away from Consoles as usual.
Just to put in a jab her, but not to sound completely like a fan boy here. I can honestly say, out of the literally hundreds and hundreds of computers I have worked on in the past 10 years, that I hate ATI drivers with a fiery burning passion of a 1000 super novas. I hate the way they load, I hate the problems I have loading them, there control system for there cards is horrible and as far as I am concerned not laid out well for anyone but tech nerds most of which can't tell you what half the crap does. Also the fact that they don't support older hardware on newer OS's, I am talking about the guy who was writing 3rd party drivers for ATI cards that allowed them not only to work with VIsta but run better then the ones ATI had released for XP, and unlocked all the features of the cards. They sure shut him down quick and threatened to sue him for those. Don't get me wrong Nvida has had there share of issues but I trust there system more then ATI by a long shot. I wish 3dfx would have stayed in buisness and not sold out to them because they buried the best graphics system IMO, Open GL, when they bought them out. As for now the only problems I have with Nvidia are wishing the card prices would come down a bit so I can SLI some things...L OL , As for PhysX I have played a few games with it in it and I must say the frill is nice, and I am glad they are finally adding tech like this into games, just another thing to step PC's away from Consoles as usual.
thank you...well said
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
All nvidia fanboys : Do me a favor. Google "nv_disp" What do you see? Stop pretending nvidias drivers are better than ATi. They aren't (please prove me otherwise in a none retarded "nuff said" way) That whole pretense probably started as a joke somewhere that people didn't pick up on. The only point where physx is interesting is when it is required which means larger parts of the entire engine is using PPU calculated physics. Right now, shit like shattering glass in Mirrors Edge doesn't mean shit to be honest(small difference that doesn't mean anything what so ever for the game). But when we have realistic ballistics modelling based on real world physics effecting traversing bullets, hell yeah. But for that to happen ATI and Nvidia need to agree on the same API(like with Directx), or publishers will have to remove half of the target group due to them having ATI. I'm so fucking tired of reading about fanboys(on both ATI and Nvidias side) not knowing SHIT about what they talk about. Once actually read about a guy who was convinced that ATI were downgrading their drivers for some reason. Insane. Wouldn't surprise me if he posted on these forums though.
AMD Athlon64 X2 6400+ (Planning an upgrade)
2 GB PC5400 DDR2 RAM
Yes you've guessed it, HIS HD 4870 512 MB
Sounds like you havent even played a game with PhysX to me. Or did you "read" about one game, out of 40+? The whole market is moving to add Physics of some kind, to there games. Cry 3, Unreal 3 new MMOs and FPS....
keep wipeing your ass with pinecones.... while the rest of us that want that little extra (that you call it) can have it.
why? cause we can.
But it looks like you cant. And thats why you are Full of nerd rage.
Proof you want proof...lol you know how hard it will be to Add up the last 10 years of fourm hate on ATI and nvidia...then average the two then show you the numbers. Even advid gamers will tell you ATI has rocky drivers. of course there is no percfet driver on both sides. But me that used to be a ATI die hard fan that tryied Nvidia and never looked back.
Well what about the fact that ATI as a company has fallen of the grid untill lately...CoulD part of it be shaky drivers....?
This entire post is an oxymoran, because you are argueing two different points... Physics (deformable envireonment, etc is different than PhysX. Which is nothing but proprietary middleware from Nvidia. Nothing new or even remarkable and will probably die out later this year.
Secondly, most of the games that have come out recently with PhysX has been questionable. Their implamentation was horrible and added no real value to the title. Some up and comming games with Nvidia PhysX have already been reviewed and those titles have fallen short of expectations.
Thats not to say there isn't a future for hardware acceleration of in-game physics, but I highly doubt Nvidia's proprietary solution is going to winout or even be the platform most developer chose to take. Havoc and OpenCL are much more robust and accesable, not to mention more powerful.
There is no denying that in-game Physics are comming, but Nvidia isn't going to be at the helm. There is already way more support for open-ended physics, with multi-core support, as other have mentioned to you.
Dread, I think you're over reacting and got too hung up in nvidia's marketing ploy... as (again) someone else has mentioned. Future game will make more use of cores, than with CUDA.
.
___________________________
- Knowledge is power, ive been in school for 28 years!
It's called SUPPORT! Nvida drivers work and they are tweeking those drivers and getting more performance out of them, not fixing broken drivers to work!. So the question again....Also if you look at the release notes on them driver you will see performance gains values (%) per game. "Do you see that with ATI drivers?". Its a simple yes or no question. Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.
It's not a yes or no question, it's a quote taken out of context, and a question answered with another not related question. ------- 60fps vs 72fps is of less importance.
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought! 24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it...
What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems. 24fps vs 20fps is "nothing" and if you are playing a game close to the screen refresh rate is even less important. It is a higher chance that you notice the 4 fps increase from 20 than the 12fps increase from 60fps.
And note that Nvidia does not tell you where their up to percentage increase is...
Originally posted by Erowid420 This entire post is an oxymoran, because you are argueing two different points... Physics (deformable envireonment, etc is different than PhysX. Which is nothing but proprietary middleware from Nvidia. Nothing new or even remarkable and will probably die out later this year.
Thanks, that actually was lost and forgotten in the discussion.
It's called SUPPORT! Nvida drivers work and they are tweeking those drivers and getting more performance out of them, not fixing broken drivers to work!. So the question again....Also if you look at the release notes on them driver you will see performance gains values (%) per game. "Do you see that with ATI drivers?". Its a simple yes or no question. Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.
It's not a yes or no question, it's a quote taken out of context, and a question answered with another not related question. ------- 60fps vs 72fps is of less importance.
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought! 24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it...
What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems. 24fps vs 20fps is "nothing" and if you are playing a game close to the screen refresh rate is even less important. It is a higher chance that you notice the 4 fps increase from 20 than the 12fps increase from 60fps.
And note that Nvidia does not tell you where their up to percentage increase is...
Me.
"Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings".
It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!
"What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems."
Well.. anyone knows that 60fps is the magic number when it comes to games. This is the number that you are shooting for or at least around it. So if your only getting 20 frames you better get a smaller display, lower the resolution or a better GFX card! Now lets stick with the 60fps and the jump to 72fps performance (FREE) gains. Now wouldn't you say that is nice for somthing that was free? 2nd. If you are running a 30" display...would you want the extra power?
And yes the drivers have yelded more performance and more so when using my 9800GTX's in SLI. I guess it's just me and my resolutions @5040x1050 and 3840x 1024 that gets the extra power and no one else.
I guess you will never understand!
"24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it..."
That was a example of what the extra "FREE" performance could do for someone. @ 320x200 resolution there would be no performance gain what so ever and Im not going to explain it to you ether!
It's called SUPPORT! Nvida drivers work and they are tweeking those drivers and getting more performance out of them, not fixing broken drivers to work!. So the question again....Also if you look at the release notes on them driver you will see performance gains values (%) per game. "Do you see that with ATI drivers?". Its a simple yes or no question. Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.
It's not a yes or no question, it's a quote taken out of context, and a question answered with another not related question. ------- 60fps vs 72fps is of less importance.
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought! 24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it...
What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems. 24fps vs 20fps is "nothing" and if you are playing a game close to the screen refresh rate is even less important. It is a higher chance that you notice the 4 fps increase from 20 than the 12fps increase from 60fps.
And note that Nvidia does not tell you where their up to percentage increase is...
Me.
Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.
It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!
What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems.
Well.. anyone knows that 60fps is the magic number when it comes to games. This is the number that you are shooting for or at least around it. So if your only getting 20 frames you better get a smaller display, lower the resolution or a better GFX card! Now lets stick with the 60fps and the jump to 72fps performance (FREE) gains. Now wouldn't you say that is nice for somthing that was free? 2nd. If you are running a 30" display...would you want the extra power?
And yes the drivers have yelded more performance and more so when using my 9800GTX's in SLI. I guess it's just me and my resolutions @5040x1050 and 3840x 1024 that gets the extra power and no one else.
I guess you will never understand!
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!
24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it...
That was a example of what the extra "FREE" performance could do for someone. @ 320x200 resolution there would be no performance gain what so ever and Im not going to explain it to you ether!
Can you define "free performance" please ..?
BTW, when you quote someone, please use QUOTES (uno: " " ). I was confused when reading your reply for a sec, thnx.
___________________________
- Knowledge is power, ive been in school for 28 years!
It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG! Where did I say anything about there is 3cores is and whetever else. Did I even mention Vsync, SLI, I'm talking about some argument made from you that NVIDIA "rocks", they present their performance increase i percantage. Said nothing else, nothing, at this point you are just putting words into my mouth.
[cut]
I guess you will never understand! Sure, maybe it is so. But what you are missing is... that their 40%, 22% or whatever means nothing. It is just PR nothing else. The different, the variations, on computer setups... well I already said it above. You are even admitting it in your post, but instead argue against something that I did not say.
To trick peope into thinking hey they are good, and it works does it not?
I can't read that link. But does that make a difference in the context of percentage's importance. But anyway that would make the argument that NVIDIA 'rock' they put percentage in their release notes worth even less.
Just another missinformation added to the other statement in this thread that ATI only releases drivers every 2nd month (and don't do performance fixes).
Maybe NVIDIA is the best, maybe not, but could people base that on things that is true(and matter) instead of based on things that really do not matter or even is not a difference beetween those two. That could actually in the end make this whole hype about PhysX, I don't know, look less?
Physx is worthless. Its marketing hype to make you believe you need it. You don't. It of no consequence in games that use it and has been rejected by Intel and AMD. That means its a non starter and will be left behind with a open source API, namely Havok.
BTW, ATI has the best driver team in the industury. Monthly updates is something Nvidia cannot touch.
Well it's about F'n time someone answer the question from 5 pages ago. What gets me is that out of all you guys that use ATI hardware....only one of you read the release notes for your drivers. SAD! Tho it's only for one game (Lost Planet) They do show it in % gains and for diffrent cards.
Hey Orphen/ Orphes.... you can go wash the shiit off your face now! LOL!
I asked why Nvidia is releasing their perfomance gains, I also asked you why you are using it as an argument to say NVIDIA > ATI... Obviously that was a lie from your side. My fault was that I trusted you on that. Now when you are called on it you turn it into your favour... seriously. You was the only one sayin and implying what ATI doesnt.:S
I also said that their percentage means nothing.
Fabricate lies all what you want, act like a child argumenting in a sandbox for all I care. :S
This is my post try reading it again, again and again until you actually understand what is written it.
-------------
ATI updates their drivers once a month.
Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?
And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.
Why saying that ATI don't increases performance with their drivers?
It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.
---------------
Instead of misqouting, taking things out of contxt. And make it look like something it doesn't.
Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.
Which you turned into meaning something else. When I was pointing that actual thing that you imply that I would not understand or know. Like this:
Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.
It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!
lol I just gotta ask, are all you guys from 55six so aggressive? I mean shit, I've rarely seen a bunch of guys from the same clan be so straight about their opinions to put it diplomatically.
Comments
useing us...? lol
you wear a tin hat too....
the post was to state that what games use Physx, not a battle.
And from what i know is the GPU Phyisx helps off load the work of the CPU cause a GPU is better at computing Graphics..
Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS) and push out INTEL...... that being if INTEL donsnt have something up thier sleave.
See i too can make far fetched Statements with nothing to back it.
Ok first, senteces start with a capital letter...
Yes, I know what your Opening Post was all about. But since you do not understand the whole picture and only choose to concern yourself with your love for Nvidia, you posts are indeed laughable and fanboi'ish.
Calculating physical object within a game is nothing new, it's just that we havn't had dedicated hardware to do it on a large scale, or have a big enough player base with systems to have that capability. So, over the years some developers have dabbled in using several physics engines such as PhysX or HavoK.
PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games. The problem is that even in Mirror's Edge, it's use was superficial and really doesn't do anything real... unlike the up and comming Ghost Busters, etc.
Where-as.. physics can also be computated on CPU aswell and with prices dropping so rapidly and (again) 6, 8 & 12 core computers comming out later this year... there is no need to worry about physics on a VIDEO card, because the CPU can and will handle it easily. Thus making nvidia's option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
You're just an Nvidia fanboy spewing marketing. It's a war and Nvidia is loosing, while CPU physics is going to be the clear winner. OpenCL ftw.
useing us...? lol
you wear a tin hat too....
the post was to state that what games use Physx, not a battle.
And from what i know is the GPU Phyisx helps off load the work of the CPU cause a GPU is better at computing Graphics..
Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS) and push out INTEL...... that being if INTEL donsnt have something up thier sleave.
See i too can make far fetched Statements with nothing to back it.
Ok first, senteces start with a capital letter...
Yes, I know what your Opening Post was all about. But since you do not understand the whole picture and only choose to concern yourself with your love for Nvidia, you posts are indeed laughable and fanboi'ish.
Calculating physical object within a game is nothing new, it's just that we havn't had dedicated hardware to do it on a large scale, or have a big enough player base with systems to have that capability. So, over the years some developers have dabbled in using several physics engines such as PhysX or HavoK.
PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games. The problem is that even in Mirror's Edge, it's use was superficial and really doesn't do anything real... unlike the up and comming Ghost Busters, etc.
Where-as.. physics can also be computated on CPU aswell and with prices dropping so rapidly and (again) 6, 8 & 12 core computers comming out later this year... there is no need to worry about physics on a VIDEO card, because the CPU can and will handle it easily. Thus making nvidia's option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
You're just an Nvidia fanboy spewing marketing. It's a war and Nvidia is loosing, while CPU physics is going to be the clear winner. OpenCL ftw.
"W"ell here is a grammer Snob... (read my Quote).....".Lets not talk about my lack of Grammer,if you dont want to talk about your lack of Sex life..Nerd."
and here you go.....
Ok first, senteces(Miss spelled) start with a capital letter...
Yes, I know what your Opening Post was all about. But since you do not understand the whole picture and only choose to concern yourself with your love for Nvidia, you(Miss Spelled) posts are indeed laughable and fanboi'ish.
Calculating physical object within a game is nothing new, it's just that we havn't(Miss Spelled) had dedicated hardware to do it on a large scale, or have a big enough player base with systems to have that capability. So, over the years some developers have dabbled in using several physics engines such as PhysX or HavoK.
PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming(Miss Spelled) games. The problem is that even in Mirror's Edge, it's use was superficial and really doesn't do anything real... unlike the up and comming(Miss Spelled) Ghost Busters, etc.
Where-as.. physics can also be computated (Miss Spelled)on CPU aswell(Miss Spelled) and with prices dropping so rapidly and (again) 6, 8 & 12 core computers comming(Miss Spelled) out later this year... there is no need to worry about physics on a VIDEO card, because the CPU can and will handle it easily. Thus making nvidia's(Capital letter) option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
"You're just an Nvidia fanboy spewing marketing. It's a war and Nvidia is loosing, while CPU physics is going to be the clear winner. OpenCL ftw."
See other people can use spell check too...telling someone they have bad grammer, using bad grammer?...."indeed laughable."
"you posts are indeed laughable and fanboi'ish?"
you mean like:
"6, 8 & 12 core computers coming out later this year." AMD fan maybe?
And i love this
"PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games..........."
"Thus making nvidia's option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all."
Well which is it, Nivida is Forceing the market or not....
And where are you getting all this wonderfull statements....Google...? Or are you realy a wolf in sheeps clothing?
AMD/ATI FAN
PS what hardwear you running?
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
ATI updates their drivers once a month.
Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?
And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.
Why saying that ATI don't increases performance with their drivers?
It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.
I'm so broke. I can't even pay attention.
"You have the right not to be killed"
lol well that last posts got Modded.... well i guess i will have to keep it down a little...dont want to Get sniched again.
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
useing us...? lol
you wear a tin hat too....
the post was to state that what games use Physx, not a battle.
And from what i know is the GPU Phyisx helps off load the work of the CPU cause a GPU is better at computing Graphics..
Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS) and push out INTEL...... that being if INTEL donsnt have something up thier sleave.
See i too can make far fetched Statements with nothing to back it.
Ok first, senteces start with a capital letter...
Yes, I know what your Opening Post was all about. But since you do not understand the whole picture and only choose to concern yourself with your love for Nvidia, you posts are indeed laughable and fanboi'ish.
Calculating physical object within a game is nothing new, it's just that we havn't had dedicated hardware to do it on a large scale, or have a big enough player base with systems to have that capability. So, over the years some developers have dabbled in using several physics engines such as PhysX or HavoK.
PhysX is a BRAND NAME used by Nvidia as a marketing tool, to try and get people to buy their products, because some games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games. The problem is that even in Mirror's Edge, it's use was superficial and really doesn't do anything real... unlike the up and comming Ghost Busters, etc.
Where-as.. physics can also be computated on CPU aswell and with prices dropping so rapidly and (again) 6, 8 & 12 core computers comming out later this year... there is no need to worry about physics on a VIDEO card, because the CPU can and will handle it easily. Thus making nvidia's option obsolete. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
You're just an Nvidia fanboy spewing marketing. It's a war and Nvidia is loosing, while CPU physics is going to be the clear winner. OpenCL ftw.
Hey...Theory (the name fits you)
PhysX is a BRAND NAME used by Nvidia as a marketing tool
When did BRAND NAME and MARKETING TOOLS start useing their own drivers? Where did you read this?
games have struck a deal with Nvidia to use CUDA/PhysX partially n their up and comming games. Because no programer is going to code specifically just for Nvidia, when they can just code for all.
Now you are more confusing than trying to eat meatloaf with no mouth!
The 8 and 12 core CPU's are to be server cores, So what O.S. supports 6 cores? The 6 core CPU's (not Intel) is most likely to be 3 core x 2 under one chip. We all know about the AMD face planting 3 core chips. The 6 core should be twice as bad!
You will be lucky to see Physx on the CPU (cpu only) in 5 years.. So all you ATI fans and the hardware you are using today, are out of luck.
useing us...? lol
you wear a tin hat too....
the post was to state that what games use Physx, not a battle.
And from what i know is the GPU Phyisx helps off load the work of the CPU cause a GPU is better at computing Graphics..
Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS) and push out INTEL...... that being if INTEL donsnt have something up thier sleave.
See i too can make far fetched Statements with nothing to back it.
PhysX was a decent idea that was overtaken by technology. There is no need to find more things for a GPU to do and no need for a separate PhysX processor. Doing it with a software API on the CPU is much more efficient. PhysX is rapidly becoming a thing of the past.
You got it all wrong!!! Physx is now and will be around for a long time, however they do it!. That why the games are looking better and better. So get use to it!
Colors...? that all you got. Trying to side step all the other statements that you fumbled. By lol at Colors.? You burned me.!!
You wanted to be the Teacher.. so i busted out the Red marker to grade your paper.
Back to the point
WHO GIVES A CRAP WHAT THE FUTURE HOLDS.... we will all find out when the time comes. That's not going to sway my purchase on Tech that i use to play the games of the present. I will buy new hardware when I need to.
WHO KEEPS A PC FOR MORE THAN 3 YEARS WITH OUT AN UPGRADE (OF ANY KIND)?
We all Build PCs to play games that we want to play. We all upgrade Time to time. to keep up games.
So what, if there will be a 6 core or a 12 core that Can do Physics better 2 years from now...
You going to not upgrade your PC for 2 years so your Rig will finely play a game with physics?
No...neither will any serious gamer. Gamers buy tech to play the games at hand... when the tech and games change, so do we.
By the time any OS that is Supporting 100% dual/quad cores we have now, We will have already upgraded to what ever is out when that day comes. LOOK HOW LONG IT HAS BEEN TO EVEN SEE A FEW "TITLES" THAT SUPPORT MULTI THREADED NOW and you think 6, 8, and 12core Support will come quick?..lol. .
this thread was to Show people that the Games coming out THIS year and LAST year, that can run physics. Not for you to say "hell no, IN 3 YEARS TIME A CPU CAN DO IT BETTER THAN A GPU." Who gives a shit about 3~ 2 years from now.? Will that help me get the full effect from A game like Red Fraction next month....No
Your right, I will just wait till a few years till 12 cores are Supported by A fabled OS and games that support it because you Read it somewhere. Then and only then can I experience Physics in my games....lol
And if you say Physics means nothing to gaming....I know its not a MASSIVE thing so far, It adds a little fluff to the game. But Isn't that the same as Upgrading your PC so you can run a little more AA or "up" your textures to clean up your favorite game? IF NOT, then why don't you trade in your PC for a console (that's Cheaper). If Something extra is not your thing, why not Play games @ 800 X 600 with everything on low.
WE PC gamers want more that's why we don't play Consoles. We want the best graphics/sound. that's why we spend more money on PC instead of A XBOX or PS3 but then you state "well Physics is just too much"...."you can pay 1k more for your PC setup over a console to play games...but you can spend 1.5K..."? Were do you all get off "Drawing the line in the sand", for what is too much to spend on a PC?...
Its funny cause if a Console guy was saying "Spending more than 300 dollars for a PC is a waste cause Xbox is a better price/Performance" you would be all Farming him with..." But you can do so much more with a PC"
And yet your Shitting on Physics saying its not worth the extra Cash.... Hypocrites.
Any why is it you think that Physics is the only reason We went Nvidia...lol there are allot more importation reasons...Try they stomp ATI.
Also it seams to me like you all have ATI, and cant run Physics NOW.. for the games NOW, and A year from now.
And your pissed, so you resort to saying Physics is stupid and don't work. IS it make up for the fact you haven't really seen it, you haven't played it, and you can't use it until years from now unless you buy A new card or a CPU can do it.
make Statements all you want but is clear that you Crying a river about it.
That's the only truth
One day ATI will get its Drivers stable and stay that way. And when they do, they can start work on Physics and what ever. Then you can wait till they get the drivers stable on that. (good luck) If the CPU starts doing the Physics... will that really help most of you.? Physics Can Cripple a PC, and most of you all bargain shop for Hardware. You going to have that 12core (Server Chip) when the time comes?
Yea all this could happen but most of you are stating Tech that you cant afford and or hasn't come out yet. can beat what I like to run....that's stupid
let the guy that runs that high-end AMD/ATI hardware... Tell me at least
Anvil what Rig you running?
What physics games have u installed & played all they way through?
What frames did you get?
What did you think wasn't worth it?
I want to Know. if what you state is "hands on" or word of mouth.
Post some proof
we would all like to hear and see it
By the way E3 and CGE, PFG, HIK., Cons are a place where Companys go to Hype and Market there Product. and yet you dont think that you to have been suckerd into thier marketing? The hardwear battle that All Gamers Lose, when our PC are outdated Every month? And we All have been chaseing the dream, filling their pockets....lol by your logic we have all been Suckerd.
Yea we all know that buddy, we didnt need to go to a Con to find that out.
And if we should all beleave you... then what shall we do? Wait till the future comes to Buy something...but then the new future will have something better and wait agian...When do we get to buy something and play it.?...lol
PS: pointing out grammar is for people that dont have a valid point. You point it out trying to make the other look Stupid, thinking it will Make your retort more solid than the guy you doesnt care to run spell check . This is a hooby i do bettween gaming, i dont care if my gammar is Wack. I post this as fast as i can so i can get back to gaming.
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
ATI updates their drivers once a month.
Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?
And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.
Why saying that ATI don't increases performance with their drivers?
It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.
Why are NVIDIA releasing performance gains?
It's called SUPPORT! Nvida drivers work and they are tweeking those drivers and getting more performance out of them, not fixing broken drivers to work!. So the question again....Also if you look at the release notes on them driver you will see performance gains values (%) per game. "Do you see that with ATI drivers?". Its a simple yes or no question.
60fps vs 72fps is of less importance.
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!
All nvidia fanboys : Do me a favor. Google "nv_disp"
What do you see?
Stop pretending nvidias drivers are better than ATi. They aren't (please prove me otherwise in a none retarded "nuff said" way) That whole pretense probably started as a joke somewhere that people didn't pick up on.
The only point where physx is interesting is when it is required which means larger parts of the entire engine is using PPU calculated physics. Right now, shit like shattering glass in Mirrors Edge doesn't mean shit to be honest(small difference that doesn't mean anything what so ever for the game). But when we have realistic ballistics modelling based on real world physics effecting traversing bullets, hell yeah. But for that to happen ATI and Nvidia need to agree on the same API(like with Directx), or publishers will have to remove half of the target group due to them having ATI.
I'm so fucking tired of reading about fanboys(on both ATI and Nvidias side) not knowing SHIT about what they talk about. Once actually read about a guy who was convinced that ATI were downgrading their drivers for some reason. Insane. Wouldn't surprise me if he posted on these forums though.
AMD Athlon64 X2 6400+ (Planning an upgrade)
2 GB PC5400 DDR2 RAM
Yes you've guessed it, HIS HD 4870 512 MB
Sounds like you havent even played a game with PhysX to me. Or did you "read" about one game, out of 40+? The whole market is moving to add Physics of some kind, to there games. Cry 3, Unreal 3 new MMOs and FPS....
keep wipeing your ass with pinecones.... while the rest of us that want that little extra (that you call it) can have it.
why? cause we can.
But it looks like you cant. And thats why you are Full of nerd rage.
Proof you want proof...lol you know how hard it will be to Add up the last 10 years of fourm hate on ATI and nvidia...then average the two then show you the numbers. Even advid gamers will tell you ATI has rocky drivers. of course there is no percfet driver on both sides. But me that used to be a ATI die hard fan that tryied Nvidia and never looked back.
Well what about the fact that ATI as a company has fallen of the grid untill lately...CoulD part of it be shaky drivers....?
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
Just to put in a jab her, but not to sound completely like a fan boy here. I can honestly say, out of the literally hundreds and hundreds of computers I have worked on in the past 10 years, that I hate ATI drivers with a fiery burning passion of a 1000 super novas. I hate the way they load, I hate the problems I have loading them, there control system for there cards is horrible and as far as I am concerned not laid out well for anyone but tech nerds most of which can't tell you what half the crap does. Also the fact that they don't support older hardware on newer OS's, I am talking about the guy who was writing 3rd party drivers for ATI cards that allowed them not only to work with VIsta but run better then the ones ATI had released for XP, and unlocked all the features of the cards. They sure shut him down quick and threatened to sue him for those.
Don't get me wrong Nvida has had there share of issues but I trust there system more then ATI by a long shot. I wish 3dfx would have stayed in buisness and not sold out to them because they buried the best graphics system IMO, Open GL, when they bought them out. As for now the only problems I have with Nvidia are wishing the card prices would come down a bit so I can SLI some things...L OL , As for PhysX I have played a few games with it in it and I must say the frill is nice, and I am glad they are finally adding tech like this into games, just another thing to step PC's away from Consoles as usual.
thank you...well said
"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
AMD Athlon64 X2 6400+ (Planning an upgrade)
2 GB PC5400 DDR2 RAM
Yes you've guessed it, HIS HD 4870 512 MB
Sounds like you havent even played a game with PhysX to me. Or did you "read" about one game, out of 40+? The whole market is moving to add Physics of some kind, to there games. Cry 3, Unreal 3 new MMOs and FPS....
keep wipeing your ass with pinecones.... while the rest of us that want that little extra (that you call it) can have it.
why? cause we can.
But it looks like you cant. And thats why you are Full of nerd rage.
Proof you want proof...lol you know how hard it will be to Add up the last 10 years of fourm hate on ATI and nvidia...then average the two then show you the numbers. Even advid gamers will tell you ATI has rocky drivers. of course there is no percfet driver on both sides. But me that used to be a ATI die hard fan that tryied Nvidia and never looked back.
Well what about the fact that ATI as a company has fallen of the grid untill lately...CoulD part of it be shaky drivers....?
This entire post is an oxymoran, because you are argueing two different points... Physics (deformable envireonment, etc is different than PhysX. Which is nothing but proprietary middleware from Nvidia. Nothing new or even remarkable and will probably die out later this year.
Secondly, most of the games that have come out recently with PhysX has been questionable. Their implamentation was horrible and added no real value to the title. Some up and comming games with Nvidia PhysX have already been reviewed and those titles have fallen short of expectations.
Thats not to say there isn't a future for hardware acceleration of in-game physics, but I highly doubt Nvidia's proprietary solution is going to winout or even be the platform most developer chose to take. Havoc and OpenCL are much more robust and accesable, not to mention more powerful.
There is no denying that in-game Physics are comming, but Nvidia isn't going to be at the helm. There is already way more support for open-ended physics, with multi-core support, as other have mentioned to you.
Dread, I think you're over reacting and got too hung up in nvidia's marketing ploy... as (again) someone else has mentioned. Future game will make more use of cores, than with CUDA.
.
___________________________
- Knowledge is power, ive been in school for 28 years!
A lot of those FPS games you listed are crap. Have fun.
-----------------------------
Real as Reality Television!!!
Me.
I'm so broke. I can't even pay attention.
"You have the right not to be killed"
Thanks, that actually was lost and forgotten in the discussion.
I'm so broke. I can't even pay attention.
"You have the right not to be killed"
Me.
"Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings".
It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!
"What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems."
Well.. anyone knows that 60fps is the magic number when it comes to games. This is the number that you are shooting for or at least around it. So if your only getting 20 frames you better get a smaller display, lower the resolution or a better GFX card! Now lets stick with the 60fps and the jump to 72fps performance (FREE) gains. Now wouldn't you say that is nice for somthing that was free? 2nd. If you are running a 30" display...would you want the extra power?
And yes the drivers have yelded more performance and more so when using my 9800GTX's in SLI. I guess it's just me and my resolutions @5040x1050 and 3840x 1024 that gets the extra power and no one else.
I guess you will never understand!
"24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it..."
That was a example of what the extra "FREE" performance could do for someone. @ 320x200 resolution there would be no performance gain what so ever and Im not going to explain it to you ether!
Me.
Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.
It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!
What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems.
Well.. anyone knows that 60fps is the magic number when it comes to games. This is the number that you are shooting for or at least around it. So if your only getting 20 frames you better get a smaller display, lower the resolution or a better GFX card! Now lets stick with the 60fps and the jump to 72fps performance (FREE) gains. Now wouldn't you say that is nice for somthing that was free? 2nd. If you are running a 30" display...would you want the extra power?
And yes the drivers have yelded more performance and more so when using my 9800GTX's in SLI. I guess it's just me and my resolutions @5040x1050 and 3840x 1024 that gets the extra power and no one else.
I guess you will never understand!
Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!
24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it...
That was a example of what the extra "FREE" performance could do for someone. @ 320x200 resolution there would be no performance gain what so ever and Im not going to explain it to you ether!
Can you define "free performance" please ..?
BTW, when you quote someone, please use QUOTES (uno: " " ). I was confused when reading your reply for a sec, thnx.
___________________________
- Knowledge is power, ive been in school for 28 years!
Does anyone know how an 8600 GTS would perform as a PhysX processor?
I'm so broke. I can't even pay attention.
"You have the right not to be killed"
ATI also write their performance increases in %:
www2.ati.com/relnotes/Catalyst_93_release_notes.pdf
I can't read that link. But does that make a difference in the context of percentage's importance. But anyway that would make the argument that NVIDIA 'rock' they put percentage in their release notes worth even less.
Just another missinformation added to the other statement in this thread that ATI only releases drivers every 2nd month (and don't do performance fixes).
Maybe NVIDIA is the best, maybe not, but could people base that on things that is true(and matter) instead of based on things that really do not matter or even is not a difference beetween those two. That could actually in the end make this whole hype about PhysX, I don't know, look less?
I'm so broke. I can't even pay attention.
"You have the right not to be killed"
Physx is worthless. Its marketing hype to make you believe you need it. You don't. It of no consequence in games that use it and has been rejected by Intel and AMD. That means its a non starter and will be left behind with a open source API, namely Havok.
BTW, ATI has the best driver team in the industury. Monthly updates is something Nvidia cannot touch.
I asked why Nvidia is releasing their perfomance gains, I also asked you why you are using it as an argument to say NVIDIA > ATI... Obviously that was a lie from your side. My fault was that I trusted you on that. Now when you are called on it you turn it into your favour... seriously. You was the only one sayin and implying what ATI doesnt.:S
I also said that their percentage means nothing.
Fabricate lies all what you want, act like a child argumenting in a sandbox for all I care. :S
This is my post try reading it again, again and again until you actually understand what is written it.
-------------
ATI updates their drivers once a month.
Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?
And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.
Why saying that ATI don't increases performance with their drivers?
It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.
---------------
Instead of misqouting, taking things out of contxt. And make it look like something it doesn't.
Which you turned into meaning something else. When I was pointing that actual thing that you imply that I would not understand or know. Like this:
I'm so broke. I can't even pay attention.
"You have the right not to be killed"
lol I just gotta ask, are all you guys from 55six so aggressive? I mean shit, I've rarely seen a bunch of guys from the same clan be so straight about their opinions to put it diplomatically.