Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

PhysX was pointless...?

13»

Comments

  • OrphesOrphes Member UncommonPosts: 3,039
    Originally posted by Sir_Drip





    He never answered the question (Jack did) but instead tryed to debunk Nvidia and state that the % were lies and or didn't matter. 2nd...



    I NEVER, stated those where lies. I stated that those numbers was worthless and questioning why NVIDIA released them. I questioned why using a percentage could be a motivator to say NVIDIA is good.


    You see.. I DO NOT HAVE ANSWER ALL YOUR QUESTIONS TO MAKE A POST ABOUT SOMETHING IN YOUR POST.


    Get over yourself.



    When the next question about drivers come about, how much you want to bet he reads the release notes before opening his trap?
    I do not need to read a release not from ATI to answer a claim that NVIDIA is good as they use percentage.
    3rd.....Now that the cat is out of the bag...How much would you like to bet that he will not start a post stating that (His) ATI % performances gains are lies, but just to sell more products and means nothing for anyone?
    I have not posted that any other % performance increase are lies in first place.
    More than likely it will be the other way around! That's a shiit salesman at it's finest!
    It is you, you, nobody else, you are the one putting words into mouth. If you find that a good base of argumentation, go ahead do it. :S You are nothing else but insulting, rude, about a normal discussion and questionings about things.


    If you can't take it that people DO NOT share you view, you are going to have learn to.
     

    I'm so broke. I can't even pay attention.
    "You have the right not to be killed"

  • AtomicZombieAtomicZombie Member Posts: 76

    The topic discussion is about PhysX.

    Ensure that if you're going to participate in this discussion, that you stay on topic. Personal attacks are not tolerated on the forum as per the Rules of Conduct.

     

    ~AtomicZombie

    On the whole human beings want to be good, but not too good, and not quite all the time.

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Sir_Drip


     
    Originally posted by Orphes

    Originally posted by Sir_Drip



    ------

    Why are NVIDIA releasing performance gains?
     
    It's called SUPPORT! Nvida drivers work and they are tweeking those drivers and getting more performance out of them, not fixing broken drivers to work!. So the question again....Also if you look at the release notes on them driver you will see performance gains values (%) per game. "Do you see that with ATI drivers?". Its a simple yes or no question.
    Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.


    It's not a yes or no question, it's a quote taken out of context, and a question answered with another not related question.
    -------
    60fps vs 72fps is of less importance.
     
    Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!
    24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it...

     
    What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems. 24fps vs 20fps is "nothing" and if you are playing a game close to the screen refresh rate is even less important. It is a higher chance that you notice the 4 fps increase from 20 than the 12fps increase from 60fps.


    And note that Nvidia does not tell you where their up to percentage increase is...
     
     

    Me.



     

    "Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings".

    It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!

     Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!

     

     

    "What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems."

    Well.. anyone knows that 60fps is the magic number when it comes to games. This is the number that you are shooting for or at least around it. So if your only getting 20 frames you better get a smaller display, lower the resolution or a better GFX card! Now lets stick with the 60fps and the jump to 72fps performance (FREE) gains. Now wouldn't you say that is nice for somthing that was free? 2nd. If you are running a 30" display...would you want the extra power?

    And yes the drivers have yelded more performance and more so when using my 9800GTX's in SLI. I guess it's just me and my resolutions @5040x1050 and 3840x 1024 that gets the extra power and no one else.

    I guess you will never understand!

     

     

     

    "24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it..."

    That was a example of what the extra "FREE" performance could do for someone. @ 320x200 resolution there would be no performance gain what so ever and Im not going to explain it to you ether!

     

     



     

    image

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by arakel


    Physx is worthless. Its marketing hype to make you believe you need it. You don't. It of no consequence in games that use it and has been rejected by Intel and AMD. That means its a non starter and will be left behind with a open source API, namely Havok.
     
    BTW, ATI has the best driver team in the industury. Monthly updates is something Nvidia cannot touch.



     

    This^^ ..!

     

    But there are several people that constantly require proof and handholding. So they will never understand, nor is it worth our time to educate them. I had 2 posts deleted in this thread, that went into further detail, but I was abused for knowing and suggesting that people crawl out from under Nvidia's marketing.

    Perhaps since a moderator is here, we can discuss things civily now.

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • DeserttFoxxDeserttFoxx Member UncommonPosts: 2,402

    Its not that physx was pointless, there are onyl 2 video card companies, ati and nvidia, and only 1 of them supports physx.

     

    Its hard for the gaming industry to get behind it because only one of the major devolpers support it, if ati had physx more devs would be comfortable making their games fully support it.

     

    Its the same deal with dx10.1, nvidia still doesnt make 10x.1 cards, so devs dont make any10.1 games.

    Quotations Those Who make peaceful resolutions impossible, make violent resolutions inevitable. John F. Kennedy

    Life... is the shit that happens while you wait for moments that never come - Lester Freeman

    Lie to no one. If there 's somebody close to you, you'll ruin it with a lie. If they're a stranger, who the fuck are they you gotta lie to them? - Willy Nelson

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by DeserttFoxx


    Its not that physx was pointless, there are onyl 2 video card companies, ati and nvidia, and only 1 of them supports physx.
     
    Its hard for the gaming industry to get behind it because only one of the major devolpers support it, if ati had physx more devs would be comfortable making their games fully support it.
     
    Its the same deal with dx10.1, nvidia still doesnt make 10x.1 cards, so devs dont make any10.1 games.



     

    DirectX 10.1 and Shader 4.1 are both STANDARDS...  Nvidia's PhysX is not.

    Though, it's funnt that Nvidia's card don't support the latest standards. Go figure.

     

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • viralzviralz Member Posts: 78

    sigh. fine i will break it down for you. all the products with physics,crossfire,sli,"gamer", "fatality", or other such nomenclature are specifically designed to appeal to those with more money than common sense. oh you will have a small increase in performance..... but who cares? the human eye cannot distinguish 30 fps from 120 fps anyway.  enjoy spending $1000 on video cards to get an extra 15 fps in crysis. a good benchmark but crappy game.

    image

  • The user and all related content has been deleted.

    "Freedom is just another name for nothing left to lose" - Janis Joplin
    image

  • noquarternoquarter Member Posts: 1,170


    Originally posted by viralz
    sigh. fine i will break it down for you. all the products with physics,crossfire,sli,"gamer", "fatality", or other such nomenclature are specifically designed to appeal to those with more money than common sense. oh you will have a small increase in performance..... but who cares? the human eye cannot distinguish 30 fps from 120 fps anyway.  enjoy spending $1000 on video cards to get an extra 15 fps in crysis. a good benchmark but crappy game.


    I've always known Xs and 1s make things faster, that's common sense.

  • viralzviralz Member Posts: 78
    Originally posted by noquarter


     

    Originally posted by viralz

    sigh. fine i will break it down for you. all the products with physics,crossfire,sli,"gamer", "fatality", or other such nomenclature are specifically designed to appeal to those with more money than common sense. oh you will have a small increase in performance..... but who cares? the human eye cannot distinguish 30 fps from 120 fps anyway.  enjoy spending $1000 on video cards to get an extra 15 fps in crysis. a good benchmark but crappy game.

     



    I've always known Xs and 1s make things faster, that's common sense.

     

    lol then good! dont waste money by supporting schemes like sli/CF. the more people buy into that crap the more they will market stupid crap instead of quality products.

    image

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Orphes


     

    Originally posted by Sir_Drip


    Originally posted by Jackcolt
     
    ATI also write their performance increases in %:

    www2.ati.com/relnotes/Catalyst_93_release_notes.pdf

     

     

    Well it's about F'n time someone answer the question from 5 pages ago. What gets me is that out of all you guys that use ATI hardware....only one of you read the release notes for your drivers. SAD!  Tho it's only for one game (Lost Planet) They do show it in % gains and for diffrent cards.

    Hey Orphen/ Orphes.... you can go wash the shiit off your face now! LOL!




    I asked why Nvidia is releasing their perfomance gains, I also asked you why you are using it as an argument to say NVIDIA > ATI... Obviously that was a lie from your side. My fault was that I trusted you on that. Now when you are called on it you turn it into your favour... seriously. You was the only one sayin and implying what ATI doesnt.:S



    I also said that their percentage means nothing.



    Fabricate lies all what you want, act like a child argumenting in a sandbox for all I care. :S



    This is my post try reading it again, again and again until you actually understand what is written it.



    -------------

    ATI updates their drivers once a month.

     

    Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?

    And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.

    Why saying that ATI don't increases performance with their drivers?

    It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.

    ---------------

     

    Instead of misqouting, taking things out of contxt. And make it look like something it doesn't.

     



    Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.

     

    Which you turned into meaning something else. When I was pointing that actual thing that you imply that I would not understand or know. Like this:

     




    Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.

    It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!

     



     

    Dont forget about this one....

    image

  • OrphesOrphes Member UncommonPosts: 3,039
    Originally posted by Sir_Drip

    Originally posted by Orphes


     

    Originally posted by Sir_Drip


    Originally posted by Jackcolt
     
    ATI also write their performance increases in %:

    www2.ati.com/relnotes/Catalyst_93_release_notes.pdf

     

     

    Well it's about F'n time someone answer the question from 5 pages ago. What gets me is that out of all you guys that use ATI hardware....only one of you read the release notes for your drivers. SAD!  Tho it's only for one game (Lost Planet) They do show it in % gains and for diffrent cards.

    Hey Orphen/ Orphes.... you can go wash the shiit off your face now! LOL!




    I asked why Nvidia is releasing their perfomance gains, I also asked you why you are using it as an argument to say NVIDIA > ATI... Obviously that was a lie from your side. My fault was that I trusted you on that. Now when you are called on it you turn it into your favour... seriously. You was the only one sayin and implying what ATI doesnt.:S



    I also said that their percentage means nothing.



    Fabricate lies all what you want, act like a child argumenting in a sandbox for all I care. :S



    This is my post try reading it again, again and again until you actually understand what is written it.



    -------------

    ATI updates their drivers once a month.

     

    Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?

    And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.

    Why saying that ATI don't increases performance with their drivers?

    It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.

    ---------------

     

    Instead of misqouting, taking things out of contxt. And make it look like something it doesn't.

     



    Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.

     

    Which you turned into meaning something else. When I was pointing that actual thing that you imply that I would not understand or know. Like this:

     




    Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.

    It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!

     



     

    Dont forget about this one....

     

    Ok...

    Let's reiterate it as of your request.

    1. I am saying that percantage is worthless because of what you are acknowledging with this, quotes from you, as you admit that there is different hardware setups. And you are also saying that I am saying the exact opposite of what I have said in first place. I have all the time said that the difference in hardware setup makes the performance increase vary.

    "There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups."

    2. I am saying that it is odd that this marketing scheme works. So to that grade that people uses it in an argumentation. You though on the other hand misquotes and are saying that I think it is odd that it is different hardware. Quote from me, how you misquoted it shows in the above quoting.

    "The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings."

     

    3. It do not matter that much if an performance increase at 60 is 12fps when it is /only/ 4fps at the whatever resolution I want to play thegame at, hardware not mentioned. But then again the hardware is not specified in the first place.

    4. Repost this all that you want. It still do not change what I said in the first post in this matter. That the percentage doesn't mean anything as they are not providing the hardware and that ane in general can not expect to get the stated performance increase.

    5. Alot of your post in this thread was removed because of flaming, trolling and similar. I think that both of you should need to learn to socialise with people isntead of being asshats. That's a friendly and honest tip. Your reposting of this makes your motive behind being on this forum pretty clear. Not here to discuss and share views on things, the apperant reason is to troll/flame and win teh internets.

    I'm so broke. I can't even pay attention.
    "You have the right not to be killed"

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Sir_Drip


     
    Originally posted by Orphes

    Originally posted by Sir_Drip



    ------

    Why are NVIDIA releasing performance gains?
     
    It's called SUPPORT! Nvida drivers work and they are tweeking those drivers and getting more performance out of them, not fixing broken drivers to work!. So the question again....Also if you look at the release notes on them driver you will see performance gains values (%) per game. "Do you see that with ATI drivers?". Its a simple yes or no question.
    Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.


    It's not a yes or no question, it's a quote taken out of context, and a question answered with another not related question.
    -------
    60fps vs 72fps is of less importance.
     
    Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!
    24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it...

     
    What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems. 24fps vs 20fps is "nothing" and if you are playing a game close to the screen refresh rate is even less important. It is a higher chance that you notice the 4 fps increase from 20 than the 12fps increase from 60fps.


    And note that Nvidia does not tell you where their up to percentage increase is...
     
     

    Me.



     

    "Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings".

    It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!

     Now if your playing a game with everything on high and no AA or AF and only getting 60 frames. Your statment above makes no sence! You would then have room to add some AA and or AF to clean up some jagged edges in the game! I guess it has a little more importance than you thought!

     

     

    "What I was saying is that 20% (could be) is a large number, it is 1/5 of something, but in the end is is not as much as it seems."

    Well.. anyone knows that 60fps is the magic number when it comes to games. This is the number that you are shooting for or at least around it. So if your only getting 20 frames you better get a smaller display, lower the resolution or a better GFX card! Now lets stick with the 60fps and the jump to 72fps performance (FREE) gains. Now wouldn't you say that is nice for somthing that was free? 2nd. If you are running a 30" display...would you want the extra power?

    And yes the drivers have yelded more performance and more so when using my 9800GTX's in SLI. I guess it's just me and my resolutions @5040x1050 and 3840x 1024 that gets the extra power and no one else.

    I guess you will never understand!

     

     

     

    "24fps is 20% more than 20fps. 72fps is 20% more then 60fps. Never said anything about high setting, never said anything about AA or AF. Heck, why didn't you imply that I was playing a game at 320x200 resolution while you was at it..."

    That was a example of what the extra "FREE" performance could do for someone. @ 320x200 resolution there would be no performance gain what so ever and Im not going to explain it to you ether!

     

     

    .

     

    image

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Sir_Drip

    Originally posted by Orphes


     

    Originally posted by Sir_Drip


    Originally posted by Jackcolt
     
    ATI also write their performance increases in %:

    www2.ati.com/relnotes/Catalyst_93_release_notes.pdf

     

     

    Well it's about F'n time someone answer the question from 5 pages ago. What gets me is that out of all you guys that use ATI hardware....only one of you read the release notes for your drivers. SAD!  Tho it's only for one game (Lost Planet) They do show it in % gains and for diffrent cards.

    Hey Orphen/ Orphes.... you can go wash the shiit off your face now! LOL!




    I asked why Nvidia is releasing their perfomance gains, I also asked you why you are using it as an argument to say NVIDIA > ATI... Obviously that was a lie from your side. My fault was that I trusted you on that. Now when you are called on it you turn it into your favour... seriously. You was the only one sayin and implying what ATI doesnt.:S



    I also said that their percentage means nothing.



    Fabricate lies all what you want, act like a child argumenting in a sandbox for all I care. :S



    This is my post try reading it again, again and again until you actually understand what is written it.



    -------------

    ATI updates their drivers once a month.

     

    Why are NVIDIA releasing performance gains? It's likely that hose are optimal number based on their setup, which one have to assume that is as optimal as it can be. Is it realistic to expect to get to thier "up to" performance gain. For who?

    And I dont go looking at a chart that says: 20fps vs 24fps. And then goes, oh a 20% performance increase. I rather go, oh only 4fps difference. 60fps vs 72fps is of less importance.

    Why saying that ATI don't increases performance with their drivers?

    It's great that there is something like PhysX. If it is as godlike as it seems to be described as, noone is winning on it being supported by one manufacturer.

    ---------------

     

    Instead of misqouting, taking things out of contxt. And make it look like something it doesn't.

     



    Because it is enough to say that there is performance increase. The percentage, alone, says nothing. It is just a fancy number, but seemingly the trick does work. Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.

     

    Which you turned into meaning something else. When I was pointing that actual thing that you imply that I would not understand or know. Like this:

     




    Odd that it does though when they actually say up to x% increase, and that they mention that it is dependent on GPU, CPU, well dependant on the computer hardware aswell as the game settings.

    It's odd that you think.. that this is odd. There are single cores, Dual cores, 3 core (junk), and Quad cores. What is odd is that you would think they all have the same performances? Aswell with GPU's, and SLI setups. As for game settings...Vsinc caps the frames @ 60fps. So why is this so hard to grasp? OMG!

     



     

    Dont forget about this one....



     

    .

    image

  • SurfriderSurfrider Member UncommonPosts: 302

    Locked:  Obviously the conversation is finished.

This discussion has been closed.