Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

PhysX was pointless...?

Dreadknot357Dreadknot357 Member Posts: 148

Someone said PhysX were pointless.... Gaming companys rarely use them...

they Said not to buy Nvidia  Go ATI.....(in Short)

Yea i will be Staying with Nvidia

Below is a partial listing of current and upcoming AGEIA PhysX-accelerated titles available for PC and console owners alike. PC titles can take advantage of the PhysX Accelerator to provide an enhanced gaming experience.

Titles i think are worth buying Nvidia are in RED but thats just me...

ALL GAMES ON THIS LIST HAVE PHYSX

Game Title Developer Platform

2 Days to Vegas Steel Monkeys PC

Adrenalin 2: Rush Hour Gaijin Entertainment PC

Age of Empires III Distineer Studios PC, Mac

Age of Empires III: The WarChiefs Distineer Studios Mac

Alpha Prime Black Element Software PC

Auto Assault Net Devil PC

Backbreaker Natural Motion TBA

B.A.S.E. Jumping Digital Dimension Development PC

Bet on Soldier: Blackout Saigon Kylotonn Entertainment PC

Bet on Soldier: Blood of Sahara Kylotonn Entertainment PC

Bet on Soldier: Blood Sport Kylotonn Entertainment PC

Beowulf Ubisoft PS3, X360

Captain Blood Akella PC, X360

Cellfactor: Combat Training Artifical Studios, Immersion Games PC

Cellfactor: Revolution Artifical Studios, Immersion Games PC



Crazy Machines II FAKT Software PC

Cryostasis Action Forms PC

Dark Physics The Game Creators PC

Desert Diner Tarsier Studios PC

Dragonshard Atari PC

Dusk 12 Orion PC

Empire Above All IceHill PC

Empire Earth III Mad Dog Software PC





Fury Auran Games PC

Gears Of War Epic Games PC, X360

Gluk'Oza: Action GFI Russia PC

Gothic 3 Piranha Bytes PC

Gunship Apocalypse FAKT Software PC

Heavy Rain Quantic Dream PC

 

Hunt, The Orion PC



Infernal Metropolis Software PC

Inhabited island: Prisoner of Power Orion PC

Joint Task Force Most Wanted Entertainment PC

KumaWAR Kuma Reality Games PC

Magic ball 3 Alawar Entertaiment PC

Mass Effect BioWare PC, X360

Medal of Honor: Airborne EA Los Angeles PC, X360

Metro 2033 4A Games PC

Monster Madness: Battle for Suburbia Artificial Studios PC, X360

Monster Truck Maniax Legendo Entertainment PC

Myst Online: URU Live Cyan Worlds PC

Open Fire BlueTorch Studios PC

Paragraph 78 Gaijin Entertainment PC



PT Boats: Knights of the Sea Akella PC

Rail Simulator Kuju Entertainment Ltd PC

Rise Of Nations: Rise Of Legends Big Huge Games PC

Roboblitz Naked Sky Entertainment PC, X360

Sacred 2 ASCARON Entertainment PC

Sherlock Holmes: The Awakened Frogwares Game Development Studio PC

Showdown: Scorpion B-COOL Interactive PC

Silverfall Monte Cristo PC

Sovereign Symphony Ceidot Game Studios PC

Speedball 2 Kylotonn Entertainment PC

Stalin Subway, The Orion PC

Stoked Rider: Alaska Alien Bongfish Interactive Entertainment PC

Switchball Atomic Elbow PC

Tension Ice-pick Lodge PC

Tom Clancy's Ghost Recon Advanced Warfighter GRIN PC, X360

Tom Clancy's Ghost Recon Advanced Warfighter 2 GRIN, Ubisoft Paris PC, X360

Tom Clancy's Rainbow Six Vegas Ubisoft Montreal PC, PS3, X360

Tom Clancy's Splinter Cell: Double Agent (multiplayer) Ubisoft Shanghai PC, X360

Tortuga: Two Treasures Ascaron Entertainment PC

Two Worlds Reality Pump PC

Ultra Tubes Eipix PC

Unreal Tournament 3 Epic Games PC, PS3, X360

Unreal Tournament 3: Extreme Physics Mod Epic Games PC

Warfare GFI Russia PC

Warmonger: Operation Downtown Destruction Net Devil PC

W.E.L.L. Online Sibilant Interactive PC

Winterheart's Guild Zelian Games PC, X360

WorldShift Black Sea Studios PC

Mirror's Edge   thanks to SythntheTick

Red Fraction PC

And more importaint the Unreal Engine 3 and CryEngine-3 witch are 2 new Engines that Alot of MMOS are running or thinking about.

Some of the MMOs

Tera  Blue hole studios............OMG

Blade and Soul...........................OMG

Mortal Online,............................OMG

Global Agenda, 

 Fallen Earth Icarus Studios PC

Pirates of the Burning Sea Flying Lab Software PC

Huxley Webzen, Inc PC, X360 .................................OMG

Hero's Jorney Simutronics PC

City of Villains Cryptic Studios PC

Entropia Universe MindArk PC

 

And more to come...its the future

 

Add more if you know any

 

About Unreal 3 PhysX  http://www.unrealtechnology.com/technology.php

Powered by NVIDIA PhysX.

Rigid body physics system supporting player interaction with physical game objects, ragdoll character animation, complex vehicles, and dismemberable objects.

Cloth simulation.

Soft body simulation.

'Physical Material' system that allows per-object or per-surface properties such as friction, sounds and effects.

Physics-driven sound.

Fully integrated support for physics-based vehicles, including player control, AI, and networking.

Gameplay-driven physical animation – capable of driving physics based on animation, and blending the results in many ways.

Unreal PhAT, the visual physics modeling tool built into UnrealEd that supports creation of optimized collision primitives for models and skeletal animated meshes; constraint editing; and interactive physics simulation and tweaking in-editor.

Fracture tool in UnrealEd allows you to take a mesh and break it into pieces.

 

About Cry 3 http://www.crytek.com/technology/cryengine-3/specifications

Video:  http://www.gamespot.com/pc/action/crysis/video/6206970/gdc-2009-crysis-warhead-cryengine-3-demo-video

Demo:  http://www.fileplanet.com/197606/190000/fileinfo/CryENGINE-3-Trailer-(HD)

CryENGINE® 3 is the first Xbox 360™, PlayStation® 3, MMO, DX9 and DX10 all-in-one game development solution that is next-gen ready – with scalable computation and graphics technologies. With CryENGINE® 3 you can start the development of your next generation games today. CryENGINE® 3 is the only solution that provides multi-award winning graphics, physics and AI out of the box. The complete game engine suite includes the famous CryENGINE® 3 Sandbox™ editor, a production-proven, 3rd generation tool suite designed and built by AAA developers. CryENGINE® 3 delivers everything you need to create your AAA games.



•Integrated Multi-threaded High Performance Physics Engine

CryENGINE® 3 physics can be applied to almost everything in a game world, including buildings, props, trees and vegetation, to realistically model reactions to forces such as: wind currents, explosions, gravity, friction and collisions with other objects, without the need of external middleware.



•Interactive & Destructible Environments

All environments in CryENGINE® 3 can be dynamically physicalized, regardless of their nature (wood, steel, concrete, natural vegetation, cloth, etc.). This allows procedural destruction of as much of the environment as the game requires. All broken objects or parts can be interactive, with realistic properties such as mass, buoyancy, etc. applied to the debris.



•Advanced Rope Physics

Bendable vegetation which responds to wind, rain or character movement, realistically interactive rope bridges, and physically driven creature tentacle animations are just some of the uses to which we’ve put our rope physics technology.



 

"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
image
image

«13

Comments

  • WisebutCruelWisebutCruel Member Posts: 1,089

    Aye, stick with Nvidia. PhysX isn't pointless, it's just taken awhile to really catch on, mainly due to people not having the hardware to support it. Most gamers now do, so the industry is developing for it at a steadier pace.

    Also, PhysX started out as a proprietary card and software, and most people saw no need for another expensive card in their system that really had no use with the little games that supported it. Once Nvidia bought PhysX and incorporated it into the graphic card, popularity picked up also.

  • GruntyGrunty Member EpicPosts: 8,657

    It's Nvidia PhysX now. Ageia was bought out by Nvidia a couple of years ago.

    "I used to think the worst thing in life was to be all alone.  It's not.  The worst thing in life is to end up with people who make you feel all alone."  Robin Williams
  • WisebutCruelWisebutCruel Member Posts: 1,089
    Originally posted by grunty


    It's Nvidia PhysX now. Ageia was bought out by Nvidia a couple of years ago.

    I already said that. You lose 5 forum points and get a slap with a limp trout.

  • QuizzicalQuizzical Member LegendaryPosts: 25,531

    Paying $500 for a physics card for a machine that would be obsolete by the time games you wanted to play could make much use of it was, indeed, pointless.

  • Loke666Loke666 Member EpicPosts: 21,441

    I stay with Nvidia myself, not just for PhysX but also because I dont like ATIs drivers.

    ATI makes very good hardware but have always been bad of making drivers that uses the cards potential, Nvidias drivers works generally a lot better and needs a lot less tweaking.

    Of course they also costs more so I understand that ATI cards are the best choice for many people. If you are good on tweaking stuff and have a limited budget, then ATI can be a good alternative.

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Quizzical


    Paying $500 for a physics card for a machine that would be obsolete by the time games you wanted to play could make much use of it was, indeed, pointless.



     

    ahhhh what are you talking about...we are talking about Nivida GPU's that now use PhysX .....lol  not PhysX Cards

    Clueless 

    and what Machine are you talking about?  Obsolete   All games are looking to Runn Physx now....Stay with the 2 cans and string  instead of the Cell Phone...right?

    and waiting for a game?  that is a list of games you can play now  and in the next year...... 

    and who cares if it was bought out by nividia  it does not change the programing or how Nivida uses it.....lol

    try again

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • SynthetickSynthetick Member Posts: 977

    I'm not sweating missing out on PhysX with my ATI, assuming the drivers ever get completed. I'll continue to use my ATI, I believe.

    http://www.ngohq.com/graphic-cards/15245-about-physx-with-ati.html



    Also, Mirror's Edge supports PhysX.

     

    image

  • CopelandCopeland Member Posts: 1,955

    I'm pretty sure that Windows 7 will allow the PhysX drivers to be run with both ATI and Nvidia cards. Nvidia isn't trying to monopolize PhysX. I'm not sure if we'll see a benefit being that it will run in Nvidia GPU's and i think in Windows 7 for it to work with ATI it will be running via virtual GPU. Apparently the virtual GPU will also allow non dx10 compatible cards to run dx10. I haven't played with the Windows 7 beta yet so i don't know if any of this stuff is working yet. Probably not but i do recall reading an article about it somewhere.

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Copeland


    I'm pretty sure that Windows 7 will allow the PhysX drivers to be run with both ATI and Nvidia cards. Nvidia isn't trying to monopolize PhysX. I'm not sure if we'll see a benefit being that it will run in Nvidia GPU's and i think in Windows 7 for it to work with ATI it will be running via virtual GPU. Apparently the virtual GPU will also allow non dx10 compatible cards to run dx10. I haven't played with the Windows 7 beta yet so i don't know if any of this stuff is working yet. Probably not but i do recall reading an article about it somewhere.

    i know that Nvidia is not tring to Keep PhysX from ATI  that i know of...

     

    But the Windows 7 running it for ATI?  i dont think so...I could be wrong

    The DX10 thing your talking about. is that DX11 Cards support DX10 through Windows 7 i think....I also read about it i didnt learn much tho....  i was looking for something diffrent when i was looking.

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • SynthetickSynthetick Member Posts: 977
    Originally posted by Copeland


    I'm pretty sure that Windows 7 will allow the PhysX drivers to be run with both ATI and Nvidia cards. Nvidia isn't trying to monopolize PhysX. I'm not sure if we'll see a benefit being that it will run in Nvidia GPU's and i think in Windows 7 for it to work with ATI it will be running via virtual GPU. Apparently the virtual GPU will also allow non dx10 compatible cards to run dx10. I haven't played with the Windows 7 beta yet so i don't know if any of this stuff is working yet. Probably not but i do recall reading an article about it somewhere.



    There are ATI drivers being made now that aren't restricted to Windows 7 for them to work. I honestly haven't read anything about Windows 7 solving the issue, there's some in-depth discussion on what's going on with the situation of ATI getting PhysX. I just browsed through it to make sure I was going to be able to use it soon on my ATI and really, hardware and everything associated with it isn't my cup of tea (unless we're talking music equipment), so I could be mistaken. 

    image

  • CopelandCopeland Member Posts: 1,955


    Originally posted by Dreadknot357
    Originally posted by Copeland I'm pretty sure that Windows 7 will allow the PhysX drivers to be run with both ATI and Nvidia cards. Nvidia isn't trying to monopolize PhysX. I'm not sure if we'll see a benefit being that it will run in Nvidia GPU's and i think in Windows 7 for it to work with ATI it will be running via virtual GPU. Apparently the virtual GPU will also allow non dx10 compatible cards to run dx10. I haven't played with the Windows 7 beta yet so i don't know if any of this stuff is working yet. Probably not but i do recall reading an article about it somewhere.
    i know that Nvidia is not tring to Keep PhysX from ATI  that i know of...
     
    But the Windows 7 running it for ATI?  i dont think so...I could be wrong
    The DX10 thing your talking about. is that DX11 Cards support DX10 throught Windows 7 i think....I also read about it i didnt learn much tho....  i was looking for something diffrent when i was looking.

    Nope i am specifically talking about the functionality of the windows 7 virtual GPU. PhysX although integrated for the Nvidia GPU's is also a seperate set of drivers. These drivers can be used on the virtual GPU. Since it's a virtual GPU it will be able to make use of DX10 even if your card is an older dx9 model.

    When i first read the article i was thinking "well thats kind of stupid to take the functionality back from the cards GPU and put that burden back on the CPU" but then i started to think about it and i can see how that functionality will actually help low end pc's be more compatible. I can see a low end I7 pc with a low end video card could actually benefit cheaply from the virtual GPU given enough RAM.

    We'll see.

  • WisebutCruelWisebutCruel Member Posts: 1,089
    Originally posted by Dreadknot357

    Originally posted by Quizzical


    Paying $500 for a physics card for a machine that would be obsolete by the time games you wanted to play could make much use of it was, indeed, pointless.



     

    ahhhh what are you talking about...we are talking about Nivida GPU's that now use PhysX .....lol  not PhysX Cards

    Clueless 

    and what Machine are you talking about?  Obsolete   All games are looking to Runn Physx now....Stay with the 2 cans and string  instead of the Cell Phone...right?

    and waiting for a game?  that is a list of games you can play now  and in the next year...... 

    and who cares if it was bought out by nividia  it does not change the programing or how Nivida uses it.....lol

    try again

    He was referring to before it became integrated by Nvidia. Which is why he said "was, indeed, pointless".

  • CleffyCleffy Member RarePosts: 6,414

    I don't really count PhysX as worth it on a GPU in its current incarnation.  The nVidia architecture itself is not really powerful enough to take full advantage of both physics and graphics acceleration, something will have to be sacraficed to do it unless you use SLI.  I think ATIs route has been thought out a bit more with Havok and OpenCL support.  By supporting an open architecture they were able to concentrate on their hardware.  In the case of ATI verse nVidia hardware, its completely dependant on if companies bothered to use the power of ATI on their engines which is the only real triumph to nVidia.  They have a wider developer network.

    The truth is ATI had the most powerful GPU since the HD2800, it just was not supported by game developers and still isn't.  Infact an ATI card can perform up to 300% better then even the GTX 295 simply because of its progressive hardware and has been able to support Real Time Raytracing for 3 years now.  Its nVidia at this time holding the gaming industry back with non-competitive hardware and using its developers network to deny the actual power of an ATI card.

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Cleffy


    I don't really count PhysX as worth it on a GPU in its current incarnation.  The nVidia architecture itself is not really powerful enough to take full advantage of both physics and graphics acceleration, something will have to be sacraficed to do it unless you use SLI.  I think ATIs route has been thought out a bit more with Havok and OpenCL support.  By supporting an open architecture they were able to concentrate on their hardware.  In the case of ATI verse nVidia hardware, its completely dependant on if companies bothered to use the power of ATI on their engines which is the only real triumph to nVidia.  They have a wider developer network.
    The truth is ATI had the most powerful GPU since the HD2800, it just was not supported by game developers and still isn't.  Infact an ATI card can perform up to 300% better then even the GTX 295 simply because of its progressive hardware and has been able to support Real Time Raytracing for 3 years now.  Its nVidia at this time holding the gaming industry back with non-competitive hardware and using its developers network to deny the actual power of an ATI card.



     

    "I don't really count PhysX as worth it on a GPU in its current incarnation. The nVidia architecture itself is not really powerful enough to take full advantage of both physics and graphics acceleration, something will have to be sacraficed to do it unless you use SLI"

    Well Cleffy please don't be another one of these "Copy and paste" guys....running low-end Hardware posting about what People can do, and not do. I'm running out of steam responding to people like that...

    That being said this statement is wrong. I am running one of the most demanding PhysX Games with non Sli on full max Settings @1920 x 1200 26" Oh yea it is Pushing my system but its just fine.

     

    Allot of people are Afraid of the future holds for some reason. Every Month there is New Tech. And either you all say "it cant be done" or "no one will ever use that" ....why? Must everything be a threat to your Own PC? If its not the Value or Company of your hardware. you all Shit on it? I really come to believe that, its this simple Fact..... "You all hate on what you cant have".

    I'm posting facts to help people out.. two of the most hyped games on this site "Fallen Earth" and "Global agenda" run on the Unreal 3. Having physX will play the way it was made, and i don't think allot of people Know this.

     

    People always end up upgrading to play a game that's coming out. And to play these Next Gen games they will need Nvidia( As of the way it is now) Of course its possible for ATI to figure out a way also. Right now they cant.

    Your posting theory's and Hypothesis from unconfirmed Sources, when I'm posting facts. All you are doing is making people doubt, based off of propaganda. Have you played a PhysX r,un game with a system that can run it?.... If not why are you here posting against it?.... sounds like ATI hate mongering to me.

     

    "The truth is ATI had the most powerful GPU since the HD2800, it just was not supported by game developers and still isn't. Infact an ATI card can perform up to 300% better then even the GTX 295 simply because of its progressive hardware and has been able to support Real Time Raytracing for 3 years now. Its nVidia at this time holding the gaming industry back with non-competitive hardware and using its developers network to deny the actual power of an ATI card."



     

    So ATI is the strongest tech, that cant be used because its Cox Blocked into being second best by Nvidia....? So that would make the logical human go Nvidia right?

     

     

    A Guy ( Nvidia Cox block) is stealing JIm's Car (Games). Jim has a 357Mag (Nvidia GPU) and Jim's Friend has A 50cal (ATI) in the trunk blocked by the Guy. Jim's Freind tells him "IF" could get to the trunk to get his 50cal( ATI GPU) it would be would be better to kill the Guy with...instead of the 357mag (Nvidia GPU)..............But Jim doesn't wait He shoots the guy with the 357Mag, cause its Readily available and Lives up tho the standards of this problem (Global marked deemed by Nvidia and Devs)

     

    So...I would go with the 357....even if Nvidia has forced the Market and Dev's to go Nivida. If you want to play games they way the Dev's made them, with all the bells and whistles. You have to go Nvidia.

     

    It was funny I was reading about the game Cryostasis before i got it. So far it's the most demanding game for physics. All the ATI fans were saying the game was like any other. and the Nvidia Fans where like "what are you talking about. (could it be they were looking at 2 diff games?)

    that being said there is a massive Jump in games with Phyix and if you cant run Physx, your game will look flat and dull. ATI fans that say "whatever"...your Full of shit.. IF you don't care about playing a game the way it was suppose to be played, then why did you Upgrade your computer at all? Shit you can just run games at 800 x 600....why not, If visuals don't matter?......yea right. BS. Your pissed cause you cant run it with your hardware. Proof that you all say one thing and do another...

     

    There is a Guy that's making a ATI driver that can run PhysX (with the help of Nvidia....lol) ALL the ATI FANS were pissed cause it wasn't coming out fast enough...Hypocrites

     

     

    So what is it.... whats holding you Back, Pride? For ATI? What cause ATI was treated unfairly...ohhhh? Or is it you don't have the money to upgrade again? Or you should have shopped around better?

     

    I was an OG ATI when i first started gaming. I had 5 or six PCs that Ran ATi GPU's the last one i still have... An X850 Plat Pro In the box. I got sick of the drivers the "Cat" is crap, and always has been. I don't care if your Card is a ICBM, if it cant get of the ground because of a failed Drivers, its useless. If ATI finds a driver for PhysX... it will be stable like all the other ATI Drivers have been....lol thats a joke.

     

    I don't give a shit who wins this stupid war as long as when a game comes out, there is Hard-wear that can max it.....I will buy who ever's is on top. And Nvidia Has been Stealing cars for longer. Unethical or not..... who cares

     

     

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • CleffyCleffy Member RarePosts: 6,414

    I still don't see the point in PhysX.  There are already many physic APIs that can be used, and Havok is the most widely used physic API.  This is why AMD is of more value since they support hardware acceleration for Havok instead of PhysX.  Unlike PhysX, Havok can also be run on the CPU which now outspecs the original PhysX Card that PhysX games adhere to.  To me a simple API addition to something thats already being done is only a cheap selling point.  Especially when the competition and your product already support the existing APIs.  To me a graphics card should be about advancing graphics.  nVidia isn't doing that and has actually been halting this progress by not meeting Microsofts DirectX ambitions.  I would rather have AMD with its capability of 300% higher calculations per processing unit, more processing units, capability to do real-time raytracing, and tesselation then PhysX.

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Cleffy


    I still don't see the point in PhysX.  There are already many physic APIs that can be used, and Havok is the most widely used physic API.  This is why AMD is of more value since they support hardware acceleration for Havok instead of PhysX.  Unlike PhysX, Havok can also be run on the CPU which now outspecs the original PhysX Card that PhysX games adhere to.  To me a simple API addition to something thats already being done is only a cheap selling point.  Especially when the competition and your product already support the existing APIs.  To me a graphics card should be about advancing graphics.  nVidia isn't doing that and has actually been halting this progress by not meeting Microsofts DirectX ambitions.  I would rather have AMD with its capability of 300% higher calculations per processing unit, more processing units, capability to do real-time raytracing, and tesselation then PhysX.



     

    I would rather have AMD with its capability of 300% higher calculations per processing unit, more processing units, capability to do real-time raytracing, and tesselation then PhysX.

     

     

    REPLY-  Now that's some funny shit there!! Since when has AMD/ATI ever beat anyone by 300% at anything? Would one of you benchmark KINGS show me please. Smells like 300% of bullshit! Ive looked at 3Dmarks, and vantage scores and havent seen ether one beating the other by 300%.

     

    To me a graphics card should be about advancing graphics. nVidia isn't doing that and has actually been halting this progress by not meeting Microsofts DirectX ambitions.

     

    REPLY- I wouldn't be so fast to blame Nvidia...I would blame the game makers tho. If you can play the games a max settings what more can you do? The game makers are what is slowing the DirectX ambitions.

     

    The thing about Havok is... ether ATI or Nvidia cards used...they both see the same things in the game and works with both. With the Physx engine Nvidia see's the eyecandy and Ati dosen't! So...

     

    Havok can also be run on the CPU which now outspecs the original PhysX Card that PhysX games adhere to.

     

    REPLY- "Original PhysX Card "  If your talking about the old Ageia cards...  then yes to they are crap compared to the Nvidia GPU's used today for the Physx engine. Prolly 10x faster and better than the old Ageia cards.  For you Ati fans...you will have to purchase a Physx card if you want to see Physx! I think BFG makes one for you guys.

    Ati and Physx....You guys prolly wont see that for another 2 years or so or a year after W7. Seeing how Ati cant even keep up with their drivers.. let alone trying to add Physx to those broken drivers! LOL! It would look like a Monkey fucking a football...Not knowing what end to start with! LOL!

    image

  • CleffyCleffy Member RarePosts: 6,414

    Its all about supporting architecture.  Games right now do not support ATI architecture, however 3D apps do support it and the difference can be seen with viewperf comparing 2 competing video cards.

    8800GT vs HD3870-Viewperf Benchmark

    Thats a 250% perfomance increase for an HD3870 against a card that performs 10% better in gaming.  It is hard to find benchmarks of consumer video cards in CAD programs, but with some search you will find it consistently favors ATI hardware by even margins up to 300%.

    The reason for this is how the ATI stream processor works.  It can process 4 similiar pieces of information on 1 process cycle compared to the 1 on nVidias architecture.  This is perfect for rendering polies or coming up with RGB information since they are 4 pieces of information.  Also through this increased amount of calculations they are capable of real-time raytracing.  The only problem is games do not support the architecture since it was released only 3 years ago, and nVidia has the largest market share.

  • SmikisSmikis Member UncommonPosts: 1,045

    wow another post about how awesome physixs are , unreal doesnt support physixs.. just 3 maps, which physixs mode does,

     

    sacred 2, physixs adds extra falling leaves from trees ( thigns you wont notice )

     

    noone plays cryostasis, stop praising game, cuz it sounds like crysis,

     

    none of those mmos ever said anything about supporting physixs,

     

    I own nvidia cards, 2 of them actually, and i obviously have physixs,, support.. only game i care for physixs is diablo 3, rest doesnt have physixs, there are only 1 game so far which have real phyxis thats mirrors edge,

     

    others adds some falling leaves. like sacred, and ureal mod.. both of them add leaves, thats about only physixs in those games..

     

    its nice yes, but its not the point  in getting nvidia over ati just cuz of physixs

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Cleffy


    Its all about supporting architecture.  Games right now do not support ATI architecture, however 3D apps do support it and the difference can be seen with viewperf comparing 2 competing video cards.
    8800GT vs HD3870-Viewperf Benchmark
    Thats a 250% perfomance increase for an HD3870 against a card that performs 10% better in gaming.  It is hard to find benchmarks of consumer video cards in CAD programs, but with some search you will find it consistently favors ATI hardware by even margins up to 300%.
    The reason for this is how the ATI stream processor works.  It can process 4 similiar pieces of information on 1 process cycle compared to the 1 on nVidias architecture.  This is perfect for rendering polies or coming up with RGB information since they are 4 pieces of information.  Also through this increased amount of calculations they are capable of real-time raytracing.  The only problem is games do not support the architecture since it was released only 3 years ago, and nVidia has the largest market share.



     

    That link you have provided is over a year old. Now I can understand that Ati prolly only put out maybe two (2) driver updates in that years time. However...Nvidia put out like 20 of them in the past year...So take a look at this and tell me what you dont see when downloading your drivers.

    For the past year with game performance in mind...

    http://www.nvidia.com/object/winxp_185.85_whql.html

    http://www.nvidia.com/object/winxp_182.50_whql.html

    http://www.nvidia.com/object/winxp_182.08_whql.html

    http://www.nvidia.com/object/winxp_182.06_whql.html

    http://www.nvidia.com/object/winxp_181.22_whql.html

    http://www.nvidia.com/object/winxp_181.20_whql.html

    http://www.nvidia.com/object/winxp_180.48_whql.html

    http://www.nvidia.com/object/winxp_178.24_whql.html

    http://www.nvidia.com/object/winxp_178.13_whql.html

    Half a years worth of drivers for my 9800GTX's and supports 6 series and up GPU's!  Now.. Why would any game maker (In their right mind) want to work (or adapt with) ATI and their broken ways ? Because it's prolly broken!  Again....No ones beating anyone by 300%....I use my GFX cards for gaming not to test programs that has no merit to gaming!

    Ok now try to find them 2 same cards with todays drivers and let see benchies....

    These benches are from late last year but newer than the one you provided. prolly tested on 178.24 drivers like 8 drivers ago.

    http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Assassins-Creed-v1.02,739.html

    http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Call-of-Duty-4-v1.6,745.html

    http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Crysis-v1.21,754.html

    http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Enemy-Territory-Quake-Wars-v1.4,763.html

    http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Half-Life-2-Episode-2,769.html

    http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Mass-Effect,778.html

    http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Microsoft-Flight-Simulator-X-SP2,784.html

    last but not least...

    http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Sum-of-FPS-Benchmarks-Totals,795.html

    Just think what it would look like if they were running todays drivers??? Prolly look like murder!

    But at any rate those cards are on their way out anyway.

    image

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Smikis


    wow another post about how awesome physixs are , unreal doesnt support physixs.. just 3 maps, which physixs mode does,
     
    sacred 2, physixs adds extra falling leaves from trees ( thigns you wont notice )
     
    noone plays cryostasis, stop praising game, cuz it sounds like crysis,
     
    none of those mmos ever said anything about supporting physixs,
     
    I own nvidia cards, 2 of them actually, and i obviously have physixs,, support.. only game i care for physixs is diablo 3, rest doesnt have physixs, there are only 1 game so far which have real phyxis thats mirrors edge,
     
    others adds some falling leaves. like sacred, and ureal mod.. both of them add leaves, thats about only physixs in those games..
     
    its nice yes, but its not the point  in getting nvidia over ati just cuz of physixs



     

    So...If you look.. it's a little more than just a few extra leaves and stuff! Look at all the videos you can, and you will find where they used the physx engine to construct the buildings. So when the building is blownup or is falling. You see it as if it's a real building. When using the Physx engine to build the buildings they found that if they didn't build them right the building coudn't support themselves and would crumble to the ground. They tell you about it in the videos..

    http://pc.ign.com/dor/objects/14235421/red-faction-guerrilla/videos/redfactionlivewire_072308.html

    http://media.pc.gamespy.com/media/142/14235421/vids_1.html

    http://www.redfaction.com/

    image

  • CleffyCleffy Member RarePosts: 6,414

    The point I was trying to make was about supporting architecture.  If they supported the ATI hardware architecture, it can perform up to 300% better.  Heck if they supported DX10.1 without supporting the rest of ATIs hardware, it would perform better.  The fact is game developers don't support it which is why you have to look else where for this support.  Particularly CAD programs since half of them support ATI's architecture.

    Viewperf is a benchmark that measures rendering capability in viewports to 3D programs.  The viewports are often based on game engine designs and Direct X.  This is what makes them a close comparison to how a game could perform if it supported the ATI architecture.  Also like nVidia, ATI updates its driver every 2 months.  I find your explanation long and does not actually target my point about supporting architecture and how nVidia continues to hold back graphics capability in games.

    Also I find PhysX to be a redundant API just like CUDA.  There are already open source and widely supported APIs that are further developed and do the same thing.  Particularly Havok and OpenCL.  The only thing I have seen people do with PhysX are things that are already possible with Havok, Newton, and your other APIs.  Cloth effects and destruction effects.

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Cleffy


    The point I was trying to make was about supporting architecture.  If they supported the ATI hardware architecture, it can perform up to 300% better.  Heck if they supported DX10.1 without supporting the rest of ATIs hardware, it would perform better.  The fact is game developers don't support it which is why you have to look else where for this support.  Particularly CAD programs since half of them support ATI's architecture.
    Viewperf is a benchmark that measures rendering capability in viewports to 3D programs.  The viewports are often based on game engine designs and Direct X.  This is what makes them a close comparison to how a game could perform if it supported the ATI architecture.  Also like nVidia, ATI updates its driver every 2 months.  I find your explanation long and does not actually target my point about supporting architecture and how nVidia continues to hold back graphics capability in games.
    Also I find PhysX to be a redundant API just like CUDA.  There are already open source and widely supported APIs that are further developed and do the same thing.  Particularly Havok and OpenCL.  The only thing I have seen people do with PhysX are things that are already possible with Havok, Newton, and your other APIs.  Cloth effects and destruction effects.



     

    The point I was trying to make was about supporting architecture.

     

    REPLY- Yep got ya on that...But Ithink you should be talking to the game makers and card makers on that...

     

    If they supported the ATI hardware architecture, it can perform up to 300% better. Heck if they supported DX10.1 without supporting the rest of ATIs hardware, it would perform better. The fact is game developers don't support it which is why you have to look else where for this support. Particularly CAD programs since half of them support ATI's architecture.

     

    REPLY-  "Game developers Don't Support it "  Half of CAD programs  support ATI's architecture"  So why should I care?  I dont use CAD and its not supported by games....I use my GFX cards for gaming!

     

    Viewperf is a benchmark that measures rendering capability in viewports to 3D programs. The viewports are often based on game engine designs and Direct X. This is what makes them a close comparison to how a game could perform if it supported the ATI architecture. Also like nVidia, ATI updates its driver every 2 months.

     

    Well I just shown you that 10% performance gain over the 8800gt went right out the window with a simple driver update....Also if you look at the release notes on them driver you will see performance gains values (%) per game. Do you see that with ATI drivers? Again look at the dates on the drivers aswell. You will see that they put out more drivers per 2 months over ATI. So why would any game maker (in their right mind) want to deal with Ati and their broken ways? Maybe in a few years, Ati will get their shit strait and produce solid drivers that increase performance output with the games that are made today.

    image

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Smikis


    wow another post about how awesome physixs are , unreal doesnt support physixs.. just 3 maps, which physixs mode does,
     
    sacred 2, physixs adds extra falling leaves from trees ( thigns you wont notice )
     
    noone plays cryostasis, stop praising game, cuz it sounds like crysis,
     
    none of those mmos ever said anything about supporting physixs,
     
    I own nvidia cards, 2 of them actually, and i obviously have physixs,, support.. only game i care for physixs is diablo 3, rest doesnt have physixs, there are only 1 game so far which have real phyxis thats mirrors edge,
     
    others adds some falling leaves. like sacred, and ureal mod.. both of them add leaves, thats about only physixs in those games..
     
    its nice yes, but its not the point  in getting nvidia over ati just cuz of physixs



     You again....

    Hey  was i talking about telling people to go Nivida over ATI?...No.  The post was to ATI fans, that were saying " Dont go Nivida Cause physX  dont add up".....

    Listen here Forum Cherry picker...... Unreal 3 Supports alot of physx.  If a dev does not add them to a game, thats not Nvidias Fault.  And the point is if something does come out  that Supports PhysX,  you wont hear "i cant run that with my card."

    noone plays cryostasis, stop praising game, cuz it sounds like crysis, ? 

    So you make statements about games that you havent played?...wow

    The fact that you commented on 3 of the 40 on the list...it seams to me, that you havent even PLAYED a game with physix yet?  And yet you know so much about what games have Physix?  and what they are like?

    Did you just Google this, and read one post that some other Avid hater wrote, and reposted his Statements?....WTF is wrong with you.

    You not get your Prozac today?



    An who Praised Cryo ? I said it has the most demanding PhysX to date...

    It sounds like Crysis?....you mean you Know nothing about the game and yet you bash it...

    Well if U3  supports Physix  and the game is made with it im sure it will be...And the Fact that alot of them did state that they will support.  Makes you look like a Blind Flamer Agian....

    Did you play any of the $40+ games on that List that support PhysX ?   No?  then How can you say only 1 game realy supports it..?

    Did you even play Mirror's edge  that you say does?

    Look folks we have a winner....  you get the "I dont think before i speak Award.!."

     

    Everybody Clap, for the teenager that borrowed his moms Computer to look at porn..... and post stupid shit online.

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Anvil_TheoryAnvil_Theory Member Posts: 106

    Pointless thread, anyone who has a dual or quad-core has built-in physic. Specially with Intel and AMD offering 6, 8 & 12 core CPU's by the end of the year, there is no need to have the video card handle something the CPU, with idle cores can handle.

     

    Not only that...  but only 50% of the people are using Nividia, so programming for PhysX (which is a brand name) is pointless. Where-as almost everyone in the future will have multi-core computers...  so programers are more likely to program using Havok.

    Nvidia is using you guys with their marketing BS.

     

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Anvil_Theory


    Pointless thread, anyone who has a dual or quad-core has built-in physic. Specially with Intel and AMD offering 6, 8 & 12 core CPU's by the end of the year, there is no need to have the video card handle something the CPU, with idle cores can handle.
     
    Not only that...  but only 50% of the people are using Nividia, so programming for PhysX (which is a brand name) is pointless. Where-as almost everyone in the future will have multi-core computers...  so programers are more likely to program using Havok.
    Nvidia is using you guys with their marketing BS.
     



     

    useing us...?  lol

    you wear a tin hat too....

    the post was to state that  what games use Physx,  not a battle.

    And from what i know is the GPU Phyisx  helps off load the work of the CPU cause a GPU is better at computing Graphics..

    Its the other way around..... its that GPU's CAN a will take over the CPU.(PC made for Gaming Graphics...if MS will support it with a OS)  and push out INTEL......  that being if INTEL donsnt have something up thier sleave.

    See i too can make far fetched Statements with nothing to back it.

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

This discussion has been closed.