Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel/Nvidia Vs AMD/ATI Offical Fight Club.

Dreadknot357Dreadknot357 Member Posts: 148

I am making this thread for one reason. To help stop derailing other threads with "OUR" fights

I and many others have helped Derail threads with our difference's.

So I say.

Lets make a place where if your here deal with what could and will be said, about your favorite hardware.

 

So lets make some ground rules for this thread. Intel/Nvidia & AMD/ATI fans lets add some groundrules for the debates to come.

(once we get some solid rules...or no rules, i will setup this thread. for the fight club)

............................................................................................................................................................................

I vote for rules

 

Rule#1 If in another thread and it turns into a Fight ( Intel/Nvidia & AMD/ATI ) Someone Yell out. TAKE IT OUTSIDE.

Just Copy and past the Quote, and bring it here. Fight away. This helps keep our Crap out of people faces. (and they dont have to cry)

 

Rule#2 Post your PC specs ( Allot of people don't see the point of this...it matters... when asking for Exp with Hardware related Issues. If you have used it, you have more exp in most cases.

 

Rule#3 PROVE EVERYTHING.....

 

Rule#4 PROVE EVERYTHING..... if you cant then add / theory / Op / Hands on exp / to your statements. Maybe someone else can confirm your statements.

 

..................................................................................................................................................................

 

that's all i got for now... so Add some rules. or Dont.

So if you have beef or fuel for the fire.... post it here....

 

 

 

 

 

 

PS  if you all dont want to Get down with this...then stop the Drailment Crying...Cause like it or not people will fight about hardwear...At least im trying to help you out.

 

"Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
image
image

«134

Comments

  • Dreadknot357Dreadknot357 Member Posts: 148

    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.

    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)

    http://www.nvidia.com/object/preconfigured_clusters.html

     

    And before  you Start going its Nvidia Hype  cause its coming from them.   

    Take a Look At  these  Companys  putting them out.

    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html

    Im not stateing that ATI  is not working on it.....  But i havent seen it.

    ............................................................................................................................

    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)

     

     

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • JackcoltJackcolt Member UncommonPosts: 2,170

    GPU's will do CPU's work definately. I can't remember exact numbers but a GPU can handle so many more threads at the same time, that you can make matrix operations (if you know what that is, you'll know how it important it is within so many fields -> Even gaming) exponentially faster. Matrix operations are some of the most often used and most demanding operations that optimizing of course means better performance, but also for example more efficient process automation in companies, which in the end means more income. One common matrix operation is the system of linear equations. The best algorithm for that has a big O notation of I think n^3. That's extremely high.

    As for the main debate, I'll think I'll pass. I foresee a lot of curse words and angry people(:P)

     

    image
    image

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by Dreadknot357


    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.
    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)
    http://www.nvidia.com/object/preconfigured_clusters.html
     
    And before  you Start going its Nvidia Hype  cause its coming from them.   
    Take a Look At  these  Companys  putting them out.
    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html
    Im not stateing that ATI  is not working on it.....  But i havent seen it.
    ............................................................................................................................
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)
     
     



     

    I think you are a really confused individual!

    Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is..  it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics.

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...  anyone can (and have) noted the direction game physics are headed.

    Though, it is humorous to watch kids learn about life.

     

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Jackcolt


    GPU's will do CPU's work definately. I can't remember exact numbers but a GPU can handle so many more threads at the same time, that you can make matrix operations (if you know what that is, you'll know how it important it is within so many fields -> Even gaming) exponentially faster. Matrix operations are some of the most often used and most demanding operations that optimizing of course means better performance, but also for example more efficient process automation in companies, which in the end means more income. One common matrix operation is the system of linear equations. The best algorithm for that has a big O notation of I think n^3. That's extremely high.
    As for the main debate, I'll think I'll pass. I foresee a lot of curse words and angry people(:P)
     



     

    come on  i will play fair...lol .hey i agree with your point you just made....even if i didnt understand the last part

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Erowid420

    Originally posted by Dreadknot357


    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.
    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)
    http://www.nvidia.com/object/preconfigured_clusters.html
     
    And before  you Start going its Nvidia Hype  cause its coming from them.   
    Take a Look At  these  Companys  putting them out.
    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html
    Im not stateing that ATI  is not working on it.....  But i havent seen it.
    ............................................................................................................................
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)
     
     



     

    I think you are a really confused individual!

    Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is..  it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics.

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...  anyone can (and have) noted the direction game physics are headed.

    Though, it is humorous to watch kids learn about life.

     

     

    and we have a winner....

     

    "Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.?"

    Tesla  a  nvidia  product  being used as A CPU.... In super computers  for use  to Crunch data....   and a service CUDA 

    Did i even mention Physics?   

    "And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I never stated how A GPU process Physics  beside what sort of things PhysX can ad to your Games.

    And that  A GPU is better at it than a CPU.  As of Date.  Well since your A forum stalker trying to call me out  into a fight   now you got one.

    I fell for what. buying a Nvidia...LOL  if you were any good at stalking you would have read that i have been using Nvidia well before Physix was around.  PhysX  is just a bouns. 

    ONCE AGAIN  WHAT PHYSX GAMES HAVE YOU PLAYED? ON A NVIDIA SYSTEM.?

    I already Stated  it's not the end all for Gaming... Its for people that want alttle more.  you didnt read that did you.

    Or that fact that if you dont care about always pushing to get the best out of a game..... Why do you use a PC for gaming

    Just get a PS3 or a XBOX. .........  you buy a PC to get a little bit more from a game...AA, AF, higher Rez , Physix 

    i posted  to state what games have it NOW and what games are coming out WITH it next year.   I posted because ATI fans said it was pointless cause  "LIKE only 3 games have it"  Proving a Fact.

    Did i ever state how physics work or programed....no

    Did i ever state that ATI will never be able...no

    So now lets get to you.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype.  (PROVE THIS)

    CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.  (PROVE THIS)

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics. (PROVE THIS)

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.(PROVE  THAT IT WILL BE EVEN 50% SUPPORTED BY W7.....And LETS SEE IF IT EVEN IS BY RELEASE)

    Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...(SHOW ME)  anyone can (and have) noted the direction game physics are headed. (PROVE THIS )

    Though, it is humorous to watch kids learn about life ( DID YOU CHECK MY AGE... KID ?   you want to get personal with me?)

     

    if you want to go down that road.....

    PS:  see this is what happens when one of you .......................... with me......  then I Hammer  you Back ,   and then none of you  remeber  why i attacked.  so its ok to egg me into a personal brawl  with bellow the belt shots.  but its not ok for me to fight back with harsh below the belt shots.....Are you kidding. 





     

     

     lololololol   you kidding me

    Anvil_Theory - Advanced Member



    Real Name: Joash Brookie



    Member since April 16, 2009



    Last Visit: May 10, 2009



    39 year old Male from Stardock, MI, United States......................................

    Erowid420 - Advanced Member



    Real Name: Jaohn Belkin



    Member since May 13, 2009



    Last Visit: May 15, 2009



    34 year old Male from Mid west, MI, United States

     

    you made a new profile to come mess with me......what a tool!!!

     

     

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Dreadknot357Dreadknot357 Member Posts: 148

    Success Stories  From Tesla  Also if you see on the sitre they are doing this with Tesla with I7/AMD/Xeon based CPU's

    Computational professionals, such as scientists and research engineers, are facing increasingly difficult computing challenges on a daily basis, such as drug research, oil and gas exploration and computational finance challenges. With the world’s first teraflop many-core processor, NVIDIA® Tesla™ high performance computing (HPC) solutions enable the necessary transition to energy efficient parallel computing power. With 240 cores per processor and a CUDA architecture that simplifies application development, Tesla™ scales to solve the world’s most important computing challenges—more quickly and accurately.



    Industry's first massively multi-threaded architecture with a 240-processor computing

    core per processor.

    Many-core architecture delivers optimum scaling across HPC applications.

    Scales to thousands of processor cores to solve large-scale problems by splitting the problem across multiple GPUs.

    High-efficiency computing platform for energy-conscious organizations.

    NVIDIA CUDA™ technology unlocks the power of the Tesla™ many-core computing products.

    Seamlessly fits into existing HPC environments.



     

    Successful Tesla Stories Include:

    Medical Imaging

    --------------------------------------------------------------------------------

    TechniScan Medical System incorporates the Tesla™ technology to develop a new imaging system for complete ultrasound scans in half the time and with greater accuracy. It eliminates the delay in generating test results so that patients and doctors have a fast and efficient device that delivers results at the pace of modern medicine.





    Life Sciences

    --------------------------------------------------------------------------------

    Utilizing the Tesla™ technology, the National Cancer Institute completed its cancer medical research calculations 12 times faster than with traditional x86 based servers. The research results can now be accumulated in 10 minutes as opposed to 2 hours.

    (this is the funniest one yet.. yes The National Cancer Institute is helping Nvidia complete the Death Star...  to sham the world into Nividas's marketing plans...lol  or maybe the National Cancer Institute  is a marketing ploy,  useing dieng kids to futher  Nvidia's  grip on the Galaxy..)



    Banking/Financial

    --------------------------------------------------------------------------------

    Hanweck Associates can now instantaneously evaluate trade activities and investments in the stock and commodities markets.

     





    Computational Fluid Dynamics

    --------------------------------------------------------------------------------

    The National Center for Atmospheric Research (NCAR) has achieved a 10x improvement in speed for Microphysics and a 20% improvement in overall speed for the Weather Research & Forecasting Models (WRF). Weather agencies worldwide are now able to more quickly produce real-time weather forecasting and climate prediction.





    Geographic Information Services

    --------------------------------------------------------------------------------

    Calculations for Manifold’s Geographic Information Services (GIS) applications, which previously took 20 minutes to complete, are now done in 30 seconds. Moreover, calculations that previously took 30 to 40 seconds are now real-time. With its new CUDA acceleration, Manifold is helping to tap previously inaccessible fuel reserves, track the progress of pollutants in the air, and give more precise information to police and fire emergency response teams.





    Oil and Gas

    --------------------------------------------------------------------------------

    Headwave’s geophysical data analysis solutions, implemented on NVIDIA’s Tesla computing solutions, allow geophysicists to apply advanced filters to their data and instantly see results even on multi-terabyte datasets. Geophysicists can now analyze the original acquired seismic (“pre-stack”) data in multiple dimensions as part of their daily workflow. As a result, Headwave is able to increase compute rates and reduce time spent in manual operations by 100x.





    CAD/CAM/CAE

    --------------------------------------------------------------------------------

    OptiTex’s 3D CAD/CAM design technology enables clothing designers to simulate the look and movement of clothing designs on virtual models, allowing them to review, refine and measure samples before the first piece of fabric is ever cut. OptiTex 3D now achieves up to a 10x performance increase. Development time and time to market for a seasonal clothing collection can now take a mere 35 days as opposed to the typical 190-day period.

     

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • neorandomneorandom Member Posts: 1,681

     track record of over 10 years clearly shows intel and nvidia offer better support for their products reguardless of who is 1 point ahead on benchmarks at the time/

  • CleffyCleffy Member RarePosts: 6,414

    I think on graphics, Intel is screwed.  They have burnt the bridge on both ATI and nVidia support for thier motherboards.  Next year when Larrabee is introduced, you are going to be stuck with an all Intel board.  Intel themselves not really understanding what makes a good graphics chip.  Just from the press released on Larrabee you can understand that it won't be able to take care of the mass simultaneous rendering a GPU handles.  I think next year will be a bad year for Intel.  Just like the P4 age for them.  On the graphics front they have to compete with nVidia and ATI.  On the processing front they have to compete with AMD thats cleared a 3 year gap in 1 on a new platform that heavily favors AMD (Hybrid CPU/GPU chips).

    On the graphics computation part.  OpenCL is more the standard then CUDA because of its open nature.  Development for it is faster and its more widely utilized.  Niether AMD or nVidia offer a better card that does these tasks when it comes to professional cards.  The only difference is with consumer cards.  AMDs cards being more closely designed to their professional cards and being capable of higher processing power as a result.  There is an extreme benefit to this technology and makes CPUs an even more moot point.  Larrabee cannot compete with a FireGL or Quadro when it comes to these sorts of tasks.

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357


    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.
    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)
    http://www.nvidia.com/object/preconfigured_clusters.html
     
    And before  you Start going its Nvidia Hype  cause its coming from them.   
    Take a Look At  these  Companys  putting them out.
    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html
    Im not stateing that ATI  is not working on it.....  But i havent seen it.
    ............................................................................................................................
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)
     
     



     

    I think you are a really confused individual!

    Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is..  it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics.

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...  anyone can (and have) noted the direction game physics are headed.

    Though, it is humorous to watch kids learn about life.

     

     

    and we have a winner....

     

    "Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.?"

    Tesla  a  nvidia  product  being used as A CPU.... In super computers  for use  to Crunch data....   and a service CUDA 

    Did i even mention Physics?   

    "And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I never stated how A GPU process Physics  beside what sort of things PhysX can ad to your Games.

    And that  A GPU is better at it than a CPU.  As of Date.  Well since your A forum stalker trying to call me out  into a fight   now you got one.

    I fell for what. buying a Nvidia...LOL  if you were any good at stalking you would have read that i have been using Nvidia well before Physix was around.  PhysX  is just a bouns. 

    ONCE AGAIN  WHAT PHYSX GAMES HAVE YOU PLAYED? ON A NVIDIA SYSTEM.?

    I already Stated  it's not the end all for Gaming... Its for people that want alttle more.  you didnt read that did you.

    Or that fact that if you dont care about always pushing to get the best out of a game..... Why do you use a PC for gaming

    Just get a PS3 or a XBOX. .........  you buy a PC to get a little bit more from a game...AA, AF, higher Rez , Physix 

    i posted  to state what games have it NOW and what games are coming out WITH it next year.   I posted because ATI fans said it was pointless cause  "LIKE only 3 games have it"  Proving a Fact.

    Did i ever state how physics work or programed....no

    Did i ever state that ATI will never be able...no

    So now lets get to you.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype.  (PROVE THIS)

    CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.  (PROVE THIS)

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics. (PROVE THIS)

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.(PROVE  THAT IT WILL BE EVEN 50% SUPPORTED BY W7.....And LETS SEE IF IT EVEN IS BY RELEASE)

    Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...(SHOW ME)  anyone can (and have) noted the direction game physics are headed. (PROVE THIS )

    Though, it is humorous to watch kids learn about life ( DID YOU CHECK MY AGE... KID ?   you want to get personal with me?)

     

    if you want to go down that road.....

    PS:  see this is what happens when one of you .......................... with me......  then I Hammer  you Back ,   and then none of you  remeber  why i attacked.  so its ok to egg me into a personal brawl  with bellow the belt shots.  but its not ok for me to fight back with harsh below the belt shots.....Are you kidding. 





     

     

     lololololol   you kidding me

    Anvil_Theory - Advanced Member



    Real Name: Joash Brookie



    Member since April 16, 2009



    Last Visit: May 10, 2009



    39 year old Male from Stardock, MI, United States......................................

    Erowid420 - Advanced Member



    Real Name: Jaohn Belkin



    Member since May 13, 2009



    Last Visit: May 15, 2009



    34 year old Male from Mid west, MI, United States

     

    you made a new profile to come mess with me......what a tool!!!

     

     



     

     

    No, I wrote a letter to the Lead Staff member here and accused them of biased and bogus moderators and I suspected both of you had something to do with it.

    I have been baiting you,  knowing you would rage at all cost. Which is captured above. Your logic is self depricating and agressive, but overspoken. So I wanted to perk your interest and cause you pause to suspect me... to see what you do. And low and behold, you post personal information that cannot be had otherwise. It's not public.

    Now me and the staff KNOW!   (thanks.)

     

     

    Secondly, your posts is too emotional and off topic. You seem to talk a lot about yourself and not the topic or the technology at hand. Please touch on the points I've made, or go home. If you don't understand the facts or the latest news, then go educate yourself. This isn't some secret that Nvidia is slowly getting shut out of several markets, etc  And that ATi sales are gaining market share and Nvidia is resorting to tactics like rebranded the same card 3 times, due to the lack of product placment and advanced tech. Yes, Nvidia is loosing ground and the CEo has been in hot sauce for a while...

     

    ...all of that was 8 months ago and old news, but it helps shape the political landscape. Not to mention the ego war between Intel and Nvidia.  We get it... you are an excited little boy with a new present called PhysX and now you must be thee beacon of light (go go fanboi). But you are not a true enthusiast, or OC, because we have been discussing deformable objects and physics in games for years, on most message boards, conferences and blogs. The battle (if there ever was one) for a Physics standard is not over, but what platform they are going to be using for their in-game physic will be the CPU... since EVERYONE has one. It's a simple win/win situation.

    Go educate yourself on these matter and stop being a lemming for these companies. You younger doodz are so quick to boast and take sides, when all people want to do is pass along the latest and greatest knowledge. So, play whatever games you have to play, but if you are a slow learner or have to be force fed the truth, I'm not your man. I don't have patient to work with Jerry's Kids! that have to be spoon fed the answers.

     

     

     

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • Sir_DripSir_Drip Member Posts: 133
    Originally posted by Erowid420

    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357


    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.
    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)
    http://www.nvidia.com/object/preconfigured_clusters.html
     
    And before  you Start going its Nvidia Hype  cause its coming from them.   
    Take a Look At  these  Companys  putting them out.
    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html
    Im not stateing that ATI  is not working on it.....  But i havent seen it.
    ............................................................................................................................
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)
     
     



     

    I think you are a really confused individual!

    Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is..  it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics.

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...  anyone can (and have) noted the direction game physics are headed.

    Though, it is humorous to watch kids learn about life.

     

     

    and we have a winner....

     

    "Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.?"

    Tesla  a  nvidia  product  being used as A CPU.... In super computers  for use  to Crunch data....   and a service CUDA 

    Did i even mention Physics?   

    "And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I never stated how A GPU process Physics  beside what sort of things PhysX can ad to your Games.

    And that  A GPU is better at it than a CPU.  As of Date.  Well since your A forum stalker trying to call me out  into a fight   now you got one.

    I fell for what. buying a Nvidia...LOL  if you were any good at stalking you would have read that i have been using Nvidia well before Physix was around.  PhysX  is just a bouns. 

    ONCE AGAIN  WHAT PHYSX GAMES HAVE YOU PLAYED? ON A NVIDIA SYSTEM.?

    I already Stated  it's not the end all for Gaming... Its for people that want alttle more.  you didnt read that did you.

    Or that fact that if you dont care about always pushing to get the best out of a game..... Why do you use a PC for gaming

    Just get a PS3 or a XBOX. .........  you buy a PC to get a little bit more from a game...AA, AF, higher Rez , Physix 

    i posted  to state what games have it NOW and what games are coming out WITH it next year.   I posted because ATI fans said it was pointless cause  "LIKE only 3 games have it"  Proving a Fact.

    Did i ever state how physics work or programed....no

    Did i ever state that ATI will never be able...no

    So now lets get to you.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype.  (PROVE THIS)

    CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.  (PROVE THIS)

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics. (PROVE THIS)

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.(PROVE  THAT IT WILL BE EVEN 50% SUPPORTED BY W7.....And LETS SEE IF IT EVEN IS BY RELEASE)

    Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...(SHOW ME)  anyone can (and have) noted the direction game physics are headed. (PROVE THIS )

    Though, it is humorous to watch kids learn about life ( DID YOU CHECK MY AGE... KID ?   you want to get personal with me?)

     

    if you want to go down that road.....

    PS:  see this is what happens when one of you .......................... with me......  then I Hammer  you Back ,   and then none of you  remeber  why i attacked.  so its ok to egg me into a personal brawl  with bellow the belt shots.  but its not ok for me to fight back with harsh below the belt shots.....Are you kidding. 





     

     

     lololololol   you kidding me

    Anvil_Theory - Advanced Member



    Real Name: Joash Brookie



    Member since April 16, 2009



    Last Visit: May 10, 2009



    39 year old Male from Stardock, MI, United States......................................

    Erowid420 - Advanced Member



    Real Name: Jaohn Belkin



    Member since May 13, 2009



    Last Visit: May 15, 2009



    34 year old Male from Mid west, MI, United States

     

    you made a new profile to come mess with me......what a tool!!!

     

     



     

     

    No, I wrote a letter to the Lead Staff member here and accused them of biased and bogus moderators and I suspected both of you had something to do with it.

    I have been baiting you,  knowing you would rage at all cost. Which is captured above. Your logic is self depricating and agressive, but overspoken. So I wanted to perk your interest and cause you pause to suspect me... to see what you do. And low and behold, you post personal information that cannot be had otherwise. It's not public.

    Now me and the staff KNOW!   (thanks.)

     

     

    Secondly, your posts is too emotional and off topic. You seem to talk a lot about yourself and not the topic or the technology at hand. Please touch on the points I've made, or go home. If you don't understand the facts or the latest news, then go educate yourself. This isn't some secret that Nvidia is slowly getting shut out of several markets, etc  And that ATi sales are gaining market share and Nvidia is resorting to tactics like rebranded the same card 3 times, due to the lack of product placment and advanced tech. Yes, Nvidia is loosing ground and the CEo has been in hot sauce for a while...

     

    ...all of that was 8 months ago and old news, but it helps shape the political landscape. Not to mention the ego war between Intel and Nvidia.  We get it... you are an excited little boy with a new present called PhysX and now you must be thee beacon of light (go go fanboi). But you are not a true enthusiast, or OC, because we have been discussing deformable objects and physics in games for years, on most message boards, conferences and blogs. The battle (if there ever was one) for a Physics standard is not over, but what platform they are going to be using for their in-game physic will be the CPU... since EVERYONE has one. It's a simple win/win situation.

    Go educate yourself on these matter and stop being a lemming for these companies. You younger doodz are so quick to boast and take sides, when all people want to do is pass along the latest and greatest knowledge. So, play whatever games you have to play, but if you are a slow learner or have to be force fed the truth, I'm not your man. I don't have patient to work with Jerry's Kids! that have to be spoon fed the answers.

     

     

     



     

    "No, I wrote a letter to the Lead Staff member here and accused them of biased and bogus moderators and I suspected both of you had something to do with it."

    Yea... we have been on our teamspeak laughing at your complaint for the last 3.5 hrs. I think I even heard someone say "There wh-ounce was a man named... Jedd" You have been nothing but trouble, with your baiting and stuff .

    First of all.... you will need to close your extra  account(s),  You only need one here. 2nd...abide by the rules of the post, If you can not provide proof...Then you will need to shut your mouth and go somewhere else. Provide something. I dont care if it's links all day long....something! 3rd... abide by the rules of the post,  Post your system spec's...You can lie.. (all you want to on this one) 3 times a week,  for all I care....However don't be shocked if someone throws you a curve ball and calls you out on it!! 4th...Read  all of the  forum post and links before you comment!. (realy look in to it ....you might learn something) This post was made for a reason, so that other people don't have to hear what they dont want to hear!

    Im sorry that your personal information  was shown for eveyone to see.(you know..the extra accounts)..And I know  that only the staff could see that information...Right?, Would you like us to take away a star from Dread or maybe myself? Would that makeyou feel better?

    I can see your mouth moving....but I cant hear you!

    Abide by the rules!

     

    EVGA 780I SLI

    E8400@4.05Ghz

    2x9800GTX SLI 804/1704/2249

    OCZ 1000W PSU

    Patriot DDR2 1200 Mhz@ 1150Mhz @ 5-5-5-12

    Matrox TH2GO (DIGI) Widescreen gaming.

    3X 20.2 Sceptre X20 WG-NaGaII Displays@(5040x1050)or (3840x1024)Rez across 3 screens.

    Antec 900 case

    AC Freezer 7 pro CPU cooler (Lapped)

    G11 keyboard

    G5 mouse

    Tt spirit RS ram coolers

    2x WD 160 GIG 7200 16M cache sata 3.0 Raid(o)

    Maxtor 250Gig 7200 sata 3.0 16m cache storage drive

    Pioneer DVR-111L DVD/Burner

    Realtek HD onboard sound

    Klipsch 2.1 THX speakers

    Sennheiser HD-280 pro headphones

    Labtec mic

    Been running this system for over a year!

     

    image

  • VyronVyron Member Posts: 55

    This is a discussion forum, not a internet fist fight.

    Keep it clean folks.

  • JackcoltJackcolt Member UncommonPosts: 2,170

    Damn that didn't take long. Faster than I had expected. Kudos to the mod for not deleting anything though.

    Okay, time to throw a few sticks in the fire.

    The ATI equivalent of CUDA is called Stream. They both basically do the same, which is providing a general purpose processing architecture for their GPUs. Both probably have their seperate low level and high level API readily available for developers to use. Here is the catch, both CUDA and Stream support or plan support for interfacing the OpenCL API. So if developers choose to use OpenCL both ATI and nVidia users will be able to use what ever product they are making. As in regards to rule 3 and 4, this is proven. You can basically find all what I just said on the developers site or wikipedia(i know, can be false information, but it's true in this case)

    So when looking in the future the games will still support both ATI and nVidia. Of course nVidia will probably like now try to give money to developers integrating their own API in some effects, which then like now enhance the experience in someway(like in Mirrors Edge where glass shatter more correctly). Both general purpose processing will be done by both sides, and game critical physics is going to use unified API, which is OpenCL. That is, of course, unless one side can pay publishers enough money to cover for the lose of huge market group, which would be the the other side.

    So as I've said before, I don't find PhysX interesting right now at least. What you gain from it is miniscule. I mean, I played Mirrors Edge through and if the glass had shattered more realistically I probably wouldn't have noticed it. So what I'm saying is right now I can't see the reason for buying nVidia alone for PhysX. Of course, if you prefer their drivers, the price/performance is the same, PhysX would only be a bonus.

    FFS, now I got in the thread. But please do as the mod say and keep it civil.

    image
    image

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Erowid420

    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357


    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.
    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)
    http://www.nvidia.com/object/preconfigured_clusters.html
     
    And before  you Start going its Nvidia Hype  cause its coming from them.   
    Take a Look At  these  Companys  putting them out.
    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html
    Im not stateing that ATI  is not working on it.....  But i havent seen it.
    ............................................................................................................................
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)
     
     



     

    I think you are a really confused individual!

    Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is..  it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics.

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...  anyone can (and have) noted the direction game physics are headed.

    Though, it is humorous to watch kids learn about life.

     

     

    and we have a winner....

     

    "Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.?"

    Tesla  a  nvidia  product  being used as A CPU.... In super computers  for use  to Crunch data....   and a service CUDA 

    Did i even mention Physics?   

    "And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I never stated how A GPU process Physics  beside what sort of things PhysX can ad to your Games.

    And that  A GPU is better at it than a CPU.  As of Date.  Well since your A forum stalker trying to call me out  into a fight   now you got one.

    I fell for what. buying a Nvidia...LOL  if you were any good at stalking you would have read that i have been using Nvidia well before Physix was around.  PhysX  is just a bouns. 

    ONCE AGAIN  WHAT PHYSX GAMES HAVE YOU PLAYED? ON A NVIDIA SYSTEM.?

    I already Stated  it's not the end all for Gaming... Its for people that want alttle more.  you didnt read that did you.

    Or that fact that if you dont care about always pushing to get the best out of a game..... Why do you use a PC for gaming

    Just get a PS3 or a XBOX. .........  you buy a PC to get a little bit more from a game...AA, AF, higher Rez , Physix 

    i posted  to state what games have it NOW and what games are coming out WITH it next year.   I posted because ATI fans said it was pointless cause  "LIKE only 3 games have it"  Proving a Fact.

    Did i ever state how physics work or programed....no

    Did i ever state that ATI will never be able...no

    So now lets get to you.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype.  (PROVE THIS)

    CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.  (PROVE THIS)

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics. (PROVE THIS)

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.(PROVE  THAT IT WILL BE EVEN 50% SUPPORTED BY W7.....And LETS SEE IF IT EVEN IS BY RELEASE)

    Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...(SHOW ME)  anyone can (and have) noted the direction game physics are headed. (PROVE THIS )

    Though, it is humorous to watch kids learn about life ( DID YOU CHECK MY AGE... KID ?   you want to get personal with me?)

     

    if you want to go down that road.....

    PS:  see this is what happens when one of you .......................... with me......  then I Hammer  you Back ,   and then none of you  remeber  why i attacked.  so its ok to egg me into a personal brawl  with bellow the belt shots.  but its not ok for me to fight back with harsh below the belt shots.....Are you kidding. 





     

     

     lololololol   you kidding me

    Anvil_Theory - Advanced Member



    Real Name: Joash Brookie



    Member since April 16, 2009



    Last Visit: May 10, 2009



    39 year old Male from Stardock, MI, United States......................................

    Erowid420 - Advanced Member



    Real Name: Jaohn Belkin



    Member since May 13, 2009



    Last Visit: May 15, 2009



    34 year old Male from Mid west, MI, United States

     

    you made a new profile to come mess with me......what a tool!!!

     

     



     

     

    No, I wrote a letter to the Lead Staff member here and accused them of biased and bogus moderators and I suspected both of you had something to do with it.

    I have been baiting you,  knowing you would rage at all cost. Which is captured above. Your logic is self depricating and agressive, but overspoken. So I wanted to perk your interest and cause you pause to suspect me... to see what you do. And low and behold, you post personal information that cannot be had otherwise. It's not public.

    Now me and the staff KNOW!   (thanks.)

     

     

    Secondly, your posts is too emotional and off topic. You seem to talk a lot about yourself and not the topic or the technology at hand. Please touch on the points I've made, or go home. If you don't understand the facts or the latest news, then go educate yourself. This isn't some secret that Nvidia is slowly getting shut out of several markets, etc  And that ATi sales are gaining market share and Nvidia is resorting to tactics like rebranded the same card 3 times, due to the lack of product placment and advanced tech. Yes, Nvidia is loosing ground and the CEo has been in hot sauce for a while...

     

    ...all of that was 8 months ago and old news, but it helps shape the political landscape. Not to mention the ego war between Intel and Nvidia.  We get it... you are an excited little boy with a new present called PhysX and now you must be thee beacon of light (go go fanboi). But you are not a true enthusiast, or OC, because we have been discussing deformable objects and physics in games for years, on most message boards, conferences and blogs. The battle (if there ever was one) for a Physics standard is not over, but what platform they are going to be using for their in-game physic will be the CPU... since EVERYONE has one. It's a simple win/win situation.

    Go educate yourself on these matter and stop being a lemming for these companies. You younger doodz are so quick to boast and take sides, when all people want to do is pass along the latest and greatest knowledge. So, play whatever games you have to play, but if you are a slow learner or have to be force fed the truth, I'm not your man. I don't have patient to work with Jerry's Kids! that have to be spoon fed the answers.

     

     

     



     

    NO ONE SAW THIS POST AND COMMENTED....lol  i say somthing and im under fire?????/

    WOW     Younger  "Doodz"?  hey if any of your profiles are True....You are 4 years older.....Jerry's kids?

    And im the one that takes heat....?

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Jackcolt


    Damn that didn't take long. Faster than I had expected. Kudos to the mod for not deleting anything though.
    Okay, time to throw a few sticks in the fire.
    The ATI equivalent of CUDA is called Stream. They both basically do the same, which is providing a general purpose processing architecture for their GPUs. Both probably have their seperate low level and high level API readily available for developers to use. Here is the catch, both CUDA and Stream support or plan support for interfacing the OpenCL API. So if developers choose to use OpenCL both ATI and nVidia users will be able to use what ever product they are making. As in regards to rule 3 and 4, this is proven. You can basically find all what I just said on the developers site or wikipedia(i know, can be false information, but it's true in this case)
    So when looking in the future the games will still support both ATI and nVidia. Of course nVidia will probably like now try to give money to developers integrating their own API in some effects, which then like now enhance the experience in someway(like in Mirrors Edge where glass shatter more correctly). Both general purpose processing will be done by both sides, and game critical physics is going to use unified API, which is OpenCL. That is, of course, unless one side can pay publishers enough money to cover for the lose of huge market group, which would be the the other side.
    So as I've said before, I don't find PhysX interesting right now at least. What you gain from it is miniscule. I mean, I played Mirrors Edge through and if the glass had shattered more realistically I probably wouldn't have noticed it. So what I'm saying is right now I can't see the reason for buying nVidia alone for PhysX. Of course, if you prefer their drivers, the price/performance is the same, PhysX would only be a bonus.
    FFS, now I got in the thread. But please do as the mod say and keep it civil.

    i Said  I agree with ATI & Nvidia working with Physics on together.

    You are adding me And Sir Drip together on alot of issues,  my OP is a little diffrent  than his....

    ALL i stated was.  Someone  ATI fan  Said"

    "Dont go Nvidia  cause there are only 3 games, that have working PhysX."...   

    So i showed people that there are over 40. Some of the biggest titles this year , will have them......

    The Fact that many of you have Said " I can't see the reason for buying nVidia alone for PhysX"

    I never said that I...  Sir_Drip may have.

    WE BOUGHT NVIDIA  For their Power/support /And OC ablity,  and have use them before they had physics...

    The PhysX  driver is just a extra bouns

    Your all hung up on that.?

    Also  you played one Broke game?

    All  the fighting we have done...   you Say PhysX  are nothing after one game that was broke.....?  

    Ever watch mythbusters.?

    How do they Test things, and get data?

    With a Control,  and many diffrent samples.  and run the numbers.....  to get a real Data

    Its funny cause i asked all of you about 15 times to what games have you played with physX....  (50 post and 2 weeks later)

     Now i get one answer... You played "ONE" PhysX enabled game.....

    Im sorry but that is crazy...

    All of this..... and all of your Debate ammo  is on ONE game. 

    One game defiens All physX?

    You have to admit , that is beyond all sorts of reason.

    PS:  jack....    did you See that guy up there and the Nuts falling out of his  head....lol  Man you though i was bad...

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Erowid420

    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357


    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.
    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)
    http://www.nvidia.com/object/preconfigured_clusters.html
     
    And before  you Start going its Nvidia Hype  cause its coming from them.   
    Take a Look At  these  Companys  putting them out.
    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html
    Im not stateing that ATI  is not working on it.....  But i havent seen it.
    ............................................................................................................................
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)
     
     



     

    I think you are a really confused individual!

    Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is..  it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics.

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...  anyone can (and have) noted the direction game physics are headed.

    Though, it is humorous to watch kids learn about life.

     

     

    and we have a winner....

     

    "Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.?"

    Tesla  a  nvidia  product  being used as A CPU.... In super computers  for use  to Crunch data....   and a service CUDA 

    Did i even mention Physics?   

    "And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I never stated how A GPU process Physics  beside what sort of things PhysX can ad to your Games.

    And that  A GPU is better at it than a CPU.  As of Date.  Well since your A forum stalker trying to call me out  into a fight   now you got one.

    I fell for what. buying a Nvidia...LOL  if you were any good at stalking you would have read that i have been using Nvidia well before Physix was around.  PhysX  is just a bouns. 

    ONCE AGAIN  WHAT PHYSX GAMES HAVE YOU PLAYED? ON A NVIDIA SYSTEM.?

    I already Stated  it's not the end all for Gaming... Its for people that want alttle more.  you didnt read that did you.

    Or that fact that if you dont care about always pushing to get the best out of a game..... Why do you use a PC for gaming

    Just get a PS3 or a XBOX. .........  you buy a PC to get a little bit more from a game...AA, AF, higher Rez , Physix 

    i posted  to state what games have it NOW and what games are coming out WITH it next year.   I posted because ATI fans said it was pointless cause  "LIKE only 3 games have it"  Proving a Fact.

    Did i ever state how physics work or programed....no

    Did i ever state that ATI will never be able...no

    So now lets get to you.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype.  (PROVE THIS)

    CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.  (PROVE THIS)

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics. (PROVE THIS)

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.(PROVE  THAT IT WILL BE EVEN 50% SUPPORTED BY W7.....And LETS SEE IF IT EVEN IS BY RELEASE)

    Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...(SHOW ME)  anyone can (and have) noted the direction game physics are headed. (PROVE THIS )

    Though, it is humorous to watch kids learn about life ( DID YOU CHECK MY AGE... KID ?   you want to get personal with me?)

     

    if you want to go down that road.....

    PS:  see this is what happens when one of you .......................... with me......  then I Hammer  you Back ,   and then none of you  remeber  why i attacked.  so its ok to egg me into a personal brawl  with bellow the belt shots.  but its not ok for me to fight back with harsh below the belt shots.....Are you kidding. 





     

     

     lololololol   you kidding me

    Anvil_Theory - Advanced Member



    Real Name: Joash Brookie



    Member since April 16, 2009



    Last Visit: May 10, 2009



    39 year old Male from Stardock, MI, United States......................................

    Erowid420 - Advanced Member



    Real Name: Jaohn Belkin



    Member since May 13, 2009



    Last Visit: May 15, 2009



    34 year old Male from Mid west, MI, United States

     

    you made a new profile to come mess with me......what a tool!!!

     

     



     

     

    No, I wrote a letter to the Lead Staff member here and accused them of biased and bogus moderators and I suspected both of you had something to do with it.

    I have been baiting you,  knowing you would rage at all cost. Which is captured above. Your logic is self depricating and agressive, but overspoken. So I wanted to perk your interest and cause you pause to suspect me... to see what you do. And low and behold, you post personal information that cannot be had otherwise. It's not public.

    Now me and the staff KNOW!   (thanks.)

     

     

    Secondly, your posts is too emotional and off topic. You seem to talk a lot about yourself and not the topic or the technology at hand. Please touch on the points I've made, or go home. If you don't understand the facts or the latest news, then go educate yourself. This isn't some secret that Nvidia is slowly getting shut out of several markets, etc  And that ATi sales are gaining market share and Nvidia is resorting to tactics like rebranded the same card 3 times, due to the lack of product placment and advanced tech. Yes, Nvidia is loosing ground and the CEo has been in hot sauce for a while...

     

    ...all of that was 8 months ago and old news, but it helps shape the political landscape. Not to mention the ego war between Intel and Nvidia.  We get it... you are an excited little boy with a new present called PhysX and now you must be thee beacon of light (go go fanboi). But you are not a true enthusiast, or OC, because we have been discussing deformable objects and physics in games for years, on most message boards, conferences and blogs. The battle (if there ever was one) for a Physics standard is not over, but what platform they are going to be using for their in-game physic will be the CPU... since EVERYONE has one. It's a simple win/win situation.

    Go educate yourself on these matter and stop being a lemming for these companies. You younger doodz are so quick to boast and take sides, when all people want to do is pass along the latest and greatest knowledge. So, play whatever games you have to play, but if you are a slow learner or have to be force fed the truth, I'm not your man. I don't have patient to work with Jerry's Kids! that have to be spoon fed the answers.

     

     

     

    So you have refused  to Prove any of the statments  that you have posted from 4  threads and 2  profiles.... 

     

    Your Tech and what ever you have posted  now looks like the rambles of a Bullshiter..

    I at least have admited what i think is OP, or Theroy,  and have Back up facts with hands on and Links to Solid sources

    you shadow the Facts  with smoke and mirrors.  

    you have refused to follow any of the rules of this thread..(which you dont have too)

    But it makes you look like your Afraid to let the truth out...

    you want to Debate with me... Prove  your Statments .

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Carl132pCarl132p Member UncommonPosts: 538

    This thread is about comparing performance. Teraflops is apparently a measure of some performance. Teraflops is the most rediculous serious thing ive ever seen. therefore this thread is rediculous.

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Carl132p


    This thread is about comparing performance. Teraflops is apparently a measure of some performance. Teraflops is the most rediculous serious thing ive ever seen. therefore this thread is rediculous.



     

    hmmmmm  Comparing Teraflops?......reads throught the thread...... Am i missing something.

    Hmmm  Thread/ Rules/ Didnt not read or follow.....Hmmm ....I will let this one go

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • Erowid420Erowid420 Member Posts: 93
    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357


    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.
    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)
    http://www.nvidia.com/object/preconfigured_clusters.html
     
    And before  you Start going its Nvidia Hype  cause its coming from them.   
    Take a Look At  these  Companys  putting them out.
    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html
    Im not stateing that ATI  is not working on it.....  But i havent seen it.
    ............................................................................................................................
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)
     
     



     

    I think you are a really confused individual!

    Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is..  it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics.

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...  anyone can (and have) noted the direction game physics are headed.

    Though, it is humorous to watch kids learn about life.

     

     

    and we have a winner....

     

    "Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.?"

    Tesla  a  nvidia  product  being used as A CPU.... In super computers  for use  to Crunch data....   and a service CUDA 

    Did i even mention Physics?   

    "And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I never stated how A GPU process Physics  beside what sort of things PhysX can ad to your Games.

    And that  A GPU is better at it than a CPU.  As of Date.  Well since your A forum stalker trying to call me out  into a fight   now you got one.

    I fell for what. buying a Nvidia...LOL  if you were any good at stalking you would have read that i have been using Nvidia well before Physix was around.  PhysX  is just a bouns. 

    ONCE AGAIN  WHAT PHYSX GAMES HAVE YOU PLAYED? ON A NVIDIA SYSTEM.?

    I already Stated  it's not the end all for Gaming... Its for people that want alttle more.  you didnt read that did you.

    Or that fact that if you dont care about always pushing to get the best out of a game..... Why do you use a PC for gaming

    Just get a PS3 or a XBOX. .........  you buy a PC to get a little bit more from a game...AA, AF, higher Rez , Physix 

    i posted  to state what games have it NOW and what games are coming out WITH it next year.   I posted because ATI fans said it was pointless cause  "LIKE only 3 games have it"  Proving a Fact.

    Did i ever state how physics work or programed....no

    Did i ever state that ATI will never be able...no

    So now lets get to you.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype.  (PROVE THIS)

    CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.  (PROVE THIS)

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics. (PROVE THIS)

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.(PROVE  THAT IT WILL BE EVEN 50% SUPPORTED BY W7.....And LETS SEE IF IT EVEN IS BY RELEASE)

    Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...(SHOW ME)  anyone can (and have) noted the direction game physics are headed. (PROVE THIS )

    Though, it is humorous to watch kids learn about life ( DID YOU CHECK MY AGE... KID ?   you want to get personal with me?)

     

    if you want to go down that road.....

    PS:  see this is what happens when one of you .......................... with me......  then I Hammer  you Back ,   and then none of you  remeber  why i attacked.  so its ok to egg me into a personal brawl  with bellow the belt shots.  but its not ok for me to fight back with harsh below the belt shots.....Are you kidding. 





     

     

     lololololol   you kidding me

    Anvil_Theory - Advanced Member



    Real Name: Joash Brookie



    Member since April 16, 2009



    Last Visit: May 10, 2009



    39 year old Male from Stardock, MI, United States......................................

    Erowid420 - Advanced Member



    Real Name: Jaohn Belkin



    Member since May 13, 2009



    Last Visit: May 15, 2009



    34 year old Male from Mid west, MI, United States

     

    you made a new profile to come mess with me......what a tool!!!

     

     



     

     

    No, I wrote a letter to the Lead Staff member here and accused them of biased and bogus moderators and I suspected both of you had something to do with it.

    I have been baiting you,  knowing you would rage at all cost. Which is captured above. Your logic is self depricating and agressive, but overspoken. So I wanted to perk your interest and cause you pause to suspect me... to see what you do. And low and behold, you post personal information that cannot be had otherwise. It's not public.

    Now me and the staff KNOW!   (thanks.)

     

     

    Secondly, your posts is too emotional and off topic. You seem to talk a lot about yourself and not the topic or the technology at hand. Please touch on the points I've made, or go home. If you don't understand the facts or the latest news, then go educate yourself. This isn't some secret that Nvidia is slowly getting shut out of several markets, etc  And that ATi sales are gaining market share and Nvidia is resorting to tactics like rebranded the same card 3 times, due to the lack of product placment and advanced tech. Yes, Nvidia is loosing ground and the CEo has been in hot sauce for a while...

     

    ...all of that was 8 months ago and old news, but it helps shape the political landscape. Not to mention the ego war between Intel and Nvidia.  We get it... you are an excited little boy with a new present called PhysX and now you must be thee beacon of light (go go fanboi). But you are not a true enthusiast, or OC, because we have been discussing deformable objects and physics in games for years, on most message boards, conferences and blogs. The battle (if there ever was one) for a Physics standard is not over, but what platform they are going to be using for their in-game physic will be the CPU... since EVERYONE has one. It's a simple win/win situation.

    Go educate yourself on these matter and stop being a lemming for these companies. You younger doodz are so quick to boast and take sides, when all people want to do is pass along the latest and greatest knowledge. So, play whatever games you have to play, but if you are a slow learner or have to be force fed the truth, I'm not your man. I don't have patient to work with Jerry's Kids! that have to be spoon fed the answers.

     

     

     

    So you have refused  to Prove any of the statments  that you have posted from 4  threads and 2  profiles.... 

     

    Your Tech and what ever you have posted  now looks like the rambles of a Bullshiter..

    I at least have admited what i think is OP, or Theroy,  and have Back up facts with hands on and Links to Solid sources

    you shadow the Facts  with smoke and mirrors.  

    you have refused to follow any of the rules of this thread..(which you dont have too)

    But it makes you look like your Afraid to let the truth out...

    you want to Debate with me... Prove  your Statments .



     

    I'm sorry, is that^^ english?

    I have already proven you wrong, if you cannot see this, then you are hopeless. More than several people have also given you the answer and sent you the correct information.  If you are unable to study information that not part of marketing hype, then I can see why you keep failing to grasp these facts. Probably why you are such an emotional ball of obcentities, defending yourself all the time.

    Though, I have to admit it is funny watching you struggle with this subject, because you were so sure of yourself when you were quoting marketing hype.... lol llol (though it is always the same with fanbois)

     

     

     

     

    ___________________________

    - Knowledge is power, ive been in school for 28 years!

  • JakeadunkJakeadunk Member Posts: 142

    I have a intel chip and a ATI card, fail to see the point here.

    Smarter than the average bear? That is assuming bears are smart.

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Erowid420

    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357

    Originally posted by Erowid420

    Originally posted by Dreadknot357


    to start this off  People Keep saying GPU's doing the work of CPUS  will not happen.  And CUDA  is a marketing hype.
    you all like to read...here you go.   And though  its not for gaming.... yet (theory)   this could very well be the future (theory)
    http://www.nvidia.com/object/preconfigured_clusters.html
     
    And before  you Start going its Nvidia Hype  cause its coming from them.   
    Take a Look At  these  Companys  putting them out.
    http://www.nvidia.com/object/tesla_preconfigured_clusters_wtb.html
    Im not stateing that ATI  is not working on it.....  But i havent seen it.
    ............................................................................................................................
    Antec 1200 full tower

    i7 920 0C @ 4.2 on air

    Coolmaster V8 CPU cooler

    EVGA x58 SLI MB

    Corsair DDR3 6 gigs 1600 9.9.9:24 1T

    EVGA GTX 285 Stock IN SLI

    WD Raptor X 150gig X2 Raid 0

    Corsiar h1000watt PSU

    Creative Sound Blaster X-Fi Titanium Fatal1ty Champion.

    Pioneer Elite A/V receiver.

    Infinity Sound Speakers 5.1

    Smasung 26 inch LCD syncmaster 1900x1200

    Wacom Cintq 21 inch digtal art monitor

    logitech G15 keayboard

    logitech MX laser mouse

    logitech 5.1suround

    (Vantage P 26,008)
     
     



     

    I think you are a really confused individual!

    Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is..  it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics.

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...  anyone can (and have) noted the direction game physics are headed.

    Though, it is humorous to watch kids learn about life.

     

     

    and we have a winner....

     

    "Nvidia (in those links) is offering a SERVICE, with their products in mind, yet, somehow you keep wanting to spin it with AMD in mind. Neither of your links illustrates Nvidia dominance or future viability with Physics.?"

    Tesla  a  nvidia  product  being used as A CPU.... In super computers  for use  to Crunch data....   and a service CUDA 

    Did i even mention Physics?   

    "And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype. CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.

    I never stated how A GPU process Physics  beside what sort of things PhysX can ad to your Games.

    And that  A GPU is better at it than a CPU.  As of Date.  Well since your A forum stalker trying to call me out  into a fight   now you got one.

    I fell for what. buying a Nvidia...LOL  if you were any good at stalking you would have read that i have been using Nvidia well before Physix was around.  PhysX  is just a bouns. 

    ONCE AGAIN  WHAT PHYSX GAMES HAVE YOU PLAYED? ON A NVIDIA SYSTEM.?

    I already Stated  it's not the end all for Gaming... Its for people that want alttle more.  you didnt read that did you.

    Or that fact that if you dont care about always pushing to get the best out of a game..... Why do you use a PC for gaming

    Just get a PS3 or a XBOX. .........  you buy a PC to get a little bit more from a game...AA, AF, higher Rez , Physix 

    i posted  to state what games have it NOW and what games are coming out WITH it next year.   I posted because ATI fans said it was pointless cause  "LIKE only 3 games have it"  Proving a Fact.

    Did i ever state how physics work or programed....no

    Did i ever state that ATI will never be able...no

    So now lets get to you.

    And after pulling up your post history it's clear you do not even grasp even the fundementals of how GPU or CPU process physics and are indeed persuaded by marketing hype.  (PROVE THIS)

    CUDA is what it is.. it's nothing remarkable or awsome, it's just Nvidia's version of calculating physics and using programable shaders. Problem is, they are loosing the battle and have really poured on the marketing hype in the last couple of months. Looks like you fell for it.  (PROVE THIS)

    I have an old Agea card sitting around here somewhere... I use CUDA to aid in folding, yet neither will be a key player come xmass when it comes to ingame physics. (PROVE THIS)

    Nvidias approach was a great idea when CPUs were single cores, but in the near future multi-core CPUs will be taken to a new level. We will see machines selling this Xmass with 12 cores.(PROVE  THAT IT WILL BE EVEN 50% SUPPORTED BY W7.....And LETS SEE IF IT EVEN IS BY RELEASE)

    Then add in the excitment over Larrabee's numbers, along with DirectX 11 engines that are already shown using CPU physics...(SHOW ME)  anyone can (and have) noted the direction game physics are headed. (PROVE THIS )

    Though, it is humorous to watch kids learn about life ( DID YOU CHECK MY AGE... KID ?   you want to get personal with me?)

     

    if you want to go down that road.....

    PS:  see this is what happens when one of you .......................... with me......  then I Hammer  you Back ,   and then none of you  remeber  why i attacked.  so its ok to egg me into a personal brawl  with bellow the belt shots.  but its not ok for me to fight back with harsh below the belt shots.....Are you kidding. 





     

     

     lololololol   you kidding me

    Anvil_Theory - Advanced Member



    Real Name: Joash Brookie



    Member since April 16, 2009



    Last Visit: May 10, 2009



    39 year old Male from Stardock, MI, United States......................................

    Erowid420 - Advanced Member



    Real Name: Jaohn Belkin



    Member since May 13, 2009



    Last Visit: May 15, 2009



    34 year old Male from Mid west, MI, United States

     

    you made a new profile to come mess with me......what a tool!!!

     

     



     

     

    No, I wrote a letter to the Lead Staff member here and accused them of biased and bogus moderators and I suspected both of you had something to do with it.

    I have been baiting you,  knowing you would rage at all cost. Which is captured above. Your logic is self depricating and agressive, but overspoken. So I wanted to perk your interest and cause you pause to suspect me... to see what you do. And low and behold, you post personal information that cannot be had otherwise. It's not public.

    Now me and the staff KNOW!   (thanks.)

     

     

    Secondly, your posts is too emotional and off topic. You seem to talk a lot about yourself and not the topic or the technology at hand. Please touch on the points I've made, or go home. If you don't understand the facts or the latest news, then go educate yourself. This isn't some secret that Nvidia is slowly getting shut out of several markets, etc  And that ATi sales are gaining market share and Nvidia is resorting to tactics like rebranded the same card 3 times, due to the lack of product placment and advanced tech. Yes, Nvidia is loosing ground and the CEo has been in hot sauce for a while...

     

    ...all of that was 8 months ago and old news, but it helps shape the political landscape. Not to mention the ego war between Intel and Nvidia.  We get it... you are an excited little boy with a new present called PhysX and now you must be thee beacon of light (go go fanboi). But you are not a true enthusiast, or OC, because we have been discussing deformable objects and physics in games for years, on most message boards, conferences and blogs. The battle (if there ever was one) for a Physics standard is not over, but what platform they are going to be using for their in-game physic will be the CPU... since EVERYONE has one. It's a simple win/win situation.

    Go educate yourself on these matter and stop being a lemming for these companies. You younger doodz are so quick to boast and take sides, when all people want to do is pass along the latest and greatest knowledge. So, play whatever games you have to play, but if you are a slow learner or have to be force fed the truth, I'm not your man. I don't have patient to work with Jerry's Kids! that have to be spoon fed the answers.

     

     

     

    So you have refused  to Prove any of the statments  that you have posted from 4  threads and 2  profiles.... 

     

    Your Tech and what ever you have posted  now looks like the rambles of a Bullshiter..

    I at least have admited what i think is OP, or Theroy,  and have Back up facts with hands on and Links to Solid sources

    you shadow the Facts  with smoke and mirrors.  

    you have refused to follow any of the rules of this thread..(which you dont have too)

    But it makes you look like your Afraid to let the truth out...

    you want to Debate with me... Prove  your Statments .



     

    I'm sorry, is that^^ english?

    I have already proven you wrong, if you cannot see this, then you are hopeless. More than several people have also given you the answer and sent you the correct information.  If you are unable to study information that not part of marketing hype, then I can see why you keep failing to grasp these facts. Probably why you are such an emotional ball of obcentities, defending yourself all the time.

    Though, I have to admit it is funny watching you struggle with this subject, because you were so sure of yourself when you were quoting marketing hype.... lol llol (though it is always the same with fanbois)

     

     

     

     

     

     

    Maybe you haven't understood the meaning of the word PROVE...



    Proving something is not done by just making a statement...

    you have to show some evidence.....

    Are you back to capping on Grammar...Didn't we have this convo..?

    Quoting Marketing Hype?....I guess you missed the success story by a real company, using A GPU to do data crunching....

     



    "Utilizing the Tesla™ technology, the National Cancer Institute completed its cancer medical research calculations 12 times faster than with traditional x86 based servers. The research results can now be accumulated in 10 minutes as opposed to 2 hours".

    (this is the funniest one yet.. yes The National Cancer Institute is helping Nvidia complete the Death Star... to sham the world into Nividas's marketing plans...lol or maybe the National Cancer Institute is a marketing ploy, useing dieng kids to futher Nvidia's grip on the Galaxy..)

     

    and many more companys....

     



    Are you still on the Physx point...When we are talking about GPUS AND CPUS?

    Well i guess that you Sidesteped that whole you think there is a conspiracy with Nvidia, and Me and Sir drip are Mods for MMORPG? Care to inlighten us with Comments, that have something to do with how you came to that point?????

    you keep sidesteping your Rambles.. If somone can beleave that there is a conspiracy to F  with you, bettween Mods And members. At the same time think that Nvidia is out to get everone....it forms a pattern of Paranoia..

    Which would make all sane people second guess your statements..... If you trully Beleave this... Post some Sources besides hersay....

     You do realize bettween the times you Run to the Mods Snitching on us.... that people ARE reading what u have worte...Full of emotional balls of obcentities, defending yourself all the time...... you do realize this right....



    Or is that Also a Conspiracy too? 

    IM not talking about physx in the OP. Your taking a Post from another thread, that I stated what games have Physics... If thats what you want to debate, state the Subject first, so somone knows what your talking about.

    If you going to Debate get on the Same subject...and Quit posting BS Knowing people will not read that far back, to see if your Full of it or not....

    You want to debate... PIck a Subject..

    Find a Post with a "Sentance" where i said something, and Quote it....Then we can Debate....

    Otherwise......

     

    As stated before

     

    You keep posting Comments made by Sir drip and are putting them in my mouth.....

    You realize there are 2 different people with the Same sig right?

    Your having a temper tantrum...

    you have gone off the deep end...

    You think Intel Nvidia is a Conspiracy

    And Me and Sir Drip are Mods for MMORPG.

    And Everyone is out to get You and ATI....

    you have lost all touch with reality, because I made you look foolish About correcting my grammar with bad grammar. and you snapped

    So you made a second Profile to mess with me.

    This is funny watching a middle aged man brake down on the Internet.....

    IF i was this foolish kid that you say i am....why are your split personalities....Fighting with me?

    You are By far the funniest person i have met on the Internet....

    And you justify all the nagging my wife gives me for being on the PC all the time..lol 

    And you think that if you keep pushing me, you will get me to respond harshly so i will get banned...

    Funny How you think that your Mature....

     

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • miagisanmiagisan Member Posts: 5,156

    i use an amd 4200+ chipset and a nvidia vid card 8800gt.....

     

    haven't upgraded in years and can run everything almost on max.....

     

    i go with parts that are good and stable. i like amd chips cause they work very well and cost alot less than intel, 8800gt was one of the only cards worth spending a little extra on from nvidia, but my next vid card will definitely be a 5870 when they drop down in prices. Couldn't give a damn about fanboism "omg nvidia/ati ftw" crap.....i buy whats good and economical. don't need to flex my computer e-peen

    image

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by miagisan


    i use an amd 4200+ chipset and a nvidia vid card 8800gt.....
     
    haven't upgraded in years and can run everything almost on max.....
     
    i go with parts that are good and stable. i like amd chips cause they work very well and cost alot less than intel, 8800gt was one of the only cards worth spending a little extra on from nvidia, but my next vid card will definitely be a 5870 when they drop down in prices. Couldn't give a damn about fanboism "omg nvidia/ati ftw" crap.....i buy whats good and economical. don't need to flex my computer e-peen



     

    Well said,  if you read back there are many times i have said just this..

    My loyalty  is to gaming...what ever gets me to the top in performance,  wins in my book Right now  Intel/Nvidia is what i need.   Somewhere in the heat of battle, somone called me a Fanboi...Yea right now i Am a fan of Intel/Nvidia....

    I Still have my old AMD and ATI parts around...but i have the extra money for the best... so i went that road...

    If AMD/ATI Keep their game solid for more than  a few years,  can can Punk Intel/Nvidia.. I will go with them.

    People in  froums like to Cherry pick comments, Or just Read the last posted and make assumptions.  Which then a person has to go back to And expalin themselves.   It dreails the thread ,and then the fight starts.

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

  • LAHScorpLAHScorp Member Posts: 22

    Just going to say this to get it off my chest, plain and simple none of your hardware specs and shit matter.  In 6 months your shit will be obsolete to begin with so why pay so much $ for bragging rights?  Right now my specs are as follows.

    AMD Phenom II X4 940 Processor @3.0ghz stock (No game utilizes this, dual cores are on the newest of games and few at that)

    ATI Radeon Sapphire HD 4870 1GB @ stock speeds runs everything at highest of settings besides Crysis. (Which no one can unless maybe you run 3 - 4 Graphics cards lol, according to tomshardware.)

    Kingston Hyper-X 1066 DDR 2x2GB Memory

    Audigy Soundblaster 4 Pro Soundcard Plays as much as I need to hear and all I could ask for.

    PC Power & Cooling 750W Peak ;825W Continuous Power 80+ Certified

    XClio Supertower Case

    Asrock A780 mobo.

    This system plays anything I could ever ask it to and only costed me $819.02.  I don't need bragging rights, I need a computer that will play what I need it to and this will play everything to date on Highest, even Crysis as long as AA is off.  Why fight over something this stupid?  The only thing that I can say about both either side is that AMD is cheaper hands down.  I can't stand the whole i7 and the hype around it when you don't even utilize that shit.  Another thing I can't stand are overclockers, wth are you overclocking for when you already run things on max settings?  Dear god people, I don't think I need to remind you that you are lucky to even have what you got.  Leave it at that.  Intel, Nvidia, ATI, AMD, all have their strong points and weaknesses but one thing is for sure, they both get the job done, so why be involved in a retarded war of who has the best?

    I think the i7 920 is decently priced but just can't compete in terms of price with AMD providing their Quad Cores at $189.99 lol and the new 955 AM3 3.2Ghz processor at $249.99? (Newegg Price).  It's almost $100 more for an intel i7 processor that I wouldn't even use to begin with.  I think without AMD/ATI/Nvidias I would find it hard upgrading to a good machine.  They all have good products and so does intel, I just think that Intel puts a little bit too much of an over price on their products I mean $1,000 for a processor? Seriously?  No thanks, I'll stick to my good ole AMD :P.

    In closing, buy what you need to get you through, I am 99.9% sure I will not have to upgrade for the next 2 - 3 years, gaming products are just now starting to use multi-core units, it'll take them another 2 years to start dynamically as a standard trying to use Quad Cores.  The only thing I can actually see me investing in is a graphics card, and those I don't mind spending $200 - $300 on a good one.  The only reason I would do that is because of the new DX11 cards coming out and even that will take a while to start utilizing in games, they are still trying to implement DX10 at decent frame rates (Age of Conan) for example.  So, I may not even have to invest in one of those for a few years.  Just depends I guess.  (This entire post is part of my opinion, not meant to flame, I just want a serious answer as to why people fight over things that aren't utilized.)  As awkward as it sounds I like mxing AMD/Nvidia, it just seems to work decently for me lol.  Only switched to ATI to test out the 4870 and I got to say I'm impressed as to how well it's done.  Still love Nvidia though ^_^.   

  • OrphesOrphes Member UncommonPosts: 3,039
    Originally posted by Dreadknot357


     
    People in  froums like to Cherry pick comments, Or just Read the last posted and make assumptions.  Which then a person has to go back to And expalin themselves.   It dreails the thread ,and then the fight starts.

     

    Hahahaha, tell that to your brother in arms... This is getting rather silly. You are saying that it is stupid to quote out of context, tell that to your friend.

    http://www.mmorpg.com/discussion2.cfm/thread/234814/page/4#2832784

    I'm so broke. I can't even pay attention.
    "You have the right not to be killed"

  • Dreadknot357Dreadknot357 Member Posts: 148
    Originally posted by Jakeadunk


    I have a intel chip and a ATI card, fail to see the point here.



     

    i dont even know what the point is anymore,  the OP was about GPU"s  doing the job of a CPU ..lol

    Some one had stated that CUDA  was a marketing Ploy to get people to buy  Inferior Nvidia GPUS. They Said Nvidia And all game Devs and companys that say the use Nvidia, Have made a deal with the Republic to Distroy AMD/ ATI...

    And i am some how helping Nvidia,  and MMO RPG  to make all ATI fans look stupid,  for not Buying a Nvidia GPU.

    And im the one that is stupid  for buying a Inferior  overpriced nvidia GPU, cuase  PhysX is a Myth  made buy the darkside

    to Control the Galaxy....and i gave in to the darkside, By i falling for the  PhysX  marketing hype...    I think

    Even tho my OP is about A GPU doing a CPU's Job.

    And in another thread I stated that there are now 40+ PhysX enabled Games....lololol

    So yea  where are we?

    "Beauty is only is only skin deep..." said the AMD/ATI fan. "Blah..Thats just what ugly people say..." said the Intel/Nvidia fan. You want price / performance, use the dollar menu..
    image
    image

Sign In or Register to comment.