It looks like you're new here. If you want to get involved, click one of these buttons!
If 3D gaming on one monitor isn't enough for you, maybe three will suffice.
"Nvidia came out in force today, unveiling the next generation of Tegra as well as bolstering its presence in the 3D marketplace. This concerns the latter.
I stopped by the Nvidia booth earlier today to catch up on the latest and greatest in 3D tech. The press conference earlier today covered most of the bases, but the 3D Vision Surround demo beckoned. 3D Vision Surround takes the existing 3D Vision concept/hardware and triples it. In other words, you're playing a PC title, in 3D, over three different screens.
The hardware needed inside your gaming machine isn't as simplified as AMD and the Radeon 5000 series cards that can power three displays on one card. The Surround setup requires two Nvidia cards in SLI; The demo on the show floor was running two "Next Generation" GF100 GPUs. One GPU powers the left and center displays, and the other powers the right display. In this case, the displays weren't monitors, rather three 120 Hz DLP projectors pointed at one very long screen. If you happen to have three 120 Hz monitors lying around, you should be good to go.
There's no word on pricing and specific GPU compatibility just yet (I was told it will work with most Nvidia GPUs), and the launch time frame is within the next three months."
Interesting Tech!
Comments
Some disadvantages: requires two gfx cards, also requires 120 Hz monitors.
Ati's concept sounds much better to.
ATI has already done this and much better.
In 3D ?
This is the year of 3D apparently, all the new TV's at CES and tech and such like. I read all the companies which Nvidia is working with to supplement 3D in games and yet nothing from ATI.
What do you mean by 3D in games?
www.nvidia.com/object/3D_Vision_Main.html
www.nvidia.com/object/3D_Vision_Main.html
Thanks for the link. That shows me what they are doing and it looks very intresting. I will keep my eye on this for sure.
www.nvidia.com/object/3D_Vision_Main.html
Thanks for the link. That shows me what they are doing and it looks very intresting. I will keep my eye on this for sure.
No problem. They've actually had 3dvision for a while. If you have a newer NVidia card you can turn it on even if you don't have the set up just to take a look at the effect. Just look for 3d in the NVidia control panel.
Great now I can watch my PHYSX in full 3d!
For those that couldn't tell I was being sarcastic, this is just a pathetic gimmik and Nvidia need to step up to the plate and make some decent DX11 GPUs before they end up being the red headed step child like ATI was for so many years.
ATI/AMD have upped their game and nVidia dropped the ball, this 3d thing is a gimmik. Any post 8800 GPU will run it but you need the glasses and monitor and you need two cards for more than one screen. Really few gamers use multiple monitors and how many of those are 120Hz and how many would want to play with 3d glasses?
ATI runs 3 monitors off one card (eyefinity) and their cards are better and cheaper and have DX11. Nvidia need to pull their fingers out and make good cards, nothing else is going to make me buy nVidia over ATI. This is a nice marketing tool but to any serious gamer you're just gonna look and say "WTF have those guys been smoking?".
I'm certain ATI will counter with something similar, and then they'll just embaress nVidia saying "Hey we have 3d too, but it only requires one card which performance wise tears 2 nVidia cards a new one and it's DX11 ready!". The time and money could've been better spent in my opinion instead of on a 3d gimmik and putting down DX11 because you don't have it on your cards yet.
"Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience"
CS Lewis
My toughts exactly, really.
It's time to release some serious information about Fermi, instead they keep bullshitting about random gimmicks such as Physx(Gimmick done at a relatively well level by software physics engines as well) and 3D(again, a gimmick that costs a lot for minimal gains).
Of course some might say that Nvidia are trying to move away from the old FPS power increase add add new technologies on the market, but that doesn't buy me. It's time to see some real benchmarks, information, specs about their new generation.
Still using 3d in games causes so huge performance hit that you can't play any newer games with decent settings without top end computer.
nVidia in recent years has been trying to make its own standards that aren't supported by other hardware. They have also been holding back serious GPU development since 2006. Closed hardware formats like Tegra, PhysX, and 3D Stereo Vision are doomed to fail with nVidia losing market share.
Its not actually that hard to implement Stereoscopic vision in 3D games, you just render 2 cameras instead of 1 paired closely to one another. Then you need a video card that can double its workload and sync to shutter glasses. There are plenty of open hardware formats that do this in OpenCL, so it should not be an issue of ATI being capable of it. ATI has had Stereoscopic 3D support since 2008.
ATI would be capable of the double amount of rendering far better then nVidia. The cards just have so much processing juice in them.
www.mmorpg.com/discussion2.cfm/thread/268255/The-Future-is-HereHolographiclike-gaming.html try this link out ... it is here ... and it is cheap
lol nvidia as been announcing card for 7 month now since ati released the ati 5xxx serie,any that think ati has been sleeping all those 7 month truelly dont know ati
amd is with ati ,intel mm no news from them and no news means they are on to something very big they might delay like they oftren do but when they do lunch i bet only ati might be able to keep up
in case you guys didnt know intel as been on the 32 nm size for a month or 2 now
they intel is close to release their next gen gamer stuff
will the intel graphic be 32 nm mm dont know but it wouldnt suprise me ,the rumor is their graphic is on a similar die to their processor just beside a 32 nm monster of a processor if this is true some might be in for a nasty surprise.since we all know
intel process make it possible to pack a lot more punch compared to amd .
Huh? Larabee got canned
Well, yes. We need those next gen cards soon, ATI is leading right now and 3D stuff like this sounds rather useless to me.
But on the other hand shouldn't they stress to release a card to early either, both companies (and 3DFX and Matrox) have done so in the past with bad results. I rather wait 9 months to get something that really kicks than to have a slightly upgraded card with Dx11.
ATI on their hand should work harder on the software part of their cards.
3D will change things eventually but stuff like this is just a waste of money to get something that is cool but useless. We are a few years away still from good and useful 3D gaming. It might open up the road for stuff that is actually useful in the long run, but the idea with 3D glasses was something Nvidia actually bought from 3DFX a long time ago, I remembering my dad having something like that 10 years ago or so, I wasn't impressed then either.
Wait 9 months for a card that kicks ass? Gonna wait that long you could never buy a card because another 9 months will have something else twice as badass.. that's over the halfway point of a generation's lifespan. Speaking of which Fermi at 6 months after 5800 series is almost not even the same generation, it's halfway to next-next gen.
3d does seem gimmicky right now and you're right it needs a few years still but I'm still excited to try it out. I thought Matrox TripleHead2Go was a really expensive gimmick but after it became affordable through Eyefinity and cheap monitors, I tried it out and it really is game changing so I'm glad Nvidia is adding it too, though having to use SLI to get it does mess up the 'affordable' part
Can't say how awesome 3 monitors is enough though.. really got me back into gaming, so I don't want to write 3d off without having tried it but it does need to get much more affordable and standardized first. Maybe next Christmas
This is why games SHOULD be using the PHYS X engine,it takes a ton of load off the cpu and GPU and is MUCH faster in computing physics.
Nvidia is the money bags,if there is anything out there that they can't do they will probably buy the company or license their program/engine.This is exactly what Epic does with the Unreal engine,they have brought on board all sorts of developers to further enhance their engine.
Epic has proven that you can have VERY high end graphics but not the load that goes with them,and add in the phys x engine you should be able to produce one hell of a game.Plus unlike the old days of EQ2 there is better compression witch allows you to maintain a good looking texture and lessen the load.
If a game is written using Dx10 and using the proper engine/software there is no reason we can't have a movie like production for a game.The problem after that is getting the developers to spend the time to create sequences and animations,they are too cheap and set goals to make a profit,they do not set goals to make great games.
Never forget 3 mile Island and never trust a government official or company spokesman.
I don't disagree with you, but quad core CPU's do have plenty of untapped horsepower to do more advanced physics than they are doing right now as well, real problem like you said is dev support. Instead of low physics we could get medium physics on quad cores, and high physics on (dedicated) GPGPU's.
As far as 3D Vision and Physx goes, the physics really ought to be done on the CPU or a dedicated GPGPU or you're just eating into GPU cycles you need to push polys for the 3d. The physics calcs aren't free - any physics you do on GPU eats into the already bottlenecked graphics subsystem.