I'm inclined to say that's definately pushing it. But seeing as cutback cards usually have low power requirements you might be in luck. I'd recommend you to visit the forum of the manufactor of the card you're considering and see what they have to say there.
no it wont. I would be surprised if any decent gaming rig ran on a 305 watt power supply (assuming it was made after the year 2006)
Playing: EVE Online Favorite MMOs: WoW, SWG Pre-cu, Lineage 2, UO, EQ, EVE online Looking forward to: Archeage, Kingdom Under Fire 2 KUF2's Official Website - http://www.kufii.com/ENG/ -
That power supply must be at least 10 years old and is worthless today. You will need 450W for even a minimum specced computer today. I have 750W myself but I reccomend 600W+, more if you want the latest end of GFX cards.
400W was standard 6 years ago, less than that takes us back to the pentium days.
No. You'll need a minimum 400W watt for that. That power supply must be at least 10 years old and is worthless today. You will need 450W for even a minimum specced computer today. I have 750W myself but I reccomend 600W+, more if you want the latest end of GFX cards. 400W was standard 6 years ago, less than that takes us back to the pentium days.
Actually, your wrong. It will run, asked around several IT forums and they all say it will run with ease.
No. You'll need a minimum 400W watt for that. That power supply must be at least 10 years old and is worthless today. You will need 450W for even a minimum specced computer today. I have 750W myself but I reccomend 600W+, more if you want the latest end of GFX cards. 400W was standard 6 years ago, less than that takes us back to the pentium days.
Actually, your wrong. It will run, asked around several IT forums and they all say it will run with ease.
and it will burn your psu.. just like the guys pc i saw, when i was vising my former employes.. which they were fixing( nothing there to fix rly.. whats fried thats fried )
that 300 psu will probably be on 30% efficienty.. which resulting in actuall 100w output..
if you actually think your psu can produce 300w ur uneducated.. best psus have 80-85% efficienty ( 80% tbh, but since i didnt saw all of them, there might be some with over 80% ) , old ones like yours will have terrible low %.. it will run, question is, for how long.. are you willing to take a risk in killing either your psu, your motherboard, and gpu.. probably even cpu.. well more or less everything..
No. You'll need a minimum 400W watt for that. That power supply must be at least 10 years old and is worthless today. You will need 450W for even a minimum specced computer today. I have 750W myself but I reccomend 600W+, more if you want the latest end of GFX cards. 400W was standard 6 years ago, less than that takes us back to the pentium days.
Actually, your wrong. It will run, asked around several IT forums and they all say it will run with ease.
and it will burn your psu.. just like the guys pc i saw, when i was vising my former employes.. which they were fixing( nothing there to fix rly.. whats fried thats fried )
that 300 psu will probably be on 30% efficienty.. which resulting in actuall 100w output..
if you actually think your psu can produce 300w ur uneducated.. best psus have 80-85% efficienty ( 80% tbh, but since i didnt saw all of them, there might be some with over 80% ) , old ones like yours will have terrible low %.. it will run, question is, for how long.. are you willing to take a risk in killing either your psu, your motherboard, and gpu.. probably even cpu.. well more or less everything..
What efficiency are you talking about?
30% efficiency would mean that the PSU draws 300W from the wall outlet to produce 100W.
With 80% efficiency that would mean to produce 240W the outtage from the wall outlet would be 300W.
Tha contains, Silverstone 850W, with efficiency greater than 80%.
Let's give them 85% efficiency. That would mean 160W drawn from the PSU after the AC - DC conversion. Now, note that I haven't seen in the review if they already deducted the efficiency loss.
160W is all power outtages inside that computer. But let's pretend that there is nothing else then the important 12V.
160 is 13.3A on 12v (against 15.7A if they already deducted efficiency). And as there is some other volt usage inside the computer it is pretty safe to say that their setup do not use more than 13.3A on 12V.
(They are using this to measure the power consumption)
And that system, in the review, would be able to run on that 300W from Fortran.
No. You'll need a minimum 400W watt for that. That power supply must be at least 10 years old and is worthless today. You will need 450W for even a minimum specced computer today. I have 750W myself but I reccomend 600W+, more if you want the latest end of GFX cards. 400W was standard 6 years ago, less than that takes us back to the pentium days.
Actually, your wrong. It will run, asked around several IT forums and they all say it will run with ease.
and it will burn your psu.. just like the guys pc i saw, when i was vising my former employes.. which they were fixing( nothing there to fix rly.. whats fried thats fried )
that 300 psu will probably be on 30% efficienty.. which resulting in actuall 100w output.. if you actually think your psu can produce 300w ur uneducated.. best psus have 80-85% efficienty ( 80% tbh, but since i didnt saw all of them, there might be some with over 80% ) , old ones like yours will have terrible low %.. it will run, question is, for how long.. are you willing to take a risk in killing either your psu, your motherboard, and gpu.. probably even cpu.. well more or less everything..
The PSU wattage rating is what it outputs AFTER the inefficiency from power conversion, so 305W is what you will get from the PSU though it actually pulls more than that from the wall.
The efficiency will be anywhere between 70% and 85% depending on how much power you are drawing. At low loads the efficiency is around 70% so if you were drawing only 100W from the PSU it would be drawing 142W from the wall. At max load the efficiency is usually around 80% which would mean you would be using 305W but drawing 381W from the wall.
So the PSU, when brand new, could output 305W - but by now with capacitor aging it's probably down to 250W output and cheaper PSU's are sometimes rated by peak wattage instead of max sustained wattage so you may get less.
However looking around at tech sites that measure power consumption for the whole system with a specific video card in, a system with the 8600GT seems to hit 185W-200W peak so I think you could get away with it if you really needed to but I do agree that you should figure out a way to get a better PSU
Comments
I'm inclined to say that's definately pushing it. But seeing as cutback cards usually have low power requirements you might be in luck. I'd recommend you to visit the forum of the manufactor of the card you're considering and see what they have to say there.
no it wont. I would be surprised if any decent gaming rig ran on a 305 watt power supply (assuming it was made after the year 2006)
Playing: EVE Online
Favorite MMOs: WoW, SWG Pre-cu, Lineage 2, UO, EQ, EVE online
Looking forward to: Archeage, Kingdom Under Fire 2
KUF2's Official Website - http://www.kufii.com/ENG/ -
No. You'll need a minimum 400W watt for that.
That power supply must be at least 10 years old and is worthless today. You will need 450W for even a minimum specced computer today. I have 750W myself but I reccomend 600W+, more if you want the latest end of GFX cards.
400W was standard 6 years ago, less than that takes us back to the pentium days.
Actually, your wrong. It will run, asked around several IT forums and they all say it will run with ease.
Actually, your wrong. It will run, asked around several IT forums and they all say it will run with ease.
and it will burn your psu.. just like the guys pc i saw, when i was vising my former employes.. which they were fixing( nothing there to fix rly.. whats fried thats fried )
that 300 psu will probably be on 30% efficienty.. which resulting in actuall 100w output..
if you actually think your psu can produce 300w ur uneducated.. best psus have 80-85% efficienty ( 80% tbh, but since i didnt saw all of them, there might be some with over 80% ) , old ones like yours will have terrible low %.. it will run, question is, for how long.. are you willing to take a risk in killing either your psu, your motherboard, and gpu.. probably even cpu.. well more or less everything..
Was reading alot of reviews on newegg and it turns out quite a few people have 300w psu's work with the 8600gt. I'm going to buy it.
GOOD LUCK!
Actually, your wrong. It will run, asked around several IT forums and they all say it will run with ease.
and it will burn your psu.. just like the guys pc i saw, when i was vising my former employes.. which they were fixing( nothing there to fix rly.. whats fried thats fried )
that 300 psu will probably be on 30% efficienty.. which resulting in actuall 100w output..
if you actually think your psu can produce 300w ur uneducated.. best psus have 80-85% efficienty ( 80% tbh, but since i didnt saw all of them, there might be some with over 80% ) , old ones like yours will have terrible low %.. it will run, question is, for how long.. are you willing to take a risk in killing either your psu, your motherboard, and gpu.. probably even cpu.. well more or less everything..
What efficiency are you talking about?
30% efficiency would mean that the PSU draws 300W from the wall outlet to produce 100W.
With 80% efficiency that would mean to produce 240W the outtage from the wall outlet would be 300W.
I'm so broke. I can't even pay attention.
"You have the right not to be killed"
What powersupply?
What is your other hardware?
Calculating example.
- 1 fan around 0,3A
- Quad-Core Q9550 60W, 5A
- 250GB Western Digital Hard Drive 0.4A
- 1 DVD/RW 2A
That would make around 8A on load for those things.
Taking a Fortran 300W for example:
FSP300-60EP 300W --- 12V1 8A, 12V2 13A
12V1 would go to the CPU. 12V2 would then have 10A left, for running you mainboard and you GPU.
this article claims that the 8600GT, their model, draws 50W. That is 4.2A.
-----
Looking through some more reviews, this states that the full system power consumption is 188W under load.
The test system.
Tha contains, Silverstone 850W, with efficiency greater than 80%.
Let's give them 85% efficiency. That would mean 160W drawn from the PSU after the AC - DC conversion. Now, note that I haven't seen in the review if they already deducted the efficiency loss.
160W is all power outtages inside that computer. But let's pretend that there is nothing else then the important 12V.
160 is 13.3A on 12v (against 15.7A if they already deducted efficiency). And as there is some other volt usage inside the computer it is pretty safe to say that their setup do not use more than 13.3A on 12V.
(They are using this to measure the power consumption)
And that system, in the review, would be able to run on that 300W from Fortran.
I'm so broke. I can't even pay attention.
"You have the right not to be killed"
The PSU wattage rating is what it outputs AFTER the inefficiency from power conversion, so 305W is what you will get from the PSU though it actually pulls more than that from the wall.
The efficiency will be anywhere between 70% and 85% depending on how much power you are drawing. At low loads the efficiency is around 70% so if you were drawing only 100W from the PSU it would be drawing 142W from the wall. At max load the efficiency is usually around 80% which would mean you would be using 305W but drawing 381W from the wall.
So the PSU, when brand new, could output 305W - but by now with capacitor aging it's probably down to 250W output and cheaper PSU's are sometimes rated by peak wattage instead of max sustained wattage so you may get less.
However looking around at tech sites that measure power consumption for the whole system with a specific video card in, a system with the 8600GT seems to hit 185W-200W peak so I think you could get away with it if you really needed to but I do agree that you should figure out a way to get a better PSU