It looks like you're new here. If you want to get involved, click one of these buttons!
I am running a Gigabyte 6850 to a 24" Samsung P2450H 1920x1080. I have the latest drivers installed to include the ATI CCC. There is just an overall washed out, fuzziness and lack of quality to the monitor when an HDMI cable is used. When I use a DVI cable everything is perect and beautiful. I thought HDMI was supposed to be better than DVI. I have searched forum after forum and can't find a solution and many others are also having this problem.
I've tried:
Comments
hdmi cables are just smaller then dvi, knowing ati its a catalyst issue and will likely never be fixed.
doesnt sound like the monitor is broken, since it works great on dvi according to you yourself
ps hdmi cables support resolutions of 1920 1080
dvi cables support over 2500 2500 on a big enough screen with a strong enough graphics card driving it
While this won't help you use HDMI - I cannot see the advantage of DVI versus HDMI. Both formats utilize a purely digital signal with no loss.
If you really want to use HDMI you would need to isolate the culprit. This would require a second monitor or TV that can accept HDMI (and preferably DVI as well) to see if it is fuzzy on that display as well. If so then you can pinpoint the video card as the culprit, otherwise it might just be the way your display outputs an HDMI signal.
The really cool thing about HDMI is its ability to carry an audio signal along with the video. While this is wonderful in the home theater world it likely has no current benefit in the PC world, until perhaps your video card can pass that signal along, and even then it would only help if your display had built in speakers, etc.
I would do more reasearch on the benefits to running HDMI on your PC versus DVI - I am betting in terms of picture quality there is no difference at all.
- MMORPG.COM Staff -
The dead know only one thing: it is better to be alive.
http://forum.ecoustics.com/bbs/messages/34579/122868.html
Read that, it'll explain the whole DVI vs HDMI thing.
TLDR; The only major difference between DVI and HDMI cables is that HDMI cables carry an audio signal on top of a video signal. Personally, I've found that when it comes to computer monitors you're better off using a DVI cable and if you're using a TV monitor you're better off using an HDMI cable.
Edit: An admin could also beat me to it >.>
Strange, because DVI and HDMI share common digital pins for the image. Not familiar with the monitor, but are there any built-in options for scaling? 1080P televisions often have something of the sort, and you generally want them disabled entirely when using them as a computer monitor, since they can notiably wreck the image quality.
A Modest Proposal for MMORPGs:
That the means of progression would not be mutually exclusive from the means of enjoyment.
Wait I found the answer here!!!:
http://www.tomshardware.com/forum/281389-33-cant-1080p-resolution-samsung-p2450h
Apparently my monitor has two settings: AV mode and PC mode. It was on AV mode. Upon switching it to PC mode all is good!
Really there isn't any noticeable improvement over DVI, about equal as far as I can tell.
Thank you all for the quick responses! (man I feel dumb)
o ya forgot to ask you if it was a tv with av/pc mode options, you said monitor so i figured it was made purely for pc!
If it works with a DVI cable, then use a DVI cable. HDMI isn't better than DVI; the only real advantage of HDMI is that the standard includes a way to send audio as well as video. If you don't need that, then I'd use DVI just so that the cable can't come loose. HDMI is mainly for televisions, which really do need audio. Both formats are in the process of being phased out in favor of DisplayPort, but that could take a while to become the standard. AMD, Apple, and Dell are the main advocates of DisplayPort.
With any digital type of cable, the card computes some picture, it goes through the cable and arrives at the monitor exactly as the card computed it, and then the monitor displays the computer as close as it can to the data it was sent. Slight distortions are generally the fault of the monitor, but that shouldn't vary by whether it's an HDMI cable or a DVI cable.
Big distortions could be the fault of defective hardware on either end. It could also be a driver issue, as you're still using very early drivers. That's a problem inherent to buying a card of a new architecture right after it comes out. Make sure you get Catalyst 10.12 when it releases later this month (any day now), as that's the first drivers for your card that have a chance of being mature.
HDMI is compressed, DVI is uncompressed and has a larger bandwidth. Use DVI or DisplayPort if at all possible.
The issue with HDMI before is that for Audio it needs a special method with Windows Vista. This was fixed by ATI in 2006, and nVidia just recently. If everything is setup properly with the monitor, it should display correctly.
BTW, everyone but Monitor Panel makers advocate for Display Port. The big reason is the lack of a licensing fee and the superior bandwidth. Without Samsung and LG onboard, its not going to take off at all. Luckily LG is starting to use DisplayPort, so Samsung has to follow suite.
what about a monitor thats hdmi only.
I have a HDMI cable from monitor to graphics card
I also have one thats HDMI from monitor that converts to DVI to the graphics?
Which one should I use?
HDMI is not compressed, it's pure digital data.