Ad
Ad
Ad
Pages: [1]   Bottom of Page
Print
Author Topic: DVI or VGA?  (Read 5129 times)
Tony B.
Jr. Member
**
Offline Offline

Posts: 76


« on: August 02, 2008, 10:26:09 AM »
ReplyReply

Hi, I recently acquired a Samsung 213t monitor.  I hooked it up using DVI and calibrated it with an i1display2.  During calibration the monitor only allows brightness control (no contrast, user color etc.).  I went to Samsungs site and found that the monitor only allows the contrast, user color etc. modes when using the VGA input.

So, is it better to use DVI with only brightness control or VGA with the other controls?

During calibration and looking at the RGB advanced mode with DVI hookup the levels were pegged to one side or the other.  Will the Disply2 adjust for this good enough?
Also, the contrast was set at 50% (again not adjustable) but was centered during contrast adjustment so no adjustment was needed there anyways.

Tony
Logged
hsmeets
Full Member
***
Offline Offline

Posts: 182


« Reply #1 on: August 02, 2008, 04:47:38 PM »
ReplyReply

Tony,

same here, DVI and only brightness controll (which is actually backlighting brightness).  No problems to getting a good print/screen match, but paper and screen stay different media anyway....

I still favor DVI over VGA, DVI is digital output from the graphics card into the LCD panel. If you choose VGA the graphics card first has to make an analog signal from the digitalbits and when the analog signal arrives at the LCD is must be digitized again..what would be the benefit of that? Compared to the DVI, VGA looked degraded on my monitor.

Cheers,

Huib
www.huibsmeets.com
« Last Edit: August 02, 2008, 04:48:34 PM by hsmeets » Logged

tagor
Newbie
*
Offline Offline

Posts: 16


« Reply #2 on: August 06, 2008, 09:57:41 AM »
ReplyReply

You want to use DVI. If you use VGA, you can only loose quality. The calibration software can do the equivalent of the contrast controls your monitor has in VGA mode.

Your LCD panel is a digital device (hopefully with 8 bits per color resolution). With DVI, you have a 1:1 mapping onto that range. With VGA, the output is mapped onto a subset of that.

The brightness setting normally controls the dimming of the backlight.

-- Tilo
Logged
Roberto Chaves
Jr. Member
**
Offline Offline

Posts: 50



WWW
« Reply #3 on: August 06, 2008, 12:57:27 PM »
ReplyReply

As the others have said, you should use DVI instead of VGA.

If you use a VGA cable the graphics card will convert the digital signal into analog (which causes a bit of quality loss) and the monitor has to convert the analog signal into digital (loss here too).
Because of this analog signal you will be able to control how it maps into a digital signal, thus the ability to adjust contrast, color etc.
With DVI it's all digital, no unnecessary conversions and no need for the additional controls on your monitor.

Please be aware that DVI-I cables can send both digital and analog signals.
Make sure you get a digital DVI signal from your graphics card to your LCD monitor.

Ps. Some monitors actually let you fake analog controls, which I find scary that resources are put into such stupid development...
Logged

Best regards,
 Roberto Chaves
 www.tabi.se
cricketer 1
Newbie
*
Offline Offline

Posts: 32


WWW
« Reply #4 on: September 09, 2008, 03:29:08 PM »
ReplyReply

Quote from: Roberto Chaves,Aug 6 2008, 12:57 PM
As the others have said, you should use DVI instead of VGA.


I am coming late to this discussion, but hope someone can assist me with obtaining DVI output from my HP Pavilion A450n Pentium 4, running Windows XP sp-2 with Nvidia GeForce FX 5200 card to my Samsung model 245bw 24" LCD monitor via the DVI output connector to the DVI input of the monitor via appropriate DVI cable.  No problems with the VGA format, but the monitor will not accept a DVI input (assuming my HP is outputting one). I see some coloured squares and a message-- your computer does not support DVI (not the precise wording but close enough).  Have tried several times including re-loading monitor drivers from the supplied CD, re-booting computer, etc.  Is there something in My Computer- hardware-display sw I should have selected to obtain DVI output?  I was using a VGA monitor previously.  Any help will be appreciated.
Logged
mbalensiefer
Sr. Member
****
Offline Offline

Posts: 297


« Reply #5 on: September 09, 2008, 10:32:33 PM »
ReplyReply

My CPU has an HDMI output which I then use with a DVI-D converter and DVI-D cable that goes into my monitor. Is there any percentage data loss here?
Logged
Pages: [1]   Top of Page
Print
Jump to:  

Ad
Ad
Ad