Okay, the title was a bit dramatic but it got yer attention, eh? Anyway, I got a new LCD screen to replace my ancient crts and the vga input works no prob but when I try to use the DVI I get a "no signal". Cable is new, I switched the monitor settings...What else should I try? (perhaps a bios setting?) If it makes a noticable difference I'll get a HDMI cable and prolly post a similar thread.... Thx in advance. Tippin'
One thing I noticed is that some monitors or videocards need the monitor attached to the computer when the computer is turned on. Is your videocard an onboard on or a plug-in one? Never came across a option to turn off one of the connections, but if it is an onboard card, then maybe your bios might have that in there somewhere.
hmm maybe the manufacturer of the LCD monitor provides his own drivers for it? For some PC my monitor only worked with those installed I think.
I upgraded to HDMI ($3 USD) on my LCD and noticed significant difference, especially in video playback and sound.
Lemmie get the DVI working and I'll deal with HDMI. @ Suggestions: No drivers or anything came w/ the monitor. Just 1 power cab and 2 vid cabs. Vid card is a...Well, card...Not onboard. Did not try to reboot w/ DVI connected. I suppose I can try that...It's connected now (along w/ vga). I'll just reboot and change monitor settings. ED: No DVI signal on rebewt. Nothing in bios pertaining to vga, dvi, hd, etc....(Just onboard, pci, pcie) Tippin'
hi, in the gfxdriver settings, have u set your primary display device to digital? can you reduce the DVI frequency for high resolutions in driver settings? maybe the cable is broken or the wrong one, thereĀ“s DVI-I, DVI-A and DVI-D. regards, hideo
in your graphics driver settings under monitor make sure the mhz is set to 65 mhz no higher. LCD monitors just dont work if you have a high one like 85 in there
Vid card has it set to 60mhz and looking up the card's specs, it just says "DVI" input (along w/ vga, hdmi ofc). Yarrgh. Tippin'
1. You might need to set the monitors input to DVI 2. You might need to set the GFX Card output to DVI 3. You will need a true DVI setup. DVI cables can transport an analog signal: DVI-D(igital) or DVI-A(analog). Be sure to have the right cable, make sure your monitor can use DVI-D if you have that as output. Some cheap monitors only have a DVI-A input which makes them NOT Digital. 4. You monitor, GFX Card ouput or cable might be just broken. Normaly a DVI setup works right out of the box Check http://en.wikipedia.org/wiki/Digital_Visual_Interface GL
First thing I tried...No Sig on DVI or HDMI, Just VGA input. Donked around the card's manager/Control panel and haven't seen anything to change output. It's an Nvidia GeForce GT 220. Doubt the monitor is too crappy...It an Asus yadayada244. I using whatever cable that came with it. I'll check later to see if the monitor is DVI-D (I'd think it would be), and the card is DVI-A... I did look on Nvidia page to check out the card and it just says "DVI" output. :/ Yea, that's what I was hoing for... :/ Temopted to get a HDMI cable just to see wtf happens.... Thx for the suggesions all and if you have anymore, bring em. Tippin'
Nvidia GeForce GT 220 has Multi-Display Setupa available. That means it can drive atlast 2 displays. IF you have a VGA cable connected to your card AND connect a second cable (DVI) it might be that the card is still set to drive only one Display. VGA is usualy the first "device" for the GFX Card. The setting for that is a Windows setting (where you choose Resolution for the display(s)) For a test you could try: - Disconect all cables beside the DVI on the GT220 - still no signal? - Try a DVI to VGA adapter, they are often included if you buy a new GFX Card, maybe a friend has one. Connect your new Display like: GFX DVI-Out->Adapter DVI to VGA -> Display VGA-Input if it works, your GFX Card if fine, else -> Connect your old Analog Monitor the same way, if this does NOT work, either your DVI Cable is broken or has the wrong typ, OR your GFX DVI output is disabled or borken) Please tell me the exact name of the monitor (i could not find any Asus 244 Model =24" Rev.4.... whats the Model Branch? VE?) Which OS are you using?
check ... right click on desktop - Setting - Setting you should see two "displays" in the window... check if both are attached (right click) Had a similar case that one display was not attached. gl
I did notice before under "settings" when I 1st hooked up the monitor that it showed 2 monitors available (even when only 1 cable was attached). I should of tinkered around there as I can get the wallpaper only when activating #2 then switching monitor to DVI....Hey, so at least I know the cable and DVI output/inputs are functional! That's a nice start. I think you all have led me on the right path and I'll mess with it a bit more. BTW - the monitor is an AsusVE249h. If you have any other tips and hints, post 'em and I'll mess with 'em when I get back. Thanks you all! Tippin'
Attach the monitor then make it the primary in settings. I feel kinda dumb as it was that simple, but then again...I AM dumb! Tippin'
Ah right, I can't remember what I did with mine when I installed them but sounds familiar. Glad you got it sorted. *Thread Closed*