Turn on suggestions
Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.
Showing results for
GFX card voltages... This has me stumped!
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Plusnet Community
- :
- Forum
- :
- Other forums
- :
- Tech Help - Software/Hardware etc
- :
- Re: GFX card voltages... This has me stumped!
GFX card voltages... This has me stumped!
08-02-2008 12:10 PM
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Report to Moderator
Hi guys
Righty, I'm using a asrock motherboard model:K7s41GX.
I have 2 spare GFX cards laying around here - one really old with only 8MB the other one turns out to be a GeForce 2 (which I didn't realise - I already have one of these in another machine and for it's age and the age of my systems it works rather well!).
Anyway I was last night thinking about bringing my old CRT monitor back into service for a dual monitor setup BUT (and this is where it gets interesting) my motherboard seems to disable onboard graphics when I plugged in an older PCI GFX card (Quote from device manager: "Cirrus Logic 5446 Compatible Graphics Adapter")
So, I thought I would use one of these AGP GFX cards BUT... my motherboard manual says it must be a 1.5V GFX card and not a 3.3V graphics card. I've googled the GeForce2 for voltages but cannot find any information about this anywhere (The model number for the card is: Ms-8826).
I assume the manufacturers are leaving out this information as AGP is presumably one set standard? - So how the heck do I know if I can use this GFx card?
OR is there a way to get my onboard GFX working with a PCI/AGP GFX card?
Righty, I'm using a asrock motherboard model:K7s41GX.
I have 2 spare GFX cards laying around here - one really old with only 8MB the other one turns out to be a GeForce 2 (which I didn't realise - I already have one of these in another machine and for it's age and the age of my systems it works rather well!).
Anyway I was last night thinking about bringing my old CRT monitor back into service for a dual monitor setup BUT (and this is where it gets interesting) my motherboard seems to disable onboard graphics when I plugged in an older PCI GFX card (Quote from device manager: "Cirrus Logic 5446 Compatible Graphics Adapter")
So, I thought I would use one of these AGP GFX cards BUT... my motherboard manual says it must be a 1.5V GFX card and not a 3.3V graphics card. I've googled the GeForce2 for voltages but cannot find any information about this anywhere (The model number for the card is: Ms-8826).
I assume the manufacturers are leaving out this information as AGP is presumably one set standard? - So how the heck do I know if I can use this GFx card?
OR is there a way to get my onboard GFX working with a PCI/AGP GFX card?
I need a new signature... i'm bored of the old one!
3 REPLIES 3
Re: GFX card voltages... This has me stumped!
08-02-2008 12:19 PM
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Report to Moderator
This http://www.ertyu.org/steven_nikkel/agpcompatibility.html might help you.
Superusers are not staff, but they do have a direct line of communication into the business in order to raise issues, concerns and feedback from the community.
Re: GFX card voltages... This has me stumped!
08-02-2008 12:22 PM
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Report to Moderator
http://www.ertyu.org/steven_nikkel/agpcompatibility.html
The above explains the differences in hardware and AGP spec so as long as you now what speed your mobo AGP slot can work at and what speed your graphics card works at you should be able to work out if the cards you have will work or not.
You cannot use both on-board and AGP/PCI slots as the mobo is designed to disable to on-board when it detects another graphics card. There is no way around this as the mobo was never designed to allow both to work at the same time. The only way to have 2 monitors is to either have a dual-head graphics card that supports 2 monitors or fit 2 graphics cards, AGP & PCI oe 2 PCI in your case.
The above explains the differences in hardware and AGP spec so as long as you now what speed your mobo AGP slot can work at and what speed your graphics card works at you should be able to work out if the cards you have will work or not.
You cannot use both on-board and AGP/PCI slots as the mobo is designed to disable to on-board when it detects another graphics card. There is no way around this as the mobo was never designed to allow both to work at the same time. The only way to have 2 monitors is to either have a dual-head graphics card that supports 2 monitors or fit 2 graphics cards, AGP & PCI oe 2 PCI in your case.
Re: GFX card voltages... This has me stumped!
08-02-2008 1:23 PM
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Highlight
- Report to Moderator
Well I got it sorted - only took a few mins to sus out once I realised that windows can re-arrange montitor positions etc.
Turns out my GeForce2 is AGP 2x and 4x and is keyed correctly for 1.5v (Thanks for the links guys - tremendously helpful!)
Anyway it's all up and running with my old CRT monitor in reserve (And i forgot how much that thing could make my eyes buuurrrrnnnnnnn!)
Ah well, have to hunt down another flatscreen when i can find the spare £'s
Turns out my GeForce2 is AGP 2x and 4x and is keyed correctly for 1.5v (Thanks for the links guys - tremendously helpful!)
Anyway it's all up and running with my old CRT monitor in reserve (And i forgot how much that thing could make my eyes buuurrrrnnnnnnn!)
Ah well, have to hunt down another flatscreen when i can find the spare £'s
I need a new signature... i'm bored of the old one!
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Plusnet Community
- :
- Forum
- :
- Other forums
- :
- Tech Help - Software/Hardware etc
- :
- Re: GFX card voltages... This has me stumped!