Ok before I got the new computer we were using my wifes, an AMD Athlon XP 1800. We play Star Wars Galaxies and when I was playing it on her computer with a 64 meg vid card on my 21 inch monitor I was getting 1-12FPS in a city enviroment with alot of other players. I figured the low FPS was due to her computer being somewhat slower. Well when I got the new computer I plugged it into my monitor, fired up the game, and wham im getting 1-12fps in the same enviroment, however this Video card is a GeForce 5700LE 256 Meg. Now the kicker, I plugged her computer into her monitor and she was getting 20-30fps. Can a monitor really make the difference like this?
Here's why I'm not sure what is going on. The only way that I can see a monitor impacting performance is if it is locking you into a fixed resolution higher than what you have to use on the other machine.
For example, if the "Slow" monitor only runs 1280x 1024 but the "Fast" monitor will go at 640x480, that could explain it.
Still, I would think that you could almost always move the resolution down, unless it's an LCD with a fixed display size or something.
(edited by Guru Zim on 2.10.04 1025) Willful ignorance of science is not commendable. Refusing to learn the difference between a credible source and a shill is criminally stupid.
I have about as close to zero experience with programming as you can get with a engineering degree. I've got an idea to streamline some things at my company, and want some basic advice on what I should start learning so that I can implement it.