Does the size of a monitor impact game performance?

The size of the monitor doesn't matter, but its resolution does. Because you're now using a monitor with a higher resolution than your previous one, your graphics card needs more video memory to store each frame, since the frames are larger. Due to this, you're going to see a general decrease in performance in all games you play at your monitor's full resolution - the video card simply has more work to do to produce a larger image.

As for Minecraft, make sure you have 64-bit Java installed. Without 64-bit Java, Minecraft can't utilize all of your available memory, which may be the cause of the lag issue. If this doesn't work, you might also want to try OptiFine - a mod for Minecraft that can significantly increase performance.


I didn't read most of your question, but the answer to the title question is yes, insofar as "size" refers to "screen resolution" (in pixels).

Every pixel has to be processed, in some way or another, individually... and with complicated 3d rendering programs, each pixel often has to be re-calculated and re-processed several times as various effects are applied: shadows, geometry shaders, lighting, screen space ambient occlusion, culling, anti-aliasing, and so on. Each of these operations effectively has to look at a large swath of pixels (or indeed, every pixel on the screen) and modify its color elements (whether they're Red/Green/Blue or some other arrangement).

The number of pixels on the screen increases quadratically if the width and height components increase symmetrically. So, to use simple numbers (these are not real resolutions in common use): if you have a 500x500 screen and you upgrade to a 1000x1000 screen, you don't just have twice as many pixels -- you have the number of pixels you originally had, times four. So 500 * 500 (multiplication) is 250,000; 1000*1000 is 1,000,000. If we took this to its logical extreme and looked at a resolution like, oh I don't know, 32000x24000, I don't honestly know if any existing graphics hardware could play modern games at a respectable frame rate with such a high resolution. The numbers don't LOOK that big, but when you multiply 32000 by 24000, you have a lot of complexity, and a lot of data to process. An enormous amount!

The physical size of the monitor is not important. The screen resolution (number of pixels) directly affects the performance, because more data (more pixels) means more processing at each stage of rendering.


I'll start by saying that these are all very good answers, but I wanted to take the time and see for myself what kind of numbers I could generate on my own two large screens.

Since at least one answer mentioned the GPU's memory and the OP mentions scenes with many objects, I will be measuring different levels of VRAM consumption, and since another mentioned resolution, I will do fullscreen rendering on one screen and compare those numbers with doing fullscreen rendering on both screens. This can be thought of as a bigger monitor at a higher resolution since there is twice the screen real estate. I will also do a small example of two mirrored images, simulating larger screen space while keeping the same resolution.

Using the compared numbers, I can demonstrate what taxes a graphics card.

The Rig:

  • Quadcore 2600k @ 5GHz
  • MSI 580 GTX Lightning Xtreme
    • This version is clocked higher than stock by MSI, but I myself haven't overclocked it for this test
    • This version has 3 gigs of DDR5
    • Driver version 304.79; completely clean install
  • 16 gigs DDR3 system RAM
  • 2x 32" LED screens running at 1920x1080
    • One connected by HDMI, one connected from the DP out through a passive adapter to HDMI in
  • I will be monitoring VRAM usage using MSI Afterburner v2.2.1

The First Test:

For this first test, we are just going to sit at an idle Windows 7 Aero-enabled desktop. I did not have a utility to measure FPS outside of Minecraft.

  • 116-118 MB usage
    • Both screens on, and in 1920x1080 mode
  • 89-91 MB MB usage
    • One screen on, and in 1920x1080 mode
  • 78-80 MB usage
    • One screen on, and in 800x600 mode
  • 73-74 MB usage
    • Both screens on, and in 800x600 mode

-- This test shows that 2x monitors does NOT mean 2x VRAM usage, so one could expect that there is a lot more overhead than just framebuffering, and it's also worth noting that while in hi-res mode, one monitor uses ~77% of the memory of two, and in low-res mode, one monitor uses ~93% of the memory of two, further exemplifying the overhead cost. -- This implies that the answer that @Huskehn gives is misleading, in that higher resolutions alone do not have a practical impact on VRAM usage.


The Second Test:

This test uses VLC to perform fullscreen Blu-ray playback (Talladega Nights, for posterity). I did not have a utility to measure FPS outside of Minecraft.

  • 172 MB usage
    • Both screens on, single fullscreen and in 1920x1080 mode
    • For giggles, I increased playback to 8x in this mode, and saw a 2 MB usage increase
  • 217 MB usage
    • One screen on, dual fullscreen and in 1920x1080 mode
  • 125 MB usage
    • One screen on, and in 1920x1080 mode
  • 106 MB usage
    • Both screens on, single fullscreen and in 800x600 mode
  • 130 MB usage
    • Both screens on, dual fullscreen and in 800x600 mode
  • 94 MB usage
    • One screen on, and in 800x600 mode

-- Not much to say on this test, other than that the VRAM usage was rock steady the whole time, until I changed playback speed only -- memory usage dropped back to normal when set to regular speed which indicated that the framebuffer had increased. Another point to make is that neither of the videos became choppy or looked slow, showing how non-3D applications are trivial.


The Third Test:

In this test, I will be turning on a 32-bit Windows XP virtual machine fullscreen on my second monitor using VirtualBox v4.1.18. The VM has been given 128 MB of video memory (Which later tests don't seem to necessarily prove that the VM can only use this much), and has had 2D and 3D acceleration enabled.

I've skipped a few modes as the other tests seem to show that a predictable difference can be observed.

  • 151-156 MB usage
    • Both screens on, and in 1920x1080 mode
  • 127 MB usage
    • Both screens on, VM screen (and VM itself) in 800x600 mode, and host in 1920x1080 mode

-- I did not measure whether there was different consumption when the screen was @1080, or when the VM's resolution was internally changed to @800


The Fourth Test:

Playing Minecraft!

There's a lot of information in this one, and is probably the most important section for @Solignis.

Firstly, because of the way Windows handles fullscreen applications, I couldn't run two MC clients at maximum size simultaneously, so I ran the other inside of the previously mentioned VM and measured the results. My brother and I play side-by-side like this all the time, and we never have any issues playing! Secondly, I ran a Minecraft server in the background so that way both clients would render the same world, and the near-identical view. I ran both clients, teleported both players to the same spot, looked them in the same direction, and then closed the clients to flush anything previously rendered, then started them back up and didn't move them. I started the server first, and noticed that VRAM went from 156 MB to 168 MB. After disconnecting the first time with both clients, I noticed that now a steady 230 MB was in use. The server was also configured for maximum rendering distance.

The MC graphics settings were the following: "Fancy", smooth lightning ON, 3D anaglyph OFF, GUI scale AUTO, particles ALL, render FAR, performance MAX, bobbing ON, adv. OGL OFF, clouds ON. While the host client can run with adv. OGL ON just fine over 280 fps, this was causing the VM to crawl. I believe this is because of the currently limited support of newer OGL implementations on VirtualBox. FPS was taken from MC by the 'F3' stats.

All tests were recorded in 1920x1080 except where noted otherwise.

  • 530 MB usage @ ~300 FPS (Host)
    • Client fullscreen in host on screen 1, idle desktop on screen 2, no VM running
  • 530 MB usage @ ~305 FPS (Host)
    • Client fullscreen in host on screen 1 and 2 (duplicated screen), no VM running
  • 482 MB usage @ ~330 FPS (Host)
    • Client fullscreen in host on screen 1, second screen off, no VM running
  • 482 MB usage @ ~330 FPS (Host)
    • Client fullscreen in host on screen 1, second screen off, no VM running
  • 480 MB usage @ ~430 FPS (Host)
    • Client fullscreen in host on screen 1, second screen idle desktop, both 800x600, no VM running
  • 547 MB usage @ ~250 FPS (Host)
    • Client fullscreen in host on screen 1, idle VM desktop on screen 2
  • 408 MB usage @ ~70 FPS (VM)
    • Client fullscreen in VM on screen 2, idle host desktop on screen 1
    • FPS started around 60, then slowly dropped to 45, then climbed to 81 after all chunks were fetched
  • 803-805 MB usage @ ~200 FPS (Host), @ ~50 FPS (VM)
    • Client fullscreen in host on screen 1, client fullscreen in VM on screen 2

-- Note that MC caches a lot, and different scenes will cause different amounts of VRAM usage, so while I chose a complex scene with lots of fire and many pistons moving, this is still a near 'best case' scenario as looking around will cause a lot more things to be fetched, cached, and especially redrawn because the whole screen is changing rather than a just a few parts. This isn't to say, however, that MC won't clear its cache when it needs to. You can see a steady increase in VRAM as far-away chunks get rendered, and there was a HUGE difference between the time it took for the VM to load blocks compared to the host system. The restoration of framerate after chunks were loaded demonstrates that, for Minecraft at least, more FPS is lost in fetching things than actually displaying them. The final test showed that the 'MultiplayerChunkCache' for this setup is 961, and then it starts swapping things out, but this didn't prevent VRAM from rising. This caching can also explain why there was not a significant drop in VRAM after switching to a smaller resolution while the client was still running.

-- In regards to the number of pixels being used in relation to performance as suggested by @Diogo, the first few runs show that double/half the amount of pixels doesn't equate to double/half the performance. Using FPS as a measure of performance, only about 10% was gained by reducing the number of pixels by half, as the second screen wasn't rendering anything difficult. So then I mirrored my displays, and the VRAM usage didn't change, and the FPS changed mariginally. In the tests with both the client running on the host as well as the VM, neither the host or the VM's Minecraft dropped 50% in FPS, but, instead, only by about 25%, despite advanced fullscreen rendering on both, so pixel count isn't really a useful identifier in performance.


The Final Test:

Well...just how high VRAM can I get?!

After many minutes of wandering around my server in Minecraft on both VM and host clients, I eventually hit 1685 MB usage when things started leveling off hard. After many more minutes, I was able to hit 1869 MB, and then it would not budge. Even with pistons every which way and fiery netherrack all around me. Even after such HIGH memory usage, the game was still 100% playable on both the host AND VM; ~233 and ~52 FPS, respectively. I had done it. I tapped out Minecraft. Twice. On the same machine!

So then I started up Skyrim. When I minimized my host's MC client, my VRAM only dropped to 1837 MB.

Outside in the lands of Skyrim, I was able to attain a total of 2636 MB of usage, while the VM's Minecraft was still at ~50 FPS. I didn't measure Skyrim's FPS, but it was visibly high.

Annoyed at not hitting the peak, I then minimized Skyrim, dropping to 2617 MB, and opened up Civ 5, which uses DX 11. Once it was loaded, my VRAM shot up to 2884 MB, and I was prompted with a Windows "Your computer is low on memory..." window that pointed to a Java process (Either server or client), however my system RAM was only 9.77 GB out of 16. I loaded my saved game, and it had done it! I hit my 3072 MB max! My second screen went blank, and my first screen started flashing furiously, and was in a reduced resolution. Fearing for my card, I quickly powered off my PC, but not before seeing some warning dialog about "Unreferenced memory at 0x00...." Up until this point, FPS was still high on both screens.


The Conclusion:

If you've made it this far, then kudos for reading. This is a round about way of saying things, but the problem that @Solignis has is unlikely GPU based or CPU based, but perhaps a poor server connection or insufficient Java/heap settings. @Philippe and @libertas are on the right track with their answers.

The Minecraft tests ran in 800x600 and 1920x1080 by themselves show that despite having 432% more pixels, the performance was not affected with such magnitude.

The duplicated screen Minecraft test was a way of simulating a bigger screen (two screens combined), but using the same resolution image, 1920x1080, just using more screen space. Short of having another, larger screen with the same resolution, this is very conclusive in showing that size alone does not noticeably affect game performance.

The the dual Minecraft tests show that rendering a higher resolution image (two screens combined) will have an impact on performance, but doubling the picture did not halve the performance, or make it noticeably slow. This implies that you would already be experiencing slowness in Minecraft in order for the small impact of a small size increase to make the game very slow.

The final test was a way to see if there was a relation between VRAM consumption and performance, and I discovered that there wasn't one. Well, that is, until you hit your maximum VRAM -- then your performance instantly goes to 0. No matter how much information the GPU was keeping track of, it didn't noticeably affect framerates. I have also proven that without a doubt, unless you plan on doing ridiculous things like playing multiple games side-by-side and at once, that 3 GB of video memory is absolutely pointless in the current generation of gaming; memory is a practically useless spec on a GPU, and 1-1.5 GB will suit anyone more than enough.

Minecraft was a very good program for these tests because there is only one set of textures, whereas large games can often have differently-sized textures based on what resolution it is being ran in, which would introduce different variables to skew the tests. Being a multiplayer game was another boon as I was able to reliably render a near-duplicate image. The fact that Minecraft is light-weight enough to be ran inside of a VM was another useful benefit so I could demonstrate extreme edge-cases, have two fullscreen applications, as well as finally being able to determine if VirtualBox's "Video Memory" setting actually restricted how much memory a VM can use on its host GPU.