Issues with Video Performance on Hyper-V Hosts
This little “issue” on one of my Hyper-V hosts had been giving me a headache for a while; for some unknown reason, the Hyper-V host began to have issues when displaying specific screens. It wasn’t happening on my Dell servers, just on a couple of self-built “no-name” servers. Since these servers were used by our QA department, people sometimes logged on to them straight from the servers’ console, and the display issues drove them nuts. So I decided to investigate further.
After a bit of Googling I found this KB:
Video performance may decrease when a Windows Server 2008-based computer has the Hyper-V role enabled and an accelerated display adapter installed:
Reading the KB, it turns out that this issue occurs when a device driver or other kernel mode component makes frequent memory allocations by using the PAGE_WRITECOMBINE protection flag set while the hypervisor is running. When the kernel memory manager allocates memory by using the WRITECOMBINE attribute, the kernel memory manager must flush the Translation Lookaside Buffer (TLB) and the cache for the specific page. However, when the Hyper-V role is enabled, the TLB is virtualized by the hypervisor.
Therefore, every TLB flush sends an intercept into the hypervisor. This intercept instructs the hypervisor to flush the virtual TLB. This is an expensive operation that introduces a fixed overhead cost to virtualization. Usually, this is an infrequent event in supported virtualization scenarios. However, some video graphics drivers may cause this operation to occur very frequently during certain operations. This significantly magnifies the overhead in the hypervisor.
It turns out that this “issue” usually happens with self-built servers, and not with known brand servers such as HP Proliant or Dell PowerEdge. Also, this is usually reported when the Hyper-V hosts are using ATI display adapters.
So, what should you do to fix this “issue” which shouldn’t have been there in the first place? Well currently, when you enable the Hyper-V role in Windows Server 2008, do not install the drivers for high performance accelerated graphics adapters. This behavior will not occur when you use the Vga.sys or Vgapnp.sys generic video drivers that are included with Windows Server 2008.
To revert to the generic video driver, you can uninstall any high performance vendor-specific video driver.
To uninstall the video adapter driver, open Server Manager and navigate to Diagnostics > Device Manager.
Expand Display Adapters, right-click on your high performance vendor-specific video driver, and select Uninstall.
BTW, this shouldn’t happen on servers that are running Server Core installation of Hyper-V.
More in Virtualization
Is VMware Doomed After Broadcom's $61B Takeover Bid?
Jul 1, 2022 | Russell Smith
What is a Virtual Machine?
Jun 27, 2022 | Rabia Noureen
Broadcom to Acquire Virtualization Company VMware for $61 Billion
May 27, 2022 | Rabia Noureen
Microsoft Lifts Windows 11 Update Block As Oracle Releases VirtualBox Fix
Mar 14, 2022 | Rabia Noureen
What is a Virtual Server?
Dec 29, 2021 | Jason Wynn
Guide: How to Install Windows 11 in a Virtual Machine
Jul 8, 2021 | Michael Reinders
Most popular on petri