Folks, I'm having some performance issues with one of our applications that used to run on a physical server but now we are QA'ing on a virtual server. The application is very CPU driven. Here is the environment/scenario
Physical Server:
Windows 2000 32bit, dual Xeon 2.4 GHz CPU's, 4 GB RAM
Virtual Server:
Windows 2008 64 bit, single Xeon 3.4 GHz CPU (fastest CPU core on ESX server), 4 GB RAM
ESX server where VM lives:
ESX 3.5, dual quad core 3.4 GHz CPU, 64 GB RAM
System is only about 30% utilized nominally in terms of overall CPU (50% at peaks)
Application:
The application is a DCOM object and is thread based and a single instance of the application that I am concerned with can only utilize a single CPU or single core to run. (It only uses a single CPU on the physical server and therefore only needs a single CPU on the virtual server.) We have proven this by configuring the virtual server to be a dual CPU and the run time of the application nightly processing routine takes the exact same time. There is also some moderate to heavy network I/O occurring as the application prepares many text files on a file share and then BCP's data into SQL tables on a database server.
Results:
On the Physical Server, the entire application runs through its daily routine in about 90 minutes
On the Virtual Server, the same run takes about 30-35 minutes longer to run. (approximately 120-125 mins)
In both cases, the application utilizes between 90-100% of a single CPU throughout the jobs run-time.
The application is not memory intensive, in both environments, the server's memory allocation to the particular application is about ~380 MB maximum.
Notes:
The network interface on both physical and virtual servers are the same link speed and same subnet.
There are no firewalls involved.
VMDK file for the virtual server is stored on EMC disk attached to the ESX server via iSCSI
We have tweaked multiple VM specific settings in order to increase the performance of the virtual machine to no avail. (paravirtualization, cpu/mmu virtualization)
Obviously this is a significant difference in terms of how long it takes to run and complete the jobs that the application runs. Can anyone think of something obvious that I have missed or should try to rectify my performance issues? In theory if my application is CPU driven, I should be seeing a 30-40% increase in performance (estimated based on CPU capacity only.) This can probably be skewed lower or higher based on a number of other limiting factors, but won't worry about those for know.
Anyone have any ideas or know of any VMware resources that might be able to help me out here?
Thanks,
--Kevin