I have a DroboPro (v1.1.4) with 4x1TB 7200RPM drives being used to host VM files for two ESXi 4.0.1 servers. The ESXi hosts are connected to the DroboPro via iSCSI (through a Dell PC2824 GigE switch dedicated to the storage network).
I am seeing very high disk latency on both LUNs that are formatted with VMFS. I am currently showing an average disk latency time of 80ms-280ms, with spikes as high as 3700ms(!). I would expect this value to be around 10-20ms on average.
On top of this, I am only getting about 10-20MB/s of disk throughput, with a few spikes up to about 35MB/s. All virtual machines on the devices start to show performance hits once the disk activity (reads/writes) hits 6MB/s or so.
I have followed the best practices documentation for ESXi configuration, but I’m still seeing this slow performance. I plan to upgrade to the latest firmware (1.1.5) this weekend, but I don’t see that as being the magic bullet.
Is there anything that I might be overlooking that would cause this performance? I had hoped to migrate our VMware Server 2.0 machines onto ESXi with DroboPro as storage, but I am already seeing performance issues and I have only migrated perhaps 20% of our VM environment. Any advice is appreciated.