Drobo

DroboPro - high disk latency with ESXi 4.0

I have a DroboPro (v1.1.4) with 4x1TB 7200RPM drives being used to host VM files for two ESXi 4.0.1 servers. The ESXi hosts are connected to the DroboPro via iSCSI (through a Dell PC2824 GigE switch dedicated to the storage network).

I am seeing very high disk latency on both LUNs that are formatted with VMFS. I am currently showing an average disk latency time of 80ms-280ms, with spikes as high as 3700ms(!). I would expect this value to be around 10-20ms on average.

On top of this, I am only getting about 10-20MB/s of disk throughput, with a few spikes up to about 35MB/s. All virtual machines on the devices start to show performance hits once the disk activity (reads/writes) hits 6MB/s or so.

I have followed the best practices documentation for ESXi configuration, but I’m still seeing this slow performance. I plan to upgrade to the latest firmware (1.1.5) this weekend, but I don’t see that as being the magic bullet.

Is there anything that I might be overlooking that would cause this performance? I had hoped to migrate our VMware Server 2.0 machines onto ESXi with DroboPro as storage, but I am already seeing performance issues and I have only migrated perhaps 20% of our VM environment. Any advice is appreciated.

I’m interested to find out the outcome of your DroboPro w/ ESXi 4.0. Did the 1.1.5 firmware help at all? According to the release note, that version is VMware ESXi 3.5 certified. TIA

I had a similiar issue with larger drives and fewer spindles. I ended up replacing the 4 1tb drives with 8 640 gig 32mb wd black enterprise drives. It ended up helping, but this is more dev unit then production. I am not very happy with the esxi iscsi connection with esxi4. I just keep them now for a library backup currently.