165 hour Data Protection, normal?

I just replaced a 1.5TB drive for a 2TB drive. Does a 165 hour rebuild sound normal? This is crazy!

Update, just jumped to 186 hours!!!
Now 204… I’m getting concerned.

is it in use during the rebuild?

for your setup i would say 72 hours is more typical

using it during rebuild dramatically increases rebuild times

I suppose spotlight may have been taking a look at it. I changed my Plex media server to stop scanning the library for now. It’s dropped to 136. Hopefully it keeps dropping. That was getting scary. I have most of that data backed up but I do need to update my backups. Hopefully the rebuild will complete safely so I can get to that.

Update: down to 79 hours. Much more calm and less hair pulling going on now.
Update (6:10PM PDT): down to 46 hours

hi this could be sounding about right (might even rise a bit too)

NeilR made a post on one (or several related) threads regarding the potentially exponential rise in rebuild times, by way of the sheer number of files and amount of data being used as drives get bigger.

since i only have the 4-slots (v1 and v2) i thought maybe you had too (until i scrolled down and saw the picture) :slight_smile:
with so many drives, i’d imaging the rebuild time would probably be around 200% of the time it would normally be for a 4-slot…

but as you have dual drive redundancy enabled (which i cant tell from here - maybe “hard drive faulures” shows dual drive redunancy), then maybe that in turn adds more time, (or simplifies the process since theres 1 more drive to maybe become a mirror and therefore shaving off 25% so i’d take a guess that it will take 175% of what it would normally take on a 4-slot with 2tb drives. (not sure what that value currently might stand at though… @NeilR “prompt?” :slight_smile:

btw ltcarter, if you are doing anything to the drobo data (such as having virus scanners running in the background, or reading/writing to the drobo, or cataloging/indexing stuff on it etc) then that could also affect the remaining time value and make it fluctuate.[hr]
hey we all updated the post at the same time :)[hr]
(heh, that drobo dashboard pic is quick sleek - how much is a drobopro again? :slight_smile:

When I last upgraded my Drobo V2, the initial estimated time remaining was quite wild, and after a couple of hours it settled down. It apparently takes some time to get accurate benchmark references. So LtCarter’s experience is probably normal.

Paul, I would expect a DroboPro to do a relay far quicker than a Drobo V2, apples to apples, which would mean 4 drives in the DroboPro. You’re supposed to get better performance for 5x the $$$, aside from some extra slots.

With 8 slots, it should not require more writing to the replaced drive. After all, only one drive’s parity is being rebuilt. But OTOH it has to read 4 extra drives to compute the parity (but that is happening concurrently?) Just to say there are variables moving in opposite directions when trying to compare a 4 slot Drobo V2 to a DroboPro. I only had 3TB of data and required 63 hours or so to rebuild each drive, so with over 3x the data I guess the DroboPro, on a relative basis isn’t doing too bad…

The 72 hours is probably about right- I suspect the estimated time is stable now. If so, then that suggests it would require about 560 hours to update that DroboPro to larger drives (i.e. 3TB). That’s about 3 weeks. And that is the part I haven’t figured out in terms of the scalability of the Drobo…

On my Drobo V2 with 3TB of data, I figured it would be faster to reformat and reload rather than upgrade 2 drives with the data in place. So obviously a 4 disk upgrade is far, far faster to reload. That assumes my throughput estimate is realistic. I probably figured about 10MB/s. That would be a challenge with small files but might be achievable with large files. But I never tried to load 3TB of data in one shot. I don’t know the speeds and feeds of a DroboPro.

@ lt

yeah i tend to say rebuilds take 48-72 hours , for reasonably full drobos

@ paul

1300 GBP more or less

thanks Neil
thanks Doc

(LTcarter - would be cool to know how long the final rebuild took) :slight_smile:
no bets… .just good to know and i might even start making another google spreadsheet with Rebuild facts like the other one as people have started using it :slight_smile:

I’ll keep an eye on it and update when it’s finished.

I noticed this morning while sitting next to it for a bit before leaving for work that it abruptly powered down, rebooted, then continued the rebuild with about the same 20hr estimate it had before rebooting. Not sure what that was all about, and I wonder how many times it’s done that since I replaced the drive.

It would probably be a good idea to pull a log after the relay completes and initiate a support case to determine the cause of the reboot(s). I believe that sometimes that may be related to a bad (marginal) drive; you want to ID that drive asap. Since you have no user accessible logging history to determine the frequency it would be good preventative medicine.

Finished this morning at about 5:30AM PDT

We don’t know when you started, plus time zone differences in your OP but it sounds like about 36 hours?

Sorry, I mentioned PST earlier but meant PDT. The rebuild started at around 4:30PM on the 28th and ended at about 5:30AM on the 30th, so about 37 hours.

Not bad…

well done.
looks like quicker than my estimate - which is a good thing) :slight_smile:

So just to follow up, I submitted my log file and opened a support case. Turns out a couple drives were experiencing “full time outs.” 3 time outs for 1 drive and 5 for the other. I’ve pulled the drive that timed out 5 times and replaced it with a brand new drive. I’ll probably do the other one soon when I get a new replacement.

I’m going to call Drobo support later today to ask if that means the drives are failing.

usually it does (mean that its failed

run the manufacturers full diagnostics on the drive and see if it passes