I’m looking for fast transfer speeds to support working on HD Video. My thought is to get a Drobo 5N to hook up via ethernet on a router for my network. It would be connected with Cat6 cables. I would like to have 2 drives SSD on the device (pro samsung ssd’s), perhaps striped together, and the other three used for 7200rpm drives. What is the fastest I could get the read/write speed using this device? Is there a certain limitation that I should be aware of?
This would be for reading video directly from this network drive to work on them in video FX work.
A couple things - first off, some of your optimizations aren’t going to do a lot of good. A set of WD green’s can easily max the throughput of the 5N, so buying expensive drives will buy you very little. Same goes for Cat6 (even if I do have it run in my house). Cat6 under most conditions won’t give an appreciable benefit over Cat5E. Mounting the SSD’s will require something like an IcyDock, as these are 3.5" trayless bays, so you’d need a plan for mounting them.
However, most fundamentally you don’t seem to understand how a Drobo works. You don’t set up RAID levels, different groups of disks, etc. You give it storage, it gives you redundancy. So there is no “striping two disks and three others…”. You get one pool of redundant storage - period.
Seriously, there have been a dozen or so posts lately, all of which are asking about manual RAID settings. Where is this all coming from?
I can’t answer all these other posts, but I appreciate your explanation, and clarification with Cat5E and the speed.
I saw a throughput of 600Mbps for the 5N somewhere. Perhaps I was mistaken? I’d like to get a bit more, but the only options I see are for large servers that wouldn’t fit in my home office (overkill). Maybe no one makes a fast RAID for a smaller size/class?
The 5N can easily push north of 800Mbps (or 100MBps) with straight green drives. At that point you’re going to have a hard time getting any closer to the full gigabit - there’s a certain amount of protocol overhead and such that is just really hard to overcome.
If you need more than 1Gbps of throughput, Synology devices have multiple 1Gbps links that can be aggregated with the proper switch, although then you have to have a way of consuming that bandwidth on your computer’s end. NetGear is also starting to ship 10Gbps “prosumer” equipment that’s not cheap, but is network-based and >1Gbps. If a DAS is possible, there’s always Thunderbolt enclosures as well.
Okay, I’ll look more into Synology and other equipment that can handle this. I’m guessing you need a card to be able to handle a fibre optic connection?
You don’t need fiber optic; we’re just starting to see support for 10GE over copper (and there was always stuff like Twinax, although that’s far outside this discussion). However, I still don’t see how your computer is going to connect to the network at matching speed, potentially making this moot.
Is there reading material that you can direct me to so I can educate myself more on this topic? Trying to figure out how to network my computer at a matching or close to speed.
Long story short, your computer needs to have an interface capable of same/faster speed, on a bus capable of same/faster speed. The bottleneck is the smallest pipe.
Case-in-point, a Gigabit Ethernet card that is standard 32-bit, 33 MHz PCI will generally never achieve full 1 Gbps throughput because 32-bit, 33 MHz PCI bus is 133MB/sec max (1064 Mbps) shared across all devices on the bus.
A little more modern, a USB 2.0 Gigabit Ethernet adapter will never get more than 480 Mbps throughput, as that is the USB 2.0 spec limit.
diamondsw might know more than I do (or maybe I just did it wrong), but the last time I researched link aggregation (aka teaming), it’s great for servers since it increase the maximum throughput, but the max throughput for a single client was still limited to the speed of a single link. In other words, a server with dual teamed GigE interfaces can deliver 2 Gbps output total, but a single client (even with dual teamed GigE interfaces) can only consumer max 1 Gbps from that server.
In this case, unless you’re doing a lot of downloading or have lots of other network traffic, a single Gigabit Ethernet interface on PCI Express x1 or better would be sufficient to avoid additional bottlenecking.
If you’re pulling from multiple servers/locations, then a multi-port teamed interface on PCI Express x4 or better could help, though that would require a “smart” or better switch.