MD100 and 2650's

Mike P Moore Mike.Moore at appliedbiosystems.com
Fri Jun 6 13:14:58 CDT 2008


If you have a 2650 already, the only option for hardware raid would be to 
use the following card:

Adaptec 4800SAS - PCI-X (Not PCI-Express) SAS RAID adapter with 1 x4 
External SAS connector
and 2 x4 internal SAS connectors.  This controller support RAID 0, 1, 10, 
5, 6, 50 and 60 plus some
other RAID options.  It has 128 MB (small) of cache on the card.

This is the ONLY PCI-X SAS RAID controller that supports anything higher 
than RAID-0/1 in hardware
that I have been able to find.
 
If software RAID is acceptable, you could use the following:

LSI SAS3800X SAS HBA - 2 x4 External SAS connectors, No internal 
connectors.
I've used these on PE SC1425's to connect up multiple MD1000's and I've 
used MD software raid's 
under CentOS 4.

There may be a few other options from both LSI Logic and Adaptec that have 
varying combinations
of Internal/External SAS ports with support for RAID-0/1/10 if that is 
what you are looking for.  Here is a 
chart from LSI Logic comparing various HBA's

http://www.lsi.com/documentation/scg/SCG_LSI-Adaptec_Comparison_Gd_120507.pdf

If your 2850 has PCI-Express slots, then you could try using a PERC 5e/6e 
controller, otherwise
your options would be the same as for the 2650. 

Here's a few things I have found out over time.
If you can use a PERC 5e/6e, I'd recommend the PERC 6e with the largest 
cache you can get 
(512 MB I believe) as this will help with write performance. 

Turn off any Read Ahead Caching in the PERC firmware for any volumes you 
create as the read ahead 
caching will compete with the write cache on the controller.  For RAID 5, 
you want as much write cache 
on the controller as you can get. 

It seems to help performance if you keep the RAID-5 stripes to 8 drives or 
less.  Reduces contention on 
the spindles between the parity IO and the data IO. 

PERC 5 controllers (don't know about PERC 6) can only support up to 8 RAID 
5 stripe sets.  I recently built
a 90 drive RAID 50 array and I had to use 8 x 11 Drive RAID 5 stripes 
instead of the 11 x 8 drive stripes I 
wanted to use.

Good Luck!  And feel free to contact me if you have any questions!

- Mike

Michael P. Moore
Senior Network Engineer
Applied Biosystems - High Throughput Discovery Division
P: 508-383-7486
ONNET: 685-7486




Alan Bunch <alabun at udfc.com> 
Sent by: linux-poweredge-bounces at dell.com
06/06/2008 10:29 AM

To
linux-poweredge at dell.com
cc

Subject
MD100 and 2650's






The goal is a large stack of disk for a disk to disk to take scheme.  We 
want to rsync the main datacenter disk to stack of disk at a remote site 
and then backup to tape from there.

What would you use to connect a MD100  to a PE 2650 ?

It does not need to be Dell supported but is does need strong Centos 5 
support.

As an alternative I have a 2850 that might be available for this

alabun

_______________________________________________
Linux-PowerEdge mailing list
Linux-PowerEdge at dell.com
http://lists.us.dell.com/mailman/listinfo/linux-poweredge
Please read the FAQ at http://lists.us.dell.com/faq

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.us.dell.com/pipermail/linux-poweredge/attachments/20080606/024ff01c/attachment.htm 


More information about the Linux-PowerEdge mailing list