No Boot on Raid with 5+ drives

Hello.

We have a Dell Poweredge R710 server with 6- 1TB drives in a Raid10 array.
Its a hardware Raid using a Perc 6i integrated controller.

Rocky Linux will see the drive and install fine with no issues.

After install, we reboot the machine and it goes straight to the grub rescue shell.

When I try raid 10 with 4 drives it works. With 5 or more it fails to boot.

The same issue occurs with Raid 5 - if there are 5 or more drives.

Issue also occurs via bios or efi install.

We have at least 2 other R710s configured with Raid 10 with 6 -1 TB drives. But they are running centOS 7.9.

Thanks for any assistance!

It’s the PERC in the R710. RHEL 6.2 or so and up doesn’t like a boot disk that’s more than 2 TB. I have an R710 with 6x1TB disks in it slated to be a search node for Security Onion (CentOS 7 based) and once I made the first 1 TB and the other five TBs separate, it boots CentOS just fine. With all six disks in one array, it went to the GRUB menu after a CentOS 7 install as you described. The Security Onion install disc would just poop its pants immediately after booting. I found this out after much head scratching and googling. Once I get a sufficient supply of round tuits I will get the 5x1TB RAID partitioned as SO wants it to be and get SO installed. But it gets worse - RHEL 8 and up don’t have support for the PERC so a separate driver disk is needed at install time for RHEL 8 and derivatives.

I finally got it to boot. UEFI does not work at all no matter what. Installs, but then does not boot.

I added inst.gpt to the boot parameters to force it to use gpt with biosboot instead of efi.

That got it to work.

1 Like