CentOS 7 Kickstart – Software RAID 10 and LVM

centoskickstartlvmpartitionredhat

So I am not sure why I am having this issue, so I hope someone can see something I am missing.

I created a kickstart file for a test Cent OS 7 automated install. Nothing seems to generate a warning except for the storage portion when it comes to partitioning. This is that section:

clearpart --all --initlabel --drives=/dev/sda,/dev/sdb,/dev/sdc,/dev/sdd,/dev/sde,/dev/sdf,/dev/sdg,/dev/sdh

part raid.1 --size=1024 --ondisk=/dev/sda
part raid.2 --size=1024 --ondisk=/dev/sdb
part raid.3 --size=1024 --ondisk=/dev/sdc
part raid.4 --size=1024 --ondisk=/dev/sdd
part raid.5 --size=1024 --ondisk=/dev/sde
part raid.6 --size=1024 --ondisk=/dev/sdf
part raid.7 --size=1024 --ondisk=/dev/sdg
part raid.8 --size=1024 --ondisk=/dev/sdh

part raid.9 --size=256 --ondisk=/dev/sda
part raid.10 --size=256 --ondisk=/dev/sdb
part raid.11 --size=256 --ondisk=/dev/sdc
part raid.12 --size=256 --ondisk=/dev/sdd
part raid.13 --size=256 --ondisk=/dev/sde
part raid.14 --size=256 --ondisk=/dev/sdf
part raid.15 --size=256 --ondisk=/dev/sdg
part raid.16 --size=256 --ondisk=/dev/sdh

part raid.17 --size=20480 --ondisk=/dev/sda
part raid.18 --size=20480 --ondisk=/dev/sdb
part raid.19 --size=20480 --ondisk=/dev/sdc
part raid.20 --size=20480 --ondisk=/dev/sdd
part raid.21 --size=20480 --ondisk=/dev/sde
part raid.22 --size=20480 --ondisk=/dev/sdf
part raid.23 --size=20480 --ondisk=/dev/sdg
part raid.24 --size=20480 --ondisk=/dev/sdh

raid /boot --fstype="xfs" --device=boot --level=10 raid.1 raid.2 raid.3 raid.4 raid.5 raid.6 raid.7 raid.8
raid /boot/efi --fstype="efi" --device=boot_efi --level=10 raid.9 raid.10 raid.11 raid.12 raid.13 raid.14 raid.15 raid.16
raid pv.1 --fstype="lvmpv" --device=root --level=10 raid.17 raid.18 raid.19 raid.20 raid.21 raid.22 raid.23 raid.24

volgroup vg1 pv.1

logvol / --fstype="xfs" --size=1 --grow --name=root --vgname=vg1

bootloader --append=" crashkernel=auto" --location=mbr

I am trying to create three partitions:

  • /boot – 1024 MiB size, formatted to xfs, RAID 10
  • /boot/efi – 256 MiB size, formatted to efi, RAID 10
  • / – 20 GiB size, formatted to xfs, RAID 10 + LVM

I am using the graphical install so I can look at everything quickly, it looks like its marking /boot/efi as efi, yet regardless I still get the below error preventing me from completing the installation.

No valid boot loader target device found. See below for details.
For a UEFI installation you must include a EFI System Partition on a GPT-formatted disk, mounted at /boot/efi.

The other weirdness I am seeing is that its not using my values for the premade partition sizes. Based on the kickstart file I wrote above these are the sizes I am seeing:

  • /boot – should be 1024 MiB, CentOS 7 makes it 4092 MiB
  • /boot/efi – should be 256 MiB, CentOS 7 makes it 1020 MiB
  • / – should be 20 GiB, Cent OS 7 makes it 79.93 GiB

I would appreciate any assistance on this.

Best Answer

Your sizes seem exactly what they are supposed to be given the part commands . Your first partition on each device is 1024 you have 8 devices in a RAID10, so that is 1024 * 8 / 2 or 4096. For RAID10 the size of the volume is the number of active devices X maximum size of smallest member / 2.

I highly doubt a software RAID10 is valid for a EFI partition, and it unless something has changed it isn't going to be valid for your /boot partition either. I suspect your only choice for that is a simple RAID1 volumes. It is valid to have a RAID1 volume that spans 8 devices. So you could try changing your boot / efi over to RAID1. With RAID1 the of the volume will just be the size of the smallest active member.

Related Topic