Linux – What’s the best practice of mass Linux system deployment

deploymentlinux

If you are trying to install 500 Linux system through network installation at the same time, the bottleneck would be the NFS/HTTP/FTP or whatever server holds files you need for installation.

IMO, this can only be solved by adding more installation servers and then round-robin them.

Is there any better solution to this problem? Something like "P2P Linux installation"?

UPDATE:
I need to describe my situation more specificly. Currently I'm deploying RHEL using kickstart+NFS. When I try to deploy 500 RHEL concurrently, the NFS server will have a huge traffic and makes every install process slow. Setting up more NFS servers is a solution but I don't think it's a good one.

Best Answer

This is usually where Multicast imaging comes along. Something like Clonezilla or ghost supports sending the data multicast which would let you push out the image to all 500 systems at once at basically the same speed as pushing the image out to 1 system.