Ssh – How to manage AWS VPC ssh access accounts and keys across multiple instances

amazon-iamamazon-vpcamazon-web-servicesssh

I am setting up a standard AWS VPC structure: a public subnet, some private subnets, hosts on each, ELB, etc. Operational network access will be via either an ssh bastion host or an openvpn instance.

Once on the network (bastion or openvpn), admins use ssh to access the individual instances.

From what I can tell all of the docs seem to depend on a single user with sudo rights and a single public ssh key. But is that really best practice? Isn't it much better to have each user access each host under their own name?

So I can deploy accounts and ssh public keys to each server, but that rapidly gets unmanageable.

How do people recommend managing user accounts? I've looked at:

  • IAM: It doesn't look like IAM has a method for automatically distributing accounts and ssh keys to VPC instances.
  • IAM via LDAP: IAM doesn't have an LDAP API
  • LDAP: set up my own LDAP servers (redundant, of course). Bit of a pain to manage, still better than managing on every host, especially as we grow.
  • Shared ssh key: rely on the VPN/bastion to track user activities. I don't love it, but…

What do people recommend?

NOTE: I moved this over from accidentally posting in StackOverflow.

Best Answer

IAM is strictly for access to create/modify/destroy AWS resources. It has nothing to do with granting ssh access to your servers. If your users do need access to the AWS API, then yes, absolutely give them IAM users.

With regards to creating users, deploying keys, etc. Just use a configuration management system. I use Ansible for this, and it's a breeze. I can deploy my users, their keys, and their sudo privileges to a single server just as easy as I can to 1000 servers.

Whatever you do, do not under any circumstance permit usage of a shared account, shared keys, etc.

Related Topic