I have created kops cluster and getting below error when logging to the cluster.
Error log :
*****INFO! KUBECONFIG env var set to /home/user/scripts/kube/kubeconfig.yaml
INFO! Testing kubectl connection....
error: You must be logged in to the server (Unauthorized)
ERROR! Test Failed, AWS role might not be recongized by cluster*****
Using script for iam-authentication and logged in to server with proper role before connecting.
I am able to login to other server which is in the same environment. tried with diff k8s version and diff configuration.
KUBECONFIG doesn't have any problem and same entry and token details like other cluster.
I can see the token with 'aws-iam-authenticator' command
Went through most of the articles and didn't helped
Best Answer
It seems as a AWS authorization issue. At cluster creation only the IAM user who created the cluster has admin rights on it, so you may need to add your own IAM User first.
1- Start by verifying the IAM user identity used implicitly in all commands:
aws sts get-caller-identity
If your aws-cli is set correctly you will have an output similar to this:
we will refer to the value in
Account
asYOUR_AWS_ACCOUNT_ID
in step 3. (in this exampleYOUR_AWS_ACCOUNT_ID="12344455555"
2- Once you have this identity you have to add it to AWS role binding to get EKS permissions.
3- You will need to edit the ConfigMap file used by kubectl to add your user
kubectl edit -n kube-system configmap/aws-auth
In the editor opened, create a username you want to use to refer to yourself using the clusterYOUR_USER_NAME
(for simplicity you may use the same as your aws user name, exampleToto
in step 2) , you will need it in step 4, and use the aws account id (don't forget to keep the quotes ""),you found it in your identity info at step 1YOUR_AWS_ACCOUNT_ID
, as follows in sectionsmapUsers
andmapAccounts
.4- Finally you need to create a role binding on the kubernetes cluster for the user specified in the ConfigMap