You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Attempting to make a ssh connection to my EKS worker nodes is failing with the error:
$ kubectl plugin ssh ip-192-168-107-182.ec2.internal -i ~/certificates/MY-KEYPAIR.pem
MissingParameter: status
SSHing into Worker Node with IP: The
code:
ssh: Could not resolve hostname the: nodename nor servname provided, or not known
Can't SSH
error: exit status 1
$
Why does the kubectl ssh plugin think the Worker Node name or IP is "The" ???
The text was updated successfully, but these errors were encountered:
The problem here is that the kubernetes version is "too new" (1.11+): The script_ssh.sh script depends on .spec.externalID, which was deprecated and now got removed.
Kubelets will no longer set externalID in their node spec. This feature has been deprecated since v1.1. (#61877, @mikedanese)
Attempting to make a ssh connection to my EKS worker nodes is failing with the error:
Why does the kubectl ssh plugin think the Worker Node name or IP is "The" ???
The text was updated successfully, but these errors were encountered: