Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sudo: /usr/bin/tee command not found #1251

Closed
Empty2k12 opened this issue May 1, 2024 · 8 comments · Fixed by #1329
Closed

sudo: /usr/bin/tee command not found #1251

Empty2k12 opened this issue May 1, 2024 · 8 comments · Fixed by #1329
Labels
🐛 bug Something isn't working ⌛ pending author's response Requested additional information from the reporter

Comments

@Empty2k12
Copy link

Describe the bug
Creating a snippet throws Error: error transferring file: sudo: /usr/bin/tee /var/lib/vz/snippets/vendor-config.yaml: command not found.

To Reproduce

  1. Run the below command.
  2. PROXMOX_VE_ENDPOINT is the https URL with port, PROXMOX_VE_USERNAME is a pam realm user with API admin privileges. The user has all visudo permissions set.
  3. ssh user@10.176.95.110 sudo pvesm apiinfo returns APIVER 10 APIAGE 1
  4. ssh user@10.176.95.110 sudo /usr/bin/tee works.
  5. Running try_sudo(){ if [ $(sudo -n pvesm apiinfo 2>&1 | grep "APIVER" | wc -l) -gt 0 ]; then sudo $1; else $1; fi }; t ry_sudo /usr/bin/tee on the host works. I am not proficient enough in Go and did not dig deep enough in the source to determine if it is trying to run sudo: /usr/bin/tee.
terraform {
  required_providers {
    proxmox = {
      source = "bpg/proxmox"
      version = "0.55.0"
    }
  }
}

variable "PROXMOX_VE_ENDPOINT" {
  type = string
}

variable "PROXMOX_VE_USERNAME" {
  type = string
}

variable "PROXMOX_VE_PASSWORD" {
  type = string
}

provider "proxmox" {
  endpoint = var.PROXMOX_VE_ENDPOINT
  username = var.PROXMOX_VE_USERNAME
  password = var.PROXMOX_VE_PASSWORD
  insecure = true

  ssh {
    agent = true
  }
}

resource "proxmox_virtual_environment_file" "gerodev-kube1-1-vendor-config" {
  content_type = "snippets"
  datastore_id = "local"
  node_name    = "kube1"

  source_file {
    path = "../proxmox/vendor-config.yaml"
  }
}

Please note this example has been created as a minimal repro. I am using the Pulumi provider which uses the same plugin, but the same error happens.

Expected behavior
A clear and concise description of what you expected to happen.

  • Single or clustered Proxmox: Clusteed
  • Provider version (ideally it should be the latest version): see above
  • Terraform version:

        OpenTofu v1.7.0
        on darwin_arm64
        + provider registry.opentofu.org/bpg/proxmox v0.55.0
    
  • OS (where you run Terraform from): MacOS 14.3.1 (23D60)
  • Debug logs (TF_LOG=DEBUG terraform apply): tofu.log
@Empty2k12 Empty2k12 added the 🐛 bug Something isn't working label May 1, 2024
@bpg
Copy link
Owner

bpg commented May 2, 2024

Hi @Empty2k12 👋🏼

ssh user@10.176.95.110 sudo /usr/bin/tee works.

I see you've configured username 'user' for SSH access, but the username argument is not specified in the provider's ssh config. So the provider is using the default username instead, i.e. the one that is configured in the provider block via var.PROXMOX_VE_USERNAME
As I see from the the debug log, that user is different, and most likely is not configured for sudo.

You'd need to add username = "user" to the ssh block in the provider config, and everything should work:

  ssh {
    agent    = true
    username = "user"
  }

Please let me know if it solves the issue.

@bpg bpg added the ⌛ pending author's response Requested additional information from the reporter label May 2, 2024
@Empty2k12
Copy link
Author

Hello @bpg, thanks for your quick response. I have just renamed the user in this issue. The user is called gero. Adding username = "gero" (should not be necessary) does not solve the issue.

Actually, the permissions for the user gero are broader than required:

gero ALL=(ALL) NOPASSWD: ALL
gero ALL=(root) NOPASSWD: /sbin/pvesm
gero ALL=(root) NOPASSWD: /sbin/qm
gero ALL=(root) NOPASSWD: /usr/bin/tee /var/lib/vz/*

@bpg
Copy link
Owner

bpg commented May 2, 2024

Interesting... 🤔 Do you have the same user set up in the same way on all nodes in your PVE cluster?

@Empty2k12
Copy link
Author

Yes, I have.

@bpg
Copy link
Owner

bpg commented May 3, 2024

That really puzzles me. Could you check the syslog on your node, perhaps we can find some clues in there. For example, on my test node I have these logs from the snippet upload:

May 02 20:44:03 pve sshd[1894819]: Accepted key ED25519 SHA256:ciWV4gSYzOwR7dxeBWt/UZq5dF2qXoJOeT8SFQzJFDM found at /home/terraform/.ssh/authorized_keys:1
May 02 20:44:03 pve sshd[1894819]: Accepted publickey for terraform from X.X.X.X port 63876 ssh2: ED25519 SHA256:ciWV4gSYzOwR7dxeBWt/UZq5dF2qXoJOeT8SFQzJFDM
May 02 20:44:03 pve sshd[1894819]: User child is on pid 1894830
May 02 20:44:03 pve sshd[1894830]: Starting session: command for terraform from X.X.X.X port 63876 id 0
May 02 20:44:03 pve sudo[1894833]: terraform : PWD=/home/terraform ; USER=root ; COMMAND=/sbin/pvesm apiinfo
May 02 20:44:03 pve sudo[1894833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
May 02 20:44:03 pve sudo[1894833]: pam_unix(sudo:session): session closed for user root
May 02 20:44:03 pve sudo[1894837]: terraform : PWD=/home/terraform ; USER=root ; COMMAND=/usr/bin/tee /var/lib/vz/snippets/cloud-config.yaml
May 02 20:44:03 pve sudo[1894837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
May 02 20:44:03 pve sudo[1894837]: pam_unix(sudo:session): session closed for user root
May 02 20:44:03 pve sshd[1894830]: Close session: user terraform from X.X.X.X port 63876 id 0

Also, what type of shell is configured for root on your node?

@mattburchett
Copy link

I just came across this same problem, and changing my shell back to bash from zsh seems to have solved the problem.

I have some pretty custom zsh configs, so I wouldn't be entirely surprised if my issue was there, but I am now curious if the OP was using a shell that isn't bash. 🙂

@HaziFlorinMarian
Copy link

Thank you @mattburchett for your comment!
I faced same issue and yeah..this module is not fully compatible with ZSH.

@bpg
Copy link
Owner

bpg commented May 27, 2024

The requirement for the default root shell has been documented in #1329

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 bug Something isn't working ⌛ pending author's response Requested additional information from the reporter
Projects
None yet
4 participants