Skip to content
This repository has been archived by the owner on Nov 24, 2022. It is now read-only.

Pipework error: "Invalid arguments for command" #344

Closed
tknerr opened this issue Jan 27, 2015 · 4 comments
Closed

Pipework error: "Invalid arguments for command" #344

tknerr opened this issue Jan 27, 2015 · 4 comments
Labels

Comments

@tknerr
Copy link

tknerr commented Jan 27, 2015

Hi!

Trying to get private networking running on one of our build servers.

So far I always get this error:

...
==> web: Setting up private networks...
 INFO driver: Configuring network interface for env-jenkins_web_1422398666776_38149 using 192.168.34.13 and bridge vlxcbr1
 INFO subprocess: Starting process: ["/usr/bin/sudo", "/usr/local/bin/vagrant-lxc-wrapper", "/var/lib/jenkins/.vagrant.d/gems/gems/vagrant-lxc-1.1.0/scripts/pipework
", "vlxcbr1", "env-jenkins_web_1422398666776_38149", "192.168.34.13/24"]
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stderr: Invalid arguments for command /var/lib/jenkins/.vagrant.d/gems/gems/vagrant-lxc-1.1.0/scripts/pipework, provided args: ["vlxcbr1", "env-jenkins_web_1422398666776_38149", "192.168.34.13/24"]
DEBUG subprocess: Waiting for process to exit. Remaining to timeout: 32000
DEBUG subprocess: Exit status: 1
ERROR warden: Error occurred: There was an error executing ["sudo", "/usr/local/bin/vagrant-lxc-wrapper", "/var/lib/jenkins/.vagrant.d/gems/gems/vagrant-lxc-1.1.0/scripts/pipework", "vlxcbr1", "env-jenkins_web_1422398666776_38149", "192.168.34.13/24"]

For more information on the failure, enable detailed logging by setting
the environment variable VAGRANT_LOG to DEBUG.
...

This was already with VAGRANT_LOG=DEBUG.

Any ideas on how to better debug such issues?

@fgrehm
Copy link
Owner

fgrehm commented Jan 28, 2015

That is coming from the sudo wrapper script. Did you regenerate the sudo wrapper after upgrading to 1.1.0?

@tlfbrito
Copy link

tlfbrito commented Feb 9, 2015

I also have this issue when runing LXC inside other LXC container.

Scenario:

  • LXC container A
  • LXC container B (with private network)

LXC B runs inside LXC A machine. LXC B will crash on Setting up private networks...

And yes, I've regenerated the sudo wrapper.

@gregoryolsen
Copy link

Hi,

I just updated #378 with what looks like the same error from vagrant-lxc-wrapper:

$ vagrant up --provider=lxc
[sudo] password for geek: 
Bringing machine 'deb7a' up with 'lxc' provider...
==> deb7a: Checking if box 'fgrehm/wheezy64-lxc' is up to date...
==> deb7a: Setting up mount entries for shared folders...
    deb7a: /vagrant => /srv/vagrant/lxc/deb7a
    deb7a: /vagrant_home_root => /home/vagrant/root
==> deb7a: Starting container...
There was an error executing ["sudo", "/usr/local/bin/vagrant-lxc-wrapper", "cat", "/srv/lxc/deb7a/config"]

For more information on the failure, enable detailed logging by setting
the environment variable VAGRANT_LOG to DEBUG.

Running the failed command from terminal shows the error from vagrant-lxc-wrapper:

$ sudo /usr/local/bin/vagrant-lxc-wrapper cat /srv/lxc/deb7a/config
Invalid arguments for command /bin/cat, provided args: ["/srv/lxc/deb7a/config"]

Perhaps #344 and #378 are the same problem?

In my case it just started happening when vagrant-lxc was updated.

Please see #378 for complete info with DEBUG.

Thanks

gregoryolsen pushed a commit to gregoryolsen/vagrant-lxc that referenced this issue Jun 8, 2016
Use lxc.lxcpath in lxc.conf to resolve sudo wrapper base_path.

Fixes: fgrehm#344 and fgrehm#378
fgrehm#344
fgrehm#378
@fgrehm fgrehm added the ignored label Nov 17, 2022
@fgrehm
Copy link
Owner

fgrehm commented Nov 17, 2022

Hey, sorry for the silence here but this project is looking for maintainers 😅

As per #499, I've added the ignored label and will close this issue. Thanks for the interest in the project and LMK if you want to step up and take ownership of this project on that other issue 👋

@fgrehm fgrehm closed this as completed Nov 17, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants