Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running LocalModel.build_cpr_model returns No such file or directory: 'docker' #2833

Open
veronikayurchuk opened this issue Apr 2, 2024 · 0 comments

Comments

@veronikayurchuk
Copy link

I was trying to run notebooks/community/prediction/custom_prediction_routines/SDK_Pytorch_Custom_Predict.ipynb

On the step

Build your custom container,

the execution of

local_model = LocalModel.build_cpr_model(
    USER_SRC_DIR,
    f"{REGION}-docker.pkg.dev/{PROJECT_ID}/{REPOSITORY}/{IMAGE}",
    predictor=CustomPyTorchPredictor,
    requirements_path=os.path.join(USER_SRC_DIR, "requirements.txt"),
)
 

returns an error "FileNotFoundError: [Errno 2] No such file or directory: 'docker'"
See details below.

/opt/conda/lib/python3.10/subprocess.py:955: RuntimeWarning: line buffering (buffering=1) isn't supported in binary mode, the default buffer size will be used
  self.stdin = io.open(p2cwrite, 'wb', bufsize)
/opt/conda/lib/python3.10/subprocess.py:961: RuntimeWarning: line buffering (buffering=1) isn't supported in binary mode, the default buffer size will be used
  self.stdout = io.open(c2pread, 'rb', bufsize)
---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
Cell In[21], line 7
      3 from google.cloud.aiplatform.prediction import LocalModel
      4 from src_dir_pytorch_tracking.predictor import \
      5     CustomPyTorchPredictor  # Update this path as the variable $USER_SRC_DIR to import the custom predictor.
----> 7 local_model = LocalModel.build_cpr_model(
      8     USER_SRC_DIR,
      9     f"{REGION}-docker.pkg.dev/{PROJECT_ID}/{REPOSITORY}/{IMAGE}",
     10     predictor=CustomPyTorchPredictor,
     11     requirements_path=os.path.join(USER_SRC_DIR, "reqs.txt"),
     12 )

File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/prediction/local_model.py:381, in LocalModel.build_cpr_model(cls, src_dir, output_image_uri, predictor, handler, base_image, requirements_path, extra_packages, no_cache)
    376     environment_variables["PREDICTOR_CLASS"] = predictor_class
    378 is_prebuilt_prediction_image = helpers.is_prebuilt_prediction_container_uri(
    379     base_image
    380 )
--> 381 _ = build.build_image(
    382     base_image,
    383     src_dir,
    384     output_image_uri,
    385     python_module=_DEFAULT_PYTHON_MODULE,
    386     requirements_path=requirements_path,
    387     extra_requirements=_DEFAULT_SDK_REQUIREMENTS,
    388     extra_packages=extra_packages,
    389     exposed_ports=[DEFAULT_HTTP_PORT],
    390     environment_variables=environment_variables,
    391     pip_command="pip3" if is_prebuilt_prediction_image else "pip",
    392     python_command="python3" if is_prebuilt_prediction_image else "python",
    393     no_cache=no_cache,
    394 )
    396 container_spec = gca_model_compat.ModelContainerSpec(
    397     image_uri=output_image_uri,
    398     predict_route=DEFAULT_PREDICT_ROUTE,
    399     health_route=DEFAULT_HEALTH_ROUTE,
    400 )
    402 return cls(serving_container_spec=container_spec)

File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/docker_utils/build.py:533, in build_image(base_image, host_workdir, output_image_name, python_module, requirements_path, extra_requirements, setup_path, extra_packages, container_workdir, container_home, extra_dirs, exposed_ports, pip_command, python_command, no_cache, **kwargs)
    530 joined_command = " ".join(command)
    531 _logger.info("Running command: {}".format(joined_command))
--> 533 return_code = local_util.execute_command(
    534     command,
    535     input_str=dockerfile,
    536 )
    537 if return_code == 0:
    538     return Image(output_image_name, home_dir, work_dir)

File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/docker_utils/local_util.py:45, in execute_command(cmd, input_str)
     26 def execute_command(
     27     cmd: List[str],
     28     input_str: Optional[str] = None,
     29 ) -> int:
     30     """Executes commands in subprocess.
     31 
     32     Executes the supplied command with the supplied standard input string, streams
   (...)
     43         Return code of the process.
     44     """
---> 45     with subprocess.Popen(
     46         cmd,
     47         stdin=subprocess.PIPE,
     48         stdout=subprocess.PIPE,
     49         stderr=subprocess.STDOUT,
     50         universal_newlines=False,
     51         bufsize=1,
     52     ) as p:
     53         if input_str:
     54             p.stdin.write(input_str.encode("utf-8"))

File /opt/conda/lib/python3.10/subprocess.py:971, in Popen.__init__(self, args, bufsize, executable, stdin, stdout, stderr, preexec_fn, close_fds, shell, cwd, env, universal_newlines, startupinfo, creationflags, restore_signals, start_new_session, pass_fds, user, group, extra_groups, encoding, errors, text, umask, pipesize)
    967         if self.text_mode:
    968             self.stderr = io.TextIOWrapper(self.stderr,
    969                     encoding=encoding, errors=errors)
--> 971     self._execute_child(args, executable, preexec_fn, close_fds,
    972                         pass_fds, cwd, env,
    973                         startupinfo, creationflags, shell,
    974                         p2cread, p2cwrite,
    975                         c2pread, c2pwrite,
    976                         errread, errwrite,
    977                         restore_signals,
    978                         gid, gids, uid, umask,
    979                         start_new_session)
    980 except:
    981     # Cleanup if the child failed starting.
    982     for f in filter(None, (self.stdin, self.stdout, self.stderr)):

File /opt/conda/lib/python3.10/subprocess.py:1863, in Popen._execute_child(self, args, executable, preexec_fn, close_fds, pass_fds, cwd, env, startupinfo, creationflags, shell, p2cread, p2cwrite, c2pread, c2pwrite, errread, errwrite, restore_signals, gid, gids, uid, umask, start_new_session)
   1861     if errno_num != 0:
   1862         err_msg = os.strerror(errno_num)
-> 1863     raise child_exception_type(errno_num, err_msg, err_filename)
   1864 raise child_exception_type(err_msg)

FileNotFoundError: [Errno 2] No such file or directory: 'docker'

Expected Behavior

A successful run of notebooks/community/prediction/custom_prediction_routines/SDK_Pytorch_Custom_Predict.ipynb

Actual Behavior

Error : FileNotFoundError: [Errno 2] No such file or directory: 'docker'

Steps to Reproduce the Problem

  1. Open Workbench or Colab Enterprise
  2. Run notebooks/community/prediction/custom_prediction_routines/SDK_Pytorch_Custom_Predict.ipynb

Specifications

  • Version: Python 3.10.13, PyTorch 1.12
  • Platform: Workbench or Colab Enterprise
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant