Running Docker Containers as Jobs¶
You can execute Docker images in Predictive Learning (PrL), just as you can with any model stored in PrL.
About Docker Images¶
Refer to the following important points about pushing Docker images or when executing them:
- Do not store access data such as usernames, tokens, and secrets within Docker images, even though the storage location itself is secure.
- The Docker containers you create have limited external connectivity. For example, they can only download Python or R libraries. Therefore, you should include in the Docker image all the extra data and libraries they require.
- Jupyter notebooks created using the Predictive Learning environments may not work as is from customers local environment.
- Exposing ports from your Docker container has no effect because they run in isolation and other components cannot access them.
Additional Code Required in Dockerfiles¶
Depending on the type of input your container requires, the Dockerfile used in building the container requires a few extra lines of code to allow the environment to prepare input data, and to extract outputs.
For Data Exchange or Data Lake Input¶
The Dockerfile for Data Exchange or Integrated Data Lake input requires the following:
RUN ["mkdir", "/tmp/input"]
RUN ["mkdir", "/tmp/output"]
RUN chmod 777 -R /tmp
For IoT Input¶
The Dockerfile for IoT input requires the following:
RUN ["mkdir", "/tmp/input"]
RUN ["mkdir", "/tmp/output"]
RUN chmod 777 -R /tmp
For PrL Storage Inputs or Outputs¶
The Dockerfile for PrL storage inputs and outputs requires the following:
RUN ["mkdir", "/tmp/input"]
RUN ["mkdir", "/tmp/output"]
RUN chmod 777 -R /tmp
Last update: November 21, 2024
Except where otherwise noted, content on this site is licensed under the Development License Agreement.