Job Manager Service¶
The Job Manager Service allows customers to execute analytical and data science models, which consist of arbitrary paragraphs of code. Currently, the Job Manager Service only supports Apache Zeppelin virtual notebooks.
The Job Manager Service is currently only available in region Europe 1.
For accessing this service you need to have the respective roles listed in Job Manager roles and scopes.
The Job Manager Service follows an execution pattern, which consists of the following steps:
- Validate input parameters, their existence and security availability.
- Start the execution environment.
- Prepare input data.
- Execute Zeppelin virtual notebook.
- Store results in user defined output directory.
Model Management Service¶
The Job Manager Service uses the API of the Model Management Service to access the analytical models to be executed. The respective models must be accessible by the tenant.
Data Exchange Service¶
The Job Manager Service uses the API of the Data Exchange Service to read input data and export output data. The respective locations must be accessible by the tenant.
Predictive Learning Services¶
The Job Manager Services runs the specified jobs in a virtual environment. This environment must be defined using the Predictive Learning services.
Apache Zeppelin is a virtual notebook environment. Apache Zeppelin notebooks support various interpreters, for many languages and frameworks, including Scala, Python, Java etc. They are useful in at least two general scenarios:
- training a model to obtain an inference model
- performing inference or prediction tasks
Training a model usually requires high computation resources like memory, storage, bandwidth and CPU. The Job Manager Service exposes its API for realizing the following tasks:
- Validate provided input before proceeding with expensive operations
- Perform necessary cleanups regardless of the success or failure of the execution
- Retry expensive operations automatically in case of failure
- Record important outputs for the user to backtrace errors
- Minimize the usage of expensive resources
- All input files have to pass the MindSphere Gateway, which has its own limitations.
- Each execution is started in a separate execution environment.
- Setting up the execution environment can take up to 30 minutes. Keep this in mind for time-sensitive prediction or inference tasks.
- The preparation time for input data and results is linearly dependent on the file sizes.
A developer wants to train an Apache Zeppelin notebook model for anomaly detection. The developer uses the Job Manager Service to (re)train this model.
Any questions left?
Except where otherwise noted, content on this site is licensed under the MindSphere Development License Agreement.