Running Jobs on a Workstation
This topic describes the basic usage of the
ml run local command to schedule jobs on a workstation using MissingLink's Resource Manager.
What is a job?
A MissingLink job is a set of one or more commands.
When you run commands on your computer or server, in addition to the file executed, you also have environment variables, a code environment, command arguments and the operating system that are all part of executing the command.
A MissingLink job is very similar. For example, when running the command:
$ echo "Hello, $NAME"
You are assuming that all of the following are involved:
- An environment variable (
- Command line arguments for
- Your operating system for providing you with the
echocommand and the shell that will replace the
$NAMEwith its value.
To better understand the principles at work in running jobs using MissingLink Resource Management, familiarize yourself with the basics of docker.
To illustrate, examine a simple "hello world" job:
$ ml run local --command 'echo "Hello $NAME"' --env NAME "World!" --image bash
Flags for the
ml run local command
The following describes the flags that are available for configuring general settings, more advanced settings, and authentication to cloud providers. The flags are available for both the
ml run local and
ml run xp commands.
--commandspecifies a single command to be executed.
--envspecifies a single environment variable. See Environment variables.
--imagespecifies the docker image to be used for this job. In this case, all that is needed is the
bashimage. MissingLink provides a few images that contain practically anything you need for machine learning.
--projectspecify where this job should be scheduled and make sure experiments that you run from this job are assigned to the desired project in MissingLink. Both of these parameters can be set to default values so that you do not need to provide them on every command. For more information, see Setting Defaults.
Flags for more advanced settings
There are more advanced features available with the
ml run local command for running jobs on your workstation. Use of the flags is described in:
- Using Recipe Files and Python Requirements Files
- Specifying Inputs and Outputs for Jobs
- Confidential Data
- Git Access Policies
Flags for authentication to cloud providers
If your data is stored in the cloud, you must authenticate the job with your cloud credentials.
For more information, see Allowing Access in Hybrid Cloud
When using Resource Management commands you can set default
project values by running:
$ ml defaults set org MY_ORG $ ml defaults set project MY_PROJECT
Once you have set the defaults, you can cut down the
hello world example that appears above to:
$ ml run xp --command 'echo "Hello $NAME"' --env NAME "World!" --image bash
From this point on, the
project parameters do not need to be added; they are understood.