types
Argument
name
: string
value
: string
CloudInstanceType
name
: string
Name of DataSphere VM configuration.
DockerImageSpec
imageUrl
: string
Docker image URL.
username
: string
Username for container registry.
One of password
Password for container registry.
passwordPlainText
: stringPlaintext password.
passwordDsSecretName
: stringID of DataSphere secret containing password.
Environment
vars
: string
Environment variables.
One of dockerImage
pythonEnv
: PythonEnv
ExtendedWorkingStorage
Extended working storage configuration.
StorageType
STORAGE_TYPE_UNSPECIFIED
SSD
type
: StorageType
sizeGb
: int64
File
desc
: FileDesc
sha256
: string
SHA256 of the file.
sizeBytes
: int64
File size in bytes.
compressionType
: FileCompressionType
File compression info
FileDesc
path
: string
Path of the file on filesystem.
var
: string
Variable to use in cmd substitution.
FileUploadError
One of fileType
outputFileDesc
: FileDesc
logFileName
: string
description
: string
GracefulShutdownParameters
timeout
: google.protobuf.Duration
signal
: int64
default 15 (SIGTERM)
Job
Instance of the job.
id
: string
ID of the job.
name
: string
Name of the job.
desc
: string
Description of the job.
createdAt
: google.protobuf.Timestamp
Create job timestamp.
finishedAt
: google.protobuf.Timestamp
Finish job timestamp.
status
: JobStatus
Status of the job.
config
: string
Config of the job, copied from configuration file.
createdById
: string
ID of the user who created the job.
projectId
: string
ID of the project.
jobParameters
: JobParameters
dataExpiresAt
: google.protobuf.Timestamp
Job data expiration timestamp.
dataCleared
: bool
Marks if the job data has been cleared.
outputFiles
: File
Output files of the job.
logFiles
: File
Job log files.
diagnosticFiles
: File
Job diagnostics files.
dataSizeBytes
: int64
Job total data size.
startedAt
: google.protobuf.Timestamp
Start job timestamp.
statusDetails
: string
Details.
actualCloudInstanceType
: CloudInstanceType
Actual VM instance type, job is running on.
parentJobId
: string
Reference to the parent job.
fileErrors
: FileUploadError
Failed uploads.
outputDatasets
: OutputDataset
Created datasets.
JobParameters
Job parameters.
inputFiles
: File
List of input files.
outputFiles
: FileDesc
List of output files descriptions.
s3MountIds
: string
List of DataSphere S3 mount ids.
datasetIds
: string
List of DataSphere dataset ids.
cmd
: string
Job run command.
env
: Environment
Job environment description.
attachProjectDisk
: bool
Should project disk be attached to VM.
cloudInstanceTypes
: CloudInstanceType
VM specification.
extendedWorkingStorage
: ExtendedWorkingStorage
Extended working storage configuration.
arguments
: Argument
List of literal arguments.
outputDatasets
: OutputDatasetDesc
List of DataSets descriptions to create.
gracefulShutdownParameters
: GracefulShutdownParameters
Graceful shutdown settings.
JobResult
returnCode
: int64
Execution return code.
OutputDataset
desc
: OutputDatasetDesc
Dataset description
id
: string
Id of created dataset
OutputDatasetDesc
name
: string
Name to create dataset with
description
: string
Description to show in UI
labels
: string
sizeGb
: int64
Size of dataset to create
var
: string
Var name to replace in cmd, like in FileDesc
PipOptions
indexUrl
: string
--index-url option
extraIndexUrls
: string
--extra-index-urls option
trustedHosts
: string
--trusted-hosts option
noDeps
: bool
--no-deps option
PythonEnv
condaYaml
: string
Conda YAML.
localModules
: File
List of local modules descriptions.
pythonVersion
: string
Python version reduced to major.minor
requirements
: string
List of pip requirements
pipOptions
: PipOptions
Pip install options
StorageFile
file
: File
url
: string
File URL.