List
Lists jobs.
- TypeScript
- Python
import { cloudApi, serviceClients, Session } from "@yandex-cloud/nodejs-sdk";
const ListProjectJobRequest =
cloudApi.datasphere.jobs_project_job_service.ListProjectJobRequest;
(async () => {
const authToken = process.env["YC_OAUTH_TOKEN"];
const session = new Session({ oauthToken: authToken });
const client = session.client(serviceClients.ProjectJobServiceClient);
const result = await client.list(
ListProjectJobRequest.fromPartial({
// projectId: "projectId",
// pageSize: 0,
// pageToken: "pageToken",
// filter: "filter"
})
);
console.log(result);
})();
import os
import grpc
import yandexcloud
from yandex.cloud.datasphere.v2.jobs.project_job_service_pb2 import ListProjectJobRequest
from yandex.cloud.datasphere.v2.jobs.project_job_service_pb2_grpc import ProjectJobServiceStub
token = os.getenv("YC_OAUTH_TOKEN")
sdk = yandexcloud.SDK(token=token)
service = sdk.client(ProjectJobServiceStub)
response = service.List(
ListProjectJobRequest(
# project_id = "projectId",
# page_size = 0,
# page_token = "pageToken",
# filter = "filter"
)
)
print(response)
ListProjectJobRequest
projectId : string
ID of the project.
pageSize : int64
The maximum number of results per page to return. If the number of available results is larger than page_size, the service returns a ListProjectJobResponse.page_token that can be used to get the next page of results in subsequent list requests.
pageToken : string
Page token. To get the next page of results, set page_token to the ListProjectJobResponse.page_token returned by a previous list request.
filter : string
restrictions:
- only
statusfield is supported - only
INoperator is supported example: - only running jobs == "status IN (EXECUTING, UPLOADING_OUTPUT)"
ListProjectJobResponse
jobs : Job
Instances of the jobs.
nextPageToken : string
This token allows you to get the next page of results for list requests. If the number of results is larger than ListProjectJobRequest.page_size, use the next_page_token as the value for the ListProjectJobRequest.page_token query parameter in the next list request. Each subsequent list request will have its own page_token to continue paging through the results.
Job
Instance of the job.
id : string
ID of the job.
name : string
Name of the job.
desc : string
Description of the job.
createdAt : google.protobuf.Timestamp
Create job timestamp.
finishedAt : google.protobuf.Timestamp
Finish job timestamp.
status : JobStatus
Status of the job.
config : string
Config of the job, copied from configuration file.
createdById : string
ID of the user who created the job.
projectId : string
ID of the project.
jobParameters : JobParameters
dataExpiresAt : google.protobuf.Timestamp
Job data expiration timestamp.
dataCleared : bool
Marks if the job data has been cleared.
outputFiles : File
Output files of the job.
logFiles : File
Job log files.
diagnosticFiles : File
Job diagnostics files.
dataSizeBytes : int64
Job total data size.
startedAt : google.protobuf.Timestamp
Start job timestamp.
statusDetails : string
Details.
actualCloudInstanceType : CloudInstanceType
Actual VM instance type, job is running on.
parentJobId : string
Reference to the parent job.
fileErrors : FileUploadError
Failed uploads.
outputDatasets : OutputDataset
Created datasets.
JobParameters
Job parameters.
inputFiles : File
List of input files.
outputFiles : FileDesc
List of output files descriptions.
s3MountIds : string
List of DataSphere S3 mount ids.
datasetIds : string
List of DataSphere dataset ids.
cmd : string
Job run command.
env : Environment
Job environment description.
attachProjectDisk : bool
Should project disk be attached to VM.
cloudInstanceTypes : CloudInstanceType
VM specification.
extendedWorkingStorage : ExtendedWorkingStorage
Extended working storage configuration.
arguments : Argument
List of literal arguments.
outputDatasets : OutputDatasetDesc
List of DataSets descriptions to create.
gracefulShutdownParameters : GracefulShutdownParameters
Graceful shutdown settings.
File
desc : FileDesc
sha256 : string
SHA256 of the file.
sizeBytes : int64
File size in bytes.
compressionType : FileCompressionType
File compression info
CloudInstanceType
name : string
Name of DataSphere VM configuration.
FileUploadError
One of fileType
outputFileDesc: FileDesc
logFileName: string
description : string
OutputDataset
desc : OutputDatasetDesc
Dataset description
id : string
Id of created dataset
FileDesc
path : string
Path of the file on filesystem.
var : string
Variable to use in cmd substitution.
Environment
vars : string
Environment variables.
One of dockerImage
pythonEnv : PythonEnv
ExtendedWorkingStorage
Extended working storage configuration.
StorageType
STORAGE_TYPE_UNSPECIFIEDSSD
type : StorageType
sizeGb : int64
Argument
name : string
value : string
OutputDatasetDesc
name : string
Name to create dataset with
description : string
Description to show in UI
labels : string
sizeGb : int64
Size of dataset to create
var : string
Var name to replace in cmd, like in FileDesc
GracefulShutdownParameters
timeout : google.protobuf.Duration
signal : int64
default 15 (SIGTERM)
DockerImageSpec
imageUrl : string
Docker image URL.
username : string
Username for container registry.
One of password
Password for container registry.
passwordPlainText: stringPlaintext password.
passwordDsSecretName: stringID of DataSphere secret containing password.
PythonEnv
condaYaml : string
Conda YAML.
localModules : File
List of local modules descriptions.
pythonVersion : string
Python version reduced to major.minor
requirements : string
List of pip requirements
pipOptions : PipOptions
Pip install options
PipOptions
indexUrl : string
--index-url option
extraIndexUrls : string
--extra-index-urls option
trustedHosts : string
--trusted-hosts option
noDeps : bool
--no-deps option