Module google_datapipelines1::api
source · [−]Structs
Central instance to access all Datapipelines related resource activities
Pipeline job details specific to the Dataflow API. This is encapsulated here to allow for more executors to store their specific details separately.
The environment values to be set at runtime for a Flex Template.
Definition of the job information maintained by the pipeline. Fields in this entity are retrieved from the executor API (e.g. Dataflow API).
Launch Flex Template parameter.
A request to launch a Dataflow job from a Flex Template.
Parameters to provide to the template being launched.
A request to launch a template.
Response message for ListJobs
Response message for ListPipelines.
The main pipeline entity and all the necessary metadata for launching and managing linked jobs.
Request message for RunPipeline
Response message for RunPipeline
The environment values to set at runtime.
Details of the schedule the pipeline runs on.
The version of the SDK used to run the job.
Request message for StopPipeline.
Workload details for creating the pipeline jobs.
A generic empty message that you can re-use to avoid defining duplicated empty messages in your APIs. A typical example is to use it as the request or the response type of an API method. For instance: service Foo { rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty); } The JSON representation for Empty is empty JSON object {}.
The Status type defines a logical error model that is suitable for different programming environments, including REST APIs and RPC APIs. It is used by gRPC. Each Status message contains three pieces of data: error code, error message, and error details. You can find out more about this error model and how to work with it in the API Design Guide.
Lists pipelines. Returns a “FORBIDDEN” error if the caller doesn’t have permission to access it.
Creates a pipeline. For a batch pipeline, you can pass scheduler information. Data Pipelines uses the scheduler information to create an internal scheduler that runs jobs periodically. If the internal scheduler is not configured, you can use RunPipeline to run jobs.
Deletes a pipeline. If a scheduler job is attached to the pipeline, it will be deleted.
Looks up a single pipeline. Returns a “NOT_FOUND” error if no such pipeline exists. Returns a “FORBIDDEN” error if the caller doesn’t have permission to access it.
Lists jobs for a given pipeline. Throws a “FORBIDDEN” error if the caller doesn’t have permission to access it.
Updates a pipeline. If successful, the updated Pipeline is returned. Returns NOT_FOUND if the pipeline doesn’t exist. If UpdatePipeline does not return successfully, you can retry the UpdatePipeline request until you receive a successful response.
Creates a job for the specified pipeline directly. You can use this method when the internal scheduler is not configured and you want to trigger the job directly or through an external system. Returns a “NOT_FOUND” error if the pipeline doesn’t exist. Returns a “FORBIDDEN” error if the user doesn’t have permission to access the pipeline or run jobs for the pipeline.
Freezes pipeline execution permanently. If there’s a corresponding scheduler entry, it’s deleted, and the pipeline state is changed to “ARCHIVED”. However, pipeline metadata is retained.
A builder providing access to all methods supported on project resources.
It is not used directly, but through the Datapipelines hub.
Enums
Identifies the an OAuth2 authorization scope. A scope is needed when requesting an authorization token.