pipelines
Creates, updates, deletes, gets or lists a pipelines
resource.
Overview
Name | pipelines |
Type | Resource |
Id | databricks_workspace.deltalivetables.pipelines |
Fields
The following fields are returned by SELECT
queries:
- get
- listpipelines
Name | Datatype | Description |
---|---|---|
name | string | |
cluster_id | string | |
pipeline_id | string | |
creator_user_name | string | |
run_as_user_name | string | |
latest_updates | array | |
spec | object | |
state | string |
Name | Datatype | Description |
---|---|---|
name | string | |
pipeline_id | string | |
creator_user_name | string | |
latest_updates | array | |
state | string |
Methods
The following methods are available for this resource:
Name | Accessible by | Required Params | Optional Params | Description |
---|---|---|---|---|
get | select | deployment_name | ||
listpipelines | select | deployment_name | Lists pipelines defined in the Delta Live Tables system. | |
create | insert | deployment_name | Creates a new data processing pipeline based on the requested configuration. If successful, this method returns the ID of the new pipeline. | |
update | update | deployment_name | Updates a pipeline with the supplied configuration. | |
delete | delete | deployment_name | Deletes a pipeline. | |
stop | exec | deployment_name | Stops the pipeline by canceling the active update. If there is no active update for the pipeline, this request is a no-op. |
Parameters
Parameters can be passed in the WHERE
clause of a query. Check the Methods section to see which parameters are required or optional for each operation.
Name | Datatype | Description |
---|---|---|
deployment_name | string | The Databricks Workspace Deployment Name (default: dbc-abcd0123-a1bc) |
SELECT
examples
- get
- listpipelines
No description available.
SELECT
name,
cluster_id,
pipeline_id,
creator_user_name,
run_as_user_name,
latest_updates,
spec,
state
FROM databricks_workspace.deltalivetables.pipelines
WHERE deployment_name = '{{ deployment_name }}' -- required;
Lists pipelines defined in the Delta Live Tables system.
SELECT
name,
pipeline_id,
creator_user_name,
latest_updates,
state
FROM databricks_workspace.deltalivetables.pipelines
WHERE deployment_name = '{{ deployment_name }}' -- required;
INSERT
examples
- create
- Manifest
Creates a new data processing pipeline based on the requested configuration. If successful, this method returns the ID of the new pipeline.
INSERT INTO databricks_workspace.deltalivetables.pipelines (
data__id,
data__name,
data__storage,
data__target,
data__schema,
data__continuous,
data__development,
data__photon,
data__edition,
data__channel,
data__catalog,
data__serverless,
data__allow_duplicate_names,
data__dry_run,
data__configuration,
data__clusters,
data__libraries,
data__trigger,
data__filters,
data__notifications,
data__deployment,
data__ingestion_definition,
deployment_name
)
SELECT
'{{ id }}',
'{{ name }}',
'{{ storage }}',
'{{ target }}',
'{{ schema }}',
{{ continuous }},
{{ development }},
{{ photon }},
'{{ edition }}',
'{{ channel }}',
'{{ catalog }}',
{{ serverless }},
{{ allow_duplicate_names }},
{{ dry_run }},
'{{ configuration }}',
'{{ clusters }}',
'{{ libraries }}',
'{{ trigger }}',
'{{ filters }}',
'{{ notifications }}',
'{{ deployment }}',
'{{ ingestion_definition }}',
'{{ deployment_name }}'
RETURNING
pipeline_id
;
# Description fields are for documentation purposes
- name: pipelines
props:
- name: deployment_name
value: string
description: Required parameter for the pipelines resource.
- name: id
value: string
- name: name
value: string
- name: storage
value: string
- name: target
value: string
- name: schema
value: string
- name: continuous
value: boolean
- name: development
value: boolean
- name: photon
value: boolean
- name: edition
value: string
- name: channel
value: string
- name: catalog
value: string
- name: serverless
value: boolean
- name: allow_duplicate_names
value: boolean
- name: dry_run
value: boolean
- name: configuration
value: object
- name: clusters
value: Array of object
- name: libraries
value: Array of object
- name: trigger
value: object
- name: filters
value: object
- name: notifications
value: Array of object
- name: deployment
value: object
- name: ingestion_definition
value: object
UPDATE
examples
- update
Updates a pipeline with the supplied configuration.
UPDATE databricks_workspace.deltalivetables.pipelines
SET
data__id = '{{ id }}',
data__name = '{{ name }}',
data__storage = '{{ storage }}',
data__target = '{{ target }}',
data__schema = '{{ schema }}',
data__continuous = {{ continuous }},
data__development = {{ development }},
data__photon = {{ photon }},
data__edition = '{{ edition }}',
data__channel = '{{ channel }}',
data__catalog = '{{ catalog }}',
data__serverless = {{ serverless }},
data__pipeline_id = '{{ pipeline_id }}',
data__allow_duplicate_names = {{ allow_duplicate_names }},
data__expected_last_modified = {{ expected_last_modified }},
data__configuration = '{{ configuration }}',
data__clusters = '{{ clusters }}',
data__libraries = '{{ libraries }}',
data__trigger = '{{ trigger }}',
data__filters = '{{ filters }}',
data__notifications = '{{ notifications }}',
data__deployment = '{{ deployment }}',
data__ingestion_definition = '{{ ingestion_definition }}'
WHERE
deployment_name = '{{ deployment_name }}' --required;
DELETE
examples
- delete
Deletes a pipeline.
DELETE FROM databricks_workspace.deltalivetables.pipelines
WHERE deployment_name = '{{ deployment_name }}' --required;
Lifecycle Methods
- stop
Stops the pipeline by canceling the active update. If there is no active update for the pipeline, this request is a no-op.
EXEC databricks_workspace.deltalivetables.pipelines.stop
@deployment_name='{{ deployment_name }}' --required;