Skip to main content

feature_kafka_configs

Creates, updates, deletes, gets or lists a feature_kafka_configs resource.

Overview

Namefeature_kafka_configs
TypeResource
Iddatabricks_workspace.ml.feature_kafka_configs

Fields

The following fields are returned by SELECT queries:

NameDatatypeDescription
namestring
auth_configobjectAuthentication configuration for connection to topics.
backfill_sourceobjectA user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config.
bootstrap_serversstringA comma-separated list of host/port pairs pointing to Kafka cluster.
extra_optionsobjectCatch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
key_schemaobjectSchema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided.
subscription_modeobjectOptions to configure which Kafka topics to pull data from.
value_schemaobjectSchema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided.

Methods

The following methods are available for this resource:

NameAccessible byRequired ParamsOptional ParamsDescription
getselectname, deployment_nameGet a Kafka config. During PrPr, Kafka configs can be read and used when creating features under the
listselectdeployment_namepage_size, page_tokenList Kafka configs. During PrPr, Kafka configs can be read and used when creating features under the
createinsertdeployment_name, kafka_configCreate a Kafka config. During PrPr, Kafka configs can be read and used when creating features under
updateupdatename, update_mask, deployment_name, kafka_configUpdate a Kafka config. During PrPr, Kafka configs can be read and used when creating features under
deletedeletename, deployment_nameDelete a Kafka config. During PrPr, Kafka configs can be read and used when creating features under

Parameters

Parameters can be passed in the WHERE clause of a query. Check the Methods section to see which parameters are required or optional for each operation.

NameDatatypeDescription
deployment_namestringThe Databricks Workspace Deployment Name (default: dbc-abcd0123-a1bc)
namestringName of the Kafka config to delete.
update_maskobjectThe list of fields to update.
page_sizeintegerThe maximum number of results to return.
page_tokenstringPagination token to go to the next page based on a previous query.

SELECT examples

Get a Kafka config. During PrPr, Kafka configs can be read and used when creating features under the

SELECT
name,
auth_config,
backfill_source,
bootstrap_servers,
extra_options,
key_schema,
subscription_mode,
value_schema
FROM databricks_workspace.ml.feature_kafka_configs
WHERE name = '{{ name }}' -- required
AND deployment_name = '{{ deployment_name }}' -- required
;

INSERT examples

Create a Kafka config. During PrPr, Kafka configs can be read and used when creating features under

INSERT INTO databricks_workspace.ml.feature_kafka_configs (
kafka_config,
deployment_name
)
SELECT
'{{ kafka_config }}' /* required */,
'{{ deployment_name }}'
RETURNING
name,
auth_config,
backfill_source,
bootstrap_servers,
extra_options,
key_schema,
subscription_mode,
value_schema
;

UPDATE examples

Update a Kafka config. During PrPr, Kafka configs can be read and used when creating features under

UPDATE databricks_workspace.ml.feature_kafka_configs
SET
kafka_config = '{{ kafka_config }}'
WHERE
name = '{{ name }}' --required
AND update_mask = '{{ update_mask }}' --required
AND deployment_name = '{{ deployment_name }}' --required
AND kafka_config = '{{ kafka_config }}' --required
RETURNING
name,
auth_config,
backfill_source,
bootstrap_servers,
extra_options,
key_schema,
subscription_mode,
value_schema;

DELETE examples

Delete a Kafka config. During PrPr, Kafka configs can be read and used when creating features under

DELETE FROM databricks_workspace.ml.feature_kafka_configs
WHERE name = '{{ name }}' --required
AND deployment_name = '{{ deployment_name }}' --required
;