From 7ddc4293a0d19a03efd9339d4a5efa72aa409500 Mon Sep 17 00:00:00 2001 From: Miles Yucht Date: Thu, 11 Apr 2024 14:51:01 +0200 Subject: [PATCH] Release v0.25.0 ### Behavior Changes * Override INVALID_PARAMETER_VALUE on fetching non-existent job/cluster ([#591](https://github.com/databricks/databricks-sdk-py/pull/591)). When getting a job or cluster by ID that doesn't exist, the API currently returns a 400, corresponding to the InvalidParameterValue exception. This change throws a ResourceNotFoundException instead in this circumstance. To handle this change, modify error handling by updating your `except` blocks from: ```py try: w.jobs.get_by_id("123") except e as InvalidParameterValue: ... ``` to ```py try: w.jobs.get_by_id("123") except e as ResourceDoesNotExist: ... ``` ### Internal Changes * Check downstream backwards compatibility ([#600](https://github.com/databricks/databricks-sdk-py/pull/600)). * Add support for upcoming Marketplace package ([#608](https://github.com/databricks/databricks-sdk-py/pull/608)). API Changes: * Changed `cancel_refresh()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. * Changed `create()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. * Changed `delete()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. * Changed `get()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. * Changed `get_refresh()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. * Changed `list_refreshes()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. * Changed `run_refresh()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. * Changed `update()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. * Removed `databricks.sdk.service.catalog.AzureManagedIdentity` dataclass. * Removed `full_name` field for `databricks.sdk.service.catalog.CancelRefreshRequest`. * Added `table_name` field for `databricks.sdk.service.catalog.CancelRefreshRequest`. * Changed `custom_metrics` field for `databricks.sdk.service.catalog.CreateMonitor` to `databricks.sdk.service.catalog.MonitorMetricList` dataclass. * Removed `full_name` field for `databricks.sdk.service.catalog.CreateMonitor`. * Changed `inference_log` field for `databricks.sdk.service.catalog.CreateMonitor` to `databricks.sdk.service.catalog.MonitorInferenceLog` dataclass. * Changed `notifications` field for `databricks.sdk.service.catalog.CreateMonitor` to `databricks.sdk.service.catalog.MonitorNotifications` dataclass. * Changed `snapshot` field for `databricks.sdk.service.catalog.CreateMonitor` to `any` dataclass. * Changed `time_series` field for `databricks.sdk.service.catalog.CreateMonitor` to `databricks.sdk.service.catalog.MonitorTimeSeries` dataclass. * Added `table_name` field for `databricks.sdk.service.catalog.CreateMonitor`. * Changed `azure_managed_identity` field for `databricks.sdk.service.catalog.CreateStorageCredential` to `databricks.sdk.service.catalog.AzureManagedIdentityRequest` dataclass. * Removed `full_name` field for `databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest`. * Added `table_name` field for `databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest`. * Removed `full_name` field for `databricks.sdk.service.catalog.GetLakehouseMonitorRequest`. * Added `table_name` field for `databricks.sdk.service.catalog.GetLakehouseMonitorRequest`. * Removed `full_name` field for `databricks.sdk.service.catalog.GetRefreshRequest`. * Added `table_name` field for `databricks.sdk.service.catalog.GetRefreshRequest`. * Removed `full_name` field for `databricks.sdk.service.catalog.ListRefreshesRequest`. * Added `table_name` field for `databricks.sdk.service.catalog.ListRefreshesRequest`. * Changed `quartz_cron_expression` field for `databricks.sdk.service.catalog.MonitorCronSchedule` to be required. * Changed `timezone_id` field for `databricks.sdk.service.catalog.MonitorCronSchedule` to be required. * Removed `databricks.sdk.service.catalog.MonitorCustomMetric` dataclass. * Removed `databricks.sdk.service.catalog.MonitorCustomMetricType` dataclass. * Removed `databricks.sdk.service.catalog.MonitorDestinations` dataclass. * Removed `databricks.sdk.service.catalog.MonitorInferenceLogProfileType` dataclass. * Removed `databricks.sdk.service.catalog.MonitorInferenceLogProfileTypeProblemType` dataclass. * Changed `custom_metrics` field for `databricks.sdk.service.catalog.MonitorInfo` to `databricks.sdk.service.catalog.MonitorMetricList` dataclass. * Changed `drift_metrics_table_name` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `inference_log` field for `databricks.sdk.service.catalog.MonitorInfo` to `databricks.sdk.service.catalog.MonitorInferenceLog` dataclass. * Changed `monitor_version` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `notifications` field for `databricks.sdk.service.catalog.MonitorInfo` to `databricks.sdk.service.catalog.MonitorNotifications` dataclass. * Changed `profile_metrics_table_name` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `snapshot` field for `databricks.sdk.service.catalog.MonitorInfo` to `any` dataclass. * Changed `status` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `table_name` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. * Changed `time_series` field for `databricks.sdk.service.catalog.MonitorInfo` to `databricks.sdk.service.catalog.MonitorTimeSeries` dataclass. * Removed `databricks.sdk.service.catalog.MonitorNotificationsConfig` dataclass. * Changed `refresh_id` field for `databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. * Changed `start_time_ms` field for `databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. * Changed `state` field for `databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. * Added `trigger` field for `databricks.sdk.service.catalog.MonitorRefreshInfo`. * Removed `any` dataclass. * Removed `databricks.sdk.service.catalog.MonitorTimeSeriesProfileType` dataclass. * Removed `full_name` field for `databricks.sdk.service.catalog.RunRefreshRequest`. * Added `table_name` field for `databricks.sdk.service.catalog.RunRefreshRequest`. * Changed `azure_managed_identity` field for `databricks.sdk.service.catalog.StorageCredentialInfo` to `databricks.sdk.service.catalog.AzureManagedIdentityResponse` dataclass. * Removed `name` field for `databricks.sdk.service.catalog.TableRowFilter`. * Added `function_name` field for `databricks.sdk.service.catalog.TableRowFilter`. * Changed `custom_metrics` field for `databricks.sdk.service.catalog.UpdateMonitor` to `databricks.sdk.service.catalog.MonitorMetricList` dataclass. * Removed `full_name` field for `databricks.sdk.service.catalog.UpdateMonitor`. * Changed `inference_log` field for `databricks.sdk.service.catalog.UpdateMonitor` to `databricks.sdk.service.catalog.MonitorInferenceLog` dataclass. * Changed `notifications` field for `databricks.sdk.service.catalog.UpdateMonitor` to `databricks.sdk.service.catalog.MonitorNotifications` dataclass. * Changed `snapshot` field for `databricks.sdk.service.catalog.UpdateMonitor` to `any` dataclass. * Changed `time_series` field for `databricks.sdk.service.catalog.UpdateMonitor` to `databricks.sdk.service.catalog.MonitorTimeSeries` dataclass. * Added `table_name` field for `databricks.sdk.service.catalog.UpdateMonitor`. * Changed `azure_managed_identity` field for `databricks.sdk.service.catalog.UpdateStorageCredential` to `databricks.sdk.service.catalog.AzureManagedIdentityResponse` dataclass. * Changed `azure_managed_identity` field for `databricks.sdk.service.catalog.ValidateStorageCredential` to `databricks.sdk.service.catalog.AzureManagedIdentityRequest` dataclass. * Removed `operation` field for `databricks.sdk.service.catalog.ValidationResult`. * Added `aws_operation` field for `databricks.sdk.service.catalog.ValidationResult`. * Added `azure_operation` field for `databricks.sdk.service.catalog.ValidationResult`. * Added `gcp_operation` field for `databricks.sdk.service.catalog.ValidationResult`. * Removed `databricks.sdk.service.catalog.ValidationResultOperation` dataclass. * Added `databricks.sdk.service.catalog.AzureManagedIdentityRequest` dataclass. * Added `databricks.sdk.service.catalog.AzureManagedIdentityResponse` dataclass. * Added `databricks.sdk.service.catalog.MonitorDestination` dataclass. * Added `databricks.sdk.service.catalog.MonitorInferenceLog` dataclass. * Added `databricks.sdk.service.catalog.MonitorInferenceLogProblemType` dataclass. * Added `databricks.sdk.service.catalog.MonitorMetric` dataclass. * Added `databricks.sdk.service.catalog.MonitorMetricType` dataclass. * Added `databricks.sdk.service.catalog.MonitorNotifications` dataclass. * Added `databricks.sdk.service.catalog.MonitorRefreshInfoTrigger` dataclass. * Added `any` dataclass. * Added `databricks.sdk.service.catalog.MonitorTimeSeries` dataclass. * Added `databricks.sdk.service.catalog.ValidationResultAwsOperation` dataclass. * Added `databricks.sdk.service.catalog.ValidationResultAzureOperation` dataclass. * Added `databricks.sdk.service.catalog.ValidationResultGcpOperation` dataclass. * Added `clone_from` field for `databricks.sdk.service.compute.ClusterSpec`. * Removed `databricks.sdk.service.compute.ComputeSpec` dataclass. * Removed `databricks.sdk.service.compute.ComputeSpecKind` dataclass. * Added `clone_from` field for `databricks.sdk.service.compute.CreateCluster`. * Added `clone_from` field for `databricks.sdk.service.compute.EditCluster`. * Added `databricks.sdk.service.compute.CloneCluster` dataclass. * Added `databricks.sdk.service.compute.Environment` dataclass. * Changed `update()` method for [a.workspace_assignment](https://databricks-sdk-py.readthedocs.io/en/latest/account/workspace_assignment.html) account-level service to return `databricks.sdk.service.iam.PermissionAssignment` dataclass. * Removed `any` dataclass. * Removed `compute_key` field for `databricks.sdk.service.jobs.ClusterSpec`. * Removed `compute` field for `databricks.sdk.service.jobs.CreateJob`. * Added `environments` field for `databricks.sdk.service.jobs.CreateJob`. * Removed `databricks.sdk.service.jobs.JobCompute` dataclass. * Removed `compute` field for `databricks.sdk.service.jobs.JobSettings`. * Added `environments` field for `databricks.sdk.service.jobs.JobSettings`. * Removed `compute_key` field for `databricks.sdk.service.jobs.RunTask`. * Removed `databricks.sdk.service.jobs.TableTriggerConfiguration` dataclass. * Removed `compute_key` field for `databricks.sdk.service.jobs.Task`. * Added `environment_key` field for `databricks.sdk.service.jobs.Task`. * Changed `table` field for `databricks.sdk.service.jobs.TriggerSettings` to `databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` dataclass. * Changed `table_update` field for `databricks.sdk.service.jobs.TriggerSettings` to `databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` dataclass. * Added `databricks.sdk.service.jobs.JobEnvironment` dataclass. * Added `databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` dataclass. * Added `databricks.sdk.service.marketplace` package. OpenAPI SHA: 94684175b8bd65f8701f89729351f8069e8309c9, Date: 2024-04-11 --- .codegen/_openapi_sha | 2 +- .gitattributes | 1 + CHANGELOG.md | 136 + databricks/sdk/__init__.py | 79 + databricks/sdk/service/catalog.py | 611 +-- databricks/sdk/service/compute.py | 101 +- databricks/sdk/service/iam.py | 40 +- databricks/sdk/service/jobs.py | 120 +- databricks/sdk/service/marketplace.py | 3571 +++++++++++++++++ databricks/sdk/version.py | 2 +- docs/account/iam/workspace_assignment.rst | 14 +- docs/dbdataclasses/catalog.rst | 118 +- docs/dbdataclasses/compute.rst | 19 +- docs/dbdataclasses/iam.rst | 4 - docs/dbdataclasses/index.rst | 1 + docs/dbdataclasses/jobs.rst | 10 +- docs/dbdataclasses/marketplace.rst | 624 +++ docs/workspace/catalog/lakehouse_monitors.rst | 52 +- docs/workspace/catalog/registered_models.rst | 16 +- .../workspace/catalog/storage_credentials.rst | 12 +- docs/workspace/compute/clusters.rst | 12 +- docs/workspace/iam/permissions.rst | 6 +- docs/workspace/index.rst | 1 + docs/workspace/jobs/jobs.rst | 6 +- .../marketplace/consumer_fulfillments.rst | 36 + .../marketplace/consumer_installations.rst | 78 + .../marketplace/consumer_listings.rst | 71 + .../consumer_personalization_requests.rst | 50 + .../marketplace/consumer_providers.rst | 31 + docs/workspace/marketplace/index.rst | 21 + .../marketplace/provider_exchange_filters.rst | 54 + .../marketplace/provider_exchanges.rst | 113 + docs/workspace/marketplace/provider_files.rst | 56 + .../marketplace/provider_listings.rst | 65 + .../provider_personalization_requests.rst | 36 + ...provider_provider_analytics_dashboards.rst | 50 + .../marketplace/provider_providers.rst | 64 + .../update_workspace_assignment_on_aws.py | 6 +- 38 files changed, 5824 insertions(+), 465 deletions(-) create mode 100755 databricks/sdk/service/marketplace.py create mode 100644 docs/dbdataclasses/marketplace.rst create mode 100644 docs/workspace/marketplace/consumer_fulfillments.rst create mode 100644 docs/workspace/marketplace/consumer_installations.rst create mode 100644 docs/workspace/marketplace/consumer_listings.rst create mode 100644 docs/workspace/marketplace/consumer_personalization_requests.rst create mode 100644 docs/workspace/marketplace/consumer_providers.rst create mode 100644 docs/workspace/marketplace/index.rst create mode 100644 docs/workspace/marketplace/provider_exchange_filters.rst create mode 100644 docs/workspace/marketplace/provider_exchanges.rst create mode 100644 docs/workspace/marketplace/provider_files.rst create mode 100644 docs/workspace/marketplace/provider_listings.rst create mode 100644 docs/workspace/marketplace/provider_personalization_requests.rst create mode 100644 docs/workspace/marketplace/provider_provider_analytics_dashboards.rst create mode 100644 docs/workspace/marketplace/provider_providers.rst diff --git a/.codegen/_openapi_sha b/.codegen/_openapi_sha index bba9504f1..0aa4b1028 100644 --- a/.codegen/_openapi_sha +++ b/.codegen/_openapi_sha @@ -1 +1 @@ -d38528c3e47dd81c9bdbd918272a3e49d36e09ce \ No newline at end of file +94684175b8bd65f8701f89729351f8069e8309c9 \ No newline at end of file diff --git a/.gitattributes b/.gitattributes index 3b4f3ee3b..9f80ba241 100755 --- a/.gitattributes +++ b/.gitattributes @@ -8,6 +8,7 @@ databricks/sdk/service/dashboards.py linguist-generated=true databricks/sdk/service/files.py linguist-generated=true databricks/sdk/service/iam.py linguist-generated=true databricks/sdk/service/jobs.py linguist-generated=true +databricks/sdk/service/marketplace.py linguist-generated=true databricks/sdk/service/ml.py linguist-generated=true databricks/sdk/service/oauth2.py linguist-generated=true databricks/sdk/service/pipelines.py linguist-generated=true diff --git a/CHANGELOG.md b/CHANGELOG.md index ac140b061..ae886dd19 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,141 @@ # Version changelog +## 0.25.0 + +### Behavior Changes + +* Override INVALID_PARAMETER_VALUE on fetching non-existent job/cluster ([#591](https://github.com/databricks/databricks-sdk-py/pull/591)). When getting a job or cluster by ID that doesn't exist, the API currently returns a 400, corresponding to the InvalidParameterValue exception. This change throws a ResourceNotFoundException instead in this circumstance. To handle this change, modify error handling by updating your `except` blocks from: +```py +try: + w.jobs.get_by_id("123") +except e as InvalidParameterValue: + ... +``` +to +```py +try: + w.jobs.get_by_id("123") +except e as ResourceDoesNotExist: + ... +``` + +### Internal Changes +* Check downstream backwards compatibility ([#600](https://github.com/databricks/databricks-sdk-py/pull/600)). +* Add support for upcoming Marketplace package ([#608](https://github.com/databricks/databricks-sdk-py/pull/608)). + +API Changes: + + * Changed `cancel_refresh()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. + * Changed `create()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. + * Changed `delete()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. + * Changed `get()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. + * Changed `get_refresh()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. + * Changed `list_refreshes()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. + * Changed `run_refresh()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. + * Changed `update()` method for [w.lakehouse_monitors](https://databricks-sdk-py.readthedocs.io/en/latest/workspace/lakehouse_monitors.html) workspace-level service with new required argument order. + * Removed `databricks.sdk.service.catalog.AzureManagedIdentity` dataclass. + * Removed `full_name` field for `databricks.sdk.service.catalog.CancelRefreshRequest`. + * Added `table_name` field for `databricks.sdk.service.catalog.CancelRefreshRequest`. + * Changed `custom_metrics` field for `databricks.sdk.service.catalog.CreateMonitor` to `databricks.sdk.service.catalog.MonitorMetricList` dataclass. + * Removed `full_name` field for `databricks.sdk.service.catalog.CreateMonitor`. + * Changed `inference_log` field for `databricks.sdk.service.catalog.CreateMonitor` to `databricks.sdk.service.catalog.MonitorInferenceLog` dataclass. + * Changed `notifications` field for `databricks.sdk.service.catalog.CreateMonitor` to `databricks.sdk.service.catalog.MonitorNotifications` dataclass. + * Changed `snapshot` field for `databricks.sdk.service.catalog.CreateMonitor` to `any` dataclass. + * Changed `time_series` field for `databricks.sdk.service.catalog.CreateMonitor` to `databricks.sdk.service.catalog.MonitorTimeSeries` dataclass. + * Added `table_name` field for `databricks.sdk.service.catalog.CreateMonitor`. + * Changed `azure_managed_identity` field for `databricks.sdk.service.catalog.CreateStorageCredential` to `databricks.sdk.service.catalog.AzureManagedIdentityRequest` dataclass. + * Removed `full_name` field for `databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest`. + * Added `table_name` field for `databricks.sdk.service.catalog.DeleteLakehouseMonitorRequest`. + * Removed `full_name` field for `databricks.sdk.service.catalog.GetLakehouseMonitorRequest`. + * Added `table_name` field for `databricks.sdk.service.catalog.GetLakehouseMonitorRequest`. + * Removed `full_name` field for `databricks.sdk.service.catalog.GetRefreshRequest`. + * Added `table_name` field for `databricks.sdk.service.catalog.GetRefreshRequest`. + * Removed `full_name` field for `databricks.sdk.service.catalog.ListRefreshesRequest`. + * Added `table_name` field for `databricks.sdk.service.catalog.ListRefreshesRequest`. + * Changed `quartz_cron_expression` field for `databricks.sdk.service.catalog.MonitorCronSchedule` to be required. + * Changed `timezone_id` field for `databricks.sdk.service.catalog.MonitorCronSchedule` to be required. + * Removed `databricks.sdk.service.catalog.MonitorCustomMetric` dataclass. + * Removed `databricks.sdk.service.catalog.MonitorCustomMetricType` dataclass. + * Removed `databricks.sdk.service.catalog.MonitorDestinations` dataclass. + * Removed `databricks.sdk.service.catalog.MonitorInferenceLogProfileType` dataclass. + * Removed `databricks.sdk.service.catalog.MonitorInferenceLogProfileTypeProblemType` dataclass. + * Changed `custom_metrics` field for `databricks.sdk.service.catalog.MonitorInfo` to `databricks.sdk.service.catalog.MonitorMetricList` dataclass. + * Changed `drift_metrics_table_name` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. + * Changed `inference_log` field for `databricks.sdk.service.catalog.MonitorInfo` to `databricks.sdk.service.catalog.MonitorInferenceLog` dataclass. + * Changed `monitor_version` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. + * Changed `notifications` field for `databricks.sdk.service.catalog.MonitorInfo` to `databricks.sdk.service.catalog.MonitorNotifications` dataclass. + * Changed `profile_metrics_table_name` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. + * Changed `snapshot` field for `databricks.sdk.service.catalog.MonitorInfo` to `any` dataclass. + * Changed `status` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. + * Changed `table_name` field for `databricks.sdk.service.catalog.MonitorInfo` to be required. + * Changed `time_series` field for `databricks.sdk.service.catalog.MonitorInfo` to `databricks.sdk.service.catalog.MonitorTimeSeries` dataclass. + * Removed `databricks.sdk.service.catalog.MonitorNotificationsConfig` dataclass. + * Changed `refresh_id` field for `databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. + * Changed `start_time_ms` field for `databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. + * Changed `state` field for `databricks.sdk.service.catalog.MonitorRefreshInfo` to be required. + * Added `trigger` field for `databricks.sdk.service.catalog.MonitorRefreshInfo`. + * Removed `any` dataclass. + * Removed `databricks.sdk.service.catalog.MonitorTimeSeriesProfileType` dataclass. + * Removed `full_name` field for `databricks.sdk.service.catalog.RunRefreshRequest`. + * Added `table_name` field for `databricks.sdk.service.catalog.RunRefreshRequest`. + * Changed `azure_managed_identity` field for `databricks.sdk.service.catalog.StorageCredentialInfo` to `databricks.sdk.service.catalog.AzureManagedIdentityResponse` dataclass. + * Removed `name` field for `databricks.sdk.service.catalog.TableRowFilter`. + * Added `function_name` field for `databricks.sdk.service.catalog.TableRowFilter`. + * Changed `custom_metrics` field for `databricks.sdk.service.catalog.UpdateMonitor` to `databricks.sdk.service.catalog.MonitorMetricList` dataclass. + * Removed `full_name` field for `databricks.sdk.service.catalog.UpdateMonitor`. + * Changed `inference_log` field for `databricks.sdk.service.catalog.UpdateMonitor` to `databricks.sdk.service.catalog.MonitorInferenceLog` dataclass. + * Changed `notifications` field for `databricks.sdk.service.catalog.UpdateMonitor` to `databricks.sdk.service.catalog.MonitorNotifications` dataclass. + * Changed `snapshot` field for `databricks.sdk.service.catalog.UpdateMonitor` to `any` dataclass. + * Changed `time_series` field for `databricks.sdk.service.catalog.UpdateMonitor` to `databricks.sdk.service.catalog.MonitorTimeSeries` dataclass. + * Added `table_name` field for `databricks.sdk.service.catalog.UpdateMonitor`. + * Changed `azure_managed_identity` field for `databricks.sdk.service.catalog.UpdateStorageCredential` to `databricks.sdk.service.catalog.AzureManagedIdentityResponse` dataclass. + * Changed `azure_managed_identity` field for `databricks.sdk.service.catalog.ValidateStorageCredential` to `databricks.sdk.service.catalog.AzureManagedIdentityRequest` dataclass. + * Removed `operation` field for `databricks.sdk.service.catalog.ValidationResult`. + * Added `aws_operation` field for `databricks.sdk.service.catalog.ValidationResult`. + * Added `azure_operation` field for `databricks.sdk.service.catalog.ValidationResult`. + * Added `gcp_operation` field for `databricks.sdk.service.catalog.ValidationResult`. + * Removed `databricks.sdk.service.catalog.ValidationResultOperation` dataclass. + * Added `databricks.sdk.service.catalog.AzureManagedIdentityRequest` dataclass. + * Added `databricks.sdk.service.catalog.AzureManagedIdentityResponse` dataclass. + * Added `databricks.sdk.service.catalog.MonitorDestination` dataclass. + * Added `databricks.sdk.service.catalog.MonitorInferenceLog` dataclass. + * Added `databricks.sdk.service.catalog.MonitorInferenceLogProblemType` dataclass. + * Added `databricks.sdk.service.catalog.MonitorMetric` dataclass. + * Added `databricks.sdk.service.catalog.MonitorMetricType` dataclass. + * Added `databricks.sdk.service.catalog.MonitorNotifications` dataclass. + * Added `databricks.sdk.service.catalog.MonitorRefreshInfoTrigger` dataclass. + * Added `any` dataclass. + * Added `databricks.sdk.service.catalog.MonitorTimeSeries` dataclass. + * Added `databricks.sdk.service.catalog.ValidationResultAwsOperation` dataclass. + * Added `databricks.sdk.service.catalog.ValidationResultAzureOperation` dataclass. + * Added `databricks.sdk.service.catalog.ValidationResultGcpOperation` dataclass. + * Added `clone_from` field for `databricks.sdk.service.compute.ClusterSpec`. + * Removed `databricks.sdk.service.compute.ComputeSpec` dataclass. + * Removed `databricks.sdk.service.compute.ComputeSpecKind` dataclass. + * Added `clone_from` field for `databricks.sdk.service.compute.CreateCluster`. + * Added `clone_from` field for `databricks.sdk.service.compute.EditCluster`. + * Added `databricks.sdk.service.compute.CloneCluster` dataclass. + * Added `databricks.sdk.service.compute.Environment` dataclass. + * Changed `update()` method for [a.workspace_assignment](https://databricks-sdk-py.readthedocs.io/en/latest/account/workspace_assignment.html) account-level service to return `databricks.sdk.service.iam.PermissionAssignment` dataclass. + * Removed `any` dataclass. + * Removed `compute_key` field for `databricks.sdk.service.jobs.ClusterSpec`. + * Removed `compute` field for `databricks.sdk.service.jobs.CreateJob`. + * Added `environments` field for `databricks.sdk.service.jobs.CreateJob`. + * Removed `databricks.sdk.service.jobs.JobCompute` dataclass. + * Removed `compute` field for `databricks.sdk.service.jobs.JobSettings`. + * Added `environments` field for `databricks.sdk.service.jobs.JobSettings`. + * Removed `compute_key` field for `databricks.sdk.service.jobs.RunTask`. + * Removed `databricks.sdk.service.jobs.TableTriggerConfiguration` dataclass. + * Removed `compute_key` field for `databricks.sdk.service.jobs.Task`. + * Added `environment_key` field for `databricks.sdk.service.jobs.Task`. + * Changed `table` field for `databricks.sdk.service.jobs.TriggerSettings` to `databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` dataclass. + * Changed `table_update` field for `databricks.sdk.service.jobs.TriggerSettings` to `databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` dataclass. + * Added `databricks.sdk.service.jobs.JobEnvironment` dataclass. + * Added `databricks.sdk.service.jobs.TableUpdateTriggerConfiguration` dataclass. + * Added `databricks.sdk.service.marketplace` package. + +OpenAPI SHA: 94684175b8bd65f8701f89729351f8069e8309c9, Date: 2024-04-11 + ## 0.24.0 ### Improvements and Bug Fixes diff --git a/databricks/sdk/__init__.py b/databricks/sdk/__init__.py index 5aa0ab02a..df8111e8f 100755 --- a/databricks/sdk/__init__.py +++ b/databricks/sdk/__init__.py @@ -38,6 +38,12 @@ PermissionsAPI, ServicePrincipalsAPI, UsersAPI, WorkspaceAssignmentAPI) from databricks.sdk.service.jobs import JobsAPI +from databricks.sdk.service.marketplace import ( + ConsumerFulfillmentsAPI, ConsumerInstallationsAPI, ConsumerListingsAPI, + ConsumerPersonalizationRequestsAPI, ConsumerProvidersAPI, + ProviderExchangeFiltersAPI, ProviderExchangesAPI, ProviderFilesAPI, + ProviderListingsAPI, ProviderPersonalizationRequestsAPI, + ProviderProviderAnalyticsDashboardsAPI, ProviderProvidersAPI) from databricks.sdk.service.ml import ExperimentsAPI, ModelRegistryAPI from databricks.sdk.service.oauth2 import (CustomAppIntegrationAPI, OAuthPublishedAppsAPI, @@ -164,6 +170,11 @@ def __init__(self, self._clusters = ClustersExt(self._api_client) self._command_execution = CommandExecutionAPI(self._api_client) self._connections = ConnectionsAPI(self._api_client) + self._consumer_fulfillments = ConsumerFulfillmentsAPI(self._api_client) + self._consumer_installations = ConsumerInstallationsAPI(self._api_client) + self._consumer_listings = ConsumerListingsAPI(self._api_client) + self._consumer_personalization_requests = ConsumerPersonalizationRequestsAPI(self._api_client) + self._consumer_providers = ConsumerProvidersAPI(self._api_client) self._credentials_manager = CredentialsManagerAPI(self._api_client) self._current_user = CurrentUserAPI(self._api_client) self._dashboard_widgets = DashboardWidgetsAPI(self._api_client) @@ -194,6 +205,14 @@ def __init__(self, self._permissions = PermissionsAPI(self._api_client) self._pipelines = PipelinesAPI(self._api_client) self._policy_families = PolicyFamiliesAPI(self._api_client) + self._provider_exchange_filters = ProviderExchangeFiltersAPI(self._api_client) + self._provider_exchanges = ProviderExchangesAPI(self._api_client) + self._provider_files = ProviderFilesAPI(self._api_client) + self._provider_listings = ProviderListingsAPI(self._api_client) + self._provider_personalization_requests = ProviderPersonalizationRequestsAPI(self._api_client) + self._provider_provider_analytics_dashboards = ProviderProviderAnalyticsDashboardsAPI( + self._api_client) + self._provider_providers = ProviderProvidersAPI(self._api_client) self._providers = ProvidersAPI(self._api_client) self._queries = QueriesAPI(self._api_client) self._query_history = QueryHistoryAPI(self._api_client) @@ -286,6 +305,31 @@ def connections(self) -> ConnectionsAPI: """Connections allow for creating a connection to an external data source.""" return self._connections + @property + def consumer_fulfillments(self) -> ConsumerFulfillmentsAPI: + """Fulfillments are entities that allow consumers to preview installations.""" + return self._consumer_fulfillments + + @property + def consumer_installations(self) -> ConsumerInstallationsAPI: + """Installations are entities that allow consumers to interact with Databricks Marketplace listings.""" + return self._consumer_installations + + @property + def consumer_listings(self) -> ConsumerListingsAPI: + """Listings are the core entities in the Marketplace.""" + return self._consumer_listings + + @property + def consumer_personalization_requests(self) -> ConsumerPersonalizationRequestsAPI: + """Personalization Requests allow customers to interact with the individualized Marketplace listing flow.""" + return self._consumer_personalization_requests + + @property + def consumer_providers(self) -> ConsumerProvidersAPI: + """Providers are the entities that publish listings to the Marketplace.""" + return self._consumer_providers + @property def credentials_manager(self) -> CredentialsManagerAPI: """Credentials manager interacts with with Identity Providers to to perform token exchanges using stored credentials and refresh tokens.""" @@ -436,6 +480,41 @@ def policy_families(self) -> PolicyFamiliesAPI: """View available policy families.""" return self._policy_families + @property + def provider_exchange_filters(self) -> ProviderExchangeFiltersAPI: + """Marketplace exchanges filters curate which groups can access an exchange.""" + return self._provider_exchange_filters + + @property + def provider_exchanges(self) -> ProviderExchangesAPI: + """Marketplace exchanges allow providers to share their listings with a curated set of customers.""" + return self._provider_exchanges + + @property + def provider_files(self) -> ProviderFilesAPI: + """Marketplace offers a set of file APIs for various purposes such as preview notebooks and provider icons.""" + return self._provider_files + + @property + def provider_listings(self) -> ProviderListingsAPI: + """Listings are the core entities in the Marketplace.""" + return self._provider_listings + + @property + def provider_personalization_requests(self) -> ProviderPersonalizationRequestsAPI: + """Personalization requests are an alternate to instantly available listings.""" + return self._provider_personalization_requests + + @property + def provider_provider_analytics_dashboards(self) -> ProviderProviderAnalyticsDashboardsAPI: + """Manage templated analytics solution for providers.""" + return self._provider_provider_analytics_dashboards + + @property + def provider_providers(self) -> ProviderProvidersAPI: + """Providers are entities that manage assets in Marketplace.""" + return self._provider_providers + @property def providers(self) -> ProvidersAPI: """A data provider is an object representing the organization in the real world who shares the data.""" diff --git a/databricks/sdk/service/catalog.py b/databricks/sdk/service/catalog.py index ed171108a..567a96eea 100755 --- a/databricks/sdk/service/catalog.py +++ b/databricks/sdk/service/catalog.py @@ -320,7 +320,34 @@ def from_dict(cls, d: Dict[str, any]) -> AwsIamRoleResponse: @dataclass -class AzureManagedIdentity: +class AzureManagedIdentityRequest: + access_connector_id: str + """The Azure resource ID of the Azure Databricks Access Connector. Use the format + /subscriptions/{guid}/resourceGroups/{rg-name}/providers/Microsoft.Databricks/accessConnectors/{connector-name}.""" + + managed_identity_id: Optional[str] = None + """The Azure resource ID of the managed identity. Use the format + /subscriptions/{guid}/resourceGroups/{rg-name}/providers/Microsoft.ManagedIdentity/userAssignedIdentities/{identity-name}. + This is only available for user-assgined identities. For system-assigned identities, the + access_connector_id is used to identify the identity. If this field is not provided, then we + assume the AzureManagedIdentity is for a system-assigned identity.""" + + def as_dict(self) -> dict: + """Serializes the AzureManagedIdentityRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.access_connector_id is not None: body['access_connector_id'] = self.access_connector_id + if self.managed_identity_id is not None: body['managed_identity_id'] = self.managed_identity_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> AzureManagedIdentityRequest: + """Deserializes the AzureManagedIdentityRequest from a dictionary.""" + return cls(access_connector_id=d.get('access_connector_id', None), + managed_identity_id=d.get('managed_identity_id', None)) + + +@dataclass +class AzureManagedIdentityResponse: access_connector_id: str """The Azure resource ID of the Azure Databricks Access Connector. Use the format /subscriptions/{guid}/resourceGroups/{rg-name}/providers/Microsoft.Databricks/accessConnectors/{connector-name}.""" @@ -336,7 +363,7 @@ class AzureManagedIdentity: assume the AzureManagedIdentity is for a system-assigned identity.""" def as_dict(self) -> dict: - """Serializes the AzureManagedIdentity into a dictionary suitable for use as a JSON request body.""" + """Serializes the AzureManagedIdentityResponse into a dictionary suitable for use as a JSON request body.""" body = {} if self.access_connector_id is not None: body['access_connector_id'] = self.access_connector_id if self.credential_id is not None: body['credential_id'] = self.credential_id @@ -344,8 +371,8 @@ def as_dict(self) -> dict: return body @classmethod - def from_dict(cls, d: Dict[str, any]) -> AzureManagedIdentity: - """Deserializes the AzureManagedIdentity from a dictionary.""" + def from_dict(cls, d: Dict[str, any]) -> AzureManagedIdentityResponse: + """Deserializes the AzureManagedIdentityResponse from a dictionary.""" return cls(access_connector_id=d.get('access_connector_id', None), credential_id=d.get('credential_id', None), managed_identity_id=d.get('managed_identity_id', None)) @@ -1256,7 +1283,7 @@ class CreateMonitor: """Name of the baseline table from which drift metrics are computed from. Columns in the monitored table should also be present in the baseline table.""" - custom_metrics: Optional[List[MonitorCustomMetric]] = None + custom_metrics: Optional[List[MonitorMetric]] = None """Custom metrics to compute on the monitored table. These can be aggregate metrics, derived metrics (from already computed aggregate metrics), or drift metrics (comparing metrics across time windows).""" @@ -1264,13 +1291,10 @@ class CreateMonitor: data_classification_config: Optional[MonitorDataClassificationConfig] = None """The data classification config for the monitor.""" - full_name: Optional[str] = None - """Full name of the table.""" - - inference_log: Optional[MonitorInferenceLogProfileType] = None + inference_log: Optional[MonitorInferenceLog] = None """Configuration for monitoring inference logs.""" - notifications: Optional[MonitorNotificationsConfig] = None + notifications: Optional[MonitorNotifications] = None """The notification settings for the monitor.""" schedule: Optional[MonitorCronSchedule] = None @@ -1284,10 +1308,13 @@ class CreateMonitor: expression independently, resulting in a separate slice for each predicate and its complements. For high-cardinality columns, only the top 100 unique values by frequency will generate slices.""" - snapshot: Optional[MonitorSnapshotProfileType] = None + snapshot: Optional[MonitorSnapshot] = None """Configuration for monitoring snapshot tables.""" - time_series: Optional[MonitorTimeSeriesProfileType] = None + table_name: Optional[str] = None + """Full name of the table.""" + + time_series: Optional[MonitorTimeSeries] = None """Configuration for monitoring time series tables.""" warehouse_id: Optional[str] = None @@ -1302,7 +1329,6 @@ def as_dict(self) -> dict: if self.custom_metrics: body['custom_metrics'] = [v.as_dict() for v in self.custom_metrics] if self.data_classification_config: body['data_classification_config'] = self.data_classification_config.as_dict() - if self.full_name is not None: body['full_name'] = self.full_name if self.inference_log: body['inference_log'] = self.inference_log.as_dict() if self.notifications: body['notifications'] = self.notifications.as_dict() if self.output_schema_name is not None: body['output_schema_name'] = self.output_schema_name @@ -1311,6 +1337,7 @@ def as_dict(self) -> dict: body['skip_builtin_dashboard'] = self.skip_builtin_dashboard if self.slicing_exprs: body['slicing_exprs'] = [v for v in self.slicing_exprs] if self.snapshot: body['snapshot'] = self.snapshot.as_dict() + if self.table_name is not None: body['table_name'] = self.table_name if self.time_series: body['time_series'] = self.time_series.as_dict() if self.warehouse_id is not None: body['warehouse_id'] = self.warehouse_id return body @@ -1320,18 +1347,18 @@ def from_dict(cls, d: Dict[str, any]) -> CreateMonitor: """Deserializes the CreateMonitor from a dictionary.""" return cls(assets_dir=d.get('assets_dir', None), baseline_table_name=d.get('baseline_table_name', None), - custom_metrics=_repeated_dict(d, 'custom_metrics', MonitorCustomMetric), + custom_metrics=_repeated_dict(d, 'custom_metrics', MonitorMetric), data_classification_config=_from_dict(d, 'data_classification_config', MonitorDataClassificationConfig), - full_name=d.get('full_name', None), - inference_log=_from_dict(d, 'inference_log', MonitorInferenceLogProfileType), - notifications=_from_dict(d, 'notifications', MonitorNotificationsConfig), + inference_log=_from_dict(d, 'inference_log', MonitorInferenceLog), + notifications=_from_dict(d, 'notifications', MonitorNotifications), output_schema_name=d.get('output_schema_name', None), schedule=_from_dict(d, 'schedule', MonitorCronSchedule), skip_builtin_dashboard=d.get('skip_builtin_dashboard', None), slicing_exprs=d.get('slicing_exprs', None), - snapshot=_from_dict(d, 'snapshot', MonitorSnapshotProfileType), - time_series=_from_dict(d, 'time_series', MonitorTimeSeriesProfileType), + snapshot=_from_dict(d, 'snapshot', MonitorSnapshot), + table_name=d.get('table_name', None), + time_series=_from_dict(d, 'time_series', MonitorTimeSeries), warehouse_id=d.get('warehouse_id', None)) @@ -1454,7 +1481,7 @@ class CreateStorageCredential: aws_iam_role: Optional[AwsIamRoleRequest] = None """The AWS IAM role configuration.""" - azure_managed_identity: Optional[AzureManagedIdentity] = None + azure_managed_identity: Optional[AzureManagedIdentityRequest] = None """The Azure managed identity configuration.""" azure_service_principal: Optional[AzureServicePrincipal] = None @@ -1495,7 +1522,8 @@ def as_dict(self) -> dict: def from_dict(cls, d: Dict[str, any]) -> CreateStorageCredential: """Deserializes the CreateStorageCredential from a dictionary.""" return cls(aws_iam_role=_from_dict(d, 'aws_iam_role', AwsIamRoleRequest), - azure_managed_identity=_from_dict(d, 'azure_managed_identity', AzureManagedIdentity), + azure_managed_identity=_from_dict(d, 'azure_managed_identity', + AzureManagedIdentityRequest), azure_service_principal=_from_dict(d, 'azure_service_principal', AzureServicePrincipal), cloudflare_api_token=_from_dict(d, 'cloudflare_api_token', CloudflareApiToken), comment=d.get('comment', None), @@ -3068,14 +3096,16 @@ class ModelVersionInfoStatus(Enum): @dataclass class MonitorCronSchedule: - pause_status: Optional[MonitorCronSchedulePauseStatus] = None - """Whether the schedule is paused or not""" + quartz_cron_expression: str + """The expression that determines when to run the monitor. See [examples]. + + [examples]: https://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html""" - quartz_cron_expression: Optional[str] = None - """A cron expression using quartz syntax that describes the schedule for a job.""" + timezone_id: str + """The timezone id (e.g., ``"PST"``) in which to evaluate the quartz expression.""" - timezone_id: Optional[str] = None - """A Java timezone id. The schedule for a job will be resolved with respect to this timezone.""" + pause_status: Optional[MonitorCronSchedulePauseStatus] = None + """Read only field that indicates whether a schedule is paused or not.""" def as_dict(self) -> dict: """Serializes the MonitorCronSchedule into a dictionary suitable for use as a JSON request body.""" @@ -3095,62 +3125,12 @@ def from_dict(cls, d: Dict[str, any]) -> MonitorCronSchedule: class MonitorCronSchedulePauseStatus(Enum): - """Whether the schedule is paused or not""" + """Read only field that indicates whether a schedule is paused or not.""" PAUSED = 'PAUSED' UNPAUSED = 'UNPAUSED' -@dataclass -class MonitorCustomMetric: - definition: Optional[str] = None - """Jinja template for a SQL expression that specifies how to compute the metric. See [create metric - definition]. - - [create metric definition]: https://docs.databricks.com/en/lakehouse-monitoring/custom-metrics.html#create-definition""" - - input_columns: Optional[List[str]] = None - """Columns on the monitored table to apply the custom metrics to.""" - - name: Optional[str] = None - """Name of the custom metric.""" - - output_data_type: Optional[str] = None - """The output type of the custom metric.""" - - type: Optional[MonitorCustomMetricType] = None - """The type of the custom metric.""" - - def as_dict(self) -> dict: - """Serializes the MonitorCustomMetric into a dictionary suitable for use as a JSON request body.""" - body = {} - if self.definition is not None: body['definition'] = self.definition - if self.input_columns: body['input_columns'] = [v for v in self.input_columns] - if self.name is not None: body['name'] = self.name - if self.output_data_type is not None: body['output_data_type'] = self.output_data_type - if self.type is not None: body['type'] = self.type.value - return body - - @classmethod - def from_dict(cls, d: Dict[str, any]) -> MonitorCustomMetric: - """Deserializes the MonitorCustomMetric from a dictionary.""" - return cls(definition=d.get('definition', None), - input_columns=d.get('input_columns', None), - name=d.get('name', None), - output_data_type=d.get('output_data_type', None), - type=_enum(d, 'type', MonitorCustomMetricType)) - - -class MonitorCustomMetricType(Enum): - """The type of the custom metric.""" - - CUSTOM_METRIC_TYPE_AGGREGATE = 'CUSTOM_METRIC_TYPE_AGGREGATE' - CUSTOM_METRIC_TYPE_DERIVED = 'CUSTOM_METRIC_TYPE_DERIVED' - CUSTOM_METRIC_TYPE_DRIFT = 'CUSTOM_METRIC_TYPE_DRIFT' - MONITOR_STATUS_ERROR = 'MONITOR_STATUS_ERROR' - MONITOR_STATUS_FAILED = 'MONITOR_STATUS_FAILED' - - @dataclass class MonitorDataClassificationConfig: enabled: Optional[bool] = None @@ -3169,48 +3149,58 @@ def from_dict(cls, d: Dict[str, any]) -> MonitorDataClassificationConfig: @dataclass -class MonitorDestinations: +class MonitorDestination: email_addresses: Optional[List[str]] = None """The list of email addresses to send the notification to. A maximum of 5 email addresses is supported.""" def as_dict(self) -> dict: - """Serializes the MonitorDestinations into a dictionary suitable for use as a JSON request body.""" + """Serializes the MonitorDestination into a dictionary suitable for use as a JSON request body.""" body = {} if self.email_addresses: body['email_addresses'] = [v for v in self.email_addresses] return body @classmethod - def from_dict(cls, d: Dict[str, any]) -> MonitorDestinations: - """Deserializes the MonitorDestinations from a dictionary.""" + def from_dict(cls, d: Dict[str, any]) -> MonitorDestination: + """Deserializes the MonitorDestination from a dictionary.""" return cls(email_addresses=d.get('email_addresses', None)) @dataclass -class MonitorInferenceLogProfileType: - granularities: Optional[List[str]] = None - """List of granularities to use when aggregating data into time windows based on their timestamp.""" +class MonitorInferenceLog: + timestamp_col: str + """Column that contains the timestamps of requests. The column must be one of the following: - A + ``TimestampType`` column - A column whose values can be converted to timestamps through the + pyspark ``to_timestamp`` [function]. + + [function]: https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.functions.to_timestamp.html""" - label_col: Optional[str] = None - """Column of the model label.""" + granularities: List[str] + """Granularities for aggregating data into time windows based on their timestamp. Currently the + following static granularities are supported: {``"5 minutes"``, ``"30 minutes"``, ``"1 hour"``, + ``"1 day"``, ``" week(s)"``, ``"1 month"``, ``"1 year"``}.""" - model_id_col: Optional[str] = None - """Column of the model id or version.""" + model_id_col: str + """Column that contains the id of the model generating the predictions. Metrics will be computed + per model id by default, and also across all model ids.""" - prediction_col: Optional[str] = None - """Column of the model prediction.""" + problem_type: MonitorInferenceLogProblemType + """Problem type the model aims to solve. Determines the type of model-quality metrics that will be + computed.""" - prediction_proba_col: Optional[str] = None - """Column of the model prediction probabilities.""" + prediction_col: str + """Column that contains the output/prediction from the model.""" - problem_type: Optional[MonitorInferenceLogProfileTypeProblemType] = None - """Problem type the model aims to solve.""" + label_col: Optional[str] = None + """Optional column that contains the ground truth for the prediction.""" - timestamp_col: Optional[str] = None - """Column of the timestamp of predictions.""" + prediction_proba_col: Optional[str] = None + """Optional column that contains the prediction probabilities for each class in a classification + problem type. The values in this column should be a map, mapping each class label to the + prediction probability for a given sample. The map should be of PySpark MapType().""" def as_dict(self) -> dict: - """Serializes the MonitorInferenceLogProfileType into a dictionary suitable for use as a JSON request body.""" + """Serializes the MonitorInferenceLog into a dictionary suitable for use as a JSON request body.""" body = {} if self.granularities: body['granularities'] = [v for v in self.granularities] if self.label_col is not None: body['label_col'] = self.label_col @@ -3222,19 +3212,20 @@ def as_dict(self) -> dict: return body @classmethod - def from_dict(cls, d: Dict[str, any]) -> MonitorInferenceLogProfileType: - """Deserializes the MonitorInferenceLogProfileType from a dictionary.""" + def from_dict(cls, d: Dict[str, any]) -> MonitorInferenceLog: + """Deserializes the MonitorInferenceLog from a dictionary.""" return cls(granularities=d.get('granularities', None), label_col=d.get('label_col', None), model_id_col=d.get('model_id_col', None), prediction_col=d.get('prediction_col', None), prediction_proba_col=d.get('prediction_proba_col', None), - problem_type=_enum(d, 'problem_type', MonitorInferenceLogProfileTypeProblemType), + problem_type=_enum(d, 'problem_type', MonitorInferenceLogProblemType), timestamp_col=d.get('timestamp_col', None)) -class MonitorInferenceLogProfileTypeProblemType(Enum): - """Problem type the model aims to solve.""" +class MonitorInferenceLogProblemType(Enum): + """Problem type the model aims to solve. Determines the type of model-quality metrics that will be + computed.""" PROBLEM_TYPE_CLASSIFICATION = 'PROBLEM_TYPE_CLASSIFICATION' PROBLEM_TYPE_REGRESSION = 'PROBLEM_TYPE_REGRESSION' @@ -3242,6 +3233,23 @@ class MonitorInferenceLogProfileTypeProblemType(Enum): @dataclass class MonitorInfo: + table_name: str + """The full name of the table to monitor. Format: __catalog_name__.__schema_name__.__table_name__.""" + + status: MonitorInfoStatus + """The status of the monitor.""" + + monitor_version: str + """The version of the monitor config (e.g. 1,2,3). If negative, the monitor may be corrupted.""" + + profile_metrics_table_name: str + """The full name of the profile metrics table. Format: + __catalog_name__.__schema_name__.__table_name__.""" + + drift_metrics_table_name: str + """The full name of the drift metrics table. Format: + __catalog_name__.__schema_name__.__table_name__.""" + assets_dir: Optional[str] = None """The directory to store monitoring assets (e.g. dashboard, metric tables).""" @@ -3249,40 +3257,30 @@ class MonitorInfo: """Name of the baseline table from which drift metrics are computed from. Columns in the monitored table should also be present in the baseline table.""" - custom_metrics: Optional[List[MonitorCustomMetric]] = None + custom_metrics: Optional[List[MonitorMetric]] = None """Custom metrics to compute on the monitored table. These can be aggregate metrics, derived metrics (from already computed aggregate metrics), or drift metrics (comparing metrics across time windows).""" dashboard_id: Optional[str] = None - """The ID of the generated dashboard.""" + """Id of dashboard that visualizes the computed metrics. This can be empty if the monitor is in + PENDING state.""" data_classification_config: Optional[MonitorDataClassificationConfig] = None """The data classification config for the monitor.""" - drift_metrics_table_name: Optional[str] = None - """The full name of the drift metrics table. Format: - __catalog_name__.__schema_name__.__table_name__.""" - - inference_log: Optional[MonitorInferenceLogProfileType] = None + inference_log: Optional[MonitorInferenceLog] = None """Configuration for monitoring inference logs.""" latest_monitor_failure_msg: Optional[str] = None """The latest failure message of the monitor (if any).""" - monitor_version: Optional[str] = None - """The version of the monitor config (e.g. 1,2,3). If negative, the monitor may be corrupted.""" - - notifications: Optional[MonitorNotificationsConfig] = None + notifications: Optional[MonitorNotifications] = None """The notification settings for the monitor.""" output_schema_name: Optional[str] = None """Schema where output metric tables are created.""" - profile_metrics_table_name: Optional[str] = None - """The full name of the profile metrics table. Format: - __catalog_name__.__schema_name__.__table_name__.""" - schedule: Optional[MonitorCronSchedule] = None """The schedule for automatically updating and refreshing metric tables.""" @@ -3291,16 +3289,10 @@ class MonitorInfo: expression independently, resulting in a separate slice for each predicate and its complements. For high-cardinality columns, only the top 100 unique values by frequency will generate slices.""" - snapshot: Optional[MonitorSnapshotProfileType] = None + snapshot: Optional[MonitorSnapshot] = None """Configuration for monitoring snapshot tables.""" - status: Optional[MonitorInfoStatus] = None - """The status of the monitor.""" - - table_name: Optional[str] = None - """The full name of the table to monitor. Format: __catalog_name__.__schema_name__.__table_name__.""" - - time_series: Optional[MonitorTimeSeriesProfileType] = None + time_series: Optional[MonitorTimeSeries] = None """Configuration for monitoring time series tables.""" def as_dict(self) -> dict: @@ -3335,23 +3327,23 @@ def from_dict(cls, d: Dict[str, any]) -> MonitorInfo: """Deserializes the MonitorInfo from a dictionary.""" return cls(assets_dir=d.get('assets_dir', None), baseline_table_name=d.get('baseline_table_name', None), - custom_metrics=_repeated_dict(d, 'custom_metrics', MonitorCustomMetric), + custom_metrics=_repeated_dict(d, 'custom_metrics', MonitorMetric), dashboard_id=d.get('dashboard_id', None), data_classification_config=_from_dict(d, 'data_classification_config', MonitorDataClassificationConfig), drift_metrics_table_name=d.get('drift_metrics_table_name', None), - inference_log=_from_dict(d, 'inference_log', MonitorInferenceLogProfileType), + inference_log=_from_dict(d, 'inference_log', MonitorInferenceLog), latest_monitor_failure_msg=d.get('latest_monitor_failure_msg', None), monitor_version=d.get('monitor_version', None), - notifications=_from_dict(d, 'notifications', MonitorNotificationsConfig), + notifications=_from_dict(d, 'notifications', MonitorNotifications), output_schema_name=d.get('output_schema_name', None), profile_metrics_table_name=d.get('profile_metrics_table_name', None), schedule=_from_dict(d, 'schedule', MonitorCronSchedule), slicing_exprs=d.get('slicing_exprs', None), - snapshot=_from_dict(d, 'snapshot', MonitorSnapshotProfileType), + snapshot=_from_dict(d, 'snapshot', MonitorSnapshot), status=_enum(d, 'status', MonitorInfoStatus), table_name=d.get('table_name', None), - time_series=_from_dict(d, 'time_series', MonitorTimeSeriesProfileType)) + time_series=_from_dict(d, 'time_series', MonitorTimeSeries)) class MonitorInfoStatus(Enum): @@ -3365,38 +3357,109 @@ class MonitorInfoStatus(Enum): @dataclass -class MonitorNotificationsConfig: - on_failure: Optional[MonitorDestinations] = None +class MonitorMetric: + name: str + """Name of the metric in the output tables.""" + + definition: str + """Jinja template for a SQL expression that specifies how to compute the metric. See [create metric + definition]. + + [create metric definition]: https://docs.databricks.com/en/lakehouse-monitoring/custom-metrics.html#create-definition""" + + input_columns: List[str] + """A list of column names in the input table the metric should be computed for. Can use + ``":table"`` to indicate that the metric needs information from multiple columns.""" + + output_data_type: str + """The output type of the custom metric.""" + + type: MonitorMetricType + """Can only be one of ``"CUSTOM_METRIC_TYPE_AGGREGATE"``, ``"CUSTOM_METRIC_TYPE_DERIVED"``, or + ``"CUSTOM_METRIC_TYPE_DRIFT"``. The ``"CUSTOM_METRIC_TYPE_AGGREGATE"`` and + ``"CUSTOM_METRIC_TYPE_DERIVED"`` metrics are computed on a single table, whereas the + ``"CUSTOM_METRIC_TYPE_DRIFT"`` compare metrics across baseline and input table, or across the + two consecutive time windows. - CUSTOM_METRIC_TYPE_AGGREGATE: only depend on the existing + columns in your table - CUSTOM_METRIC_TYPE_DERIVED: depend on previously computed aggregate + metrics - CUSTOM_METRIC_TYPE_DRIFT: depend on previously computed aggregate or derived metrics""" + + def as_dict(self) -> dict: + """Serializes the MonitorMetric into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.definition is not None: body['definition'] = self.definition + if self.input_columns: body['input_columns'] = [v for v in self.input_columns] + if self.name is not None: body['name'] = self.name + if self.output_data_type is not None: body['output_data_type'] = self.output_data_type + if self.type is not None: body['type'] = self.type.value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> MonitorMetric: + """Deserializes the MonitorMetric from a dictionary.""" + return cls(definition=d.get('definition', None), + input_columns=d.get('input_columns', None), + name=d.get('name', None), + output_data_type=d.get('output_data_type', None), + type=_enum(d, 'type', MonitorMetricType)) + + +class MonitorMetricType(Enum): + """Can only be one of ``"CUSTOM_METRIC_TYPE_AGGREGATE"``, ``"CUSTOM_METRIC_TYPE_DERIVED"``, or + ``"CUSTOM_METRIC_TYPE_DRIFT"``. The ``"CUSTOM_METRIC_TYPE_AGGREGATE"`` and + ``"CUSTOM_METRIC_TYPE_DERIVED"`` metrics are computed on a single table, whereas the + ``"CUSTOM_METRIC_TYPE_DRIFT"`` compare metrics across baseline and input table, or across the + two consecutive time windows. - CUSTOM_METRIC_TYPE_AGGREGATE: only depend on the existing + columns in your table - CUSTOM_METRIC_TYPE_DERIVED: depend on previously computed aggregate + metrics - CUSTOM_METRIC_TYPE_DRIFT: depend on previously computed aggregate or derived metrics""" + + CUSTOM_METRIC_TYPE_AGGREGATE = 'CUSTOM_METRIC_TYPE_AGGREGATE' + CUSTOM_METRIC_TYPE_DERIVED = 'CUSTOM_METRIC_TYPE_DERIVED' + CUSTOM_METRIC_TYPE_DRIFT = 'CUSTOM_METRIC_TYPE_DRIFT' + + +@dataclass +class MonitorNotifications: + on_failure: Optional[MonitorDestination] = None """Who to send notifications to on monitor failure.""" + on_new_classification_tag_detected: Optional[MonitorDestination] = None + """Who to send notifications to when new data classification tags are detected.""" + def as_dict(self) -> dict: - """Serializes the MonitorNotificationsConfig into a dictionary suitable for use as a JSON request body.""" + """Serializes the MonitorNotifications into a dictionary suitable for use as a JSON request body.""" body = {} if self.on_failure: body['on_failure'] = self.on_failure.as_dict() + if self.on_new_classification_tag_detected: + body['on_new_classification_tag_detected'] = self.on_new_classification_tag_detected.as_dict() return body @classmethod - def from_dict(cls, d: Dict[str, any]) -> MonitorNotificationsConfig: - """Deserializes the MonitorNotificationsConfig from a dictionary.""" - return cls(on_failure=_from_dict(d, 'on_failure', MonitorDestinations)) + def from_dict(cls, d: Dict[str, any]) -> MonitorNotifications: + """Deserializes the MonitorNotifications from a dictionary.""" + return cls(on_failure=_from_dict(d, 'on_failure', MonitorDestination), + on_new_classification_tag_detected=_from_dict(d, 'on_new_classification_tag_detected', + MonitorDestination)) @dataclass class MonitorRefreshInfo: + refresh_id: int + """Unique id of the refresh operation.""" + + state: MonitorRefreshInfoState + """The current state of the refresh.""" + + start_time_ms: int + """Time at which refresh operation was initiated (milliseconds since 1/1/1970 UTC).""" + end_time_ms: Optional[int] = None - """The time at which the refresh ended, in epoch milliseconds.""" + """Time at which refresh operation completed (milliseconds since 1/1/1970 UTC).""" message: Optional[str] = None """An optional message to give insight into the current state of the job (e.g. FAILURE messages).""" - refresh_id: Optional[int] = None - """The ID of the refresh.""" - - start_time_ms: Optional[int] = None - """The time at which the refresh started, in epoch milliseconds.""" - - state: Optional[MonitorRefreshInfoState] = None - """The current state of the refresh.""" + trigger: Optional[MonitorRefreshInfoTrigger] = None + """The method by which the refresh was triggered.""" def as_dict(self) -> dict: """Serializes the MonitorRefreshInfo into a dictionary suitable for use as a JSON request body.""" @@ -3406,6 +3469,7 @@ def as_dict(self) -> dict: if self.refresh_id is not None: body['refresh_id'] = self.refresh_id if self.start_time_ms is not None: body['start_time_ms'] = self.start_time_ms if self.state is not None: body['state'] = self.state.value + if self.trigger is not None: body['trigger'] = self.trigger.value return body @classmethod @@ -3415,7 +3479,8 @@ def from_dict(cls, d: Dict[str, any]) -> MonitorRefreshInfo: message=d.get('message', None), refresh_id=d.get('refresh_id', None), start_time_ms=d.get('start_time_ms', None), - state=_enum(d, 'state', MonitorRefreshInfoState)) + state=_enum(d, 'state', MonitorRefreshInfoState), + trigger=_enum(d, 'trigger', MonitorRefreshInfoTrigger)) class MonitorRefreshInfoState(Enum): @@ -3428,39 +3493,51 @@ class MonitorRefreshInfoState(Enum): SUCCESS = 'SUCCESS' +class MonitorRefreshInfoTrigger(Enum): + """The method by which the refresh was triggered.""" + + MANUAL = 'MANUAL' + SCHEDULE = 'SCHEDULE' + + @dataclass -class MonitorSnapshotProfileType: +class MonitorSnapshot: def as_dict(self) -> dict: - """Serializes the MonitorSnapshotProfileType into a dictionary suitable for use as a JSON request body.""" + """Serializes the MonitorSnapshot into a dictionary suitable for use as a JSON request body.""" body = {} return body @classmethod - def from_dict(cls, d: Dict[str, any]) -> MonitorSnapshotProfileType: - """Deserializes the MonitorSnapshotProfileType from a dictionary.""" + def from_dict(cls, d: Dict[str, any]) -> MonitorSnapshot: + """Deserializes the MonitorSnapshot from a dictionary.""" return cls() @dataclass -class MonitorTimeSeriesProfileType: - granularities: Optional[List[str]] = None - """List of granularities to use when aggregating data into time windows based on their timestamp.""" +class MonitorTimeSeries: + timestamp_col: str + """Column that contains the timestamps of requests. The column must be one of the following: - A + ``TimestampType`` column - A column whose values can be converted to timestamps through the + pyspark ``to_timestamp`` [function]. + + [function]: https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.functions.to_timestamp.html""" - timestamp_col: Optional[str] = None - """The timestamp column. This must be timestamp types or convertible to timestamp types using the - pyspark to_timestamp function.""" + granularities: List[str] + """Granularities for aggregating data into time windows based on their timestamp. Currently the + following static granularities are supported: {``"5 minutes"``, ``"30 minutes"``, ``"1 hour"``, + ``"1 day"``, ``" week(s)"``, ``"1 month"``, ``"1 year"``}.""" def as_dict(self) -> dict: - """Serializes the MonitorTimeSeriesProfileType into a dictionary suitable for use as a JSON request body.""" + """Serializes the MonitorTimeSeries into a dictionary suitable for use as a JSON request body.""" body = {} if self.granularities: body['granularities'] = [v for v in self.granularities] if self.timestamp_col is not None: body['timestamp_col'] = self.timestamp_col return body @classmethod - def from_dict(cls, d: Dict[str, any]) -> MonitorTimeSeriesProfileType: - """Deserializes the MonitorTimeSeriesProfileType from a dictionary.""" + def from_dict(cls, d: Dict[str, any]) -> MonitorTimeSeries: + """Deserializes the MonitorTimeSeries from a dictionary.""" return cls(granularities=d.get('granularities', None), timestamp_col=d.get('timestamp_col', None)) @@ -4207,7 +4284,7 @@ class StorageCredentialInfo: aws_iam_role: Optional[AwsIamRoleResponse] = None """The AWS IAM role configuration.""" - azure_managed_identity: Optional[AzureManagedIdentity] = None + azure_managed_identity: Optional[AzureManagedIdentityResponse] = None """The Azure managed identity configuration.""" azure_service_principal: Optional[AzureServicePrincipal] = None @@ -4280,7 +4357,8 @@ def as_dict(self) -> dict: def from_dict(cls, d: Dict[str, any]) -> StorageCredentialInfo: """Deserializes the StorageCredentialInfo from a dictionary.""" return cls(aws_iam_role=_from_dict(d, 'aws_iam_role', AwsIamRoleResponse), - azure_managed_identity=_from_dict(d, 'azure_managed_identity', AzureManagedIdentity), + azure_managed_identity=_from_dict(d, 'azure_managed_identity', + AzureManagedIdentityResponse), azure_service_principal=_from_dict(d, 'azure_service_principal', AzureServicePrincipal), cloudflare_api_token=_from_dict(d, 'cloudflare_api_token', CloudflareApiToken), comment=d.get('comment', None), @@ -4583,7 +4661,7 @@ def from_dict(cls, d: Dict[str, any]) -> TableInfo: @dataclass class TableRowFilter: - name: str + function_name: str """The full name of the row filter SQL UDF.""" input_column_names: List[str] @@ -4593,14 +4671,15 @@ class TableRowFilter: def as_dict(self) -> dict: """Serializes the TableRowFilter into a dictionary suitable for use as a JSON request body.""" body = {} + if self.function_name is not None: body['function_name'] = self.function_name if self.input_column_names: body['input_column_names'] = [v for v in self.input_column_names] - if self.name is not None: body['name'] = self.name return body @classmethod def from_dict(cls, d: Dict[str, any]) -> TableRowFilter: """Deserializes the TableRowFilter from a dictionary.""" - return cls(input_column_names=d.get('input_column_names', None), name=d.get('name', None)) + return cls(function_name=d.get('function_name', None), + input_column_names=d.get('input_column_names', None)) @dataclass @@ -4993,7 +5072,7 @@ class UpdateMonitor: """Name of the baseline table from which drift metrics are computed from. Columns in the monitored table should also be present in the baseline table.""" - custom_metrics: Optional[List[MonitorCustomMetric]] = None + custom_metrics: Optional[List[MonitorMetric]] = None """Custom metrics to compute on the monitored table. These can be aggregate metrics, derived metrics (from already computed aggregate metrics), or drift metrics (comparing metrics across time windows).""" @@ -5001,13 +5080,10 @@ class UpdateMonitor: data_classification_config: Optional[MonitorDataClassificationConfig] = None """The data classification config for the monitor.""" - full_name: Optional[str] = None - """Full name of the table.""" - - inference_log: Optional[MonitorInferenceLogProfileType] = None + inference_log: Optional[MonitorInferenceLog] = None """Configuration for monitoring inference logs.""" - notifications: Optional[MonitorNotificationsConfig] = None + notifications: Optional[MonitorNotifications] = None """The notification settings for the monitor.""" schedule: Optional[MonitorCronSchedule] = None @@ -5018,10 +5094,13 @@ class UpdateMonitor: expression independently, resulting in a separate slice for each predicate and its complements. For high-cardinality columns, only the top 100 unique values by frequency will generate slices.""" - snapshot: Optional[MonitorSnapshotProfileType] = None + snapshot: Optional[MonitorSnapshot] = None """Configuration for monitoring snapshot tables.""" - time_series: Optional[MonitorTimeSeriesProfileType] = None + table_name: Optional[str] = None + """Full name of the table.""" + + time_series: Optional[MonitorTimeSeries] = None """Configuration for monitoring time series tables.""" def as_dict(self) -> dict: @@ -5031,13 +5110,13 @@ def as_dict(self) -> dict: if self.custom_metrics: body['custom_metrics'] = [v.as_dict() for v in self.custom_metrics] if self.data_classification_config: body['data_classification_config'] = self.data_classification_config.as_dict() - if self.full_name is not None: body['full_name'] = self.full_name if self.inference_log: body['inference_log'] = self.inference_log.as_dict() if self.notifications: body['notifications'] = self.notifications.as_dict() if self.output_schema_name is not None: body['output_schema_name'] = self.output_schema_name if self.schedule: body['schedule'] = self.schedule.as_dict() if self.slicing_exprs: body['slicing_exprs'] = [v for v in self.slicing_exprs] if self.snapshot: body['snapshot'] = self.snapshot.as_dict() + if self.table_name is not None: body['table_name'] = self.table_name if self.time_series: body['time_series'] = self.time_series.as_dict() return body @@ -5045,17 +5124,17 @@ def as_dict(self) -> dict: def from_dict(cls, d: Dict[str, any]) -> UpdateMonitor: """Deserializes the UpdateMonitor from a dictionary.""" return cls(baseline_table_name=d.get('baseline_table_name', None), - custom_metrics=_repeated_dict(d, 'custom_metrics', MonitorCustomMetric), + custom_metrics=_repeated_dict(d, 'custom_metrics', MonitorMetric), data_classification_config=_from_dict(d, 'data_classification_config', MonitorDataClassificationConfig), - full_name=d.get('full_name', None), - inference_log=_from_dict(d, 'inference_log', MonitorInferenceLogProfileType), - notifications=_from_dict(d, 'notifications', MonitorNotificationsConfig), + inference_log=_from_dict(d, 'inference_log', MonitorInferenceLog), + notifications=_from_dict(d, 'notifications', MonitorNotifications), output_schema_name=d.get('output_schema_name', None), schedule=_from_dict(d, 'schedule', MonitorCronSchedule), slicing_exprs=d.get('slicing_exprs', None), - snapshot=_from_dict(d, 'snapshot', MonitorSnapshotProfileType), - time_series=_from_dict(d, 'time_series', MonitorTimeSeriesProfileType)) + snapshot=_from_dict(d, 'snapshot', MonitorSnapshot), + table_name=d.get('table_name', None), + time_series=_from_dict(d, 'time_series', MonitorTimeSeries)) @dataclass @@ -5180,7 +5259,7 @@ class UpdateStorageCredential: aws_iam_role: Optional[AwsIamRoleRequest] = None """The AWS IAM role configuration.""" - azure_managed_identity: Optional[AzureManagedIdentity] = None + azure_managed_identity: Optional[AzureManagedIdentityResponse] = None """The Azure managed identity configuration.""" azure_service_principal: Optional[AzureServicePrincipal] = None @@ -5236,7 +5315,8 @@ def as_dict(self) -> dict: def from_dict(cls, d: Dict[str, any]) -> UpdateStorageCredential: """Deserializes the UpdateStorageCredential from a dictionary.""" return cls(aws_iam_role=_from_dict(d, 'aws_iam_role', AwsIamRoleRequest), - azure_managed_identity=_from_dict(d, 'azure_managed_identity', AzureManagedIdentity), + azure_managed_identity=_from_dict(d, 'azure_managed_identity', + AzureManagedIdentityResponse), azure_service_principal=_from_dict(d, 'azure_service_principal', AzureServicePrincipal), cloudflare_api_token=_from_dict(d, 'cloudflare_api_token', CloudflareApiToken), comment=d.get('comment', None), @@ -5346,7 +5426,7 @@ class ValidateStorageCredential: aws_iam_role: Optional[AwsIamRoleRequest] = None """The AWS IAM role configuration.""" - azure_managed_identity: Optional[AzureManagedIdentity] = None + azure_managed_identity: Optional[AzureManagedIdentityRequest] = None """The Azure managed identity configuration.""" azure_service_principal: Optional[AzureServicePrincipal] = None @@ -5392,7 +5472,8 @@ def as_dict(self) -> dict: def from_dict(cls, d: Dict[str, any]) -> ValidateStorageCredential: """Deserializes the ValidateStorageCredential from a dictionary.""" return cls(aws_iam_role=_from_dict(d, 'aws_iam_role', AwsIamRoleRequest), - azure_managed_identity=_from_dict(d, 'azure_managed_identity', AzureManagedIdentity), + azure_managed_identity=_from_dict(d, 'azure_managed_identity', + AzureManagedIdentityRequest), azure_service_principal=_from_dict(d, 'azure_service_principal', AzureServicePrincipal), cloudflare_api_token=_from_dict(d, 'cloudflare_api_token', CloudflareApiToken), databricks_gcp_service_account=_from_dict(d, 'databricks_gcp_service_account', @@ -5426,36 +5507,68 @@ def from_dict(cls, d: Dict[str, any]) -> ValidateStorageCredentialResponse: @dataclass class ValidationResult: - message: Optional[str] = None - """Error message would exist when the result does not equal to **PASS**.""" + aws_operation: Optional[ValidationResultAwsOperation] = None + """The operation tested.""" + + azure_operation: Optional[ValidationResultAzureOperation] = None + """The operation tested.""" - operation: Optional[ValidationResultOperation] = None + gcp_operation: Optional[ValidationResultGcpOperation] = None """The operation tested.""" + message: Optional[str] = None + """Error message would exist when the result does not equal to **PASS**.""" + result: Optional[ValidationResultResult] = None """The results of the tested operation.""" def as_dict(self) -> dict: """Serializes the ValidationResult into a dictionary suitable for use as a JSON request body.""" body = {} + if self.aws_operation is not None: body['aws_operation'] = self.aws_operation.value + if self.azure_operation is not None: body['azure_operation'] = self.azure_operation.value + if self.gcp_operation is not None: body['gcp_operation'] = self.gcp_operation.value if self.message is not None: body['message'] = self.message - if self.operation is not None: body['operation'] = self.operation.value if self.result is not None: body['result'] = self.result.value return body @classmethod def from_dict(cls, d: Dict[str, any]) -> ValidationResult: """Deserializes the ValidationResult from a dictionary.""" - return cls(message=d.get('message', None), - operation=_enum(d, 'operation', ValidationResultOperation), + return cls(aws_operation=_enum(d, 'aws_operation', ValidationResultAwsOperation), + azure_operation=_enum(d, 'azure_operation', ValidationResultAzureOperation), + gcp_operation=_enum(d, 'gcp_operation', ValidationResultGcpOperation), + message=d.get('message', None), result=_enum(d, 'result', ValidationResultResult)) -class ValidationResultOperation(Enum): +class ValidationResultAwsOperation(Enum): + """The operation tested.""" + + DELETE = 'DELETE' + LIST = 'LIST' + PATH_EXISTS = 'PATH_EXISTS' + READ = 'READ' + WRITE = 'WRITE' + + +class ValidationResultAzureOperation(Enum): """The operation tested.""" DELETE = 'DELETE' + HIERARCHICAL_NAMESPACE_ENABLED = 'HIERARCHICAL_NAMESPACE_ENABLED' LIST = 'LIST' + PATH_EXISTS = 'PATH_EXISTS' + READ = 'READ' + WRITE = 'WRITE' + + +class ValidationResultGcpOperation(Enum): + """The operation tested.""" + + DELETE = 'DELETE' + LIST = 'LIST' + PATH_EXISTS = 'PATH_EXISTS' READ = 'READ' WRITE = 'WRITE' @@ -6818,7 +6931,7 @@ class LakehouseMonitorsAPI: def __init__(self, api_client): self._api = api_client - def cancel_refresh(self, full_name: str, refresh_id: str): + def cancel_refresh(self, table_name: str, refresh_id: str): """Cancel refresh. Cancel an active monitor refresh for the given refresh ID. @@ -6830,7 +6943,7 @@ def cancel_refresh(self, full_name: str, refresh_id: str): Additionally, the call must be made from the workspace where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :param refresh_id: str ID of the refresh. @@ -6841,24 +6954,24 @@ def cancel_refresh(self, full_name: str, refresh_id: str): headers = {} self._api.do('POST', - f'/api/2.1/unity-catalog/tables/{full_name}/monitor/refreshes/{refresh_id}/cancel', + f'/api/2.1/unity-catalog/tables/{table_name}/monitor/refreshes/{refresh_id}/cancel', headers=headers) def create(self, - full_name: str, + table_name: str, assets_dir: str, output_schema_name: str, *, baseline_table_name: Optional[str] = None, - custom_metrics: Optional[List[MonitorCustomMetric]] = None, + custom_metrics: Optional[List[MonitorMetric]] = None, data_classification_config: Optional[MonitorDataClassificationConfig] = None, - inference_log: Optional[MonitorInferenceLogProfileType] = None, - notifications: Optional[MonitorNotificationsConfig] = None, + inference_log: Optional[MonitorInferenceLog] = None, + notifications: Optional[MonitorNotifications] = None, schedule: Optional[MonitorCronSchedule] = None, skip_builtin_dashboard: Optional[bool] = None, slicing_exprs: Optional[List[str]] = None, - snapshot: Optional[MonitorSnapshotProfileType] = None, - time_series: Optional[MonitorTimeSeriesProfileType] = None, + snapshot: Optional[MonitorSnapshot] = None, + time_series: Optional[MonitorTimeSeries] = None, warehouse_id: Optional[str] = None) -> MonitorInfo: """Create a table monitor. @@ -6872,7 +6985,7 @@ def create(self, Workspace assets, such as the dashboard, will be created in the workspace where this call was made. - :param full_name: str + :param table_name: str Full name of the table. :param assets_dir: str The directory to store monitoring assets (e.g. dashboard, metric tables). @@ -6881,14 +6994,14 @@ def create(self, :param baseline_table_name: str (optional) Name of the baseline table from which drift metrics are computed from. Columns in the monitored table should also be present in the baseline table. - :param custom_metrics: List[:class:`MonitorCustomMetric`] (optional) + :param custom_metrics: List[:class:`MonitorMetric`] (optional) Custom metrics to compute on the monitored table. These can be aggregate metrics, derived metrics (from already computed aggregate metrics), or drift metrics (comparing metrics across time windows). :param data_classification_config: :class:`MonitorDataClassificationConfig` (optional) The data classification config for the monitor. - :param inference_log: :class:`MonitorInferenceLogProfileType` (optional) + :param inference_log: :class:`MonitorInferenceLog` (optional) Configuration for monitoring inference logs. - :param notifications: :class:`MonitorNotificationsConfig` (optional) + :param notifications: :class:`MonitorNotifications` (optional) The notification settings for the monitor. :param schedule: :class:`MonitorCronSchedule` (optional) The schedule for automatically updating and refreshing metric tables. @@ -6898,9 +7011,9 @@ def create(self, List of column expressions to slice data with for targeted analysis. The data is grouped by each expression independently, resulting in a separate slice for each predicate and its complements. For high-cardinality columns, only the top 100 unique values by frequency will generate slices. - :param snapshot: :class:`MonitorSnapshotProfileType` (optional) + :param snapshot: :class:`MonitorSnapshot` (optional) Configuration for monitoring snapshot tables. - :param time_series: :class:`MonitorTimeSeriesProfileType` (optional) + :param time_series: :class:`MonitorTimeSeries` (optional) Configuration for monitoring time series tables. :param warehouse_id: str (optional) Optional argument to specify the warehouse for dashboard creation. If not specified, the first @@ -6926,12 +7039,12 @@ def create(self, headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } res = self._api.do('POST', - f'/api/2.1/unity-catalog/tables/{full_name}/monitor', + f'/api/2.1/unity-catalog/tables/{table_name}/monitor', body=body, headers=headers) return MonitorInfo.from_dict(res) - def delete(self, full_name: str): + def delete(self, table_name: str): """Delete a table monitor. Deletes a monitor for the specified table. @@ -6946,7 +7059,7 @@ def delete(self, full_name: str): Note that the metric tables and dashboard will not be deleted as part of this call; those assets must be manually cleaned up (if desired). - :param full_name: str + :param table_name: str Full name of the table. @@ -6954,9 +7067,9 @@ def delete(self, full_name: str): headers = {} - self._api.do('DELETE', f'/api/2.1/unity-catalog/tables/{full_name}/monitor', headers=headers) + self._api.do('DELETE', f'/api/2.1/unity-catalog/tables/{table_name}/monitor', headers=headers) - def get(self, full_name: str) -> MonitorInfo: + def get(self, table_name: str) -> MonitorInfo: """Get a table monitor. Gets a monitor for the specified table. @@ -6970,7 +7083,7 @@ def get(self, full_name: str) -> MonitorInfo: the monitor. Some information (e.g., dashboard) may be filtered out if the caller is in a different workspace than where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :returns: :class:`MonitorInfo` @@ -6978,10 +7091,10 @@ def get(self, full_name: str) -> MonitorInfo: headers = {'Accept': 'application/json', } - res = self._api.do('GET', f'/api/2.1/unity-catalog/tables/{full_name}/monitor', headers=headers) + res = self._api.do('GET', f'/api/2.1/unity-catalog/tables/{table_name}/monitor', headers=headers) return MonitorInfo.from_dict(res) - def get_refresh(self, full_name: str, refresh_id: str) -> MonitorRefreshInfo: + def get_refresh(self, table_name: str, refresh_id: str) -> MonitorRefreshInfo: """Get refresh. Gets info about a specific monitor refresh using the given refresh ID. @@ -6993,7 +7106,7 @@ def get_refresh(self, full_name: str, refresh_id: str) -> MonitorRefreshInfo: Additionally, the call must be made from the workspace where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :param refresh_id: str ID of the refresh. @@ -7004,11 +7117,11 @@ def get_refresh(self, full_name: str, refresh_id: str) -> MonitorRefreshInfo: headers = {'Accept': 'application/json', } res = self._api.do('GET', - f'/api/2.1/unity-catalog/tables/{full_name}/monitor/refreshes/{refresh_id}', + f'/api/2.1/unity-catalog/tables/{table_name}/monitor/refreshes/{refresh_id}', headers=headers) return MonitorRefreshInfo.from_dict(res) - def list_refreshes(self, full_name: str) -> Iterator[MonitorRefreshInfo]: + def list_refreshes(self, table_name: str) -> Iterator[MonitorRefreshInfo]: """List refreshes. Gets an array containing the history of the most recent refreshes (up to 25) for this table. @@ -7020,7 +7133,7 @@ def list_refreshes(self, full_name: str) -> Iterator[MonitorRefreshInfo]: Additionally, the call must be made from the workspace where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :returns: Iterator over :class:`MonitorRefreshInfo` @@ -7029,11 +7142,11 @@ def list_refreshes(self, full_name: str) -> Iterator[MonitorRefreshInfo]: headers = {'Accept': 'application/json', } res = self._api.do('GET', - f'/api/2.1/unity-catalog/tables/{full_name}/monitor/refreshes', + f'/api/2.1/unity-catalog/tables/{table_name}/monitor/refreshes', headers=headers) return [MonitorRefreshInfo.from_dict(v) for v in res] - def run_refresh(self, full_name: str) -> MonitorRefreshInfo: + def run_refresh(self, table_name: str) -> MonitorRefreshInfo: """Queue a metric refresh for a monitor. Queues a metric refresh on the monitor for the specified table. The refresh will execute in the @@ -7046,7 +7159,7 @@ def run_refresh(self, full_name: str) -> MonitorRefreshInfo: Additionally, the call must be made from the workspace where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :returns: :class:`MonitorRefreshInfo` @@ -7055,23 +7168,23 @@ def run_refresh(self, full_name: str) -> MonitorRefreshInfo: headers = {'Accept': 'application/json', } res = self._api.do('POST', - f'/api/2.1/unity-catalog/tables/{full_name}/monitor/refreshes', + f'/api/2.1/unity-catalog/tables/{table_name}/monitor/refreshes', headers=headers) return MonitorRefreshInfo.from_dict(res) def update(self, - full_name: str, + table_name: str, output_schema_name: str, *, baseline_table_name: Optional[str] = None, - custom_metrics: Optional[List[MonitorCustomMetric]] = None, + custom_metrics: Optional[List[MonitorMetric]] = None, data_classification_config: Optional[MonitorDataClassificationConfig] = None, - inference_log: Optional[MonitorInferenceLogProfileType] = None, - notifications: Optional[MonitorNotificationsConfig] = None, + inference_log: Optional[MonitorInferenceLog] = None, + notifications: Optional[MonitorNotifications] = None, schedule: Optional[MonitorCronSchedule] = None, slicing_exprs: Optional[List[str]] = None, - snapshot: Optional[MonitorSnapshotProfileType] = None, - time_series: Optional[MonitorTimeSeriesProfileType] = None) -> MonitorInfo: + snapshot: Optional[MonitorSnapshot] = None, + time_series: Optional[MonitorTimeSeries] = None) -> MonitorInfo: """Update a table monitor. Updates a monitor for the specified table. @@ -7086,21 +7199,21 @@ def update(self, Certain configuration fields, such as output asset identifiers, cannot be updated. - :param full_name: str + :param table_name: str Full name of the table. :param output_schema_name: str Schema where output metric tables are created. :param baseline_table_name: str (optional) Name of the baseline table from which drift metrics are computed from. Columns in the monitored table should also be present in the baseline table. - :param custom_metrics: List[:class:`MonitorCustomMetric`] (optional) + :param custom_metrics: List[:class:`MonitorMetric`] (optional) Custom metrics to compute on the monitored table. These can be aggregate metrics, derived metrics (from already computed aggregate metrics), or drift metrics (comparing metrics across time windows). :param data_classification_config: :class:`MonitorDataClassificationConfig` (optional) The data classification config for the monitor. - :param inference_log: :class:`MonitorInferenceLogProfileType` (optional) + :param inference_log: :class:`MonitorInferenceLog` (optional) Configuration for monitoring inference logs. - :param notifications: :class:`MonitorNotificationsConfig` (optional) + :param notifications: :class:`MonitorNotifications` (optional) The notification settings for the monitor. :param schedule: :class:`MonitorCronSchedule` (optional) The schedule for automatically updating and refreshing metric tables. @@ -7108,9 +7221,9 @@ def update(self, List of column expressions to slice data with for targeted analysis. The data is grouped by each expression independently, resulting in a separate slice for each predicate and its complements. For high-cardinality columns, only the top 100 unique values by frequency will generate slices. - :param snapshot: :class:`MonitorSnapshotProfileType` (optional) + :param snapshot: :class:`MonitorSnapshot` (optional) Configuration for monitoring snapshot tables. - :param time_series: :class:`MonitorTimeSeriesProfileType` (optional) + :param time_series: :class:`MonitorTimeSeries` (optional) Configuration for monitoring time series tables. :returns: :class:`MonitorInfo` @@ -7130,7 +7243,7 @@ def update(self, headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } res = self._api.do('PUT', - f'/api/2.1/unity-catalog/tables/{full_name}/monitor', + f'/api/2.1/unity-catalog/tables/{table_name}/monitor', body=body, headers=headers) return MonitorInfo.from_dict(res) @@ -7795,9 +7908,19 @@ def list(self, Whether to include registered models in the response for which the principal can only access selective metadata for :param max_results: int (optional) - Max number of registered models to return. If catalog and schema are unspecified, max_results must - be specified. If max_results is unspecified, we return all results, starting from the page specified - by page_token. + Max number of registered models to return. + + If both catalog and schema are specified: - when max_results is not specified, the page length is + set to a server configured value (10000, as of 4/2/2024). - when set to a value greater than 0, the + page length is the minimum of this value and a server configured value (10000, as of 4/2/2024); - + when set to 0, the page length is set to a server configured value (10000, as of 4/2/2024); - when + set to a value less than 0, an invalid parameter error is returned; + + If neither schema nor catalog is specified: - when max_results is not specified, the page length is + set to a server configured value (100, as of 4/2/2024). - when set to a value greater than 0, the + page length is the minimum of this value and a server configured value (1000, as of 4/2/2024); - + when set to 0, the page length is set to a server configured value (100, as of 4/2/2024); - when set + to a value less than 0, an invalid parameter error is returned; :param page_token: str (optional) Opaque token to send for the next page of results (pagination). :param schema_name: str (optional) @@ -8079,7 +8202,7 @@ def create(self, name: str, *, aws_iam_role: Optional[AwsIamRoleRequest] = None, - azure_managed_identity: Optional[AzureManagedIdentity] = None, + azure_managed_identity: Optional[AzureManagedIdentityRequest] = None, azure_service_principal: Optional[AzureServicePrincipal] = None, cloudflare_api_token: Optional[CloudflareApiToken] = None, comment: Optional[str] = None, @@ -8094,7 +8217,7 @@ def create(self, The credential name. The name must be unique within the metastore. :param aws_iam_role: :class:`AwsIamRoleRequest` (optional) The AWS IAM role configuration. - :param azure_managed_identity: :class:`AzureManagedIdentity` (optional) + :param azure_managed_identity: :class:`AzureManagedIdentityRequest` (optional) The Azure managed identity configuration. :param azure_service_principal: :class:`AzureServicePrincipal` (optional) The Azure service principal configuration. @@ -8213,7 +8336,7 @@ def update(self, name: str, *, aws_iam_role: Optional[AwsIamRoleRequest] = None, - azure_managed_identity: Optional[AzureManagedIdentity] = None, + azure_managed_identity: Optional[AzureManagedIdentityResponse] = None, azure_service_principal: Optional[AzureServicePrincipal] = None, cloudflare_api_token: Optional[CloudflareApiToken] = None, comment: Optional[str] = None, @@ -8231,7 +8354,7 @@ def update(self, Name of the storage credential. :param aws_iam_role: :class:`AwsIamRoleRequest` (optional) The AWS IAM role configuration. - :param azure_managed_identity: :class:`AzureManagedIdentity` (optional) + :param azure_managed_identity: :class:`AzureManagedIdentityResponse` (optional) The Azure managed identity configuration. :param azure_service_principal: :class:`AzureServicePrincipal` (optional) The Azure service principal configuration. @@ -8280,7 +8403,7 @@ def update(self, def validate(self, *, aws_iam_role: Optional[AwsIamRoleRequest] = None, - azure_managed_identity: Optional[AzureManagedIdentity] = None, + azure_managed_identity: Optional[AzureManagedIdentityRequest] = None, azure_service_principal: Optional[AzureServicePrincipal] = None, cloudflare_api_token: Optional[CloudflareApiToken] = None, databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest] = None, @@ -8302,7 +8425,7 @@ def validate(self, :param aws_iam_role: :class:`AwsIamRoleRequest` (optional) The AWS IAM role configuration. - :param azure_managed_identity: :class:`AzureManagedIdentity` (optional) + :param azure_managed_identity: :class:`AzureManagedIdentityRequest` (optional) The Azure managed identity configuration. :param azure_service_principal: :class:`AzureServicePrincipal` (optional) The Azure service principal configuration. diff --git a/databricks/sdk/service/compute.py b/databricks/sdk/service/compute.py index 1639daa62..d3cf5c647 100755 --- a/databricks/sdk/service/compute.py +++ b/databricks/sdk/service/compute.py @@ -388,6 +388,23 @@ def from_dict(cls, d: Dict[str, any]) -> ClientsTypes: return cls(jobs=d.get('jobs', None), notebooks=d.get('notebooks', None)) +@dataclass +class CloneCluster: + source_cluster_id: str + """The cluster that is being cloned.""" + + def as_dict(self) -> dict: + """Serializes the CloneCluster into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.source_cluster_id is not None: body['source_cluster_id'] = self.source_cluster_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CloneCluster: + """Deserializes the CloneCluster from a dictionary.""" + return cls(source_cluster_id=d.get('source_cluster_id', None)) + + @dataclass class CloudProviderNodeInfo: status: Optional[List[CloudProviderNodeStatus]] = None @@ -1420,6 +1437,10 @@ class ClusterSpec: """Attributes related to clusters running on Microsoft Azure. If not specified at cluster creation, a set of default values will be used.""" + clone_from: Optional[CloneCluster] = None + """When specified, this clones libraries from a source cluster during the creation of a new + cluster.""" + cluster_log_conf: Optional[ClusterLogConf] = None """The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. @@ -1554,6 +1575,7 @@ def as_dict(self) -> dict: body['autotermination_minutes'] = self.autotermination_minutes if self.aws_attributes: body['aws_attributes'] = self.aws_attributes.as_dict() if self.azure_attributes: body['azure_attributes'] = self.azure_attributes.as_dict() + if self.clone_from: body['clone_from'] = self.clone_from.as_dict() if self.cluster_log_conf: body['cluster_log_conf'] = self.cluster_log_conf.as_dict() if self.cluster_name is not None: body['cluster_name'] = self.cluster_name if self.cluster_source is not None: body['cluster_source'] = self.cluster_source.value @@ -1589,6 +1611,7 @@ def from_dict(cls, d: Dict[str, any]) -> ClusterSpec: autotermination_minutes=d.get('autotermination_minutes', None), aws_attributes=_from_dict(d, 'aws_attributes', AwsAttributes), azure_attributes=_from_dict(d, 'azure_attributes', AzureAttributes), + clone_from=_from_dict(d, 'clone_from', CloneCluster), cluster_log_conf=_from_dict(d, 'cluster_log_conf', ClusterLogConf), cluster_name=d.get('cluster_name', None), cluster_source=_enum(d, 'cluster_source', ClusterSource), @@ -1679,29 +1702,6 @@ def from_dict(cls, d: Dict[str, any]) -> CommandStatusResponse: status=_enum(d, 'status', CommandStatus)) -@dataclass -class ComputeSpec: - kind: Optional[ComputeSpecKind] = None - """The kind of compute described by this compute specification.""" - - def as_dict(self) -> dict: - """Serializes the ComputeSpec into a dictionary suitable for use as a JSON request body.""" - body = {} - if self.kind is not None: body['kind'] = self.kind.value - return body - - @classmethod - def from_dict(cls, d: Dict[str, any]) -> ComputeSpec: - """Deserializes the ComputeSpec from a dictionary.""" - return cls(kind=_enum(d, 'kind', ComputeSpecKind)) - - -class ComputeSpecKind(Enum): - """The kind of compute described by this compute specification.""" - - SERVERLESS_PREVIEW = 'SERVERLESS_PREVIEW' - - class ContextStatus(Enum): ERROR = 'Error' @@ -1754,6 +1754,10 @@ class CreateCluster: """Attributes related to clusters running on Microsoft Azure. If not specified at cluster creation, a set of default values will be used.""" + clone_from: Optional[CloneCluster] = None + """When specified, this clones libraries from a source cluster during the creation of a new + cluster.""" + cluster_log_conf: Optional[ClusterLogConf] = None """The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. @@ -1884,6 +1888,7 @@ def as_dict(self) -> dict: body['autotermination_minutes'] = self.autotermination_minutes if self.aws_attributes: body['aws_attributes'] = self.aws_attributes.as_dict() if self.azure_attributes: body['azure_attributes'] = self.azure_attributes.as_dict() + if self.clone_from: body['clone_from'] = self.clone_from.as_dict() if self.cluster_log_conf: body['cluster_log_conf'] = self.cluster_log_conf.as_dict() if self.cluster_name is not None: body['cluster_name'] = self.cluster_name if self.cluster_source is not None: body['cluster_source'] = self.cluster_source.value @@ -1919,6 +1924,7 @@ def from_dict(cls, d: Dict[str, any]) -> CreateCluster: autotermination_minutes=d.get('autotermination_minutes', None), aws_attributes=_from_dict(d, 'aws_attributes', AwsAttributes), azure_attributes=_from_dict(d, 'azure_attributes', AzureAttributes), + clone_from=_from_dict(d, 'clone_from', CloneCluster), cluster_log_conf=_from_dict(d, 'cluster_log_conf', ClusterLogConf), cluster_name=d.get('cluster_name', None), cluster_source=_enum(d, 'cluster_source', ClusterSource), @@ -2592,6 +2598,10 @@ class EditCluster: """Attributes related to clusters running on Microsoft Azure. If not specified at cluster creation, a set of default values will be used.""" + clone_from: Optional[CloneCluster] = None + """When specified, this clones libraries from a source cluster during the creation of a new + cluster.""" + cluster_log_conf: Optional[ClusterLogConf] = None """The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. @@ -2722,6 +2732,7 @@ def as_dict(self) -> dict: body['autotermination_minutes'] = self.autotermination_minutes if self.aws_attributes: body['aws_attributes'] = self.aws_attributes.as_dict() if self.azure_attributes: body['azure_attributes'] = self.azure_attributes.as_dict() + if self.clone_from: body['clone_from'] = self.clone_from.as_dict() if self.cluster_id is not None: body['cluster_id'] = self.cluster_id if self.cluster_log_conf: body['cluster_log_conf'] = self.cluster_log_conf.as_dict() if self.cluster_name is not None: body['cluster_name'] = self.cluster_name @@ -2758,6 +2769,7 @@ def from_dict(cls, d: Dict[str, any]) -> EditCluster: autotermination_minutes=d.get('autotermination_minutes', None), aws_attributes=_from_dict(d, 'aws_attributes', AwsAttributes), azure_attributes=_from_dict(d, 'azure_attributes', AzureAttributes), + clone_from=_from_dict(d, 'clone_from', CloneCluster), cluster_id=d.get('cluster_id', None), cluster_log_conf=_from_dict(d, 'cluster_log_conf', ClusterLogConf), cluster_name=d.get('cluster_name', None), @@ -2969,6 +2981,37 @@ def from_dict(cls, d: Dict[str, any]) -> EditResponse: return cls() +@dataclass +class Environment: + """The a environment entity used to preserve serverless environment side panel and jobs' + environment for non-notebook task. In this minimal environment spec, only pip dependencies are + supported. Next ID: 5""" + + client: str + """* User-friendly name for the client version: “client”: “1” The version is a string, + consisting of the major client version""" + + dependencies: Optional[List[str]] = None + """List of pip dependencies, as supported by the version of pip in this environment. Each + dependency is a pip requirement file line + https://pip.pypa.io/en/stable/reference/requirements-file-format/ Allowed dependency could be + , , (WSFS or Volumes in + Databricks), E.g. dependencies: ["foo==0.0.1", "-r + /Workspace/test/requirements.txt"]""" + + def as_dict(self) -> dict: + """Serializes the Environment into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.client is not None: body['client'] = self.client + if self.dependencies: body['dependencies'] = [v for v in self.dependencies] + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> Environment: + """Deserializes the Environment from a dictionary.""" + return cls(client=d.get('client', None), dependencies=d.get('dependencies', None)) + + @dataclass class EventDetails: attributes: Optional[ClusterAttributes] = None @@ -6208,6 +6251,7 @@ def create(self, autotermination_minutes: Optional[int] = None, aws_attributes: Optional[AwsAttributes] = None, azure_attributes: Optional[AzureAttributes] = None, + clone_from: Optional[CloneCluster] = None, cluster_log_conf: Optional[ClusterLogConf] = None, cluster_name: Optional[str] = None, cluster_source: Optional[ClusterSource] = None, @@ -6256,6 +6300,8 @@ def create(self, :param azure_attributes: :class:`AzureAttributes` (optional) Attributes related to clusters running on Microsoft Azure. If not specified at cluster creation, a set of default values will be used. + :param clone_from: :class:`CloneCluster` (optional) + When specified, this clones libraries from a source cluster during the creation of a new cluster. :param cluster_log_conf: :class:`ClusterLogConf` (optional) The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. If @@ -6364,6 +6410,7 @@ def create(self, if autotermination_minutes is not None: body['autotermination_minutes'] = autotermination_minutes if aws_attributes is not None: body['aws_attributes'] = aws_attributes.as_dict() if azure_attributes is not None: body['azure_attributes'] = azure_attributes.as_dict() + if clone_from is not None: body['clone_from'] = clone_from.as_dict() if cluster_log_conf is not None: body['cluster_log_conf'] = cluster_log_conf.as_dict() if cluster_name is not None: body['cluster_name'] = cluster_name if cluster_source is not None: body['cluster_source'] = cluster_source.value @@ -6404,6 +6451,7 @@ def create_and_wait( autotermination_minutes: Optional[int] = None, aws_attributes: Optional[AwsAttributes] = None, azure_attributes: Optional[AzureAttributes] = None, + clone_from: Optional[CloneCluster] = None, cluster_log_conf: Optional[ClusterLogConf] = None, cluster_name: Optional[str] = None, cluster_source: Optional[ClusterSource] = None, @@ -6432,6 +6480,7 @@ def create_and_wait( autotermination_minutes=autotermination_minutes, aws_attributes=aws_attributes, azure_attributes=azure_attributes, + clone_from=clone_from, cluster_log_conf=cluster_log_conf, cluster_name=cluster_name, cluster_source=cluster_source, @@ -6491,6 +6540,7 @@ def edit(self, autotermination_minutes: Optional[int] = None, aws_attributes: Optional[AwsAttributes] = None, azure_attributes: Optional[AzureAttributes] = None, + clone_from: Optional[CloneCluster] = None, cluster_log_conf: Optional[ClusterLogConf] = None, cluster_name: Optional[str] = None, cluster_source: Optional[ClusterSource] = None, @@ -6546,6 +6596,8 @@ def edit(self, :param azure_attributes: :class:`AzureAttributes` (optional) Attributes related to clusters running on Microsoft Azure. If not specified at cluster creation, a set of default values will be used. + :param clone_from: :class:`CloneCluster` (optional) + When specified, this clones libraries from a source cluster during the creation of a new cluster. :param cluster_log_conf: :class:`ClusterLogConf` (optional) The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. If @@ -6654,6 +6706,7 @@ def edit(self, if autotermination_minutes is not None: body['autotermination_minutes'] = autotermination_minutes if aws_attributes is not None: body['aws_attributes'] = aws_attributes.as_dict() if azure_attributes is not None: body['azure_attributes'] = azure_attributes.as_dict() + if clone_from is not None: body['clone_from'] = clone_from.as_dict() if cluster_id is not None: body['cluster_id'] = cluster_id if cluster_log_conf is not None: body['cluster_log_conf'] = cluster_log_conf.as_dict() if cluster_name is not None: body['cluster_name'] = cluster_name @@ -6696,6 +6749,7 @@ def edit_and_wait( autotermination_minutes: Optional[int] = None, aws_attributes: Optional[AwsAttributes] = None, azure_attributes: Optional[AzureAttributes] = None, + clone_from: Optional[CloneCluster] = None, cluster_log_conf: Optional[ClusterLogConf] = None, cluster_name: Optional[str] = None, cluster_source: Optional[ClusterSource] = None, @@ -6724,6 +6778,7 @@ def edit_and_wait( autotermination_minutes=autotermination_minutes, aws_attributes=aws_attributes, azure_attributes=azure_attributes, + clone_from=clone_from, cluster_id=cluster_id, cluster_log_conf=cluster_log_conf, cluster_name=cluster_name, diff --git a/databricks/sdk/service/iam.py b/databricks/sdk/service/iam.py index 5db539a54..27f448ccb 100755 --- a/databricks/sdk/service/iam.py +++ b/databricks/sdk/service/iam.py @@ -890,7 +890,7 @@ class PermissionsRequest: request_object_type: Optional[str] = None """The type of the request object. Can be one of the following: authorization, clusters, cluster-policies, directories, experiments, files, instance-pools, jobs, notebooks, pipelines, - registered-models, repos, serving-endpoints, or sql-warehouses.""" + registered-models, repos, serving-endpoints, or warehouses.""" def as_dict(self) -> dict: """Serializes the PermissionsRequest into a dictionary suitable for use as a JSON request body.""" @@ -915,7 +915,7 @@ class PrincipalOutput: """The display name of the principal.""" group_name: Optional[str] = None - """The group name of the groupl. Present only if the principal is a group.""" + """The group name of the group. Present only if the principal is a group.""" principal_id: Optional[int] = None """The unique, opaque id of the principal.""" @@ -1135,7 +1135,9 @@ def from_dict(cls, d: Dict[str, any]) -> UpdateRuleSetRequest: @dataclass class UpdateWorkspaceAssignments: permissions: List[WorkspacePermission] - """Array of permissions assignments to update on the workspace.""" + """Array of permissions assignments to update on the workspace. Note that excluding this field will + have the same effect as providing an empty list which will result in the deletion of all + permissions for the principal.""" principal_id: Optional[int] = None """The ID of the user, service principal, or group.""" @@ -1238,20 +1240,6 @@ class UserSchema(Enum): URN_IETF_PARAMS_SCIM_SCHEMAS_EXTENSION_WORKSPACE_2_0_USER = 'urn:ietf:params:scim:schemas:extension:workspace:2.0:User' -@dataclass -class WorkspaceAssignmentsUpdated: - - def as_dict(self) -> dict: - """Serializes the WorkspaceAssignmentsUpdated into a dictionary suitable for use as a JSON request body.""" - body = {} - return body - - @classmethod - def from_dict(cls, d: Dict[str, any]) -> WorkspaceAssignmentsUpdated: - """Deserializes the WorkspaceAssignmentsUpdated from a dictionary.""" - return cls() - - class WorkspacePermission(Enum): ADMIN = 'ADMIN' @@ -2607,7 +2595,7 @@ def get(self, request_object_type: str, request_object_id: str) -> ObjectPermiss :param request_object_type: str The type of the request object. Can be one of the following: authorization, clusters, cluster-policies, directories, experiments, files, instance-pools, jobs, notebooks, pipelines, - registered-models, repos, serving-endpoints, or sql-warehouses. + registered-models, repos, serving-endpoints, or warehouses. :param request_object_id: str The id of the request object. @@ -2655,7 +2643,7 @@ def set(self, :param request_object_type: str The type of the request object. Can be one of the following: authorization, clusters, cluster-policies, directories, experiments, files, instance-pools, jobs, notebooks, pipelines, - registered-models, repos, serving-endpoints, or sql-warehouses. + registered-models, repos, serving-endpoints, or warehouses. :param request_object_id: str The id of the request object. :param access_control_list: List[:class:`AccessControlRequest`] (optional) @@ -2686,7 +2674,7 @@ def update(self, :param request_object_type: str The type of the request object. Can be one of the following: authorization, clusters, cluster-policies, directories, experiments, files, instance-pools, jobs, notebooks, pipelines, - registered-models, repos, serving-endpoints, or sql-warehouses. + registered-models, repos, serving-endpoints, or warehouses. :param request_object_id: str The id of the request object. :param access_control_list: List[:class:`AccessControlRequest`] (optional) @@ -3378,7 +3366,8 @@ def list(self, workspace_id: int) -> Iterator[PermissionAssignment]: parsed = PermissionAssignments.from_dict(json).permission_assignments return parsed if parsed is not None else [] - def update(self, workspace_id: int, principal_id: int, permissions: List[WorkspacePermission]): + def update(self, workspace_id: int, principal_id: int, + permissions: List[WorkspacePermission]) -> PermissionAssignment: """Create or update permissions assignment. Creates or updates the workspace permissions assignment in a given account and workspace for the @@ -3389,16 +3378,19 @@ def update(self, workspace_id: int, principal_id: int, permissions: List[Workspa :param principal_id: int The ID of the user, service principal, or group. :param permissions: List[:class:`WorkspacePermission`] - Array of permissions assignments to update on the workspace. - + Array of permissions assignments to update on the workspace. Note that excluding this field will + have the same effect as providing an empty list which will result in the deletion of all permissions + for the principal. + :returns: :class:`PermissionAssignment` """ body = {} if permissions is not None: body['permissions'] = [v.value for v in permissions] headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } - self._api.do( + res = self._api.do( 'PUT', f'/api/2.0/accounts/{self._api.account_id}/workspaces/{workspace_id}/permissionassignments/principals/{principal_id}', body=body, headers=headers) + return PermissionAssignment.from_dict(res) diff --git a/databricks/sdk/service/jobs.py b/databricks/sdk/service/jobs.py index b5287f530..34db760ab 100755 --- a/databricks/sdk/service/jobs.py +++ b/databricks/sdk/service/jobs.py @@ -356,10 +356,6 @@ def from_dict(cls, d: Dict[str, any]) -> ClusterInstance: @dataclass class ClusterSpec: - compute_key: Optional[str] = None - """The key of the compute requirement, specified in `job.settings.compute`, to use for execution of - this task.""" - existing_cluster_id: Optional[str] = None """If existing_cluster_id, the ID of an existing cluster that is used for all runs. When running jobs or tasks on an existing cluster, you may need to manually restart the cluster if it stops @@ -379,7 +375,6 @@ class ClusterSpec: def as_dict(self) -> dict: """Serializes the ClusterSpec into a dictionary suitable for use as a JSON request body.""" body = {} - if self.compute_key is not None: body['compute_key'] = self.compute_key if self.existing_cluster_id is not None: body['existing_cluster_id'] = self.existing_cluster_id if self.job_cluster_key is not None: body['job_cluster_key'] = self.job_cluster_key if self.libraries: body['libraries'] = [v.as_dict() for v in self.libraries] @@ -389,8 +384,7 @@ def as_dict(self) -> dict: @classmethod def from_dict(cls, d: Dict[str, any]) -> ClusterSpec: """Deserializes the ClusterSpec from a dictionary.""" - return cls(compute_key=d.get('compute_key', None), - existing_cluster_id=d.get('existing_cluster_id', None), + return cls(existing_cluster_id=d.get('existing_cluster_id', None), job_cluster_key=d.get('job_cluster_key', None), libraries=_repeated_dict(d, 'libraries', compute.Library), new_cluster=_from_dict(d, 'new_cluster', compute.ClusterSpec)) @@ -478,9 +472,6 @@ class CreateJob: access_control_list: Optional[List[iam.AccessControlRequest]] = None """List of permissions to set on the job.""" - compute: Optional[List[JobCompute]] = None - """A list of compute requirements that can be referenced by tasks of this job.""" - continuous: Optional[Continuous] = None """An optional continuous property for this job. The continuous property will ensure that there is always one run executing. Only one of `schedule` and `continuous` can be used.""" @@ -501,6 +492,9 @@ class CreateJob: """An optional set of email addresses that is notified when runs of this job begin or complete as well as when this job is deleted.""" + environments: Optional[List[JobEnvironment]] = None + """A list of task execution environment specifications that can be referenced by tasks of this job.""" + format: Optional[Format] = None """Used to tell what is the format of the job. This field is ignored in Create/Update/Reset calls. When using the Jobs API 2.1 this value is always set to `"MULTI_TASK"`.""" @@ -582,12 +576,12 @@ def as_dict(self) -> dict: body = {} if self.access_control_list: body['access_control_list'] = [v.as_dict() for v in self.access_control_list] - if self.compute: body['compute'] = [v.as_dict() for v in self.compute] if self.continuous: body['continuous'] = self.continuous.as_dict() if self.deployment: body['deployment'] = self.deployment.as_dict() if self.description is not None: body['description'] = self.description if self.edit_mode is not None: body['edit_mode'] = self.edit_mode.value if self.email_notifications: body['email_notifications'] = self.email_notifications.as_dict() + if self.environments: body['environments'] = [v.as_dict() for v in self.environments] if self.format is not None: body['format'] = self.format.value if self.git_source: body['git_source'] = self.git_source.as_dict() if self.health: body['health'] = self.health.as_dict() @@ -610,12 +604,12 @@ def as_dict(self) -> dict: def from_dict(cls, d: Dict[str, any]) -> CreateJob: """Deserializes the CreateJob from a dictionary.""" return cls(access_control_list=_repeated_dict(d, 'access_control_list', iam.AccessControlRequest), - compute=_repeated_dict(d, 'compute', JobCompute), continuous=_from_dict(d, 'continuous', Continuous), deployment=_from_dict(d, 'deployment', JobDeployment), description=d.get('description', None), edit_mode=_enum(d, 'edit_mode', JobEditMode), email_notifications=_from_dict(d, 'email_notifications', JobEmailNotifications), + environments=_repeated_dict(d, 'environments', JobEnvironment), format=_enum(d, 'format', Format), git_source=_from_dict(d, 'git_source', GitSource), health=_from_dict(d, 'health', JobsHealthRules), @@ -1260,28 +1254,6 @@ def from_dict(cls, d: Dict[str, any]) -> JobCluster: new_cluster=_from_dict(d, 'new_cluster', compute.ClusterSpec)) -@dataclass -class JobCompute: - compute_key: str - """A unique name for the compute requirement. This field is required and must be unique within the - job. `JobTaskSettings` may refer to this field to determine the compute requirements for the - task execution.""" - - spec: compute.ComputeSpec - - def as_dict(self) -> dict: - """Serializes the JobCompute into a dictionary suitable for use as a JSON request body.""" - body = {} - if self.compute_key is not None: body['compute_key'] = self.compute_key - if self.spec: body['spec'] = self.spec.as_dict() - return body - - @classmethod - def from_dict(cls, d: Dict[str, any]) -> JobCompute: - """Deserializes the JobCompute from a dictionary.""" - return cls(compute_key=d.get('compute_key', None), spec=_from_dict(d, 'spec', compute.ComputeSpec)) - - @dataclass class JobDeployment: kind: JobDeploymentKind @@ -1374,6 +1346,30 @@ def from_dict(cls, d: Dict[str, any]) -> JobEmailNotifications: on_success=d.get('on_success', None)) +@dataclass +class JobEnvironment: + environment_key: str + """The key of an environment. It has to be unique within a job.""" + + spec: Optional[compute.Environment] = None + """The a environment entity used to preserve serverless environment side panel and jobs' + environment for non-notebook task. In this minimal environment spec, only pip dependencies are + supported. Next ID: 5""" + + def as_dict(self) -> dict: + """Serializes the JobEnvironment into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.environment_key is not None: body['environment_key'] = self.environment_key + if self.spec: body['spec'] = self.spec.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> JobEnvironment: + """Deserializes the JobEnvironment from a dictionary.""" + return cls(environment_key=d.get('environment_key', None), + spec=_from_dict(d, 'spec', compute.Environment)) + + @dataclass class JobNotificationSettings: no_alert_for_canceled_runs: Optional[bool] = None @@ -1582,9 +1578,6 @@ def from_dict(cls, d: Dict[str, any]) -> JobRunAs: @dataclass class JobSettings: - compute: Optional[List[JobCompute]] = None - """A list of compute requirements that can be referenced by tasks of this job.""" - continuous: Optional[Continuous] = None """An optional continuous property for this job. The continuous property will ensure that there is always one run executing. Only one of `schedule` and `continuous` can be used.""" @@ -1605,6 +1598,9 @@ class JobSettings: """An optional set of email addresses that is notified when runs of this job begin or complete as well as when this job is deleted.""" + environments: Optional[List[JobEnvironment]] = None + """A list of task execution environment specifications that can be referenced by tasks of this job.""" + format: Optional[Format] = None """Used to tell what is the format of the job. This field is ignored in Create/Update/Reset calls. When using the Jobs API 2.1 this value is always set to `"MULTI_TASK"`.""" @@ -1684,12 +1680,12 @@ class JobSettings: def as_dict(self) -> dict: """Serializes the JobSettings into a dictionary suitable for use as a JSON request body.""" body = {} - if self.compute: body['compute'] = [v.as_dict() for v in self.compute] if self.continuous: body['continuous'] = self.continuous.as_dict() if self.deployment: body['deployment'] = self.deployment.as_dict() if self.description is not None: body['description'] = self.description if self.edit_mode is not None: body['edit_mode'] = self.edit_mode.value if self.email_notifications: body['email_notifications'] = self.email_notifications.as_dict() + if self.environments: body['environments'] = [v.as_dict() for v in self.environments] if self.format is not None: body['format'] = self.format.value if self.git_source: body['git_source'] = self.git_source.as_dict() if self.health: body['health'] = self.health.as_dict() @@ -1711,12 +1707,12 @@ def as_dict(self) -> dict: @classmethod def from_dict(cls, d: Dict[str, any]) -> JobSettings: """Deserializes the JobSettings from a dictionary.""" - return cls(compute=_repeated_dict(d, 'compute', JobCompute), - continuous=_from_dict(d, 'continuous', Continuous), + return cls(continuous=_from_dict(d, 'continuous', Continuous), deployment=_from_dict(d, 'deployment', JobDeployment), description=d.get('description', None), edit_mode=_enum(d, 'edit_mode', JobEditMode), email_notifications=_from_dict(d, 'email_notifications', JobEmailNotifications), + environments=_repeated_dict(d, 'environments', JobEnvironment), format=_enum(d, 'format', Format), git_source=_from_dict(d, 'git_source', GitSource), health=_from_dict(d, 'health', JobsHealthRules), @@ -3366,10 +3362,6 @@ class RunTask: """The cluster used for this run. If the run is specified to use a new cluster, this field is set once the Jobs service has requested a cluster for the run.""" - compute_key: Optional[str] = None - """The key of the compute requirement, specified in `job.settings.compute`, to use for execution of - this task.""" - condition_task: Optional[RunConditionTask] = None """If condition_task, specifies a condition with an outcome that can be used to control the execution of other tasks. Does not require a cluster to execute and does not support retries or @@ -3520,7 +3512,6 @@ def as_dict(self) -> dict: if self.attempt_number is not None: body['attempt_number'] = self.attempt_number if self.cleanup_duration is not None: body['cleanup_duration'] = self.cleanup_duration if self.cluster_instance: body['cluster_instance'] = self.cluster_instance.as_dict() - if self.compute_key is not None: body['compute_key'] = self.compute_key if self.condition_task: body['condition_task'] = self.condition_task.as_dict() if self.dbt_task: body['dbt_task'] = self.dbt_task.as_dict() if self.depends_on: body['depends_on'] = [v.as_dict() for v in self.depends_on] @@ -3563,7 +3554,6 @@ def from_dict(cls, d: Dict[str, any]) -> RunTask: return cls(attempt_number=d.get('attempt_number', None), cleanup_duration=d.get('cleanup_duration', None), cluster_instance=_from_dict(d, 'cluster_instance', ClusterInstance), - compute_key=d.get('compute_key', None), condition_task=_from_dict(d, 'condition_task', RunConditionTask), dbt_task=_from_dict(d, 'dbt_task', DbtTask), depends_on=_repeated_dict(d, 'depends_on', TaskDependency), @@ -4450,7 +4440,7 @@ def from_dict(cls, d: Dict[str, any]) -> SubmitTask: @dataclass -class TableTriggerConfiguration: +class TableUpdateTriggerConfiguration: condition: Optional[Condition] = None """The table(s) condition based on which to trigger a job run.""" @@ -4468,7 +4458,7 @@ class TableTriggerConfiguration: allowed value is 60 seconds.""" def as_dict(self) -> dict: - """Serializes the TableTriggerConfiguration into a dictionary suitable for use as a JSON request body.""" + """Serializes the TableUpdateTriggerConfiguration into a dictionary suitable for use as a JSON request body.""" body = {} if self.condition is not None: body['condition'] = self.condition.value if self.min_time_between_triggers_seconds is not None: @@ -4479,8 +4469,8 @@ def as_dict(self) -> dict: return body @classmethod - def from_dict(cls, d: Dict[str, any]) -> TableTriggerConfiguration: - """Deserializes the TableTriggerConfiguration from a dictionary.""" + def from_dict(cls, d: Dict[str, any]) -> TableUpdateTriggerConfiguration: + """Deserializes the TableUpdateTriggerConfiguration from a dictionary.""" return cls(condition=_enum(d, 'condition', Condition), min_time_between_triggers_seconds=d.get('min_time_between_triggers_seconds', None), table_names=d.get('table_names', None), @@ -4494,10 +4484,6 @@ class Task: field is required and must be unique within its parent job. On Update or Reset, this field is used to reference the tasks to be updated or reset.""" - compute_key: Optional[str] = None - """The key of the compute requirement, specified in `job.settings.compute`, to use for execution of - this task.""" - condition_task: Optional[ConditionTask] = None """If condition_task, specifies a condition with an outcome that can be used to control the execution of other tasks. Does not require a cluster to execute and does not support retries or @@ -4523,6 +4509,10 @@ class Task: """An optional set of email addresses that is notified when runs of this task begin or complete as well as when this task is deleted. The default behavior is to not send any emails.""" + environment_key: Optional[str] = None + """The key that references an environment spec in a job. This field is required for Python script, + Python wheel and dbt tasks when using serverless compute.""" + existing_cluster_id: Optional[str] = None """If existing_cluster_id, the ID of an existing cluster that is used for all runs. When running jobs or tasks on an existing cluster, you may need to manually restart the cluster if it stops @@ -4621,7 +4611,6 @@ class Task: def as_dict(self) -> dict: """Serializes the Task into a dictionary suitable for use as a JSON request body.""" body = {} - if self.compute_key is not None: body['compute_key'] = self.compute_key if self.condition_task: body['condition_task'] = self.condition_task.as_dict() if self.dbt_task: body['dbt_task'] = self.dbt_task.as_dict() if self.depends_on: body['depends_on'] = [v.as_dict() for v in self.depends_on] @@ -4629,6 +4618,7 @@ def as_dict(self) -> dict: if self.disable_auto_optimization is not None: body['disable_auto_optimization'] = self.disable_auto_optimization if self.email_notifications: body['email_notifications'] = self.email_notifications.as_dict() + if self.environment_key is not None: body['environment_key'] = self.environment_key if self.existing_cluster_id is not None: body['existing_cluster_id'] = self.existing_cluster_id if self.for_each_task: body['for_each_task'] = self.for_each_task.as_dict() if self.health: body['health'] = self.health.as_dict() @@ -4657,13 +4647,13 @@ def as_dict(self) -> dict: @classmethod def from_dict(cls, d: Dict[str, any]) -> Task: """Deserializes the Task from a dictionary.""" - return cls(compute_key=d.get('compute_key', None), - condition_task=_from_dict(d, 'condition_task', ConditionTask), + return cls(condition_task=_from_dict(d, 'condition_task', ConditionTask), dbt_task=_from_dict(d, 'dbt_task', DbtTask), depends_on=_repeated_dict(d, 'depends_on', TaskDependency), description=d.get('description', None), disable_auto_optimization=d.get('disable_auto_optimization', None), email_notifications=_from_dict(d, 'email_notifications', TaskEmailNotifications), + environment_key=d.get('environment_key', None), existing_cluster_id=d.get('existing_cluster_id', None), for_each_task=_from_dict(d, 'for_each_task', ForEachTask), health=_from_dict(d, 'health', JobsHealthRules), @@ -4822,10 +4812,10 @@ class TriggerSettings: pause_status: Optional[PauseStatus] = None """Whether this trigger is paused or not.""" - table: Optional[TableTriggerConfiguration] = None + table: Optional[TableUpdateTriggerConfiguration] = None """Old table trigger settings name. Deprecated in favor of `table_update`.""" - table_update: Optional[TableTriggerConfiguration] = None + table_update: Optional[TableUpdateTriggerConfiguration] = None def as_dict(self) -> dict: """Serializes the TriggerSettings into a dictionary suitable for use as a JSON request body.""" @@ -4841,8 +4831,8 @@ def from_dict(cls, d: Dict[str, any]) -> TriggerSettings: """Deserializes the TriggerSettings from a dictionary.""" return cls(file_arrival=_from_dict(d, 'file_arrival', FileArrivalTriggerConfiguration), pause_status=_enum(d, 'pause_status', PauseStatus), - table=_from_dict(d, 'table', TableTriggerConfiguration), - table_update=_from_dict(d, 'table_update', TableTriggerConfiguration)) + table=_from_dict(d, 'table', TableUpdateTriggerConfiguration), + table_update=_from_dict(d, 'table_update', TableUpdateTriggerConfiguration)) class TriggerType(Enum): @@ -5115,12 +5105,12 @@ def cancel_run_and_wait(self, run_id: int, timeout=timedelta(minutes=20)) -> Run def create(self, *, access_control_list: Optional[List[iam.AccessControlRequest]] = None, - compute: Optional[List[JobCompute]] = None, continuous: Optional[Continuous] = None, deployment: Optional[JobDeployment] = None, description: Optional[str] = None, edit_mode: Optional[JobEditMode] = None, email_notifications: Optional[JobEmailNotifications] = None, + environments: Optional[List[JobEnvironment]] = None, format: Optional[Format] = None, git_source: Optional[GitSource] = None, health: Optional[JobsHealthRules] = None, @@ -5143,8 +5133,6 @@ def create(self, :param access_control_list: List[:class:`AccessControlRequest`] (optional) List of permissions to set on the job. - :param compute: List[:class:`JobCompute`] (optional) - A list of compute requirements that can be referenced by tasks of this job. :param continuous: :class:`Continuous` (optional) An optional continuous property for this job. The continuous property will ensure that there is always one run executing. Only one of `schedule` and `continuous` can be used. @@ -5160,6 +5148,8 @@ def create(self, :param email_notifications: :class:`JobEmailNotifications` (optional) An optional set of email addresses that is notified when runs of this job begin or complete as well as when this job is deleted. + :param environments: List[:class:`JobEnvironment`] (optional) + A list of task execution environment specifications that can be referenced by tasks of this job. :param format: :class:`Format` (optional) Used to tell what is the format of the job. This field is ignored in Create/Update/Reset calls. When using the Jobs API 2.1 this value is always set to `"MULTI_TASK"`. @@ -5225,12 +5215,12 @@ def create(self, body = {} if access_control_list is not None: body['access_control_list'] = [v.as_dict() for v in access_control_list] - if compute is not None: body['compute'] = [v.as_dict() for v in compute] if continuous is not None: body['continuous'] = continuous.as_dict() if deployment is not None: body['deployment'] = deployment.as_dict() if description is not None: body['description'] = description if edit_mode is not None: body['edit_mode'] = edit_mode.value if email_notifications is not None: body['email_notifications'] = email_notifications.as_dict() + if environments is not None: body['environments'] = [v.as_dict() for v in environments] if format is not None: body['format'] = format.value if git_source is not None: body['git_source'] = git_source.as_dict() if health is not None: body['health'] = health.as_dict() diff --git a/databricks/sdk/service/marketplace.py b/databricks/sdk/service/marketplace.py new file mode 100755 index 000000000..d559d98f0 --- /dev/null +++ b/databricks/sdk/service/marketplace.py @@ -0,0 +1,3571 @@ +# Code generated from OpenAPI specs by Databricks SDK Generator. DO NOT EDIT. + +from __future__ import annotations + +import logging +from dataclasses import dataclass +from enum import Enum +from typing import Dict, Iterator, List, Optional + +from ._internal import _enum, _from_dict, _repeated_dict, _repeated_enum + +_LOG = logging.getLogger('databricks.sdk') + +# all definitions in this file are in alphabetical order + + +@dataclass +class AddExchangeForListingRequest: + listing_id: str + + exchange_id: str + + def as_dict(self) -> dict: + """Serializes the AddExchangeForListingRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange_id is not None: body['exchange_id'] = self.exchange_id + if self.listing_id is not None: body['listing_id'] = self.listing_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> AddExchangeForListingRequest: + """Deserializes the AddExchangeForListingRequest from a dictionary.""" + return cls(exchange_id=d.get('exchange_id', None), listing_id=d.get('listing_id', None)) + + +@dataclass +class AddExchangeForListingResponse: + exchange_for_listing: Optional[ExchangeListing] = None + + def as_dict(self) -> dict: + """Serializes the AddExchangeForListingResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange_for_listing: body['exchange_for_listing'] = self.exchange_for_listing.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> AddExchangeForListingResponse: + """Deserializes the AddExchangeForListingResponse from a dictionary.""" + return cls(exchange_for_listing=_from_dict(d, 'exchange_for_listing', ExchangeListing)) + + +class AssetType(Enum): + + ASSET_TYPE_DATA_TABLE = 'ASSET_TYPE_DATA_TABLE' + ASSET_TYPE_GIT_REPO = 'ASSET_TYPE_GIT_REPO' + ASSET_TYPE_MEDIA = 'ASSET_TYPE_MEDIA' + ASSET_TYPE_MODEL = 'ASSET_TYPE_MODEL' + ASSET_TYPE_NOTEBOOK = 'ASSET_TYPE_NOTEBOOK' + ASSET_TYPE_UNSPECIFIED = 'ASSET_TYPE_UNSPECIFIED' + + +class Category(Enum): + + ADVERTISING_AND_MARKETING = 'ADVERTISING_AND_MARKETING' + CLIMATE_AND_ENVIRONMENT = 'CLIMATE_AND_ENVIRONMENT' + COMMERCE = 'COMMERCE' + DEMOGRAPHICS = 'DEMOGRAPHICS' + ECONOMICS = 'ECONOMICS' + EDUCATION = 'EDUCATION' + ENERGY = 'ENERGY' + FINANCIAL = 'FINANCIAL' + GAMING = 'GAMING' + GEOSPATIAL = 'GEOSPATIAL' + HEALTH = 'HEALTH' + LOOKUP_TABLES = 'LOOKUP_TABLES' + MANUFACTURING = 'MANUFACTURING' + MEDIA = 'MEDIA' + OTHER = 'OTHER' + PUBLIC_SECTOR = 'PUBLIC_SECTOR' + RETAIL = 'RETAIL' + SCIENCE_AND_RESEARCH = 'SCIENCE_AND_RESEARCH' + SECURITY = 'SECURITY' + SPORTS = 'SPORTS' + TRANSPORTATION_AND_LOGISTICS = 'TRANSPORTATION_AND_LOGISTICS' + TRAVEL_AND_TOURISM = 'TRAVEL_AND_TOURISM' + + +@dataclass +class ConsumerTerms: + version: str + + def as_dict(self) -> dict: + """Serializes the ConsumerTerms into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.version is not None: body['version'] = self.version + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ConsumerTerms: + """Deserializes the ConsumerTerms from a dictionary.""" + return cls(version=d.get('version', None)) + + +@dataclass +class ContactInfo: + """contact info for the consumer requesting data or performing a listing installation""" + + company: Optional[str] = None + + email: Optional[str] = None + + first_name: Optional[str] = None + + last_name: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ContactInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.company is not None: body['company'] = self.company + if self.email is not None: body['email'] = self.email + if self.first_name is not None: body['first_name'] = self.first_name + if self.last_name is not None: body['last_name'] = self.last_name + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ContactInfo: + """Deserializes the ContactInfo from a dictionary.""" + return cls(company=d.get('company', None), + email=d.get('email', None), + first_name=d.get('first_name', None), + last_name=d.get('last_name', None)) + + +class Cost(Enum): + + FREE = 'FREE' + PAID = 'PAID' + + +@dataclass +class CreateExchangeFilterRequest: + filter: ExchangeFilter + + def as_dict(self) -> dict: + """Serializes the CreateExchangeFilterRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.filter: body['filter'] = self.filter.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateExchangeFilterRequest: + """Deserializes the CreateExchangeFilterRequest from a dictionary.""" + return cls(filter=_from_dict(d, 'filter', ExchangeFilter)) + + +@dataclass +class CreateExchangeFilterResponse: + filter_id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the CreateExchangeFilterResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.filter_id is not None: body['filter_id'] = self.filter_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateExchangeFilterResponse: + """Deserializes the CreateExchangeFilterResponse from a dictionary.""" + return cls(filter_id=d.get('filter_id', None)) + + +@dataclass +class CreateExchangeRequest: + exchange: Exchange + + def as_dict(self) -> dict: + """Serializes the CreateExchangeRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange: body['exchange'] = self.exchange.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateExchangeRequest: + """Deserializes the CreateExchangeRequest from a dictionary.""" + return cls(exchange=_from_dict(d, 'exchange', Exchange)) + + +@dataclass +class CreateExchangeResponse: + exchange_id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the CreateExchangeResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange_id is not None: body['exchange_id'] = self.exchange_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateExchangeResponse: + """Deserializes the CreateExchangeResponse from a dictionary.""" + return cls(exchange_id=d.get('exchange_id', None)) + + +@dataclass +class CreateFileRequest: + file_parent: FileParent + + marketplace_file_type: MarketplaceFileType + + mime_type: str + + display_name: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the CreateFileRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.display_name is not None: body['display_name'] = self.display_name + if self.file_parent: body['file_parent'] = self.file_parent.as_dict() + if self.marketplace_file_type is not None: + body['marketplace_file_type'] = self.marketplace_file_type.value + if self.mime_type is not None: body['mime_type'] = self.mime_type + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateFileRequest: + """Deserializes the CreateFileRequest from a dictionary.""" + return cls(display_name=d.get('display_name', None), + file_parent=_from_dict(d, 'file_parent', FileParent), + marketplace_file_type=_enum(d, 'marketplace_file_type', MarketplaceFileType), + mime_type=d.get('mime_type', None)) + + +@dataclass +class CreateFileResponse: + file_info: Optional[FileInfo] = None + + upload_url: Optional[str] = None + """Pre-signed POST URL to blob storage""" + + def as_dict(self) -> dict: + """Serializes the CreateFileResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.file_info: body['file_info'] = self.file_info.as_dict() + if self.upload_url is not None: body['upload_url'] = self.upload_url + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateFileResponse: + """Deserializes the CreateFileResponse from a dictionary.""" + return cls(file_info=_from_dict(d, 'file_info', FileInfo), upload_url=d.get('upload_url', None)) + + +@dataclass +class CreateInstallationRequest: + accepted_consumer_terms: Optional[ConsumerTerms] = None + + catalog_name: Optional[str] = None + + listing_id: Optional[str] = None + + recipient_type: Optional[DeltaSharingRecipientType] = None + + repo_detail: Optional[RepoInstallation] = None + """for git repo installations""" + + share_name: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the CreateInstallationRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.accepted_consumer_terms: + body['accepted_consumer_terms'] = self.accepted_consumer_terms.as_dict() + if self.catalog_name is not None: body['catalog_name'] = self.catalog_name + if self.listing_id is not None: body['listing_id'] = self.listing_id + if self.recipient_type is not None: body['recipient_type'] = self.recipient_type.value + if self.repo_detail: body['repo_detail'] = self.repo_detail.as_dict() + if self.share_name is not None: body['share_name'] = self.share_name + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateInstallationRequest: + """Deserializes the CreateInstallationRequest from a dictionary.""" + return cls(accepted_consumer_terms=_from_dict(d, 'accepted_consumer_terms', ConsumerTerms), + catalog_name=d.get('catalog_name', None), + listing_id=d.get('listing_id', None), + recipient_type=_enum(d, 'recipient_type', DeltaSharingRecipientType), + repo_detail=_from_dict(d, 'repo_detail', RepoInstallation), + share_name=d.get('share_name', None)) + + +@dataclass +class CreateListingRequest: + listing: Listing + + def as_dict(self) -> dict: + """Serializes the CreateListingRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.listing: body['listing'] = self.listing.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateListingRequest: + """Deserializes the CreateListingRequest from a dictionary.""" + return cls(listing=_from_dict(d, 'listing', Listing)) + + +@dataclass +class CreateListingResponse: + listing_id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the CreateListingResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.listing_id is not None: body['listing_id'] = self.listing_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateListingResponse: + """Deserializes the CreateListingResponse from a dictionary.""" + return cls(listing_id=d.get('listing_id', None)) + + +@dataclass +class CreatePersonalizationRequest: + """Data request messages also creates a lead (maybe)""" + + intended_use: str + + accepted_consumer_terms: ConsumerTerms + + comment: Optional[str] = None + + company: Optional[str] = None + + first_name: Optional[str] = None + + is_from_lighthouse: Optional[bool] = None + + last_name: Optional[str] = None + + listing_id: Optional[str] = None + + recipient_type: Optional[DeltaSharingRecipientType] = None + + def as_dict(self) -> dict: + """Serializes the CreatePersonalizationRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.accepted_consumer_terms: + body['accepted_consumer_terms'] = self.accepted_consumer_terms.as_dict() + if self.comment is not None: body['comment'] = self.comment + if self.company is not None: body['company'] = self.company + if self.first_name is not None: body['first_name'] = self.first_name + if self.intended_use is not None: body['intended_use'] = self.intended_use + if self.is_from_lighthouse is not None: body['is_from_lighthouse'] = self.is_from_lighthouse + if self.last_name is not None: body['last_name'] = self.last_name + if self.listing_id is not None: body['listing_id'] = self.listing_id + if self.recipient_type is not None: body['recipient_type'] = self.recipient_type.value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreatePersonalizationRequest: + """Deserializes the CreatePersonalizationRequest from a dictionary.""" + return cls(accepted_consumer_terms=_from_dict(d, 'accepted_consumer_terms', ConsumerTerms), + comment=d.get('comment', None), + company=d.get('company', None), + first_name=d.get('first_name', None), + intended_use=d.get('intended_use', None), + is_from_lighthouse=d.get('is_from_lighthouse', None), + last_name=d.get('last_name', None), + listing_id=d.get('listing_id', None), + recipient_type=_enum(d, 'recipient_type', DeltaSharingRecipientType)) + + +@dataclass +class CreatePersonalizationRequestResponse: + id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the CreatePersonalizationRequestResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.id is not None: body['id'] = self.id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreatePersonalizationRequestResponse: + """Deserializes the CreatePersonalizationRequestResponse from a dictionary.""" + return cls(id=d.get('id', None)) + + +@dataclass +class CreateProviderRequest: + provider: ProviderInfo + + def as_dict(self) -> dict: + """Serializes the CreateProviderRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.provider: body['provider'] = self.provider.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateProviderRequest: + """Deserializes the CreateProviderRequest from a dictionary.""" + return cls(provider=_from_dict(d, 'provider', ProviderInfo)) + + +@dataclass +class CreateProviderResponse: + id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the CreateProviderResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.id is not None: body['id'] = self.id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> CreateProviderResponse: + """Deserializes the CreateProviderResponse from a dictionary.""" + return cls(id=d.get('id', None)) + + +class DataRefresh(Enum): + + DAILY = 'DAILY' + HOURLY = 'HOURLY' + MINUTE = 'MINUTE' + MONTHLY = 'MONTHLY' + NONE = 'NONE' + QUARTERLY = 'QUARTERLY' + SECOND = 'SECOND' + WEEKLY = 'WEEKLY' + YEARLY = 'YEARLY' + + +@dataclass +class DataRefreshInfo: + interval: int + + unit: DataRefresh + + def as_dict(self) -> dict: + """Serializes the DataRefreshInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.interval is not None: body['interval'] = self.interval + if self.unit is not None: body['unit'] = self.unit.value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> DataRefreshInfo: + """Deserializes the DataRefreshInfo from a dictionary.""" + return cls(interval=d.get('interval', None), unit=_enum(d, 'unit', DataRefresh)) + + +@dataclass +class DeleteExchangeFilterResponse: + + def as_dict(self) -> dict: + """Serializes the DeleteExchangeFilterResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> DeleteExchangeFilterResponse: + """Deserializes the DeleteExchangeFilterResponse from a dictionary.""" + return cls() + + +@dataclass +class DeleteExchangeResponse: + + def as_dict(self) -> dict: + """Serializes the DeleteExchangeResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> DeleteExchangeResponse: + """Deserializes the DeleteExchangeResponse from a dictionary.""" + return cls() + + +@dataclass +class DeleteFileResponse: + + def as_dict(self) -> dict: + """Serializes the DeleteFileResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> DeleteFileResponse: + """Deserializes the DeleteFileResponse from a dictionary.""" + return cls() + + +@dataclass +class DeleteInstallationResponse: + + def as_dict(self) -> dict: + """Serializes the DeleteInstallationResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> DeleteInstallationResponse: + """Deserializes the DeleteInstallationResponse from a dictionary.""" + return cls() + + +@dataclass +class DeleteListingResponse: + + def as_dict(self) -> dict: + """Serializes the DeleteListingResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> DeleteListingResponse: + """Deserializes the DeleteListingResponse from a dictionary.""" + return cls() + + +@dataclass +class DeleteProviderResponse: + + def as_dict(self) -> dict: + """Serializes the DeleteProviderResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> DeleteProviderResponse: + """Deserializes the DeleteProviderResponse from a dictionary.""" + return cls() + + +class DeltaSharingRecipientType(Enum): + + DELTA_SHARING_RECIPIENT_TYPE_DATABRICKS = 'DELTA_SHARING_RECIPIENT_TYPE_DATABRICKS' + DELTA_SHARING_RECIPIENT_TYPE_OPEN = 'DELTA_SHARING_RECIPIENT_TYPE_OPEN' + + +@dataclass +class Exchange: + name: str + + comment: Optional[str] = None + + created_at: Optional[int] = None + + created_by: Optional[str] = None + + filters: Optional[List[ExchangeFilter]] = None + + id: Optional[str] = None + + linked_listings: Optional[List[ExchangeListing]] = None + + updated_at: Optional[int] = None + + updated_by: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the Exchange into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.comment is not None: body['comment'] = self.comment + if self.created_at is not None: body['created_at'] = self.created_at + if self.created_by is not None: body['created_by'] = self.created_by + if self.filters: body['filters'] = [v.as_dict() for v in self.filters] + if self.id is not None: body['id'] = self.id + if self.linked_listings: body['linked_listings'] = [v.as_dict() for v in self.linked_listings] + if self.name is not None: body['name'] = self.name + if self.updated_at is not None: body['updated_at'] = self.updated_at + if self.updated_by is not None: body['updated_by'] = self.updated_by + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> Exchange: + """Deserializes the Exchange from a dictionary.""" + return cls(comment=d.get('comment', None), + created_at=d.get('created_at', None), + created_by=d.get('created_by', None), + filters=_repeated_dict(d, 'filters', ExchangeFilter), + id=d.get('id', None), + linked_listings=_repeated_dict(d, 'linked_listings', ExchangeListing), + name=d.get('name', None), + updated_at=d.get('updated_at', None), + updated_by=d.get('updated_by', None)) + + +@dataclass +class ExchangeFilter: + exchange_id: str + + filter_value: str + + filter_type: ExchangeFilterType + + created_at: Optional[int] = None + + created_by: Optional[str] = None + + id: Optional[str] = None + + name: Optional[str] = None + + updated_at: Optional[int] = None + + updated_by: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ExchangeFilter into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.created_at is not None: body['created_at'] = self.created_at + if self.created_by is not None: body['created_by'] = self.created_by + if self.exchange_id is not None: body['exchange_id'] = self.exchange_id + if self.filter_type is not None: body['filter_type'] = self.filter_type.value + if self.filter_value is not None: body['filter_value'] = self.filter_value + if self.id is not None: body['id'] = self.id + if self.name is not None: body['name'] = self.name + if self.updated_at is not None: body['updated_at'] = self.updated_at + if self.updated_by is not None: body['updated_by'] = self.updated_by + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ExchangeFilter: + """Deserializes the ExchangeFilter from a dictionary.""" + return cls(created_at=d.get('created_at', None), + created_by=d.get('created_by', None), + exchange_id=d.get('exchange_id', None), + filter_type=_enum(d, 'filter_type', ExchangeFilterType), + filter_value=d.get('filter_value', None), + id=d.get('id', None), + name=d.get('name', None), + updated_at=d.get('updated_at', None), + updated_by=d.get('updated_by', None)) + + +class ExchangeFilterType(Enum): + + GLOBAL_METASTORE_ID = 'GLOBAL_METASTORE_ID' + + +@dataclass +class ExchangeListing: + created_at: Optional[int] = None + + created_by: Optional[str] = None + + exchange_id: Optional[str] = None + + exchange_name: Optional[str] = None + + id: Optional[str] = None + + listing_id: Optional[str] = None + + listing_name: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ExchangeListing into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.created_at is not None: body['created_at'] = self.created_at + if self.created_by is not None: body['created_by'] = self.created_by + if self.exchange_id is not None: body['exchange_id'] = self.exchange_id + if self.exchange_name is not None: body['exchange_name'] = self.exchange_name + if self.id is not None: body['id'] = self.id + if self.listing_id is not None: body['listing_id'] = self.listing_id + if self.listing_name is not None: body['listing_name'] = self.listing_name + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ExchangeListing: + """Deserializes the ExchangeListing from a dictionary.""" + return cls(created_at=d.get('created_at', None), + created_by=d.get('created_by', None), + exchange_id=d.get('exchange_id', None), + exchange_name=d.get('exchange_name', None), + id=d.get('id', None), + listing_id=d.get('listing_id', None), + listing_name=d.get('listing_name', None)) + + +@dataclass +class FileInfo: + created_at: Optional[int] = None + + display_name: Optional[str] = None + """Name displayed to users for applicable files, e.g. embedded notebooks""" + + download_link: Optional[str] = None + + file_parent: Optional[FileParent] = None + + id: Optional[str] = None + + marketplace_file_type: Optional[MarketplaceFileType] = None + + mime_type: Optional[str] = None + + status: Optional[FileStatus] = None + + status_message: Optional[str] = None + """Populated if status is in a failed state with more information on reason for the failure.""" + + updated_at: Optional[int] = None + + def as_dict(self) -> dict: + """Serializes the FileInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.created_at is not None: body['created_at'] = self.created_at + if self.display_name is not None: body['display_name'] = self.display_name + if self.download_link is not None: body['download_link'] = self.download_link + if self.file_parent: body['file_parent'] = self.file_parent.as_dict() + if self.id is not None: body['id'] = self.id + if self.marketplace_file_type is not None: + body['marketplace_file_type'] = self.marketplace_file_type.value + if self.mime_type is not None: body['mime_type'] = self.mime_type + if self.status is not None: body['status'] = self.status.value + if self.status_message is not None: body['status_message'] = self.status_message + if self.updated_at is not None: body['updated_at'] = self.updated_at + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> FileInfo: + """Deserializes the FileInfo from a dictionary.""" + return cls(created_at=d.get('created_at', None), + display_name=d.get('display_name', None), + download_link=d.get('download_link', None), + file_parent=_from_dict(d, 'file_parent', FileParent), + id=d.get('id', None), + marketplace_file_type=_enum(d, 'marketplace_file_type', MarketplaceFileType), + mime_type=d.get('mime_type', None), + status=_enum(d, 'status', FileStatus), + status_message=d.get('status_message', None), + updated_at=d.get('updated_at', None)) + + +@dataclass +class FileParent: + file_parent_type: Optional[FileParentType] = None + + parent_id: Optional[str] = None + """TODO make the following fields required""" + + def as_dict(self) -> dict: + """Serializes the FileParent into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.file_parent_type is not None: body['file_parent_type'] = self.file_parent_type.value + if self.parent_id is not None: body['parent_id'] = self.parent_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> FileParent: + """Deserializes the FileParent from a dictionary.""" + return cls(file_parent_type=_enum(d, 'file_parent_type', FileParentType), + parent_id=d.get('parent_id', None)) + + +class FileParentType(Enum): + + LISTING = 'LISTING' + PROVIDER = 'PROVIDER' + + +class FileStatus(Enum): + + FILE_STATUS_PUBLISHED = 'FILE_STATUS_PUBLISHED' + FILE_STATUS_SANITIZATION_FAILED = 'FILE_STATUS_SANITIZATION_FAILED' + FILE_STATUS_SANITIZING = 'FILE_STATUS_SANITIZING' + FILE_STATUS_STAGING = 'FILE_STATUS_STAGING' + + +class FilterType(Enum): + + METASTORE = 'METASTORE' + + +class FulfillmentType(Enum): + + INSTALL = 'INSTALL' + REQUEST_ACCESS = 'REQUEST_ACCESS' + + +@dataclass +class GetExchangeResponse: + exchange: Optional[Exchange] = None + + def as_dict(self) -> dict: + """Serializes the GetExchangeResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange: body['exchange'] = self.exchange.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> GetExchangeResponse: + """Deserializes the GetExchangeResponse from a dictionary.""" + return cls(exchange=_from_dict(d, 'exchange', Exchange)) + + +@dataclass +class GetFileResponse: + file_info: Optional[FileInfo] = None + + def as_dict(self) -> dict: + """Serializes the GetFileResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.file_info: body['file_info'] = self.file_info.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> GetFileResponse: + """Deserializes the GetFileResponse from a dictionary.""" + return cls(file_info=_from_dict(d, 'file_info', FileInfo)) + + +@dataclass +class GetLatestVersionProviderAnalyticsDashboardResponse: + version: Optional[int] = None + """version here is latest logical version of the dashboard template""" + + def as_dict(self) -> dict: + """Serializes the GetLatestVersionProviderAnalyticsDashboardResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.version is not None: body['version'] = self.version + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> GetLatestVersionProviderAnalyticsDashboardResponse: + """Deserializes the GetLatestVersionProviderAnalyticsDashboardResponse from a dictionary.""" + return cls(version=d.get('version', None)) + + +@dataclass +class GetListingContentMetadataResponse: + next_page_token: Optional[str] = None + + shared_data_objects: Optional[List[SharedDataObject]] = None + + def as_dict(self) -> dict: + """Serializes the GetListingContentMetadataResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + if self.shared_data_objects: + body['shared_data_objects'] = [v.as_dict() for v in self.shared_data_objects] + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> GetListingContentMetadataResponse: + """Deserializes the GetListingContentMetadataResponse from a dictionary.""" + return cls(next_page_token=d.get('next_page_token', None), + shared_data_objects=_repeated_dict(d, 'shared_data_objects', SharedDataObject)) + + +@dataclass +class GetListingResponse: + listing: Optional[Listing] = None + + def as_dict(self) -> dict: + """Serializes the GetListingResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.listing: body['listing'] = self.listing.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> GetListingResponse: + """Deserializes the GetListingResponse from a dictionary.""" + return cls(listing=_from_dict(d, 'listing', Listing)) + + +@dataclass +class GetListingsResponse: + listings: Optional[List[Listing]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the GetListingsResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.listings: body['listings'] = [v.as_dict() for v in self.listings] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> GetListingsResponse: + """Deserializes the GetListingsResponse from a dictionary.""" + return cls(listings=_repeated_dict(d, 'listings', Listing), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class GetPersonalizationRequestResponse: + personalization_requests: Optional[List[PersonalizationRequest]] = None + + def as_dict(self) -> dict: + """Serializes the GetPersonalizationRequestResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.personalization_requests: + body['personalization_requests'] = [v.as_dict() for v in self.personalization_requests] + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> GetPersonalizationRequestResponse: + """Deserializes the GetPersonalizationRequestResponse from a dictionary.""" + return cls( + personalization_requests=_repeated_dict(d, 'personalization_requests', PersonalizationRequest)) + + +@dataclass +class GetProviderResponse: + provider: Optional[ProviderInfo] = None + + def as_dict(self) -> dict: + """Serializes the GetProviderResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.provider: body['provider'] = self.provider.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> GetProviderResponse: + """Deserializes the GetProviderResponse from a dictionary.""" + return cls(provider=_from_dict(d, 'provider', ProviderInfo)) + + +@dataclass +class Installation: + installation: Optional[InstallationDetail] = None + + def as_dict(self) -> dict: + """Serializes the Installation into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.installation: body['installation'] = self.installation.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> Installation: + """Deserializes the Installation from a dictionary.""" + return cls(installation=_from_dict(d, 'installation', InstallationDetail)) + + +@dataclass +class InstallationDetail: + catalog_name: Optional[str] = None + + error_message: Optional[str] = None + + id: Optional[str] = None + + installed_on: Optional[int] = None + + listing_id: Optional[str] = None + + listing_name: Optional[str] = None + + recipient_type: Optional[DeltaSharingRecipientType] = None + + repo_name: Optional[str] = None + + repo_path: Optional[str] = None + + share_name: Optional[str] = None + + status: Optional[InstallationStatus] = None + + token_detail: Optional[TokenDetail] = None + + tokens: Optional[List[TokenInfo]] = None + + def as_dict(self) -> dict: + """Serializes the InstallationDetail into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.catalog_name is not None: body['catalog_name'] = self.catalog_name + if self.error_message is not None: body['error_message'] = self.error_message + if self.id is not None: body['id'] = self.id + if self.installed_on is not None: body['installed_on'] = self.installed_on + if self.listing_id is not None: body['listing_id'] = self.listing_id + if self.listing_name is not None: body['listing_name'] = self.listing_name + if self.recipient_type is not None: body['recipient_type'] = self.recipient_type.value + if self.repo_name is not None: body['repo_name'] = self.repo_name + if self.repo_path is not None: body['repo_path'] = self.repo_path + if self.share_name is not None: body['share_name'] = self.share_name + if self.status is not None: body['status'] = self.status.value + if self.token_detail: body['token_detail'] = self.token_detail.as_dict() + if self.tokens: body['tokens'] = [v.as_dict() for v in self.tokens] + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> InstallationDetail: + """Deserializes the InstallationDetail from a dictionary.""" + return cls(catalog_name=d.get('catalog_name', None), + error_message=d.get('error_message', None), + id=d.get('id', None), + installed_on=d.get('installed_on', None), + listing_id=d.get('listing_id', None), + listing_name=d.get('listing_name', None), + recipient_type=_enum(d, 'recipient_type', DeltaSharingRecipientType), + repo_name=d.get('repo_name', None), + repo_path=d.get('repo_path', None), + share_name=d.get('share_name', None), + status=_enum(d, 'status', InstallationStatus), + token_detail=_from_dict(d, 'token_detail', TokenDetail), + tokens=_repeated_dict(d, 'tokens', TokenInfo)) + + +class InstallationStatus(Enum): + + FAILED = 'FAILED' + INSTALLED = 'INSTALLED' + + +@dataclass +class ListAllInstallationsResponse: + installations: Optional[List[InstallationDetail]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListAllInstallationsResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.installations: body['installations'] = [v.as_dict() for v in self.installations] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListAllInstallationsResponse: + """Deserializes the ListAllInstallationsResponse from a dictionary.""" + return cls(installations=_repeated_dict(d, 'installations', InstallationDetail), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListAllPersonalizationRequestsResponse: + next_page_token: Optional[str] = None + + personalization_requests: Optional[List[PersonalizationRequest]] = None + + def as_dict(self) -> dict: + """Serializes the ListAllPersonalizationRequestsResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + if self.personalization_requests: + body['personalization_requests'] = [v.as_dict() for v in self.personalization_requests] + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListAllPersonalizationRequestsResponse: + """Deserializes the ListAllPersonalizationRequestsResponse from a dictionary.""" + return cls(next_page_token=d.get('next_page_token', None), + personalization_requests=_repeated_dict(d, 'personalization_requests', + PersonalizationRequest)) + + +@dataclass +class ListExchangeFiltersResponse: + filters: Optional[List[ExchangeFilter]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListExchangeFiltersResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.filters: body['filters'] = [v.as_dict() for v in self.filters] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListExchangeFiltersResponse: + """Deserializes the ListExchangeFiltersResponse from a dictionary.""" + return cls(filters=_repeated_dict(d, 'filters', ExchangeFilter), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListExchangesForListingResponse: + exchange_listing: Optional[List[ExchangeListing]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListExchangesForListingResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange_listing: body['exchange_listing'] = [v.as_dict() for v in self.exchange_listing] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListExchangesForListingResponse: + """Deserializes the ListExchangesForListingResponse from a dictionary.""" + return cls(exchange_listing=_repeated_dict(d, 'exchange_listing', ExchangeListing), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListExchangesResponse: + exchanges: Optional[List[Exchange]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListExchangesResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchanges: body['exchanges'] = [v.as_dict() for v in self.exchanges] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListExchangesResponse: + """Deserializes the ListExchangesResponse from a dictionary.""" + return cls(exchanges=_repeated_dict(d, 'exchanges', Exchange), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListFilesResponse: + file_infos: Optional[List[FileInfo]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListFilesResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.file_infos: body['file_infos'] = [v.as_dict() for v in self.file_infos] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListFilesResponse: + """Deserializes the ListFilesResponse from a dictionary.""" + return cls(file_infos=_repeated_dict(d, 'file_infos', FileInfo), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListFulfillmentsResponse: + fulfillments: Optional[List[ListingFulfillment]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListFulfillmentsResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.fulfillments: body['fulfillments'] = [v.as_dict() for v in self.fulfillments] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListFulfillmentsResponse: + """Deserializes the ListFulfillmentsResponse from a dictionary.""" + return cls(fulfillments=_repeated_dict(d, 'fulfillments', ListingFulfillment), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListInstallationsResponse: + installations: Optional[List[InstallationDetail]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListInstallationsResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.installations: body['installations'] = [v.as_dict() for v in self.installations] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListInstallationsResponse: + """Deserializes the ListInstallationsResponse from a dictionary.""" + return cls(installations=_repeated_dict(d, 'installations', InstallationDetail), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListListingsForExchangeResponse: + exchange_listings: Optional[List[ExchangeListing]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListListingsForExchangeResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange_listings: body['exchange_listings'] = [v.as_dict() for v in self.exchange_listings] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListListingsForExchangeResponse: + """Deserializes the ListListingsForExchangeResponse from a dictionary.""" + return cls(exchange_listings=_repeated_dict(d, 'exchange_listings', ExchangeListing), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListListingsResponse: + listings: Optional[List[Listing]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ListListingsResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.listings: body['listings'] = [v.as_dict() for v in self.listings] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListListingsResponse: + """Deserializes the ListListingsResponse from a dictionary.""" + return cls(listings=_repeated_dict(d, 'listings', Listing), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ListProviderAnalyticsDashboardResponse: + id: str + + dashboard_id: str + """dashboard_id will be used to open Lakeview dashboard.""" + + version: Optional[int] = None + + def as_dict(self) -> dict: + """Serializes the ListProviderAnalyticsDashboardResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.dashboard_id is not None: body['dashboard_id'] = self.dashboard_id + if self.id is not None: body['id'] = self.id + if self.version is not None: body['version'] = self.version + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListProviderAnalyticsDashboardResponse: + """Deserializes the ListProviderAnalyticsDashboardResponse from a dictionary.""" + return cls(dashboard_id=d.get('dashboard_id', None), + id=d.get('id', None), + version=d.get('version', None)) + + +@dataclass +class ListProvidersResponse: + next_page_token: Optional[str] = None + + providers: Optional[List[ProviderInfo]] = None + + def as_dict(self) -> dict: + """Serializes the ListProvidersResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + if self.providers: body['providers'] = [v.as_dict() for v in self.providers] + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListProvidersResponse: + """Deserializes the ListProvidersResponse from a dictionary.""" + return cls(next_page_token=d.get('next_page_token', None), + providers=_repeated_dict(d, 'providers', ProviderInfo)) + + +@dataclass +class Listing: + summary: ListingSummary + """Next Number: 26""" + + detail: Optional[ListingDetail] = None + + id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the Listing into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.detail: body['detail'] = self.detail.as_dict() + if self.id is not None: body['id'] = self.id + if self.summary: body['summary'] = self.summary.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> Listing: + """Deserializes the Listing from a dictionary.""" + return cls(detail=_from_dict(d, 'detail', ListingDetail), + id=d.get('id', None), + summary=_from_dict(d, 'summary', ListingSummary)) + + +@dataclass +class ListingDetail: + assets: Optional[List[AssetType]] = None + """Type of assets included in the listing. eg. GIT_REPO, DATA_TABLE, MODEL, NOTEBOOK""" + + collection_date_end: Optional[int] = None + """The ending date timestamp for when the data spans""" + + collection_date_start: Optional[int] = None + """The starting date timestamp for when the data spans""" + + collection_granularity: Optional[DataRefreshInfo] = None + """Smallest unit of time in the dataset""" + + cost: Optional[Cost] = None + """Whether the dataset is free or paid""" + + data_source: Optional[str] = None + """Where/how the data is sourced""" + + description: Optional[str] = None + + documentation_link: Optional[str] = None + + embedded_notebook_file_infos: Optional[List[FileInfo]] = None + + file_ids: Optional[List[str]] = None + + geographical_coverage: Optional[str] = None + """Which geo region the listing data is collected from""" + + license: Optional[str] = None + """ID 20, 21 removed don't use License of the data asset - Required for listings with model based + assets""" + + pricing_model: Optional[str] = None + """What the pricing model is (e.g. paid, subscription, paid upfront); should only be present if + cost is paid TODO: Not used yet, should deprecate if we will never use it""" + + privacy_policy_link: Optional[str] = None + + size: Optional[float] = None + """size of the dataset in GB""" + + support_link: Optional[str] = None + + tags: Optional[List[ListingTag]] = None + """Listing tags - Simple key value pair to annotate listings. When should I use tags vs dedicated + fields? Using tags avoids the need to add new columns in the database for new annotations. + However, this should be used sparingly since tags are stored as key value pair. Use tags only: + 1. If the field is optional and won't need to have NOT NULL integrity check 2. The value is + fairly fixed, static and low cardinality (eg. enums). 3. The value won't be used in filters or + joins with other tables.""" + + terms_of_service: Optional[str] = None + + update_frequency: Optional[DataRefreshInfo] = None + """How often data is updated""" + + def as_dict(self) -> dict: + """Serializes the ListingDetail into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.assets: body['assets'] = [v.value for v in self.assets] + if self.collection_date_end is not None: body['collection_date_end'] = self.collection_date_end + if self.collection_date_start is not None: body['collection_date_start'] = self.collection_date_start + if self.collection_granularity: body['collection_granularity'] = self.collection_granularity.as_dict() + if self.cost is not None: body['cost'] = self.cost.value + if self.data_source is not None: body['data_source'] = self.data_source + if self.description is not None: body['description'] = self.description + if self.documentation_link is not None: body['documentation_link'] = self.documentation_link + if self.embedded_notebook_file_infos: + body['embedded_notebook_file_infos'] = [v.as_dict() for v in self.embedded_notebook_file_infos] + if self.file_ids: body['file_ids'] = [v for v in self.file_ids] + if self.geographical_coverage is not None: body['geographical_coverage'] = self.geographical_coverage + if self.license is not None: body['license'] = self.license + if self.pricing_model is not None: body['pricing_model'] = self.pricing_model + if self.privacy_policy_link is not None: body['privacy_policy_link'] = self.privacy_policy_link + if self.size is not None: body['size'] = self.size + if self.support_link is not None: body['support_link'] = self.support_link + if self.tags: body['tags'] = [v.as_dict() for v in self.tags] + if self.terms_of_service is not None: body['terms_of_service'] = self.terms_of_service + if self.update_frequency: body['update_frequency'] = self.update_frequency.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListingDetail: + """Deserializes the ListingDetail from a dictionary.""" + return cls(assets=_repeated_enum(d, 'assets', AssetType), + collection_date_end=d.get('collection_date_end', None), + collection_date_start=d.get('collection_date_start', None), + collection_granularity=_from_dict(d, 'collection_granularity', DataRefreshInfo), + cost=_enum(d, 'cost', Cost), + data_source=d.get('data_source', None), + description=d.get('description', None), + documentation_link=d.get('documentation_link', None), + embedded_notebook_file_infos=_repeated_dict(d, 'embedded_notebook_file_infos', FileInfo), + file_ids=d.get('file_ids', None), + geographical_coverage=d.get('geographical_coverage', None), + license=d.get('license', None), + pricing_model=d.get('pricing_model', None), + privacy_policy_link=d.get('privacy_policy_link', None), + size=d.get('size', None), + support_link=d.get('support_link', None), + tags=_repeated_dict(d, 'tags', ListingTag), + terms_of_service=d.get('terms_of_service', None), + update_frequency=_from_dict(d, 'update_frequency', DataRefreshInfo)) + + +@dataclass +class ListingFulfillment: + listing_id: str + + fulfillment_type: Optional[FulfillmentType] = None + + recipient_type: Optional[DeltaSharingRecipientType] = None + + repo_info: Optional[RepoInfo] = None + + share_info: Optional[ShareInfo] = None + + def as_dict(self) -> dict: + """Serializes the ListingFulfillment into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.fulfillment_type is not None: body['fulfillment_type'] = self.fulfillment_type.value + if self.listing_id is not None: body['listing_id'] = self.listing_id + if self.recipient_type is not None: body['recipient_type'] = self.recipient_type.value + if self.repo_info: body['repo_info'] = self.repo_info.as_dict() + if self.share_info: body['share_info'] = self.share_info.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListingFulfillment: + """Deserializes the ListingFulfillment from a dictionary.""" + return cls(fulfillment_type=_enum(d, 'fulfillment_type', FulfillmentType), + listing_id=d.get('listing_id', None), + recipient_type=_enum(d, 'recipient_type', DeltaSharingRecipientType), + repo_info=_from_dict(d, 'repo_info', RepoInfo), + share_info=_from_dict(d, 'share_info', ShareInfo)) + + +@dataclass +class ListingSetting: + filters: Optional[List[VisibilityFilter]] = None + """filters are joined with `or` conjunction.""" + + visibility: Optional[Visibility] = None + + def as_dict(self) -> dict: + """Serializes the ListingSetting into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.filters: body['filters'] = [v.as_dict() for v in self.filters] + if self.visibility is not None: body['visibility'] = self.visibility.value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListingSetting: + """Deserializes the ListingSetting from a dictionary.""" + return cls(filters=_repeated_dict(d, 'filters', VisibilityFilter), + visibility=_enum(d, 'visibility', Visibility)) + + +class ListingShareType(Enum): + + FULL = 'FULL' + SAMPLE = 'SAMPLE' + + +class ListingStatus(Enum): + """Enums""" + + DRAFT = 'DRAFT' + PENDING = 'PENDING' + PUBLISHED = 'PUBLISHED' + SUSPENDED = 'SUSPENDED' + + +@dataclass +class ListingSummary: + """Next Number: 26""" + + name: str + + listing_type: ListingType + + categories: Optional[List[Category]] = None + + created_at: Optional[int] = None + + created_by: Optional[str] = None + + created_by_id: Optional[int] = None + + exchange_ids: Optional[List[str]] = None + + git_repo: Optional[RepoInfo] = None + """if a git repo is being created, a listing will be initialized with this field as opposed to a + share""" + + metastore_id: Optional[str] = None + + provider_id: Optional[str] = None + + provider_region: Optional[RegionInfo] = None + + published_at: Optional[int] = None + + published_by: Optional[str] = None + + setting: Optional[ListingSetting] = None + + share: Optional[ShareInfo] = None + + status: Optional[ListingStatus] = None + """Enums""" + + subtitle: Optional[str] = None + + updated_at: Optional[int] = None + + updated_by: Optional[str] = None + + updated_by_id: Optional[int] = None + + def as_dict(self) -> dict: + """Serializes the ListingSummary into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.categories: body['categories'] = [v.value for v in self.categories] + if self.created_at is not None: body['created_at'] = self.created_at + if self.created_by is not None: body['created_by'] = self.created_by + if self.created_by_id is not None: body['created_by_id'] = self.created_by_id + if self.exchange_ids: body['exchange_ids'] = [v for v in self.exchange_ids] + if self.git_repo: body['git_repo'] = self.git_repo.as_dict() + if self.listing_type is not None: body['listingType'] = self.listing_type.value + if self.metastore_id is not None: body['metastore_id'] = self.metastore_id + if self.name is not None: body['name'] = self.name + if self.provider_id is not None: body['provider_id'] = self.provider_id + if self.provider_region: body['provider_region'] = self.provider_region.as_dict() + if self.published_at is not None: body['published_at'] = self.published_at + if self.published_by is not None: body['published_by'] = self.published_by + if self.setting: body['setting'] = self.setting.as_dict() + if self.share: body['share'] = self.share.as_dict() + if self.status is not None: body['status'] = self.status.value + if self.subtitle is not None: body['subtitle'] = self.subtitle + if self.updated_at is not None: body['updated_at'] = self.updated_at + if self.updated_by is not None: body['updated_by'] = self.updated_by + if self.updated_by_id is not None: body['updated_by_id'] = self.updated_by_id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListingSummary: + """Deserializes the ListingSummary from a dictionary.""" + return cls(categories=_repeated_enum(d, 'categories', Category), + created_at=d.get('created_at', None), + created_by=d.get('created_by', None), + created_by_id=d.get('created_by_id', None), + exchange_ids=d.get('exchange_ids', None), + git_repo=_from_dict(d, 'git_repo', RepoInfo), + listing_type=_enum(d, 'listingType', ListingType), + metastore_id=d.get('metastore_id', None), + name=d.get('name', None), + provider_id=d.get('provider_id', None), + provider_region=_from_dict(d, 'provider_region', RegionInfo), + published_at=d.get('published_at', None), + published_by=d.get('published_by', None), + setting=_from_dict(d, 'setting', ListingSetting), + share=_from_dict(d, 'share', ShareInfo), + status=_enum(d, 'status', ListingStatus), + subtitle=d.get('subtitle', None), + updated_at=d.get('updated_at', None), + updated_by=d.get('updated_by', None), + updated_by_id=d.get('updated_by_id', None)) + + +@dataclass +class ListingTag: + tag_name: Optional[ListingTagType] = None + """Tag name (enum)""" + + tag_values: Optional[List[str]] = None + """String representation of the tag value. Values should be string literals (no complex types)""" + + def as_dict(self) -> dict: + """Serializes the ListingTag into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.tag_name is not None: body['tag_name'] = self.tag_name.value + if self.tag_values: body['tag_values'] = [v for v in self.tag_values] + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ListingTag: + """Deserializes the ListingTag from a dictionary.""" + return cls(tag_name=_enum(d, 'tag_name', ListingTagType), tag_values=d.get('tag_values', None)) + + +class ListingTagType(Enum): + + LISTING_TAG_TYPE_LANGUAGE = 'LISTING_TAG_TYPE_LANGUAGE' + LISTING_TAG_TYPE_TASK = 'LISTING_TAG_TYPE_TASK' + LISTING_TAG_TYPE_UNSPECIFIED = 'LISTING_TAG_TYPE_UNSPECIFIED' + + +class ListingType(Enum): + + PERSONALIZED = 'PERSONALIZED' + STANDARD = 'STANDARD' + + +class MarketplaceFileType(Enum): + + EMBEDDED_NOTEBOOK = 'EMBEDDED_NOTEBOOK' + PROVIDER_ICON = 'PROVIDER_ICON' + + +@dataclass +class PersonalizationRequest: + consumer_region: RegionInfo + + comment: Optional[str] = None + + contact_info: Optional[ContactInfo] = None + """contact info for the consumer requesting data or performing a listing installation""" + + created_at: Optional[int] = None + + id: Optional[str] = None + + intended_use: Optional[str] = None + + is_from_lighthouse: Optional[bool] = None + + listing_id: Optional[str] = None + + listing_name: Optional[str] = None + + metastore_id: Optional[str] = None + + provider_id: Optional[str] = None + + recipient_type: Optional[DeltaSharingRecipientType] = None + + share: Optional[ShareInfo] = None + + status: Optional[PersonalizationRequestStatus] = None + + status_message: Optional[str] = None + + updated_at: Optional[int] = None + + def as_dict(self) -> dict: + """Serializes the PersonalizationRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.comment is not None: body['comment'] = self.comment + if self.consumer_region: body['consumer_region'] = self.consumer_region.as_dict() + if self.contact_info: body['contact_info'] = self.contact_info.as_dict() + if self.created_at is not None: body['created_at'] = self.created_at + if self.id is not None: body['id'] = self.id + if self.intended_use is not None: body['intended_use'] = self.intended_use + if self.is_from_lighthouse is not None: body['is_from_lighthouse'] = self.is_from_lighthouse + if self.listing_id is not None: body['listing_id'] = self.listing_id + if self.listing_name is not None: body['listing_name'] = self.listing_name + if self.metastore_id is not None: body['metastore_id'] = self.metastore_id + if self.provider_id is not None: body['provider_id'] = self.provider_id + if self.recipient_type is not None: body['recipient_type'] = self.recipient_type.value + if self.share: body['share'] = self.share.as_dict() + if self.status is not None: body['status'] = self.status.value + if self.status_message is not None: body['status_message'] = self.status_message + if self.updated_at is not None: body['updated_at'] = self.updated_at + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> PersonalizationRequest: + """Deserializes the PersonalizationRequest from a dictionary.""" + return cls(comment=d.get('comment', None), + consumer_region=_from_dict(d, 'consumer_region', RegionInfo), + contact_info=_from_dict(d, 'contact_info', ContactInfo), + created_at=d.get('created_at', None), + id=d.get('id', None), + intended_use=d.get('intended_use', None), + is_from_lighthouse=d.get('is_from_lighthouse', None), + listing_id=d.get('listing_id', None), + listing_name=d.get('listing_name', None), + metastore_id=d.get('metastore_id', None), + provider_id=d.get('provider_id', None), + recipient_type=_enum(d, 'recipient_type', DeltaSharingRecipientType), + share=_from_dict(d, 'share', ShareInfo), + status=_enum(d, 'status', PersonalizationRequestStatus), + status_message=d.get('status_message', None), + updated_at=d.get('updated_at', None)) + + +class PersonalizationRequestStatus(Enum): + + DENIED = 'DENIED' + FULFILLED = 'FULFILLED' + NEW = 'NEW' + REQUEST_PENDING = 'REQUEST_PENDING' + + +@dataclass +class ProviderAnalyticsDashboard: + id: str + + def as_dict(self) -> dict: + """Serializes the ProviderAnalyticsDashboard into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.id is not None: body['id'] = self.id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ProviderAnalyticsDashboard: + """Deserializes the ProviderAnalyticsDashboard from a dictionary.""" + return cls(id=d.get('id', None)) + + +@dataclass +class ProviderInfo: + name: str + + business_contact_email: str + + term_of_service_link: str + + privacy_policy_link: str + + company_website_link: Optional[str] = None + + dark_mode_icon_file_id: Optional[str] = None + + dark_mode_icon_file_path: Optional[str] = None + + description: Optional[str] = None + + icon_file_id: Optional[str] = None + + icon_file_path: Optional[str] = None + + id: Optional[str] = None + + is_featured: Optional[bool] = None + """is_featured is accessible by consumers only""" + + published_by: Optional[str] = None + """published_by is only applicable to data aggregators (e.g. Crux)""" + + support_contact_email: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the ProviderInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.business_contact_email is not None: + body['business_contact_email'] = self.business_contact_email + if self.company_website_link is not None: body['company_website_link'] = self.company_website_link + if self.dark_mode_icon_file_id is not None: + body['dark_mode_icon_file_id'] = self.dark_mode_icon_file_id + if self.dark_mode_icon_file_path is not None: + body['dark_mode_icon_file_path'] = self.dark_mode_icon_file_path + if self.description is not None: body['description'] = self.description + if self.icon_file_id is not None: body['icon_file_id'] = self.icon_file_id + if self.icon_file_path is not None: body['icon_file_path'] = self.icon_file_path + if self.id is not None: body['id'] = self.id + if self.is_featured is not None: body['is_featured'] = self.is_featured + if self.name is not None: body['name'] = self.name + if self.privacy_policy_link is not None: body['privacy_policy_link'] = self.privacy_policy_link + if self.published_by is not None: body['published_by'] = self.published_by + if self.support_contact_email is not None: body['support_contact_email'] = self.support_contact_email + if self.term_of_service_link is not None: body['term_of_service_link'] = self.term_of_service_link + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ProviderInfo: + """Deserializes the ProviderInfo from a dictionary.""" + return cls(business_contact_email=d.get('business_contact_email', None), + company_website_link=d.get('company_website_link', None), + dark_mode_icon_file_id=d.get('dark_mode_icon_file_id', None), + dark_mode_icon_file_path=d.get('dark_mode_icon_file_path', None), + description=d.get('description', None), + icon_file_id=d.get('icon_file_id', None), + icon_file_path=d.get('icon_file_path', None), + id=d.get('id', None), + is_featured=d.get('is_featured', None), + name=d.get('name', None), + privacy_policy_link=d.get('privacy_policy_link', None), + published_by=d.get('published_by', None), + support_contact_email=d.get('support_contact_email', None), + term_of_service_link=d.get('term_of_service_link', None)) + + +@dataclass +class RegionInfo: + cloud: Optional[str] = None + + region: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the RegionInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.cloud is not None: body['cloud'] = self.cloud + if self.region is not None: body['region'] = self.region + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> RegionInfo: + """Deserializes the RegionInfo from a dictionary.""" + return cls(cloud=d.get('cloud', None), region=d.get('region', None)) + + +@dataclass +class RemoveExchangeForListingResponse: + + def as_dict(self) -> dict: + """Serializes the RemoveExchangeForListingResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> RemoveExchangeForListingResponse: + """Deserializes the RemoveExchangeForListingResponse from a dictionary.""" + return cls() + + +@dataclass +class RepoInfo: + git_repo_url: str + """the git repo url e.g. https://github.com/databrickslabs/dolly.git""" + + def as_dict(self) -> dict: + """Serializes the RepoInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.git_repo_url is not None: body['git_repo_url'] = self.git_repo_url + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> RepoInfo: + """Deserializes the RepoInfo from a dictionary.""" + return cls(git_repo_url=d.get('git_repo_url', None)) + + +@dataclass +class RepoInstallation: + repo_name: str + """the user-specified repo name for their installed git repo listing""" + + repo_path: str + """refers to the full url file path that navigates the user to the repo's entrypoint (e.g. a + README.md file, or the repo file view in the unified UI) should just be a relative path""" + + def as_dict(self) -> dict: + """Serializes the RepoInstallation into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.repo_name is not None: body['repo_name'] = self.repo_name + if self.repo_path is not None: body['repo_path'] = self.repo_path + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> RepoInstallation: + """Deserializes the RepoInstallation from a dictionary.""" + return cls(repo_name=d.get('repo_name', None), repo_path=d.get('repo_path', None)) + + +@dataclass +class SearchListingsResponse: + listings: Optional[List[Listing]] = None + + next_page_token: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the SearchListingsResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.listings: body['listings'] = [v.as_dict() for v in self.listings] + if self.next_page_token is not None: body['next_page_token'] = self.next_page_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> SearchListingsResponse: + """Deserializes the SearchListingsResponse from a dictionary.""" + return cls(listings=_repeated_dict(d, 'listings', Listing), + next_page_token=d.get('next_page_token', None)) + + +@dataclass +class ShareInfo: + name: str + + type: ListingShareType + + def as_dict(self) -> dict: + """Serializes the ShareInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.name is not None: body['name'] = self.name + if self.type is not None: body['type'] = self.type.value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> ShareInfo: + """Deserializes the ShareInfo from a dictionary.""" + return cls(name=d.get('name', None), type=_enum(d, 'type', ListingShareType)) + + +@dataclass +class SharedDataObject: + data_object_type: Optional[str] = None + """The type of the data object. Could be one of: TABLE, SCHEMA, NOTEBOOK_FILE, MODEL, VOLUME""" + + name: Optional[str] = None + """Name of the shared object""" + + def as_dict(self) -> dict: + """Serializes the SharedDataObject into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.data_object_type is not None: body['data_object_type'] = self.data_object_type + if self.name is not None: body['name'] = self.name + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> SharedDataObject: + """Deserializes the SharedDataObject from a dictionary.""" + return cls(data_object_type=d.get('data_object_type', None), name=d.get('name', None)) + + +class SortBy(Enum): + + SORT_BY_DATE = 'SORT_BY_DATE' + SORT_BY_RELEVANCE = 'SORT_BY_RELEVANCE' + SORT_BY_TITLE = 'SORT_BY_TITLE' + SORT_BY_UNSPECIFIED = 'SORT_BY_UNSPECIFIED' + + +@dataclass +class SortBySpec: + sort_by: SortBy + """The field on which to sort the listing.""" + + sort_order: SortOrder + """The order in which to sort the listing.""" + + def as_dict(self) -> dict: + """Serializes the SortBySpec into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.sort_by is not None: body['sort_by'] = self.sort_by.value + if self.sort_order is not None: body['sort_order'] = self.sort_order.value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> SortBySpec: + """Deserializes the SortBySpec from a dictionary.""" + return cls(sort_by=_enum(d, 'sort_by', SortBy), sort_order=_enum(d, 'sort_order', SortOrder)) + + +class SortOrder(Enum): + + SORT_ORDER_ASCENDING = 'SORT_ORDER_ASCENDING' + SORT_ORDER_DESCENDING = 'SORT_ORDER_DESCENDING' + SORT_ORDER_UNSPECIFIED = 'SORT_ORDER_UNSPECIFIED' + + +@dataclass +class TokenDetail: + bearer_token: Optional[str] = None + + endpoint: Optional[str] = None + + expiration_time: Optional[str] = None + + share_credentials_version: Optional[int] = None + """These field names must follow the delta sharing protocol. Original message: + RetrieveToken.Response in managed-catalog/api/messages/recipient.proto""" + + def as_dict(self) -> dict: + """Serializes the TokenDetail into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.bearer_token is not None: body['bearerToken'] = self.bearer_token + if self.endpoint is not None: body['endpoint'] = self.endpoint + if self.expiration_time is not None: body['expirationTime'] = self.expiration_time + if self.share_credentials_version is not None: + body['shareCredentialsVersion'] = self.share_credentials_version + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> TokenDetail: + """Deserializes the TokenDetail from a dictionary.""" + return cls(bearer_token=d.get('bearerToken', None), + endpoint=d.get('endpoint', None), + expiration_time=d.get('expirationTime', None), + share_credentials_version=d.get('shareCredentialsVersion', None)) + + +@dataclass +class TokenInfo: + activation_url: Optional[str] = None + """Full activation url to retrieve the access token. It will be empty if the token is already + retrieved.""" + + created_at: Optional[int] = None + """Time at which this Recipient Token was created, in epoch milliseconds.""" + + created_by: Optional[str] = None + """Username of Recipient Token creator.""" + + expiration_time: Optional[int] = None + """Expiration timestamp of the token in epoch milliseconds.""" + + id: Optional[str] = None + """Unique id of the Recipient Token.""" + + updated_at: Optional[int] = None + """Time at which this Recipient Token was updated, in epoch milliseconds.""" + + updated_by: Optional[str] = None + """Username of Recipient Token updater.""" + + def as_dict(self) -> dict: + """Serializes the TokenInfo into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.activation_url is not None: body['activation_url'] = self.activation_url + if self.created_at is not None: body['created_at'] = self.created_at + if self.created_by is not None: body['created_by'] = self.created_by + if self.expiration_time is not None: body['expiration_time'] = self.expiration_time + if self.id is not None: body['id'] = self.id + if self.updated_at is not None: body['updated_at'] = self.updated_at + if self.updated_by is not None: body['updated_by'] = self.updated_by + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> TokenInfo: + """Deserializes the TokenInfo from a dictionary.""" + return cls(activation_url=d.get('activation_url', None), + created_at=d.get('created_at', None), + created_by=d.get('created_by', None), + expiration_time=d.get('expiration_time', None), + id=d.get('id', None), + updated_at=d.get('updated_at', None), + updated_by=d.get('updated_by', None)) + + +@dataclass +class UpdateExchangeFilterRequest: + filter: ExchangeFilter + + id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the UpdateExchangeFilterRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.filter: body['filter'] = self.filter.as_dict() + if self.id is not None: body['id'] = self.id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateExchangeFilterRequest: + """Deserializes the UpdateExchangeFilterRequest from a dictionary.""" + return cls(filter=_from_dict(d, 'filter', ExchangeFilter), id=d.get('id', None)) + + +@dataclass +class UpdateExchangeFilterResponse: + filter: Optional[ExchangeFilter] = None + + def as_dict(self) -> dict: + """Serializes the UpdateExchangeFilterResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.filter: body['filter'] = self.filter.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateExchangeFilterResponse: + """Deserializes the UpdateExchangeFilterResponse from a dictionary.""" + return cls(filter=_from_dict(d, 'filter', ExchangeFilter)) + + +@dataclass +class UpdateExchangeRequest: + exchange: Exchange + + id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the UpdateExchangeRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange: body['exchange'] = self.exchange.as_dict() + if self.id is not None: body['id'] = self.id + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateExchangeRequest: + """Deserializes the UpdateExchangeRequest from a dictionary.""" + return cls(exchange=_from_dict(d, 'exchange', Exchange), id=d.get('id', None)) + + +@dataclass +class UpdateExchangeResponse: + exchange: Optional[Exchange] = None + + def as_dict(self) -> dict: + """Serializes the UpdateExchangeResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.exchange: body['exchange'] = self.exchange.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateExchangeResponse: + """Deserializes the UpdateExchangeResponse from a dictionary.""" + return cls(exchange=_from_dict(d, 'exchange', Exchange)) + + +@dataclass +class UpdateInstallationRequest: + installation: InstallationDetail + + installation_id: Optional[str] = None + + listing_id: Optional[str] = None + + rotate_token: Optional[bool] = None + + def as_dict(self) -> dict: + """Serializes the UpdateInstallationRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.installation: body['installation'] = self.installation.as_dict() + if self.installation_id is not None: body['installation_id'] = self.installation_id + if self.listing_id is not None: body['listing_id'] = self.listing_id + if self.rotate_token is not None: body['rotate_token'] = self.rotate_token + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateInstallationRequest: + """Deserializes the UpdateInstallationRequest from a dictionary.""" + return cls(installation=_from_dict(d, 'installation', InstallationDetail), + installation_id=d.get('installation_id', None), + listing_id=d.get('listing_id', None), + rotate_token=d.get('rotate_token', None)) + + +@dataclass +class UpdateInstallationResponse: + installation: Optional[InstallationDetail] = None + + def as_dict(self) -> dict: + """Serializes the UpdateInstallationResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.installation: body['installation'] = self.installation.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateInstallationResponse: + """Deserializes the UpdateInstallationResponse from a dictionary.""" + return cls(installation=_from_dict(d, 'installation', InstallationDetail)) + + +@dataclass +class UpdateListingRequest: + listing: Listing + + id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the UpdateListingRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.id is not None: body['id'] = self.id + if self.listing: body['listing'] = self.listing.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateListingRequest: + """Deserializes the UpdateListingRequest from a dictionary.""" + return cls(id=d.get('id', None), listing=_from_dict(d, 'listing', Listing)) + + +@dataclass +class UpdateListingResponse: + listing: Optional[Listing] = None + + def as_dict(self) -> dict: + """Serializes the UpdateListingResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.listing: body['listing'] = self.listing.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateListingResponse: + """Deserializes the UpdateListingResponse from a dictionary.""" + return cls(listing=_from_dict(d, 'listing', Listing)) + + +@dataclass +class UpdatePersonalizationRequestRequest: + status: PersonalizationRequestStatus + + listing_id: Optional[str] = None + + reason: Optional[str] = None + + request_id: Optional[str] = None + + share: Optional[ShareInfo] = None + + def as_dict(self) -> dict: + """Serializes the UpdatePersonalizationRequestRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.listing_id is not None: body['listing_id'] = self.listing_id + if self.reason is not None: body['reason'] = self.reason + if self.request_id is not None: body['request_id'] = self.request_id + if self.share: body['share'] = self.share.as_dict() + if self.status is not None: body['status'] = self.status.value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdatePersonalizationRequestRequest: + """Deserializes the UpdatePersonalizationRequestRequest from a dictionary.""" + return cls(listing_id=d.get('listing_id', None), + reason=d.get('reason', None), + request_id=d.get('request_id', None), + share=_from_dict(d, 'share', ShareInfo), + status=_enum(d, 'status', PersonalizationRequestStatus)) + + +@dataclass +class UpdatePersonalizationRequestResponse: + request: Optional[PersonalizationRequest] = None + + def as_dict(self) -> dict: + """Serializes the UpdatePersonalizationRequestResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.request: body['request'] = self.request.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdatePersonalizationRequestResponse: + """Deserializes the UpdatePersonalizationRequestResponse from a dictionary.""" + return cls(request=_from_dict(d, 'request', PersonalizationRequest)) + + +@dataclass +class UpdateProviderAnalyticsDashboardRequest: + id: Optional[str] = None + """id is immutable property and can't be updated.""" + + version: Optional[int] = None + """this is the version of the dashboard template we want to update our user to current expectation + is that it should be equal to latest version of the dashboard template""" + + def as_dict(self) -> dict: + """Serializes the UpdateProviderAnalyticsDashboardRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.id is not None: body['id'] = self.id + if self.version is not None: body['version'] = self.version + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateProviderAnalyticsDashboardRequest: + """Deserializes the UpdateProviderAnalyticsDashboardRequest from a dictionary.""" + return cls(id=d.get('id', None), version=d.get('version', None)) + + +@dataclass +class UpdateProviderAnalyticsDashboardResponse: + id: str + """id & version should be the same as the request""" + + dashboard_id: str + """this is newly created Lakeview dashboard for the user""" + + version: Optional[int] = None + + def as_dict(self) -> dict: + """Serializes the UpdateProviderAnalyticsDashboardResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.dashboard_id is not None: body['dashboard_id'] = self.dashboard_id + if self.id is not None: body['id'] = self.id + if self.version is not None: body['version'] = self.version + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateProviderAnalyticsDashboardResponse: + """Deserializes the UpdateProviderAnalyticsDashboardResponse from a dictionary.""" + return cls(dashboard_id=d.get('dashboard_id', None), + id=d.get('id', None), + version=d.get('version', None)) + + +@dataclass +class UpdateProviderRequest: + provider: ProviderInfo + + id: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the UpdateProviderRequest into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.id is not None: body['id'] = self.id + if self.provider: body['provider'] = self.provider.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateProviderRequest: + """Deserializes the UpdateProviderRequest from a dictionary.""" + return cls(id=d.get('id', None), provider=_from_dict(d, 'provider', ProviderInfo)) + + +@dataclass +class UpdateProviderResponse: + provider: Optional[ProviderInfo] = None + + def as_dict(self) -> dict: + """Serializes the UpdateProviderResponse into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.provider: body['provider'] = self.provider.as_dict() + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> UpdateProviderResponse: + """Deserializes the UpdateProviderResponse from a dictionary.""" + return cls(provider=_from_dict(d, 'provider', ProviderInfo)) + + +class Visibility(Enum): + + PRIVATE = 'PRIVATE' + PUBLIC = 'PUBLIC' + + +@dataclass +class VisibilityFilter: + filter_type: Optional[FilterType] = None + + filter_value: Optional[str] = None + + def as_dict(self) -> dict: + """Serializes the VisibilityFilter into a dictionary suitable for use as a JSON request body.""" + body = {} + if self.filter_type is not None: body['filterType'] = self.filter_type.value + if self.filter_value is not None: body['filterValue'] = self.filter_value + return body + + @classmethod + def from_dict(cls, d: Dict[str, any]) -> VisibilityFilter: + """Deserializes the VisibilityFilter from a dictionary.""" + return cls(filter_type=_enum(d, 'filterType', FilterType), filter_value=d.get('filterValue', None)) + + +class ConsumerFulfillmentsAPI: + """Fulfillments are entities that allow consumers to preview installations.""" + + def __init__(self, api_client): + self._api = api_client + + def get(self, + listing_id: str, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[SharedDataObject]: + """Get listing content metadata. + + Get a high level preview of the metadata of listing installable content. + + :param listing_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`SharedDataObject` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + f'/api/2.1/marketplace-consumer/listings/{listing_id}/content', + query=query, + headers=headers) + if 'shared_data_objects' in json: + for v in json['shared_data_objects']: + yield SharedDataObject.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def list(self, + listing_id: str, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[ListingFulfillment]: + """List all listing fulfillments. + + Get all listings fulfillments associated with a listing. A _fulfillment_ is a potential installation. + Standard installations contain metadata about the attached share or git repo. Only one of these fields + will be present. Personalized installations contain metadata about the attached share or git repo, as + well as the Delta Sharing recipient type. + + :param listing_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ListingFulfillment` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + f'/api/2.1/marketplace-consumer/listings/{listing_id}/fulfillments', + query=query, + headers=headers) + if 'fulfillments' in json: + for v in json['fulfillments']: + yield ListingFulfillment.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + +class ConsumerInstallationsAPI: + """Installations are entities that allow consumers to interact with Databricks Marketplace listings.""" + + def __init__(self, api_client): + self._api = api_client + + def create(self, + listing_id: str, + *, + accepted_consumer_terms: Optional[ConsumerTerms] = None, + catalog_name: Optional[str] = None, + recipient_type: Optional[DeltaSharingRecipientType] = None, + repo_detail: Optional[RepoInstallation] = None, + share_name: Optional[str] = None) -> Installation: + """Install from a listing. + + Install payload associated with a Databricks Marketplace listing. + + :param listing_id: str + :param accepted_consumer_terms: :class:`ConsumerTerms` (optional) + :param catalog_name: str (optional) + :param recipient_type: :class:`DeltaSharingRecipientType` (optional) + :param repo_detail: :class:`RepoInstallation` (optional) + for git repo installations + :param share_name: str (optional) + + :returns: :class:`Installation` + """ + body = {} + if accepted_consumer_terms is not None: + body['accepted_consumer_terms'] = accepted_consumer_terms.as_dict() + if catalog_name is not None: body['catalog_name'] = catalog_name + if recipient_type is not None: body['recipient_type'] = recipient_type.value + if repo_detail is not None: body['repo_detail'] = repo_detail.as_dict() + if share_name is not None: body['share_name'] = share_name + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('POST', + f'/api/2.1/marketplace-consumer/listings/{listing_id}/installations', + body=body, + headers=headers) + return Installation.from_dict(res) + + def delete(self, listing_id: str, installation_id: str): + """Uninstall from a listing. + + Uninstall an installation associated with a Databricks Marketplace listing. + + :param listing_id: str + :param installation_id: str + + + """ + + headers = {'Accept': 'application/json', } + + self._api.do('DELETE', + f'/api/2.1/marketplace-consumer/listings/{listing_id}/installations/{installation_id}', + headers=headers) + + def list(self, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[InstallationDetail]: + """List all installations. + + List all installations across all listings. + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`InstallationDetail` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.1/marketplace-consumer/installations', + query=query, + headers=headers) + if 'installations' in json: + for v in json['installations']: + yield InstallationDetail.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def list_listing_installations(self, + listing_id: str, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[InstallationDetail]: + """List installations for a listing. + + List all installations for a particular listing. + + :param listing_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`InstallationDetail` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + f'/api/2.1/marketplace-consumer/listings/{listing_id}/installations', + query=query, + headers=headers) + if 'installations' in json: + for v in json['installations']: + yield InstallationDetail.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def update(self, + listing_id: str, + installation_id: str, + installation: InstallationDetail, + *, + rotate_token: Optional[bool] = None) -> UpdateInstallationResponse: + """Update an installation. + + This is a update API that will update the part of the fields defined in the installation table as well + as interact with external services according to the fields not included in the installation table 1. + the token will be rotate if the rotateToken flag is true 2. the token will be forcibly rotate if the + rotateToken flag is true and the tokenInfo field is empty + + :param listing_id: str + :param installation_id: str + :param installation: :class:`InstallationDetail` + :param rotate_token: bool (optional) + + :returns: :class:`UpdateInstallationResponse` + """ + body = {} + if installation is not None: body['installation'] = installation.as_dict() + if rotate_token is not None: body['rotate_token'] = rotate_token + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do( + 'PUT', + f'/api/2.1/marketplace-consumer/listings/{listing_id}/installations/{installation_id}', + body=body, + headers=headers) + return UpdateInstallationResponse.from_dict(res) + + +class ConsumerListingsAPI: + """Listings are the core entities in the Marketplace. They represent the products that are available for + consumption.""" + + def __init__(self, api_client): + self._api = api_client + + def get(self, id: str) -> GetListingResponse: + """Get listing. + + Get a published listing in the Databricks Marketplace that the consumer has access to. + + :param id: str + + :returns: :class:`GetListingResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', f'/api/2.1/marketplace-consumer/listings/{id}', headers=headers) + return GetListingResponse.from_dict(res) + + def list(self, + *, + assets: Optional[List[AssetType]] = None, + categories: Optional[List[Category]] = None, + is_free: Optional[bool] = None, + is_private_exchange: Optional[bool] = None, + is_staff_pick: Optional[bool] = None, + page_size: Optional[int] = None, + page_token: Optional[str] = None, + provider_ids: Optional[List[str]] = None, + sort_by_spec: Optional[SortBySpec] = None, + tags: Optional[List[ListingTag]] = None) -> Iterator[Listing]: + """List listings. + + List all published listings in the Databricks Marketplace that the consumer has access to. + + :param assets: List[:class:`AssetType`] (optional) + Matches any of the following asset types + :param categories: List[:class:`Category`] (optional) + Matches any of the following categories + :param is_free: bool (optional) + Filters each listing based on if it is free. + :param is_private_exchange: bool (optional) + Filters each listing based on if it is a private exchange. + :param is_staff_pick: bool (optional) + Filters each listing based on whether it is a staff pick. + :param page_size: int (optional) + :param page_token: str (optional) + :param provider_ids: List[str] (optional) + Matches any of the following provider ids + :param sort_by_spec: :class:`SortBySpec` (optional) + Criteria for sorting the resulting set of listings. + :param tags: List[:class:`ListingTag`] (optional) + Matches any of the following tags + + :returns: Iterator over :class:`Listing` + """ + + query = {} + if assets is not None: query['assets'] = [v.value for v in assets] + if categories is not None: query['categories'] = [v.value for v in categories] + if is_free is not None: query['is_free'] = is_free + if is_private_exchange is not None: query['is_private_exchange'] = is_private_exchange + if is_staff_pick is not None: query['is_staff_pick'] = is_staff_pick + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + if provider_ids is not None: query['provider_ids'] = [v for v in provider_ids] + if sort_by_spec is not None: query['sort_by_spec'] = sort_by_spec.as_dict() + if tags is not None: query['tags'] = [v.as_dict() for v in tags] + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', '/api/2.1/marketplace-consumer/listings', query=query, headers=headers) + if 'listings' in json: + for v in json['listings']: + yield Listing.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def search(self, + query: str, + *, + assets: Optional[List[AssetType]] = None, + categories: Optional[List[Category]] = None, + is_free: Optional[bool] = None, + is_private_exchange: Optional[bool] = None, + page_size: Optional[int] = None, + page_token: Optional[str] = None, + provider_ids: Optional[List[str]] = None, + sort_by: Optional[SortBy] = None) -> Iterator[Listing]: + """Search listings. + + Search published listings in the Databricks Marketplace that the consumer has access to. This query + supports a variety of different search parameters and performs fuzzy matching. + + :param query: str + Fuzzy matches query + :param assets: List[:class:`AssetType`] (optional) + Matches any of the following asset types + :param categories: List[:class:`Category`] (optional) + Matches any of the following categories + :param is_free: bool (optional) + :param is_private_exchange: bool (optional) + :param page_size: int (optional) + :param page_token: str (optional) + :param provider_ids: List[str] (optional) + Matches any of the following provider ids + :param sort_by: :class:`SortBy` (optional) + + :returns: Iterator over :class:`Listing` + """ + + query = {} + if assets is not None: query['assets'] = [v.value for v in assets] + if categories is not None: query['categories'] = [v.value for v in categories] + if is_free is not None: query['is_free'] = is_free + if is_private_exchange is not None: query['is_private_exchange'] = is_private_exchange + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + if provider_ids is not None: query['provider_ids'] = [v for v in provider_ids] + if query is not None: query['query'] = query + if sort_by is not None: query['sort_by'] = sort_by.value + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.1/marketplace-consumer/search-listings', + query=query, + headers=headers) + if 'listings' in json: + for v in json['listings']: + yield Listing.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + +class ConsumerPersonalizationRequestsAPI: + """Personalization Requests allow customers to interact with the individualized Marketplace listing flow.""" + + def __init__(self, api_client): + self._api = api_client + + def create( + self, + listing_id: str, + intended_use: str, + accepted_consumer_terms: ConsumerTerms, + *, + comment: Optional[str] = None, + company: Optional[str] = None, + first_name: Optional[str] = None, + is_from_lighthouse: Optional[bool] = None, + last_name: Optional[str] = None, + recipient_type: Optional[DeltaSharingRecipientType] = None + ) -> CreatePersonalizationRequestResponse: + """Create a personalization request. + + Create a personalization request for a listing. + + :param listing_id: str + :param intended_use: str + :param accepted_consumer_terms: :class:`ConsumerTerms` + :param comment: str (optional) + :param company: str (optional) + :param first_name: str (optional) + :param is_from_lighthouse: bool (optional) + :param last_name: str (optional) + :param recipient_type: :class:`DeltaSharingRecipientType` (optional) + + :returns: :class:`CreatePersonalizationRequestResponse` + """ + body = {} + if accepted_consumer_terms is not None: + body['accepted_consumer_terms'] = accepted_consumer_terms.as_dict() + if comment is not None: body['comment'] = comment + if company is not None: body['company'] = company + if first_name is not None: body['first_name'] = first_name + if intended_use is not None: body['intended_use'] = intended_use + if is_from_lighthouse is not None: body['is_from_lighthouse'] = is_from_lighthouse + if last_name is not None: body['last_name'] = last_name + if recipient_type is not None: body['recipient_type'] = recipient_type.value + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('POST', + f'/api/2.1/marketplace-consumer/listings/{listing_id}/personalization-requests', + body=body, + headers=headers) + return CreatePersonalizationRequestResponse.from_dict(res) + + def get(self, listing_id: str) -> GetPersonalizationRequestResponse: + """Get the personalization request for a listing. + + Get the personalization request for a listing. Each consumer can make at *most* one personalization + request for a listing. + + :param listing_id: str + + :returns: :class:`GetPersonalizationRequestResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', + f'/api/2.1/marketplace-consumer/listings/{listing_id}/personalization-requests', + headers=headers) + return GetPersonalizationRequestResponse.from_dict(res) + + def list(self, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[PersonalizationRequest]: + """List all personalization requests. + + List personalization requests for a consumer across all listings. + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`PersonalizationRequest` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.1/marketplace-consumer/personalization-requests', + query=query, + headers=headers) + if 'personalization_requests' in json: + for v in json['personalization_requests']: + yield PersonalizationRequest.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + +class ConsumerProvidersAPI: + """Providers are the entities that publish listings to the Marketplace.""" + + def __init__(self, api_client): + self._api = api_client + + def get(self, id: str) -> GetProviderResponse: + """Get a provider. + + Get a provider in the Databricks Marketplace with at least one visible listing. + + :param id: str + + :returns: :class:`GetProviderResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', f'/api/2.1/marketplace-consumer/providers/{id}', headers=headers) + return GetProviderResponse.from_dict(res) + + def list(self, + *, + is_featured: Optional[bool] = None, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[ProviderInfo]: + """List providers. + + List all providers in the Databricks Marketplace with at least one visible listing. + + :param is_featured: bool (optional) + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ProviderInfo` + """ + + query = {} + if is_featured is not None: query['is_featured'] = is_featured + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.1/marketplace-consumer/providers', + query=query, + headers=headers) + if 'providers' in json: + for v in json['providers']: + yield ProviderInfo.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + +class ProviderExchangeFiltersAPI: + """Marketplace exchanges filters curate which groups can access an exchange.""" + + def __init__(self, api_client): + self._api = api_client + + def create(self, filter: ExchangeFilter) -> CreateExchangeFilterResponse: + """Create a new exchange filter. + + Add an exchange filter. + + :param filter: :class:`ExchangeFilter` + + :returns: :class:`CreateExchangeFilterResponse` + """ + body = {} + if filter is not None: body['filter'] = filter.as_dict() + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('POST', '/api/2.0/marketplace-exchange/filters', body=body, headers=headers) + return CreateExchangeFilterResponse.from_dict(res) + + def delete(self, id: str): + """Delete an exchange filter. + + Delete an exchange filter + + :param id: str + + + """ + + headers = {'Accept': 'application/json', } + + self._api.do('DELETE', f'/api/2.0/marketplace-exchange/filters/{id}', headers=headers) + + def list(self, + exchange_id: str, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[ExchangeFilter]: + """List exchange filters. + + List exchange filter + + :param exchange_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ExchangeFilter` + """ + + query = {} + if exchange_id is not None: query['exchange_id'] = exchange_id + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', '/api/2.0/marketplace-exchange/filters', query=query, headers=headers) + if 'filters' in json: + for v in json['filters']: + yield ExchangeFilter.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def update(self, id: str, filter: ExchangeFilter) -> UpdateExchangeFilterResponse: + """Update exchange filter. + + Update an exchange filter. + + :param id: str + :param filter: :class:`ExchangeFilter` + + :returns: :class:`UpdateExchangeFilterResponse` + """ + body = {} + if filter is not None: body['filter'] = filter.as_dict() + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('PUT', f'/api/2.0/marketplace-exchange/filters/{id}', body=body, headers=headers) + return UpdateExchangeFilterResponse.from_dict(res) + + +class ProviderExchangesAPI: + """Marketplace exchanges allow providers to share their listings with a curated set of customers.""" + + def __init__(self, api_client): + self._api = api_client + + def add_listing_to_exchange(self, listing_id: str, exchange_id: str) -> AddExchangeForListingResponse: + """Add an exchange for listing. + + Associate an exchange with a listing + + :param listing_id: str + :param exchange_id: str + + :returns: :class:`AddExchangeForListingResponse` + """ + body = {} + if exchange_id is not None: body['exchange_id'] = exchange_id + if listing_id is not None: body['listing_id'] = listing_id + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('POST', + '/api/2.0/marketplace-exchange/exchanges-for-listing', + body=body, + headers=headers) + return AddExchangeForListingResponse.from_dict(res) + + def create(self, exchange: Exchange) -> CreateExchangeResponse: + """Create an exchange. + + Create an exchange + + :param exchange: :class:`Exchange` + + :returns: :class:`CreateExchangeResponse` + """ + body = {} + if exchange is not None: body['exchange'] = exchange.as_dict() + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('POST', '/api/2.0/marketplace-exchange/exchanges', body=body, headers=headers) + return CreateExchangeResponse.from_dict(res) + + def delete(self, id: str): + """Delete an exchange. + + This removes a listing from marketplace. + + :param id: str + + + """ + + headers = {'Accept': 'application/json', } + + self._api.do('DELETE', f'/api/2.0/marketplace-exchange/exchanges/{id}', headers=headers) + + def delete_listing_from_exchange(self, id: str): + """Remove an exchange for listing. + + Disassociate an exchange with a listing + + :param id: str + + + """ + + headers = {'Accept': 'application/json', } + + self._api.do('DELETE', f'/api/2.0/marketplace-exchange/exchanges-for-listing/{id}', headers=headers) + + def get(self, id: str) -> GetExchangeResponse: + """Get an exchange. + + Get an exchange. + + :param id: str + + :returns: :class:`GetExchangeResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', f'/api/2.0/marketplace-exchange/exchanges/{id}', headers=headers) + return GetExchangeResponse.from_dict(res) + + def list(self, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[Exchange]: + """List exchanges. + + List exchanges visible to provider + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`Exchange` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.0/marketplace-exchange/exchanges', + query=query, + headers=headers) + if 'exchanges' in json: + for v in json['exchanges']: + yield Exchange.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def list_exchanges_for_listing(self, + listing_id: str, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[ExchangeListing]: + """List exchanges for listing. + + List exchanges associated with a listing + + :param listing_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ExchangeListing` + """ + + query = {} + if listing_id is not None: query['listing_id'] = listing_id + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.0/marketplace-exchange/exchanges-for-listing', + query=query, + headers=headers) + if 'exchange_listing' in json: + for v in json['exchange_listing']: + yield ExchangeListing.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def list_listings_for_exchange(self, + exchange_id: str, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[ExchangeListing]: + """List listings for exchange. + + List listings associated with an exchange + + :param exchange_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ExchangeListing` + """ + + query = {} + if exchange_id is not None: query['exchange_id'] = exchange_id + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.0/marketplace-exchange/listings-for-exchange', + query=query, + headers=headers) + if 'exchange_listings' in json: + for v in json['exchange_listings']: + yield ExchangeListing.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def update(self, id: str, exchange: Exchange) -> UpdateExchangeResponse: + """Update exchange. + + Update an exchange + + :param id: str + :param exchange: :class:`Exchange` + + :returns: :class:`UpdateExchangeResponse` + """ + body = {} + if exchange is not None: body['exchange'] = exchange.as_dict() + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('PUT', f'/api/2.0/marketplace-exchange/exchanges/{id}', body=body, headers=headers) + return UpdateExchangeResponse.from_dict(res) + + +class ProviderFilesAPI: + """Marketplace offers a set of file APIs for various purposes such as preview notebooks and provider icons.""" + + def __init__(self, api_client): + self._api = api_client + + def create(self, + file_parent: FileParent, + marketplace_file_type: MarketplaceFileType, + mime_type: str, + *, + display_name: Optional[str] = None) -> CreateFileResponse: + """Create a file. + + Create a file. Currently, only provider icons and attached notebooks are supported. + + :param file_parent: :class:`FileParent` + :param marketplace_file_type: :class:`MarketplaceFileType` + :param mime_type: str + :param display_name: str (optional) + + :returns: :class:`CreateFileResponse` + """ + body = {} + if display_name is not None: body['display_name'] = display_name + if file_parent is not None: body['file_parent'] = file_parent.as_dict() + if marketplace_file_type is not None: body['marketplace_file_type'] = marketplace_file_type.value + if mime_type is not None: body['mime_type'] = mime_type + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('POST', '/api/2.0/marketplace-provider/files', body=body, headers=headers) + return CreateFileResponse.from_dict(res) + + def delete(self, file_id: str): + """Delete a file. + + Delete a file + + :param file_id: str + + + """ + + headers = {'Accept': 'application/json', } + + self._api.do('DELETE', f'/api/2.0/marketplace-provider/files/{file_id}', headers=headers) + + def get(self, file_id: str) -> GetFileResponse: + """Get a file. + + Get a file + + :param file_id: str + + :returns: :class:`GetFileResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', f'/api/2.0/marketplace-provider/files/{file_id}', headers=headers) + return GetFileResponse.from_dict(res) + + def list(self, + file_parent: FileParent, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[FileInfo]: + """List files. + + List files attached to a parent entity. + + :param file_parent: :class:`FileParent` + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`FileInfo` + """ + + query = {} + if file_parent is not None: query['file_parent'] = file_parent.as_dict() + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', '/api/2.0/marketplace-provider/files', query=query, headers=headers) + if 'file_infos' in json: + for v in json['file_infos']: + yield FileInfo.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + +class ProviderListingsAPI: + """Listings are the core entities in the Marketplace. They represent the products that are available for + consumption.""" + + def __init__(self, api_client): + self._api = api_client + + def create(self, listing: Listing) -> CreateListingResponse: + """Create a listing. + + Create a new listing + + :param listing: :class:`Listing` + + :returns: :class:`CreateListingResponse` + """ + body = {} + if listing is not None: body['listing'] = listing.as_dict() + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('POST', '/api/2.0/marketplace-provider/listing', body=body, headers=headers) + return CreateListingResponse.from_dict(res) + + def delete(self, id: str): + """Delete a listing. + + Delete a listing + + :param id: str + + + """ + + headers = {'Accept': 'application/json', } + + self._api.do('DELETE', f'/api/2.0/marketplace-provider/listings/{id}', headers=headers) + + def get(self, id: str) -> GetListingResponse: + """Get a listing. + + Get a listing + + :param id: str + + :returns: :class:`GetListingResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', f'/api/2.0/marketplace-provider/listings/{id}', headers=headers) + return GetListingResponse.from_dict(res) + + def list(self, *, page_size: Optional[int] = None, page_token: Optional[str] = None) -> Iterator[Listing]: + """List listings. + + List listings owned by this provider + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`Listing` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', '/api/2.0/marketplace-provider/listings', query=query, headers=headers) + if 'listings' in json: + for v in json['listings']: + yield Listing.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def update(self, id: str, listing: Listing) -> UpdateListingResponse: + """Update listing. + + Update a listing + + :param id: str + :param listing: :class:`Listing` + + :returns: :class:`UpdateListingResponse` + """ + body = {} + if listing is not None: body['listing'] = listing.as_dict() + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('PUT', f'/api/2.0/marketplace-provider/listings/{id}', body=body, headers=headers) + return UpdateListingResponse.from_dict(res) + + +class ProviderPersonalizationRequestsAPI: + """Personalization requests are an alternate to instantly available listings. Control the lifecycle of + personalized solutions.""" + + def __init__(self, api_client): + self._api = api_client + + def list(self, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[PersonalizationRequest]: + """All personalization requests across all listings. + + List personalization requests to this provider. This will return all personalization requests, + regardless of which listing they are for. + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`PersonalizationRequest` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.0/marketplace-provider/personalization-requests', + query=query, + headers=headers) + if 'personalization_requests' in json: + for v in json['personalization_requests']: + yield PersonalizationRequest.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def update(self, + listing_id: str, + request_id: str, + status: PersonalizationRequestStatus, + *, + reason: Optional[str] = None, + share: Optional[ShareInfo] = None) -> UpdatePersonalizationRequestResponse: + """Update personalization request status. + + Update personalization request. This method only permits updating the status of the request. + + :param listing_id: str + :param request_id: str + :param status: :class:`PersonalizationRequestStatus` + :param reason: str (optional) + :param share: :class:`ShareInfo` (optional) + + :returns: :class:`UpdatePersonalizationRequestResponse` + """ + body = {} + if reason is not None: body['reason'] = reason + if share is not None: body['share'] = share.as_dict() + if status is not None: body['status'] = status.value + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do( + 'PUT', + f'/api/2.0/marketplace-provider/listings/{listing_id}/personalization-requests/{request_id}/request-status', + body=body, + headers=headers) + return UpdatePersonalizationRequestResponse.from_dict(res) + + +class ProviderProviderAnalyticsDashboardsAPI: + """Manage templated analytics solution for providers.""" + + def __init__(self, api_client): + self._api = api_client + + def create(self) -> ProviderAnalyticsDashboard: + """Create provider analytics dashboard. + + Create provider analytics dashboard. Returns Marketplace specific `id`. Not to be confused with the + Lakeview dashboard id. + + :returns: :class:`ProviderAnalyticsDashboard` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('POST', '/api/2.0/marketplace-provider/analytics_dashboard', headers=headers) + return ProviderAnalyticsDashboard.from_dict(res) + + def get(self) -> ListProviderAnalyticsDashboardResponse: + """Get provider analytics dashboard. + + Get provider analytics dashboard. + + :returns: :class:`ListProviderAnalyticsDashboardResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', '/api/2.0/marketplace-provider/analytics_dashboard', headers=headers) + return ListProviderAnalyticsDashboardResponse.from_dict(res) + + def get_latest_version(self) -> GetLatestVersionProviderAnalyticsDashboardResponse: + """Get latest version of provider analytics dashboard. + + Get latest version of provider analytics dashboard. + + :returns: :class:`GetLatestVersionProviderAnalyticsDashboardResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', '/api/2.0/marketplace-provider/analytics_dashboard/latest', headers=headers) + return GetLatestVersionProviderAnalyticsDashboardResponse.from_dict(res) + + def update(self, id: str, *, version: Optional[int] = None) -> UpdateProviderAnalyticsDashboardResponse: + """Update provider analytics dashboard. + + Update provider analytics dashboard. + + :param id: str + id is immutable property and can't be updated. + :param version: int (optional) + this is the version of the dashboard template we want to update our user to current expectation is + that it should be equal to latest version of the dashboard template + + :returns: :class:`UpdateProviderAnalyticsDashboardResponse` + """ + body = {} + if version is not None: body['version'] = version + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('PUT', + f'/api/2.0/marketplace-provider/analytics_dashboard/{id}', + body=body, + headers=headers) + return UpdateProviderAnalyticsDashboardResponse.from_dict(res) + + +class ProviderProvidersAPI: + """Providers are entities that manage assets in Marketplace.""" + + def __init__(self, api_client): + self._api = api_client + + def create(self, provider: ProviderInfo) -> CreateProviderResponse: + """Create a provider. + + Create a provider + + :param provider: :class:`ProviderInfo` + + :returns: :class:`CreateProviderResponse` + """ + body = {} + if provider is not None: body['provider'] = provider.as_dict() + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('POST', '/api/2.0/marketplace-provider/provider', body=body, headers=headers) + return CreateProviderResponse.from_dict(res) + + def delete(self, id: str): + """Delete provider. + + Delete provider + + :param id: str + + + """ + + headers = {'Accept': 'application/json', } + + self._api.do('DELETE', f'/api/2.0/marketplace-provider/providers/{id}', headers=headers) + + def get(self, id: str) -> GetProviderResponse: + """Get provider. + + Get provider profile + + :param id: str + + :returns: :class:`GetProviderResponse` + """ + + headers = {'Accept': 'application/json', } + + res = self._api.do('GET', f'/api/2.0/marketplace-provider/providers/{id}', headers=headers) + return GetProviderResponse.from_dict(res) + + def list(self, + *, + page_size: Optional[int] = None, + page_token: Optional[str] = None) -> Iterator[ProviderInfo]: + """List providers. + + List provider profiles for account. + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ProviderInfo` + """ + + query = {} + if page_size is not None: query['page_size'] = page_size + if page_token is not None: query['page_token'] = page_token + headers = {'Accept': 'application/json', } + + while True: + json = self._api.do('GET', + '/api/2.0/marketplace-provider/providers', + query=query, + headers=headers) + if 'providers' in json: + for v in json['providers']: + yield ProviderInfo.from_dict(v) + if 'next_page_token' not in json or not json['next_page_token']: + return + query['page_token'] = json['next_page_token'] + + def update(self, id: str, provider: ProviderInfo) -> UpdateProviderResponse: + """Update provider. + + Update provider profile + + :param id: str + :param provider: :class:`ProviderInfo` + + :returns: :class:`UpdateProviderResponse` + """ + body = {} + if provider is not None: body['provider'] = provider.as_dict() + headers = {'Accept': 'application/json', 'Content-Type': 'application/json', } + + res = self._api.do('PUT', f'/api/2.0/marketplace-provider/providers/{id}', body=body, headers=headers) + return UpdateProviderResponse.from_dict(res) diff --git a/databricks/sdk/version.py b/databricks/sdk/version.py index f8ab8c2e1..8c308d723 100644 --- a/databricks/sdk/version.py +++ b/databricks/sdk/version.py @@ -1 +1 @@ -__version__ = '0.24.0' +__version__ = '0.25.0' diff --git a/docs/account/iam/workspace_assignment.rst b/docs/account/iam/workspace_assignment.rst index a09af197c..1ce06996e 100644 --- a/docs/account/iam/workspace_assignment.rst +++ b/docs/account/iam/workspace_assignment.rst @@ -61,7 +61,7 @@ :returns: Iterator over :class:`PermissionAssignment` - .. py:method:: update(workspace_id: int, principal_id: int, permissions: List[WorkspacePermission]) + .. py:method:: update(workspace_id: int, principal_id: int, permissions: List[WorkspacePermission]) -> PermissionAssignment Usage: @@ -82,9 +82,9 @@ workspace_id = os.environ["DUMMY_WORKSPACE_ID"] - a.workspace_assignment.update(workspace_id=workspace_id, - principal_id=spn_id, - permissions=[iam.WorkspacePermission.USER]) + _ = a.workspace_assignment.update(workspace_id=workspace_id, + principal_id=spn_id, + permissions=[iam.WorkspacePermission.USER]) Create or update permissions assignment. @@ -96,7 +96,9 @@ :param principal_id: int The ID of the user, service principal, or group. :param permissions: List[:class:`WorkspacePermission`] - Array of permissions assignments to update on the workspace. - + Array of permissions assignments to update on the workspace. Note that excluding this field will + have the same effect as providing an empty list which will result in the deletion of all permissions + for the principal. + :returns: :class:`PermissionAssignment` \ No newline at end of file diff --git a/docs/dbdataclasses/catalog.rst b/docs/dbdataclasses/catalog.rst index 55011b026..f9e45e60e 100644 --- a/docs/dbdataclasses/catalog.rst +++ b/docs/dbdataclasses/catalog.rst @@ -73,7 +73,11 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: -.. autoclass:: AzureManagedIdentity +.. autoclass:: AzureManagedIdentityRequest + :members: + :undoc-members: + +.. autoclass:: AzureManagedIdentityResponse :members: :undoc-members: @@ -753,7 +757,7 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:class:: MonitorCronSchedulePauseStatus - Whether the schedule is paused or not + Read only field that indicates whether a schedule is paused or not. .. py:attribute:: PAUSED :value: "PAUSED" @@ -761,44 +765,21 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:attribute:: UNPAUSED :value: "UNPAUSED" -.. autoclass:: MonitorCustomMetric - :members: - :undoc-members: - -.. py:class:: MonitorCustomMetricType - - The type of the custom metric. - - .. py:attribute:: CUSTOM_METRIC_TYPE_AGGREGATE - :value: "CUSTOM_METRIC_TYPE_AGGREGATE" - - .. py:attribute:: CUSTOM_METRIC_TYPE_DERIVED - :value: "CUSTOM_METRIC_TYPE_DERIVED" - - .. py:attribute:: CUSTOM_METRIC_TYPE_DRIFT - :value: "CUSTOM_METRIC_TYPE_DRIFT" - - .. py:attribute:: MONITOR_STATUS_ERROR - :value: "MONITOR_STATUS_ERROR" - - .. py:attribute:: MONITOR_STATUS_FAILED - :value: "MONITOR_STATUS_FAILED" - .. autoclass:: MonitorDataClassificationConfig :members: :undoc-members: -.. autoclass:: MonitorDestinations +.. autoclass:: MonitorDestination :members: :undoc-members: -.. autoclass:: MonitorInferenceLogProfileType +.. autoclass:: MonitorInferenceLog :members: :undoc-members: -.. py:class:: MonitorInferenceLogProfileTypeProblemType +.. py:class:: MonitorInferenceLogProblemType - Problem type the model aims to solve. + Problem type the model aims to solve. Determines the type of model-quality metrics that will be computed. .. py:attribute:: PROBLEM_TYPE_CLASSIFICATION :value: "PROBLEM_TYPE_CLASSIFICATION" @@ -829,7 +810,24 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:attribute:: MONITOR_STATUS_PENDING :value: "MONITOR_STATUS_PENDING" -.. autoclass:: MonitorNotificationsConfig +.. autoclass:: MonitorMetric + :members: + :undoc-members: + +.. py:class:: MonitorMetricType + + Can only be one of ``"CUSTOM_METRIC_TYPE_AGGREGATE"``, ``"CUSTOM_METRIC_TYPE_DERIVED"``, or ``"CUSTOM_METRIC_TYPE_DRIFT"``. The ``"CUSTOM_METRIC_TYPE_AGGREGATE"`` and ``"CUSTOM_METRIC_TYPE_DERIVED"`` metrics are computed on a single table, whereas the ``"CUSTOM_METRIC_TYPE_DRIFT"`` compare metrics across baseline and input table, or across the two consecutive time windows. - CUSTOM_METRIC_TYPE_AGGREGATE: only depend on the existing columns in your table - CUSTOM_METRIC_TYPE_DERIVED: depend on previously computed aggregate metrics - CUSTOM_METRIC_TYPE_DRIFT: depend on previously computed aggregate or derived metrics + + .. py:attribute:: CUSTOM_METRIC_TYPE_AGGREGATE + :value: "CUSTOM_METRIC_TYPE_AGGREGATE" + + .. py:attribute:: CUSTOM_METRIC_TYPE_DERIVED + :value: "CUSTOM_METRIC_TYPE_DERIVED" + + .. py:attribute:: CUSTOM_METRIC_TYPE_DRIFT + :value: "CUSTOM_METRIC_TYPE_DRIFT" + +.. autoclass:: MonitorNotifications :members: :undoc-members: @@ -856,11 +854,21 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:attribute:: SUCCESS :value: "SUCCESS" -.. autoclass:: MonitorSnapshotProfileType +.. py:class:: MonitorRefreshInfoTrigger + + The method by which the refresh was triggered. + + .. py:attribute:: MANUAL + :value: "MANUAL" + + .. py:attribute:: SCHEDULE + :value: "SCHEDULE" + +.. autoclass:: MonitorSnapshot :members: :undoc-members: -.. autoclass:: MonitorTimeSeriesProfileType +.. autoclass:: MonitorTimeSeries :members: :undoc-members: @@ -1341,7 +1349,7 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: -.. py:class:: ValidationResultOperation +.. py:class:: ValidationResultAwsOperation The operation tested. @@ -1351,6 +1359,50 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:attribute:: LIST :value: "LIST" + .. py:attribute:: PATH_EXISTS + :value: "PATH_EXISTS" + + .. py:attribute:: READ + :value: "READ" + + .. py:attribute:: WRITE + :value: "WRITE" + +.. py:class:: ValidationResultAzureOperation + + The operation tested. + + .. py:attribute:: DELETE + :value: "DELETE" + + .. py:attribute:: HIERARCHICAL_NAMESPACE_ENABLED + :value: "HIERARCHICAL_NAMESPACE_ENABLED" + + .. py:attribute:: LIST + :value: "LIST" + + .. py:attribute:: PATH_EXISTS + :value: "PATH_EXISTS" + + .. py:attribute:: READ + :value: "READ" + + .. py:attribute:: WRITE + :value: "WRITE" + +.. py:class:: ValidationResultGcpOperation + + The operation tested. + + .. py:attribute:: DELETE + :value: "DELETE" + + .. py:attribute:: LIST + :value: "LIST" + + .. py:attribute:: PATH_EXISTS + :value: "PATH_EXISTS" + .. py:attribute:: READ :value: "READ" diff --git a/docs/dbdataclasses/compute.rst b/docs/dbdataclasses/compute.rst index b990234f6..7ed50c973 100644 --- a/docs/dbdataclasses/compute.rst +++ b/docs/dbdataclasses/compute.rst @@ -75,6 +75,10 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: +.. autoclass:: CloneCluster + :members: + :undoc-members: + .. autoclass:: CloudProviderNodeInfo :members: :undoc-members: @@ -236,17 +240,6 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: -.. autoclass:: ComputeSpec - :members: - :undoc-members: - -.. py:class:: ComputeSpecKind - - The kind of compute described by this compute specification. - - .. py:attribute:: SERVERLESS_PREVIEW - :value: "SERVERLESS_PREVIEW" - .. py:class:: ContextStatus .. py:attribute:: ERROR @@ -445,6 +438,10 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: +.. autoclass:: Environment + :members: + :undoc-members: + .. autoclass:: EventDetails :members: :undoc-members: diff --git a/docs/dbdataclasses/iam.rst b/docs/dbdataclasses/iam.rst index a095098b1..9cafb78df 100644 --- a/docs/dbdataclasses/iam.rst +++ b/docs/dbdataclasses/iam.rst @@ -288,10 +288,6 @@ These dataclasses are used in the SDK to represent API requests and responses fo .. py:attribute:: URN_IETF_PARAMS_SCIM_SCHEMAS_EXTENSION_WORKSPACE_2_0_USER :value: "URN_IETF_PARAMS_SCIM_SCHEMAS_EXTENSION_WORKSPACE_2_0_USER" -.. autoclass:: WorkspaceAssignmentsUpdated - :members: - :undoc-members: - .. py:class:: WorkspacePermission .. py:attribute:: ADMIN diff --git a/docs/dbdataclasses/index.rst b/docs/dbdataclasses/index.rst index f44361cc2..893e488d7 100644 --- a/docs/dbdataclasses/index.rst +++ b/docs/dbdataclasses/index.rst @@ -12,6 +12,7 @@ Dataclasses files iam jobs + marketplace ml oauth2 pipelines diff --git a/docs/dbdataclasses/jobs.rst b/docs/dbdataclasses/jobs.rst index beb29da75..6d5853617 100644 --- a/docs/dbdataclasses/jobs.rst +++ b/docs/dbdataclasses/jobs.rst @@ -197,10 +197,6 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: -.. autoclass:: JobCompute - :members: - :undoc-members: - .. autoclass:: JobDeployment :members: :undoc-members: @@ -227,6 +223,10 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: +.. autoclass:: JobEnvironment + :members: + :undoc-members: + .. autoclass:: JobNotificationSettings :members: :undoc-members: @@ -679,7 +679,7 @@ These dataclasses are used in the SDK to represent API requests and responses fo :members: :undoc-members: -.. autoclass:: TableTriggerConfiguration +.. autoclass:: TableUpdateTriggerConfiguration :members: :undoc-members: diff --git a/docs/dbdataclasses/marketplace.rst b/docs/dbdataclasses/marketplace.rst new file mode 100644 index 000000000..50226a5d5 --- /dev/null +++ b/docs/dbdataclasses/marketplace.rst @@ -0,0 +1,624 @@ +Marketplace +=========== + +These dataclasses are used in the SDK to represent API requests and responses for services in the ``databricks.sdk.service.marketplace`` module. + +.. py:currentmodule:: databricks.sdk.service.marketplace +.. autoclass:: AddExchangeForListingRequest + :members: + :undoc-members: + +.. autoclass:: AddExchangeForListingResponse + :members: + :undoc-members: + +.. py:class:: AssetType + + .. py:attribute:: ASSET_TYPE_DATA_TABLE + :value: "ASSET_TYPE_DATA_TABLE" + + .. py:attribute:: ASSET_TYPE_GIT_REPO + :value: "ASSET_TYPE_GIT_REPO" + + .. py:attribute:: ASSET_TYPE_MEDIA + :value: "ASSET_TYPE_MEDIA" + + .. py:attribute:: ASSET_TYPE_MODEL + :value: "ASSET_TYPE_MODEL" + + .. py:attribute:: ASSET_TYPE_NOTEBOOK + :value: "ASSET_TYPE_NOTEBOOK" + + .. py:attribute:: ASSET_TYPE_UNSPECIFIED + :value: "ASSET_TYPE_UNSPECIFIED" + +.. py:class:: Category + + .. py:attribute:: ADVERTISING_AND_MARKETING + :value: "ADVERTISING_AND_MARKETING" + + .. py:attribute:: CLIMATE_AND_ENVIRONMENT + :value: "CLIMATE_AND_ENVIRONMENT" + + .. py:attribute:: COMMERCE + :value: "COMMERCE" + + .. py:attribute:: DEMOGRAPHICS + :value: "DEMOGRAPHICS" + + .. py:attribute:: ECONOMICS + :value: "ECONOMICS" + + .. py:attribute:: EDUCATION + :value: "EDUCATION" + + .. py:attribute:: ENERGY + :value: "ENERGY" + + .. py:attribute:: FINANCIAL + :value: "FINANCIAL" + + .. py:attribute:: GAMING + :value: "GAMING" + + .. py:attribute:: GEOSPATIAL + :value: "GEOSPATIAL" + + .. py:attribute:: HEALTH + :value: "HEALTH" + + .. py:attribute:: LOOKUP_TABLES + :value: "LOOKUP_TABLES" + + .. py:attribute:: MANUFACTURING + :value: "MANUFACTURING" + + .. py:attribute:: MEDIA + :value: "MEDIA" + + .. py:attribute:: OTHER + :value: "OTHER" + + .. py:attribute:: PUBLIC_SECTOR + :value: "PUBLIC_SECTOR" + + .. py:attribute:: RETAIL + :value: "RETAIL" + + .. py:attribute:: SCIENCE_AND_RESEARCH + :value: "SCIENCE_AND_RESEARCH" + + .. py:attribute:: SECURITY + :value: "SECURITY" + + .. py:attribute:: SPORTS + :value: "SPORTS" + + .. py:attribute:: TRANSPORTATION_AND_LOGISTICS + :value: "TRANSPORTATION_AND_LOGISTICS" + + .. py:attribute:: TRAVEL_AND_TOURISM + :value: "TRAVEL_AND_TOURISM" + +.. autoclass:: ConsumerTerms + :members: + :undoc-members: + +.. autoclass:: ContactInfo + :members: + :undoc-members: + +.. py:class:: Cost + + .. py:attribute:: FREE + :value: "FREE" + + .. py:attribute:: PAID + :value: "PAID" + +.. autoclass:: CreateExchangeFilterRequest + :members: + :undoc-members: + +.. autoclass:: CreateExchangeFilterResponse + :members: + :undoc-members: + +.. autoclass:: CreateExchangeRequest + :members: + :undoc-members: + +.. autoclass:: CreateExchangeResponse + :members: + :undoc-members: + +.. autoclass:: CreateFileRequest + :members: + :undoc-members: + +.. autoclass:: CreateFileResponse + :members: + :undoc-members: + +.. autoclass:: CreateInstallationRequest + :members: + :undoc-members: + +.. autoclass:: CreateListingRequest + :members: + :undoc-members: + +.. autoclass:: CreateListingResponse + :members: + :undoc-members: + +.. autoclass:: CreatePersonalizationRequest + :members: + :undoc-members: + +.. autoclass:: CreatePersonalizationRequestResponse + :members: + :undoc-members: + +.. autoclass:: CreateProviderRequest + :members: + :undoc-members: + +.. autoclass:: CreateProviderResponse + :members: + :undoc-members: + +.. py:class:: DataRefresh + + .. py:attribute:: DAILY + :value: "DAILY" + + .. py:attribute:: HOURLY + :value: "HOURLY" + + .. py:attribute:: MINUTE + :value: "MINUTE" + + .. py:attribute:: MONTHLY + :value: "MONTHLY" + + .. py:attribute:: NONE + :value: "NONE" + + .. py:attribute:: QUARTERLY + :value: "QUARTERLY" + + .. py:attribute:: SECOND + :value: "SECOND" + + .. py:attribute:: WEEKLY + :value: "WEEKLY" + + .. py:attribute:: YEARLY + :value: "YEARLY" + +.. autoclass:: DataRefreshInfo + :members: + :undoc-members: + +.. autoclass:: DeleteExchangeFilterResponse + :members: + :undoc-members: + +.. autoclass:: DeleteExchangeResponse + :members: + :undoc-members: + +.. autoclass:: DeleteFileResponse + :members: + :undoc-members: + +.. autoclass:: DeleteInstallationResponse + :members: + :undoc-members: + +.. autoclass:: DeleteListingResponse + :members: + :undoc-members: + +.. autoclass:: DeleteProviderResponse + :members: + :undoc-members: + +.. py:class:: DeltaSharingRecipientType + + .. py:attribute:: DELTA_SHARING_RECIPIENT_TYPE_DATABRICKS + :value: "DELTA_SHARING_RECIPIENT_TYPE_DATABRICKS" + + .. py:attribute:: DELTA_SHARING_RECIPIENT_TYPE_OPEN + :value: "DELTA_SHARING_RECIPIENT_TYPE_OPEN" + +.. autoclass:: Exchange + :members: + :undoc-members: + +.. autoclass:: ExchangeFilter + :members: + :undoc-members: + +.. py:class:: ExchangeFilterType + + .. py:attribute:: GLOBAL_METASTORE_ID + :value: "GLOBAL_METASTORE_ID" + +.. autoclass:: ExchangeListing + :members: + :undoc-members: + +.. autoclass:: FileInfo + :members: + :undoc-members: + +.. autoclass:: FileParent + :members: + :undoc-members: + +.. py:class:: FileParentType + + .. py:attribute:: LISTING + :value: "LISTING" + + .. py:attribute:: PROVIDER + :value: "PROVIDER" + +.. py:class:: FileStatus + + .. py:attribute:: FILE_STATUS_PUBLISHED + :value: "FILE_STATUS_PUBLISHED" + + .. py:attribute:: FILE_STATUS_SANITIZATION_FAILED + :value: "FILE_STATUS_SANITIZATION_FAILED" + + .. py:attribute:: FILE_STATUS_SANITIZING + :value: "FILE_STATUS_SANITIZING" + + .. py:attribute:: FILE_STATUS_STAGING + :value: "FILE_STATUS_STAGING" + +.. py:class:: FilterType + + .. py:attribute:: METASTORE + :value: "METASTORE" + +.. py:class:: FulfillmentType + + .. py:attribute:: INSTALL + :value: "INSTALL" + + .. py:attribute:: REQUEST_ACCESS + :value: "REQUEST_ACCESS" + +.. autoclass:: GetExchangeResponse + :members: + :undoc-members: + +.. autoclass:: GetFileResponse + :members: + :undoc-members: + +.. autoclass:: GetLatestVersionProviderAnalyticsDashboardResponse + :members: + :undoc-members: + +.. autoclass:: GetListingContentMetadataResponse + :members: + :undoc-members: + +.. autoclass:: GetListingResponse + :members: + :undoc-members: + +.. autoclass:: GetListingsResponse + :members: + :undoc-members: + +.. autoclass:: GetPersonalizationRequestResponse + :members: + :undoc-members: + +.. autoclass:: GetProviderResponse + :members: + :undoc-members: + +.. autoclass:: Installation + :members: + :undoc-members: + +.. autoclass:: InstallationDetail + :members: + :undoc-members: + +.. py:class:: InstallationStatus + + .. py:attribute:: FAILED + :value: "FAILED" + + .. py:attribute:: INSTALLED + :value: "INSTALLED" + +.. autoclass:: ListAllInstallationsResponse + :members: + :undoc-members: + +.. autoclass:: ListAllPersonalizationRequestsResponse + :members: + :undoc-members: + +.. autoclass:: ListExchangeFiltersResponse + :members: + :undoc-members: + +.. autoclass:: ListExchangesForListingResponse + :members: + :undoc-members: + +.. autoclass:: ListExchangesResponse + :members: + :undoc-members: + +.. autoclass:: ListFilesResponse + :members: + :undoc-members: + +.. autoclass:: ListFulfillmentsResponse + :members: + :undoc-members: + +.. autoclass:: ListInstallationsResponse + :members: + :undoc-members: + +.. autoclass:: ListListingsForExchangeResponse + :members: + :undoc-members: + +.. autoclass:: ListListingsResponse + :members: + :undoc-members: + +.. autoclass:: ListProviderAnalyticsDashboardResponse + :members: + :undoc-members: + +.. autoclass:: ListProvidersResponse + :members: + :undoc-members: + +.. autoclass:: Listing + :members: + :undoc-members: + +.. autoclass:: ListingDetail + :members: + :undoc-members: + +.. autoclass:: ListingFulfillment + :members: + :undoc-members: + +.. autoclass:: ListingSetting + :members: + :undoc-members: + +.. py:class:: ListingShareType + + .. py:attribute:: FULL + :value: "FULL" + + .. py:attribute:: SAMPLE + :value: "SAMPLE" + +.. py:class:: ListingStatus + + Enums + + .. py:attribute:: DRAFT + :value: "DRAFT" + + .. py:attribute:: PENDING + :value: "PENDING" + + .. py:attribute:: PUBLISHED + :value: "PUBLISHED" + + .. py:attribute:: SUSPENDED + :value: "SUSPENDED" + +.. autoclass:: ListingSummary + :members: + :undoc-members: + +.. autoclass:: ListingTag + :members: + :undoc-members: + +.. py:class:: ListingTagType + + .. py:attribute:: LISTING_TAG_TYPE_LANGUAGE + :value: "LISTING_TAG_TYPE_LANGUAGE" + + .. py:attribute:: LISTING_TAG_TYPE_TASK + :value: "LISTING_TAG_TYPE_TASK" + + .. py:attribute:: LISTING_TAG_TYPE_UNSPECIFIED + :value: "LISTING_TAG_TYPE_UNSPECIFIED" + +.. py:class:: ListingType + + .. py:attribute:: PERSONALIZED + :value: "PERSONALIZED" + + .. py:attribute:: STANDARD + :value: "STANDARD" + +.. py:class:: MarketplaceFileType + + .. py:attribute:: EMBEDDED_NOTEBOOK + :value: "EMBEDDED_NOTEBOOK" + + .. py:attribute:: PROVIDER_ICON + :value: "PROVIDER_ICON" + +.. autoclass:: PersonalizationRequest + :members: + :undoc-members: + +.. py:class:: PersonalizationRequestStatus + + .. py:attribute:: DENIED + :value: "DENIED" + + .. py:attribute:: FULFILLED + :value: "FULFILLED" + + .. py:attribute:: NEW + :value: "NEW" + + .. py:attribute:: REQUEST_PENDING + :value: "REQUEST_PENDING" + +.. autoclass:: ProviderAnalyticsDashboard + :members: + :undoc-members: + +.. autoclass:: ProviderInfo + :members: + :undoc-members: + +.. autoclass:: RegionInfo + :members: + :undoc-members: + +.. autoclass:: RemoveExchangeForListingResponse + :members: + :undoc-members: + +.. autoclass:: RepoInfo + :members: + :undoc-members: + +.. autoclass:: RepoInstallation + :members: + :undoc-members: + +.. autoclass:: SearchListingsResponse + :members: + :undoc-members: + +.. autoclass:: ShareInfo + :members: + :undoc-members: + +.. autoclass:: SharedDataObject + :members: + :undoc-members: + +.. py:class:: SortBy + + .. py:attribute:: SORT_BY_DATE + :value: "SORT_BY_DATE" + + .. py:attribute:: SORT_BY_RELEVANCE + :value: "SORT_BY_RELEVANCE" + + .. py:attribute:: SORT_BY_TITLE + :value: "SORT_BY_TITLE" + + .. py:attribute:: SORT_BY_UNSPECIFIED + :value: "SORT_BY_UNSPECIFIED" + +.. autoclass:: SortBySpec + :members: + :undoc-members: + +.. py:class:: SortOrder + + .. py:attribute:: SORT_ORDER_ASCENDING + :value: "SORT_ORDER_ASCENDING" + + .. py:attribute:: SORT_ORDER_DESCENDING + :value: "SORT_ORDER_DESCENDING" + + .. py:attribute:: SORT_ORDER_UNSPECIFIED + :value: "SORT_ORDER_UNSPECIFIED" + +.. autoclass:: TokenDetail + :members: + :undoc-members: + +.. autoclass:: TokenInfo + :members: + :undoc-members: + +.. autoclass:: UpdateExchangeFilterRequest + :members: + :undoc-members: + +.. autoclass:: UpdateExchangeFilterResponse + :members: + :undoc-members: + +.. autoclass:: UpdateExchangeRequest + :members: + :undoc-members: + +.. autoclass:: UpdateExchangeResponse + :members: + :undoc-members: + +.. autoclass:: UpdateInstallationRequest + :members: + :undoc-members: + +.. autoclass:: UpdateInstallationResponse + :members: + :undoc-members: + +.. autoclass:: UpdateListingRequest + :members: + :undoc-members: + +.. autoclass:: UpdateListingResponse + :members: + :undoc-members: + +.. autoclass:: UpdatePersonalizationRequestRequest + :members: + :undoc-members: + +.. autoclass:: UpdatePersonalizationRequestResponse + :members: + :undoc-members: + +.. autoclass:: UpdateProviderAnalyticsDashboardRequest + :members: + :undoc-members: + +.. autoclass:: UpdateProviderAnalyticsDashboardResponse + :members: + :undoc-members: + +.. autoclass:: UpdateProviderRequest + :members: + :undoc-members: + +.. autoclass:: UpdateProviderResponse + :members: + :undoc-members: + +.. py:class:: Visibility + + .. py:attribute:: PRIVATE + :value: "PRIVATE" + + .. py:attribute:: PUBLIC + :value: "PUBLIC" + +.. autoclass:: VisibilityFilter + :members: + :undoc-members: diff --git a/docs/workspace/catalog/lakehouse_monitors.rst b/docs/workspace/catalog/lakehouse_monitors.rst index 453dbf73b..75f861d85 100644 --- a/docs/workspace/catalog/lakehouse_monitors.rst +++ b/docs/workspace/catalog/lakehouse_monitors.rst @@ -11,7 +11,7 @@ catalog). Viewing the dashboard, computed metrics, or monitor configuration only requires the user to have **SELECT** privileges on the table (along with **USE_SCHEMA** and **USE_CATALOG**). - .. py:method:: cancel_refresh(full_name: str, refresh_id: str) + .. py:method:: cancel_refresh(table_name: str, refresh_id: str) Cancel refresh. @@ -24,7 +24,7 @@ Additionally, the call must be made from the workspace where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :param refresh_id: str ID of the refresh. @@ -32,7 +32,7 @@ - .. py:method:: create(full_name: str, assets_dir: str, output_schema_name: str [, baseline_table_name: Optional[str], custom_metrics: Optional[List[MonitorCustomMetric]], data_classification_config: Optional[MonitorDataClassificationConfig], inference_log: Optional[MonitorInferenceLogProfileType], notifications: Optional[MonitorNotificationsConfig], schedule: Optional[MonitorCronSchedule], skip_builtin_dashboard: Optional[bool], slicing_exprs: Optional[List[str]], snapshot: Optional[MonitorSnapshotProfileType], time_series: Optional[MonitorTimeSeriesProfileType], warehouse_id: Optional[str]]) -> MonitorInfo + .. py:method:: create(table_name: str, assets_dir: str, output_schema_name: str [, baseline_table_name: Optional[str], custom_metrics: Optional[List[MonitorMetric]], data_classification_config: Optional[MonitorDataClassificationConfig], inference_log: Optional[MonitorInferenceLog], notifications: Optional[MonitorNotifications], schedule: Optional[MonitorCronSchedule], skip_builtin_dashboard: Optional[bool], slicing_exprs: Optional[List[str]], snapshot: Optional[MonitorSnapshot], time_series: Optional[MonitorTimeSeries], warehouse_id: Optional[str]]) -> MonitorInfo Create a table monitor. @@ -46,7 +46,7 @@ Workspace assets, such as the dashboard, will be created in the workspace where this call was made. - :param full_name: str + :param table_name: str Full name of the table. :param assets_dir: str The directory to store monitoring assets (e.g. dashboard, metric tables). @@ -55,14 +55,14 @@ :param baseline_table_name: str (optional) Name of the baseline table from which drift metrics are computed from. Columns in the monitored table should also be present in the baseline table. - :param custom_metrics: List[:class:`MonitorCustomMetric`] (optional) + :param custom_metrics: List[:class:`MonitorMetric`] (optional) Custom metrics to compute on the monitored table. These can be aggregate metrics, derived metrics (from already computed aggregate metrics), or drift metrics (comparing metrics across time windows). :param data_classification_config: :class:`MonitorDataClassificationConfig` (optional) The data classification config for the monitor. - :param inference_log: :class:`MonitorInferenceLogProfileType` (optional) + :param inference_log: :class:`MonitorInferenceLog` (optional) Configuration for monitoring inference logs. - :param notifications: :class:`MonitorNotificationsConfig` (optional) + :param notifications: :class:`MonitorNotifications` (optional) The notification settings for the monitor. :param schedule: :class:`MonitorCronSchedule` (optional) The schedule for automatically updating and refreshing metric tables. @@ -72,9 +72,9 @@ List of column expressions to slice data with for targeted analysis. The data is grouped by each expression independently, resulting in a separate slice for each predicate and its complements. For high-cardinality columns, only the top 100 unique values by frequency will generate slices. - :param snapshot: :class:`MonitorSnapshotProfileType` (optional) + :param snapshot: :class:`MonitorSnapshot` (optional) Configuration for monitoring snapshot tables. - :param time_series: :class:`MonitorTimeSeriesProfileType` (optional) + :param time_series: :class:`MonitorTimeSeries` (optional) Configuration for monitoring time series tables. :param warehouse_id: str (optional) Optional argument to specify the warehouse for dashboard creation. If not specified, the first @@ -83,7 +83,7 @@ :returns: :class:`MonitorInfo` - .. py:method:: delete(full_name: str) + .. py:method:: delete(table_name: str) Delete a table monitor. @@ -99,13 +99,13 @@ Note that the metric tables and dashboard will not be deleted as part of this call; those assets must be manually cleaned up (if desired). - :param full_name: str + :param table_name: str Full name of the table. - .. py:method:: get(full_name: str) -> MonitorInfo + .. py:method:: get(table_name: str) -> MonitorInfo Get a table monitor. @@ -120,13 +120,13 @@ the monitor. Some information (e.g., dashboard) may be filtered out if the caller is in a different workspace than where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :returns: :class:`MonitorInfo` - .. py:method:: get_refresh(full_name: str, refresh_id: str) -> MonitorRefreshInfo + .. py:method:: get_refresh(table_name: str, refresh_id: str) -> MonitorRefreshInfo Get refresh. @@ -139,7 +139,7 @@ Additionally, the call must be made from the workspace where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :param refresh_id: str ID of the refresh. @@ -147,7 +147,7 @@ :returns: :class:`MonitorRefreshInfo` - .. py:method:: list_refreshes(full_name: str) -> Iterator[MonitorRefreshInfo] + .. py:method:: list_refreshes(table_name: str) -> Iterator[MonitorRefreshInfo] List refreshes. @@ -160,13 +160,13 @@ Additionally, the call must be made from the workspace where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :returns: Iterator over :class:`MonitorRefreshInfo` - .. py:method:: run_refresh(full_name: str) -> MonitorRefreshInfo + .. py:method:: run_refresh(table_name: str) -> MonitorRefreshInfo Queue a metric refresh for a monitor. @@ -180,13 +180,13 @@ Additionally, the call must be made from the workspace where the monitor was created. - :param full_name: str + :param table_name: str Full name of the table. :returns: :class:`MonitorRefreshInfo` - .. py:method:: update(full_name: str, output_schema_name: str [, baseline_table_name: Optional[str], custom_metrics: Optional[List[MonitorCustomMetric]], data_classification_config: Optional[MonitorDataClassificationConfig], inference_log: Optional[MonitorInferenceLogProfileType], notifications: Optional[MonitorNotificationsConfig], schedule: Optional[MonitorCronSchedule], slicing_exprs: Optional[List[str]], snapshot: Optional[MonitorSnapshotProfileType], time_series: Optional[MonitorTimeSeriesProfileType]]) -> MonitorInfo + .. py:method:: update(table_name: str, output_schema_name: str [, baseline_table_name: Optional[str], custom_metrics: Optional[List[MonitorMetric]], data_classification_config: Optional[MonitorDataClassificationConfig], inference_log: Optional[MonitorInferenceLog], notifications: Optional[MonitorNotifications], schedule: Optional[MonitorCronSchedule], slicing_exprs: Optional[List[str]], snapshot: Optional[MonitorSnapshot], time_series: Optional[MonitorTimeSeries]]) -> MonitorInfo Update a table monitor. @@ -202,21 +202,21 @@ Certain configuration fields, such as output asset identifiers, cannot be updated. - :param full_name: str + :param table_name: str Full name of the table. :param output_schema_name: str Schema where output metric tables are created. :param baseline_table_name: str (optional) Name of the baseline table from which drift metrics are computed from. Columns in the monitored table should also be present in the baseline table. - :param custom_metrics: List[:class:`MonitorCustomMetric`] (optional) + :param custom_metrics: List[:class:`MonitorMetric`] (optional) Custom metrics to compute on the monitored table. These can be aggregate metrics, derived metrics (from already computed aggregate metrics), or drift metrics (comparing metrics across time windows). :param data_classification_config: :class:`MonitorDataClassificationConfig` (optional) The data classification config for the monitor. - :param inference_log: :class:`MonitorInferenceLogProfileType` (optional) + :param inference_log: :class:`MonitorInferenceLog` (optional) Configuration for monitoring inference logs. - :param notifications: :class:`MonitorNotificationsConfig` (optional) + :param notifications: :class:`MonitorNotifications` (optional) The notification settings for the monitor. :param schedule: :class:`MonitorCronSchedule` (optional) The schedule for automatically updating and refreshing metric tables. @@ -224,9 +224,9 @@ List of column expressions to slice data with for targeted analysis. The data is grouped by each expression independently, resulting in a separate slice for each predicate and its complements. For high-cardinality columns, only the top 100 unique values by frequency will generate slices. - :param snapshot: :class:`MonitorSnapshotProfileType` (optional) + :param snapshot: :class:`MonitorSnapshot` (optional) Configuration for monitoring snapshot tables. - :param time_series: :class:`MonitorTimeSeriesProfileType` (optional) + :param time_series: :class:`MonitorTimeSeries` (optional) Configuration for monitoring time series tables. :returns: :class:`MonitorInfo` diff --git a/docs/workspace/catalog/registered_models.rst b/docs/workspace/catalog/registered_models.rst index d08c6aa2e..6a60c4f6d 100644 --- a/docs/workspace/catalog/registered_models.rst +++ b/docs/workspace/catalog/registered_models.rst @@ -132,9 +132,19 @@ Whether to include registered models in the response for which the principal can only access selective metadata for :param max_results: int (optional) - Max number of registered models to return. If catalog and schema are unspecified, max_results must - be specified. If max_results is unspecified, we return all results, starting from the page specified - by page_token. + Max number of registered models to return. + + If both catalog and schema are specified: - when max_results is not specified, the page length is + set to a server configured value (10000, as of 4/2/2024). - when set to a value greater than 0, the + page length is the minimum of this value and a server configured value (10000, as of 4/2/2024); - + when set to 0, the page length is set to a server configured value (10000, as of 4/2/2024); - when + set to a value less than 0, an invalid parameter error is returned; + + If neither schema nor catalog is specified: - when max_results is not specified, the page length is + set to a server configured value (100, as of 4/2/2024). - when set to a value greater than 0, the + page length is the minimum of this value and a server configured value (1000, as of 4/2/2024); - + when set to 0, the page length is set to a server configured value (100, as of 4/2/2024); - when set + to a value less than 0, an invalid parameter error is returned; :param page_token: str (optional) Opaque token to send for the next page of results (pagination). :param schema_name: str (optional) diff --git a/docs/workspace/catalog/storage_credentials.rst b/docs/workspace/catalog/storage_credentials.rst index db64eae90..e3a5ac33e 100644 --- a/docs/workspace/catalog/storage_credentials.rst +++ b/docs/workspace/catalog/storage_credentials.rst @@ -15,7 +15,7 @@ To create storage credentials, you must be a Databricks account admin. The account admin who creates the storage credential can delegate ownership to another user or group to manage permissions on it. - .. py:method:: create(name: str [, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentity], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], comment: Optional[str], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], read_only: Optional[bool], skip_validation: Optional[bool]]) -> StorageCredentialInfo + .. py:method:: create(name: str [, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentityRequest], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], comment: Optional[str], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], read_only: Optional[bool], skip_validation: Optional[bool]]) -> StorageCredentialInfo Usage: @@ -45,7 +45,7 @@ The credential name. The name must be unique within the metastore. :param aws_iam_role: :class:`AwsIamRoleRequest` (optional) The AWS IAM role configuration. - :param azure_managed_identity: :class:`AzureManagedIdentity` (optional) + :param azure_managed_identity: :class:`AzureManagedIdentityRequest` (optional) The Azure managed identity configuration. :param azure_service_principal: :class:`AzureServicePrincipal` (optional) The Azure service principal configuration. @@ -145,7 +145,7 @@ :returns: Iterator over :class:`StorageCredentialInfo` - .. py:method:: update(name: str [, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentity], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], comment: Optional[str], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], force: Optional[bool], new_name: Optional[str], owner: Optional[str], read_only: Optional[bool], skip_validation: Optional[bool]]) -> StorageCredentialInfo + .. py:method:: update(name: str [, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentityResponse], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], comment: Optional[str], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], force: Optional[bool], new_name: Optional[str], owner: Optional[str], read_only: Optional[bool], skip_validation: Optional[bool]]) -> StorageCredentialInfo Usage: @@ -180,7 +180,7 @@ Name of the storage credential. :param aws_iam_role: :class:`AwsIamRoleRequest` (optional) The AWS IAM role configuration. - :param azure_managed_identity: :class:`AzureManagedIdentity` (optional) + :param azure_managed_identity: :class:`AzureManagedIdentityResponse` (optional) The Azure managed identity configuration. :param azure_service_principal: :class:`AzureServicePrincipal` (optional) The Azure service principal configuration. @@ -204,7 +204,7 @@ :returns: :class:`StorageCredentialInfo` - .. py:method:: validate( [, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentity], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], external_location_name: Optional[str], read_only: Optional[bool], storage_credential_name: Optional[str], url: Optional[str]]) -> ValidateStorageCredentialResponse + .. py:method:: validate( [, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentityRequest], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], external_location_name: Optional[str], read_only: Optional[bool], storage_credential_name: Optional[str], url: Optional[str]]) -> ValidateStorageCredentialResponse Validate a storage credential. @@ -220,7 +220,7 @@ :param aws_iam_role: :class:`AwsIamRoleRequest` (optional) The AWS IAM role configuration. - :param azure_managed_identity: :class:`AzureManagedIdentity` (optional) + :param azure_managed_identity: :class:`AzureManagedIdentityRequest` (optional) The Azure managed identity configuration. :param azure_service_principal: :class:`AzureServicePrincipal` (optional) The Azure service principal configuration. diff --git a/docs/workspace/compute/clusters.rst b/docs/workspace/compute/clusters.rst index 84f2b7968..db6f22991 100644 --- a/docs/workspace/compute/clusters.rst +++ b/docs/workspace/compute/clusters.rst @@ -72,7 +72,7 @@ - .. py:method:: create(spark_version: str [, apply_policy_default_values: Optional[bool], autoscale: Optional[AutoScale], autotermination_minutes: Optional[int], aws_attributes: Optional[AwsAttributes], azure_attributes: Optional[AzureAttributes], cluster_log_conf: Optional[ClusterLogConf], cluster_name: Optional[str], cluster_source: Optional[ClusterSource], custom_tags: Optional[Dict[str, str]], data_security_mode: Optional[DataSecurityMode], docker_image: Optional[DockerImage], driver_instance_pool_id: Optional[str], driver_node_type_id: Optional[str], enable_elastic_disk: Optional[bool], enable_local_disk_encryption: Optional[bool], gcp_attributes: Optional[GcpAttributes], init_scripts: Optional[List[InitScriptInfo]], instance_pool_id: Optional[str], node_type_id: Optional[str], num_workers: Optional[int], policy_id: Optional[str], runtime_engine: Optional[RuntimeEngine], single_user_name: Optional[str], spark_conf: Optional[Dict[str, str]], spark_env_vars: Optional[Dict[str, str]], ssh_public_keys: Optional[List[str]], workload_type: Optional[WorkloadType]]) -> Wait[ClusterDetails] + .. py:method:: create(spark_version: str [, apply_policy_default_values: Optional[bool], autoscale: Optional[AutoScale], autotermination_minutes: Optional[int], aws_attributes: Optional[AwsAttributes], azure_attributes: Optional[AzureAttributes], clone_from: Optional[CloneCluster], cluster_log_conf: Optional[ClusterLogConf], cluster_name: Optional[str], cluster_source: Optional[ClusterSource], custom_tags: Optional[Dict[str, str]], data_security_mode: Optional[DataSecurityMode], docker_image: Optional[DockerImage], driver_instance_pool_id: Optional[str], driver_node_type_id: Optional[str], enable_elastic_disk: Optional[bool], enable_local_disk_encryption: Optional[bool], gcp_attributes: Optional[GcpAttributes], init_scripts: Optional[List[InitScriptInfo]], instance_pool_id: Optional[str], node_type_id: Optional[str], num_workers: Optional[int], policy_id: Optional[str], runtime_engine: Optional[RuntimeEngine], single_user_name: Optional[str], spark_conf: Optional[Dict[str, str]], spark_env_vars: Optional[Dict[str, str]], ssh_public_keys: Optional[List[str]], workload_type: Optional[WorkloadType]]) -> Wait[ClusterDetails] Usage: @@ -125,6 +125,8 @@ :param azure_attributes: :class:`AzureAttributes` (optional) Attributes related to clusters running on Microsoft Azure. If not specified at cluster creation, a set of default values will be used. + :param clone_from: :class:`CloneCluster` (optional) + When specified, this clones libraries from a source cluster during the creation of a new cluster. :param cluster_log_conf: :class:`ClusterLogConf` (optional) The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. If @@ -227,7 +229,7 @@ See :method:wait_get_cluster_running for more details. - .. py:method:: create_and_wait(spark_version: str [, apply_policy_default_values: Optional[bool], autoscale: Optional[AutoScale], autotermination_minutes: Optional[int], aws_attributes: Optional[AwsAttributes], azure_attributes: Optional[AzureAttributes], cluster_log_conf: Optional[ClusterLogConf], cluster_name: Optional[str], cluster_source: Optional[ClusterSource], custom_tags: Optional[Dict[str, str]], data_security_mode: Optional[DataSecurityMode], docker_image: Optional[DockerImage], driver_instance_pool_id: Optional[str], driver_node_type_id: Optional[str], enable_elastic_disk: Optional[bool], enable_local_disk_encryption: Optional[bool], gcp_attributes: Optional[GcpAttributes], init_scripts: Optional[List[InitScriptInfo]], instance_pool_id: Optional[str], node_type_id: Optional[str], num_workers: Optional[int], policy_id: Optional[str], runtime_engine: Optional[RuntimeEngine], single_user_name: Optional[str], spark_conf: Optional[Dict[str, str]], spark_env_vars: Optional[Dict[str, str]], ssh_public_keys: Optional[List[str]], workload_type: Optional[WorkloadType], timeout: datetime.timedelta = 0:20:00]) -> ClusterDetails + .. py:method:: create_and_wait(spark_version: str [, apply_policy_default_values: Optional[bool], autoscale: Optional[AutoScale], autotermination_minutes: Optional[int], aws_attributes: Optional[AwsAttributes], azure_attributes: Optional[AzureAttributes], clone_from: Optional[CloneCluster], cluster_log_conf: Optional[ClusterLogConf], cluster_name: Optional[str], cluster_source: Optional[ClusterSource], custom_tags: Optional[Dict[str, str]], data_security_mode: Optional[DataSecurityMode], docker_image: Optional[DockerImage], driver_instance_pool_id: Optional[str], driver_node_type_id: Optional[str], enable_elastic_disk: Optional[bool], enable_local_disk_encryption: Optional[bool], gcp_attributes: Optional[GcpAttributes], init_scripts: Optional[List[InitScriptInfo]], instance_pool_id: Optional[str], node_type_id: Optional[str], num_workers: Optional[int], policy_id: Optional[str], runtime_engine: Optional[RuntimeEngine], single_user_name: Optional[str], spark_conf: Optional[Dict[str, str]], spark_env_vars: Optional[Dict[str, str]], ssh_public_keys: Optional[List[str]], workload_type: Optional[WorkloadType], timeout: datetime.timedelta = 0:20:00]) -> ClusterDetails .. py:method:: delete(cluster_id: str) -> Wait[ClusterDetails] @@ -276,7 +278,7 @@ .. py:method:: delete_and_wait(cluster_id: str, timeout: datetime.timedelta = 0:20:00) -> ClusterDetails - .. py:method:: edit(cluster_id: str, spark_version: str [, apply_policy_default_values: Optional[bool], autoscale: Optional[AutoScale], autotermination_minutes: Optional[int], aws_attributes: Optional[AwsAttributes], azure_attributes: Optional[AzureAttributes], cluster_log_conf: Optional[ClusterLogConf], cluster_name: Optional[str], cluster_source: Optional[ClusterSource], custom_tags: Optional[Dict[str, str]], data_security_mode: Optional[DataSecurityMode], docker_image: Optional[DockerImage], driver_instance_pool_id: Optional[str], driver_node_type_id: Optional[str], enable_elastic_disk: Optional[bool], enable_local_disk_encryption: Optional[bool], gcp_attributes: Optional[GcpAttributes], init_scripts: Optional[List[InitScriptInfo]], instance_pool_id: Optional[str], node_type_id: Optional[str], num_workers: Optional[int], policy_id: Optional[str], runtime_engine: Optional[RuntimeEngine], single_user_name: Optional[str], spark_conf: Optional[Dict[str, str]], spark_env_vars: Optional[Dict[str, str]], ssh_public_keys: Optional[List[str]], workload_type: Optional[WorkloadType]]) -> Wait[ClusterDetails] + .. py:method:: edit(cluster_id: str, spark_version: str [, apply_policy_default_values: Optional[bool], autoscale: Optional[AutoScale], autotermination_minutes: Optional[int], aws_attributes: Optional[AwsAttributes], azure_attributes: Optional[AzureAttributes], clone_from: Optional[CloneCluster], cluster_log_conf: Optional[ClusterLogConf], cluster_name: Optional[str], cluster_source: Optional[ClusterSource], custom_tags: Optional[Dict[str, str]], data_security_mode: Optional[DataSecurityMode], docker_image: Optional[DockerImage], driver_instance_pool_id: Optional[str], driver_node_type_id: Optional[str], enable_elastic_disk: Optional[bool], enable_local_disk_encryption: Optional[bool], gcp_attributes: Optional[GcpAttributes], init_scripts: Optional[List[InitScriptInfo]], instance_pool_id: Optional[str], node_type_id: Optional[str], num_workers: Optional[int], policy_id: Optional[str], runtime_engine: Optional[RuntimeEngine], single_user_name: Optional[str], spark_conf: Optional[Dict[str, str]], spark_env_vars: Optional[Dict[str, str]], ssh_public_keys: Optional[List[str]], workload_type: Optional[WorkloadType]]) -> Wait[ClusterDetails] Usage: @@ -343,6 +345,8 @@ :param azure_attributes: :class:`AzureAttributes` (optional) Attributes related to clusters running on Microsoft Azure. If not specified at cluster creation, a set of default values will be used. + :param clone_from: :class:`CloneCluster` (optional) + When specified, this clones libraries from a source cluster during the creation of a new cluster. :param cluster_log_conf: :class:`ClusterLogConf` (optional) The configuration for delivering spark logs to a long-term storage destination. Two kinds of destinations (dbfs and s3) are supported. Only one destination can be specified for one cluster. If @@ -445,7 +449,7 @@ See :method:wait_get_cluster_running for more details. - .. py:method:: edit_and_wait(cluster_id: str, spark_version: str [, apply_policy_default_values: Optional[bool], autoscale: Optional[AutoScale], autotermination_minutes: Optional[int], aws_attributes: Optional[AwsAttributes], azure_attributes: Optional[AzureAttributes], cluster_log_conf: Optional[ClusterLogConf], cluster_name: Optional[str], cluster_source: Optional[ClusterSource], custom_tags: Optional[Dict[str, str]], data_security_mode: Optional[DataSecurityMode], docker_image: Optional[DockerImage], driver_instance_pool_id: Optional[str], driver_node_type_id: Optional[str], enable_elastic_disk: Optional[bool], enable_local_disk_encryption: Optional[bool], gcp_attributes: Optional[GcpAttributes], init_scripts: Optional[List[InitScriptInfo]], instance_pool_id: Optional[str], node_type_id: Optional[str], num_workers: Optional[int], policy_id: Optional[str], runtime_engine: Optional[RuntimeEngine], single_user_name: Optional[str], spark_conf: Optional[Dict[str, str]], spark_env_vars: Optional[Dict[str, str]], ssh_public_keys: Optional[List[str]], workload_type: Optional[WorkloadType], timeout: datetime.timedelta = 0:20:00]) -> ClusterDetails + .. py:method:: edit_and_wait(cluster_id: str, spark_version: str [, apply_policy_default_values: Optional[bool], autoscale: Optional[AutoScale], autotermination_minutes: Optional[int], aws_attributes: Optional[AwsAttributes], azure_attributes: Optional[AzureAttributes], clone_from: Optional[CloneCluster], cluster_log_conf: Optional[ClusterLogConf], cluster_name: Optional[str], cluster_source: Optional[ClusterSource], custom_tags: Optional[Dict[str, str]], data_security_mode: Optional[DataSecurityMode], docker_image: Optional[DockerImage], driver_instance_pool_id: Optional[str], driver_node_type_id: Optional[str], enable_elastic_disk: Optional[bool], enable_local_disk_encryption: Optional[bool], gcp_attributes: Optional[GcpAttributes], init_scripts: Optional[List[InitScriptInfo]], instance_pool_id: Optional[str], node_type_id: Optional[str], num_workers: Optional[int], policy_id: Optional[str], runtime_engine: Optional[RuntimeEngine], single_user_name: Optional[str], spark_conf: Optional[Dict[str, str]], spark_env_vars: Optional[Dict[str, str]], ssh_public_keys: Optional[List[str]], workload_type: Optional[WorkloadType], timeout: datetime.timedelta = 0:20:00]) -> ClusterDetails .. py:method:: ensure_cluster_is_running(cluster_id: str) diff --git a/docs/workspace/iam/permissions.rst b/docs/workspace/iam/permissions.rst index 24c790432..47ff4f37f 100644 --- a/docs/workspace/iam/permissions.rst +++ b/docs/workspace/iam/permissions.rst @@ -80,7 +80,7 @@ :param request_object_type: str The type of the request object. Can be one of the following: authorization, clusters, cluster-policies, directories, experiments, files, instance-pools, jobs, notebooks, pipelines, - registered-models, repos, serving-endpoints, or sql-warehouses. + registered-models, repos, serving-endpoints, or warehouses. :param request_object_id: str The id of the request object. @@ -157,7 +157,7 @@ :param request_object_type: str The type of the request object. Can be one of the following: authorization, clusters, cluster-policies, directories, experiments, files, instance-pools, jobs, notebooks, pipelines, - registered-models, repos, serving-endpoints, or sql-warehouses. + registered-models, repos, serving-endpoints, or warehouses. :param request_object_id: str The id of the request object. :param access_control_list: List[:class:`AccessControlRequest`] (optional) @@ -175,7 +175,7 @@ :param request_object_type: str The type of the request object. Can be one of the following: authorization, clusters, cluster-policies, directories, experiments, files, instance-pools, jobs, notebooks, pipelines, - registered-models, repos, serving-endpoints, or sql-warehouses. + registered-models, repos, serving-endpoints, or warehouses. :param request_object_id: str The id of the request object. :param access_control_list: List[:class:`AccessControlRequest`] (optional) diff --git a/docs/workspace/index.rst b/docs/workspace/index.rst index e466303be..4d7eabff8 100644 --- a/docs/workspace/index.rst +++ b/docs/workspace/index.rst @@ -13,6 +13,7 @@ These APIs are available from WorkspaceClient files/index iam/index jobs/index + marketplace/index ml/index pipelines/index serving/index diff --git a/docs/workspace/jobs/jobs.rst b/docs/workspace/jobs/jobs.rst index 0dccebb1a..bd0b0c6bb 100644 --- a/docs/workspace/jobs/jobs.rst +++ b/docs/workspace/jobs/jobs.rst @@ -120,7 +120,7 @@ .. py:method:: cancel_run_and_wait(run_id: int, timeout: datetime.timedelta = 0:20:00) -> Run - .. py:method:: create( [, access_control_list: Optional[List[iam.AccessControlRequest]], compute: Optional[List[JobCompute]], continuous: Optional[Continuous], deployment: Optional[JobDeployment], description: Optional[str], edit_mode: Optional[JobEditMode], email_notifications: Optional[JobEmailNotifications], format: Optional[Format], git_source: Optional[GitSource], health: Optional[JobsHealthRules], job_clusters: Optional[List[JobCluster]], max_concurrent_runs: Optional[int], name: Optional[str], notification_settings: Optional[JobNotificationSettings], parameters: Optional[List[JobParameterDefinition]], queue: Optional[QueueSettings], run_as: Optional[JobRunAs], schedule: Optional[CronSchedule], tags: Optional[Dict[str, str]], tasks: Optional[List[Task]], timeout_seconds: Optional[int], trigger: Optional[TriggerSettings], webhook_notifications: Optional[WebhookNotifications]]) -> CreateResponse + .. py:method:: create( [, access_control_list: Optional[List[iam.AccessControlRequest]], continuous: Optional[Continuous], deployment: Optional[JobDeployment], description: Optional[str], edit_mode: Optional[JobEditMode], email_notifications: Optional[JobEmailNotifications], environments: Optional[List[JobEnvironment]], format: Optional[Format], git_source: Optional[GitSource], health: Optional[JobsHealthRules], job_clusters: Optional[List[JobCluster]], max_concurrent_runs: Optional[int], name: Optional[str], notification_settings: Optional[JobNotificationSettings], parameters: Optional[List[JobParameterDefinition]], queue: Optional[QueueSettings], run_as: Optional[JobRunAs], schedule: Optional[CronSchedule], tags: Optional[Dict[str, str]], tasks: Optional[List[Task]], timeout_seconds: Optional[int], trigger: Optional[TriggerSettings], webhook_notifications: Optional[WebhookNotifications]]) -> CreateResponse Usage: @@ -158,8 +158,6 @@ :param access_control_list: List[:class:`AccessControlRequest`] (optional) List of permissions to set on the job. - :param compute: List[:class:`JobCompute`] (optional) - A list of compute requirements that can be referenced by tasks of this job. :param continuous: :class:`Continuous` (optional) An optional continuous property for this job. The continuous property will ensure that there is always one run executing. Only one of `schedule` and `continuous` can be used. @@ -175,6 +173,8 @@ :param email_notifications: :class:`JobEmailNotifications` (optional) An optional set of email addresses that is notified when runs of this job begin or complete as well as when this job is deleted. + :param environments: List[:class:`JobEnvironment`] (optional) + A list of task execution environment specifications that can be referenced by tasks of this job. :param format: :class:`Format` (optional) Used to tell what is the format of the job. This field is ignored in Create/Update/Reset calls. When using the Jobs API 2.1 this value is always set to `"MULTI_TASK"`. diff --git a/docs/workspace/marketplace/consumer_fulfillments.rst b/docs/workspace/marketplace/consumer_fulfillments.rst new file mode 100644 index 000000000..4ea7a9c29 --- /dev/null +++ b/docs/workspace/marketplace/consumer_fulfillments.rst @@ -0,0 +1,36 @@ +``w.consumer_fulfillments``: Consumer Fulfillments +================================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ConsumerFulfillmentsAPI + + Fulfillments are entities that allow consumers to preview installations. + + .. py:method:: get(listing_id: str [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[SharedDataObject] + + Get listing content metadata. + + Get a high level preview of the metadata of listing installable content. + + :param listing_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`SharedDataObject` + + + .. py:method:: list(listing_id: str [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[ListingFulfillment] + + List all listing fulfillments. + + Get all listings fulfillments associated with a listing. A _fulfillment_ is a potential installation. + Standard installations contain metadata about the attached share or git repo. Only one of these fields + will be present. Personalized installations contain metadata about the attached share or git repo, as + well as the Delta Sharing recipient type. + + :param listing_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ListingFulfillment` + \ No newline at end of file diff --git a/docs/workspace/marketplace/consumer_installations.rst b/docs/workspace/marketplace/consumer_installations.rst new file mode 100644 index 000000000..3cdb00a5a --- /dev/null +++ b/docs/workspace/marketplace/consumer_installations.rst @@ -0,0 +1,78 @@ +``w.consumer_installations``: Consumer Installations +==================================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ConsumerInstallationsAPI + + Installations are entities that allow consumers to interact with Databricks Marketplace listings. + + .. py:method:: create(listing_id: str [, accepted_consumer_terms: Optional[ConsumerTerms], catalog_name: Optional[str], recipient_type: Optional[DeltaSharingRecipientType], repo_detail: Optional[RepoInstallation], share_name: Optional[str]]) -> Installation + + Install from a listing. + + Install payload associated with a Databricks Marketplace listing. + + :param listing_id: str + :param accepted_consumer_terms: :class:`ConsumerTerms` (optional) + :param catalog_name: str (optional) + :param recipient_type: :class:`DeltaSharingRecipientType` (optional) + :param repo_detail: :class:`RepoInstallation` (optional) + for git repo installations + :param share_name: str (optional) + + :returns: :class:`Installation` + + + .. py:method:: delete(listing_id: str, installation_id: str) + + Uninstall from a listing. + + Uninstall an installation associated with a Databricks Marketplace listing. + + :param listing_id: str + :param installation_id: str + + + + + .. py:method:: list( [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[InstallationDetail] + + List all installations. + + List all installations across all listings. + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`InstallationDetail` + + + .. py:method:: list_listing_installations(listing_id: str [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[InstallationDetail] + + List installations for a listing. + + List all installations for a particular listing. + + :param listing_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`InstallationDetail` + + + .. py:method:: update(listing_id: str, installation_id: str, installation: InstallationDetail [, rotate_token: Optional[bool]]) -> UpdateInstallationResponse + + Update an installation. + + This is a update API that will update the part of the fields defined in the installation table as well + as interact with external services according to the fields not included in the installation table 1. + the token will be rotate if the rotateToken flag is true 2. the token will be forcibly rotate if the + rotateToken flag is true and the tokenInfo field is empty + + :param listing_id: str + :param installation_id: str + :param installation: :class:`InstallationDetail` + :param rotate_token: bool (optional) + + :returns: :class:`UpdateInstallationResponse` + \ No newline at end of file diff --git a/docs/workspace/marketplace/consumer_listings.rst b/docs/workspace/marketplace/consumer_listings.rst new file mode 100644 index 000000000..4bef0319d --- /dev/null +++ b/docs/workspace/marketplace/consumer_listings.rst @@ -0,0 +1,71 @@ +``w.consumer_listings``: Consumer Listings +========================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ConsumerListingsAPI + + Listings are the core entities in the Marketplace. They represent the products that are available for + consumption. + + .. py:method:: get(id: str) -> GetListingResponse + + Get listing. + + Get a published listing in the Databricks Marketplace that the consumer has access to. + + :param id: str + + :returns: :class:`GetListingResponse` + + + .. py:method:: list( [, assets: Optional[List[AssetType]], categories: Optional[List[Category]], is_free: Optional[bool], is_private_exchange: Optional[bool], is_staff_pick: Optional[bool], page_size: Optional[int], page_token: Optional[str], provider_ids: Optional[List[str]], sort_by_spec: Optional[SortBySpec], tags: Optional[List[ListingTag]]]) -> Iterator[Listing] + + List listings. + + List all published listings in the Databricks Marketplace that the consumer has access to. + + :param assets: List[:class:`AssetType`] (optional) + Matches any of the following asset types + :param categories: List[:class:`Category`] (optional) + Matches any of the following categories + :param is_free: bool (optional) + Filters each listing based on if it is free. + :param is_private_exchange: bool (optional) + Filters each listing based on if it is a private exchange. + :param is_staff_pick: bool (optional) + Filters each listing based on whether it is a staff pick. + :param page_size: int (optional) + :param page_token: str (optional) + :param provider_ids: List[str] (optional) + Matches any of the following provider ids + :param sort_by_spec: :class:`SortBySpec` (optional) + Criteria for sorting the resulting set of listings. + :param tags: List[:class:`ListingTag`] (optional) + Matches any of the following tags + + :returns: Iterator over :class:`Listing` + + + .. py:method:: search(query: str [, assets: Optional[List[AssetType]], categories: Optional[List[Category]], is_free: Optional[bool], is_private_exchange: Optional[bool], page_size: Optional[int], page_token: Optional[str], provider_ids: Optional[List[str]], sort_by: Optional[SortBy]]) -> Iterator[Listing] + + Search listings. + + Search published listings in the Databricks Marketplace that the consumer has access to. This query + supports a variety of different search parameters and performs fuzzy matching. + + :param query: str + Fuzzy matches query + :param assets: List[:class:`AssetType`] (optional) + Matches any of the following asset types + :param categories: List[:class:`Category`] (optional) + Matches any of the following categories + :param is_free: bool (optional) + :param is_private_exchange: bool (optional) + :param page_size: int (optional) + :param page_token: str (optional) + :param provider_ids: List[str] (optional) + Matches any of the following provider ids + :param sort_by: :class:`SortBy` (optional) + + :returns: Iterator over :class:`Listing` + \ No newline at end of file diff --git a/docs/workspace/marketplace/consumer_personalization_requests.rst b/docs/workspace/marketplace/consumer_personalization_requests.rst new file mode 100644 index 000000000..63ead75d3 --- /dev/null +++ b/docs/workspace/marketplace/consumer_personalization_requests.rst @@ -0,0 +1,50 @@ +``w.consumer_personalization_requests``: Consumer Personalization Requests +========================================================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ConsumerPersonalizationRequestsAPI + + Personalization Requests allow customers to interact with the individualized Marketplace listing flow. + + .. py:method:: create(listing_id: str, intended_use: str, accepted_consumer_terms: ConsumerTerms [, comment: Optional[str], company: Optional[str], first_name: Optional[str], is_from_lighthouse: Optional[bool], last_name: Optional[str], recipient_type: Optional[DeltaSharingRecipientType]]) -> CreatePersonalizationRequestResponse + + Create a personalization request. + + Create a personalization request for a listing. + + :param listing_id: str + :param intended_use: str + :param accepted_consumer_terms: :class:`ConsumerTerms` + :param comment: str (optional) + :param company: str (optional) + :param first_name: str (optional) + :param is_from_lighthouse: bool (optional) + :param last_name: str (optional) + :param recipient_type: :class:`DeltaSharingRecipientType` (optional) + + :returns: :class:`CreatePersonalizationRequestResponse` + + + .. py:method:: get(listing_id: str) -> GetPersonalizationRequestResponse + + Get the personalization request for a listing. + + Get the personalization request for a listing. Each consumer can make at *most* one personalization + request for a listing. + + :param listing_id: str + + :returns: :class:`GetPersonalizationRequestResponse` + + + .. py:method:: list( [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[PersonalizationRequest] + + List all personalization requests. + + List personalization requests for a consumer across all listings. + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`PersonalizationRequest` + \ No newline at end of file diff --git a/docs/workspace/marketplace/consumer_providers.rst b/docs/workspace/marketplace/consumer_providers.rst new file mode 100644 index 000000000..6f329d132 --- /dev/null +++ b/docs/workspace/marketplace/consumer_providers.rst @@ -0,0 +1,31 @@ +``w.consumer_providers``: Consumer Providers +============================================ +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ConsumerProvidersAPI + + Providers are the entities that publish listings to the Marketplace. + + .. py:method:: get(id: str) -> GetProviderResponse + + Get a provider. + + Get a provider in the Databricks Marketplace with at least one visible listing. + + :param id: str + + :returns: :class:`GetProviderResponse` + + + .. py:method:: list( [, is_featured: Optional[bool], page_size: Optional[int], page_token: Optional[str]]) -> Iterator[ProviderInfo] + + List providers. + + List all providers in the Databricks Marketplace with at least one visible listing. + + :param is_featured: bool (optional) + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ProviderInfo` + \ No newline at end of file diff --git a/docs/workspace/marketplace/index.rst b/docs/workspace/marketplace/index.rst new file mode 100644 index 000000000..8c56654f4 --- /dev/null +++ b/docs/workspace/marketplace/index.rst @@ -0,0 +1,21 @@ + +Marketplace +=========== + +Manage AI and analytics assets such as ML models, notebooks, applications in an open marketplace + +.. toctree:: + :maxdepth: 1 + + consumer_fulfillments + consumer_installations + consumer_listings + consumer_personalization_requests + consumer_providers + provider_exchange_filters + provider_exchanges + provider_files + provider_listings + provider_personalization_requests + provider_provider_analytics_dashboards + provider_providers \ No newline at end of file diff --git a/docs/workspace/marketplace/provider_exchange_filters.rst b/docs/workspace/marketplace/provider_exchange_filters.rst new file mode 100644 index 000000000..ceca51e63 --- /dev/null +++ b/docs/workspace/marketplace/provider_exchange_filters.rst @@ -0,0 +1,54 @@ +``w.provider_exchange_filters``: Provider Exchange Filters +========================================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ProviderExchangeFiltersAPI + + Marketplace exchanges filters curate which groups can access an exchange. + + .. py:method:: create(filter: ExchangeFilter) -> CreateExchangeFilterResponse + + Create a new exchange filter. + + Add an exchange filter. + + :param filter: :class:`ExchangeFilter` + + :returns: :class:`CreateExchangeFilterResponse` + + + .. py:method:: delete(id: str) + + Delete an exchange filter. + + Delete an exchange filter + + :param id: str + + + + + .. py:method:: list(exchange_id: str [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[ExchangeFilter] + + List exchange filters. + + List exchange filter + + :param exchange_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ExchangeFilter` + + + .. py:method:: update(id: str, filter: ExchangeFilter) -> UpdateExchangeFilterResponse + + Update exchange filter. + + Update an exchange filter. + + :param id: str + :param filter: :class:`ExchangeFilter` + + :returns: :class:`UpdateExchangeFilterResponse` + \ No newline at end of file diff --git a/docs/workspace/marketplace/provider_exchanges.rst b/docs/workspace/marketplace/provider_exchanges.rst new file mode 100644 index 000000000..d53fd823d --- /dev/null +++ b/docs/workspace/marketplace/provider_exchanges.rst @@ -0,0 +1,113 @@ +``w.provider_exchanges``: Provider Exchanges +============================================ +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ProviderExchangesAPI + + Marketplace exchanges allow providers to share their listings with a curated set of customers. + + .. py:method:: add_listing_to_exchange(listing_id: str, exchange_id: str) -> AddExchangeForListingResponse + + Add an exchange for listing. + + Associate an exchange with a listing + + :param listing_id: str + :param exchange_id: str + + :returns: :class:`AddExchangeForListingResponse` + + + .. py:method:: create(exchange: Exchange) -> CreateExchangeResponse + + Create an exchange. + + Create an exchange + + :param exchange: :class:`Exchange` + + :returns: :class:`CreateExchangeResponse` + + + .. py:method:: delete(id: str) + + Delete an exchange. + + This removes a listing from marketplace. + + :param id: str + + + + + .. py:method:: delete_listing_from_exchange(id: str) + + Remove an exchange for listing. + + Disassociate an exchange with a listing + + :param id: str + + + + + .. py:method:: get(id: str) -> GetExchangeResponse + + Get an exchange. + + Get an exchange. + + :param id: str + + :returns: :class:`GetExchangeResponse` + + + .. py:method:: list( [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[Exchange] + + List exchanges. + + List exchanges visible to provider + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`Exchange` + + + .. py:method:: list_exchanges_for_listing(listing_id: str [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[ExchangeListing] + + List exchanges for listing. + + List exchanges associated with a listing + + :param listing_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ExchangeListing` + + + .. py:method:: list_listings_for_exchange(exchange_id: str [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[ExchangeListing] + + List listings for exchange. + + List listings associated with an exchange + + :param exchange_id: str + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ExchangeListing` + + + .. py:method:: update(id: str, exchange: Exchange) -> UpdateExchangeResponse + + Update exchange. + + Update an exchange + + :param id: str + :param exchange: :class:`Exchange` + + :returns: :class:`UpdateExchangeResponse` + \ No newline at end of file diff --git a/docs/workspace/marketplace/provider_files.rst b/docs/workspace/marketplace/provider_files.rst new file mode 100644 index 000000000..f719ca65f --- /dev/null +++ b/docs/workspace/marketplace/provider_files.rst @@ -0,0 +1,56 @@ +``w.provider_files``: Provider Files +==================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ProviderFilesAPI + + Marketplace offers a set of file APIs for various purposes such as preview notebooks and provider icons. + + .. py:method:: create(file_parent: FileParent, marketplace_file_type: MarketplaceFileType, mime_type: str [, display_name: Optional[str]]) -> CreateFileResponse + + Create a file. + + Create a file. Currently, only provider icons and attached notebooks are supported. + + :param file_parent: :class:`FileParent` + :param marketplace_file_type: :class:`MarketplaceFileType` + :param mime_type: str + :param display_name: str (optional) + + :returns: :class:`CreateFileResponse` + + + .. py:method:: delete(file_id: str) + + Delete a file. + + Delete a file + + :param file_id: str + + + + + .. py:method:: get(file_id: str) -> GetFileResponse + + Get a file. + + Get a file + + :param file_id: str + + :returns: :class:`GetFileResponse` + + + .. py:method:: list(file_parent: FileParent [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[FileInfo] + + List files. + + List files attached to a parent entity. + + :param file_parent: :class:`FileParent` + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`FileInfo` + \ No newline at end of file diff --git a/docs/workspace/marketplace/provider_listings.rst b/docs/workspace/marketplace/provider_listings.rst new file mode 100644 index 000000000..d26c5293e --- /dev/null +++ b/docs/workspace/marketplace/provider_listings.rst @@ -0,0 +1,65 @@ +``w.provider_listings``: Provider Listings +========================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ProviderListingsAPI + + Listings are the core entities in the Marketplace. They represent the products that are available for + consumption. + + .. py:method:: create(listing: Listing) -> CreateListingResponse + + Create a listing. + + Create a new listing + + :param listing: :class:`Listing` + + :returns: :class:`CreateListingResponse` + + + .. py:method:: delete(id: str) + + Delete a listing. + + Delete a listing + + :param id: str + + + + + .. py:method:: get(id: str) -> GetListingResponse + + Get a listing. + + Get a listing + + :param id: str + + :returns: :class:`GetListingResponse` + + + .. py:method:: list( [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[Listing] + + List listings. + + List listings owned by this provider + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`Listing` + + + .. py:method:: update(id: str, listing: Listing) -> UpdateListingResponse + + Update listing. + + Update a listing + + :param id: str + :param listing: :class:`Listing` + + :returns: :class:`UpdateListingResponse` + \ No newline at end of file diff --git a/docs/workspace/marketplace/provider_personalization_requests.rst b/docs/workspace/marketplace/provider_personalization_requests.rst new file mode 100644 index 000000000..32cdbdbb3 --- /dev/null +++ b/docs/workspace/marketplace/provider_personalization_requests.rst @@ -0,0 +1,36 @@ +``w.provider_personalization_requests``: Provider Personalization Requests +========================================================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ProviderPersonalizationRequestsAPI + + Personalization requests are an alternate to instantly available listings. Control the lifecycle of + personalized solutions. + + .. py:method:: list( [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[PersonalizationRequest] + + All personalization requests across all listings. + + List personalization requests to this provider. This will return all personalization requests, + regardless of which listing they are for. + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`PersonalizationRequest` + + + .. py:method:: update(listing_id: str, request_id: str, status: PersonalizationRequestStatus [, reason: Optional[str], share: Optional[ShareInfo]]) -> UpdatePersonalizationRequestResponse + + Update personalization request status. + + Update personalization request. This method only permits updating the status of the request. + + :param listing_id: str + :param request_id: str + :param status: :class:`PersonalizationRequestStatus` + :param reason: str (optional) + :param share: :class:`ShareInfo` (optional) + + :returns: :class:`UpdatePersonalizationRequestResponse` + \ No newline at end of file diff --git a/docs/workspace/marketplace/provider_provider_analytics_dashboards.rst b/docs/workspace/marketplace/provider_provider_analytics_dashboards.rst new file mode 100644 index 000000000..cc29e089f --- /dev/null +++ b/docs/workspace/marketplace/provider_provider_analytics_dashboards.rst @@ -0,0 +1,50 @@ +``w.provider_provider_analytics_dashboards``: Provider Providers Analytics Dashboards +===================================================================================== +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ProviderProviderAnalyticsDashboardsAPI + + Manage templated analytics solution for providers. + + .. py:method:: create() -> ProviderAnalyticsDashboard + + Create provider analytics dashboard. + + Create provider analytics dashboard. Returns Marketplace specific `id`. Not to be confused with the + Lakeview dashboard id. + + :returns: :class:`ProviderAnalyticsDashboard` + + + .. py:method:: get() -> ListProviderAnalyticsDashboardResponse + + Get provider analytics dashboard. + + Get provider analytics dashboard. + + :returns: :class:`ListProviderAnalyticsDashboardResponse` + + + .. py:method:: get_latest_version() -> GetLatestVersionProviderAnalyticsDashboardResponse + + Get latest version of provider analytics dashboard. + + Get latest version of provider analytics dashboard. + + :returns: :class:`GetLatestVersionProviderAnalyticsDashboardResponse` + + + .. py:method:: update(id: str [, version: Optional[int]]) -> UpdateProviderAnalyticsDashboardResponse + + Update provider analytics dashboard. + + Update provider analytics dashboard. + + :param id: str + id is immutable property and can't be updated. + :param version: int (optional) + this is the version of the dashboard template we want to update our user to current expectation is + that it should be equal to latest version of the dashboard template + + :returns: :class:`UpdateProviderAnalyticsDashboardResponse` + \ No newline at end of file diff --git a/docs/workspace/marketplace/provider_providers.rst b/docs/workspace/marketplace/provider_providers.rst new file mode 100644 index 000000000..610c9602e --- /dev/null +++ b/docs/workspace/marketplace/provider_providers.rst @@ -0,0 +1,64 @@ +``w.provider_providers``: Provider Providers +============================================ +.. currentmodule:: databricks.sdk.service.marketplace + +.. py:class:: ProviderProvidersAPI + + Providers are entities that manage assets in Marketplace. + + .. py:method:: create(provider: ProviderInfo) -> CreateProviderResponse + + Create a provider. + + Create a provider + + :param provider: :class:`ProviderInfo` + + :returns: :class:`CreateProviderResponse` + + + .. py:method:: delete(id: str) + + Delete provider. + + Delete provider + + :param id: str + + + + + .. py:method:: get(id: str) -> GetProviderResponse + + Get provider. + + Get provider profile + + :param id: str + + :returns: :class:`GetProviderResponse` + + + .. py:method:: list( [, page_size: Optional[int], page_token: Optional[str]]) -> Iterator[ProviderInfo] + + List providers. + + List provider profiles for account. + + :param page_size: int (optional) + :param page_token: str (optional) + + :returns: Iterator over :class:`ProviderInfo` + + + .. py:method:: update(id: str, provider: ProviderInfo) -> UpdateProviderResponse + + Update provider. + + Update provider profile + + :param id: str + :param provider: :class:`ProviderInfo` + + :returns: :class:`UpdateProviderResponse` + \ No newline at end of file diff --git a/examples/workspace_assignment/update_workspace_assignment_on_aws.py b/examples/workspace_assignment/update_workspace_assignment_on_aws.py index 48a26a40b..f12e85891 100755 --- a/examples/workspace_assignment/update_workspace_assignment_on_aws.py +++ b/examples/workspace_assignment/update_workspace_assignment_on_aws.py @@ -12,6 +12,6 @@ workspace_id = os.environ["DUMMY_WORKSPACE_ID"] -a.workspace_assignment.update(workspace_id=workspace_id, - principal_id=spn_id, - permissions=[iam.WorkspacePermission.USER]) +_ = a.workspace_assignment.update(workspace_id=workspace_id, + principal_id=spn_id, + permissions=[iam.WorkspacePermission.USER])