-
Notifications
You must be signed in to change notification settings - Fork 35
Add support for direct dataplane access #453
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…vant to direct dataplane access
databricks-sdk-java/src/main/java/com/databricks/sdk/core/http/RequestOptions.java
Show resolved
Hide resolved
...ks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/OAuthHeaderFactoryFromSuppliers.java
Outdated
Show resolved
Hide resolved
…/RequestOptions.java Update class description of RequestOptions Co-authored-by: Renaud Hartert <renaud.hartert@databricks.com>
databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksConfig.java
Outdated
Show resolved
Hide resolved
| public ServingEndpointsDataPlaneAPI( | ||
| ApiClient apiClient, DatabricksConfig config, ServingEndpointsAPI servingEndpointsAPI) { | ||
| impl = new ServingEndpointsDataPlaneImpl(apiClient, config, servingEndpointsAPI); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This code should be generated, is this PR meant to be merged after the code generation PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure how the procedure for syncing the generated code and hand-written code works, but I have made a PR on universe for the updated templates. Perhaps you can take a look at it to make sure everything is fine and then I will remove it from the PR here along with WorkspaceClient and the implementation :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SG, let's merge the codegen first 👍
renaudhartert-db
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, to be merged once we have merged the code generation PR first.
databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksConfig.java
Show resolved
Hide resolved
This reverts commit ec211f3.
|
If integration tests don't run automatically, an authorized user can run them manually by following the instructions below: Trigger: Inputs:
Checks will be approved automatically on success. |
## Release v0.52.0 ### New Features and Improvements * Added Direct-to-Dataplane API support, allowing users to query route optimized model serving endpoints ([#453](#453)). ### API Changes * Added `workspaceClient.dashboardEmailSubscriptions()` service and `workspaceClient.sqlResultsDownload()` service. * Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.ClusterAttributes`. * Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.ClusterDetails`. * Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.ClusterSpec`. * Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.CreateCluster`. * Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.EditCluster`. * Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.UpdateClusterResource`. * Added `tags` field for `com.databricks.sdk.service.pipelines.CreatePipeline`. * Added `tags` field for `com.databricks.sdk.service.pipelines.EditPipeline`. * Added `tags` field for `com.databricks.sdk.service.pipelines.PipelineSpec`. * Added `maxProvisionedConcurrency` and `minProvisionedConcurrency` fields for `com.databricks.sdk.service.serving.ServedEntityInput`. * Added `maxProvisionedConcurrency` and `minProvisionedConcurrency` fields for `com.databricks.sdk.service.serving.ServedEntityOutput`. * Added `maxProvisionedConcurrency` and `minProvisionedConcurrency` fields for `com.databricks.sdk.service.serving.ServedModelInput`. * Added `maxProvisionedConcurrency` and `minProvisionedConcurrency` fields for `com.databricks.sdk.service.serving.ServedModelOutput`. * Added `DELTASHARING_CATALOG`, `FOREIGN_CATALOG`, `INTERNAL_CATALOG`, `MANAGED_CATALOG`, `MANAGED_ONLINE_CATALOG`, `SYSTEM_CATALOG` and `UNKNOWN_CATALOG_TYPE` enum values for `com.databricks.sdk.service.catalog.CatalogType`. * Added `GA4_RAW_DATA`, `POWER_BI`, `SALESFORCE`, `SALESFORCE_DATA_CLOUD`, `SERVICENOW`, `UNKNOWN_CONNECTION_TYPE` and `WORKDAY_RAAS` enum values for `com.databricks.sdk.service.catalog.ConnectionType`. * Added `OAUTH_ACCESS_TOKEN`, `OAUTH_M2M`, `OAUTH_REFRESH_TOKEN`, `OAUTH_RESOURCE_OWNER_PASSWORD`, `OAUTH_U2M`, `OAUTH_U2M_MAPPING`, `OIDC_TOKEN`, `PEM_PRIVATE_KEY`, `SERVICE_CREDENTIAL` and `UNKNOWN_CREDENTIAL_TYPE` enum values for `com.databricks.sdk.service.catalog.CredentialType`. * Added `CATALOG`, `CLEAN_ROOM`, `CONNECTION`, `CREDENTIAL`, `EXTERNAL_LOCATION`, `EXTERNAL_METADATA`, `FUNCTION`, `METASTORE`, `PIPELINE`, `PROVIDER`, `RECIPIENT`, `SCHEMA`, `SHARE`, `STAGING_TABLE`, `STORAGE_CREDENTIAL`, `TABLE`, `UNKNOWN_SECURABLE_TYPE` and `VOLUME` enum values for `com.databricks.sdk.service.catalog.SecurableType`. * Added `TERADATA` enum value for `com.databricks.sdk.service.pipelines.IngestionSourceType`. * Added `OIDC_FEDERATION` enum value for `com.databricks.sdk.service.sharing.AuthenticationType`. * [Breaking] Changed `securableType` field for `com.databricks.sdk.service.catalog.ConnectionInfo` to type `com.databricks.sdk.service.catalog.SecurableType` class. * [Breaking] Changed `catalogType` field for `com.databricks.sdk.service.catalog.SchemaInfo` to type `com.databricks.sdk.service.catalog.CatalogType` class.
What changes are proposed in this pull request?
This PR introduces direct dataplane access to the Java SDK.
Key Changes
1. OAuth Header Factory Implementation
OAuthHeaderFactoryinterface that combinesHeaderFactoryandTokenSourcefunctionality2. Error Token Source
ErrorTokenSourceimplementation that provides clear error messages when OAuth tokens are not supported3. OAuth Authentication Flow Changes
OAuthHeaderFactoryinstead of basicHeaderFactoryHow is this tested?
Added unit tests for all new classes including
OAuthHeaderFactory,ErrorTokenSource, along with tests forconfig.getTokenSource(). The feature has been manually validated end-to-end by creating a model serving endpoint with route optimization enabled and successfully sending queries through the dataplane access path.