Skip to content

Conversation

@emmyzhou-db
Copy link
Contributor

@emmyzhou-db emmyzhou-db commented May 26, 2025

What changes are proposed in this pull request?

This PR introduces direct dataplane access to the Java SDK.

Key Changes

1. OAuth Header Factory Implementation

  • Introduced OAuthHeaderFactory interface that combines HeaderFactory and TokenSource functionality

2. Error Token Source

  • Added ErrorTokenSource implementation that provides clear error messages when OAuth tokens are not supported

3. OAuth Authentication Flow Changes

  • Modified OAuth auth types to return OAuthHeaderFactory instead of basic HeaderFactory
  • Updated credential providers to use the new OAuth header factory system:
    • Azure Service Principal
    • OAuth M2M Service Principal
    • External Browser
    • Databricks CLI
    • GitHub OIDC
    • Azure GitHub OIDC

How is this tested?

Added unit tests for all new classes including OAuthHeaderFactory, ErrorTokenSource, along with tests for config.getTokenSource(). The feature has been manually validated end-to-end by creating a model serving endpoint with route optimization enabled and successfully sending queries through the dataplane access path.

…/RequestOptions.java


Update class description of RequestOptions

Co-authored-by: Renaud Hartert <renaud.hartert@databricks.com>
Comment on lines +21 to +23
public ServingEndpointsDataPlaneAPI(
ApiClient apiClient, DatabricksConfig config, ServingEndpointsAPI servingEndpointsAPI) {
impl = new ServingEndpointsDataPlaneImpl(apiClient, config, servingEndpointsAPI);
Copy link
Contributor

@renaudhartert-db renaudhartert-db May 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code should be generated, is this PR meant to be merged after the code generation PR?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure how the procedure for syncing the generated code and hand-written code works, but I have made a PR on universe for the updated templates. Perhaps you can take a look at it to make sure everything is fine and then I will remove it from the PR here along with WorkspaceClient and the implementation :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SG, let's merge the codegen first 👍

Copy link
Contributor

@renaudhartert-db renaudhartert-db left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, to be merged once we have merged the code generation PR first.

@emmyzhou-db emmyzhou-db requested a review from hectorcast-db May 27, 2025 13:14
@github-actions
Copy link

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/sdk-java

Inputs:

  • PR number: 453
  • Commit SHA: 4ea2d8ffe6e15b2b5bdb302dc2d8022daaf82fbe

Checks will be approved automatically on success.

@renaudhartert-db renaudhartert-db added this pull request to the merge queue May 27, 2025
Merged via the queue into main with commit 3c4dd89 May 27, 2025
15 checks passed
@renaudhartert-db renaudhartert-db deleted the emmyzhou-db/code-gen branch May 27, 2025 15:11
deco-sdk-tagging bot added a commit that referenced this pull request May 27, 2025
## Release v0.52.0

### New Features and Improvements
* Added Direct-to-Dataplane API support, allowing users to query route optimized model serving endpoints ([#453](#453)).

### API Changes
* Added `workspaceClient.dashboardEmailSubscriptions()` service and `workspaceClient.sqlResultsDownload()` service.
* Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.ClusterAttributes`.
* Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.ClusterDetails`.
* Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.ClusterSpec`.
* Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.CreateCluster`.
* Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.EditCluster`.
* Added `remoteShuffleDiskIops`, `remoteShuffleDiskThroughput` and `totalInitialRemoteShuffleDiskSize` fields for `com.databricks.sdk.service.compute.UpdateClusterResource`.
* Added `tags` field for `com.databricks.sdk.service.pipelines.CreatePipeline`.
* Added `tags` field for `com.databricks.sdk.service.pipelines.EditPipeline`.
* Added `tags` field for `com.databricks.sdk.service.pipelines.PipelineSpec`.
* Added `maxProvisionedConcurrency` and `minProvisionedConcurrency` fields for `com.databricks.sdk.service.serving.ServedEntityInput`.
* Added `maxProvisionedConcurrency` and `minProvisionedConcurrency` fields for `com.databricks.sdk.service.serving.ServedEntityOutput`.
* Added `maxProvisionedConcurrency` and `minProvisionedConcurrency` fields for `com.databricks.sdk.service.serving.ServedModelInput`.
* Added `maxProvisionedConcurrency` and `minProvisionedConcurrency` fields for `com.databricks.sdk.service.serving.ServedModelOutput`.
* Added `DELTASHARING_CATALOG`, `FOREIGN_CATALOG`, `INTERNAL_CATALOG`, `MANAGED_CATALOG`, `MANAGED_ONLINE_CATALOG`, `SYSTEM_CATALOG` and `UNKNOWN_CATALOG_TYPE` enum values for `com.databricks.sdk.service.catalog.CatalogType`.
* Added `GA4_RAW_DATA`, `POWER_BI`, `SALESFORCE`, `SALESFORCE_DATA_CLOUD`, `SERVICENOW`, `UNKNOWN_CONNECTION_TYPE` and `WORKDAY_RAAS` enum values for `com.databricks.sdk.service.catalog.ConnectionType`.
* Added `OAUTH_ACCESS_TOKEN`, `OAUTH_M2M`, `OAUTH_REFRESH_TOKEN`, `OAUTH_RESOURCE_OWNER_PASSWORD`, `OAUTH_U2M`, `OAUTH_U2M_MAPPING`, `OIDC_TOKEN`, `PEM_PRIVATE_KEY`, `SERVICE_CREDENTIAL` and `UNKNOWN_CREDENTIAL_TYPE` enum values for `com.databricks.sdk.service.catalog.CredentialType`.
* Added `CATALOG`, `CLEAN_ROOM`, `CONNECTION`, `CREDENTIAL`, `EXTERNAL_LOCATION`, `EXTERNAL_METADATA`, `FUNCTION`, `METASTORE`, `PIPELINE`, `PROVIDER`, `RECIPIENT`, `SCHEMA`, `SHARE`, `STAGING_TABLE`, `STORAGE_CREDENTIAL`, `TABLE`, `UNKNOWN_SECURABLE_TYPE` and `VOLUME` enum values for `com.databricks.sdk.service.catalog.SecurableType`.
* Added `TERADATA` enum value for `com.databricks.sdk.service.pipelines.IngestionSourceType`.
* Added `OIDC_FEDERATION` enum value for `com.databricks.sdk.service.sharing.AuthenticationType`.
* [Breaking] Changed `securableType` field for `com.databricks.sdk.service.catalog.ConnectionInfo` to type `com.databricks.sdk.service.catalog.SecurableType` class.
* [Breaking] Changed `catalogType` field for `com.databricks.sdk.service.catalog.SchemaInfo` to type `com.databricks.sdk.service.catalog.CatalogType` class.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants