Saturday, December 23, 2023
HomeBig DataSaying new id and entry administration enhancements to make Databricks readily accessible...

Saying new id and entry administration enhancements to make Databricks readily accessible to your customers


We’re excited to share new id and entry administration options to assist simplify the set-up and scale of Databricks for admins. Unity Catalog is on the heart of governance on the Databricks Information Intelligence Platform. A part of Unity Catalog is our id and entry administration capabilities, designed with the next ideas:

  1. Construct safe, scalable, and ubiquitous id and entry administration for onboarding, managing, and collaborating.
  2. Allow prospects to simply management entry to Databricks utilizing intuitive, extensible, and audit-ready permissions.
  3. Develop world-class, extremely scalable authentication for browser and API entry to allow prospects and companions to easily and securely leverage the facility of the Databricks Information Intelligence Platform.

On this weblog, we’ll present a refresher on current id and entry administration options and introduce new investments to simplify the Databricks admin expertise. These investments embrace easy logins from Energy BI and Tableau, simplified single sign-on setup by way of unified login, OAuth authorization, and operating jobs utilizing the id of a service principal as a safety finest follow.

Seamlessly join Energy BI and Tableau to Databricks on AWS utilizing single sign-on

Energy BI and Tableau are two of the most well-liked third get together information instruments on Databricks. The power to securely join from Energy BI and Tableau to Databricks with single sign-on is now usually accessible on AWS. Databricks leverages OAuth to permit customers to entry Databricks from these instruments utilizing single sign-on. This simplifies login for customers and reduces the danger of leaked credentials. OAuth accomplice purposes for Energy BI and Tableau are enabled in your account by default.

To get began, take a look at our docs web page or watch this demo video for Energy BI.

Authenticate customers to Databricks utilizing unified login on AWS

Single sign-on (SSO) is a key safety finest follow and lets you authenticate your customers to Databricks utilizing your most popular id supplier. At Databricks, we provide SSO throughout all three clouds. On Azure and GCP, we provide SSO on your account and workspaces by default within the type of Microsoft Entra ID (previously Azure Energetic Listing) and Google Cloud Identification, respectively. On AWS, Databricks affords assist for a wide range of id suppliers reminiscent of Okta, Microsoft Entra ID, and OneLogin utilizing both SAML or OIDC.

This summer time, we launched unified login, a brand new characteristic that simplifies SSO for Databricks on AWS accounts and workspaces. Unified login lets you handle one SSO configuration in your account and each Databricks workspace related to it. With Single Signal-On (SSO) activated in your account, you possibly can allow unified login for all or particular workspaces. This setup makes use of an account-level SSO configuration for Databricks entry, simplifying consumer authentication throughout your account’s workspaces. Unified Login is in use on hundreds of workspaces in manufacturing already.

Unified Login

Unified login is GA and enabled mechanically on accounts created after June 21, 2023. The characteristic is in public preview for accounts created earlier than June 21, 2023. To allow unified login, see arrange SSO in your Databricks account console.

Automate service principal entry to Databricks with OAuth on AWS

We’re excited to announce that OAuth for service principals is usually accessible on AWS. On Azure and GCP, we assist OAuth by way of Azure and Google tokens, respectively. Service principals are Databricks identities to be used with automated instruments, jobs, and purposes. It’s a safety finest follow to make use of service principals as a substitute of customers for manufacturing automation workflows for the next causes:

  • Manufacturing workflows that run utilizing service principals usually are not impacted when customers depart the group or change roles.
  • If all processes that act on manufacturing information run utilizing service principals, interactive customers don’t want any write, delete, or modify privileges in manufacturing. This eliminates the danger of a consumer overwriting manufacturing information accidentally.
  • Utilizing service principals for automated workflows permits customers to higher defend their very own entry tokens.

OAuth is an open normal protocol that authorizes customers and repair accounts to APIs and different assets with out revealing the credentials. OAuth for service principals makes use of the OAuth consumer credentials stream to generate OAuth entry tokens that can be utilized to authenticate to Databricks APIs. OAuth for service principals has the next advantages for authenticating to Databricks:

  • Makes use of Databricks service principals, as a substitute of customers, for authentication.
  • Makes use of short-lived (one-hour) entry tokens for credentials to cut back the danger of credentials being leaked.
  • Expired OAuth entry token will be mechanically refreshed utilizing Databricks instruments and SDKs.
  • Can authenticate to all Databricks APIs that the service principal has entry to, at each the account and workspace degree. This lets you automate the creation and setup of workspaces in a single script.

To make use of OAuth for service principals, see Authentication utilizing OAuth for service principals.

Securely run Databricks jobs as a Service Principal

Databricks Workflows orchestrates information processing, machine studying, and analytics pipelines within the Databricks Information Intelligence Platform. A Databricks job is a method to run your information processing and evaluation purposes in a Databricks workspace. By default, jobs run because the id of the job proprietor. Which means that the job assumes the permissions of the job proprietor and may solely entry information and Databricks objects that the job proprietor has permission to entry.

We’re excited to announce that you would be able to now change the id that the job is operating as to a service principal. Which means that the job assumes the permissions of that service principal as a substitute of the proprietor and ensures that the job won’t be affected by a consumer leaving your group or switching departments. Operating a job as a service principal is usually accessible on AWS, Azure, and GCP. Take a look at Run a job as a service principal within the docs to get began.

“Operating Databricks workflows utilizing service principals permits us to separate the workflows permissions, their execution, and their lifecycle from customers, and subsequently making them safer and sturdy”

— George Moldovan, Product Proprietor, Raiffeisen Financial institution Worldwide

Identification and Entry Administration Greatest Practices on Databricks

At Databricks, we’re dedicated to scaling with you as your group grows. We coated lots in at the moment’s weblog, highlighting our key investments in our id and entry administration platform by way of Unity Catalog on Databricks. With a slew of latest id and entry administration options now accessible, you would possibly marvel what “good” appears like as you construct your information governance technique with Databricks.

We suggest you take a look at our id and entry administration docs pages for the newest finest practices (AWS | Azure | GCP) or watch our Information + AI Summit 2023 session “Greatest Practices for Setting Up Databricks SQL at Enterprise Scale”.



Supply hyperlink

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments