Tabular Editor x Databricks (Part 1)

It’s an exciting time for SpaceParts Co.! One of their divisions is in the final stages of releasing a new data platform, built using Databricks. 

But for Borp – who is an Enterprise Power BI Developer at SpaceParts Co. – it’s a time of uncertainty. 

“Databricks?!” he thinks to himself. "Isn’t that for Data Science boffins and rock star Data Engineers? What even is a Data Lakehouse?” 

Borp will now be responsible for building Enterprise Power BI Semantic Models using Databricks, but he’s not quite sure where to start. 

He has a ton of questions:

“How do I even access Databricks?”  

“Will I be able to find the data I need for my Semantic Models easily?” 

“Can I still use Tabular Editor to build my models on Databricks?” 

Don’t worry Borp – we’ve got your back! 

Tabular Editor x Databricks

In this series of articles and videos we’ll give an oversight of how Power BI developers can use Databricks, and of course how you can benefit from using Tabular Editor whilst you do this.

We won’t just be spending time all our time in Tabular Editor. We’ll look at understanding what Databricks is, how to navigate around its interface to explore your data, which features can help you as a Power BI Developer, as well as best practices in Databricks itself and how you can connect Tabular Editor so you can take advantage of these features in Power BI. 

We’ll also introduce some handy scripts you can use in conjunction with Databricks and Tabular Editor to take your semantic models to the next level. 

SpaceParts Co. Data

If you would like to follow along as Borp gets to grips with using the new data platform, we have some great news – the SpaceParts Co dataset is now available on the Databricks Marketplace!

The Databricks Marketplace is an open forum for exchanging data products. On it, you can find Databricks integrations through their Partner Connect network, Solution Accelerators, AI Models, and datasets.

The SpaceParts Co dataset has been set up as a free asset that you can set up to help with learning how to use Tabular Editor with Databricks. The dataset utilizes Delta Sharing, an open protocol developed by Databricks for secure data sharing with other organizations.

Getting set-up

In order to set up a dataset as a consumer in Databricks Marketplace, you need access to a Databricks workspace, and you’ll need specific permissions within Unity Catalog. It may be that you require a Databricks Administrator to set this up on your behalf. 
Getting started with Tabular Editor and Databricks

Here’s what’s required: 

  • Create catalog and use provider permissions on the Unity Catalog metastore attached to your workspace. These allow you to manage shared data products. 
  • Alternatively, if you don’t have these permissions, you need the metastore admin role, which grants broader access. 
  • Additionally, you must have the use marketplace assets privilege on the Unity Catalog metastore. This is enabled by default but can be restricted by an admin. 

If your workspace was automatically enabled for Unity Catalog, the workspace admin has these permissions and can grant them to other users. You can request access from your Databricks account admin or metastore admin if needed. 

Alternative to the above, you can sign up for the Databricks Free Edition. The free edition provides you with all the features required to follow along with this series, including admin rights for any configuration required. 

In addition to setting up the SpaceParts Co dataset as a delta share, we also recommend that you clone the data into your own Databricks catalog. This enables the use of additional features that are not currently supported by Delta Shares, as well as moving the data to your own tenant and region. We’ve included a notebook with the dataset to make this nice and easy.

Once you’re set up, you’re ready to follow along with Borp as he familiarizes himself with Databricks and how he can build Power BI semantic models with Tabular Editor to build better data models faster. 

Watch out for the next part of this series, where we will give an overview of Databricks. 

Related articles