Introduction
Can tables created in Fabric be seamlessly referenced and edited in Databricks?
Many people may have this question.
In this article, we will specifically introduce the use case of:
- Using tables created in Fabric within Databricks.
For details on the prerequisite settings, please refer to the previous article.
This article is part of a four-part series:
- Overview and Purpose of Interoperability
- Detailed Configuration of Hub Storage
- Using tables created in Fabric within Databricks (this article)
- Using tables created in Databricks within Fabric
Linking Tables Created in Fabric to Databricks
Creating a New Table in Fabric
Upload a CSV file to the Fabric Lakehouse.
:::note info
The CSV file used in this article is sales.csv
from the following Microsoft documentation:
Create a Microsoft Fabric Lakehouse
:::
From the CSV file, select [Load to Table] > [New Table].
Specify ext
, which is a shortcut created in the hub storage, as the schema.
Verifying the Created Table
You can confirm that a new table has been created in the Lakehouse.
A create_from_fabric_sales
folder is created in the ext
folder of the hub storage.
(This means that the newly created table physically exists in the hub storage.)
You can also confirm that the table is in Delta format.
At this point, as expected, the table created in Fabric is not yet visible in Databricks.
Enabling Databricks to Access Fabric Tables
Use the Databricks SQL Editor to create an external table.
Specify the hub storage folder path (the folder of the table created in Fabric) in the Location field.
CREATE TABLE <table_name>
USING DELTA
LOCATION 'abfss://<container_name>@<ADLS2_name>.dfs.core.windows.net/folder_name/<table_folder_name>'
Then, you can view tables created in Fabric from the [Catalog].
Viewing and Analyzing Tables Created in Fabric with Databricks (BI Creation)
From the [Dashboard] in Databricks, you can create a new dashboard and select an external table (i.e., a table created in Fabric) from [Data] > [Select Table].
Thus, it is possible to analyze tables created in Fabric using Databricks.
Editing Tables Created in Fabric with Databricks (DML)
Try executing an UPDATE
statement (DML statement) from the SQL Editor in Databricks.
UPDATE create_from_fabric_sales
SET Item = 'No.1 Item'
WHERE Item = 'Road-150 Red, 48'
Of course, you can confirm that the changes have been reflected on the Databricks side.
Although the edit was made from Databricks, the changes were successfully reflected on the Fabric side as well.
SELECT Item, SUM(Quantity * UnitPrice) AS Revenue
FROM Fabric_Lakehouse.ext.create_from_fabric_sales
GROUP BY Item
ORDER BY Revenue DESC;
Therefore, it is possible to edit tables created in Fabric using Databricks (DML statements).
Conclusion
From the above, we have confirmed that
"Tables created in Fabric can be used in Databricks."
Once the hub storage is set up, it is relatively easy to achieve interoperability between Fabric and Databricks.
In the next article, we will introduce the reverse case:
"Using tables created in Databricks in Fabric."
▽ Next article
▽ Previous article
Top comments (0)