)

Advertisement

Ad code

Exploring Analytical Options with Dynamics 365 Finance and Operations: Link to Fabric

I’ve recently been exploring various analytical options within Dynamics 365 Finance and Operations, and one that I’ve delved deeply into is Link to Fabric.

There is a walkthrough guide available on the Microsoft Fasttrack Github repo. See Hands-on Lab: Link to Fabric from Dynamics 365 finance and operations apps

This guide is an excellent starting point and should be one of the first things you try out. However, it’s important to understand that there are limitations to this approach that may not be suitable for all real-world scenarios. Lets discuss these items and what I have been exploring ...

Background

I want to join multiple tables to create my denormalised views that I can report on. My goal is to use Direct Lake mode in the semantic model. Specifically, I wanted to avoid the need to reimport data into Power BI for reporting. 

Key Limitations

The first limitation you’ll encounter is:

By design, only tables in the semantic model derived from tables in a Lakehouse or Warehouse support Direct Lake mode. Although tables in the model can be derived from SQL views in the Lakehouse or Warehouse, queries using those tables will fall back to DirectQuery mode.

FinOps is highly normalised and following the Microsoft lab which uses views would not work for me. 

Solutions

A solution to this problem is to create a delta table that I could load from the query/view.

Here are several ways to to do this:

1. Import Data Using a Data Pipeline: This method is easy to configure but can be slow and is not ideal for large volumes of data. Only works within the same workspace.

2. Import Data Using Dataflow Gen2: Also easy to configure, but the limitation is that the copy only works within the same workspace.

3. Import Using a Stored Procedure: Simple to set up but shares the same limitation as Dataflow Gen2, working only at the SQL analytics endpoint level and not across workspaces.

4. Import Using a Notebook: This method has a higher learning curve but offers the best performance and flexibility.


Scenarios

For me, I would lean towards using Notebooks. I have explored Spark SQL option as that was the lowest learning curve for the different languages you can use in a Notebook.

Below is a simple query to get you started. It assumes you have the tables already exporting in fabric.

Select statement with a join

A simple select query to get you started with your first notebook.
%%sql
SELECT
    party.recid AS PartyId
    ,party.name AS Name
    ,COALESCE(party.namealias, '') AS ShortName
    ,COALESCE(postal.countryregionid, '') AS Country
    ,COALESCE(postal.state, '') AS State
    ,COALESCE(postal.city, '') AS City
    ,COALESCE(postal.street, '') AS Street
    ,COALESCE(postal.zipcode, '') AS PostCode
    ,COALESCE(phone.locator, '') AS PhoneNumber
    ,COALESCE(email.locator, '') AS Email
FROM dirpartytable party
LEFT OUTER JOIN logisticspostaladdress postal ON postal.location = party.primaryaddresslocation
AND postal.validto > current_date() -- filters only valid(effective) addresses
LEFT OUTER JOIN logisticselectronicaddress phone ON phone.recid = party.primarycontactphone
LEFT OUTER JOIN logisticselectronicaddress email ON email.recid = party.primarycontactemail

You should see a table result showing below your query.




Create table if not exists

Next level is to create a new delta table and copy your selection into it.
%%sql
CREATE TABLE IF NOT EXISTS fact_dirpartytable
USING DELTA AS
SELECT
    party.recid AS PartyId
    ,party.name AS Name
    ,COALESCE(party.namealias, '') AS ShortName
    ,COALESCE(postal.countryregionid, '') AS Country
    ,COALESCE(postal.state, '') AS State
    ,COALESCE(postal.city, '') AS City
    ,COALESCE(postal.street, '') AS Street
    ,COALESCE(postal.zipcode, '') AS PostCode
    ,COALESCE(phone.locator, '') AS PhoneNumber
    ,COALESCE(email.locator, '') AS Email
FROM dirpartytable party
LEFT OUTER JOIN logisticspostaladdress postal ON postal.location = party.primaryaddresslocation
AND postal.validto > current_date() -- filters only valid(effective) addresses
LEFT OUTER JOIN logisticselectronicaddress phone ON phone.recid = party.primarycontactphone
LEFT OUTER JOIN logisticselectronicaddress email ON email.recid = party.primarycontactemail

This is a one of copy and will not copy data if the table exists already.

Next blog post, I will cover a few different scenarios.







I’ve recently been exploring various analytical options within Dynamics 365 Finance and Operations, and one that I’ve delved deeply into is Link to Fabric.

There is a walkthrough guide available on the Microsoft Fasttrack Github repo. See Hands-on Lab: Link to Fabric from Dynamics 365 finance and operations apps

This guide is an excellent starting point and should be one of the first things you try out. However, it’s important to understand that there are limitations to this approach that may not be suitable for all real-world scenarios. Lets discuss these items and what I have been exploring ...

Background

I want to join multiple tables to create my denormalised views that I can report on. My goal is to use Direct Lake mode in the semantic model. Specifically, I wanted to avoid the need to reimport data into Power BI for reporting. 

Key Limitations

The first limitation you’ll encounter is:

By design, only tables in the semantic model derived from tables in a Lakehouse or Warehouse support Direct Lake mode. Although tables in the model can be derived from SQL views in the Lakehouse or Warehouse, queries using those tables will fall back to DirectQuery mode.

FinOps is highly normalised and following the Microsoft lab which uses views would not work for me. 

Solutions

A solution to this problem is to create a delta table that I could load from the query/view.

Here are several ways to to do this:

1. Import Data Using a Data Pipeline: This method is easy to configure but can be slow and is not ideal for large volumes of data. Only works within the same workspace.

2. Import Data Using Dataflow Gen2: Also easy to configure, but the limitation is that the copy only works within the same workspace.

3. Import Using a Stored Procedure: Simple to set up but shares the same limitation as Dataflow Gen2, working only at the SQL analytics endpoint level and not across workspaces.

4. Import Using a Notebook: This method has a higher learning curve but offers the best performance and flexibility.


Scenarios

For me, I would lean towards using Notebooks. I have explored Spark SQL option as that was the lowest learning curve for the different languages you can use in a Notebook.

Below is a simple query to get you started. It assumes you have the tables already exporting in fabric.

Select statement with a join

A simple select query to get you started with your first notebook.
%%sql
SELECT
    party.recid AS PartyId
    ,party.name AS Name
    ,COALESCE(party.namealias, '') AS ShortName
    ,COALESCE(postal.countryregionid, '') AS Country
    ,COALESCE(postal.state, '') AS State
    ,COALESCE(postal.city, '') AS City
    ,COALESCE(postal.street, '') AS Street
    ,COALESCE(postal.zipcode, '') AS PostCode
    ,COALESCE(phone.locator, '') AS PhoneNumber
    ,COALESCE(email.locator, '') AS Email
FROM dirpartytable party
LEFT OUTER JOIN logisticspostaladdress postal ON postal.location = party.primaryaddresslocation
AND postal.validto > current_date() -- filters only valid(effective) addresses
LEFT OUTER JOIN logisticselectronicaddress phone ON phone.recid = party.primarycontactphone
LEFT OUTER JOIN logisticselectronicaddress email ON email.recid = party.primarycontactemail

You should see a table result showing below your query.




Create table if not exists

Next level is to create a new delta table and copy your selection into it.
%%sql
CREATE TABLE IF NOT EXISTS fact_dirpartytable
USING DELTA AS
SELECT
    party.recid AS PartyId
    ,party.name AS Name
    ,COALESCE(party.namealias, '') AS ShortName
    ,COALESCE(postal.countryregionid, '') AS Country
    ,COALESCE(postal.state, '') AS State
    ,COALESCE(postal.city, '') AS City
    ,COALESCE(postal.street, '') AS Street
    ,COALESCE(postal.zipcode, '') AS PostCode
    ,COALESCE(phone.locator, '') AS PhoneNumber
    ,COALESCE(email.locator, '') AS Email
FROM dirpartytable party
LEFT OUTER JOIN logisticspostaladdress postal ON postal.location = party.primaryaddresslocation
AND postal.validto > current_date() -- filters only valid(effective) addresses
LEFT OUTER JOIN logisticselectronicaddress phone ON phone.recid = party.primarycontactphone
LEFT OUTER JOIN logisticselectronicaddress email ON email.recid = party.primarycontactemail

This is a one of copy and will not copy data if the table exists already.

Next blog post, I will cover a few different scenarios.







I’ve recently been exploring various analytical options within Dynamics 365 Finance and Operations, and one that I’ve delved deeply into is Link to Fabric.

There is a walkthrough guide available on the Microsoft Fasttrack Github repo. See Hands-on Lab: Link to Fabric from Dynamics 365 finance and operations apps

This guide is an excellent starting point and should be one of the first things you try out. However, it’s important to understand that there are limitations to this approach that may not be suitable for all real-world scenarios. Lets discuss these items and what I have been exploring ...

Background

I want to join multiple tables to create my denormalised views that I can report on. My goal is to use Direct Lake mode in the semantic model. Specifically, I wanted to avoid the need to reimport data into Power BI for reporting. 

Key Limitations

The first limitation you’ll encounter is:

By design, only tables in the semantic model derived from tables in a Lakehouse or Warehouse support Direct Lake mode. Although tables in the model can be derived from SQL views in the Lakehouse or Warehouse, queries using those tables will fall back to DirectQuery mode.

FinOps is highly normalised and following the Microsoft lab which uses views would not work for me. 

Solutions

A solution to this problem is to create a delta table that I could load from the query/view.

Here are several ways to to do this:

1. Import Data Using a Data Pipeline: This method is easy to configure but can be slow and is not ideal for large volumes of data. Only works within the same workspace.

2. Import Data Using Dataflow Gen2: Also easy to configure, but the limitation is that the copy only works within the same workspace.

3. Import Using a Stored Procedure: Simple to set up but shares the same limitation as Dataflow Gen2, working only at the SQL analytics endpoint level and not across workspaces.

4. Import Using a Notebook: This method has a higher learning curve but offers the best performance and flexibility.


Scenarios

For me, I would lean towards using Notebooks. I have explored Spark SQL option as that was the lowest learning curve for the different languages you can use in a Notebook.

Below is a simple query to get you started. It assumes you have the tables already exporting in fabric.

Select statement with a join

A simple select query to get you started with your first notebook.
%%sql
SELECT
    party.recid AS PartyId
    ,party.name AS Name
    ,COALESCE(party.namealias, '') AS ShortName
    ,COALESCE(postal.countryregionid, '') AS Country
    ,COALESCE(postal.state, '') AS State
    ,COALESCE(postal.city, '') AS City
    ,COALESCE(postal.street, '') AS Street
    ,COALESCE(postal.zipcode, '') AS PostCode
    ,COALESCE(phone.locator, '') AS PhoneNumber
    ,COALESCE(email.locator, '') AS Email
FROM dirpartytable party
LEFT OUTER JOIN logisticspostaladdress postal ON postal.location = party.primaryaddresslocation
AND postal.validto > current_date() -- filters only valid(effective) addresses
LEFT OUTER JOIN logisticselectronicaddress phone ON phone.recid = party.primarycontactphone
LEFT OUTER JOIN logisticselectronicaddress email ON email.recid = party.primarycontactemail

You should see a table result showing below your query.




Create table if not exists

Next level is to create a new delta table and copy your selection into it.
%%sql
CREATE TABLE IF NOT EXISTS fact_dirpartytable
USING DELTA AS
SELECT
    party.recid AS PartyId
    ,party.name AS Name
    ,COALESCE(party.namealias, '') AS ShortName
    ,COALESCE(postal.countryregionid, '') AS Country
    ,COALESCE(postal.state, '') AS State
    ,COALESCE(postal.city, '') AS City
    ,COALESCE(postal.street, '') AS Street
    ,COALESCE(postal.zipcode, '') AS PostCode
    ,COALESCE(phone.locator, '') AS PhoneNumber
    ,COALESCE(email.locator, '') AS Email
FROM dirpartytable party
LEFT OUTER JOIN logisticspostaladdress postal ON postal.location = party.primaryaddresslocation
AND postal.validto > current_date() -- filters only valid(effective) addresses
LEFT OUTER JOIN logisticselectronicaddress phone ON phone.recid = party.primarycontactphone
LEFT OUTER JOIN logisticselectronicaddress email ON email.recid = party.primarycontactemail

This is a one of copy and will not copy data if the table exists already.

Next blog post, I will cover a few different scenarios.







Post a Comment

0 Comments

Comments