The F&O Twist on Process Mining – Use a data entity to connect a process

, ,

After writing and creating some video content about the basics of Power Automate Process Mining, in this blog, I will talk about an example of how to ingest data from Dynamics 365 F&O. I get a lot of questions, so finally here is a blog with hopefully answering a lot of your questions.

Overview available connectors

Before I dive into the details, there is, as I mentioned in a previous post, no dedicated connector for loading data from Dynamics 365 F&O. This doesn’t mean you can’t use data from F&O for Process Mining. There are several options available, like:

  • Azure Data Lake – There is an option to export data to a Data Lake and ensure it is formatted as an event log.
  • Dataverse – You can use virtual tables or fill a physical table and use this as an event log.
  • Web API – With this connector, you can use OData to retrieve records from a table via public data entities available in Dynamics 365 F&O.

There are some more options. What option to choose depends on several factors, such as performance, storage cost, capacity, transformation, and complexity of the source data. Organizations might have their own IT staff having skills in specific technologies that can influence their preference.

Data entity to connect a process

In this blog, I will not elaborate on the differences and when to choose what option. That is a topic, I will talk about in a future post. The goal for now is to take one process and bring data to Process Mining. I created a demo before that I also presented at the DynamicsMinds conference in May 2023. That was based on an ISV solution, and business events starting a Power Automate Flow to fill a table with events in Dataverse. I received several questions if I could share the example. Unfortunately, as it was based on an ISV product, the pre-requisite would be acquiring that solution. Besides, all the components involved are not the simplest to start with.

I found another process in the standard application: the workflow history. The advantage of the workflow history is that the data is stored in tables that can be easily used to transpose as an event log. From the Warehouse material movement analysis, I checked how Microsoft connected the data from F&O to Process Mining. During some talks I had with Microsoft before, I thought they mentioned using Dataverse, but there is only a staging table in the F&O database and I couldn’t find any trace in Dataverse. I found out that Microsoft is using the Web API connector to start the data flow. In that case, there is a data entity required to connect to the data via an OData call.

For the workflow history, there is no staging table and no data entity available. When I looked at the table structure of the workflow history, I thought that this was providing the base information directly without needing to create a staging table to transform all data into event logs. So, only a data entity is required to be created. After that, you have the prerequisites to create a process like the example in the next screenshot.

Note that this is quite an easy example. In case you are in the need for more complex processes, like order to cash, and procure to pay you would need to gather data from multiple tables. I will share an example of one of these processes in a future blog post.

YouTube learning video

I created a video about how I found out about the technical details Microsoft is using themselves for the warehouse material movement analysis, what exact information is available for workflow history, details for a required data entity, and how to create the process as shown in the screenshot above. This video will provide further details and might answer a lot of the questions you all sent to me in the past period. You can also watch more learning videos for Process Mining in this playlist on YouTube: F&O Twist on Process Mining

Below the video, you can download the data entity I created for the workflow history process mining. Of course, you can challenge yourself and create the data entity on your own. Don’t forget to read the closing words at the end of this blog.

Download

The data entity used in the example can be downloaded and used without restrictions. As it is a stand-alone data entity, I didn’t solve all best practices like the use of label files, and map privileges to duties.

There is more…

As mentioned, the option using a Web API connector is just one of the options to connect Dynamics 365 F&O processes to Power Automate. More connectors will be explained in other blog posts.

The data entity I created is based on directly accessible data. In case you need more attributes that require some business logic, a staging table to collect data for process mining might be a better option.

The data entity itself is not filtering data for a particular recent period. The entity can be extended with a date range or use a staging table and a cleanup process similar to Microsoft implemented for the standard warehouse material movement analysis.

As the data entity has some additional attributes, you can connect the process with the desktop version of Process Mining and start more advanced analysis.

Still after some blogs, there are more topics to discuss about process mining. In my next posts, I will for example write about:

  • How to create event logs from Dynamics 365 F&O transactional data concatenating data from multiple tables
  • Power Automate Process Mining desktop features
  • Create your own Power BI dashboards for Process Mining

If you want to learn more about Process Mining, you can explore the documentation on Microsoft Learn: Overview of process mining and task mining in Power Automate – Power Automate | Microsoft Learn



I do hope you liked this post and will add value for you in your daily work as a professional. If you have related questions or feedback, don’t hesitate to use the Comment feature below.


That’s all for now. Till next time!

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.