The F&O Twist on Process Mining – Order to Cash example

,

During various presentations this year, I discussed Process Mining and how to feed it with Microsoft Dynamics 365 Finance and Operations data. The presentations ended with an example of Order to Cash where data is retrieved and synchronized from Dynamics 365. In this post, I will share the details so you can use them for your learning.

Presentations

Last year, I started sharing content on this website and YouTube on how to get started with Process Mining. While people could read my blogs and watch videos, I had another goal to prepare a use case which is often discussed as part of a valid use case for process mining. I wanted to create a demo for an Order-to-Cash or Procuse-to-Pay process on Process Mining where the data for the event log should be coming from Microsoft Dynamics 365 Finance and Operations.

I used several times the next metaphor: “Say, we have an electric car with a lot of interesting technology bringing driving to the next level. Compare the electrical car with Process Mining. It needs to be fueled from a charging station, in this metaphor that is Dynamics 365. But where is the charging cable? How to connect the car to the charging station?”

At the beginning of this year, I spent a lot of my spare time thinking of the best solution getting Dynamics 365 F&O data transformed into event logs. There is no best option in my opinion at this moment. How to prepare event logs can be done in various ways which often needs customization.

I provided myself with some constraints to get data without any customization or manual transformation of data. I will elaborate on some of the constraints and consequences further in this blog. The solution direction for an Order-to-Cash process was to make tables available in the Fabric link. The tables will be synchronized periodically after the first full initialization. There is an SQL analytics endpoint available for the Fabric Lakehouse, so an SQL statement can be used with an SQL database connector in Process Mining. I was ready for a series of presentations about this topic.

DynamicsMinds | May 27 – 29, 2024 Portorož, Slovenia

My first presentation where I showed the fully connected process was at DynamicsMinds. At that time, the data to be presented was messy due to the order in which the Dynamics 365 demo data was created. E.g., after the order creation, packing slip posting, and booking of the invoice, the data for order confirmations was created. Here I excluded the order confirmations from the process and showed only data from one company to get a bit of a better-looking process map.

Community Summit | October 13-17, 2024 San Antonio, Texas

My first Community Summit in the United States after I attended the User Summits held in Europe between 2016 and 2020. When I found out that some connections were not syncing correctly, I recreated environments and started cleaning some data. I was relying on the created date of records. To get a better-looking process map, I changed the data in the database order confirmations were sent before the packing slip. This resulted in a process map for Order-to-Cash with only a few mistakes as I missed some records.

DynUG høstkonferanse | November 26-27, 2024 Oslo, Norway 

My colleagues at Next365 Norway submitted me as a presenter for a user group conference in Norway. Where I could talk about several different topics, the organization had a strong preference to have the Process Mining topic covered. For this presentation, I had another challenge as I had contents prepared for an hour, but found out that I had a session of 30 minutes allocated. As this was the last session before lunch, luckily I got some additional time and managed to speed up all the things I wanted to talk about.

Dynamics 365 Community Call ERP Edition | December 11, 2024 online

The Dynamics 365 Community Calls are held twice a month. These calls are open to everyone in the Dynamics 365 community. One has a focus on CRM and the other one on ERP. The calls do start with some community updates and upcoming events. Then two presenters can talk about their topic for about 20 minutes. On December 11, I talked about how to get started with Process Mining. As part of the presentation, I was able to show a fully cleaned data set for the Order-to-Cash process.

During the last presentation, I promised to share the details of how to get the Dynamics 365 transactional data for the sales process in Process Mining. All Process Mining content is now summarized on a landing page on this website. You can find it in the main menu or click the link: The F&O Twist on Process Mining.

Order-to-Cash

As mentioned above, I wanted to have a flow without customizations, and manual interactions, or data updates. When looking at the data model and table contents, I had to create a union view of several tables. All the tables needed for this demo do have by default the Created date system field enabled. Initially, I thought of the Procure-to-Pay process including workflow approvals, but not all tables have the Created date field enabled, so critical information about a timestamp is missing.

There is no Finance and Operations connector for Process Mining available, so I did explore other options. When I read about a SQL analytics endpoint available for a Lakehouse, I chose to set up a Synapse Link and replaced that a few weeks later with the Fabric link.

For the demos I used during my presentations, the query I used was limited to the mandatory event log attributes. I only added one additional attribute to be able to filter or analyze differences between companies. Already for this query, data needs to be retrieved from the next 6 tables:

  • SalesTable
  • CustConfirmJour
  • CustPackingSlipJour
  • CustInvoiceJour
  • CustTrans
  • CustSettlement

I have enabled these tables in my environment for the Fabric link. After the initialization, I could find them in my Fabric Lakehouse. In the SQL analytics endpoint, you can write a SQL query and save it as a view. The view can be accessed in Power Automate by using the SQL database connector.

A full demo of the setup can be watched in the next video. A fun fact: This is the first video I recorded and edited using Microsoft Clipchamp. I spent much time learning the edit options from this tool before I got the end result. The video is now available in 4K resolution. The device I used for reporting appears not to be fully 4K. I will use another laptop for the recording part next time.

SQL statement

In the video above, I used an SQL statement to union data from several tables. You can view and copy the statement below.

Each activity for the view requires retrieving data from a different table or setting another range and taking the timestamp from a different field. They are combined with the UNION command in the full SQL code. To get a unique case ID across all different legal entities, I concatenated the fields DataAreaID and SalesId. As the same sales order number can exist in different legal entities, it could mix up activities from different companies.

SELECT dataareaid + '|' + salesid as CaseID, createdon as TimeStamp, 'Sales order created' as Activity, dataareaid as Company FROM salestable
Where salesstatus in (3,4)
UNION
SELECT dataareaid + '|' +salesid as CaseID, modifiedon as TimeStamp, 'Sales order canceled' as Activity, dataareaid as Company FROM salestable
Where salesstatus = 4
UNION
SELECT custconfirmjour.dataareaid + '|' + custconfirmjour.salesid as CaseID, custconfirmjour.createdon as TimeStamp, 'Sales confirmation sent' as Activity, custconfirmjour.dataareaid as Company FROM custconfirmjour
INNER JOIN salestable ON custconfirmjour.salesid = salestable.salesid AND custconfirmjour.dataareaid = salestable.dataareaid
WHERE salestable.salesstatus in (3,4)
UNION
SELECT custpackingslipjour.dataareaid + '|' + custpackingslipjour.salesid as CaseID, custpackingslipjour.createdon as TimeStamp, 'Sales packing slip posted' as Activity, custpackingslipjour.dataareaid as Company FROM custpackingslipjour
INNER JOIN salestable ON custpackingslipjour.salesid = salestable.salesid AND custpackingslipjour.dataareaid = salestable.dataareaid
WHERE salestable.salesstatus in (3,4)
UNION
SELECT dataareaid + '|' + salesid as CaseID, createdon as TimeStamp, 'Sales invoice posted' as Activity, dataareaid as Company FROM custinvoicejour
Where salesid <> ''
UNION
SELECT paymtrans.dataareaid + '|' + custinvoicejour.salesid as CaseID, paymtrans.createdon as TimeStamp, 'Payment' as Activity, paymtrans.dataareaid as Company FROM custtrans as paymtrans
INNER JOIN custsettlement ON paymtrans.recid = custsettlement.transrecid AND paymtrans.dataareaid = custsettlement.dataareaid
INNER JOIN custtrans as invoicetrans ON custsettlement.offsetrecid = invoicetrans.recid AND custsettlement.dataareaid = invoicetrans.dataareaid
INNER JOIN custinvoicejour ON invoicetrans.invoice = custinvoicejour.invoiceid AND invoicetrans.transdate = custinvoicejour.invoicedate AND invoicetrans.dataareaid = custinvoicejour.dataareaid
WHERE custinvoicejour.salesid <> ''
GO

In case you want to add more attributes for supporting analysis in process mining, you will most likely need more tables and the SQL statement will be more complex.

Note that the query above is not considering taking a recent set of data. It is now retrieving data with a full history. For process mining purposes and checking improvements you may be interested in analyzing only the last 30 or 60 days.

There is more…

In this example. I used the option to get all data from transaction tables and converted it into an event log using SQL statements. Depending on the use case, you can also use business events, data events or add customizations. What solution to choose depends on a lot of variables.

All solutions comes with a cost. In this example it consumes additional Dataverse storage as tables were enabled for the Fabric link. Also retrieving data via Fabric requires a Power BI or Fabric subscription. In case you want to store data in a specific Dataverse table or your own Data Lake, it will consume space and for that reason cost is involved. Maybe, some components are already part of your subscription or data from some tables are already required for reporting. In that case, there is a shared cost. Adding customizations also come with a cost for not only building it, but also during updates it requires validation if it is still working.

I mentioned the procure-to-pay process earlier in this post. As not all related tables do have the Created date field enabled, a customization is required. You can consider enabling the system field or build another approach using e.g. data events.



I do hope you liked this post and will add value for you in your daily work as a professional. If you have related questions or feedback, don’t hesitate to use the Comment feature below.


That’s all for now. Till next time!

.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.