Speed up data import with (data) entity execution parameters
This post will inform you about the Entity execution parameters which can be found on the Data import/export framework parameters form. It will give you an option for gaining performance when importing a large number of records using the data management features.
Entity execution parameters
The entity execution parameters do have the option to provide information on how to divide the workload when performing data import using the batch framework. In Microsoft Dynamics AX 2012 we were used to specifying the number of tasks when creating the processing group. In the current product, it can be defined upfront as a parameter.
Use the next path to open the Entity execution parameters: System administration > Workspaces > Data management; Then click the Framework Parameters tile. The Data management workspace can also be started from your default dashboard.
On the Framework parameters form you can select the tab page Entity settings and the button Configure entity execution parameters to open the next form.
As you can see in the picture above, you can specify a threshold per entity when the tasks need to be split and how many tasks will be created then. You can also make variations for the same entity. So, you can use 4 tasks for smaller and 8 tasks for larger files.
When specifying these parameters with a task count, it will create multiple threads where the workload will be divided and run in parallel. From my experience, this can make a huge difference in performance during data import.
Import in batch
To be able to really use these settings, you have to run the import using the batch framework. The Import button on the Import project form, will start a single thread on the server. You can use the Import in batch button on the Import options menu.
Then complete the parameters on the batch slider to have the batch scheduled. When the batch job is executing, you can actually monitor how many tasks are created on the View tasks form. From the Execution details, you have a link to the batch job. On this form, you can browse deeper into the tasks for this job.
Good to know….
Some data entities like Fixed Assets are not created to support the use of multiple tasks. When trying to set up these entities, you get an error as shown in the screenshot below.
I do hope you liked this post and will add value for you in your daily work as a professional. If you have related questions or feedback, don’t hesitate to use the Comment feature below.
That’s all for now. Till next time!
Hello Andre,
This setting is not possible anymore with the Customers V2 entity.
Microsoft does not allow anymore the change of number of tasks.
Thanks for the feedback, Ada!
Hello andré
the customer data entity now not support multi task (thread ) and take huge time comparing other imports
never succes import big file (50.000 customer) , did someone success import whtiout split in small file or creating a new data entity ?
Hi Marc,
A bit late reply… Which exact version of Dynamics 365 are you using? I’m aware that Customers V2 has a limitation here. I haven’t checked the latest versions.
Is this also applicable when using the entity as an odata?
Hi Rajeshni,
Apologies for the late reply. This is not applicable for Odata. Only data management, where the execution is performed using batch jobs.
Hi Andre,
We need to upload atleast 1.5 lacs record through Budget entry entity in an onpremise environment but it creates huge log files and taking more than 24 hrs and fails. As there is no customisation Please help us how we can make upload fast
Hi Raj,
I cannot provide free consultancy on this site. Personally, I have not used the budget entry with huge number of records. You can use system monitoring to see if the problem is related to storage, indexing (missing or fragmented), cpu, memory, etc..
If you need more help, you can create a question on the Dynamics Community or hire someone who can help find the root cause of your performance issue.