Tuesday, 24 May 2016

Data migration Framework Walkthrough Vedio

1. Check Parameters: 
services - MicrosoftDynamicsaxdataimport/export framework service

USMF >> Data import export framework >> setup >>Data import/ export framework parameters .
we need to set up shared working directory.when we install Data migration framework , it fills the service connection url.
Paste the url in Internet explorer and check the response of it.

2. Check Source data formats :
We need to set up data source formats.
USMF >> Data import export framework >> setup >>source data formats
New  record >> source name - ax , description - ax , Type - ax. save.
New record >> CSV  , Description - Comma test, file - file.
General >> file format : delimited , first row header - checked, Row delimeter - {CR}{LF},Column Delimeter - Comma {,}, Text qualifier - ".
Application - Here we need to select the dimensions.
Dimension Code - check Business unit, department, cost centre.
3. Target Entities :

what we are going to import. We target to entities here.
We need to check Entity. (table name i think)
we have 3 concepts for each entity
i) Staging table - the data comes in
ii) Entity Class - we need to do additional processing/ addition modification can be done with this class
iii) Target entity - Is the query to the target tables.

Example :

  • Copy data to single entity : 

Select Asset in Target entities >> Modify target mappings.

click Mapping details ...
We can change mapping/ change classes . But once done we need to click Generate mappings >> yes.
And then save.close .

Entity Strutcure  - this shows us the target tables it is going to . Levels are important  order the data it enter in to different tables. When we process the functions , we process based on levels mentioned here.
Run business validations, Run business logic in insert or update method - will be checked in some entities and some it is not. Depending on entity , we need to decide it.
We need to check the same in processing group. both areas should be checked / not checked.

Click >>Target fields

USMF >> Data import export framework >> Common>> Processing groups

Create a new processing group .
Group name - ProductsAX , Description - Products AX
Click on Entities . We can select no. of entities to be included in processing group. for this ex i select one.

Select entity - Product , source data format - ax  and click select.

Select item id - 1000 and click ok. close the form.

Click on "Get stating data" in processing group form .click ok.Click Run . You can run in batch or directly. I run directly with out clicking batch processing. Click Ok.

infolog  - "1 product record inserted in staging. ".

In processing group >> Click on Execution history. click View staging data , we can see the record inserted.

we want to populate csv file from the template.,

Create a new processing group >> Group name - Products CSV ,Description - Products CSV.
Click on Entities .
Entity - Product,Change format - CSV,
Sample file path - we need to select below folder where we pasted the product entity .
Ex  here - C:\BJU

Click on Generate source sample file.Choose the fields  and click "Generate sample file". Click notepad. So this can be used as sample file.

We can fine lot of sample files in below folder
C:\Program Files (x86)\Microsoft Dynamics AX\60\Client\DemoFiles\Delimited

Copy - Product entity from this folder to C:\BJU or C:\DIXF.
====================
Click on Generate source mapping - infolog "Product entity mapping is done successfully.
Click on Modify source mapping  just to see mappings .
Click on Preview source file. Then we can see the data in preview.

=======================================================
Go to processing group >> Select ProductsAX . >> click on Execution history>>View staging data >> Export to file >> select procesing group - Product csv and click ok. File is generated on your desktop.
Go to that file, modify the file, create one more line for other product changing the Product name . we can even update the previous record existed.
Click on save.
Copy this file to C:\BJU
Select Processing group  to Products CSV >> sample file path to the new file we pasted.

Click on Preview source file . we can see the 2 records.  "Preview error details" give the errors while doing the process.

Click on Processing group >> Product CSV >> Generate stagin data . Click ok.Click run >> OK.
Infolog - "2 product' record inserted in staging\".
Go to execution history >> view staging data >>  you can see 2 records now.

Now to go Execution history screen >> copy data to target.>> click Run >>
Infolog - "Data written to target 'Product' ('1' records created, '1' records updated)"


You can check one new product created and 1000 product is modified.
https://www.youtube.com/watch?v=TZiAAIk35Qg  - Customer table Import
========================================================================
  • Copy data to  Composite entity : 
where we can copy from single file to 2 different entities.
---------------------------------------------------------------------
Creating new entities & other features.
Create a custom entity 
Compare data between companies 
Copy data between companies
Quick import/export 
Staging clean up.
--------------------------------------------------------------------

USMF >> Data import export framework >> Common>> Create a custom entity for data import/ export

Click next >> select Custtable >> Select Menu item

Important Links on DIXF :

https://stoneridgesoftware.com/generating-voucher-numbers-when-importing-general-journals-with-dief-in-ax-2012/


2 comments:

  1. The perfect team of data migration solution providers, that is, your company, had understood the proper plans for transferring the data efficiently. It was my great choice to choose your company for transforming the information fluently from one system to the other or the targeted system.

    ReplyDelete
  2. Simply wish to say your article is as astonishing. The clarity in your post is simply great, and I could assume you are an expert on this subject. Same as your blog i found another one Data Migration Software .Actually I was looking for the same information on internet for Data Migration Assistant and came across your blog. I am impressed by the information that you have on this blog. Thanks a million and please keep up the gratifying work.

    ReplyDelete