Scenario 2: Data Migration / Data Transfer

In this scenario, we're going to use DMT to perform a data transfer or data migration from a legacy application into a new CRM application.

Let's say the legacy application data is stored in 3 tables within an Access database:

  • Customer: contains name and address of each customer
  • Product
  • Order

In the new CRM application database (Oracle) the tables and columns are different:

  • Client: contains customer names
  • Address: contains customer addresses
  • Prod
  • Order

Note: For our example, we chose an Oracle and an Access database but this would work the exact same way with any other database.

You can easily implement this scenario with DMT:

Scenario 2: Data Migration

Step 1: Create the Export Group/Jobs (in the Export Tab)

  • Create an Export Group to contain all the Export Jobs
    • Name: Export Legacy
    • Group Type: Schedule Group
    • Scheduling Frequency: Manual
  • In the group create a Job to export Customer data
    • Name: Export Legacy Customers
    • Database Connection: Connection to the legacy Access database
    • Data File: Legacy Customers.csv
    • SQL Statement: Enter the SQL statement to export customers data from the legacy database

    • e.g.
      SELECT CUST_ID, NAME, ADDR1, ADDR2, CITY, ZIP, STATE, COUNTRY
      FROM CUSTOMER
  • In the group create a Job to export Product data
    • Name: Export Legacy Products
    • Database Connection: Connection to the legacy Access database
    • Data File: Legacy Products.csv
    • SQL Statement: Enter the SQL statement to export products data from the legacy database

    • e.g.
      SELECT PROD_ID, NAME, DESC_TEXT, UNIT_PRICE
      FROM PRODUCT
      WHERE DEMISED = 'N'
  • In the group create a Job to export Order data
    • Name: Export Legacy Orders
    • Database Connection: Connection to the legacy Access database
    • Data File: Legacy Orders.csv
    • SQL Statement: Enter the SQL statement to export orders data from the legacy database

    • e.g.
      SELECT ORDER_ID, PROD_ID, CUST_ID, QTY
      FROM ORDER

Note: Scheduling parameters are not required for the jobs because they belong to a Schedule Group (i.e. they will run automatically when the group is run).

Once the export group and jobs are created, you can validate them to make sure they are error free and ready to run (right click on the Group then "Validate Group").

Step 2: Create the Load Group/Jobs for Customer information (in the Load Tab)

Because the legacy customer table is now split into 2 tables in the new system (Client and Address), we’re going to use a Multi Table Group to perform this split. The Multi Table Group contains 2 jobs, to respectively load the Client and Address tables.

  • Create a Load Group to load the customer information
    • Name: Load Legacy Customers
    • Group Type: Multi Table Group
    • Linked to Ext.: Select "Export Legacy -> Export Legacy Customers" job to automatically link to the Legacy Customers.csv data file produced by the Export Legacy Customers job.
    • Scheduling Frequency: Manual
  • In the Group, create a Load Job to load the Client table
    • Name: Load Client Table
    • Database Connection: Connection to the new CRM Oracle database
    • Dest. Schema/Table: CRM / CLIENT
    • Load Mapping: Enter the mapping for each column from the data file to the destination table e.g.
    • ActiveDest. Column
      (in dest. database)
      Value
      (from src. data file)
      Comments
       YesCLIENT_ID$DFC_[CUST_ID]CUST_ID used in CLIENT and ADDRESS tables
       YesNAME$DFC_[NAME]
  • In the Group, create a Load Job to load the Address table
    • Name: Load Address Table
    • Database Connection: Connection to the new CRM Oracle database
    • Dest. Schema/Table: CRM / ADDRESS
    • Load Mapping: Enter the mapping for each column from the data file to the destination table e.g.
    • ActiveDest. Column
      (in dest. database)
      Value
      (from src. data file)
      Comments
       YesCLIENT_ID$DFC_[CUST_ID]CUST_ID used in CLIENT and ADDRESS tables
       YesADDRESS$DFC_[ADDR1] || ' ' || $DFC_[ADDR2]Use Oracle || operator to concatenate 2 address fields into one (Eval must be unchecked)
       YesCITY$DFC_[CITY]
       YesPOST_CODE$DFC_[ZIP]
       YesSTATE$DFC_[STATE]
       YesCOUNTRY$DFC_[COUNTRY]

Note 1: In the above example, we used CUST_ID as a key to identify clients in both CLIENT and ADDRESS tables. If no ID was available in the legacy database or if we wanted to generate a new one, DMT can automatically generate a primary key and used it as foreign key in load jobs.

Note 2: before you can enter load mappings for the load jobs, you need to look up the column definitions for the data file and/or the destination table

Note 3: The prefix $DFC_ indicate a Data File Column

Once the load group and jobs for customer info are created, you can validate them to make sure they are error free and ready to run (right click on the Group then "Validate Group").

Step 3: Create the Load Group/Jobs for Products and Orders information (in the Load Tab)

  • Create a Load Group to load the products and orders information
    • Name: Load Legacy Products and Orders
    • Group Type: Schedule Group
    • Scheduling Frequency: Manual
  • In the Group, create a Load Job to load the Product table
    • Name: Load Product Table
    • Linked to Ext.: Select "Export Legacy -> Export Legacy Products" job to automatically link to the "Legacy Products.csv" data file produced by the "Export Legacy Products" job
    • Database Connection: Connection to the new CRM Oracle database
    • Dest. Schema/Table: CRM / PROD
    • Load Mapping: Enter the mapping for each column from the data file to the destination table e.g.
    • ActiveDest. Column
      (in dest. database)
      Value
      (from src. data file)
      Comments
       YesID$DFC_[PROD_ID]
       YesNAME$DFC_[NAME]
       YesPROD_DESC$DFC_[DESC_TEXT]
       YesUNIT_PRICE$DFC_[UNIT_PRICE]
  • In the Group, create a Load Job to load the Order table
    • Name: Load Order Table
    • Linked to Ext.: Select "Export Legacy -> Export Legacy Orders" job to automatically link to the "Legacy Orders.csv" data file produced by the "Export Legacy Orders" job
    • Database Connection: Connection to the new CRM Oracle database
    • Dest. Schema/Table: CRM / ORDER
    • Load Mapping: Enter the mapping for each column from the data file to the destination table e.g.
    • ActiveDest. Column
      (in dest. database)
      Value
      (from src. data file)
      Comments
       YesID$DFC_[ORDER_ID]
       YesPROD_ID$DFC_[PROD_ID]
       YesCLIENT_ID$DFC_[CUST_ID]
       YesQTY$DFC_[QTY]

Note 1: Scheduling parameters are not required for the jobs because they belong to a Schedule Group (i.e. they will run automatically when the group is run).

Note 2: before you can enter load mappings for the load jobs, you need to look up the column definitions for the data file and/or the destination table

Note 3: The prefix $DFC_ indicate a Data File Column

Once the load group and jobs for Product and Order info are created, you can validate them to make sure they are error free and ready to run (right click on the Group then "Validate Group").

Step 4: Run the Migration

All we need to do now is run the migration (i.e. the groups and jobs we've just created). Since it's a one off data migration, we won't be using the scheduler and instead we'll be running the groups and jobs "manually" from the client. Just right click on the group or job to run and select Run in the context menu.

  • Run the Export Group "Export Legacy".
    This runs the 3 jobs within the group and produces 3 data files: "Legacy Customers.csv", "Legacy Products.csv", "Legacy Orders.csv".
  • Run Load Group "Load Legacy Customers".
    This runs the 2 jobs within the group and loads the "Legacy Customers.csv" data file into tables CLIENT and ADDRESS
  • Run Load Group "Load Legacy Products and Orders".
    This runs the 2 jobs within the group.
    "Load Product Table" loads the "Legacy Products.csv" data file into the PROD table
    "Load Order Table" loads the "Legacy Orders.csv" data file into the ORDER table

The data from the legacy database has now been migrated to the new database.