Professional Documents
Culture Documents
Deal Processing
This Section covers the functionalities involved in the DMIG tool. Each of the tables and their related functionalities are
explained. Also, sample setups are given. Please Use the Menu MB.DMMAIN for operations using the DM TOOL.
It has three menus namely Migration Pre-Requisite Check, Client System to T24 Migration , T24 to T24 Migration to facilitate
efficient working with DM tool.
A sample screen shot of the Screen is as shown.
Menus
System Dates
Table
DATES
Purpose
The DATES application is checked for the current system Date before start of Data Migration.
Check Holiday Setup
Table
HOLIDAY
Purpose
The HOLIDAY application is checked for the current holiday setup.
Check Migration User
USER
Purpose
The USER application is checked for existence of Data Migration Users .
Check the Licensed Product
Table
SPF
Purpose
The SPF application is checked for the license in field number 36 , LICENSE.CODE.
Check Company Record Setup
Table
COMPANY
Purpose
The COMPANY application is checked for the current holiday setup.
Check T24 Application Auto Id Setup
Table
AUTO.ID.START
Purpose
The AUTO.ID.START application is checked the Ids Setup of the applications in scope for
Migration.
Check DM Tool Details
Enquiry
ENQUIRY DM.TOOL.DETAILS
Purpose
Display the details of the DM Tool that will be used for the validation and Load.
Table
DM.INFORMATION
Purpose
The application is used to summarise the number of records Loaded. The information of the
number of counts per application is available in a Live file DATA.MIG.COUNT. This application
keeps track of the changes in the Load count.
Detailed Working Procedure
This is the table where the parameters are provided. It can be accessed from the Command line of the Browser.
The ID for this application must be only SYSTEM.
After selecting the correct ID you will get a screen as below. Input The application name and commit it.
DM.IFORMATION – verify
After doing the initial setup of parameters, DM.INFORMATION is verified. On
Verification:
The count for each application is done and updated in the fields LIVE Count, NAU Count, HIS Count.
The value is updated in a LIVE file DATA.MIG.COUNT.
After committing the record we need to verify the record. We select the record name to be verified and click the
button as shown.
The latest count of the application in the area will be the first multivalue set.
This menu contains applications that are used while migrating data from a Client’s system to T24.
It consists of T24 Data Loading and T24 Recon Data Extract.
A sample screen shot of the Screen is as shown.
Table
DM.PREREQUISITE
Purpose
Creation of folder, copying of data files to the respective folders and creating the DM.SERVICE.CONTROL (DSC) is
necessary for the load of Data to T24.DM.PREREQUISITE by which we can create necessary folders and moves the
data files to their respective paths. We will also be able to create folders for Reconciliation.
The DSC is an integral part of data load. This application also creates specified DSCs with appropriate field values
through Open Financial Service (OFS).
OFS.SOURCE
The following OFS.SOURCE record DSC.CREATE must be available in the deployed area.
DM.PREREQUISITE – Parameters
This is the table where the parameters are provided. It can be accessed from the Command
line of the Browser.
The ID for this application must be in the format:
https://tcsp.temenos.com/R19CD/R19CD.htm#../Subsystems/R19CMUG/Content/CountryPacks/Generic/DataMigrationTool/Deal Processing.htm%… 7/209
8/5/2021 Deal Processing
A 3 digit number followed by an asterisk (*), followed by a valid user name that must be present in the USER table.
(The format is mandatory).
DM.PREREQUISITE – Verify
After doing the initial setup of parameters, DM.PREREQUISITE is verified. On verification 4 processes are triggered.
Creation of folders VERIFY01, VERIFY02, VERIFY03, LOAD01, AUTH01, DMV under the specified folders. Copying of
the data file to the above folders in their respective paths.
Creation of DSC records via OFS.
Creation of Reconciliation folders.
If DMD with function ‘R’ or ‘D’ records are not available, DSC will not be created. This will be captured as override.
For other DMDs with function as ‘V’, ‘I’ or ‘A’, if record is not present, it will be an error.
After committing the record we need to verify the record. We type the record name to be verified and click the
button as shown.
The folders are created for Validate, Input and Authorise and also the Reconciliation folders are created as shown
below and the data file is also copied to the respective path.
TAFC:
TAFJ:
Table
DATA.MAPPING.VERIFICATION
Purpose
The menu is used to do the initial setup for the verification of the Data Mapping Sheet and the
DM.MAPPING.DEFINITION.
Detailed Working Procedure
This is the table where the parameters are provided. It can be accessed from the Command line of the Browser.
The ID for this application can be up to 65 characters.
After committing the record we need to verify the record. We type the record name to be verified and click the
button as shown.
The compared DMV fields and the DMD fields are populated in the file as below.
By providing the data file name only, the values in the data file will be validated and the PASS/FAIL status will be
updated based on the validation status of each record. The validation status of each record will be stored in a
separate file as a result.
The validation of the record values in the data file can be validated based on the verification type provided by the
user. The following are verification types,
FIRST
POSITION
RANDOM
1. FIRST
If the datafile name is given and the verification type is “FIRST” then only the first string in the datafile will be
validated and the result for that particular record will be stored in a separate file in the DMV folder. A sample
record is provided below for reference.
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored in the DMV folder. Only
the file for data file will be created as the field verification is not provided in this case.
TAFC:
The result will be stored in the DMV folder as below.
TAFJ:
Screen shot for the text and log files location.
The status of the validated datafile will be populated in the file as below.
2. POSITION
If the datafile name is given and the verification type is “POSITION” then it’s mandatory to provide values for the
fields “FIRST.POSITION” and “LAST.POSITION”. The number of strings present between the two position values in
the datafile will be validated and the result for those records will be stored in a separate file in the DMV folder.
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored in the DMV folder. Only
the file for data file will be created as the field verification is not provided in this case.
3. RANDOM
If the datafile name is given and the verification type is “RANDOM” then a string will be picked randomly from
datafile and will be validated. The result for that particular record will be stored in a separate file in the DMV
folder. A sample record is provided below for reference.
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored in the DMV folder. Only
the file for datafile will be created as the field verification is not provided in this case.
Table
DM.MAPPING.DEFINITION
Purpose
This is the base mapping definition table, which maps the incoming data with T24™ fields.
The following items are defined here.
T24™ Table to be Loaded
Type of Load to be done (Example – OFS or flat jbase write)
Type of Action to be performed (Example – Only Validate the data or actual
update)
Function to be used (Allowed function are Input, Authorise, Delete, Reverse)
If not mentioned, Input function will be used by default.
Way to identify the each field in a transaction/ record (Example – by delimiter or
position)
DMD Record:
The below screenshot shows the $ARC file of the FUNDS.TRANSFER record.
600000002 to 600000010
The first multi value for APPL.FIELD.NAME must be DATA.UPLOAD.GRP with FIELD.POSITION as 1. The second multi
value for APPL.FIELD.NAME must be DATA.UPLOAD.SEQ with FIELD.POSITION as 2.
This configuration and sequence must be maintained for Sequential Upload. For Non Sequential Upload, these
values can be ignored.
The screen shot for reference.
Data File
The first block is the Group 1 and the Second block is Group 2. (N number of groups can be included in the data
file). The first field in the Data file must be the Group ID followed by the second field which must be the Sequence
number for the respective Group.
Sequential Upload Process
Upload
Verify the DM.SERVICE.CONTROL>I.LIMITU.
The F.DM.SERVICE.DATA.FILE is will have 10 Record Keys for each of the 10 Groups in the Data File.
The OFS.REQUEST.DETAIL (ORD) lists all the records in the data file, as processed.
Processed OFS.REQUEST.DETAIL
Uploaded Records:
Global Limit: 600000001.0010000.01
It is a program which is used to create the Versions and DMD’s (DM.MAPPING.DEFINITION) for any input able
applications in T24.This program is executed in jshell prompt, during program execution, it will prompt for
Application name
Suffix of DMD name
Once the user provides the appropriate values, the program takes care of creating the respective Version’s (If the
record does not exist in the environment) for the corresponding DMD and also DMD’s in the environment.
Prerequisites
Need to have these record has the perquisite for successful execution this routine
VERSION > VERSION,RNT
Process Flow:
Once the application name is keyed, all input able fields are extracted from the STANDARD.SELECTION and for that
particular application and then DMD will be created.
AUTO.DMD.CREATION is modified now. Previously it was working as a mainline routine. Now, it has been
converted to a service.
Please follow the below steps for running the service AUTO.DMD.CREATION.
Below is the sample screenshot of the automatically created DMD for the CUSTOMER application.
DM.MAPPING.DEFINITION>V.SUFFIX.
DM Service Control
Table
DM.SERVICE.CONTROL
Purpose
This work-file application is used to control the actual Load process. The following details must be provided when
the DM.SERVICE.CONTROL is set up .
Load Company – The Company for which the data is loaded. If this information is not provided in the incoming tape, it must be provided here.
Data file directory and file name – A valid directory name and file name.
Number of Sessions – The number of sessions with which the TSA.SERVICE must be run.
Run Status – To Request for a STOP of the TSA.SERVICE.
DM.SERVICE.CONTROL>I.CUSTOM
TSA.WORKLOAD.PROFILE record
The number of sessions in the DM.SERVICE.CONTROL record is updated as number of agents in this record.
TSA.WORKLOAD.PROFILE>DM.SERVICE-I.CUSTOM
As can be seen, the user in the TSA.SERVICE record is updated from the DM.SERVICE.CONTROL.
On verification, the status of the DM.SERVICE.CONTROL record is marked as ‘START’. The same is updated into
TSA.SERVICE record as well. The control log is updated with the starting
request details.
Then start the TSM.
Data File:
TAFC:
TAFJ:
Immediately the TSA.SERVICE is marked for STOP. The agents will stop after processing the current records.
Modifying the number of agents at Run time
To modify the number of agents during run time ,pick up the DM.SERVICE.CONTROL record in Verify mode. Update
the number of sessions as required. This will be updated to the TSA.WORKLOAD.PROFILE application. Once the
TSM reaches its review time, it will either spawn more or less agents as required (in Phantom mode) or request for
running more or less agents (in Debug mode).
Right now this option is available only in Classic.
Updating of Time Field in DSC:
The process of enhancing DSC, by capturing the Start and Stop fields in DSC, when the service is started . The field
DSC.TYPE is included in DM.SERVICE.CONTROL which can be updated either as DME or DMD. If the id of the
DM.SERVICE.CONTROL is from DM.MAPPING.DEFINITION application then the value DMD is given or else if the id
is from DM.EXTRACTOR, then DME is given in the field DSC.TYPE.
DM.SERVICE.CONTROL ID from DM.MAPPING.DEFINITION
If the id of the DM.SERVICE.CONTROL is from DM.MAPPING.DEFINITION application then the value DMD is given.
DM.SERVICE.CONTROL>V.CUSTOM
After the extraction is complete we will find that the time fields in DSC have been updated.
DM.SERVICE.CONTROL>V.ACCINT
Accept Override.
On verifying DM.SERVICE.CONTROL the error details will be moved to DM.SERVICE.CONTROL.HIST and the error details in DSC will be removed.
The TSA.WORKLOAD.PROFILE is updated correctly. The agents required are updated as 4 once the corresponding
DSC is verified after the number of agents has been updated in the DSC record.
TSA.WORKLOAD.PROFILE>DM.SERVICE-V.CUSTOM
Service Control
Table
TSA.SERVICE
Purpose
The Menu is used to Start or Stop the TSA.SERVICE record TSM or any other service that needs to be started or
stopped manually.
Service Status
Enquiry
ENQUIRY DM.ENQ.TSA.STATUS
Purpose
This enquiry provides the information for all Data Migration services as well as the TSM status.
Once the TSA.SERVICE has been marked as START, if the TSM is running in phantom mode the tSA(agent) will be
automatically started. If the same is running in debug mode, then the agents will have to be manually initiated to
start the actual process.
Detailed Working Procedure
The Enquiry DM.ENQ.TSA.STATUS is used to get the details of TSA.SERVICEs and to monitor the status of the
services from T24. The selection criteria are as shown below.
We need to refresh the enquiry regularly to know the status of the services. For this we need to enable auto
refresh as shown below.
Click on the more options link.
The result will be as below and the query will refresh every 5 seconds.
Application backup
Table
DM.APP.BACKUP
Purpose
Taking backup of the application is a necessary step in the process of migration. The following Steps explain the
usage of the application DM.APP.BACKUP by which we can take the Backup of any application.
The entire operation can be successfully complete using the Browser.
DM.APP.BACKUP - parameters
This is the table where the parameters are provided. It can be accessed from the MENU line of the Browser.
The Folder for taking the Application backup will be created under DM.BACKUP.
After the transaction is processed, TSA.SERVICE is automatically stopped and the BATCH record is updated
accordingly .
Running the Service
1. The TSA.SERVICE, DOB-APP.CUSTOMER will be started automatically on verifying the
application DM.APP.BACKUP.
2. Start the TSM , but inputting and authorizing the TSA.SERVICE records TSM to “Start”
3. Initiate the TSM in the classic mode like below.
4. The records from the APPLICATION will be moved to the backup folder.
After:
Count the number of records in the backup folder to verify all the records are been moved .
AA.MAPPING
Table
DM.AA.MAPPING
Purpose
This application is used to extract the AA products with their respective Properties and Property class.
Description: The descriptions of your report / interface.
Extract Type: Specifies the task that will be done when the application is verified. FULL is specified for the entire
extract of AA Product and Properties. The option ‘Only Product’ and ‘Properties’ is reserved for Future Use.
Product: This field is reserved for Future Use.
Input a record.
The output will be stored in AA.DMS.OUT directory under the particular product line name.
DM.PRE.PROCESS (Multi-Threading)
Table
DM.PRE.PROCESS
Purpose
The scope of the table is to explain in detail regarding Multi-Threading DM.PRE.PROCESS.
Detailed Working Procedure
The DM.SERVICE.CONTROL the field Pre Process is included. The field is mandatory and value can be provided
based on the vetting table as SINLGE or MULTI.
DM.SERVICE.CONTROL screenshot with field included:
DM.SERVICE.CONTROL>V.TESTPP1
Changes in DM.SERVICE.CONTROL.RUN
The following are the changes done in DM.SERVICE.CONTROL.RUN
Table
DM.PREREQUISITE
Purpose
To create the necessary directories for reconciliation extracts.
T24 Data Extract for Reconciliation
Table
DM.EXTRACTOR
Purpose
The application DM.EXTRACTOR is used for extraction of data from T24.
Service Control
Table
TSA.SERVICE
Purpose
To start and stop the services during extraction process.
Service Status
Table
DM.PREREQUISITE
Purpose
To monitor the services status during extraction process.
Directory Creation for Reconciliation extract
Table
DM.PREREQUISITE
Purpose
To create the necessary directories for reconciliation extracts.
DM.PREREQUISITE
This is the table where the parameters are provided. It can be accessed from the Command line of the Browser.
The ID for this application must be in the format:
A 3 digit number followed by an asterisk (*), followed by a valid user name that must be present in the USER table.
(The format is mandatory).
DM.PREREQUISITE – verify
After doing the initial setup of parameters, DM.PREREQUISITE is verified. On verification below process will be
triggered: Creation of Reconciliation folders.
After committing the record we need to verify the record. We type the record name to be verified and click the
button as shown.
Table
DM.EXTRACTOR
Purpose
The application DM.EXTRACTOR is used for extraction of data from T24.
DM.EXTRACTOR – parameters
This is the table where the parameters are provided. It ca n be accessed from the Command line of the Browser.
There are no specifications for the ID of this application.
DM.EXTRACTOR – authorize
After doing the initial setup of parameters, DM.EXTRACTOR is authorised. On authorisation 3 tables are updated:
1. TSA.WORKLOAD.PROFILE is updated/created with Id as
DM.EXTRACTOR-<ID of DM.EXTRACTOR>
2. TSA.SERVICE is updated/created with Id as
DM.EXTRACTOR-<ID of DM.EXTRACTOR>
3. BATCH is updated/created with Id as
DM.EXTRACTOR-<ID of DM.EXTRACTOR>
After the transactions are processed TSA.WORKLOAD.PROFILE, TSA.SERVICE and the BATCH record area created as
below.
TSA.WORKLOAD.PROFILE
TSA.SERVICE
BATCH
The output file specified in the application is generated in the output directory.
After authorising the record, the service has to be started as per the below screen shot.
Once the service is complete, the OFS will be listed as shown below.
OUTPUT:
TSA.SERVICE record:
On authorisation of the DM.EXTRACTOR record a record in TSA.SERVICE will be created.
Sample Output :
https://tcsp.temenos.com/R19CD/R19CD.htm#../Subsystems/R19CMUG/Content/CountryPacks/Generic/DataMigrationTool/Deal Processing.htm… 84/209
8/5/2021 Deal Processing
TSA.SERVICE record:
On authorisation of the DM.EXTRACTOR record a record in TSA.SERVICE will be created.
Sample Output :
Example:
VALUE[F] : To append to the first position of the field value
VALUE[L] : To append to the last position of the field value
VALUE[NUMBER] : To append to any position in the field value
The field values are provided as given in the below screen shot to extract from more than one application using
the append condition.
Create a record in the DM.EXTRACTOR.
After running the service, target file has been created in the output directory.
Extracted file.
Trim:
Trim is the condition which is used to remove a value from the position which is mentioned within the field value.
The format is [position].
Example:
[number1, number2]
[13,19] : Trim from 13th position to 19th position
Or [number]
The field values are provided as given in the below screen shot to extract from more than one application using
the trim condition.
After running the service, target file has been created in the output directory.
Extracted file.
Replace:
Replace is the condition which is used to replace one value with another value in the position mentioned in the
field value.
The format is VALUE[POSITION].
Example:
VALUE [number1, number2]
Or VALUE[number]
The field values are provided as given in the below screen shot to extract from more than one application using
the replace condition.
After running the service, target file has been created in the output directory.
Extracted file.
Specific Conversions
For specific conversions FLD.TYPE field should be set to MANIPULATE.
System Variables
Variables like !TODAY, !USER, !NEXT.WORKING.DAY etc. can be used
DM.EXTRACTOR Record
EXTRACTED DATA:
Arithmetic Operation
EXTRACTED DATA:
Absolute value
Absolute value can be extracted. FLD.NAME should be set to A DMFLD.ONLINE.BALANCE.
https://tcsp.temenos.com/R19CD/R19CD.htm#../Subsystems/R19CMUG/Content/CountryPacks/Generic/DataMigrationTool/Deal Processing.htm… 95/209
8/5/2021 Deal Processing
DM.EXTRACTOR Record
EXTRACTED DATA:
Substitute
DM.EXTRACTOR Record
EXTRACTED DATA:
Convert
EXTRACTED DATA:
Extract
The values in the record is Extracted from position N1 to N2.
DM.EXTRACTOR Record:
EXTRACTED DATA:
In order to create an internal Account we should define the three fields LOCAL.CCY, DC.TOA.CATEGORY,
DC.DAO.START.ID.
Where,
DC.TOA.CATEGORY - The take over account category which is unused must be specified
DC.DAO.START.ID - The Department account officer id (DAO) which is a part of DATA.CAPTURE id must be specified.
The value “OPEN.ACTUAL.BAL” in the field FLD.NAME will extract the values for the fields Sign, Transaction
Reference and Amount FCY/Amount LCY as mentioned below.
The final extracted file with the balanced batches is stored in “DC.BALANCED.LIST.txt” and the same must be
considered.
LD.SCHEDULE.DEFINE (future repayment) Extraction
Extract the data from LD and LD.SCHEDULE.DEFINE (future repayment) based on the multi-value field the future
repayment will be extracted.
Before extracting the data, the DME record the LD.SCHEDULE.DEFINE fields must be in the below format. The @id
only required for extracting the future repayment data. Here the LD records with future repayment schedule will
be extracted
DM.EXTRACTOR Record:
The FLD.APP field must be value LD.SCHEDULE.DEFINE and FLD.NAME, FLD.APP.FLD must be value ‘@ID’ and also
DATACAPTURE must be selected ‘NO’.
After starting the services the DME records the LD contract with future repayment will be extracted.
Extracted file:
The FILE.TYPE field must be provided with the value $ARC for extracting archived records. If the field FILE.TYPE is
left blank the live file will be extracted for the particular application.
After starting the services the DME records the LD contract with future repayment will be extracted.
Extracted file:
Service Control
Table
TSA.SERVICE
Purpose
The Menu is used to Start or Stop the TSA.SERVICE record TSM or any other service that
needs to be started or stopped manually.
Service Status
Enquiry
ENQUIRY DM.ENQ.TSA.STATUS
Purpose
This enquiry provides the information for all Data Migration services as well as the TSM status.
Once the TSA.SERVICE has been marked as START, if the TSM is running in phantom mode the tSA(agent) will be
automatically started. If the same is running in debug mode, then the agents will have to be manually initiated to
start the actual process.
Detailed Working Procedure
The Enquiry DM.ENQ.TSA.STATUS is used to get the details of TSA.SERVICEs and to monitor the status of the
services from T24. The selection criteria are as shown below.
We need to refresh the enquiry regularly to know the status of the services. For this we need to enable auto
refresh as shown below.
Click on the more options link.
The result will be as below and the query will refresh every 5 seconds.
T24 to T24 migration has the same menus as in Client System to T24 Migration and the procedure is also the same as
seen above.
The below routines creates the dump file from source environment.
T24.DAB.SOURCE.BUILD
T24.DAB.SOURCE.BUILD.LOAD
T24.DAB.SOURCE.BUILD.SELECT
T24.DAB.SOURCE.BUILD.POST
The below routine creates the dump file from target environment.
T24.DAB.TARGET.BUILD
T24.DAB.TARGET.BUILD.LOAD
T24.DAB.TARGET.BUILD.SELECT
T24.DAB.TARGET.BUILD.POST
The below routine creates the output dump file, which is comparison of source and target dump in the target
environment.
T24.DAB.COMPARE.BUILD
T24.DAB.COMPARE.BUILD.LOAD
T24.DAB.COMPARE.BUILD.SELECT
T24.DAB.COMPARE.BUILD.POST
This routine creates the output dump file, which is comparison of source and target dump in the target
environment.
The Data Audit Build tool is developed for all the data tables.
In this document, we have shown for both AUDIT and NON AUDIT Field data tables. Similarly the procedure can be
used for all the data tables.
Steps to be done in Source environment
RECORD CREATION
The inputs required for DAB has to be provided in this record as shown in the below screen shot.
Browser View:
Putty View:
SOURCE.SL.ID- The saved lists ID with the selected application has to be provided in this field. If not, the build will
be created for all the applications.
The table names has to be provided in the saved lists as shown in the below screen shot.
TAFJ:
To view records from the template T24.DAB.INFO give S as shown below in the screen shot.
The data record is shown below from the template T24.DAB.INFO.
Putty View:
Putty View:
This T24.DAB.SOURCE.BUILD service will pick the data from T24.SAB.INFO & T24.DAB.INFO.
Start the service “T24.DAB.SOURCE.BUILD” to create the source build.
STEP3: Start the TSM service
TAFJ:
Check whether the date and status has been updated in T24.SAB.INFO.
STEP 5: Check the source dump in T24.DAB/SOURCE folder under the bnk.run
1. Source dump file will be saved in the below directory.
../ bnk.run/T24.DAB/SOURCE
The dump will be saved in the SOURCE path will be as T24.DAB.SOURCE.BUILD (CURRENT System DATE & TIME)
TAFJ:
1. Source dump file will be saved in the below directory.
../ TAFJ/T24.DAB/SOURCE
Please select the data table dump folder from Source dump files.
And View of the record contents in the data dump is shown in the below screen shot.
Then pack all the records from T24.DAB.INFO and release in the target environment.
Putty View:
TAFJ:
Check whether the date and status has been updated in T24.SAB.INFO.
To view records from the template T24.DAB.INFO give S as shown below in the screen shot.
STEP 9: Check the source dump in T24.DAB/TARGET folder under the bnk.run
Target dump file will be saved in the below directory.
../ bnk.run/T24.DAB/TARGET
The dump will be created in the TARGET path with the ID as T24.DAB.TARGET.BUILD Current System Date & Time).
Please select the particular data table dump folder from Target dump files.
View of the record contents in the target data dump is shown in the below screen shot.
TAFJ:
Target dump file will be saved in the below directory.
./ TAFJ/T24.DAB/TARGET
https://tcsp.temenos.com/R19CD/R19CD.htm#../Subsystems/R19CMUG/Content/CountryPacks/Generic/DataMigrationTool/Deal Processing.htm… 130/209
8/5/2021 Deal Processing
Please select the particular data table dump folder from Target dump files.
View of the record contents in the target data dump is shown in the below screen shot.
Steps to be done for comparison between Source dump and Target dump
STEP 1: Clear files
Before running every service the below mentioned files have to be cleared,
CLEAR.FILE F.TSA.STATUS
CLEAR.FILE F.BATCH.STATUS
CLEAR.FILE F.EB.EOD.ERROR
Before running the service see the below screen shot SOURCE STATUS & TARGET.STATUS completed and then
input the REVERSE.COMPARE - Y/N
Since we have only the testing data's it’s very less than client data’s. To proceed the reverse process it will take
more time to compare. If you wish to continue Press:
Y – Comparison of reverse process, it means compares target dump to source dump
https://tcsp.temenos.com/R19CD/R19CD.htm#../Subsystems/R19CMUG/Content/CountryPacks/Generic/DataMigrationTool/Deal Processing.htm… 131/209
8/5/2021 Deal Processing
N – Skip the reverse process comparison
TAFJ
See the below screen shot after completed the dump creation T24.SAB.INFO application is include the date and
completed status.
STEP 6: Check the source dump in T24.DAB/RESULT folder under the bnk.run
1. Result dump file will be saved in the below directory.
../ bnk.run/T24.DAB/RESULT
The dump will be saved in the RESULT path with the ID as T24.DAB.RESULT.BUILD Current System Date & Time).
TAFJ:
Result dump file will be saved in the below directory.
../ bnk.run/T24.DAB/RESULT
The dump will be saved in the RESULT path with the ID as T24.DAB.RESULT.BUILD Current System Date & Time)
STEP 7: Result files
The result files will be created as:
T24.DAB.MISMATCH.BASE
T24.DAB.SOURCE.BASE
T24.DAB.TARGET.BASE
T24DAB.AA.ACTIVITY.CLASS.SOURCE.NEW
T24DAB.AA.ACTIVITY.CLASS.TARGET.NEW
T24.DAB.MISMATCH.BASE – The field level mismatches between source and target will be captured in this file.
TAFJ:
T24.DAB.MISMATCH.BASE – The field level and mismatches between source and target will be captured in this
file.
T24.DAB.SOURCE.BASE- The new records in the source environment will be captured in this file.
T24.DAB.TARGET.BASE - The new records in the target environment will be captured in this file.
T24DAB.AA.ACTIVITY.CLASS.TARGET.NEW- The new record contents in target environment will be captured in this
file.
TAFJ:
T24DAB.ACCOUNT.SOURCE.NEW
T24DAB.ACCOUNT.TARGET.NEW
Below routine creates the output dump file, which is comparison of source and target dump in the target
environment.
T24.OAB.COMPARE.BUILD
T24.OAB.COMPARE.BUILD.LOAD
T24.OAB.COMPARE.BUILD.SELECT
T24.OAB.COMPARE.BUILD.POST
The Object Audit Build tool is developed for all the Local and Core Source.
TAFC:
Create base in source environment.
Step 1: Save list creation:
Create a saved list (User defined name) and give the list of BP names.
JED &SAVEDLISTS& JSHOW.COMMAN.BP
Create the test3lib under the TAFJ Path and transfer the checked jar in binary mode
t24@localhost:DEV>CREATE-FILE test3lib TYPE=UD
Before run the each every service clear below mention file
CLEAR.FILE F.TSA.STATUS
CLEAR.FILE F.BATCH.STATUS
CLEAR.FILE F.EB.EOD.ERROR
The inputs required for OAB has to be provided in this record as shown in the below screen shot.
Step 3: RECORD CREATION
OAB.SOURCE.SL.ID - The savedlists ID with the selected BP has to be provided in this field. If not, the build will be
created for all the local & core
The BP list has to be provided in the savedlists as shown in the below screen shot.
Browser View:
Putty View:
TAFJ:
Step 5: Start the T24.OAB.SOURCE.BUILD service
To start the TSM first and then start the service T24.OAB.SOURCE.BUILD in Auto mode.
Browser View:
Putty View:
START THE SERVICE “T24.OAB.SOURCE.BUILD” ” to create the dump with the source provided in savedlists (if
savedlists provided) or with all the Bp lists.
Step 6: Start the TSM service
TAFJ:
Step 8: Check the source dump in T24.OAB/SOURCE folder under the bnk.run
The Output structure of the base from the Source environment is as follows:
1.The source base file will be saved in the below directory.
../ bnk.run/T24.OAB/SOURCE
TAFJ:
../ bnk.run/T24.OAB/SOURCE
The base will be saved in the SOURCE path as T24.OAB.BASE.SOURCE stamped by the present System DATE &
TIME.
To view the source base select the file using the following command in the specified path.
TAFJ:
TAFJ:
JED &SAVEDLISTS& TSHOW.LIB
To start the TSM first and then start the service T24.OAB.TARGET.BUILD in auto mode.
Step 4: Start the TSM
Putty View:
START THE SERVICE “T24.OAB.TARGET.BUILD” to create the dump with the source provided in savedlists (if
savedlists provided) or with all the Bp lists.
Step 6: Start the TSM service
TAFJ:
Check whether the date and status has been updated in T24.SAB.INFO.
Step 8: Check the source dump in T24.OAB/TARGET folder under the bnk.run
The Output structure of the base from the Target environment is as follows:
The target base file will be saved in the below directory.
../ bnk.run/T24.OAB/TARGET
TAFJ:
./TAFJ /T24.OAB/TARGET
The base will be saved in the TARGET path as T24.OAB.BASE.TARGET stamped by the present System DATE & TIME.
https://tcsp.temenos.com/R19CD/R19CD.htm#../Subsystems/R19CMUG/Content/CountryPacks/Generic/DataMigrationTool/Deal Processing.htm… 155/209
8/5/2021 Deal Processing
To view the source base select the file using the following command in the specified path.
TAFJ:
To view the source base select the file using the following command in the specified path.
Putty View:
Start the service “T24.OAB.COMPARE.BUILD” to compare the routines.
TAFJ:
See the below screen shot after completed the dump creation T24.SAB.INFO application is include the date and
completed status.
Step 7: Check the source dump in T24.OAB/RESULT folder under the bnk.run
The Output structure of the comparison result from the Target environment is as follows:
The output of the comparison will be saved in the below directory.
../ bnk.run/T24.OAB/RESULT
TAFJ:
./TAJ/T24.OAB/RESULT
TAFJ:
T24.OAB.MATCH - It is based on the comparison result between the Source and Target base. The result contains
the matched Contents and also the new routines (routines found in Source Base and not in Target Base).
All the output files will have the format:
<BP.NAME > ROUTIN.NAME>DD MTH YYYY>HH:MM:SS>NEW>
Where ,<DD MTH YYYY> is the date when the routine was compile Last
<HH:MM:SS>, is the time when the routine was compile Last
<NEW> Applies only when there is a new routine exists and compiled in target env.
The below is the sample for Matched object audits.
TAFJ:
T24.OAB.UNMATCH - It is based on the comparison result between the Source and Target base. The result
contains the unmatched Contents. Below is the sample for the unmatched object audits.
TAFJ:
T24.OAB.NEW - It is based on the comparison result between the Source and Target base. The result contains the
new routines (routines found in Target Base and not in Source Base). Below is the sample for the unmatched
object audits.
TAFJ:
T24.OAB.NEW - It is based on the comparison result between the Source and Target base. The result contains the
new routines (routines found in Target Base and not in Source Base). Below is the sample for the unmatched
object audits. No new routines in the pack. Hence blank file generated.
DM.REF.TABLE
It is data migration reference table for AA Migration. It contains three fields – DESCRIPTION,
APPLICATION, FIELD.NAME. If we want to extract any specific data (field value) from any valid t24 table, we need to
specify the application name and field names for which the values has to be extracted.
DM.AA.CUSTOMER.PROXY
Description:
The DM.AA.CUSTOMER.PROXY is a concat table containing arrangement id and proxy Customer id. It is a multi value
set. The concat file id must be Customer id which provided in the AA.ARRANGEMENT.ACTIVITY record. Before we
update the concat file, we need to be specify the required fields in DM.REF.TABLE (Ref 4.4.1). Based on the reference
table it updates DM.AA.CUSTOMER.PROXY table.
Functional Process:
The arrangement id and Customer id extracted from AA.ARRANGEMENT.ACTIVITY table as based on the certain
condition. First select the AA records, if the ACTIVITY field has ‘PROXY.SERVICES-NEW-ARRANGEMENT’ with
FIELD.NAME has ‘PROXY.
If all the records has the value ‘PROXY’ in field FIELD.NAME, the corresponding proxy Customer and arrangement id
will be updated in concat file.
Sample: AA.ARRANGEMENT.ACTIVITY record.
T24.AA.INT.ACT.DEFAULT
The subroutine will be attached as input routine in specified version with creation of PGM .FILE and EB.API entry. This
routine will default the values as ‘PROXY.ARRANGEMENT’ in the fields FIELD.NAME and updates the corresponding
arrangement id in the FIELD.VALUE field in AA.ARRANGEMENT.ACTIVITY table (selected from
DM.AA.CUSTOMER.PROXY) . It is multi value set.
Prerequisites:
Create the PGM.FILE entry for the T24.AA.INT.ACT.DEFAULT.
While inputting or Loading the data via this version the specified fields will be defaulted from the concat table
DM.AA.CUSTOMER.PROXY.
DM.AA.CUSTOMER
Description:
The DM.AA.CUSTOMER table is a concat file containing arrangement id for the corresponding PROXY CUSTOMER. The
arrangement id filed is a multi value set. The concat file id must be Customer id which provided in the
AA.ARRANGEMENT.ACTIVITY (Proxy) record. Before we update the concat file, we need to be specify the required
fields in DM.REF.TABLE (Ref 4.4.1). Based on this reference table, it updates the DM.AA.CUSTOMER table.
Functional Process:
The arrangement id extracted from AA.ARRANGEMENT.ACTIVITY table as based on the certain condition. First select
the AA records, if the ACTIVITY field has 'INTERNET.SERVICES-NEW-ARRANGEMENT’ and update the arrangement id in
the concat table.
Sample: AA.ARRANGEMENT.ACTIVITY record.
T24.EB.EXT.ARR.ACT.DEFAULT
The subroutine will be attached as input routine in specified version with creation of PGM.FILE and EB.API entry. This
routine will default the values in the fields ARRANGEMENT from the concat table DM.AA.CUSTOMER in
EB.EXTERNAL.USER table.
Prerequisites
Create the PGM.FILE entry for t he T24.EB.EXT.ARR.ACT.DEFAULT.
While inputting or Loading the data via this version the arrangement id will be defaulted from the concat table
DM.AA.CUSTOMER.
This is used to convert the data file of SC.PERF.DETAIL into the required utility format file and to update the concat
File SC.PERF.DETAIL.CONCAT.
Routine Updation
DM.SC.FORMAT:
This utility is used to convert the provided data file into RUN.PERFORMANCE.TAK EON
execution format.
Required Input for DM.SC.FORMAT:
Steps to be followed for Program execution:
Create a directory PERF.DATA in ...bnk.run path
Enter the path and file name
In PERF.DATA directory create a file data.txt and save it
Run the program DM.SC.FORMAT it will create a new Data file in data.txt.
Expected output
Before Processing DM.SC.FORMAT data file in format:
DM.SC.PERF.CONCAT.UPDATE
The routine will update the concat file SC.PERF.DETAIL.CONCAT based on the loaded SC.PERF.DETAIL record. It
contains the SC.PERF.DETAIL id with corresponding date (YYYYMM).
SC.PERF.DETAIL:
Data Compare
DM.DATA.COMPARE
Purpose
The scope of the table is to compare both the Extracts and to provide the output of the comparison.
Description: The Description for the comparison.
Compare Type: The field specifies whether to compare environment or file.
Recon Mapping: The DM.COMPARE.FIELD.MAPPING table record which will be used for mapping of the reconciliation Extracts.
Id Key: This field will hold the Primary key (ID) position of the record.
Legacy File Name: This field is used for information only to store the Legacy Data Extract.
Source File Path: The directory Name where the Extract is provided by the Bank.
Source File Name: The File Name of the Extract provided by the Bank.
Target File Path: The Directory Name where the Recon Extract from T24 is located.
Target File Name: The File Name of the Recon Extract from T24.
Output File Path: The Output File Path where the reconciliation extract is available after the comparison.
Output File Name: The Output File name of the reconciliation extract after the comparison.
No. Of Agents: The number of agents that will be required to run the comparison can bementioned in this field.
Application Name: The application name for the comparison.
Field Position.1: This field specifies the position of the field and applicable only for environment compare.
Source Field.1: This field specifies the first field of source data file.
Data File:
In the below source file, the first two string ids are ending with 004,005. But the target file, the first two string ids are
ending with 005,004.
SOURCE.txt
TARGET.txt
DM.DATA.COMPARE record
Record ID: TEST
Output:
The source and target new records are created in the separate file.
Mismatch records
TARGET.txt
DM.DATA.COMPARE record
Record ID: TEST
The source and target new records are created in the separate file.
Mismatch records
Mismatch records
DM.DATA.COMPARE writes the comparison errors in a file.
In the below source file, the first two string ids are ending with 004,005. But the target file, the first string is in the
second position.
SOURCE.txt
TARGET.txt
DM.DATA.COMPARE record
Record ID: TEST
After updating the error log file, the source and target concat file are clear by the routine.
DM Scheduler
Table
DM.SERVICE.SCHEDULER
Purpose
The services for validating, loading and authorising of Data in T24 is required to be scheduled and automated. Most of the tasks are manually performed
which needs to be eliminated. The multithread service will perform the tasks in an automated way to eliminate the manual task to an extent.
Description: The description of the SCHEDULER.
Verification Dss: This field is used to store the DM.SERVICE.SCHEDULER ID which needs to be completed successfully in order for the current service to
be started.
Run Status: To indicate whether the service needs to be started or stopped. The two options available are START and STOP.
DM Service Id: To store the DM.SERVICE.CONTROL ID which needs to be started. This is a multi-value field.
Verification: To store the DM.SERVICE.CONTROL ID which needs to be completed successfully in order for the service, associated to the multivalued set,
to be started. This is an associated multi-value field.
Err Threshold: To specify the maximum number of errors that will be tolerated before marking the service as failed. This is an associated multi-value
field.
No Of Errors: No input field to update the number of records in error from the corresponding DM.SERVICE.CONTROL after the service is complete. This is
an associated multivalue field.
Skip Service: To specify if the service needs to be run or can be skipped. The two options are YES and NO.
Dsc Status: No input field which will be updated with values 0, 1, 2 or 3 based on the current status of the service associated with the field DM Service
Id.
Dss Status: No input field which will be updated with values 0, 1, 2 or 3 based on the current status of all the services in the SCHEDULER.
Status Detail: No input field which will be updated with some details for information only.
Data File:
Output:
Once the Scheduler is completed, the DSS updates with Sceduler Status as Scheduler Completed. And also indicating
No of Errors as 0 for both (Positive Test case).
DM.DATA.TARNSLATION
Table
DM.DATA.TRANSLATION
Purpose
The data provided by the Client will contain values that correspond to the Legacy Data Format which cannot be
loaded into T24 without any TRANSLATION. Hence a valid TRANSLATION must be done so that the Data is converted
into a T24 Data format and can be loaded.
Application: The application for which the TRANSLATION carried out.
Source Dir: The source directory of the data file consisting of Legacy ID is provided in this field. (Please note that any error log generated from the
TRANSLATION will be available in this directory)
Source Fname: The Datafile name consisting of Legacy ID is given.
Target Dir: The name of the output directory where the converted file will be available after the TRANSLATION is over. Any error log generated will be
available as mentioned above. By default DATA.BP is defaulted which can be modified as per the user.
Target Fname: The name of the converted data file with T24 Ids must be given here.
FM Delim: The Field Marker that is used in the datafile to delimit field values. By default ‘|’ is defaulted which can be modified as per the user.
VM Delim: The Value Marker that is used in the datafile to delimit field values. By default ‘::’ is defaulted which can be modified as per the user.
SM Delim: The Sub Value Marker that is used in the datafile to delimit field values. By default ‘!!’ is defaulted which can be modified as per the user.
Mapping Table: Any Concat file name that will be used for the TRANSLATION can be provided in this field. This is a multivalue field.
Pos Convert: The positions in the data file where the TRANSLATION has to be done and replaced with T24 Ids. This is a sub value field associated with
Mapping Table field.
Pos Value: fixed values change the position in the data file where the TRANSLATION has to be done and replaced the fixed value. This is an associated
sub value set with Pos Check field.
Pos Routine: A subroutine can be attached to this field that will be used for TRANSLATION logic. This is an associated sub value set with Pos Check field.
No. Of Service: The number of agents that will be required to run the TRANSLATION can be mentioned in this field.
Output
Thus any changes that are to be done in the data file can me made through this table.
When loading the UTF converted data file or data file which consists of more than 1024 characters, there should
not be any broken records in DM.SERVICE.DATA.FILE which means there is no breakage of record into 2.
Verifying DSC to start the service:
After starting the service, while listing the OFS.REQUEST.DETAIL, it is found there is no breakage of record even it
consists of special characters or more than 1024 characters.
In the above screen shots, it is found that there is no truncation of record even if there is special characters or more
than 1024 characters.
DM.AA.LENDING.BAL.EXTRACT
Balances can be extracted for Reconciliation Purpose using the routine DM.AA.LENDING.BAL.EXTRACT.
Execution:
OUTPUT File:
Sample Output:
DM.AA.LENDING.SCHEDULE.EXTRACT
OUTPUT File:
Sample Output:
DM.DC.BATCH.UPDATE
DM.DC.BATCH.UPDATE can be done via DSC and also it checks for the updation of DC.BATCH.CONTROL and
DC.DEPT.BATCH.CONTROL.
It also checks the ID format for Data capture like Number of characters, prefixed with DC and the current Julian Date
in T24.
DSC record:
If the batch already exist in the environment then the below error will throw.
5Mnemonic duplication
On loading the file, if there are any duplication of mnemonic it would be captured and error message will be created
for the same.
Starting the DM.SERVICE.CONTROL.
During AA load. Suppose if the AA product is proofed and not published, error log is created.
AA.PRODUCT record shows that the product SMALL.BUSINESS.LOAN is in proofed status.
After running the service, if in the data file the product used is SMALL.BUSINESS.LOAN then error log is created as
shown below.
TDM.DATA.STRUCTURE.EXTRACT
It is a program which is used to extract all the fields and properties of any table given in the saved list.
Create the savedlists by entering the name of a specific application.
Once the savedlists name is given, it will extract the field properties of that application and also it will get stored in
the T24.DATA.STRUCTURE.LIST directory in the format of .xls.
ENQUIRY DM.ENQ.ISSUE.LOG
The purpose of the ENQUIRY DM.ENQ.ISSUE.LOG is to get the number of error, number of records validated and the
error details of the load process.
To run the enquiry.
Selection Criteria
Output of the selection showing number of error, number of records validated and the error details.
UTF8 conversion
The UTF8 conversion of data file for swift character handling has to be incorporated in generic DM Tool.
The data file with Arabic character.
After verifying the DMP we need to specify the value ‘Yes’ for converting the data file and the converted data file is
generated in same path.
DM.CLEANUP
The purpose of the routine is to clean up the existing DMD'S, DSC'S, TSA’s (DM.SERVICE) and BATCH records related to DM SERVICE and VERSIONS and
The lib & bin files (GR0800005bin, GR0800005lib) and DM.BP and T24MIG.BP folders are decataloged and deleted
For this we need to run program DM.CLEANUP.
Prerequisites
To export the corresponding LIB and BIN :
export JBCDEV_LIB=GR0800005lib
Clean-up Process:
Conclusions
Thus the document explaining the functionality of a non -core T24™ Data Migration Tool that is used for Data extract
from T24 Load into T24 from any Client system. This could be used as a base for using Data Migration tool in any T24
data Extract & Load process.
Published on: 14/05/2019