You are on page 1of 47

Technical Specifications

Page 1 of 47
7-Jan-20
Technical Specifications

Scenario 1 : field mapping


We have some material data in the legacy system and we have to load that data into
the SAP tables and we have to delete the duplicate records in legacy file if any, and as
per the business requirement material number field must be a primary key.

Step 1] For taking legacy (source) file into the designer environment

Goto flat Format tab right click on the flat files goto new
.

Page 2 of 47
7-Jan-20
Technical Specifications

Step 2]
give file name for creation, root directory, Original file name, delimeter
between data If you have any header or title in the source file then say yes
to skip row header otherwise SAP take as record for the title also if you want to
mention name of the column in target also then say yes to the write row header.

Page 3 of 47
7-Jan-20
Technical Specifications

Step 3]
Now we have Flat file in BODS environment we can use this file for the data loading in
the SAP table.
We will drang and drop this file into staging area

Page 4 of 47
7-Jan-20
Technical Specifications

Step 4]
Then we take the query transform from palette and drop in the Staging area from the
same palette we drag and drope the template as a target later using this template we
will create a permanent table.
Once source file, transform and target table comes in the staging area then make the
connecation between them.

Page 5 of 47
7-Jan-20
Technical Specifications

Step 5]
Double click and open the query transform for the field mapping purpose.

Select all the field in the schema in right click and select map to output,
After this all this field will go to the output file means we are done with the mapping
Of the fields.

Page 6 of 47
7-Jan-20
Technical Specifications

Step 6]
As per our reqirement we have to maintain material number as a key field for that right
click on the material number field in the schema out section and click on primary key.

Page 7 of 47
7-Jan-20
Technical Specifications

Step 7]
As per second requirement delete the duplicate records in the source file if any for
deleting the duplicate records, select distinct option sould be enable under the select
tab.

Page 8 of 47
7-Jan-20
Technical Specifications

Step 8]
Once all the setting done as per the requirement then save the job check the errors then
execute the job.

Page 9 of 47
7-Jan-20
Technical Specifications

Step 9]
After complitating the executation Check the data in the target file wheter it came as per
requirement or not.

Page 10 of 47
7-Jan-20
Technical Specifications

Scenario 2: joins

We have two files in legacy in one file contains basic information like material number ,
Material name and material cost in other file we have we have material number, plant
(from where this material belongs),and storage location of the material,
But requirement is we want all the information like material number ,
Material name, material cost, plant (from where this material belongs),and storage
location of the material in only one table.

Solution: as per mention we have to file in the source system but we want whole data
into single file for achiving this kind of the requirement we will use joins.

Step 1] For taking legacy (source) file into the designer environment

Goto flat Format tab right click on the flat files goto new

Page 11 of 47
7-Jan-20
Technical Specifications

Step 2]
give file name for creation, root directory, Original file name, delimeter
between data If you have any header or title in the source file then say yes
to skip row header otherwise SAP take as record for the title also if you want to
mention name of the column in target also then say yes to the write row header.

Page 12 of 47
7-Jan-20
Technical Specifications

Step 3]
Now we have Flat file in BODS environment we can use this file for the data loading in
the SAP table.
We will drang and drop this file into staging area

Repeat the step 1, 2, 3 From scenario 1for taking the second file into the environment .

Page 13 of 47
7-Jan-20
Technical Specifications

Step 4]
Then we take the query transform from palette and drop in the Staging area from the
same palette we drag and drope the template as a target later using this template we
will create a permanent table.
Once source file, transform and target table comes in the staging area then make the
connecation between them.

Page 14 of 47
7-Jan-20
Technical Specifications

Step 5]

Double click and open the query transform for the field mapping purpose.
Select all the field in the schema in except matrial number from the second file because
We don’t want the two colomn for the same value right click and select map to output,
After this all this field will go to the output file means we are done with the mapping
Of the fields.

Page 15 of 47
7-Jan-20
Technical Specifications

Step 6]
Now goto from tab select table for left side

Page 16 of 47
7-Jan-20
Technical Specifications

Step 7]
Select type of join(we use inner join suitable for our scenario) and then select the right
table then join condition

Page 17 of 47
7-Jan-20
Technical Specifications

Step 8]
Once you went to the join condition then drag and drop material number field from the
left table give equal to symbol and drag and drop material number field from the right
table and click on ok.

Page 18 of 47
7-Jan-20
Technical Specifications

Step 9]
come back save the job check the error and execute the job finally check the data in the
target table whether it is come as per our requirement or not.

Page 19 of 47
7-Jan-20
Technical Specifications

Step 10]
Here material number, material name and cost coming from the first table and plant,
Storage location coming from the right table.

Page 20 of 47
7-Jan-20
Technical Specifications

Scenario 3: Data Scrubbing

Data scrubbing is the one of the Bussiness requirement. This kind of requirement we
add some custome field in the schema out (manually adding some field in the target
table) as per business requirement.
We have one legacy file in that file contain material number,
material type, industry sector and base unit of measuring this three field and as per
requirement we have to add one field and in that field we have to write discription of the
material type with respective to material type field and want the material number in
descending order.

Step 1]
For taking legacy (source) file into the designer environment

Goto flat Format tab right click on the flat files goto new

Page 21 of 47
7-Jan-20
Technical Specifications

Step 2]
give file name for creation, root directory, Original file name, delimeter
between data If you have any header or title in the source file then say yes
to skip row header otherwise SAP take as record for the title also if you want to
mention name of the column in target also then say yes to the write row header.

Page 22 of 47
7-Jan-20
Technical Specifications

Step 3]
Now we have Flat file in BODS environment we can use this file for the data loading in
the SAP table.
We will drang and drop this file into staging area

Page 23 of 47
7-Jan-20
Technical Specifications

Step 4]
Then we take the query transform from palette and drop in the Staging area from the
same palette we drag and drope the template as a target later using this template we
will create a permanent table.
Once source file, transform and target table comes in the staging area then make the
connecation between them.

Page 24 of 47
7-Jan-20
Technical Specifications

Step 5]
Double click and open the query transform for the field mapping purpose.
Select all the field in the schema in right click and select map to output,
After this all this field will go to the output file means we are done with the mapping
Of the fields.

Page 25 of 47
7-Jan-20
Technical Specifications

Step 6]
Once we done with mapping of the field then we have to add new field in the target
table for this right click on the query name in schema out select new output column.

Page 26 of 47
7-Jan-20
Technical Specifications

Step 7]
Then give name and data type of the field .

Page 27 of 47
7-Jan-20
Technical Specifications

Once you give the name data type and length of the field then hit the save button
Now you can able to see the custome field in the target table

Page 28 of 47
7-Jan-20
Technical Specifications

Step 8]
Till now we add custome field in the target table now we can give the condition to that
Field when and what should be write in that field with respect to material type.
Open the query click on the custome field under the mapping tab we will use the
DECODE function.

When we use the decode function the default condition is mandatory to write otherwise
the decode function can not work.
e.g.->Here in the decode function we mention if MTART field (Material type) is equal to
ROH then write RAW MATERIAL in the custome field in same line.

Page 29 of 47
7-Jan-20
Technical Specifications

Step 9]
Goto order by tab drag and drop the MTNR field and give the descending .

Page 30 of 47
7-Jan-20
Technical Specifications

Step 10]
come back save the job check the error and execute the job finally check the data in the
target table whether it is come as per our requirement or not.

Page 31 of 47
7-Jan-20
Technical Specifications

On the basis of material type we got the description in new custome filed and we sort
the data on basis of material number.

Page 32 of 47
7-Jan-20
Technical Specifications

Scenario 4] lookup function


we have data in file or table, legacy or SAP in that data we have one field material
number and requirement is we have to chage the each and every material number
with new values.
e.g.

old_material_number new_material_number
23 n23
24 n24
25 n25

like above data we have to change the every material number to the new material
number.

Step 1]
We will create one text file in that we create two column one holding old values
And second one hold new values.
Once we create the file file with old and new data then will load this file into SAP table
will make this table as permanent table.
Because lookup function is only work on permanent table.

once we click on import table then it will become a permanent table

Page 33 of 47
7-Jan-20
Technical Specifications

Step 2]
 We take the file which holds current data
 Take query transform data mapped all the fiels into output schema .
 Click on material number field then click on function

 After cliking on function click on lookup function then goto lookup.txt

Page 34 of 47
7-Jan-20
Technical Specifications

Next

Page 35 of 47
7-Jan-20
Technical Specifications

In above screen shot


1: we mention the permanent table name which we created in step 1.
2: conditioncolumn in lookup table there we mention the old material number
field from the Permanent table is equal to the material number field from the
file which we will loading
3: for output column in lookup table there we mention the new material number
field from the Permanent table

it means wherever material number in loading file is equal to the the material number
in the permanent table (old_material_number) then return the value from
new_material_nummber which belongs to the permanent table.

Note: in above picture same color shows the that specific field belongs to same color
table

Page 36 of 47
7-Jan-20
Technical Specifications

Step 3]
Come back save the job check the error and execute the job finally check the data in
the target table whether it is come as per our requirement or not.

Page 37 of 47
7-Jan-20
Technical Specifications

Final output

Page 38 of 47
7-Jan-20
Technical Specifications

Scenario 5] Data Aggregation


We have one file in that we are maintaining the customer and sales data means
which customer Buying which material on what date, now requirement is to find out how
many times same customer buying the same produnct.

Step 1]
For taking legacy (source) file into the designer environment

Goto flat Format tab right click on the flat files goto new

Page 39 of 47
7-Jan-20
Technical Specifications

Step 2]
give file name for creation, root directory, Original file name, delimeter
between data If you have any header or title in the source file then say yes
to skip row header otherwise SAP take as record for the title also if you want to
mention name of the column in target also then say yes to the write row header.

Page 40 of 47
7-Jan-20
Technical Specifications

Step 3]
Now we have Flat file in BODS environment we can use this file for the data loading in
the SAP table.
We will drang and drop this file into staging area

Page 41 of 47
7-Jan-20
Technical Specifications

Step 4]
we took the query here namely ‘sorting’ and make the connection

Page 42 of 47
7-Jan-20
Technical Specifications

Step5]
Here we map only customer and material field because in target table we only want
this two field along with the one custome field which indicate the repeatation of the
order.

We want to show the count of the repetation of the order from same customer
For achieving this we have to sort the data on the basis of customer numer and
Material number, for that we will take query transformation by using query will sort the
data.

Page 43 of 47
7-Jan-20
Technical Specifications

Step 6]

For further operation we used one more query namely ‘count’and


Mapped the fields.

Now we have data in the sorted format.


After sorting the we have to used group by because we want repeated data
for combinantion of material and customer
later we add one custome field which will shows the reapetation of the order

Page 44 of 47
7-Jan-20
Technical Specifications

Step 7]
We add one custome field namely ‘no_of_count’
In this field we use the count function under mapping tab this count function return
the value of how many times order has repeat.
Finaly we take one template as a target

Page 45 of 47
7-Jan-20
Technical Specifications

Step8]
Once all the setting done as per the requirement then save the job check the errors then
execute the job.

Page 46 of 47
7-Jan-20
Technical Specifications

Step 9]
After complitation of the executation Check the data in the target file wheter it came as
per requirement or not.

Page 47 of 47
7-Jan-20

You might also like