Professional Documents
Culture Documents
The main purpose of database is to store data and this data can be used for late
r purpose (analysis). Any big Enterprise we consider, they always store their bu
siness data into database. So that they use this data for analyzing their busine
ss.
Any database like (Oracle , Ms Access, SQL Server, Sybase
orm of a structure called TABLE .
Customer Number
Every table must have a Primary Key (In Exceptional Cases we can have a table wi
thout a primary)
we have 2 types of columns in a table.
1) Key Column
2) Non - Key Column
Key Column:- Column which is a part of Primary Key.
Non - Key Column:- Column which is not a part of Primary Key.
All Non - Key Columns act as Attributes or Properties for Key Column.
1)
2)
OLTP
OLAP
In case of extended star schema, we will have Fact table connected to the Dimens
ion table and the Dimension table is connected to the SID table and SID table is
connected to the master data tables.
Fact Table and Dimension table will be inside the cube.
SID table and Master data tables are outside the cube
One Fact table can get connected to 16 Dimension tables, one Dimension table can
be assigned with maximum of 248 SID tables (248 characteristics).
Master data & SID tables:
Every characteristic Info Object will have its own SID table to convert the Alph
a-Numeric value to a Numeric value. But the Key figure Info Object will not SID
table because the keyfigure value is a numeric.
When ever we insert a value into the characteristic Info Object, system will gen
erate an SID number in the SID table which is a numeric value
Each Characteristic can have its own master data tables (ATTR,TEXT,HIER)
Attribute Table is used to store all the attribute or properties data
Text table is used to store the description in multiple languages
Hier table is used to store the Parent-Child data.
As you can observe the in above picture, Material Attribute table holds all the
attribute information like (Material Group, Material Price), Material Text table
Holds Description in multiple languages.
So when we load master data, SID s are generated in the SID table.
Fact Table & Dimension Tables:
Fact Table:
Fact Table will have Dimension ID s and Key figures.
Maximum DIM ID s 16
Maximum Keyfigure
233
If we model the keyfigure inside the Fact table, it gives fact because t
he property of Info cube is additive.
SAP BW Architecture:
Points to be Noted regarding SAP BW:
In SAP BW we work with objects like (Info Cube, ODS, Info Source, data Source,
Info Package, Update Rules, Transfer rules, BEx queries .)
In SAP BW we will have 2 types of Objects:
1)
Standard or Business Content Objects:
All the standarad objects will have their technical name starting with t
he number 0.
Pre-requistes:
Source System connection Between Flat file and SAP BW.
Steps:
1) Design the Info Cube.
2) Implement the Design in SAP BW
2.1) Create the Info Area
2.2) Create the Info Object Catalogs
2.3) Create the Info Objects
2.4) Create the Info Cube
2.5) Loading the Info Cube from Flat file.
2.5.1) Created the Application Component
2.5.2) Create the Info Source
2.5.3) Assign the Data Source to Info Source
2.5.4) Activate the Transfer rules
2.5.5) Create the Update Rules
2.5.6) Create the Info Package and schedule the load
Deleting Data in Info Cube:
1)
2)
3)
ETL Process:
When we start the Infopackage SAP BW triggers the loading process with the below
steps:
1.
SAP BW sends a request to the source system
2.
It gets confirmation OK from the source system
3.
Then SAP BW send the data request
4.
Based on the Data Selections, the data is extraction from the source syste
m
5.
The same data is transferred to SAP BW (Transfer methods PSA & IDOC)
6.
Then the data is loaded into the Data target through Transfer rules & upda
te rules.
Transfer Methods [PSA & IDOC]:
PSA:
Presistant Staging Area
It is a 2 Dimensional Table.
Data is transferred directly to PSA table and Information is transferred throu
gh Info IDoc s
When we activate the transfer rule with PSA as the Transfer method, the system
will automatically create the PSA table.
Every Data Source will have its own PSA table.
We can find the PSA table of a Data Source by using the T-code SE11
Structure of the PSA Table:- Transfer structure + 4 technical fields (Request
No, Data packet ID, Record No, Partition No).
PSA holds Replica of data coming from Source.
We can do editing in PSA
Error handling is possible with PSA
We can reload the records from PSA to the Data target by using - Reconstructio
n
We can delete data in PSA (Generally we delete data in PSA which is older than
7 days)
Allowing Special Characters into SAP BW - RSKC [76].
RSALLOWEDCHAR
IDOC:
Intermediate Document
It is the Standard used to transfer the data in SAP environment.
Data is transferred through Data Idoc's and Information is transferred through
Info IDoc's.
IDoc Maintenance
With whatever data selections we run the init update, delta update also should
run with the same data selection
In the Infopackage we can see the Delta update option only if the data source s
upports delta.
Transfer Rules:
By using transfer rules, we can do mapping between Transfer structure & communic
ation structure.
Types of transfer rules:
1)
Direct Mapping: We use option to map the value from a source field in the
Transfer structure to the target Info object in the communication structure.
2)
Constant: We use option to specify a fixed/constant value for the records
loaded through transfer rules.
3)
Formula: we use option to implement a formula by using Formula editor.
4)
Routine [Transfer Routine]: when use this option to transform the data by
using ABAP/4 code. When implementing a transfer routine we must refer to the fie
lds by using the structure name as TRAN_STRUCTURE i.e, [TRAN_STRUCTURE-/BIC/PRICE]
. When debugging the name of the transfer routine will be formed as COMPUTE_FIEL
DNAME.
Update Rules:
Update rules specify the mapping between Source object and Target object. We use
update rules to perform all kind of Transformations. Update rules update the da
ta into the data target.
Types of Update rules:
Keyfigure:
Source Keyfigure or Direct Mapping
Formula: we use option to implement a formula by using Formula editor.
Routine or Update routine: when use this option to transform the value of a ke
y figure by using ABAP/4 code. When implementing an update routine we must refer
to the fields by using the structure name as COMM_STRUCTURE i.e, [COMM_STRUCTURE/BIC/PRICE]. When debugging the name of the update routine will be formed as ROU
TINE_001.
Routine with Unit: when use this option to transform the value of a key figure
and also value of the unit characteristic associated with it by using ABAP/4 co
de
Characteristic:
Source char or Direct Mapping
Constant
Master Data Attribute of: we use this option to feed value by doing lookup to
the master data tables.
Formula
Routine
Initial Value: Populates no value (by default NULL)
Time Characteristics:
Source char or Direct Mapping (Automatic Time Conversion)
Constant
Master Data Attribute of: we use this option to feed value by doing lookup to t
he master data tables.
Formula Routine
Initial Value: Populates no value (by default NULL)
Time Distribution: we use the option to distribution values from Higher level
Time characteristics to Lower level Time characteristics.
Start Routine:
Start routine is executed before individual update rules.
Start routine is executed packet by packet.
So we use start routine to perform or implement any kind of logic which is supp
osed to get executed before update rules.
When implementing start routine we use an INTERNAL TABLE
DATA_PACKAGE.
Sample code:
LOOP AT DATA_PACKAGE.
IF DATA_PACKAGE-/BIC/ZCREG <> AMR .
DELETE DATA_PACKAGE.
ENDIF.
ENDLOOP.
Return Table:
We use this option when we want to split one record from the source to multiple
records in the data target.
Difference between Update rules & Transfer rules:
Transfer Rules
Update Rules
Transfer Rules will just Transfer the data
Update Rules will update the data into data target
Transfer rules are specific to source system
Update rules are specific to Data Target.
Different Data flow Designs:
One InfoSource to Multiple Data Target Yes
Multiple InfoSource to Single Data Target
Yes
One InfoSource can be assigned with multiple DataSources
yes
Same DataSource cannot be assigned to multiple InfoSources
Master Data:
Detailed Information about any Entity is called as Master Data.
Ex:- Detailed information about a customer Customer Master data
In SAP BW, we have 3 types of Master data:
ATTR
TEXT
HIER
ATTR: is used store all the attributes / properties of an entity.
Text: is used to store all the descriptions in different languages
HIER: is used to store parent-child data
How to load Master data ATTR & TEXT from Flatfile
Steps:
Create the Application component
Create the Info Source of type Direct Update
Assign the DataSource to InfoSource
Activate the Transfer rules for ATTR & Text DataSources
Create the Info Packages and schedule the loads
Note:- we need to create one Infopackage for ATTR DataSource and one for Text Da
taSource
Hierarchies:
When do we go for hierarchies:
When the Characteristics are related as 1:M and in the Reporting if we need to d
isplay the values by using hierarchies (Tree like display)
Types of Hierarchies:
- Hierarchy Not Time dependent
- Hierarchy Structure Dependent on Time
If we have an info object 'A', when we create the Info object 'C' by taking 'A'
as the Template, all the properties of 'A' are copied towards 'C' and we can ch
ange the properties of 'C', we can load the seperate master data for the info ob
ject 'C'.
When do we go with reference:
When we want to create the new master data object which is supposed to hold the
data which is already a sub set of some other Master data objects, we go for cre
ating the Master data object with reference.
Reference Example:- Sold to Party , Ship to Party, Bill to Party, Payer are created with reference
to Customer.
- Sende
r Cost Center and Reciever Cost Center are created with refer
ence to Cost Center.
Converting Master Data as Data Target:
ODS
ODS: (Operational Data Store)
ODS is also an Info Provider like Info Cube.
ODS is a 2 dimensional.
the Property of ODS is : Overwrite.
We perfer ODS to do detailed level of reporting
we also use ODS for Staging.
ODS CONTAINS 3 TABLES:New Data Table
Active Data Table
Change Log
1. Active Data Table:
/BIC,0/AXXXXXX00
Structure: - All Key Fields [Primary Key]+ All data Fields + Recordmode
Reporting
Active data table will be the Source when we schedule Init / Full update for
Data Mart
2. New data Table:
/BIC,0/AXXXXXX40
Structure: - Technical Keys [loading Request No + Data packet no + Record N
o] (Primary Key) + All Key Fields + All data Fields + Recordmode
First table where the data is staged in ODS.
est will have the same activation request. So if we delete a particular request,
it deletes all the other request in the ODS with the same Activation reuest.
Activation Serially / Parallel
Data Marts:
Case 1:
Loading Data from ODS to Info Cube
Pre-requisites:
1) Myself Source System Connection
2) Application Component - Data Mart (DM)
Steps:
1) Identify the Source object and the Target Object.
SO - yo_sd01
TO - yc_dm1
2) Check whether the Source Object has got the "EXPORT GENERATE DATA SOURCE". If
it is not there we have to explicitly generate the "EXPORT GENERATE DATA SOURCE
".
SO - yo_sd01 - EGDS - 8yo_sd01
3) Connect the Source object to the Target Object with the help of Update rules.
4) Find the Info source which gets created automatically when we build the updat
e rules and this Info source is also assigned with Myself source system connecti
on under the Application Component (Data Marts).
5) Create the Info Package and Schedule the load.
Case 2:
Loading Data from Info cube to Info Cube
Pre-requisites:
1) Myself Source System Connection
2) Application Component - Data Mart (DM)
Steps:
1) Identify the Source object and the Target Object.
SO - yc_dm1
TO - yc_dm2
2) Check whether the Source Object has got the "EXPORT GENERATE DATA SOURCE". If
it is not there we have to explicitly generate the "EXPORT GENERATE DATA SOURCE
".
3) Connect the Source object to the Target Object with the help of Update rules.
4) Find the Info source which gets created automatically when we build the updat
e rules and this Info source is also assigned with Myself source system connecti
on under the Application Component (Data Marts).
5) Create the Info Package and Schedule the load.
- How to correct the Delta Load ?
- Different Update Mechanisims to different Data Targets?
In case of ODS : we cannot do a "Full update" after doing "INIT" or "DELTA" upda
tes because this will reset the Delta Management of the ODS. So to overcome with