~ OT oe OTT y
Generic Extractions
Generic Extractions
1. What is the T— code for generic extraction?
A. RSO2
2. What type of data sources, can we ereate ?
A: I trasanction data 2. master data attributes 3. texts,
3. Why we can’t generate hierarchy generate datasource 2
A: Because in SAP R/3 the data base design in OLTP system is ER model, no hierarchies are
maintained here, whenever there is any relationship 1: tiany, they split the information into
Afferent tables and link them with primary key and foreign key relationship.
4. In what always we can generate generic datasource ?
A® Table , view, infoset and function module. (Domain is available for text data souce),
5. When w create a generic datasource it will ask th
long, which is mandatory ?
Az Alllare mandatory.
ree descriptions, short, medium and
& When we are generating datasource using a table,
automatically or not ?
Az yes
is extract structure created
7. What is the difference
module ?
A: in case of funetion module, ext
structure js automatically generated.
between creating a generic datasource using view and function
tract structure is explicitly created. In case of view, extract
8. When do u prefer for view and when do u Prefer for function module ?
A: when the relationship is 1:1, view will be preferred, and when the relationship is 1: many
function module will be preferred.
9.What is the importance of selection check box when You are.generating the data source ?
As for whichever the field selection check box is seleoted th particular field will be made
available in data selection tab of the info package, so we can use it lo extract data selectively,
9. What is the importance of hide check box ?
A: It plays on transfer structure, then for whatever
* the field u select, hide won't be available in the
‘wansfer structure at BW side as well as at R/3 sid,
10, When do u go for generic datasource ?
Ail ‘when we don’t have any business content data source readymadely available, we go for
generic datasource.
2 Even ifa business content datasource is available,
and it is already being used up, and if
You want to simulate the same kind of extractor, we go for ge
netic extraction,
awe
1. Did you create generic data sources; if yes tell me what
A: Scenario’s:
SOM OY EOS SS Eee CPU Sr wy
eoyeue
1. To extract open deliveries:
datasource 2lis_12_veitm is available as part of business
Content to extract All the delivery item level information, So to extract only open deliveries
We have built the generic data source with function module, (logic: we used three
conditions to identify an open delivery
® IF Delivery is considered to be a open delivery, if it has no PGI (post goods issue)
document,
> If Delivery is considered to be a open delivery, i
PGI document doesn’t contain billing document
> If Delivery is considered to be a open delivery, if it has PGI and if this PGA
document contains billing document, and this billing document not yet been posted to
COPA.
(to implement this logic)
We have a database table LIPS which contains all the delivery item level information,
So we collect all the records from this LIPS table and cross check whether given
delivery is open delivery or not, if it is open delivery insert the tecords’ into
E_T_DATA, 4
it has PGI document, and if this
To check whether given delivery is open or not
% 1. IfDelivery is considered to be a open delivery, if it has no PGI (post goods issue)_
document.
(we use a table VBFA (sales dociiment Flow) to find out the PGI document for a
given delivery.) else
Deliveries considered to be a open delivery, if it has PGI document, and if this PGI
document doesn’t contain billing document
(using VBFA find PGI document for a given delivery and feed this PGI document
‘gain to the VBFA table to find out the subsequent billing document )
> If Delivery is considered to be a open delivery, if it has PGI and if this PGA
document contains billing document, and ths billing document not yet been posted to
CO-PA. _
billing is available for a given delivery then we use the VBRP table to find out the
Posting status of the billing document, if the posting status is not equal to C, then itis a
open delivery, if it equal to C then it isn’t a open delivery.
Scenario 2: (this scenario is suitable, only if SAP R/3 is 3.1v)
In SAP R/3 3.1v there are no business content datasources to extract F_AR
information so we went about the building the generic datasource using the VIEW built
on two tables, BSID and BSAD.
Scenario 3:dv
>
»
d
d
be loaded only with flexible update and this can be assigned with transaction
datasource, but the business content data source Omaterial_attr js master data attribute
data source, which can’t be used up, so to solve this problem we have built the
transaction data datasouree ZMATERIAL_ATIR which is simulating the
Omaterial_attr
12, Can we set up delta for generic datasource ?
A: yes
13, What r the different ways of setting up the delta in generic extraction?
A: Calday (if setup delta on base of calday we can run delta only once per day-that too at
the end of the clock to minimize the missing of delta records.)
Numeric pointer(this type of delta is suitable only when we r extracting data from a
table which supports only creation of new records, but not change of existing records. Ex:
CATSDB (HR time management table) ).
Time stamp( using timestamp we can run delta multiple per day but we need to use the
safety lower limit and safety upper limit with minimum of 5 minutes (300 seconds to be
given).
14, In what scenario did we setup delta in generic datasources?
A: Scenario I: If scenario is open delivery, then no delta (why? Because we always wanted to
extract snapshot of open deliveries),
Scenarioll: Yes we setup delta here based on calday using the field posting date (Technical
Name).
Scenariolll: When we r simulating the generic data source for Omaterial_attr, we
implemented delta using coding in function module,
15. Can we do datasource Enhancement for Generic Datasource, if yes How?
A: yes , Go to T-Code RSAG, Select the data source and click on the button enhance extract
structure and specify the fields which we want to enhance by prefixing ZZ to the fields, after
this again get beck to RSA6 T-Code select the datasource and click on Change BUTTON and
deselect the hide check box and field only known in exit check box, then we write the
enhancement code to populate the data into the enhanced fields by using the enhancement
RSAPOOO! this enhancement contains four function exits, that is EXIT_SAPLRSAP_001(for
transaction data data source), EXIT_SAPLRSAP_002 (For master data attribute)
EXIT_SAPLRSAP_003 (For master dala texts), EXIT_SAPLRSAP_004 (Master data
Hierarchies),
16. When we are generating the datasource we specify certain application component,
what is that u r doing there ?
A: We are designating the datasource to the application component.
17. When u set up delta for a generic datasouree, when you run a delta load at BW side,
where does data come from ?
A: RSAT7 (DELTA QUEUE).18. When you are setting up the delta in generie datasources, what are the different
images we can setup ? :
A: We can setup two types of images, one is new status for changed records, another one is
additive delta.
19, When I’m setting up the delta for the generic datasource if I select the update method
as new status for changed records can I load data direetly to the cube ?
A: No (why? If the update type is setup to new status for changed records then it becomes
mandatory for us to load data into ODS and then ODS to CUBE.
20. What is difference between business content datasources and generic datasources?
A: _In case of business content datasources the data source is readymadely aveilable in
delivered version, in order to use it we have to install that particular data source to create a
copy of the data source in active version by using RSAS T-code, select the Data Source and
click on transfer data source. In case of generic datasources we are creating our own
datasources. We prefer business content datasources with specific to performance.
21Steps of Generie Extraction Method ? :
A: Go to RS02 and select the type of datasource and specify the name of datasource and click
‘on create button and then it takes you next sereen where we basically designate the datasource
to particular application component and specify the table name or view name or function
module or infoset query depending up on the requirement and provide the short medium and
ong descriptions and save it then it take u to the next screen where it shows the detailed lever
of information about the datasource and extract structure and allows us to customize the
wansfer structure by using the HIDE check box and selection check box for data selection tab
and then save it to generate the datasource.
21. What is ALE pointer (ALE Delta)? .
A: this is one way of setting up the delta based on the pre-defined fields defined by SAP
or some tables.
22, What is Data Source?
A: Data Source defined ES and TS at source system and TS at BW system.
23. When does the TS at SAP R/3 side get generated?
A: when you activate transfer rules in BW, it creates a transfer structure at SAP R/3 side.
24. How do we find out the TS in the source system?
A; go to SELI, select the Radio button as DATA Type and type in /BI* and press on the
function key F4, click on the new selection button down and then in the short description type
in *NAME OF THE DATASORCE® and say enter. Then it gives the name of the transfer
structure,MCVBAK — Connunidta, Stullie
PhSoed 2118 - 1 -VAHDR
Fine, an ey
es > 0 I Vas HDR ae stk Tle
melee Ge | ROnoH
Cong ee
“Tab Nome > MC NIVAD HDRSETP
sacl Stup ibe
LBWG — delete Sdlup tLe,
SMOL — delice Deli Quous
LBWQ — dibets Bytracliy Quews.
LBWE — Te sdip Ve THe
SM13 — Upbte Queve
RSAT — Dell Quue.
RSAS = Exbed OCRed. — wel te debry te DS.
:19. Did you create any user defined Information Structure; If yes what is the Scenario?
A: Yes, to extract pricing information from KONV and KONP Tables, we have built our
own user defined information structure $608 and extracted the data.
LO—Cockpi
1, What is the difference between LIS and LO?
A: _ 1, Level of Information, Lo provides Data Sources specific to level of information. Eor
Ex: 2LIS_11_VAHDR ~ Sales Order Header Information, 2LIS_11_VAITM - Sales Order
Item, 2LIS_1]_VASCL — Sales Order Scheduler, but in case of LIS it provides information
structures specific to Sales Order Information.
2. Lo uses V3 update (Asynchronous Background Process), which gives the option of
executing the LUW’s in the background, but in case of LIS we use V1 or V2
3. In Lo, we fill the Setup Tables and once the data is completely extracted into BW
side we can delete the Setup Tables Back again by using the T-code LBWG. But in case of LIS
We never delete the contents of the information Structure once it is filled, apart from this
information structure is also filled with Delta Records, which is of no use.
2. What are the steps for generating Lo Data Source?
Since Lo Data Sources are given as part of Business Content in Delivered Version To
create a Copy of the same data source in active version, Go to T-Code RSAS, Select the data
source and click on ‘transfer datasource’ bution, to maintain the ready made extract structure
given along with the datasource, go to the T-Code LBWE and click on the ‘Maintenance * link
to maintain the Extract Structure by moving the fields from the communication structures, if
the required field is not available in the communication structure then we go for enhancing the
data source. Since the extract structure is te-generated, we need to generate the data source by
clicking the data source link and customize the extract structure by using the following check
boxes ‘selection, hide, inversion, field only known in exit? and then save it. Then choose the
delta update mode as Queued Delta among (Direct Delta, Scriatized-V¥3-Update, Un serialized
V3 update and queued delta) and click on inactivate link to activate the datasource. Then
replicate the data source in BW side and start Migreting the data,
3. _How do we migrate the data in LO?
To be on the safer side, we prefer to delete the Setup Tables by using the T-Code
LBWG and specify the application component and also we clear the Delta Queues by using the
T-Code SMQI and we clear the Extractor queues by using the T-Code LBWQ then lock
Related T-Codes, then run the Statistical Setup by using the T-Code OLI*BW, to fill the setup
tables then run the initial Loads or Full Loads to extract all the data from Setup Tables into BW
side, then once done successfully we go back to R/3 LBWE to setup the periodicity of the V3
job, which could be hourly or two hourly depending on the number of transactions, more the
transactions v3 Job should be scheduled more fiequently and then Finally locks can be
released, so that user can start recording the transactions.
NOTE: These Migrations are always done in the Week Ends after the office hours and T-
Codes are usually locked and unlocked by the basis people.BUUOHUOUUDU YUU VUE TEP ewww we wu UE EY wy 4]
A.
c
D.
4,
A.
5.
A.
‘What are the different Delta Update modes and explain each? 3
Direct Delta:- In case of Direct delta LUW’s are direcily posted to Delta Queue
(RSA7) and we extract the LUW’s from Delta Queue to SAP BW by running Delta
Loads. If we use Direct Delta it degrades the OLTP system performance because when
LUW’s are directly posted to Delta Queue (RSA7) the application is kept waiting until
all the enhancement code is executed.
Quened Delta: - In case of Queued Delta LUW’s are posted to Extractor queue
(LBWQ), by scheduling the V3 job we move the documents from Extractor queue
(LBWQ) to Delta Queue (RSA7) and we extract the LUW’s from Delta Queue to SAP
BW by running Delta Loads. Queued Delta is acne oy SAE maintain the
Extractor Log which us to handle the LUW’s, which are missed,
Serialized V3 Update: - In case of Serialized V3 update LUW’s are posted to Update
Queue (SM13), by scheduling the V3 job which move the documents from Extractor
queue (LBWQ) to Delta Queue (RSA7) in a serialized fashion and we extract the
LUW's from Delta Queue to SAP BW by running Delta Loads. Since the LUW?s are
moved in a Serial fashion from Update queue to Delta queue if we have any error
document it doesn’t lift the subsequent documents and as it sorts the documents based
on the creation time, there every possibility for frequent failures in V3 job and missing
out the delta records. It also degrades the OLTP system performance as it forms
multiple segments with respective to the change in the language.
Un serialized V3 Update: - In case of Un serialized V3 update LUW’s are posted to
Update Queue ( SMI3 }, by scheduling the V3 job which move the documents from
Extractor queue ( LBWQ ) to Delta Queue ( RSA7 ) and we extract the LUW’s from
Delta Queue to SAP BW by running Delta Loads. Since the LUW’s are not moved ina
Serial fashion from Update queue to Delta queue if we have any error document it
considers the subsequent documents and no sorting of dacuments based on the creation
time. It improves the OLTP system performance as it forms a segment for one
language.
How do you enhance a LO Datasource?
The enhancement procedure is same as any other data source enhancement but only
difference is that instead of appending the fields to the Extract structure we append the
field to the communication structure and move the field from communication structure
to extract structure in Maintenance link in T-code LBWE. Deselect the hide check box
and field only known in exit check box when generating the data source, then we write
the enhancement code to populate the deta into the enhanced fields by using the
enhancement RSAPOOO1 this enhancement contains four function exits, that is
EXIT_SAPLRSAP_001(for transaction data data source) EXIT_SAPLRSAP_002(For
master data ‘attributeJEXIT_SAPLRSAP_003(For master data texts),
EXIT_SAPLRSAP_004 (Master data Hierarchies ). We use the intemal table
C_T_DATA in coding,
What is the T-code for deleting Set Up tables?
LBWG.vuuvuwe
vuuUuwee
wu
we
vwuvuvevuwwuuvud9s ey
‘What is the Importance of Delta Queue?
Delta Queue (RSA7) Maintains 2 images one Delta image and the other Repeat Delta.
‘When we mun the delta load in BW system it sends the Delta image and when ever
delta loads and we the repeat delta it sends the repeat delta records,
‘What does the total column indicate in RSA7 Delta queue?
For each set of LUW either available in both Delta and Repeat Delta or not it counts as
one LUW.
‘What you mean by LUW?
LUW stands of Logical Unit of Work. When we create o new document it forms New
image “N’ and when ever there is e change in the existing document it forms before
image ‘X’ and after Image ‘ * and these after and before images together constitute one
LUW. Before image is always backed up by after image.
‘When we are initializing the data source do we need to Lock the posting in SAP
RB?
‘Yes, that is the reason we normally do this data migration steps during weekends in
non-office hours, not to disturb the business.
How do we Re-initialize a LO Data souree?
Firstly we lock as transactions as see that we have posting done to Extractor queue
(LBWQ) in case of Queued Delta and Update queue (SM13) in case of Sevislized=V3
Updete-and Un serialized V3 update, Then if we don't want the delta records in SM13 end
LBWQ we can delete them or if we want them into SAP BW, then we schedule the V3 job
and collect LUW’s into Delta Queue and Extract the LUW’s from Delia Queue to SAP BW
by running Delta Loads in SAP BW. Once the Queues are Cleaned up, delete the Setup
Tables by using the T-Code LBWG and specify the application component, then run the
Statistical Setup by using the T-Cocle OLI*BW, to fill the setup tables then run the initial
Loads or Full Loads to extract all the data from Setup Tables into BW side, then once done
successfully we go back to R/3 LBWE to setup the periodicity of the V3 job, which could
be hourly or two hourly depending on the number of transactions, more the transactions v3
Job should be scheduled more frequently and then Finally locks can be released, so that
user can start recording the transactions,
NOTE: These Migrations are always done in the Week Ends after the office hours and
T-Codes are usually locked and unlocked by the basis people.wee
:
vw
pO
px
CO-PA Extractio1
‘What is the naming convention for CO-PA data source?
1.CO PA (client number any 3 digit number except 000 and 066)_(any
meaning full name).
On what basis do we generate the CO-PA data source?
Since CO-PA data source is generated using Operating Concern we call it as
Customer Generated Extractor.
‘What is the T-Code to Generate the CO-PA data Source?
KEBO.
You generated the Data Source based on Costing based or Accounting
based? And what is difference?
We generated the CO-PA data source using Cost based.
Cost based is preferred in case of product-based company and it also
considers the costs like Advertisement cost and disbursement cost. In case of
cost based Company code field is mandatory and it gives a detailed split of all
the values.
Accounting based is preferred in case of Service based company. In
case of Account based Company code, controlling area, Cost Element fields
are mandatory and it gives the Cumulated values for analysis,
Do we have Business Content Info Source in case of CO-PA?
‘No, Only Application proposal or User defined Info Source.
In Case of CO-PA extractor how does the Delta handled?
CO-PA runs delta based on Time Stamp. We can look at the Time stamps set
for a Data source using the T-code KEB2,
What do you mean by Safety Delta?
When we initialize the CO-PA data sources, we run the Delta load only after a
minimum of half-an-hovr time gap between Initialize Delta update and Delta
Update and this Time Gap is called Safety Delta.
Explain the steps to generate the CO-PA datasource?
Go io T-code KEBO and specify the name of the CO-PA Data Source with the
following naming convention 1_CO_PA (client number any 3 digit number
except 000 and 066)_(any meaning fill name), select the radio button “Create*
and specify the Operating Concern and specify ‘Cost Based or Accounting
Based” depending on the requirement and click on Execute Button. And in the
next screen give short, medium, long description and specify the field name for
Partitioning (Profitability Segment) and click on ‘Info catalog’ Button and savewou
www
wow eee eww ew
pe
the data source and the data source is generated, Then replicate the data source
in BW side and start Migrating the data.
‘What you mean by summarization Level or Field name for Partitioning?
It is nothing but the Profitability Segment what we specify when we post an
entry.
Explain me about the Data Struetures or different Tables in CO-PA?
CEIXXXX ~ Actual Line Items,
CE2XXXX — Planned Line items.
CE3XXXX - Segment Table,
CE4XKXX ~ Segment Level Table.