Professional Documents
Culture Documents
Data Services Within The SAP BI Staging Process
Data Services Within The SAP BI Staging Process
1.
2.
Start the Data Warehousing Workbench using TC RSA1 and select Modeling from the
navigation area. Pick the entry Source Systems.
3.
Position the cursor on the External System folder, and choose Create from the context menu
as shown above.
4.
Enter Logical System Name and Source System Name as shown above and hit Continue.
5.
Data Services will start an RFC Server program and indicates to SAP BI that it is ready to
receive RFC calls. To identify itself as the RFC Server representing this SAP BI Source
System a keyword is exchanged, in the screen shot above it is "DEMO_SRC". This is the
Registered Server Program, the Data Services RFC Server will register itself with at SAP.
Therefore, provide the same Program ID that you want to use for the call of the RFC Server
on Data Services side. All other settings for the Source System can remain on the default
settings. To complete the definition of the Source System, save it.
6.
Now, start the RFC Server on the Data Services side so that we can test the connection and
load data from Data services to SAP BI system.
a)
Open MS-DOS command prompt. (Start > Run > Type CMD & hit OK)
b)
c)
d)
The server here is 172.17.10.162 and the System Number is 00. The router string
could hence be "/H/172.17.10.162/S/3300". The /H/ stands for the hostname, the /S/
for the port number. The port number can be derived from the system number,
system 00 means port 3300. Same thing with the SAP Gateway. In a standard
standalone installation the value for this is sapgw00.
e)
7.
8.
9.
The newly entered 3rd party system would appear in the list of External Systems as shown
below.
PoC Loading unstructured data from Flat files into SAP BI using BusinessObjects Data
Services
Let us take a sample tab delimited flat file as shown below. (Note - Sample data is placed in an
excel file only for understanding purpose. In actual PoC we will be using a text file)
11. The new datastore BW_PoC_Target_Datastore is now available under Local Object
Library as shown below.
Hit Yes. Now, you would see your sample data in the editor window. Note that
Designer automatically assigned appropriate data types based on the input data.
PoC_File_Format is now available under the Flat Files tree in the Local Object Library as shown
below.
7. Let us have some validation done on the source data. Let us assume we want only those
records where country is IN (India) or US.
8. Drag the Validation from the Transform tab of the Local Object Library on the Data Flow
workspace.
12. Drag the target for failed records from any of the available datastores e.g. Database, Flat
Files, Excel Workbooks, BW Target, etc. In our case, I have created a target datastore for
Oracle XE schema on my local machine. So I have taken a template table as a target for
the failed records.
14. Connect the Validation box with target Invalid_Customer for failing records.
Now, we need SAP BI target for the records which pass the validation. We first have to
create DataSource/InfoSource on the SAP BI side.
In order to create InfoSource, we need InfoObjects corresponding to our sample flat file
data.
1.
2.
Start the Data Warehousing Workbench using transaction Code RSA1 and
select Modelingfrom the navigation area. Pick the entry InfoObjects.
3.
Create the InfoObjects as per the cloumns in Flat File as shown in the following 3
screenshots.
4.
5.
Now, let us create a DataSource on the SAP BI side as a target for our sample data.
Following 3 screenshots shows that System is not allowing us to create a DataSource in a source
system of type External System.
It also provides alternate way to achieve the same. It seems we cannot create BI 7.0 dataSource
of type external system.
As per the method described in the earlier screenshot, we will create InfoSource 3.x.
Follow the steps as given below.
Find your InfoSource, and open its subtree. Position the cursor on the DataSource name, and
use the option Import from the context menu.
Afterwards, the DataSource will be available in the Object Library for your SAP BI system.
Open the workflow, you have created in the previous section. Open the Datastore tab strip in the
Object Library. Search for the Datastore that you have created for the SAP BI system as target (in
our case BW_PoC_Target_Datastore). Drag the structure to the canvas you want to load the data
in the SAP BI system to. Connect the existing Validation Transform with the new target.
Double click this new target to open the Target Table Editor as shown below. Go to Options tab
and select Column Comparison = Compare by position.
This is done so that even if column names do not match (InfoObject names cannot be more that 9
characters), we can still go ahead and carry out the data movement.
Validate your DataFlow.
Since the loading process has to be initiated by the SAP BI system, we have to create a batch file
for the execution of the Job in Data Services. This batch file can subsequently be called by an
InfoPackage in the SAP BI system.
Open the Data Services Management Console by choosing the corresponding menu entry from
theTools menu.
Log in with your user credentials. In the Management Console navigate to the Administrator, and
open your repository (or All Repositories) of the Batch folder.
Switch to the Batch Job Configuration tab strip and find your Job you want to schedule. Choose
the option Export Execution Command.
Provide a File Name for the batch file. Leave the other settings on the default values and press
theExport button.
Note
Since the maximum length of the file name entry field in the InfoPackage is limited to 44
characters, your file name entered must not exceed 40 characters (44 characters minus 4
characters for the extension .bat)
In our case following two files gets created after the export under C:\temp folder.
Before you switch to the SAP BI system, check that the RFC server is still running.
Switch to the SAP BI system you want to load the data to. Open the Data Warehousing
Workbench and find your InfoSource / DataSource. Create an InfoPackage for the DataSource.
Switch to the 3rd Party Selection tabstrip. Press the Refresh Sel. Fields button to display the input
fields.
Enter the File Name that you have provided in the creation of the batch file.
Switch to the Processing tab strip. Select Only PSA for updating the data to. Hit Save.
Switch to Schedule tab strip. Ensure Start Data Load Immediately is selected. Hit Start. It start
executing the job designed in the Data Services using the batch file.
Click Continue.
Following screen shows the data loaded into the PSA. In our example we transferred the records,
which passed all our Data Quality measures, i.e. all records which Country equal to either IN or
US. In total there are 3 records.
Records not satisfying the validation criteria i.e. records with countries other than IN or US are
stored the template table as desired.
Screenshot below shows the failing records stored in the Oracle XE schema.