You are on page 1of 13

ETL Based Replication

 Configuring Data Stores
i) SAP Application

ii) SAP BW Source

iii) SAP BW Target iv) DB(MS SQL) v) SAP HANA  We can Bring Data from different sources like 1) SAP System 2) Non-SAP System 3) File System .

A) Data from SAP System  We have two types of system in SAP Systems a) SAP ERP b) SAP Data Warehouse a) SAP Data Warehouse  Before bringing data from SAP BW we need to connect SAP Data Services and SAP BW  So we need to do two steps i) Create RFC Connection in Management console of SAP Data Services  Log on to SAP DS Management console  Click on Administrator  Select SAP Connection and Click on RFC Server Interface .

 Select RFC Server Interface Configuration(Tab)and Click on Add and provide credentials of SAP BW system and click on apply  Select RFC Server Interface Status(Tab) Select our Connection and Start.Connection should be in Green Color. NOTE:. ii) Create Source system in SAP BW  Log on to SAP BW system Provide T-code: RSA1  Go to Source system(tab) expand Data Services(Folder) right click on DS40 select “Connection Parameters”  And provide RFC connection in Program-Id___________ .

Save and Test Connection. workflow. job. Note:. data flow  Take ABAP flow and provide required data about SAP BW server .DS40 is source system which is provided by SAP 1) Project #1: Table------------------------------------HANA  Import table into data store 1st  Create project.

Table(as target)from HANA Data Store connect them and execute job 2) Project #2: Open Hub------------------------------ HANA  STEPS AT BI  Log on to SAP BW system and provide T-Code:RSA1  Go to Info-provider(Tab) Check the info provider(Info cube. DSO. ABAP flow ->Drag required table from Data Store. MP. Data Transport and connect them  ABAP flow Take Query transformation. . IS) whether data is there or not. Take Query transformation.

Execute and execute  Go to Open Hub(Tab) Create Open-Hub  Provide name and Info-provider type(cube ..DSO .….etc) and Info-provider name  Provide Destination Type is 3rd party tool  And provide RFC Connection name(DS40)  And activate Open-Hub  Create transformation between Open-Hub and Cube and map the fields and activate .

 Create DTP and activate  Go to Process Chain(Tab) Right click on Unassigned nodes and select “create Process chain”  Create Start variant provide name and description and select option  Select DTP process type and select our DTP connect Start variant an DTP process type  And activate  STEPS AT DS .

Job. workflow. job. Work Flow. Log on to the DS Import our open-hub table into Data-store create Project. Data Flow Drag Open Hub and select option “Execute Process chain before reading” and select process chain  Connect Open-Hub and Hana Table through Query Transformation and map the fields  And execute jobs Note:-System will take update records from bi (By default Process chain will take only updated records) b) SAP ERP Project #1: Table------------------------------------ HANA  STEPS AT SAP ECC6  Log on to SAP ECC6 and provide T-Code:SE11  Check table VBAK  STEPS AT SAP DS  Import table into data store 1st  Create project.    . data flow  Take ABAP flow and provide required data about SAP BW server .

 ABAP flow ->Drag required table from Data Store. Take Query transformation. Data Transport and connect them  ABAP flow Take Query transformation. Table(as target) from HANA Data Store connect them and execute job Project #2: Extractor ------------------------------------ HANA     STEPS AT SAP ERP (For Standard Extractors) Log o to SAP ECC6 and provide T-Code:RSA5 Select required extractors and click on Activate  Provide T-Code:RSA6 to change extractor based on requirements  (For Customized Extractors)  Provide T-Code:SE11 check the table VBAK  Check table VBAK .

Drag our Extractor and connect with sap HANA table by using Query transformation and map the fields .VBELN=VBAK. Work flow. Click on View provide name(should starts with Z)  Select “Database View”  Provide Table name. View name save and save     STEPS AT DS Import required extractor Create project.VBELN  Select fields from table and activate  Provide T-Code:RSO2. Job. Data Flow. Create join VBAK. Provide name click on Create  Provide Application Component.Select Transaction Type.

New job and Name it as Initial.LAST_UPDATE > $GV_STARTTIME ) and (ODS_EMPLOYEE. take a New data flow. Step4: Double click on Initial end script and write the bellow logic Sql('cdc'.'INSERT INTO CDC_TIME VALUES ({$GV_ENDTIME})'). take a new work flow and put it In between them join the three as bellow Step3: Double click on Initial script write the bellow syntax $GV_STARTTIME='1996. $GV_ENDTIME.01. double clickk on data flow take employee Table as source and take query transformation join both of them Srep6: Double click on query transformation map the required fields from schema in to schema out And go to WHERE Class tab and provide the syntax as bellow (ODS_EMPLOYEE. Step5: Double click on New work flow.00. Sql('cdc'. Create two Global variables at job level $GV_STARTTIME.'DELETE FROM CDC_TIME').01 12. $GV_ENDTIME = sysdate().LAST_UPDATE < $GV_ENDTIME ) . Run the job B) Data from Non-SAP System: Project #1:Table---------------------- HANA Change data Capture Step1: Create New project. Step2: Take two scripts and name it as Initial start and Initial end .00'.

drag file and make it as source  Connect HANA table through Query transformation map the fields . date format.C)FILE SYSTEM:     Log on to the DS System Go to the File Format (Tab) Create new Provide name. skip header=yes. data flow. directory. file. work flow. job.  We will get preview of file  Create project.