SAP BW Extraction

Table of Contents

1. Extraction................................................................................................................................................. 8 1.1Introduction ......................................................................................................................................... 8 1.2 Step-by-step control flow for a successful data extraction with SAP BW: .......................................12 2. Data Extraction from SAP Source Systems.......................................................................................... 12 2.1 Introduction ...................................................................................................................................... 12 2.1.1 Process ....................................................................................................................................... 13 2.1.2 Plug-in for R/3 Systems..............................................................................................................14 2.2 Transfer Method - PSA and IDoc....................................................................................................... 14 2.2.1 Introduction ...............................................................................................................................14 2.2.2 Persistent Staging Area (PSA)..................................................................................................... 15 2.2.2.1 Definition ............................................................................................................................ 15 2.2.2.2 Use ...................................................................................................................................... 15 2.2.3 IDoc’s..........................................................................................................................................15 2.2.3.1 Definition ............................................................................................................................ 15 2.2.3.2 Example: ............................................................................................................................. 15 2.2.4 Two Methods to transfer data...................................................................................................15 2.2.4.1 Differences and advantages:...............................................................................................16 2.2.4.1.1 PSA ...............................................................................................................................16 2.2.4.2 ALE (data IDoc)................................................................................................................ 16 2.3 Data Source....................................................................................................................................... 16 2.3.1 Assigning DataSources to InfoSources and Fields to InfoObjects.............................................. 17 2.3.2 Maintaining DataSources...........................................................................................................17 2.3.3 Transferring Business Content DataSources into Active Version .............................................. 18 2.3.4 Extraction Structure ...................................................................................................................18 2.3.5 Transfer Structure ......................................................................................................................18 2.3.6 Replication of DataSources ........................................................................................................19 2.3.6.1 Replication of the Entire Metadata ....................................................................................19

2.3.6.2 Replication of the Application Component Hierarchy of a Source System ........................ 19 2.3.6.3 Replication of the Metadata ...............................................................................................19 2.3.6.4 Replication of a DataSource of a Source System ................................................................19 2.4 Data Extraction Logistics ...................................................................................................................20 2.4.1 Data extraction Illustration ........................................................................................................20 2.4.1.1 Full Load:............................................................................................................................. 20 2.4.1.2 Delta Load: .......................................................................................................................... 22 2.5 LO Cockpit Functions......................................................................................................................... 23 2.5.1 Maintain Extract Structures ....................................................................................................... 23 2.5.2 Maintain Data Sources............................................................................................................... 23 2.5.3 Activating update.......................................................................................................................24 2.5.4 Controlling update .....................................................................................................................24 2.5.5 Setup Tables...............................................................................................................................24 2.5.6 Serialized V3...............................................................................................................................24 2.5.7 Queued Delta (the third update method).................................................................................. 25 2.5.8 Direct Delta ( 2nd delta update method in our list) .................................................................... 25 2.5.9 Unserialized V3: (The last one) ..................................................................................................25 2.6 LO Data Sources Data Flow in R/3 :................................................................................................... 25 2.6.1 Filling up the Appended Structure.............................................................................................30 2.6.2 Regenerate & Check the Customized Objects ...........................................................................34 2.7 Structure of Delta Method for LO Cockpit Data Sources..................................................................36 2.7.1 Delta Management in extraction...............................................................................................36 2.7.2 Step-by-Step Maintenance ........................................................................................................37 2.8 Delta Method ....................................................................................................................................46 2.8.1 Master Data ...............................................................................................................................46 2.8.2 TRANSACTIONAL DATA ..............................................................................................................47 2.8.3 Delta Process..............................................................................................................................48 2.9 Delta Method Properties .................................................................................................................. 49 2.9.1 Delta Initialization ......................................................................................................................49 2.9. 2 Delta Extraction.........................................................................................................................49

................11................2..2 Queued delta (V1 + V3 updates)...............2..3 Replication of DataSources...52 2..1 Benefits ........................................................9........3...............................................................................................1.1 Replication Process Flow ............................................1............. 51 2.......................3......................................................................10.............................................10..........................................................13..13.............................................59 2......... 62 2......2 Limits............................................................................2 Time-dependent Texts .....................................................59 2...............................59 2..................................................................................................................2 Functions............................2........3 Time-dependent Texts and Attributes.....2.........................13..4 Language-dependent Texts...................................... 62 2....................1 Master Data .......................................1 Extraction Structure .10..3 V3 Update ........ 54 2......3................... 59 2..................................................9..............................61 2..........12 Generic Data Types ...........12.........3..................2 Editing the DataSource in the Source System..........................................................9.60 2..... 60 2........54 2.........................................12........................................................................................................................................................................................12........ 59 2......................63 2............................59 2.................................................................................................................................3...........................................................................12.............3...................................................................12.......................................10....................................2 Deleting DataSources during Replication ....................................................................... 50 2........1.60 2...9.........................13 Generic Data sources ..............................................................12........................................................................................... 50 2................ Texts....3 Update Modes ...............62 2......12............................1Create Generic extraction [Master data].......................1 V1 Update ....................................................12.................................13............1 Time-dependent Attributes ............3.....................................................................59 2...10.....2 V2 Update ..................................................53 2..59 2......................2..................................................................51 2............................2....................................................2..62 2............................................................1........13.......3 Automatic Replication during Data Request.... Hierarchies ...................................56 2............... 53 2..............................3 Un-serialized V3 Update (V1/V2 + V3 Updates) ...................................................................................... Attributes ......................................................................2................................11 Generic extraction........1 Direct Delta (V1 update) ..10 Delta Queue Functions..............12.................. 51 2...........................63 .................................................................................................12................................................................3 Transactional data................................13...............................................55 2........................

....1 Using InfoObjects with UD Connect.................................................................87 4.................................................................................................3 Extracting Transaction and Master Data using Flat Files ................................2...........................................................................133 5............................................................................................... Extraction with Flat Files........................................................................................70 3.................................... 130 5.........................................1 Deploy the user data source’s JDBC driver to the server: ....................4 Creating a DataSource for UD Connect...........................................................................................................x) ....86 3...........................................................106 4.....................107 4.....................................................70 3......................2.................................................1 Aggregated Reading and Quantity Restriction ..........................................................................3 Creating UD source system........................................................1 Basic Steps of Data Flow (ETL process): .....................................................130 5........1 Data from Flat Files (7....................................3 Testing the Connections..........................................................6.......2 Using SAP Namespace for Generated Objects.......... 133 5........................................ 135 5....1 5.......130 5...4 Data Types that can be extracted using Flat Files...........2 Process Flow........................137 5................6 BI JDBC Connector.....................2 Configuring BI Java Connector .....106 4. 135 Cloning the Connections.....................4....6....................... 107 5............133 5.................5.....69 3................................7 BI XMLA Connector .....131 5.................................................87 3..................................69 3....................................................................................................................................................................................7...............................................................................................................139 .............6................2.......7...............................2 5..................................................134 5.......135 5.................................................63 3.......................................................................134 5...........................................................................3 Connector Properties...............................................................6......................................2 Use of Multiple Database Objects as UD Connect Source Object ...........2......................................................... DB Connect............................................................................................................... Universal Data Integration ........4.......................................................2 Loading data from SAP Supporting DBMS into BI.........6.......14 Enhancing Business Content.........................135 JNDI Names..................................................................0)...133 5................1 Introduction .......................5..........................................2 Step-by-Step to upload Master Data from Flat File to InfoObjects ........................................................1 Introduction ..2........................ 138 5...........................................................................................................5 Using Relational UD Connect Sources (JDBC) ....................1 Process Description................................2 Data from Flat Files (3......130 5..................................................................6....................

................1 Introduction ............................................................................................6..........1 End-to-End Web Business Processes .................147 7.....................................140 6......144 6............3 Add Router Tables.................................... XML Integration................................................................ 151 ...........................................4..............144 6.............2........................................1 Replicating Architecture......................3............ 141 6..........140 6...............................................................3 XML Solutions for SAP services ......................151 7.144 6..5...........................................149 7................2 Internet Security Standards .........1 In the Source BI..... 144 6......................................150 7.....................4 Factors leading to emergence of XML-enabled SAP solutions ...........4 Data Mart Interface between Several Systems ....................................................................................1 Add New Users to BC ..............................................................................................2............................................145 6........................................................2.................... 149 7...........................1 Components of Business Connector........142 6..............1 Incorporating XML Standards ..................3....3 SAP applications with XML...........................141 6....5............................................. 148 7................................................................2 SAP’s Internet Business Framework ........................................................................................................2..............................................................4..........................................................4..............................................................148 7.....................................................3.........................4 Web-based business solutions.............................................................3............................................................................................................... 143 6................2 Process Flow..............................4.................2 Aggregating Architecture...........................................142 6.....................2 Special Features ....145 6.............................2 Open Business Document Exchange over the Internet ............................3 Data Mart Interface in the Myself System......1 Architectures..............................................................................................................................................................................................................................................................................4..................................................1 Introduction ..........................................................146 6...........................................5.... 146 7.....4.....................................................................3 Business Integration with XML .................................................................................................147 7..................................................... 145 6............................140 6...140 6..............2 Benefits of XML Integration .............................................. Data Mart Interface .......................................5 How to Customize Business Connector (BC)........................ 141 6.1....1....2 Add SAP Systems........................................................5..................3............................144 6.....................1 Changing Business Standards and their adoption ..4 Access functionality in the Business Connector.........................................4........................................4.....................................................3................................................. 147 7..........................

.............................................................................2...........4.....................................................2 Create Virtual Infocube.............................3.................................................................................................2........................156 7......5.............153 7............................................................................................1.......1 Creating a SAP RemoteCube.......................................................................................4 Generating Master Data Export DataSources.....3....................3.........................................159 7.........................................4.........3................................ 157 7..............................................2 In the Target BI..................7........3...........................3...................1......................................................... 161 7......... 156 7..................3.............................3............................. 151 7. Virtual InfoCubes............................................153 7.........5.2 Remote Cube............................. 158 7...3.158 7.........4.......4.................................................................3....................4...............................................154 7......2 Structure .......1 Structure .................................................................................................................................4..............1 SAP RemoteCube .............................................................................................152 7................3............1 Structure ...........................1 Introduction .....................................................................1.........................161 .2...................3 Generating Export DataSources for InfoProviders...........................................3.......................... 156 7.................2 Integration ......1 Delta Process.........2.....................5 Transactional Data Transfer Using the Data Mart Interface...................153 7......................3 Different Types...........................154 7...................................................4................2 Restriction....................3 Integration ..........155 7.............................3...................3....3................ 153 7..................152 7...........1 Description of the interfaces for user-defined function modules ............3............3 Method for determining the correct variant for the interface...........................................................................................................2 Dependencies.............3.........................................................................160 7...................2 Additional parameters for variant 2 for transferring hierarchy restrictions .........................................................................3..........154 7.......155 7......6 Transferring Texts and Hierarchies for the Data Mart Interface ...154 7................157 7.........................................................2.......3 Virtual InfoCubes with Services ..................................

. These then collect the requested data and send it in the required transfer format using BAPIs to the SAP Business Information Warehouse. In non-SAP applications.1Introduction Extraction programs that read data from extract structures and send it. in the required format.1. These extraction tools are implemented on the source system side during implementation and support various releases. The IDOC structures or TRFC data record structures (if the user chooses to use the PSA . Extraction 1.Persistent Staging Area) that are generated from the transfer structures for the Business Information Warehouse on the source system side are used for this. to the Business Information Warehouse also belong to the data staging mechanisms in the SAP R/3 system as well as the SAP Strategic Initiative products such as APO. similar extraction programs can be implemented with the help of third party providers. CRM and SEM.

choose Source System Tree ® The user Source System ® DataSource Overview ® The user Application Components ® Context Menu (right mouse click) à Replicate DataSources in the BW Administrator Workbench. usually from a table view. the ‘DataSource Replication’ step is provided to duplicate the DataSource is replicated with its relevant properties in BW. For an R/3 OLTP source system. the Metadata for an application component. choose Source System Tree ® The user Source System à DataSource Overview à The user Application Components à The user DataSource à Context Menu (right mouse click) à Replicate DataSources in the BW Administrator Workbench To replicate the Metadata from a source system into BW for an application component. Once there the user can assign it to an InfoSource. The user can request the Metadata for a DataSource. or all the Metadata of a source system: To replicate the Metadata of a DataSource. The structure is written in the OLTP using the data elements that describe the available data. choose Source System Tree à The user Source System à Context Menu (right mouse click) à Replicate DataSources in the BW Administrator Workbench . To update all the Metadata of a source system.The OLTP extraction tables form the basis of a DataSource on an R/3 OLTP system.

The InfoObjects used for this are not just transaction and master data but also relationship sets such as attributes or hierarchies for master data. Virtually any source of data can be extracted for use in the Business Information Warehouse . A complete list can be found on the SAPnet BW homepage. in order to guarantee smooth communication between SAP OLTP and SAP BW. The SAP BW extractors carry out a number of functions in the SAP OLTP. Various businesses have already been certified as official third party providers in this field.All application data must be described in SAP BW using meta data. These tasks must be carried out by external software in non-SAP Systems that are set up in the BAPI interfaces of the SAP BW.

 Benefits of TRFC    Improved performance when loading data. Possibility to synchronize InfoSources. Two options for data transfer are possible   TRFC (chosen by selecting PSA in the transfer structure) TRFCs are faster than ALE but there is a 255 field and a 1962 byte limit imposed. The load process takes place in two steps and tracing of the status of each step is provided. ALE (IDOC) Here there is a 1000 byte limit imposed. Overcoming the 1000 byte limit (now a maximum of 255 fields and 1962 bytes). The transfer is asynchronous and runs in the background. API to access the data stored in the ODS (read and update). .

which calls the BI Service API to process the request. Once the defined point of time is reached. 4. An InfoPackage is scheduled for execution at a specific point of time or for a certain system. the SAP BW system starts a batch job that sends a request IDoc to the SAP source system. 2. 5. 2.1. The BI Service API finally sends a final status IDoc notifying the target system that request processing has finished (successfully or with errors specified in the status IDoc).1 Introduction Extractors are one of the data retrieval mechanisms in the SAP source system. 6. The BI Service API continues to call the extractor until no more data can be fetched. and customer exits are called for possible enhancements. Data Extraction from SAP Source Systems 2. Possible error conditions include specification of DataSources unavailable in the source system and changes in the DataSource setup or the extraction process that have not yet been replicated to the SAP BW system. The BI Service API calls the extractor in initialization mode to allow for extractorspecific initializations before actually starting the extraction process. The generic extractor.or user-defined event. The extractor takes care of splitting the complete result set into data packages according to the IDoc control parameters.2 Step-by-step control flow for a successful data extraction with SAP BW: 1. 3. The BI Service API checks the request for technical consistency. . opens an SQL cursor based on the specified DataSource and selection criteria. One data package per call is returned to the BI Service API. An extractor can fill the extract structure of a DataSource with the data from SAP source system datasets. 7. The request IDoc arrives in the source system and is processed by the IDoc dispatcher. for example. The BI Service API calls the extractor in extraction mode.

2. each of which are hard-coded for the DataSource that was delivered with BW Business Content. The DataSource fields are made available to be assigned to BW InfoObjects. including its relevant properties.In a metadata upload. In addition.1 Process There are application-specific extractors. The data loading process is triggered to the source system by a request IDoc. with which the user can extract more data from the SAP source system and transfer it into BW. Only when the user calls up the generic extractor by naming the DataSource does it know which data is to be .1. there are generic extractors. and which fill the extract structure of the DataSource. is replicated in the BW. Once there the user can assign it to an InfoSource. After specifying the data flow in the BW Administrator Workbench by maintaining the transfer rules. the user can schedule an InfoPackage in the Scheduler. the DataSource.

2 Transfer Method . we will discuss this in the class in details.PSA and IDoc 2. FI-SL and HR. CO-PA. DataSources are generated on the basis of these (individually) defined info structures. This is how generic extractors allow the extraction of data that cannot be made available within the framework of Business Content. If time permits. uses generic extraction to read info structures. The user can run generic data extraction in the R/3 source system application areas such as LIS. Regardless of application.extracted. or transaction data from all transparent tables. We speak of customer-defined DataSources with generic data extraction from applications. The user get there by choosing The user Source System  Context Menu (right mouse click)  Customizing Extractors in the BW Administrator Workbench – Modeling. database views or SAP query functional areas or using the function module.2 Plug-in for R/3 Systems BW-specific source system functions. 2. and from which tables it should read it from and in which structure. This is how LIS.1.Communication between the R/3 source system and the SAP Business Information Warehouse is only possible if the appropriate plug-in is installed in the source system.2. This is how it fills different extract structures and DataSources.help. extractors and DataSources are delivered by so-called plug-ins. The user can find further information in the implementation guide to data extraction from SAP source systems. the user can generically extract master data attributes or -texts.1 Introduction This information is taken from sap. 2. we speak of generic DataSources. . for example. The DataSource data for these types are read generically and transferred into the BW. In this case.com and some other sources as well and rearranged to understand this concept much better. The user can generate user-specific DataSources here.

2. When the user load flat files.12.2.2.2.2 Use The requested data is stored in transparent. their usefulness. IDocs are used in both Electronic Data Interchange (EDI) and for data distribution in a system group (ALE).1 Definition The IDoc interface exchanges business data with an external system.3 IDoc’s 2. The IDoc interface consists of the definition of a data structure. along with processing logic for this data structure.1 Definition The Persistent Staging Area (PSA) is the initial storage area for requested transaction data. it may still contain errors. which means that if the data contained errors in the source system. the sequence in which they are arranged. 2.2 Example: The IDoc type ORDERS01 transfers the logical message types ORDERS (purchase order) and ORDRSP (order confirmation). delivery confirmations or purchase orders) normally represent the different specific formats. 2. relational database tables. since it may be modified by conversion routines (for example. exception handling is triggered using SAP tasks. and how complete they are.2.4 Two Methods to transfer data Basically. The business data is saved in IDoc format in the IDoc Interface and is forwarded as IDocs. 2.2.3. It is stored in the form of the transfer structure and is not modified in any way. Multiple message types with related content can be assigned to one IDoc type.2.2.2. and texts from various source systems within the Business Information Warehouse.1999 might be converted to 19991231 in order to ensure the uniformity of the data).Among other areas. Standard SAP format for electronic data interchange between systems (Intermediate Document). The agents who are responsible for these tasks and have the relevant authorizations are defined in the IDoc Interface. the date format 31. If an error occurs. master data attributes.2 Persistent Staging Area (PSA) 2.Different message types (for example.2. known as IDoc types. the data does not remain completely unchanged. The user can check the quality of the requests. 2. there are two transfer methods for SAP systems: .3.

Information is sent from the source system (no data) through the Idoc interface (info IDocs). This information can be. Error handling is possible. 2. That is not the case with IDoc’s. Advantage: Improved performance since larger data packages can be transported. Uses TRFC as transfer log 3. Instead. A DataSource consists of a quantity of fields that are offered for data transfer into BW. IDoc containers are not used to send the data. the number of data records extracted or information on the monitor. By defining a DataSource.  With the PSA transfer method.1 Differences and advantages: 2. 2. Data record length Max. The user will not be able to view the data in IDoc's while transferring the data. It also describes the properties of the extractor belonging to it. 2. Uses TRFC as transfer log.2. 2.1000 bytes. The DataSource is technically based on the fields of the extraction structure. With the IDoc method. 4.2 ALE (data IDoc) 1.4. Data record length Max.1 PSA 1. Uses Info-IDocs: Uses info and data IDocs. 1962 bytes.4. the data is transferred directly in the form of a transfer structure. The most advantageous thing about PSA is that we can see and do any editing if there is any error in the records which means that we are able to view the data. 5. Advantage: More detailed log through control record and status record for data IDoc.3 Data Source Data that logically belongs together is stored in the source system in the form of DataSources.4. these fields can be enhanced as well as hidden (or filtered) for the data transfer.2. 2. for example. IDoc interface technology is used to pack the data into IDoc containers.1. Number of fields per data record: Restricted to 255 3. Use with hierarchies. More common technology since it brings with it a better load performance and gives the user the option of using the PSA as an inbound data store (For Master and Transaction data). 4. .2. 5.

The user assign DataSources to InfoSources and fields to InfoObjects in the transfer rules maintenance.as regards the data transfer into BW. 2. There are four types of DataSource:   DataSources for transaction data DataSources for master data These can be: 1. the user determines how the fields of the transfer structure are transferred into the InfoObjects of the Communication Structure. Create a new InfoSource Assign several DataSources to one InfoSource. Choose an InfoSource from a list containing InfoSources sorted according to the agreement of their technical names 2. if data from different IBUs that logically belongs together is grouped together in BW. if the user wants to gather data from different sources into a single InfoSource. DataSources for attributes 2. using the context menu (right mouse button) of a DataSource. the properties of the DataSource relevant to BW are replicated in BW. If the user use this assignment option. This assignment takes place in the same way in the transfer rules maintenance. This is used. The fields for a DataSource are assigned to InfoObjects in BW. there is also the additional option of assigning an unmapped DataSource to an InfoSource. the user can 1. for example. The user get to customizing via the context menu (right mouse button) for the . DataSources for texts 3.2 Maintaining DataSources Source system DataSources are processed in the customizing for extractors. choose Assign InfoSource. To do this. 2.3. DataSources for hierarchies DataSources are used for extracting data from a source system and for transferring data into the BW. During a Metadata upload. Data is transferred from the source system into the SAP Business Information Warehouse in the Transfer Structure. DataSources make the source system data available on request to the BW in the form of the (if necessary. In the transfer rules maintenance. filtered and enhanced) extraction structure.3.1 Assigning DataSources to InfoSources and Fields to InfoObjects In the DataSource overview for a source system in the Administrator Workbench – Modeling.

3 Transferring Business Content DataSources into Active Version Business Content DataSources of a source system are only available to the user in BW for transferring data.5 Transfer Structure The transfer structure is the structure in which the data is transported from the source system into the SAP Business Information Warehouse.3. 2. In a SAP source system. meaning completing the extraction structure. This means filtering the extraction structure and/or enhances the DataSource for fields.4 Extraction Structure In the extraction structure. 2. The user can edit DataSource extraction structures in the source system. in the BW Administrator Workbench choose Goto Modeling Source Systems  the user Source System  Context Menu (right mouse click) Customizing Extractors  Subsequent Processing of DataSources. the user can determine the DataSource fields in which the user hide extraction structure fields from the transfer. the active version of the DataSource is finally replicated in BW. if the user have transferred these in their active versions in the source system and then carried out a Metadata upload. To do this. Or the user can go directly to the DataSource maintenance screen by choosing Maintaining DataSource in Source System from the context menu of the source system DataSource overview. An InfoSource in BW requires at least one DataSource for data extraction. With a Metadata upload. then the user have to first transfer the data from the D version into the active version (A version). In particular. .relevant source system in the source system tree of the BW Administrator Workbench Modeling.3. In order to transfer and activate a DataSource delivered by SAP with Business Content. DataSource data that logically belongs together is staged in flat structure of the extract structure. data from a DataSource is staged in the source system. The extraction structure contains the amount of fields that are offered by an extractor in the source system for the data loading process. select the user source system in the source system tree of the BW Administrator Workbench and select Customizing ExtractorsBusiness Information Warehouse Business Content DataSources/ Activating SAP Business Content Transfer Business Content DataSources using the context menu (right mouse button). 2. If the user want to transfer data from a source system into a BW using a Business Content DataSource. In the source system. The transfer structure provides the BW with all the source system information available for a business process. the user has the option of filtering and enhancing the extract structure in order to determine the DataSource fields.3.

2. 2.3.3.3. From here it is transferred. or  In the initial screen of the DataSource repository (transaction RSDS). the user can also replicate an individual DataSource that so far did not exist in the BI system. When the user activate the transfer rules in BW. A transfer structure always refers to a DataSource in a source system and an InfoSource in a BW.3.1 Replication of the Entire Metadata (Application Component Hierarchy and DataSources) of a Source System  Choose Replicate DataSources in the Data Warehousing Workbench in the source system tree through the source system context menu. . This is not possible in the view for the DataSource tree since a DataSource that has not been replicated so far will not be displayed.2 Replication of the Application Component Hierarchy of a Source System Choose Replicate Tree Metadata in the Data Warehousing Workbench in the DataSource tree through the root node context menu. Using this function. into the BW communication structure.6.3. 2. or  Choose Replicate DataSources in the Data Warehousing Workbench in the DataSource tree through the root node context menu. 2. a transfer structure identical to the one in BW is created in the source system from the DataSource fields.In the transfer structure maintenance screen.3 Replication of the Metadata (DataSources and Possibly Application Components) of an Application Component Choose Replicate Metadata in the Data Warehousing Workbench in the DataSource tree through an application component context menu.6. The data is transferred 1:1 from the transfer structure of the source system into the BW transfer structure.6. select the source system and the DataSource and then choose DataSource  Replicate DataSource.6.4 Replication of a DataSource of a Source System  Choose Replicate Metadata in the Data Warehousing Workbench in the DataSource tree through a DataSource context menu. using the transfer rules.It is a selection of DataSource fields from a source system.6 Replication of DataSources 2. the user specify the DataSource fields that the user want to transfer into the BW.

Delta load . So whenever sales order is created ( document posted).2.1.1 Full Load: Document posting means creating a transaction. Data Extraction is the process of loading data from OLTP to OLAP (BW/BI). delivered in the business content.4. standard datasources). Full Load – Entire data which available at source is loaded to BW/BI 2.Whenever the user are doing a full load. 2. setup tables are used. VBAK.1 Data extraction Illustration Data can be extracted in two modes 1. . EKKO. VBAP). from a more BW perspective.4.Only the new/changed/deleted data is loaded. writing into the application/transaction tables. it transaction is written into the database tables/ application tables/transaction tables (Ex.4 Data Extraction Logistics New technique to extract logistics information and consists of a series of a standard extract structures (that is. EKPO. 2.

.

Unserialized V3 Before that we need to understand V1.2. V3 updates.1.2 Delta Load: Various types of update methods are discussed below. . These are different work processes on the application server that makes the update LUW from the running program and execute it. V2. Queued Delta 2. Direct Delta 3. These are separated to optimize the transaction processing capabilities. These are different work processes on the application server that makes the update LUW from the running program and execute it. These are separated to optimize the transaction processing capabilities.4. Serialized V3 1.

2 Maintain Data Sources In the Data source maintenance screen. . the user can customize the data source by using the following fields: field name.5 LO Cockpit Functions 2. hide field. 2.5.5. selection. short text. and field only known in customer exit.1 Maintain Extract Structures Here the user can add additional fields from the communication structures available to the extract structure. inversion or cancellation field or reverse posting.2.

Setup tables are used to Initialize delta loads and for full load. Setup table name wiil be extract structure name followed by SETUP. 2. it can sequence these records correctly and thus achieve 'serialization'. So if the user are going for only full upload full update is possible in LO extractors.Since update table records have the timestamp.5 Setup Tables Access to application tables are not permitted. the user can check it in LBWE also) name followed by 'setup'. the user can re-run the load to pull the data from setup tables. When a load fails.But setup tables do not receive the delta data from the deltas done after the init. the user avoid pulling from R/3 directly as we need to bring field values from multiple tables.3 Activating update By Setting as active.5. Queued Delta. which we have already discussed): Serialized V3 update.2.So if users full update should get ALL data from the source system will need to delete and re-fill setup tables. With this option. LO Cockpit supports 4 types of update modes (delta modes. Unserialized V3 update. The user can see the data in the setup tables.4 Controlling update These talks about the delta update mode the user are using and how do the user control the data load based on the volume of data. Data will be there in setup tables. hence setup tables are there to collect the required data from the application tables. Update table gets these update records which may or may not end up in correct sequence (as there is no locking) when it reaches BW. In the full update whatever data is present in the setup tables(from the last done in it) is sent to BW.6 Serialized V3 Take an example of the same PO item changing many times in quick succession. when the V3 job runs. Depending on the update mode a job has to be scheduled with which the updated data is transferred in the background into the central delta management (Delta Queue). 2. Set up table names starts with 'MC' followed by application component '01'/'02' etc and then last digits of the datasource name and then followed by SETUP Also we can say the communication structure (R/3 side. data is written into extract structures both online as well as during completion of setup tables or restructure table or LO initialization tables.5. 'Serialized V3' was to ensure this correct sequence of update records going from update tables to delta queue (and then to BW).5. Its part of LO Extraction scenario. 2. Direct Delta. Full update is possible in LO extractors. The setup tables are the base tables for the Datasource used for Full upload. . V1 (with enqueue mechanism) ensures that the OLTP tables are updated consistently.5.

After activating this method. up to 10000 document delta/changes to one LUW are cumulated per datasource in the BW delta queues. but there is no fixed rule: it depends from peculiarity of every specific situation (business volume. 2. Each document posting is directly transferred into the BW delta queue Each document posting with delta extraction leads to exactly one LUW in the respective BW delta queues Just to remember that ‘LUW’ stands for Logical Unit of Work and it can be considered as an inseparable sequence of database operations that ends with a database commit (or a roll-back if an error occurs).5. 2. that we can consider as the serializer’s brother. the extraction data (for the relevant application) is written in an extraction queue (instead of in the update data as in V3) and can be transferred to the BW delta queues by an update collective run.8 Direct Delta ( 2nd delta update method in our list) With this update mode. I need to go to source system. the simplest way to perform scheduling is via the "Job control" function in LBWE. reporting needs and so on). one can directly login to Source System ( R/3 in our case) or From BW also can remotely login to Source System. as previously executed during the V3 update. 2.But.g. Material Movements V2 update has to be successful. it will be necessary to schedule a job to regularly transfer the data to the delta queues As always.7 Queued Delta (the third update method) With queued delta update mode. the V3 unserialized delta disowns the main characteristic of his brother: data is read in the update collective run without taking the sequence into account and then transferred to the BW delta queues.SAP recommends to schedule this job hourly during normal operation after successful delta initialization.5.Issues:Only suitable for data target design for which correct sequence of changes is not important e.9 Unserialized V3: (The last one) With this update mode. the extraction data continues to be written to the update tables using a V3 update module and then is read and processed by a collective update run (through LBWE).2. 1. If the user use this method.6 LO Data Sources Data Flow in R/3 : DataSources reside in the source system so If I need to customize DataSource. as the name of this method suggests.5. Logon to BW .

c. Go to Administrator Workbench (RSA1). Go to Source systems. 2. Choose R/3 Source System (where the user DataSource resides) . It shall take the user to Source System ( R/3) .Right Click & go to “Customizing for Extractors”.a. b.

3. Go to “0CUSTOMER_ATTR” Customer Number DataSource ( or whatever DataSource the user decide to customize) 5. Scroll down to reach the appended structure . Click on DataSource 6. Choose “Edit DataSources and Application Component Hierarchy” 4.

save & activate . Check.It will append this to Extract Structure of DataSource. 9. Go Back .7. Double Click on appended structure – ZBIW_KNA1_S1 (or the user chosen one) & add the fields ( which the user wish to add to DataSource) 8.

check out the appended field below append structure. Now Click on Extract Structure – BIW_KNA1_S (in my case). Click on 0CUSTOMER_ATTR (DataSource) 11. .10.

Post appending the structure. Click on Change .6. Project “ZSDBWNEW” in my case. we need to fill this Append with data – For this a.2. Click on Enhancement Assignments e.1 Filling up the Appended Structure 1. ( the user need to create project in case the user are doing enhancement on this BW system for the first time) d. Choose the project – Which the user is using for Master Data Enhancement. Go to T Code – CMOD c. LOGON to Source System (R/3) b.

4. Choose the following a. Press Continue – In case it prompts for some confirmation like “Enhancement project already active” etc 3. 2. It will take the user to – . Click on Components. Enhancement – RSAP0001 b.

5. 6. Choose”EXIT_SAPLRSAP_002” & Click on this. . Choose”EXIT_SAPLRSAP_002” & Click on this.

Go Back. KNA1. Go to Change Mode. Make Sure Tables KNVV. KNKK. . include zxrsau02. Do the “Check” before Save. 8. KNVP (these are table from which I need to extract values of appended fields) are declared at the start of the code. 10. 13. This Code is to be inserted before “endcase” of already written code. 11.e. Double Clicking on this will take to ABAP Editor to put in the code. We need to write ABAP Code to fill up Appended Fields in “0Customer_Attr” through the Include mentioned here i.7. BIW_KNVV_S. if no errors are found the Save & Activate”. 12. 9.

Choose “Edit DataSources and Application Component Hierarchy” 4. 2. Logon to BW System – Go to Administrator Workbench (RSA1) – Go to Source systems – Choose Source System (R/3 in our case) . Scroll Down. (Keep activating while going back on every step).6. 15. 16. Now Final Step – We need to regenerate the DataSource & make sure Newly Added Attributes (Starting with ZZ) are not HIDE. . Go Back and then Activate.Right Click & go to “Customizing for Extractors”.14. Come out. 3. Go to “0CUSTOMER_ATTR” Customer Number DataSource & Click on CHANGE. 2. Do the Check again & if no errors then Activate.2 Regenerate & Check the Customized Objects 1.

Scroll down & Go to Fields starting with ZZ & make sure these are not HIDE (remove the sign from HIDE check box).5. 6. Next Steps are Checking Extractors & Loading data . Click on DataSource & Generate. Keep clicking on Right Sign if it prompts for any confirmation. as shown below 8. 7. Go back.

occupies a simplified interface (see documentation for data element ROFNAME_S).7. Generic extractors of type (extraction method) F2 and delta process are AIE (After image Via Extractor) will be using pull delta model 'F2': The data is extracted by means of a function module that. which forms the first physical layer in BI from where data is further staged into the DSO. The data is extracted to the persistent staging area. in overwrite and summation mode. and an infocube.1 Delta Management in extraction The LO cockpit configuration screen (LBWE) contains following key parameters: . Whenever we request delta data from BW. in contrast to 'F1'. the data will pulled via delta queue and Delta LUW’s will be saved in repeat delta table and repeat delta LUW’s only will be visible in RSA7. which can be a part of a pass through layer or an EDW layer.7 Structure of Delta Method for LO Cockpit Data Sources The delta InfoPackage is executed which extracts data from the delta queue to the SAP BI system and the same is scheduled as a part of the process chain. Note the negative values for the key figures for the before image record in the PSA table. But for the normal F1 type extractors both Delta and Repeat delta LUW’s will be visible in RSA7 In the below screen shot Total =2 will refer number LUW’s in repeat delta table 2. The same can be updated to a DSO.2. When the data from the LO datasource is updated to a DSO with the setting „unique record the before image is ignored.

b.Here navigate as per the screenshot below and Delete the data from the setup tables. d. Serialized V3 Update The document data is collected in the order it was created and transferred into the BW as a batch job.    Maintenance of extract structures Maintaining InfoSources Activating the update Job control  Update Mode a. 2. . Direct Delta The extraction data is transferred directly from document postings into the BW delta queue. c. Queued Delta The extraction data from document postings is collected in an extraction queue. There are two setup tables in R/3. Unserialized V3 Update This method is opposite to the Serialized V3 Update. from which a periodic collective run is used to transfer the data into the BW delta queue. The transfer sequence is the same as the order in which the data was created.7. which exist and get filled with the logistic data before it’s extracted into BW. The BW delta queue does not have to be the same as the order in which it was posted.2 Step-by-Step Maintenance We need to first log into the R/3 System and call Transaction SBIW (Display IMG). The transfer sequence is the same as the order in which the data was created.

Click YES on the popup. An information message will be displayed in the message box.Click on the clock symbol to delete the existing contents from the setup table if any so that its free and we can go ahead and make or configure our Logistic Datasources. Click YES. Select the Application component and execute for deletion of data. . Select the Application component and execute for deletion of data.

Call Transaction LBWE and navigate as per the below screenshot: Click on the maintain Extract structure to maintain the fields .

An information message will be displayed In the above screenshot: Click on the Update Overview text to reach the following screen. This will take the user to SM13 for any relative table updates and Execute. .

Now go back to previous screen and click on BW Maintenance Delta Queue. This will take the user to RSA7 transaction to view the delta queues if any .

. Click on Run and It will Prompt for confirming the entries in the Extract structure. Now on the Main LBWE screen. the user can see RED status before the datasource.Click back to go reach this pop-up. Assign a request so that it generates extract structure successfully.

.Now click on the Datasource as below. Assign a Request to have the Datasource screen where properties related to fields can be modified.

. Now go to the BW System and Replicate the related Datasource from the Exact Source system.As the user assign this and come back the user will see a change in the status color as YELLOW.

.Now go back to the R/3 System and Click on the ACTIVE parameter under the Job Control. assign a request Now the user will see that the status color will turn as GREEN and then the user can assign the update mode as well.

2.1 Master Data 0ARTICLE_ATTR 0ARTICLE_TEXT 0CUSTOMER_ATTR 0CUSTOMER_TEXT 0PLANT_ATTR 0PLANT_TEXT 0MATL_GROUP_TEXT 0MAT_VEND_ATTR . In the Delta Queue will only be the information that has not been sent to BW and the last request. It is taken from here when BW asks for information to the source system. Create an infopackage and DTP to load the data.8 Delta Method With this process the new information that is generated daily in the source system must be sent to BW. The Data sources that should be used are: 2. The principal reason for this is the information volume that is imported to BW every day.8.Now in the BW system create transformations from the datasource to the Infoprovider. This information is stored en the delta queue (RSA7 Transaction).

8. the user must mark the data source that the user wants to activate and select Activate Data Source.0VENDOR_ATTR 0VENDOR_TEXT 0SALESORG_ATTR 0SALESORG_TEXT 0SALES_DIST_TEXT 0SALES_GRP_TEXT 0SALES_OFF_TEXT 0SALESORG_ATTR 0SALESORG_TEXT 0SALES_DIST_TEXT 0SALES_GRP_TEXT 0SALES_OFF_TEXT 2. .2 TRANSACTIONAL DATA 2LIS_02_SCL 2LIS_03_BF 2LIS_03_BX 2LIS_03_UM 2LIS_13_VDITM 2LIS_40_REVAL In the screen Installation of Data Source from Business Content.

for which data a DataSource is suited. (technical name of the delta process in the system: ABR)  The extractor delivers additive deltas that are serialized by request. This serializes the delta packets.3 Delta Process  Forming deltas with after. This process supports an update in an ODS object as well as in an InfoCube. It does not support the direct update of data in an InfoCube. For numeric key figures. an after image shows the status after the change. for example. This serialization is necessary since the extractor within a request delivers each key once. for example. This serializes data by packet since the same key can be copied more than once within a request. and how the update and serialization are to be carried out. a before image the status before the change with a negative sign and the reverse image also shows the negative sign next to the record while indicating it for deletion. From this the user can derive. and otherwise changes in the non-key fields are not copied over correctly. this process only supports overwriting and not adding. it specifies how the DataSource data is passed on to the data target.The delta process is a feature of the extractor and specifies how data is to be transferred. This delta process is used by LIS DataSources. which are updated directly in the delta queue. 2. An ODS object must always be in operation when the user update data in an InfoCube. It only supports the addition of fields. The delta process controls whether adding or overwriting is permitted.8. It supports an update in an ODS object as well as in an InfoCube. In this case. otherwise . (technical name of the delta process in the system: ADD)  Forming deltas with after image. adding and overwriting are permitted. As a DataSource attribute. before and reverse images that are updated directly in the delta queue.

The LO datasources support ABR delta mechanism which is both DSO and InfoCube compatible. and the init InfoPackage extracts the initial data into BI. before and reverse images that are updated directly to the delta queue. 2 Delta Extraction Once the initialization of the logistics transaction data datasource is successfully carried out. The restructuring/set up tables are cluster tables that hold the respective application data. a before image gives status before the change with a minus sign and a reverse image sends the record with a minus sign for the deleted records. extract data by directly accessing the application tables. For e. if the quantity of ordered material is changed to 14 from 10. <Extraction structure>SETUP and the compressed data from application tables stored here can be viewed through SE11. The presence of restructuring/set up tables prevents the BI extractors directly access the frequently updated large logistics application tables and are only used for initialization of data to BI. (technical name of the delta process in the system: AIM/AIMD) 2. 2. the LO datasources use the concept of set up tables to carry out the initial data extraction process. FI etc. in the sales document 1000.1 Delta Initialization In contrast with other business content and generic data sources. Thus the datasource 2LIS_11_VAITM having extract structure MC11VA0ITM has the set up table MC11VA0ITMSETUP. is used in capacity in BBP.incorrect results would come about.g. The data extractors for HR. where extractor can also send records with the deletion flag. It is used in FI-AP/AR for transferring line items. A job is executed to fill the set up tables. while the variation of the process.9. then the data gets extracted as shown in the table. the set up tables have to be filled. all subsequent new and changed records are extracted to the BI system using the delta mechanism supported by the datasource. For loading data first time into the BI system. .9. The ABR delta creates delta with after. The after image provides status after change. but in case of LO extractors they do not access the application tables directly. The serialization plays an important role if the delta records has to be updated into a DSO in overwrite mode. The setup tables in SAP have the naming convention.9 Delta Method Properties 2. which gets automatically generated after successful delta initialization. and the BI system extracts the data as a onetime activity for the initial data load. and the data can be deleted from the set up tables after successful data extraction into BI to avoid redundant storage.

9. the creation of a sales order. It is a very important aspect for the logistic datasources as the very program that updates the application tables for a transaction triggers/pushes the data for information systems.g. 2.The type of delta provided by the LO datasources is a push delta. having an integrated controlling aspect. the program that outputs the statement COMMIT WORK AND . i. The SAP system treats both these events generated by the creation of order with different priorities by using two different update modes for achieving the same. the delta data records from the respective application are pushed to the delta queue before they are extracted to BI as part of the delta update. The data entered by the user is used by the logistic application for achieving both the above aspects. the user enters data and saves the transaction. The update modes are separately discussed below.0 system. The following three update methods are available. With V1 updates.e. which carries out updates in the background.1 V1 Update A V1 update is carried out for critical or primary changes and these affect objects that has a controlling function in the SAP System. Apart from these two update modes SAP also supports a collective run.3 Update Modes Before elaborating on the delta methods available for LO datasources it is necessary to understand the various update modes available for the logistics applications within the SAP ECC 6. i. which can be a V1 or a V2 update. The fact whether a delta is generated for a document change is determined by the LO application. a) V1 Update b) V2 Update c) V3 Update While carrying out a transaction. The latter is often termed as statistical updates. the V1 update and the V2 update. with the former being a time critical activity. by means of an update type.3. for example the creation of an sales order (VA01) in the system.9. 2. for e. The data entered by the user from a logistics application perspective is directly used for creating the orders.e. called the V3 update. These updates are time critical and are synchronous updates. but the former. the creation of the order takes a higher priority than result calculations triggered by the entry. and also indirectly forms a part of the information for management information reporting.

Implementing the same as a V2 update runs 10 times after the V1 for the same has been completed. They are carried out in a separate LUW and not under the locks of the transaction that creates them. 2.2 V2 Update A V2 update. But when executed as a V3 update. the update can be executed at any time in one single operation with the same being carried out in one database operation at a later point in time. preventing simultaneous updates.WAIT which waits until the update work process outputs the status of the update. The different update modes available with LO datasources are. The most important aspect is that the V1 synchronous updates can never be processed a second time.9.3. in contrast with V1 is executed for less critical secondary changes and are pure statistical updates resulting from the transaction. The V1 updates are carried out as a priority in contrast to V2 updates. They are asynchronous in nature. This largely reduces the load on the system. During the creation of an order the V1 update writes data into the application tables and the order gets processed. the V2 components are processed by a V1 update process but the V1 updates must be processed before the V2 update.9. SAP provides different mechanisms for pushing the data into the delta queue and is called update modes. Direct Delta . These updates are executed under the SAP locks of the transaction that creates the update there by ensuring consistency of data.e. The program then responds to errors separately. though the V2 updates are usually also processed straight away. the SAP system also has another update method called the V3 update which consists of collective run function modules.3 V3 Update Apart from the above mentioned V1 and V2 updates. the database is updated 10 times. They are often executed in the work process specified for V2 updates. Compared to the V1 and V2 updates. All function module calls are then collected. aggregated and updated together and are handled in the same way as V2 update modules. 2. a.10 Delta Queue Functions The LO datasource implements its delta functionality using the above update methods either individually or as a combination of them. which is carried out when a report (RSM13005) starts the update (in background mode). the V3 update is a batch asynchronous update. The V3 update does not happen automatically unlike the V1 and V2 updates. If one of the function modules increments a statistical entry by one. If this is not the case. 2. The V1 updates are processed sequentially in a single update work process and they belong to the same database LUW.3. this is called up 10 times during the course of the transaction. i.

Recommended for customers with fewer documents.1 Direct Delta (V1 update) A direct delta updates the changed document data directly as an LUW to the respective delta queues. Advantage of direct delta a. Disadvantage of direct delta a. Setup and delta initialization required before document postings are resumed. A logistics transaction posting leads to an entry in the application tables and the delta records are posted directly to the delta queue using the V1 update. d. Un-serialized V3 Delta The following section discusses in detail the concept behind the three different delta update modes. No additional monitoring of update data or extraction queue required. When using this update mode. c.10.b. b. The data available in the delta queue is then extracted periodically to the BI system. The figure below shows the update modes for delta mechanism from the LO Cockpit. Not suitable for scenarios with high number of document changes. Extraction is independent of V2 updating. no document postings should be carried out during delta initialization in the concerned logistics application from the start of the recompilation run in the OLTP until all delta init requests have been successfully updated successfully in BW. . the scenarios in which they are used.V1 is more heavily burdened. 2. their advantages and disadvantages. C. Writing to the delta queue within the V1 posting process ensures serialization by document. Queued Delta c. b.

with an update collection run. The data is collected in the extraction queue and a scheduled background job transfers the data in the extraction queue to the delta queue. and a delta init request is posted in BW. that is when setup tables are filled. the document postings (relevant for the involved application) can be opened again as soon as the execution of the recompilation run (or runs.10.10.The data from documents posted is completely lost if documents are posted during the reinitialization process. if several and running in parallel) ends. because the system is able to collect new document data . thanks to the logic of this method.000 delta extractions of documents can be aggregated in an LUW in the delta queue for a datasource. 2.1 Benefits  When the user need to perform a delta initialization in the OLTP.2 Queued delta (V1 + V3 updates) In the queued delta update mode the logistic application pushes the data from the concerned transaction into an extraction queue by means of the V1 update.0 system (transaction LBWQ).2. SAP recommends the queued delta process for customers with a high amount of documents with the collection job for extraction from extraction queue to be scheduled on an hourly basis. up to 10. The data pushed by the logistic application can be viewed in the logistics queue overview function in the SAP ECC 6. Depending on the concerned application. in a similar manner to the V3 update. 2.

 By writing in the extraction queue within the V1 update process (that is more burdened than by using V3).  In contrast to the V3 collective run an event handling is possible here.creation.10.performed each day for the application in question. The entire extraction process is independent of the V2 update process. because a definite end for the collective run is identifiable: in fact. 2.2 Limits  V1 is more heavily burdened compared to V3. this process is especially recommended for customers with a high occurrence of documents (more than 10.2. change or deletion .  Administrative overhead of extraction queue. In the initialization process.  Extraction is independent of V2 update. where nn is the number of the application) is automatically triggered and.during the delta init uploading too (with a deeply felt recommendation: remember to avoid update collective run before all delta init requests have been successfully updated in the user BW!). the collection of new document data during the delta initialization request can reduce the downtime on the restructuring run. an event (&MCEX_nn.  On the contrary of direct delta. The job uses the report RMBWV311 for collection run and the function module will have the naming convention MCEX_UPDATE_<Application>. 2. when the collective run for an application ends. but collective run clearly performs better than the serialized V3 and especially slowing-down due to documents posted in multiple languages does not apply in this method.10.000 document changes . the serialization is ensured by using the enqueue concept. it can be used to start a subsequent job. MCEX_UPDATE_11 for sales orders. thus.3 Un-serialized V3 Update (V1/V2 + V3 Updates) .

In this mode of delta update the concerned logistic application writes data to update tables which further transfers data to the delta queue by means of a collection run call V3 update. Once the data is updated to the update tables by the logistic applications, it is retained there until the data is read and processed by a collective update run, a scheduled background job, the V3 update job, which updates all the entries in the update tables to the delta queue.

As the name suggests the update is un-serialized, i.e. this mode of update does not ensure serialization of documents posted to the delta queue. This means that the entries in the delta queue need not correspond to the actual sequence of updates that might have happened in the logistic application. This is important if the data from the datasource is further updated to a DSO in overwrite mode as the last entry would overwrite the previous entries resulting in erroneous data. An un-serialized delta update when used should always update data either to an infocube or to a DSO with key figures in summation mode. It is also advised if the un-serialized V3 update can be avoided to documents subjected to a large number of changes when it is necessary to track changes.

2.11 Generic extraction
Generic R/3 data extraction allows us to extract virtually any R/3 data. Generic data extraction is a function in Business Content that supports the creation of DataSources based on database views or InfoSet queries. InfoSet is similar to a view but allows outer joins between tables. The new generic delta service supports delta extractors on monotonic ‘delta attributes‘like Timestamp, Calendar day, Numeric pointer (e.g. document number, counter) – must be strictly monotonic increasing with time. Only one attribute can be defined as the delta attribute. For extracting data from the VBAK table, the Logistics Extraction Cockpit is the recommended method.

2.11.1Create Generic extraction [Master data]

1. Under Transaction SBIW – This step gives the user the option of creating and maintaining generic Data Sources for transaction data, master data attributes or texts from any kind of transparent tables, database views or SAP query functional areas or via a function module, regardless of application. This enables the user to use the generic extraction of data.

2. Create a Generic Data Source a) Select the Data Source type and assign a technical name to it. b) Choose Create The screen for creating a generic Data Source appears

3. a) Choose an application .Component to which the data source is to be assigned. b) Enter the descriptive texts. The user can choose these freely. c) Choose Generic Delta

4. Specify the delta-specific field and the type for this field. Maintain the settings for the generic delta: Specify a safety interval. Safety interval should be set so that no document is missed – even if it was not stored in the DB table when the extraction took place.

5. Select Delta type: New status for changed records (I.e. after-image); This can be used with Data target ODS (AIE).Additive Delta (I.e. aggregated data records) (ADD) Then choose Save.

6.

After step 4, the screen of step 3 comes back. Now choose Save again.

Delta is enabled by data selection logic LUW count can also have value 1. when an error occurs during . Most of the time it will be ZERO.Whenever delta is extracted. In systems as of basis release 4. the user will see the Delta Update flag selected. Also note LUW count does not equal to the changes records in the source table. 7. the extracted data is stored in the delta queue tables to serve as a fallback.the user can display the current value for the delta-relevant field in the delta queue.This will generate the data source. Choose Save again Delta Attributes can be monitored in delta queue (RSA7).0B. After generating the data source.

12. In SAP Business Information Warehouse (SAP BW).3.2. a time interval is specified for this attribute.1000 and 12.2 Time-dependent Texts . 2. 2. As master data must exist between the period of 01.1 Time-dependent Attributes If the characteristic has at least 1 time-dependent attribute.01. 2. An example of this is the cost center hierarchy.the update of the BW system. These attributes are used to display additional information so results can be better understood. A hierarchy consists of a quantity of nodes that have a parent child relationship with one another. up to three texts can be maintained for each master record. three different types of master data can be differentiated in InfoObjects.31. 2. Attributes Master data attributes are fields that are used to provide a more detailed description of master data elements.12. These texts can consist of the following: one short text.12. the gaps are filled automatically. Texts Texts are used to describe a master record.1.2.1 Master Data In SAP BW.1. and one long text. The structures can be defined in a version-specific as well as a timedependent manner.12.12. Hierarchies Hierarchies can be used in the analysis to describe alternative views of the data. An attribute table can be used by several InfoCubes. An example of a master data text is the name of the supplier that goes with the supplier number.2 Functions 2.12. An example of a master data attribute is the country of the supplier that goes with the supplier number.2.12. This ensures a higher level of transparency for the user and a more comprehensive consistency. The user will see a '1' in this field (the extract counts as one LUW) and are even able to be displayed in a detail screen. 2.1000 in the database.1. one medium text.1. 2.12 Generic Data Types 2.

customer names).3 Transactional data Transaction data includes origination data. the text for the key date is always displayed in the query. Generic extractors are of 3 types: 1. 2. the user can determine whether the texts are language-specific (for example. or not (for example. Summary transactional data is used in SAP R/3 for mostly reporting purposes. with product names: German Auto. Based on Function module . competitive data.If the user creates time-dependent texts. When loading transactional data.3 Time-dependent Texts and Attributes If the texts and the attributes are time-dependent. and macroeconomic data.4 Language-dependent Texts In the Characteristic InfoObject Maintenance. English car).Any business event in SAP R/3 leads to transaction data being generated. 2.12. the system utilizes elements of the master data to process the transactions. and schedule lines. the user have to upload all texts with a language indicator. line items.12. Only the texts in the selected language are displayed.12.2. It is regularly updated and loaded into the system. If they are language-dependent. Transaction data . 2. Document level transaction data is detailed data and consists of headers.2. Based on table/view 2. SAP R/3 handles transaction data at the document level or at the summary level. monthly account data. Based on Infoset Query 3. the time intervals do not have to agree.

ES -> What data to be extracted from source system TS -> What data to be extracted from source system to BW. Additionally.A DataSource consists of a quantity of fields that are offered for data transfer into BI. The DataSource is technically based on the fields of the structure. Upon replication. the DataSource describes the properties of the associated extractor with regard to data transfer to BI. TS is a group of fields which indicated how the data is coming from the source system BW Data that logically belongs together is stored in the source system in the form of DataSources. these fields can be enhanced as well as hidden (or filtered) for the data transfer. By defining a dataSource.2. filtered and enhanced) extraction structure. .13 Generic Data sources Defines as extract structure and transfer structure from source system (like SAP R/3) and transfer structure from BW side. DataSources make the source system data available to BI on request in the form of the (if necessary. the BI-relevant properties of the DataSource are made known in BIDataSources are used for extracting data from an SAP source system and for transferring data into BI.

The user can edit DataSource extraction structures in the source system. information is only retained if it is required for data extraction.3 Replication of DataSources In the SAP source system. the user determines what the assignment of fields from the DataSource to InfoObjects from BI should look like.13. 2. Replicating the header tables is a prerequisite for collecting and activating BI Content In the second step. Here. The rules that the user set in the transformation apply here. data from a DataSource is staged in the source system. choose Business Information Warehouse Subsequent Processing of DataSources. . the D versions are replicated. 2. the Persistent Staging Area (PSA). the user determines which fields from the DataSource are actually transferred. In the transformation. In the source system. meaning completing the extraction structure. There is no implicit assignment of objects with the same names. the user avoid generating too many DDIC objects unnecessarily as long as the DataSource is not yet being used – that is.3. DataSources (R3TR RSDS) are saved in the M version in BI with all relevant metadata. In the source system.13. using transaction SBIW. only the DataSource header tables of BI Content DataSources are saved in BI as the D version. 2.1 Replication Process Flow In the first step. Replication allows the user to make the relevant metadata known in BI so that data can be read more quickly.1 Extraction Structure In the extraction structure. the DataSource is the BI-relevant metaobject that makes source data available in a flat structure for data transfer into BI. Data transfer processes facilitate the further distribution of the data from the PSA to other targets. In transaction SBIW in the source system. the A versions are replicated. It contains the amount of fields that are offered by an extractor in the source system for the data loading process.In the DataSource maintenance in BI.2 Editing the DataSource in the Source System The user can edit DataSources in the source system. In particular. In this way. The metadata from the SAP source systems is not dependent on the BI metadata.13. 2. The assignment of source system objects to BI objects takes place exclusively and centrally in BI.13. Data is transferred in the input layer of BI. This means filtering the extraction structure and/or enhancing the DataSource for fields. a DataSource can have the SAP delivery version (D version: Object type R3TR OSOD) or the active version (A version: Object type R3TR OSOA). the user can determine the DataSource fields in which the user hide extraction structure fields from the transfer.

the system automatically deletes the D version in BI. If. automatic synchronization of the metadata in BI with the metadata in the source system takes place.3. reporting and analysis. which are used in BW for extracting.14 Enhancing Business Content SAP provides business contents.as long as a transformation does not yet exist for the DataSource. the system determines that the D version of a DataSource in the source system or the associated BI Content (shadow objects of DataSource R3TR SHDS or shadow objects of mapping R3TR SHMP) is not or no longer available in BI.Install the Business content.3.3. If this indicator is set. during replication. whenever there is a data request. if the DataSource has changed in the source system.Log on to the SAP R/3 server. Steps involved in using the business content and generating reports. When the user replicates DataSources for a particular application component.13.13. the DataSource is automatically replicated from the BI upon each data request – that is. 2. 2. 1.2 Deleting DataSources during Replication DataSources are only deleted during replication if the user performs replication for an entire source system or for a particular DataSource. 2.3 Automatic Replication during Data Request The user can use a setting in the InfoPackage maintenance under Extras  Synchronize Metadata to define that. the system does not delete any DataSources because they may have been assigned to another application component in the meantime. .x DataSources (R3TR ISFS) are saved in BI in the A version with all the relevant metadata. 2. The Transaction code (Tcode) is RSA5.

Include the following code in the user exit.3. .Checking if the Business content is ready for use.Activate the enhanced structure. User exit is written for this purpose. 6. TABLES: AUFK. Enhancing the structure. 5. 7. TABLES: VBAP. DATA: Temp_KDAUF like DTFIGL_4-KDAUF.Enhance the field BOM in to the structure. Tcode RSA6. 4. Here I am using BOM (Bill of Materials) from the table VBAP from Sales and Distribution (SD). Hence it should be mapped to 0_FI_GL_4 from VBAP when the condition is satisfied. DATA: Temp_ZZSEQUENCNUMBER like DTFIGL_4-ZZSEQUENCENUMBER. DATA: Temp_sorder like DTFIGL_4.

Temp_sorder-ZZBOM = Temp_ZZBOM. 8. Temp_sorder-KDAUF = Temp_KDAUF. 11. Temp_sorder-ZZORDERCAT = Temp_ZZORDERCAT. Modify C_t_data from Temp_sorder. Select single AUTYP from AUFK into Temp_ZZORDERCAT where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS. When '0FI_GL_4'. Log on to BW server.Business content has been installed and activated. Loop at C_t_data into Temp_sorder. DATA: Temp_ZZBOM like DTFIGL_4-ZZBOM. . Select single KDAUF from AUFK into Temp_KDAUF where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS. End loop. 9. A field Bill of Material (ZBOM) from VBAP (SD) has been enhanced to FI_GL_4 structure. Create an InfoObject for the enhanced field ZZBOM. Select single STLNR from VBAP into Temp_ZZBOM where GSBER = Temp_sorder-GSBER and AUFNR = Temp_sorder-AUFNR.Activate the user exit. Case i_datasource. Select single SEQNR from AUFK into Temp_ZZSEQUENCENUMBER Where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS. 10. Temp_sorder-ZZSEQUENCENUMBER = Temp_ZZSEQUENCNUMBER.DATA: Temp_ZZORDERCAT like DTFIGL_4-ZZORDERCAT.

Replicate the data from SAP R/3 to the BW server.12. Create ODS. 13. Assign the objects corresponding to the Key fields and Data fields. Map the fields manually since ZZBOM is an enhancement. it should be mapped to the InfoObject created by us. 15. . 14. Assign Data source.

Create InfoCube. Create Update rules for the ODS. 19. The data load is successful. 20. Activate the InfoCube and create update rule in order to load the data from the ODS. . Assign the corresponding characteristics. Create InfoSource and InfoPackage. The Data is then loaded into the ODS which is the data target for the InfoPackage. 22.16. 18. Schedule the load. The data is loaded in to the ODS which could be monitored.The data is extracted form the SAP R/3 while we schedule the load from Info Package. Time Characteristics and Key figures. Create relevant Dimensions to the cube. 23. 17. 21.

27. 26. 1996 and 1997. . General Ledger Balance Sheet for the period 1995. 25. Data available in the InfoCube. Update ODS and the InfoCube is successfully loaded.24. Manage the InfoCube to view the available data. As per the requirement the General Ledger InfoCube is created and ready for Reporting. Query designer for 0_FI_GL_4. Create a new query using a query designer. 28.

In selected roles in a company. queries.1 Data from Flat Files (7. ensures the user to look at information within the entire enterprise. Extraction with Flat Files 3. if budget planning for a company’s branch offices is done in . key figures. files in ASCII format (American Standard Code for Information Interchange) or CSV format (Comma Separated Value). OLAP technology enables multi-dimensional analyses from various business perspectives.0) BI supports the transfer of data from flat files. The Business Information Warehouse Server for core areas and processes. As well as roles. The SAP Business Information Warehouse enables Online Analytical Processing (OLAP).The SAP Business Information Warehouse allows the user to analyze data from operative SAP applications as well as all other business applications and external data sources such as databases. For example. which processes information from large amounts of operative and historical data. Business Content offers the information that employees need to carry out their tasks. and characteristics. pre-configured with Business Content. which make BW implementation easier. Business Content contains other pre-configured objects such as InfoCubes. online services and the Internet. 3.

3 Extracting Transaction and Master Data using Flat Files Go to T-CODE SE11 and select the radio button TRANSACTION DATA and give the some technical name and click on CREATE button. DataSources with this delta process type can write the data into ODS objects and master data tables. that is. The data for the flat file can be transferred to BI from a workstation or from an application server. delta transfer in the case of flexible updating is supported. defining the metadata for the user file in BI.x) The metadata update takes place in DataSource maintenance of BI. this planning data can be loaded into BI so that a plan-actual comparison can be performed. the extracted data is added in BW. Determine the parameters for data transfer in an InfoPackage and schedule the data request. The user can establish if and which delta processes are supported during maintenance of the transfer structure. 3.Definition and updating of metadata. 3. The transfer of data to SAP BW takes place via a file interface. The user creates a DataSource in BI. as well as about creating InfoSources for flat files under:  Flexibly Updating Data from Flat Files  Updating Master Data from a Flat File  Uploading Hierarchies from Flat Files The structure of the flat file and the metadata (transfer structure of the DataSource) defined in SAP BW have to correspond to one another to enable correct data transfer. the DataSource is done manually for flat files in SAP BW. Make especially sure that the sequence of the InfoObjects corresponds to the sequence of the columns in the flat file. DataSources with this delta process type can supply both ODS objects and InfoCubes with data. The user creates an InfoPackage that includes the parameters for data transfer to the PSA. With additive deltas. 3. 2. For flat files. 1. the values are overwritten in BW. . The user defines a file source system. The user can find more information about this.Microsoft Excel. The user can find more information under Maintaining InfoPackages Procedure for Flat Files.2 Data from Flat Files (3. During transfer of the new status for modified records.

Then click on SAVE button. TABLE NAME and give Descriptions. .Give APLICATION COMPONENT.

in the source system tab. which we already assigned in R/3 side. . So select click on SAVE button. And then -> Go to BW side and Replicate the user datasource by selecting the application component.Select the fields like master data otherwise click on hide. Some key figures along with the reference values. For transaction data KEY FIGURES and REFERENCE are compulsory.

-> Click on Assign infosource by the context menu of the user replicated datasource -> Create the infosource as flexible updated .

-> Assign the transfer rules by selecting necessary infoobject related to the object. . And then click on Activate. which we extracted by clicking F4 for each and every infoobject.

-> Go to info provider and create infocube. .

.

.

.

create update rules for that infocube.-> After the creating of infocube. .

.

.-> Go to infosource and select the user datasource and create infopackage.

.

4 Data Types that can be extracted using Flat Files Flat files are data files that contain records with no structured relationships. . this type of file can be referred to as a flat file.lis .csv is comma separated flat file.lst. Modern database management systems used a more structured approach to file management (such as one defined by the Structured Query Language) and therefore have more complex storage arrangements. For this reason.. . .3. FOR Example . . . Additional knowledge is required to interpret these files such as the file format properties. This type of file contains no inherent information about the data and interpretation requires additional knowledge. Many database management systems offer the option to export data to comma delimited file.txt .

Establish an interface w. to the final Data Target. Consider two factors: Name of file and Structure same as Source. Create DTP (Data Transfer Process) to transport data stored within PSA .t the Source system used 3. Create a Target (Info Provider : InfoObjects/Info Cubes/DSO etc) 6.2 Step-by-Step to upload Master Data from Flat File to InfoObjects Transaction Code – RSA1 : Data Warehouse Workbench . Create a Data Source – which has a structure similar to the Source structure. Create Info Provider to transfer data from Data Source to PSA (Persistent staging Area) 5.1 Basic Steps of Data Flow (ETL process): 1.3.4.4. 7. 3. Create a Transformation rule to map Source & Target fields.r. Determine the Source System (Flat File/SAP R3/WebServices etc) 2. 4.

Source/Input: Create a Flat File of type CSV. Select “Create InfoArea” . with the following structure: Customer Language Land City Name EN US SEATTLE JOHN 260 261 DE DE HAMBURG SAM 262 DE DE WALDORF MICHEAL Right click on the InfoProvider column header.

Right click on the created InfoArea and create InfoObject Catalog .Enter the details – InfoArea name and Description The InfoArea – IA for Customer appears on the InfoProvider list.

In this case.’ Click on Create. The InfoObject Catalog appears right under the InfoArea created earlier. Now. Hence. Select ‘Char. we need to create Characteristics InfoObject. right click on the InfoObject Catalog and create Info Object. .Enter a suitable Name and Description for InfoObject Catalog. we are creating Master data.’ or ‘Key Figure’ based on the type of data (Master data/ Trans. Select ‘Char. Data) needs to be created.

enter the Data Type and Length. then it becomes an attribute of this Characteristic Key field or just another normal field in the Master data Table. . no changes to be done.  In the General Tab.  In the Business Explorer tab. On the right window. If the box is checked.In the pop-up for InfoObject enter Characteristic name and description. Note: the ‘Attribute only’ field is not checked. This means that this Characteristic is a primary key field in the Master Data table. The first Characteristic InfoObject created can be treated as a key value in the Master table to be created. the properties of the Characteristic ‘CUSTNO’ are visible.

Save + Activate => P tables gets created  In the Master data/texts tab.  In the Attribute tab. to create a corresponding Text table for this Master Data table with two default fields – Language & Description. Select ‘short/medium/long texts’. Ex: Customer Id – primary key field Land City . also ‘texts language dependent’. enter the fields (non-key fields) that need to be maintained within the Master Data table.  Check “With Master data”. having naming convention /BIC/P<characteristic name>  Check “With Texts”.  Select “Char is InfoProvider” (as our InfoObject is our target in this scenario).  Enter the InfoArea in the box below. to create a Master Data table.

select the 1st option – “Create Attribute as Characteristic”. . Enter the Data type and the Length.On entry of each field. [Enter] Repeat the same for other fields too. hit [Enter] and in the pop-up box.

.[Activate ] . select Data Source. Click on the ‘[X]’ button to choose the Source System. Create File Interface: Source Systems  File  Create <New File System> Or Use the existing file source system Data Source creation/selection: On the Navigator tab.Activate all dependent InfoObjects.

The technical name of Application component APC_CUSTOMER now gets saved as => ZAPC_CUSTOMER Right click on the Application component and create DataSource . Ex: Name: APC_CUSTOMER Click ok.Choose the Source System  File  <Flat File source system name> Right click on the header and Create Application Component. Enter the name and description for Application Component.

Enter a suitable DataSource name : DS_Cust Source System : Flatfile Select Data Type of DataSource: Master Data attributes  as we are creating master data table. Note: Select Master Data Attributes. while the user are uploading Master Data from file (DS1) Select Master Data Text. while the user are uploading Text Descriptions for the Master Data records (DS2) .

The details can be viewed in the below pane. In a Data source. ‘.In the right Window. .  General Info tab – provides information on the Data Source created  Extraction tab – provides details of file to be extracted. search for the file on the user local desktop  Header rows to be ignored = ‘1’ (if file has a header row as column titles) = ‘0’ (if file has no header row)  Select Data Format as – “Separated with Separator (for ex CSV)”.  Proposal tab – provides a quick view of the data in the flat file to be uploaded Click on the ‘Load Example’ tab. Therefore. Enter the Data Separator as ‘ . we need to mention 2 details:  Name of Flat File (ASCII or CSV) to be uploaded  Structure (Lathe user) of the input file. mentioning the fields. make the following changes:  Select name of file using F4. the user can view the Data Source properties.

 In the Template Info field.  [Enter] or [Save]  [Copy] : all properties of the characteristics ( ex: data types. [Save] [Activate]  Preview tab – gives a preview of data on flat file loaded onto a structure.  Click on “Read Preview Data” . to map with the fields from Source file. Note: the fields 0LANGU and 0TXTSH are the standard SAP defined characteristics InfoObjects. length etc) are copied here. enter the field names (characteristics/attributes). Fields tab – provides details of all fields used to create a data structure.

 Then click on the [Monitor]button – next to “Process Chain Mgmt” Or [ F6]. Click on the Data Source and Create InfoPackage. On the right window. Enter the InfoPackage name : IP_Cust Enter a suitable Description Select the corresponding DataSource from the list below and [Save].Info Package: An InfoPackage helps to transport data from the Data Source structures into the PSA tables. we have the InfoPackage properties:  Data Selection tab – to initiate loading of data into the infopackage  Select “Start Data Load Immediately”  Click on [Start]. (This is similar to transfer of data from Work areas to Internal Tables in R/3 ABAP). Rt. .

click OK. with the table and the respective data in them. In the pop-up. we can view the status of the data loading to PSA. defining the number of records per request. Click on the [PSA Maintenance] or [Ctrl + F8] to maintain the PSA table with the flat file values.In the next screen. This leads to the PSA display screen. Click on [Back] and come back to the DataSource screen. .

 For the Target of Transformation. select Sub-type of Object : ‘Attributes’ (IP1 for DS1) While uploading Text desc. Note: Normally. For this Master Data. Click on OK Note: While uploading Master Data.  For the Source of Transformation.Transformation: This is a rule that is defined. the Transformation can be created on the Data Source or the Target. our target is the InfoObject (Master data table). Here.  Enter Object Type: InfoObject (this is our target in this scenario)  Enter subtype of Object: Attributes (as we’re considering Master data only.  Enter Object Type: DataSource (this is our source structure)  Enter name of Data Source used  Enter the Source System used. for mapping the source fields in the DataSource to the final Target (basically. we are creating a Transformation on the DataSource. select Subtype of Object : ‘Texts’ (IP2 for DS2) . Right click on DataSource  Create Transformation In the pop-up. Not transactional data)  Enter the name of the InfoObject we created. which has been already created. In this case. the Info Providers from where BI extracts the reports).

of proposals (rules) generated to map source fields with target fields. . Data Transfer Process (DTP): The DTP is a process used to transfer data from the PSA tables. On the next window we can see a graphical representation of the mapping of Source fields to Target fields. based on the Transformation (mapping) rules. Save and Activate. Click on Ok. into the data Targets. DataSource  right click  Create Data Transfer Process. if the mapping if done correctly. we get the no.In the next screen pop-up.

 Select Extraction Mode as ‘Full’.  The DTP name is described as : < DataSrc/SourceSys -> InfoObject >  DTP type : standard  Target of DTP : Object Type : InfoObject (in this case) Subtype of Object : Attribute Name : < name of InfoObject >  Source of DTP: Object Type : Data Source DataSource : <name> Source System: Flat file Click on OK.  Extraction tab. In the DTP properties window.In the pop-up. [Save] [Activate] .

 

Update tab – no changes to be made Execute tab,  Click on [Execute] button

Click on [Refresh] until the icons in yellow, turn Green. In case a Red icon appears, we need to track the error and rectify it. Go Back. To view the contents of the Target (Info Object), there are 2 ways: 1. Go to the InfoObject – CUSTNO  MasterData/Texts tab  Dbl click on the Master data table created : /BIC/P<Char name>  Table display  Execute

On Transformation/DTP  Attribute  rt.click  Manage  rt windw Contents tab  [Contents]  F8.

Text table Data Display is Language Dependent: For Lang = ‘EN’, display corresponding Short Desc.

For Lang = ‘DE’, display corresponding Short Desc.

Observations: In the above Master data table display, The fields – Customer Name, Customer Land, and Customer City: have been created and loaded with data as per the file. These 3 fields have been created as a result of checking the “With Master data” for the Characteristic. The contents of the Master data are stored in the newly created transparent table /BIC/P<char. Name> Ex: The fields – Language, Description : have been created and loaded with data as per the file. These 2 fields have been created by default, as a result of checking the “With Text data” for the Characteristic. The contents of the Text data are stored in the newly created transparent table /BIC/T<char. Name> Ex:

4. DB Connect
4.1 Introduction
The DB Connect enhancements to database interface allow the user to transfer data straight into BI from the database tables or views of external applications. The user can use tables and views in database management systems that are supported by SAP to transfer data. The user use Data Sources to make the data known to BI. The data is processed in BI in the same way as data from all other sources.

2.1 Process Description Go to RSA1 à Source Systems à DB Connect à Create . IBM DB2 UDB There are 2 types of classification.2 Loading data from SAP Supporting DBMS into BI Steps are as follows:1. they don’t meet the requirements & hence can’t perform DB Connect. Further.SAP DB Connect only supports certain Database Management systems (DBMS). 4. in addition to extracting data using standard connection. If not. only if SAP has released a DBSL. In this process we use a Data source. both these DBMS are supported on their respective operating system versions. One is the BI DBMS & the other is source DBMS. the user can extract data from tables/views in database management systems (DBMS) 4. the structure for table/view must be known to BI.2. The main thing which is.Direct access to external DB Using Data source. Connecting a database to Source system -. to make the data available to BI & transfer the data to the respective Info providers defined in BI system. using the usual data accusation process we transfer data from DBs to BI system. The following are the list of DBMS      Max DB [Previously SAP DB] Informix Microsoft SQL Server Oracle IBM DB2/390. Using this SAP provides options for extracting data from external systems. IBM DB2/400.

Type & Release Now. we can see the name of our Source System (MS SQL DB Connect) The logical DB Connect name is MSSQL . Under DB Connect. create the source system using 1. Source System Name à MS SQL DB Connect 3. Logical System Name à MSSQL 2.Now.

we now have to create a Data source in the component area. So right click the Application component area à Create Data source (as in below figure).In Data sources we need to create an Application Component area to continue with the export Goto RSA1  Data sources  Create Application Component After creating an Application Component Area called “ac_test_check”. .

we can specify the Table/View here .The Data source name here is “ds_ac_tech” The Source System here is the defined “MSSQL” The type of data type data source that we have here is “Master Data Attributes” The below screen shot describes how to perform extraction or loading using a Table/View. As the standard adapter is “Database Table” (by default).

choose the data source from the DB Object Names. we have selected the “EMPLOYEES” as the Table/View. Or we can choose the Table/View à “REGION” . Now.Now.

.We have 2 database fields  Region  Region ID Description Now that the Data source has to be activated before it is loaded. we “ACTIVATE” it once.

Eastern. Northern & Southern Right click the Info package DS_AC_TEST à Create Info package We now create an Info package called “IP_DS_AC_TECH”. with Source system as MSSQL . the data records (4) are displayed. Western.After activation.

Once done we perform a schedule on the Info package à “Start” Now. we need to create an Info Area to create an Info provider (like Info Cube) .

After creating the info cube we check for the data in the PSA by “Manage the PSA” This can be also done using the Key controls (Ctrl + Shift + F6) .

. which describes all the Database fields. and we have to specify the Data source fields. we can view the following factors 1. 4. 3. 5. types & length. In the next tab we have “PROPOSAL”.he number of records displayed: 4 No’s Using the PSA Maintenance. 2. Status Data Packet Data records REGION ID REGION Description The Table/View “CUSTOMERS” is now chosen for Extraction.

.

.

Now. go to RSA1 à Info objects à Info object (Test) à Create Info Object Catalog . we create an Info package à IP_TEST_CUST Now.

Now. Region Description à Region2 (Region) . We now create 2 Info objects & pass the Region ID & Region Description to the 2 objects. we can preview the Region ID & Region Description. 1.

Region Description à reg_id (Region ids) .2.

these are the 2 variables created under the Info object “test2”   Region (REGION2) Region ids (REG_ID) .Now.

we create a transformation using “Create Transformation” on the Region ids (Attributes) .We create Characteristic as Info Provider for the Master Data loading in the “Info Provider” section  Insert Characteristic as Info Provider Now.

we now perform a DTP Creation on the same Region ID (Attribute) We choose the Target system (default) as Info object  Region ID  REG_ID & the Source Type as Data source with the Source System  MSSQL .We now choose the Source System after this à MSSQL  MS SQL DB CONNECT After checking the Transformation mappings on the Region ID.

.

After this step. Once the info package has been triggered we can go to “Maintain PSA” & monitor the status of data in PSA . we proceed with creating an Info package  IP_DS_TEDDY which has the source system as MSSQL. Further we start the scheduling of the Info package.

we can monitor transfer of data from PSA  Info cube . we EXECUTE the DTP. And.Further.

.

.

Model the InfoObjects required in accordance with the source object elements in BI. 5. Define a DataSource in BI. the user can extract the data. provided that the conditions for this are met. 3. 5. Universal Data Integration 5. Secondly. . Specify a logical system name.1 Introduction UD Connect (Universal Data Connect) uses Application Server J2EE connectivity to enable reporting and analysis of relational SAP and non-SAP data. Create the connection to the data source with the user relational or multi-dimensional source objects (relational database management system with tables and views) on the J2EE Engine. Select the required RFC Destination for the J2EE Engine. UD Connect can use the JCA-compatible (J2EE Connector Architecture) BI Java Connector.2 Process Flow 1. UD Connect Source Object UD Connect source objects are relational data store tables in the UD Connect source.3 Creating UD source system 1. choose Create in the Context menu for the UD Connect folder. 4. Create RFC destinations on the J2EE Engine and in BI to enable communication between the J2EE Engine and BI. the user can read the data directly in the source using a VirtualProvider. 2. 2. In the Implementation Guide for SAP NetWeaver →Business Intelligence →UDI Settings by Purpose →UD Connect Settings.5. 4. UD Connect Source The UD Connect Sources is the instances that can be addressed as data sources using the BI JDBC Connector. Firstly. To connect to data sources. Source Object Element Source object elements are the components of UD Connect source objects – fields in the tables. load it into BI and store it there physically. 3. In the source system tree in Data Warehousing Workbench.

Select the application component where the user want to create the DataSource and choose Create DataSource. 1. The mapping proposal is based on the similarity of the names of the source object element and DataSource field and the compatibility of the respective data types. specify whether the DataSource is initial non-cumulative and might produce duplicate data records in one request. 3. the metadata (information about the source object and source object elements) must be create in BI in the form of a DataSource. Assign the non-assigned source object elements to free DataSource fields. a) Define the delta process for the DataSource. e) Select the UD Connect source object. The system displays the elements of the source object (for JDBC it is these fields) and creates a mapping proposal for the DataSource fields. 5. enter a technical name for the DataSource. a) b) Enter descriptions for the DataSource (short. The user cannot . Select the Proposal tab. medium. A connection to the UD Connect source is established. Select JDBC as the connector type.4 Creating a DataSource for UD Connect To transfer data from UD Connect sources to BI. 6. 9. 2. The DataSource maintenance screen appears. d) The system displays Universal Data Connect (Binary Transfer) as the adapter for the DataSource. a) Check the mapping and change the proposed mapping as required. Choose Continue. b) Specify whether the user want the DataSource to support direct access to data. select the type of DataSource and choose Copy. Select the Extraction tab. c) UD Connect does not support real-time data acquisition. Select the General tab. On the next screen. 7. 5.5. Note that source object elements can have a maximum of 90 characters. Choose Properties if the user want to display the general adapter properties. Specify the name of the source system if it has not already been derived from the Logical system name. long). All source objects available in the selected UD Connect source can be selected using input help. 8. If required. Select the name of the connector. 4. Both upper and lower case are supported.

as required. If the system detects changes between the proposal and the field list when switch from the Proposal tab to the Fields tab. These fields are generated as a secondary index in the PSA. Entering InfoObjects here does not equate to assigning them to DataSource fields. c) d) e) f) g) h) i) If the user did not transfer the field list from a proposal. change the data type for a field. the system displays an error message. the system proposes the InfoObjects the user entered here as InfoObjects that the user might want to assign to a field . Choose the selection options (such as EQ. a) b) Under Transfer. specify whether the data to be selected is languagedependent or time-dependent. If required. Define the Fields tab. Assignments are made in the transformation. Choose Insert Row and enter a field name. specify the decision-relevant DataSource fields that the user wants to be available for extraction and transferred to BI. If the user chooses an External Format. This is important in ensuring good performance for data transfer process selections. All fields are selected by default. specify a conversion routine that converts data from an external format to an internal format. If this happens. the user can edit the fields that the user transferred to the field list of the DataSource from the Proposal tab. Under Field Type. Under Template InfoObject. Select the fields that the user wants to be able to set selection criteria for when scheduling a data request using an InfoPackage. change the values for the key fields of the source. in particular with semantic grouping If required. When the user define the transformation. Change the entries if required. BT) that the user wants to be available for selection in the InfoPackage. a dialog box is displayed where the user can specify whether the user want to copy changes from the proposal to the field list. specify InfoObjects for the fields of the DataSource. The user can specify InfoObjects in order to define the DataSource fields.b) map elements to fields if the types are incompatible. ensure that the output length of the field (external length) is correct. If required. the user can define the fields of the DataSource directly. 6. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage. Specify whether the source provides the data in the internal or external format. Choose Copy to Field List to select the fields that the user want to transfer to the field list for the DataSource. This allows the user to transfer the technical properties of the InfoObjects to the DataSource field. Here.

The recognized key figures are aggregated. each select statement generated by the JDBC adapter receives a group by clause that uses all recognized characteristics. Check.5. Microsoft Excel.5. view) can be used for a UD Connect Source. The view offers more benefits:  The database user selected from SAP BW for access is only permitted to access these objects.  Using the view. and text files such as CSV. the user can run type conversions that cannot be made by the adapter (generation of the ABAP data type DATS.1 Aggregated Reading and Quantity Restriction In order to keep the data mass that is generated during UD Connect access to a JDBC data source as small as possible. This connector is fully compliant with the J2EE Connector . Microsoft SQL Server.6 BI JDBC Connector Sun's JDBC (Java Database Connectivity) is the standard Java API for Relational Database Management Systems (RDBMS). Microsoft Access.) 5. supporting data sources such as Teradata. What is recognized as a key figure or characteristic and which methods are used for aggregation depends on the properties of the associated InfoObjects modeled in SAP BW for this access.5 Using Relational UD Connect Sources (JDBC) 5. packages with around 6. To prevent exceeding the storage limitations of the J2EE server. Oracle. However. if multiple objects are used in the form of a join. Select the Preview tab. The BI JDBC Connector allows the user to connect applications built with the BI Java SDK to over 170 JDBC drivers. This function allows the user to check whether the data formats and data are correct.7. The JDBC scenario does not support joins. DB2. The amount of extracted data is not restricted. If the user selects Read Preview Data. save and activate the DataSource 8. TIMS etc. a database view should be created that provides this join and this object is to be used as a UD Connect source object. the number of data records the user specified in the user field selection is displayed in a preview. 5.2 Use of Multiple Database Objects as UD Connect Source Object Currently only one database object (table. 5.000 records are transferred to the calling ABAP module.

and when finished. The user does this in SAP NetWeaver Application Server’s Visual Administrator. Enter library :<jdbc driver name> and choose OK. the user first need to configure the properties in BI Java Connector used to connect to the data source. In the right frame. In the DB Driver field in the Add Driver dialog box. choose Add to add a reference to the user JDBC driver. 5. Select the BI JDBC Connector in the Connectors tree. provided by the implementation of JMI capabilities based on CWM • A query model independent of the SQL dialect in the underlying data source The JDBC Connector implements the BI Java SDK's IBIRelational interface. In the Loader Reference box. enter a name for the user JDBC driver. The <jdbc driver name> is the name the user entered for the user driver when the user loaded it (see Prerequisites in BI JDBC Connector). 6. This can be done by performing the following steps: 1. The user can also use the BI JDBC Connector to make these data sources available in SAP BI systems using UD Connect. 3. 6.6.6. Choose the Resource Adapter tab. On the Cluster tab. select Yes when prompted. select the Drivers node on the Runtime tab. In the service Connector Container. Select No. To select additional JAR files. . 7. select Server x → Services → JDBC Connector. The connector adds the following functionality to existing JDBC drivers: • Standardized connection management that is integrated into user management in the portal • A standardized metadata service. 5.1 Deploy the user data source’s JDBC driver to the server: 1. Start the Visual Administrator. The user can also create systems in the portal that are based on this connector. 8.2 Configuring BI Java Connector To prepare a data source for use with the BI Java SDK or with UD Connect. Navigate to the user JDBC driver's JAR file and select it. 5. 4. 5. 4. From the icon bar. configure a reference to the JDBC driver of the user data source.Architecture (JCA). 3. Save the settings. 2. choose Create New Driver or Data source. 2.

2 JNDI Names When creating applications with the BI Java SDK.2. refer to a connector by its JNDI name: The BI JDBC Connector has the JNDI name SDK_JDBC. 5.3 Cloning the Connections The user can clone an existing connection by using the Clone button in the toolbar. Perform the tests for the connector by visiting the URLs 5.6.2.6.2. 5. the user can perform a rough installation check by displaying the page for the connector in the user server.3 Connector Properties Refer to the table below for the required and optional properties to configure for the user connector: BI JDBC Connector Properties .1 Testing the Connections After the user has configured the BI Java Connector.5.6.6.

.

The BI XMLA Connector enables the exchange of analytical data between a client application and a data provider working over the Web. using SOAP based XML communication API. The BI XMLA Connector allows the user to connect applications built with the BI Java SDK to data sources such as Microsoft Analysis Services. MicroStrategy.7 BI XMLA Connector Microsoft's XMLA (XML for Analysis) facilitates Web services-based. MIS. The XMLA Connector sends commands to an XMLAcompliant OLAP data source in order to retrieve the schema rowsets and obtain a result set.x. or the user can create systems in the portal based on this connector. platformindependent access to OLAP providers. The BI XMLA Connector implements the BI Java SDK's IBIOlap interface.The user can also use the BI XMLA Connector to make these data sources available in SAP BI Systems via UD Connect.5. This connector is fully compliant with the J2EE Connector Architecture (JCA). Hyperion. and BW 3. .

5.1 Using InfoObjects with UD Connect .7.

then the objects are created in the development system and assigned to package RSSDK_EXT. To do this. the user run the RSSDK_LOCALIZE_OBJECTS report in the ABAP editor (transaction: SE 38). If this appears to be too laborious (see the dependencies listed below). note that the InfoObjects have to correspond to the source object elements with regard to the type description and length description.  Key Because the generated objects are ABAP development objects. Transportable means that the generated objects can be transferred to another SAP BW system using the correction and transport system. 5.2 Using SAP Namespace for Generated Objects The program-technical objects that are generated during generation of a DataSource for UD connect can be created in transportable or local format. If the report is executed again. This package is especially designated for these objects. The transportability of an object depends on. the user must be authorized as a developer. . The delivery status allows for the generation of transportable objects in the SAP namespace. The objects generated afterward are not transportable. it is used to transfer the infrastructure into the productive environment. because productive systems block system changeability for security reasons. the generation is changed back to transportable. In general. The following restrictions apply when using InfoObjects: • Alpha conversion is not supported • The use of conversion routines is not supported • Upper and lower case must be enabled These InfoObject settings are checked when they are generated. . these are development systems. The objects are also added to a transport request that the user create or that already exists. For more information about data type compatibility. The status of already generated objects does not change.When modeling InfoObjects in BI.7. A developer key must be procured and entered. in which namespace it is created. there is also the option of switching to generation of local objects. If the user need to work with transportable objects.• If a classic SAP system landscape of this type exists. Then the system switches to local generation. the user should be aware of the following dependencies:  System changeability These objects can only be generated in systems whose system changeability permits this. among other things. All new objects are created as transportable. After the transport request is finished.

The key is to be entered exactly once per object and system. Hence.Generation requires the customer-specific installation number and can be generated online. 6.1 Introduction SAP’s Business Connector (SAP BC) is a business integration tool to allow SAP Customers the ability to communicate with other SAP customers or SAP marketplaces. SAP BC has a built in integration support for SAP’s specification of IDOC-XML and RFC-XML standards. This middleware component uses the Internet as communication platform and XML/HTML as data format thus seamlessly integrating different IT architectures with R/3. systems and users and facilitates them do business via web. The Business Connector allows integration with R/3 via open and nonproprietary technology. SAP BC integrates RFC server and the client and provides an XML layer over R/3 functionality. SAP BC makes this communication easier by its XML conversions. That is. The key has to be procured and entered exactly once per user and system. Afterwards. Also. it comes with an XML automation that converts SAP’s RFC format into XML and supports both synchronous RFC and asynchronous RFC protocols such that there is no requirement of SAP R/3 automation at the receiving end. The system administrator knows this procedure and should be included in the procurement. .2 Benefits of XML Integration  End-to-End Web Business Processes  Open Business Document Exchange over the Internet  XML Solutions for SAP services 6. an object key is required. Through its integration with XML it enables the exchange of structured business documents over the Internet by providing common standard for different applications and IT systems to communicate and exchange business data. XML Integration 6. the object is released for further changes as well.1 End-to-End Web Business Processes Internet bridges the gap between different businesses. Like the developer key.2. Further efforts are not required if there are repeated changes to the field list or similar. 6. Because the generated objects were created in the SAP namespace. this is customer specific and can also be procured online. whenever dealing with the messages conforming to the same standards it supports the integration.

XML acts as a uniform standard for exchanging business data. XML messages are translated into corresponding SAP internal call whenever required and converted back into XML format when received from SAP system enhancing the existing programming model for distributed applications formed by ALE. 6. Non-SAP applications can now be used by SAP customers owing to XML compliance. With XML. These third-party systems using the XML standard data format are increasing rapidly. A customer using SAP can use a range of add-on products and services to their existing applications. pursuing the production process in cooperation with other firms. through which heterogeneous applications can communicate with one another over uniform interfaces and in a language.2 Open Business Document Exchange over the Internet SAP Business Connector uses hypertext transfer protocol (HTTP) to exchange XML-based documents over the Internet. It also ensures secure exchange of documents with the help of its Security Socket Layer (SSL) technology and allows implementation of maps. In addition. IDOC and BAPI. It supports all major existing interfaces that are provided by SAP with the help of XML-based Interface Repository (IFR) and empowers SAP customers to benefit from SAP functionality over the Internet. using a company’s services in the cyber marketspace allows the customer data to be received and directly stored on vendor system as both are using data formatted in XML. simple and complex structures can be presented at any data level and for any category of data. This IFR gives the option for downloading XML schemas for operational use and provides a uniform XML interface representation despite different implementation technologies such as RFC. For example.6.2. This means that exchanging data quickly and securely between applications and systems is an increasingly important requirement. In addition. This makes the process of data transfer a lot easier by providing a common structure for all business documents. which everyone involved. Also with SAP BC. Companies no longer act in isolation: instead.com solutions accessible via XML based business documents.3 Business Integration with XML Business processes are increasingly characterized by structures operating between companies. SAP BC extends business scenarios across firewalls enabling secure flow of business documents without requiring changes to establish security infrastructures. can understand. SAP BC provides openness and flexibility to comply with emerging business semantics that continuously keep on changing.3 XML Solutions for SAP services SAP BC makes all mySAP. they are integrated into geographically distributed production networks.2. . branches and loops without any coding by using developer tool. 6.

6. Messages are stored in a generic envelope. An XML environment of data exchange over the Internet via security protocols such as HTTP.3. It uses preprepared representations of the SAP interfaces to XML Commerce Business Language (xCBL) messages to facilitate communication with the MarketSet marketplaces. SAP supports two different standards for these envelopes . local and company wide systems) and the marketplace (that is. HTTPS and FTP fully supports collaborative business scenarios. . among other things. systems across a number of companies). Furthermore. customers can now exchange data with XML standard by using the Internet infrastructure through a much-more user-friendly Web Browser. Packaging SAP BAPIs in a standard envelope offers several advantages.Microsoft BizTalk and a format similar to Simple Object Access Protocol (SOAP). including direct processing of messages by external applications and a uniform system of error handling. component. 6. structured and unstructured information. XML formatting and message handling with the help of the SAP Business Connector allows customers to use an industry-wide accepted format for data exchange between the SAP system and partner systems including historically grown proprietary systems. and business-process levels:  User integration is achieved by providing a single point of Web-based access to the workplace (that is.1 Incorporating XML Standards Various XML standards are supported by SAP.XML integration is also easier than using proprietary communication formats like SAP's BAPIs and RFCs. The ability to integrate and analyze business data across applications. and heterogeneous systems extends the traditional business environment and provides users with a complete view of the business.3. the SAP XML extensions for IDocs and BAPIs. increasingly common in an integrated world. This envelope contains metadata that controls. It presents data according to an SAP specification either in IDoc-XML or BAPI-XML. This has found widespread customer acceptance as it reduces integration and maintenance cost of interface integration during the implementation of non-SAP systems. the routing of the messages.2 SAP’s Internet Business Framework SAP's Internet Business Framework (IBF) attempts to address business collaboration issues by enabling integration with Internet technologies at the user.

 Component integration consists of the integration of Internet technologies at the front-end and application-server levels with the aid of HTML, HTTP, and XML messages.  Business process integration across company boundaries is accomplished through the exchange of transactions between companies based on open Internet standards.
6.3.3 SAP applications with XML

SAP BC provides an add-on XML layer with R/3 functions to ensure compatibility of non-SAP applications with R/3 internal data structures or protocols. SAP Business Connector can help achieve seamless B2B integration between businesses through the integration framework. For example, there can be a real-time integration between supplier inventories and an enterprise’s SAP system; or a multi-vendor product, price and availability information and a customer’s purchasing application. The SAP proprietary RFC format is converted to XML (or HTML) so that no SAP software is needed on the other end of the communication line and developing applications does not require SAP R/3 knowledge. SAP BC has built in support for SAP's specification of IDoc-XML and RFCXML. Whenever the user deal with messages conforming to these standards, integration is supported out of the box. For cases where other XML formats are required the user can create maps with a graphical tool or insert the user own mapping logic. XML Based communication over the Internet is achieved through SAP's Business Connector

Figure 1

6.3.4 Factors leading to emergence of XML-enabled SAP solutions 6.3.4.1 Changing Business Standards and their adoption

Mapping of SAP business documents and the XML-based business documents can be easily done with SAP Business Connector. The flexible architecture of the SAP Business Connector makes it easy to add specific schemas and interfaces To comply with business documents standards, the Business Connector provides for automated generation of server and client side codes thus solving the interconnectivity problem by enabling uniform transactions among customers. A business model based on these standards allows companies to move towards becoming make-to-order businesses replete with all the marketing, sales, distribution, manufacturing and other logistic-driven operational cost savings.
6.3.4.2 Internet Security Standards

The effectiveness of E-commerce is premised on a secure exchange of information. The SAP business connector provides security essential to an online business transaction. Business partners can be authenticated and business documents can be securely exchanged. The SAP Business Connector supports the well-established standard encryption technology Secure Socket Layer (SSL) for secure document exchange. Digital signature will ensure authentication of crucial data. SAP communication is achieved through Idocs, BAPIs, and RFCs. These documents must be made compliant with Internet. This conversion is achieved using the Business Connector. Business Connector converts SAP documents into XML so that they can be exchanged using Internet protocols.

6.4 Web-based business solutions
The SAP Business Connector allows businesses to effectively use intra and inter-enterprise information. For example-companies can use the SAP Business Connector to retrieve catalog information from a suppliers’ Web site and to integrate the information with internal processes - in real time.
6.4.1 Components of Business Connector

A business connector constitutes a Server and an Integrator. The Integrator can be used to add any additional functionality to the Server. SAP Business connector in association with a third party XML enabled software product can only be used for collaborative business-to-business scenarios

such as transmitting catalog information, managing purchase orders and availability to promise checks, acknowledging purchase orders, and handling invoices. The requirements for XML interface certification for SAP's complementary software include:  Use of HTTP and HTTPS protocols with the SAP Business Connector  Customization for a specific Business Application  Sending and receiving the communication objects (i.e. Idocs, BAPIs or RFCs)

6.5 How to Customize Business Connector (BC)
1. Start the Business Connector by calling <BC_dir>/bin/server.bat. If the user wants to have a debug output, enter server.bat –debug <debuglevel> –log <filename>. Debuglevel cannot be > 10. 2. Open the Business Connector administration screen into a web browser window. Enter the user Username and corresponding password.
6.5.1 Add New Users to BC

3. If the user want to transmit data from an SAP system to the BC, the user need to have the same user in both systems. For creating users in the BC click Security > Users and Groups > Add and Remove Users. 4. Enter the desired SAP User, assign the corresponding Password and click Create Users. This creates a User. Mark the just created User in the Groups box section and make sure that the ‘Select Group’ ="Administrators". Now, add the User into the Administrators group by clicking (below the right selection box). Click Save Changes to save the settings.
6.5.2 Add SAP Systems

5. All the proposed SAP system(s) should be added within the Business Connector. To achieve this, click Adapters > SAP which opens a new window. In the new window click SAP > SAP Servers > Add SAP Server. 6. Enter the necessary information for the SAP server (System, Login Defaults, Server Logon, and Load Balancing. "Save" (as illustrated in screen 1).

Using the Rules it finds the recipient and the format that should be used to send the call to this particular recipient. The user cannot use the SAP Business Connector in a web application (e-Commerce.3 Add Router Tables 7. Enter information like ‘Sender’. All incoming calls to Business Connector are scrutinized and routing information is extracted about the ‘Sender’. Procurement. Post a document containing the XML format of the Idoc or BAPI/RFC-call to the Business Connector service. For example: Use the following statements to post a document to the /sap/InboundIdoc service of the Business Connector. With "Add Rule" the rule is created on other details like "Transport" and "Transport Parameters" must be provided. For example: The user can send an XML document to the user vendor. 6. but can use it to facilitate a business-to-business transaction in an EDIlike manner. 8. click Save and enable the rule by clicking No under the "Enabled?" column. etc). Clicking Adapters > Routing opens a new window. . After entering all the details.5.6. ‘Receiver’ and ‘MsgType’. and they send the user an XML packet back. ‘Receiver’ and ‘Msg Type’.5.4 Access functionality in the Business Connector 9.

Data marts can be used in different ways:  They save a subset of the data of a Data Warehouse in another database. and hierarchies for an InfoObject. Export DataSources for master data contain the metadata for all attributes. . Data exchange of multiple BI Systems: the data-delivering system is referred to as the Source B.7.  The user can only generate an export DataSource from an InfoCube if:  The InfoCube is activated  The name of the InfoCube is at least one character shorter than the maximum length of a name. A BI system defines itself as the source system for another BI system by:  Providing metadata  Providing transaction data and master data An export DataSource is needed to transfer data from a source BI into a target BI. the data-receiving system as the Target BI. possibly even in a different location.1 Introduction The data mart interface makes it possible to update data from one InfoProvider to another. Export DataSources for InfoCubes and DataStore objects contain all the characteristics and key figures of the InfoProvider.  The Delete function is not supported at this time.  They are stored as intentionally redundant segments of the (logical. The InfoProviders of the source BI are used as sources of data here.  They are smaller units of a Data Warehouse. texts. global) overall system (Data Warehouse). since the DataSource name is made up of the InfoCube name and a prefix. Data Mart Interface 7. The individual BI systems arranged in this way are called data marts.2 Special Features  Changes to the metadata of the source system can only be added to the export DataSources by regenerating the export DataSources. Data exchange between BI systems and other SAP systems. 7.

3 Data Mart Interface in the Myself System The data mart interface in the Myself System is used to connect the BW System to it. the same as in the SAP source system. to see which orders are open. This means the user can update data from data targets into other data . The user can update data directly from ODS objects into other data targets. targets within the system. since the ODS data can be used directly as a DataSource in the same system. update the data to another ODS o object and merges the objects together. The user can carry out a data clean-up. thereby. fill another InfoCube. In order to trace the status in reporting. Here too. Data about orders and deliveries are in two separate ODS objects. for example. that is.7. or continue to work in several BI Systems. 7. ource The data mart interface can be used between two BI Systems or between another SAP system and a BI System: . using transfer rules and update rules.4 Data Mart Interface between Several Systems Data marts are found both in the maintenance and in the definition. Updating ODS data into another ODS object makes it possible to track object the status of orders. delivered in part etc. the user can group together data from one or more source systems in a BI System. The user can import InfoCube data by InfoSource into BW again and.

and to be able to perform analyses of the state of the entire dataset on the other. on the one hand.  Replication Architecture  Aggregation Architecture 7.1. the data for a BI server is available as source data and can be updated in further target BI Systems.  To enable data separation relating to the task area. logging and analysis.4.  To construct hub and spoke scenarios in which a BI System stands in the middle and the data from distributed systems runs together and is standardized.  To achieve improved concision and maintenance of the individual Data Warehouses.  To make it possible to have fewer complexities when constructing and implementing a Data Warehouse. 7.1 Architectures The functions can produce different architectures in a Data Warehouse landscape. .4.1 Replicating Architecture If the user selects this architecture. To enable large amounts of data to be processed  To increase the respective speeds of data transfer.

4.7.1. data is grouped together from two or more BI servers. . and is then available for further processing.2 Aggregating Architecture With aggregating architecture.

2 Process Flow To define an existing BI system as a source system. create a data transfer process with the InfoProvider into which the data is to be loaded as the target. To do this.2 In the Target BI  If required. This is only necessary if the source system has not yet been created in the target BI. Using the source system tree. A default transformation is created at the same time. the user can replicate all the metadata of a source system.1 In the Source BI Before a BI system can request data from another BI system.  Activate the DataSource.  Create an InfoPackage at the user DataSource using the context menu.4. 7. or only replicate the metadata of the user DataSource. it must have information about the structure of the data to be requested.4.  Create the InfoProvider into which the data is to be loaded. which contains all the characteristics and key figures of the InfoProvider. The user generates an export DataSource for the respective InfoProvider in the source BI.  The complete data flow is displayed in the InfoProvider tree under the user InfoProvider. .2.4.  Using the context menu for the user DataSource.  Replicate the metadata of the user export DataSource from the user source BI into the user target BI. define the source system. access the functions of the data mart we use the Data Mart Interface. This export DataSource includes an extraction structure. the user has to upload the metadata from the source BI into the target BI. 7. The user source BI system is specified as the source by default.2.7.

choose Additional Functions  Generate Export DataSource.4 Generating Master Data Export DataSources The export DataSource is needed to transfer data from a source BI system into a target BI system. To do this. Generate the export DataSource from the context menu of the user InfoObject. The technical name of the export DataSource is 8******M for attributes (M stands for master data) 8******T for texts 8******H for hierarchies.Technical name of the export DataSource: 8COPA 7. By doing this. The user can use the selected InfoProvider as a DataSource for another BI system. . all the metadata is created that is required to extract master data (all attributes. choose Modeling and select the InfoProvider tree.3 Generating Export DataSources for InfoProviders The export DataSource is needed to transfer data from a source BI system into a target BI system. the user must therefore make sure that the length of the technical name of each object is no longer than 28 characters. The technical name of the export DataSource is made up of the number 8 together with the name of the InfoProvider. Generate the export DataSource using the context menu of the user InfoProvider. and thus for individual InfoObjects. Technical name of an InfoCube: COPA. We recommend that the user always use process chains for loading processes. texts. 1. To do this. and hierarchies) from an InfoObject. 1. choose Additional Functions  Generate Export DataSource. 2. When the user create an InfoObject or a master data InfoSource in the source BI system. 7. Schedule the InfoPackage and the data transfer process.4. The user can generate an export DataSource for master data. 2. (The asterisks (*) stand for the source InfoObject). Select the InfoObject tree in the Data Warehousing Workbench in the source BI system. In the Data Warehousing Workbench in the BI source system.4.

5 Transactional Data Transfer Using the Data Mart Interface The data transfer is the same as the data transfer in an SAP system.4. the requests in the change log of the DataStore object are used as the basis for determining the delta. 7. taking into account the specified dimension-specific selections. only those requests are transferred that are set to Qualitative OK in the InfoCube administration. from the fact tables of the delivering BI system.7.5.4. 7. A distinction is made between InfoCubes and DataStore objects. Later on. The delta can only be fully requested for both cost centers. When the next upload is performed. Only those requests are transferred that have been rolled up successfully in the aggregates. When the user load hierarchies. the user can transfer data by full upload as well as by delta requests.2 Restriction It is possible to only make one selection for each target system for the delta.4. Different target systems can also be filled like this. The user first makes a selection using cost center 1 and load deltas for this selection. meaning that the current status is transferred into the target BI system. the user also decides to load a delta for cost center 2 in parallel to the cost center 1 delta. It corresponds to the data transfer for an R/3 System.  For DataStore objects. the user must first create them in the source system. If no aggregates are used.4. the user get to the available hierarchies from the Hierarchy Selection tab page by using the pushbutton Available Hierarchies in OLTP.1 Delta Process Using the data mart interface.6 Transferring Texts and Hierarchies for the Data Mart Interface The transfer of master data is scheduled in the Scheduler. If the user wants to load texts or hierarchies using the Scheduler. only those requests are transferred that have come in since initialization.  The InfoCube that is used as an export DataSource is first initialized. The system reads the data. . meaning that it is then impossible to separately execute deltas for the different selections 7. Select the user hierarchy and schedule the loading. Only the change log requests that have arisen from reactivating the DataStore object data are transferred.5.

having the same structure similar to infocube. Virtual remote cube can be used to 'act like an infocube.7. just right click on any infoarea. There's an option to use the old R/3 infosource. choose the DTP related to the transformation and save it. But in Bi7. which doesn't actually do anything.3 Different Types A distinction is made between the following types: . along with start routine. The user can choose to create a virtual provider with a 'direct access' to R/3. 7. If the user uses Infosource. and choose create 'virtual provider'. well depends on the user making. For direct access.1 Introduction They only store data virtually. is. Virtual InfoCubes 7. meaning the user do not 'execute' it to start a data transfer. the user have a more flexible approach in that the user can create a transformation to configure the 'transfer logic' of the user virtual infoprovider. characteristics and key figures.2 Create Virtual Infocube In BI7. not physically and are made available to Reporting as Info-Providers. the user need to right click on the virtual infoprovider and choose 'Activate Direct Access'. But the actual generation of data for this cube. go to the Infosource tab. the user need to create a sort of 'pseudo' DTP. Note that no update rules exist with this kind of setup. feel like an infocube. and when the user create the virtual infoprovider. it will use the structure in the infosource automatically and the flow of data is automatically link R/3 with the transfer rules and the virtual infoprovider. 7. so the implementation details are really up to the user. But using BI7 setup. but is not an infocube'. The user can create this virtual infoprovider. The user can also choose to get value from a function module of the user calling. This means that upon viewing the data in this virtual provider. What this means is that the user can create an infosource with transfer rules and such. the user chooses Virtual Provider with 'direct access'. with dimensions. If the user is using BI7 setup. After all is done. a remote function call is called direct to R/3 to get the values on the fly. and choose the infosource. end routine or any other transformation technique visible with using a transformation rule.

In the dialog box that follows.3. 3. Choose the InfoSource tree or InfoProvider tree in the Administrator Workbench under Modeling. Use SAP RemoteCubes if:  The user need very up up-to-date data from an SAP source system  The user only access a small amount of data from time to time  Only a few users execute queries simultaneously on the database. Otherwise the user must select the source system in the relevant query. choose one or several source systems and select Save Assignments. 5. choose Assign Source System from the context menu in the user SAP RemoteCube.1. To do this. The source system assignments are local to the system and are not .3. Select SAP RemoteCube as InfoCube type and enter the user InfoSource. choose Create InfoCube. In the next screen the user can check the defined SAP RemoteCube and adjust it if necessary before activation. Do not use SAP RemoteCubes if:  The user request a large amount of data in the first query navigation step. . transported. The user can define if a unique source system is assigned to the InfoCube using an indicator. In the context menu. 4. SAP RemoteCube  RemoteCube  Virtual InfoCube with Services 7.1 SAP RemoteCube An SAP RemoteCube is a RemoteCube that allows the definition of queries with direct access to transaction data in other SAP systems. 1. 2. 6.1 Creating a SAP RemoteCube The SAP RemoteCube can be used in reporting in cases where it does not differ from other InfoProviders. The user can then assign source systems to the SAP RemoteCube. In this stem case the characteristic (0LOGSYS) is added to the InfoCube definition. and no appropriate aggregates are available in the source system  A lot of users execute queries simultaneously  The user frequently access the same data 7.

1. Only the structure of the RemoteCube is defined in BW. a source system ID has been created for the source system DataSources from the source system that are released for direct access are assigned to the InfoSource of the SAP RemoteCube. . To prevent this user can create an inversion routine for every transfer routine.3.3. They are already replicated in BW when the user executes a query. With more complex transformations such as routines or formulas. a source system must meet the following requirements:     BW Service API functions (contained in the SAP R/3 plug-in) are installed. 7. The transaction data is called during execution of a query in the source system. the selections are provided to the InfoObjects if the transformation is only simple mapping of the InfoObject. the user can carry out reporting using data in external systems without having to physically store transaction data in BW. They copy the characteristics and key figures of the InfoSource. The user can. Inversion is not possible with formulas. the user can reduce the administrative work on the BW side and also save memory space. The data is read for reporting using a BAPI from another system.2 Structure SAP RemoteCubes are defined based on an InfoSource with flexible updating.0B In BW. The Release status of the source system is at least 4. By doing this. for example.2 Remote Cube A RemoteCube is an InfoCube whose transaction data is not managed in the Business Information Warehouse but externally. Master data and hierarchies are not read directly in the source system.3. include an external system from market data providers using a RemoteCube. the selections cannot be transferred. If the user have specified a constant in the transfer rules. which is why SAP recommends that the user use formulas instead of routines. It takes longer to read the data in the source system because the amount of data is not limited. 7.7. There are active transfer rules for these combinations. During this process. Using a RemoteCube. the data is transferred only if this constant can be fulfilled.1.3 Integration To be assigned to an SAP RemoteCube.

the external system transfers the requested data to the OLAP Processor. instead of using a Basic Cube filled with data.2 Integration To report using a RemoteCube the user has to carry out the following steps: 1. the Data Manager.2. Load the master data:  Create a master data InfoSource for each characteristic  Load texts and attributes Define the RemoteCube Define the queries based on the RemoteCube .7.  Selection  Characteristics  Key figures As a result.3. Define the required InfoObjects.2. 7.3. 5. 3. 4. 2. create a source system for the external system that the user wants to use. calls the RemoteCube BAPI and transfers the parameters.1 Structure When reporting using a RemoteCube. In BW.

2. Depending on these properties. but also requires more implementation effort. the data manager provides services to convert the parameters and data. The user has a number of options for defining the properties of the data source more precisely. is given at the end of this documentation.3 Virtual InfoCubes with Services A virtual InfoCube with services is an InfoCube that does not physically store its own data in BW. an extra Detail pushbutton appears on the interface.3.7. There are different default variants for the interface of this function module. The data can be local or remote. These conversions only change the transfer table in the user-defined function module. The user uses a virtual InfoCube with services if the user wants to display data from non-BW data sources in BW without having to copy the data set into the BW structures. Enter the name of the function module that the user wants to use as the data source for the virtual InfoCube.3. It offers more flexibility. The result of the query is not changed because the restrictions that are not processed by the function module are checked later in the OLAP processor. The data source is a user-defined function module. 7. Options:  No restrictions: if this option is selected. This function is used primarily in the SAP Strategic Enterprise Management (SEM) application. In comparison to the RemoteCube. together with the description of the interfaces. This pushbutton opens an additional dialog box. One method for defining the correct variant. any restrictions are passed to the InfoCube. If the user chooses Virtual InfoCube with Services as the type for the user InfoCube. The next step is to select options for converting/simplifying the selection conditions.1 Structure When the user creates an InfoCube the user can specify the type. . the virtual InfoCube with services is more generic.3. The user can also use the user own calculations to change the data before it is passed to the OLAP processor. 1. in which the user defines the services. The user does this by selecting the Convert Restrictions option.

the OLAP processor incorporates this conversion during the calculation. SID support: If the data source of the function module can process SIDs. If this option is selected. wherever possible. The user use the currency tables to determine the correct value for this internal representation. for example. the user need to have selected the characteristics that correspond to these attributes. 4. the characteristic values are read from the data source and the data manager determines the SIDs dynamically. the user have to define a logical system that is used to determine the target system for the remote function call. the navigation attributes are read in the data manager once the user-defined function module has been executed. Other restrictions (FEMS > 0) that are created. restrictions that are applied to SID values are converted automatically into the corresponding restrictions for the characteristic values. If this is not possible. With navigation attributes: If this option is selected. by setting restrictions on columns in queries. In this case. Expand hierarchy restrictions: If this option is selected. . SID support must be switched off and the hierarchy restrictions must be expanded.   Only global restrictions: If this option is selected. Restrictions applied to the navigation attributes are not passed to the function module in this case. Pack RFC: This option packs the parameter tables in BAPI format before the function module is called and unpacks the data table that is returned by the function module after the call is performed. If this option is not selected. navigation attributes and restrictions applied to navigation attributes are passed to the function module. the user Should select this option. Simplify selections: Currently this option is not yet implemented.2 Dependencies If the user uses a remote function call. only global restrictions (FEMS = 0) are passed to the function module.3.3. 5. are deleted. if the user select this option. restrictions on hierarchy nodes are converted into the corresponding restrictions on the characteristic value. 3. Since this option is only useful in conjunction with a remote function call. In this case. The value in this internal format is different from the correct value in that the decimal places are shifted. Internal format (key figures): In SAP systems a separate format is often used to display currency key figures. in the query. 6. 7.

1 Description of the interfaces for user user-defined function modules Variant 1: Variant 2: .3.2.3.7.

using field 'POSIT' . The interface is intended for internal use only and only half of it is given here.3.3. Table i_tsx_hier has the following type: Variant 3: SAP advises against using this interface. use Note that SAP may change the structures used in the interface. Go through the list from top to the bottom. an entry for the 'COMPOP' = 'HI' (for hierarchy) field is created at the appropriate place in table I_T_RANGE (for FEMS 0) or I_TX_RANGETAB (for FEMS > 0).2 Additional parameters for variant 2 for transferring hierarchy restrictions With hierarchy restrictions.7. defined The first appropriate case is the variant that the user should use: If Pack RFC is activated: Variant 1 If SID Support is deactivated: Variant 2 . 7.3 Method for determining the correct variant for the interface The following list describes the procedure for determining the correct interface for the user-defined function module.3.2.3. and the 'LOW' field contains a number that can be used to read the corresponding hierarchy restriction from table I_TSX_HIER.

Sign up to vote on this title
UsefulNot useful