SAP BW Extraction

Table of Contents

1. Extraction................................................................................................................................................. 8 1.1Introduction ......................................................................................................................................... 8 1.2 Step-by-step control flow for a successful data extraction with SAP BW: .......................................12 2. Data Extraction from SAP Source Systems.......................................................................................... 12 2.1 Introduction ...................................................................................................................................... 12 2.1.1 Process ....................................................................................................................................... 13 2.1.2 Plug-in for R/3 Systems..............................................................................................................14 2.2 Transfer Method - PSA and IDoc....................................................................................................... 14 2.2.1 Introduction ...............................................................................................................................14 2.2.2 Persistent Staging Area (PSA)..................................................................................................... 15 2.2.2.1 Definition ............................................................................................................................ 15 2.2.2.2 Use ...................................................................................................................................... 15 2.2.3 IDoc’s..........................................................................................................................................15 2.2.3.1 Definition ............................................................................................................................ 15 2.2.3.2 Example: ............................................................................................................................. 15 2.2.4 Two Methods to transfer data...................................................................................................15 2.2.4.1 Differences and advantages:...............................................................................................16 2.2.4.1.1 PSA ...............................................................................................................................16 2.2.4.2 ALE (data IDoc)................................................................................................................ 16 2.3 Data Source....................................................................................................................................... 16 2.3.1 Assigning DataSources to InfoSources and Fields to InfoObjects.............................................. 17 2.3.2 Maintaining DataSources...........................................................................................................17 2.3.3 Transferring Business Content DataSources into Active Version .............................................. 18 2.3.4 Extraction Structure ...................................................................................................................18 2.3.5 Transfer Structure ......................................................................................................................18 2.3.6 Replication of DataSources ........................................................................................................19 2.3.6.1 Replication of the Entire Metadata ....................................................................................19

2.3.6.2 Replication of the Application Component Hierarchy of a Source System ........................ 19 2.3.6.3 Replication of the Metadata ...............................................................................................19 2.3.6.4 Replication of a DataSource of a Source System ................................................................19 2.4 Data Extraction Logistics ...................................................................................................................20 2.4.1 Data extraction Illustration ........................................................................................................20 2.4.1.1 Full Load:............................................................................................................................. 20 2.4.1.2 Delta Load: .......................................................................................................................... 22 2.5 LO Cockpit Functions......................................................................................................................... 23 2.5.1 Maintain Extract Structures ....................................................................................................... 23 2.5.2 Maintain Data Sources............................................................................................................... 23 2.5.3 Activating update.......................................................................................................................24 2.5.4 Controlling update .....................................................................................................................24 2.5.5 Setup Tables...............................................................................................................................24 2.5.6 Serialized V3...............................................................................................................................24 2.5.7 Queued Delta (the third update method).................................................................................. 25 2.5.8 Direct Delta ( 2nd delta update method in our list) .................................................................... 25 2.5.9 Unserialized V3: (The last one) ..................................................................................................25 2.6 LO Data Sources Data Flow in R/3 :................................................................................................... 25 2.6.1 Filling up the Appended Structure.............................................................................................30 2.6.2 Regenerate & Check the Customized Objects ...........................................................................34 2.7 Structure of Delta Method for LO Cockpit Data Sources..................................................................36 2.7.1 Delta Management in extraction...............................................................................................36 2.7.2 Step-by-Step Maintenance ........................................................................................................37 2.8 Delta Method ....................................................................................................................................46 2.8.1 Master Data ...............................................................................................................................46 2.8.2 TRANSACTIONAL DATA ..............................................................................................................47 2.8.3 Delta Process..............................................................................................................................48 2.9 Delta Method Properties .................................................................................................................. 49 2.9.1 Delta Initialization ......................................................................................................................49 2.9. 2 Delta Extraction.........................................................................................................................49

.......................3......................................................................12.............................. 62 2.........................................................59 2.......................................12...........................1 Extraction Structure ...3..................................................................... Texts.................13 Generic Data sources ..............................................................................................2................................................................1 Replication Process Flow ..................................................................12.................59 2..........................................................1Create Generic extraction [Master data].....................61 2...12........12.......................................................................2 Functions........................2........................................................................3...2.............................................................................................2.................12 Generic Data Types ...56 2..............................................................................10....................2........................................12..11........................................................12......... Hierarchies .............................................................................12...2 Editing the DataSource in the Source System..12..........63 2..........51 2................................................................................60 2........9...............52 2...................13.62 2.........9............................1 Time-dependent Attributes ......3 Transactional data............2 Limits.......4 Language-dependent Texts......................... 51 2...........................................................53 2.......................................................13...............................................................................................1 Benefits .....2........................................................................................................... 53 2....................9.......1.....................2 Queued delta (V1 + V3 updates).................10...................3 Update Modes .................... 62 2.......................................... 59 2............3........................................1.........2.3 Automatic Replication during Data Request.................3 Replication of DataSources............................59 2.......................................................................................9......................................................... 50 2...2 V2 Update ...........................................13................................11 Generic extraction.....3.................................1 V1 Update ........... 54 2..........................................13.........................2........2 Deleting DataSources during Replication .................................................................3 V3 Update ....................59 2...13.......54 2..........1 Master Data .1......................10........60 2............................................59 2........................................... Attributes ..59 2...............63 .............13........................................3..............................................1....... 51 2..........................................................................62 2...................................................................... 59 2... 60 2.............................1 Direct Delta (V1 update) .............................................12..........................................................55 2........................................3 Un-serialized V3 Update (V1/V2 + V3 Updates) ...10...........................................................3 Time-dependent Texts and Attributes.................... 50 2.............................................2 Time-dependent Texts ..................................................................................3...........10...............10 Delta Queue Functions...

..........................................106 4..5........107 4.............................1 Basic Steps of Data Flow (ETL process): .......1 Process Description............134 5..................x) .....................................................2 Using SAP Namespace for Generated Objects.............................1 5..................................................................................................4 Creating a DataSource for UD Connect..........................................69 3....69 3......................................................................................................................................135 5.................................................................5 Using Relational UD Connect Sources (JDBC) ...........................................................2............................................... 135 5.....2....130 5....................... 135 Cloning the Connections..87 3..................................................................... DB Connect.........1 Aggregated Reading and Quantity Restriction ....6................6..................2...........................0).................................................................................................................................................................137 5.............................................................................. Universal Data Integration ..........................5........................................63 3.... 133 5...................1 Deploy the user data source’s JDBC driver to the server: .....................................3 Connector Properties...........................................................................87 4......133 5..................................................................................................................................................................139 ... 130 5......2.......................................2.......1 Data from Flat Files (7.................133 5............135 JNDI Names..................7...70 3............................ 138 5..................................................................................................................14 Enhancing Business Content...........................................130 5..................................2 Data from Flat Files (3................1 Introduction ..........................131 5...................................................4 Data Types that can be extracted using Flat Files......................134 5.................................................................................................................... 107 5................86 3..............7 BI XMLA Connector .............................................6...........................133 5... Extraction with Flat Files....................................................................................................................................................................................2 Process Flow..........................................106 4.................2 Configuring BI Java Connector ...............................2 Step-by-Step to upload Master Data from Flat File to InfoObjects ..........................................................................................................70 3......................................4..........2 5.................1 Using InfoObjects with UD Connect.............................................2 Use of Multiple Database Objects as UD Connect Source Object ................................................130 5........6...................6.....................................1 Introduction ..................3 Creating UD source system....................4.........................................................................2 Loading data from SAP Supporting DBMS into BI....................................................................................6...............7........3 Testing the Connections............................6 BI JDBC Connector.................................3 Extracting Transaction and Master Data using Flat Files ..........

..........................147 7............................................................. 148 7...............................................................................................................4 Factors leading to emergence of XML-enabled SAP solutions ...... XML Integration...............148 7.........................2 SAP’s Internet Business Framework ..142 6...................140 6.......................................................3..................1..........3 Business Integration with XML ..............140 6....................149 7..................................4.......................... 151 ............................................................................................................. 146 7..1 Incorporating XML Standards ..3 Add Router Tables.....................................................................................................150 7.......3.....142 6.............5................1 Introduction ............................................................151 7........................3 XML Solutions for SAP services ......................................................................................................................4..............4.......................4............................3..............................................................................5....................4........................................4......................... 144 6.......................................3...................................................................144 6.....6........................ 145 6...........................................2 Benefits of XML Integration .4................. 141 6............................................................4.........................................................141 6..3..............................4 Web-based business solutions....................................3 SAP applications with XML.........144 6............................................................... Data Mart Interface .......... 149 7............................................................1 Architectures.............145 6.......................3.........................................................145 6...............................................4 Access functionality in the Business Connector..................................................1 Replicating Architecture....................................5.....2.................................5 How to Customize Business Connector (BC)..144 6.............1 Add New Users to BC .................................2 Aggregating Architecture............................2...............1.........2 Process Flow.........2................................2 Open Business Document Exchange over the Internet ............................................................................................5....................................2 Special Features ........... 147 7.........................................................................................................................................................................................2 Internet Security Standards ..2.....................................................................................................1 Changing Business Standards and their adoption ................................................140 6.......................................................................................................144 6............. 141 6..................................2 Add SAP Systems...............................1 Introduction ........................................................140 6.......................................................................1 In the Source BI.......4 Data Mart Interface between Several Systems ...................................................................1 Components of Business Connector........147 7......................................................................................3 Data Mart Interface in the Myself System.146 6.............1 End-to-End Web Business Processes ...................................... 143 6..

.........................................................153 7....................................................................4...2 Additional parameters for variant 2 for transferring hierarchy restrictions .....................3..........3...............154 7............4 Generating Master Data Export DataSources........................................................7.........................2 Restriction...1.............2....157 7......................................................... 156 7.................................1 Structure .................1 Introduction .......... 161 7....................4.......3 Virtual InfoCubes with Services ...............154 7.........................3......................................................................................159 7...................153 7.........2 In the Target BI.....2.......2...............................................................154 7..............................................3...3.............................4.............. 157 7.........................................................................3...................2 Integration ..154 7....................... 151 7...........152 7...3 Generating Export DataSources for InfoProviders.......................................................................................................5...3...........156 7.................3......................2 Create Virtual Infocube................3...........153 7..........152 7..................................3........3....1 Creating a SAP RemoteCube.......2.......................5..............................................................................3......................1 SAP RemoteCube ...........1 Delta Process............................................................................6 Transferring Texts and Hierarchies for the Data Mart Interface .............3.....4.............. 153 7...........................................................................155 7.......................................................3 Different Types.............2 Dependencies. Virtual InfoCubes...............................................................................................................................................................5 Transactional Data Transfer Using the Data Mart Interface..............................................................................................................1 Structure .......1..........................................................................................4....................................3.....2........................1..3.................4...155 7..............160 7..3.............3...... 156 7..............3 Method for determining the correct variant for the interface.......2 Structure .....................................................................................3...........................3 Integration ......................1 Description of the interfaces for user-defined function modules ....................... 158 7..........................................................................158 7...................................................2 Remote Cube........................................................................4...........161 ..

1Introduction Extraction programs that read data from extract structures and send it. similar extraction programs can be implemented with the help of third party providers. to the Business Information Warehouse also belong to the data staging mechanisms in the SAP R/3 system as well as the SAP Strategic Initiative products such as APO. These extraction tools are implemented on the source system side during implementation and support various releases. .Persistent Staging Area) that are generated from the transfer structures for the Business Information Warehouse on the source system side are used for this.1. In non-SAP applications. The IDOC structures or TRFC data record structures (if the user chooses to use the PSA . These then collect the requested data and send it in the required transfer format using BAPIs to the SAP Business Information Warehouse. Extraction 1. in the required format. CRM and SEM.

Once there the user can assign it to an InfoSource. the ‘DataSource Replication’ step is provided to duplicate the DataSource is replicated with its relevant properties in BW. choose Source System Tree ® The user Source System ® DataSource Overview ® The user Application Components ® Context Menu (right mouse click) à Replicate DataSources in the BW Administrator Workbench. The user can request the Metadata for a DataSource. For an R/3 OLTP source system. or all the Metadata of a source system: To replicate the Metadata of a DataSource.The OLTP extraction tables form the basis of a DataSource on an R/3 OLTP system. the Metadata for an application component. To update all the Metadata of a source system. usually from a table view. choose Source System Tree à The user Source System à Context Menu (right mouse click) à Replicate DataSources in the BW Administrator Workbench . choose Source System Tree ® The user Source System à DataSource Overview à The user Application Components à The user DataSource à Context Menu (right mouse click) à Replicate DataSources in the BW Administrator Workbench To replicate the Metadata from a source system into BW for an application component. The structure is written in the OLTP using the data elements that describe the available data.

A complete list can be found on the SAPnet BW homepage. Virtually any source of data can be extracted for use in the Business Information Warehouse . The InfoObjects used for this are not just transaction and master data but also relationship sets such as attributes or hierarchies for master data. in order to guarantee smooth communication between SAP OLTP and SAP BW.All application data must be described in SAP BW using meta data. These tasks must be carried out by external software in non-SAP Systems that are set up in the BAPI interfaces of the SAP BW. The SAP BW extractors carry out a number of functions in the SAP OLTP. Various businesses have already been certified as official third party providers in this field.

API to access the data stored in the ODS (read and update). Possibility to synchronize InfoSources. The load process takes place in two steps and tracing of the status of each step is provided. Two options for data transfer are possible   TRFC (chosen by selecting PSA in the transfer structure) TRFCs are faster than ALE but there is a 255 field and a 1962 byte limit imposed. ALE (IDOC) Here there is a 1000 byte limit imposed.  Benefits of TRFC    Improved performance when loading data. Overcoming the 1000 byte limit (now a maximum of 255 fields and 1962 bytes). . The transfer is asynchronous and runs in the background.

An InfoPackage is scheduled for execution at a specific point of time or for a certain system. 3. The BI Service API calls the extractor in initialization mode to allow for extractorspecific initializations before actually starting the extraction process. The BI Service API calls the extractor in extraction mode. The extractor takes care of splitting the complete result set into data packages according to the IDoc control parameters.or user-defined event. Possible error conditions include specification of DataSources unavailable in the source system and changes in the DataSource setup or the extraction process that have not yet been replicated to the SAP BW system. An extractor can fill the extract structure of a DataSource with the data from SAP source system datasets. The generic extractor.2 Step-by-step control flow for a successful data extraction with SAP BW: 1. 7. . for example. The request IDoc arrives in the source system and is processed by the IDoc dispatcher. opens an SQL cursor based on the specified DataSource and selection criteria.1. 6. and customer exits are called for possible enhancements. The BI Service API finally sends a final status IDoc notifying the target system that request processing has finished (successfully or with errors specified in the status IDoc). the SAP BW system starts a batch job that sends a request IDoc to the SAP source system. Data Extraction from SAP Source Systems 2. 4. The BI Service API continues to call the extractor until no more data can be fetched. 5. 2. One data package per call is returned to the BI Service API. 2.1 Introduction Extractors are one of the data retrieval mechanisms in the SAP source system. which calls the BI Service API to process the request. The BI Service API checks the request for technical consistency. Once the defined point of time is reached.

2. The DataSource fields are made available to be assigned to BW InfoObjects. Once there the user can assign it to an InfoSource.In a metadata upload.1 Process There are application-specific extractors. with which the user can extract more data from the SAP source system and transfer it into BW. the user can schedule an InfoPackage in the Scheduler. the DataSource. is replicated in the BW. and which fill the extract structure of the DataSource. In addition.1. each of which are hard-coded for the DataSource that was delivered with BW Business Content. After specifying the data flow in the BW Administrator Workbench by maintaining the transfer rules. there are generic extractors. Only when the user calls up the generic extractor by naming the DataSource does it know which data is to be . including its relevant properties. The data loading process is triggered to the source system by a request IDoc.

The DataSource data for these types are read generically and transferred into the BW. If time permits. FI-SL and HR. This is how generic extractors allow the extraction of data that cannot be made available within the framework of Business Content. 2.2 Plug-in for R/3 Systems BW-specific source system functions. DataSources are generated on the basis of these (individually) defined info structures. database views or SAP query functional areas or using the function module. This is how it fills different extract structures and DataSources.help.1 Introduction This information is taken from sap. 2. . Regardless of application. The user can run generic data extraction in the R/3 source system application areas such as LIS.com and some other sources as well and rearranged to understand this concept much better. The user can generate user-specific DataSources here.2 Transfer Method .PSA and IDoc 2. CO-PA. uses generic extraction to read info structures. This is how LIS.1. and from which tables it should read it from and in which structure.2.Communication between the R/3 source system and the SAP Business Information Warehouse is only possible if the appropriate plug-in is installed in the source system. The user get there by choosing The user Source System  Context Menu (right mouse click)  Customizing Extractors in the BW Administrator Workbench – Modeling. The user can find further information in the implementation guide to data extraction from SAP source systems. the user can generically extract master data attributes or -texts. or transaction data from all transparent tables. We speak of customer-defined DataSources with generic data extraction from applications. we speak of generic DataSources. for example.extracted. we will discuss this in the class in details. extractors and DataSources are delivered by so-called plug-ins. In this case.

It is stored in the form of the transfer structure and is not modified in any way.2 Example: The IDoc type ORDERS01 transfers the logical message types ORDERS (purchase order) and ORDRSP (order confirmation). since it may be modified by conversion routines (for example.2 Use The requested data is stored in transparent.Among other areas. 2. and texts from various source systems within the Business Information Warehouse.1 Definition The Persistent Staging Area (PSA) is the initial storage area for requested transaction data. there are two transfer methods for SAP systems: .2.2.3 IDoc’s 2. exception handling is triggered using SAP tasks.12. 2.2. The agents who are responsible for these tasks and have the relevant authorizations are defined in the IDoc Interface. the data does not remain completely unchanged. 2. which means that if the data contained errors in the source system. their usefulness.1 Definition The IDoc interface exchanges business data with an external system. the sequence in which they are arranged. and how complete they are.4 Two Methods to transfer data Basically. The IDoc interface consists of the definition of a data structure. known as IDoc types. IDocs are used in both Electronic Data Interchange (EDI) and for data distribution in a system group (ALE). along with processing logic for this data structure.2.1999 might be converted to 19991231 in order to ensure the uniformity of the data). When the user load flat files. Multiple message types with related content can be assigned to one IDoc type. If an error occurs.2. Standard SAP format for electronic data interchange between systems (Intermediate Document). 2.2.2.2. delivery confirmations or purchase orders) normally represent the different specific formats. the date format 31.3. it may still contain errors.2 Persistent Staging Area (PSA) 2. The user can check the quality of the requests.3. The business data is saved in IDoc format in the IDoc Interface and is forwarded as IDocs.2. relational database tables. master data attributes.2.Different message types (for example.

4. The user will not be able to view the data in IDoc's while transferring the data.1 PSA 1. 5. 5.4. Error handling is possible. The DataSource is technically based on the fields of the extraction structure. these fields can be enhanced as well as hidden (or filtered) for the data transfer. 2.4.1000 bytes. That is not the case with IDoc’s. Advantage: Improved performance since larger data packages can be transported. for example.2.3 Data Source Data that logically belongs together is stored in the source system in the form of DataSources. Uses TRFC as transfer log 3.4. Data record length Max. By defining a DataSource. The most advantageous thing about PSA is that we can see and do any editing if there is any error in the records which means that we are able to view the data. 2. 2. 2.1 Differences and advantages: 2. . A DataSource consists of a quantity of fields that are offered for data transfer into BW.1. the data is transferred directly in the form of a transfer structure. Uses Info-IDocs: Uses info and data IDocs. With the IDoc method. Advantage: More detailed log through control record and status record for data IDoc. IDoc containers are not used to send the data. Information is sent from the source system (no data) through the Idoc interface (info IDocs). Data record length Max.2. Number of fields per data record: Restricted to 255 3. Uses TRFC as transfer log. 4.  With the PSA transfer method. It also describes the properties of the extractor belonging to it. More common technology since it brings with it a better load performance and gives the user the option of using the PSA as an inbound data store (For Master and Transaction data). Use with hierarchies. Instead. 2. This information can be. IDoc interface technology is used to pack the data into IDoc containers. 1962 bytes. the number of data records extracted or information on the monitor.2.2 ALE (data IDoc) 1.

if data from different IBUs that logically belongs together is grouped together in BW. Data is transferred from the source system into the SAP Business Information Warehouse in the Transfer Structure. 2. The user assign DataSources to InfoSources and fields to InfoObjects in the transfer rules maintenance. There are four types of DataSource:   DataSources for transaction data DataSources for master data These can be: 1. the properties of the DataSource relevant to BW are replicated in BW.as regards the data transfer into BW. there is also the additional option of assigning an unmapped DataSource to an InfoSource. The user get to customizing via the context menu (right mouse button) for the . The fields for a DataSource are assigned to InfoObjects in BW. the user determines how the fields of the transfer structure are transferred into the InfoObjects of the Communication Structure. choose Assign InfoSource. If the user use this assignment option. During a Metadata upload. filtered and enhanced) extraction structure. This assignment takes place in the same way in the transfer rules maintenance. To do this. DataSources for texts 3. the user can 1. In the transfer rules maintenance. DataSources for hierarchies DataSources are used for extracting data from a source system and for transferring data into the BW.3. DataSources for attributes 2.3. 2. for example.1 Assigning DataSources to InfoSources and Fields to InfoObjects In the DataSource overview for a source system in the Administrator Workbench – Modeling. using the context menu (right mouse button) of a DataSource. Choose an InfoSource from a list containing InfoSources sorted according to the agreement of their technical names 2. DataSources make the source system data available on request to the BW in the form of the (if necessary. Create a new InfoSource Assign several DataSources to one InfoSource. This is used.2 Maintaining DataSources Source system DataSources are processed in the customizing for extractors. if the user wants to gather data from different sources into a single InfoSource.

The transfer structure provides the BW with all the source system information available for a business process.3. then the user have to first transfer the data from the D version into the active version (A version). . Or the user can go directly to the DataSource maintenance screen by choosing Maintaining DataSource in Source System from the context menu of the source system DataSource overview.3. DataSource data that logically belongs together is staged in flat structure of the extract structure. In order to transfer and activate a DataSource delivered by SAP with Business Content. meaning completing the extraction structure. The extraction structure contains the amount of fields that are offered by an extractor in the source system for the data loading process. the active version of the DataSource is finally replicated in BW.5 Transfer Structure The transfer structure is the structure in which the data is transported from the source system into the SAP Business Information Warehouse.3 Transferring Business Content DataSources into Active Version Business Content DataSources of a source system are only available to the user in BW for transferring data. in the BW Administrator Workbench choose Goto Modeling Source Systems  the user Source System  Context Menu (right mouse click) Customizing Extractors  Subsequent Processing of DataSources. With a Metadata upload. 2. the user can determine the DataSource fields in which the user hide extraction structure fields from the transfer. the user has the option of filtering and enhancing the extract structure in order to determine the DataSource fields.3. if the user have transferred these in their active versions in the source system and then carried out a Metadata upload. If the user want to transfer data from a source system into a BW using a Business Content DataSource. An InfoSource in BW requires at least one DataSource for data extraction. In particular. To do this. 2. In a SAP source system.relevant source system in the source system tree of the BW Administrator Workbench Modeling. This means filtering the extraction structure and/or enhances the DataSource for fields. In the source system. 2. The user can edit DataSource extraction structures in the source system.4 Extraction Structure In the extraction structure. select the user source system in the source system tree of the BW Administrator Workbench and select Customizing ExtractorsBusiness Information Warehouse Business Content DataSources/ Activating SAP Business Content Transfer Business Content DataSources using the context menu (right mouse button). data from a DataSource is staged in the source system.

3. 2. or  In the initial screen of the DataSource repository (transaction RSDS).In the transfer structure maintenance screen. This is not possible in the view for the DataSource tree since a DataSource that has not been replicated so far will not be displayed. The data is transferred 1:1 from the transfer structure of the source system into the BW transfer structure. When the user activate the transfer rules in BW.1 Replication of the Entire Metadata (Application Component Hierarchy and DataSources) of a Source System  Choose Replicate DataSources in the Data Warehousing Workbench in the source system tree through the source system context menu. the user specify the DataSource fields that the user want to transfer into the BW.It is a selection of DataSource fields from a source system. Using this function.2 Replication of the Application Component Hierarchy of a Source System Choose Replicate Tree Metadata in the Data Warehousing Workbench in the DataSource tree through the root node context menu.6. a transfer structure identical to the one in BW is created in the source system from the DataSource fields. 2.3.3.3.6.3 Replication of the Metadata (DataSources and Possibly Application Components) of an Application Component Choose Replicate Metadata in the Data Warehousing Workbench in the DataSource tree through an application component context menu.6.4 Replication of a DataSource of a Source System  Choose Replicate Metadata in the Data Warehousing Workbench in the DataSource tree through a DataSource context menu. select the source system and the DataSource and then choose DataSource  Replicate DataSource. using the transfer rules. 2. From here it is transferred.6.3. into the BW communication structure. A transfer structure always refers to a DataSource in a source system and an InfoSource in a BW.6 Replication of DataSources 2. . or  Choose Replicate DataSources in the Data Warehousing Workbench in the DataSource tree through the root node context menu. 2. the user can also replicate an individual DataSource that so far did not exist in the BI system.

1 Full Load: Document posting means creating a transaction. EKPO. from a more BW perspective. delivered in the business content. setup tables are used.4 Data Extraction Logistics New technique to extract logistics information and consists of a series of a standard extract structures (that is.Whenever the user are doing a full load. it transaction is written into the database tables/ application tables/transaction tables (Ex. VBAK. writing into the application/transaction tables.4.Only the new/changed/deleted data is loaded. So whenever sales order is created ( document posted). standard datasources).2. VBAP).4. Data Extraction is the process of loading data from OLTP to OLAP (BW/BI). EKKO.1 Data extraction Illustration Data can be extracted in two modes 1. 2. Full Load – Entire data which available at source is loaded to BW/BI 2. Delta load . 2.1. .

.

1.2 Delta Load: Various types of update methods are discussed below. Serialized V3 1. Direct Delta 3.2. V2. These are different work processes on the application server that makes the update LUW from the running program and execute it.4. V3 updates. These are separated to optimize the transaction processing capabilities. These are separated to optimize the transaction processing capabilities. These are different work processes on the application server that makes the update LUW from the running program and execute it. . Unserialized V3 Before that we need to understand V1. Queued Delta 2.

.5 LO Cockpit Functions 2.1 Maintain Extract Structures Here the user can add additional fields from the communication structures available to the extract structure. and field only known in customer exit.5. 2.2.5. the user can customize the data source by using the following fields: field name. inversion or cancellation field or reverse posting. selection. short text. hide field.2 Maintain Data Sources In the Data source maintenance screen.

2. LO Cockpit supports 4 types of update modes (delta modes. Queued Delta. Direct Delta. it can sequence these records correctly and thus achieve 'serialization'. which we have already discussed): Serialized V3 update. the user avoid pulling from R/3 directly as we need to bring field values from multiple tables. The user can see the data in the setup tables. Unserialized V3 update.5 Setup Tables Access to application tables are not permitted. With this option.6 Serialized V3 Take an example of the same PO item changing many times in quick succession. Setup table name wiil be extract structure name followed by SETUP. So if the user are going for only full upload full update is possible in LO extractors. V1 (with enqueue mechanism) ensures that the OLTP tables are updated consistently. data is written into extract structures both online as well as during completion of setup tables or restructure table or LO initialization tables. . Setup tables are used to Initialize delta loads and for full load.So if users full update should get ALL data from the source system will need to delete and re-fill setup tables. when the V3 job runs. the user can re-run the load to pull the data from setup tables. the user can check it in LBWE also) name followed by 'setup'.But setup tables do not receive the delta data from the deltas done after the init. Its part of LO Extraction scenario.5.Since update table records have the timestamp. hence setup tables are there to collect the required data from the application tables. 2. When a load fails. 'Serialized V3' was to ensure this correct sequence of update records going from update tables to delta queue (and then to BW).5. Set up table names starts with 'MC' followed by application component '01'/'02' etc and then last digits of the datasource name and then followed by SETUP Also we can say the communication structure (R/3 side. Data will be there in setup tables. Depending on the update mode a job has to be scheduled with which the updated data is transferred in the background into the central delta management (Delta Queue). The setup tables are the base tables for the Datasource used for Full upload.4 Controlling update These talks about the delta update mode the user are using and how do the user control the data load based on the volume of data.5. In the full update whatever data is present in the setup tables(from the last done in it) is sent to BW.2. 2.5. Update table gets these update records which may or may not end up in correct sequence (as there is no locking) when it reaches BW. Full update is possible in LO extractors.3 Activating update By Setting as active.

7 Queued Delta (the third update method) With queued delta update mode. up to 10000 document delta/changes to one LUW are cumulated per datasource in the BW delta queues.g. as previously executed during the V3 update. that we can consider as the serializer’s brother. but there is no fixed rule: it depends from peculiarity of every specific situation (business volume. Each document posting is directly transferred into the BW delta queue Each document posting with delta extraction leads to exactly one LUW in the respective BW delta queues Just to remember that ‘LUW’ stands for Logical Unit of Work and it can be considered as an inseparable sequence of database operations that ends with a database commit (or a roll-back if an error occurs). one can directly login to Source System ( R/3 in our case) or From BW also can remotely login to Source System.Issues:Only suitable for data target design for which correct sequence of changes is not important e.SAP recommends to schedule this job hourly during normal operation after successful delta initialization. the V3 unserialized delta disowns the main characteristic of his brother: data is read in the update collective run without taking the sequence into account and then transferred to the BW delta queues. it will be necessary to schedule a job to regularly transfer the data to the delta queues As always. reporting needs and so on).But.5. After activating this method.6 LO Data Sources Data Flow in R/3 : DataSources reside in the source system so If I need to customize DataSource.8 Direct Delta ( 2nd delta update method in our list) With this update mode.9 Unserialized V3: (The last one) With this update mode. as the name of this method suggests. Logon to BW . 2.5. Material Movements V2 update has to be successful. I need to go to source system. If the user use this method. the extraction data continues to be written to the update tables using a V3 update module and then is read and processed by a collective update run (through LBWE).2. 2. the simplest way to perform scheduling is via the "Job control" function in LBWE.5. 1. 2. the extraction data (for the relevant application) is written in an extraction queue (instead of in the update data as in V3) and can be transferred to the BW delta queues by an update collective run.

Right Click & go to “Customizing for Extractors”. Choose R/3 Source System (where the user DataSource resides) . Go to Administrator Workbench (RSA1). Go to Source systems. b. 2.a. It shall take the user to Source System ( R/3) . c.

Scroll down to reach the appended structure . Choose “Edit DataSources and Application Component Hierarchy” 4. Click on DataSource 6.3. Go to “0CUSTOMER_ATTR” Customer Number DataSource ( or whatever DataSource the user decide to customize) 5.

9. Check. Double Click on appended structure – ZBIW_KNA1_S1 (or the user chosen one) & add the fields ( which the user wish to add to DataSource) 8.7. save & activate . Go Back .It will append this to Extract Structure of DataSource.

Now Click on Extract Structure – BIW_KNA1_S (in my case). .10. check out the appended field below append structure. Click on 0CUSTOMER_ATTR (DataSource) 11.

LOGON to Source System (R/3) b. ( the user need to create project in case the user are doing enhancement on this BW system for the first time) d. Project “ZSDBWNEW” in my case. Post appending the structure. Click on Change .2.1 Filling up the Appended Structure 1. Go to T Code – CMOD c. Choose the project – Which the user is using for Master Data Enhancement. we need to fill this Append with data – For this a.6. Click on Enhancement Assignments e.

Choose the following a. Click on Components. 2. Enhancement – RSAP0001 b. 4. Press Continue – In case it prompts for some confirmation like “Enhancement project already active” etc 3. It will take the user to – .

. Choose”EXIT_SAPLRSAP_002” & Click on this. 6. Choose”EXIT_SAPLRSAP_002” & Click on this.5.

13.7. Make Sure Tables KNVV. We need to write ABAP Code to fill up Appended Fields in “0Customer_Attr” through the Include mentioned here i. 11. 10. KNA1. BIW_KNVV_S. Double Clicking on this will take to ABAP Editor to put in the code.e. 8. Do the “Check” before Save. if no errors are found the Save & Activate”. 9. This Code is to be inserted before “endcase” of already written code. 12. . include zxrsau02. Go Back. KNKK. KNVP (these are table from which I need to extract values of appended fields) are declared at the start of the code. Go to Change Mode.

Logon to BW System – Go to Administrator Workbench (RSA1) – Go to Source systems – Choose Source System (R/3 in our case) . 2. Come out. Do the Check again & if no errors then Activate. .6. 16. Choose “Edit DataSources and Application Component Hierarchy” 4. 15. Go to “0CUSTOMER_ATTR” Customer Number DataSource & Click on CHANGE.14. (Keep activating while going back on every step). 2. Go Back and then Activate.2 Regenerate & Check the Customized Objects 1.Right Click & go to “Customizing for Extractors”. 3. Scroll Down. Now Final Step – We need to regenerate the DataSource & make sure Newly Added Attributes (Starting with ZZ) are not HIDE.

Scroll down & Go to Fields starting with ZZ & make sure these are not HIDE (remove the sign from HIDE check box). 7.5. 6. Click on DataSource & Generate. Next Steps are Checking Extractors & Loading data . Go back. Keep clicking on Right Sign if it prompts for any confirmation. as shown below 8.

occupies a simplified interface (see documentation for data element ROFNAME_S). But for the normal F1 type extractors both Delta and Repeat delta LUW’s will be visible in RSA7 In the below screen shot Total =2 will refer number LUW’s in repeat delta table 2. Whenever we request delta data from BW. Note the negative values for the key figures for the before image record in the PSA table. which can be a part of a pass through layer or an EDW layer.7 Structure of Delta Method for LO Cockpit Data Sources The delta InfoPackage is executed which extracts data from the delta queue to the SAP BI system and the same is scheduled as a part of the process chain. When the data from the LO datasource is updated to a DSO with the setting „unique record the before image is ignored. which forms the first physical layer in BI from where data is further staged into the DSO. in overwrite and summation mode.7. The same can be updated to a DSO.2. the data will pulled via delta queue and Delta LUW’s will be saved in repeat delta table and repeat delta LUW’s only will be visible in RSA7. The data is extracted to the persistent staging area. Generic extractors of type (extraction method) F2 and delta process are AIE (After image Via Extractor) will be using pull delta model 'F2': The data is extracted by means of a function module that. and an infocube. in contrast to 'F1'.1 Delta Management in extraction The LO cockpit configuration screen (LBWE) contains following key parameters: .

Here navigate as per the screenshot below and Delete the data from the setup tables. Queued Delta The extraction data from document postings is collected in an extraction queue. which exist and get filled with the logistic data before it’s extracted into BW. Unserialized V3 Update This method is opposite to the Serialized V3 Update.    Maintenance of extract structures Maintaining InfoSources Activating the update Job control  Update Mode a. b. The transfer sequence is the same as the order in which the data was created. d. The BW delta queue does not have to be the same as the order in which it was posted. c. from which a periodic collective run is used to transfer the data into the BW delta queue. Serialized V3 Update The document data is collected in the order it was created and transferred into the BW as a batch job. Direct Delta The extraction data is transferred directly from document postings into the BW delta queue.7. . 2.2 Step-by-Step Maintenance We need to first log into the R/3 System and call Transaction SBIW (Display IMG). There are two setup tables in R/3. The transfer sequence is the same as the order in which the data was created.

Select the Application component and execute for deletion of data. . Click YES. An information message will be displayed in the message box. Click YES on the popup.Click on the clock symbol to delete the existing contents from the setup table if any so that its free and we can go ahead and make or configure our Logistic Datasources. Select the Application component and execute for deletion of data.

Call Transaction LBWE and navigate as per the below screenshot: Click on the maintain Extract structure to maintain the fields .

. This will take the user to SM13 for any relative table updates and Execute.An information message will be displayed In the above screenshot: Click on the Update Overview text to reach the following screen.

This will take the user to RSA7 transaction to view the delta queues if any .Now go back to previous screen and click on BW Maintenance Delta Queue.

Click back to go reach this pop-up. . Now on the Main LBWE screen. Assign a request so that it generates extract structure successfully. Click on Run and It will Prompt for confirming the entries in the Extract structure. the user can see RED status before the datasource.

Now click on the Datasource as below. Assign a Request to have the Datasource screen where properties related to fields can be modified. .

As the user assign this and come back the user will see a change in the status color as YELLOW. . Now go to the BW System and Replicate the related Datasource from the Exact Source system.

.Now go back to the R/3 System and Click on the ACTIVE parameter under the Job Control. assign a request Now the user will see that the status color will turn as GREEN and then the user can assign the update mode as well.

8. In the Delta Queue will only be the information that has not been sent to BW and the last request. The Data sources that should be used are: 2. It is taken from here when BW asks for information to the source system. The principal reason for this is the information volume that is imported to BW every day.8 Delta Method With this process the new information that is generated daily in the source system must be sent to BW. This information is stored en the delta queue (RSA7 Transaction). Create an infopackage and DTP to load the data.1 Master Data 0ARTICLE_ATTR 0ARTICLE_TEXT 0CUSTOMER_ATTR 0CUSTOMER_TEXT 0PLANT_ATTR 0PLANT_TEXT 0MATL_GROUP_TEXT 0MAT_VEND_ATTR . 2.Now in the BW system create transformations from the datasource to the Infoprovider.

the user must mark the data source that the user wants to activate and select Activate Data Source.8.0VENDOR_ATTR 0VENDOR_TEXT 0SALESORG_ATTR 0SALESORG_TEXT 0SALES_DIST_TEXT 0SALES_GRP_TEXT 0SALES_OFF_TEXT 0SALESORG_ATTR 0SALESORG_TEXT 0SALES_DIST_TEXT 0SALES_GRP_TEXT 0SALES_OFF_TEXT 2. .2 TRANSACTIONAL DATA 2LIS_02_SCL 2LIS_03_BF 2LIS_03_BX 2LIS_03_UM 2LIS_13_VDITM 2LIS_40_REVAL In the screen Installation of Data Source from Business Content.

This serializes the delta packets. for which data a DataSource is suited. for example. otherwise . This serializes data by packet since the same key can be copied more than once within a request. The delta process controls whether adding or overwriting is permitted. an after image shows the status after the change. (technical name of the delta process in the system: ABR)  The extractor delivers additive deltas that are serialized by request. It supports an update in an ODS object as well as in an InfoCube. This delta process is used by LIS DataSources.The delta process is a feature of the extractor and specifies how data is to be transferred. This process supports an update in an ODS object as well as in an InfoCube. this process only supports overwriting and not adding. It only supports the addition of fields. From this the user can derive. adding and overwriting are permitted. and otherwise changes in the non-key fields are not copied over correctly. which are updated directly in the delta queue. a before image the status before the change with a negative sign and the reverse image also shows the negative sign next to the record while indicating it for deletion.3 Delta Process  Forming deltas with after. For numeric key figures. This serialization is necessary since the extractor within a request delivers each key once. 2. and how the update and serialization are to be carried out. for example. An ODS object must always be in operation when the user update data in an InfoCube. it specifies how the DataSource data is passed on to the data target.8. It does not support the direct update of data in an InfoCube. (technical name of the delta process in the system: ADD)  Forming deltas with after image. As a DataSource attribute. In this case. before and reverse images that are updated directly in the delta queue.

and the init InfoPackage extracts the initial data into BI. The LO datasources support ABR delta mechanism which is both DSO and InfoCube compatible. in the sales document 1000. The setup tables in SAP have the naming convention. extract data by directly accessing the application tables. which gets automatically generated after successful delta initialization. The restructuring/set up tables are cluster tables that hold the respective application data.9 Delta Method Properties 2.9.g.1 Delta Initialization In contrast with other business content and generic data sources. and the data can be deleted from the set up tables after successful data extraction into BI to avoid redundant storage. The after image provides status after change. FI etc. <Extraction structure>SETUP and the compressed data from application tables stored here can be viewed through SE11. is used in capacity in BBP. The data extractors for HR. The serialization plays an important role if the delta records has to be updated into a DSO in overwrite mode. . a before image gives status before the change with a minus sign and a reverse image sends the record with a minus sign for the deleted records. before and reverse images that are updated directly to the delta queue. the LO datasources use the concept of set up tables to carry out the initial data extraction process. but in case of LO extractors they do not access the application tables directly. then the data gets extracted as shown in the table. if the quantity of ordered material is changed to 14 from 10. 2. the set up tables have to be filled. all subsequent new and changed records are extracted to the BI system using the delta mechanism supported by the datasource. Thus the datasource 2LIS_11_VAITM having extract structure MC11VA0ITM has the set up table MC11VA0ITMSETUP. where extractor can also send records with the deletion flag. while the variation of the process. A job is executed to fill the set up tables. and the BI system extracts the data as a onetime activity for the initial data load. For loading data first time into the BI system. It is used in FI-AP/AR for transferring line items. (technical name of the delta process in the system: AIM/AIMD) 2. 2 Delta Extraction Once the initialization of the logistics transaction data datasource is successfully carried out. The presence of restructuring/set up tables prevents the BI extractors directly access the frequently updated large logistics application tables and are only used for initialization of data to BI. The ABR delta creates delta with after.incorrect results would come about. For e.9.

for example the creation of an sales order (VA01) in the system. the creation of a sales order. The fact whether a delta is generated for a document change is determined by the LO application. The latter is often termed as statistical updates. the program that outputs the statement COMMIT WORK AND . called the V3 update. 2.g.e.1 V1 Update A V1 update is carried out for critical or primary changes and these affect objects that has a controlling function in the SAP System.9. The SAP system treats both these events generated by the creation of order with different priorities by using two different update modes for achieving the same.0 system. the V1 update and the V2 update. With V1 updates. having an integrated controlling aspect. and also indirectly forms a part of the information for management information reporting. The update modes are separately discussed below. These updates are time critical and are synchronous updates. The data entered by the user from a logistics application perspective is directly used for creating the orders. i. i. The data entered by the user is used by the logistic application for achieving both the above aspects.e. the user enters data and saves the transaction. 2. with the former being a time critical activity. Apart from these two update modes SAP also supports a collective run. The following three update methods are available. but the former.The type of delta provided by the LO datasources is a push delta. a) V1 Update b) V2 Update c) V3 Update While carrying out a transaction. It is a very important aspect for the logistic datasources as the very program that updates the application tables for a transaction triggers/pushes the data for information systems. for e. which can be a V1 or a V2 update. the creation of the order takes a higher priority than result calculations triggered by the entry.3. the delta data records from the respective application are pushed to the delta queue before they are extracted to BI as part of the delta update. which carries out updates in the background.3 Update Modes Before elaborating on the delta methods available for LO datasources it is necessary to understand the various update modes available for the logistics applications within the SAP ECC 6.9. by means of an update type.

Implementing the same as a V2 update runs 10 times after the V1 for the same has been completed. During the creation of an order the V1 update writes data into the application tables and the order gets processed. in contrast with V1 is executed for less critical secondary changes and are pure statistical updates resulting from the transaction.9.9. Compared to the V1 and V2 updates. The V1 updates are processed sequentially in a single update work process and they belong to the same database LUW. The different update modes available with LO datasources are.3 V3 Update Apart from the above mentioned V1 and V2 updates. All function module calls are then collected. 2. These updates are executed under the SAP locks of the transaction that creates the update there by ensuring consistency of data. 2. this is called up 10 times during the course of the transaction. which is carried out when a report (RSM13005) starts the update (in background mode). If one of the function modules increments a statistical entry by one. The V3 update does not happen automatically unlike the V1 and V2 updates. the update can be executed at any time in one single operation with the same being carried out in one database operation at a later point in time. i. They are often executed in the work process specified for V2 updates. the database is updated 10 times.WAIT which waits until the update work process outputs the status of the update. Direct Delta . SAP provides different mechanisms for pushing the data into the delta queue and is called update modes. They are asynchronous in nature.10 Delta Queue Functions The LO datasource implements its delta functionality using the above update methods either individually or as a combination of them.3. though the V2 updates are usually also processed straight away. But when executed as a V3 update. If this is not the case.e. the V3 update is a batch asynchronous update.2 V2 Update A V2 update. 2. The program then responds to errors separately.3. aggregated and updated together and are handled in the same way as V2 update modules. preventing simultaneous updates. This largely reduces the load on the system. The most important aspect is that the V1 synchronous updates can never be processed a second time. The V1 updates are carried out as a priority in contrast to V2 updates. They are carried out in a separate LUW and not under the locks of the transaction that creates them. the SAP system also has another update method called the V3 update which consists of collective run function modules. the V2 components are processed by a V1 update process but the V1 updates must be processed before the V2 update. a.

No additional monitoring of update data or extraction queue required. Setup and delta initialization required before document postings are resumed. b. A logistics transaction posting leads to an entry in the application tables and the delta records are posted directly to the delta queue using the V1 update. The figure below shows the update modes for delta mechanism from the LO Cockpit. 2. Extraction is independent of V2 updating. b. no document postings should be carried out during delta initialization in the concerned logistics application from the start of the recompilation run in the OLTP until all delta init requests have been successfully updated successfully in BW. Queued Delta c.1 Direct Delta (V1 update) A direct delta updates the changed document data directly as an LUW to the respective delta queues.10. Recommended for customers with fewer documents.b. d. the scenarios in which they are used. When using this update mode. The data available in the delta queue is then extracted periodically to the BI system. Writing to the delta queue within the V1 posting process ensures serialization by document. Un-serialized V3 Delta The following section discusses in detail the concept behind the three different delta update modes. c. C. their advantages and disadvantages. Not suitable for scenarios with high number of document changes. .V1 is more heavily burdened. Advantage of direct delta a. Disadvantage of direct delta a.

The data from documents posted is completely lost if documents are posted during the reinitialization process.10. The data pushed by the logistic application can be viewed in the logistics queue overview function in the SAP ECC 6.10. because the system is able to collect new document data . if several and running in parallel) ends.2 Queued delta (V1 + V3 updates) In the queued delta update mode the logistic application pushes the data from the concerned transaction into an extraction queue by means of the V1 update. that is when setup tables are filled.2. 2. the document postings (relevant for the involved application) can be opened again as soon as the execution of the recompilation run (or runs.0 system (transaction LBWQ). with an update collection run. thanks to the logic of this method. in a similar manner to the V3 update. Depending on the concerned application. SAP recommends the queued delta process for customers with a high amount of documents with the collection job for extraction from extraction queue to be scheduled on an hourly basis. up to 10. The data is collected in the extraction queue and a scheduled background job transfers the data in the extraction queue to the delta queue. and a delta init request is posted in BW.000 delta extractions of documents can be aggregated in an LUW in the delta queue for a datasource. 2.1 Benefits  When the user need to perform a delta initialization in the OLTP.

this process is especially recommended for customers with a high occurrence of documents (more than 10.10. In the initialization process.creation. because a definite end for the collective run is identifiable: in fact.2. the collection of new document data during the delta initialization request can reduce the downtime on the restructuring run.10. 2. when the collective run for an application ends.  On the contrary of direct delta. The job uses the report RMBWV311 for collection run and the function module will have the naming convention MCEX_UPDATE_<Application>. an event (&MCEX_nn.performed each day for the application in question.000 document changes . the serialization is ensured by using the enqueue concept. change or deletion . but collective run clearly performs better than the serialized V3 and especially slowing-down due to documents posted in multiple languages does not apply in this method.  By writing in the extraction queue within the V1 update process (that is more burdened than by using V3).2 Limits  V1 is more heavily burdened compared to V3.  Administrative overhead of extraction queue. it can be used to start a subsequent job.during the delta init uploading too (with a deeply felt recommendation: remember to avoid update collective run before all delta init requests have been successfully updated in the user BW!). where nn is the number of the application) is automatically triggered and. The entire extraction process is independent of the V2 update process.3 Un-serialized V3 Update (V1/V2 + V3 Updates) .  In contrast to the V3 collective run an event handling is possible here.  Extraction is independent of V2 update. MCEX_UPDATE_11 for sales orders. thus. 2.

In this mode of delta update the concerned logistic application writes data to update tables which further transfers data to the delta queue by means of a collection run call V3 update. Once the data is updated to the update tables by the logistic applications, it is retained there until the data is read and processed by a collective update run, a scheduled background job, the V3 update job, which updates all the entries in the update tables to the delta queue.

As the name suggests the update is un-serialized, i.e. this mode of update does not ensure serialization of documents posted to the delta queue. This means that the entries in the delta queue need not correspond to the actual sequence of updates that might have happened in the logistic application. This is important if the data from the datasource is further updated to a DSO in overwrite mode as the last entry would overwrite the previous entries resulting in erroneous data. An un-serialized delta update when used should always update data either to an infocube or to a DSO with key figures in summation mode. It is also advised if the un-serialized V3 update can be avoided to documents subjected to a large number of changes when it is necessary to track changes.

2.11 Generic extraction
Generic R/3 data extraction allows us to extract virtually any R/3 data. Generic data extraction is a function in Business Content that supports the creation of DataSources based on database views or InfoSet queries. InfoSet is similar to a view but allows outer joins between tables. The new generic delta service supports delta extractors on monotonic ‘delta attributes‘like Timestamp, Calendar day, Numeric pointer (e.g. document number, counter) – must be strictly monotonic increasing with time. Only one attribute can be defined as the delta attribute. For extracting data from the VBAK table, the Logistics Extraction Cockpit is the recommended method.

2.11.1Create Generic extraction [Master data]

1. Under Transaction SBIW – This step gives the user the option of creating and maintaining generic Data Sources for transaction data, master data attributes or texts from any kind of transparent tables, database views or SAP query functional areas or via a function module, regardless of application. This enables the user to use the generic extraction of data.

2. Create a Generic Data Source a) Select the Data Source type and assign a technical name to it. b) Choose Create The screen for creating a generic Data Source appears

3. a) Choose an application .Component to which the data source is to be assigned. b) Enter the descriptive texts. The user can choose these freely. c) Choose Generic Delta

4. Specify the delta-specific field and the type for this field. Maintain the settings for the generic delta: Specify a safety interval. Safety interval should be set so that no document is missed – even if it was not stored in the DB table when the extraction took place.

5. Select Delta type: New status for changed records (I.e. after-image); This can be used with Data target ODS (AIE).Additive Delta (I.e. aggregated data records) (ADD) Then choose Save.

6.

After step 4, the screen of step 3 comes back. Now choose Save again.

7.the user can display the current value for the delta-relevant field in the delta queue. when an error occurs during . the user will see the Delta Update flag selected. the extracted data is stored in the delta queue tables to serve as a fallback. Delta is enabled by data selection logic LUW count can also have value 1. Also note LUW count does not equal to the changes records in the source table.0B.Whenever delta is extracted. After generating the data source.This will generate the data source. Most of the time it will be ZERO. In systems as of basis release 4. Choose Save again Delta Attributes can be monitored in delta queue (RSA7).

2.12.2 Functions 2.12. A hierarchy consists of a quantity of nodes that have a parent child relationship with one another. These attributes are used to display additional information so results can be better understood. up to three texts can be maintained for each master record.12.12. a time interval is specified for this attribute. 2. the gaps are filled automatically. These texts can consist of the following: one short text.31. 2. The user will see a '1' in this field (the extract counts as one LUW) and are even able to be displayed in a detail screen.1 Master Data In SAP BW. This ensures a higher level of transparency for the user and a more comprehensive consistency.2.12.1.1000 and 12. three different types of master data can be differentiated in InfoObjects. Texts Texts are used to describe a master record.the update of the BW system.12.12.12 Generic Data Types 2. 2.1.01. one medium text.3. An example of a master data text is the name of the supplier that goes with the supplier number. An example of a master data attribute is the country of the supplier that goes with the supplier number. 2.1. Attributes Master data attributes are fields that are used to provide a more detailed description of master data elements. As master data must exist between the period of 01. An attribute table can be used by several InfoCubes. The structures can be defined in a version-specific as well as a timedependent manner. Hierarchies Hierarchies can be used in the analysis to describe alternative views of the data. In SAP Business Information Warehouse (SAP BW). An example of this is the cost center hierarchy.2.1000 in the database.2 Time-dependent Texts . and one long text.1 Time-dependent Attributes If the characteristic has at least 1 time-dependent attribute. 2.1.2.

competitive data. and macroeconomic data. 2.2. Only the texts in the selected language are displayed. or not (for example. English car). 2.3 Transactional data Transaction data includes origination data. When loading transactional data.2. and schedule lines.12. SAP R/3 handles transaction data at the document level or at the summary level.12. the system utilizes elements of the master data to process the transactions. 2. the text for the key date is always displayed in the query. the user have to upload all texts with a language indicator.If the user creates time-dependent texts. the time intervals do not have to agree.3 Time-dependent Texts and Attributes If the texts and the attributes are time-dependent. It is regularly updated and loaded into the system. monthly account data. Generic extractors are of 3 types: 1. Based on Function module .12.Any business event in SAP R/3 leads to transaction data being generated. with product names: German Auto. Document level transaction data is detailed data and consists of headers. line items. Summary transactional data is used in SAP R/3 for mostly reporting purposes. the user can determine whether the texts are language-specific (for example. If they are language-dependent. Transaction data .4 Language-dependent Texts In the Characteristic InfoObject Maintenance. customer names). Based on Infoset Query 3. Based on table/view 2.

13 Generic Data sources Defines as extract structure and transfer structure from source system (like SAP R/3) and transfer structure from BW side. ES -> What data to be extracted from source system TS -> What data to be extracted from source system to BW. these fields can be enhanced as well as hidden (or filtered) for the data transfer. By defining a dataSource. DataSources make the source system data available to BI on request in the form of the (if necessary. . the BI-relevant properties of the DataSource are made known in BIDataSources are used for extracting data from an SAP source system and for transferring data into BI. The DataSource is technically based on the fields of the structure. the DataSource describes the properties of the associated extractor with regard to data transfer to BI. filtered and enhanced) extraction structure.A DataSource consists of a quantity of fields that are offered for data transfer into BI. Additionally. TS is a group of fields which indicated how the data is coming from the source system BW Data that logically belongs together is stored in the source system in the form of DataSources. Upon replication.2.

the user determines which fields from the DataSource are actually transferred. . the user can determine the DataSource fields in which the user hide extraction structure fields from the transfer.1 Replication Process Flow In the first step. The metadata from the SAP source systems is not dependent on the BI metadata. Here. The assignment of source system objects to BI objects takes place exclusively and centrally in BI. The rules that the user set in the transformation apply here.13. meaning completing the extraction structure. a DataSource can have the SAP delivery version (D version: Object type R3TR OSOD) or the active version (A version: Object type R3TR OSOA). In the transformation.13. the Persistent Staging Area (PSA). In the source system. Replicating the header tables is a prerequisite for collecting and activating BI Content In the second step. Data is transferred in the input layer of BI. In this way. This means filtering the extraction structure and/or enhancing the DataSource for fields. the user determines what the assignment of fields from the DataSource to InfoObjects from BI should look like. In particular. using transaction SBIW. the DataSource is the BI-relevant metaobject that makes source data available in a flat structure for data transfer into BI. the user avoid generating too many DDIC objects unnecessarily as long as the DataSource is not yet being used – that is.1 Extraction Structure In the extraction structure.In the DataSource maintenance in BI. The user can edit DataSource extraction structures in the source system. information is only retained if it is required for data extraction. the D versions are replicated. It contains the amount of fields that are offered by an extractor in the source system for the data loading process. In the source system.13. Data transfer processes facilitate the further distribution of the data from the PSA to other targets. 2. 2. 2. Replication allows the user to make the relevant metadata known in BI so that data can be read more quickly. choose Business Information Warehouse Subsequent Processing of DataSources. the A versions are replicated. data from a DataSource is staged in the source system.3 Replication of DataSources In the SAP source system.2 Editing the DataSource in the Source System The user can edit DataSources in the source system. DataSources (R3TR RSDS) are saved in the M version in BI with all relevant metadata. In transaction SBIW in the source system.13. 2. only the DataSource header tables of BI Content DataSources are saved in BI as the D version. There is no implicit assignment of objects with the same names.3.

x DataSources (R3TR ISFS) are saved in BI in the A version with all the relevant metadata. if the DataSource has changed in the source system.Log on to the SAP R/3 server.as long as a transformation does not yet exist for the DataSource. .14 Enhancing Business Content SAP provides business contents. 2. If this indicator is set.3 Automatic Replication during Data Request The user can use a setting in the InfoPackage maintenance under Extras  Synchronize Metadata to define that. reporting and analysis. automatic synchronization of the metadata in BI with the metadata in the source system takes place. during replication.3. the system does not delete any DataSources because they may have been assigned to another application component in the meantime. 2. The Transaction code (Tcode) is RSA5. whenever there is a data request.3. the DataSource is automatically replicated from the BI upon each data request – that is.3.13. 2. 2. If. the system determines that the D version of a DataSource in the source system or the associated BI Content (shadow objects of DataSource R3TR SHDS or shadow objects of mapping R3TR SHMP) is not or no longer available in BI. Steps involved in using the business content and generating reports. 1.2 Deleting DataSources during Replication DataSources are only deleted during replication if the user performs replication for an entire source system or for a particular DataSource. the system automatically deletes the D version in BI.13. When the user replicates DataSources for a particular application component.Install the Business content. which are used in BW for extracting.

TABLES: AUFK. DATA: Temp_KDAUF like DTFIGL_4-KDAUF. 4. 5. . 7. DATA: Temp_sorder like DTFIGL_4.3. User exit is written for this purpose. DATA: Temp_ZZSEQUENCNUMBER like DTFIGL_4-ZZSEQUENCENUMBER. Tcode RSA6. Here I am using BOM (Bill of Materials) from the table VBAP from Sales and Distribution (SD). Hence it should be mapped to 0_FI_GL_4 from VBAP when the condition is satisfied.Activate the enhanced structure. TABLES: VBAP.Include the following code in the user exit.Enhance the field BOM in to the structure. Enhancing the structure. 6.Checking if the Business content is ready for use.

Loop at C_t_data into Temp_sorder. Temp_sorder-ZZBOM = Temp_ZZBOM. When '0FI_GL_4'. 10. A field Bill of Material (ZBOM) from VBAP (SD) has been enhanced to FI_GL_4 structure. End loop. Temp_sorder-ZZORDERCAT = Temp_ZZORDERCAT. Select single KDAUF from AUFK into Temp_KDAUF where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS.Activate the user exit. Temp_sorder-ZZSEQUENCENUMBER = Temp_ZZSEQUENCNUMBER. Select single SEQNR from AUFK into Temp_ZZSEQUENCENUMBER Where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS.Business content has been installed and activated. Log on to BW server. 11. DATA: Temp_ZZBOM like DTFIGL_4-ZZBOM.DATA: Temp_ZZORDERCAT like DTFIGL_4-ZZORDERCAT. . 8. Modify C_t_data from Temp_sorder. 9. Select single STLNR from VBAP into Temp_ZZBOM where GSBER = Temp_sorder-GSBER and AUFNR = Temp_sorder-AUFNR. Temp_sorder-KDAUF = Temp_KDAUF. Select single AUTYP from AUFK into Temp_ZZORDERCAT where AUFNR = Temp_sorder-AUFNR and BUKRS = Temp_sorder-BUKRS. Case i_datasource. Create an InfoObject for the enhanced field ZZBOM.

12. 13. Map the fields manually since ZZBOM is an enhancement. . Assign the objects corresponding to the Key fields and Data fields. 14. it should be mapped to the InfoObject created by us. 15. Replicate the data from SAP R/3 to the BW server. Assign Data source. Create ODS.

The data is loaded in to the ODS which could be monitored. Assign the corresponding characteristics. 19. The data load is successful. Time Characteristics and Key figures. 23. 20. 21. Activate the InfoCube and create update rule in order to load the data from the ODS. Create InfoSource and InfoPackage. 22. The Data is then loaded into the ODS which is the data target for the InfoPackage. 18. Create InfoCube.16. . 17. Schedule the load. Create relevant Dimensions to the cube. Create Update rules for the ODS.The data is extracted form the SAP R/3 while we schedule the load from Info Package.

. Data available in the InfoCube. 26. 28. 1996 and 1997. Create a new query using a query designer. As per the requirement the General Ledger InfoCube is created and ready for Reporting. Update ODS and the InfoCube is successfully loaded. General Ledger Balance Sheet for the period 1995. 27. Manage the InfoCube to view the available data. Query designer for 0_FI_GL_4. 25.24.

which processes information from large amounts of operative and historical data. In selected roles in a company. online services and the Internet. queries. files in ASCII format (American Standard Code for Information Interchange) or CSV format (Comma Separated Value). OLAP technology enables multi-dimensional analyses from various business perspectives. As well as roles. The Business Information Warehouse Server for core areas and processes. pre-configured with Business Content.The SAP Business Information Warehouse allows the user to analyze data from operative SAP applications as well as all other business applications and external data sources such as databases. and characteristics. ensures the user to look at information within the entire enterprise.1 Data from Flat Files (7. Business Content contains other pre-configured objects such as InfoCubes. For example. Extraction with Flat Files 3. if budget planning for a company’s branch offices is done in . key figures. The SAP Business Information Warehouse enables Online Analytical Processing (OLAP). which make BW implementation easier.0) BI supports the transfer of data from flat files. Business Content offers the information that employees need to carry out their tasks. 3.

2. as well as about creating InfoSources for flat files under:  Flexibly Updating Data from Flat Files  Updating Master Data from a Flat File  Uploading Hierarchies from Flat Files The structure of the flat file and the metadata (transfer structure of the DataSource) defined in SAP BW have to correspond to one another to enable correct data transfer. 1. 3. the extracted data is added in BW. the values are overwritten in BW. that is. With additive deltas. Make especially sure that the sequence of the InfoObjects corresponds to the sequence of the columns in the flat file. delta transfer in the case of flexible updating is supported.2 Data from Flat Files (3. The transfer of data to SAP BW takes place via a file interface. The user can find more information about this.3 Extracting Transaction and Master Data using Flat Files Go to T-CODE SE11 and select the radio button TRANSACTION DATA and give the some technical name and click on CREATE button. The user can find more information under Maintaining InfoPackages Procedure for Flat Files. defining the metadata for the user file in BI. The user defines a file source system. The user creates a DataSource in BI. the DataSource is done manually for flat files in SAP BW. The user can establish if and which delta processes are supported during maintenance of the transfer structure. 3. this planning data can be loaded into BI so that a plan-actual comparison can be performed. Determine the parameters for data transfer in an InfoPackage and schedule the data request.Microsoft Excel.x) The metadata update takes place in DataSource maintenance of BI. The data for the flat file can be transferred to BI from a workstation or from an application server. DataSources with this delta process type can write the data into ODS objects and master data tables. The user creates an InfoPackage that includes the parameters for data transfer to the PSA. For flat files. During transfer of the new status for modified records. . DataSources with this delta process type can supply both ODS objects and InfoCubes with data. 3.Definition and updating of metadata.

Then click on SAVE button. TABLE NAME and give Descriptions.Give APLICATION COMPONENT. .

which we already assigned in R/3 side. For transaction data KEY FIGURES and REFERENCE are compulsory. . And then -> Go to BW side and Replicate the user datasource by selecting the application component.Select the fields like master data otherwise click on hide. in the source system tab. Some key figures along with the reference values. So select click on SAVE button.

-> Click on Assign infosource by the context menu of the user replicated datasource -> Create the infosource as flexible updated .

-> Assign the transfer rules by selecting necessary infoobject related to the object. which we extracted by clicking F4 for each and every infoobject. . And then click on Activate.

.-> Go to info provider and create infocube.

.

.

.

-> After the creating of infocube. create update rules for that infocube. .

.

-> Go to infosource and select the user datasource and create infopackage. .

.

4 Data Types that can be extracted using Flat Files Flat files are data files that contain records with no structured relationships. Modern database management systems used a more structured approach to file management (such as one defined by the Structured Query Language) and therefore have more complex storage arrangements.csv is comma separated flat file. .3.lst. . . Many database management systems offer the option to export data to comma delimited file. . For this reason.txt .lis . .. Additional knowledge is required to interpret these files such as the file format properties. this type of file can be referred to as a flat file. FOR Example . This type of file contains no inherent information about the data and interpretation requires additional knowledge.

Create a Data Source – which has a structure similar to the Source structure.1 Basic Steps of Data Flow (ETL process): 1.r.t the Source system used 3. 7.4. Create a Target (Info Provider : InfoObjects/Info Cubes/DSO etc) 6. Determine the Source System (Flat File/SAP R3/WebServices etc) 2. Consider two factors: Name of file and Structure same as Source.2 Step-by-Step to upload Master Data from Flat File to InfoObjects Transaction Code – RSA1 : Data Warehouse Workbench . Create a Transformation rule to map Source & Target fields.3. 4.4. Create Info Provider to transfer data from Data Source to PSA (Persistent staging Area) 5. Establish an interface w. to the final Data Target. Create DTP (Data Transfer Process) to transport data stored within PSA . 3.

with the following structure: Customer Language Land City Name EN US SEATTLE JOHN 260 261 DE DE HAMBURG SAM 262 DE DE WALDORF MICHEAL Right click on the InfoProvider column header.Source/Input: Create a Flat File of type CSV. Select “Create InfoArea” .

Right click on the created InfoArea and create InfoObject Catalog .Enter the details – InfoArea name and Description The InfoArea – IA for Customer appears on the InfoProvider list.

Now.’ or ‘Key Figure’ based on the type of data (Master data/ Trans. In this case. we are creating Master data. Data) needs to be created. we need to create Characteristics InfoObject. Select ‘Char. right click on the InfoObject Catalog and create Info Object. Hence. Select ‘Char.’ Click on Create. The InfoObject Catalog appears right under the InfoArea created earlier.Enter a suitable Name and Description for InfoObject Catalog. .

. The first Characteristic InfoObject created can be treated as a key value in the Master table to be created. then it becomes an attribute of this Characteristic Key field or just another normal field in the Master data Table. On the right window. no changes to be done. enter the Data Type and Length.  In the Business Explorer tab.  In the General Tab. Note: the ‘Attribute only’ field is not checked. This means that this Characteristic is a primary key field in the Master Data table.In the pop-up for InfoObject enter Characteristic name and description. If the box is checked. the properties of the Characteristic ‘CUSTNO’ are visible.

to create a Master Data table.Save + Activate => P tables gets created  In the Master data/texts tab. Select ‘short/medium/long texts’. having naming convention /BIC/P<characteristic name>  Check “With Texts”.  Select “Char is InfoProvider” (as our InfoObject is our target in this scenario). to create a corresponding Text table for this Master Data table with two default fields – Language & Description.  Enter the InfoArea in the box below.  Check “With Master data”.  In the Attribute tab. Ex: Customer Id – primary key field Land City . enter the fields (non-key fields) that need to be maintained within the Master Data table. also ‘texts language dependent’.

. [Enter] Repeat the same for other fields too. hit [Enter] and in the pop-up box.On entry of each field. Enter the Data type and the Length. select the 1st option – “Create Attribute as Characteristic”.

Create File Interface: Source Systems  File  Create <New File System> Or Use the existing file source system Data Source creation/selection: On the Navigator tab. Click on the ‘[X]’ button to choose the Source System. . select Data Source.[Activate ] .Activate all dependent InfoObjects.

Choose the Source System  File  <Flat File source system name> Right click on the header and Create Application Component. Ex: Name: APC_CUSTOMER Click ok. Enter the name and description for Application Component. The technical name of Application component APC_CUSTOMER now gets saved as => ZAPC_CUSTOMER Right click on the Application component and create DataSource .

Enter a suitable DataSource name : DS_Cust Source System : Flatfile Select Data Type of DataSource: Master Data attributes  as we are creating master data table. while the user are uploading Master Data from file (DS1) Select Master Data Text. while the user are uploading Text Descriptions for the Master Data records (DS2) . Note: Select Master Data Attributes.

Enter the Data Separator as ‘ . we need to mention 2 details:  Name of Flat File (ASCII or CSV) to be uploaded  Structure (Lathe user) of the input file. . In a Data source.  General Info tab – provides information on the Data Source created  Extraction tab – provides details of file to be extracted. Therefore.  Proposal tab – provides a quick view of the data in the flat file to be uploaded Click on the ‘Load Example’ tab. The details can be viewed in the below pane. make the following changes:  Select name of file using F4. search for the file on the user local desktop  Header rows to be ignored = ‘1’ (if file has a header row as column titles) = ‘0’ (if file has no header row)  Select Data Format as – “Separated with Separator (for ex CSV)”.In the right Window. ‘. mentioning the fields. the user can view the Data Source properties.

enter the field names (characteristics/attributes).  [Enter] or [Save]  [Copy] : all properties of the characteristics ( ex: data types.  In the Template Info field. to map with the fields from Source file. length etc) are copied here. Note: the fields 0LANGU and 0TXTSH are the standard SAP defined characteristics InfoObjects. [Save] [Activate]  Preview tab – gives a preview of data on flat file loaded onto a structure. Fields tab – provides details of all fields used to create a data structure.  Click on “Read Preview Data” .

Rt.Info Package: An InfoPackage helps to transport data from the Data Source structures into the PSA tables. On the right window.  Then click on the [Monitor]button – next to “Process Chain Mgmt” Or [ F6]. Enter the InfoPackage name : IP_Cust Enter a suitable Description Select the corresponding DataSource from the list below and [Save]. we have the InfoPackage properties:  Data Selection tab – to initiate loading of data into the infopackage  Select “Start Data Load Immediately”  Click on [Start]. . (This is similar to transfer of data from Work areas to Internal Tables in R/3 ABAP). Click on the Data Source and Create InfoPackage.

Click on [Back] and come back to the DataSource screen. we can view the status of the data loading to PSA. defining the number of records per request. This leads to the PSA display screen. Click on the [PSA Maintenance] or [Ctrl + F8] to maintain the PSA table with the flat file values. In the pop-up. . click OK.In the next screen. with the table and the respective data in them.

 For the Target of Transformation. our target is the InfoObject (Master data table). Here.  Enter Object Type: DataSource (this is our source structure)  Enter name of Data Source used  Enter the Source System used.Transformation: This is a rule that is defined.  Enter Object Type: InfoObject (this is our target in this scenario)  Enter subtype of Object: Attributes (as we’re considering Master data only. for mapping the source fields in the DataSource to the final Target (basically. For this Master Data. Not transactional data)  Enter the name of the InfoObject we created. Click on OK Note: While uploading Master Data. select Sub-type of Object : ‘Attributes’ (IP1 for DS1) While uploading Text desc. we are creating a Transformation on the DataSource. Right click on DataSource  Create Transformation In the pop-up. the Transformation can be created on the Data Source or the Target. In this case. Note: Normally.  For the Source of Transformation. select Subtype of Object : ‘Texts’ (IP2 for DS2) . the Info Providers from where BI extracts the reports). which has been already created.

Data Transfer Process (DTP): The DTP is a process used to transfer data from the PSA tables.In the next screen pop-up. Click on Ok. we get the no. of proposals (rules) generated to map source fields with target fields. . On the next window we can see a graphical representation of the mapping of Source fields to Target fields. into the data Targets. if the mapping if done correctly. DataSource  right click  Create Data Transfer Process. Save and Activate. based on the Transformation (mapping) rules.

[Save] [Activate] . In the DTP properties window.  Extraction tab.  Select Extraction Mode as ‘Full’.In the pop-up.  The DTP name is described as : < DataSrc/SourceSys -> InfoObject >  DTP type : standard  Target of DTP : Object Type : InfoObject (in this case) Subtype of Object : Attribute Name : < name of InfoObject >  Source of DTP: Object Type : Data Source DataSource : <name> Source System: Flat file Click on OK.

 

Update tab – no changes to be made Execute tab,  Click on [Execute] button

Click on [Refresh] until the icons in yellow, turn Green. In case a Red icon appears, we need to track the error and rectify it. Go Back. To view the contents of the Target (Info Object), there are 2 ways: 1. Go to the InfoObject – CUSTNO  MasterData/Texts tab  Dbl click on the Master data table created : /BIC/P<Char name>  Table display  Execute

On Transformation/DTP  Attribute  rt.click  Manage  rt windw Contents tab  [Contents]  F8.

Text table Data Display is Language Dependent: For Lang = ‘EN’, display corresponding Short Desc.

For Lang = ‘DE’, display corresponding Short Desc.

Observations: In the above Master data table display, The fields – Customer Name, Customer Land, and Customer City: have been created and loaded with data as per the file. These 3 fields have been created as a result of checking the “With Master data” for the Characteristic. The contents of the Master data are stored in the newly created transparent table /BIC/P<char. Name> Ex: The fields – Language, Description : have been created and loaded with data as per the file. These 2 fields have been created by default, as a result of checking the “With Text data” for the Characteristic. The contents of the Text data are stored in the newly created transparent table /BIC/T<char. Name> Ex:

4. DB Connect
4.1 Introduction
The DB Connect enhancements to database interface allow the user to transfer data straight into BI from the database tables or views of external applications. The user can use tables and views in database management systems that are supported by SAP to transfer data. The user use Data Sources to make the data known to BI. The data is processed in BI in the same way as data from all other sources.

in addition to extracting data using standard connection.SAP DB Connect only supports certain Database Management systems (DBMS). both these DBMS are supported on their respective operating system versions. Further. Connecting a database to Source system -.2. One is the BI DBMS & the other is source DBMS.Direct access to external DB Using Data source. the user can extract data from tables/views in database management systems (DBMS) 4. In this process we use a Data source. The main thing which is. using the usual data accusation process we transfer data from DBs to BI system. the structure for table/view must be known to BI. they don’t meet the requirements & hence can’t perform DB Connect.2 Loading data from SAP Supporting DBMS into BI Steps are as follows:1.1 Process Description Go to RSA1 à Source Systems à DB Connect à Create . Using this SAP provides options for extracting data from external systems. The following are the list of DBMS      Max DB [Previously SAP DB] Informix Microsoft SQL Server Oracle IBM DB2/390. If not. 2. to make the data available to BI & transfer the data to the respective Info providers defined in BI system. IBM DB2 UDB There are 2 types of classification. 4. only if SAP has released a DBSL. IBM DB2/400.

Now. Type & Release Now. Source System Name à MS SQL DB Connect 3. create the source system using 1. Logical System Name à MSSQL 2. Under DB Connect. we can see the name of our Source System (MS SQL DB Connect) The logical DB Connect name is MSSQL .

In Data sources we need to create an Application Component area to continue with the export Goto RSA1  Data sources  Create Application Component After creating an Application Component Area called “ac_test_check”. we now have to create a Data source in the component area. . So right click the Application component area à Create Data source (as in below figure).

As the standard adapter is “Database Table” (by default). we can specify the Table/View here .The Data source name here is “ds_ac_tech” The Source System here is the defined “MSSQL” The type of data type data source that we have here is “Master Data Attributes” The below screen shot describes how to perform extraction or loading using a Table/View.

Or we can choose the Table/View à “REGION” . we have selected the “EMPLOYEES” as the Table/View. Now. choose the data source from the DB Object Names.Now.

we “ACTIVATE” it once.We have 2 database fields  Region  Region ID Description Now that the Data source has to be activated before it is loaded. .

After activation. the data records (4) are displayed. Eastern. with Source system as MSSQL . Northern & Southern Right click the Info package DS_AC_TEST à Create Info package We now create an Info package called “IP_DS_AC_TECH”. Western.

we need to create an Info Area to create an Info provider (like Info Cube) .Once done we perform a schedule on the Info package à “Start” Now.

After creating the info cube we check for the data in the PSA by “Manage the PSA” This can be also done using the Key controls (Ctrl + Shift + F6) .

2. which describes all the Database fields. 3. In the next tab we have “PROPOSAL”. 5. Status Data Packet Data records REGION ID REGION Description The Table/View “CUSTOMERS” is now chosen for Extraction. and we have to specify the Data source fields. . we can view the following factors 1. 4.he number of records displayed: 4 No’s Using the PSA Maintenance. types & length.

.

.

Now. go to RSA1 à Info objects à Info object (Test) à Create Info Object Catalog . we create an Info package à IP_TEST_CUST Now.

Region Description à Region2 (Region) . 1. we can preview the Region ID & Region Description. We now create 2 Info objects & pass the Region ID & Region Description to the 2 objects.Now.

Region Description à reg_id (Region ids) .2.

Now. these are the 2 variables created under the Info object “test2”   Region (REGION2) Region ids (REG_ID) .

We create Characteristic as Info Provider for the Master Data loading in the “Info Provider” section  Insert Characteristic as Info Provider Now. we create a transformation using “Create Transformation” on the Region ids (Attributes) .

we now perform a DTP Creation on the same Region ID (Attribute) We choose the Target system (default) as Info object  Region ID  REG_ID & the Source Type as Data source with the Source System  MSSQL .We now choose the Source System after this à MSSQL  MS SQL DB CONNECT After checking the Transformation mappings on the Region ID.

.

Once the info package has been triggered we can go to “Maintain PSA” & monitor the status of data in PSA .After this step. Further we start the scheduling of the Info package. we proceed with creating an Info package  IP_DS_TEDDY which has the source system as MSSQL.

Further. And. we EXECUTE the DTP. we can monitor transfer of data from PSA  Info cube .

.

.

5. To connect to data sources. the user can read the data directly in the source using a VirtualProvider. Select the required RFC Destination for the J2EE Engine. In the Implementation Guide for SAP NetWeaver →Business Intelligence →UDI Settings by Purpose →UD Connect Settings. Create the connection to the data source with the user relational or multi-dimensional source objects (relational database management system with tables and views) on the J2EE Engine. Universal Data Integration 5. . choose Create in the Context menu for the UD Connect folder. 3. UD Connect can use the JCA-compatible (J2EE Connector Architecture) BI Java Connector. 4.3 Creating UD source system 1. Create RFC destinations on the J2EE Engine and in BI to enable communication between the J2EE Engine and BI. UD Connect Source The UD Connect Sources is the instances that can be addressed as data sources using the BI JDBC Connector. Firstly.5. Define a DataSource in BI. the user can extract the data. Secondly. In the source system tree in Data Warehousing Workbench.1 Introduction UD Connect (Universal Data Connect) uses Application Server J2EE connectivity to enable reporting and analysis of relational SAP and non-SAP data. 2. 2. 3. 5. Specify a logical system name. load it into BI and store it there physically. Source Object Element Source object elements are the components of UD Connect source objects – fields in the tables.2 Process Flow 1. UD Connect Source Object UD Connect source objects are relational data store tables in the UD Connect source. 4. Model the InfoObjects required in accordance with the source object elements in BI. provided that the conditions for this are met.

3. a) Define the delta process for the DataSource. Select the Proposal tab. specify whether the DataSource is initial non-cumulative and might produce duplicate data records in one request. e) Select the UD Connect source object. The user cannot . 6. enter a technical name for the DataSource. select the type of DataSource and choose Copy. medium. A connection to the UD Connect source is established. Note that source object elements can have a maximum of 90 characters. The mapping proposal is based on the similarity of the names of the source object element and DataSource field and the compatibility of the respective data types. On the next screen. Assign the non-assigned source object elements to free DataSource fields. the metadata (information about the source object and source object elements) must be create in BI in the form of a DataSource. long).5. 8. a) b) Enter descriptions for the DataSource (short. Choose Continue. Select the General tab. Select the application component where the user want to create the DataSource and choose Create DataSource. The DataSource maintenance screen appears. b) Specify whether the user want the DataSource to support direct access to data. Select the name of the connector. Specify the name of the source system if it has not already been derived from the Logical system name. Select the Extraction tab. Both upper and lower case are supported. 5. If required. a) Check the mapping and change the proposed mapping as required.4 Creating a DataSource for UD Connect To transfer data from UD Connect sources to BI. 2. c) UD Connect does not support real-time data acquisition. Choose Properties if the user want to display the general adapter properties. 5. The system displays the elements of the source object (for JDBC it is these fields) and creates a mapping proposal for the DataSource fields. Select JDBC as the connector type. 7. d) The system displays Universal Data Connect (Binary Transfer) as the adapter for the DataSource. All source objects available in the selected UD Connect source can be selected using input help. 1. 9. 4.

6. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage. change the data type for a field. specify whether the data to be selected is languagedependent or time-dependent.b) map elements to fields if the types are incompatible. This is important in ensuring good performance for data transfer process selections. Under Field Type. These fields are generated as a secondary index in the PSA. as required. Define the Fields tab. Entering InfoObjects here does not equate to assigning them to DataSource fields. c) d) e) f) g) h) i) If the user did not transfer the field list from a proposal. Here. the user can define the fields of the DataSource directly. change the values for the key fields of the source. When the user define the transformation. If this happens. specify InfoObjects for the fields of the DataSource. specify a conversion routine that converts data from an external format to an internal format. Change the entries if required. the system displays an error message. specify the decision-relevant DataSource fields that the user wants to be available for extraction and transferred to BI. Under Template InfoObject. Choose the selection options (such as EQ. If required. Specify whether the source provides the data in the internal or external format. Select the fields that the user wants to be able to set selection criteria for when scheduling a data request using an InfoPackage. BT) that the user wants to be available for selection in the InfoPackage. in particular with semantic grouping If required. Assignments are made in the transformation. If the user chooses an External Format. a dialog box is displayed where the user can specify whether the user want to copy changes from the proposal to the field list. ensure that the output length of the field (external length) is correct. the user can edit the fields that the user transferred to the field list of the DataSource from the Proposal tab. If required. the system proposes the InfoObjects the user entered here as InfoObjects that the user might want to assign to a field . All fields are selected by default. Choose Insert Row and enter a field name. The user can specify InfoObjects in order to define the DataSource fields. If the system detects changes between the proposal and the field list when switch from the Proposal tab to the Fields tab. a) b) Under Transfer. This allows the user to transfer the technical properties of the InfoObjects to the DataSource field. Choose Copy to Field List to select the fields that the user want to transfer to the field list for the DataSource.

This connector is fully compliant with the J2EE Connector . supporting data sources such as Teradata. Microsoft Excel. 5.1 Aggregated Reading and Quantity Restriction In order to keep the data mass that is generated during UD Connect access to a JDBC data source as small as possible. However. This function allows the user to check whether the data formats and data are correct. DB2.7. 5. the number of data records the user specified in the user field selection is displayed in a preview. view) can be used for a UD Connect Source. Check. TIMS etc. Microsoft SQL Server. save and activate the DataSource 8. and text files such as CSV. The recognized key figures are aggregated. The view offers more benefits:  The database user selected from SAP BW for access is only permitted to access these objects.  Using the view. If the user selects Read Preview Data.5. a database view should be created that provides this join and this object is to be used as a UD Connect source object. the user can run type conversions that cannot be made by the adapter (generation of the ABAP data type DATS. Select the Preview tab.) 5.5.5 Using Relational UD Connect Sources (JDBC) 5. each select statement generated by the JDBC adapter receives a group by clause that uses all recognized characteristics. Oracle.2 Use of Multiple Database Objects as UD Connect Source Object Currently only one database object (table. Microsoft Access. What is recognized as a key figure or characteristic and which methods are used for aggregation depends on the properties of the associated InfoObjects modeled in SAP BW for this access. if multiple objects are used in the form of a join. The JDBC scenario does not support joins.6 BI JDBC Connector Sun's JDBC (Java Database Connectivity) is the standard Java API for Relational Database Management Systems (RDBMS). To prevent exceeding the storage limitations of the J2EE server. The BI JDBC Connector allows the user to connect applications built with the BI Java SDK to over 170 JDBC drivers.000 records are transferred to the calling ABAP module. packages with around 6. The amount of extracted data is not restricted.

provided by the implementation of JMI capabilities based on CWM • A query model independent of the SQL dialect in the underlying data source The JDBC Connector implements the BI Java SDK's IBIRelational interface. From the icon bar.6. 5.Architecture (JCA). The <jdbc driver name> is the name the user entered for the user driver when the user loaded it (see Prerequisites in BI JDBC Connector). select Server x → Services → JDBC Connector. This can be done by performing the following steps: 1. The user does this in SAP NetWeaver Application Server’s Visual Administrator. Choose the Resource Adapter tab. 3. Select No. and when finished. 4. 6. The connector adds the following functionality to existing JDBC drivers: • Standardized connection management that is integrated into user management in the portal • A standardized metadata service. Save the settings.6. Enter library :<jdbc driver name> and choose OK. select Yes when prompted. 8. On the Cluster tab. In the right frame. 2.1 Deploy the user data source’s JDBC driver to the server: 1. The user can also create systems in the portal that are based on this connector. choose Create New Driver or Data source. 7.2 Configuring BI Java Connector To prepare a data source for use with the BI Java SDK or with UD Connect. Select the BI JDBC Connector in the Connectors tree. 6. To select additional JAR files. 2. In the Loader Reference box. Start the Visual Administrator. 4. . The user can also use the BI JDBC Connector to make these data sources available in SAP BI systems using UD Connect. 5. configure a reference to the JDBC driver of the user data source. In the service Connector Container. select the Drivers node on the Runtime tab. 5. Navigate to the user JDBC driver's JAR file and select it. enter a name for the user JDBC driver. 5. 3. In the DB Driver field in the Add Driver dialog box. the user first need to configure the properties in BI Java Connector used to connect to the data source. choose Add to add a reference to the user JDBC driver.

5.1 Testing the Connections After the user has configured the BI Java Connector. the user can perform a rough installation check by displaying the page for the connector in the user server. 5.3 Cloning the Connections The user can clone an existing connection by using the Clone button in the toolbar.6.2.2 JNDI Names When creating applications with the BI Java SDK.6.3 Connector Properties Refer to the table below for the required and optional properties to configure for the user connector: BI JDBC Connector Properties . refer to a connector by its JNDI name: The BI JDBC Connector has the JNDI name SDK_JDBC.2.6.5.6. Perform the tests for the connector by visiting the URLs 5.2.

.

The BI XMLA Connector implements the BI Java SDK's IBIOlap interface. . MicroStrategy. and BW 3.5. The BI XMLA Connector enables the exchange of analytical data between a client application and a data provider working over the Web. This connector is fully compliant with the J2EE Connector Architecture (JCA). The BI XMLA Connector allows the user to connect applications built with the BI Java SDK to data sources such as Microsoft Analysis Services. Hyperion. platformindependent access to OLAP providers. The XMLA Connector sends commands to an XMLAcompliant OLAP data source in order to retrieve the schema rowsets and obtain a result set.The user can also use the BI XMLA Connector to make these data sources available in SAP BI Systems via UD Connect. or the user can create systems in the portal based on this connector.7 BI XMLA Connector Microsoft's XMLA (XML for Analysis) facilitates Web services-based. MIS.x. using SOAP based XML communication API.

7.5.1 Using InfoObjects with UD Connect .

When modeling InfoObjects in BI. In general. For more information about data type compatibility. then the objects are created in the development system and assigned to package RSSDK_EXT. After the transport request is finished. there is also the option of switching to generation of local objects. it is used to transfer the infrastructure into the productive environment. the user should be aware of the following dependencies:  System changeability These objects can only be generated in systems whose system changeability permits this. The following restrictions apply when using InfoObjects: • Alpha conversion is not supported • The use of conversion routines is not supported • Upper and lower case must be enabled These InfoObject settings are checked when they are generated.  Key Because the generated objects are ABAP development objects. If the report is executed again. To do this. in which namespace it is created. . . If the user need to work with transportable objects. This package is especially designated for these objects. the user must be authorized as a developer.2 Using SAP Namespace for Generated Objects The program-technical objects that are generated during generation of a DataSource for UD connect can be created in transportable or local format. these are development systems. A developer key must be procured and entered. The delivery status allows for the generation of transportable objects in the SAP namespace.• If a classic SAP system landscape of this type exists. If this appears to be too laborious (see the dependencies listed below). The transportability of an object depends on. note that the InfoObjects have to correspond to the source object elements with regard to the type description and length description. The status of already generated objects does not change. The objects are also added to a transport request that the user create or that already exists. because productive systems block system changeability for security reasons. Transportable means that the generated objects can be transferred to another SAP BW system using the correction and transport system. All new objects are created as transportable. The objects generated afterward are not transportable. 5. the generation is changed back to transportable.7. Then the system switches to local generation. the user run the RSSDK_LOCALIZE_OBJECTS report in the ABAP editor (transaction: SE 38). among other things.

1 Introduction SAP’s Business Connector (SAP BC) is a business integration tool to allow SAP Customers the ability to communicate with other SAP customers or SAP marketplaces.Generation requires the customer-specific installation number and can be generated online. SAP BC integrates RFC server and the client and provides an XML layer over R/3 functionality. Further efforts are not required if there are repeated changes to the field list or similar. Because the generated objects were created in the SAP namespace. this is customer specific and can also be procured online. Also.2. 6. systems and users and facilitates them do business via web. SAP BC has a built in integration support for SAP’s specification of IDOC-XML and RFC-XML standards. Afterwards. The key is to be entered exactly once per object and system. The key has to be procured and entered exactly once per user and system. The system administrator knows this procedure and should be included in the procurement. SAP BC makes this communication easier by its XML conversions.1 End-to-End Web Business Processes Internet bridges the gap between different businesses. Hence. the object is released for further changes as well. That is. This middleware component uses the Internet as communication platform and XML/HTML as data format thus seamlessly integrating different IT architectures with R/3. XML Integration 6. . whenever dealing with the messages conforming to the same standards it supports the integration. 6.2 Benefits of XML Integration  End-to-End Web Business Processes  Open Business Document Exchange over the Internet  XML Solutions for SAP services 6. The Business Connector allows integration with R/3 via open and nonproprietary technology. it comes with an XML automation that converts SAP’s RFC format into XML and supports both synchronous RFC and asynchronous RFC protocols such that there is no requirement of SAP R/3 automation at the receiving end. an object key is required. Through its integration with XML it enables the exchange of structured business documents over the Internet by providing common standard for different applications and IT systems to communicate and exchange business data. Like the developer key.

SAP BC provides openness and flexibility to comply with emerging business semantics that continuously keep on changing. XML acts as a uniform standard for exchanging business data. These third-party systems using the XML standard data format are increasing rapidly. 6. XML messages are translated into corresponding SAP internal call whenever required and converted back into XML format when received from SAP system enhancing the existing programming model for distributed applications formed by ALE. pursuing the production process in cooperation with other firms.2. It supports all major existing interfaces that are provided by SAP with the help of XML-based Interface Repository (IFR) and empowers SAP customers to benefit from SAP functionality over the Internet. . can understand. SAP BC extends business scenarios across firewalls enabling secure flow of business documents without requiring changes to establish security infrastructures. which everyone involved.3 Business Integration with XML Business processes are increasingly characterized by structures operating between companies.3 XML Solutions for SAP services SAP BC makes all mySAP. simple and complex structures can be presented at any data level and for any category of data. It also ensures secure exchange of documents with the help of its Security Socket Layer (SSL) technology and allows implementation of maps. This makes the process of data transfer a lot easier by providing a common structure for all business documents. This means that exchanging data quickly and securely between applications and systems is an increasingly important requirement. IDOC and BAPI. branches and loops without any coding by using developer tool. A customer using SAP can use a range of add-on products and services to their existing applications. Also with SAP BC. In addition.2. through which heterogeneous applications can communicate with one another over uniform interfaces and in a language. Companies no longer act in isolation: instead. Non-SAP applications can now be used by SAP customers owing to XML compliance. In addition. For example. With XML.6. they are integrated into geographically distributed production networks.com solutions accessible via XML based business documents.2 Open Business Document Exchange over the Internet SAP Business Connector uses hypertext transfer protocol (HTTP) to exchange XML-based documents over the Internet. using a company’s services in the cyber marketspace allows the customer data to be received and directly stored on vendor system as both are using data formatted in XML. 6. This IFR gives the option for downloading XML schemas for operational use and provides a uniform XML interface representation despite different implementation technologies such as RFC.

component. and heterogeneous systems extends the traditional business environment and provides users with a complete view of the business. the routing of the messages. HTTPS and FTP fully supports collaborative business scenarios. increasingly common in an integrated world.XML integration is also easier than using proprietary communication formats like SAP's BAPIs and RFCs. SAP supports two different standards for these envelopes .1 Incorporating XML Standards Various XML standards are supported by SAP. It presents data according to an SAP specification either in IDoc-XML or BAPI-XML. . Messages are stored in a generic envelope.3. Furthermore. It uses preprepared representations of the SAP interfaces to XML Commerce Business Language (xCBL) messages to facilitate communication with the MarketSet marketplaces. the SAP XML extensions for IDocs and BAPIs. The ability to integrate and analyze business data across applications. among other things. systems across a number of companies).2 SAP’s Internet Business Framework SAP's Internet Business Framework (IBF) attempts to address business collaboration issues by enabling integration with Internet technologies at the user. structured and unstructured information. customers can now exchange data with XML standard by using the Internet infrastructure through a much-more user-friendly Web Browser. 6. and business-process levels:  User integration is achieved by providing a single point of Web-based access to the workplace (that is. Packaging SAP BAPIs in a standard envelope offers several advantages.Microsoft BizTalk and a format similar to Simple Object Access Protocol (SOAP). including direct processing of messages by external applications and a uniform system of error handling. This envelope contains metadata that controls. local and company wide systems) and the marketplace (that is.3. XML formatting and message handling with the help of the SAP Business Connector allows customers to use an industry-wide accepted format for data exchange between the SAP system and partner systems including historically grown proprietary systems. This has found widespread customer acceptance as it reduces integration and maintenance cost of interface integration during the implementation of non-SAP systems. 6. An XML environment of data exchange over the Internet via security protocols such as HTTP.

 Component integration consists of the integration of Internet technologies at the front-end and application-server levels with the aid of HTML, HTTP, and XML messages.  Business process integration across company boundaries is accomplished through the exchange of transactions between companies based on open Internet standards.
6.3.3 SAP applications with XML

SAP BC provides an add-on XML layer with R/3 functions to ensure compatibility of non-SAP applications with R/3 internal data structures or protocols. SAP Business Connector can help achieve seamless B2B integration between businesses through the integration framework. For example, there can be a real-time integration between supplier inventories and an enterprise’s SAP system; or a multi-vendor product, price and availability information and a customer’s purchasing application. The SAP proprietary RFC format is converted to XML (or HTML) so that no SAP software is needed on the other end of the communication line and developing applications does not require SAP R/3 knowledge. SAP BC has built in support for SAP's specification of IDoc-XML and RFCXML. Whenever the user deal with messages conforming to these standards, integration is supported out of the box. For cases where other XML formats are required the user can create maps with a graphical tool or insert the user own mapping logic. XML Based communication over the Internet is achieved through SAP's Business Connector

Figure 1

6.3.4 Factors leading to emergence of XML-enabled SAP solutions 6.3.4.1 Changing Business Standards and their adoption

Mapping of SAP business documents and the XML-based business documents can be easily done with SAP Business Connector. The flexible architecture of the SAP Business Connector makes it easy to add specific schemas and interfaces To comply with business documents standards, the Business Connector provides for automated generation of server and client side codes thus solving the interconnectivity problem by enabling uniform transactions among customers. A business model based on these standards allows companies to move towards becoming make-to-order businesses replete with all the marketing, sales, distribution, manufacturing and other logistic-driven operational cost savings.
6.3.4.2 Internet Security Standards

The effectiveness of E-commerce is premised on a secure exchange of information. The SAP business connector provides security essential to an online business transaction. Business partners can be authenticated and business documents can be securely exchanged. The SAP Business Connector supports the well-established standard encryption technology Secure Socket Layer (SSL) for secure document exchange. Digital signature will ensure authentication of crucial data. SAP communication is achieved through Idocs, BAPIs, and RFCs. These documents must be made compliant with Internet. This conversion is achieved using the Business Connector. Business Connector converts SAP documents into XML so that they can be exchanged using Internet protocols.

6.4 Web-based business solutions
The SAP Business Connector allows businesses to effectively use intra and inter-enterprise information. For example-companies can use the SAP Business Connector to retrieve catalog information from a suppliers’ Web site and to integrate the information with internal processes - in real time.
6.4.1 Components of Business Connector

A business connector constitutes a Server and an Integrator. The Integrator can be used to add any additional functionality to the Server. SAP Business connector in association with a third party XML enabled software product can only be used for collaborative business-to-business scenarios

such as transmitting catalog information, managing purchase orders and availability to promise checks, acknowledging purchase orders, and handling invoices. The requirements for XML interface certification for SAP's complementary software include:  Use of HTTP and HTTPS protocols with the SAP Business Connector  Customization for a specific Business Application  Sending and receiving the communication objects (i.e. Idocs, BAPIs or RFCs)

6.5 How to Customize Business Connector (BC)
1. Start the Business Connector by calling <BC_dir>/bin/server.bat. If the user wants to have a debug output, enter server.bat –debug <debuglevel> –log <filename>. Debuglevel cannot be > 10. 2. Open the Business Connector administration screen into a web browser window. Enter the user Username and corresponding password.
6.5.1 Add New Users to BC

3. If the user want to transmit data from an SAP system to the BC, the user need to have the same user in both systems. For creating users in the BC click Security > Users and Groups > Add and Remove Users. 4. Enter the desired SAP User, assign the corresponding Password and click Create Users. This creates a User. Mark the just created User in the Groups box section and make sure that the ‘Select Group’ ="Administrators". Now, add the User into the Administrators group by clicking (below the right selection box). Click Save Changes to save the settings.
6.5.2 Add SAP Systems

5. All the proposed SAP system(s) should be added within the Business Connector. To achieve this, click Adapters > SAP which opens a new window. In the new window click SAP > SAP Servers > Add SAP Server. 6. Enter the necessary information for the SAP server (System, Login Defaults, Server Logon, and Load Balancing. "Save" (as illustrated in screen 1).

but can use it to facilitate a business-to-business transaction in an EDIlike manner.5. Enter information like ‘Sender’.3 Add Router Tables 7. For example: The user can send an XML document to the user vendor. All incoming calls to Business Connector are scrutinized and routing information is extracted about the ‘Sender’. After entering all the details. ‘Receiver’ and ‘Msg Type’.4 Access functionality in the Business Connector 9. 6. 8. click Save and enable the rule by clicking No under the "Enabled?" column. . Post a document containing the XML format of the Idoc or BAPI/RFC-call to the Business Connector service. For example: Use the following statements to post a document to the /sap/InboundIdoc service of the Business Connector. and they send the user an XML packet back.6. ‘Receiver’ and ‘MsgType’. etc). With "Add Rule" the rule is created on other details like "Transport" and "Transport Parameters" must be provided. Procurement. Clicking Adapters > Routing opens a new window.5. Using the Rules it finds the recipient and the format that should be used to send the call to this particular recipient. The user cannot use the SAP Business Connector in a web application (e-Commerce.

A BI system defines itself as the source system for another BI system by:  Providing metadata  Providing transaction data and master data An export DataSource is needed to transfer data from a source BI into a target BI. texts. Export DataSources for InfoCubes and DataStore objects contain all the characteristics and key figures of the InfoProvider. global) overall system (Data Warehouse). 7.1 Introduction The data mart interface makes it possible to update data from one InfoProvider to another. the data-receiving system as the Target BI.  The user can only generate an export DataSource from an InfoCube if:  The InfoCube is activated  The name of the InfoCube is at least one character shorter than the maximum length of a name. Data Mart Interface 7. Data exchange of multiple BI Systems: the data-delivering system is referred to as the Source B. possibly even in a different location. The individual BI systems arranged in this way are called data marts. and hierarchies for an InfoObject. since the DataSource name is made up of the InfoCube name and a prefix.2 Special Features  Changes to the metadata of the source system can only be added to the export DataSources by regenerating the export DataSources.  The Delete function is not supported at this time. The InfoProviders of the source BI are used as sources of data here. Data marts can be used in different ways:  They save a subset of the data of a Data Warehouse in another database. Export DataSources for master data contain the metadata for all attributes.  They are stored as intentionally redundant segments of the (logical.7. Data exchange between BI systems and other SAP systems. .  They are smaller units of a Data Warehouse.

the same as in the SAP source system. Here too. that is. update the data to another ODS o object and merges the objects together.4 Data Mart Interface between Several Systems Data marts are found both in the maintenance and in the definition. since the ODS data can be used directly as a DataSource in the same system.7. or continue to work in several BI Systems. Updating ODS data into another ODS object makes it possible to track object the status of orders. fill another InfoCube.3 Data Mart Interface in the Myself System The data mart interface in the Myself System is used to connect the BW System to it. ource The data mart interface can be used between two BI Systems or between another SAP system and a BI System: . delivered in part etc. The user can import InfoCube data by InfoSource into BW again and. for example. the user can group together data from one or more source systems in a BI System. In order to trace the status in reporting. The user can carry out a data clean-up. targets within the system. thereby. This means the user can update data from data targets into other data . using transfer rules and update rules. 7. The user can update data directly from ODS objects into other data targets. Data about orders and deliveries are in two separate ODS objects. to see which orders are open.

4. on the one hand. 7. the data for a BI server is available as source data and can be updated in further target BI Systems.1.1 Architectures The functions can produce different architectures in a Data Warehouse landscape.  To enable data separation relating to the task area.4.  To construct hub and spoke scenarios in which a BI System stands in the middle and the data from distributed systems runs together and is standardized. and to be able to perform analyses of the state of the entire dataset on the other.  To make it possible to have fewer complexities when constructing and implementing a Data Warehouse.  Replication Architecture  Aggregation Architecture 7. logging and analysis.1 Replicating Architecture If the user selects this architecture. .  To achieve improved concision and maintenance of the individual Data Warehouses. To enable large amounts of data to be processed  To increase the respective speeds of data transfer.

and is then available for further processing.7. data is grouped together from two or more BI servers. .4.1.2 Aggregating Architecture With aggregating architecture.

 Create the InfoProvider into which the data is to be loaded.7. This is only necessary if the source system has not yet been created in the target BI. the user can replicate all the metadata of a source system. or only replicate the metadata of the user DataSource. it must have information about the structure of the data to be requested.2 In the Target BI  If required.  Activate the DataSource. which contains all the characteristics and key figures of the InfoProvider. To do this.  Using the context menu for the user DataSource. The user generates an export DataSource for the respective InfoProvider in the source BI.4.4. This export DataSource includes an extraction structure. access the functions of the data mart we use the Data Mart Interface.2 Process Flow To define an existing BI system as a source system. 7.  Replicate the metadata of the user export DataSource from the user source BI into the user target BI. The user source BI system is specified as the source by default. the user has to upload the metadata from the source BI into the target BI. 7. .4.  Create an InfoPackage at the user DataSource using the context menu.2.2.  The complete data flow is displayed in the InfoProvider tree under the user InfoProvider. define the source system. create a data transfer process with the InfoProvider into which the data is to be loaded as the target.1 In the Source BI Before a BI system can request data from another BI system. Using the source system tree. A default transformation is created at the same time.

To do this. and thus for individual InfoObjects. The user can use the selected InfoProvider as a DataSource for another BI system.4. Select the InfoObject tree in the Data Warehousing Workbench in the source BI system. The technical name of the export DataSource is 8******M for attributes (M stands for master data) 8******T for texts 8******H for hierarchies. The technical name of the export DataSource is made up of the number 8 together with the name of the InfoProvider.4.3 Generating Export DataSources for InfoProviders The export DataSource is needed to transfer data from a source BI system into a target BI system. texts. Schedule the InfoPackage and the data transfer process. and hierarchies) from an InfoObject.4 Generating Master Data Export DataSources The export DataSource is needed to transfer data from a source BI system into a target BI system. When the user create an InfoObject or a master data InfoSource in the source BI system. The user can generate an export DataSource for master data. Generate the export DataSource using the context menu of the user InfoProvider. Generate the export DataSource from the context menu of the user InfoObject. Technical name of an InfoCube: COPA. choose Additional Functions  Generate Export DataSource. the user must therefore make sure that the length of the technical name of each object is no longer than 28 characters. (The asterisks (*) stand for the source InfoObject). choose Additional Functions  Generate Export DataSource. By doing this. choose Modeling and select the InfoProvider tree. In the Data Warehousing Workbench in the BI source system. all the metadata is created that is required to extract master data (all attributes. 1.Technical name of the export DataSource: 8COPA 7. 1. . 2. 7. To do this. 2. We recommend that the user always use process chains for loading processes.

Only the change log requests that have arisen from reactivating the DataStore object data are transferred.4. meaning that it is then impossible to separately execute deltas for the different selections 7. When the user load hierarchies.2 Restriction It is possible to only make one selection for each target system for the delta. from the fact tables of the delivering BI system. meaning that the current status is transferred into the target BI system. only those requests are transferred that are set to Qualitative OK in the InfoCube administration.5. the requests in the change log of the DataStore object are used as the basis for determining the delta. the user can transfer data by full upload as well as by delta requests.4.1 Delta Process Using the data mart interface. If no aggregates are used. the user get to the available hierarchies from the Hierarchy Selection tab page by using the pushbutton Available Hierarchies in OLTP.  For DataStore objects. The system reads the data. The user first makes a selection using cost center 1 and load deltas for this selection. Later on. If the user wants to load texts or hierarchies using the Scheduler. taking into account the specified dimension-specific selections. Different target systems can also be filled like this. When the next upload is performed. It corresponds to the data transfer for an R/3 System.5.4. 7. only those requests are transferred that have come in since initialization. 7. the user must first create them in the source system.  The InfoCube that is used as an export DataSource is first initialized. Only those requests are transferred that have been rolled up successfully in the aggregates. the user also decides to load a delta for cost center 2 in parallel to the cost center 1 delta. A distinction is made between InfoCubes and DataStore objects. . The delta can only be fully requested for both cost centers.6 Transferring Texts and Hierarchies for the Data Mart Interface The transfer of master data is scheduled in the Scheduler.7. Select the user hierarchy and schedule the loading.4.5 Transactional Data Transfer Using the Data Mart Interface The data transfer is the same as the data transfer in an SAP system.

But the actual generation of data for this cube. 7. not physically and are made available to Reporting as Info-Providers. But in Bi7. well depends on the user making. the user chooses Virtual Provider with 'direct access'.2 Create Virtual Infocube In BI7. After all is done. The user can also choose to get value from a function module of the user calling. choose the DTP related to the transformation and save it. the user have a more flexible approach in that the user can create a transformation to configure the 'transfer logic' of the user virtual infoprovider. meaning the user do not 'execute' it to start a data transfer. and choose create 'virtual provider'. go to the Infosource tab. with dimensions. feel like an infocube. which doesn't actually do anything. There's an option to use the old R/3 infosource. The user can choose to create a virtual provider with a 'direct access' to R/3. This means that upon viewing the data in this virtual provider. the user need to create a sort of 'pseudo' DTP. The user can create this virtual infoprovider. But using BI7 setup. along with start routine. For direct access. characteristics and key figures. and when the user create the virtual infoprovider. What this means is that the user can create an infosource with transfer rules and such. end routine or any other transformation technique visible with using a transformation rule. but is not an infocube'. having the same structure similar to infocube. it will use the structure in the infosource automatically and the flow of data is automatically link R/3 with the transfer rules and the virtual infoprovider. If the user uses Infosource. the user need to right click on the virtual infoprovider and choose 'Activate Direct Access'.7. Virtual InfoCubes 7. is. Note that no update rules exist with this kind of setup.3 Different Types A distinction is made between the following types: . Virtual remote cube can be used to 'act like an infocube.1 Introduction They only store data virtually. so the implementation details are really up to the user. and choose the infosource. just right click on any infoarea. If the user is using BI7 setup. 7. a remote function call is called direct to R/3 to get the values on the fly.

1 SAP RemoteCube An SAP RemoteCube is a RemoteCube that allows the definition of queries with direct access to transaction data in other SAP systems. The user can then assign source systems to the SAP RemoteCube. choose one or several source systems and select Save Assignments. In the context menu.1. Select SAP RemoteCube as InfoCube type and enter the user InfoSource. The source system assignments are local to the system and are not .3. 2. 6. In this stem case the characteristic (0LOGSYS) is added to the InfoCube definition. To do this. 5. choose Create InfoCube.1 Creating a SAP RemoteCube The SAP RemoteCube can be used in reporting in cases where it does not differ from other InfoProviders. The user can define if a unique source system is assigned to the InfoCube using an indicator. Do not use SAP RemoteCubes if:  The user request a large amount of data in the first query navigation step. SAP RemoteCube  RemoteCube  Virtual InfoCube with Services 7. In the dialog box that follows. 1. 4. Choose the InfoSource tree or InfoProvider tree in the Administrator Workbench under Modeling.3. . transported. 3. Otherwise the user must select the source system in the relevant query. choose Assign Source System from the context menu in the user SAP RemoteCube. Use SAP RemoteCubes if:  The user need very up up-to-date data from an SAP source system  The user only access a small amount of data from time to time  Only a few users execute queries simultaneously on the database. In the next screen the user can check the defined SAP RemoteCube and adjust it if necessary before activation. and no appropriate aggregates are available in the source system  A lot of users execute queries simultaneously  The user frequently access the same data 7.

During this process. Only the structure of the RemoteCube is defined in BW. 7. To prevent this user can create an inversion routine for every transfer routine.3. for example. The user can. the selections cannot be transferred. With more complex transformations such as routines or formulas. The Release status of the source system is at least 4. the user can reduce the administrative work on the BW side and also save memory space. Using a RemoteCube. a source system ID has been created for the source system DataSources from the source system that are released for direct access are assigned to the InfoSource of the SAP RemoteCube. the user can carry out reporting using data in external systems without having to physically store transaction data in BW.1. They copy the characteristics and key figures of the InfoSource.0B In BW. . If the user have specified a constant in the transfer rules.3.2 Structure SAP RemoteCubes are defined based on an InfoSource with flexible updating. They are already replicated in BW when the user executes a query.1. There are active transfer rules for these combinations. The transaction data is called during execution of a query in the source system. the selections are provided to the InfoObjects if the transformation is only simple mapping of the InfoObject. a source system must meet the following requirements:     BW Service API functions (contained in the SAP R/3 plug-in) are installed.3. By doing this. Inversion is not possible with formulas. The data is read for reporting using a BAPI from another system.3 Integration To be assigned to an SAP RemoteCube. It takes longer to read the data in the source system because the amount of data is not limited. the data is transferred only if this constant can be fulfilled. which is why SAP recommends that the user use formulas instead of routines. include an external system from market data providers using a RemoteCube.2 Remote Cube A RemoteCube is an InfoCube whose transaction data is not managed in the Business Information Warehouse but externally. Master data and hierarchies are not read directly in the source system. 7.7.

In BW. Load the master data:  Create a master data InfoSource for each characteristic  Load texts and attributes Define the RemoteCube Define the queries based on the RemoteCube . Define the required InfoObjects.3. 3. 2. 7.2 Integration To report using a RemoteCube the user has to carry out the following steps: 1.2.7.2. the Data Manager. 4.3. calls the RemoteCube BAPI and transfers the parameters.1 Structure When reporting using a RemoteCube. the external system transfers the requested data to the OLAP Processor. instead of using a Basic Cube filled with data. 5. create a source system for the external system that the user wants to use.  Selection  Characteristics  Key figures As a result.

These conversions only change the transfer table in the user-defined function module. In comparison to the RemoteCube. The user uses a virtual InfoCube with services if the user wants to display data from non-BW data sources in BW without having to copy the data set into the BW structures.7.3. any restrictions are passed to the InfoCube. The user does this by selecting the Convert Restrictions option. Enter the name of the function module that the user wants to use as the data source for the virtual InfoCube. The result of the query is not changed because the restrictions that are not processed by the function module are checked later in the OLAP processor. The user has a number of options for defining the properties of the data source more precisely. The user can also use the user own calculations to change the data before it is passed to the OLAP processor. There are different default variants for the interface of this function module. Options:  No restrictions: if this option is selected.3. 1. an extra Detail pushbutton appears on the interface. the virtual InfoCube with services is more generic. The data source is a user-defined function module. Depending on these properties. but also requires more implementation effort. It offers more flexibility.3. is given at the end of this documentation. The next step is to select options for converting/simplifying the selection conditions. If the user chooses Virtual InfoCube with Services as the type for the user InfoCube. One method for defining the correct variant. in which the user defines the services. together with the description of the interfaces. the data manager provides services to convert the parameters and data. . The data can be local or remote. This function is used primarily in the SAP Strategic Enterprise Management (SEM) application.1 Structure When the user creates an InfoCube the user can specify the type.3 Virtual InfoCubes with Services A virtual InfoCube with services is an InfoCube that does not physically store its own data in BW. 2. This pushbutton opens an additional dialog box. 7.

2 Dependencies If the user uses a remote function call. if the user select this option. wherever possible. If this option is selected. The user use the currency tables to determine the correct value for this internal representation.3. the user Should select this option. Expand hierarchy restrictions: If this option is selected. for example.3. by setting restrictions on columns in queries. restrictions on hierarchy nodes are converted into the corresponding restrictions on the characteristic value. 6.   Only global restrictions: If this option is selected. Restrictions applied to the navigation attributes are not passed to the function module in this case. 3. If this is not possible. Simplify selections: Currently this option is not yet implemented. Since this option is only useful in conjunction with a remote function call. The value in this internal format is different from the correct value in that the decimal places are shifted. the navigation attributes are read in the data manager once the user-defined function module has been executed. SID support must be switched off and the hierarchy restrictions must be expanded. . 5. are deleted. restrictions that are applied to SID values are converted automatically into the corresponding restrictions for the characteristic values. With navigation attributes: If this option is selected. Internal format (key figures): In SAP systems a separate format is often used to display currency key figures. 4. the user have to define a logical system that is used to determine the target system for the remote function call. only global restrictions (FEMS = 0) are passed to the function module. in the query. the OLAP processor incorporates this conversion during the calculation. navigation attributes and restrictions applied to navigation attributes are passed to the function module. the characteristic values are read from the data source and the data manager determines the SIDs dynamically. If this option is not selected. 7. Pack RFC: This option packs the parameter tables in BAPI format before the function module is called and unpacks the data table that is returned by the function module after the call is performed. the user need to have selected the characteristics that correspond to these attributes. In this case. SID support: If the data source of the function module can process SIDs. Other restrictions (FEMS > 0) that are created. In this case.

2.1 Description of the interfaces for user user-defined function modules Variant 1: Variant 2: .3.3.7.

7. Table i_tsx_hier has the following type: Variant 3: SAP advises against using this interface.3 Method for determining the correct variant for the interface The following list describes the procedure for determining the correct interface for the user-defined function module. using field 'POSIT' . Go through the list from top to the bottom. use Note that SAP may change the structures used in the interface.2 Additional parameters for variant 2 for transferring hierarchy restrictions With hierarchy restrictions.3.7.2. The interface is intended for internal use only and only half of it is given here. defined The first appropriate case is the variant that the user should use: If Pack RFC is activated: Variant 1 If SID Support is deactivated: Variant 2 .3. and the 'LOW' field contains a number that can be used to read the corresponding hierarchy restriction from table I_TSX_HIER.3. an entry for the 'COMPOP' = 'HI' (for hierarchy) field is created at the appropriate place in table I_T_RANGE (for FEMS 0) or I_TX_RANGETAB (for FEMS > 0).3.

Sign up to vote on this title
UsefulNot useful