This action might not be possible to undo. Are you sure you want to continue?
Posted by madhu on Thursday, June 2, 2011
1. How to use Virtual K.F/Char. ?
Ans : This “virtual” characteristic is getting a value assigned at query runtime and must not be loaded with data in data target. Therefore, no change to existing update rules. The implementation can be divided into the following areas: 1. Create of InfoObject [Key Figure / Characteristics] and attach the InfoObject to the InfoProvider. 2. Implementation of BADI RSR_OLAP_BADI (Set filter on Info provider while defining BADI implementation) 3. Adding the InfoObject into the Query. 2. Query Performance Tips : Ans : i. Don‟t show too much data in initial view of report output ii. Limit the level of hierarchies on initial view iii. Always use Mandatory variables iv. Utilize filters based on InfoProviders v. Suppress Result rows if not needed vi. Eliminate or Reduce Not logic in query selection 3. DataStore Objects : Ans : Standard DSO – Max. 16 key fields can be created.
4. Types Of DataStore Objects : Standard DSO Write-Optimized DSO Direct DSO - The DataStore object for direct update differs from the standard DataStore object in
terms of how the data is processed. In a standard DataStore object, data is stored in different versions (active, delta, modified), whereas a DataStore object for direct update contains data in a single version. Therefore, data is stored in precisely the same form in which it was written to the DataStore object for direct update by the application. In the BI system, you can use a DataStore object for direct update as a data target for an analysis process Type Structure Data Supply SID Details Example Generation Standard Consists of From data Yes Standard Operational Scenario for DataStore three tables: transfer DataStore Standard DataStore Objects Object activation process Object queue, table of active data, change log WriteConsists of From data No WriteA plausible scenario for writeOptimized the table of transfer Optimized optimized DataStore objects is
A user-defined function module is used as a data source. This means that the system does not create a dimension table. then the dimension should be flagged as Line Item Dimension. Different Types Of InfoCubes : exclusive saving of new. it also has a disadvantage: A dimension marked as a line item cannot subsequently include additional characteristics. do not select a dimension having high cardinality. 7. This is only possible with normal dimensions. In the example below. Nevertheless. the database optimizer can choose better execution plans. Based on a function module: A VirtualProvider without its own physical data store in the BI system. High cardinality: This means that the dimension is to have a large number of instances (that is. This information is used to carry out optimizations on a physical level in depending on the database platform. If you are unsure.having a very large cardinality. whereas standard InfoCubes are technically optimized for read accesses. of rows) exceeds 20% of fact table size. but externally. Removing the dimension table has the following advantages: ○ When loading transaction data. however. the SQLbased queries are simpler. ○ A table. A general rule is that a dimension has a high cardinality when the number of dimension entries is at least 20% of the fact table entries. Real-time InfoCubes are used when creating planning data. Scenario : 0IS_DOCID(Document Identification) InfoObejct used as Line Iten Dimension. Different index types are used than is normally the case. unique data records. 8. Instead. This number range operation can compromise performance precisely in the case where a degenerated dimension is involved. In many cases. The data is written to . the SID table of the characteristic takes the role of dimension table. Line Item Dimension : If dimension table size (no. The data is read from an external system for reporting using a BAPI. For APD Standard InfoCube (with physical data store) VirtualProvider (without physical data store) Based on a data transfer process and a DataSource with 3.x InfoSource: A VirtualProvider that allows the definition of queries with direct access to transaction data in other SAP source systems. Based on a BAPI: A VirtualProvider whose data is not processed in the BI system. As a result. write-optimized DataStore objects are used as the EDW layer for saving data.DataStore Objects active only data process DataStore Object DataStore Consists of Objects for the table of Direct Update active data only 5. a high cardinality).is removed from the star schema. no IDs are generated for the entries in the dimension table. Real Time Cube: Real-time-enabled InfoCubes can be distinguished from standard InfoCubes by their ability to support parallel write accesses. for example in the posting process for documents in retail. 6.
For real-time InfoCubes. If aggregates exists than they need to be reconstructed. Standard InfoCubes are not suitable for this. Remodelling : Remodeling is a new feature available as of NW04s BI 7. If you change this setting to Real-Time InfoCube Can Be Loaded with Data. This article describes how to add a new characteristic to InfoCube using the remodeling feature and populating it using a Customer Exit. value of another InfoObject (in the same dimension). Real-time-enabled InfoCubes can be filled with data using two different methods: Using the BWBPS transaction for creating planning data and using BW Staging. you can then fill the Cube using BW Staging. Similarly a KeyFigure can be deleted. Adjust queries based on this InfoCube to accommodate the changes made. 9. Note following after you finish remodeling process and start daily loads and querying this InfoCube: All the objects dependent on InfoCube like transformations.Difference between With Export and Without Export Migration of 3.x datasource : . If Real-Time InfoCube Can Be Planned. This feature does not yet support remodeling of DSO and InfoObjects. The interface for the class should be IF_RSCNV_EXIT and code is written in the Method IF_RSCNV_EXIT~EXIT. The code is written in SE24 by creating a new class.the InfoCube by several users at the same time. a reduced read performance is compensated for by the option to read in parallel (transactionally) and an improved write performance. MultiProviders will have to be re-activated. You have the option to switch the real-time InfoCube between these two methods. If new field was added using remodeling than don‟t forget to map it in the transformation rules for future data loads. They should be used if you only need read access (such as for reading reference data). choose Switch Real-Time InfoCube. the Cube is filled using BWBPS functions. If you are adding or replacing a KeyFigure compress the cube first to avoid inconsistencies unless all the records in the InfoCube are unique.0 which enables to change the structure of an InfoCube already loaded without disturbing data. replaced with a constant value or a new KeyFigure can be added and populated using a constant value or a Customer Exit. 10. with value of an attribute of another InfoObject (in the same dimension) or with a value derived using Customer Exit. Planning Not Allowed. Note following before you start remodeling process: Back-up of existing data During remodeling process InfoCube is locked for any changes or data loads so make sure you stall all the data loads for this InfoCube till the time this process finishes. Using remodeling a characteristic can be simply deleted or added/replaced with a constant value. Data Loading not Allowed is selected by default. From the context menu for your real-time InfoCube in the InfoProvider tree.
You can only derive values for other variables or validate the user entries. or in the filter. subsequently.. then the variable is only calculated if the reference characteristic is uniquely specified in the respective row. (by splitting the timestamp in date and time). These three steps are called I_STEP. column. Type-Pools : SBIWA. Difference between Calculated Key Figure and Formula : The replacement of formula variables with the processing type Replacement Path acts differently in calculated key figures and formulas: If you use a formula variable with “Replacement from the Value of an Attribute” in a calculated key figure. aggregations. etc. Here you can validate the user entries. Characteristic restriction in Restricted Key Figure) to determine the data you want to display at the report runtime. 14. The first step (I_STEP = 1) is before the processing of the variable pop-up and gets called for every variable of the processing type “customer exit”. SRSC. How to create Generic Datasource using Function Module : A structure is created first for extract structure which will contain all datasource fields. then the system automatically adds the drilldown according to the reference characteristic for the attribute. You can alter the selections at runtime using navigation and filters. meaning all additional. if the base tables (from where data will be fetched) contain date and time field then include a dummy field (Timestamp) in the extract structure created for the FM and use this field in code. calculated key figures.x DataSource. which are assembled in the calculated key figure itself.Allows you to revert back to 3. Customer Exit for Query Variables : The customer exit for variables is called three times maximally. You can use this step to fill your variable with default values. The system only calculates the operators. Then a Function module is created by copying the FM RSAX_BIW_GET_DATA_SIMPLE. Constant Selection : In the Query Designer. If you use a formula variable with Replacement from the Value of an Attribute in a formula element. This step is called only for those variables that are not marked as “ready for input” and are set to “mandatory variable entry”. before the aggregation using the reference characteristic. Please note that you cannot overwrite the user input values into a variable with this customer exit.g. This allows you to further restrict the selections. you use selections (e. The system then evaluates the variables for each characteristic value for the reference characteristic.With Export . all of the other operations are executed. the calculated key figure is calculated and. 15. The code needs to be modified as per requirement. The second step (I_STEP = 2) is called after the processing of the variable pop-up. and formulas. For delta functionality. 11. Afterwards. The Constant Selection function allows you to mark a selection in the Query Designer as constant. Transfer Rules.Does not allow you to ever revert back to the 3. This means that navigation and filtering have no effect on the selection at runtime. In which scenario you have used Generic DataSource ? . The third step (I_STEP = 3) is called after all variable processing and gets called only once and not per variable.when you choose this option (Recommended) Without Export . 12. 13..x DataSource.
Safety Interval Lower Limit : This field contains the value taken from the highest value of the previous delta extraction to determine the lowest value of the time stamp for the next delta extraction. Use count 10-100. Ideally this being an additive delta .this would mean that all records created / changed during extraction are skipped. 2. generated record # like in GLPCA table. Estimate the extraction time for your datasource and then accordingly set the safety upper limit so that no records are skipped. This will load Delta in BW as of yesterday. For example: A time stamp is used to determine a delta. Selection criteria to extract data was – illing Plan Date(FPLT-AFDAT) < Current month date illing Status(FPLT-FKSAF) = Not yet Processed ontract Type(VBAK-AUART) = Fixed Price (ZSCC) em Category(VBAP-PSTYV) = ZSV2 We could have used Data Source 2LIS_11_VAITM and could have enhanced that for FPLT fields. 3. Leave Lower limit blank. a record can be extracted into the BW for such data without any problems. records can be transferred twice. this is unavoidable. then use Upper Limit of equal to 1800 Seconds (30 minutes). If value 10 is used then last 10 records will be loaded again. then use Lower Limit. If delta field is a Numeric Pointer i. If delta field is Date (Record Create Date or change date).you need to be careful not to double your records . this way records created during extraction are not missed. then ideally your safety upper limit should be half hour or more . records that are created during extraction cannot be extracted. then use Upper Limit of 1 day. The extracted data is master data: The system only transfers after-images that overwrite the status in the BW. the delta pointer will read 12:15:00 and subsequent delta will read records created / changed on or after 12:15:00 . 1. Generic DS Safety Interval – Lower limit and Upper Limit ?What are the Delta Specific fields ? When to chose ‘New status for changed records’ and when ‘Additive Delta’ ? Safety Interval Upper Limit : The upper limit for safety interval contains the difference between the current highest value at the time of the delta or initial delta extraction and the data that has actually been read. the current time stamp can always be used as the upper limit when extracting: The lower limit of the next extraction is not seamlessly joined to the upper limit of the last extraction. Not surprisingly. those records . Leave upper limit blank. If a record is created when load was running. Instead. But then this being an additive delta . its value is the same as this upper limit minus a safety margin. for the reasons above. But the problem was that whenever there will be status change in a Billing Plan.We had a requirement to send Contract (Sales Order) data from BI to MERICS (external system). Thereby we have created a generic data source using Function Module. For example : If you start extraction at 12:00:00 with no safety interval and then your extract runs for 15 minutes . that will not be captured by data source 2LIS_11_VAITM. Taking this into account. This safety margin needs to be big enough to contain all values in the extraction which already had a time stamp when the last extraction was carried out but which were not read. This will load Delta in BW as of 30 minutes old. Therefore. 16.either extract records during periods of very low activity or have smaller safety limits to make sure data does not get duplicated. However. If this value is initial. If delta field is Time Stamp." This would mean that if your extractor takes half an hour to run . Leave Lower limit blank.e.
21. How to Create Optimized InfoCube ? o Define lots of small dimensions rather than a few large dimensions.The field is a DATS8 field which always contains the day of the last change. Key Fig. o If the size of the dimension table amounts to more than 10% of the fact table. mark the dimension as a line item dimension. o The size of the dimension tables should account for less than 10% of the fact table. be sure this DataSources is only feeding an ODS Object Delta Specific Fields : o TimeStamp . o Numeric Pointer . – 248 Max. o Error DTP . Different Types of DTP : o Standard DTP . o Direct Access DTP . 17. o Calendar Day . Maximum Char.The field is a DEC15 field which always contains the time stamp of the last change to a record in the local time format. lower limit can be used to backup the starting sequence number. And Key Figures allowed in a Dimension of Cube ? Max.Error DTP is used to update error records from Error stock to the corressponding data targets.DTP for Direct Access is the only available option for VirtualProviders. DataSources with this delta type can supply data to ODS objects and InfoCubes. New status for changed records : Each record to be loaded delivers the new status for the key figures and characteristics. DSO etc). This may result in some records being processed more than once. Difference between DSO and Cube DSO InfoCube Use Consolidation of data in the Aggregation and performance data warehouse layer optimization for Loading delta records that can multidimensional reporting subsequently be updated to Analytical and strategic data InfoCubes or master data analysis tables Operative analysis (when being used in the operational data store) Type of data Non volatile data (when being Non volatile data used Aggregated data. Char. Additive Delta : The key figures for extracted data are added up in BW. How to Fill Up Set Up Table and related Transaction 18. therefore. 20.may get lost. totals in the data warehouse layer) Volatile data (when being . DataSources with this delta type can write to ODS objects or master data tables. To prevent this situation. – 233 19.Standard DTP is used to update data from PSA to data targets ( Info cube.The field contains another numerical pointer that appears with each new record.
Functional enhancements have been made for the DataStore object: New type of DataStore object Enhanced settings for performance optimization of DataStore objects. 3. you can always revert to the old transactions too.5 6. 13.There are functional changes to the Persistent Staging Area (PSA). Which means you would need to have an EP guy in ur project for implementing the portal ! :) 5.BI supports real-time data acquisition. From BI. 4. 15. Give an example where DSO is used for Addition not overwrite. 12 The Data Source:There is a new object concept for the Data Source. Difference between 3.100. Vendors for these would be HP or IBM. Not like 3. New authorization objects have been added 11. 23. The new features/ Major differences include: .0 1. remote activation of Data Sources is possible in SAP source systems. Search functionality hass improved!! You can search any object. documenttype data (line items) Type of data update Overwrite (in rare cases: Addition only addition) Data structure Flat and relational database Enhanced star schema (fact tables. Options for direct access to data have been enhanced. analysis flat reporting with low level of granularity The number of query records (OLAP analysis) should Use of InfoCube aggregates be strictly limited by the Drill-through to document choice of key level fields (stored in DataStore objects) Individual document display possible using the reportreport interface 22. 10. This BI accl is a separate box and would cost more.x and 7. It implements the Enterprise Data Warehousing (EDW). 2. The Data Warehousing Workbench replaces the Administrator Workbench. 8. The monitoring has been improved with a new portal based cockpit. In Infosets now you can include Infocubes as well. 7. table and dimension tables) semantic key fields Type of data analysis Reporting at high level of Multidimensional data granularity. SAP BW is now known formally as BI (part of NetWeaver 2004s).used in the operational data store) Transactional data. The transformation replaces the transfer and update rules. 14. This is only for info cube. Transformations are in and routines are passe! Yess. The Remodeling transaction helps you add new key figure and characteristics and handles historical data as well without much hassle. The BI accelerator (for now only for infocubes) helps in reducing query run time by almost a factor of 10 . Remodeling of InfoProviders supports you in Information Lifecycle Management. 9.
When you Compress the cube. 24. The system does this by deleting the request IDs. Also in the Transformation now we can do "Start Routine. DTP (Data Transfer Process) replaced the Transfer and Update rules. What is Compression. What are the error you have faced during Transport of object ? 28. only requests that have already been rolled up are compressed. that case we take Job Count Number. If aggregates exist. we roll them up into all the aggregates of the infocube and then carry on the compression of the cube. . From Chain tab get VARIANT and INSTANCE value. Expert Routine and End Routine". This replaces the Transfer Rules and Update Rules iv. 16. Performance optimization includes new BI Accelerator feature. and also there is no IDoc transfer method in BI 7. it is recommended to roll up a request as soon as possible and then compress the infocube. that is.0. User management (includes new concept for analysis authorizations) for more flexible BI end user authorizations. during data load. If no aggregates exist. New data flow capabilities such as Data Transfer Process (DTP).a) Renamed ODS as DataStore. What steps need to follow when a process in Process Chain fails and we need to make it green to proceed further? 1. It looks like we would not have the option to bypass the PSA Yes. g) Load through PSA has become a must. Compression . For some cases INSTANCE is not available. What is Change Run ? How to resolve when a attribute change run fails because of locking problem? 27. For performance and disk space reasons. 26. You can't skip this. The process terminates if no active. Enhanced and Graphical transformation capabilities such as Drag and Relate options. with saved data quality. when we roll up data load requests. I am not too sure about this. Real time data Acquisition (RDA). Compression : After rollup."COMPRESS AFTER ROLLUP" option ensures that all the data is rolledup into aggrgates before doing the compression. One level of Transformation. RollUp . What is Extended Start Schema ? 25. the InfoCube content is automatically compressed. ii. the system compresses all requests that have yet to be compressed. Attribute Change Run? RollUp : You can automatically roll up and transfer into the aggregate requests in the InfoCube with “green traffic light status”.with Zero Elimination – Zero-elimination means that data rows with all keyfigs = 0 will be deleted. Right click on the failed process and go to „Display Messages‟. initially filled aggregates exist in the system. First we need to do aggregate roll up before compression. Load through PSA has become a mandatory. New features in BI 7 compared to earlier versions: i. iii. v. which improves performance. b) Inclusion of Write-optmized DataStore which does not have any change log and the requests do need any activation c) Unification of Transfer and Update rules d) Introduction of "end routine" and "Expert Routine" e) Push of XML data into BI system (into PSA) without Service API or Delta Queue f) Intoduction of BI accelerator that significantly improves the performance.
A rule group is a group of transformation rules. VARIANT. In Standard Rule group(target). Example : Records in source system: Actual and Plan Amount are represented as separate fields. TYPE. This group contains all the default rules. among other things. CHAIN. Processing Type Replacement Path for Variable with Examples. Version to constant value 020 and direct assignment form Plan Amount to Amount in target field. A transformation can contain multiple rule groups. And another rule group(New Rule group) will be created where we will make the char. text variables and formula variables. They are differentiated using the characteristic Version (Version = 010 represents Actual Amount and Version = 020 represents Plan Amount in this example). The processing type Replacement Path can be used with characteristic value variables. 30. BATCHTIME and STATE = „G‟. . o A default rule group is created for every transformation called as Standard Group. 29. This means that for a characteristic. Execute program (SE38) RSPC_PROCESS_FINISH by providing LOGID. 31. CompanyCode Account FiscalYear/Period ActualAmount Plan Amount 1000 5010180001 01/2008 100 400 1000 5010180001 02/2008 200 450 1000 5010180001 03/2008 300 500 Records in business warehouse: Single KeyFigure represents both Actual and Plan Amount. which would be necessary. It contains one transformation rule for each key field of the target. TYPE. CompanyCode Account FiscalYear/Period Version Amount 1000 5010180001 01/2008 010 100 1000 5010180001 02/2008 010 200 1000 5010180001 03/2008 010 300 1000 5010180001 01/2008 020 400 1000 5010180001 02/2008 020 450 1000 5010180001 03/2008 020 500 To achieve this. Why we cannot use DSO to load inventory data ? DS objects cannot admit any stock key figures (see Notes 752492 and 782314) and. Go to table RSPCPROCESSLOG. ODS objects cannot be used as non-cumulative InfoCubes. that is. BATCHDATE and BATCHTIME 3. they cannot calculate stocks in terms of BW technology. BATCHDATE.2. Give example. you can create different rules for different key figures. What is Rule Group in Transformation. Therefore. Version to constant value 010 and direct assignment form Actual Amount to Amount in target field. we will make the char. Only the additional created groups can be deleted. o Standard rule group cannot be deleted. Give the VARIANT and INSTANCE and get LOGID. Rule groups allow you to combine various rules. You use the Replacement Path to specify the value that automatically replaces the variable when you execute the query or Web application. INSTANCE. Few key points about Rule Groups: o A transformation can contain multiple rule groups. they do not have a validity table.
This can be achieved multiple ways like logic in infopackage routine. you can influence the aggregation behavior of calculated key figures and obtain improved performance during calculation. Select the characteristic Product and from the context menu. 32. overwriting will happen thus not affecting the data integrity). But as a matter of fact. are replaced by the results of a query. o Characteristic value variables with the processing type Replacement Path. selections identifying only the changed records and so on. By choosing this attribute. · Attribute Value The variable value is replaced with the value of an attribute.o Text and formula variables with the processing type Replacement Path are replaced by a corresponding characteristic value. You reach the Save Variable dialog step 7. 2. if you look at the data load it would say FULL LOAD instead of DELTA. Choose Next. · Name (Text) The variable value is replaced with the name of the characteristic. we had a code in the infopackage that looks at when the previous request was loaded. 1. The Variable Wizard appears. you can create a reference to the characteristic for which the variable is defined. · External Characteristic Value Key The variable value is replaced with an external value of the characteristic (external/internal conversion). In my past experience. Choose the attribute Reference to Characteristic (Constant 1). choose New Variable. Enter the query Top 5 Products. Hierarchy Attribute The variable value is replaced with a value of a hierarchy attribute. You reach the Replacement Path dialog step. 4. using that date calculates the month and loads data for which CALMONTH is between the previously loaded date and today's date (since the data target is an ODS. 5. This allows you to determine how the sales for these five. Enter a variable name and a description. Flat Aggregates If you have less than 16 characters in an aggregate ( including the time . An additional field appears for entering the hierarchy attribute. You need this setting for sign reversal with hierarchy nodes o Example: Replacement with Query You want to insert the result for the query Top 5 products as a variable in the query Sales – Calendar year / month. When replacing the variable with an attribute value. data and package dimensions) then the characteristic SIDs are stored as Line items meaning the E Fact table of the . Choose the processing type Replacement Path. 3. Choose Exit You are now able to insert the variable into the query Sales – Calendar year / month. Pseudo Delta This is different from the normal delta in a way that.Note that formula variables have to contain numbers in their names so that the formula variable represents a value after replacement. Choose Next. 6. 33. its only pulling the records that are changed or created after the previous load. Replacement with a characteristic value : Replace Variable with · Key The variable value is replaced with the characteristic key. An additional field appears for entering the attribute. top-selling products has developed month for month. even if there is a duplicate selection.
01.meaning that the aggregate more or less is like a standard table with the necessary SIDs and nothing else. Flat aggregates can be rolled up on DB Server (without loading data into Application Server) 34. executed LUWs in the TRFC outbound of the source system. In SAP BW. Alternatively. SQL traces. After the report is executed. into which you copy the source code of the correction instructions. b) Workload Analysis: You use transaction code ST03 . Performance Metric measures. Execute the report. there is no way of reading the data from a buffer and transferring this to the master data tables. you can import the attached correction instructions into your system and create an executable program in the customer namespace for this using transaction SE38.aggregate ( Assuming that your aggregate is compressed ) will have 16 columns and these will have the SIDs only. terminated request.(e. It should be larger than or the same as the number of records for the last. It is a general Basis tool but can be leveraged for BW. Master Data Load failure recovery Steps : Issue : A delta update for a master data DataSource is aborted. Solution : Import the next PI or CRM patch into your source system and execute the RSA1BDCP report. 19:30:00). 20010131193000 for 31. as yet.g. a dialog box appears with the number of records that should have the 'unread' status. After you execute the report.2001.Therefore. which are transferred into BW during the next upload. there are no. The report contains 3 parameters: 1. should be displayed as YYYYMMDDHHMMSS (for example. What is KPI ? (1) Predefined calculations that render summarized and/or aggregated information. KPIs are put in place and visible to an organization to indicate the level of progress and status of change efforts in an organization. Performance Monitoring and Analysis tools in BW: a) System Trace: Transaction ST01 lets you do various levels of system trace such as authorization checks. and industry experts to ensure that they reflect best practices.hence the name FLAT . P_OS (DataSource): Name of the DataSource 2.For this time stamp select the time stamp of the last successful delta request of this DataSource in the corresponding BW system. KPIs are industry-recognized measurements on which to base critical business decisions. which is useful in making strategic decisions.. table/buffer trace etc. 35. Business Content KPIs have been developed based upon input from customers. partners. 36. 20010131193000 for January 31. 19:30:00).. 2001. P_TIME (generation time stamp):The generation date and time of the first change pointer.In addition. Check the plausibility of this number of records. (2) Also known as Performance Measure.The data was sent to the BW in this case but it was not posted in the PSA. You do not have any further tables like dimension tables etc for an aggregate in this case . change the status of the last (terminated) request in BW to 'green' and request the data in 'delta' mode. P_RS (BIW system): logical name of the BW system 3.
In table RSSDLINIT check for the record with the problematic datasource. An important feature of this tool is the ability to retrieve important IDoc information. That's it. If you do not have a existing full infopackage. The error occurs in the FUNCTION-POOL FORM RSM1_CHECK_FOR_DELTAUPD. KPIs for FI datasource : Accounts Payable : o DataSource 0FI_AP_4 (Vendors: Line Items with Delta Extrcation): DataSource Field BI Info Object DMSOL 0DEBIT_LC (Debit Amount in local currency) DMHAB 0CREDIT_LC (Credit Amount in local currency) . These steps have always worked for me. g) ABAP Runtime Analysis Tool: Use transaction SE30 to do a runtime analysis of a transaction. Solution : Try to open a existing full infopackage if you have it. it throws the same error. 4. Follow the steps in the note 852443 . After you open the infopacke remove the delta initialization from the infopackage as shown below. e) BW Technical Content Analysis: SAP Standard Business Content 0BWTCT that needs to be activated. Enqueue trace. It is a very helpful tool if you know the program or routine that you suspect is causing a performance bottleneck. 38. you will be able to open existing full infopackage because it is not going to check delta consistency. 3. This error typically occurs when delta is not in sync between source system and BW system. d) Performance Analysis: Transaction ST05 enables you to do performance traces in different are as namely SQL trace. Go to RSRQ transaction and enter the RNR number and say execute. follow the steps in the OSS note if this doesn't work for you. Of course you need to do your delta init again as we made last delta init red. You can re initialize the delta and start using the infopackage. It might happen when you copy new environments or when you refresh you QA or DEV boxes from production. Now change the status of the request to red. 37. 2. It contains several InfoCubes. 1. Runtime Error MESSAGE_TYPE_X when opening an info package in BW You sometimes run into error message 'Runtime error MESSAGE_TYPE_X' when you try to open an existing delta infopackage. program or function module. f) BW Monitor: You can get to it independently of an InfoPackage by running transaction RSMO or via an InfoPackage. Now you will be able to open your delta infopackage and run it. Get the request number (RNR) from the record.c) Database Performance Analysis: Transaction ST04 gives you all that you need to know about what‟s happening at the database level. It will show you the monitor screen of actual delta init request. ODS Objects and MultiProviders and contains a variety of performance related information. RFC trace and buffer trace. Got menu scheduleer -> Initialization options for source system ->Select the entry -> click on delete button After that you will be able to open existing delta infopackage. You can go through all of them or do what I do follow the steps below. There are many troubleshooting steps in this note. It won't even let you create a new infopackage.
signs General Ledger : o DataSource 0FI_GL_4 DS Field BI IO WRBTR 0AMOUNT (Amount) DMBTR 0VALUE_LC (Amount in local currency) DMSOL 0DEBIT_LC (Debit amount in local currency) DMHAB 0CREDIT_LC (Credit amount in local currency) DMSHB 0DEB_CRE_LC (Amount in local currency with +/.signs o DataSource 0FI_AP_6 (Vendor Sales Figures via Delta Extraction) : DS Field BI IO UM01S 0DEBIT (Total Debit Postings) UM01H 0CREDIT (Total credit postings) UM01K 0BALANCE (Cumulative Balance) UM01U 0SALES (Sales for the Period) Accounts Receivable : o DataSource 0FI_AR_4 (Cu stomers: Line Items with Delta Extraction) DS Field BI IO ZBD1T 0DSCT_DAYS1 (Days for Cash Discount 1) ZBD2T 0DSCT_DAYS2 (Days for Second Cash Discount) ZBD3T 0NETTERMS (Deadline for Net Conditions) DMSOL 0DEBIT_LC (Debit amount in local currency) DMHAB 0CREDIT_LC (Credit amount in local currency) DMSHB 0DEB_CRE_LC (Amount in local currency with +/.signs WRSOL 0DEBIT_DC (Debit amount in Foreign currency) WRHAB 0CREDIT_DC (Credit amount in Foreign currency) o DataSource 3FI_GL_0L_TT (Leading Ledger (Totals)) DMSHB .0DEB_CRE_LC (Amount in Local currency with +/.signs WRSOL 0DEBIT_DC (Debit amount in Foreign currency) WRHAB 0CREDIT_DC (Credit amount in Foreign currency) WRSHB 0DEB_CRE_DC (Foreign currency amount with +/.
40. 0GROSS_WT (Gross Weight).Y1) 39. o Generally characteristics that have a large number of entries should be in a dimension by themselves. 0LENGTH(Length). o One should also try to minimise the number of dimensions. which is flagged as a "line item" dimension. 0GROSS_CONT (Gross Content). Infocube Optimization : o When designing an InfoCube. o It is generally recommended to do this if the dimension table size (number of rows) exceeds 10% of the fact table's size. . 42. What all Custom reports you have created in your project ? 41.DataSource 3FI_GL_Y1_TT (Non-leading ledger (Statutory) (Totals) .Y1) DS Field BI IO DEBIT 0DEBIT (Total Debit Postings) CREDIT 0CREDIT (Total Credit postings) BALANCE 0BALANCE (Cumulative Balance) TURNOVER 0SALES (Sales for the Perod) QUANTITY 0QUANTITY (Quantity) o DataSource 3FI_GL_Y1_TT (Non-leading ledger (Statutory) (Totals) . o Both of these objectives can usually be met by building your dimensions with characteristics that are related to each other in a 1:1 manner (for example each state is in one country) or only have a small number of entries. it is most important to keep the size of each dimension table as small as possible. o Characteristics that have a "many to many" relationship to each other should not be placed in the same dimension otherwise the dimension table could be huge. You should also flag it as a "line item" dimension in an SAP InfoCube. Example of Display Key Figure used in Master Data In 0MATERIAL: The display key figures are 0HEIGHT(Height). How do you handle Init without data transfer through DTP ? Under „Execute‟ tab select the processing mode as s hown in the screenshot.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.