SAP Bw Interview Questions 1)How we do the SD and MM configuration for BW ?

You need to activate the data sources in R3 system. You need to maintain the login information for the logical system.sm59 : Choose the RFC destination , BW system, Under logon Security, maintain the user credentials. Maintain control parameters for data transfer. Filling in of setup tables SBIW I feel that these are certain prerequisites. From an SD perspective, you as a BW consultant should first understand the basic SD process flow on the R3 side. (search the forum for SD process flow and you'll get a wealth of information on the flow and the tables as well as transactions involved in SD). Next you need to understand the process flow that has been implemented at the clients place. How the SD data flows and what are the integration points with other modules as well as how the integration happens. This knowledge is essential when modeling your BW design. From a BW perspective you need to first know all the SD extractors and what information they bring. Next look at all the cubes and ODS for SD. 1. What is the t-code to see log of transport connection? in RSA1 -> Transport Connection you can collect the Queries and the Role and after this you can transport them (enabling the transport in SE10, import it in STMS 1. RSA1 2. Transport connection (button on the left bar menu) 3. Sap transport -> Object Types (button on the left bar menu) 4. Find Query Elements -> Query 5. Find your query 6. Group necessery object 7. Transport Object (car icon) 8. Release transport (SE10 T-code) 9. load transport (STMS T-code) 2.Lo; mm inventory data source with marker significance? Marker is as like check point when u upload the data from inventory data source 2lis_03_bx data source for current stock and BF for movement type after uploading data from BX u should rlise the request in cube or i menn to say compress it then load data from another data source BF and set this updated data to no marker update so marker is use as a check point if u dont do this u getting data missmatch at bex level bcz system get confuse . (2LIS_03_BF Goods Movement From Inventory Management-- -----Unckeck the no marker update tab) (2LIS_03_BX Stock Initialization for Inventory Management-- ---select the no marker update check box) 2LIS_03_UM Revaluations ----Uncheck the no marker update tab) in the infopackege of "collaps" 3. How can you navigate to see the error idocs ? If it is fine check the IDOCs in source system go to BD87->give Ur user ID and date>execute->you can find Red status Idocs select the erroneous Idoc->Rt.click and select Manual process. You need to Reprocess this IDOC which are RED. For this you can take help of Any of your Team (ALE IDOC Team or BAsis Team)Or Else youcan push it manually. Just search it in bd87 screen only to Reprocess. Also, Try to find why this IDocs are stuck there. 4)Difference between v1, v2, v3 jobs in extraction? V1 Update: when ever we create a transaction in R/3(e.g.,Sales Order) then the entries get into the R/3 Tables(VBAK, VBAP..) and this takes place in V1 Update. V2 Update: V2 Update starts a few seconds after V1 Update and in this update the values get into Statistical Tables, from where we do the extraction into BW.

V3 Update: Its purely for BW extraction. But in the Document below, V1, V2 and V3 are defined in a different way. Can You please explain me in detial what exactly V1, V2 and V3 updates means? 5.What are statistical update and document update? Synchronous Updating (V1 Update) The statistics update is made synchronously with the document update. While updating, if problems that result in the termination of the statistics update occur, the original documents are NOT saved. The cause of the termination should be investigated and the problem solved. Subsequently, the documents can be entered again. Radio button: V2 updating 6.Do you have any idea how to improve the performance of the BW..? Asynchronous Updating (V2 Update) With this update type, the document update is made separately from the statistics update. A termination of the statistics update has NO influence on the document update (see V1 Update). Radio button: Updating in U3 update program Asynchronous Updating (V3 Update) With this update type, updating is made separately from the document update. The difference between this update type and the V2 Update lies,however, with the time schedule. If the V3 update is active, then the update can be executed at a later time. In contrast to V1 and V2 Updates, no single documents are updated. The V3 update is, therefore, also described as a collective update. 7)How can you decide the query performance is slow or fast ? You can check that in RSRT tcode. execute the query in RSRT and after that follow the below steps Goto SE16 and in the resulting screen give table name as RSDDSTAT for BW 3.x and RSDDSTAT_DM for BI 7.0 and press enteryou can view all the details about the query like time taken to execute the query and the timestmaps 8)What is statistical setup and what is the need and why? Follow these steps to filling the set up table. 1. Go to transaction code RSA3 and see if any data is available related to your DataSource. If data is there in RSA3 then go to transaction code LBWG (Delete Setup data) and delete the data by entering the application name. 2. Go to transaction SBIW --> Settings for Application Specific Datasource --> Logistics --> Managing extract structures --> Initialization --> Filling the Setup table --> Application specific setup of statistical data --> perform setup (relevant application) 3. In OLI*** (for example OLI7BW for Statistical setup for old documents : Orders) give the name of the run and execute. Now all the available records from R/3 will be loaded to setup tables. 4. Go to transaction RSA3 and check the data. 5. Go to transaction LBWE and make sure the update mode for the corresponding DataSource is serialized V3 update. 6. Go to BW system and create infopackage and under the update tab select the initialize delta process. And schedule the package. Now all the data available in the setup tables are now loaded into the data target. 7. Now for the delta records go to LBWE in R/3 and change the update mode for the corresponding DataSource to Direct/Queue delta. By doing this record will bypass SM13 and directly go to RSA7. Go to transaction code RSA7 there you can see green light # Once the new records are added immediately you can see the record in RSA7. 8. Go to BW system and create a new infopackage for delta loads. Double click on new infopackage. Under update tab you can see the delta update radio button.. 9. Now you can go to your data target and see the delta record. 9.Why we have construct setup tables? The R/3 database structure for accounting is much more easier than the Logistical structure.

Once you post in a ledger that is done. You can correct, but that give just another posting. BI can get information direct out of this (relatively) simple database structure. In LO, you can have an order with multiple deliveries to more than one delivery addresses. And the payer can also be different. When 1 item (orderline) changes, this can have its reflection on order, supply, delivery, invoice, etc. Therefore a special record structure is build for Logistical reports.and this structure now is used for BI. In order to have this special structre filled with your starting position, you must run a set-up. from that moment on R/3 will keep filling this LO-database. If you wouldn't run the setup. BI would start with data from the moment you start the filling of LO (with the logistica cocpit) 10.How can you eliminate the duplicate records in TD, MD? Try to check the system logs through SM21 for the same. 11.What use marker in MM? Marker update is just like check point. ie it will give the snapshot of the stock on a particular date ie when was the marker updated. Because we are using Noncumulative keyfigure it will lot of time to calculate the current stock for example at report time. to overcome this we use marker update Marker updates do not summarize the data.. In inventory management scenarios, we have to calculate opening stock and closing stock on a daily basis. In order to facilitate this, we set a marker which will add and subtract the values for each record. In the absence of marker update, the data will be added up and will not provide the correct values. 12.Tell me web template? You get information on where the web template details are stored from the following tables : RSZWOBJ Storage of the Web Objects RSZWOBJTXT Texts for Templates/Items/ Views RSZWOBJXREF Structure of the BW Objects in a Template RSZWTEMPLATE Header Table for BW HTML Templates You can check these tables and search for your web template entry . However, If I understand your question correctly , you will have to open the template in the WAD and then make the corrections in the same to correct it. 13.What is dashboard? A dash board can be created using the web application Designer (WAD) or the visual composer (VC). A dashboard is just a collection of reports, views and links etc in a single view. For e.g. igoogle is a dashboard. A dashboard is a graphical reporting interface, which displays KPIs (Key Performance Indicators) as charts and graphs. A dashboard is a performance management system When we look at the all organization measures how they are performing with helicopter view, we need a report that teaches and shows the trend in a graphical display quickly. These reports are called as Dashboard Reports, still we can report these measures individually, but by keeping all measures in a single page, we are creating single access point to the users to view all information available to them. Absolutely this will save lot of precious time, gives clarity on decision that needs to be taken, helps the users to understand the measure(s) trend with business flow creating dashboard Dashboards : Could be built with Visual Composer & WAD create your dashboard in BW, (1) Create all BEx Queries with required variants,tune them perfectly. (2) Differentiate table queries and graph queries. (3) Choose the graph type required that meet your requirement. (4) Draw the layout how the Dashboard page looks like. (5) Create a web template that has navigational block / selection information. (6) Keep navigational block fields are common across the measures. (7) Include the relevant web items into web template.

How can you solve the data mismatch tickets between r/3 and bw? Check the mapping at BW side for 0STREET in transfer rules. Although simple in concept.This database stores the questions and the answers and serves as the heart of your blue print. Thisdatabase stores any open concerns and pending issues that relate to the implementation. assign them to teammembers.Security requirements to prevent unauthorized useThis process involves one seemingly simple task: Find out exactly what theend users' analysis requirements are.Check the data in PSA for the same field.Critical success factors for the implementation.sd.(8) Deploy the URL/Iview to users through portal/intranet The steps to be followed in the creation of Dashboard using WAD are summarized as below: 1) Open a New Web template in WAD. These blueprints are in the form of questionnaires that are designed to probe for information that uncovers how your company does business. in practicegathering and reaching . and update the database accordingly.Source systems that are involved and the scope of information needed from each.Intended audience and stakeholders and their analysis needs. As such.If the PSA is also doesn't have complete data then check the field in RSA3 in source system.Any major transformation that is needed in order to provide the information. 3) Place the appropriate web items in the appropriate tabular grids 4) Assign queries to the web items (A Query assigned to a web item is called as a data provider) 5) Care should be taken to ensure that the navigation block‟s selection parameters are common across all the BEx queries of the affected dataproviders. These functional requirements represent the scope of analysis needs and expectations (both now and in the future) of the end user. 16)What is replacement path tell me one scenario? http://www.Business reasons for the project and business questions answered by the implementation. 19). The kinds of questions asked are germane to the particular business function.solutions. Each business blueprint document essentially outlines your future business processes and business requirements. both now and in the future. Customers are provided with a customer input template for each application that collects the data. and buildthe BW system to these requirements. so that important matters do not fall through the cracks. they also serve to document the implementation. 6) Properties of the individual web items are to be set as per the requirements. The question and answer format is standard across applications to facilitate easier use by the project team. 7) The URL when this web template is executed should be used in the portal/intranet 14. They can be modified in Properties window or in the HTML code. How do we gather the requirements for an Implementation Project? One of the biggest and most important challenges in any implementation is gathering and understanding the end user and process team functional requirements. com/documents/ SDS_BW_Replaceme nt%20Path% 20Variables. Centrally storing this information assists in gathering and then managing issues to resolution. html 17. You can then track the issues in database.Issues database: Another tool used in the blueprinting phase is the issues database. These typically involve all of the following:. 2) Define the tabular layout as per the requirements so as to embed the necessary web items. what we do in Business Blue Print Stage? SAP has defined a business blueprint phase to help extract pertinent information about your company that is necessary for implementation.What is difference between PSA & IDOC? BI7 is PSA used only for Data load from Source System into BW 18). as seen inthe following sample questions:1) What information do you capture on a purchase order?2) What information is required to complete a purchase order?Accelerated SAP question and answer database:The question and answer database (QADB) is a simple although aging tool designed to facilitate the creation and maintenance of your business blueprint.

what is the approch. To give the option of key date.4. they can check the drop down list and choose other countries. But 99% this is not possible. In OLI*** (for example OLI7BW for Statistical setup for old documents : Orders) give the name . This is not for End users document. data sources.. then we are going to tell the KPI and deliverable reports detail to the users.Let's say Functional Spec says. To create any varaibles. ageing calculation etc.Normally your BW customization or creation of new info providers all are depending on your source system. According their business scenrio we could't use standard business content. Fiscal Year. 20) How do we decide what cubes has to be created? Its depends on your project requirement.What will be the technical and display name of the report. who'll be authorized to run this report. info sources and info providers).MM or FI.3.. How do you set up the varaible. How will you get the 12 months of data. This document is applicable for end users also. Customized cubes are not mandatory for all the projects. After that we have created custom data source to info providers as well asreports. The Company variable should be defaulted to USA but then if the user wants to change it. If data is there in RSA3 then go to transaction code LBWG (Delete Setup data) and delete the data by entering the application name. If your bussines requirement is differs from given scenario ( BI content cubes ) then only we will opt for customized cubes. The report should return values for 12 months of data depending on the fiscal year that the user enters Or it should display in quarterly values.Technical Specs translate these requirements in a technical fashion. But the source system have new scenarios for message escalation.4.For example.2. In my first project we implemented for Solution Manager BW implemention. MM and FI etc.. For that we have taken only existing info objects and created new info objects which are not there in the business content.2. Fiscal year and Fiscal Version – certain Info Obejcts should be availble in the system..If your source system is R3 and your users are using only R3 standard business scenarios like SD. That means here we are going to say which are all business we are implementing like SD. 23) What is Customization? How do we do in LO? How to do basic LO extraction for SAP-R3-BW1.so that they are used as user entry variable. 21) Who used to make the Technical and Functional Specifications? Technical Specification:Here we will mention all the BW objects (info objects. Go to transaction code RSA3 and see if any data is available related to your DataSource. Then we are going to say the data flow and behaviour of the data load (either delta or full) also we can tell the duration of the cube activation or creation.a clear understanding and agreement on a complete setof BW functional requirements is not always so simple. If available.If your source system other that R3 then you should go with customization of your all objects.1. the user should be able to enter the Key date.1. to resolve each of the line items listed above. Pure BW technical things are available in this document.2. 22) Give me one example of a Functional Specification and explain what information we will get from that? Functional Specs are requirements of the business user. This document is going to mingle with both Function Consultants and Business Users. Functional specs are also called as Software requirements. Because surely they should have included their new business scenario or new enhancements. Fiscal Version.Now from this Techinal Spec follows. Same explanation goes for the rest. Go to transaction SBIW --> Settings for Application Specific Datasource --> Logistics --> Managing extract structures --> Initialization --> Filling the Setup table --> Application specific setup of statistical data --> perform setup (relevant application)3. where do you do it. The calculations or formulas for the report will be displayed in precision of one decimal point. 3. what is the technical of the objects you'll use. etc.. then you dont want to create any info providers or you dont want to enhance any thing in the existing BW Business Content.Functional Specification:Here we will describe the business requirements. etc are clearly specified in the technical specs. what'll be the technical name of the objects you'll crete as a result of this report. then should we create any variables for them . What changes in properties willu do to get the precision. There we have activated all the business content in CRM.

If any of the loads runs into errors then resolve it.Depends upon the requirement 6. If the support person faces any issues then he will ask/request to operator to raise a ticket. Activating attributes and hierarchies4. 3.. but some are not. Tickets are raised at the time of support project these may be any issues. Finding all affected aggregates2. The typical tickets in a production Support work could be: 1. Most of the std SAP data sources come as delta enabled. 25) Tickets and Authorization in SAP Business Warehouse What is tickets? And example? Tickets are the tracking tool by which the user will track the work which we do. 5.hope it helps.Create some new reports based on the requirement of client. Create ADHOC hierarchies. They will of types critical or moderate. 7.5.Attribute change run is nothing but adjusting the master data after its been loaded from time to time so that it can change or generate or adjust the sid's so that u may not have any problem when loading the trasaction data in to data targets..7. Critical means it is most complicated issues . Low priority and so on. Data source Enhancement. Go to transaction code RSA7 there you can see green light # Once the new records are added immediately you can see the record in RSA7.. Add/remove fields in any of the master data/ODS/Cube.. Like High Priority. Data source Enhancement.etc. the indexes are also created in this phase. Generally attribute change run is used when there is any change in the master data. problems. 2. Validating the data in Cubes/ODS. Double click on new infopackage..This method is used for better loading in less time. Operator will raise a ticket and assign it to the respective person. Go to transaction LBWE and make sure the update mode for the corresponding DataSource is serialized V3 update.This would be done by scheduling the infopackages for the attributes/texts mentioned by the client. By doing this record will bypass SM13 and directly go to RSA7.Analyze the error and take suitable action.4. Create ADHOC reports.. 2. Go to transaction RSA3 and check the data.of the run and execute. Loading any of the missing master data attributes/texts . it is not possible to execute queries. If you create generic datasources. In some databases. 3. . Validating the data in Cubes/ODS. set up all affected aggregates again and write the result in the new aggregate table. .. Loading any of the missing master data attributes/texts. It can be a change requests or data loads or what ever. 27) Different types of Delta updates? Delta loads will bring any new or changed records after the last upload. Go to BW system and create infopackage and under the update tab select the initialize delta process. . rename the new aggregate table.The hierarchy/attribute change run which activates hierarchy and attribute changes and adjusts the corresponding aggregates is devided.6. 24) When we use Maintain Data Source. If any of the loads runs into errors then resolve it. Add/remove fields in any of the master data/ODS/Cube. 5. Now all the data available in the setup tables are now loaded into the data target...3. Create ADHOC reports. Now all the available records from R/3 will be loaded to setup tables. which cannot rename the indexes. Under update tab you can see the delta update radio button. 1. . Generally Ticket raised by the client can be considered based on the priority. 26) Change attribute run. Critical can be (Need to solve in 1 day or half a day) depends on the client.the detail explanation about Attribute change run.Now for the delta records go to LBWE in R/3 and change the update mode for the corresponding DataSource to Direct/Queue delta.it is used for realingment of the master data. In this case you can do a full load to the ODS and then do a delta from the ODS to the cube.By using the Validation reports or by comparing BW data with R/3. If the ticket is of low> priority it must be considered only after attending to high priority tickets. 6. What we do? What we will maintain? Go to BW system and create a new infopackage for delta loads. into 4 phases:1. The concept of Ticket varies from contract to contract in between companies. 4. If a ticket is of high priority it has to be resolved ASAP.depends how you measure this. 7. Create ADHOC hierarchies.. . then you have the option of creating a . When renaming. And schedule the package. . After solving the ticket will be closed by informing the client that the issue is solved.Create hierarchies in RSA1 for the info-object. 4.

2. The conversion of units of measure is required to convert business measurements into other units. in contrast to the current default settings (serialized V3 update). the data in the updating collective run are thereby read without regard to sequence from the update tables and are transferred to the BW delta queue.Non-serialized V3 Update: With this update mode. the extraction data is collected for the affected application instead of being collected in an extraction queue. depending on the application. timestamp or numeric pointer fields (this can be doc number. you have to enter a conversion factor which is derived from a characteristic ora characteristic combination (compound characteristic) and the corresponding properties.1 unit = 25 g (linear correlation)*Unit* has dimension ID AAAADL and *Gram* has dimension ID MASS. etc).ExampleNumber Unit Number Unit1 Chocolate bar 25 g1 Small carton 12 Chocolate bar1 Large carton 20 Small carton1 Europallet 40 Large carton* Quantity Conversion* <http://help.com/saphelp_nw2004s/helpdata/en/27/b65c42b4e05542e10000000a1550b 0/frameset.Queued Delta: With this update mode. However. unit. or into different target units of measure. source and target units of measure) that determine how the conversion is performed.============ ========= ==There are three deltasDirect Delta: With this update mode. Measurements of lengthConversions within the same dimension ID (T006-DIMID) – for example. length:1 m = 100 cm (linear correlation)*Meter* and *Centimeter* both belong to dimension ID LENGTH. You can also perform InfoObject-specific conversions (for example. each document posting with delta extraction is posted for exactly one LUW in the respective BW delta queues. The business transaction rules of the conversion are established in the quantity conversion type. In doing so. you first have to initialize the delta on the BW side and then set up the delta.You differentiate between conversions for which you only need to enter a source and target unit in order to perform conversion and conversions for which specifying these values alone is not sufficient.delta onCalday.FeaturesThis function enables the conversion of updated data records from the source unit of measure into a target unit of measure. if the conversion is repeated.. For the latter. For more . the extraction data for the application considered is written as before into the update tables with the help of a V3 update module. if possible with a well explained example. and can be transferred as usual with the V3 update by means of an updating collective run into the BW delta queue.To do a delta.sap. the extraction data is transferred with each document posting directly into the BW delta queue.. 28) Function modules. Business measurements encompass physical measurements which are either assigned to a dimension or are nondimensional.1) UNIT_CONVERSION_ SIMPLE and2) MD_CONVERT_MATERIAL _UNITexplain how to use these things..You'll be able to see the delta changes coming in the delta queue through RSA7 on the R3 side. number andweight. They are kept there as long as the data is selected through an updating collective run and are processed.1. up to 10000 deltaextractions of documents for an LUW are compressed for each Data Source into the BW delta queue. In terms of functionality.. Simple conversions can be performed between units of measure that belong to the same dimension (such as meters to kilometers. kilograms to grams). quantity conversion is structured similarly to currency translation. The conversion type is a combination of different parameters (conversion factors. two palettes (PAL) of material 4711 were ordered and this order quantity has to be converted to the stock quantity *Carton*(CAR) ).The delta mechanism is the same for both Master data and Transaction data loads.htm>* *UseQuantity conversion allows you to convert key figures with units that have different units of measure in the source system into a uniform unit of measure in the BI system.In part it is based on the quantity conversion functionality in SAP NetWeaver Application Server.). Nondimensional measurements are understood as countable measurements(palette. In doing so. Measurements of number associated with measurements of weightConversions involving different dimension IDs – for example.Quantity conversion is based on quantity conversion types.

If conversion factors are not found in the basic table (T006).In the Business Explorer you can:● Establish a quantity conversion in the query definition. see QuantityConversion Types<http://help. If the source unit of measure and target unit of measure belong to the same dimension for 80% of the data records that you want to convert.If you want to convert 1000 grams into kilograms but the conversion factors are not defined in the quantity DataStore object.When converting quantities in the Business Explorer.In both cases..● Translate quantities at query runtime.· Using reference InfoObject if available. and accept that the system will have to search in the second table also for the remaining 20%.With option 3 and option 4. kilograms to grams. first try to determine factors using the central units of measure (option4). If the system cannot determine conversion factors from the quantity DataStore object it tries again usingthe central units of measure.The settings that you can make in this regard affect performance and the decision must be strictly based on the data set. meters to kilometers. the system tries to determine conversion factors at each stage. If the system finds conversion factors. Entering source and target quantities is optional. central units of measure (T006) if notThe system tries to determine the conversion factors using the quantity DataStore object you have defined.htm>.sap.· Using central units of measure (T006)Conversion can only take place if the source unit of measure and target unit of measure belong to the same dimension (for example. In addition.· Using central units of measure (T006) if available. Translation is more limitedhere than in the query definition. That table contains the conversion factors. reference InfoObject if notThe system tries to find the conversion factors in the central units of measure table. option 2 is most suitable.IntegrationThe quantity conversion type is stored for future use and is available for quantity conversions in the transformation rules for InfoCubes and in the Business Explorer:In the transformation rules for InfoCubes you can specify. material-specific conversions) between units that do not belong to the same dimension.[image: This graphic is explained in the accompanying text]*Quantity Conversion Types*<http://help. The source unit of measure is determined dynamically from the data record or from a specified InfoObject (characteristic) . The key figure you enter here has to exist in the InfoProvider and the attribute this key figure has in the data record is taken as the conversionfactor. If you only want to perform conversions within the same dimension.sap.information. the system only accesses one database table.The *Conversion Factor from InfoObject *option (as with *Exchange Rate from InfoObject* in currency translation types) is only available when you load data. even though this is a very simple conversion. you can specify a fixed source unit of measure or determine the source unit of measure using avariable. for each key figure or data field. the system cannot perform the conversion. the source unit of measure is . the system searches again in the quantity DataStore object.Conversion FactorsThe following options are available:· Using a reference InfoObjectThe system tries to determine the conversion factors from the reference InfoObject you have chosen or from the associated quantity DataStore object.If you are performing InfoObject-specific conversions (for example.The option you choose should depend on how you want to spread the conversion.The decisive factor in defining a conversion type is the way in which you want conversion factors to be determined.com/saphelp_nw2004s/helpdata/en/1c/1b5d427609c153e10000000a 1550b0/content. or in reverse. whether quantity conversion is performed during the update. and so on). option 1 is most suitable. In certain cases you can also run quantity conversion in user-defined routines in the transformation rules. If the system finds conversion factors it uses these to perform the conversion. StructureThe parameters that determine the conversion factors are the source and target unit of measure and the option you choose for determining the conversion factors. it uses these to perform the calculation.htm>DefinitionA quantity conversion type is a combination of different parameters that establish how the conversion is performed.Source Unit of MeasureThe source unit of measure is the unit of measure that you want to convert.com/saphelp_nw2004s/helpdata/en/1c/1b5d427609c153e10000000a 1550b0/frameset. If the system cannot determine conversion factors from the central units of measure it tries to find conversion factors that match the attributes of the data record by looking in the quantity DataStore object.

-SAP ABAP programming with BWData modeling. business key users) Liase with key users to agree reporting requirements.During the data load process the source unit of measure can be determined either from the data record or using a specified characteristic that bears master data. data mapping / translation. solving the tickets raised by the user. ODS and cube design in BWData loading process and procedures (performance tuning)Query and report development using Bex Analyzer and Query DesignerWeb report development using Web Application. The variables that have been defined for InfoObject 0UNIT are used.· Target quantity using InfoSetThis setting covers the same functionality as *InfoObject for Determining Target Quantity*.In reporting. This is not the same as defining currency attributes where you determine a currency attribute on the *Business Explorer* tab page in characteristic maintenance.You reach the maintenance for the unit of measure in *SAP Customizing Implementation Guide* (r) *SAP NetWeaver *(r) *General Settings* (r) *Check Units of Measure*. functional specs) Write and execute test plans and scripts Coordinate and manage business / user testing Deliver training to key users Coordinate and manage product ionization and rollout activities Track CIP (continuous improvement) requests. source system developer. As long as you have selected an InfoSet. User will raise a ticket when they face any problem with the query. Under *InfoObject for Determining Unit of Measure*. you can enter the InfoObject under *InfoObject for DeterminingTarget Quantity*. all InfoObjects are listed that have at least one attribute of type *Unit*. If the InfoObject that you want to use to determine the target quantity is unique in the InfoSet (it only occurs once in the whole InfoSet).· Alternatively. report designs Translate requirements into design specifications( report specs. star schema. work with users to prioritize. Production support In production support there will be two kind jobs which you will be doing mostly 1. looking into the data load errors. plan and manage CIP An SAP BW technical consultant is responsible for:SAP BW extraction using standard data extractor and available development tools for SAP and non-SAP data sources. other than this you will also be doing some enhancements to the present cubes and master data but that done on requirement.You only have to enter the InfoObject in *Target Quantity Using InfoSet* if you want to determine the target quantity using an InfoObject but that occurs more than once in the InfoSet. 'UNIT'). solving the errors related to data load. master data. Data records are converted that have the same unit key as the source unit of measure. you can determine that the target unit of measure be determined during the conversion. All the InfoObjects with quantity attributes contained in the InfoSet can be displayed using input help.always determined from the data record.All the active InfoSets in the system can be displayed using input help. like report showing wrong values incorrect data etc. Data loading involves monitoring process chains. you can use a source unit of measure from a variable.The InfoSet contains InfoProviders A and B and both A and B contain InfoObject X with a quantity attribute. In this case you have to specify exactly whether you want to use X from A or X from B to determine the target quantity. In the Query Designer under the properties for the relevant key figure. 29) An SAP BW functional consultant is responsible for the following: Key responsibilities include: Maintain project plans Manage all project activities.The values in input help correspond to the values in table T006 (units of measure). With quantity conversion types you determine the InfoObject in the quantity conversion type itself.· You can specify an InfoObject in the quantity conversion type that is used to determine the target unit of measure during the conversion. 2. 29. You have to select one of these attributes as the corresponding quantity attribute. Target Unit of MeasureYou have the following options for determining the target unit of measure:· You can enter a fixed target unit of measure in the quantityconversion type (for example. you can select an InfoObject. you specify either a fixed target unit of measure or a variable to determine the target unit of measure.if the system response is slow or if the query run . Field aliases are used in an InfoSet to ensure uniqueness. many of which are executed by resources not directly managed by the project leader (central BW development team.You can use a fixed source unit of measure in planning functions.

These cells are not displayed and serve as containers for help selections or help formulas. We can perform the daily activities in Production 1. Creating aggregates in Prod system 5. application server and client. That is reason you go for a local structure. Then in cell editor you are enable to write a formula specifically for that cell as sum of the two cell before. generic cell definitions are created at the intersection of the structural components that determine the values to be presented in the cell. 4.Select the global structure.*When you try to save a global structure. 32) What is SAP GUI and what use of it? AP Graphic User Interface: SAP GUI is the GUI client in SAP R/3's 3-tier architecture of database.Rightclick and choose Remove reference. choose the open query icon (icon tht looks like a folder) On the SAP BEx Open dialog box:Choose Queries.com/saphelp_nw70/helpdata/en/4f/472e42e1ef5633e10000000a155106/f rameset. It is software that runs on a Microsoft Windows. and allows a user to access SAP functionality in SAP applications such as mySAP ERP and SAP Business Information Warehouse (now called SAP Business Intelligence). 6.Select the desired InfoCubeChoose New. you can define cells that have no direct relationship to the structural components.you need two structures to enable cell editor in bex. In every query you have one structure for key figures. chC/% = chA/% + chB/% then:kfA kfB %chA 6 4 66%chB 10 2 20%chC 8 4 86% Manager Round Review Questions. Perform Change run Hirerachy 4.On the Define the query screen:In the left frame.In addition. the cross among them results in a fix reporting area of n rows * m columns. The activities in a typical Production Support would be as follows: 1. Apple Macintosh or Unix desktop. expand the Structure node. that is how you identify a global structure* 31) What is the use of Define cell in BeX & where it is useful? Cell in BEX:::Use*When you define selection criteria and formulas for structural components and there are two structural components of a query.This is useful when you want to any cell had a diferent behaviour that the general one described in your query defininion. monitoring Dataload failures thru RSMO 2.helpline activities 3. Modifying BW reports as per the need of the user.kfA kfB %chA 6 4 66%chB 10 2 20%chC 8 4 50%Then you want that % for row chC was the sum of % for chA and % chB. You need the SAP GUI to log on to and to use the SAP systems. * Working on some tickets with small changes in reports or in AWB objects. Resolving urgent user issues .could be using process chains or manual loads. a dialogue box prompts you tocomfirm changes to all queries. then you have to do another structure with selections or formulas inside.Data Loading . Normally the production support activities include * Scheduling * R/3 Job Monitoring * B/W Job Monitoring * Taking corrective action for failed data loads.Cell-specific definitions allow you to define explicit formulas.A local structure is created. Regression testing when version/patch upgrade is done.sap.For example imagine you have the following where % is a formula kfB/KfA *100. ***a local structure when you want to add structure elements that are unique to the specific query.Drag and drop the desired structure into either the Rows or Columnsframe. to override implicitly created cell values. Check Aggr's Rollup. Monitoring Process Chains Daily/weekly/ monthly 3.time is high. The cross of any row with any column can be defined as formula in cell editor. Creating adhoc hierarchies. from the SAP Business Explorer toolbar.htm 33) What is the RMS Application? SAP Records Management is a component of the SAP Web Application Server for the . Check alsohttp://help.Remember that you cannot revert back the changes made to global structure inthis regard.Then having two structures. This function allows you to design much more detailed queries. along with implicit cell definition. and selection conditions for cells and in this way. 30) How to convert a BeX query Global structure to local structure (Steps involved) BeX query Global structure to local structureSteps. Changing the global structure changes the structure for all the queries that use the global structure. You will have to delete the local structure and then drag ndrop global structure into query definition. 2.Coming to the navigation part--In the BEx Analyzer.

low level. Records Management guarantees this quick access. 34) Bug resolution for the RMS Application? 3A. that are processed by the process . provided that the code you are using has a logical sequence. you could use a formatted search to bring up the next code. In doing so. the query is still the same but the user leaves the BP Code field blank and the add-on will populate it when the user clicks on the Add button.co. Process Flow:1. Business Process Management with SAP NetWeaver and ARIS for SAP NetWeaver provides procedure models. and so on. RMS's and technical development of the roles. you specify.electronic management of records and even paper-based information can be part of the electronic record in the SAP RMS. as needed within their business processes. Configure application components in SAP Solution Manager. all information objects of a business transaction are grouped together in a transparent hierarchical structure. You can't set an automatic code for BPs. transactions can already be assigned for process steps from the reference model. Furthermore. Quick access to information is a key factor for performing business successfully.uk/upgrade. SAP Records Management not only provides an electronic representation of the conventional paper record. Other advantages of using SAP Records Management compared to other providers of record-based solutions:Records Management is a solution for the electronic management of records.Basic Table : BUT000 Steps to view this tables:Go to TX (tcode) se16 . an organisation can enjoy all the advantages of a paper-free office: No storage costs for records. By converting paper records to electronic records. BPM2. However. for example. mappings. service and communication channel). In this case. technologies and reference content for modeling.sap. This includes participating in the high level. executing and monitoring these business processes.com/irj/sdn/go/portal/prtroot/docs/library/uuid/a8ffd911-0b01-0010679e-d47dade98cdd Tools used for business process:1. such as business documents. I've also included this kind of function in an add-on. Configure integration scenario and integration process (PI). and optimal retrieval of information. You can use wizards for the configuration.pdf 35) Development tasks for RMS release work? The main task isComplete life cycle development of SAP Authorization Roles . 2. specify the table u want ot view in this case is But000 and click on the icon table contents (or enter) and u can find the entries by giving a selection or view the total no of entries. no cost-intensive copying procedures. If you want to have a separate range for each BP type then the user needs to set the BP type field before using the formatted search. http://rmsitservices. The RMS divides various business units logically thereby making it possible to provide particular groups of users with access to particular records. Use metadata (PI). 36) What is BP Master Data? BP Master data is nothing but Business partner data used in CRM Master tables describe the BP Master Data tables. for example.In the Business Blueprint. You can assign this formatted search to the BP Code field and then the user can trigger it (ShiftF2) when they are creating a new BP.You adapt the defined integration scenarios and integration processes to your specific system landscape. you can also edit the Implementation Guide. message interfaces. and thereby specify how your business processes are to run in the SAP system. Authorization Objects A. collaboration profiles (communication party.You specify the necessary metadata for your integration requirements. https://www.sdn. configuring. methods. However. In one record. 3. You can also assign transactions to any additional processes and steps you have defined. ARIS etc. Process ModelingA process model is an abstraction of a process and describes all the aspects of the process:· Activities: steps that are executed within the process· Roles: users or systems that execute the activities· Artifacts: objects. such as data types.

and "06" delete. "03" read. but all 10 fields are not used in all objects. Developers and programmers can create new authorization objects through the developers' workbench called ABAP Workbench in SAP R/3. These are predefined activity codes that reside in a table named TACT. an authorization will have an asterisk (*) as a field value. such as a purchase order. there are approximately 800 standard authorizations. AuthorizationsAuthorizations are the keys that can open the authorization objects. Examples of activity are "01" create or generate. It stands for Advanced Business Application Programming Language. TCODES and data entry screens. To implement and utilize innovative processes and strategies successfully. such as programs. one needs to thoroughly understand the authorization concept. These classifications are called object classes.Processes within a company can be modeled on multiple abstraction levels and from numerous different viewpoints. check in following table AGR_TCODES An example of an authorization is as follows: Field Value ACTVT (Activity) 01 BUKRS (Company Code) 0010 This particular authorization grants users access to create for company code 0010 the specific object that is locked by the authorization object. Based upon design. A deciding factor for the success of business process modeling is. · Process configuration models support the process-driven configuration and implementation of processes. The authorization concept allows the assignment of broad or finely defined authorizations/permissions for system access. Business Process Management in SAP NetWeaver provides a common methodology for all levels of process modeling. ABAP/4 is a 4GL (fourth-generation programming language) that was used to develop all SAP R/3 applications. Depending on the SAP R/3 version.it is automatically activated when u activate DSO in BW. you must convert business process views into technical views and relate both views. it is an info object . 0Record mode is used to identify the delta images in BW which is used in DSO . these authorizations can be limited to: Access to the transaction code (TCODE) to create a material master Access to specific material Authorization to work in a particular plant in the system Authorization ObjectAuthorization objects can best be described as locks that limit access to SAP R/3 system objects.· Process execution models support service-based process execution 37) Describe the BP Master Data. that all those involved have a common understanding of the business processes and “speak the same language”. and they contain the specific information for field values. such as company code or plant. right up to end-to-end processes. If a field is not restricted. The following authorization will grant total access to all the activities for all the company codes: Field Value ACTVT (Activity) * BUKRS (Company Code) * 40) What is 0Recordmode? A. therefore. This common methodology forms a common reference framework for all project participants and links models for multiple abstraction levels:· Business process models describe the process map and process architecture of a company – from value chain diagrams and event-driven process chains. The next most common field is an organization field. To get an understanding of SAP R/3 security. Authorization Objects? Authorization Objects: SAP R/3 Authorization ConceptFundamental to SAP R/3 security is the authorization concept. Authorization objects are classified and cataloged in the system based upon functionality. The most common field in an authorization object is the activity field. There can be 10 fields in an authorization object. different individuals or departments within a company are responsible for modeling processes from the business and technical perspectives. such as FI (financial accounting) or HR (human resources). Like that in R/3 also . For instance. "02" change. Several authorizations may be required to perform a task such as creating a material master record. an authorization contains a specific set of values for one or all the fields of a particular authorization object. Typically. "04" print or edit message.

There are two ways of finding exception against a query:1. what are the objects we need to develop are modified depending on the client's requirements).. (i.2006 unavailable to the query for showing up.showing sales in 2007. When ever u extracting data from R/3 using LO or Generic. Realization: In this only.).2005 . the implementation of the project takes place (development of objects etc) and we are involved in the project from here only.2008 against Materials...related VendorAll the MM related documents data when transfered to FI these are createdRelated Tables BSIK and BSAKAll the above six tables data is present in BKPF and BSEG tablesYou can link these tables with the hlp of BELNR and GJAHR and with Dates also. Account Receivablesrelated to CustomerAll the SD related data when transfered to FI these are created. .. RKF is restriction applied on a keyfigure.2002. Business Blue print.like belowMaterial Sales in 2007 Sales in 2008M1 200 300M2 400 700You need to create two RKF's.support.you can design your RKF to show only 2007 or something like that.Related Tables BSID and BSAD3. 3... GL Accounting -related tables are SKA1. like what are his needs and requirements etc. 2.0? But I know in Bw3.7.This will make only data which have fiscyear >2006 available for query to process or show. Open queries in the BEX Query Designer.e. decision makers define clear project objectives and an efficient decision making process (i.You have got a keyfigure called Sales in your cube Now you will put global restriction at query level by putting Fiscyear > 2006 in the Filter. Project Preparation: In this phase. Now to meet your requirement..1.. Like this BW identify the Delta images.5.. A Project Charter is issued and an implementation strategy is outlined in this phase.Within that data. 1.Like in above case putting filter Fiscyear>2006 willmake data from cube for yeaers 2001. Realization.Suppose for example. you want to analyse data only after 2006.? From a query name or description.Sales in 2007 is one RKF which is defined on keyfigure Sales restricted by Fiscyear = 2007Similarly. Special Purpose Ledger. Account Payables . 2004. Discussions with the client.. It holds delta images in R/3. the one which is having background colour as exception reporting are with exceptions. 41)What is the difference between filter & Restricted Key Figures? Examples & Steps in BI? Filter restriction applies to entire query.. Project managers will be involved in this phase (I guess). this field 0Cancel is mapping with 0Record mode in BW. 42)How to create condition and exceptions in Bi. Asset ManagmentIn CO there are Profit center AccountingCost center Accounting will be there. Final preparation & Go-Live . If you are finding exception tab at the right side of filter and rows/column tab. SKB1 Master dataBSIS and BSAS are the Transaction Data2. Execute queries one by one. Business Blueprint: It is a detailed documentation of your company's requirements. the query is having exception.Sales in 2008 is one RKF which is defined on Keyfigure Sales restricted by Fiscyear = 2008Now i think u understood the differenceFilter will make the restriction on query level. you would not be able to judge whether the query is having any exception. which is rarely used..have field 0cancel.2. .2003. 43)The FI Business Flow related to BW.e. Etc.So query is only left with data to be shown from 2007 and 2008. Q) I want to delete a BEx query that is in Production system through request. Is anyone aware about it?A) Have you tried the RSZDELETE transaction? Q) What are the five ASAP Methodologies? A: Project plan.. case studies or scenarios FI FlowBasically there are 5 major topics/areas in FI.5 version.4.

Production system: All the extraction part takes place in this sys. using one or more ODS objects or InfoObjects. . Q) What is landscape of R/3 & what is landscape of BW. You define the data sources in an InfoSet. and sequential file. conducting pre-go-live. testing. And the data wise. (I. Difference between infocube and ODS? A: Infocube is structured as star schema (extended) where a fact table is surrounded by different dim table that are linked with DIM'ids.blogspot. end user training etc.4. Overwrite functionality. Also check the following link: http://sapbibobj. You can also drill-through to BEx queries and InfoSet Queries from a second BW system that is connected as a data mart. Navigating in a BW to an InfoSet Query. An InfoSet can contain data from one or more tables that are connected to one another by key fields. ODS is a flat structure (flat table) with no star schema concept and which will have granular data (detailed level).e. table join. modification etc) and from here the objects are transported to the testing system.com/2010/10/data-store-objects. 5. These can be connected using joins. you will have aggregated data in the cubes. and is used by SAP Query as a source data. table. a DataSource in an application system is selected. Analysis of objects developing. In most cases. such as logical database. Q). The Project team will be supporting the end users. When you create an InfoSet. Choose InfoObjects or ODS objects as data sources. Go-Live & support: The project has gone live and it is into production. No overwrite functionality.. Final Preparation: Final preparation before going live i. End user training is given that is in the client site you train them how to work with the new environment.html Q) What is ODS? http://sapbibobj. Landscape of R/3 A) Landscape of b/w: u have the development system. Q) What does InfoCube contains? A) Each InfoCube has one FactTable & a maximum of 16 (13+3 system defined.blogspot. unit & data packet) dimensions.com/2010/09/differeces-between-dso-and-infocube.html Q) What is InfoSet? A) An InfoSet is a special view of a dataset.e. The data sources specified in the InfoSet form the basis of the InfoSet Query. InfoSets determine the tables or fields in these tables that can be referenced by a report. InfoSets are based on logical databases. SAP Query includes a component for maintaining InfoSets. testing system. production system Development system: All the implementation part is done in this sys. as they are new to the technology. but before transporting an initial test known as Unit testing (testing of objects) is done in the development sys. Testing/Quality system: quality check is done in this system and integration testing is done. _The InfoSet Query functions allow you to report using flat data tables (master data reporting). time.

Differences between STAR Schema & Extended Schema? A) In STAR SCHEMA. Each Fact Table can contain a maximum of 233 key figures. surrounded by dimensional tables and the dimension tables contains of master data. Q) How many dimensions are in a CUBE? A) 16 dimensions. Masterdata tables are independent of Infocube & reusability in other InfoCubes. texts. Q) What does FACT Table contain? A FactTable consists of KeyFigures. Data packet … Q) Partitioning possible for ODS? A) No. text & hierarchy. Where as navigational attribute is used for drilling down in the report. which is used only for display purpose in the report. These Masterdata & dimensional tables are linked with each other with SID keys. In Extended Schema the dimension tables does not contain master data. Q) Difference between display attributes and navigational attributes? A: Display attribute is one. A FACT Table in center. unit & data packet]) Q) What does SID Table contain? SID keys linked with dimension table & master data tables (attributes. We don't need to maintain Navigational attribute in the cube as a characteristic (that is the advantage) to drill down. Selective deletion & change log entry deletion. . (13 user defined & 3 system pre-defined [time. hierarchies) Q) What does ATTRIBUTE Table contain? Master attribute data Q) What does TEXT Table contain? Master text data. Dimension can contain up to 248 freely available characteristics. It's possible only for Cube.Q). Q) How would we delete the data in change log table of ODS? A) Context menu of ODS → Manage → Environment → change log entries. short text. instead they are stored in Masterdata tables divided into attributes. medium text & language key if it is language dependent Q) What does Hierarchy table contain? Master hierarchy data Q) How would we delete the data in ODS? A) By request IDs. long text. Q) What are the extra fields does PSA contain? A) (4) Record id.

The data is stored in cluster tables from where it is read when the initialization is run. which help in retrieving data fastly. It is important that during initialization phase. Process variant (start variant) is the place the process chain knows where to start. CAN U ADD A NEW FIELD AT THE ODS LEVEL? Sure you can. The extraction set up reads the dataset that you want to process such as. customers orders with the tables like VBAK. B) Transactional Load Partitioning Improvement: Partitioning based on expected load volumes and data element sizes. Currency attributes. Transitive attributes. ODS is nothing but a table. Hierarchies. no one generates or modifies application data. Q) What are Process Chains? A) TCode is RSPC. Q) Why we delete the setup tables (LBWG) & fill them (OLI*BW)? A) Initially we don't delete the setup tables but when we do change in extract structure we go for it. Hierarchy nodes & Characteristic values.Q) Why partitioning? A) For performance tuning. There should be . To refresh the statistical data. Process chains nothing but grouping processes. without timeouts). is a sequence of processes scheduled in the background & waiting to be triggered by a specific event. Improves data loading into PSA and Cubes by infopackages (Eg. that means there are some newly added fields in that which r not before. We r changing the extract structure right. Variable Types are Manual entry /default value Replacement path SAP exit Customer exit Authorization Q) WHAT ARE INDEXES? Indexes are data base indexes. So to get the required data (i. Time dependent attributes. Compounding attributes. Formulas. Q) Different types of Attributes? A) Navigational attribute. the data which is required is taken and to avoid redundancy) we delete n then fill the setup tables. Q) WHAT ARE THE DIFFERENT VARIABLES USED IN BEX? A ) Different Variable's are Texts.e. Of course Q) What types of partitioning are there for BW? There are two Partitioning Performance aspects for BW (Cube & PSA) Query Data Retrieval Performance Improvement: Partitioning by (say) Date Range improves data retrieval by making best use of database [data range] execution plans and indexes (of say Oracle database engine). at least until the tables can be set up. VBAP) & fills the relevant communication structure with the data.. Q) CAN CHARECTERSTIC INFOOBJECT CAN BE INFOPROVIDER. Q. Display attributes.

You have that information in the monitor RSMO during and after data loads. Data Target Administration. You will need to compare records on a 1:1 basis against records in R/3 transactions for the functional area in question. There are four types of sid tables X time independent navigational . Y-table = A table to link material SIDs with SIDS for time-dependent navigation attributes.min and max one start variant in each process chain. it will not tell you how many records should be expected in BW for a given load. Cancellation . Process variant (start variant) is the place the process type knows when & where to start. It also gives you error messages from the PSA. SELECT fields & CANCELLATION fields? A) Selection fields-.. Q) For what we use HIDE fields. you can also go to display that data. However. Q) Types of Updates? A) Full Update.The only purpose is when we check this column. From RSMO for a given load you can determine how many records were passed through the transfer rules from R/3. It is simple to use. I would recommend enlisting the help of the end user community to assist since they presumably know the data. It will give you the number of records extracted. Q) X & Y Tables? X-table = A table to link material SIDs with SIDs for time-independent navigation attributes. I think this is reverse posting Q) How can I compare data in R/3 with data in a BW Cube after the daily delta loads? Are there any standard procedures for checking them or matching the number of records? A) You can go to R/3 TCode RSA3 and run the extractor. Since records that get updated into Cubes/ODS structures are controlled by Update Rules. but only really tells you if the extractor works. To use RSA3. you will not be able to determine what is in the Cube compared to what is in the R/3 environment.and nullifying the value. go to it and enter the extractor ex: 2LIS_02_HDR. A) RSA3 is a simple extractor checker program that allows you to rule out extracts problems in R/3. Reporting agent & Other BW services.It will reverse the posted documents of keyfigures of customer defined by multiplying it with -1. Init Delta Update & Delta Update. Load Process & subsequent processing. Then go to BW Monitor to check the number of records in the PSA and check to see if it is the same & also in the monitor header tab. Click execute and you will see the record count. the field will appear in InfoPackage Data selection tab.These fields are not transferred to BW transfer structure. You are not modifying anything so what you do in RSA3 has no effect on data quality afterwards. Q) What are Process Types & Process variant? A) Process types are General services. here we specify when should the process chain start by giving date and time or if you want to start immediately Some of theses processes trigger an event of their own that in-turn triggers other processes.. Hide fields -. and how many records passed through the Update Rules. how many targets were updated. Ex: Start chain → Delete BCube indexes → Load data from the source system to PSA → Load data from PSA to DataTarget ODS → Load data from ODS to BCube → Create Indexes for BCube after loading data → Create database statistics → Roll-Up data into the aggregate → Restart chain from beginning.

only the records that are ready for the next delta request are displayed on the detail screen. Directory of all reports (Table RSRREPDIR) and Directory of the reporting component elements (Table RSZELTDIR) for workbooks and the connections to queries check Where. You will find your information about technical names and description about queries in the following tables. Reports that are created using BEx Analyzer. Thus. a possibly existing customer exit is not taken into account. a group of documents from a collective run or a whole data packet of an application extractor. In the detail screen of Transaction RSA7. Q) Do the entries in table ROIDOCPRMS have an impact on the performance of the loading procedure from the delta queue? .used list for reports in workbooks (Table RSRWORKBOOK) Titles of Excel Workbooks in InfoCatalog (Table RSRWBINDEXT) Q) What is a LUW in the delta queue? A) A LUW from the point of view of the delta queue can be an individual document. Q) Why is there a DataSource with '0' records in RSA7 if delta exists and has also been loaded successfully? It is most likely that this is a DataSource that does not send delta data to the BW System via the delta queue but directly via the extractor (delta for master data using ALE change pointers). Both. the LUWs must be kept for a possible delta request repetition. This error is corrected with BW 2. In the meantime. Q) Why does the number in the 'Total' column in the overview screen of Transaction RSA7 differ from the number of data records that is displayed when you call the detail view? A) The number on the overview screen corresponds to the total of LUWs (see also first question) that were written to the qRFC queue and that have not yet been confirmed. Q) Why does Transaction RSA7 still display LUWs on the overview screen after successful delta loading? A) Only when a new delta has been requested does the source system learn that the previous delta was successfully loaded to the BW System. the records belonging to the previous delta request and the records that do not meet the selection conditions of the preceding delta init requests are filtered out. the LUWs of the previous delta may be confirmed (and also deleted).0B Support Package 11. A) There is no such table in BW if you want to know such details while you are opening a particular query press properties button you will come to know all the details that you wanted.attributes sid tables Y time dependent navigational attributes sid tables H hierarchy sid tables I hierarchy structure sid tables Q) How to know in which table (SAP BW) contains Technical Name / Description and creation data of a particular Reports. Then. In particular. The detail screen displays the records contained in the LUWs. Such a DataSource should not be displayed in RSA7. the number on the overview screen does not change when the first delta was loaded to the BW System.

What will happen with delta? Should I initialize again after that? . For which one is the delta loaded? A) For delta. Q) What is the purpose of function 'Delete data and meta data in a queue' in RSA7? What exactly is deleted? A) You should act with extreme caution when you use the deletion function in the delta queue. you can make up to about 100 delta inits. the client and the short name of the DataSource. For DataSources whose name is longer than 19 characters (for delta-capable DataSources only possible as of PlugIn 2001. i. This is made up of the prefix 'BW. no longer exists or can no longer be accessed. You do not only delete all data of this DataSource for the affected BW System. Physical deletion only takes place in the qRFC outbound queue if there are no more references to the LUWs. If performance problems are related to the loading process from the delta queue. it should be only up to 10-20 delta inits. Please note. It should not be more.e. Q) How many selections for delta inits are possible in the system? A) With simple selections (intervals without complicated join conditions or single values). Reason: With many selection conditions that are joined in a complicated way. Caution: As of Plug In 2000. that LUWs are not split during data loading for consistency reasons. too many 'where' lines are generated in the generated ABAP source code that may exceed the memory limit. This means that when very large LUWs are written to the DeltaQueue. It is comparable to deleting an InitDelta in the BW System and should preferably be executed there. Then you can only request new deltas after another delta initialization. however. When you delete the data. The deletion function is for example intended for a case where the BW System. Q) What is the relationship between RSA7 and the qRFC monitor (Transaction SMQ1)? A) The qRFC monitor basically displays the same data as RSA7. the LUWs kept in the qRFC queue for the corresponding target system are confirmed. With complicated selection conditions. This means. from which the delta initialization was originally executed. the actual package size may differ considerably from the MAXSIZE and MAXLINES parameters.2 patch 3 the entries in table ROIDOCPRMS are as effective for the delta queue as for a full update. In the qRFC monitor you cannot distinguish between repeatable and new LUWs. the data of a LUW is displayed in an unstructured manner there. make a client copy. all selections made via delta inits are summed up.A) The impact is limited. Q) I loaded several delta inits with various selections. then refer to the application-specific notes (for example in the CO-PA area.1) the short name is assigned in table ROOSSHORTN. a delta for the 'total' of all delta initializations is loaded. but also lose the entire information concerning the delta initialization. Q) I intend to copy the source system. The internal queue name must be used for selection on the initial screen of the qRFC monitor. For DataSources whose name are 19 characters long or shorter. Moreover. the short name corresponds to the name of the DataSource. in the logistics cockpit area and so on).

Only another delta request loads the missing documents into BW. READY and RECORDED in both tables are considered to be valid. make sure that your deltas have been fetched from the DeltaQueue into BW and that no delta is pending. If you see such records. or. the next load will be of type 'Repeat'. The delta must be initialized in any case since delta depends on both the BW system and the source system. which processes them in one or several parallel update processes in an asynchronous way. NOSEND in SMQ1 means nothing (see note 378903). However. plan a sufficiently large "safety time window" between the end of the collective run in the source system and the start of the delta request in BW. Every other status is an indicator for an error or an inconsistency. you should expect that the delta have to be initialized after the copy. An alternative solution where this problem does not occur is described in Note 505700. After the client copy. After the system copy. The records with status EXECUTED are usually deleted from the queue in packages within a delta request directly after setting the status before extracting a new delta. set the . an inconsistency might occur between BW delta tables and the OLTP delta tables as described in Note 405943. it means that either a process which is confirming and deleting records which have been loaded into the BW is successfully running at the moment. Q) Despite of the delta request being started after completion of the collective run (V3 update). After the client copy. In any case only the statuses READ. The status EXECUTED in TRFCQOUT can occur temporarily. some entries have the status 'READY'. if the records remain in the table for a longer period of time with status EXECUTED. It is set before starting a DeltaExtraction for all records with status READ present at that time. What do these statuses mean? Which values in the field 'Status' mean what and which values are correct and which are alarming? Are the statuses BW-specific or generally valid in qRFC? A) Table TRFCQOUT and ARFCSSTATE: Status READ means that the record was read once either in a delta request or in a repetition of the delta request. Q) How and where can I control whether a repeat delta is requested? A) Via the status of the last delta in the BW Request Monitor. The status READY in the TRFCQOUT and RECORDED in the ARFCSSTATE means that the record has been written into the DeltaQueue and will be loaded into the BW with the next delta request or a repetition of a delta. Even if no dump 'MESSAGE_TYPE_X' occurs in BW when editing or creating an InfoPackage. others 'RECORDED'. For this reason. What is the cause for this "splitting"? A) The collective run submits the open V2 documents for processing to the task handler.A) Before you copy a source client or source system. The value 'U' in field 'NOSEND' of table TRFCQOUT is discomforting. In the table TRFCQOUT. it does not contain all documents. no more deltas are loaded into the BW. If you need to repeat the last load for certain reasons. Q) In SMQ1 (qRFC Monitor) I have status 'NOSEND'. it is likely that there are problems with deleting the records which have already been successfully been loaded into the BW. In this state. the table will contain the entries with the old logical system name that are no longer useful for further delta loading from the new logical system. Table ROOSPRMSC will probably be empty in the OLTP since this table is client-independent. this does not mean that the record has successfully reached the BW yet. If the request is RED. ARFCSSTATE is 'READ'.

Usually we only use start routine when it does not concern one single characteristic (for example when you have to read the same table for 4 characteristics).after selection Step 3 . are these routines written in update rules? I will be glad. For the contents of the repeat see Question 14. Authorizations.before selection Step 2 . RSZGLOBV. In the transfer structure you just click on the yellow triangle behind a characteristic and choose "routine".characteristics. Hierarchies. Delta requests set to red despite of data being already updated lead to duplicate records in a subsequent repeat. Hierarchy nodes & Formula elements. if they have not been deleted from the data targets concerned before. Customer Exit Q) Why we use this RSRP0001 Enhancement? A) For enhancing the Customer Exit in reporting. Step 1 . Q) We need to find the table in which query and variable assignments are stored. Q) Variable processing types in Reporting? A) Manual. In the update rules you can choose "start routine" or click on the triangle with the green square behind an individual characteristic. HOW CAN U RECTIFY THE ERROR? A) Go to TCode sm66 then see which one is locked select that pid from there and goto sm12 TCode then unlock it this is happened when lock errors are occurred when u scheduled. Q) Difference between Calculated KeyFigure & Formula? A) Q) Variables in Reporting? A) Characteristics values. Q) In BW we need to write abap routines. I found this info on one of the previous posting. We used ABAP Routines for example: To convert to Uppercase (transfer structure) To convert Values out of a third party tool with different keys into the same keys as our SAP System uses (transfer structure) To select only a part of the data for from an infosource updating the InfoCube (Start Routine) etc. We must read this table in a user-exit to get which variables are used in a query. Replacement path. SAP Exit. query table . Also. if this is clarified with real-time scenarios and few examples? A) Over here we write our routines in the start routines in the update rules or in the transfer structure (you can choose between writing them in the start routines or directly behind the different characteristics. Text. use this id for table start with RSZEL* When 'ZFISPER1' name of the variable when VNAM = 'ZVC_FY1' .processed all variable at the same time Q) What is an aggregate? .get query id from table RSRREPDIR (field RSRREPDIR-COMPUID).request in the monitor to red manually. Q) THERE is one ODS AND 4 INFOCUBES. I wish to know when and what type of abap routines we got to write. variable table. I hope this helps. WE SEND DATA AT TIME TO ALL CUBES IF ONE CUBE GOT LOCK ERROR. A) Check out tables RSZELTDIR and RSZCOMPDIR for query BEx elements.

Unfortunately such kind of data can't be avoided in daily extraction. just take the following steps: 1. Flat Aggregate --when u have more than 15 characteristics for aggregate system will generate that aggregate as flat aggregate to increase performance.But please don't do that immediately . we have to roll-up to make that data available in aggregate. This line time stays cool in my ODS Z1CCWO01 but encounters error while loaded into several Cubes during process chain's running. Q) How can we stop loading data into infocube? A) First you have to find out what is the job name from the monitor screen for this load monitor screen lo Header Tabstrip lo untadhi. In some situations the data to ODS is correct but reports error while loading from ODS to Cubes.4 cubes and 1 ODS)to red and then delete all of them. .Delta mechanism ensures the consistency of transactional data. Roll-up--when the data is loaded for second time into cube. Request 25344 is bad It's easy to modify the data in PSA of ODS and then re-construct it. SM37 (job monitoring) in r/3 select this job and from the menu u can delete the job. This will cause a repeat delta load in the following steps.if we simply delete the delta requests in Cubes afterwards and then re-load delta again. Cancellation is not suggestible for delta load. There are some options. we have to not only correct itself but also correct the related records in subsequent data targets. Here is one example: Some FI documents transferred in R/3 with LSMW's background mode and one of the line items 'Negative posting signal' was inputted with 'x' while the right one should be 'X'. just check them out in sm37 and also u have to delete the request in BW. it won't reach your expectation because the delta is broken! If we want to fix both data in ODS / Cubes and keep the right delta. Repair bad data and subsequent data targets with delta update Posted by Aaron Wang in aaron.wang3 on 11/04/2007 9:44:37 PM Initialization / Delta update is what we do everyday and the update method for most data targets.A) Aggregates are small or baby cubes. A subset of InfoCube. So if some data is wrong in the early stage of data flow. Set the request's QM status in all your data targets(in my case.

But still mark the QM status in 'Requests' tab to red (not in Monitor).2. Open table RSBODSLOGSTATE and change fields 'Max. delta slice that was extracted by all receivers' and 'Max. Delete the request contains the bad data (25344 in my case) 3. Open table RSDMDELTA via SE16. Modify data in PSA. .Since the Data Mart Status of the bad request has been cleared. 5.( 25344 to 24860 in my case) 4. we can thus delete the request and modify data in PSA. delta slice extracted until now' to the latest right request number.Here stores the successful DataMarted request.

A warning of 'repeat delta' will populate and choose 'request again'. please post them. The above steps could ensure right delta after PSA change and reconstruction for all data targets in the data flow. I dont claim my answers to be correct and hence wherever you dont agree. Then the project should be functionally divided into smaller modules generally done by project managers alongwith technical and functional leads. It's somewhat complicated but I still didn't find any better solution instead of redo the whole initialization (which may contain several millions of records and affect all the data targets!) Business Intelligence Interview Questions I have got these queries from someone and I would like to share my views.6. 7. Load the delta package in Data Mart from ODS to subsequent data targets. please let me know and share your views. Re-construct bad request (25344) in ODS and activate it. Question:How you generally Approach to ur Analytics Project? Answer: Any project should start from defining the scope of the project and the approach should be not to deviate from the scope. . If you have any questions or comments.

To move ahead. Clients try to match the data with their existing reports and you never get the correct results. Question: What are the Challenges You Faced while making of Reports? Answer: Making of an report has never been a difficult task. they will keep asking for the changes. I think this way you would be able to complete at least 40-50 % of the requirements. HR etc. production. If the database values are correct. There could be two reasons for report not fetching the right data. Question: How analytics Process Your Request When you Create your Requests. Also in between it performs various tasks like converting the user operations like user selections to form a logical SQL. Simultaneously testing and deployment is planned in a phased manner. you try to discover the things and at later stage come to know of all these problems and you are held responsible for this delay. According to the defined scope of the project they start gathering requirements while interacting with the clients. If you ask me to blindly create schemas for the warehouse without knowing any requirements. I will broadly describe the expected analysis an organisation would like to do in every module. Mostly clients do not have correct data in their database and on top of that to correct the results they make some changes at the report level to bring the desired result which you may not e aware of while creating the reports. the answer is. again . There are more questions which I will try to answer later. Your approach should be to first show them what they want to see and then add more information in the report. 2. processes the requests. 1. study the data and business and you can create few more schemas. I have experienced that if you are not able to create the report in exactly the way they used to see.Oracle BI server converts the logical SQL submitted by the client into optimised physical SQL which is then sent to the backend database. manages the cached results. Technical leads decides what schemas to create and what requirements are going to fulfill by that schema. Question: What you will do when your Report is not Fetching Right Data? Answer: this is the biggest problem in report creation and verification. there there could be a problem with the joins and relations in the schema. checking and verifying credentials. But problem comes when users are reluctant to adopt a new system. 3. 2. Hence always consult the SPOC(Single Point of Contact) and try to understand the logic they have used to generate their reports. breaking the request into threads(as Oracle BI is a multi threaded server). finance. Answer: If the Question means how does Oracle BI Analytics Server processes the user requests. Hence you may not agree to my answers.The functional leads then decide on majorly three things: 1. You need to discover that analysing and digging deep into the matter. but wherever please post a comment and let me know too. Question: How we are going to decide which schema we are going to implement in the data warehouse? Answer: One way is what is mentioned in Question above. purchase. inventory. I will simply first divide the schemas on the basis of functional areas of an Organisation which are similar to the modules in an ERP like sales. The questions are very specific to OBIEE and I dont have much experience in that. They had a discussion with the technical leads and try to reach a solution. Technical leads discuss all this with the developers and try to close requirements.

converting the results received from the database into user presentable form etc. Question: From where u Get the Logical Query of your Request? Answer: The logical SQL generated by the server can be viewed in BI Answers. If I have not understood the question, Please raise your voice. Question: Major Challenges You Faced While Creating the RPD?????? Answer: Every now and then there are problems with the database connections but the problem while creating the repository RPD files comes with complex schemas made on OLTP systems consisting of lot of joins and checking the results. Th type of join made need to be checked. By default it is inner join but sometimes the requirement demands other types of joins. There are lot of problems with the date formats also.

Question: What are Global Filter and how thery differ From Column Filter? Answer: Column filter- simply a filter applied on a column which we can use to restrict our column values while pulling the data or in charts to see the related content. Global filter- Not sure. I understand this filter will have impact on across the application but I really dont understand where and how it can be user. I heard of global variables but not global filters. How to make the Delivery Profilers Work? When we are Use SA System how Does SA Server understand that It needs to use it For Getting the User Profile information? Where to Configure the Scheduler? Answer: I am not sure if Iam correct but we configure the OBIEE schedular in database. Question: How to hide Certain Columns From a User? Answer: Application access level security- Do not add the column in the report, Do not add the column in the presentation layer. Question:How can we Enable Drills in a Given Column Data? Answer: To enable Drill down for a column, it should be included in the hirarchy in OBIEE. Hyperion IR has a drill anywhere feature where dont have to define and can drill to any available column. Question: Is Drill Down Possible without the attribute being a Part of a Hierarchical Dimension? Answer: No Question: How do u Conditional Format.? Answer: while creating a chat in BI Answers, you can define the conditions and can apply colour formatting. Question: What is Guided Navigation? Answer: I think it is just the arrangement of hyperlinks to guide the user to navigate between the reports to do the analysis. How is Webcat File Deployed Across Environment?

Question: How the users Created Differs From RPD/Answers/Dashboards Level?????

Answer: RPD users can do administrator tasks like adding new data source, create hirarchies, change column names where as Answers users may create new charts, edit those charts and Dashboard users may only view and analyse the dashboard or can edit dashboard by adding/removing charts objects. Question: Online/Offline Mode how it Impact in Dev and Delpoyment???? Answer: Online Mode- You can make changes in the RPD file and push in changes which will be immediately visible to the users who are already connected. This feature we may use in production environment. Offline mode- can be useful in test or development environment. Questions: Explan me the Schema in Your Last Project???????

DB What happens if u Reconcile/Sync Both????

Technical Business Intelligence Questions and Answers Many questions asked during the interview will test the applicant‟s knowledge of the field and their ability to use business intelligence software and other related applications. Q: Could you please explain the concept of business intelligence? A: Business intelligence is the management and collection of data that is used to help a business in the decision making process. The gathered data can also be used to predict the outcome of various business operations. There are a few key steps in business intelligence, which include: the gathering of data, analysis of that data, a review of the situation, risk evaluation and then using all of this information to make the best decision for the business. This data and analysis can be used to make financial and sales decisions, and also help a company gain an edge over its competitors. Q: What are some of the standard tools used in business intelligence? A: Some of the standard business intelligence tools are: - BusinessObjects - Crystal Reports - Micro Strategy - Microsoft OLAP - Qlik View Note: Make sure that the most frequently used solutions are mentioned, as well as new and successful programs. This will demonstrate your interest in the field and knowledge of trends. Both are very important. Q: Describe what Online Analytical Processing is.

A: Online analytical processing, or OLAP, is a versatile tool that analyzes data stored within a multidimensional database. It allows the user to isolate pieces of information and view them from many different perspectives. For example: Sales of a particular product in April can be compared to the sales of the same product in September. On the other hand, sales of a particular product can also be compared to other products sold in the area. OLAP software programs can also be used for data mining purposes. Q: Please explain what a universe is in business intelligence. A: A universe is terminology used in the BusinessObjects application. It is actually the semantic layer between the end user and the data warehouse. A universe masks the complex, traditional database structure and replaces it with familiar business terminology. This makes it easier for the end user to understand and use.
Q: What is an aggregate table? A: Aggregate tables summarize information gathered from existing warehouse data. An example could be yearly or monthly sales information. These aggregate tables are typically used to reduce query time, as the actual table is likely to have millions of records. Rather than retrieving the information from the actual table, it is taken from the aggregate table, which is much smaller. Retrieving that information directly would take quite a bit of time and would also put a huge strain on the server. Q: Please explain what business intelligence dashboards are. A: A business intelligence dashboard is, more or less, a reporting tool that tells a business how its performing at a particular point in time. It consolidates important pieces of information and creates a visual display so that a user can see whether or not the company is in good shape. A dashboard‟s interface is usually customizable and can pull real-time data.

Behavioral BI Questions Aside from technical questions, the applicant will likely be asked about how they perform certain tasks and what they would do in certain situations. These are much like the typical behavioral questions asked during an interview, but are still geared towards the business intelligence field. These can be questions about data, analytics or reporting methods. Below are some potential questions and tips on how to answer them. Q: How much experience do you have with dashboards, reporting tools and scorecards? A: Be as thorough as possible and completely honest. If you have any experience at all in this field, there is a good chance you are pretty familiar with each of these tools. Tell the employer how long you have been working with these tools and how often you used them (i.e. daily or weekly). Q: What is your method of analyzing data? Please provide some examples. A: The interviewer is looking to find out how you approach data analysis via examples of what you have done in the past. Try to choose instances where you took a different approach or pinpointed something that was previously overlooked. Q: What is the most important report you have created? Was this report easily understood by others? Were they able to grasp the implications of that data?

all of your hard work will mean nothing. HR. Transfer routines and Update routines? Ans: Start Routines: The start routine is run for each DataPackage after the data has been written to the PSA and before the transfer rules have been executed. as you will have an opportunity to ask some of your own towards the end.Generic Extractors based on Table. complex data into a report that is easily understood by others in the company. but if the person who receives the report cannot comprehend the implications of your data. the questions may vary. we can only append data i. b) CUBE: Follows the star schema. SAP BW/BI Interview Questions Part-1 1. InfoSet. It has no return value. Its purpose is to execute preliminary calculations and to store them in global DataStructures. Try not to be intimidated by the wording of the questions and focus on the core of what is being asked. What are Start routines. CO-PA Cross Application . . There should be one characteristic in common for unifying the physical infoproviders. Again. It allows complex computations for a key figure or a characteristic. You may be able to create compelling reports. allows overwrite and data is in transparent tables and provides operational reports. This structure or table can be accessed in the other routines. What is the difference between ODS. It does not have physical data. Business intelligence interview questions may be a bit more in-depth and technical in nature. What are the steps involved in LO Extraction? Ans: The steps are a) RSA5 Select the DataSources b) LBWE Maintain DataSources and Activate Extract Structures c) LBWG Delete Setup Tables d) 0LI*BW Setup tables e) RSA3 Check extraction and the data in Setup tables f) LBWQ Check the extraction queue g) LBWF Log for LO Extract Structures h) RSA7 BW Delta Queue Monitor 3. Depending on what position you have applied for. be thorough with your answer and give as much detail about the report as possible. What are the extractor types? Ans: There are three types of extractors Application Specific . 4. Function Module 2. CO.e.A: The employer wants to know if you are capable of turning complicated. but they are important in determining which candidates are truly knowledgeable in the area and able to provide the enterprise with the support it needs. additive property and provides analytical reports. The entire DataPackage in the transfer structure format is used as a parameter for the routine. LO Cockpit Customer-Generated Extractors FI-SL. c) MultiProvider: It is the logical unification of physical infoproviders. SAP CRM.BW Content FI. Jot down some questions to ask during an interview before the meeting. View. InfoCube and MultiProvider? Ans: a) ODS: Provides granular data.

What are the options available in transfer rules for mapping? Ans: InfoObject.Transfer / Update Routines: They are defined at the InfoObject level. Explain how you used Start routines in your project? Ans: Start routines are used for mass processing of records. For example if material M1 is forecasted to say 100 in May. So we can process all these records together in start routine. What is compression? Ans: When we perform the compression the data in the infocube moves from F-Fact table to the E-Fact table. How would you optimize the dimensions? Ans: We should define as many dimensions as possible and we have to take care that no single dimension crosses more than 20% of the fact table size. 9. Example: If we have total telephone expense for a Cost Center. Extra Large 20%). Constant. In start routine all the records of DataPackage is available for processing. Medium 40%. This is achieved in start routine. When you perform this compression by choosing the check box “with zero elimination” keyfigures with the zero values also will be deleted. we use the return table in the Update Routine. When we perform partition the E fact table of the infocube will be partitioned and hence we should perform the compression followed by partition in order to get the partition effect. What is Rollup? Ans: This is used to load new requests (delta records) in the Infocube into the aggregates. we wanted to have 4 records against one single record that is coming in the info package. 8. 11. Routine. What is table partitioning and what are the benefits of partitioning in an Infocube? Ans: It is the method of dividing the fact table which would improve the query performance. It is like the Start Routine. Then after applying size %(Small 20%. using a return table we can get expense per employee. We can perform partition only when we have either 0CALMONTH or 0FISCPER in the infocube. What are Return Tables? Ans: When we want to return multiple records. If we have not performed a rollup then the current data (which is not rolled up) will not be available for reporting 10. We can use this to define Global Data and Global Checks. Partitioning helps to run the report faster as data is stored in the relevant partitions. In one of scenario. we should declare the dimension as line item dimension (There should not be more than one characteristic assigned to a dimension in order to declare it as line item dimension) 13. It is independent of the DataSource. If the dimension table crosses more than 20% of the size of the fact table. Large 20%. Formula 12. we wanted to apply size % to the forecast data. How to find the size of the dimension table? Ans: With the program SAP_CUBES_DESIGN . As a whole compression saves database size and increases the query performance. 5. 6. 7. For example if it is an ODS then active table structure will be the table. instead of single value. The entire request IDs will be deleted and compressed into a single request. What is the table that is used in start routines? Ans: Always the table structure will be the structure of an ODS or InfoCube.

support. For example.e. when we want to report on Characteristics or Master Data. or else it will take less than 30 minutes. if you have complex coding in update rules it will take longer time. we can convert units in Pounds to Kilos. If we enable this flag then unit of Key Figure appears in the ABAP code as an additional parameter. .14. While creating datasource from RSO2. 1. after entering datasource name and pressing create. In BI 7 version we use open hub destination (object) for this purpose. What is ASAP Methodologies? Ans: ASAP stands for Accelerated SAP methodology which is used to implement and deliver the project efficiently. Realization. we can make 0CUSTOMER as an InfoProvider and report on it. What is BW Statistics and what is its use? Ans: They are group of Business Content InfoCubes which are used to measure performance for Query and Load Monitoring. Final preparation & Go-Live . Business Blue print. 15. What are the delta options available when you load from flat file? Ans: The 3 options for Delta Management with Flat Files: Full Upload New Status for Changed records (ODS Object only) Additive Delta (ODS Object & InfoCube 19. 16. Can an InfoObject be an InfoProvider. 21. 18. Project Preparation: In this phase the system landscape will be set up. What are Conversion Routines for units and currencies in the update rule? Ans: Using this option we can write ABAP code for Units / Currencies conversion. OLAP and Warehouse management. Can we make a datasource to support delta? Ans: If this is a custom (user-defined) datasource you can make the datasource delta enabled. The five stages of ASAP methodologies areProject plan. what are the objects we need to develop are modified depending on the client's requirements). In BW 3. There are some standard reports on these infocubes which provides us the in information about the BW system performance as a whole. How much time does it take to extract 1 million (10 lakhs) of records into an infocube? Ans: This depends. We have to right click on the InfoArea and select "Insert characteristic as data target". 2. It also shows the usage of aggregates.5 version an object called Infospoke is used to send data from BW to the external systems. such as document number & counter 20. (I. Generic delta can be enabled baseda) Time stamp b) Calendar day c) Numeric pointer. For example. What is Open Hub Service? Ans: The Open Hub Service (process) enables us to distribute data from an SAP BW system into external system. 17. which says generic delta. Business Blueprint: It is a detailed documentation of your company's requirements. how and why? Ans: Yes. in the next screen there is one button at the top.

When do we go for initialization without data transfer? Ans: 30. Also when a DataSource is enhanced we will go for deleting the setup tables followed by the statistical setup (filling setup tables) 31. Hierarchies. Formulas. Some data is uploaded twice into infocube. Whereas navigational attribute is used for drilling down in the report. Realization: In this only. How to correct it? Ans: Selective deletion if the data is already compressed otherwise deletes the duplicate request 23. 4. MM. Can number of Datasources have one infosource? Ans: Yes of course. 5. for loading text and hierarchies we use different data sources but the same Infosource which is already used for attribute 24: Can many infosources assign to one infoprovider? Ans: Yes 25: Can many transaction Datasources can be assigned to one infosource? Ans: No. Delta. Why do we have set up tables in LO extraction only but not the other extractions? Ans: As the historical data volume in case of logistics modules (SD. testing.e. Why we delete the setup tables (LBWG) & fill them (OLI*BW)? Ans: For the first time when we are loading the historical data in the setup tables as a caution we perform deletion for the setup tables in order to avoid any junk data residing in it. For example. . Final Preparation: Final preparation before going live i. Hierarchy nodes & Characteristic values. 26. which is used only for display purpose in the report. The Project team will be supporting the end users. the implementation of the project takes place (development of objects etc) And we are involved in the project from here only. as they are new to the technology. End user training is given that is in the client site you train them how to work with the new environment. 21. Initialize delta (init). Repair full 28: What is early delta? Ans: 29. 31) What Are The Different Variables Used In BEx? Ans: Different Variable's are Texts. end user training etc. 22. conducting pre-go-live. Difference between display attributes and navigational attributes? Ans: Display attribute is one. Why not in transfer rules? Ans: 27: What are the types of data update we have in BW? Ans: Full.3. Go-Live & support: The project has gone live and it is into production. Currency conversions can be written in update rules. We don't need to maintain Navigational attribute in the cube as a characteristic (that is the advantage) to drill down. and PP) is high we have setup table which are the intermediate tables of type transparent tables. it‟s a one to one assignment in this case.

39) Can one infosource send data to multiple datatargets? Ans:Yes 40) What does the number in the 'Total' column in Transaction RSA7 mean? Ans: The 'Total' column displays the number of LUWs that were written in the delta queue and that have not yet been confirmed.32) What is processing type of a variable? Ans: The way how the variable gets the input value is called processing type. A LUW only disappears from the RSA7 display when it has been transferred to the BW System and a new delta request has been received from the BW System. The number includes the LUWs of the last delta request (for repetition of a delta request) and the LUWs for the next delta request. 34) What are Indexes? Ans: Indexes are data base indexes. which help in retrieving data faster. It will give you the number of records extracted. Then go to BW Monitor to check the number of records in the PSA and check to see if it is the same & also in the monitor header tab. 35) What is the significance of KPI's? Ans: KPI‟s (Key Performance indicators) indicate the performance of a company. 38) What is the difference between writing a routine in transfer rules and writing a routine in update rules? Ans: If you are using the same Infosource to update data in more than one data target its better u write in transfer rules because u can assign one InfoSource to more than one data target & and whatever logic we write in update rules it is specific to particular one data target. These are key figures 36) What types of partitioning are there for BW? Ans: There are two Partitioning Performance aspects for BW (Cube & PSA) 37) How can I compare data in R/3 with data in a BW Cube after the daily delta loads? Are there any standard procedures for checking them or matching the number of records? Ans: You can go to R/3 T-Code RSA3 and run the extractor. . We have the following five processing typesVariable Types are: a) Manual entry /default value b) Replacement path c) SAP exit d) Customer exit e) Authorization 33) How many Levels we can go In Reporting? Ans: We can drill down to any level by using Navigational attributes and jump targets (RRI).

Q 6) Diff Methods of Generic datasource using Transaction RSO2 : a) Extraction from DB Table or View b) Extraction from SAP Query c) Extraction by Function Module 2) Important BW datasource relevant tables . The relevant data can be from the BI system or from other SAP or non-SAP systems. but which is read directly for analysis and reporting purposes.1)name the two table that provide detail information about data source 2)how and when can you control whether repeat delta is requested? 3)how can you improve the performance of a query 4)how to prevent duplicate record in at the data target level 5)what is virtual cube? its significance 6)diff methods of generic datasource 7)how to connect anew data target to an existing data flow 8)what is partition 9) SAP batch process 10)how do you improve the info cube design preformance 12)is there any diff between repair run/repaire request.if yes then please tell me in detail 13)difference between process chain and infopackage group diff between partition/aggregate Answers Q 3) Query Performance can be improved by making the Aggregates having all the Chars & KF used in Query. Q 5) Virtual Cube : InfoProvider with transaction data that is not stored in the object itself. VirtualProviders only allow read access to data.

Generic Extarction using 1. for example after loading data into Info Cube. you rollup data into aggregates. . Last changed date and who etc. Apart from them. 3. A) During data loading.Test . Another one is after you load data into ODS. Q) Under which menu path is the Test Workbench to be found. You can only partiton on 0CALMONTH or 0FISCPER 4) 1. This rolling up into aggregates might fail. including in earlier Releases? The menu path is: Tools .Test Workbench. Infoset Queries . in process chains you add so many process types. now this rolling up of data into aggregates is a process type which you keep after the process type for loading data into Cube. you activate ODS data (another process type) this might also fail. Q) I want to delete a BEx query that is in Production system through request.ROOSOURCE: Table Header for SAP BW OLTP Sources RODELTAM: BW Delta Process ROOSFIELD: DataSource Fields ROOSGEN: Generated Objects for OLTP Source. 3) For Q 8) i think you mean table partition You use partition to improve performance. Function modules 5) Hi Santosh Pls note down the Q& ANS Some of the Real time question. Is anyone aware about it? A) Have you tried the RSZDELETE transaction? Q) Errors while monitoring process chains. ROOSOURCE 6.Views 2.ABAP Workbench .

Here are a couple of scenarios: . I don't think you can delete single records. which says generic delta. A) You know how to edit PSA. Delta extraction is supported for all generic extractors. SAP Query and function modules The delta queue (RSA7) allows you to monitor the current status of the delta attribute Q) Workbooks. You have to delete entire PSA data for a request. after entering datasource name and pressing create. Generic delta services: Supports delta extraction for generic extractors according to: Time stamp Calendar day Numeric pointer. If you want more details about this there is a chapter in Extraction book. While creating datasource from RSO2.Q) In Monitor----. Q) Can we make a datasource to support delta. in the next screen there is one button at the top. such as tables/views. A) If this is a custom (user-defined) datasource you can make the datasource delta enabled. as a general rule. it's in last pages u find out.) SM50 à Program/Mode à Program à Debugging & debug this work process.Simulate update.Details (Header/Status/Details) à Under Processing (data packet): Everything OK à Context menu of Data Package 1 (1 Records): Everything OK ---. should be transported with the role. Q) PSA Cleansing. such as document number & counter Only one of these attributes can be set as a delta attribute. (Here we can debug update rules or transfer rules.

and then you have a choice of transporting the role (recommended) or just the workbook. .1. 1. it will exist (verified via Table RSRWBINDEXT). 2. Q) What are the five ASAP Methodologies? A: Project plan. Keep in mind that a workbook is an object unto itself and has no dependencies on other objects. decision makers define clear project objectives and an efficient decision making process ( i. then an additional step will have to be taken after import: Locate the WorkbookID via Table RSRWBINDEXT (in Dev and verify the same exists in the target system) and proceed to manually add it to the role in the target system via Transaction Code PFCG -. A Project Charter is issued and an implementation strategy is outlined in this phase. Project managers will be involved in this phase (I guess). Discussions with the client.ALWAYS use control c/control v copy/paste for manually adding! 3.support. This depends. If the role exists in both dev and the target system but the workbook has never been transported. If both the workbook and its role have been previously transported.). Realization. like what are his needs and requirements etc. then the role does not need to be part of the transport.even though it may not be visible. If only the workbook is transported. or else it will take less than 30 minutes. Project Preparation: In this phase. as a general rule. Overall. If the role does not exist in the target system you should transport both the role and workbook.e. Thus. if you have complex coding in update rules it will take longer time. you do not receive an error message from the transport of 'just a workbook' -. Business Blue print. you should transport roles with workbooks. Q) How much time does it take to extract 1 million (10 lackhs) of records into an infocube? A. Final preparation & Go-Live .

2. production system Development system: All the implementation part is done in this sys. as they are new to the technology. Realization: In this only. End user training is given that is in the client site you train them how to work with the new environment. Business Blueprint: It is a detailed documentation of your company's requirements. (i. Production system: All the extraction part takes place in this sys. testing. Q) What is landscape of R/3 & what is landscape of BW. end user training etc. Q) How do you measure the size of infocube? . The Project team will be supporting the end users. modification etc) and from here the objects are transported to the testing system. what are the objects we need to develop are modified depending on the client's requirements). 3. conducting pre-go-live. 4.e. Go-Live & support: The project has gone live and it is into production.e. but before transporting an initial test known as Unit testing (testing of objects) is done in the development sys.e. (I. Analysis of objects developing. 5. Testing/Quality system: quality check is done in this system and integration testing is done. Final Preparation: Final preparation before going live i. Then Landscape of b/w: u have the development system. testing system. the implementation of the project takes place (development of objects etc) and we are involved in the project from here only.. Landscape of R/3 not sure.

CURRENCY CONVERSIONS CAN BE WRITTEN IN UPDATE RULES. Q. Difference between infocube and ODS? A: Infocube is structured as star schema (extended) where a fact table is surrounded by different dim table that are linked with DIM'ids. you will have aggregated data in the cubes. then you can delete it by requestID. DataSources on the transactional system needs to be replicated on BW side and attached to infosource and update rules respectively. Overwrite functionality. Q. Q. WHY NOT IN TRANSFER RULES? . Q. for loading text and hierarchies we use different data sources but the same InfoSource. For example. which is used only for display purpose in the report. Q. Flat file datasources does not support 0recordmode in extraction. n new. A) Data flows from transactional system to analytical system (BW). -after. a add. SOME DATA IS UPLOADED TWICE INTO INFOCUBE. CAN NUMBER OF DATASOURCES HAVE ONE INFOSOURCE? A) Yes of course. r reverse Q) Difference between display attributes and navigational attributes? A: Display attribute is one. HOW TO CORRECT IT? A: But how is it possible? If you load it manually twice.A: In no of records. ODS is nothing but a table. No overwrite functionality ODS is a flat structure (flat table) with no star schema concept and which will have granular data (detailed level). And the data wise. CAN U ADD A NEW FIELD AT THE ODS LEVEL? Sure you can. BRIEF THE DATAFLOW IN BW. x before. d delete. We don't need to maintain Navigational attribute in the cube as a characteristic (that is the advantage) to drill down. Q). Where as navigational attribute is used for drilling down in the report.

Q) WHERE THE PSA DATA IS STORED? In PSA table. The data is stored in cluster tables from where it is read when the initialization is run.sdn and search for Designing Virtual Cube and you will get a good material designing the Function Module .com/sdn/index.. Virtual cube it is like a the structure.sap. sbiw2 for delta update in LIS THEN WHAT IS THE PROCEDURE IN LO-COCKPIT? No LIS in LO cockpit. Q) WHAT IS DATA SIZE? The volume of data one data target holds (in no. you will get the information : https://www. It is important that during initialization phase. sap remote and multi) Virtual Cube is used for example. Virtual (remote.sdn. sbiw1.Q) WHAT IS PROCEDURE TO UPDATE DATA INTO DATA TARGETS? FULL and DELTA.. VBAP) & fills the relevant communication structure with the data. The extraction set up reads the dataset that you want to process such as.. when ever the table is updated the virtual cube will fetch the data from table and display report Online. customers orders with the tables like VBAK. no one generates or modifies application data.e. FYI. For designing the Virtual cube you have to write the function module that is linking to table. We will have datasources and can be maintained (append fields).. the data which is required is taken and to avoid redundancy) we delete n then fill the setup tables. To refresh the statistical data. Q) Why we delete the setup tables (LBWG) & fill them (OLI*BW)? A) Initially we don't delete the setup tables but when we do change in extract structure we go for it. that means there are some newly added fields in that which r not before. So to get the required data ( i. Q) SIGNIFICANCE of ODS? It holds granular data (detailed level). We r changing the extract structure right. at least until the tables can be set up. Refer white paper on LO-Cockpit extractions. of records) Q) Different types of INFOCUBES. if you consider railways reservation all the information has to be updated online. Basic. Q) AS WE USE Sbwnn.

Etc Q) PROCESS CHAINS: IF U has USED IT THEN HOW WILL U SCHEDULING DATA DAILY. Number ranges.Q) INFOSET QUERY. Q) IF THERE ARE 2 DATASOURCES HOW MANY TRANSFER STRUCTURES ARE THERE. In R/3 or in BW? 2 in R/3 and 2 in BW Q) BRIEF SOME STRUCTURES USED IN BEX. B) Transactional Load Partitioning Improvement: . Variable Types are Manual entry /default value Replacement path SAP exit Customer exit Authorization Q) HOW MANY LEVELS YOU CAN GO IN REPORTING? You can drill down to any level by using Navigational attributes and jump targets. Hierarchy nodes & Characteristic values. ST22. delete indexes before load. you can create structures. Formulas. Rows and Columns. There should be some tool to run the job daily (SM37 jobs) Q) What types of partitioning are there for BW? There are two Partitioning Performance aspects for BW (Cube & PSA) Query Data Retrieval Performance Improvement: Partitioning by (say) DateRange improves data retrieval by making best use of database [data range] execution plans and indexes (of say Oracle database engine). Q) WHAT ARE THE DIFFERENT VARIABLES USED IN BEX? Different Variable's are Texts. Hierarchies. Q) TOOLS USED FOR PERFORMANCE TUNING. Can be made of ODS's and Characteristic InfoObjects with masterdata.

However. Q) How can I compare data in R/3 with data in a BW Cube after the daily delta loads? Are there any standard procedures for checking them or matching the number of records? A) You can go to R/3 TCode RSA3 and run the extractor. To use RSA3. From RSMO for a given load you can determine how many records were passed through the transfer rules from R/3. A) RSA3 is a simple extractor checker program that allows you to rule out extracts problems in R/3. how many targets were updated. but only really tells you if the extractor works. It is simple to use. and how many records passed through the Update Rules. It will give you the number of records extracted. You are not modifying anything so what you do in RSA3 has no effect on data quality afterwards. Q) What is the difference between writing a routine in transfer rules and writing a routine in update rules? A) If you are using the same InfoSource to update data in more than one data target its better u write in transfer rules because u can assign one InfoSource to more than one data target & and what ever logic u write in update rules it is specific to particular one data target. You have that information in the monitor RSMO during and after data loads. go to it and enter the extractor ex: 2LIS_02_HDR.Partitioning based on expected load volumes and data element sizes. Click execute and you will see the record count. you will not be able to determine what is in the Cube compared to what is in the R/3 environment. I would recommend enlisting the help of the end user community to assist since they presumably know the data. Improves data loading into PSA and Cubes by infopackages (Eg. without timeouts). it will not tell you how many records should be expected in BW for a given load. You will need to compare records on a 1:1 basis against records in R/3 transactions for the functional area in question. It also gives you error messages from the PSA. you can also go to display that data. . Then go to BW Monitor to check the number of records in the PSA and check to see if it is the same & also in the monitor header tab. Since records that get updated into Cubes/ODS structures are controlled by Update Rules.

but a return table. You can then generate as many key figure values. A LUW only disappears from the RSA7 display when it has been transferred to the BW System and a new delta request has been received from the BW System. suppose you want to restrict (delete) some records based on conditions before getting loaded into data targets. Q) Start routines? A) Start routines u can write in both updates rules and transfer rules. Y-table = A table to link material SIDs with SIDS for time-dependent navigation attributes. Ex: .Q) Routine with Return Table.Delete Data_Package ani ante it will delete a record based on the condition Q) X & Y Tables? X-table = A table to link material SIDs with SIDs for time-independent navigation attributes. Q) Line-Item Dimension (give me an real time example) Line-Item Dimension: Invoice no: or Doc no: is a real time example Q) What does the number in the 'Total' column in Transaction RSA7 mean? A) The 'Total' column displays the number of LUWs that were written in the delta queue and that have not yet been confirmed. no: of billing documents as RKF's. There are four types of sid tables X time independent navigational attributes sid tables Y time dependent navigational attributes sid tables H hierarchy sid tables I hierarchy structure sid tables Q) Filters & Restricted Key figures (real time example) Restricted KF's u can have for an SD cube: billed quantity. A) Update rules generally only have one return value. then you can specify this in update rules-start routine. billing value. However. as you like from one data record. . by choosing checkbox Return table. The number includes the LUWs of the last delta request (for repetition of a delta request) and the LUWs for the next delta request. you can create a routine in the tab strip key figure calculation. The corresponding key figure routine then no longer has a return value.

the number on the overview screen does not change when the first delta was loaded to the BW System. the LUWs must be kept for a possible delta request repetition. In particular.used list for reports in workbooks (Table RSRWORKBOOK) Titles of Excel Workbooks in InfoCatalog (Table RSRWBINDEXT) Q) What is a LUW in the delta queue? A) A LUW from the point of view of the delta queue can be an individual document. a possibly existing customer exit is not taken into account. You will find your information about technical names and description about queries in the following tables. . In the detail screen of Transaction RSA7. Then. only the records that are ready for the next delta request are displayed on the detail screen. a group of documents from a collective run or a whole data packet of an application extractor.Q) How to know in which table (SAP BW) contains Technical Name / Description and creation data of a particular Reports. Q) Why does the number in the 'Total' column in the overview screen of Transaction RSA7 differ from the number of data records that is displayed when you call the detail view? A) The number on the overview screen corresponds to the total of LUWs (see also first question) that were written to the qRFC queue and that have not yet been confirmed. the records belonging to the previous delta request and the records that do not meet the selection conditions of the preceding delta init requests are filtered out. Both. The detail screen displays the records contained in the LUWs. Reports that are created using BEx Analyzer. Directory of all reports (Table RSRREPDIR) and Directory of the reporting component elements (Table RSZELTDIR) for workbooks and the connections to queries check Where. A) There is no such table in BW if you want to know such details while you are opening a particular query press properties button you will come to know all the details that you wanted. Thus. In the meantime. the LUWs of the previous delta may be confirmed (and also deleted). Q) Why does Transaction RSA7 still display LUWs on the overview screen after successful delta loading? A) Only when a new delta has been requested does the source system learn that the previous delta was successfully loaded to the BW System.

Q) Why are selections not taken into account when the delta queue is filled? A) Filtering according to selections takes place when the system reads from the delta queue. This is necessary for reasons of performance. Q) Why is there a DataSource with '0' records in RSA7 if delta exists and has also been loaded successfully? It is most likely that this is a DataSource that does not send delta data to the BW System via the delta queue but directly via the extractor (delta for master data using ALE change pointers). Such a DataSource should not be displayed in RSA7. This error is corrected with BW 2.0B Support Package 11. Q) Do the entries in table ROIDOCPRMS have an impact on the performance of the loading procedure from the delta queue? A) The impact is limited. If performance problems are related to the loading process from the delta queue, then refer to the application-specific notes (for example in the CO-PA area, in the logistics cockpit area and so on).

Caution: As of Plug In 2000.2 patch 3 the entries in table ROIDOCPRMS are as effective for the delta queue as for a full update. Please note, however, that LUWs are not split during data loading for consistency reasons. This means that when very large LUWs are written to the DeltaQueue, the actual package size may differ considerably from the MAXSIZE and MAXLINES parameters. Q) Why does it take so long to display the data in the delta queue (for example approximately 2 hours)? A) With Plug In 2001.1 the display was changed: the user has the option of defining the amount of data to be displayed, to restrict it, to selectively choose the number of a data record, to make a distinction between the 'actual' delta data and the data intended for repetition and so on. Q) What is the purpose of function 'Delete data and meta data in a queue' in RSA7? What exactly is deleted? A) You should act with extreme caution when you use the deletion function in the delta queue. It is comparable to deleting an InitDelta in the BW System and should preferably be executed there. You do not only delete all data of this DataSource for the affected BW System, but also lose the entire information concerning the delta initialization. Then you can

only request new deltas after another delta initialization.

When you delete the data, the LUWs kept in the qRFC queue for the corresponding target system are confirmed. Physical deletion only takes place in the qRFC outbound queue if there are no more references to the LUWs. The deletion function is for example intended for a case where the BW System, from which the delta initialization was originally executed, no longer exists or can no longer be accessed. Q) Why does it take so long to delete from the delta queue (for example half a day)? A) Import PlugIn 2000.2 patch 3. With this patch the performance during deletion is considerably improved. Q) Why is the delta queue not updated when you start the V3 update in the logistics cockpit area? A) It is most likely that a delta initialization had not yet run or that the delta initialization was not successful. A successful delta initialization (the corresponding request must have QM status 'green' in the BW System) is a prerequisite for the application data being written in the delta queue. Q) What is the relationship between RSA7 and the qRFC monitor (Transaction SMQ1)? A) The qRFC monitor basically displays the same data as RSA7. The internal queue name must be used for selection on the initial screen of the qRFC monitor. This is made up of the prefix 'BW, the client and the short name of the DataSource. For DataSources whose name are 19 characters long or shorter, the short name corresponds to the name of the DataSource. For DataSources whose name is longer than 19 characters (for delta-capable DataSources only possible as of PlugIn 2001.1) the short name is assigned in table ROOSSHORTN. In the qRFC monitor you cannot distinguish between repeatable and new LUWs. Moreover, the data of a LUW is displayed in an unstructured manner there. Q) Why are the data in the delta queue although the V3 update was not started? A) Data was posted in background. Then, the records are updated directly in the delta queue (RSA7). This happens in particular during automatic goods receipt posting (MRRS). There is no duplicate transfer of records to the BW system. See Note 417189.

Q) Why does button 'Repeatable' on the RSA7 data details screen not only show data loaded into BW during the last delta but also data that were newly added, i.e. 'pure' delta records? A) Was programmed in a way that the request in repeat mode fetches both actually repeatable (old) data and new data from the source system. Q) I loaded several delta inits with various selections. For which one is the delta loaded? A) For delta, all selections made via delta inits are summed up. This means, a delta for the 'total' of all delta initializations is loaded. Q) How many selections for delta inits are possible in the system? A) With simple selections (intervals without complicated join conditions or single values), you can make up to about 100 delta inits. It should not be more. With complicated selection conditions, it should be only up to 10-20 delta inits. Reason: With many selection conditions that are joined in a complicated way, too many 'where' lines are generated in the generated ABAP source code that may exceed the memory limit. Q) I intend to copy the source system, i.e. make a client copy. What will happen with may delta? Should I initialize again after that? A) Before you copy a source client or source system, make sure that your deltas have been fetched from the DeltaQueue into BW and that no delta is pending. After the client copy, an inconsistency might occur between BW delta tables and the OLTP delta tables as described in Note 405943. After the client copy, Table ROOSPRMSC will probably be empty in the OLTP since this table is client-independent. After the system copy, the table will contain the entries with the old logical system name that are no longer useful for further delta loading from the new logical system. The delta must be initialized in any case since delta depends on both the BW system and the source system. Even if no dump 'MESSAGE_TYPE_X' occurs in BW when editing or creating an InfoPackage, you should expect that the delta have to be initialized after the copy. Q) Is it allowed in Transaction SMQ1 to use the functions for manual control of processes? A) Use SMQ1 as an instrument for diagnosis and control only. Make changes to BW queues only after informing the BW Support or only if this is explicitly requested in a note for

if the records remain in . In the table TRFCQOUT. others 'RECORDED'. Q) Despite my deleting the delta init. plan a sufficiently large "safety time window" between the end of the collective run in the source system and the start of the delta request in BW. delta initializations and deletions of delta inits should always be carried out at a time when no posting takes place. it means that either a process which is confirming and deleting records which have been loaded into the BW is successfully running at the moment. However. If you see such records. It is set before starting a DeltaExtraction for all records with status READ present at that time. What is the cause for this "splitting"? A) The collective run submits the open V2 documents for processing to the task handler. he/she posts data into the queue even though the initialization had been deleted in the meantime. Q) In SMQ1 (qRFC Monitor) I have status 'NOSEND'.component 'BC-BW' or 'BW-WHM-SAPI'. READY and RECORDED in both tables are considered to be valid. ARFCSSTATE is 'READ'. The status EXECUTED in TRFCQOUT can occur temporarily. The status READY in the TRFCQOUT and RECORDED in the ARFCSSTATE means that the record has been written into the DeltaQueue and will be loaded into the BW with the next delta request or a repetition of a delta. which processes them in one or several parallel update processes in an asynchronous way. For this reason. buffer problems may occur: If a user started the internal mode at a time when the delta initialization was still active. some entries have the status 'READY'. The records with status EXECUTED are usually deleted from the queue in packages within a delta request directly after setting the status before extracting a new delta. Only another delta request loads the missing documents into BW. What do these statuses mean? Which values in the field 'Status' mean what and which values are correct and which are alarming? Are the statuses BW-specific or generally valid in qRFC? A) Table TRFCQOUT and ARFCSSTATE: Status READ means that the record was read once either in a delta request or in a repetition of the delta request. An alternative solution where this problem does not occur is described in Note 505700. or. Q) Despite of the delta request being started after completion of the collective run (V3 update). it does not contain all documents. this does not mean that the record has successfully reached the BW yet. In any case only the statuses READ. LUWs are still written into the DeltaQueue? A) In general. This is the case in your system. Otherwise.

the next load will be of type 'Repeat'. We recommend to reset the buffers using Transaction $SYNC. Q) The extract structure was changed when the DeltaQueue was empty. if they have not been deleted from the data targets concerned before. Every other status is an indicator for an error or an inconsistency. Delta requests set to red despite of data being already updated lead to duplicate records in a subsequent repeat. the delta needs to be re-initialized. no more deltas are loaded into the BW.1. In this state. Which update method is recommended in logistics? According to which criteria should the decision be made? How can I choose an update method in logistics? See the recommendation in Note 505700.the table for a longer period of time with status EXECUTED. it shows that some fields were moved. the Logistic Cockpit offers various types of update methods. NOSEND in SMQ1 means nothing (see note 378903).which is of no practical importance. When loading the delta into the PSA. If you need to repeat the last load for certain reasons. Q) How and where can I control whether a repeat delta is requested? A) Via the status of the last delta in the BW Request Monitor. it is likely that there are problems with deleting the records which have already been successfully been loaded into the BW. Why are the data displayed differently? What can be done? Make sure that the change of the extract structure is also reflected in the database and that all servers are synchronized. Afterwards new delta records were written to the DeltaQueue. For the contents of the repeat see Question 14. When the problem occurs. set the request in the monitor to red manually. If the extract structure change is not communicated synchronously to the server where delta records are being created. This may have disastrous consequences for the delta. however . The same result occurs when the contents of the DeltaQueue are listed via the detail display.or . If the request is RED. The value 'U' in field 'NOSEND' of table TRFCQOUT is discomforting. Q) As of PI 2003. Q) Are there particular recommendations regarding the data volume the DeltaQueue may grow to without facing the danger of a read failure due to memory problems? A) There is no strict limit (except for the restricted number range of the 24-digit QCOUNT counter in the LUW management table . the records are written with the old structure until the new structure has been generated.

To avoid memory problems. a program-internal limit ensures that never more than 1 million LUWs are read and fetched from the database per DeltaRequest. Usually. This displays the relevance of data field. Is there any easy way to include this information on the report for users? So that they know the validity of the report.g. If the limit is observed. When estimating "smooth" limits.the restrictions regarding the volume and number of records in a database table). reasons. however. to try not to reach that limit but trigger the fetching of data from the connected BWs already when the number of LUWs reaches a 5-digit value. Q) Can we filter the fields at Transfer Structure? Q) Can we load data directly into infoobject with out extraction is it possible. the DeltaQueue must be emptied by several successive DeltaRequests. That limit is of rather small practical importance as well since a comparable limit already applies when writing to the DeltaQueue. e. in the Logistics Cockpit). As a rule. correct reading is guaranteed in most cases. both the number of LUWs is important and the average data volume per LUW. But for other. we load the transactional data nightly. We recommend. 100 Mbytes per LUW should not be exceeded). If it is so. A) If I understand your requirement correctly. The data volume of a single LUW should not be considerably larger than 10% of the memory available to the work process for data extraction (in a 32-bit architecture with a memory volume of about 1GByte per work process. configure your workbook to display the text elements in the report. . you want to display the date on which data was loaded into the data target from which the report is being executed. you should at least make sure that the data are fetched from all connected BWs as quickly as possible. BW-specific. Q) I would like to display the date the data was uploaded on the report. which is the date on which the data load has taken place. we recommend to bundle data (usually documents) already when writing to the DeltaQueue to keep number of LUWs small (partly this can be set in the applications. the frequency should not be higher than one DeltaRequest per hour. If this limit is reached within a request. If the number of LUWs cannot be reduced by bundling application transactions.

Q) WHAT IS TRANSACTIONAL CUBE? A) Transactional InfoCubes differ from standard InfoCubes in that the former have an improved write access performance level. IF WE R SHEDULED DAILY. Standard Basic cubes are not suitable for this. the transactional InfoCube was developed to meet the demands of SAP Strategic Enterprise Management (SEM). Q) HOW CAN U ANALIZE THE PROJECT AT FIRST? Prepare Project Plan and Environment Define Project Management Standards and Procedures Define Implementation Standards and Procedures Testing & Go-live + supporting. a) We can set the time. . THROUGH WHICH NETWORK. data is written to the InfoCube (possibly by several users at the same time) and re-read as soon as possible. VPN is nothing but one sort of network where we can connect to the client systems sitting in offshore through RAS (Remote access server). Q) HOW MANY DAYS CAN WE KEEP THE DATA IN PSA. WEEKLY AND MONTHLY. Q) HOW CAN U GET THE DATA FROM CLIENT IF U R WORKING ON OFFSHORE PROJECTS. a) VPN…………….Virtual Private Network. Standard InfoCubes are technically optimized for read-only access and for a comparatively small number of simultaneous accesses. We can copy from other infoobject if it is same. WE SEND DATA AT TIME TO ALL CUBES IF ONE CUBE GOT LOCK ERROR.Yes. HOW CAN U RECTIFY THE ERROR? Go to TCode sm66 then see which one is locked select that pid from there and goto sm12 TCode then unlock it this is happened when lock errors are occurred when u scheduled. meaning that. Instead. Q) Can anybody tell me how to add a navigational attribute in the BEx report in the rows? A) Expand dimension under left side panel (that is infocube panel) select than navigational attributes drag and drop under rows panel. Q) THERE is one ODS AND 4 INFOCUBES. We load data from PSA if it is already in PSA.

It is not "Deleting cube contents with update rules" It is only possible to avoid that some content is updated into the InfoCube using the start routine. I would like to clarify a doubt regarding Delta extractor. I've tried using the 0recordmode but that doesn't work. Loop at all the records and delete the record that has the condition. if I run a report for month range 01/2004 . Also. Create a start routine in Update rule. Which variable should I use? Q) Why is it every time I switch from Info Provider to InfoObject or from one item to another while in modeling I always get this message " Reading Data " or "constructing workbench" in it runs for minutes. If I am . In that case you may delete the change record and keep the old and after the change the wrong information.. "If the open order quantity was 0" You have to think also in before and after images in case of a delta upload. Q) I am very new to BW..Q) Is there any way to delete cube contents within update rules from an ODS data source? The reason for this would be to delete (or zero out) a cube record in an "Open Order" cube if the open order quantity was 0. would it be easier to write a program that would be run after the load and delete the records with a zero open qty? A) START routine for update rules u can write ABAP code. anyway to stop this? Q) Can any one give me info on how the BW delta works also would like to know about 'before image and after image' am currently in a BW project and have to write start routines for delta load. you can do it. A) Yap. Q) Wondering how can I get the values.. for an example.10/2004 then monthly value is actually divide by the number of months that I selected.

correct. A) Q) In BW we need to write abap routines. Now say I make changes to any of the open record. Q) What does FACT Table contain? . by using delta extractors the data that has already been scheduled will not be uploaded again. In the update rules you can choose "start routine" or click on the triangle with the green square behind an individual characteristic. I hope this helps. are these routines written in update rules? I will be glad. which was already uploaded. Sales. time. We used ABAP Routines for example: To convert to Uppercase (transfer structure) To convert Values out of a third party tool with different keys into the same keys as our SAP System uses (transfer structure) To select only a part of the data for from an infosource updating the InfoCube (Start Routine) etc. I wish to know when and what type of abap routines we got to write. Q) What does InfoCube contains? A) Each InfoCube has one FactTable & a maximum of 16 (13+3 system defined. In the transfer structure you just click on the yellow triangle behind a characteristic and choose "routine". Usually we only use start routine when it does not concern one single characteristic (for example when you have to read the same table for 4 characteristics). Now I have uploaded all the sales order created till yesterday into the cube. Now what happens when I schedule it again? Will the same record be uploaded again with the changes or will the changes get affected to the previous record. Also. if this is clarified with real-time scenarios and few examples? A) Over here we write our routines in the start routines in the update rules or in the transfer structure (you can choose between writing them in the start routines or directly behind the different characteristics. Say for a specific scenario. unit & data packet) dimensions.

A FACT Table in center.A FactTable consists of KeyFigures. long text. This is transaction data which loads fine in the PSA but not the data target. texts. Each Fact Table can contain a maximum of 233 key figures. hierarchies) Q) What does ATTRIBUTE Table contain? Master attribute data Q) What does TEXT Table contain? Master text data. A) Check transaction SPRO ---Then click the "Goggles"-Button => Business Information Warehouse => Global Settings => 2nd point in the list. unit & data packet]) Q) What does SID Table contain? SID keys linked with dimension table & master data tables (attributes. text & hierarchy. short text. (when created)? . In Extended Schema the dimension tables does not contain master data. Q) Does data packets exits even if you don't enter the master data. so i don't know all the english descriptions). surrounded by dimensional tables and the dimension tables contains of master data. (13 user defined & 3 system pre-defined [time. I hope you can use my "Guide" (my BW is in german. Differences between STAR Schema & Extended Schema? A) In STAR SCHEMA. Q) As to where in BW do you go to add a character like a \. # so that BW will accept it. instead they are stored in Masterdata tables divided into attributes. Dimension can contain up to 248 freely available characteristics. Q) How many dimensions are in a CUBE? A) 16 dimensions. medium text & language key if it is language dependent Q) What does Hierarchy table contain? Master hierarchy data Q) What is the advantage of extended STAR Schema? Q). Masterdata tables are independent of Infocube & reusability in other InfoCubes. These Masterdata & dimensional tables are linked with each other with SID keys.

Q) How would we delete the data in ODS? A) By request IDs.Q) When are Dimension ID's created? A) When Transaction data is loaded into InfoCube. Q) When are SID's generated? A) When Master data loaded into Master Tables (Attr. Q) Different types of Attributes? A) Navigational attribute. Q) How would we delete the data in change log table of ODS? A) Context menu of ODS â†‟ Manage â†‟ Environment â†‟ change log entries. Q) Display attributes? A) You can show DISPLAY attributes in a report. which are used only for displaying. Data packet … Q) Partitioning possible for ODS? A) No. Selective deletion & change log entry deletion. Transitive attributes. Text. It's possible only for Cube. Q) Why partitioning? A) For performance tuning. Currency attributes. Q) What are the extra fields does PSA contain? A) (4) Record id. on general tab checked as attribute only. Display attributes. Q) How does u recognize an attribute whether it is a display attribute or not? A) In Edit characteristics of char. Compounding attributes. Q) Compounding attribute? A) Q) Time dependent attributes? A) Q) Currency attributes? . Q) Transitive Attributes? A) Navigational attributes having nav attr…these nav attrs are called transitive attrs Q) Navigational attribute? A) Are used for drill down reporting (RRI). Q) Have you ever tried to load data from 2 InfoPackages into one cube? A) Yes. Time dependent attributes. Hierarchies).

A) Q) Authorization relevant object. Why authorization needed? A) Q) How do we convert Master data InfoObject to a Data target? A) InfoArea â†‟ Infoprovider (context menu) â†‟ Insert characteristic Data as DataTarget. Q) Steps in LIS are Extraction? A) Q) Steps in LO are Extraction? A) * Maintain extract structures. Q) What does LO Cockpit contain? . * Maintain communication structures/transfer rules. (R/3) * InfoPackage for Delta uploads. Direct Delta: . * Maintain InfoCubes & Update rules. (R/3) * Replicate DataSource in BW. * Set-up periodic V3 update. * Activate extract structures. (R/3) * Maintain DataSources. (R/3) * InfoPackage for the Delta initialization. Serialized V3 update. Q) Steps in FlatFile Extraction? A) Q) Different types in LO's? A) Direct Delta.The extraction data from the application is collected in extraction queue instead of as update data and can be transferred to the BW delta queue by an update collection run. the extraction data is transferred directly into the BW delta queue. * Assign InfoSources. Queued Delta. Queued Delta: . as in the V3 update. (R/3) * Delete setup tables/setup extraction. Q) How do we load the data if a FlatFile consists of both Master and Transaction data? A) Using Flexible update method while creating InfoSource. Each document posting with delta extraction becomes exactly one LUW in the corresponding Delta queue.With every document posted in R/3. Unserialized V3 Update.

Q) SM37 . Change or Delete the InfoCube.Character permit checker.TCode for LIS. Q) RSDCUBEM --.Analysis and Repair of BW Objects Q) RRMX . assign users to these roles.A) * Maintaining Extract structure. Q) MC21 ----. * Controlling Updates.Open Hub Service: Create InfoSpoke. Q) ST22 .Changeability of the BW namespace. Q) RSD5 -. Q) LBWE ---.Fill Set-up tables.Monitoring of Dataloads. .Maintaining Generic DataSources. Q) RSO2 --. Q) RSDDV .IMG (To make configurations in BW). * Activating Updates.Checking ShortDump.Data packet characteristics maint. Q) RSA6 --. Q) RSRT -. Q) RSRV .Creating Updating rules for LO's.TCode for Logistics extractors. Q) RSA7 ---. Q) RSBOH1 -.DB Connect Q) RSMO --.Query monitor. Q) LBW0 --.For Delete. Q) RSCUSTV6 -. Q) PFCG ---.Maintain DataSources. Q) SPRO -. Q) RSDBC .BEx Analyzer Q) RSBBS . * Maintaining DataSources.Role maintenance.Creating user-defined Information Structure for LIS (It is InfoSource in SAP BW).Extract checker. Q) MC24 ---.Partitioning of PSA.Scheduling Background jobs.Report to Report interface (RRI).Delta Queue (allows you to monitor the current status of the delta attribute) Q) RSA3 ---. Q) RSKC -. Q) OLI*BW --.Delete set-up tables in LO's. Q) SE03 -.Maintaining Aggregates. Q) LBWG --.

Q) What are Process Types & Process variant? A) Process types are General services. Process chains nothing but grouping processes.Q) RSMONMESS -.Table to find out delta update methods.Definition Q) CMOD . .e. Process variant (start variant) is the place the process type knows when & where to start. Q) Difference between MasterData & Transaction InfoPackage? A) 5 tabs in Masterdata & 6 tabs in Transaction data. here we specify when should the process chain start by giving date and time or if you want to start immediately Some of theses processes trigger an event of their own that in-turn triggers other processes.ABAP Dictionary Q) SE09 . Data Target Administration. Load Process & subsequent processing. is a sequence of processes scheduled in the background & waiting to be triggered by a specific event. Init Delta Update & Delta Update. Q) ROOSOURCE .Program Compare Q) SE11 . the extra tab in Transaction data is DATA TARGETS."Messages for the monitor" table. Q) Types of Updates? A) Full Update.Implementation guide Q) Statistical Update? A) Q) What are Process Chains? A) TCode is RSPC. Q) RODELTAM . Ex: . Reporting agent & Other BW services. before image & after image) Q) SMOD .Transport Organizer (Extended View) Q) SBIW .Transport Organizer (workbench organizer) Q) SE10 .Project Management enhancing Q) SPAU .Finding for modes of records (i. Process variant (start variant) is the place the process chain knows where to start. There should be min and max one start variant in each process chain.Start chain â†‟ Delete BCube indexes â†‟ Load data from the source system to PSA â†‟ Load data from PSA to DataTarget ODS â†‟ Load data from ODS to BCube â†‟ Create Indexes for BCube after loading data â†‟ Create database statistics â†‟ Roll-Up data into the aggregate â†‟ Restart chain from beginning.

you will replicate the datasources from 2nd R/3 system into 2nd BW system. Third. Always Update data. You have to send your extractors first to the corresponding R/3 Q Box and replicate that to BW. Then you have to do this transport in BW. hide fields -. testing and then production Q) Functionality of InitDelta & Delta Updates? A) Q) What is Change-run ID? .These fields are not transferred to BW transfer structure.. Development. you have to replicate the datasources into the 2nd BW system…and then transport BW objects. This is only possible when we use MM & SD modules.and nullifying the value.The only purpose is when we check this column. even if no master data exits for the data)? A) Because while loading the data it has to create SID keys for transaction data. SELECT fields & CANCELLATION fields? A) Selection fields-. Q) For what we use HIDE fields. Q) InfoPackage groups? A) Q) Explain the status of records in Active & change log tables in ODS when modified in source system? a) Q) Why it takes more time while loading the transaction data even to load the transaction without master data (we check the checkbox. the field will appear in InfoPackage Data selection tab. First you will transport all the datasources from 1st R/3 system to 2nd R/3 System.It will reverse the posted documents of keyfigures of customer defined by multiplying it with -1..Q) For Full update possible while loading data from R/3? A) InfoPackage â†‟ Scheduler â†‟ Repair Request flag (check). A) When it comes to transporting for R/3 and BW. Cancellation . Second. I think this is reverse posting Q) Transporting. u should always transport all the R/3 Objects first………once you transport all the R/3 objects to the 2nd system. you will transport all the BW Objects from 1st BW system to 2nd BW system.

SAP Exit. Text. Uses? A) When we want to delete the data in InfoProvider & again want to re-load the data. Q) Difference between Filters & Conditioning? A) Q) What is NODIM? A) For example it converts 5lts + 5kgs = 10. Hierarchies. Q) What for Exception's? How can I get PINK color? . Replacement path.A) Q) Currency conversions? A) Q) Difference between Calculated KeyFigure & Formula? A) Q) When does a transfer structure contain more fields than the communication structure of an InfoSource? A) If we use a routine to enhance a field in the communication from several fields in the transfer structure. Customer Exit Q) Why we use this RSRP0001 Enhancement? A) For enhancing the Customer Exit in reporting. the communication structure may contain more fields. Q) Variables in Reporting? A) Characteristics values. Hierarchy nodes & Formula elements. greater than.. technical name of PSA. Authorizations. less than or equal etc. A) For cleansing purpose. Q) What is the use of Conditioning? A) To retrieve data based on particular conditions like less than. since InfoObjects can be copied to the communication structure from all the extract structures. Q) What is the use of Filters? A) It Restricts Data. A) The total no of InfoObjects in the communication structure & Extract structure may be different. Q) What is the PSA. Q) Variable processing types in Reporting? A) Manual. at this stage we can directly load from PSA not going to extract from R/3.

a DataSource in an application system is selected. and is used by SAP Query as a source data. You can also drill-through to BEx queries and InfoSet Queries from a second BW system.A) To differentiate values with colors. The data sources specified in the InfoSet form the basis of the InfoSet Query. table. they need authorization. __You define the data sources in an InfoSet. Q) Can Favorites accessed by other users? a) No. The InfoSet Query functions allow you to report using flat data tables (master data reporting). These can be connected using joins. table join. An InfoSet can contain data from one or more tables that are connected to one another by key fields. When you create an InfoSet. that is Connected as a data mart. • Collective update (V3 update) . using one or more ODS objects or InfoObjects. and sequential file. InfoSets determine the tables or fields in these tables that can be referenced by a report. SAP Query includes a component for maintaining InfoSets. by adding relevant colors u can get pink. InfoSets are based on logical databases. In most cases. • Asynchronous update (V2 update) Document update and the statistics update take place separately in different tasks. such as logical database. Q) Why SAPLRSAP is used? A) We use these function modules for enhancing in r/3. Q) What are workbooks & uses? A) Q) Where are workbooks saved? A) Workbooks are saved in favorites. Choose InfoObjects or ODS objects as data sources. Q) LO's? A) Synchronous update (V1 update) Statistics update is carried out at the same time as the document update in the same task. Q) What is InfoSet? A) An InfoSet is a special view of a dataset. Navigating in a BW to an InfoSet Query.

in contrast to the v2 update. InfoCubes. ALL SUCH DOCUMENTS AND RELATED GRAPHICS ARE PROVIDED "AS IS" WITHOUT WARRANTY HAVE ANY KIND. Groupings gather together all of the objects from a single area: Only Necessary Objects: Only those additional objects that are needed to activate the objects that you have selected are included (minimum selection). A full production environment. a development system with a relatively low or no volume of new documents may only need to run the V3 update on a weekly basis. While a non-executed V3 update will not hinder your OLTP system. Business Content includes: R/3 extractor programs. choose the additional Business Content objects that you want to include. and Workbooks. enter asterisk as your user (for all users). These objects provide ready-made solutions to basic business information requirements and are used to accelerate the implementation of a BW. This transaction will take you to the "UPDATE RECORDS: MAIN MENU" screen. At this screen. Queries. Business Content Business Content is the umbrella term for the preconfigured BW objects delivered by SAP. Successfully scheduling the update will ensure that all the necessary information Structures are properly updated when new or existing documents are processed. by administering the V3 update jobs properly. Any outstanding V3 updates will be listed. InfoObjects. the V3 collective statistics update must be scheduled as a job. with hundreds of transactions per hour may have to be updated every 15 to 30 minutes. COMPENDIT MAKES NO REPRESENTATIONS ABOUT THE SUITABILITY OF THE INFORMATION CONTAINED IN THE DOCUMENTS AND RELATED GRAPHICS PUBLISHED ON THIS SERVER FOR ANY PURPOSE. However. For example. From the Grouping menu. Scheduling intervals should be based on the amount of activity on a particular OLTP system. DataSources. . your information structures will be current and overall performance will be improved.Again. Roles. flag the radio button 'All' and hit enter. SAP standard background job scheduling functionality may be used in order to schedule the V3 updates successfully. InfoSources. document update is separate from the statistics update. It is possible to verify that all V3 updates are successfully completed via transaction SM13.

Nope. Does anyone know how the system gets the data for these 2 info objects? A) From context menu of source system à transfer global settings. all the update rules are active between ODS and cube and all of them are one to one mappings (not even a single ABAP routine in UpdateRules). Do you have any clue why would the job log show as complete. I have loaded the ODS with R/3 data without any problem. I'm working on BW-3. Q) I have checked up sm37. why would the load from ODS to cube fail? (There are no lock entries in the system). This request can be added again after a system copy has been made. it gets saved but not getting activated. I am using 0FIAP_O03 to load 0FIAP_C03 (InfoSource 0FI_AP_4 loads the ODS). why would you want to go with export datasource when you would have already created an export datasource by creating update rules between ODS and cube! Sorry for the silly question. while I tried activating it in Info provider. I have tried that too before posting the question. where as no data appears in the cube? Regarding the export datasource issue. the request fails miserably. based on FISCVARNT you can take data for 0FISCYEAR and 0FISCPER Q) I am facing an odd problem in my dev box. If not the job was never started. Backup for System Copy: You use this setting to collect some of the objects into a transport request. A) Hi. throws up error. Q) I found 0fiscyear has no master data table. Maybe you have to do a 'Generate Export Datasource' by right-click on the ODS. I have checked for the data in the cube. Does anyone have any idea? A) You must have a Job Log in SM37. No data there either. In Data Flow Afterwards: All the objects that receive data from another object are collected.RSA1. In Data Flow Before and Afterwards: All the objects that both deliver and receive data are collected. Q) Actually I'm trying to create a simple standard. I enabled only BEx report in Settings. In fact. the job shows as complete! But the request status under InfoCube à manage à request tab is still showing yellow. In fact. when I want to do the delta initialization in the cube using ODS data. If the cube and the corresponding update rules are active and the data loads immaculately perfect until ODS. but any help in this regard is highly appreciated. The errors are . Now.1version. I saw the data in New Data table and then after activation I am able to see the data in Active Data table and Change Log table. [order -Flat data] ODS. Assuming a false status. and 0fiscper has master data table t009.In Data Flow Before: All the objects that pass data on to another object are collected.

But once I enter this object. Any ideas on how I should go about this? Q) For the current project I am looking at. Is it advisable to separate their BW systems to 2 different clients as well? Or is it recommended to actually fit both in one? What's the best practice? Q) I am creating a CO-PA datasource. I tried to insert the 0RECORDMODE object on the right hand side.. 2 Could not write in to the output device T'. you know all the VVOCO and those things. but My client wants that the default name should be the same name as query name. said continue. Both companies are on different clients in R3. it didn't add to the DATA FILED Folder side along with my keyfigures. characteristics from Segment Item and Line model and others including Calculated Key figures are available fine.Error: So. to Data fields folder. BOOK2. they have 2 major companies in their Group. I cannot see any value fields. Is there any way I can set to make value fields available? Q) While executing the query it generally takes the default name of the excel sheet as BOOK1. So. Their configuration and system setup is different in R3. I try to create CO-PA datasource for Transaction data using KEB0. except the value fields. Also the status is inactive. I couldn't find the 0RECORDMODE object.. BW Delta process: updatemode object in the left hand side KF or Char Info objects. it shows the object while trying to find 0RECORDMODE in INSERT INFOOBJECTS options [on right click at Data Fields Folder]. However.1. . I just tried to activate the ODS object I'm getting the error 'could not write in to the output device T' in status bar.. What could be the error? Q) I need to populate the results of my query into another InfoCube. I successfully set the business content datasources into active versions in R/3. Then.

I found this info on one of the previous posting. We must read this table in a user-exit to get which variables are used in a query. In BW customizing BW Customizing Implementation Guide à Reporting-relevant Settings à General Reporting Settings à Set Alternative Currency Display In this table. A) Check out tables RSZELTDIR and RSZCOMPDIR for query BEx elements.characteristics.get query id from table RSRREPDIR (field RSRREPDIR-COMPUID). but the infocube has data upto 8 November only as we have deleted few requests because there was mismatch of notification complaints encountered during reporting.A) Embed the query in a workbook saved as the name of the query and have your client open the workbook as opposed to the query. use this id for table start with RSZEL* When 'ZFISPER1' name of the variable when VNAM = 'ZVC_FY1' . Step 1 . variable table. you can specify the symbol or string you want to use.before selection Step 2 .processed all variable at the same time Q) Actually the ODS has data till date (09 Dec) which is coming from 2 datasources. do you see any simply any way to make my upload process easier? Do you think that the server will support that amount of data? Or which considerations should I add to make sure that this upload process would run? Q) We need to find the table in which query and variable assignments are stored. Please let me know the solution to come out this problem. . Can we go for Initialize delta update again for cube? Q) How is the display of USD in currency controlled to be seen as USD or $? A) You can control Currency display with the following customizing point. So we tried to update the data thru "Update ODS Data in data target" by selecting the delta update option. query table .after selection Step 3 . Q) Considering that I have in this infocube 6 dimensions. RSZGLOBV. 30 characteristics and 20 key figures. we are getting an error message called "Delta update for ZUODEC02 (ODS) is invalidated".

key relationship. in which scenario u use line item dimension? A) A line item dimension in a fact table does not have the dimension table.If this table is empty USD. You can then generate as many key figure values. SID columns per time-dependent navigational attribute) Q) Routine with Return Table A) Update rules generally only have one return value. Timestamp. Change these values accordingly (in the above example: Employee). Q) If u update data from ODS to data target. we have to roll-up to make that data available in aggregate. In the routine editor. Roll-up--when the data is loaded for second time into cube. you can create a routine in the tab strip key figure calculation. and use this to fill the return table RESULT_TABLE Q) What is Line-Item Data. Q) X & Y Tables? A) X-tables and Y-tables only have primary key. A subset of InfoCube. but a return table. you will find the calculated characteristic values in the structure ICUBE_VALUES. However. the symbol used in BEx restitution is $.key relationship plus SID columns per time-independent navigational attribute) or a Y-table (SID.just have a look when u update from ODS. Flat Aggregate --when u have more than 15 characteristics for aggregate system will generate that aggregate as flat aggregate to increase performance. Q) What is an aggregate? A) Aggregates are small or baby cubes. as you like from one data record. X-table (SID. System will generate InfoSource with one prefix name? A) It will generate with prefix name starting with 8 along with InfoSource name. it connects . The corresponding key figure routine then no longer has a return value. by choosing Return table. tRFC8 ani naming convention tho executing avuthu untundhi -. fill the field for the relevant key figure (in the above example: Sales revenue). Q) Deleting Data from PSA? A) Context menu PSA and delete data or context menu à Edit several requests delete the required request and in reconstruction tab of cube manage delete request. Q) How to check physically about data load? A) At the bottom while updating.

directly with the SID table at its sole characteristic. you will see a warning if you try to carry out a report using an InfoCube while the compression is running. and reduces performance in Reporting. compression has an additional effect on query performance. However. which is included in the fact table in the packet dimension. With non-cumulative InfoCubes. and the reply time is therefore reduced. This unnecessarily increases the volume of data. This makes it possible to pay particular attention to individual requests. One advantage of the request ID concept is that you can subsequently delete complete requests from the InfoCube. entire requests can be inserted at the same time. Compressing InfoCubes Use When you load data into the InfoCube. the marker for non-cumulatives in non-cumulative InfoCubes is updated. Compressing one request takes approx. If you are using an Oracle database as your BW database. and bring data from different requests together into one single request (request ID 0). as the system has to aggregate using the request ID every time you execute a query. This means that. With other manufacturers' databases. it will eliminate the dimension table. Using compressing. as the compressed data can no longer be deleted from the InfoCube using its request IDs. you can also carry out a report using the relevant InfoCube while the compression is running. with the exception of the request ID) to appear more than once in the fact table. Fact table directly linked to the SID table. the request ID concept can also cause the same data record (all characteristics agree. Also. you can eliminate these disadvantages. If you run the compression for a non-cumulative InfoCube. You can schedule the function immediately or in the background. on the whole. In this case you can only report on the relevant InfoCube when the compression has finished running. You must be absolutely certain that the data loaded into the InfoCube is correct. If you want to avoid the InfoCube containing entries whose key figures are zero values (in reverse posting for example) you can run a zero-elimination at the same time as . 2. When there is only one characteristic in a dimension. and link it to other events.5 ms per data record. Functions You can choose request IDs and release them to be compressed. Each of these requests has its own request ID. This function is critical. the summarization time (including the time to update the markers) will be about 5 ms per data record. less data is read for a non-cumulative query.

Everyday for client information we usually check the data loading thru' monitor. the entries. In this case. Suppose there is no open request. Zero-elimination is permitted only for InfoCubes. summarize a request as soon as you have established that it is correct. where exclusively key figures with the aggregation behavior 'SUM' appear. Q) Our is a SAP-BW with a non-sap source system. All the queries which u create or edit are assigned to this request. Not only into excel but also to application server and also to database tables. are deleted from the fact table. InfoCubes (Master data attributes and texts) & not on hierarchies and multiproviders. Q) If you use process chains. There should always be one transport open if you want to create or edit queries.the compression. where all the key figures are equal to 0. then you cannot edit or create any queries Once you have some queries in a request and you want to transport then you can release this one and immediately create a new one so that u can again create or change queries. Q) You can convert currencies at 2 places? A) One is at update rule level & one at the front-end (reports lo) Q) You can create infospokes only on 4 objects. Is there any ways so that I can connect our mail-server to it & get automatic e-mails from BW server about every time after the transaction data is loaded? A) Go to SBWP Tcode there u have set for getting automatic mails. . Visualize the schedule by using network applications and Centrally control and monitor the processes. you can Automate the complex schedules in BW with the help of Event-controlled processing. A) ODS. Activities For performance reasons. and to save space on the memory. Q) Transportation For BEx you can have only one transport request. and is no longer to be removed from the InfoCube. You are not permitted to run a zero-elimination with non-cumulative values in particular. I mean for data offloading from BW to excel.

Cancellation is not suggestible for delta load. A) In BPS. Q) How to reset the data mart status in the ODS. herein you can > schedule > your whole Process chain (at once. Q) We are facing a problem with authorizations for BPS. Q) How can we stop loading data into infocube? A) First you have to find out what is the job name from the monitor screen for this load monitor screen lo Header Tabstrip lo untadhi. and so on) Call the icon "process types" > . Manage Reset the Data Mart Status (third column). Right now. Which transaction is used for resetting the status? A) Go to the ODS and from context menu. SM37 (job monitoring) in r/3 select this job and from the menu u can delete the job. That has nothing to do with Transactional Data. you can use user specific variables or you can set up a Variable of type exit. we are not sure if there is user exit concept available as it is for BW variables. In order to simplify the control of the necessary authorizations. just check them out in sm37 and also u have to delete the request in BW. by day. We have a large number of agents that need to enter plan data via a layout. will it trigger delta for the 2LIS_11_VAITM extractor? A) I suppose you are talking of changing the "Ship to" address information in the Ship to master data. Q) Here are the steps for Process chain A) Call transaction RSPC Create a process chain with the start process. There are some options. Address is changed for ShipTo on the sales order document (not Master data) and this address is saved in ADRC table. You can also have a variable of type authorization which uses the security / authorization of the BW system.A) Write a transfer routine for the infoObjects to extend the length of the data type. Therefore no delta will be in the 2LIS_11_VAITM extractors. by an event. we would like to filter via something similar to a user exit using a function module in order to avoid having to define authorization objects for each of the agents who have access to the systems. If necessary restart the load from the DM which at InfoSource = "8ODS-Name" using Update Tab" Initialize Delta Process" Initial Load without Data Transfer" after resetting the Data Mart Status. Q) If the Ship to address is changed.

In order to add this field could I add the ADNR to the include MCPARTUSR section of the MCPARTNER structure? (And hopefully those fields would then be available in any of the MCVBAK. Q) I want to add the ADNR (address number) of the ship-to party to the standard extractor in LBWE.1C. Type COPY in the transaction-calling field on left top of the screen.. there is a way to copy process chain. and then populated them with a user exit. and then populate the field with a user exit during the extraction process. MCVBAP Communication structures. the system asks you à independent from à error. so why not give the 0DOCUMENT an attribute called ADDR or a full set of attributes that include addresses. open the desired process chain to be copied. Append (?). Perhaps this works in 3. I just wanted verification that I might be going in the right direction. I am using BW 3. Q) Can any body know how to create datamarts for Masterdata characteristics? Here for example I have two master data characteristics like 0Mat_plant and Zmatplt in the same BW .0 as well. but not on the transaction extractors A) We ultimately did add a few fields to the structure. you à can create events by pressing the right mouse click down from the à predecessor à process to the successor process. Or I could add it directly to the MCVBAK in a "Z". In the RSPC view. However.. only by success". Has anyone attempted something like this before? Is that what I need to do to get it into the communication structures for LBWE the logistic cockpit to use it? Haven't seen many posts or documentation that talks about specifics.?) Or add an additional append ZMCPART. I THINK it would work but am not certain yet. Every document is associated with an ADDR. only by error. Saw a bunch on master data augmentation. We have ONE-Time ship-to recipients and we want to be able to report on the STATE of the recipient rather than the state of the sold-to.. Then build an infoset on R/3 to populate the 0DOCUMENT object directly with addresses. Then populate the field with a user exit during the extraction process. The process chain will be copied then. In such a way you can create the process chain Copying Process chains.within the process types: Call the infopackage you need to load the data from the source > system to > ODS > Activate the ODS à Update the ODS into the InfoCube à Rollup the data from the InfoCube to the aggregate the connection between the steps from 2 until 7 are connected by à EVENTS. Treat the addr as "master" data of the 0document object rather than part of the transactional data.

disks. A) Datamarts are a functionality of an ODS you need one extraction program created by the ODS to make the upload into the data target) and not a functionality from Master data tables. memory. but is missing from BW.now imagine that occurring even more frequently.BW uses a Data Warehouse and OLAP concepts for storing and analyzing data. Now I want to upload the data from 0Mat_plant to Zmat_plant using the delta enabled infopackage. The document does exist in BSEG. Metadata Search (Developer Functionality) : . Then check in PSA whether that record is in BW or not. or year-end -. What could be the reason? A) The record may be missing because of any logic in update rules or transfer rules logic. Some of the features are new and the others are tweaked from previous versions. am I right? A) The major benefits of reporting with BW over R/3 are in terms of performance and analysis.Heavy reporting along with regular OLTP transactions can produce a lot of load both on the R/3 and the database (cpu. Features in BI 7. 2. First check whether the record exist in RSA3. then that means we don't have to implement BW.0 version. Therefore create one ODS for jour object "like 0Mat_plant" Q) To check the version of particular infoobject in table RSDIOBJ Q) I have a break on one document between R/3 and BW. Just take a look at the load put on your system during a month end. Data analysis -. quarter end. where R/3 was designed for transaction processing. 1. I mean to say that there may be any filter on that particular record in Update or transfer rules. If they function similarly. just use R3 Drill down reporting tool. etc). Performance -.system. All the other documents that were posted during that time were extracted by BW except this one. With a lot of work you can get the same analysis out of R/3 but most likely would be easier from a BW. Q) What's the difference between R3 drill down reporting and BW reporting? These two reporting seem to function similarly with slice and dice. Then check the update rules.0 or Netweaver 2004s Below are the features in SAP BI 7.

During the text search. With the advanced search. Authorization object S_RO_BCTRA is checked. 2. When activating Business Content in BI.1. you can search in all queries that were changed in the last month and that include both the term "overview" in the text and the characteristic customer in the definition. it can be restricted for all object types according to the person who last changed it and according to the time of the change. you can also search in attributes. the function ""Generation of Documents for Metadata"" in the administration of document management (transaction RSODADMIN) was deleted. lower and uppercase are ignored and the object will also be found when the case in the text is different from that in the search term. Web templates) using the TREX search engine. For example. It is possible to search BI metadata (such as InfoCubes. InfoObjects. Further functions include searching in the delivered (A) version. This search is integrated into the Metadata Repository. 4. fuzzy search and the option of linking search terms with “AND” and “OR”. These attributes are specific to every object type. "Because the advanced search described above offers more extensive options for search in metadata. You need role SAP_RO_BCTRA.       Effects on Customizing Installation of TREX search engine Creation of an RFC destination for the TREX search engine Entering the RFC destination into table RSODADMIN_INT Determining relevant object types Initial indexing of metadata" Remote Activation of DataSources (Developer Functionality) : 1. The authorization is valid for all DataSources of a source system. 3. queries. This activation is subject to an authorization check. you can activate DataSources remotely from the BI system. When the objects are . You have to schedule (delta) indexing of metadata as a regular job (transaction RSODADMIN). a search for one or all object types is performed in technical names and in text. With the simple search. Beyond that. the Data Warehousing Workbench and to some degree into the object editors.

If you have the necessary authorizations for activation. How to automate flat file loads into SAP BW? Flat file uploads are pain in the neck in most cases. In BI. process variant). BW issues an error for the corresponding source-system-dependent objects (transformations. and then transfer the corresponding source-system-dependent objects from the Business Content. if you trigger the transfer of the Business Content in the active version. replicate them in the BI system. You have to choose the best option for your environment. you have to activate the DataSources in the source system manually and then replicate them to the BI system. the system issues a warning for the DataSources. you can use Customizing for the extractors to manually transfer the required DataSources in the source system from the Business Content. in many cases we do not have an option except to go with flat file uploads because of non SAP source systems or in cases where business users has to consolidate source file from different systems. In any case automating these flat file loads will save much needed time and head ache for SAP BW support team and saves the delay in emailing the file to support team. For instance there are scheduling tools like (IBM . If you lack the necessary authorization for activation. The source-systemdependent objects are activated in the BI system. the results of the authorization check are based on the cache. However. the system checks the authorizations remotely. In this case. otherwise remote activation is not supported. In this case. process chain. type of scheduling tool being used etc. transfer rules. There are different options to automate flat file loads.collected. Source systems and/or BI systems have to have BI Service API SAP NetWeaver 2004s at least. 2. and issues a warning if you lack authorization to activate the DataSources. This also helps with the security of sensitive data in flat files. the DataSources in the source system are transferred to the active version and replicated in the BI system. 3. available interfaces. transfer structure. InfoPackage. It depends on your system architecture.

Users put their source files on to a common share windows NT drive (folder) 2. Actual loading of the file. Detecting the flat file automatically and starting the job/process chain to load the file. Getting the flat file on to the app server (Flat file needs to be on the app server to be able to schedule the infopackage in the backgroud. One can give upload access to the user/user group to the folder where flat files reside before being loaded into BW. checking for format errors. This is how the second approach works: 1. The above approach works in case of single user/small user group. There are three critical parts in automating flat file loads. It won‟t work in case of a large user group because of file overwrite and timestamp issues. post cleaning steps like archiving/moving files Getting the flat file on to App Server: For this step can use a FTP client to upload the file from user‟s computer to a designated directory/folder in the app server. It is relatively easy and straight forward to automate if you are using such a tool. This windows batch script can be schedule to run for every 30 mins or 1 hour depending on the requirement. 3. Users can upload the file using a ftp client like WSFTP pro. sending user notifications. . 1. you can schedule it in the back ground to load from client workstation) 2.Maestro/Tivoli) that has features (like being able to start the jobs based on presence of a file in the application server). A batch script consolidates (if there are more than one file) and ftps the file to app server 3. The better approach for large user group is to use ftp scripts to consolidate and ftp the file to app server from windows shared folders. This works in the cases where a single user/small user group is responsible for generating the flat files.

the script takes care of multiple files issue and also it is scalable meaning can easily allow new users to upload the files all they need is access to the shared folder Detecting the flat file and automatically and starting the job/process chain: This is the critical part. You need to define the file name and directory path in when you define the jobs in the tool. . You can write an ABAP program to check if a file is present in the app server. Post load and cleanup activities: You need to do some housekeeping activities to avoid overlap of loads. This is how this works: 1. The advantage of this approach is that no training of FTP client is needed for end users.4. You need to resubmit the schedule at the end of the load to be able to load multiple files in a single day. Don‟t worry if you are not using a scheduling tool or your scheduling tool does not support this. Other tools also might support. 2. This program triggers an event if the file is present. keeping track of files and time stamps for troubleshooting if there are any data issues. A process chain scheduled based on the event from above step. I know that IBM Maestro/Tivoli supports this. There are other options. This process chain loads the file into BW. This ABAP program can be used to trigger the process chain that loads the actual file. If you are using a scheduling tool that supports this all you need to do is schedule the jobs to be dependent of the file. Check your scheduling tool features if you are using one to see if it is possible to schedule jobs dependent on presence of a file in the app server. The scheduling tool checks for the file at regular intervals ( 10 or 15 mins ) and starts the schedule when the file is present. ABAP program is scheduled to run every 30 mins or 1 hour depending on the requirement.

1. Archive the file to archive directory after the successful load and notify the users.2. How to change process chain (PC) status in SAP BW? There are scenarios where you need to change status of a process chain or a particular step in process chain. Right click on the failed step in the process chain monitor and go to displaying messages 2. In these cases there is no straight forward way to change the status. rename it to flatfile. custom ABAP programs etc. There are other scenarios where one needs to change the status of a single step in the process chain or status of whole process chain. 1. By doing this you can avoid overlapping loads. For instance if your file name is flatfile.load. It is easy to change the status of a step if it is data load. You might need to change the status in the cases where we need mark these steps successful so that dependent steps get processed. instance and start date . Go to „Chain‟ tab and note down variant. First step is to rename the file at the beginning of the process chain/schedule. 1. it detects the file and starts the load at 10 AM and also assume that the file is huge and it takes more than 30 mins to load. In this scenario the next run of the ABAP program at 10:30 AM detects the same file and it triggers the process chain before the fist file load is complete. you can just change the status of request in the monitor and it will in turn changes the status in the process chain. Step by step instructions on change process chain status 1.cbs. The problem is other types of processes like master data activation. Renaming the file avoids this issue. You can use UNIX command if your scheduling tool supports or ABAP program or SM36 job for this.csv . Check flat file for formatting issues: Top most problems with flat files are formatting issues. Let us assume your ABAP program runs every 30 mins. You can write an ABAP program to check for formatting issues before hand and notify the users and skip the data load.

variant. Go to SE37 transaction and execute the function module RSPC_PROCESS_FINISH and enter the values from step 3 and enter the new status „G‟ in status field and execute the FM 5. 4. instance. Now dependent steps in the process chain will start running. you will notice the changed status.3. instance and start dates from step 2 and note down log_id. How to? Navigational Attributes in SAP BW Navigational attribute is like any other master data attribute except that we can navigate (filter. go to the monitor screen of the process chain.Importance.Navigational Attributes . However there are many scenarios where you want users to navigate in the reports based on a master data attribute because it is not always possible to have every required field as a characteristic for many reasons. As you know we can navigate only based on characteristics in infocube or multiprovider. drill down and selection) in reports on this characteristic. type. Go to SE16 and the table RSPCPROCESSLOG and enter variant. Usage. SAP BW . . This sets the status of the process chain (PC) After you set the status using the FM.

This is because SAP inserts sids int SID table (/BIC/X*) for new navigational attributes. . no need of any lookups. For PSA data below are the best practices in my view. no delta worries and navigational attribute can be used for history data as well. How to turn on navigational attributes? Well. SAP BW : PSA and Changelog Deletion Best practice to have a strategy on PSA and changelog deletion. first you need to switch them on in the master data. The only disadvantage is it will impact report performance a little bit. Go to RSD1 -> Attribute tab -< Detail/Navigation Attributes -> Navigational Attribute on/off column. you should either do a lookup in BW or enhance the extractor on ECC side to add new characteristic to infocube. So you don‟t want to have too many navigable attributes in an infcoube.For starters. After this you need to switch them on in required infocubes and multproviders under navigational attributes section. The solution is to use navigational attributes. this is cumbersome and delta on the new fields may not work and most important the history data will not have this field populated. Caution: Transport of master data object might take long time when you switch on navigation attributes in cases where master data volume is high.

Some say we can delete the changelog data no problem. change log data can be deleted. delete the requests older than 45 days.PSA Deletion in SAP BW/BI(Business Intelligence) 1. The rationale behind 45 days is that month end reporting is done within the first two weeks of the month. So by keeping 45 days we will have the opportunity to reload/recover data from PSA if there are in issues with data. some say we will get incorrect data if there are any updates for deleted old data. there are no issues with it even when we get changes for the records we deleted. Transaction Data – Keep upto 45 days. For the record. delta or full. ODS Activation : During activation SAP compares records in new table with records in active table and generates changelog records and puts them under a unique id which corresponds to request id in the active table. 2. Attribute full Loads : 1 week Texts: One week or no PSA at all Change Log Data : I think there is lot of confusion in forums on deleting changelog data. Records in new table are deleted at the end of activation. Attribute Delta Loads : 2 weeks. 3. 1. . Data is first loaded into new table (This is before ODS activation) 2. after the delta load is complete the records in change log table are never used for anything again unless you want to reconstruct/reload data requests by requests. Now when you do a delta from the ODS to the cube it loads the records from changelog table based on request id. Here is how change log /delta from ODS to the cube works. Master Data : For master data it depends on attribute or texts.

This is wonderful presentation by Dr. How to delete changelog or PSA data in SAP BW? OK. Even if you need to reload you do it by either full or delta init loads or data will be loaded from active table. Actual deletion of change log can be done either in process chains or from the ODS >Manange -> Environment -> Delete change log data. How to know how many requests are there. It will give the no of requests by PSA and ODS.BI performance 20 Technical tips and tricks to speed SAP Business Intelligence (BW . what are big change log/PSA to get rid of first to save some much needed spaces. You can go to table RSTSODSREQUEST and export the list to excel or access file. Now you have decided to delete change log and PSA data which has been sitting out there. Bjarne Berg and his work. Performance Issues & Tips     Multiproviders and Partitioning Aggregates Query Design & Chaching Hardware & Servers Designing for Performance  InfoCubes and DSOs .BI) query and report perforance by Dr. Download the presentation from the attachements section below.BI performance.4. Bajarne Berg whos is considered to be SAP BI guru. Check out resource to know more about Dr. This presentationg gives 20 tips on improving SAP BW . Tips and Tricks to improve SAP BW . Bjarne Berg. This presentation covers following topics.

In BI 7. no need to setup a separate precal server which you need to setup in 3. Also now you can send reports in PDF format in BI 7. Download these documents from the attachments section below.5 to be able to broadcast workbooks.BI Accelerator   Sizing and Implementation Management and Costs Early Watch Reports Information Broadcasting in SAP BI 7. publish reports to portal or for sending reports to printer.0 This SAP document gives step by stepinstructions on troubleshooting information broadcasting in SAP BW/BI. The second document „Howto-troubleshoot-information-broadcasting.0 you can also broadcast workbooks in addition to queries. SAP has added new features to Information broadcasting in SAP BI 7. The first document attached to this article „Information Broadcasting with SAP Netweaver-04‟ gives an overview of information broadcasting features in SAP BI 7.BW Service API RSDS . SAP Transaction Codes (tcodes): T-code Search Results for "datasource" Top of Form x Find Tcodes RSA6 . Information broadcasting is used to send reports to users in email.Maintain DataSources Basis .0 and improved over all functionality. Getting information broadcasting to work with workbooks is not a trivial thing.0 explaining with examples where and how to use information broadcasting in different scenarios.pdf‟ gives step by step instructions on troubleshooting information broadcasting.0.DataSource BW . this document helps in troubleshooting process.Data Staging .

Generate DataSources Enterprice Controlling .Enhancements Basis .Modeling .Assign Gen.BW Service API RSA15 .Data Staging CRMBWST .Information System KEB1 .RSA2 .BW Delta Queue Monitor Basis .Logistics Information System (LIS) RSA1 .Install Business Content Basis .BW Service API FCIWCU00 .: Customizing Cockpit Logistics .Genertd DataSource for BW Status Obj Cross Application .MOLAP DataSource creation BW . Project Systems .SAPI DataSource Repository Basis . DataSource for BW Status Obj.Create CO-PA DataSource CO .datasource related Transaction Codes---------------------RSA3 .Customer Enhancements SE11 . Ledger DataSource/Ledger FI .Maintain DataSources Basis .Administrator Workbench FAGLBW03 .BW Service API RSO2 .OLAP Technology BWST .DataSource BW .Dictionary Maintenance RSDS .BW Service API LBWE .DataSource Repository Basis .EC-EIS/BP: Generate DataSource Enterprice Controlling .SAPI DataSource (Old GUI) Basis .Oltp Metadata Repository Basis .DataSource Documentation BW .Data Staging SAP Transaction Codes (tcodes): T-code Search Results for "LO Cockpit" LO Cockpit x Find Tcodes .BW Service API RSA7 .DW Workbench BW .BW Service API RSA6 .General Ledger Accounting RSDSD .Forwarding to SAP BW KCBW .LO Data Ext.ABAP Dictionary Maintenance Basis .BW Service API KEB0 .CrossApplication Components RSDPMOLAPDS .Executive Information System --------------------.Extractor Checker Basis .BW Service API RSA5 .Profitability Analysis RSA2OLD .DW Workbench: DataSource Tree BW .CO-PA Hierarchy DataSource CO .Administrator Workbench CMOD .Profitability Analysis RSA8 .Gener.

: Delivery Logistics .BW Service API RSA6 .Customer Enhancements RSA5 .RFC SM37 .Install Business Content Basis . Struct.Logistics Information System (LIS) SAP Transaction Codes (tcodes): T-code Search Results for "Copa Datasource" Copa Datasource x Find Tcodes --------------------.: Customizing Cockpit Logistics .Delete Newly Reorg.Client/Server Technology OLI7BW .Overview of job selection Basis . BW Data Logistics .Administrate Update Records Basis .Logistics Queue Overview Logistics .Start DBA Cockpit Basis .BW Service API CMOD .SAPI DataSource Repository Basis .Logistics Information System (LIS) LBWG .BW Service API SPRO .Customizing Project Management (IMG) DBACOCKPIT .LO Cockpit related Transaction Codes---------------------LBWE .Logistics Information System (LIS) RSA3 .: Invoices Logistics . Str.BW Service API . Str.Background Processing OLI9BW .BW Service API LBWQ . VIS Extr.Enhancements Basis .Maintain DataSources Basis .Logistics Information System (LIS) RSA6 .BW Delta Queue Monitor Basis .Logistics Information System (LIS) RSA7 .BW Service API BMBC .Reorg.Maintain DataSources Basis .Reorg. of VIS Extr.Customizing .Edit Project Basis .Logistics Information System (LIS) SE11 .--------------------. VIS Extr.: Order Logistics .Reorg.Extractor Checker Basis .BW Service API RSA7 .BIW in IMG for OLTP Basis .Batch Information Cockpit Logistics .Dictionary Maintenance RSA2 .LO Data Ext.DB2 Universal Database for UNIX / NT SMQ1 .Copa Datasource related Transaction Codes---------------------RSA3 .BW Service API SM13 .ABAP Dictionary Maintenance Basis .BW Service API SBIW .Batches OLI8BW .Extractor Checker Basis .qRFC Monitor (Outbound Queue) Basis .BW Delta Queue Monitor Basis .

CO-PA Line Item Entry CO .Line Item Display .Modeling .DW Workbench BW .Dictionary Maintenance CMOD .Work Clearance Management OAC0 .Profitability Analysis SE16 .BW Service API RSDS .SAPI DataSource Repository Basis .Content Management Service .RSA5 .LO Data Ext.Maintain Categories Basis .View maintenance VV2_T258I_V CO .KPRO Administration Basis .BIW in IMG for OLTP Basis .Create CO-PA DataSource CO .Maintain Derivation Strategy CO .DW Workbench BW .ABAP Editor KE4I .ABAP Dictionary Maintenance Basis .Workbench Utilities KEDR .Data Staging KE30 .BW Service API KPRO .Profitability Analysis RSA2 .Enhancements Basis .BW Service API SE38 .Data Browser Basis .Modeling .Actual Data CO .Execute profitability report CO .Customer Enhancements KEB0 .Profitability Analysis KE24 .Install Business Content Basis .Content Management Service RSA1 .Customizing Project Management (IMG) OACT .content extractors related Transaction Codes---------------------WCM .CMS Customizing Content Repositories Basis .Customizing .Administrator Workbench RSA5 .Content Management Service CSADMIN .Install Business Content Basis .DataSource BW .ABAP Editor Basis .Logistics Information System (LIS) SE11 .Profitability Analysis SAP Transaction Codes (tcodes): T-code Search Results for "content extractors" content extractors x Find Tcodes --------------------.Content Server Administration Basis .Work Clearance Management PM .Profitability Analysis SBIW .Edit Project Basis .: Customizing Cockpit Logistics .Administrator Workbench LBWE .BW Service API RSO2 .Oltp Metadata Repository Basis .Profitability Analysis KE21N .BW Service API RSA1 .Content Management Service SPRO .

ABAP Editor XSLT .BIW in IMG for OLTP Basis .BW Service API .ABAP Dictionary Maintenance Basis .RFC Destinations (Display/Maintain) Basis .Workbench Utilities RSA2 .Extractor Checker Basis .Modeling .BW Service API RSDS .Dictionary Maintenance SAP Transaction Codes (tcodes): T-code Search Results for "generic datasource" generic datasource x Find Tcodes --------------------.RSA3 .Maintain DataSources Basis .SAPI DataSource Repository Basis .Print and Output Management OAC3 .LO Data Ext.XSLT tester Basis .Workflow Definition: Administration Basis .: Customizing Cockpit Logistics .SAP ArchiveLink: Links Basis .Data Staging SE16 .Dictionary Maintenance CMOD .Output Controller Basis .SAP Business Workflow SXMB_MONI .Install Business Content Basis .Integration Engine .ICM Monitor Basis .Administrator Workbench LBWE .generic datasource related Transaction Codes---------------------RSA3 .RFC SE11 .BW Service API RSA7 .ABAP Editor Basis .Enhancements Basis .ArchiveLink SBIW .HTTP Service Hierarchy Maintenance Basis .DW Workbench BW .BIW in IMG for OLTP Basis .ABAP Dictionary Maintenance Basis .Logistics Information System (LIS) SE11 .BW Delta Queue Monitor Basis .BW Service API SWDC .BW Service API RSA1 .Integration Engine SE38 .Internet Communication Framework SMICM .XML Generic Technology SP01 .Monitoring Basis .BW Service API RSA6 .Extractor Checker Basis .Oltp Metadata Repository Basis .Customer Enhancements SBIW .BW Service API SICF .BW Service API RSO2 .DataSource BW .Data Browser Basis .Internet Communication Manager SM59 .BW Service API RSA5 .

Authorization and Role Management CMOD .OLAP Technology ST01 .Environment SM37 . BW .Role Maintenance Basis .Customizing .ABAP dump analysis Basis .Logistics Information System (LIS) SE37 .Customer Enhancements RSECADMIN .ABAP Editor Basis .Manage Analysis Authorizations BW .Deactivate Pers.Business Explorer RRMB .Business Explorer RS_PERS_BOD_ACTIVATE .Activation of BEx Personalization BW .System Trace Basis .Internet Communication Framework SWDC .DW Workbench BW . for BEx Open BW .Logistics Queue Overview Logistics .Low Level Layer SICF .SAP Business Workflow SAP Transaction Codes (tcodes): T-code Search Results for "BWA" BWA x Find Tcodes MEREP_BWAFDEL .ABAP Editor LBWQ .Start the Business Explorer Analyzer BW . Runtime SAP Transaction Codes (tcodes): T-code Search Results for "Bex" Bex x Find Tcodes RS_PERS_ACTIVATE .Enhancements Basis .RFC ST22 .Activate BEx Open Pers.Edit Project Basis .ABAP Function Modules Basis . for BW and Classes Cross Application .Start of the report monitor BW .Table Maint.Syntax.Bex related Transaction Codes---------------------RSRT .Overview of job selection Basis .Modeling .delivery of BWAFMAPP entries Basis .Business Explorer RSA1 . Compiler.qRFC Monitor (Outbound Queue) Basis .Function Builder CTBW .OLAP Technology RRMX .Upload Screens from BEx Browser BW .Business Explorer --------------------.SAP NetWeaver Mobile .Business Explorer RS_PERS_BOD_DEACTIVA .HTTP Service Hierarchy Maintenance Basis .Workflow Definition: Administration Basis .SE38 .Customizing Project Management (IMG) PFCG .Administrator Workbench SPRO .Background Processing SMQ1 .

Administrator Workbench SAP Transaction Codes (tcodes): T-code Search Results for "transport" transport x Find Tcodes STMS .C FI Transport Chart of Accounts FI .Basic Functions VT04 .Create Stock Transport Order MM .OLAP Technology SLG1 .Transportation .CO-PA: Transport tool CO .--------------------.Transport Organizer (Extended) Basis .Basis Application Log RSA1 .Modeling .Transportation Worklist Logistics Execution .Organizational Management /SAPAPO/SCC_TL1 .TREX Administration Tool Basis . MM .Transport Organizer Tools Basis .Production Orders ME27 .Background Processing RSDDBIAMON .Business Explorer RSDDV .Set up PD Transport Connection Basis .Transport Organizer COSS .BI Accelerator Maintenance Monitor BW .Basic Functions OOCR .Transport Organizer Basis .Transport Management System SE10 .BI Accelerator Maintenance Monitor BW .Create Transport Scheduling Agmt.Manual Transport Link Basis .Transfer Bank Data from BIC Database Cross Application .Transport of C Tables PP .Transport Management System Basis .Transport Organizer SE01 .Start of the report monitor BW .Translation Tools OBY9 .Set Up Transport Organizer Basis .Bank SM37 .OLAP Technology RSRT .Transport Organizer SE09 .Overview of job selection Basis .Basic Functions SLXT .Purchasing GCTR .Transport Organizer SE06 .OLAP Technology RSRV .BWA related Transaction Codes---------------------TREXADMIN .Transport from Report Writer objects FI .Translation Transport Basis .Transport Organizer Basis .DW Workbench BW .Application Log: Display Logs Basis .Maintaining Aggregates/BIA Index BW .OLAP Technology BIC .Organizational Management KE3I .Profitability Analysis ME37 .Transportation Lanes MM .Transport Organizer SE03 .Analysis and Repair of BW Objects BW .TREX ABAP + JAVA API RSDDBIAMON2 .Purchasing RE_RHMOVE30 .

Business Explorer RSA1 . Conversion Program.Syntax.Project System LECI .DW Workbench BW . MC.ABAP Dictionary Maintenance Basis .Transport Organization Customizing CO .Legacy System Migration Workbench Basis .Basic Functions ME6Z .User and Authorization Management SE14 . SPDD STMS .OLAP Technology RRMX .Financials Basis QCCY . Compiler.Transport Management System Basis .Purchasing STMS_PATH .Destination for Transport Methods Financials .Customizing .Background Processing SU01 .Role Maintenance Basis .Start of the report monitor BW .Transport Management System OKE5 .Edit Project Basis . Runtime LSMW ./SAPAPO/TSOBJ .Overview of job selection Basis .Quality Management MEKX .ABAP dump analysis Basis .Data Browser Basis .Transport Vendor Evaluation Tables MM .Vendor Evaluation FINB_TR_DEST .Utilities for Dictionary Tables Basis .Dictionary Maintenance SARA .Modeling .Overhead Cost Controlling SAP Transaction Codes (tcodes): T-code Search Results for "delete bex" delete bex x Find Tcodes --------------------.Transport Connection DP/SNP Project Systems .Function Builder SE11 .Archive Development Kit ST22 .Administrator Workbench SE38 .Activation Program.Transport Management System SE37 .Transport Condition Types Purchasing MM .Start the Business Explorer Analyzer BW .Customizing Project Management (IMG) SE16 .User Maintenance Basis .Archive Administration Basis .TMS Transport Routes Basis . DB Utility.ABAP Editor SPRO .ABAP Editor Basis .delete bex related Transaction Codes---------------------RSRT .Customer Enhancements .Register Means of Transport/Visitor Logistics Execution .Legacy System Migration Workbench CMOD .Workbench Utilities PFCG .Transport QM tolerance key QM .Authorization and Role Management SM37 .Enhancements Basis .ABAP Function Modules Basis .

Print and Output Management SPAD .Parameters for Automatic Payment FI .Spool Administration (Test) Basis .Call View Maintenance Basis .ABAP Editor Basis .TemSe Administration Basis .Communication Services: Mail.Installation Check: Spool Basis .Output Controller Basis .RFC . SMS.bad character related Transaction Codes---------------------RSKC .Audit Information System OMMZ .Administration Basis .Overview of job selection Basis .Spool and related areas Basis .Print and Output Management RSPO0055 .Print and Output Management SPAD .Spool Administration Basis .Print and Output Management SP00 .Maintaining the Permittd Extra Chars BW .Spool Parameters for WM Print Ctrl Logistics Execution .spool related Transaction Codes---------------------SP01 .Other Functions --------------------.Background Processing SP02 .Warehouse Management SM59 .RFC Destinations (Display/Maintain) Basis .Financial Accounting SAP Transaction Codes (tcodes): T-code Search Results for "bad character" bad character x Find Tcodes --------------------.Schedule Background Job Basis .Print and Output Management SPAT .SE16N .General Table Display CO .Spool Parameters Basis .Controlling SM30 .Display Spool Requests Basis .Audit Information System RSPFPAR_SPOOL .R/3 Syslog SAP Transaction Codes (tcodes): T-code Search Results for "spool" spool x Find Tcodes SP02 .Online System Log Analysis Basis .Print and Output Management SPOV .Spool Request Overview Basis .Table Maintenance Tool SM21 . Fax.ABAP Editor SM36 .Background Processing SP12 .Display Spool Requests Basis .Print and Output Management SM37 .SAPconnect .Online System Log Analysis Basis .Print and Output Management SE38 . Telephony SM21 .Print and Output Management SCOT .R/3 Syslog F110 .Spool Administration Basis .

Contains master data common to all cubes.Integration Engine STRUST . The star schema?      Uses generated numeric keys and aggregates in its own tables for faster access.Edit Project Basis .Function Builder F103 .Transfer Bank Data from BIC Database Cross Application .I18N Unicode SE16 .Internationalization Basis .Bank F104 . Supports slowly changing dimensions.Security SO10 .Spool Administration Basis .C FI Maintain Table T030F FI . Of these 16.Unicode Postconversion Basis .Customizing . How many dimensions are there in a cube? There are a total of 16 dimensions in a cube.Monitoring Basis .ABAP/4 Reporting: Receivables Prov.htm 1. 3.SAPscript Font Maintenance Basis .Workbench Utilities SE73 .ABAP Dictionary Maintenance Basis . What is the “myself data mart”? .XSLT tester Basis .Dictionary Maintenance BIC . 2.Customizing Project Management (IMG) I18N .XML Generic Technology SXMB_MONI .Data Browser Basis .Basic Functions SPRO .Basic Functions SPAD .Print and Output Management SE11 . This leaves the customer with 13 dimensions.ABAP Editor Basis . unit and request.Integration Engine . What are the advantages of an Extended star schema of BW vs. Uses an external hierarchy.Basic Functions SE38 .SAPscript: Standard Texts Basis .ABAP Function Modules Basis .ABAP Editor OBXD .Internationalization (I18N) XSLT .Basic Functions SUMG .ABAP/4 Reporting: Trnsfr Receivables FI .SE37 .SAPscript OB04 . Supports multiple languages.com/books/bw.C FI Maintain Table T030 FI . 3 are predefined by SAP and these are time.Trust Manager Basis . What is the transaction for the Administrator work bench? Transaction RSA1 4. FI .SAPscript BW Interview Questions Written by Kevin Wilson To get the full 201 Interview Questions go to http://201interviewquestions.

In the BEx query you can create filters or variables for country and you can also use the drill down feature. What is a condition? If you want to filter on key figures or do a ranked analysis then you use a condition. What are the 10 decision points of data warehousing?     Identify a fact table. . What is meant by compounding? Compounding defines the superior InfoObject. What are the types of attributes? Display only . For example. 6. 5. you can use a condition to report on the top 10 customers. the controlling area is the compounding (superior) object. Identify the dimension tables. 13. Define the granularity of the fact table (how detailed do you want the data to be). when you define a cost center. or customers with more than a million dollars in annual sales. assume that we have customer characteristics with country as a navigational attribute. They are used to improve performance when executing queries. For example. What are the data types supported by characteristics?     NUMC Numeric CHAR (up to 60) Up to 60 characters DATS Date TIMS Time 11. What is an aggregate? Aggregates are mini cubes. you will define a variable for 0MATERIAL. you can have a calculated key figure to calculate sales tax based on your sale price. which must be combined to define an object. Aggregates are transparent to the user.These attributes are only for display and no analysis can be done. 9. Navigational attributes . you will then be able to analyze the data using customer and country. 12. What is the enhancement user exit for BEx reporting? RSR00001 8. What is a characteristics variable? You can have dynamic input for characteristics using a characteristic variable. For example.A BW system feeding data to itself is called the myself data mart. You can equate them to indexes on a table. For example. if you are developing a sales report for a given product. For example. 7. percentage functions and total functions.These attributes behave like regular characteristics. Define the attributes of the entities. It is created automatically and uses ALE for data transfer. 10. What is a calculated key figure? A calculated key figure is used to do complicated calculations on key figures such as mathematical functions.

All the regular requests are stored in the F table. What is an InfoSet? An InfoSet is an info provider giving data by joining data from different sources like ODS and master data. InfoSets can also be used to combine transactional data with master data. it might seem silly but no one is watching you. How often is the data extracted. You can also do an outer join in an InfoSet. If you want to do some complex string manipulation. 15. First. This saves space and improves performance but the disadvantage is that you cannot delete the compressed requests individually. They are always shown in relation to a point in time. In this case the use of an ABAP routine is most appropriate Formula . we will ask how many employees we had as of last quarter. You can use formula builder to help put your formulas together. how you use you hands and legs. If you are loading data from a specified country from a flat file. Identify aggregates. How long will the data be kept. Examples are head count and inventory amount. What options are available in the transfer rule?     Assign an InfoObject – direct transfer. assume that you are getting a flat file from legacy data and the cost center is in a field and you have to “massage” the data to get it in. etc. What are non cumulative key figures? These are key figures that are not summarized (unlike sales. If you want to convert all lower case characters to upper case. 16. When you compress.for simple calculations use formula eg. You might want to practice in front of a mirror. What is compression or collapse? This is the process by which we delete the request ID‟s which leads to space savings. This will give you good feedback on how you look to other people and give you an opportunity to practise a bit before going out into the world. . no transformation Assign a constant eg. how you stand.). Notice how you sit. you can make the country (US) as a constant and assign the value explicitly ABAP routine eg. 14. You can. If you are using noncumulative key figures in a cube. For example. Another usage is. the cube should be compressed as often possible to improve performance. which leads to improved performance 17. From which system is the data to be extracted. Identify slowly changing dimensions. For example.      Define pre-calculated key figures. the request ID is deleted and data is moved from the F table to the E table. Then you can have an InfoSet with transaction data and material where you will be able to do calculations based on material price in BEx. however. to change your body language you must be aware of your body language. Yeah. We don‟t add up the head count. what you do while talking to someone. use the TOUPPER formula. if you have quantity in the transaction data and you have price as an attribute of the material. if you have ODS you can disable BEx reporting (in the setting) and use the ODS in the InfoSet for reporting. still use selective deletion.

they are worrying about their own problems. 5. You might sit with your legs almost ridiculously far apart or sit up straight in a tense pose all the time. Just play around a bit. 3. Lean. 2. Take bits and pieces you like from different people. Don’t slouch. sit up straight – but in a relaxed way. Don’t cross your arms or legs – You have probably already heard you shouldn‟t cross your arms as it might make you seem defensive or guarded. If you slow down your movements you‟ll feel calmer. You might also want observe friends. But don‟t be the first to laugh at your own jokes. Smile and laugh – lighten up. Don’t be afraid to take up some space – Taking up space by for example sitting or standing with your legs apart a bit signals self-confidence and that you are comfortable in your own skin. Or lean back too much or you might seem arrogant and distant. 7. But fake it til you make it is a useful way to learn something new. feelings work backwards too. not in a too tense manner. but not too much – If you want to show that you are interested in what someone is saying. And people aren‟t looking as much as you think. open and relaxed or whatever you want to communicate. Try using what you can learn from them. Nod when they are talking – nod once in a while to signal that you are listening. Try to relax. Try to loosen up by shaking the shoulders a bit and move them back slightly. Keep your arms and legs open. movie stars or other people you think has good body language. 8. Observe what they do and you don‟t. don‟t take yourself too seriously. 1. If you want to show that you‟re confident in yourself and relaxed lean back a bit. But don‟t overdo it and peck like Woody Woodpecker. People will be a lot more inclined to listen to you if you seem to be a positive person. smile and laugh when someone says something funny. Keeping too much eye-contact might creep people out. give them all some eye contact to create a better connection and see if they are listening. Relax a bit. If you are not used to keeping eye-contact it might feel a little hard or scary in the beginning but keep working on it and you‟ll get used to it. If you sit up straight you will feel more energetic and in control. lean toward the person talking. but don’t stare – If there are several people you are talking to. Have eye contact. Your feelings will actually reinforce your new behaviours and feelings of weirdness will dissipate. Giving no eye-contact might make you seem insecure. Relax your shoulders – When you feel tense it‟s easily winds up as tension in your shoulders. Then try it out. They might move up and forward a bit.Another tip is to close your eyes and visualize how you would stand and sit to feel confident. See yourself move like that version of yourself. practice and monitor yourself to find a comfortable balance. But don‟t lean in too much or you might seem needy and desperate for some approval. That‟s ok. In the beginning easy it‟s to exaggerate your body language. And remember. it . Some of these tips might seem like you are faking something. If you smile a bit more you will feel happier. role models. This goes for your legs too. 4. 6.

you might do the same. Walking slower not only makes you seem more calm and confident. Use your hands more confidently – instead of fidgeting with your hands and scratching your face use them to communicate what you are trying to say. And don‟t let your hands flail around. Mirror . use them with some control. Smile when you are introduced to someone but don‟t keep a smile plastered on your face. it might make you seem insecure and a bit lost. Keep you head up . Let people have their personal space. But don‟t use them to much or it might become distracting. when the two of you get a good connection. 9. However. 18. 16. 14. 11. If he leans forward. Declutter your movements if you are all over the place. you will start to mirror each other unconsciously. Slow down a bit – this goes for many things. it will also make you feel less stressed. don‟t hold anything in front of your heart as it will make you seem guarded and distant. Lower your drink – don‟t hold your drink in front of your chest.Often when you get along with a person. If she holds her hands on her thighs. Use your hands to describe something or to add weight to a point you are trying to make. Especially things like keeping you head up might take time to correct if you have spent thousands of days . You‟ll seem nervous and fidgeting can be a distracting when you try to get something across. phase out or transform fidgety movement and nervous ticks such as shaking your leg or tapping your fingers against the table rapidly. keep a positive. Don’t stand too close –one of the things we learned from Seinfeld is that everybody gets weirded out by a close-talker. they might think that the spine ends where the neck begins and therefore crane the neck forward in a Montgomery Burns-pose. Keep your head up straight and your eyes towards the horizon. you might lean forward. 12. 13. Then weirdness will ensue. 15. You can change your body language but as all new habits it takes a while. Keep a good attitude – last but not least.Don‟t keep your eyes on the ground. don‟t invade it. don‟t snap you‟re neck in their direction. slow down and focus your movements. If someone addresses you. To make the connection better you can try a bit of proactive mirroring. Don’t touch your face – it might make you seem nervous and can be distracting for the listeners or the people in the conversation. Try to relax. Keep you whole spine straight and aligned for better posture.makes you seem nervous and needy. turn it a bit more slowly instead. That means that you mirror the other person‟s body language a bit. How you feel will come through in your body language and can make a major difference. 10. you‟ll seem insincere. For information on how make yourself feel better read 10 ways to change how you feel and for relaxation try A very simple way to feel relaxed for 24 hours. Lower it and hold it beside your leg instead. open and relaxed attitude. But don‟t react instantly and don‟t mirror every change in body language. Don’t fidget – try to avoid. Your spine ends in the back of your head. Realise where you spine ends – many people (including me until recently) might sit or stand with a straight back in a good posture. 17. In fact.

looking at your feet. keep on until it sticks. By then they should have developed into new habits and something you‟ll do without even thinking about it. And if you try and change to many things at once it might become confusing and feel overwhelming. Take a couple of these body language bits to work on every day for three to four weeks. If not. Then take another couple of things you‟d like to change and work on them. .

Sign up to vote on this title
UsefulNot useful