Oracle BI Applications 7.

9: Implementation for Oracle EBS
Activity Guide

D55409GC10 Edition 1.0 October 2008 D56566

Author
Jim Sarokin

Copyright © 2008, Oracle. All rights reserved. Disclaimer This document contains proprietary information and is protected by copyright and other intellectual property laws. You may copy and print this document solely for your own use in an Oracle training course. The document may not be modified or altered in any way. Except where your use constitutes "fair use" under copyright law, you may not use, share, download, upload, copy, print, display, perform, reproduce, publish, license, post, transmit, or distribute this document in whole or in part without the express authorization of Oracle. The information contained in this document is subject to change without notice. If you find any problems in the document, please report them in writing to: Oracle University, 500 Oracle Parkway, Redwood Shores, California 94065 USA. This document is not warranted to be error-free. Restricted Rights Notice

Technical Contributors and Reviewers
Dan Hilldale Mitravinda Kolachalam Manmohit Saggi Phillip Scott Kasturi Shekhar Albert Walker Jr.

Editors
Raj Kumar Daniel Milne Joyce Raftery

If this documentation is delivered to the United States Government or anyone using the documentation on behalf of the United States Government, the following notice is applicable: U.S. GOVERNMENT RIGHTS The U.S. Government’s rights to use, modify, reproduce, release, perform, display, or disclose these training materials are restricted by the terms of the applicable Oracle license agreement and/or the applicable U.S. Government contract. Trademark Notice Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners.

Publisher
Giri Venugopal

Contents

Practice 2-1: Matching Oracle Business Analytics Warehouse Components................... 5 Solutions 2-1: Matching the Oracle Business Analytics Warehouse Components ........... 7 Practice 2-2: Locating the Oracle Business Analytics Warehouse Components.............. 8 Solutions 2-2: Locating Oracle Business Analytics Warehouse Components .................. 9 Practice 4-1 Configuring the Training Environment......................................................... 11 Practice 5-1: Exploring Oracle BI ETL Metadata............................................................. 27 Solutions 5-1: Exploring Oracle BI ETL Metadata........................................................... 35 Practice 6-1: Working with Informatica Designer ............................................................ 39 Practice 7-1: Creating and Running an Informatica Workflow ........................................ 47 Practice 8-1 Exploring a Prebuilt SDE Mapping.............................................................. 51 Practice 8-2 Exploring a Prebuilt SIL Mapping................................................................ 57 Practice 9-1: Exploring the DAC...................................................................................... 63 Solutions 9-1: Exploring the DAC .................................................................................... 73 Practice 10-1: Configuring Common Areas and Dimensions Before Running a Full Load................................................................................................... 77 Practice 10-2: Configuring General Ledger Account Hierarchies ................................... 80 Practice 10-3: Mapping Oracle GL Natural Accounts to Group Account Numbers......... 85 Practice 10-4: Creating a New Metric Based on a New Group Account Number ........... 89 Practice 11-1: Customizing DAC Metadata..................................................................... 93 Practice 13-1: Creating a Custom SDE Mapping ............................................................ 99 Practice 13-2: Creating a Custom SIL Mapping ............................................................ 103 Practice 13-3: Adding DAC Tasks and Running Customized ETL................................ 107 Practice 14-1: Adding a New Dimension in the OBAW ................................................. 115 Practice 14-2: Creating an SDE Mapping to Load the Dimension Staging Table......... 118 Practice 14-3: Creating an SIL Mapping to Load the Dimension Table ........................ 122 Practice 14-4: Creating an SDE Mapping to Load the Fact Staging Table................... 125 Practice 14-5: Creating an SIL Mapping to Load the Fact Table .................................. 127 Practice 14-6: Adding DAC Tasks and Running Customized ETL................................ 130

iii

iv .

Match the Oracle Business Analytics Warehouse component names on the left to their descriptions on the right.9: Implementation for Oracle EBS 5 . 10–15 minutes Scenario Outcome Time Instructions: 1.Lesson 2: Oracle Business Intelligence Applications Architecture Overview Practice 2-1: Matching Oracle Business Analytics Warehouse Components Goals To match the core Oracle Business Analytics Warehouse components with their corresponding functions You have received your Oracle Business Analytics Warehouse software. You will have a list of the components and terms related to the software you are about to deploy. Write the appropriate letter of the description in the blank. Oracle BI Applications 7. and you begin by validating your knowledge of its components.

Contains the components used to create and administer the data warehouse b.9: Implementation for Oracle EBS . Mappings Description a. Extract. Online analytical processing f. Mapping Designer 2. Batch 11. Informatica Repository 7. that store mappings. and load data c. has a single primary key m. Dimensional Schema 8. DW Database Server Machine 13. performing computations. Informatica Designer 5. Central table in a dimensional schema. ETL 16. and target tables i. k. Relational tables. Transform data between source and target e. Administrator Workstation Machine 14. j. Repository Manager 12. transformations. Online transaction processing Contains the data warehouse Format for data that allows for effective querying Table in a dimensional schema that stores descriptions. and other metadata g. Used to create mappings q. the only table with multiple joins to other tables n. Enables you to create and modify the Informatica mappings. transform. Fact Table 9. and loading data p. transformations. Database composed of dimensional schemas that stores data warehouse data 6 Oracle BI Applications 7. Oracle Business Analytics Warehouse Database 3. l. manage. Transactional System 17. Used to run. Set of instructions for retrieving data.Lesson 2: Oracle Business Intelligence Applications Architecture Overview Component 1. Transformations 10. schedule. Dimension Table 15. DAC 6. Enables you to administer the Informatica Repository h. Analytical System 4. Program that runs ETL to load the data warehouse d. and configure ETL o. custom-built for the Oracle Business Analytics Warehouse.

Informatica Repository 7. custom-built for Oracle Business Analytics Warehouse. Fact Table d 9. Online transaction processing j. Repository Manager j 12. performing computations. Program that runs ETL to load the data warehouse d. and load data c. Database composed of dimensional schemas that stores data warehouse data Description m 8. Used to create mappings q. Relational tables. Table in a dimensional schema that stores descriptions. Transform data between source and target e. Administrator Workstation Machine l 14. Contains the components used to create and administer the data warehouse b. Oracle Business Analytics Warehouse Database e h n f k 3. transformations. schedule. and configure ETL o. Used to run. Enables you to administer the Informatica Repository h. Mapping Oracle BI Applications 7. DAC 6. Match the Oracle Business Analytics Warehouse component names on the left to their descriptions on the right. transformations. Component p q 1. ETL i 16. and target tables i. Dimensional Schema a. Batch g 11. Analytical System 4.9: Implementation for Oracle EBS 7 .Lesson 2: Oracle Business Intelligence Applications Architecture Overview Solutions 2-1: Matching the Oracle Business Analytics Warehouse Components Answers: 1. Dimension Table b 15. has a single primary key m. Format for data that allows for effective querying l. Enables you to create and modify the Informatica mappings. Central table in a dimensional schema. DW Database Server Machine a 13. and loading data p. Transactional System o 17. that store mappings. Extract. Transformations c 10. Write the appropriate letter of the description in the blank. Online analytical processing f. transform. Mapping Designer 2. Contains the Data Warehouse k. and other metadata g. the only table with multiple joins to other tables n. Informatica Designer 5. manage. Set of instructions for retrieving data.

ETL Servers Component DAC Client DAC Server DAC Repository Oracle Business Analytics Warehouse Database Informatica Integration Services Informatica Repository Service Informatica Workflow Manager Informatica Client (Repository Manager. and you begin by validating your knowledge of its components. 10–15 minutes Outcome Time Instructions: 1. place a check mark in the appropriate column for the recommended locations for setup and installation of each item. Designer. You will have a list of the component locations for the Oracle Business Analytics Warehouse.9: Implementation for Oracle EBS . and so on) Informatica Repository Oracle Business Analytics Warehouse Tables ETL Clients Component OBAW Database Component ETL Repositories Component 8 Oracle BI Applications 7.Lesson 2: Oracle Business Intelligence Applications Architecture Overview Practice 2-2: Locating the Oracle Business Analytics Warehouse Components Goals Scenario To determine the recommended location for each core component You have received your Oracle Business Analytics Warehouse software. Using the list provided.

Lesson 2: Oracle Business Intelligence Applications Architecture Overview Solutions 2-2: Locating Oracle Business Analytics Warehouse Components Answers: 1. Using the list provided.9: Implementation for Oracle EBS 9 . and so on) Informatica Repository Oracle Business Analytics Warehouse Tables ETL Clients Component Oracle BI Applications 7. Designer. place a check mark in the appropriate column for the recommended locations for setup and installation of each item. OBAW Database Component ETL Repositories Component ETL Servers Component DAC Client DAC Server DAC Repository Oracle Business Analytics Warehouse Database Informatica Integration Services Informatica Repository Service Informatica Workflow Manager Informatica Client (Repository Manager.

9: Implementation for Oracle EBS .Lesson 2: Oracle Business Intelligence Applications Architecture Overview 10 Oracle BI Applications 7.

Disable the validation of data code pages by changing the value to No. It is not necessary to validate the code page when the data is moved between compatible source and target databases: [Code Pages] Oracle BI Applications 7. Double-click powrmart. and one database schema (BIAPPS) that is included as part of the training environment: DAC: contains DAC repository tables INFA: contains Informatica repository tables BIAPPS: contains Oracle E-Business Suite source data In this practice you perform additional.ini to open the file using Notepad. In Windows Explorer. post-installation configuration of the DAC and Informatica in your environment. When you are done with the demonstrations. This includes Informatica PowerCenter.Lesson 4: Installing Oracle BI Applications Practice 4-1 Configuring the Training Environment Goals Scenario To configure the training environment before you populate and customize the Oracle Business Analytics Warehouse Oracle Business Intelligence platform and Oracle BI Applications have already been installed in this training environment.1\client\bin. To review the steps that were used to install the Oracle BI components. double-click the Installing shortcut on the desktop to view demonstrations of the Oracle BI Applications and Informatica installation processes. close the browser. the Data Warehouse Application Console (DAC). Modify the Informatica PowerCenter initialization file to disable validation of data code pages. navigate to C:\Informatica\PowerCenter8. b. Time 40–50 minutes Instructions: 1.1. Locate the following entry at the end of the file: [Code Pages] ValidateDataCodePages=Yes d. c. Performing these tasks is a convenient debugging technique. as most configuration issues arise from steps performed in this practice.9: Implementation for Oracle EBS 11 . a. The data code page is disabled here because the code page is identical for both the source and target databases in this training environment. 2. two database schemas (DAC and INFA) that were created during the installation process.

3. Select File > Exit. Navigate to C:\OracleBI\dwrep\Informatica\Repository.1.1\server\infa_shared\LkpFiles.1\server\infa_shared\SrcFiles. In this training environment. Restore the Oracle Business Intelligence prebuilt repository. On the desktop. f. You need to copy source files and lookup files from the Oracle Business Intelligence Applications installation directory to the Informatica directory on the Informatica PowerCenter Services machine.1\server\infa_shared\Backup.1. everything is installed on one machine. Copy the file Oracle_BI_DW_Base. a. You will be using this name throughout this course. a. 4. Copy source files and lookup files. right-click the My Computer icon and select Properties. 5. Identify the machine name. referred to hereafter as <machine name>. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles and copy all the source files in the directory. c.Lesson 4: Installing Oracle BI Applications ValidateDataCodePages=No e. b.rep and paste it into C:\Informatica\PowerCenter8. 12 Oracle BI Applications 7. Click OK to close the System Properties window.ini file after the modification.rep is installed into the OracleBI\dwrep\Informatica\Repository directory during the Oracle Business Intelligence Applications installation. You use the restore option in Informatica PowerCenter Administration Console to load this prebuilt Oracle_BI_DW_Base repository. Replace any existing files. Paste all the source files into C:\Informatica\PowerCenter8. Select File > Save to save the powrmart. c. Select the Computer Name tab and make a note of the machine name. b.1. Copy all the lookup files from C:\OracleBI\dwrep\Informatica\LkpFiles and paste into C:\Informatica\PowerCenter8. so you copy between directories. An Informatica repository file named Oracle_BI_DW_Base. b.9: Implementation for Oracle EBS . a.

verify that the service is running. Log in as admin with password admin. start it and wait a minute before continuing with the next step. select the Oracle_BI_DW_Base repository service that was created during the installation process. and then click OK. Wait for a Succeeded message. Select Actions > Delete Contents.1 SP4 is started. Select Start > Programs > Informatica PowerCenter 8. m.9: Implementation for Oracle EBS 13 . i. o. and then click OK. p.1 > Services > Launch Admin Console. In the left pane. g. Verify that Complete is selected. The repository now has no content. l. d. Click Yes when prompted to restart repository service. Click Edit in the General properties area. In the right pane. Select Actions > Restore Contents. In the Delete contents for Oracle_BI_DW_Base dialog box. Set the Operating Mode to Exclusive and click OK. Double-click the Services icon on the desktop and verify that the service Informatica Services 8. enter Administrator as the repository username and Administrator as the password. select the Properties tab h.1.Lesson 4: Installing Oracle BI Applications c. n. j. e.1. After a few minutes. If it is not started. k. Oracle BI Applications 7. f.

Verify that the restore completes successfully.Lesson 4: Installing Oracle BI Applications q. Click Close. select Oracle_BI_DW_Base from the Select backup file list and select the Restore as new check box. This takes about 5-10 minutes to complete. the repository becomes a stand-alone repository.9: Implementation for Oracle EBS . In the Properties tab. click Edit in the General properties area. In the Restore Contents dialog box. Promote the repository to a global repository. Click OK to start the restore process. 14 Oracle BI Applications 7. 6. you need to promote it to a global repository. t. r. s. a. Click Save to save the log file to the desktop. When a repository is restored. u. The screenshot below shows a partial view. After restoring the repository.

8. a.9. scroll down to the Custom properties area and click Edit. You should receive a message that integration service properties were updated.x GA or later. select Oracle_BI_DW_Base_Integration_Service. d. Log out of the Administration Console and close the browser. Click OK. Custom Properties Name SiebelUnicodeDBFlag ServerPort overrideMpltVarWithMapVar SiebelUnicodeDB Custom Properties Value No 4006 Yes biapps@orcl obaw@orcl e. a. d.9: Implementation for Oracle EBS 15 . If prompted. b. Copy Hibernate libraries to the appropriate DAC directories. Be sure to leave a space between biapps@orcl and obaw@orcl: f.2. Verify the results. Hibernate libraries are not installed as part of Oracle Business Intelligence Applications 7. Verify that the repository service Oracle_BI_DW_Base is enabled and running in normal mode. In the right pane. Click OK. c. Oracle recommends that you download Hibernate Core Package Version 3. Set up the DAC client. Newer versions of Hibernate Oracle BI Applications 7. 7. but have to be downloaded from the Hibernate Web site. g.5. Create the custom properties in the table below by clicking Add to display new Name and Value fields. Change the OperatingMode value to Normal. you need to have libraries from an open source software product called Hibernate. To run the DAC client or DAC server.Lesson 4: Installing Oracle BI Applications b. In the left pane. Set PowerCenter Integration Services custom properties. enter Administrator for the repository username and password. c. Display the Properties tab.

ii.5 GA). The pmcmd program is installed in the PowerCenter Services bin directories.jar. ii.jar hibernate-mapping-3. iii.Lesson 4: Installing Oracle BI Applications Core Package 3. For this training.1\server\bin directory to the C:\Informatica\PowerCenter8. Enable DAC client communication with Informatica PowerCenter. The DAC client uses pmrep to synchronize DAC tasks with Informatica workflows and to keep the DAC task source and target tables’ information up-to-date.1. Verify that the DAC_HOME variable points to the directory where DAC is installed.jar hibernate3.2 directory to the C:\OracleBI\DAC directory as described in the following table. Copy only the files described in the table. i. ii.1. Because of the requirement to co-locate the DAC client with the PowerCenter client. In C:\OracleBI\DAC directory.1. i. Install the Informatica pmcmd and pmrep command-line programs. Navigate to C:\PracticeFiles\hibernate-3. DAC is supported on the libraries of these versions also. In this training environment it should be set DAC_HOME= C:\OracleBI\DAC.dtd \hibernate-3.1. i. In this training environment. Make sure there are no spaces in the path reference.jar into C:\OracleBI\DAC\lib. a.hibernate. In this training environment you copy the driver from the oracle database directory. iii.1 SP4.5. Install JDBC drivers for DAC database connectivity. c. 16 Oracle BI Applications 7. The pmrep program is installed in the Informatica PowerCenter Client and Informatica PowerCenter Services bin directories.2. Navigate to C:\oracle\product\11. Copy the Hibernate files from the C:\PracticeFiles\hibernate-3.1\client\bin directory.dtd Copy from \hibernate-3.bat.2 \hibernate-3.exe program from the C:\Informatica\PowerCenter8. ii.0.2\src\org\hibernate Copy to \DAC\lib \DAC\lib \DAC \DAC hibernate-configuration-3.0\db_1\sqldeveloper\jdbc\lib. Verify that the JAVA_HOME variable points to the directory where Java SDK is installed. Files *. You must install the appropriate JDBC driver in the DAC\lib directory to enable DAC database connectivity.1 SP 4 has been installed on the same machine as the DAC client and PowerCenter client 8. Copy ojdbc14.bat file and verify the connection configuration for the DAC repository: i. You can download the Hibernate Core Package from http://www. 9.9: Implementation for Oracle EBS . The DAC client uses the Informatica pmrep and pmcmd command-line programs to communicate with Informatica PowerCenter.2\lib \hibernate-3. Open the DAC config. In this training environment it should be set JAVA_HOME= C:\jdk1. the Hibernate libraries have already been downloaded to your training environment.bat and select Edit. Paste ojdbc14.0. Hibernate Core Package Version 3.2\src\org\hibernate b. iv. the pmrep program is already available on the machine for the DAC client to use.org. PowerCenter Services 8. You do not need to copy any of the other files in the C:\PracticeFiles\hibernate-3.0_12.2. Copy the pmcmd.2 are now available (for example.2 directory. Close config.1. right-click config.

i. under System variables. right-click My Computer and select Properties. viii. vi. Be sure to include the semicolon before the directory path: Oracle BI Applications 7. Set the variable value to C:\Informatica\PowerCenter8. iv. select the path system variable and click Edit.9: Implementation for Oracle EBS 17 . Select Advanced > Environment Variables.1\domains. ii. Set environment variables for the DAC client.infa. In the Edit System Variable dialog box.infa file is located in C:\Informatica\PowerCenter8.1. The screenshot shows only a partial view.1. Verify that domains. ix. add the directory path to Informatica PowerCenter binaries to the end of the path environment variable.Lesson 4: Installing Oracle BI Applications b. you need to define the path of the Informatica domain file domains. v. The path should include the name of the file. In the Environment Variables dialog box.infa. Click OK to close the dialog box. On the Desktop.1. iii. vii. Click New to create new system variable. Name the variable INFA_DOMAINS_FILE. In order for the DAC client to be able to use the pmrep and pmcmd programs.

Click OK to close the System Properties dialog box.9: Implementation for Oracle EBS . Verify that the DAC client is able to use pmrep and pmcmd. xi. The test is successful if the pmrep and pmcmd prompts appear.Lesson 4: Installing Oracle BI Applications . From a Windows command prompt. 18 Oracle BI Applications 7.1\client\bin x. c.1. Click OK to close the Environment Variables dialog box. xii. execute pmrep and then pmcmd. i.C:\Informatica\PowerCenter8. Click OK to close the Edit System Variable dialog box.

h. select the DAC connection from the Connection list. enter dac as the table owner name and password. j. Click Finish to save the connection details and return to the login dialog box. Click Test Connection. 10. Create a DAC connection. c. the next step would be to use the DAC client to import DAC metadata into the DAC repository schema.10 source systems has already been imported into the DAC repository in your training environment. The DAC client opens. i. when you log into DAC and connect to a DAC repository for the first time. f. a. For this training. To start the DAC client. Note: Typically. Siebel 8. Click Test and verify that the connection was successfully established. Enter the appropriate connection details as specified in the table: Name Connection type Instance Database Host Database Port DAC Oracle (Thin) orcl <machine name> 1521 e. g. Enter dac as the Table owner name and password. and then click Next. Select Apply. Typically. select Create Connection. so you are not prompted to create the DAC repository. Importing DAC metadata. the DAC detects that the DAC schema does not exist in the database and you are asked whether you want to create a repository. you would specify the source system applications (Oracle 11. b. and click Login. A DAC connection is a stored set of login details that enable you to log into the DAC client and connect to the DAC repository. this step has been omitted from the training. As part of this process.10. select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. 11. Close the command window. and so on) for which you import the ETL metadata. For more information on Oracle BI Applications 7. the DAC schema and repository tables have already been created in your training environment.5. select Configure.5. k. In the Configuring dialog box. If necessary. Because importing DAC metadata is a time-consuming process.Lesson 4: Installing Oracle BI Applications ii.9: Implementation for Oracle EBS 19 .0. Click Close. d. The DAC metadata for the Oracle 11. In the Login dialog box.

Click OK to display the Data Warehouse Configuration Wizard. Click Finish. Click Start. Configure the connection between the DAC server and the DAC repository. The DAC repository that you connect to using the DAC client is the one that stores the DAC server repository connection information that you specify in this procedure.0. Click Next. Before starting this procedure. c. a. Ask your instructor if you need assistance. An ODBC connection named OBAW has already been created for you in this training environment. and so on) that you selected when you imported the seed data into the DAC metadata repository. including details of any conflicts between containers. Container Table Owner Password ODBC Data Source Data Area Index Area leave empty for all containers obaw obaw obaw obaw_data obaw_index g. Create the Oracle Business Analytics Warehouse tables. i. Because the training environment uses Windows. If a “Failure” message is displayed. Siebel 8. please refer to the section on importing the DAC Metadata in the Oracle Business Intelligence Applications Fusion Edition Installation and Configuration Guide. you must create a database schema with a role named SSE_ROLE for the data warehouse. the data warehouse tables have not been created. A database schema named OBAW with the role SSE_ROLE has already been created for you in this training environment. The Data Warehouse tab becomes active. 20 Oracle BI Applications 7. select Oracle as the database platform for both the target data warehouse and source transactional database. Use the log information in \OracleBI\DAC\config\generate_ctl. The Oracle Business Analytics Warehouse tables are created by the DAC client. In the DAC client. If you leave the Container filed blank.log to diagnose the error.9: Implementation for Oracle EBS .10. The createtables. In the DAC client. Leave the Container field blank. e. 12. select Tools > DAC Server Management > DAC Server Setup. \OracleBI\DAC\config\createtables. 13.5. use the following log files: \OracleBI\DAC\config\generate_ctl.log — A log of the ddlimp process. In the Sources dialog box. Select the Create Data Warehouse Tables check box. Enter the appropriate details of the database schema in which you want to store the data warehouse as specified in the following table.Lesson 4: Installing Oracle BI Applications this process. indicating that the data warehouse tables have been created. f. DAC creates a container by default for the source business applications (Oracle 11. b. After a minute or two. verify that a “Success” message is displayed. d. you can use the DAC client to configure a DAC server that runs in the same \DAC folder. a.log — A log of the schema definition process. The DAC client uses ODBC connections to the Oracle Business Analytics Warehouse database for this procedure.log is not generated. select Tools > ETL Management > Configure. If you want to see log information about the process. h. The Run Status tab displays information about the process.

9: Implementation for Oracle EBS 21 . enter the appropriate information. Display the DAC System Properties tab. by modifying the record with Name = Oracle_BI_DW_Server. d. Click Save. Register the Informatica Services in the DAC. Connection type Instance Database Host Database Port Table owner name Password Oracle (Thin) orcl <machine name> 1521 dac dac e. Click Save. You should receive a message “Connection was successfully established!” f. Click the Informatica Servers tab. Set values for the following properties: InformaticaParameterFileLocation C:\Informatica\PowerCenter8. a. a. and Informatica. c. b. click Populate from preconfigured client connection to populate the fields with connection details from the DAC client. Accept all defaults except for the following: Server Hostname Password <machine name> Administrator c. 15. Register the Informatica Integration Services service. Set DAC system properties.1. the DAC server. Register the Informatica repository service. Click Close. Accept all defaults except for the following: Server Hostname Password <machine name> Administrator Oracle BI Applications 7. Click OK and verify the server configuration information.) 14. Select Oracle_BI_DW_Server in the top pane and then modify the fields in the Edit subtab. In the DAC client. Because the DAC server is running on the same machine as the DAC client. g. b. c. skip the Email Configuration. You need to set these properties to ensure proper integration between the DAC client. (For this training. Click Test Connection to make sure the DAC repository connection works. Click Save. Click Yes in the Server Setup message to display the Server Configuration dialog box. In the Repository Connection Information tab. click the Setup button to display the Setup view.Lesson 4: Installing Oracle BI Applications b.1\server\infa_shared\ SrcFiles Main Informatica Repository Repository Name DAC Server Host Oracle_BI_DW_Base DAC <machine name> d. d. by modifying the record with Name = INFORMATICA_REP_SERVER.

Click OK. Click Test Connection to test the connection. d. b. i. Start the DAC server. You should receive the message “Connection to DataWarehouse successfully established!” e. m. Set the transactional and data warehouse physical data sources in the DAC. Select the Edit subtab and accept all defaults except for the following: Instance Table Owner Table Owner Password DB Host Port Default Index Space orcl obaw obaw <machine name> 1521 OBAW_INDEX d. c. You should receive a message “Connection to Oracle_BI_DW_Server successfully established!” k. You should receive a message “Connection to INFORMATICA_REP_SERVER successfully established!” n. 22 Oracle BI Applications 7. Click the Informatica Servers tab.” It may take a moment for the connection to be established. Log in to the DAC connection as dac with the password dac. The records that are created by DAC for the OLTP sources depend on the business application source systems you selected when importing the DAC metadata. b. Confirm that the DAC client is connected to the DAC server by looking at the DAC server monitor icon in the upper-right corner of the DAC client. c. you should see the message “DAC Server is idle. Click OK. a. Click Setup to open the Setup view. Restart the Informatica Services 8. The icon color should be orange. Close the DAC client.Lesson 4: Installing Oracle BI Applications e. f. In this training environment you should see ORA_11_5_10 and SEBL_80 sources. j. Select INFORMATICA_REP_SERVER. l. e. The command window flashes. indicating that the DAC client is connected to the DAC server and the DAC server is idle.1 SP4 service. Wait until the icon turns orange before proceeding to the next step. Select DataWarehouse in the top pane. h. a. Click OK.9: Implementation for Oracle EBS . Click the Physical Data Sources tab. g. Restart the DAC client by selecting Start the DAC server by selecting Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. The Physical Data Sources tab displays a pre-created record for the data warehouse with name DataWarehouse and one or more records for the OLTP sources. When you mouse over the icon. Start the DAC server by selecting Start > Programs > Oracle Business Intelligence > Oracle DAC > Start DAC Server. Select Oracle_BI_DW_Server in the upper pane.1. 16. Click Test Connection in the lower pane to test the connection. 17. Click Save. restart the DAC client. Test the connection. and test the Informatica connections.

You should receive the message “Connection to ORA_11_5_10 successfully established!” j. Verify that the Informatica Services 8. click More. specify the name of the domain that was created when you installed Informatica PowerCenter Services (for example. h. In the Welcome window. To log in to Workflow Manager. Use the Domain list to select the domain (Domain_<machine name>).1. and Administrator in the Username field. c. enter Oracle_BI_DW_Base in the Repository field. Select ORA_11_5_10. l. Select Repositories in the Repository Navigator. Configure the relational connections in the Informatica Workflow Manager. Click Save. In the menu. 18. Please note: If no domain is visible in the list. If the Connection Settings area is not displayed. b. g. d. a. Click OK. Click OK to save the details. j. g. <machine name>) and port for the gateway port (for example. In the Add Repository dialog box.9: Implementation for Oracle EBS 23 .1 SP4 service is started. 6001). select Repository > Add to display the Add Repository dialog box. deselect Show this message at startup and click OK. k. select Start > Programs > Informatica PowerCenter 8. Select the Edit subtab and accept all defaults except for the following: Connection type Instance Table Owner Table Owner Password DB Host Port Oracle (Thin) orcl biapps biapps <machine name> 1521 i. i. Test the connection. k. At the Add Domain dialog box.Lesson 4: Installing Oracle BI Applications f. Domain_<machine name>). h. e. Enter Administrator as the password. and the fully qualified host name for the gateway host (for example. Oracle BI Applications 7. Select Repository > Connect to display the Connect to Repository dialog box. Click Save. f. click Add to display the Add Domain dialog box. Leave the DAC client open for the next practice.1 > Client > PowerCenter Workflow Manager.1.

Name User Name Password Connect String DataWarehouse obaw obaw orcl p. Name User Name Password 24 ORA_11_5_10 biapps biapps Oracle BI Applications 7. n. Select Connections > Relational to display the Relational Connection Browser. Click OK. m. You must specify DataWarehouse exactly as it appears in the Physical Data Sources tab in the DAC Setup view.9: Implementation for Oracle EBS . You need to create a connection for each transactional (OLTP) database. and click OK.Lesson 4: Installing Oracle BI Applications l. select Oracle. q. Again. o. Click New again and repeat the steps to create the ORA_11_5_10 connection object. you must specify the OLTP connection ORA_11_5_10 exactly as it appears in the Physical Data Sources tab in the DAC Setup view. Click New to display the Select Subtype dialog box. and a connection for the Oracle Business Analytics Warehouse (OLAP) database. Enter the details for the DataWarehouse connection object as specified in the following table. Click Connect. The Repository Navigator in the left pane should look similar to the screenshot.

Lesson 4: Installing Oracle BI Applications Connect String orcl r. Select Repository > Exit to close Workflow Manager. t.9: Implementation for Oracle EBS 25 . Click OK. s. Oracle BI Applications 7. Click Close.

9: Implementation for Oracle EBS .Lesson 4: Installing Oracle BI Applications 26 Oracle BI Applications 7.

Expand the SDE_ORA11510_Adaptor folder in the Repository Navigator.10. If the Welcome dialog box appears. focus on the metadata objects and their general use in the ETL process. 2. c. and naming conventions You have installed and configured the software components required to run ETL for the Oracle Business Analytics Warehouse.9: Implementation for Oracle EBS 27 . you explore some of the prebuilt metadata used to perform ETL.1. Double-click Oracle_BI_DW_Base. 1. b. At this point. Rather. Enter Administrator as the username and password and click Connect. Start the Informatica Designer tool and connect to the Informatica Repository. you use Informatica tools and the DAC client to explore some of the prebuilt metadata used in the Oracle Business Analytics Warehouse ETL process. d. which are all covered in more detail in subsequent lessons. 20–30 minutes Scenario Outcome Time Instructions: In this exercise.5. This folder contains the Informatica repository source dependent extract (SDE) metadata for the Oracle E-Business Suite application. Explore an SDE adaptor folder.1 SP4 is started. Select Start > Programs > Informatica PowerCenter 8. deselect Show this message at startup and click OK. a. Verify that the service Informatica Services 8. processes. a. e. do not concern yourself with specific details about the metadata. or the DAC client. You will be able to relate the fundamental steps of the Oracle Business Analytics Warehouse ETL process to prebuilt metadata in the Oracle Informatica Repository and the DAC repository. version 11.1 > Client > PowerCenter Designer. Before loading and customizing the data warehouse.1. Oracle BI Applications 7. the Informatica tools.Lesson 5: Understanding the ETL Process Practice 5-1: Exploring Oracle BI ETL Metadata Goals To explore some of the prebuilt Oracle Business Intelligence ETL metadata to gain a highlevel understanding of some of the key elements.

or pass data. you can ignore the other subfolders in the Sources folder. target definitions. c.) d.9: Implementation for Oracle EBS . (For this training. Target definition objects provide detailed descriptions of objects or files that contain target data in mappings. modify. Expand the mapping SDE_ORA_GLRevenueFact. Mappings contain a set of mapplets. The Mappings folder contains the Informatica repository mapping objects for this adaptor. Transformation objects are used in mappings to generate. Based on the name of this mapping. and load data. Expand the Mappings folder and locate the mapping SDE_ORA_GLRevenueFact. Is this table located in the transactional database or the data warehouse? 28 Oracle BI Applications 7. Expand the Targets folder. What is the target table for this mapping? c. transform. or pass data. Expand the OLTP and OLAP subfolders. Expand the Mapplets folder. The Mapplets folder contains the Informatica repository mapplet objects for this adaptor. a. g. The Transformations folder contains the Informatica repository transformation objects for this adaptor. Expand the Mappings folder. Explore an SDE mapping. Source definition objects provide detailed descriptions of tables or files that provide source data in mappings. The Targets folder contains the Informatica repository target definition objects for this adaptor. Mapplets are reusable objects that are used in mappings in the same way as transformations to generate. modify. e. source definitions. These subfolders contain the Informatica repository source definition objects for the transactional (OLTP) and data warehouse (OLAP) databases for this adaptor.Lesson 5: Understanding the ETL Process b. 3. f. and transformations used to extract. what do you surmise is the purpose of this mapping? b. Expand the Transformations folder. Expand the Sources folder.

a. Notice that there are two lookup transformations for this mapplet: LKP_CUSTLOC_CUST_LOC_ID and LKP_LOC_CURRENCY. g. RA_CUSTOMER_TRX_LINES_ALL. Working with Informatica Designer. and Lesson 8. That is because this mapplet contains lookup transformations. application. and output the data to the next transformation object in the SDE mapping. Explore SDE mapplets. Are these sources located in the transactional database or the data warehouse database? d. These mapplets are transformation objects in the SDE mapping. Instead. Expand the Transformation Instances folder. c. This folder contains the Informatica Repository source independent load (SIL) metadata. b. which can look up data in a flat file or a relational table. Working with Informatica Designer. and Lesson 8. RA_CUST_TRX_LINE_GL_DIST_ALL. Exploring SDE and SIL Mappings. Non-Siebel SDE adapter folders typically do not expose sources directly in the mappings. These are extract mapplets that may contain relational. You learn more about this in the next lesson. Expand Oracle_BI_DW_Base > SILOS. they use the concept of Business Component mapplets. Expand the Source Instances folder and notice there are four source definitions for this mapplet: RA_CUSTOMER_TRX_ALL. Locate and expand the mapplet mplt_SA_ORA_GLRevenueFact. You learn more about this in the next lesson. f. Scroll to locate the mapplet mplt_BC_ORA_GLRevenueFact. or synonym. 4. 5. What type of table is this? e. make any necessary transformations to the data. Expand the Transformation Instances folder. Notice there are no source instances for this mapplet. and RA_SALESREPS_ALL. e.Lesson 5: Understanding the ETL Process d. view. They are used to extract data from an Oracle source. Expand SDE_ORA11510_Adaptor > Mapplets to examine the two mapplets for the SDE_ORA_GLRevenueFact mapping. f. This folder contains the additional transformation objects for this mapplet. Notice that there are no source instances for this mapping.9: Implementation for Oracle EBS 29 . a. or flat file sources. Expand the Transformation Instances folder and notice that there are two mapplets defined for this mapping: mplt_BC_ORA_GLRevenueFact and mplt_SA_ORA_GLRevenueFact. Exploring SDE and SIL Mappings. Notice that the SILOS folder contains all of the Oracle BI Applications 7. Explore the SILOS folder.

Expand the Transformation Instances folder and notice that there are many more transformation objects associated with an SIL mapping than with an SDE mapping. what do you surmise is the purpose of this mapping? b. Expand SIL_GLRevenueFact > Source Instances. Can you think of any instances where a Source Independent Loading mapping would include a source table in the transactional database? 30 Oracle BI Applications 7. What type of table is this? e. i. Based on the name of this mapping. b. Explore an SIL mapping. What type of table is this? g. Expand the Mappings folder and locate the mapping SIL_GLRevenueFact. a. You explore these transformations in more detail in Lesson 8: Exploring SDE and SIL Mappings. Which SDE mapping populates this table? h.Lesson 5: Understanding the ETL Process same subfolders contained in an SDE_*_Adapter folder. Notice that one of the sources for this mapping is W_GL_REVN_FS. Is this table located in the transactional database or the data warehouse? d.9: Implementation for Oracle EBS . Expand the Sources folder. Is this table located in the transactional database or the data warehouse? f. Are the sources listed here in the transactional database or the data warehouse? 6. What is the target table for this mapping? c. Expand SIL_GLRevenueFact > Target Instances.

Full mode refers to data loaded for the first time or data that is truncated and then loaded. This workflow also contains instructions on how to execute the task for the SDE_ORA_GLRevenueFact mapping. b. select Tools > Workflow Manager. Notice that there is another workflow named SDE_ORA_GLRevenueFact_Full. Notice that the mapping for this task is SDE_ORA_GLRevenueFact. a. Explore workflows in Informatica Workflow Manager. which extracts and moves data from source tables to the W_GL_REVN_FS fact staging table.Lesson 5: Understanding the ETL Process 7. Notice there are two tasks for this workflow: a Start task and a task named SDE_ORA_GLRevenueFact. Oracle BI Applications 7. click the Object tab. This workflow contains instructions on how to execute the task for the mapping SDE_ORA_GLRevenueFact. In the Repository Navigator. d. If necessary. A table can be loaded in full mode or incremental mode. Right-click the SDE_ORA_GLRevenueFact task (not the workflow) and select Properties. which extracts and moves data from source tables to the W_GL_REVN_FS fact staging table.9: Implementation for Oracle EBS 31 . e. In Informatica PowerCenter Designer. expand SDE_ORA11510_Adapter > Workflows > SDE_ORA_GLRevenueFact to view the components of the SDE_ORA_GLRevenueFact workflow. Incremental mode refers to new or changed data being added to the existing data. f. c. Click Cancel to close the Properties dialog box. g.

Select Repository > Exit to close Workflow Manager. k. select the Oracle 11. Click Cancel to close the Properties dialog box. a. Notice that there is another workflow named SIL_GLRevenueFact_Full. follow the instructions below to start the DAC client. 32 Oracle BI Applications 7. click the Object tab. m. which loads data from the fact staging table and other data warehouse tables into the W_GL_REVN_F fact table. Notice that there are two tasks for this workflow: a Start task and a task named SIL_GLRevenueFact.Lesson 5: Understanding the ETL Process h. o. expand SILOS > Workflows > SIL_GLRevenueFact to view the components of the SIL_GLRevenueFact workflow. If necessary.5. If necessary. In the Repository Navigator.10 container in the list. Click the Tasks tab in the upper-right corner (not the Tasks subtab). i. Right-click the SIL_GLRevenueFact task (not the workflow) and select Properties. Explore DAC SDE tasks. b. a. If not. which should still be open from the previous set of practices. c. b. It may take a few seconds for the DAC client to open. Select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. Return to the DAC client.9: Implementation for Oracle EBS . 9. n. j. Log in to the DAC connection with table owner name dac and password dac (these should be the default login values). Select Repository > Exit to close Designer. 8. Notice that the mapping for this task is SIL_GLRevenueFact. click Design in the upper-left corner of the DAC client to open the Design view. l. This workflow contains instructions on how to execute the task for the SIL_GLRevenueFact mapping. If necessary.

Click Go and verify that the SIL_GLRevenueFact task is returned by the query and is the only task visible in the upper window. e. Based on your exploration of workflows in Informatica Workflow Manager. Which Informatica mapping is executed by the workflows that are called and run by this DAC task? Oracle BI Applications 7. h. Click Go and verify that the SDE_ORA_GLRevenueFact task is returned by the query and is the only task visible in the upper window. what is the target table for this DAC task? l. click the Edit subtab. Notice that the execution type for this task is Informatica. which call and run Informatica workflows in the Informatica repository. Click the Query button. which uses SDE workflows and mappings to extract data from source transactional tables and to load fact staging tables in the Oracle Business Analytics Warehouse (OBAW). c. Most DAC tasks are Informatica task types. f. To verify your answer. enter SIL_GLRevenueFact. a. m. Click the Source Tables subtab to view the source tables for this DAC task.Lesson 5: Understanding the ETL Process d. click the Target Tables subtab. i. In the Name field. f. d. g. Which Informatica mapping is executed by the workflows that are called and run by this DAC task? k. which Informatica workflows are called and run by this DAC task? j. Based on your exploration of workflows in Informatica Workflow Manager. which Informatica workflows are called and run by this DAC task? g. e. Recall that this is the SIL mapping you explored in Informatica Designer earlier in this practice. which call and run Informatica workflows in the Informatica repository. enter SDE_ORA_GLRevenueFact. b. If necessary. Notice that the task phase for this task is Extract Fact.9: Implementation for Oracle EBS 33 . Notice that the execution type for this task is Informatica. Most DAC tasks are Informatica task types. Based on your exploration of this SDE_ORA_GLRevenueFact mapping in Informatica Designer. Click the Query button. Explore DAC SIL tasks. 10. Notice that the task phase for this task is Load Fact. click the Edit subtab. which uses SIL workflows and mappings to load data from fact staging tables into fact tables in the OBAW. If necessary. In the Name field.

To verify your answer. Click the Source Tables subtab to view the source tables for the SIL_GLRevenueFact DAC task. click the Target Tables subtab. What is the primary source table for the SIL_GLRevenueFact task? l. 34 Oracle BI Applications 7. What type of table is this? m. Select File > Close to close the DAC client. what is the target table for this DAC task? i. j.9: Implementation for Oracle EBS .Lesson 5: Understanding the ETL Process h. k. Based on your exploration of this SIL_GLRevenueFact mapping in Informatica Designer. Which DAC task loads this table? n.

c. what do you surmise is the purpose of this mapping? This source dependent extract (SDE) mapping extracts Oracle source data that is used to load the general ledger revenue fact table. Expand the Mappings folder and locate the mapping SDE_ORA_GLRevenueFact. Based on the name of this mapping. 6. RA_CUSTOMER_TRX_LINES_ALL. what do you surmise is the purpose of this mapping? The purpose of this mapping is to load the general ledger revenue fact table in the data warehouse. 6.b.a. Are these sources located in the transactional database or the data warehouse database? Transactional database Expand the Sources folder. .e.Lesson 5: Understanding the ETL Process Solutions 5-1: Exploring Oracle BI ETL Metadata Answers: 3. What is the target table for this mapping? W_GL_REVN_F Is this table located in the transactional database or the data warehouse? Data warehouse What type of table is this? Fact table Expand SIL_GLRevenueFact > Source Instances. 3.d. Is this table located in the transactional database or the Oracle BI Applications 7. RA_CUST_TRX_LINE_GL_DIST_ALL.b. Are the sources listed here in the transactional database or the data warehouse? Data warehouse Expand the Mappings folder and locate the mapping SIL_GLRevenueFact. Notice that one of the sources for this mapping is W_GL_REVN_FS. 6.c. What is the target table for this mapping? W_GL_REVN_FS Is this table located in the transactional database or the data warehouse? Data warehouse What type of table is this? Fact staging table Expand the Source Instances folder and notice there are four source definitions for this mapplet: RA_CUSTOMER_TRX_ALL.a. and RA_SALESREPS_ALL.b. 6. 5.d. Expand the mapping SDE_ORA_GLRevenueFact.c. 3. Expand SIL_GLRevenueFact > Target Instances.9: Implementation for Oracle EBS 35 3. 6. Based on the name of this mapping. 4.

6. 10.j.f. 9. 10. 10. 10.h. Notice that the execution type for this task is Informatica.g. which Informatica workflows are called and run by this DAC task? SDE_ORA_GLRevenueFact and SDE_ORA_GLRevenueFact_Full Which Informatica mapping is executed by the workflows that are called and run by this DAC task? SDE_ORA_GLRevenueFact Based on your exploration of this SDE_ORA_GLRevenueFact mapping in Informatica Designer.i. which call and run Informatica workflows in the Informatica repository. which call and run Informatica workflows in the Informatica repository. What type of table is this? Fact staging table Which SDE mapping populates this table? SDE_ORA_GLRevenueFact Can you think of any instances where a Source Independent Loading mapping would include a source table in the transactional database? Source Independent Loading mappings are used to process data in the staging tables and other data warehouse tables and load it into the ultimate target tables. Based on your exploration of workflows in the Informatica Workflow Manager. 9.i. Most DAC tasks are Informatica task types. 6.Lesson 5: Understanding the ETL Process data warehouse? Data Warehouse 6. Based on your exploration of workflows in Informatica Workflow Manager.f.l. what is the target table for this DAC task? W_GL_REVN_FS Notice that the execution type for this task is Informatica. so typically these mappings would not include a source table in transactional database. 36 Oracle BI Applications 7.g.k. what is the target table for this DAC task? W_GL_REVN_F What is the primary source table for the SIL_GLRevenueFact task? W_GL_REVN_FS What type of table is this? Fact staging table 9. 10.k.9: Implementation for Oracle EBS . which Informatica workflows are called and run by this DAC task? SIL_GLRevenueFact and SIL_GLRevenueFact_Full Which Informatica mapping is executed by the workflows that are called and run by this DAC task? SIL_GLRevenueFact Based on your exploration of this SIL_GLRevenueFact mapping in Informatica Designer. Most DAC tasks are Informatica task types.

Which DAC task loads this table? SDE_ORA_GLRevenueFact Oracle BI Applications 7.9: Implementation for Oracle EBS 37 .Lesson 5: Understanding the ETL Process 10.m.

Lesson 5: Understanding the ETL Process 38 Oracle BI Applications 7.9: Implementation for Oracle EBS .

2. c. Select Tools > Designer to open the Informatica PowerCenter Designer. Right-click the CUSTOM_SDE folder and select Open. you will have an SDE mapping that extracts data from the REVN table and loads it into the WC_REVN_FS staging table.Lesson 6: Working with Informatica Designer Practice 6-1: Working with Informatica Designer Goal Scenario To use Informatica Designer tools to build a source dependent extract mapping The primary goal of this practice is to become familiar with Informatica Designer and its tools.9: Implementation for Oracle EBS 39 . These tables have very small datasets. c. b. Select Folder > Create. you create a source dependent extract (SDE) mapping that extracts data from a source table and moves the data into a fact staging table. 30–40 minutes Time Instructions: 1.1. Click OK to confirm that the folder has been successfully created. a. Enter Administrator as the username and password and click Connect. in lessons 14 and 15.1 > Client > PowerCenter Repository Manager. Later in this course. you build custom SDE and SIL mappings and use the DAC to run the mappings and to verify the results. To accomplish this. you use Informatica Workflow Manager to run the mapping and to verify the results. Double-click Oracle_BI_DW_Base. you use custom tables provided specifically for this training. so you can focus less on the data being moved by ETL and more on Informatica Designer tools and the steps for building mappings. Import the source for the SDE mapping. Alternatively. g. Verify that the CUSTOM_SDE folder is bolded and that the toolbar displays CUSTOM_SDE – (Oracle_BI_DW_Base) in the drop-down field in the upper-left corner. d. you can double-click the CUSTOM_SDE folder to open it. a. Name the new folder CUSTOM_SDE and click OK. Outcome When you complete this practice. In the practices for the next lesson. e. d. b. All modifications to the Oracle_BI_DW_Base repository should be done in custom folders. the secondary goal of this practice is to become familiar with some ETL mapping components and the steps to create them. In this set of practices. Select Start > Programs > Informatica PowerCenter 8. If not. f. Open Informatica Repository Manager and create a custom Informatica repository folder. select Tools > Source Analyzer. Thus. Oracle BI Applications 7. deselect Show this message at startup and click OK. If the Welcome dialog box appears. Source Analyzer should already be open in the workspace.

Click OK. and password. Select ETL_LAB_OLTP (Oracle in OraDb11g_home1) from the ODBC data source list. b. right-click the REVN table and select Preview Data. k. Select Sources > Import from Database. The Preview Data window appears. 40 Oracle BI Applications 7. b. a. Select Tools > Target Designer. j. g. Recall that LAST_UPDATE_DATE is used in incremental loads to identify changed records. l. a. Select the REVN table. 4. 3. h. Enter etl_lab_oltp for username. In the Source Analyzer window. owner name. LAST_UPDATE_DATE is compared to the DAC parameter $$LAST_EXTRACT_DATE to determine which records have been updated since the last ETL run. The Import Tables dialog box appears.9: Implementation for Oracle EBS . c. This ODBC data source has already been created for your training environment. Select Targets > Import from Database. verify there is a message indicating that the source ETL_LAB_OLTP:REVN is inserted.Lesson 6: Working with Informatica Designer e. Select the ETL_LAB_DW (Oracle in OraDb11g_home1) ODBC data source to connect to the target database. and a LAST_UPDATE_DATE column. REVN is displayed in the Source Analyzer workspace. enter etl_lab_oltp for the username. Click Close to close the Preview Data window. Select Repository > Save. and click Connect. Preview the source data. In the Output window. i. c. Use lowercase for all three values. Note that REVN has five rows of revenue data with a primary key (ROW_ID). expand CUSTOM_SDE > Sources and verify that the ETL_LAB_OLTP source appears and contains the REVN table. Use lowercase for all three values. foreign keys for product and person data. f. owner name. In the Navigator. m. Select ETL_LAB_OLTP (Oracle in OraDb11g_home1) for the ODBC data source. Expand ETL_LAB_OLTP > TABLES. Import the target for the SDE mapping. and click Connect. and password.

It also has placeholders for the source columns that will be mapped to this target when you build the SDE mapping. ensure that you are working in the CUSTOM_SDE Folder. Verify that the CUSTOM_SDE folder is bolded and the Designer toolbar displays CUSTOM_SDE (Oracle_BI_DW_Base) in the drop-down field in the upper-left corner. Click OK. In the steps that follow. and password. The completed mapping will look similar to this screenshot: At all times. Expand ETL_LAB_DW > TABLES. WC_RVF_FS is displayed in the Target Designer workspace. j.DATASOURCE_NUM_ID: Stores the data source from which the data is extracted. owner name. h.Lesson 6: Working with Informatica Designer d. WC_RVF_FS. Because you will use Informatica Workflow Manager. Please note that hard-coding the value is for training purposes only and is not the recommended practice. expand CUSTOM_SDE > Targets and verify that the WC_RVF_FS target is visible. verify there is a message indicating that the target WC_RVF_FS is inserted. In the Output window. REVN. The values for these parameters are passed from the DAC during ETL run time. Enter etl_lab_dw for the username. Oracle BI Applications 7. Select the WC_RVF_FS table. In the Navigator. Notice that this staging table has the following required columns: . a. Select Mapplets > Create.INTEGRATION_ID: Stores the primary key or the unique identifier of a record as in the source table. to run and verify the SDE mapping in the next practice. The data does not undergo any major transformations in this mapping as it is moved from the source to the target fact staging table. b. Select Tools > Mapplet Designer.9: Implementation for Oracle EBS 41 . In the Target Designer window. Create a mapplet to extract the revenue data from the source. to the target fact staging table. The C in the naming convention indicates that this is a custom mapping. 5. you hard code the value for the $$LAST_EXTRACT_DATE parameter. 7. Name the mapplet mplt_BC_C_RevenueFact and click OK. and click Connect. f. examine the WC_RVF_FS table definition. c. . e. Use lowercase for all three values. and not the DAC. These parameters are used to identify records that have changed since the last ETL run. k. Create parameters to pass values for the last extract date and the initial extract date to the mapplet. i. 6. This mapping extracts and moves data from the source table. you create a source dependent extract mapping named SDE_C_RevenueFact. Select Repository > Save. g.

Lesson 6: Working with Informatica Designer

a. Select Mapplets > Parameters and Variables. b. Click the Add a new variable to this table button. c. Enter the following values:
Name: Type: Data type: Initial Value $$LAST_EXTRACT_DATE Parameter Date/time 05/02/2003 21:02:44

d. Click the Add a new variable to this table button again. e. Enter the following values:
Name: Type: Data type: Initial Value $$INITIAL_EXTRACT_DATE Parameter Date/time <leave blank>

f. Click OK. 8. Add a source definition to the mapplet. a. In the Repository Navigator, expand CUSTOM_SDE > Sources > ETL_LAB_OLTP. b. Drag the REVN source into the Mapplet Designer. By default, a source qualifier transformation named SQ_REVN is created. c. If desired, select Layout > Zoom Percent to change the layout of the mapplet in the Mapplet Designer. 9. Add a mapplet output transformation to the mapplet. a. Select Transformation > Create to open the Create Transformation dialog box. b. In the list, select Mapplet Output. c. Name the transformation MAPO_REVN_EXTRACT. d. Click Create. e. Click Done. f. Drag MAPO_REVN_EXTRACT to the right of SQ_REVN. g. Drag each column from SQ_REVN to a blank row in MAPO_REVN_EXTRACT. h. Check your work. Your mapplet should look similar to the screenshot:

10. Generate the default SQL query for the source qualifier. a. Double-click the SQ_REVN source qualifier. b. Click the Properties tab.
42 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 6: Working with Informatica Designer

c. For the Sql Query transformation attribute, in the Value field, click the down arrow to open the SQL Editor. d. Click inside the SQL: field. e. Click Generate SQL to display the default query. f. Modify the default SQL query to include a WHERE clause that compares the LAST_UPDATE_DATE column to the $$LAST_EXTRACT_DATE parameter. Hint: You can enter the port and variable manually, or double-click the corresponding objects in the left pane to add them to the query:
WHERE (REVN.LAST_UPDATE_DATE > TO_DATE('$$LAST_EXTRACT_DATE', 'MM/DD/YYYY HH24:MI:SS'))

g. Click OK to close SQL Editor. h. Click Apply in the Edit Transformations dialog box. i. Click OK to close the Edit Transformations dialog box. 11. Validate the mapplet. a. Select Layout > Arrange. b. In the Select Outputs dialog box, verify that MAPO_REVN_EXTRACT is selected and click OK. c. If necessary, select View > Output to view the Output window. d. Click the Validate tab of the Output window e. Select Mapplets > Validate. You should receive the message Mapplet mplt_BC_C_RevenueFact is VALID. f. If your mapplet is valid, select Repository > Save to update the repository. If you receive a message that the mapplet is not valid, review the steps in this practice and try to troubleshoot. If you need assistance, ask your instructor. 12. Create an SDE mapping. a. Select Tools > Mapping Designer. b. Select Mappings > Create. c. Name the mapping SDE_C_RevenueFact, and click OK. 13. Create a parameter to pass a value for the DATASOURCE_NUM_ID to the mapping. DATASOURCE_NUM_ID is a unique number assigned by the DAC to a data source so that the data can be identified in the data warehouse. By default, the unique number assigned to the Oracle E-Business Suite source is 4. This value is passed from the DAC during ETL run time. Because you will use Informatica Workflow Manager and not the DAC to run and verify the SDE mapping in the next practice, you hard code the value for the $$DATASOURCE_NUM_ID
Oracle BI Applications 7.9: Implementation for Oracle EBS 43

Lesson 6: Working with Informatica Designer

parameter. Please note that hard-coding this value is for training purposes only and is not the recommended practice. a. Select Mappings > Parameters and Variables. b. Click the Add a new variable to this table button. c. Enter the following values:
Name: Type: Data type: Initial Value $$DATASOURCE_NUM_ID Parameter integer 4

d. Click OK. 14. Add the mapplet to the mapping. a. In the Repository Navigator, expand CUSTOM_SDE > Mapplets. b. Drag mplt_BC_C_RevenueFact from the Navigator into the Mapping Designer workspace. 15. Create a new Expression transformation and add it to the mapping. An Expression transformation is used to calculate values in a single row before writing to the target. In this step, you use the $$DATASOURCE_NUM_ID parameter that you created in an earlier step to place a value of 4 in DATASOURCE_NUM_ID. a. Select Transformation > Create. b. Select Expression from the transformation type list. c. Enter EXPTRANS as the transformation name. d. Click Create. e. Click Done. f. In mplt_BC_C_RevenueFact, use Shift-click to select all five ports, ROW_ID, PROD_ID, PERSON_ID, REVF, and LAST_UPDATE_DATE, and drag them to the EXPTRANS transformation. g. Double-click EXPTRANS. h. Click the Ports tab. i. Click the Add a new port to this transformation button. j. Give the new port the following attributes:
Name: Data type: Prec Port Type DATASOURCE_NUM_ID Decimal 10 Output

k. If necessary, select the DATASOURCE_NUM_ID port and use the up arrow to move it to the top of the port list. l. For the DATASOURCE_NUM_ID port, click the down arrow button in the Expression column to open Expression Builder. m. Delete the existing formula (DATASOURCE_NUM_ID). n. Click the Variables tab in the left pane.

44

Oracle BI Applications 7.9: Implementation for Oracle EBS

Select Mappings > Validate. 17. ask your instructor. In the Repository Navigator. c. review the steps in this practice and try to troubleshoot. Click Apply and OK. Drag DATASOURCE_NUM_ID.Lesson 6: Working with Informatica Designer o. f. g. the target for data extracted from the REVN source is the WC_RVF_FS fact staging table in the ETL_LAB_DW database. PERSON_ID. select View > Output to view the Output window. You should receive the message Mapping SDE_C_RevenueFact is VALID. q. Select WC_RVF_FS as the target and click OK. Drag WC_RVF_FS into the Mapping Designer and place it to the right of EXPTRANS. If you receive a message that the mapping is not valid. The mapping should look similar to the screenshot: c. REVF. INTEGRATION_ID stores the primary key or the unique identifier of a record. Drag ROW_ID from EXPTRANS to INTEGRATION_ID in the WC_RVF_FS target definition. and LAST_UPDATE_DATE from EXPTRANS onto their corresponding ports in the WC_RVF_FS target definition. 16. Click the Validate tab of the Output window e.9: Implementation for Oracle EBS 45 . Add and link the target definition. select Repository > Save. d. Select Layout > Arrange. Target definitions vary by mapping. Expand the Mapping Parameters folder and double-click the $$DATASOURCE_NUM_ID variable to add it to the expression. If you need assistance. b. the DATASOURCE_NUM_ID port will be populated with the initial value (4) defined for the $$DATASOURCE_NUM_ID variable. Oracle BI Applications 7. p. Click OK. If necessary. a. a. d. Leave Informatica Designer open for the next practice. b. In this practice. If your mapping is valid. When you run ETL in the next practice. PROD_ID. expand CUSTOM_SDE > Targets and select WC_RVF_FS. Validate the mapping.

Lesson 6: Working with Informatica Designer 46 Oracle BI Applications 7.9: Implementation for Oracle EBS .

Start the Informatica Workflow Manager. 20–30 minutes Scenario Outcome Time Instructions: 1. enter the following values: Name User Name Password Connect String etl_lab_dw etl_lab_dw etl_lab_dw ORCL e. In the Select Subtype dialog box. You have a workflow associated with the SDE mapping and have loaded the WC_RVF_FS fact staging table. select Oracle and click OK. Add a database connection to the target database. as in the practice for the previous lesson. b. Please note that the DAC is used to run ETL mappings in production environments. Again. you use custom tables provided specifically for this training.1. If the Informatica Designer is not open. c.1. b. Verify that the Informatica Services 8. whereas Informatica client tools are used for testing. Enter Administrator as the username and password and click Connect. d. 2. If the Informatica Designer is still open from the previous practice.1 SP4 service is started. Now you use Informatica Workflow Manager to create a workflow to run the mapping. follow these steps: a.1 > Client > PowerCenter Workflow Manager. Oracle BI Applications 7. c. Double-click Oracle_BI_DW_Base. 3. a. Click New. Select Connections > Relational to open the Relational Connection Browser. Select Start > Programs > Informatica PowerCenter 8. Click OK to return to the Relational Connection Browser. In the Connection Object Definition dialog box.Lesson 7: Working with Informatica Workflow Manager Practice 7-1: Creating and Running an Informatica Workflow Goals To use Informatica Workflow Manager to create a workflow for the SDE mapping you created in the previous practice In the practice for the previous lesson you created a source dependent extract (SDE) mapping that extracts data from a source table and moves the data into a fact staging table.9: Implementation for Oracle EBS 47 . select Tools > Workflow Manager.

j. i. In the Workflow Designer window. Enter the name s_SDE_C_RevenueFact. f. In the Mappings dialog box. If it is not open. a. b. 48 Oracle BI Applications 7. enter the name s_SDE_C_RevenueFact. g. For the $Target connection value attribute. select ETL_LAB_OLTP and click OK. In the Create Workflow dialog box. For the $Source connection value attribute. In the General tab. Create a workflow for the SDE mapping you created. Click the Properties tab. This workflow contains instructions on how to execute the session task for the mapping SDE_C_RevenueFact. enter the following values: Name User Name Password Connect String etl_lab_oltp etl_lab_oltp etl_lab_oltp ORCL d. c. d. Verify that the CUSTOM_SDE folder is open. Click Done. Select Tools > Workflow Designer. Edit the session task properties. e. a. click the down arrow in the Value field to open the Connection Browser. Select Workflows > Create. b. c. drag the s_SDE_C_RevenueFact session off of the Start task. d. Click OK to return to the Relational Connection Browser. In the Create Task dialog box. select the Session task type. In the Connection Browser. Click Create.9: Implementation for Oracle EBS . e. l. select Oracle and click OK. 6. Click New in the Relational Connection Browser. Click Close to close the Relational Connection Browser. Click OK. h. k. double-click CUSTOM_SDE to open it or right-click CUSTOM_SDE and select Open. Add a database connection to the source database. Select Tasks > Create to add a session task to the workflow.Lesson 7: Working with Informatica Workflow Manager 4. select the SDE_C_RevenueFact mapping and click OK. In the Connection Object Definition dialog box. In the Select Subtype dialog box. Select Tasks > Link Task. which extracts and moves the data from the source table REVN to the target fact staging table WC_RVF_FS. select Fail parent if this task fails and Fail parent if this task does not run. f. In the Workflow Designer workspace. double-click the s_SDE_C_RevenueFact session task. click the down arrow in the Value field to open the Connection Browser. e. 5. a. then click the Start task. b. m. c. and drag the link to the s_SDE_C_RevenueFact session task.

a. Select Layout > Arrange > Horizontal. change the Target load type attribute from Bulk to Normal. t. review the steps in this practice and try to troubleshoot. enter a value of 1. Click the Validate tab of the Output window. Select Start > Programs > Oracle – OraDb11g_home1 > Application Development > SQL Plus.9: Implementation for Oracle EBS 49 . Select Workflows > Validate. If your workflow is valid. Return to Informatica Workflow Manager. n. in a production environment you would use connection variables generated by the DAC to designate source and target connections.SQ_REVN source. b. In the Connection Browser. c. Verify that the query returns a count of zero. select Repository > Save. scroll down and select Truncate target table option. for the Stop on errors attribute. select the mplt_BC_C_RevenueFact. use SQL*Plus to query the WC_RVF_FS target table and confirm that there is no data. r. b. select ETL_LAB_DW and click OK. In the left pane. In the Error handling section. select ETL_LAB_DW. f. Note: As you will see in subsequent practices. Enter ETL_LAB_DW as the username and click Enter. d. In the Properties section. Execute the s_SDE_C_RevenueFact workflow to load the data into WC_RVF_FS. In the Connections setting in the right pane. e. 7. Click the Config Object tab of the Edit Tasks dialog box. a. Before running the workflow. i. Oracle BI Applications 7. j. Enter ETL_LAB_DW as the password and click Enter. h. At the SQL> prompt. p. and then click OK. c. s. WC_RVF_FS is the staging table in the ETL_LAB_DW database that will be populated by running the s_SDE_C_RevenueFact workflow you just created. k.Lesson 7: Working with Informatica Workflow Manager g. In the Properties section. execute the following query: SELECT COUNT(*) FROM WC_RVF_FS. q. o. Because you are in a test environment. In the left pane. select the WC_RVF_FS target. click the down-arrow button in the Value field to edit the target connection. select View > Output to view the Output window. In the Relational Connection Browser. If you receive a message that the workflow is not valid. d. Click the Mapping tab. If you need assistance. a. 8. ask your instructor. 9. and then click OK. click the down-arrow button in the Value field to edit the source connection. If necessary. select ETL_LAB_OLTP. You should receive the message Workflow s_SDE_C_RevenueFact is VALID. you are manually selecting the relational connections you created earlier. l. e. Validate the workflow. Click Apply. m. Click OK. In the Relational Connection Browser. In the Connections setting in the right pane. Leave SQL*Plus open.

9: Implementation for Oracle EBS . right-click the s_SDE_C_RevenueFact session task object in Workflow Monitor. If you troubleshoot successfully. Monitor the progress and verify that the workflow and the session task both have a status of Succeeded before continuing. c. b. select Get Task Log and try to troubleshoot. a. a. If the workflow returns a status of Failed. try re-running the workflow. which should still be open. ask your instructor. select Get Workflow Log and try to troubleshoot. 50 Oracle BI Applications 7. execute the following query: SELECT COUNT(*) FROM WC_RVF_FS. right-click s_SDE_C_RevenueFact. Workflow Manager. Return to SQL*Plus. and Designer. In the Repository Navigator. Informatica PowerCenter Workflow Monitor should open when you start the workflow. Close SQL*Plus. Use SQL*Plus to query the WC_RVF_FS table to verify that the data has been loaded. Troubleshooting may require changes to the workflow in Workflow Manager or changes to the SDE mapping in Designer. 10. 11. right-click the s_SDE_C_RevenueFact workflow object in Workflow Monitor. When the both the workflow and session complete successfully. expand CUSTOM_SDE > Workflows. Monitor the progress of the workflow in Workflow Monitor. At the SQL> prompt. If the session task returns a status of Failed. Verify that the query returns five rows of data. d.Lesson 7: Working with Informatica Workflow Manager b. If you need assistance. and select Start Workflow. b. close Workflow Monitor.

Verify that the Explore_SDE folder is open in Designer. Expand the Explore_SDE folder. f. Close Repository Manager. Select Folder > Create. Name the new folder Explore_SDE and click OK. Copy a mapping to the new folder. c. j. b. Click OK to confirm that the folder has been successfully created. Oracle BI Applications 7. Select Tools > Designer to open Designer. Notice that at this point it contains empty subfolders. c. e.9: Implementation for Oracle EBS 51 . h. i. An understanding of the anatomy of prebuilt mappings will assist you in the configuration and customization of ETL mappings. g. an expression transformation. b. Enter Administrator as the username and password and click Connect. Extract mappings generally consist of a source table or business component. e. 2. Business components are packaged as mapplets. Note that individual mappings may differ substantially from the examples provided here. Select the mapping SDE_ORA_GLRevenueFact. Verify that the new Explore_SDE folder is visible in the Repository Navigator in Designer. Select Explore_SDE in the Repository Navigator. f. g.1.Lesson 8: Exploring SDE and SIL Mappings Practice 8-1 Exploring a Prebuilt SDE Mapping Goals Scenario To explore a prebuilt source dependent extract (SDE) mapping in the Oracle Informatica repository The main goal of this practice is to explore the anatomy of a typical prebuilt SDE mapping in the Informatica repository. Double-click Oracle_BI_DW_Base. Select Edit > Paste. Select Edit > Copy. Business components are used to extract data from the source system. Expand SDE_ORA11510_Adaptor > Mappings. Open Informatica Repository Manager and create a new Informatica repository folder.1 > Client > PowerCenter Repository Manager. which reside in source-specific folders within the repository. The folder is open if it is bolded and appears as Explore_SDE – (Oracle_BI_DW_Base) in the list in the upper-left corner of the Designer toolbar. Time 10–15 minutes Instructions: 1. a. This practice explores the function and components of a typical SDE mapping. and a staging table. a. d. Select Start > Programs > Informatica PowerCenter 8. d. Select the Explore_SDE folder.

You explore each component in detail later in this practice. b. select Layout > Zoom Percent and select a percent to modify the mapping layout in the Mapping Designer workspace. expand Explore_SDE > Mappings.Lesson 8: Exploring SDE and SIL Mappings h. c. Drag SDE_ORA_GLRevenueFact into Mapping Designer. Explore the parameters and variables for the mapping. 3. This mapping extracts revenue data from tables in the Oracle E-Business Suite source system and stores the data in the W_GL_REVN_FS fact staging table in the Oracle Business Analytics Warehouse. e.9: Implementation for Oracle EBS . Click Yes to confirm that you want to copy SDE_ORA_GLRevenueFact. transformations. a. Expand the subfolders under Explore_SDE and notice that all of the related repository objects for this mapping were also copied to the folder: sources. Select Tools > Mapping Designer to open Mapping Designer in the workspace. Note that there are four components in the mapping and that the data flow is from left to right. d. k. You are copying this mapping to a custom folder as a precaution to ensure that you do not inadvertently make changes to the mapping in the SDE_ORA11510_Adaptor folder while you explore it. Explore the SDE_ORA_GLRevenue mapping. Select Repository > Save. Expand Explore_SDE > Mappings and confirm that the SDE_ORA_GLRevenueFact mapping is copied to the Explore_SDE folder. mplt_BC_ORA_GLRevenueFact Mapplet Exp_W_GL_REVN_FS_Integration_Id Expression mplt_SA_ORA_GLRevenueFact Mapplet Extracts revenue transactions from the source system Calculates values in a single row before writing to the target Converts source-specific data elements into standard formats and then stores them in a staging table Stores revenue transactions extracted from the source system W_GL_REVN_FS Target Definition 4. i. If desired. The following table provides a high-level explanation of the function of each component in the mapping. and mapplets. j. targets. 52 Oracle BI Applications 7. If necessary.

there are four tables imported from the Oracle E-Business Suite source: RA_CUSTOMER_TRX_LINES_ALL. Click Cancel to close the Declare Parameters and Variables dialog box without making any changes. a.Lesson 8: Exploring SDE and SIL Mappings a. If desired. which means they pass data to the next object in the mapplet. Click the Ports tab. h. i. Notice that all ports are input/output. Explore the mplt_BC_ORA_GLRevenueFact mapplet. l. When you add a relational or a flat file source definition to a mapplet or mapping. Right-click mplt_BC_ORA_GLRevenueFact and select Open Mapplet to open it in the Mapplet Designer. or validate the SQL. and RA_SALESREPS_ALL. The three component types are: source definition. j. Note that there are three component types in the mapplet and that the data flow is from left to right. so they contain only output ports. modify. 5. Source definitions are imported into the Informatica repository via the Source Analyzer. g. f. all four source definitions connect to a single source qualifier. select Layout > Zoom Percent and select a percent to modify the layout in the Mapplet Designer workspace. In this example. click the down arrow in the Value field to open the SQL Editor to display the SQL statement that is used to retrieve the data from the sources. Double-click the SQ_GL_REVENUE_EXTRACT source qualifier to open the Edit Transformations dialog box. b. Click the Properties tab. This mapplet extracts revenue data from tables in the Oracle E-Business Suite source system. Click Cancel to close the Edit Transformations dialog box. Values for these parameters are passed by the DAC during run time. and mapplet output transformation. k. Input/output ports receive data and pass it unchanged. c. Source definitions provide data. e. You can use the SQL Editor to view. d. RA_CUST_TRX_LINE_GL_DIST_ALL. Recall that this parameter is used to identify records that have changed since the last ETL run and that the value for this parameter is passed from the DAC during run time. you need to connect it to a source qualifier transformation. b. RA_CUSTOMER_TRX_ALL. For the Sql Query transformation attribute. In this example. c. Notice that there are four source definitions for this mapplet.) Oracle BI Applications 7. Notice that all ports are output ports. these parameters are included in expressions for ports in the Exp_W_GL_REVN_FS_Integration_Id expression transformation.9: Implementation for Oracle EBS 53 . (You define parameters and variables for mapplets similar to how you define them for mappings: by selecting Mapplets > Parameters and Variables to open the Declare Parameters and Variables dialog box. Recall that source definitions represent tables or files that provide source data. Click the Ports tab. source qualifier. Scroll to the bottom of the SQL statement and notice that the SQL contains the $$LAST_EXTRACT_DATE parameter. The source qualifier transformation represents the rows that the Informatica Integration Service reads when it runs a session. Select Mappings > Parameters and Variables to open the Declare Parameters and Variables dialog box. Double-click RA_CUSTOMER_TRX_ALL to open the Edit Transformations dialog box. Notice there are two parameters listed for this mapping: $$DATASOURCE_NUM_ID and $$TENANT_ID. As you will see.

Click the down arrow in the Expression field to open Expression Editor. 54 Oracle BI Applications 7. Double-click MAPO_GL_REVENUE_EXTRACT to open the Edit Transformations dialog box. c. Click the Ports tab. which converts source-specific data elements into standard formats and then stores them in a staging table. Notice that this tab lists the source definitions associated with this source qualifier. a. Alternatively. c. Click Cancel to close the SQL Editor. 7. 6. there are also input ports. Explore the mplt_SA_ORA_GLRevenueFact mapplet. f. If necessary. Scroll to locate the TENANT_ID port.9: Implementation for Oracle EBS . and output ports with expressions. Drag the mplt_SA_ORA_GLRevenueFact mapplet into the Mapplet Designer. Click the Ports tab. Alternatively. Recall that this value is a unique identifier for each data source and is passed by the DAC during run time. expand Explore_SDE > Mapplets. n. Double-click the Exp_W_GL_REVN_FS_Integration_Id expression transformation to open the Edit Transformations dialog box. Notice the Functions. q. g. i. e. Click Cancel to close the Edit Transformations dialog box. Notice that all ports are input. b. Click the Sources tab. the MAPO_GL_REVENUE_EXTRACT mapplet output transformation is the “target” of this mapplet and receives data from the SQ_GL_REVENUE_EXTRACT source qualifier. Notice that port type varies by port.Lesson 8: Exploring SDE and SIL Mappings m. a. Scroll to locate the DATASOURCE_NUM_ID port. and the Numeric and Operator keypads. you can click the Mapping Designer icon on the toolbar. An expression transformation is used to calculate values in a single row before writing to the target. h. Explore the Exp_W_GL_REVN_FS_Integration_Id expression transformation. The MAPO_GL_REVENUE_EXTRACT mapplet output transformation passes output from the mapplet to the next transformation in the mapping (Exp_W_GL_REVN_FS_Integration_Id Expression in this example). d. Click Cancel to close the Edit Transformations dialog box. Select Tools > Mapplet Designer. In this example. While most of the ports are input/output ports that receive data and pass it unchanged. variable ports. Click Cancel to close Expression Editor. Input ports receive data. Notice the value in the Expression field. you can click the Mapplet Designer icon on the toolbar. You can use these to build expressions in the Expression Editor. The source independent load (SIL) mapping then uses an Analytic Data Interface (ADI) mapplet to pick up these records. Ports. DATASOURCE_NUM_ID is an output port that gets its value from the $$DATASOURCE_NUM_ID parameter. which are already transformed into standard format. This is a Source Adapter mapplet. The SDE_ORA_GLRevenueFact mapping should still be open. p. b. Click Cancel to close the Edit Transformations dialog box. which are used to store values across rows. and Variables tabs in the left pane. r. Select Tools > Mapping Designer. Notice that the parameter $$TENANT_ID is used in an IIF conditional expression. o.

and a mapplet output transformation. Notice that the :LKP reference qualifier is used in the expression to return values from the LKP_CUSTLOC_CUST_LOC_ID unconnected lookup transformation. d. b. Click the Ports tab and notice that all ports are input ports. Click the Ports tab. An unconnected lookup transformation is a stand-alone transformation that is not linked to other transformations in a mapping or mapplet. 8. e. Explore the target definition. Click the Ports tab and notice that all ports are input ports with the EXT_ prefix. h. Double-click the LKP_LOC_CURRENCY lookup transformation to open the Edit Transformations dialog box. Click Cancel. f. i.Lesson 8: Exploring SDE and SIL Mappings d. Double-click the MAPO_SAI_GL_REVENUE output transformation. it is output through a new port. Click Cancel to close the Edit Transformations dialog box. Click Cancel to close the Edit Transformations dialog box. n. A lookup transformation is called in an expression that uses the :LKP reference qualifier. Select Tools > Mapping Designer or click the Mapping Designer icon The SDE_ORA_GLRevenueFact mapping should still be open. a mapplet input transformation. the data is passed to the expression transformation as input only. Scroll through the ports and notice that if the input data is transformed. m. The components are two unconnected lookup procedures. Click Cancel to close the Edit Transformations dialog box. on the toolbar. p. Click the Ports tab and notice that all ports are output.9: Implementation for Oracle EBS 55 . Double-click the MAPI_SAI_GL_REVENUE input transformation. Double-click the EXP_SAI_GL_REVENUE expression transformation. Click the Properties tab and notice that the lookup table name is W_LOC_CURR_CODE_TMP. g. j. an expression transformation. This transformation receives the output of the Exp_W_GL_REVN_FS_Integration_Id expression transformation in the SDE_ORA_GLRevenueFact mapping and passes the data to the EXP_SAI_GL_REVENUE expression transformation in this mapplet. This lookup transformation retrieves the local currency code from W_LOC_CURR_CODE_TMP. Click Cancel to close the Expression Editor. Locate the EXT_CUST_SOLD_TO_ID port and click the down arrow in the Expression field to open the Expression Editor. Notice also that the TO_CHAR function is used to convert the data. Notice that the EXT_*output ports of the mplt_SA_ORA_GLRevenueFact mapplet are used to populate the ports in the W_GL_REVN_FS target definition. This is the fact staging table Oracle BI Applications 7. k. which is prefixed with EXT_. l. Click Cancel to close the Edit Transformations dialog box. c. These ports exactly match the input ports of an Analytic Data Interface (ADI) mapplet in the corresponding SIL mapping. These are unconnected lookup transformations. r. Double-click the W_GL_REVN_FS target definition. s. o. q. The ports have been renamed with the INP_ prefix. a. Note that there are five components in the mapplet and that the data flow is from left to right. e. After the data is transformed. Notice the LKP_LOC_CURRENCY and LKP_CUSTLOC_CUST_LOC_ID lookup transformations.

Select Repository > Exit to close Informatica PowerCenter Designer.Lesson 8: Exploring SDE and SIL Mappings in the Oracle Business Analytics Warehouse. 56 Oracle BI Applications 7.9: Implementation for Oracle EBS . This table is one of the source definitions in the corresponding SIL mapping. SIL_GLRevenueFact. f.

Lesson 8: Exploring SDE and SIL Mappings Practice 8-2 Exploring a Prebuilt SIL Mapping Goals Scenario To explore a prebuilt source independent load (SIL) mapping in the Oracle Informatica repository The main goal of this practice is to explore the anatomy of a typical. d. a. Select Folder > Create. but rather from the warehouse staging tables. Select Edit > Copy. perform transformations. Select the mapping SIL_GLRevenueFact. Open Informatica Repository Manager and create a new Informatica repository folder. Select Start > Programs > Informatica PowerCenter 8. because the data undergoes more transformation before being passed to the target. Whereas the targets of SDE mappings are OBAW staging tables. Note that individual mappings may differ substantially from the examples provided here. Double-click Oracle_BI_DW_Base. d. Time 15–20 minutes Instructions: 1. i. Select Explore_SIL in the Repository Navigator. SIL mappings select data from Oracle Business Analytics Warehouse (OBAW) staging tables.9: Implementation for Oracle EBS 57 . h. Expand the Explore_SIL folder. prebuilt SIL mapping in the Informatica repository. Verify that the Explore_SIL folder is open in Designer. This practice explores the function and components of a typical SIL mapping. a. j. Copy a mapping to the new folder. Close Repository Manager. the targets of SIL mappings are the final OBAW tables. Oracle BI Applications 7. 2. c. e. b. Select Tools > Designer to open Designer. SIL mappings differ from SDE mappings with regard to source and target tables. SIL mappings are typically more complex than SDE mappings. Click OK to confirm that the folder has been successfully created. Also. Name the new folder Explore_SIL and click OK. c. Enter Administrator as the username and password and click Connect. An understanding of the anatomy of prebuilt SIL mappings will assist you in the configuration and customization of ETL mappings.1. Verify that the new Explore_SIL folder is visible in the Repository Navigator in Designer.1 > Client > PowerCenter Repository Manager. g. SIL mappings are source independent in that they do not select data from the transactional source. e. f. Expand SILOS > Mappings. and load data into OBAW tables. The folder is open if it is bolded and appears as Explore_SIL – (Oracle_BI_DW_Base) in the list in the upper-left corner of the Designer toolbar. b. Notice that at this point it contains empty subfolders.

which are populated by dimension SIL mappings. and target definition. In addition. Select Edit > Paste. 3. Select the Explore_SIL folder. you explore the objects and the general flow of the SIL mapping. Explore the source definitions. This mapping moves data from a fact staging table in the Oracle Business Analytics Warehouse (OBAW) into the W_GL_REVN_F fact table in the OBAW. They include dimension tables. the SIL mapping contains the following transformation objects: filter and update strategy. mapplet. lookups. Drag SIL_GLRevenueFact into Mapping Designer. expression. If necessary. Explore the SIL_GLRevenueFact mapping. expand Explore_SIL > Mappings. lookup procedure. The data undergoes significant changes in this mapping (through several mapplets. i. In the remainder of this practice. Again. Open Mapping Designer in the workspace. e. Notice that the source definitions are all located in the OBAW. and mapplets. d. Expand Explore_SIL > Mappings and confirm that the SIL_GLRevenueFact mapping is copied to the Explore_SIL folder. c. h. j.Lesson 8: Exploring SDE and SIL Mappings f. 58 Oracle BI Applications 7. Click Yes to confirm that you want to copy SIL_GLRevenueFact. If desired. source qualifier. you are copying this mapping to a custom folder as a precaution to ensure that you do not inadvertently make changes to the mapping in the SILOS folder while you explore it. Expand the subfolders under Explore_SIL and notice that all of the related repository objects for this mapping were also copied to the folder: sources. b. transformations. targets. which is populated by the SDE_ORA_GLRevenueFact mapping that you explored in the previous practice. g. select Layout > Zoom Percent and select a percent to modify the mapping layout in the Mapping Designer workspace.9: Implementation for Oracle EBS . Notice that the SIL mapping has many of the same transformation objects as the SDE mapping you explored in the previous practice: source definition. 4. and the W_GL_REVN_FS fact staging table. k. and transformations) as it is moved from the source definition tables to the target table. Select Repository > Save. a.

c. Oracle BI Applications 7. Explore the Exp_W_GL_REVN_F_Update_Flg expression transformation. e. open it in Mapplet Designer. W_GL_REVN_FS. If you would like to explore the processing for this mapplet in more depth. double-click it to open the Edit Transformations dialog box. Click the down arrow in the Expression field to open the Expression Editor and view the evaluation logic for the port. accessed on the Properties tab. which uniquely identifies each ETL run. which is located downstream in this mapping. The value for ETL_PROC_WID is passed by the DAC during run time to the $$ETL_PROC_WID parameter in the MPLT_GET_ETL_PROC_WID mapplet. d. a.9: Implementation for Oracle EBS 59 . contains the SQL statement that is used to retrieve data from the sources. Explore the MPLT_GET_ETL_PROC_WID mapplet. f.Lesson 8: Exploring SDE and SIL Mappings 5. to minimize the number of records to be cached by this lookup. Click Cancel to close the Expression Editor. This ID is also displayed as the Process ID on the Current Run / Run History screen in the DAC. This source qualifier transformation performs the same basic function as the source qualifier in the SDE mapping. Click Cancel to close the Edit Transformations dialog box. ETL_PROC_WID stores the ID of the ETL process information. Together. The SQL statement of this lookup transformation is overridden with a join with the staging area table. This cached lookup transformation is responsible for looking up the required DATASOURCE_NUM_ID and INTEGRATION_ID columns in the target table. which is used to take actions such as insert and delete in the update strategy transformation. The table lists the possible values for the UPDATE_FLG port: I B U D X Insert new record Insert new record and mark for soft delete Update existing record Update existing record and mark for soft delete Reject b. 7. Rows that have the same INTEGRATION_ID and DATASOURCE_NUM_ID will be updated in the target table. 8. Explore the Sq_W_GL_REVN_FS source qualifier transformation. The source qualifier transformation represents the row set retrieved from the source objects before undergoing subsequent transformations. UPDATE_FLG. Explore the Lkp_W_GL_REVN_F lookup procedure. 6. Click the Ports tab and select the UPDATE_FLG port. Read the description for an understanding of the evaluation logic of the port. double-click it to open the Edit Transformations dialog box. If you would like to explore this lookup procedure in more depth. If you would like to explore this source qualifier transformation in more depth. these two columns are the primary key that uniquely identifies rows in the target table. The rest of the rows will be inserted. These columns are compared against the corresponding staging area columns to detect new records or identify changed records. The description explains that this expression transformation evaluates the value of the “Update Flag” port. W_GL_REVN_F. The Sql Query transformation attribute. Double-click the Exp_W_GL_REVN_F_Update_Flg expression transformation to open the Edit Transformations dialog box. This is a reusable mapplet that looks up and retrieves the ETL_PROC_WID from the W_PARAM_G table in the OBAW.

Currency conversions are required because your business might have transactions involving multiple currencies. This mapplet is responsible for getting the correct exchange rates for a given date. 12. open the transformation. 14. 11.Lesson 8: Exploring SDE and SIL Mappings 9. X_CUSTOM. It also loads the exchange rates required to convert the document amount into each of the three global currencies. You will learn more about configuring global currencies in Lesson 10 “Configuring Analytical Applications. This EXP_Custom expression transformation is part of that methodology. This transformation is used to decide whether incoming rows should be inserted or updated in the target table based on the value of the UPDATE_FLG port. open Upd_W_GL_REVN_F_Ins_Upd. This expression transformation is used when you need to customize the mapping by adding columns to an existing fact or dimension table. Explore the Upd_W_GL_REVN_F_Ins_Upd update strategy transformation. 10. and click the down arrow in the Value field of the Filter Condition transformation attribute. Out of the box. To view the update strategy logic. but they should follow the same route through the mapping as X_CUSTOM. which marks a safe path through the mapping. open it in Mapplet Designer. dimension surrogate key resolution. Jobs done in this mapplet include code-name pair resolution. Explore the EXP_Custom expression transformation. If you would like to explore the processing for this mapplet in more depth. and open the expression for the Update Strategy Expression attribute. Recall that the value of this port was set for each row in the Exp_W_GL_REVN_F_Update_Flg expression transformation. Oracle BI Applications provides a methodology to extend the prebuilt ETL mappings to include additional columns and load the data into existing OBAW tables. Explore the mplt_SIL_GLRevenueFact mapplet. system columns generation. and so forth. Both the prebuilt ETL mappings and OBAW tables are extensible. To view the filter condition logic. open it in Mapplet Designer. which are the common currencies used by the OBAW. and global currencies. Oracle Business Intelligence Applications provides three global currencies. All extension logic should follow the same route through the mapping as X_CUSTOM. You can add additional transformations to the mapping. click the Properties tab. Explore the MPLT_LKP_W_CUSTOMER_FIN_PROFL_D mapplet. click the Properties tab. The OBAW stores amounts in document currency (the currency of the transaction). This mapplet is responsible for transforming specific types of columns in the target table W_GL_REVN_F. The logic is as follows: If 60 Oracle BI Applications 7. Explore the mplt_Curcy_Conversion_Rates mapplet. This mapplet does a lookup to resolve the CUSTOMER_FIN_PROFL_WID column. open it in Mapplet Designer. If you would like to explore the processing for this mapplet in more depth. the columns must first be passed through the ETL process. which is the key to the customer accounts dimension in W_GL_REVN_F. In order to see additional columns in the data warehouse. Thus. Explore the Fil_W_GL_REVN_F filter transformation.” If you would like to explore the processing for this mapplet in more depth. where a currency conversion to any of the three possible global currencies is involved. the load mapping loads the document currency and local currency amounts into the target table. This process of customizing and extending the ETL mappings and the OBAW is covered in depth in lessons 1214. and it has a single placeholder column. This filter transformation uses an IIF formula to filter out records that have an UPDATE_FLG value of “X”. 13. local currency (the currency in which accounting entries are recorded). the target table has two amount columns and three exchange rate columns.9: Implementation for Oracle EBS . For every monetary amount extracted from the source.

Lesson 8: Exploring SDE and SIL Mappings UPDATE_FLG for a row is set to I (insert new record) or B (insert new record and mark for soft delete). It distinguishes the transaction type of the record. whether it is open. cleared. The ACCT_DOC_TYPE_WID port in this table is the foreign key to the W_XACT_TYPE_D table. 15. 16. All the transaction amounts are stored in document currency and local currency and the table also maintains three global currency exchange rates. the record is flagged for insertion. or unposted. Close all open Informatica clients. If UPDATE_FLG for a row is set to D (update existing record and mark for soft delete) or U (update existing record). The DOC_STATUS_WID port is the foreign key to the W_STATUS_D table. This fact table stores all the revenue transactions for the Oracle BI Financial Analytics application in the OBAW. Oracle BI Applications 7. Explore the W_GL_REVN_F target definition. posted. the record is flagged for update. That completes your exploration of a typical SIL mapping.9: Implementation for Oracle EBS 61 . It is used to determine the status of the record. All other records are flagged for rejection.

9: Implementation for Oracle EBS .Lesson 8: Exploring SDE and SIL Mappings 62 Oracle BI Applications 7.

These are all the subject areas associated with the selected source system container. a. The DAC allows you to view and modify the repository application objects based on the source system container you specify. It allows you to create. The DAC Design view provides access to functionality related to creating and managing subject areas.9: Implementation for Oracle EBS 63 . automatically selects the Tables tab in the top pane. the Source System Container list appears to the right of the view buttons. c. Open the DAC client. 2. The navigation tree displays all the metadata corresponding to the selected source system container. objects. Log in to the DAC connection using dac as the table owner name and password. and displays all the tables in list mode in the list. b. b. 20–30 minutes Time Instructions: 1. high-performing environment. Notice that the subject areas listed in the navigation tree correspond to the subject areas listed in the list. Explore the Design view. You cannot change the metadata for preconfigured containers. If you want to customize the metadata in a preconfigured container. If necessary. and monitor modular data warehouse applications in a parallel.Lesson 9: Working with the Data Warehouse Administration Console Practice 9-1: Exploring the DAC Goal Scenario To explore the Oracle BI Data Warehouse Administration Console (DAC) to become familiar with its functionality. click the Subject Area tab in the top pane. This expands the navigation tree to display all the tables corresponding to this container. All DAC repository objects are associated with a source system container. and so forth. Double-click Tables in the navigation tree. execute. select the Oracle 11. configure. f. g. The navigation tree is visible on the left side of the DAC window. a. Select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. Notice that the tree root nodes in the navigation tree correspond to the tabs in the top pane of the DAC window on the right: Subject Areas. You learn how to create a custom container and the associated metadata in Lesson 11: “Customizing DAC Metadata and Running an Execution Plan. which stores application objects in a hierarchical framework that defines a data warehouse application. e.10 source system container in the list. Oracle BI Applications 7.” d. It allows you to select the source system container that holds the repository objects that correspond to a specific source system. Indices. Expand the Subject Areas node in the navigation tree.5. If necessary. When the Design view is active. Tables. click Design on the toolbar to navigate to the Design view. you must first make a copy of the container. If necessary. and properties The DAC provides a framework for the entire life cycle of data warehouse implementations.

m. n. 64 Oracle BI Applications 7. Query functionality is available in every DAC screen. Extract. Explore source system folders. Double-click Subject Areas in the navigation tree or click the Subject Areas tab. If desired. warehouse. Querying is a way to locate one or more records that meet your specified criteria.9: Implementation for Oracle EBS . b. Double-click any one of the tables in the navigation tree. only the tasks associated with the configuration tag will be chosen by the DAC when the subject area is assembled. Click the Name column to sort the table records by name. enter W_GL_REVN_F and click Go. Click the Source System Folders tab. drag column headings to reorder them. The following table provides a description of some of these common elements: Name Inactive Owner Edit Description A column that specifies the name of the repository. In the Name field. a. Right-click anywhere in the list to view the right-click menu. and Post Load. k.5. A column that specifies the source system container in which the object was created A subtab that allows you to edit an object that is selected in the top pane window (preconfigured objects cannot be edited) A subtab that displays and allows you to view or edit a description of the object selected in the top pane c. Inactive objects do not participate in the ETL process. The following table provides a description of these properties: Configuration Tag Indicates whether configuration tag tasks are the only tasks associated Tasks Only with this subject area that will participate in the ETL process. Do not select any right-click commands. j. or transactional database object A column that indicates whether an object is inactive. Load. 3. You can use query commands and operators to define your query criteria. Where are the physical folders located? 4. Close the right-click menu.10 container and that each logical folder points to a physical folder. a. Notice that some of the DAC interface tabs have common elements. The commands available in right-click menus depend on which tab is active. i. as well as the tasks that are associated with the tables. Notice some of the additional properties associated with each subject area.Lesson 9: Working with the Data Warehouse Administration Console h. A subject area also includes the tasks required to load the subject area tables. Click the Query button. Subject areas are assigned to execution plans. which can be scheduled for full or incremental loads. The table is displayed in singlerecord mode in the list. l. If this check box is selected. The Subject Areas tab lists all the subject areas associated with the selected source system container. Explore subject areas. A subject area is a logical grouping of tables related to a particular subject or application context. such as columns or subtabs. b. listed for the Oracle 11. Notice there are three logical folders. Double-click Tables in the navigation tree to return to list mode.

The following table provides a description of properties specific to tables: Table Type Warehouse Image Suffix Is MultiSet Indicates the type of table: file. The Tables tab allows you to view and edit existing tables and to create new tables for custom containers. hierarchy. remove. Table types can be fact. e. aggregate. j. and so on. applicable only to Teradata databases Oracle BI Applications 7. If necessary. click the Edit subtab. f. It includes the following properties: Parent Group Phase Autogenerated Is Group Displays the task group name if the task belongs to a task group Identifies the task phase of the ETL process Indicates whether the task was generated by the DAC’s task generation process Indicates whether the task is a task group h. Tables in the DAC are physical database tables defined in the database schema. This tab displays the tables that are associated with the selected subject area.Lesson 9: Working with the Data Warehouse Administration Console Last Designed Indicates the date and time the subject area was last assembled d. Click the Financials . and so on Indicates whether the table is a warehouse table. If the warehouse flag is not selected. Click the Tasks subtab. It includes the following properties: Include Tasks Indicates whether the configuration tag tasks will be executed with the selected subject area Context Disabled Indicates (if checked) that the configuration tag is globally disabled (set as Inactive in the Configuration Tags parent tab) 5. Click the Tables tab in the top pane. a. Explore tables. b. Notice that some of the subject area properties are displayed in this tab. This tab displays the tasks associated with the selected subject area. These can be transactional database tables or data warehouse tables. the schema creation process will not include this table.Revenue subject area. This tab displays the configuration tags that are associated with this subject area. Indicates the suffix for image tables. applicable to Siebel source tables only Indicates whether the table is a MultiSet table. source.Revenue subject area are a task group? Hint: Use the query feature. and allows you to add. aggregate. Click the Description subtab to view a brief description of the subject area. Click the Configuration Tags subtab. Click the Tables subtab. i. The Tables tab lists all the tables associated with the selected source system container.9: Implementation for Oracle EBS 65 . as well as flat files that can be sources or targets. dimension. and inactivate tasks for a custom container. How many of the tasks associated with the Financials . g. dimension. Notice the properties associated with each table. It allows you to add or remove tables for custom containers.

You should have a clear understanding of when and where the index will be used at the time of registering the index. This tab displays a read-only list of tasks that use the selected table as a target. This tab displays a read-only list of indices that belong to the selected table. d. Click the Target for Tasks (RO) subtab. This tab lists the columns associated with the selected table. Recall that this is a fact staging table. e. The bulk load will fail if there are indices on the table. b. For example. Click the Multi-Column Statistics subtab. This tab displays a read-only list of tasks that use the selected table as a source. While this improves the ETL performance. g. The following table provides a description of properties specific to indices: Table Name Index Usage Table for which an index is created Usage of index: ETL or Query. the index will not be dropped and the load will fail. Explore indices. Click the Conditional for Tasks (RO) subtab.9: Implementation for Oracle EBS . the preconfigured workflows have the bulk load option turned on. a. Click the Indices (RO) subtab. Notice the properties associated with each index. applicable only to Teradata databases c. The index will be created with the Allow Reverse Scan option. it is important to keep the index definitions in sync with the database. all the indices as defined in the repository will be dropped before the data is loaded and will be created after the data is loaded automatically. # Unique Columns Is Unique Is Clustered Is Bitmap Allow Reverse Scan 66 Oracle BI Applications 7. It also allows you to add columns and foreign key column relationships to the selected table for custom containers. There can be only one clustered index per table. Related tables participate in the ETL process in addition to the tables that are associated with this table. h. Click the Columns subtab. For unique indices. Therefore. f. and it is not registered in the repository. i. An ETL index is typically used during the ETL process. Click the Indices tab in the top pane. The Indices tab displays a list of all the indices associated with the selected source system container. k. when a table is going to be truncated. Click the Related Tables subtab. 6. This tab is applicable to Teradata databases only. A Query index is an index used only during the reporting process. the number of columns that will be unique Indicates whether the index is unique Indicates whether the index is clustered. Enter W_GL_REVN_F in the Name field and click Go. It is recommended that you do not register any indices for source tables. j. if you create an index on the database. Click the Query button. Indicates whether the index is of the bitmap type Applicable only for DB2-UDB databases. Click the Source for Tasks (RO) subtab. This tab displays a read-only list of tasks that are optional tasks for the selected table. During the ETL process.Lesson 9: Working with the Data Warehouse Administration Console Has Unique Primary Index Indicates whether the table has a Unique Primary Index.

You learned about setting up these connections when you configured the DAC in Practice 4-1: Configuring the Training Environment. A task is a unit of work for loading one or more tables. Notice the Command for Incremental Load and Command for Full Load properties. Index properties are specific to different database types. This tab lists the database types that apply to the selected index. f. Notice the folder name.Lesson 9: Working with the Data Warehouse Administration Console Always Drop & Create Indicates whether the index will be dropped and created regardless of whether the table is being loaded using a full load or incremental load c. The load commands for this task are SDE_ORA_GLRevenueFact and SDE_ORA_GLRevenueFact_Full. truncate properties. e. respectively. Click the Databases subtab. Informatica.” b. will dependent tasks be executed? Oracle BI Applications 7. Query for the SDE_ORA_GLRevenueFact task. These are the logical database connections for the primary source database and primary target database. If this task fails. A table can be loaded in full mode or incremental mode.9: Implementation for Oracle EBS 67 . Full mode refers to data loaded for the first time or data that is truncated and then loaded. It includes the following properties: Position Sort Order Indicates the position of the column in the index Indicates whether the sort order is ascending or descending e. Incremental mode refers to new or changed data being added to the existing data. Click the Tasks tab in the top pane. and commands for full or incremental loads. You can use the list to display database-specific properties.or post-SQL commands executed with this task? i. Notice the Primary Source and Primary Target properties. Click the Columns subtab. Recall that Extract is the logical name of the source system folder that points to the physical folder SDE_ORA11510_Adaptor in the Informatica repository (Oracle_BI_DW_Base). d. Is this task executed via an external program. d. You learn more about assembling subject areas in Lesson 11 “Customizing DAC Metadata and Running an Execution Plan. The Tasks tab lists all the tasks associated with the selected source system container. Recall that these commands correspond to the Informatica workflows that execute the tasks for the Informatica mapping SDE_ORA_GLRevenueFact. When you assemble a subject area. Explore tasks. A task comprises the following: source and target tables. the index will not be created. c. the DAC assigns tasks to it. phase. In which phase of the ETL process does this task occur? g. SQL file. Tasks that are automatically assigned to the subject area by the DAC are indicated by the Autogenerated flag in the Tasks subtab of the Subject Areas tab. execution type. 7. Are there any pre. or a stored procedure? h. Extract. If no database type is indicated. This tab displays the list of columns that make up the index. a.

SQL file. True or false? The target table is truncated regardless of whether a full or incremental load is run. Which of the following is not a source table for this task? RA_CUSTOMER_TRX_ALL RA_CUSTOMER_TRX_LINES_ALL RA_CUST_TRX_LINE_GL_DIST_ALL W_GL_REVN_F l. In which phase of the ETL process does this task occur? p. What type of table is the target table? n. What is the target table for this task? m. 68 Oracle BI Applications 7. Are the source tables for this task located in the transactional database or the data warehouse? k. Query for the SIL_GLRevenueFact task. What is the primary source table for this task and what kind of table is it? t. Are the source tables for this task located in the transactional database or the data warehouse? r.9: Implementation for Oracle EBS . Informatica. Which source table for this task is the target table for the SDE_ORA_GLRevenueFact task? s. What is the target table for this task? u. Is this task executed via an external program. o. What type of table is the target table? v.Lesson 9: Working with the Data Warehouse Administration Console j. or a stored procedure? q.

Click the Tasks subtab. a. This tab displays a read-only list of the tables used for getting data by the task group.” b. A task group is a group of tasks that you define because you want to impose a specific order of execution. unless the tag is part of the subject area definition Include Task property. What happens to indices when the target table is truncated? 8. This tab lists all the tasks that belong to the selected task group. When a task is tagged. A task group is considered to be a “special task. e. Select TASK_GROUP_Extract_BusinessLocationDimension in the list. c. Explore configuration tags. 10. b. This tab also identifies the tasks to which the tables belong. This tab displays a read-only list of the tables into which the task group loads data. Click the Source System Parameters tab. which you worked with in Oracle BI Applications 7. b. d. Scroll through the source system parameters to get a sense of the parameters and their values.9: Implementation for Oracle EBS 69 . The execution order identifies the order in which the tasks are executed. If this check box is selected. Click the Configuration Tags tab. a. it is not eligible to be included in the collection of tasks for any subject area. which lists the tasks associated with the configuration tag selected in the top window. Click the Target Tables (RO) tab. c. e. Click the Subject Areas subtab to view the subject areas that belong to a configuration tag or to add subject areas to a configuration tag for custom containers. A configuration tag is an object that controls the inclusion of tasks in subject areas. The Source Systems Parameters tab lists all the source system parameters associated with the selected source system container. only the tasks associated with the configuration tag will be chosen by the DAC when the subject area is assembled. d. Scroll to the bottom of the list and select the Oracle – Extract Value Set Hierarchies configuration tag. notice the $$DATASOURCE_NUM_ID parameter.Lesson 9: Working with the Data Warehouse Administration Console w. 9. Click the Source Tables (RO) tab. Explore task groups. Notice the Configuration Tag Tasks Only field. It allows you to edit existing parameters and to configure new ones for custom containers. For example. This field indicates whether configuration tag tasks are the only tasks associated with this subject area that will participate in the ETL process. Click the Child Tasks subtab. Click the Task Groups tab. Explore source system parameters. This tab also identifies the tasks to which the tables belong. a.

you can only run one DAC server against any given DAC repository. The value must be numerical. you can view and edit existing physical data source connections and create new ones. the frequency (in days) at which the DAC client updates the table and index statistics for the DAC repository. For more information about DAC systems properties. database connections. The DAC server load balances across the servers. Click the Email Recipients tab. and email notification. In this tab. Explore the Setup view. The following table lists a few of the properties and their function. Informatica servers. Property Analyze Frequency (in days) Description For DAC metadata tables. When set to True: An ETL that is running when the DAC server abnormally terminates will continue running when the DAC server is restarted. in the repository you must specify the network host name of the machine where the DAC server is to be run. b. 70 Oracle BI Applications 7. This tab enables you to set up a list of email addresses that will be notified about the status of the ETL process. Click the DAC Systems Properties tab. a. Click Setup on the toolbar to navigate to the Setup view. When set to False: An ETL that is running when the DAC server abnormally terminates will not automatically restart when the DAC server restarts. This property also takes the value localhost. Click the Physical Data Sources tab. d. Click the Informatica Servers tab. The Setup View provides access to functionality related to setting up DAC system properties. The DAC server and a given DAC repository have a one-to-one mapping. refer to the Oracle Business Intelligence Data Warehouse Administration Console Guide. 11. Thus. This tab enables you to configure various properties that determine the behavior of the DAC server. The Informatica Servers tab enables you both to register one or more Informatica servers and one Informatica Repository server and to specify how many workflows can be executed in parallel on each server.9: Implementation for Oracle EBS . You cannot use an IP address for this property. The directory where the Informatica parameter file is stored A unique name for the DAC repository Auto Restart ETL InformaticaFileParameterLocation Repository Name c. this value is provided for development and testing purposes and should not be used in a production environment. The Physical Data Sources tab provides access to the connection properties for the physical data sources. You already worked with some of these properties when you configured the DAC in Practice 4-1: Configuring the Training Environment. DAC Server Host The host name of the machine where the DAC server resides. It is used to populate DATASOURCE_NUM_ID in Informatica mappings during run time. e. The ETL status will be updated to Failed. However. An administrator will have to manually restart the ETL. that is.Lesson 9: Working with the Data Warehouse Administration Console earlier practices.

Click Execute on the toolbar to navigate to the Execute view. d. b.Revenue subject area associated with this execution plan? i. Explore the Execute view. What is the primary source for the SIL_GLRevenueFact task? Oracle BI Applications 7. Select the Financials_Oracle 11. Does the SIL_EmployeeDimension task execute before or after the SIL_GLRevenueFact task? n. and monitor execution plans. Which tasks must be completed after this execution plan is run? l. What is the number of prune days assigned to this execution plan? Recall that the DAC subtracts the number of prune days from the LAST_REFRESH_ DATE of a given source and supplies this as the value for the $$LAST_EXTRACT_DATE parameter.5. Which tasks must be completed before this execution plan is run? k. g.Lesson 9: Working with the Data Warehouse Administration Console 12. Click the Ordered Tasks subtab. c. The Execute View provides access to functionality that allows you to run. How do you determine this? o. In which Informatica folder are the SDE mappings located? Hint: Click the Parameters tab. Does the Financials_Oracle 11. Is the Financials . a.5. j. If necessary. Will the tables associated with this execution plan be analyzed? h.10 execution plan always execute a full load? e. m.5.9: Implementation for Oracle EBS 71 . click the Execution Plans tab in the top pane to view the existing execution plans. schedule.10 execution plan active? f. Is the Financials_Oracle 11.10 execution plan.

” 72 Oracle BI Applications 7. Run History. Is W_GL REVN_FS a source table for the SIL_GLRevenueFact task? x. You will learn about the remaining Execute tabs (Current Run. In which task phase of the ETL process is SIL_GLRevenueFact executed? r. What is the target table for the SIL_GLRevenueFact task? w. u. t. If necessary. Which ETL mappings (commands) are used to run the SIL_GLRevenueFact task? s. What is the primary target for the SIL_GLRevenueFact task? q. Leave the DAC open.Lesson 9: Working with the Data Warehouse Administration Console p. select the SIL_GLRevenueFact task in the Ordered Tasks subtab. How many tasks are immediate successors to the SIL_GLRevenueFact task? v. Click Details. Scheduler) in the practices for Lesson 11 “Customizing DAC Metadata and Running an Execution Plan. Close Details.9: Implementation for Oracle EBS . y.

Informatica.h 7.j. Load. 7.l. The Continue on Error property is not selected. SQL file. 16 In which phase of the ETL process does this task occur? Extract Is this task executed via an external program. Informatica. 7. will dependent tasks be executed? No. and Post Load.f.10 container and that each logical folder points to a physical folder.Lesson 9: Working with the Data Warehouse Administration Console Solutions 9-1: Exploring the DAC Answers 3.g. 7. Extract.5. When this check box is selected. 7. or a stored procedure? Informatica 4. 7. the dependent tasks are not stopped.i.Revenue subject area are a task group? Hint: Use the query feature.o. if the command fails.p. Are the source tables for this task located in the transactional database or the data warehouse? Transactional Which of the following is not a source table for this task? W_GL_REVN_F What is the target table for this task? W_GL_REVN_FS What type of table is the target table? Fact staging table In which phase of the ETL process does this task occur? Load fact Is this task executed via an external program. 7.k. Oracle BI Applications 7. Notice there are three logical folders.or post-SQL commands executed with this task? No If this task fails.m. 7.b. 7.h. 7. Where are the physical folders located? In the Oracle Informatica repository (Oracle_BI_DW_Base) How many of the tasks associated with the Financials . listed for the Oracle 11.9: Implementation for Oracle EBS 73 . SQL file. or a stored procedure? Informatica Are there any pre.

12.10 execution plan always execute a full load? No. 74 .w. Does the Financials_Oracle 11.t. 12. 7. the table is analyzed so that the index statistics are up-to-date. In which Informatica folder are the SDE mappings located? SDE_ORA11510_Adaptor.u.d. Which tasks must be completed before this execution plan is run? None. 12. The Inactive property is not selected.5. When indices are dropped and created.r. fact staging table What is the target table for this task? W_GL_REVN_F What type of table is the target table? Fact table True or false? The target table is truncated regardless of whether a full or incremental load is occurring. Is the Financials_Oracle 11. 7. This indicates that the target table will be truncated only when a full load is occurring.Lesson 9: Working with the Data Warehouse Administration Console 7. 7.j. There are no tasks listed in the Preceding Tasks subtab. Oracle BI Applications 7. The Full Load Always property is not selected.s. Click the Parameters subtab to see the selected execution plan’s parameters for database connections and Informatica folders. 12. What is the number of prune days assigned to this execution plan? 30 Will the tables associated with this execution plan be analyzed? Yes. What happens to indices when the target table is truncated? Any indices registered for this table are dropped before the command is executed and then re-created after the command completes successfully.v.9: Implementation for Oracle EBS 7.Revenue subject area associated with this execution plan? Yes. 12.f.i. 7.h.q. 12. Is the Financials . Click the Subject Areas subtab to see subject areas associated with this execution plan. 12.10 execution plan active? Yes. False.g. 7.e. The Analyze property is selected. Only Truncate for Full Load is selected.5. Are the source tables for this task located in the transactional database or the data warehouse? Data warehouse Which source table for this task is the target table for the SDE_ORA_GLRevenueFact task? W_GL_REVN_FS What is the primary source table for this task and what kind of table is it? W_GL_REVN_FS.

12.Lesson 9: Working with the Data Warehouse Administration Console 12. There are no tasks listed in the Following Tasks subtab. 12. 12. Does the SIL_EmployeeDimension task execute before or after the SIL_GLRevenueFact task? Before How do you determine this? One way is to examine the Depth property in the Ordered Tasks tab.p.q. The SIL_EmployeeDimension has a depth of 26. 12.v. so it executes before the SIL_GLRevenueFact task. What is the primary source for the SIL_GLRevenueFact task? Data Warehouse What is the primary target for the SIL_GLRevenueFact task? DataWarehouse In which task phase of the ETL process is SIL_GLRevenueFact executed? Load Fact Which ETL mappings (commands) are used to run the SIL_GLRevenueFact task? SIL_GLRevenueFact_Full for full loads. and so on.m.w.u. it is the primary source.k. Tasks that have no dependency have a depth of 0.9: Implementation for Oracle EBS 75 . Oracle BI Applications 7. 12. 12. The Depth property determines the level of a task’s dependency. 12.n. Which tasks must be completed after this execution plan is run? None. SIL_GLRevenueFact for incremental loads How many tasks are immediate successors to the SIL_GLRevenueFact task? Two: SDE_ORA_Stage_GLRevenueFact_AGGRDerive and SDE_ORA_Stage_GLRevenueFact_GRFDerive What is the target table for the SIL_GLRevenueFact task? W_GL_REVN_F Is W_GL REVN_FS a source table for the SIL_GLRevenueFact task? Yes. 12.o. 12.r. Tasks that depend on other tasks of depth have a depth of 1. which has a depth of 29.

Lesson 9: Working with the Data Warehouse Administration Console 76 Oracle BI Applications 7.9: Implementation for Oracle EBS .

if your organization is a multinational enterprise that has its headquarters in the United States. a. d. 10–15 minutes Scenario Time Instructions: 1. c. which are the common currencies used by the data warehouse. where USD = US Dollars: Oracle BI Applications 7.10 source container. which begins December 1. Configure global currencies. Currency conversions are required because your business might have transactions involving multiple currencies. Configure the initial extract date. The specified initial extract date will be used as a filter on the creation date of OLTP data in the selected full extract mapping. It reduces the volume of data in the initial load. In the Source Systems Parameters tab.9: Implementation for Oracle EBS 77 . You explore the configuration settings in the pre-built Oracle 11.Lesson 10: Configuring Analytical Applications Practice 10-1: Configuring Common Areas and Dimensions Before Running a Full Load Goal To explore pre-load configuration steps that apply to Oracle Business Intelligence Applications deployed with any source system This practice explores configuration settings for Oracle Business Intelligence that you need to apply for any applications you deploy (for example. Oracle Human Resources).10 container from the list to the right of the Execute button. Scroll or query to located the $$INITIAL_EXTRACT_DATE parameter. Display the Edit tab. make sure that you set it to the beginning of an accounting period. In this example. select the Design view in the DAC client. To create a meaningful report. If necessary. b. Oracle Business Intelligence Applications provides three global currencies. When you set the Initial Extract Date parameter. If you were working in an editable custom container. 2. You learn how to do this in the next set of practices. The global currency is useful when creating enterprise-wide reports. If necessary.5.5. The initial extract date is required when you extract data for a full load. a. 2003. and not a date in the middle of an accounting period. This includes steps required before a full data load and steps for controlling your data set. you probably want to choose US dollars (USD) as one of the three global currencies. you would enter an initial extract date in the Value field. For example. Out of the box. you have to use a common currency. f. select the Oracle 11. Oracle Financial Analytics. Click the Source System Parameters tab. e. you want to extract data from the first fiscal month of 2004. scroll or query to locate the parameters used to set the global currencies and verify that the values are set according to the following table.

This is true for this file and all other . c. 78 Oracle BI Applications 7. Note that you must spell the currencies as they are spelled in your OLTP source system. For each of the global currencies. Scroll or query to locate the $$DEFAULT_LOC_RATE_TYPE parameter. When Oracle Business Intelligence Applications converts your transaction records’ amount from document currency to global currencies. a. scroll or query to locate the three parameters used to configure the global currency exchange rate types and verify the values are set according to the following table: Parameter $$GLOBAL1_RATE_TYPE $$GLOBAL2_RATE_TYPE $$GLOBAL3_RATE_TYPE Value Corporate Corporate Corporate b. you explore setting up fiscal calendars by fiscal month. Note that you must spell the exchange rate type values as they are spelled in your OLTP source system. Important: Please note that you are exploring this file in the C:\OracleBI\dwrep\Informatica\SrcFiles directory. Configure exchange rate types. In your production environment.1. Configure fiscal calendars. Fiscal Month. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles and open fiscal_month. a.9: Implementation for Oracle EBS . fiscal month. In your development and production environments.csv files in the \Informatica\PowerCenter8. it also requires the exchange rate types to use to perform the conversion. and the start date of the fiscal month in YYYYMMDD format. Fiscal data is first loaded in the W_DAY_D table and then the SIL mappings read data from W_DAY_D and load data into the aggregate fiscal time dimension tables such as Fiscal Week. Recall that in Practice 4-1 you copied this file and other . Fiscal Quarter. the SIL mappings are designed to derive the Fiscal Week from the start date and end date of a fiscal month by grouping days into periods of seven days. you would modify the . and verify that it is set to Corporate.csv.1\server\infa_shared\SrcFiles directory so the files can be read by the Informatica mappings. 4. In either case. and Fiscal Year. 3. you would need to enter accurate fiscal data. The default installation of Oracle Business Intelligence Applications supports one fiscal calendar. Oracle Business Intelligence Applications also allows you to specify the exchange rate type to use to perform the conversion. as there is no check done with the Informatica mappings. In this example. Notice that the file contains the fiscal year.1.1\server\infa_shared\SrcFiles directory. You may choose to provide fiscal calendar information in terms of the fiscal weeks of your organization or in terms of the fiscal months of your organization. Oracle Business Intelligence Applications also provides three global exchange rate types for you to configure. which is used to configure the exchange rate type for document currency to local currency conversion. In the Source Systems Parameters tab.csv files discussed in this set of practices.Lesson 10: Configuring Analytical Applications Parameter $$GLOBAL1_CURR_CODE $$GLOBAL2_CURR_CODE $$GLOBAL3_CURR_CODE Value USD USD USD b.csv files from this directory to the \Informatica\PowerCenter8.

In the DAC client. click the Setup view.10 data source. This task is active because in this example you have chosen to provide fiscal information in terms of fiscal months. a.9: Implementation for Oracle EBS 79 . Query for the SIL_DayDimension_FiscalMonth_Extract task and verify that it is active (that the inactive box is not checked).csv without making any changes.10 data source and the value 1 indicates a Siebel source. which is a system column in the data warehouse that uniquely identifies a data source category and indicates which source systems the data comes from. This task is not active because in this example you have chosen to provide fiscal information in terms of fiscal months. not fiscal weeks. the value 4 indicates an Oracle 11. Click the Physical Data Sources tab. Oracle BI Applications 7.5. c. verify that the Data Source Number value is set to 4 for the Oracle 11. Query for the SIL_DayDimension_FiscalWeek_Extract and verify that it is inactive (that the inactive box is checked). e.Lesson 10: Configuring Analytical Applications b. 6. Close fiscal_month. Leave the DAC open. In the Edit subtab. Return to the DAC and click the Tasks tab. c. b.5. d. Data sources that are supported by Oracle BI have predefined DATASOURCE_NUM_ID values. d. 5. Select the ORA_11_5_10 data source. Configure DATASOURCE_NUM_ID. For example.

including the leaf nodes. and which stores up to 20 hierarchies in a flattened structure. In this example. The following screenshot shows a partial view of W_HIERARCHY_D: 80 Oracle BI Applications 7. Setting up hierarchies using general ledger accounting flexfield value sets definitions requires establishing relationships between three data warehouse tables: W_HIERARCHY_D. which is a generic hierarchy table.Lesson 10: Configuring Analytical Applications Practice 10-2: Configuring General Ledger Account Hierarchies Goal To configure general ledger account hierarchies using general ledger accounting flexfield value set definitions There are two ways to set up hierarchies in Oracle Financial Analytics: by using flexfield value set definitions for general ledger accounting or by using the Financial Statement Generator (FSG) report definition. and W_GL_BALANCE_A. which stores general ledger account balances aggregated by general ledger account segment codes and segment attributes. you store the hierarchy information in the W_HIERARCHY_D table. refer to the Oracle Business Intelligence Applications Fusion Edition Installation and Configuration Guide. Each record stores the top node to leaf node path in the HIER_CODE to HIER20_CODE columns. For more information about using the Financial Statement Generator (FSG) report definition. you explore the steps for setting up general ledger account hierarchies using general ledger accounting flexfield value sets definitions. Columns HIER_CODE to HIER20_CODE and HIER_NAME to HIER20_NAME store the code/name values for each level within a hierarchy. Whichever method you choose to set up general ledger account hierarchies. which stores the hierarchies. W_GL_ACCOUNT_D. Time 40–50 minutes Scenario Instructions: 1. which stores general ledger accounts and general ledger code combinations. Explore general ledger account hierarchies in W_HIERARCHY_D.9: Implementation for Oracle EBS .

Explore the file_glacct_segment_config_ora. SEG1 stores Department information. you need to specify the segments of the same nature in the same column. In this example. and SEG3 stores Company information. all SEG2 segments (Account) from all chart of accounts will be stored in the ACCOUNT_SEG2* columns. To specify the segments that you want to analyze.csv. all SEG3 segments (Company) from all chart of accounts will be stored in the ACCOUNT_SEG3* columns and so forth if you add more paired values. Because there are 30 paired segment columns in W_GL_ACCOUNT_D. However. The objective of this configuration file is to make sure that when segment information is extracted into the warehouse table W_GL_ ACCOUNT_D. When ETL is run. The screenshot shows a populated W_GL_ACCOUNT_D based on the information provided in the Oracle BI Applications 7. you could proceed to add up to 30 paired values. all SEG1 segments (Department) from all chart of accounts will be stored in the ACCOUNT_SEG1* columns. Therefore. Thus.csv file. c. SEG2 stores Account information. Before you run the ETL process for general ledger accounts. a.9: Implementation for Oracle EBS 81 . Explore W_GL_ACCOUNT_D.Lesson 10: Configuring Analytical Applications 2. you should only add as many as you need to analyze your facts by value set hierarchies. in this file. The screenshot shows a partial view of the file: This means that in E-Business Suite. you need to specify the segments that you want to analyze. In this file. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles. b. chart of accounts 101 has department information stored in SEGMENT2 and value set ID = 1002471. Chart of accounts 50214 has department information stored in SEGMENT4 and value set ID = 1002725. you use this ETL configuration file. segments with the same nature from different charts of accounts are stored in the same column in W_GL_ACCOUNT_D. segments from both chart of accounts (101 and 50214) are stored in the same SEG1/SEG1_VALUESETID paired values. Open the file file_glacct_segment_config_ora. using this file allows you to uniformly store segments of the same nature in the same column in the data warehouse regardless of how they are stored in EBS.

e. b. In the physical layer. This tag is active because in this example you have chosen to set up hierarchies in Oracle Financial Analytics using the flexfield value set definitions for general ledger accounting. expand Oracle Data Warehouse > Catalog > dbo.csv file. As expected. Explore the hierarchy information in the physical layer of the preconfigured repository. This means that in W_GL_BALANCE_A (where you store GL account balances at an aggregated level). It stores hierarchies in a flattened structure. e. a. you want to store GL account balances at the company. click the Subject Areas tab. c. segments with value set ID 1002471 are stored in ACCOUNT_SEG1* columns. This tag is inactive because in this example you have chosen to set up hierarchies in Oracle Financial Analytics using the flexfield value set definitions for general ledger accounting. Log in as Administrator with password SADMIN. Refer to the Oracle Business Intelligence Applications Fusion Edition Installation and Configuration Guide for more information about using the Financial Statement Generator (FSG) report definition to set up hierarchies. f. and department level instead of at the GL code combination level. and so forth. Scroll to locate Dim_W_HIERARCHY_D_ValueSet1. click the Configuration Tags subtab. Verify that the tag Oracle – Extract Value Set Hierarchies is active (that the inactive box is not checked). Note that Dim_W_HIERARCHY_D_ValueSet1 is an alias table that points to the W_HIERARCHY_D table. h. k. With the Financials – General Ledger subject area selected. This table is used to establish the repository hierarchies based on the flexfield value set definitions you explored earlier.rpd to open it in the Administration Tool. g. Double-click OracleBIAnalyticsApps. Close file_glacct_segment_config_ora.Lesson 10: Configuring Analytical Applications file_glacct_segment_config_ora. Verify that the tag Oracle – Extract FSG Hierarchies is inactive (that the inactive box is checked). Notice that aggregation is set to Y for all segment columns. f. Scroll to the bottom of file file_glacct_segment_config_ora. account.9: Implementation for Oracle EBS . The screenshot shows only a partial view of W_GL_ACCOUNT_D: d. Double-click Dim_W_HIERARCHY_D_ValueSet1 to open the Alias Physical Table properties dialog box. 3. d. and segments with value set ID 1002472 are stored in ACCOUNT_SEG2* columns.csv without making any changes. j. (Recall that W_HIERARCHY_D is a generic hierarchy table. Columns HIER_CODE to 82 Oracle BI Applications 7.csv. Scroll or query to locate the Financials – General Ledger subject area. Navigate to C:\OracleBI\server\Repository. i. not using the Financial Statement Generator (FSG) report definition. In the DAC client.

h. j. The code for the highest level in the hierarchy (HIER_CODE) joins to the column in W_GL_BALANCE_A that stores the segment one attribute. Click Cancel to close the Physical Foreign Key dialog box.9: Implementation for Oracle EBS 83 . including the leaf nodes. The code for the highest level in the hierarchy (HIER_CODE) joins to column in W_GL_ACCOUNT_D that stores the segment one attribute. m. Notice the join in the Expression field: Dim_W_HIERARCHY_D_ValueSet1. Right-click Dim_W_HIERARCHY_D_ValueSet1 and select Physical Diagram > Object(s) and Direct Joins. Close the Physical Diagram.HIER_CODE = Dim_W_GL_ACCOUNT_D. n.HIER20_CODE = Fact_Agg_W_GL_BALANCE_A. W_GL_BALANCE_A has only six segment columns. if you have more than six hierarchies. Each record stores the top node to leaf node path in the HIER_CODE to HIER20_CODE columns).HIER_CODE = Fact_Agg_W_GL_BALANCE_A. Double-click the connector between Dim_W_HIERARCHY_D_ValueSet1 and Dim_W_GL_ACCOUNT_D to open the Physical Foreign Key dialog box. l. Notice the join in the Expression field: Dim_W_HIERARCHY_D_ValueSet1. Recall that W_GL_BALANCE_A stores general ledger account balances aggregated by general ledger account codes and segment attributes and W_GL_ACCOUNT_D stores general ledger accounts and general ledger code combinations.ACCOUNT_SEG1_CODE AND Dim_W_HIERARCHY_D_ValueSet1. i. k. Explore the hierarchy information in the business model and mapping layer of the preconfigured repository. 4.ACCOUNT_SEG1_ATTRIB The code for the last level in the hierarchy (HIER20_CODE) joins to the column in W_GL_BALANCE_A that stores the segment one code.Lesson 10: Configuring Analytical Applications g. which is an alias that points to the W_GL_BALANCE_A_table. So.ACCOUNT_SEG1_ATTRIB The code for the last level (the detail leaf level) in the hierarchy (HIER20_CODE) joins to the column in W_GL_ACCOUNT_D that stores the segment one code. HIER20_CODE and HIER_NAME to HIER20_NAME store the code/name values for each level within a hierarchy. and Dim_W_GL_ACCOUNT_D. Click Cancel to close the Physical Foreign Key dialog box.ACCOUNT_SEG1_CODE AND Dim_W_HIERARCHY_D_ValueSet1. p. Oracle BI Applications 7.HIER20_CODE = Dim_W_GL_ACCOUNT_D. join only the first six to W_GL_BALANCE_A but join all hierarchies to W_GL_ACCOUNT_D in the previous step. Notice that Dim_W_HIERARCHY_D_ValueSet1 joins to Fact_Agg_W_GL_BALANCE_A. Double-click the connector between Dim_W_HIERARCHY_D_ValueSet1 and Dim_W_GL_ACCOUNT_D to open the Physical Foreign Key dialog box. o. which is an alias that points to the W_GL_ACCOUNT_D table. Click Cancel to close the properties dialog box.

1002470 is the value set hierarchy ID of the segment for which you are creating the hierarchy. d.Catalog. Click the Column Mapping tab and notice that the logical code/name columns map to the physical code/name columns in Dim_W_HIERARCHY_D_ValueSet1. Close the Logical Table Diagram. expand Core and scroll to locate Dim – GL ValueSetHierarchy1. 84 Oracle BI Applications 7. Expand Dim – GL ValueSetHierarchy1 > Sources and double-click Dim_W_HIERARCHY_D_ValueSet1 to open the logical table source dialog box. e. h.9: Implementation for Oracle EBS . In the WHERE clause filter. which displays all the logical fact tables that have logical join to the logical hierarchy table Dim – GL ValueSetHierarchy1. Right-click the Dim – GL ValueSetHierarchy1 logical table (not the logical table source) and select Business Diagram > Selected Tables and Direct Joins to open the Logical Table Diagram. Click the Content tab. notice that a HIER_CODE filter is specified to restrain the output of the logical table to one hierarchy only: "Oracle Data Warehouse". Close the Oracle BI Administration Tool. In this example 1002470 corresponds to the highest level hierarchy in W_HIERARCHY_D.dbo. Close OracleBIAnalyticsApps. c. Click Cancel to close the Logical Table Source dialog box. f.HIER_CODE = 1002470 Here.rpd without saving. b. i.Dim_W_HIERARCHY_D_ValueSet1. In the Business Model and Mapping layer. j. g.Lesson 10: Configuring Analytical Applications a.

for example. supplier dimension in accounts payables (AP) and customer dimension in account receivables (AR). This information resides in the subledgers.csv file.csv. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles and open file_group_acct_names. Explore W_GL_ACCOUNT_D. It is critical that general ledger account numbers are mapped to the group account numbers (or domain values) because the group account number is used during data extraction as well as front-end reporting. customer. OBIEE Financial Applications uses group account number to categorize the accounting entries. employee. a.9: Implementation for Oracle EBS 85 . AP equals ACCOUNTS PAYABLES and AR equals ACCOUNTS Oracle BI Applications 7. which is the data warehouse dimension table that stores all the general ledger accounts associated with any organization. This file provides a list of group account numbers that you can use. For example. The GROUP_ACCOUNT_NUM field in W_GL_ACCOUNT_D denotes the nature of the general ledger accounts (for example. and so on. Time 10–15 minutes Instructions: 1. Explore the file_group_acct_names. in order to facilitate reporting on the GL transactions in a data warehouse environment. The screenshot provides a partial snapshot of W_GL_ACCOUNT_D: 2. cash account or payroll account). the transactions are tracked at an account level and used more for bookkeeping purposes. Therefore. In Oracle GL.Lesson 10: Configuring Analytical Applications Practice 10-3: Mapping Oracle GL Natural Accounts to Group Account Numbers Goal Scenario To map Oracle general ledger natural accounts to group account numbers Oracle EBS General Ledger (GL) does not contain business attributes that represent real world entities such as supplier.

It includes the financial statement item code. For example. A list of domain values for general ledger account numbers is also provided in the Oracle Business Analytics Warehouse Data Model Reference.csv without making any changes. Each row maps all accounts within the specified account number range and within the given chart of account ID to a group account number. b. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles. a. in the partial view of the file in the screenshot below. which determines the nature of the account. Explore the data model reference. all accounts within the account number range from 4110 to 4110 that have a chart of accounts ID equal to 101 are assigned to the Revenue group account number.csv.csv file. Close file_group_acct_names. Open file_group_acct_codes_ora.Lesson 10: Configuring Analytical Applications RECEIVABLE. All accounts within the account number range from 5110 to 5110 that have a chart of accounts ID equal to 101 are assigned to Cost of Goods Sold (COGS) group account 86 Oracle BI Applications 7.9: Implementation for Oracle EBS . Explore the file_group_acct_codes_ora. The screenshot shows only a partial view of the table: 4. This file provides the logic for assigning accounts. 3. The screenshot provides only a partial view of file_group_acct_names.csv: b.

AR. you indirectly associate the GL account numbers to the financial statement item codes as well. 5.csv file. and so forth. COGS.csv file a. If you create a new group account number. add a new row to this file. Open the C:\OracleBI\dwrep\Informatica\SrcFiles\file_grpact_fstmt. REVENUE. TAX. You must map the new group account number to one of the following financial statement item codes: AP. The screenshot shows a partial view of the file_grpact_fstmt.Lesson 10: Configuring Analytical Applications number. you can create new rows in the file_group_acct_names. OTHERS. Close file_group_acct_codes_ora. c.csv without making any changes. This file specifies the relationship between a group account number and a financial statement item code. Explore the file_grpact_fstmt.csv file. You must also provide the general ledger account category code (GL_ACCOUNT_CAT_CODE). which determines whether the account is a balance sheet account (BS) or a profit and loss account (PL).9: Implementation for Oracle EBS 87 .csv file: These financial statement item codes correspond to the following six base fact tables in the Financial Analytics product: AP base fact (W_AP_XACT_F) AR base fact (W_AR_XACT_F) Revenue base fact (W_GL_REVN_F) Cost of Goods Sold base fact (W_GL_COGS_F) Tax base fact (W_TAX_XACT_F) GL Journal base fact (W_GL_OTHER_F) Oracle BI Applications 7. If you need to create a new group of account numbers. You can then assign GL accounts to the new group of account numbers in the file_group_acct_codes_ora. By mapping your GL accounts against the group account numbers and then associating the group account number to a financial statement item code.csv file.

csv without making any changes.9: Implementation for Oracle EBS .Lesson 10: Configuring Analytical Applications b. 88 Oracle BI Applications 7. Close file_grpact_fstmt.

9: Implementation for Oracle EBS 89 . d. which stands for prepaid expenses. The repository is large. Select OracleBIAnalyticsApps. Scroll to line 11 and notice that there is a group account number named PPAID EXP. g. Navigate to C:\Informatica\PowerCenter8. b. expand Financials – GL Balance Sheet > Facts – Balance Sheet Statement. a. f. so it may take a moment to open. it does not automatically show up in your reports.1\server\infa_shared\SrcFiles. h. Right-click Prepaid Expenses and select Display Related > Logical Column to open the Query Repository dialog box. i. c.1. Select File > Open > Offline. In the Presentation layer. 10–15 minutes Time Instructions: 1. Open file_group_acct_codes_ora. The Prepaid Expenses logical column is displayed in the Business Model and Mapping layer.rpd and click Open. Leave the file open.Lesson 10: Configuring Analytical Applications Practice 10-4: Creating a New Metric Based on a New Group Account Number Goals Scenario To create a new metric in the repository after adding a new group account number When you add a new group account number in file_group_acct_codes_ora. l. Select Start > Programs > Oracle Business Intelligence > Administration to open the Administration Tool. Double-click the Prepaid Expenses logical column in the Business Model and Mapping layer to open the Logical Column dialog box. Select the Prepaid Expenses logical column in the list and click Go To. j. Oracle BI Applications 7. k. Log in as Administrator with password SADMIN.csv. In the Oracle BI repository you need to create a new metric that maps to the new group account number. Explore how measures are mapped to group account numbers in the OBI repository. e.csv.

9: Implementation for Oracle EBS .Lesson 10: Configuring Analytical Applications m. Click Cancel to close the Logical Column dialog box. e. locate the duplicate measure Prepaid Expense#1. n. Scroll down. right-click the Prepaid Expenses measure and select Duplicate. If necessary. d. Return to file_group_acct_codes_ora. a. Create a new group account number in file_group_acct_codes_ora.csv. Double-click TEST to open the Logical Column dialog box. c. Thus.csv and insert a new row below the first row with the following data: b. 90 Oracle BI Applications 7. 2.csv. and rename it to TEST. Save and close file_group_acct_codes_ora. Click Yes if prompted to keep the CSV format. click the General tab and notice that the corresponding fact table (GL Balance fact) in the BMM layer joins with the GL Account Dimension and filters for those GL accounts that belong to the group account number PPAID EXP. In the Business Model and Mapping layer. the metric Prepaid Expenses is the total amount coming from accounts which have a group account number equal to PPAID EXP.

h. g. That execution plan completes in approximately 90 minutes. Click OK to close the Logical Column dialog box. which is prohibitive given the time constraints of this training. Oracle BI Applications 7.5. 3.10 execution plan.10 takes 4-5 hours. Close the repository and the Administration Tool. transform the data via Informatica mappings. j. However. Do not check global consistency. replace PPAID EXP with TEST.9: Implementation for Oracle EBS 91 . Typically at this point. The metric TEST is now the total amount coming from accounts that have a group account number equal to TEST. Save the repository.Lesson 10: Configuring Analytical Applications f. Running a full ETL. The new measure is now available to be viewed in Oracle BI Presentation Services requests and dashboards. In the filter. you would use the DAC to run the Financials_Oracle 11. Therefore.5. and load the data into the appropriate tables in the Oracle Business Analytics Data Warehouse. Drag the new TEST measure to Financials – GL Balance Sheet > Facts – Balance Sheet Statement in the Presentation layer. you run ETL for a custom execution plan in the next set of practices. running an initial extract for Financials_Oracle 11. which would extract data from the Oracle E-Business Suite source system. i.

9: Implementation for Oracle EBS .Lesson 10: Configuring Analytical Applications 92 Oracle BI Applications 7.

c. click the calendar icon to open the Date dialog box. Scroll or query to locate the $$INITIAL_EXTRACT_DATE parameter. If you want to customize the metadata in a preconfigured container. In the Date field. enter Custom. h. you must first make a copy of the container. e. select the Design view in the DAC client.9: Implementation for Oracle EBS 93 . Please notice that a preconfigured Financials – Revenue subject area already exists. But for training purposes. g. so that at any time you can find the newly created objects and modified objects. You cannot change the metadata for preconfigured containers. you determine that the only subject area you want to analyze is Financials – Revenue. a new subject area. c. 2003. enter 01. If necessary. Time 70–90 minutes (depending on how much time the ETL takes to run) Scenario Instructions: 1. d. Configure the initial extract date.Lesson 11: Customizing DAC Metadata and Running an Execution Plan Practice 11-1: Customizing DAC Metadata Goals To customize the DAC by creating a custom container. Display the Edit subtab. Enter December 1. Select Create as a Copy of Existing Container.Revenue fact table. It should take about five minutes to complete. In the Name field. After examining your business requirements. f. Click the Source System Parameters tab. f. b. d. Click the check icon in the Value field to open the Enter Parameter Value dialog box. 2. click OK. select the Custom container from the list to the right of the Execute button. which should still be open.10 container from the list and click OK. Select the Oracle 11. a. a. When you see the success message. The DAC keeps track of all customizations in the copied container. you create and run a custom subject area and execution plan limited to the Financials . In the ID field. Second: 0. Hour: 12 AM. Oracle BI Applications 7. e. Return to the DAC client. The DAC metadata for a source system is held in a container. If not. and an execution plan Oracle Business Intelligence Applications provides preconfigured subject areas and execution plans. Create a copy of a preconfigured source system container. Select File > New Source System Container. select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client and log in to the DAC connection using dac as the table owner name and password. Minute: 0. b.5. as well as the original objects. You can change these preconfigured objects or create new objects to correspond to your particular business requirements. If necessary.

In the right-hand list. the subject area currently has no tasks associated with it. This attribute stores the time stamp for the most recent assembly of a subject area. In the Edit subtab. f. c. 94 Oracle BI Applications 7. Select the Subject Areas tab. Click Add/Remove. Click OK to close the Enter Parameter Value dialog box.Revenue as the name of the subject area and click Save. which identifies tasks that are added by the DAC. a. j. Click Refresh in the Tasks subtab to refresh the list of tasks associated with the custom subject area. Notice that in the left-hand list of the Choose Tables dialog box. g. a. verify that 1 of 214 is displayed. Click Save. 3. Click OK to close the Date dialog box. b. the W_GL_REVN_F table appears in a green italic font to indicate that the DAC object is a referenced object.5. and load the W_GL_REVN_F fact table and all of its related tables. Click the Tasks subtab and notice that. k. select the Selected record only option and click OK. Click Yes to confirm that you want to proceed. d. In the upper-right corner of the Task subtab. In the Choose Tables dialog box. Verify that the Custom – Revenue subject area is selected and click Assemble. because you have never assembled the Custom – Revenue subject area. g. d. The DAC assembles a subject area by determining what dimensions and other related tables are required and what tasks are needed to load these tables. Notice that all tasks generated by the assembly of the subject area are marked as Autogenerated.Lesson 11: Customizing DAC Metadata and Running an Execution Plan i. f. l. Assemble the subject area. the W_GL_REVN_F table appears in standard font. This indicates that the object is not a reference or a cloned object but is unique to the current container. Click OK to acknowledge that the subject area has been successfully assembled. Scroll through the list and notice that the DAC has assembled a list of all the tasks that prepare. Click OK in the Adding window to acknowledge that W_GL_REVN_F is added to the subject area. Also notice that the Last Designed attribute of the Custom – Revenue subject area is null. i. The Updating dialog box opens. k. c. h. Click New. enter Custom . j. e. Notice also that the Last Designed attribute for the Custom – Revenue subject area is now populated. The assembly of subject area takes several minutes. Click Add to add W_GL_REVN_F to the right-hand list of tables that belong to the custom subject area. In the Assembling window. Click Save. A cloned object is a referenced object that has been modified in the referencing container. e. 4.10 container. in this case owned by the Oracle 11. Create a custom subject area.9: Implementation for Oracle EBS . extract. Select the Tables subtab. Click OK to close the Choose Tables dialog box. b. query for W_GL_REVN_F. indicating that there are 214 tasks associated with this subject area.

DBConnection_OLAP DBConnection_OLTP FlatFileConnection DataWarehouse ORA_11_5_10 ORA_11_5_10_Flatfile f. the DAC compiles and sets precedence for the tasks required to load the subject areas included in the plan. Click Save. This file includes generic as well as task-level parameters set at the task object level. You need only update the pertinent connection parameters. g. 6. enter Custom – Revenue as the name of the execution plan and click Save. which. a. Recall that. select the Custom – Revenue subject area and click Add to add it to the list of subject areas on the right belonging to the custom execution plan. is generated. which are to be updated. b. the connection details specified in the execution plan parameters must match physical data sources specified in the Setup view. d. c. e. b. the DAC Server compiles a file for each Informatica DAC task that contains Informatica Server parameters and writes it to the c:\ \Informatica\PowerCenter8. once generated. h.9: Implementation for Oracle EBS 95 . In the Edit subtab. Build the execution plan. Execution plan parameters include the Informatica repository folders related to the container that the execution plan belongs to. At run time. because you have never built the execution plan. Enter the following values to update the DATASOURCE parameters. Select the Parameters child tab and click Generate. The parameter names are generated as they will appear in the parameter file generated at run time.1. Create a custom execution plan and add the Custom – Revenue subject area to it. When you build an execution plan. If you need to verify that the values are correct. must match the source and target relational connections configured in the Informatica repository. select the Custom – Revenue execution plan. Click Yes to proceed.Lesson 11: Customizing DAC Metadata and Running an Execution Plan 5. is referenced in the PMCMD command issued for the task during the execution of an execution plan. Select the Subject Areas child tab. Generate execution plan parameters for the Custom – Revenue execution plan that you have created. Click the Ordered Tasks child tab for the Custom – Revenue execution plan and notice that. as well as data source connections. 7. c. a. Click OK to close the Choose Subject Areas dialog box. it currently has no tasks associated Oracle BI Applications 7. Click Add/Remove. Click the Execute button to open the Execute view. In the table list on the left. Click OK in the Adding window to acknowledge that the Custom – Revenue subject area is added to the execution plan. Click New. a. A list of seed parameters. The task’s parameter file.1\server\infa_shared\SessLogs directory. in turn. and the execution plan–level parameters that specify connection details for the transactional database sourced by the execution plan as well as the warehouse that it targets. d. as previously noted . In the Execution Plan list. Click OK to acknowledge that new values have been set for the execution plan parameters. you can navigate to the Setup view and check the names of the Source and Warehouse type connections. f. e.

b. Tasks that have no dependencies have a depth of 0.Lesson 11: Customizing DAC Metadata and Running an Execution Plan b. this attribute stores the time stamp indicating when the execution plan was last built. Run the custom execution plan. Monitor the ETL plan execution. indicating that a plan is being executed. Also. extract. Verify that the DAC Server is started. 10. e. Tasks that depend on other tasks having a depth of 0 have a depth of 1. b. Accept the default and click OK in the Building window. f. Click OK to acknowledge that the request has been successfully submitted to the Informatica Server. c. Also.9: Implementation for Oracle EBS . d. 9.” When the DAC client cannot establish a connection to the DAC Server. If the client is connected to a server that is running an ETL process. As with assembling a subject area. indicating its execution precedence at plan run time. the icon resembles a green electrical plug with a lightning sign superimposed on it. Click Auto Refresh and set the automatic refresh frequency to 30 seconds. verify that 1 of 214 is displayed. In the Starting ETL dialog box. and so on. click Yes to confirm that you want to start the execution plan. Verify that the Custom – Revenue execution plan is selected and click Build. select Start > Programs > Oracle Business Intelligence > Oracle DAC > Start DAC Server and verify that the DAC Server Monitor icon resembles an orange electrical plug in a socket before continuing. again accept the default to perform the operation for the selected record only and click OK. a. g. As with subject areas. which means that the client is connected to the server and that the server is idle. a. Click OK to acknowledge that the Custom – Revenue execution plan has been successfully built. building the execution plan takes several minutes. Scroll through the list and notice that the DAC has assembled a list of all the tasks that prepare. If the DAC Server is not started. Select the Current Run tab. When you mouse over the orange icon it should say “DAC Server is idle. the Server Monitor icon resembles a red electrical plug. Notice that the DAC Server Monitor icon has changed from yellow to green. In the upper-right corner of the Ordered Tasks subtab. c. a. Notice that a time stamp now appears in the Last Designed column for the execution plan. In the Execution Plans tab. 8. In the second Building window. d. with it. Click Refresh in the Ordered Tasks subtab to refresh the list of tasks associated with the execution plan. Click the Run Now button in the Top Pane toolbar. Verify that the DAC Server Monitor icon in the upper-right corner of the DAC client resembles an orange electrical plug in a socket. indicating that there are 214 tasks associated with this execution plan. and load the tables belonging to the subject areas included in the execution plan. b. Select the Custom – Revenue run and confirm that it has a run status of Running. 96 Oracle BI Applications 7. notice that the Last Designed attribute of the execution plan is null. notice that each task is assigned a task depth. h. select the Custom – Revenue execution plan. c.

the execution plan’s status is Running. You can also manually run a task. notice the path that the log file has been saved to and click OK. You can check the tasks that have failed in the Current Run tab of the Execute view. When all tasks have completed (after approximately 60 minutes). Select the Tasks subtab to view task statuses within the execution plan. Close the log file. Queued. d. Running. In Windows Explorer. Select Start > Programs > Oracle – OraDb11g_home1 > Application Development > SQL Plus. and Number of Successful Tasks = 214. Verify that 105882 rows are returned. Enter obaw as the user name. c. a. Enter the following SQL query to verify that the W_GL_REVN_F table contains data: SELECT COUNT (*) from W_GL_REVN_F. and so on. If the execution plan fails: When an execution plan is executed and a task fails. If you need assistance. 11. Verify that Run Status = Completed.9: Implementation for Oracle EBS 97 . the execution plan’s status is changed to Failed. When all the tasks have been run. b. e. i. While tasks are still running. Query the data warehouse to verify that tables contain data. and then re-queue the failed tasks by changing the status to Queued. f. In the Fetching log file dialog box. Close SQL*Plus. Number of Failed Tasks = 0. 12. navigate to C:\OracleBI\DAC\ServerLog and open the Custom – Revenue. e. You can then restart the ETL. View the run history for the execution plan. either accept the file name or modify it and click OK. d.log file.Lesson 11: Customizing DAC Metadata and Running an Execution Plan d. Use the list to view different task statuses: All. e. and select Get Run information > Get log file. h. a. f. Right-click the Custom – Revenue run. where ‘x’ is the process ID number. and if one or more tasks have failed. Tasks with a Completed status are skipped. and then restart the ETL.x. Oracle BI Applications 7. Status Description = Finished. ask your instructor. g. b. c. Select the Custom – Revenue execution plan that you just ran. All the tasks will then be rerun. select the Run History tab. the status of the tasks that are dependent on the failed task is changed to Stopped. Enter obaw as the password. Scroll through the file and confirm that there are no steps listed under Failed Sessions or Queued Sessions. In the Input dialog box. change its status to Completed. fix the problems.

Lesson 11: Customizing DAC Metadata and Running an Execution Plan 98 Oracle BI Applications 7.9: Implementation for Oracle EBS .

Typically. you verify that the data warehouse target table. i. b. Click Yes when asked if you want to copy SDE_ORA_OrganizationDimension_Customer. a. h. f. Navigate to SDE_ORA11510_Adaptor > Workflows. c. b. a. To build the custom SDE mapping. Copy the SDE_ORA_OrganizationDimension_Customer workflow and paste it into the CUSTOM_SDE folder. is populated with these rows. Copy the SDE_ORA_OrganizationDimension_Customer mapping and paste it into the CUSTOM_SDE folder. In this example. Select the SDE_ORA_OrganizationDimension_Customer mapping.Lesson 13: Adding Columns to an Existing Dimension Table Practice 13-1: Creating a Custom SDE Mapping Goals To create a source dependent extract (SDE) mapping that can be used to move data from a column in a source system into the W_ORG_DS staging extension table in preparation for loading it into the W_ORG_D table in the data warehouse In this type 1 customization. Use SQL*Plus to examine data in the source table. Log in as biapps with password biapps. Navigate to SDE_ORA11510_Adaptor > Mappings. e. W_ORG_D. Both workflows are based on the same Oracle BI Applications 7. d. You want to extract the data from this column and ultimately load it into a custom column in the organization dimension table. you would create two workflows. W_ORG_D. in the data warehouse. Run the following select statement: select count(*) from HZ_CUST_ACCOUNTS where ATTRIBUTE5 is NOT NULL.9: Implementation for Oracle EBS 99 . g. Verify that the SDE_ORA_OrganizationDimension_Customer mapping is visible in CUSTOM_SDE > Mappings. you extract data from a column in a source system and load the data into an existing data warehouse table. e. After you run ETL. Close SQL*Plus 2. your company has used the ATTRIBUTE5 column in the HZ_CUST_ACCOUNTS table in the EBS source system to capture data related to accounts. one to be used for a full load and the other to be used for an incremental load. Verify that 27 rows are returned. Copy an existing mapping and workflow to a custom folder. Select Start > Programs > Oracle – OraDb11g_home > Application Development > SQL Plus to start SQL*Plus. 15–20 minutes Scenario Time Instructions: 1. d. c. Connect to the Oracle_BI_DW_Base repository. you copy an existing mapping and workflow into a custom folder in Informatica and then modify the mapping and workflow. If necessary. open Informatica Repository Manager. The first step is to build a custom SDE mapping.

Edit the SQL override in the source qualifier. c. click Finish. m. h. which is executed during both full and incremental loads. In Repository Manager. Select Tools > Mapping Designer. Select Tools > Target Designer. c. j. Navigate to CUSTOM_SDE > Targets > W_ORG_DS. select Reuse and Apply this resolution to > All conflicts. l. a.9: Implementation for Oracle EBS . Edit the target definition to include the required column. a. b. Click the Add a new column to this table button. Drag W_ORG_DS into the Target Designer window. Navigate to CUSTOM_SDE > Mappings. Click OK to close the Edit Tables dialog box. d. Drag the ATTRIBUTE5 column from the EXP_CUSTOMERS expression to a blank port in the MAPO_CUSTOMERS output transformation. Drag the ATTRIBUTE5 column from the SQ_BCI_CUSTOMERS source qualifier to a blank port in the EXP_CUSTOMERS expression. mapping. Click Next. n. Scroll to the bottom and select the last column in the list. g. 100 Oracle BI Applications 7. e. 5. f. Verify that the SDE_ORA_OrganizationDimension_Customer workflow is visible in CUSTOM_SDE > Workflows.Lesson 13: Adding Columns to an Existing Dimension Table j. g. 3. This provides an opportunity to tune each of these load scenarios. In the Mapplet Designer. h. In the Copy Summary window. Change the name of the NEWFIELD column to X_ACCOUNT_LOG and verify that the data type is set to varchar2(10). you copy and create only one workflow for all the mappings in this set of practices. Drag the SDE_ORA_OrganizationDimension_Customer mapping into the Mapping Designer. l. k. Map the column in the mapplet. In the Copy Wizard. Right-click the mplt_BC_ORA_OrganizationDimension_Customer mapplet and select Open Mapplet. a. Double-click W_ORG_DS in the Target Designer window to open the Edit Tables dialog box. locate the HZ_CUST_ACCOUNTS source definition. e. For this training. select Tools > Designer to open Informatica Designer. Click Apply. f. Click Yes to confirm the copy. Click the Columns tab. which creates a NEWFIELD column. b. i. Drag the ATTRIBUTE5 column from the HZ_CUST_ACCOUNTS source definition to a blank port in the SQ_BCI_CUSTOMERS source qualifier. 4. k. d. X_CUSTOM. Double-click the SQ_BCI_CUSTOMERS source qualifier. Verify that the CUSTOM_SDE folder is open in the repository navigator in Informatica Designer.

a. Click Create. e. f. Return to the mapping by selecting Tools > Mapping Designer or by clicking the Mapping Designer icon on the toolbar. Double-click the X_CUSTOM expression to open the Edit Transformations dialog box. Select Expression in the list. c. Oracle BI Applications 7. Enter X_CUSTOM as the name. Be sure to add a comma before HZ_CUST_ACCOUNTS. d. 6. select View > Output to view the Output window. Validate your work and update the repository. Scroll to the bottom and verify that the ATTRIBUTE5 port has been added. Scroll down and add the HZ_CUST_ACCOUNTS. Click the Ports tab. Click Apply and OK in the Edit Transformations dialog box.Lesson 13: Adding Columns to an Existing Dimension Table Click the Ports tab. e. If the mapping is changed and the related exposed objects are replaced. d. this will make it easier to reconnect. f. rename the port to indicate both the table and column it comes from: HZ_CUST_ACCOUNTS_ATTRIBUTE5. c. Validate the mapplet. Click the down arrow in the Value field for the SQL Query transformation attribute to open the SQL Editor. 8. b. 9.ATTRIBUTE5 column immediately after the last column in the SELECT clause. g. Click OK to close the SQL Editor. Create a new custom expression transformation in the mapping. select Repository > Save to update the repository. h. because the custom expression will not be replaced. b. f. a. Click the Properties tab. Map the column in the mapping. Hint: Use the Ports tab on the left to add the column. 7. If your mapplet is valid. b. Click OK to close the Edit Transformations dialog box. As a best practice. c.ATTRIBUTE5. Select Transformation > Create. d. Click Done. a. If necessary. Drag HZ_CUST_ACCOUNTS_ATTRIBUTE5 from the X_CUSTOM expression to the X_ACCOUNT_LOG port in the W_ORG_DS target definition.9: Implementation for Oracle EBS 101 . b.” c. e. You should get the message “Mapplet mplt_BC_ORA_OrganizationDimension_Customer is VALID. Drag the ATTRIBUTE5 column from the mplt_BC_ORA_OrganizationDimension_Customer mapplet to the X_CUSTOM expression. Select Mapplets > Validate to verify that there are no inconsistencies in the mapping.

j. select Tools > Workflow Designer to open Workflow Designer. Verify that $Source connection value is set to $DBConnection_OLTP.SDE_ORA_OrganizationDimension_Customer. Drag the SDE_ORA_OrganizationDimension_Customer workflow into the Workflow Designer window. 10. v. You should get the message “Mapping SDE_ORA_OrganizationDimension_Customer is VALID. a. Verify that $Target connection value is set to $DBConnection_OLAP. u. x. verify that the Fail parent if this task fails and Fail parent if this task does not run options are selected. In the General tab. Edit the workflow. Change the parameter file name to CUSTOM_SDE. Click OK. select Repository > Save to update the repository. c. select Repository > Save to update the repository. Change the Target load type attribute from Bulk to Normal. p. g. d. in Workflow Manager. l. w. If necessary. Verify that the connection value is set to $DBConnection_OLTP. m. select View > Output to view the Output window. t. n. In the left pane. 102 Oracle BI Applications 7. s. q. Click the Mapping tab. y. Select Tools > Workflow Manager.” c. In the Workflow Designer window. r. e. Change the session log file name to CUSTOM_SDE. You should get the message Workflow SDE_ORA_OrganizationDimension_Customer is VALID.log. f.Lesson 13: Adding Columns to an Existing Dimension Table a. For the Stop on errors attribute. Expand CUSTOM_SDE > Workflows. enter a value of 1.SDE_ORA_OrganizationDimension_Customer. Select Workflows > Validate. b. Click the Validate tab of the Output window and select Mappings > Validate to verify that there are no inconsistencies in the mapping. If your workflows are valid. Verify that the CUSTOM_SDE folder is open. i. h. b. select Sources > mplt_BC_ORA_OrganizationDimension_Customer. For Targets > W_ORG_DS. Click the Properties tab. double-click the SDE_ORA_OrganizationDimension_Customer task to open the Edit Tasks dialog box. o. If necessary. Click Apply.9: Implementation for Oracle EBS . k.txt. Click the Config Object tab. verify that the connection value is set to $DBConnection_OLAP. Click the Validate tab in the Output Window.SQ_BCI_CUSTOMERS. If your mapping is valid.

Lesson 13: Adding Columns to an Existing Dimension Table

Practice 13-2: Creating a Custom SIL Mapping
Goals Scenario To create a source independent load (SIL) mapping that can be used to move data into the W_ORG_D table from the W_ORG_DS staging table You have built an SDE mapping to extract data from the source system and load it into the W_ORG_DS staging table in the data warehouse. You now create an SIL mapping to move the data from the W_ORG_DS staging table into the W_ORG_D dimension table. To build the SIL mapping, you copy an existing mapping and workflow into a custom folder in Informatica and then modify the mapping and workflow. 15–20 minutes

Time

Instructions:
1. Create a custom Informatica repository folder and copy the SIL mapping and workflow needed for customization. a. Return to Informatica Repository Manager, which should still be open. b. Select Oracle_BI_DW_Base in the Repository Navigator. c. Select Folder > Create. d. In the Create Folder dialog box, name the folder CUSTOM_SILOS. e. Click OK. f. Click OK to confirm that the folder was successfully created. g. Navigate to SILOS > Mappings > SIL_OrganizationDimension. h. Copy the SIL_OrganizationDimension mapping and paste it into the CUSTOM_SILOS folder. i. Click Yes when asked if you want to copy. j. Navigate to SILOS > Workflows > SIL_OrganizationDimension. k. Copy the SIL_OrganizationDimension workflow and paste it into the CUSTOM_SILOS folder. l. Click Yes when asked if you want to copy. m. In the Copy Wizard, select Reuse and Apply this resolution to… > All Conflicts. n. Click Next. o. In the Copy Summary window, click Finish. 2. Edit the source definition to include the required columns. a. Select Tools > Designer to return to Informatica Designer. b. If you do not see the new CUSTOM_SILOS folder, right-click Oracle_BI_DW_Base, select Disconnect, and then reconnect as Administrator with password Administrator. c. Verify that the CUSTOM_SILOS folder is visible in the repository navigator. d. Open CUSTOM_SILOS. e. Select Tools > Source Analyzer.
Oracle BI Applications 7.9: Implementation for Oracle EBS 103

Lesson 13: Adding Columns to an Existing Dimension Table

f. Navigate to CUSTOM_SILOS > Sources > OLAP > W_ORG_DS. g. Drag W_ORG_DS into the Source Analyzer window. h. Double-click W_ORG_DS in the Source Analyzer window to open the Edit Tables dialog box. i. Click the Columns tab. j. Scroll to the bottom and select the last column in the list, X_CUSTOM. k. Click the Add a new column to this table button, which creates a NEWFIELD column. l. Rename the NEWFIELD column to X_ACCOUNT_LOG and verify that the data type is set to varchar2(10). m. Click Apply. n. Click OK. 3. Edit the target definition to include the required columns. a. Select Tools > Target Designer. b. Navigate to CUSTOM_SILOS > Targets > W_ORG_D. c. Drag W_ORG_D into the Target Designer window. d. Double-click W_ORG_D in the Warehouse Designer window to open the Edit Tables dialog box. e. Click the Columns tab. f. Scroll to the bottom and select the last column in the list, X_CUSTOM. g. Click the Add a new column to this table button, which creates a NEWFIELD column. h. Rename the NEWFIELD column to X_ACCOUNT_LOG and verify that the data type is set to VARCHAR2(10). i. Click Apply. j. Click OK. 4. Edit the SIL mapping. a. Select Tools > Mapping Designer. b. Navigate to CUSTOM_SILOS > Mappings > SIL_OrganizationDimension. c. Drag the SIL_OrganizationDimension mapping into the Mapping Designer window. d. Drag the X_ACCOUNT_LOG column from the W_ORG_DS source definition to the blank port below the X_CUSTOM port in the Sq_W_ORG_DS source qualifier. e. Locate and double-click the Fil_W_ORG_D filter. f. Click the Ports tab. g. Scroll to the bottom and select the X_CUSTOM port. h. Click the button to add a new port. i. Name the port X_ACCOUNT_LOG and verify Prec is set to 10. j. Click Apply and OK. k. Drag the X_ACCOUNT_LOG column from the Sq_W_ORG_DS source qualifier to the corresponding port in the Fil_W_ORG_D filter. l. Drag the column X_ACCOUNT_LOG from the Fil_W_ORG_D filter to a blank port in the EXP_Custom expression. m. Drag the column X_ACCOUNT_LOG from the EXP_Custom expression to a blank port in the Upd_W_ORG_D_Ins_Upd update strategy.
104 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

n. Drag the column X_ACCOUNT_LOG from the Upd_W_ORG_D_Ins_Upd update strategy to the corresponding column in the W_ORG_D target definition. 5. Edit the SQL override in the source qualifier. a. Double-click the Sq_W_ORG_DS source qualifier. b. Click the Properties tab. c. Click the down arrow in the Value field for the SQL Query transformation attribute to open the SQL Editor. d. Add the W_ORG_DS.X_ACCOUNT_LOG column immediately after W_ORG_DS.X_CUSTOM in the SELECT clause. Hint: Use the Ports tab in the left pane to add the column. Be sure to add a comma after W_ORG_DS.X_CUSTOM. e. Click OK to close the SQL Editor. f. Click Apply and OK in the Edit Transformations dialog box. 6. Validate your work and update the repository. a. If necessary, select View > Output to view the Output window. b. Click the Validate tab of the Output window and select Mappings > Validate to verify that there are no inconsistencies in the mapping. You should get the message Mapping SIL_OrganizationDimension is VALID. c. If your mapping is valid, select Repository > Save to update the repository. 7. Edit the workflow. a. Select Tools > Workflow Manager. b. In order to see the new CUSTOM_SILOS folder, right-click Oracle_BI_DW_Base, select Disconnect, and then reconnect as Administrator with password Administrator. c. Verify that the CUSTOM_SILOS folder is visible. d. Open the CUSTOM_SILOS folder. e. Select Tools > Workflow Designer. f. Navigate to CUSTOM_SILOS > Workflows > SIL_OrganizationDimension. g. Drag the SIL_OrganizationDimension workflow into the Workflow Designer window. h. Double-click the SIL_OrganizationDimension task. i. In the General tab, verify that the Fail parent if this task fails and Fail parent if this task does not run options are selected. j. Click the Properties tab. k. Change the session log file name to CUSTOM_SILOS.SIL_OrganizationDimension.log. l. Change the parameter file name to CUSTOM_SILOS.SIL_OrganizationDimension.txt. m. Verify that $Source connection value is set to $DBConnection_OLAP. n. Verify that $Target connection value is set to $DBConnection_OLAP. o. Click the Config Object tab. p. For the Stop on errors attribute, enter a value of 1. q. Click the Mapping tab. r. For Sources > Sq_W_ORG_DS, verify that the connection value is set to $DBConnection_OLAP.
Oracle BI Applications 7.9: Implementation for Oracle EBS 105

z. You should get the message Workflow SIL_OrganizationDimension is VALID. u. select Repository > Save to update the repository. w. Click Apply.9: Implementation for Oracle EBS . For Targets > W_ORG_D. If your workflow is valid. Close all open Informatica applications. t. Verify that the Target load type is set to Normal.Lesson 13: Adding Columns to an Existing Dimension Table s. Click OK. verify that the connection value is set to $DBConnection_OLAP. v. Select Workflows > Validate. x. 106 Oracle BI Applications 7. y.

If necessary. The other method is to add the column definitions directly in the data warehouse database and then use the DAC’s Import from Database command to add the new columns in the DAC. a. you use the first method.9: Implementation for Oracle EBS 107 . b. Enter the following values: Name Position Data Type Length X_ACCOUNT_LOG 226 VARCHAR 150 Oracle BI Applications 7. f. h. Recall that when you built the SDE and SIL mappings in the two previous practices. 2. In this step and the steps that follow. Select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. select the Custom container in the list. add them to a custom subject area and execution plan. c. Click New. There are two methods for adding a new object to the data warehouse. d. Add the new column object for the W_ORG_DS and W_ORG_D tables to the DAC. b. a.Lesson 13: Adding Columns to an Existing Dimension Table Practice 13-3: Adding DAC Tasks and Running Customized ETL Goals To configure the DAC to extract data from the source and load it into the custom column in the data warehouse You have built the sessions and workflows in Informatica Workflow Manager to run the SDE and SIL mappings. g. Now you must add this new column object to the data warehouse. 40–45 minutes Scenario Time Instructions: 1. Click the Columns child tab. One method is to use DAC’s Data Warehouse Configurator to create the physical columns for the tables in the data warehouse database. Query for W_ORG_DS. log in to the DAC client. If necessary. and run the ETL to load the data into the custom column in the W_ORG_D dimension table in the data warehouse. Query to confirm that the X_ACCOUNT_LOG column does not yet exist in W_ORG_DS. You use the second method in the next lesson. Now you must modify DAC tasks. Log in as dac with password dac. Click Design to open the DAC design view. you added the X_ACCOUNT_LOG column to the W_ORG_DS and W_ORG_D tables. e. Select Tables in the top pane.

c. b. d. In the Data Warehouse Configuration Wizard. c. Click Save. You should receive the error message: “X_ACCOUNT_LOG” invalid identifier. Leave SQL*Plus open. 3. This may take a few minutes. Repeat for W_ORG_D. Click Start. check the log files. Use SQL*Plus to verify that the column does not yet exist in the tables in the data warehouse. d. You should receive a message that all tasks successfully finished. query for W_ORG_D. enter: select x_account_log from W_ORG_DS. Log in as obaw with password obaw. Create the new column in the data warehouse database. Click Save. Enter or verify the data warehouse information. Click Finish. a. At the SQL> prompt. j. Select Start > Programs > Oracle – OraDB10g_home > Application Development > SQL Plus to open SQL*Plus. f. Return to the top pane Tables tab. If necessary. 108 Oracle BI Applications 7. Select Tools > ETL Management > Configure. select Create Data Warehouse Tables and click Next. and repeat the steps to add the same column to W_ORG_D with the following values (position is the only value that is different): Name Position Data Type Length Precision Nullable X_ACCOUNT_LOG 255 VARCHAR 150 0 Selected k. If you receive an error message. a. b. e.Lesson 13: Adding Columns to an Existing Dimension Table Precision Nullable 0 Selected i.9: Implementation for Oracle EBS . Container Table Owner Password ODBC Data Source Data Area Index Area Custom OBAW OBAW OBAW OBAW_DATA OBAW_INDEX e. 4. f. select Oracle as the source and target databases and click OK. g.

e. Use the New button to create four new custom logical and physical task folders for the custom folders you created in the Informatica repository: Name Custom_Extract Custom_Load CUSTOM_SDE CUSTOM_SILOS Type Logical Logical Physical Physical c. Leave SQL*Plus open. This is because you only copied and modified the SDE_ORA_OrganizationDimension_Customer workflow in the CUSTOM_SDE folder. Save and click Yes to confirm changes in the Updating… message. Return to SQL*Plus and run the select statements again to verify that the X_ACCOUNT_LOG column now exists in both W_ORG_DS and W_ORG_D. d. 5. Right-click SDE_ORA_OrganizationDimension_Customer and select Synchronize Tasks. change Command for Full Load from SDE_ORA_OrganizationDimension_Customer_Full to SDE_ORA_OrganizationDimension_Customer. When synchronization completes. Click Yes when prompted to proceed in the Synchronizing Tasks… message. f. a. Navigate to Tools > Seed Data > Task Folders. Navigate to Design > Source System Folders. a. Click Save.9: Implementation for Oracle EBS 109 . i. click OK in the Synchronizing task(s) message box. You should receive messages that 888 rows exist in W_ORG_DS and 889 rows exist in W_ORG_D. b. Create custom logical and physical task folders for the custom folders you created in the Informatica repository. Query for SDE_ORA_OrganizationDimension_Customer. Click Close. For this exercise. 6. Navigate to Design > Tasks. Modify existing tasks to use the custom mappings and workflows. This may take a few moments. a. c. change the folder name to Custom_Extract. h.Lesson 13: Adding Columns to an Existing Dimension Table h. b. g. Oracle BI Applications 7. This is for training purposes only and not the recommended procedure. d. Use the New button and the Edit tab to register the four new custom folders: Logical Folder Custom_Extract Custom_Load Physical Folder CUSTOM_SDE CUSTOM_SILOS 7. Register the new custom folders. b. In the Edit child tab.

m. c. Display the Execution Plans tab and click New. change Command for Full Load from SIL_OrganizationDimension_Full to SIL_OrganizationDimension. e. Click Save. 8. 9. Click Save. b. When synchronization completes. o. d. Verify that the Custom Organization Dimension subject area is selected. l. j. Click the Execute button to open the Execute view. Use the Add/Remove button to add the two tasks to the subject area: SDE_ORA_OrganizationDimension_Customer Load into Organization Dimension For this exercise and demonstration only. Click Add to add W_ORG_D to the custom subject area. Click New. In the Edit child tab. a. b. query for W_ORG_D. n. i. Enter Custom Organization Dimension as the name of the execution plan and click Save. h. k. Save and click Yes to confirm changes. c. f. d. c. Click the Target Tables child tab and verify that Truncate Always and Truncate for Full Load are both selected for the W_ORG_DS target table. Click Add/Remove. 10.Lesson 13: Adding Columns to an Existing Dimension Table i. Click the Subject Areas tab.9: Implementation for Oracle EBS . Select the Subject Areas child tab. Click Yes in the Synchronizing task message box. Return to the Tasks tab in the top pane and query for Load into Organization Dimension. p. Click Add/Remove. Enter Custom Organization Dimension as the name of the subject area and click Save. e. j. Click OK in the Adding window to acknowledge that W_ORG_D is added to the subject area. Create a custom subject area. Add the tasks to the subject area. For this exercise. d. it is acceptable to manually add only these two tasks to the subject area instead of assembling the subject area. In the Choose Tables dialog box. 110 Oracle BI Applications 7. Create a custom execution plan and add the custom subject area to it. click OK in the Synchronizing task(s) message box. Select the Tables child tab. Click OK to close the Choose Tables dialog box. a. Right-click Load into Organization Dimension and select Synchronize Tasks. change the folder name to Custom_Load. Click the Tasks child tab. a. This is for training purposes only and not the recommended procedure. g. This is because you only copied and modified the SIL_OrganizationDimension workflow in the CUSTOM_SILOS folder. b.

again accept the default to perform the operation for the selected record only and click OK. Click OK in the Adding window to acknowledge that the Custom Organization Dimension subject area was added to the execution plan. Click OK to acknowledge successful parameter generation. Click OK. c. c. Select the ORA_11_5_10 connection. Building the execution plan may take several minutes. h. h. you can navigate to Setup > Physical Data Sources and check the names of the Source and Warehouse type connections. Build the execution plan. Because the full load you ran in the previous practice is recorded in the DAC repository.Lesson 13: Adding Columns to an Existing Dimension Table f. Generate execution plan parameters for the custom execution plan. d. and verify that there are refresh dates corresponding to your initial load of the OBAW. Click OK to acknowledge that the Custom Organization Dimension execution plan has been successfully built. In the second Building window. Click Refresh in the top pane. Click Setup and select the Physical Data Sources tab. Oracle BI Applications 7. Click the Ordered Tasks child tab for the Custom Organization Dimension execution plan and verify that the SDE_ORA_OrganizationDimension_Customer and Load into Organization Dimension tasks are listed. a. In the left table list. f. select the Refresh Dates subtab. Click Save. 13. e. i. a. 11. If you need to verify that the values are correct. verify that Custom Organization Dimension is selected. This will cause an incremental update to be initiated based on the refresh date. b. f. Select the Parameters child tab. Notice that a time stamp now appears in the Last Designed column for the execution plan. c. Leave the FOLDER parameters as they are. Click Save. i. Click Generate. In the Execution Plan list. Repeat for the DataWarehouse connection. Accept the default and click OK in the Building… window. Click Yes to proceed. e. d. the refresh date for the transactional source tables is stored for your last run. Verify that the Custom Organization Dimension execution plan is selected and click Build. This step will clear the refresh date from the source tables in the OLTP and allow a full load for your custom execution plan. a.9: Implementation for Oracle EBS 111 . Reset the refresh date for the data warehouse. A list of parameters is generated. g. The QUERY_INDEX_CREATION task is also listed. select Custom Organization Dimension and click Add to add it to the right list of subject areas belonging to the custom execution plan. 12. b. Set the DATASOURCE parameters: DBConnection_OLTP DBConnection_OLAP ORA_11_5_10 DataWarehouse g. b.

Verify that no records are returned. confirm that you want to reset by entering the text and click Yes. 14. Note that the DAC Server Monitor icon has changed from yellow to green.” When the DAC client cannot establish a connection to the DAC Server. f. Run the custom execution plan. 15. g. Return to SQL*Plus. b.9: Implementation for Oracle EBS . d. “DAC Server is idle. Select the Tasks tab in the bottom pane to view task status within the execution plan. c. Before running the execution plan. When you move the cursor (mouse over) over the orange icon it should say. Write down the Process ID of the current run. Click Auto Refresh to verify the refresh frequency of 30 seconds. Return to the Execution Plans tab and select the Custom Organization Dimension execution plan. Click the Run Now button in the Top Pane toolbar. Click OK to acknowledge that the request has been successfully submitted to the Informatica Server. Select Tools > ETL Management > Reset Data Warehouse. Run the following SQL statement: select count(*) from W_ORG_D where X_ACCOUNT_LOG is NOT NULL. Select the Custom Organization Dimension run and confirm that it has a Run Status of Running. Click Refresh in the lower pane and verify that the refresh dates for the transactional and data warehouse tables are cleared. g. the Server Monitor icon resembles a red electrical plug. indicating that a plan is being executed. Use the list to view different task statuses. a. a. Verify that the DAC Server is started. which means that the client is connected to the server and the server is idle. If necessary. Click OK to confirm that the reset was successful. f. c. Select the Current Run tab. If the DAC Server is not started select Start > Programs > Oracle Business Intelligence > Oracle DAC > Start DAC Server and verify that the DAC Server Monitor icon resembles an orange electrical plug in a socket before continuing. click Yes to confirm that you want to start the execution plan. c. e. d. a. b. 16. b. b. 112 Oracle BI Applications 7.Lesson 13: Adding Columns to an Existing Dimension Table d. the icon resembles a green electrical plug with a lightning sign superimposed on it. Monitor the ETL plan execution. In the Starting ETL dialog box. In the Reset Data Warehouse dialog box. 17. If the client is connected to a server that is running an ETL process. query W_ORG_D and verify that there is no data in the X_ACCOUNT_LOG column. Verify that the DAC Server Monitor icon in the upper-right corner of the DAC client resembles an orange electrical plug in a socket. e. Click Refresh to refresh the status. a. d. log in as obaw with password obaw.

In the Input dialog box. c. c. After running the execution plan. navigate to C:\OracleBI\DAC\ServerLog and open Custom_Organization_Dimension. open SQL*Plus and log in as obaw with password obaw. Right-click Custom Organization Dimension and select Get Run information > Get log file. 19. If not.9: Implementation for Oracle EBS 113 . In the Fetching log file dialog box. click OK to accept the default log file name. b. g. and the number of successful tasks = 3. Run the following SQL statement: select count(*) from W_ORG_D where X_ACCOUNT_LOG is NOT NULL. Note that the naming convention of the log files includes the ETL Process ID that you recorded in a previous step. b.Lesson 13: Adding Columns to an Existing Dimension Table 18. which should still be open. the number of failed tasks = 0. a.#. In Windows Explorer. When all tasks have completed (about 2 minutes). Select the Custom Organization Dimension execution plan you just ran. e. Scroll down to the bottom of the file to the list of step statuses and confirm that there are no steps listed under Failed Sessions or Queued Sessions. Verify that 27 records are returned.log. Verify that run status = Completed. select the Run History Top Pane tab to view the log file for the execution plan you just ran. Oracle BI Applications 7. f. Return to SQL*Plus. a. notice the path to which the log file has been saved and click OK. View Run History. query W_ORG_D and verify that there is now data in the X_ACCOUNT_LOG column. d.

Lesson 13: Adding Columns to an Existing Dimension Table 114 Oracle BI Applications 7.9: Implementation for Oracle EBS .

you run a DDL script to create a new dimension table and a dimension staging table based on the standard data warehouse structure (with appropriate system columns). f. i. Click Read Tables. j. c.sql file and examine the SQL. Navigate to the Design view. Verify that the Custom container is selected. Click the Tables tab in the top pane. 2. The dimension table contains these two required columns as well as the required ETL_PROC_ID column. e. Click Import Tables. b. If necessary. Note that the dimension staging table contains the required columns: DATASOURCE_NUM_ID and INTEGRATION_ID. enter WC_PARTNER*. If necessary. Select Import for the WC_PARTNER_DS and WC_PARTNER_D tables. Run the script. Import the tables into the DAC. Navigate to C:\PracticeFiles. Oracle BI Applications 7. Choose the DataWarehouse data source. a.sql file. open SQL*Plus and log in as obaw with password obaw. d. 15–20 minutes Time Instructions: 1.9: Implementation for Oracle EBS 115 . Run a script to create a new dimension table and a new dimension staging table in the data warehouse. The script creates two new tables: a dimension table named WC_PARTNER_D and a dimension staging table named WC_PARTNER_DS. b. g. k.Lesson 14: Adding a New Dimension in the OBAW Practice 14-1: Adding a New Dimension in the OBAW Goals Scenario To create a new dimension table and a new dimension staging table in the data warehouse In this practice. g. f. c. l. Right-click anywhere in the list and select Import from database > Import Database Tables. a. Click OK to confirm that reading tables is complete. Copy the SQL and paste it into SQL*Plus. d. Click OK to confirm that the import was successful. Close SQL*Plus. In the Table Name Filter field. You then register the new source table and its staging table in the DAC repository and associate it with the appropriate database connection. e. Open the partner. h. Close the partner. open the DAC client.

g. Add a new column named PARTNER_WID with the following properties: Position Data Type Length Precision Foreign Key to Table Foreign Key to Column Nullable Default Value 109 Number 10 0 WC_PARTNER_D ROW_WID Selected 0 6. b. h. The columns appear in the column list. Query for W_GL_REVN_FS. b. Right-click WC_PARTNER_D and select Import From Database > Import Database Columns. e. Click OK to confirm that reading the columns was successful. Save. b. Click Read Columns. Select All in the list. Click OK to confirm that importing the columns was successful. d. c. a. Select the WC_PARTNER_D table. Query for the WC_PARTNER_DS and WC_PARTNER_D tables in the list or select Original from the list. Click Import Columns. a. b.9: Implementation for Oracle EBS . There should be 108 columns in W_GL_REVN_F. Set properties for the tables. a. Add a foreign key column to W_GL_REVN_FS. l. Set the Warehouse flag for both tables. Select All in the list. Add a foreign key column to W_GL_REVN_F. click Refresh to view the columns. a. Repeat the steps for WC_PARTNER_DS. Set the Dimension table type for WC_PARTNER_D and the Dimension Stage table type for WC_PARTNER_DS. 116 Oracle BI Applications 7. Accept the default: Selected record only and click OK.Lesson 14: Adding a New Dimension in the OBAW 3. d. j. Click the Columns subtab and notice that no columns are visible for this table. Import table columns. 5. Click Refresh in the Columns subtab to verify that the columns are now visible in the DAC and that all the column properties are accurate. No properties need to be modified. c. f. Sort on position. In the Columns child tab. Pick the DataWarehouse data source. Save. k. 4. i. c. d. e. Query for W_GL_REVN_F.

9: Implementation for Oracle EBS 117 . a. Scroll to look for column number 109 and verify that the PARTNER_WID column was generated. Select Oracle as the source and target database and click OK. Add a new column named PARTNER_ID with the following properties: Position Data Type Length Precision Nullable Default Value 103 VARCHAR 30 0 Selected 0 7. Click Finish. In the Data Warehouse Configuration Wizard.Lesson 14: Adding a New Dimension in the OBAW c. and click Next. l. Open the oracle_bi_dw. Navigate to C:\OracleBI\dac\conf\sqlgen\ctlfile to examine the SQL control file. You should receive a message that all tasks are successfully finished. Click Start. p. f. o. Scroll to look for column number 103 and verify that the PARTNER_ID column was generated. j. c. to verify that the PARTNER_WID column now exists in the W_GL_REVN_F table in the data warehouse database. Search for W_GL_REVN_F. Create the new PARTNER_WID and PARTNER_ID columns in the W_GL_REVN_F and W_GL_REVN_FS tables in the data warehouse database. Run desc W_GL_REVN_F. Return to SQL*Plus. There should be 102 columns in W_GL_REVN_FS. In the Columns child tab. Search for W_GL_REVN_FS. m. select Create Data Warehouse Tables. k. to verify that the PARTNER_ID column now exists in the W_GL_REVN_FS table in the data warehouse database. check the log files. Oracle BI Applications 7. b. g. sort on position. Container Table Owner Password ODBC Data Source Data Area Index Area Custom obaw obaw obaw obaw_data obaw_index e. d. If you receive an error message.ctl file with Notepad. Run desc W_GL_REVN_FS. Enter or verify the data warehouse information. This may take a few minutes. h. d. Select Tools > ETL Management > Configure. i. n.

15–20 minutes Time Instructions: 1. Navigate to C:\PracticeFiles and open the Partner. a. partner name. Select Sources > Import from File. Save the repository. 118 Oracle BI Applications 7. t. q.1. In the Open Flat File dialog box. You use Informatica tools to create a new SDE mapping to load the data into the dimension staging table that you created in the previous practice. s. a. Select Tools > Source Analyzer. verify that Delimited is selected. m. d. navigate to C:\Informatica\PowerCenter8. f. The Partner source appears in the Source Analyzer and is added as a source in CUSTOM_SDE > Sources > FlatFile. Open the CUSTOM_SDE folder. h. Import the source file into Informatica. Examine the data. In the Flat File Import Wizard. Select Import field names from first line. 2. Select All Files in the “Files of type” list. The file contains five rows of data with values for row ID. l. Click Finish.csv and click OK.1\server\infa_shared\SrcFiles. Click Next. n. o. Copy the partner.csv file. p. Close the file. g. e. Select partner.1. i. r. and partner location. Change Length/Prec for PARTNER_NAME and PARTNER_LOC to 50.9: Implementation for Oracle EBS .Lesson 14: Adding a New Dimension in the OBAW Practice 14-2: Creating an SDE Mapping to Load the Dimension Staging Table Goals Scenario To create a new SDE mapping to load the dimension staging table You have a spreadsheet (CSV file) with the dimension data that you want to import into the data warehouse. Accept all defaults in step 2 of the Flat File Import Wizard. Open Informatica Designer. Enter Partner as the name for this source. k. Click Next. Import the WC_PARTNER_DS target. b.1\server\infa_shared\SrcFiles. c.csv file and paste it into C:\Informatica\PowerCenter8. Select Tools > Target Designer. j.

e. Click Create. Add the source to the SDE mapping. Drag the Partner source into the Mapping Designer window and notice that a source qualifier transformation is created. a. Select Tools > Options. Select Transformation > Create. c. Select Expression in the list. e. and click OK. Click Connect. f. Name the new mapping SDE_Custom_PartnerDimension and click OK. 4. c. Enter obaw as Username. Expand obaw > Tables. select the Tables tab. Click the Add a new variable to this table button. g. Drag all three columns from SQ_Partner Source Qualifier to the EXPTRANS expression. b. h. Select Mappings > Create. b. Click OK. Oracle BI Applications 7. c. Enter EXPTRANS as the name for the transformation. a. select WC_PARTNER_DS. Click Done. Select Mapping Designer in the Tools list and verify that the Create Source Qualifiers when opening Sources option is selected. e. d. Owner Name. f. Navigate to CUSTOM_SDE > Sources > Flat File.9: Implementation for Oracle EBS 119 . b. Create the SDE mapping. f. Select Tools > Mapping Designer. First. Select Mappings > Parameters and Variables to open the Declare Parameters and Variables dialog box. Save the repository. b. 5. Add a new parameter with the following values: Name Type Data type Prec Scale Initial Value $$DATASOURCE_NUM_ID Parameter decimal 10 0 15 j. Select the obaw ODBC data source. and Password. Click OK to close the Declare Parameters and Variables dialog box. a. d.Lesson 14: Adding a New Dimension in the OBAW Select Targets > Import from Database. Add an expression transformation to the mapping. In the Options dialog box. i. 3. c. g. d. WC_PARTNER_DS appears in the Warehouse Designer window and is added as a target in CUSTOM_SDE > Targets. turn on the option to create a Source Qualifier transformation when adding a new source.

Add a new output port with the following properties: Port Name Data type Prec Scale I O V DATASOURCE_NUM_ID decimal 10 0 Not selected Selected Not selected n. g. Please note that this is for training purposes only and is not the recommended practice. e. you would create two workflows. For the purposes of this training. Select Workflows > Create. Select Tasks > Create. Delete the existing expression. Typically.9: Implementation for Oracle EBS . f. e. 7. Double-click the EXPTRANS expression. 6. c. r. Select Tools > Workflow Manager. d. Click Apply. Drag the WC_PARTNER_DS target to the right of the EXPTRANS expression transformation. Expand Mapping parameters. Click the down arrow in the Expression field to open Expression Builder. o. Verify that the CUSTOM_SDE folder is open. one to be used for a full load and the other to be used for an incremental load. You should receive the message Mapping SDE_Custom_PartnerDimension is VALID. Both workflows are based on the same mapping. Click OK. Name the workflow SDE_Custom_PartnerDimension and click OK. Click the Variables tab. f. Leave task type set to Session and name the task SDE_Custom_PartnerDimension. Click OK. which is executed during both full and incremental loads. Link the other three columns in EXPTRANS to their corresponding columns in WC_PARTNER_DS. Select Mappings > Validate. m. Double-click $$DATASOURCE_NUM_ID to add it to the expression. c. you create only one workflow for all the mappings in this set of practices. q. l. 120 Oracle BI Applications 7. Click Create. p. Add the target to the mapping. a. This provides an opportunity to tune each of these load scenarios. Save the repository. d. s. Click the Ports tab. u. Drag ROW_ID from EXPTRANS to INTEGRATION_ID in WC_PARTNER_DS. Create the SDE workflow. a. b. b. t.Lesson 14: Adding a New Dimension in the OBAW k.

Click Apply. select Targets > WC_PARTNER_DS. Click OK. In the Relational Connection Browser.9: Implementation for Oracle EBS 121 . Oracle BI Applications 7. c. i. Validate the workflow. select both Fail parent if this task fails and Fail parent if this task does not run. Select Tasks > Link Task and link the Start task to the SDE_Custom_PartnerDimension session. Click Done to close the Create Task dialog box. a. click the down-arrow button in the Value property to edit its target connection. Verify that the value for the $Source connection value attribute is empty. i. r. g. j. q. Under Readers in the right pane.Lesson 14: Adding a New Dimension in the OBAW h. Click OK. In the left pane. In the Connection settings in the right pane for the WC_PARTNER_DS instance. select the Sources node and then the SQ_Partner source. click the down arrow in the Value field. p. Click the Properties tab. Associate the SDE_Custom_PartnerDimension mapping with the session and click OK. 9. Edit the workflow session properties. Select Workflows > Validate. click the Use Connection Variable option button. For the $Target connection value attribute. You should receive the message Workflow SDE_Custom_PartnerDimension is VALID. and click OK. 8. k. For the Stop on errors attribute. j. In the left pane. m. Select Use Connection Variable and set the variable to $DBConnection_OLAP. h. l. c. Set the target load type to Normal. d. b. On the General tab. b. a. e. n. Double-click the SDE_Custom_PartnerDimension task. enter a value of 1. o. Save the repository. enter $DBConnection_OLAP. verify that the SQ_Partner reader is set to File Reader. f. Click the Mapping tab. Click the Config Object tab.

and Password and click Connect. b. a. select WC_PARTNER_D. Save the repository. Click the Columns tab. In this practice. Select the obaw ODBC data source. 3. 15–20 minutes Time Instructions: 1. In Informatica Designer. Select OBAW > Tables > WC_PARTNER_DS and click OK. Double-click WC_PARTNER_D in the Target Designer window. Select Sources > Import from Database. you created an SDE mapping to load the dimension staging table. Select Tools > Mapping Designer. c. Import the SIL source. Select Tools > Target Designer. Enter obaw as Username. Set username. and click OK. a. owner name. d. e. c. Create the SIL mapping. you use Informatica tools to create a new SIL mapping to load the dimension table. Select Mappings > Create.9: Implementation for Oracle EBS . g. f. 122 Oracle BI Applications 7. Change the ROW_WID key type to PRIMARY KEY. Owner Name. WC_PARTNER_D appears in the Warehouse Designer window and is added as a target in CUSTOM_SILOS > Targets. Name the mapping SIL_Custom_PartnerDimension and click OK. Select the obaw ODBC data source. WC_PARTNER_DS appears in the Warehouse Designer window and is added as a source in CUSTOM_SILOS > Sources. f. open the CUSTOM_SILOS folder. e. Select Targets > Import from Database.Lesson 14: Adding a New Dimension in the OBAW Practice 14-3: Creating an SIL Mapping to Load the Dimension Table Goals Scenario To create a new SIL mapping to load the dimension table In the previous practice. Expand OBAW > Tables. b. h. 2. a. and password to obaw and click Connect. i. Click Apply and OK. Select Tools > Source Analyzer. c. j. Import the SIL target. d. b.

Create a new workflow for the SIL_Custom_PartnerDimension mapping. Drag MPLT_GET_ETL_PROC_WID into the mapping and place it near the target definition. Drag ETL_PROC_WID from the MPLT_GET_ETL_PROC_WID mapplet to the corresponding column in the target definition. Edit the workflow session properties. Select Tasks > Link Task and link the Start task to the SIL_Custom_PartnerDimension session. 7. Select Transformation > Create. Navigate to CUSTOM_SILOS > Mapplets > MPLT_GET_ETL_PROC_WID. Leave task type set to Session and name the task SIL_Custom_PartnerDimension. Add a sequence transformation to the mapping that updates ROW_WID in the target definition. Drag the WC_PARTNER_D target in the mapping and place it to the right of the SQ_WC_PARTNER_DS source qualifier. 5. Verify that the CUSTOM_SILOS folder is open. Oracle BI Applications 7. Open Workflow Manager.9: Implementation for Oracle EBS 123 . Select Sequence Generator in the type list. f. c. b. f. 9. a. c. a. c. Drag NEXTVAL from the SEQTRANS sequence transformation to ROW_WID in the target definition. Click Create. Add a mapplet to the mapping that retrieves the ETL process ID. a. Drag the WC_PARTNER_DS source into the mapping. g. Validate the mapping. a. b. i. e.Lesson 14: Adding a New Dimension in the OBAW 4. Create a new workflow named SIL_Custom_PartnerDimension. Link the four columns in the SQ_WC_PARTNER_DS source qualifier to their corresponding columns in the WC_PARTNER_D target definition. Save the repository. d. (This mapplet was copied to this folder when you copied the SIL_OrganizationDimension mapping in an earlier practice). Select Mappings > Validate. Name the transformation SEQTRANS. d. You should receive the message Mapping SIL_Custom_PartnerDimension is VALID. b. 6. Click Done to close the Create Task dialog box. Associate the SIL_Custom_PartnerDimension mapping with the session and click OK. Drag INTEGRATION_ID from the source qualifier to the corresponding column in the MPLT_GET_ETL_PROC_WID mapplet. e. 8. d. h. b. Click Done. b. c. Click Create. Select Tasks > Create. a. Add the source and target to the mapping.

Lesson 14: Adding a New Dimension in the OBAW

a. Double-click the SIL_Custom_PartnerDimension session. b. On the General tab, select both Fail parent if this task fails and Fail parent if this task does not run. c. Click the Properties tab. d. Set the $Source connection value and $Target connection value attributes to $DBConnection_OLAP. e. Click the Config Object tab. f. For the Stop on errors attribute, enter a value of 1. g. Click the Mapping tab. h. In the left pane, select Sources > SQ_WC_PARTNER_DS. i. In the Connections settings in the right pane for the SQ_WC_PARTNER_DS instance, click the down-arrow button in the Value property to edit its source connection. j. In the Relational Connection Browser, click the Use Connection Variable option button and enter $DBConnection_OLAP, and then click OK. k. In the left pane, select Targets > WC_PARTNER_D. l. In the Connection settings in the right pane for the WC_PARTNER_D instance, click the down-arrow button in the Value property to edit its target connection. m. In the Relational Connection Browser, click the Use Connection Variable option button and enter $DBConnection_OLAP, and then click OK. n. Set target load type to Normal. o. Click Apply. p. Click OK. 10. Validate the workflow. a. Select Workflows > Validate. You should receive the message Workflow SIL_Custom_PartnerDimension is VALID. b. Save the repository.

124

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

Practice 14-4: Creating an SDE Mapping to Load the Fact Staging Table
Goals Scenario Time To create a new SDE mapping to load the fact staging table You copy and modify an existing mapping and workflow to create a custom SDE mapping and workflow to load the fact staging table. 15–20 minutes

Instructions:
1. Copy an existing SDE mapping to the custom folder. a. Open Informatica Repository Manager. b. Navigate to SDE_ORA11510_Adaptor > Mappings. c. Copy the SDE_ORA_GLRevenueFact mapping and paste it into the CUSTOM_SDE folder. d. Click Yes to confirm the copy. e. Expand CUSTOM_SDE > Mappings and verify that the SDE_ORA_GLRevenueFact mapping is copied. 2. Copy an existing SDE workflow to the custom folder. a. Navigate to SDE_ORA11510 > Workflows. b. Copy the SDE_ORA_GLRevenueFact workflow and paste it into the CUSTOM_SDE folder. c. Click Yes to confirm the copy. The Copy Wizard window opens. Select Reuse and apply this resolution to all conflicts. d. Click Next. e. Click Finish. f. Expand CUSTOM_SDE > Workflows and verify that the SDE_ORA_GLRevenueFact is copied. 3. Modify the target for the SDE mapping. a. Open Informatica Designer. b. Right-click the CUSTOM_SDE folder and select Disconnect. c. Right-click the CUSTOM_SDE folder again and select Connect to see the changes you made in Repository Designer. d. Open the CUSTOM_SDE folder. e. Navigate to CUSTOM_SDE > Targets. f. Select Tools > Target Designer. g. Drag W_GL_REVN_FS into the Target Designer.
Oracle BI Applications 7.9: Implementation for Oracle EBS 125

Lesson 14: Adding a New Dimension in the OBAW

h. i. j. k. l. m. n. o.

Double-click W_GL_REVN_FS to open it. Click the Columns tab. Scroll to the bottom and select the X_CUSTOM column. Click the Add a new column to this table button. Name the column PARTNER_ID. Set data type to varchar2 and prec to 15. Click Apply and OK. Select Repository > Save.

4. Modify the SDE mapping. a. Select Tools > Mapping Designer. b. Drag SDE_ORA_GLRevenueFact into the Mapping Designer. c. Select Transformation > Create. d. Select Expression in the list. e. Enter X_CUSTOM as the name. f. Click Create. g. Click Done. h. Drag the CUST_TRX_TYPE_ID column from the mplt_BC_ORA_GLRevenueFact mapplet into the X_CUSTOM expression transformation. i. Double click the X_CUSTOM transformation to open it. j. Click the Ports tab. k. Rename CUST_TRX_TYPE_ID to PARTNER_ID. For this exercise, you rename CUST_TRX_TYPE_ID and use it as the foreign key to the new WC_PARTNER_D dimension table. This is because the Partner spreadsheet you use as a source for the dimension uses known RA_CUSTOMER_TRX_ALL. CUST_TRX_TYPE_ID values as ROW_ID values. Please note this is for training purposes only and is not the recommended practice. l. Click Apply and OK. m. Drag PARTNER_ID from the X_CUSTOM expression transformation to PARTNER_ID in the W_GL_REVN_FS target. n. Select Mappings > Validate. You should receive the message Mapping SDE_ORA_GLRevenueFact is VALID. o. Save the repository. 5. Validate the workflow. a. Select Tools > Workflow Manager. b. Select Tools > Workflow Designer. c. If necessary, open the CUSTOM_SDE folder. It may be necessary to disconnect and reconnect to see the workflows. d. Select CUSTOM_SDE > Workflows. e. Drag the SDE_ORA_GLRevenueFact workflow into the Workflow Designer. f. Select Workflows > Validate. You should receive the message Workflow SDE_ORA_GLRevenueFact is VALID.
126 Oracle BI Applications 7.9: Implementation for Oracle EBS

Select Tools > Source Analyzer. Click Yes to confirm the copy. a. and click Connect. select Reuse and Apply this resolution to all conflicts. g. a. f. c. Import sources. c. Copy an existing SIL workflow to the custom folder. Oracle BI Applications 7. which should still be open.9: Implementation for Oracle EBS 127 . b. e. Select Sources > Import from Database. Click Finish. f. d. d. 2. Click Yes to confirm the copy. e. and password. Navigate to SILOS > Workflows. d. you modified an existing SDE mapping to load the fact staging table. you use Informatica tools to modify an existing SIL mapping and validate workflows used to load the fact table. Copy the SIL_GLRevenueFact mapping and paste it into the CUSTOM_SILOS folder.Lesson 14: Adding a New Dimension in the OBAW Practice 14-5: Creating an SIL Mapping to Load the Fact Table Goals Scenario To create a new SIL mapping to load the fact table In the previous practice. Open Informatica Repository Manager. b. Copy the SIL_GLRevenueFact workflow and paste it into the CUSTOM_SILOS folder. Select the obaw ODBC data source. right-click CUSTOM_SILOS and select Disconnect and then right-click CUSTOM_SILOS and select Connect. To see the changes made in Repository Manager. a. enter obaw as user name. Copy an existing SIL mapping to the custom folder. In this practice. b. e. c. Expand obaw > Tables. Return to Informatica Designer. Click Next. Expand CUSTOM_SILOS > Workflows and verify that the SIL_GLRevenueFact workflow is copied. In the Copy Wizard. owner name. g. Open the CUSTOM_SILOS folder. 3. Navigate to SILOS > Mappings. Expand CUSTOM_SILOS > Mappings and verify that the SIL_GLRevenueFact mapping is copied. 15–20 minutes Time Instructions: 1.

g. d.PARTNER_ID = WC_PARTNER_D. Expand obaw > Tables. Click OK. i. and Retain userdefined Descriptions. Open the SQL Query. c. Click the Ports tab. 5. and click Connect. g. Delete the SQ_WC_PARTNER_D source qualifier from the mapping. Select Tools > Mapping Designer. Select the obaw ODBC data source. j. Select CUSTOM_SILOS > Mappings > SIL_GLRevenueFact. b. c. c. e.Lesson 14: Adding a New Dimension in the OBAW h. add: . f. Select Apply to all tables. Select CUSTOM_SILOS > Targets and verify that the W_REVN_F table appears as a target. At the end of the FROM clause. h. Click Apply and OK. Import the target. b. Map PARTNER_WID through the transformations in the mapping. Modify the source qualifier. d. a. f. Retain user-defined PK-FK relationships. Select CUSTOM_SILOS > Sources > obaw and verify that the WC_PARTNER_D and W_REVN_FS tables appear as sources. Select Tools > Target Designer. Drag SIL_GLRevenueFact into the Mapping Designer. f. Select CUSTOM_SILOS > Sources > OBAW > WC_PARTNER_D. 7. Click Replace. add: LEFT OUTER JOIN WC_PARTNER_D ON W_GL_REVN_FS. Select Targets > Import from Database. Select W_GL_REVN_F and click OK. Rename ROW_WID to PARTNER_WID. Use Ctrl+click to select the WC_PARTNER_D and W_GL_REVN_FS tables. Double-click Sq_W_GL_REVN_FS to open it. a. At the end of the WHERE clause. b.WC_PARTNER_D.ROW_WID g. 128 Oracle BI Applications 7. and password.WC_PARTNER_D h. owner name. Drag ROW_WID from WC_PARTNER_D onto a blank port in the Sq_W_GL_REVN_FS source qualifier. Click the Properties tab. a. At the end of the SELECT clause. Drag WC_PARTNER_D into the mapping.INTEGRATION_ID i. 6. j. d. Click OK. 4. Add the WC_PARTNER_D source to the SIL mapping. add: . e. enter obaw as user name. e.9: Implementation for Oracle EBS .

Validate the workflow. 8. d. b. updating them to zero so that they can be loaded into the target table. and enter the following expression: IIF(ISNULL(IN_PARTNER_WID). d. Delete the existing expression. c. f. i. f. Oracle BI Applications 7. e. You should receive the message Mapping SIL_GLRevenueFact is VALID. h. Expand CUSTOM_SILOS > Workflows. Drag PARTNER_WID from Upd_W_GL_REVN_F_Ins_Upd to PARTNER_WID in the W_GL_REVN_F target. a.0. Save all changes if prompted. 9. Drag PARTNER_WID from EXP_Custom to the blank port below X_CUSTOM in Upd_W_GL_REVN_F_Ins_Upd. b. b. g. Select Tools > Workflow Manager. Save the repository. e. Drag PARTNER_WID from Sq_W_GL_REVN_FS to a blank port in Exp_Custom expression transformation. j. Verify that the CUSTOM_SILOS folder is open.IN_PARTNER_WID) g. Validate the mapping. c. Click Apply and OK. a. Close all open Informatica clients. h. Select Workflows > Validate. Add a new port named PARTNER_WID and set it as an Output port only. Save the repository.Lesson 14: Adding a New Dimension in the OBAW a. You should receive the message Workflow SIL_GLRevenueFact is VALID. Double-click the EXP_Custom to open it. Rename the PARTNER_WID port to IN_PARTNER_WID and set it as an Input port only. You do this to manage any nulls. Drag the SIL_GLRevenueFact workflow into the Workflow Designer. Select Mappings > Validate.9: Implementation for Oracle EBS 129 . Click OK. Click the down arrow to open the Expression Editor for the PARTNER_WID column. i. Select Tools > Workflow Designer.

click OK in the Synchronizing task message box. Click Yes when prompted to update source and target tables. j. Selected record only. Now you must modify the DAC tasks. 20–30 minutes Time Instructions: 1. i. If necessary. a. Click the Target Tables child tab and verify that both Truncate Always and Truncate for Full Load are selected for the WC_PARTNER_DS target table. Navigate to Design > Tasks. .9: Implementation for Oracle EBS e.Lesson 14: Adding a New Dimension in the OBAW Practice 14-6: Adding DAC Tasks and Running Customized ETL Goals Scenario To configure the DAC to run a custom execution plan You have built the sessions and workflows in Informatica Workflow Manager to run the SDE and SIL mappings. 130 Oracle BI Applications 7. and run the ETL to load data into the W_REVN_F fact table in the data warehouse. b. Verify that the Custom container is selected. h. k. c. add them to a custom subject area and execution plan. Right-click the task and select Synchronize Tasks. When synchronization completes. open the DAC client and log in as dac with password dac. Click OK to accept the default. d. Click Save. f. g. Create a new task with the following properties: Name Command for Incremental Load Command for Full Load Folder Name Primary Source Primary Target Task Phase Execution Type Priority Custom Extract for Partner Dimension SDE_Custom_PartnerDimension SDE_Custom_PartnerDimension Custom_Extract FlatFileConnection DBConnection_OLAP Extract Dimension Informatica 5 Save the task. Create and synchronize tasks in DAC.

u. Create another new task with the following properties: Oracle BI Applications 7. Click OK. click OK in the Synchronizing task message box. Click the Target Tables child tab and verify that both Truncate Always and Truncate for Full Load are selected for the W_GL_REVN_FS target table. . p. v. w. Click Yes when prompted to update source and target tables. n.9: Implementation for Oracle EBS 131 s. Right-click the task and synchronize the tasks for the selected record only. When synchronization completes. o. Click OK. click OK in the Synchronizing task message box. Right-click the task and select synchronize tasks for the selected record only. Save the task. Create another new task with the following properties: Name Command for Incremental Load Command for Full Load Folder Name Primary Source Primary Target Task Phase Execution Type Priority Custom Load into Partner Dimension SIL_Custom_PartnerDimension SIL_Custom_PartnerDimension Custom_Load DBConnection_OLAP DBConnection_OLAP Load Dimension Informatica 5 m. Click Yes when prompted to update source and target tables. Create another new task with the following properties: Name Command for Incremental Load Command for Full Load Folder Name Primary Source Primary Target Task Phase Execution Type Priority Custom Extract for Revenue Fact SDE_ORA_GLRevenueFact SDE_ORA_GLRevenueFact Custom_Extract DBConnection_OLTP DBConnection_OLAP Extract Fact Informatica 5 Save the task. When synchronization completes. r. x. z.Lesson 14: Adding a New Dimension in the OBAW l. t. Click Save. q. y.

9: Implementation for Oracle EBS . b. dd. Right-click the task and synchronize tasks for the selected record only. Create and save the new subject area named Custom Revenue Update. ee. bb. c. Save the execution plan. cc. Click the Tables child tab and use the Add/Remove button to add W_GL_REVN_F. h. Set the data sources as follows: DBConnection_OLAP DBConnection_OLTP FlatFileConnection DataWarehouse ORA_11_5_10 ORA_11_5_10_Flatfile f. Click Execute. c. Click Build. Click the Subject Areas child tab and add the Custom Revenue Update subject area. Click the Tasks child tab. Save the task. 132 Oracle BI Applications 7. d. Click the Subject Areas tab. d. click OK in the Synchronizing task message box. Save the subject area. Click OK. Create and save a new execution plan named Custom Revenue Update. e. b. e. Click Yes when prompted to update source and target tables. When synchronization completes.Lesson 14: Adding a New Dimension in the OBAW Name Command for Incremental Load Command for Full Load Folder Name Primary Source Primary Target Task Phase Execution Type Priority Custom Update for Revenue Fact SIL_GLRevenueFact SIL_GLRevenueFact Custom_Load DBConnection_OLAP DBConnection_OLAP Update Fact Informatica 5 aa. 2. Click the Parameters child tab and click Generate to generate parameters for the execution plan. Create a new execution plan. Accept the default and click OK in the Building window. 3. Create a new subject area. g. a. Use the Add/Remove button to add the four custom tasks: Custom Extract for Partner Dimension Custom Extract for Revenue Fact Custom Load into Partner Dimension Custom Update for Revenue Fact f. a.

Before running the execution plan. a. again accept the default and click OK. d. 8. f. Click Refresh to refresh the status. Select Tools > ETL Management > Reset Data Warehouse. Verify that no records are returned. select the Run History tab. building the execution plan may take several minutes. 4. Write down the process ID of the current run. 6. 7. l. Run the execution plan. Verify that the expected four custom tasks appear in the list along with the QUERY_INDEX_CREATION task. Click the Ordered Tasks child tab. Monitor the ETL plan execution. Oracle BI Applications 7. Use the drop-down menu to view different task statuses. k. Run the following SQL statement: select count(*) from WC_PARTNER_D where PARTNER_NAME is not null. g. a. a. d. Click OK to acknowledge that the request has been successfully submitted to the Informatica Server. In the Starting ETL dialog box. a. Select the Custom Revenue Update execution plan you just ran. b. As with assembling a subject area. b. When the execution plan completes (approximately 5 minutes). Select the Current Run tab. Reset the refresh date for the data warehouse. Note that the DAC Server Monitor icon has changed from yellow to green. click Yes to confirm that you want to start the execution plan. Click OK to confirm that the reset was successful. c. d. select Custom Revenue Update. indicating that a plan is being executed. confirm that you want to reset by entering the text and click Yes.Lesson 14: Adding a New Dimension in the OBAW i. c. Click the Run Now button in the Top Pane toolbar. query WC_PARTNER_D and verify that there is no data in the table. a. In the Reset Data Warehouse dialog box. j. c. 5. In the second Building window. Log in as obaw with password obaw. Click OK to acknowledge that the execution plan has been successfully built.9: Implementation for Oracle EBS 133 . c. Open SQL*Plus. In the Execution Plans tab. e. Select the Tasks tab in the bottom pane to view task status within the execution plan. b. b. View Run History. Select the Custom Revenue Update run and confirm that it has a Run Status of Running. Click Auto Refresh to verify the refresh frequency of 30 seconds. b.

134 Oracle BI Applications 7. Run the following SQL statement: select count(*) from WC_PARTNER_D where PARTNER_NAME is not null. 9. c. h. b. After running the execution plan. Verify that Run Status = Completed. and Number of Successful Tasks = 5.log. click OK to accept the default log file name. If you need assistance. In the Input dialog box. d. Note that the naming convention of the log files includes the ETL Process ID that you recorded in a previous step. g. use the log files to troubleshoot.#. query WC_PARTNER_D and verify that there is now data in the table. Log in as obaw with password obaw. Scroll down to the bottom of the file to the list of step statuses and confirm that there are no steps listed under Failed Sessions or Queued Sessions. f. note the path to which the log file has been saved and click OK. If any tasks fail. Number of Failed Tasks = 0. ask your instructor. In the Fetching log file dialog box. e. navigate to C:\OracleBI\DAC\ServerLog\ and open Custom_Revenue_Update.9: Implementation for Oracle EBS . a. Right-click Custom Revenue Update and select Get Run information > Get log file. In Windows Explorer. d. Open SQL*Plus.Lesson 14: Adding a New Dimension in the OBAW c. Verify that five records are returned.

Sign up to vote on this title
UsefulNot useful