Professional Documents
Culture Documents
Applications
Lab Guide
Developed by:
Section Objectives
In this section, you will:
Hosted images are run behind the Oracle firewall. To access your image
from a non-Oracle machine you must establish a limited VPN connection
to the OPN Enablement Environment (OPNEE) via an Array SSL
connection.
1. Obtain your dedicated Array SSL credentials from your instructor.
2. Refer to the provided document, OPNEE_Access_Instructions.pdf
that you have been provided.
NOTES
Once connected via Array SSL you will not have access to network
resources (i.e. Internet, local printers, etc.) outside of the OPNEE.
If you disconnect from the network, you will have to re-establish this
connection to access your hosted image.
2. For the exercises in this section, the following services are needed.
Oracle DB Services
o Database: ORCL (bi)
OracleJobSchedulerORCL
o Listener (bi)
OBIEE 11g Services
o OBIEE 11g
o Note that due to a bug in Process Control software
OBIEE shows with red status even when started.
However OBIEE does start correctly, just wait patiently
while the command window executes – it takes
approximately 10 minutes.
BI Apps 7.9.6.3
o Informatica Server
o DAC Server
3. Processes can be stopped and started by selecting the check box to the
left of the service, and then selecting stop or start at the top of the
page. The status of the services (green flag is ‘up’, red flag is ‘down’,
‘?’ means we don’t know) can be updated by selecting the update icon
on the right.
Start Informatica
Server…
1. Start the Informatica server for the exercises in this section.
2. Your training instructor will pass you the client installables. Extract the
file (such as 901HF2_Client_Installer_win32-x86.zip) to a temporary
directory.
3. Go to the folder where you extracted the file. Open the file install.exe
in the folder Client to launch the installer. In the appearing screen
select the default option Install Informatica 9.0.1 with Hotfix 2. Click
Next>.
8. Installation screens.
10. New icons are created in the Start menu in Start All Programs
Informatica 9.0.1 Client PowerCenter Client. Start PowerCenter
Repository Manager to configure the connection to Informatica
PowerCenter server running on the VM.
Username: Administrator
Note that if you have a Windows 64 bits and installed the 32-bit
client tool you have to run the 32-bit ODBC Administrator at
C:\Windows\SysWOW64\odbcad32.exe
Click on the System DSN Tab
Click on the Add button and select the “Oracle BI Server” driver in
the subsequent screen. Click on Finish.
Enter the name of the Server in the subsequent step and click on
Next
2
OBIEE Ad Hoc Reports
Section Objectives
In this section, you:
Build a report using OBIEE
Analyses
Create a new dashboard and
add the created report to it
Create a pivot table and a
graph views for the report
Start Services…
1. Make sure that the needed services are running. The database, database
listener and OBIEE server should be running.
2. Login into OBIEE using username and password as weblogic and
password1. Select New Analysis and select the subject area
“Service - CRM Activities” from the popup menu.
8. We’re going to build a report that shows the actual cost by group
activity and time. Select the following columns:
Time Week
Activity Activity Name, Activity Type
Facts Activity Cost Facts Activity Costs, Time Cost Contract
You can either drag and drop them from the left to the right or double-
click over the columns to add them.
The columns appear in the same order as you add them. To reorder
them drag and drop the columns. When you start dragging the column
a grey bar shows where the column will be moved.
When you’re done click on save on the top right hand corner of the
screen. Create a new folder called Boot Camp under Shared Folders.
Save the analysis to the new folder for example with the name Activity
Cost Report for Boot Camp.
9. So far we’ve done all the report definition in the Criteria tab. To view
the results of the analysis select the tab Results on top left. A default
layout with a title and a table view shows up.
10. Let’s now create a new dashboard where we’ll place the new report.
Select New Dashboard. Set the name for example to Boot Camp
and select the Location to be the folder that you created in the previous
step. Leave the default selection Add content now and click OK.
11. You now enter the dashboard edit mode and can add content. Find the
analysis that you created using the Catalog pane at bottom left. Drag
and drop it to the canvas. Save the changes clicking on the icon on top
right.
12. To view the dashboard click on the Run button on top right.
3. Click on the top right of the analysis on the icon xyz ( ) and select
Edit Analysis in the popup menu.
5. Maybe you didn’t notice it, but actually a new view was added to the
bottom of the compound layout. To edit it, click on the Edit View
button. Alternatively you can also select the view in the Views pane at
the bottom left.
6. A pivot table edit mode opens up. Drag and drop the column Week to
the box Pivot Table Prompts. Thus the users will be able to show
results per week. Drag and drop the column Activity Name to the box
Excluded. We don’t need the column for this view. You can see the
changes in the upper pane. When you’re done click Done at top right.
7. You now return back to the compound layout and can see the changed
pivot layout. You can test how the prompt works, too. Clicking Done
in the previous view didn’t save the analysis, do it now by clicking
Save at the top right.
8. Let’s create a new pie graph view. Click New View Graph Pie.
10. We’ve now created three different views – a table, a pivot table and a
graph from the same base query. Note how we can exclude the
columns we don’t need in the pivot table and graph views.
11. If we decide that we don’t need one of the views we can delete it. Say
we don’t need the default table view. We can remove it from the
compound layout by clicking the icon at top right of the table. If we’re
not going to use it at all then the best practice is to delete it completely.
This can be done by selecting it in the Views and clicking the icon
Remove View from Analysis. Save after changes.
12. Finally we can have a look on the dashboard, which now shows the
modified analysis.
13. Note that if you deleted the table view you’re not using the column
Activity Name anymore. According to the best practice you shouldn’t
have any columns in the base query that you don’t use anywhere. The
reason is simple: they will take up resources for nothing. So remove
any unnecessary columns from the analyses. Go to the analysis edit
mode and delete the column as shown in the screenshot below. Save
the analysis after changes.
Section Objectives
In this section, we will walk through the following dashboards:
Start Services…
1. Make sure that the needed services are running. The database, database
listener and OBIEE server should be running.
2. Login into OBIEE using username and password as MSTERN and
welcome1. This user is specifically configured for Sales Analytics.
3. Navigate to Sales Pipeline dashboard as shown below.
4. The report Forecast and Actual versus Quota shows the user where
s/he stands with respect to quota based on his forecast and the actual
revenue he has produced so far in the quarter.
browser.
dashboard.
9. Have a look on the other pages and reports within the Pipeline
dashboard.
4. The # of Reps with Closed Deals report shows sales reps that have
closed deals.
10. The "Top performers" report shows the sales reps are doing.
11. Have a look on the other pages and reports within the Sales
Effectiveness dashboard.
3. The Close Rate by Source report shows what sources are generating the
highest and lowest close rates.
5. The Average Sales Cycle by Source shows the average sales cycle based
on the opportunities source.
9. Browse on the other pages and reports within the Demand Generation
dashboard.
5. The Top Accounts report shows order revenue, opportunity revenue and
number of orders for the top 10 accounts.
6. The Account History report shows the number of active accounts and open
opportunities for each quarter.
8. The Product Penetration report in the Products dashboard page helps users
identify the products with the deepest customer penetration.
9. Have a look on the other reports and dashboard pages in the Customers
dashboard.
3. The Win Loss Trend report shows the organizations overall win / loss
progress.
4. The Reason Lost report shows the biggest reasons leading to competitive
losses across all deals or against a specific competitor.
5. The Win Loss Scorecard report tells how the organization is doing against
all of its primary competitors.
6. The Recent Competitive Wins shows a summary of the most recent wins
against all competitors.
7. Have a look on the other reports and dashboard pages in the Competitors
dashboard.
The dashboard provides the Service Executive with a full and detailed
insight into the pulse of the Service Organization. High level
dashboards provide intelligence around understanding the service
performance with respect to important customers, financial performance
of the service organization and general service resolution trends.
1. Login into OBIEE using username and password as MSTERN and
welcome1.
2. Navigate to Service Service Executive dashboard as shown below.
3. The Top Accounts with Open-Critical SRs report displays the top accounts
ranked by the number of open critical SRs.
6. Have a look on the remaining reports and dashboard pages in the Service
Executive dashboard.
3. The Service Request Trends report displays the total number of service
requests by severity or source for the current year.
4. The First Time Fix Rate report displays the total number of service
requests that were closed within one activity by service request severity,
7. Have a look on the other dashboard pages and reports within the Service
Manager dashboard.
3. The Call Back Queue report displays the employee’s call back queue.
4. The Volume and Cost report displays the closed activity volume
distribution by activity actual end dates and associated total costs to
complete the activities.
6. The Activities by Type and Cost report displays the activity volume
distribution by type with total estimated cost.
Start Services…
For the exercises in this section, make sure that the following services are
running:
Database and listener
DAC server
Informatica PowerCenter services
OBIEE
1. Build source and target objects in EDW schema and create synonyms
in the OBIA7963 schema to point to the newly created tables. (If the
schema used by OBIEE is different from OBIA7963 then modify
accordingly.) You can use either sqlplus or SQL Developer to create
the tables. Consult the details with your training instructor.
CREATE TABLE EDW.WC_ASSET_DS
(
DATASOURCE_NUM_ID NUMBER(10),
INTEGRATION_ID VARCHAR2(30 BYTE),
ASSET_REGION VARCHAR2(30 BYTE),
ASSET_TYPE VARCHAR2(30 BYTE)
);
1. The instructor should have given you the ASSET_TYPE.csv file. Copy
it to the Informatica source file folder, typically something like
6. The flat file now appears in the sources and in the Source Analyzer
screen. Save the changes.
8. Ask your instructor which is the correct ODBC driver to connect to the
database. Here we’re using the ODBC called DataWarehouse. If there
isn’t any existing ODBC connection, ask your instructor for the ODBC
creation details. The following are example screenshots to create an
ODBC connection.
10. Select from the list of available drivers the best option for Oracle
database. If an Oracle database is installed locally then there’s an
obvious choice. Otherwise you might need to install Oracle Instant
Client.
11. Fill in the details. You can use any name for the Data Source Name.
TNS Service Name is typically ORCL, username edw and password
edw. Test the connection.
12. If for some reason you couldn’t set up the ODBC connection you can
define the table manually in Target Analyzer through the menu
Targets Create… Again, ask your instructor for more detail if you
need to do this.
13. Now import the table WC_ASSET_DS. Here we’re using the ODBC
connection DataWarehouse to import the table.
14. The table appears in Target Analyzer canvas and in the Targets folder.
Save.
15. Next we’ll build the SDE mapping that takes the data from the csv-file
to the staging table WC_ASSET_DS. Switch to Mapping Designer
mode. Create a new mapping through Mappings Create… Name it
Custom_SDE_AssetDimension.
16. Drag and drop the source and target that we defined to the canvas from
the left pane.
19. Drag and drop the columns from the source qualifier
SQ_ASSET_TYPE to the expression. Connect the columns to the
correct ports in the target table. Something like this.
expression to $$DATASOURCE_NUM_ID.
22. Now let’s build the SIL mapping to load the data from the staging table
WC_ASSET_DS to the dimension table WC_ASSET_D. Go to the
folder Custom_SIL. Go to Source Analyzer mode and import the
source table WC_ASSET_DS from Sources Import from Database.
Save.
24. Next let’s copy a mapplet from the SILOS folder to our custom folder
so that we can reuse the out-of-the-box logic. Open the folder SILOS.
Look for MPLT_GET_ETL_PROC_WID in the Mapplets subfolder.
Copy by Edit Copy or CTRL-C. Open again the folder
Custom_SIL. Set the focus on Mapplets subfolder in Custom_SIL.
Paste the mapplet by Edit Paste or CTRL-V. Save.
26. Drag and drop the source table WC_ASSET_DS and target table
WC_ASSET_D to the canvas. Drag and drop also the mapplet
28. Add an output port called ROW_WID, data type decimal and precision
10 to the expression. In this lab we’re taking some shortcuts in order
not to make the lab too long. Normally ROW_WID would be
populated with a number from a sequence transformation so that each
record would get a unique ROW_WID. In this case we’ll just set a
static value -1. Connect the ROW_WID to the target table. Right-click
and Arrange All. Save. The outcome should look more or less like this.
4. Select the mapping with the same name to be associated with the session.
7. Select the options Fail parent if this task fails and Fail parent if this task
does not run in the properties of the session instance. Save.
8. Now let’s create the SIL session and workflow. Open the folder
Custom_SIL. Go to Task Developer mode. Create a new session by Tasks
Create… and name it Custom_SIL_AssetDimension.
9. Select the mapping with the same name to be associated with the session.
12. Select the options Fail parent if this task fails and Fail parent if this task
does not run in the properties of the session instance. Save.
After that we create new logical and physical folders to point to our
custom Informatica folders. Logical folder is used within DAC and
physical folder has the same name as the Informatica folder. In this
case both folders have the same name.
1. Start the DAC Client at Programs DataWarehouse Administration
Console Client.
2. The default username and password are fine, just click Login.
3. Create a new source system container by navigating to File > New Source
System container. Provide a unique id and name.
4. Create two new logical folders with the names Custom_Extract and
Custom_Load by selecting Tools Seed Data Task Logical Folders.
5. Create two new physical folders with the names Custom_SDE and
Custom_SIL by selecting Tools Seed Data Task Physical Folders.
3. Right-click over the task name in the upper pane and select Synchronize
tasks. Select the option Add missing tables and click OK. DAC will
process the command for some time and prompt the added tables.
4. Check the source and target table properties. Set the Data Source to
FlatFileConnection for ASSET_TYPE and to DBConnection_OLAP for
WC_ASSET_DS. Set the WC_ASSET_DS to be truncated always since
it’s a staging table.
6. Right-click over the task name in the upper pane and select Synchronize
tasks. Select the option Add missing tables and click OK. DAC will
process the command for some time and prompt the added tables.
W_PARAM_G is added because that’s used in the out-of-the-box mapplet
7. Check the source and target table properties. Set the Data Source to
DBConnection_OLAP for WC_ASSET_DS, W_PARAM_G and
WC_ASSET_DS. Set the WC_ASSET_DS to be truncated only during a
full load since it’s a dimension table. (Note however that we’ve taken
some shortcuts in this lab and the mapping Custom_SIL_AssetDimension
doesn’t have a proper update strategy in place. So with each execution the
same records would be inserted again to the table, which normally would
not be correct.)
9. DAC now has imported the metadata of the table columns and indexes.
Check the column view for the tables to make sure.
1. Go to Design Subject Areas and create a new subject area for example
with a name Custom Load Asset Dimension.
2. Go to the Tables tab in the subject area properties and add the table
WC_ASSET_D.
3. Generate the parameters for the execution plan in the tab Parameters. Set
the DBConnection_OLAP to DataWarehouse. Check against the
screenshot.
4. Since we defined the tables as synonyms in the schema let’s select the
synonyms and unselect all the other types of database objects.
9. A physical diagram opens up showing the two tables. Let’s define a join
based on INTEGRATION_ID between them clicking New Join ( ).
(Here again we’re taking a shortcut. Usually INTEGRATION_ID would
not be used for joins but rather the WIDs, the surrogate keys created in the
warehouse during the ETL process. For example, the join could be
established between a product dimension ROW_WID column and the
column PRODUCT_WID in a fact table. Also, typically two dimension
tables would not be joined together. A dimension table usually joins to a
fact table. In this case creating a ROW_WID in the custom dimension and
then have foreign keys in fact table(s) to the dimension would have been
10. In the Business Model and Mapping Layer open the business model Core
and navigate to Dim – Asset. Drag and drop the columns
ASSET_REGION and ASSET_TYPE from the physical table alias
Custom_Dim_WC_ASSET_D_Asset to the Dim – Asset in the Business
Model and Mapping Layer.
11. The columns appear at the end of the list of columns. Rename them to
Asset Region and Custom Asset Type. (There already is a column Asset
Type so we can’t give the same name.)
Open the Service - CRM Assets subject area in the Presentation Layer.
Drag and drop the two custom columns to the presentation table Asset
Attributes. Check in the changes to the repository, save and check the
consistency. Note: If it is not possible to save the RPD in the online
mode – for example OBIEE fails when trying to do so – then ask your
trainer how to save it locally and deploy it using Enterprise Manager.
WITH
SAWITH0 AS (select distinct T346071.ASSET_REGION as c1,
T346071.ASSET_TYPE as c2
from
WC_ASSET_D T346071 /* Custom_Dim_WC_ASSET_D_Asset */ )
select distinct 0 as c1,
D1.c1 as c2,
D1.c2 as c3
from
SAWITH0 D1
order by c2, c3
14. We’ve now added the new custom tables to the data warehouse. After that
we built the ETL mappings, sessions and workflows. Thereafter we
defined the tasks, subject area and execution plan in DAC and ran the data
load. We then imported the custom table metadata to OBIEE repository
and modeled it in the Business Model and Mapping Layer and
Presentation Layer. As the last step we tested the new development by
creating an analysis in the OBIEE front-end.
Section Objectives
At the end of this section, you will be able to:
Start services…
1. For the exercises in this section make sure that the database,
database listener and OBIEE services are running.
2. Open WebLogic Administration Console at
http://demodrive:9704/console and login with weblogic / password1.
The Administration console is used for example to manage users and
user groups.
3. Navigate to Security Realms myrealm Users and Groups
Groups New. Create a new group called FinancialAnalysts. Set
the description to Financial Analysts Group. (According to the
naming standard plural is to be used for group names.) Leave the
authentication provider to default. Click OK.
5. From the list of users select the user that you just created. Click on
the user name to go to the properties of the user. Set the groups of
the user in the tab Groups. From the list of parent groups on the left
select the groups BIAuthors and FinancialAnalysts and move them
to the Chosen box on the right. Save.
step, just go to the role properties and continue from the next step.)
10. Check that the group has been added. Click OK at the top right of the
window to save the application role.
11. Click the role BIAuthor in the list of application roles. It’s one of the
out-of-the-box roles. This role basically grants necessary
permissions to create dashboards and analyses for other users. See
the list of members. Remember how we added our user to the group
BIAuthors? By belonging to BIAuthors group a user gets assigned to
BIAuthor application role and therefore has extensive rights to create
12. You can exit from the role properties by pressing Cancel at the top
right.
18. You can still see the dashboard with this user because the user dnoonan
has been granted the BIAuthors role and that role has full control of
the dashboard General Ledger and dashboard page Balance Sheet. But
of course this would be correctly set in any in-house system, and
somebody who doesn’t have access to the data wouldn’t see the
corresponding dashboard, either.
19. Try creating a new analysis, and you’ll see that the user has no access
to the subject area Financials - GL Balance Sheet. All the other subject
20. Now login with the user that you created, and which has been granted
the role Financial Analyst. You should be able to see the dashboard.
When you create a new analysis you can select the subject area
Financials - GL Balance Sheet.
Great, we’ve now seen how to set object-level security with an application
role. With these security settings we’ve restricted the access to Financials -
GL Balance Sheet to all users except those who have Financial Analyst
role, administrator or system role.
How about if we’d like all the users to access the dashboards created in
Financials - GL Balance Sheet, but allow only Financial Analyst role to
create new dashboards and reports? In the next section, let’s change the
repository permissions back, and set the security with the front-end
privileges instead.
1. For the exercises in this section make sure that the database, database
listener and OBIEE services are running.
2. Go back to the repository in online mode. Double-click the
presentation catalog Financials - GL Balance Sheet. Click
Permissions… and set the permissions back to Default for BIAuthor
and BIConsumer.
5. Scroll down the list until you find Subject Area: “Financials - GL
Balance Sheet” and the privilege to access it within Oracle BI
Answers. Click on the current privilege holder Authenticated User to
change the properties. Remove the privilege from Authenticated User,
6. Logout and login again with the user dnoonan / password Admin123
who’s only been granted the application role BIAuthor. Notice how
you can see the reports on the dashboard page General Ledger –
Balance Sheet. However, this user is not able to create new analyses in
the subject area Financials - GL Balance Sheet.
7. Logout and login with the user we created previously, and which has
been granted the Financial Analyst role. With this user we can create
analyses in the subject area Financials - GL Balance Sheet.
To sum up, the difference between the two security approaches is the
following:
1) In the first exercise we set the permissions of the whole subject area
Financials - GL Balance Sheet in the OBIEE repository. The users who
don’t have permissions to the subject area can’t see the reports or
dashboards based on it, nor can they create new reports in that subject
area.
2) In this exercise we modified the front-end privileges. As a result the
users will still be able to consume reports and dashboards based on
Financials - GL Balance Sheet but unless they have the role Financial