Professional Documents
Culture Documents
Menu
Skip to content
Home
Work Stuffs
Thought
About me
7. ASO data loads are typically more flexible than BSO data loads. ASO supports
the concept of load buffers which can do data addition, subtraction, etc. of data
coming in from multiple data sources in memory
8. There is no need for identifying sparse and dense dimensions
Outline Fragmentation
ASO has a quirk behaviour when it comes to outline maintenance: if you
amend/delete the outline members periodically, the outline keeps growing bigger and
BIGGER.! This will impact the retrieval, and subsequent maintenance jobs
performance. To rectify this, in my implementations I usually add the following
scheduled steps in the monthly outline maintenance job using a temp application:
Note:
Real app: This is the live production Essbase ASO cube
Temp app: Temporary cube that is used for maintenance process
Empty outline: can be saved in any server directory, this contains empty dimensions
of Real app with minimum required members. The idea is to re-build the app
everytime from scratch to prevent the outline fragmentation.
Steps:
1.
2.
3.
4.
The steps above help in the system availability as well by minimizing downtime
required to update the application.
Compression
Compression dimension is not mandatory, but helps performance. The compression
dimension is a dynamic dimension. It should be the column headers in a data load file.
Ideal compression is achieved if the leaf level member count is evenly divisible by 16.
Accounts
Share
Like this:
Like Loading...
Related
Essbase Study MaterialsIn "Work Stuffs"
Essbase Performance Tuning - Simulated CalculationIn "Work Stuffs"
Oracle to Phase Out Essbase Add-inIn "Work Stuffs"
This entry was posted in Work Stuffs and tagged Oracle;Hyperion;Essbase;
ASO;Performance;Outline;Fragmentation. Bookmark the permalink.
Post navigation
Oasis Stop Crying Your Heart Out
How Poor We Are
I had around 3 GB of data. I was loading the level0 data and the current month data. It was taking
around 1 hour for data loading. But for calculation it was 3 days. I was using these setting:
Dimensions: 8 (dense-2, sparse-6)
Data file cache: 100 MB
Data cache: 10 MB
Index cache: 100 MB
Using these scripts in the calculation:
Set aggmissg ON;
Set Calc parallel 4;
Set Updatecalc off;
set msg summary;
set msg detail;
Calcall;
Also i tried with this script:
Set aggmissg ON;
Set Calc parallel 4;
Set Updatecalc off;
set msg summary;
set msg detail;
Agg(s1,s2,s3,s4) sparse dimensions
As I was using dynamic calc on all upper level mbrs on dense dimensions, I was calculating only
sparse dimensions. I don't have any formulas on sparse dimensions hence using agg command.
Let me know how can i improve the calculation time.
Join this group
5 Replies
0
I suggest:
1) comment out set msg detail, this is taking up overhead writing to the app log file every block that
is calculated.
2) change set calc parallel to 3. (how many processors do you have?)
3) make the calculation utilze the cache. Try using SET CACHE ALL;
3a) What are your cache size settings in the essbase.cfg file? Check that.
4) Note from Tech Reference: When a dimension contains fewer than six consolidation levels, AGG
is typically faster than CALC. Conversely, the CALC command is usually faster on dimensions with
six or more levels. How many levels do you have in your sparse dimensions?
5) What are your other two sparse dims?
6) As far as your day long data load for only 3 GB of data, how is your data file ordered? It should be
d1, d2, s2, s2, s3, s4, s5, s6 for maximum performance.
0
Apologies about item 6. I mis-read the post re: data load time.
0
1.First Of all set you Essbase.cfg file with the following settings
CALCCACHE TRUE|FALSE
CALCCAHEHIGH|CALCCACHEDEFAULT|CALCCACHELOW
INTELLIGENT CALCULATION - UPDATECALC TRUE|FALSE
set the agent delay and net delay.
2.calc all calculates and aggregates the entire database based on the database outline.instead of
using calc all ;
try to calculate by using the
CALC DIM (dim1, dim2);
or
AGG (dim3, dim4);
Use calc dim to calculate the Dense dimension and agg to sparse dimension.
3.make sure your outline follows the hour glass model
let me know how it goes.
Popular
Taking Business Intelligence to the Next Level
Related
5 PC Hardware Innovations that Pay Off for Business
Beyond Excel: Taking Business Intelligence to the Next Level
SMB Security Comparison Guide
More White Papers
0
Hi Harsha,
You could perform calculation by two different ways.
At First, by making an export/import for level 0. Your Index and pag will be reduced, before your
aggregate calculation script.
The second one, if you need to save your upper level for previous month, I suggest to fix the current
month and clear upper level data for this month. By this way, you could avoid a waste of time for
your aggregate calculation scripts.