Professional Documents
Culture Documents
HFM Material
HFM Material
Entity Dimension :
Default Currency : Here we are going to define a currency for Particular entity, This currency it will
translate to ultimate parent currency.
Allow Adjustment : If you say allow adjustment for entity then you are allow to post journals for
particular entity otherwise you are not allow to post journals
IsIcp : Isicp Specify whether Entity is used for Inter Company Elimination process or not.
Allow Adjustment from Children : this option we are going to set for parent entity, if you do not
select then you are not allow to post journals from children.
User defined : it is VB variable and it is used for member list and business rules.
Holding Company : it is virtual ultimate parent entity, and it is used for consoldation routine based
on holding method, it is used for smooth roll up data.
For example : if you have a entity TCS under "tata group". We have a 2 entites so those entity data
will be roll up to "tata group", and if you have any revenues or expenses at "tata group", then you
are not allow to input data so we need a create a virtual entity this is called a holding company.
Security as partner : it is used for inter company transcations and it is to be selected where you
have selected IsIcp for Entity.
Default parent : it is optional field and define as per tree, if you do not define nothing will happen.
Custom Dimension
Is Calculated : If you say Is Calculated then you are not allow to input data and here data will come
from rule file and it is used for opening balances.
Switch sign for flow : It is used for cash flow statment to working capital calculations
For Example : if we have an account debtors and if it is increses or decreases then you can select
switch sign for flow it means it will changes behaviour of sign changes (+ to -)
Switch type for flow : if you select switch type for flow then it will change behaviuor of account
type,
for example : if you have an account wages and in month of jan you have 100, feb 100 and march
100 so if you are going to see same data at YTD value it will display as 100,200 & 300
So now if yo select switch type for flow for this account then it will chnage behaviour as balance
sheet account like 100, 100, & 100 at YTD values
HS.C1.UD1(XXX) = True
Submission group : we can define submission group 0 to 99, if you select submission group as 0
then security and Reporting Cycle will not applicable for those members.
It means user is able to load data and see data if we do not start process management also
If you have a 2 phases then first we need start "first pass" for first phase and go to next phase.
if consoldation will take 4 hours for finish and in any case user want to load data so again we need
to say consoldation so it will be take more time, because of this reason we are going to define sub
mission groups and phases.
Account Type
Revenue :
Expenses :
Asset :
Liability :
Flow : it is used for cashflow statments even we can use for income statments also
if you say account type as "flow" then those accounts will translate with currency rates. And these
accounts will accumlate at YTD values.
Balance : its also similar to flow accounts but these accounts will not accumlate at YTD values.
Group label : it is used for heading and these accounts will not a behaviour
if you have any accounts which we dont like to roll up then we can select account type as "group
label"
Balance Recurring : for example if you take locaton singapore , this goverment will maintain
common tax rate for entire year, so if you have this kind of accounts then we are going to select
account type as "balance recurring"
once if you enter tax rate in month of jan then it will applicable for all months and if you enter
other tax rate between year then it will start from betwen to end of year.
Dynamic : it is used for ration analysis and these account will not have behaviour like addition and
subtraction.
Difference of calculation and dynamic rule : for calculation we need to say right click and
calculate but for dynamic no need to say, it will apply on flying.
Is Calculated : if you say account is Iscalculated then you are not allow to input data, it means
these acoounts are getting data from rules
IsConsoldated : If you say Is consoldate or not then these account data will roll up to parent
account member "but if you do not select Isconsoldate then data will not roll up at parent entity
level."
IsIcp : IsIcp define whether account is used for inter company elmination or not.
Plug account : plug account stores data difference amount of two individuval transcations. And
plug account default parent should be #root or we can define hirarchy if you have more than one.
Important : while ICP elemination proces those 2 accounts should be have same plug account, if
you do not give proper plug account then 100% elemination will happen.
Custom Top member : it will have relationship between account and custom dimensions.it will help
you in journal postings.
XBRL : Xtend Business Reporting Language, we are going to use it for UD4.
Enable Custom Aggregate : it is used for custom dimension to roll up data. If you select custom
aggregate then custom dimension members data will roll up to custom parent member.
Cal Attribute : for example if any cell get data from rule file then if you say right click on cell and
cell information now it will display the rule logic.
ICP top Member : it is also similar to custom top member but there is relationship between
account and ICP, using this attribute we can restrict [ICP None].
If you say ICP Top member Bangalore now user will not allow you to select [ICP None]
Default Currency : we need to define ultimate parent currency and its an target currency
Default rate for balance account : we need to define rate which rate we need to convert with
balance sheet accounts – (Opening Balance)
Default rate for flow account : Here we need to define rate which we need to use for income
statment accounts – (closing rate)
Use PVA for balabnce account & Use PVA for flow account : it is translation method, all balance
accounts use VAR method and all Income statment account use PVA method.
Value Added method : its a stright forward method and it will take current month YTD value and
multiply with current month currency rate.
Periodic Value added method : Its not stright forward method it will current month YTD value and
deduct from Prior month YTD value there remain money will value multiply with current month
currency rate.
ICP Aggrigation weight : this attribute defines ICP data roll up to ICP Top if you set "1", then 100%
data will roll up to ICP TOP and if you set 0.5 then 50% data will roll up to ICP TOP so icp
Aggregation weight might be between "1 to -1".
Default value for active : If you set "1" : then all entites will be in active status
if you set "0", then all entites will be inactive status.
In case if you close any entity because of loss then we can not delete entity from metadata
because we have past dat for an entity, so in this case we are going to inactive an entity in
ownership management and write a "Noinput" rule for restrict an entity.
Validation account : Here we are going to define validation account name these are validation
checks for data validation. So we need to write a rule logic in rule file
if validation account is not equal to zero then we are not allow to close reporting cycle.
Consoldation rule : if you set "yes" we need to load consoldation rule and if you say "NO" it means
we are using default consoldation.
Org by period : this attribute defines whether the new consoldation structer can co exist with past
consoldation structer.
For example : If you an entity AB which is associated with parent A and Parent B, so if you any
requirement like first 6 months data should be consoldate to parent A and next 6 month data
should be consoldate to parent B , so we can define these structer with help of ownership
management when you select Org By Period application in Application Setting.
2.Parent : if you set parent it will restrict value dimensions members, users are able to see data
from EC to PCA. And users are not able to see data in "[parent]", [parent adjustment], [Parent
total].
Use security for acount, entity, ICP, Custom and Scenario : here we need to select for which
dimension we are going to use security.
Enable metadata filtering : if you enable this attribute then users are able to get access for
members where they have access, and remain metadat memners they are not able see.
Maximum cell cell text : here we need to define sixe of text (8000 or 9000)
maximum Docs size : here we can select -1 for unlimited size of documents
Use submission & use submission for account & use submission for Custom : if you have any
submission groups in account or custom dimension then we can select this attribute.
Name of FDMEE : its a optional but when we are doing migration then it will help u for take back
up.
Scenario Dimension
compare to all dimension except scenario dimension, it does not have system defined member
[None]
Actual, Budget, Forecast, Satitory
Default frequence : Its define frquence of data loading, we can set YTD or MTD
Zero View for adj and Zero veiw for non Adj : this attributes determines that how do you want to
see data like you want see data '0' s or interpret '0' s.
For example we have a data 100 in month of january and when you are going to see data in month
of february, then
1.If you set scenario YTD and If you set view asYTD then value is 0
2.If you set scenario periodic and If you set view as periodic then value is 0
3.If you set scenario YTD and If you set view as Periodic then value is -100
4.If you set scenario periodic and If you set view asYTD then value is 100
Zero view for non Adj : its also similar to Zero view adjustment but it will used for see journal data.
Consoldated YTD : If you enable then first YTD balances will be consoldate and in next level
periodic values will be consoldated.
User Defined : It is VB varible and its used in Member list and rule file.
Support Process management : if you set 'yes' then this scenario will support process
management and if you set NO then it will not support.
If you set 'O' it will support process management with email alerting.
Maximum review level : we need to define review levels, bydefault its '3' and we can define 1 to
10
use line items : if you have any line item accounts then we can select if you select line items then
those accounts willcomes to green color and behave as parent, if you set line item for existing
account or scenario then you are not allow to load metadata so in this case we need to create new
member.
Enable data audit : if you enable this attribute then we are able to see data loading audit
it is not used for single reason there is 3 reasons for we are going to use value dimension
1.It is used to load and identify the data (data and adjustments)
2.it is used to translate local currency to parent currency
3.And it is used maintain all foreign currencies this application can support.
Paramaters :
parent currency : it is used translate data from local currency to parent, because parent has
differnt currency to child.
Parent currency adjustment : it is used to post journals in parent currency, these adjustments are
linked with currency and not linked with parent entity.
Parent : it also similar to PCT but PCT linked with curency and parent linked with particular node.
Parent Adjustment : It is used to post journals on specific node, and these adjustment avilble only
for parent entity where you have selected attribute "allow adjustment from children". In entity
dimension
Proportion : depending on set of consoldation rules this member store parent total or proportion
of parent total.
It means if you have 80% consoldation in ownership management then this memeber stores 80%
data only,
if you are using default consoldation then PCON-POWN = PMIN here PMIN should be zero. Then
only 80% consoldation will work.
Elemination : Elimination stores the inter company elemination data. Elemination will be happen
after proportion because in consoldation rule we are going to call elemination sub routine so.
Contribution, Contribution adjustment & Contrubution Total : I have no idea about this attributes.
Exchange Rate Grid POV
1.In row we need to have Account and Custom1 here we need to select source currency from
Custom1
2.In Colom we need to have Custom 2, here we need to select target currency
if take custom dimension in reverse manner source currency will divided by target currency
instead of multiplying.
4.Coming to remain dimensions like Entity and ICP , Custom 3, Custom 4 Should be none.
ti is nothing but source currency will translate to other currency via target currency
So first in HFM the source currency will translate to target currency, here source currency will
multiply with target currency
In next level target currency will translate with other currency in reverse manner, here target
currency will divided by ther currency.
3.We can link one form to another form 3.We can not link grids
4.We can make chnages in scripting level 4.We dont have scripting concept in data grid
5.In data for we can see data and Text 5.We cant not see text in data grid
6.we can make simple math calculation with 6.we can not make calculations
help of Scalc function
9.we have some more row and colom properties 9.we dont have row and colom properties
(Add member,Cell text, Style , Color etc...)
Scale : It will degrade cell value,for exapmle we have a cell value 12,500 and if you set sclae as 2
then it will display value as 125
if you default then form will display text and if you set blank then form will display data
1 = strat colom
1 = start row
Scalc(Col(1) – Col(2))
Link Form : here we need to define another form name for link with current form and if you double
click on chain symbol, then form will display which is associated with current form.
Add Member : for example you have fixed asset hirarechy and in this same form if you want to see
some additional members then we can use this add member property.
Business Rules :
1.No Input
2.Translation
3.Dynamic
4.Consolidation
5.Calculation
6.Validation
7.Transcation
8.Allocation
9.Input
No Input : It is used to restrict dimension members for data entry, basically we can use NoInput
rule for Opening balance but we we can make manipulation using "Is calculated" in custom
dimesions.
But coming to my experience i used NoInput rule for restrict some of entities and i used for
restrict all entites except [None] for enter exchnage rates. why because there is chance, users are
going to select entity for enter exchange rates, basically we can restrict entity dimension in data
form, instead of that we have written NoInput rule.
Difference of "Is Calculated" and NoInput : if you set "Is Calculated" for account then we can not
make any customizations and we cant enter data manually, for those accounts data will comes
from rules
Using NoInput rule we can make some customization for particular POV.
Rule 1 :
Sub NoInput ()
HS.NoInput "E#Ban.A#Defftax"
End Sub
Rule 2 :
Sub NoInput ()
Dim Eli
Dim i
Eli = HS.Entity.list ("Group", "[Base]")
i = Lbound (Eli) to Ubound (Eli)
When you concatnate varibale with function then we have to say that "&Eli(i)&"
Dynamic Rule :
when we are going to write a dynmic rule we need to cross check whether the
particular account , account type is "Dynamic" or not, and those accounts might be base level
members
Dynamic calculation will be apply on flying it means after load rule file then separately no
need to say calculation and Consoldation.
Basically we are going to use this dynamic rule for "Ratio Annlysis" because these account
will not have any behaviuor (Addition & Subtraction)
Rule :
Sub Dynamic ()
If HS.Entity.member = "Bangalore"
End If
End If
End If
End Sub
Transcation Rule :
It is used for ICP transcations, when are going to use ICP tradational data load then
we need to convey the system for support these ICP accounts for transcations.
Sub Transcation()
HS.Supporttrans "A#SalesIC"
HS.Supporttrans "A#WagesIC"
End Sub
Translation Rule :
Default rate for balance account and default rate for flow accounts
Use PVA for balance account and Use PVA for flow account,
As per my knowledge all balance sheet account will pick rate from default rate for balance and all
P/L accounts will pick rate from default rate for flow
so if you have any requirment like certian accounts want to convert with certian rates
either custom or historical rates then we need to write rule.
as og my experience all balance sheet accounts use VAR method and All P/L accounts use PVA
method.
And we can use function HS.GetRate for get rate of combnation
End Sub
Allocation Rule : It is used Move data from one POV to another POV and mandatory to load
member list file.
After load rule file first we need to say calculate and then allocate
Rule :
Sub Allocate ()
End Sub
Consoldation :
If you want to do 100% consoldation then no need to write a rule and if you want to make
some customization that i would like to do consoldation for certain combnation not all other
combnation then we can write consoldation rule.
1.Before going to write a rule first we need to cross check whether you have a selected attribute
Consoldation rule "Yer or No" in Application setting
Open data unit : it is used to get data from POV, it means this function call the data where we have
valid intersection with dimension account, custom and ICP, these dimension are nothing but a sub
Cube information.
lNumitems = Mydataunit.GetNumItems
GetItems : this function it will cal l the data of First POV and so on for consoldation.
Is Consoldate : this function it is going to be check accounts where we have selected attribute "Is
Consoldate"
Pown : this function it will pulls the percentage of ownership for particular entity.
Pcon : it is very important function in consoldation rule why because this function keeps data in
[proportion] and [Elemination] of value dimension.
Call HS.con ("", "dPown", "")
if you use default consolidation and if you want to run 80% consoldation then we need to set Pcon
and Pown 80% then only it will work.
If you are using routine apporach consoldation and if you set Pown 80% remain 20% it will
goes to Pmin, it is nothing But "Non Controlling interest"
Dim MyDataUnit
Dim lNumItems
Dim i
Dim dPOwn
dPOwn = HS.Node.POwn("")
lNumItems = MyDataUnit.GetNumItems
for i = 0 to lNumItems-1
next
End sub
******In Consolidation rule we use Pown only and In Elemination rule we use PCon******
Elemination rule : Elemination rule it is nothing but extension of consoldation rule because we call
elemination rule in consoldation rule, as per line in value dimesion, first application store data in
[Proportion] and next [Elemination].
We have 2 ways for load ICP Data, one is Tradational data load method and routine apporach
method
If you are using tradationla data load method we need to write Sub Transction rule in rule file.
Sub tranction ()
HS.Supporttrans A#ICPSales
End Sub.
1.first we need to recognise ICP Entities and Icp Accounts with help of Attribute "IsIcp"
2.Define Plug account
(Plug account it is used for identify diffenerece of 2 individuval transcation, those
transcations one is should be debit and another should be credit and these 2 individuval accounts
should be maintain 1 common plug account, you can define any account type for plug account but
thing is if you want to see defference amount better to give account type as per those trancation),
if you have only one plug account then those account default parent should be #root. If you
have more than accounts better to create separate hirerachy.
Coming to our project we are not using tradation ICP module, just we are using routine apporach.
Sub Eleminate ()
Dim Strplug
Dim dPcon
Strplug = HS.Account.PlugAcct (Straccount)
dPcon = HS.Node.Pcon ("")
End Sub
Calculation Rule :
recently we have not received any requirement for rules but eariler in same project i
worked on creating rules on some of normal calculation rules, Currency translation adjustments
(CTA), Carry forward rules. Moving Balances Rules & ICP Adjustment rules
we are going to write rule in end of file and we will call this sub routine in sub calculate.
Procedure of Rules : when we receive any rule requirement first we need to understand logic from
client or your team lead.
1.When you are going to write any rule try to decrease size of exp fucntion , i meant say that try to
declarae varibles in rule
2.next we have to be select proper intersection (when you are giving intersection follow the
dimension order for better performance like first declare Value dimension, scenario, Entity)
3.you should not use dimensions Scenario, year, period and Entity in Exp function it means you
have to use only Account, ICP & Custom dimension in Destination POV.
we had a requirement, i am not sure of accounts, client told me that we have a some accounts and
those accounts should be forward to next year, then i told to client for make manipulate UD1 with
RF and i have written rule like :
HS.Exp".A#"&Ali(i)&".Y#Curr = A#"&Ali(i)&".P#Last.Y#Prior"
Next
End If
End If
End If
End If
End Sub
Currency Adjustment rule : am not sure about account names , some of income statment accounts
are associated with balance sheet accounts for tally trail balance, and here we have written rule
why because those Balance sheet accounts should be translated with OpRate when data move
from IS to BS.
Sub CalCulate ()
End If
End Sub
As well i have a written some other calcualtion rules and ICP Adjustment rules, Move Balances
from some accounts to one target accounts.
Journals
Journals are nothing but adjustment of data. I dont have a any experience on post journals and its
completely end user job.
1.Before going to journals first we need to open a period
2.we need to cross check whether you have selected attribute 'allow adjustment' or 'allow
adjustment from children'
Why we need to post journals : if your balance sheet is not matching due to some of translation
issues, then we need to post journals in suspension accounts.
And if you have any extra ordinary items then we need to post journals
Where we need to post journals :
for example we have 2 currency (USD & INR) and here USD is ultimate Parent currency so If you
find any mis match data at USD entity level then we need to post journals in <Entity currency adjs>
or else if you find mis match data in INR entity then we need to post journals in <Parent Cur Adjs>
why because INR data should be trnslated to Parent.
Standrad Templates : It is normal journals which we can be used frequently, if you create any
standrad templates those templates we can select from manage journals, and if you post journals
from standrad templates then it will comes to working status. As well we can create standrad
templates for auto reversal journals.
Recurring Journals : It is also normal journal entry which can be used frequently and if you create
any recurring templates first we need to "generate" templates then we can select templates from
manage journals, and if you post journals from recurring templates it will comes to approved status
if any user want to post journals from recurring templates then user should ba have
"recurring generate" acces in shared services.
Auto Reversal Journals : you can create a auto reversal journals that you want to resverse for next
period.
for exmple when u cretae and post auto reversal journals the debit of value of sales in
month jan here system is createing credit value of sales in month of feb.
if you post journals in <parent cur adjs> then those adjustments will applies to all hierarchy it
means it will applies to altrantive hierarchy also
If you post journals in [parent adjs] then those adjustments will applies to that particular hierarchy.
Process Management (Reporting Cycle)
1.Every month beganing 1st working day and 2nd working day our HFM admin will start reporting
cycle, then it will comes to "first pass"
2.next we will convey to user for data loading so in our project every month 3rd working day to 10th
working day users are going to load data.
3.Once data load is completed then submitter and reviewers are going to review the data, so once
review level 1 move to review level 2 then reviewer 1, he cant access data.
4.Submit
5.Approved
6.Publish data
7.Once publish data then our admin will lock entites so none of users are not able to load data.
In case if any user want load data after publish also then we can load data by having admin
access. So here no need to come back
EPMA
In EPMA we have an option to create a application with local dimension, it will give you standrad
dimensions and it will allow you to create custom dimensions also
And we have one more option to create a blank application where it does not contain dimenions
here we need to import '.Ads' file.
EPMA CLASSIC
1.We need to say Validate and Deploy for affect 1.we need reload changed metadata in
of making changes workspace
3.We can share dimension members accross 3.We cant share metadata members to other
hyperion products (Planning & Essbase) hyperion products.
4.In EPMA first build metadata and then 4.First we need to create application and build
validate Application metadata according custom dimensions.
Smart View :
It is used to view and modify data, In smart view highly we are going to use bulid functions for
create templates. As well some times we are going to create add hoc analysis for view and modify
the data.
Casecade : it is used for comparision of 2 excel sheet because when you click on Casecade then
same sheet will open in same window
Drilthough : first we need to place your cursor on cell and when you click drilthrough here our
exceptation is it should communicate with source and data should be display from where it is going
to get data.
**** In smart view if you want to load zeros then we need to enable submit zero in data option.
In smart view we can see reporting cycle status and calculation status****
Manage POV : It is used to Set your POV as favourte and we can use this POV for all sheet also.
Build Function :
HS Set Value : it is used to submit data from smartview to HFM, In this function we need to Cell
reference for get cell and submit cell.
HS Sheet Type : it is used to know for sheet type and active connection, URL and Application Name
**** I wokerd on create a BS-Cashflow Templates, IS-Cashflow, Blance Sheet Entity Wise and
Income Statment Entity wise and others ****
Utility
Eariler we used utility for migration from 11.1.1.3 to 11.1.2.3 why because LCM does not support
to copy data but it will copy all artifacts. In 11.1.2.3 we can use LCM for take back up of application.
Copry Application Utility : Highly we used Copy application utility for take back up of application
and move artficats from dev to prod.
But there is dis advantage thats what this utility will copy all artifacts except Security so need to
extract security file and load in our Production.
LCM : In shared services if you click on application, there we have to select artifacts and click on
export then it will comes under appliacation systems there we can download.
Otherwise when you click on export it will saved in Our Oracle = middleware = user projects =
import and Export folder.
Member List
1.system Defined : when you load metadata in application then system by default will create
member list
A.Sub EnumMemberlist : here we are going to define how many member lists are required for
each diimensions
Else if
HS.Dimension = "Entity" Then
End Sub
SubEumMembersInList : here we need to define members for each required member lists
HS.AddMembertoList "Bank"
HS.AddmemberTolists "Cash"
End if
Next
End If
End If
End Sub
***** : When you create member list for entity dimension we need to define Parent and Child
Entity why because there is chance whether those entites that will have alterantive hierarchy.
HS.Account.IsConsolidate = True
HS.Account.IsIcp = True
HS.Entity.UD1 = "RF"
HS.Entity.Defaultcurrency = "USD"
Task Flow :
Taks flow is nothing but a sub sequence of task, so we can create a task flow for run
Consolidation, trnaslation & Calculation with out have any manual interventions.
You can create task flow and link a series of task and specify the time to run a task, here tasks
specify the stages and links specify the how the should be proceed.
And go to Adminstration tab ----- Taskflow --- here we have 2 options (manage & Status)
Mange : This is Place where you are going to cereate a new task
2.submit
4.Process tab : Select Application and Select activity for what reason you are creating a
Taskflow.(Allowcation, Consol, Cal, Trans, Load Journal, Icp)
5.Schedule Tab : Specify the Time and date for run a Schedule Event.
Coming to My Project we have a 3 tasks which are Running at 8am and 10am & 11am.
I come a cross in some projects they are using some other tools for run task automation (1.Star
Command Center & 2.EPM Mastro).
Ownership management :
This is place where we can make particular entity active or Inactive as well we are going to
maintain Pcon and Pown.
You have to define consoldation methods when you build metadat. In our project we are
maintaining methods (Susidiary, Joint Ventures, Equity, Global & Holding)
1.
1.HFM Arctech
Client Lier : Client lier will have a user interface and ability to communicate with application lair,
here we can maintain data and metadata.
Web lair : The HFM Web client will have a functionality, data grids data forms these are avilble in
window with exception of security adminstritation
Middle lair : it will hava a domain intellgency and connection to the data base.
Data base : This lair will have a relational data base and it will maintain data and metadata.
Consolidation Types :
1.Consolidation all : It will consoldate for all children at irrespctive data modifications
2.Consoldate all with data : when say all with data it will ignore #missing cells and Consoldate the
data
If you Set '0' then those data will not roll up to Parents
1.Client Lair : in Client Lair We have 3 Systems (HFM, Workspace & Smartview)
2.Application Lair : HFR will not work Indpendently, so we need to pull data from
Application either it is hfm or Essbase.
Snap Shot : If u save the report as snapshot, it means it will contain static information
Example : if you took snap for today and run a same report after 5 days then the old value
will be displayed.
Dynamic Selection : for example if you say base level or decendents of acoount member,then it is
dynamic, if you made any metadata changes in HFm, those changes will be reflect to report, if i
took dynamic selection.
Static Selection : It means you are going to define members manually, those changes will not
reflect to report.
1.User POV
2.GRID POV
1.User POV : when user trying to view the report in work space it will allow them for change
dimensions member in user pov befor excuting report.
2.Grid POV : coming to grid POV first we need to enable grid pov in General properties.
If u enable grid pov the pov it will come and sit into the grid, it is going to over ride
the user POV. it means it is going to restrict dimension membres
For exmaple if u want see only dimension members (Scnario : Actual, Year : 2015) yes you can
restrict with help of grid Pov.
Set Up User Pov : if you have requirment like user want see one or more dimension members in
user Pov it means if u want see only 4 Entites in Entity Dimension , You can do this activity with
help of Setup User POV.
For Example i want to display entity dimension in user POV and i want to give access to user
for 2 or 3 entites access
It means you are going to restrict user POV with certain members.
Currenct Pov : Assuming users will not allow to change dimension members which are presented
in row and colom,
For example in report we have acoount in row and period in colom so if u have requirement for
change dimension member for period you can use current POV So the period dimension will go
and sit into the user pov.
Prompt : prompt also similar to current POV but it will ask u change dimension member at run
time report.(Befor excute a report.). it will not sit in user pov.
Functions :
1.Match
2.range
3.Period Off set
4.relative Members
5.experssion
Match : for example in hireracy if u want see dimension members in report which is started from
letter 'S', using this match function u can retrive members which is started from 'S'
Range : this range function will work only on period dimensions, if u have requirment like i want
see data from jan to august , here we can give manualy, or use this function
you can give range for periods.
Period offset : it is also used for periods, for example if u want see data like current month, prior
month and next month information, using this function you can achieve even its not a more
dynamic if u select manually
if u want make a report more dynamically, u can use current POv or Prompt for period offset
function, for example if u select prompt for function when user trying to excuting a report
it will ask u for input period, if u select august then report will display information (July august
septmber data).
Relateve Member : Its similar to period off set but this function used for year Dimension.
Expression : idle if u have requirment like in account dimension you have selected entire hirarchy
for one account member()
and i dont want see couple of members whic are not required to me, using this expression u can
hide.
Navigate : Go o view in right side of member selection ---- select advanced there using 'and'
opearator, u can hide those dimensions.which are not required in report
Import and Export : Import and Export We can Perfom in Workspace and When u say right click on
report then we can export,(its pretty easy)
Import : When you are trying to import same report from development to production, we need to
change databse connection (explore --- tool -- Change databse connection, here we need
to give development DB Connection and Production DB Connection, select ok)
Books and Batch : Books and Batches Collection of report, books we are going to use it for excute,
batches we are going to use it for schedule the report
creating Books : Go to file --- new ---- Document --- Select "collect report as book". We can add n
Number reports in Book, When u say right click on book (we have an option to view
book like (HTML Preview, PDF Preview, Edit Preview, Entire book as pdf)).
Creating Batches : creating batches also similar to book. after creating Batch, we need to go to
navigate ---- Schedule --- Batch Schedule.
Click New Batch Scheduler and Select batch in Reporting Folder, specify the time
and date to run batch. and we have option to generate Pdf and snapshot for each report in
respcted folder.
Interview Purpose :
In our project there is a lot of requirement for make changes in existing data forms and
report, approxmately we have more than 40 to 50 data forms and report so those data forms and
report layouts are not matching with production environment and in addition to that we have to
add new lines, custom headings, formulas and use some functions and other things.
As well we have a requirment for create new smart view templates for Balance sheet-Cash
flow and income statment -cash flow as wll we need to make some other templates also.
We have generated one report in productions which we need to create report same as test,
when we see data in report we find out one negative value for account where it is not listed in
test. Because one of user he posted journals in production instead of test.
When we find out mismatch value, first i have created one add hoc with value dimension,
there i find out mismatch value at <Entity curr adjs> so i have extracted data in <Entity curr adj >
with same intersection in test and production. Immediately i sent emails to client.
We have one more small requirment for create a journal templates for user needs
FDMEE
Tell me about experience on FDMEE
1.Worked on create a locations, import format and mappings for dimension
2.worked on load ownership data
3.workerd on load on exchanges rate data
4.I can load journals through FDMEE
5.Worked on batch loader setup
6.Worked on Sql databse tables for error handling
7.I have invloved in creating user securities
8.i wrote a 2 custom scripts for before batch and after batch as well as i have knowledge on import
script but i did not written in our project.
Client has 3 sources which are located in different network so we are unable to integrate
with one databse network as well client using older version of "peoples soft" so this older verson
of "people soft" does not support to integrate with FDMEE.
ERP peoples are going to extract files and keep in common network folder then here
network team they are using SFTP "(Secured File Transfer Protocol)" tool and they wrote a
windows batch script for get files from network folder to our fdmee source folder. Here we have a
custom script (before batch script) for get files from source folder to open batch folder.
Well coming to rename of files eariler we used windows script for rename file but currently
we are using jython script for rename file in before batch scripting.
3.How do you handle bulk files in batch script without have jython script?
When we receive source file from different ERP's first we are going to get all file names by
using windows script in comand prompt, once we received files names , then copy and paste in our
template, in our template 1st colom is for source file names and 2nd one is sequence , 3rd colom for
location name, 4th one for period and 5th colom it is used for get target file names, coming to final
colom here we have written formula for get all source file with target file naming conventions.
Copy this colom and enter in comand prompt then it will give you all source files with proper target
file naming convention here we have option to change format of file also.
Fill : it is used for add some additional value in our source value. For example in file we have a
source account name is 100 so if you set Fill = 00000 then it will display as 10000.
DRCR Split : when you are going to load general ledger, in this file you have a 2 amount coloms for
debit and credit but well coming to import format we have only one amount colom, so if you set
DRCR split in import format, then it will ignore credit values when you have debit value in colom as
well it will ignore debit value when we have credit value in colom, bcuz any account will have
value either debit or credit.
This expression will work only for excel and delimiter should be fixed
Flip Sign : when we are getting negative vales from source and in hfm debit and credit it will go on
based on account type. So if you select sign change for account then it will multiply with '-1'.
NZP : It is nothing but disable a zero supress, when you are trying to load zeros from source then
first we need to set "enable zero loads" in application option under target application and we have
to define "NZP" in amount colom under import format. Now you are able load zero values.
2.FDMEE it will requires separate locations for load ownership data because when you are creating
location value dimension should be "None" as well as it requires separate import format.
3.Coming to import format here we have to hard core for custom dimensions, these dimensions
shoud be "None".
4.coming to mapping only for ownership data we are going to make mapping for parent entity.
5.For ICP dimension here we need to select childrens which are fall under respected parent.
For eample if you made mapping for parent entity India then here we have to select childrens
hyderabad, bangalore and et..
6.and Create data load rule for this location and select import format
In 11.1.2.3 we used copy application utility for take template of ownership data and we made
changes in existing data and loaded in HFM through FDMEE.
5.How to load exchange rates through FDMEE
FDMEE It will not recognize system defined members examples : Currencies so we need to set
"check intersection" as 'No'
1.FDMEE it will requires separate locations and separate import format
2.In import format we have to do hard core for dimensions C3, C4, Entity and ICP
4.Create a data load rule for location and assign import format
And enter label ID it should be Unquie, select location, period and scenario
And next go to work bench and click on load journals, after upload file we need to say check and
post.
If you made any changes in Excel and before going to load file in FDMEE then we need set named
range under formula tab then those changes will applies to FDMEE.
Here in file if you give posstive values it will goes to debit value in journal as well if you negative
values it will goes to credit value in journals.
Security :
1.Shared services : in Shred services we are going to give security roles for users or location
groups.
And here we can give roles : create integrator, drill through and Intermediate 2-9.
Inter mediate 2 – 9 these are not mandatory roles we need to define this roles in FDMEE under
security setting. (Data load rule, import format, batches and etc...)
Location Specific : In FDMEE if you click on "maintain user group" then it will create groups
according to locations in Shared services, there in shared services we can give security roles and
assign the user to groups.
1.batch tables : (batch group, batches, batch loadid Audit, batch group) : when you are loading
data through batch setup there is dis advantage that is fishes will comes to orange color, so we
dont know whether batch loaded successfully or not, if you have "after batch script" then you will
get email for status of batch load.
2.TDATAMAP : it is used to identify mapping for all dimensions according to locations. If you say
Select *from TDATAMAP where Partion key = 1 then it will display all mappings for
dimensions according to location, here partion key is nothing but location ID.
3.TDATA MAP SEG : it is also similar to TDATAMAP but if you made any changes in existing mapping
here in this table it will show status of changes by time and date.
4.TDATASEG : Highly we are going to use TDATASEG table, Because it is used to identify the
validation issues. If you write query like
it means for example if you have any validation issues either in account or other dimesions
then we can use 'like' mapping that is sourse = *, target = Suspnse, now validation will get success
and export will faild, in sql its defacult to find out null values so in this case we are going to use
some name for get validations.
Reasons on issues in Import, Validate and Export :
1.Import : if you have any issues in import there are some reasons that is
2.validation : if dont have a proper mapping with target dimensions then we will see validation
issues.
1.Invalid Intersection
2.There is no such target member in HFM
Types of mapping :
Between : it is nothing but a range mapping, if you say 100,199 and mapped with target sales then
all members between thsese member will map to one target account.
In : In Mapping it is used to select members randomly and mapped to one target member.
Like : here we are going to use wild card characters (* & ?) but i used * mapping only.
If you say 1* then all members which is started with '1' these accounts will map with one
target account
Ex : If custom 1 & custom 2 = [None], it should be mapped to land. Irrspective of account value in
source value.
Select proper dimesnion when you are going to create multidimension, click on add and select
target dimesion value and select dimension
Scenario : if you have one acoount member which is mapped in explict and range mapping, in this
case FDMEE will give first preference to explict only because system follow sequence of mapping
(Explict, Between, In & like)
Jython :
In FDMEE Highly we are going to use jython scripting as well we can use SQL script and VB
scripts also and coming to SQL script , we are going to use it for mapping purpose why beacuse if
you do any mappings in FDMEE those mappings will sit in SQL TDATAMAP table as well
performance wise we can use SQL script. We can use SQL script for Between, IN and like mappings
under data load mappings.
1.Event
2.Import
3.Custom
when we are going to write script then we need to follow certian jython rules in script thats what :
3.always you should be use fdmAPI.logInfo for whether particular line is excuted or not.
4.indentation (4 spaces)
5.i suggest you that write script in NotePad++ under python Language
Event Script :
I worked on creating email alert script for after data load in event script. And we can use same
script for after validate also that if you have any validation issues then it will send a emails to client
and our HFM admin., As well we have assigned same script to after batch load under batch setup.
1.Import date
2.import time
3.import calender
4.import smtplib
5.import httplib
6.import shutil
7.import codecs
and we have used some of fdmContext that is
LOADID, FILENAME,PERIODNAME,LOCNAME
Sender = "nagaraju.k@gmail.com"
receiver = "fdmadm2017@gmail.com"
message = "file loaded successfully"
try:
smtpserver.SMTP = ("smpt.gmail.com", 587)
smtpServer.ehlo()
smtpServer.starttls()
smtpServer.login(sender, "Sairam123")
smtpServer.sendmail(sender, receivers, message)
fdmAPI.logInfo ("Successfully sent email")
smtpServer.quit()
import shutil
import codecs
import time
import datetime
from time import strptime
import calendar
import smtplib
import httplib
LoadID = str(fdmContext["LOADID"])
Loc_name = str(fdmContext["LOCNAME"])
Cat_name = str(fdmContext["CATNAME"])
Per_name = str(fdmContext["PERIODNAME"])
Filename = str(fdmContext["FILENAME"])
coming to custom script I worked on create script for copy files from source folder to our
open batch folder with target file name as well we need to move same source files to archive
folder why because if you don’t move this files to archive folder , again and again it will run in next
batch process. We are receiving file, name is location name and Delimiter ‘&’ period and year
according to FDMEE period and year.
Script :
when we are going to write script first we need to import required libraries, here I
imported libraries are
import OS
import Shutil
Next we need to give source folder path and destination path, but coming to destination we should
be use fdmContext for make it dynamic. (it will help u when you are going to migrate scripts.)
srcpath = 'C:\Oracle\FDMApp\Sourcefiles'
now we have to use fdmAPI.logInfo for if you want know whether particular line is executed
success fully or not.
fdmAPI.logInfo("***line4destpath=" +destpath)
src_files = OS.listdir(srcpath)
(it is used to define for list of files in source folder with in premise or same system location)
Next we have a given archive folder path and written API Log
archivepath = 'C:\Oracle\FDMApp\Sourcefiles\Archive'
fdmAPI.logInfo("***line7destpath=" +destpath)
Next we have to run “for loop” get the file names in source folder and when use for loop we have
give indentation (4 Spaces)
if(OS.path.isfile(full_file_name)):
after that we written script for split file name according source file Delimiter
split_Name = file_name.split("&")
Location_Name = split_Name[0]
Month = split_Name[1]
After that we need to use “shutil” for move and copy files
Shutil.copy(full_file_name, destpath)
Shutil.move(full_file_name, archivepath)
Provide a new file name syntax for get our target file naming convention as per batch.
Next we have to use rename syntax for rename of target file name
Import Script :
In our project i have written script for split dimensions members why beacuse we are
receving some of source acount name with combnation of Entity & Custom 1 so according import
format we have to split and mapped to target member.
when are going to write import script we need to give file name and this file name you
should be used in defenition.
Syntex :
def Account (strField, strRecord)
strFiled : it will be cross check filed value in source file according import format
return sgelist[0].strip()
And i have written one more script for handle 2 amount coloms in source file that is what one of
user he was extracted 50 general ledgers and in these ledger we have 2 amount colom's but
coming to our import format we have only one import colom.
if user extracted files in excel format, we can load by using DRCR split, but this DRCR split it
will not work for text douments.
You have to be understand that if any accounts will have a value either it should be debit or
credit
Syntex :
strdebit = strFiled.strip()
seglist = strRecord.split(",")
here i split all records with delimiter "," because we need to find out colom of credit values.
Strcredit = seglist[5].strip()
elif
strcredit <> "0.00"
glamount = "-" + strcredit
in above syntex if the value is not zero then it will added to debit value.
if you have blank fileds in source files, am sure that you will get import issues so we need to fill this
blank values with "0" or spaces then we can use "*" mapping for mapped to target "[None]"
otherwise we can use Expression "Fill L" expression in import format or else we have to write
jython script
Syntex :
def blank value (strFiled, strRecord)
seglist = strfield.strip()
length = len(seglist)
if length = = 0:
return "[None]"
elif
return seglist
recently we have recevied one more requirement for skip some of entity values which are not
required to load in HFM
in source file we have 2100-210100 here entity name with combnation of account name, in this
sistuvation client told me that please ignore the values of entity where entity name started with
21.
syntex :
elif
return strentity
FDMEE Integration
1.First We need register a target Application under register and Provide Sourrce Details, here we
have a many sources(Sap, JD Edwards and Others.)
2.and next go to system Settings and provide a Universal path for Application, when u say create a
folder it will folder (Data, Inbox and Outbox).
3.in this same way we need to provide a application route folder in Application Setting also.
4.And next Create a Locations (Here location are Source Systems (Either Sap or Jd Edvards))
5.Next period mapping here we a Option to define Global mapping and Source mapping,
(If U create a Global mapping it will Applicable for all Application what are application
Register in FDMEE and If you Create source Mapping it will Applicable for Particular Application.)
6.Category Mapping (here Categry is Nothing But Scnario Dimension mapping) in Same Manner
We can create Gloabl and Source mapping.
7.Import Format : Coming to Import format, its nothing but Source System Record layout Here we
need to provide a Import Format name and delimited when its Required, as well as here we need
to create import format for file as our with target dimensions.
8.Location : here we are going to create a location either it can be source or target locations, so in
our we have 3 location wherer you are getting data from
9.Period Mapping : we are going to create a mapping for period either in global or application
mapping
(Global mapping means this mapping will applicable for all application where you have
more than one profile)
(Application Mapping Means here we are going to create a mapping for particular
Application)
10.Category Mapping : Category is nothing but a scenario in HFM so here we will create a mapping
for actual scnario and forecast
11.Mapping : Well Coming to Mapping here we are going to define a mapping for Account, Entity,
Custom & ICP, and other Dimensions we have defined in Category(Scenario), Location (value) and
Period (Period and Year) mappings.
12.Data load rule : using this data load rule we can give different import format to one location.and
we can create a many data load rules and assing a different import formats
11.1.2.3 : Here sepaerately we need to install ODI connection and should be integrate with ERP as
well it is distbuted compnent it means it is windows based so we can not install 11.1.2.3 in linux.
11.1.2.4 : it was directly integrated with ERP and completely java based, so we can install 11.1.2.4
in linux so we call it as exacltics.