You are on page 1of 63

Datamart Configuration

33
2014-11-12
MUREX+INTEGRATION
Murex Integration DMM Team

All intellectual property rights and other proprietary rights in and associated with the whole and every part of this
document (including all text, logos, graphics and images) shall at all times remain vested in Murex S.A.S.. You shall do all
that is necessary to protect these rights, including but not limited to, taking all measures necessary to keep confidential
the information contained herein and/or notified by Murex S.A.S. and not, directly or indirectly, using or divulging, or
allowing to be used or divulged such information to or by any third party. In addition, you shall not reproduce, copy,
distribute, republish, download, display, post or transmit this document or any part thereof in any form or by any means
whatsoever. You may also not mirror any content contained herein on any other server. Nothing in this document or the
present notice shall be construed as conferring any license of any of Murex S.A.S.'s intellectual property rights, whether
by estoppel, implication or otherwise. Any unauthorised use of any content contained in this document may violate
copyright laws, trademark laws, the laws of privacy and publicity, and communications regulations and statutes. If you are
aware of any unauthorised use affecting our rights and interests in and associated with this document, you will
immediately notify Murex S.A.S..
Table of Contents
Datamart Configuration.......................................................................................................................................................1
1 - Overview..............................................................................................................................................................1
2 - Datamart functional architecture..........................................................................................................................1
3 - Dynamic tables....................................................................................................................................................2
4 - Datamart setup....................................................................................................................................................4
5 - Datamart feeders.................................................................................................................................................4
5.1 - Definition...................................................................................................................................................4
5.2 - Configuring a datamart feeder...................................................................................................................5
5.3 - Extracting data with a datamart feeder......................................................................................................7
5.3.1 - Accessing the datamart feeders list................................................................................................7
5.3.2 - Defining the extraction filter.............................................................................................................8
5.3.2.1 - Use a Predefined filter..........................................................................................................9
5.3.2.2 - Use a run-time filter.............................................................................................................10
5.3.3 - Defining the Data generation parameters.....................................................................................11
6 - Batch of datamart feeders.................................................................................................................................12
6.1 - Definition.................................................................................................................................................12
6.2 - Configuring a batch of datamart feeders.................................................................................................12
6.3 - Extracting data with a batch of datamart feeders....................................................................................14
6.4 - Batch over a range of dates....................................................................................................................15
7 - Datamart Extractions.........................................................................................................................................18
7.1 - Definition.................................................................................................................................................18
7.2 - Configuring a Datamart Extraction..........................................................................................................19
7.2.1 - Request parameters......................................................................................................................21
7.2.2 - Mx Keyword Parameters...............................................................................................................22
7.2.3 - Mx Security Keyword Parameter...................................................................................................23
7.2.4 - Mx Datamart Functions.................................................................................................................23
7.2.5 - Output Display Settings.................................................................................................................25
7.2.5.1 - Flat file................................................................................................................................25
7.2.5.2 - Viewer screen.....................................................................................................................25
7.2.5.3 - Excel file..............................................................................................................................25
7.2.5.4 - PDF file...............................................................................................................................26
8 - Batches of Datamart Extractions.......................................................................................................................26
8.1 - Definition.................................................................................................................................................26
8.2 - Configuring a batch of Datamart Extractions...........................................................................................26
8.3 - Running a Batch of Extractions...............................................................................................................27
8.4 - Report Publishing....................................................................................................................................28
8.4.1 - Files Sharing & Archiving..............................................................................................................28
8.4.1.1 - Dynamic Directories............................................................................................................31
8.4.1.2 - Public Folders.....................................................................................................................33
8.4.2 - Report Bursting.............................................................................................................................35
8.4.2.1 - Definition.............................................................................................................................35
8.4.2.2 - Settings...............................................................................................................................36
8.4.3 - Email notification...........................................................................................................................39

Appendix 1 - Menus differences between versions.........................................................................................................42

Appendix 2 - Functionalities differences between versions...........................................................................................43

Appendix 3 - Tables compatible within a feeder..............................................................................................................44

i
Table of Contents
Appendix 4 - Effect of running a feeder or batch of feeders on previous data sets.....................................................47

Appendix 5 - Market Data Sets used for computation.....................................................................................................48


5.1 - Simulation Viewer...........................................................................................................................................48
5.2 - Market Data Viewer........................................................................................................................................48
5.3 - Dynamic tables test execution........................................................................................................................48
5.4 - Batch of feeder execution...............................................................................................................................49
5.4.1 - Standard Batch of feeder.....................................................................................................................49
5.4.2 - PLVAR Batch of feeder........................................................................................................................49
5.4.3 - Market Data Batch of feeder................................................................................................................50
5.5 - Using Live Market Data..................................................................................................................................51
5.5.1 - Prerequisites........................................................................................................................................51
5.5.2 - For Dynamic Tables.............................................................................................................................51
5.5.3 - For Datamart Feeders..........................................................................................................................52
5.5.3.1 - Using the SU_CONFIG Variable................................................................................................52
5.5.3.2 - Using the Global Filter in the batch of datamart feeder..............................................................52

Appendix 6 - Scope of trades included in the computation............................................................................................53


6.1 - Quick Overview on Trade Acceptance...........................................................................................................53
6.2 - Case where the boundary date is equal to the current date...........................................................................53
6.2.1 - Processing Center EOD organization..................................................................................................54
6.2.1.1 - Trades Implicit Acceptance mode..............................................................................................54
6.2.1.2 - Trades Manual Acceptance mode.............................................................................................55
6.2.2 - PLControl Center EOD organization....................................................................................................55
6.2.2.1 - Trades Manual Acceptance mode.............................................................................................55
6.2.2.2 - Trades Implicit Acceptance mode..............................................................................................56
6.3 - Case where the boundary date is inferior to the current date.........................................................................57

Appendix 7 - PRINTSRV service........................................................................................................................................59


7.1 - Generalities....................................................................................................................................................59
7.2 - Output settings...............................................................................................................................................59
7.3 - Other settings.................................................................................................................................................60

ii
Datamart Configuration

This document describes the configuration of the Datamart: its architecture, tables and feeders.

1 - Overview

With the sizeable amount of data present in an Mx production database, raw data is no longer suitable for making fast,
reliable, and strategic decisions. The Datamart provides the means to produce concise information such as P&L and audit
figures from stored data, thereby enabling a more intelligent analysis of the results and a possible interface with external
systems.

The Datamart can thus be seen as a separate subset of the production database, more specifically a repository of data
gathered from operational data and other sources, presented in terms that are familiar, easy to access and use for a
particular designed purpose, such as reporting.

Once installed, the Datamart needs to be properly configured for use by reports and other external tools. The aim of this
document is to outline the procedure of constructing a Datamart. Each of the steps involved in this process, mainly the
design of dynamic tables, the creation of reporting tables, and the definition of the feeders, is described in detail.

Its technical installation within the Mx database or on an other RDBMS server is detailed in the Datamart Installation
(document ID 1926) document.

Note that the menus name in this document are relative to version v3.1. However, the menus retain the same
functionalities in all versions. The only difference observable between one version and another is the naming. Please find
in Appendix 1 the list of useful menus in the reporting module and the different paths to access them depending on the
version.

2 - Datamart functional architecture

The Datamart relies on the DataPublisher service - also called DAP - in order to compute its data (figure 1). The
Datapublisher is used to access data of the Mx production database, and compute additional data to be stored into the
datamart. These extractions can then be used for particular purposes by external systems - e.g. Actuate - or by some Mx
processes - e.g. the simulation viewer.

Figure 1 : The DataPublisher service

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 1
Datamart Configuration, 2014-11-12

The computations performed by the DAP rely, in most cases, on a dynamic table definition, or on the definition of the
simulation view referenced by that dynamic table (if any). The reporting tables of the datamart are then populated
according to the datamart mapping definition, the tool which defines the relation between each reporting table and
dynamic table.

Figure 2 : Datamart Overview

The feeders and batches define the extractions to be done: they contain the list of tables to populate, the filters, etc. They
are launched by end-users from an Mx Session, and batches of feeders can be scheduled to run in EOD processing
scripts.

Once the reporting tables are populated, they can be used to build reports or extract computed data to external
environments.

In summary, the Datamart architecture is mainly based on three structures:

• The Dynamic tables, which act like views on Murex application and define how to compute data.

• Datamart mapping, which defines the reporting datamodel. It describes the set of tables designed for reporting
and extraction purpose, each table being linked to a temporary table (dynamic table or other).

• Feeders which are extraction definitions. They define the subset of reporting tables to be updated together
depending on business needs. Similarly, a batch is a group of feeders set to run simultaneously. Batches are
characterized by execution criteria, like filters and output. While feeders are used by end-users directly, batches
are aimed to be automated in processing scripts.

Each of these structures will be treated separately throughout the document. Note that reports and batches of reports act
the same way as feeders and batches of feeders. They cover an additional layer, the Actuate report parameterization,
which is covered in another documentation (Reports Configuration).

3 - Dynamic tables

Dynamic tables, also known as reporting views or datamart views, define the data to be computed by the DAP in order to
feed the datamart tables. Note that each reporting table is linked to a certain mode of computation, such as dynamic table
for instance, which determines the way in which the table content will be computed. When datamart tables are populated
using feeders, details of the computation mode are carried in each table definition, defined at the Datamart mapping level
(section 4).

Dynamic tables form a denormalized access to the Mx data model. Their fields are derived from various tables for
transaction storage, tables for accounting entries, payment flows, simulation, etc.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 2
Datamart Configuration, 2014-11-12

In addition, dynamic tables allow access to computed data not available in the Mx data model such as PnL, MV, cash flow
schedules, sensitivities, ...

The mechanism used in reporting views to compute the calculations is shared by all other system applications, mainly the
one used in the simulations and PnL notepads, making the results homogeneous in the entirety of the system.

The different kind of data accessed by the dynamic tables are:

• Transaction related: access to the shared computation engine


• Simulation: access to user simulation views
• Accounting and payments modules
• Any static table of MxG production data model
• And many others

Dynamic tables can be accessed from the menu Data extraction / Datamart / Reporting Views in a configurator session,
and from Middle Office / Reporting / Dynamic Tables Configuration in an end-user session (figure 3).

Figure 3 : Dynamic Table Configuration

A dynamic table is mainly defined by a Class type which determines the kind of data accessed by the current dynamic
table.

The Table fields, which form the reporting view set of fields, are inherited from the Class type and the subclass of that
class. The Horizontal fields are user definable fields, built using tables fields and Mx parser functions. And the
Interactive fields are parameters like fields, which value is determined at run-time by end-users. They are used for
filtering purposes or for reorganization of data.

In addition, special flags allow the enhancement of the data display, such as the duplication of lines, consolidation
according to a certain grouping key, disabling some computations, etc.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 3
Datamart Configuration, 2014-11-12

Finally, a dynamic table may require some default configuration: setting the computing dates for revaluation of data,
pre-filters whose conditions are verified before the computations, and post-filters which filter the results.

Details on the configuration and use of dynamic tables are discussed in the Datamart Views (document ID 1817)
documentation.

4 - Datamart setup

The datamart mapping defines the structure of the reporting tables of the datamart. These tables can be based on an
underlying reporting view, in which case they form a subset of this view, or they can be based on an SQL query retrieval.

The datamart mapping characterizes the datamodel of the environment: the tables, the fields format and type, and the
indexes. Refer to the doc Datamart Setup (document ID 8853) to follow the steps involved in the reporting table creation.

Once defined in this menu, the reporting tables are created in the database, but they are still empty at this stage. It is then
the role of the datamart feeders to populate the reporting tables, which use as such the definition set in the datamart
mapping screen.

5 - Datamart feeders

5.1 - Definition

Datamart feeders are objects defining the extraction done from the production database into a given set of datamart
tables. Datamart feeders use the mapping definition described in the datamart setup screen. This implies that the way in
which the reporting tables are filled with a datamart feeder is implicit once the datamart mapping is defined.

The configuration of datamart feeders is performed in a configurator session, from the Data Extraction / Datamart /
Datamart Feeders menu. When entering this screen, the list of existing feeders is displayed (figure 4). Several operations
can then be performed:

• deleting a feeder by pressing the delete key on the keyboard


• modifying one by pressing space bar
• inserting a new one by pressing insert.

These functions are also accessible from the toolbar menu Edit or via contextual menus (right click).

Figure 4 : Datamart Feeders

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 4
Datamart Configuration, 2014-11-12

5.2 - Configuring a datamart feeder

To define a new Datamart Feeder, choose Edit/Insert from the main menu or press the Insert key. The Datamart Feeder is
defined by:

Label /
to differentiate it
Description

only used for the classification of feeders. End user screens display report list by categories. Existing
Category categories can be reused or a new category can be inserted at this stage. Click on the browse (...) to get the
list of categories and on the Insert key to create a new one (figure 5).

Defines whether it is a Murex or a Client extraction (figure 5). By default, the feeder is always defined as a
Client ownership, unlike the report which can be a Murex or a Client ownership. However, do not worry about
Owner
this criteria, since it is automatically filled by the system. It is used when exporting and importing Murex reports
between different environments.

The feeder can be linked to one or more reporting rights templates (figure 6). The feeder will only be
accessible to groups of users belonging to the reporting rights template selected at this stage. Templates of
Related this kind are created in a configurator session from the Data Extraction / Datamart / Reporting Rights Template
menu and rights are set in the supervisor session, where a group of users is associated to one reporting right
Templates
template. The Reporting rights template are only available in versions v2000.2.11 and v3.1. In version
v2000.2.10, the reporting rights are managed with Reporting roles. Details on groups rights are provided in the
Datamart & Reporting Administration.

A feeder can populate more than one table, as long as the tables are compatible. Use the button Insert table
to select among the available list of tables (figure 7).
Tables to The Main flag at the right of each table should be checked in special cases when the feeder is used for running
update reports in Actuate. When there are more then one table included in the feeder or the report, then the run-time
filter associated with it will be compatible with the selected main table. Therefore, all the reports and feeders
that share the same main table will have the same filter appearing when launched.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 5
Datamart Configuration, 2014-11-12

Figure 5 : Selecting or adding a category

Figure 6 : Reporting rights templates list

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 6
Datamart Configuration, 2014-11-12

Figure 7 : Datamart feeder tables

Several Datamart tables can be populated with the same feeder. See the appendix 3 for more information about table
links.

5.3 - Extracting data with a datamart feeder

5.3.1 - Accessing the datamart feeders list

Extraction of data is done by running a feeder from the Middle Office / Datamart Reporting / Datamart Feeders menu in an
end-user session.

The list of feeders should be previously defined in a configurator session. They are classified into categories, and the only
visible feeders are the ones added to the reporting right template to which the group is associated.

To run a feeder, select it, right click / Run (figure 8).

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 7
Datamart Configuration, 2014-11-12

Figure 8 : Run a feeder

5.3.2 - Defining the extraction filter

Before running the extraction, the system will ask for an optional DAP filter (figure 9).

Figure 9 : Dap run-time filter

If the end-user does not personalize it, then the feeder will run with the default configuration specified at the dynamic table
level.

The default settings of the dynamic table can be partially overwritten in one of two ways: use a predefined filter or define a
run-time filter.

Whether overwriting the settings of the dynamic table using a Pre-defined filter, or a Run-time filter, the following rules are
applied to the output results:

• The computing dates of the dynamic table are overwritten with the computing dates defined at the filter level.
• The Pre-filter selection is the result of the intersection of filters defined in the dynamic table and the filter.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 8
Datamart Configuration, 2014-11-12

5.3.2.1 - Use a Predefined filter

By checking the Use predefined filter box in the previous screen, the screen in figure 10 appears.

Figure 10 : Pre-defined filter screen

This screen contains the list of interactive variables defined for the dynamic table. Set the values of the interactive
variables as needed.

To see the list of predefined filters, click on the browse button at the upper right corner of the screen (figure 11). At this
stage, it is possible to edit, delete or duplicate a predefined filter by right clicking on it. A new filter can even be inserted
and defined at this stage. To select a filter, either double click on it or right click / select.

Figure 11 : Pre-defined filter list

Predefined filters contain three sections similar to the ones of the Dynamic table (figure 12). The first section contain
general information about the filter, mainly the following components:

Label/Description Identifies the filter

determines whether the filter is available for the current user only, the group to which the user belongs, or
Visible by
to everybody

Activate late if the late trading is activated, all the deals and transactions that have been entered after the closing time
trading will not be taken into account.

In addition, the filter is composed of two other sections: the Data Selection section contains the dates and market market
parameters used to calculate the data, and the Pre-filter selection is used to filter the data to be computed.

• Date Selection:

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 9
Datamart Configuration, 2014-11-12

Computing can be used for revaluation or filtering. Refer to the Datamart Views (document ID 1817) documentation to get
dates more details about the possible selections and meaning.

Market Data by default, the market data set used to compute the data is the one attached to the user current desk.
However, it is possible to load another market data set per computing date by selecting one from the drop
sets
down list (feature available in version v3.1 and active from the P&L monitor screen only).

• Pre-filter selection: This frame (figure 12) differs depending on the dynamic table class. Refer to the Datamart
Views (document ID 1817) documentation to get more details about the available pre-filters and their usage for
each dynamic table class.

Figure 12 : Pre-defined filter definition

5.3.2.2 - Use a run-time filter

If no filter has to be created, saved or reused, then the Use predefined filter button should be unchecked. In the filter
screen that appears (figure 13), define the filtering criteria for this particular extraction. The run-time filter contains the
same sections defined for the Pre-defined filter. Save and exit the screen.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 10
Datamart Configuration, 2014-11-12

Figure 13 : Run-time filter screen

5.3.3 - Defining the Data generation parameters

In the Data management frame a tag (Label of data) needs to be given for the extraction in order to identify the data set.
This will help users referring to it later, in order to generate a report on this specific data for instance (figure 10 and/or 13) .

A data set corresponds to a specific usage or extraction. It is defined by:

• the User
• the Label of data, which forms the tag used to identify the data set
• the Generation date which is equivalent to the last computing date in the filter

These fields are converted in the reporting tables to a numeric key: M_REF_DATA, which is unique per data set.

All the tables updated within the same feeder or batch will have the same M_REF_DATA value for the new rows
populated with this extraction.

If the data is published, the user is not taken into account in the M_REF_DATA , and the data set will be available for all
users.

Once the name of the data extraction is set, specify whether the data should be present in the reporting tables once and
only once (One data set), once per day (One data set per day), or for every extraction (One data set per run).

To be more specific:

One data set the extraction holding the same tag will be deleted and created again at each extraction.

One data set the extraction holding the same tag will be present in one copy only per day. No matter how many times you
run the same feeder with that generation date, the data set with the same data tag will be overwritten for that
per day
particular date.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 11
Datamart Configuration, 2014-11-12

One data set every time this feeder is run, a new data set will be created, even if the same label of data is used.
per run

Set the data generation parameters, then click on Proceed to run the extraction.

6 - Batch of datamart feeders

6.1 - Definition

Datamart Feeders can be grouped together in a batch to perform several extractions automatically. Each extraction can
be associated with a tag specified at the level of the batch. This tag can be used later in the same way to refer to the
extraction. Batches of Datamart Feeders can be accessed from the datamart menu. A list of existing batches is displayed
and they can be modified, deleted or inserted in a similar way to the feeders.

Batches of datamart feeders can then be included in EOD processing scripts. The procedure is fully described in the
documentation Reporting Processing Scripts.

6.2 - Configuring a batch of datamart feeders

Upon creation of a new batch, the screen in figure Error: Reference source not found appears.

Figure 14 : Creation of a Batch Of Feeders

A batch of feeder is defined as follows (figure Error: Reference source not found):

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 12
Datamart Configuration, 2014-11-12

Label /
Description to identify the batch of feeders
(1)

Flag Optional Whenever this flag is checked in a batch of feeders definition, no report can run as long as this batch is not
/ Mandatory done and successful. Once all the batches of feeders having this flag set are done, the execution of batches of
for EOD (2) reports (where data are pre-computed) takes place

Can be
overwritten This feature is for the usage of batch of feeders using the reporting API
by API (3)

Global Filter To be specified after the insertion of the feeders. At this stage, a pre-existing filter can be used, or a new one
(4) can be added. These filters are the same that appear when running a feeder using a Pre-defined filter.

Label of data Used similarly as in the case of Datamart feeders. This tag cannot be overridden by the end user at runtime.
However, end-users are allowed to create their own batches of feeders for their personal use (which is not
(5)
possible with Datamart Feeders)

Data are Additional flag used to make the data set available to all users. If it is not checked, the data set will be private
published (6) and it will only be visible to that end-user.

Data It is an additional flag that offers the possibility to merge the current extraction with an existing one. This means
Computed by that, if some data has been extracted by a batch holding the same data tag, the data to be extracted in this
batch will be merged with the already existing one. In other words, this data set will have the same
several
identification tag and the same reference number (M_REF_DATA) as the initial set. This is very useful for
batches (7) reports where you need to join data to get a particular output and display.

Option added to version MxG2000 recent version. It is based on multiprocessing and is intended to optimize
Scanner performance by parallelizing the computations into different processes (used only for computations involving
template (8) deals). The description of this process is beyond the scope of this document. To use this multiprocessing tool,
please refer to the document Datamart Feeders in Parallel Mode (document ID 1834) for more information.

Exception This allows to catch exception errors and to cast them to a specific level. For example, for known errors on
template (9) deals it can be useful to cast them with a warning level to avoid the batch to finish in error.

Email
notification It allows to set up email notification of batch status (success or failure) that will be sent with error details
(10)

It is possible to set UNIX commands to be executed on the server before and after the execution of the batch
of feeders. The root path is the applicative directory path.
It is also possible to use interactive parameter as input for the unix scripts to be executed. These parameters
will be prefixed by '%' character and are these ones:
Command
• USER
before and
• GROUP
after (11) • DESK
• SESSION_ID
• BATCH_LBL
• OUTPUT_NAME

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 13
Datamart Configuration, 2014-11-12

In addition, three important components have to be defined to complete the Batch's definition:

• View templates (12): one or more reporting rights templates to make the batch of feeder accessible to one or
more groups of users (figure Error: Reference source not found).
• Interactive variables (13): stands for parameter like fields, defined at the level of the dynamic table or the
reporting views (figure Error: Reference source not found).
• Feeeders to run (14): determines the list of feeders to be added in this batch. All the Datamart feeders inserted
at this level should be compatible. Note that all the tables of feeders listed in the same batch will have the same
identification tag and the same reference number (figure Error: Reference source not found).

Several Datamart tables can be populated with the same batch of feeders. See the appendix 3 for more information about
table links.

6.3 - Extracting data with a batch of datamart feeders

In order to run a Batch of Datamart Feeders, open an end-user session and go to Middle Office / Datamart Reporting /
Batches Of Feeders. The list of Datamart batches appear (figure 14).

Note that the available batches can only be run: no filters and no data set can be defined at this stage.

Figure 14 : Running a Batch of feeder

However, an end-user can create its own batch, but only using the set of Datamart feeders visible to him. These are
determined, as mentioned in configuring a datamart feeder (section 5.2), in the Reporting Rights Template definition
associated to the group to which the user belongs.

Note that the definition of a Batch of feeder from an end-user session differs slightly from the one defined from a
configurator session (figure 15). More specifically, the data generated cannot be made public, and cannot be merged with
other data set. They are differentiated in the list from the predefined batches by the colour of the column: is white instead
of light blue for the remaining batches.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 14
Datamart Configuration, 2014-11-12

Figure 15 : Creation of a Batch of feeder from an end-user session

6.4 - Batch over a range of dates

Batch of feeders can be run on one or several dates by specifying the list of dates to be used as computing date 2 from
the Middle Office / Datamart Reporting / Execute Batches of Feeders menu in an end-user session, via the contextual
menu Run on date(s) .

Figure 16 Run on date(s) execution mode

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 15
Datamart Configuration, 2014-11-12

Figure 17 Batch over range date(s) feature

In case your batch of feeder is calculating data only on one computing date, select only dates list to be used as computing
date 2.

Figure 18 Batch of feeder with only one date configured

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 16
Datamart Configuration, 2014-11-12

In case your batch of feeder is calculating data on several computing dates, select dates list to be used as computing date
2 and set the other dates according the date type list.

Figure 19 Batch of feeder configured with 3 computing dates

Note that a new date type is available for this feature 'Computing date 2 shifter' allowing the end-user to shift the
computing dates 0 & 1 according the list of computing dates 2 selected.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 17
Datamart Configuration, 2014-11-12

Figure 20 Computing date 2 shifter available for computing dates 0 & 1

In 'View execution jobs' menu, the end-user can see one line per computing date 2 selected and the execution of batches
is done sequentially.

7 - Datamart Extractions

7.1 - Definition

Datamart Extractions are datamart objects used to display in a Viewer component or extract into a CSV file a selection of
the content of the Datamart. They can be partially used as a feeder to populate the datamart tables.

The following three steps will take place during the execution of a datamart extraction:

• Identification of a datamart dataset: this can be done by selecting an existing dataset or by creating a new one
(equivalent to a feeder execution). This is based on the list of tables in the definition of the datamart extraction.
• Selection of some data in this dataset: a SQL statement at the level of the definition of the Datamart extractions
specifies which data will be made available to the output. The user can choose four types of output: Viewer, flat
file (CSV), Excel XML file and PDF file.
• Display of the select data in the Viewer or extraction of this data in a file. If data is generated on client side in a
file, it can be displayed in Excel after the extraction. Only Excel 2002 or later version will support the Excel XML
file.

The configuration of datamart extractions is performed in a configurator session, from the Data Extraction / Datamart /
Datamart Extractions menu. When entering this screen, the list of existing Datamart Extractions is displayed and several

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 18
Datamart Configuration, 2014-11-12

operations can then be performed:

• deleting an extraction by pressing the delete key on the keyboard


• modifying one by pressing space bar
• inserting a new one by pressing insert.

These functions are also accessible from the toolbar menu Edit or via contextual menus (right click).

7.2 - Configuring a Datamart Extraction

After pressing Insert in the Datamart Extractions list, a screen with the properties below will be displayed. Many of these
properties are also present in the feeder configuration and can be configured in the same manner (see section 5.2):

Label Name given to the Datamart extraction.

Description Description associated to the Datamart extraction.

Category Used for the classification purposes. Same as feeder configuration (section 5.2).

Owner Specifies the ownership of the Datamart extraction. Same as feeder configuration (section 5.2).

Related
The rights template are managed in the same way than for the feeder definitions in section 5.2.
Templates

Scanner This parameter is optional. When filled, it makes use of scanner engines to parallelize the computation and
Template population of Datamart tables over several processors.

Extraction
Defines the SQL request used to select the data from the Datamart.
request

Tables to
The list of tables populated by this extraction. Refer to section 5.2 for configuration details.
update

The screenshot on figure 21 shows an example of a configured Datamart Extraction called PL_EXTRACT in the category
P&L and based on the datamart table MXOD_TP.REP. The SQL statement defining the selection of data is described in
the MXOD_PLCHECK extraction request. The configuration for this request is shown on figure 22.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 19
Datamart Configuration, 2014-11-12

Figure 21 : Datamart Extraction Definition

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 20
Datamart Configuration, 2014-11-12

Figure 22 : Extraction Request Definition

The extraction request can be based on Datamart tables or on any Murex table. If the extraction request is based on a
Datamart table, it needs to be modified at runtime by the system to select the appropriate dataset in the Datamart.
Therefore, it is necessary to specify the name of one of the tables used or its alias if aliases are used in the main table
property. The system will then be allowed to customize the statement accordingly.

The maximum length of a column name is 28 characters. The completion of fields including the alias of the fields in SQL
request of the extraction is available using F10.

Several Datamart tables can be populated with the same extraction. See the appendix 3 for more information about table
links.

7.2.1 - Request parameters

It is possible to add dynamic parameters to the request:

Syntax : @<parameter>:T , where @ identifies an upcoming parameter

• <parameter> stands for the label that appears upon execution of the extraction
• T stands for the parameter type. Can be C, N or D whether the parameter is of type character, numeric or date
respectively

E.g: where M_NB=@Trn (document ID NaN) _number:N

If request runs on REP tables, the data set reference has to be passed to the request to ensures the request doesn't run
on the entire table but on the data set generated by the extraction

Syntax: where M_REF_DATA=@MxDataSetKey:N

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 21
Datamart Configuration, 2014-11-12

The system replaces the interactive variable with the reference number just generated in the tables of the extraction.

7.2.2 - Mx Keyword Parameters

In addition to the dynamic request parameters, a list of Mx Specific parameters are provided :

Filtering Criteria (Alias) Type Description Root parameter naming Example


access the list of
Portfolios String MxPorfolio MxPorfolio
Mx portfolios

access the list of


Counterparts String MxCounterpart MxCounterpartMain
Mx counterparties

access the list of


Transaction Type String MxFmlyGrpType MxFmlyGrpType1
Mx transactions

access the list of


Instruments String MxInstrument MxInstrumentPart1
Mx instruments

access the list of


Historical Data sets Double Mx historical data MxHistoricalData MxHistoricalDataFirst
sets

access to the last


Data Set key String MxDataSetKey MxDataSetKey1
data set generated

add an expression
to a query, which
SQL Expression String MxSQLExpression MxExpression to the query
can be replaced at
run time

At run-time, Mx identifies the specific parameter and displays a prompt window accordingly. If no item is selected from the
list, no filtering is applied on the where clause.

Note that specific parameters that return a list don't require an operator between the column name and the @

E.g : column_name@MxPortfolio:C => the prompt window will contain the list of Portfolios to which the user has access to

The MxSQLExpression parameter defines an expression (a character, a where clause...) at batch level or run-time.
Instead of duplicating same extraction and modifying the where condition, you should use the MxSQLExpression
parameter to construct one and only one extraction and defines where clause at batch level or run-time. The
MxSQLExpression parameter should be declared in a where condition.

The MxHistoricalData parameter can be defined at batch level of run-time for intraday extractions :

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 22
Datamart Configuration, 2014-11-12

Figure 19 : MxHistoricalData definition

This parameter can be either generic or having a constant value. If the user ticks on "Use fixed data reference", a window
will be displayed containing all sets of data available for this datamart table (the table defined in the datamart extraction
query on which the parameter MxHistoricalData is defined) and a value can be chosen.

The MxHistoricalData can also refer to a dynamic set of data which could be defined by :

• Label of data : the label of data of the set of data to be read from the datamart table.
• Data visibility : the set of data to be retrieved from the datamart table public (this set was fed by a batch of
feeders having flag "Data are published" ticked) or private (the set of data was published by a feeder or a batch
of feeders having the property "Data are published" unticked.
• Date shifter : The shifter is relative to the date filled in the "Relative to" drop down list. The date shifter applied to
the date configured in the "relative to" box refers to the computing date of the set of data to be retrieved.
• Relative to : This drop down list contains 3 values : Login date (this is the date of the current session from which
the datamart extraction or batch of extractions will be executed ), reporting date (the date filled in a configuration
session from the menu : Data extraction / Datamart / Initialize reporting date) or Last snapshot (the last set of
data which was generated for this datamart table. This set of data appears at top of the list of available sets of
data when looking at this datamart table from a configuration session from the menu : Data extraction / Datamart
/ View produced data by table)

7.2.3 - Mx Security Keyword Parameter

The parameter MxAuthorizedPortfolio is a hidden parameter, automatically filtering out portfolios on which the group
doesn't have access to. The Same naming convention applies.

7.2.4 - Mx Datamart Functions

A list of Mx Datamart Specific functions are provided:

Intput Output
Function Name Function Description
Parameter(s) Type
MxGetComputerDate() returns the date of the user's computer String

MxGetComputerTime() returns the time of the user's computer String

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 23
Datamart Configuration, 2014-11-12

returns the name of the user who has executed


MxGetUser() String
the extraction, e.g: MUREXBO

returns the group of the user who has executed


MxGetGroup() String
the extraction

MxGetLoginDate() returns the current session date String

• FIELD:
column
name in
DYN_AUDIT_REP
access the audit details of the generated data
without the set (retrieves information from
prefix M_ DYN_AUDIT_REP)
• Table: E.g
MxGetDynAudit(<FIELD>, [Table],
optional MxGetDynAudit(REP_DATE2,'PL_2.REP',830) String
@MxDataSetKey/@MxHistoricalData)
parameter.Ifreturns the content of the field REP_DATE2 of
not filled, the table DYN_AUDIT_REP where the output
take main table is 'PL_2.REP' and the data reference is
table equal to '830'.
• Reference
of data set
wanted

• STRING:
MxToDate(<STRING>) string to converts a character (YYYY-MM-DD) to date Date
convert

• STRING:
MxToInt(<STRING>) string to converts a character to integer Integer
convert

All the functions can be used:

• in the Datamart setup, in the underlying request of a datamart table of type SQL
• in the request of a Datamart extraction
• in the Output settings of a Datamart extraction (Header and Footer).

When used in the Where clause of a request, the result of the following functions MxGetComputerDate,
MxGetComputerTime and MxGetDynAudit should be converted to the date format using the function MxToDate.

At run-time, Mx identifies the specific functions and returns the result accordingly.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 24
Datamart Configuration, 2014-11-12

7.2.5 - Output Display Settings

At run-time, three output modes are available.

7.2.5.1 - Flat file

• In the client directory - can be opened with Excel


• On the server (under the folder 'datamart_extractions', available only for batches of extractions)

Figure 20 : file output

7.2.5.2 - Viewer screen

• The Mx dynamic reporting interface.


• The ouput displays the results of the SQL query defined previously in the extraction definition.

Figure 21 : viewer output

7.2.5.3 - Excel file

• In the client directory - can be opened with Excel


• The generated file is valid for Excel 2002 or higher versions
• On the server in overwrite mode only (under the folder 'datamart_extractions', available only for batches of
extractions)

Figure 22 : Excel XML file output

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 25
Datamart Configuration, 2014-11-12

7.2.5.4 - PDF file

• In the client directory - can be opened with Adobe Acrobat Reader


• On the server in overwrite mode only (under the folder 'datamart_extractions', available only for batches of
extractions)

Figure 23 : PDF file output

PRINTSRV service is mandatory for PDF output. Refer to appendix 7 for more information.

8 - Batches of Datamart Extractions

8.1 - Definition

One batch is a set of one or more Datamart Extractions to be run simultaneously.

The list of tables referenced within one batch will share the same M_REF_DATA for the same extraction.

Batches can be automated to run in EOD processing scripts.

8.2 - Configuring a batch of Datamart Extractions

• General information, Scanner Templates, Exception Templates: same behavior as for feeders
• List of extractions to execute: Each extraction can have its own output settings in Parameters
♦ If Viewer is chosen, the view layout will be written to a server CSV file and the view is displayed on the
graphical interface
♦ If File is chosen, 'Generate on server' must be checked to generate on server
♦ If Table is chosen, a XTR file will be written on server (Overwrite or Append mode)
♦ If PDF is chosen, a PDF file will be generated

PRINTSRV service is mandatory for PDF output. Refer to appendix 7 for more information.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 26
Datamart Configuration, 2014-11-12

Figure 24 : Extraction configuration within batch of extractions

• Data generation parameters


♦ Data pre-computed unchecked: Extractions acts as a batch of feeder with the filtering criteria defined in
the Global filter and Interactive variables
♦ Several Datamart tables can be populated with the same batch of extractions. See the appendix 3 for
more information about table links

• Data pre-computed
♦ Extractions runs on data pre-computed by a batch of feeder - Label of data
♦ Data are published: indicates if referenced data is private to the owner of the extraction or published to
ALL
♦ Data selection: refers to the date of the selected data set

In case the flag 'Data are pre-computed' is ticked, the last snapshot of data generated can be retrieved using the label
'Last Snapshot'. The 'last snapshot' option should be used in case of extraction used as a feeder or as an Intraday data
extractor.

Figure 23: To use the last snapshot of data generated

8.3 - Running a Batch of Extractions

From an end-user session, Middle office/Datamart reporting/Execute batch of datamart extractions

Right click on the batch and select 'Run set'

Batches of extractions cannot be configured at all at run-time

End-users can define their own batch of extractions from this screen, however:

• Data extracted by this extraction can't be published


Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 27
Datamart Configuration, 2014-11-12

• Only extractions to which the current user has access can be added in the Batch
• The batch won't be visible to others

Batches of extractions can be called from processing scripts.

8.4 - Report Publishing

PRINTSRV service is mandatory for report publishing. Refer to appendix 7 for more information.

8.4.1 - Files Sharing & Archiving

All the extractions files based on the PDF and Excel format are stored and organized in the following menu Middle
office/Datamart reporting/View produced reports/Datamart. The extraction files storage and organization is configuring in a
batch of extractions at the level of parameters and concerns only the extraction files generation on server.

Figure 25 : View Produced Reports Menu

The storage of the extractions files used the following parameters of the batch of extraction:

• Generate on server: to specify the generation of extractions files on application server


♦ By user: to specify the extraction is generated under datamart_extractions/<Customize_Folders>
♦ Generic: to specify the extraction is generated under <Customize_Folders> . It allows user dumping the
reports in any folder under the root application directory. Note that a black list of forbidden folders exists
and a warning message appears in case you are using one of them
• Server Output Path: to customize the folder where the extractions files are stored
• Public Folders: to specify the publication of the extractions files in public folders

Several functionalities are available in this menu by right clicking:

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 28
Datamart Configuration, 2014-11-12

• Print Preview: open the extraction file


• Copy to client: copy an extraction's file from the server to the client directory
• File audit: give several information concerning the extraction file
• Delete: delete a folder or an extraction's file

Figure 32 : View Produced Reports Functionality

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 29
Datamart Configuration, 2014-11-12

Figure 26 : Parameters for Extraction Files Storage and Organization

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 30
Datamart Configuration, 2014-11-12

Figure 24 Parameters for Extraction Files Storage in generic mode

8.4.1.1 - Dynamic Directories

The extractions files are still generated under a default directory datamart_extractions in case the flag b y us er is
selected.

The user has the possibility to customize the default path using dynamic parameters:

• @BATCH_NAME: the current name of the batch of extraction


• @DATE(CMP): the computer date
• @DATE(DATA): the date of the data generation
• @DATE(SYS): the system date
• @DESK: the login desk
• @GROUP: the login group

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 31
Datamart Configuration, 2014-11-12

• @USER: the login user


• @XTR_NAME: the name of the underlying extraction

In that case the default directory becomes datamart_extractions/<Customize_Folders> and the path is contextual with
respect to this root directory.

Figure 27 : Dynamic Directories for Storing the Extraction Files

For example: if the user fills the path with @USER/@XTR_NAME/@DATE(SYS), then the extraction file would be
generated under: Datamart_extractions/JOHN/pl_extraction/20090807.

The user can also use Unix command like '..', but be careful because all special characters will be replaced by '_'. This
applies to the following :

. | ; , ! @ # $ ( ) < > " ' ` ~ { } [ ] = + & ^ <space> <tab>

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 32
Datamart Configuration, 2014-11-12

'/' & '\' are the separators to specify another folder.

8.4.1.2 - Public Folders

The user has the possibility to publish data under Public Folders. In the reporting right templates, rights should be given to
groups to publish data in certain public folders. Public folders are added in the Reporting right templates and can be
shared by several groups.

Figure 28 : Public Folders Configuration in batch of extractions

In the output settings of the batch of extraction the user should be able to tick the box 'Publish' and select one or several
public folders according the list proposed, the extraction file generated should be stored under the Public folders selected.

In that case and if the flag by user is selected, the extraction is stored under datamart_extractions/<Customize_Folders>
and datamart_extractions/ <Public_Folder> <Customize_Folders> .

A public folder can't have the same name as a user.

8.4.1.2.1 - Access Rights

User can only access files stored under his user name and all shared files stored under the public folders his group has
access to.

A user has full access to his user name directory, but has limited access in the public folders. The rights on the public
folder (read, delete, write) are assigned in the reporting right templates in a supervisor session ('Reporting/Extraction
Directories').

8.4.1.2.1.1 - Supervisor Session

To create a Public Folder the user should use a supervisor session and modify the definition of the reporting template in
the tab Reporting/Extraction Directories .

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 33
Datamart Configuration, 2014-11-12

Figure 29 : Public Folders Configuration

8.4.1.2.1.2 - Configurator Session

The extraction files storage and organization is configuring in a batch of extractions at the level of parameters using a
configurator session.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 34
Datamart Configuration, 2014-11-12

Figure 30 : Public Folders Configuration

8.4.2 - Report Bursting

8.4.2.1 - Definition

It's the option to distribute out of one and only one execution only the report results to multiple recipients who each will
receive an instance of the report with a subset of data.

It's used to handle both large amounts of data as well as a large number of users while maintaining and running one and
only one report.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 35
Datamart Configuration, 2014-11-12

8.4.2.2 - Settings

The Report Bursting option is available in batch mode only. It's defined in the Extraction output settings of each extraction
and is only applicable with output Viewer or Pdf.

The criteria needed to handle the Report bursting are:

• the Bursting key (one or more field) used to split the data. The Bursting key can be composed of any field
available in the sql extraction Dictionary. To add a field to the Bursting key, drag and drop the field from the left
panel into the right panel title bar.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 36
Datamart Configuration, 2014-11-12

Figure 33 : Bursting Key selection

• The recipient list (email addresses or disk location)


• Dynamic name or path for resulting files: possibility to append the Bursting key to the filename or to create a
directory per bursting key by adding the parameter @BurstingKey to the directory path.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 37
Datamart Configuration, 2014-11-12

Figure 34 : Report bursting filename or path

By default, the filename is generated as such <report_name>+"_"+<Timestamp>+"_"+BurstingKey. It's mandatory to


append the BurstingKey to the filename or to add the BurstingKey to the path to guarantee the uniqueness of the file per
directory.

Below are two examples of Report Bursting results:

• one report file per bursting key, obtained by adding the Bursting Key to the filename

Figure 35 : Results with bursting key appended to file name

• a dynamic path creation with the Bursting key values

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 38
Datamart Configuration, 2014-11-12

Figure 36 : Results with Bursting key added to the dynamic path

8.4.3 - Email notification

Email notification rules can be added to scheduled jobs. They are set per extraction in the Output extraction settings tab.

Figure 37 : Email notification settings

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 39
Datamart Configuration, 2014-11-12

An Email notification is defined with:

• an Email template (To, CC, BCC, Subject , Body) default or customized.

Figure 38 : Email template

• options

On job
If job succeeds, send email notification to recipients in Email template
sucess

Send report If job succeeds, send email notification to recipients with report result in attachment

On job failure If job fails, send email notification to recipients in Email template

Send error If job fails, send email notification to recipients with xml error file in attachment

Note that it's possible to define distinct notification rules per recipient list for the same report.

Figure 38 : Multiple email notification rules per report

Configuration:

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 40
Datamart Configuration, 2014-11-12

mail-server information has to be specified in the following file:

fs/murex/mxres/printsrv/config.mxres

<MailServer>mail.fr.murex.com</MailServer>
<MailServerPort>25</MailServerPort>
<BreakAfterRowCount>1000</BreakAfterRowCount>
<MailSender>MX-Report-Service@murex.com</MailSender>

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 41
Appendix 1 - Menus differences between versions
The menus retain the same functionalities in all versions. The only difference observable between one version and
another is the naming. Below is the list of useful menus in the reporting module and the different paths to access them
depending on the version.

Screen V3.1 V2000.2.11 V2000.2.10

• Data Extraction /
• Publisher / Datamart • Publisher / Datamart
Datamart /
/Reporting Views /Reporting Views
Reporting Views
• Middle Office / • Middle Office /
Dynamic Tables • Middle Office /
Reporting /Dynamic Reporting / Dynamic
Reporting / Dynamic
Tables Tables
Tables
Configuration Configuration
Configuration

Data Extraction / Datamart / Publisher / Datamart Publisher / Datamart /


Reporting Tables
Datamart Mapping /Datamart Mapping Datamart Setup

• Data Extraction /
• Publisher / Datamart • Publisher / Datamart
Datamart /Datamart
/ Datamart Feeders / Datamart Feeders
Feeders
Datamart Feeders • Middle Office / • Middle Office /
• Middle Office /
Datamart Reporting Datamart Reporting
Datamart Reporting
/ Datamart Feeders / Datamart Feeders
/ Datamart Feeders

• Data Extraction
• Publisher / Datamart • Publisher / Datamart
/Datamart / Batches
/ Batches Of / Batches Of
Of Datamart
Datamart Feeders Datamart Feeders
Feeders
Batches of Feeders • Middle Office / • Middle Office /
• Middle Office /
Datamart Reporting Datamart Reporting
Datamart Reporting
/ Batches Of / Batches Of
/ Batches Of
Datamart Feeders Datamart Feeders
Datamart Feeders

• Data Extraction
Datamart Extractions /Datamart / NA NA
Datamart Extraction

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 42
Appendix 2 - Functionalities differences between versions
Some functionalities have been added or modified in versions 2.11 and 3.1. Below is the list of functionalities changing
from one version to another. The unchanged items or functions are not listed.

Function V3.1 V2000.2.11 V2000.2.10


Reporting table based on SQL Present Present Absent

Main table flag for the list of


Present Present Absent
tables within a feeder

Running Feeders with the Market Present when batch


data set as parameter for launched from P&L Absent Absent
computing dates monitor only

Running Feeders with Activate


Present Absent Absent
Late trading

Managed by a Reporting
Managed by a Reporting Managed by one or more
Rights on feeders/reports Rights Template per
Rights Template per group Reporting Role per group
group

Scanner Template in Batches of


Present Absent Absent
Feeders

= User + Data
= User + Data generation = User + Data generation date +
M_REF_DATA generation date + +
date + + Label of Data set + Label of Data set + Desk
Label of Data set

Report publishing (archiving,


Present Absent Absent
bursting and email notification)

Extraction formatting: PDF and


Excel, Formatting layout, Present Absent Absent
hearder/footer, Front/Last page

Exception configuration Present Absent Absent

Trade, Portfolio, P&L


Trade and Portfolio
Scanner Template Var and Market Data Absent
algorithms
Loaders algorithms

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 43
Appendix 3 - Tables compatible within a feeder
Reporting tables are compatible if they are based on dynamic tables that share the same filters, and the same date
interpretation. For example, class ACCOUTING and TRANSACTION use different filters or scanning tables. Moreover,
within the TRANSACTION class, DYN_TRNRP_PL uses the computing dates as revaluation dates, whereas
DYN_TRNRP_MK uses the computing dates to filter transactions. In both examples, the tables are incompatible and
should be included in two different feeders. Below, is the dynamic table types that are compatible:

• Transaction based in detailed mode:


♦ TRNRP_PL
♦ TRNPL
♦ TRNRP_CS
♦ TRNRP_DT
♦ TRNRP_MV
♦ TRNRP_CD
♦ TRNRP_SV
♦ TRNRP_XG
♦ Simulation View
• Transaction life cycle :
♦ TRNRP_MK
• Portfolio based :
♦ Theta Analysis
♦ PL Variance
♦ Transaction based in consolidated mode
• Accounting :
♦ Accounting
♦ Accounting (reporting)
• Market data :
♦ Data Dictionary Service
• Static :
♦ Copy Creation
• Value at Risk
• STP rights audit
• Trade version audit
• STP rights

The compatibility of reporting tables is done at the level of batch of feeder:

• Compatibility of BuiltOn flags


• Compatibility of filters

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 44
Datamart Configuration, 2014-11-12

Figure 39 : Compatibility of dynamic tables in batches of feeders

For the reporting tables that share the same date interpretation but not the same filtering, they can be joined together
within the same feeder or batch of feeder provided that a table link is defined at the datamart table level.

To define such a table link:

• From an configurator session, Data extraction/Datamart/Datamart setup (tables definition and mapping)
• Select the secondary table that must be linked to the main table
• Lock the rights
• Click on Attach data
• In the new window, click on Edit, then on View the linked tables
• click on Edit, then on Insert linked table
• Define the link fields
• Save & Exit

For example, if we need to link the main table UT_TP.REP and the secondary table UT_MK.REP on the field NB, the
following link should be defined from the UT_MK.REP table definition:

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 45
Datamart Configuration, 2014-11-12

Figure 40 : Table link insertion

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 46
Appendix 4 - Effect of running a feeder or batch of feeders on
previous data sets
When running a feeder, the system checks the existence of a data set holding the same label in all the existing reporting
tables. Recall that each data set is identified by the numeric key M_REF_DATA , formed by the concatenation of the user,
the label of the data set and the data generation set (section 5.3.3).

If the data generation parameters of the data sets holding the same label are the same as the ones specified in the current
feeder, then these data sets are deleted from all the reporting tables according to the previous rules (one data set, one
data set per run, one data set per day) (section 5.3.3) .

More precisely, if in the current feeder, the data management is set to:

• One data set: all the previous data sets holding this label are deleted in all the reporting tables
• One data set per day: the data sets holding this label and having the same data generation date are deleted
from all the reporting tables
• One data set per run: none of the data sets holding this label are deleted

In all cases, whether there exits a matching data set or not, a new M_REF_DATA is created.

The same rules are applied when a batch of feeders is run. However, if the data set in the batch of feeder is computed by
several batches, then the matching data sets are only deleted from the reporting tables referenced by this batch (and not
from tables referenced by other batches).

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 47
Appendix 5 - Market Data Sets used for computation
This paragraph describes, for each type of dynamic table/ batch of feeder execution, the different market data sets
involved. These market data sets (MDS) are the ones used for data revaluation.

5.1 - Simulation Viewer

For all Murex user Group types, the current session MDS (can jump from one MDS to another by hitting ctrl+u) is used for
computation in the Simulation Viewer.

5.2 - Market Data Viewer

For all Murex user Group types, the current session MDS (can jump from one MDS to another by hitting ctrl+u) is used for
computation in the Market Data Viewer.

5.3 - Dynamic tables test execution

Below are the market data sets used when testing a standard or theta analysis dynamic table on screen.

Group type MDS used for computations


Front Office Current session MDS (can jump from one MDS to another by hitting ctrl+u)

Processing Current session MDS (can jump from one MDS to another by hitting ctrl+u)

P&L Control MDS of the relevant portfolio's closing entity

Below are the market data sets used when testing a PLVAR (P&L variance) dynamic table on screen.

Group type MDS used for first date MDS used for second date
MDS specified in the PLVAR scenario Current session MDS (can jump from
Front Office
definition one MDS to another by hitting ctrl+u)

MDS specified in the PLVAR scenario Current session MDS (can jump from
Processing
definition one MDS to another by hitting ctrl+u)

MDS specified in the PLVAR scenario MDS of the relevant portfolio's closing
P&L Control
definition entity

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 48
Datamart Configuration, 2014-11-12

5.4 - Batch of feeder execution

5.4.1 - Standard Batch of feeder

Below are the market data sets used when running a standard batch of feeder.

MDS specified in the batch of feeder


Group type MDS used for computations
configuration
MDS specified in the batch of feeder
Front Office Y
configuration

Current session MDS (can jump from


N
one MDS to another by hitting ctrl+u)

MDS specified in the batch of feeder


Processing Y
configuration

N Processing center default entity's MDS

MDS specified in the batch of feeder


P&L Control Y
configuration

MDS of the relevant portfolio's closing


N
entity

Below are the market data sets used when running a standard batch of feeder from the P&L monitor screen.

MDS specified in the batch of feeder


Group type MDS used for computations
configuration
MDS of the relevant portfolio's closing
P&L Control Y
entity

MDS of the relevant portfolio's closing


N
entity

5.4.2 - PLVAR Batch of feeder

Below are the market data sets used when running a PLVAR batch of feeder.

Group type MDS specified in the MDS used for first date MDS used for seond date
batch of feeder
configuration for first

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 49
Datamart Configuration, 2014-11-12

date
MDS specified in the batch of feeder Current session MDS (can jump from
Front Office Y
configuration for first date one MDS to another by hitting ctrl+u)

Current session MDS of first date


Current session MDS (can jump from
N (can jump from one MDS to another
one MDS to another by hitting ctrl+u)
by hitting ctrl+u)

MDS specified in the batch of feeder


Processing Y Processing center default entity's MDS
configuration for first date

Processing center default entity's


N Processing center default entity's MDS
MDS for first date

MDS specified in the batch of feeder MDS of the relevant portfolio's closing
P&L Control Y
configuration for first date entity

MDS of the relevant portfolio's MDS of the relevant portfolio's closing


N
closing entity for first date entity

Below are the market data sets used when running a PLVAR batch of feeder from the P&L monitor screen.

MDS specified in the


batch of feeder
Group type MDS used for first date MDS used for second date
configuration for first
date
MDS specified in the batch of feeder
MDS of the relevant portfolio's closing
P&L Control Y configuration for first date
entity
(Contextual yesterday)

MDS of the relevant portfolio's MDS of the relevant portfolio's closing


N
closing entity for first date entity

5.4.3 - Market Data Batch of feeder

Below are the market data sets used when running a Market Data batch of feeder (Market Data feeders are based on
Dynamic tables of type Data Dictionary Service configured to run with a user defined Market Data Viewer).

MDS specified in the batch of feeder


Group type MDS used for computations
configuration
MDS specified in the batch of feeder
Front Office Y
configuration

Current session MDS (can jump from


N
one MDS to another by hitting ctrl+u)

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 50
Datamart Configuration, 2014-11-12

MDS specified in the batch of feeder


Processing Y
configuration

N Processing center default entity's MDS

MDS specified in the batch of feeder


P&L Control Y
configuration

N Processing center default entity's MDS

Below are the market data sets used when running a Market Data batch of feeder from the P&L monitor screen.

MDS specified in the batch of feeder


Group type MDS used for computations
configuration
P&L Control Y Selected closing entity's MDS

N Selected closing entity's MDS

5.5 - Using Live Market Data

5.5.1 - Prerequisites

The MX user who will be running the dynamic table or the datamart feeder using the live market data from the cache
(MDCS server) should have the correct assignment to the live market data.

For that the following needs to be configured first:

• From a Configurator session, Configure a Realtime Subscription Rule from the menu Market Data -> Contribution ->
Subscription Rules and assign it to the FO desk settings from the menu Infrastructure -> FO Desk -> Settings.
• Assign the FO Desk settings to the front office desk from the menu Infrastructure -> Organization -> Front Office
Desk .
• From a Supervisor session, attach the MX group to which the MX user belongs to the Front Office desk.

5.5.2 - For Dynamic Tables

To run the dynamic table using the live market data from the cache (MDCS server), the following setting needs to be
done:

• From the menu Middle Office -> Reporting -> Reporting Setup insert a reporting setup specific for realtime
market data connection where the flag Realtime status is set to Connected (Automatic Retrieve)

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 51
Datamart Configuration, 2014-11-12

• From the dynamic table use the interactive variable SU_CONFIG to select the reporting setup created previously.

5.5.3 - For Datamart Feeders

It is possible to run the batch of datamart feeder on the live market data in the cache (MDCS server).

5.5.3.1 - Using the SU_CONFIG Variable

Set the interactive variable SU_CONFIG to select the reporting setup with the realtime connection (see the configuration in
the previous pragraph)

5.5.3.2 - Using the Global Filter in the batch of datamart feeder

For that, the following configuration should be done:

• From the Global filter of the batch of feeder, click on the Market Data tab
• From the drop down list MDC Server, chose the option:
♦ Default (Reporting Setup) : to inherit from the Reporting setup defined. If there's no interactive variable
SU_CONFIG defined for this feeder, the settings of the main Reporting setup (defined under
MiddleOffice/Reporting/Reporting setup ) will be inherited instead
♦ Connected: to overwrite the settings of the Reporting setup (default or user-defined)

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 52
Appendix 6 - Scope of trades included in the computation
This paragraph describes the scope of trades that are included when running the datamart feeders and the dynamic tables
with different users profiles and at the different stages of the business day in the financial institution.

For the EOD concepts and naming conventions, refer to the PL Control Center (document ID 2275) documentation.

The trades perimeter taken into account when running a computation depends on three main factors:

• Boundary date of the computation


• MX group profile of the user running the computation
• Acceptance of the trades

In a short summary :

• Front Office and Processing groups scan all the trades with the highest available version with regards to the
boundary date
• PLControl groups scan all the accepted trades with the highest available version with regards to the boundary
date

6.1 - Quick Overview on Trade Acceptance

There are two acceptance modes are available Manual and Implicit.

• Implicit : The trades are automatically included in the official P&L (accepted) at insertion, when the closing entity
is closed down, the trades inserted afterwards will only be included in the next day's P&L. The accepted trades
can be identified by the value of their actual date in the system that is less or equal to the processing
center/PLCC date. The actual date which can be found in the fields available for each version of the trade:
♦ TRN_EXT_DBF.M_ACT_DATE
♦ TRN_HDR_DBF.M_MRPL_DATE
• Manual : The trade are not included in the official P&L by default. The acceptance is either event driven via the
workflow or done manually by the PL Control group. In this case, in addition to the condition on the trade's actual
date, another field is used to confirm the acceptance:
♦ TRN_HDR_DBF.M_LAT
♦ TRN_EXT_DBF.M_LAT

6.2 - Case where the boundary date is equal to the current date

This section covers the case where the computation is done as of the current (contextual log in) date.

Depending on the trade acceptance mode, the EOD organization mode (Processing Center or PLControl center) and on
the point in time when the computation is run during the business day, the following illustrations show the scope of trades
included in the extraction.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 53
Datamart Configuration, 2014-11-12

6.2.1 - Processing Center EOD organization

In this case, the EOD organization if the financial institution is based on one Processing center managing all closing
entities.

6.2.1.1 - Trades Implicit Acceptance mode

Below is the trades Perimeter for Processing Center EOD Organization and Implicit Acceptance mode for the different MX
profiles.

Figure 41 : Trades Perimeter for Processing Center EOD organization and Implicit Acceptance mode

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 54
Datamart Configuration, 2014-11-12

6.2.1.2 - Trades Manual Acceptance mode

Below is the trades Perimeter for Processing Center EOD Organization and Manual Acceptance mode for the different MX
profiles.

Figure 42 : Trades Perimeter for Processing Center EOD Organization and Manual Acceptance mode

6.2.2 - PLControl Center EOD organization

In this case, the EOD organization if the financial institution is based on PLControl centers managing the closing entities.

6.2.2.1 - Trades Manual Acceptance mode

Below is the trades Perimeter for Processing Center EOD Organization and Manual Acceptance mode for the different MX
profiles.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 55
Datamart Configuration, 2014-11-12

Figure 43 : Trades Perimeter for PLControl Center EOD Organization and Manual Acceptance mode

6.2.2.2 - Trades Implicit Acceptance mode

Below is the trades Perimeter for Processing Center EOD Organization and Implicit Acceptance mode for the different MX
profiles.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 56
Datamart Configuration, 2014-11-12

Figure 44 : Trades Perimeter for Processing Center EOD Organization and Implicit Acceptance mode

6.3 - Case where the boundary date is inferior to the current date

This is for the case where the computation is launched as of past date. The different groups will evaluate the trades as of
their last valid status on the boundary date.

For trades that were inserted after the boundary date, they will be scanned by the extraction but not evaluated.

For the trades on which there were market operations after the boundary date, thy will be evaluated as per their state
before the market operation.

Therefore, only the latest valid version of the trade on the boundary date will be taken into account. Technically speaking :

• For Front Office and Processing groups : All trades in TRN_HDR_DBF are scanned. When the boundary date is
inferior to the FO date (PC date for processing groups):
♦ The trades which actual date is higher than the boundary date are not evaluated
♦ The last version with an actual date less than the boundary date is the one used in the computation

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 57
Datamart Configuration, 2014-11-12

• For PL Control group :


♦ In implicit mode, only the trades which actual date is less or equal to the PC (PLCC) date are scanned.
♦ In Manual mode, only the trades which actual date is less or equal to the PC (PLCC) date and where
M_LAT='N' are scanned.
♦ The trades that have an actual date higher than the boundary date will not be evaluated.
♦ The last accepted version with an actual date less than the boundary date is the one used in the
computation

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 58
Appendix 7 - PRINTSRV service

7.1 - Generalities

The MXPRINTSRV service is mandatory for extractions and batches of extractions configured to be generated under PDF
format, to be printed or to retrieve the reports generated under the menu View produced reports.

Two steps must be checked to make sure the service is available:

• Is the Mxprintsrv service launched (launchmxj.app -printsrv) ?


• Is the environment updated with the corresponding license rights?

7.2 - Output settings

When running an extraction on printer, the user has to define which printer will execute the job. The PRINTSRV service
retrieves the list of printers defined at server level (the one where the service is running).

Defining a PDF output is similar to defining a printed output as page settings need to be set (width, height an margins).

Some default PRINTSRV settings may be updated under fs/public/mxres/printsrv/config.mxres in order to customize the
output.

This configuration file is formatted as below:

<PrinterSettings>
<PaperWidth>8.3</PaperWidth>
<PaperHeight>11.7</PaperHeight>
<Orientation>Landscape</Orientation>
<Margins>
<TopMargin>0.5</TopMargin>
<BottomMargin>0.5</BottomMargin>
<LeftMargin>0.5</LeftMargin>
<RightMargin>0.5</RightMargin>
<HeaderMargin>0.25</HeaderMargin>
<FooterMargin>0.25</FooterMargin>
</Margins>
<ColumnBorder>Y</ColumnBorder>
<RowBorder>Y</RowBorder>
<FontFactor>0</FontFactor>
<!--
<PrinterName></PrinterName>
-->
</PrinterSettings>

Note that the default paper size is 8.5 x 11.0. It corresponds to the A4 format.

The text orientation of layouts can be defined at the level of the printsrv service and will be applied to all extraction files:
<Orientation>Landscape</Orientation>.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 59
Datamart Configuration, 2014-11-12

A modification of the file config.mxres necessitates the restart of the launcher (launchmxj.cmd -printsrv) and the restart of
the service.

7.3 - Other settings

The maximum java heap size defined for the PRINTSRV service can be modified in the launchmxj.app file. The value is
set by default to 512M but it can be increased up to 2048M:

DEFAULT_PRINTSRV_JVM_ARGS=-Xms32M-Xmx512M (default java heap size)

DEFAULT_PRINTSRV_JVM_ARGS=-Xms32M-Xmx2048M (maximum java heap size)

This might be necessary when there is a big number of data to be printed.

Printed for MUREX+INTEGRATION Copyright 2014 Murex S.A.S. All rights reserved. 60