You are on page 1of 14

How Not to

Make a Mess of
Data Consolidation
Building the foundations of a data-led financial institution

criticalsoftware.com
info@criticalsoftware.com
White Paper How Not to Make a Mess of Data Consolidation

Data: the Holy


Grail for modern
businesses
All manners of organisations rely on data to carry out their
business. And as digital transformation marches on, the
volumes of data used by businesses only continues to grow.

That growth is accelerating. Data volumes created on an


annual basis have doubled in the past two and a half years [1]
and are forecast to double again in the coming three years.

149

118

94

74

59

41
33
26
15.5 18
12.5
6.5 9
5
2

2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024

Figure 1 Data volumes created annually, in zettabytes.

© Critical Software. All rights reserved.


For instance, the data volumes generated by Internet of
Things (IoT) devices and sensors are expected to grow from
13.6 zettabytes in 2019 to 79.4 zettabytes in 2025 [2].

The financial sector is no exception. On the one hand, financial


services institutions – especially banks and insurance brokers
- own a huge asset. Data is the new oil, after all! On the
other, data that isn’t managed properly, isn’t accessible or
hasn’t been properly verified can be just as useless as not
having any data at all. How can organisations ensure that
they can trust their data in a world that picks-up speed
year by year in terms of data volumes generated?

Data consolidation is one means of managing vast quantities of


data. As financial institutions realise how data can transform and
grow their business, there is a growing need to implement data
consolidation processes, bringing all of that data to one location
while simultaneously removing redundant and erroneous data.

Although it seems the perfect solution on the surface, it’s


all too easy for data consolidation projects to go wrong,
with many projects failing to achieve the primary goal
of data consolidation – making data easier to use.

References:
[1] https://www.statista.com/statistics/871513/worldwide-data-created/
[2] https://www.statista.com/statistics/1017863/worldwide-iot-connected-devices-data-size/
White Paper How Not to Make a Mess of Data Consolidation

Common risks in
data consolidation
From the outside, migration build a new infrastructure landscape for data consolidation projects.
projects look always relatively around the existing legacy system, Lack of data governance and
straightforward: take the existing using incremental change and end-user involvement during the
system and copy it onto a new replacement, so that – step-by- migration project are two big
platform. However, we soon step - a new system emerges while threats to the migration itself.
find that we have to deal with business continuity is maintained.
Financial services organisations
cleverly designed vendor lock ins,
Another risk that comes with must also be cognisant of the
version incompatibilities, complex
improper data consolidation is the temptation to implement a ‘big
legacy systems with a significant
lack of data quality and master bang’ project. Modernising too
number of customisations and
data governance. To ensure much, too fast will automatically
regulatory requirements on
better results once migration is lead to problems and, inevitably,
reporting in the finance world.
completed, it is not sufficient just failure. The size and complexity of
And that’s where the problems to copy data from one place to deliverables raise the probability
start. Any kind of lift and shift or the other. To avoid bringing old of failure dramatically. An
greenfield approach in migrating problems to a new platform, the agile approach with multiple
and consolidating data introduces focus must be on enhancing data sprints (units of time in which
significant business risks in terms of while migrating all that in line with project tasks are performed)
business continuity. The only feasible a solid (master) data governance. reduces this risk yet also serves
approach in our experience is to All this requires the involvement of to achieve modernisation and
the business side and availabilities Cloud migration goals.
must be ensured as misalignments
The success of data consolidation
are one of the biggest threads
projects therefore relies on how
a given project is carried out and
how data is migrated and stored.

© Critical Software. All rights reserved.


White Paper How Not to Make a Mess of Data Consolidation

A data decision
to make
For data consolidation to be useful, it requires careful thought and consideration as to how
data – once migrated – will be stored and used. Will data find a home in a warehouse, a
lake or a hub, and what are the differences between these? Let’s define each in turn:

DATA WAREHOUSE DATA LAKE DATA HUB

A data warehouse is typically In the context of big data, data A data hub is a modern data
used to store structured data and lakes are selected to manage management architecture enabling
ensure availability for pre-defined appropriate information and have companies to consolidate and
and standardised reporting across proven to be a useful architecture. exchange data. It offers the
all users inside the organisation. However, when using them, data possibility to perform differentiated
scientists are needed to carry data analyses and to generate
out the meaningful evaluation comprehensive data science
and consolidation of the data. scenarios, amongst other things.
This naturally has an impact on There are different technologies
resourcing budgets through their that can be used when setting
need for IT skills and expertise. up a data hub architecture, all
based on Cloud infrastructures.

© Critical Software. All rights reserved.


Gartner research found that 57% of data and analytics leaders are investing in data warehouses,
46% are using data hubs and 39% are using data lakes. But it must be remembered that they
are not interchangeable alternatives, and their purpose must be understood and aligned with
business requirements.

DATA HUBS: THE IDEAL SOLUTION?

The intention of a data hub is to provide simplicity and to deliver up-to-date business objectives, providing
a 360-degree view of any facet of the business and allowing employees to gain access to data they want,
when they need it. The data hub is the go-to place for the core data within an enterprise. It centralises the
enterprise's critical data, usually located across different applications, and enables seamless data sharing
between diverse endpoints. It also acts as the main source of trusted data within governance initiatives.

Data hubs enable the collection of data in a highly efficient way, equipped with technology allowing
for the flexible implementation of reports and KPIs from various sources and applications. Using
data hubs can allow companies to achieve unprecedented agility, enabling quick responses to
new requirements in their individual market segments and other data relevant changes.

But while data hubs provide some answers to the question of data consolidation, and may seem simple enough,
there are still a number of risks which financial institutions face when attempting to unite data into a single source.

Finance and risk INGESTION ANALYTICS


PERSISTENCE TRANSFORMATION
application
• Data from multiple • Stores, manages • Data quality rules, • Machine learning /
Advanced sources and maintains business rules and Artificial Intelligence
business objects compliance rules /
analytics • Pre-processes
policies
• Business insights
real-time & batch • Near-raw format are generated
read many

data
• KPIs for the
Business business
intelligence • Analytics / ML / AI
jobs
write once

Operational
applications
DATA MONITORING

Data warehouse
(structure for analytics) DATA SECURITY

Data lake
((un)structured, unrefined) DATA GOVERNANCE

Figure 2 Data hub overview.


White Paper How Not to Make a Mess of Data Consolidation

How a well-defined
migration framework
can help
At Critical Software, we see data consolidation as the process
that collects all relevant data for the entire organisation, a
department or a line of business from wherever it is stored;
removes any redundancies or inconsistences; and normalises
it in order to create a safe, accurate and clear single source of
truth. This can then be exploited by standard visualisation tools
or other applications in a safe, simple and consolidated way.

The described methodological approach, referred to by its acronym


BOT, is divided into three main phases – Build, Operate and Transfer.

1) BUILD: SET THE STAGE 2.1) OPERATE: INCEPTION & ARCHITECTURE 2.2) OPERATE: DELIVERY 3) TRANSFER

Assessment Research | Analysis | Design | Certification Data analytics | Develop | Scrum | Data science | Data engineering

Customer and stakeholder

Define project vision Research


Define desirable scope of the MVP Design the new journeys Agile teams Knowledge sharing
Define project prerequisites Design architectural solution Discover, refine, govern, orchestrate and visualise data and transfer
(Governance, team, logistics, IT)
Identify and analyse the data sources
Design user interface prototypes (if applicable)

Architecture Technical journey

CENTRES OF EXCELLENCE

Data analytics & Business intelligence | Artificial intelligence & Machine learning | User experience & User interface | Architecture & Development | Agile & Scrum | Security | Data engineering

Figure 3 Critical Software migration framework

© Critical Software. All rights reserved.


BUILD When managing this process, USER EXPERIENCE
organisations can take advantage
With data consolidation, it is User experience should be at the
of a series of centres of excellence
common to have a clear notion of heart of any data consolidation
(CoEs), optimising the data
what is already provided and how. build. How users can access
consolidation process through
Typically, we suggest an assessment relevant data using a consolidated
implementing efficient project
to identify major pain points and platform is integral to its business
management and adopting best
align and merge future goals. This success. A platform that is complex
practices in terms of building
helps us define a project vision and and does not possess the most
the consolidation platform.
design an MVP (minimum viable up-to-date data would only
Some of these include:
product) in terms of technologies, damage an institution’s ability to
architecture and data processing. use data to their advantage.

AGILE
Through user experience design
(UxD) techniques like mapping,
INCEPTION & ARCHITECTURE Using Agile methodologies for a
easy-to-use interfaces can be built
data consolidation project enables
The analysis team will refine which clearly label data, allowing
organisations to ensure business
and prepare the backlog to be users to identify with ease the
continuity whilst their existing
integrated into the following sprints. information they are looking for.
landscape is still operational and
This is a mixed team between
can provide results as it is used
business and client, with a project
to. As we are building the new
owner from the customer side and CONSOLIDATE ELEMENT-
landscape around the existing
a business/technical analyst from BY-ELEMENT
legacy systems with incremental
Critical supported by business
change and replacement,
Using Agile allows for element-
domain experts. The technical
organisations will benefit from
by-element consolidation. We
leader will adjust the architecture to
an unbroken data analytics
typically recommend migrating
be adequate to new requirements.
service while their solution
frameworks either KPI by KPI or
will be transformed gradually.
market by market, or any other
Because of that, organisations
element-focused approach. In any
DELIVERY
will achieve an earlier return on
case, we focus on certain areas and
investment, better visibility and
The scrum teams will iterate within consolidate them step by step.
transparency, a more adaptive
each sprint on the implementation
approach and higher quality. Using this approach, financial
of the backlog previously defined.
services companies can ensure
The main task to focus on is
continuous operations but
on data-related-areas, but
also facilitate data cleansing
also includes documentation,
and data validation during
prototyping, support to key users,
the consolidation process. This
training, testing and bug fixing.
requires significant effort and
involvement from the business side
but ensures that no uncleansed
data is migrated and that the
consolidated KPIs and reports
deliver business-validated results.
White Paper How Not to Make a Mess of Data Consolidation

The data governance


conundrum
Yet within this process of building processes and responsibilities This buy-in should not come at a
a data consolidation platform is are informal. In practice, data late stage of the data governance
the need to establish a series of governance is about establishing process. Arguably, the earlier
data governance guidelines. These systematic, formal control over stakeholders within the institution
assign responsibilities and limit these processes and responsibilities. are able to buy into the governance
freedom as they introduce guidance Doing so can help organisations structure taking shape, the better,
for users and data processors remain responsive, especially as as they will have more time to
on how to access and utilise the they grow to a size at which it is adapt to new ways of handling
data within the framework. no longer efficient for individuals data while increasing their
to perform cross-functional awareness of the norms of data
Data governance aims to ensure
tasks. Several of the overall handling as they are established.
that a dataset is following
benefits of data management
certain rules so that it can be How can early engagement with
can only be realised after the
processed by all consumers. A stakeholders be achieved? That is
institution has established
data governance framework is a where effective communication
systematic data governance.
set of data rules, organisational comes in. By outlining clearly not
role delegations and processes Key to an effective data governance only what the data governance
aimed at bringing everyone in the framework is compliance. Without framework entails, but also why it
organisation onto the same page. rigorous adherence to rules, is being implemented (to comply
data governance – and even with data protection regulations,
Most companies already have
the entire consolidation project for instance), members of the
some form of data governance for
itself – could be put in jeopardy. organisation are much more likely
individual applications, business
One way to ensure this is to to engage with the process – not
units, or functions, even if the
achieve significant buy-in from only ensuring data is maintained
across the institution, rather in a safe and effective way, but
than making governance the sole also ensuring that the wider
responsibility of project managers. changes brought about by data
consolidation do not fail.

© Critical Software. All rights reserved.


PROCESSES

(WHEN)

Develop a value Aligning policies,


1 Mission 1
statement requirements & controls

Goals
Prepare a 2 Establishing decision rights
Focus Metrics &
roadmap
WHAT 2 areas Success Measures

Funding

Plan and fund 3 Establishing accountability

Data rules and


RULES & RULES OF 3
definitions
ENGAGEMENT Design the 4 Performing stewardship
program
(WHY)
Decision
4
rights
Deploy the 5 Managing change
program

5 Accountabilities

Govern the 6 Defining data


data
Control
6
mechanisms
Monitor, measure, 7 Issue resolution
report

Data (HOW) Specifying data quality


7 8
stakeholders requirements

PEOPLE & Data governance Data governance Building governance into


8 10 9
ORGANISATIONAL BODIES office processes technology

(WHO)
Data governance Stakeholder care and
9 10
office support

Figure 4 The DGI Data Governance Framework © The Data Governance Institute
White Paper How Not to Make a Mess of Data Consolidation

Driving change with


data consolidation
The key driver for consolidating data and traditional data warehouses should be future-orientated. Ensuring a
future proof data landscape to utilise data-driven technologies in the best possible way is a key goal for data
driven companies to implement solutions around artificial intelligence (AI) and Big Data.

The previously separated worlds of analytics roles and data roles are coming together, not only impacting on
technology but also the people and processes that support and use them. Consolidation also works to optimise
business – a financial institution which has a good grasp of the data it is using is far more able to adapt to its
customers and to the market, improving its offering and subsequently increasing revenues.

Therefore, for Critical Software, business involvement is key to any data consolidation project. Any kind of
solely technical approach for a data consolidation project is bound to fail. Adopting a holistic, business-led view
of a project allows organisations to embrace new data services while ensuring any legacy systems that need to
be maintained are migrated successfully.

At Critical Software, we see data consolidation as the process of putting an organisation’s data in an
integrated environment, delivering clear business benefits in the form of:

• Immediate access to relevant and correct data.

• Increased efficiency due to consolidated data.

• Reduced operational expenses.

• Better alignment and compliance with data protection laws.

• Better compliance with regulatory requirements, e.g. ECB or BaFIN.

Critical Software has been involved in various data consolidation projects and our migration framework,
combined with agile methodologies, has ensured their development efficacy and efficiency. Most importantly,
however, our approach has ensured business involvement throughout every phase of the project – avoiding any
messy communications and keeping the project on-track.

© Critical Software. All rights reserved.


To find out more about our
work, please get in touch:
info@criticalsoftware.com
We are CMMI Maturity Level 5 rated.
For a list of our certifications & standards 
visit our website.

criticalsoftware.com
info@criticalsoftware.com

You might also like