Professional Documents
Culture Documents
Make a Mess of
Data Consolidation
Building the foundations of a data-led financial institution
criticalsoftware.com
info@criticalsoftware.com
White Paper How Not to Make a Mess of Data Consolidation
149
118
94
74
59
41
33
26
15.5 18
12.5
6.5 9
5
2
2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024
References:
[1] https://www.statista.com/statistics/871513/worldwide-data-created/
[2] https://www.statista.com/statistics/1017863/worldwide-iot-connected-devices-data-size/
White Paper How Not to Make a Mess of Data Consolidation
Common risks in
data consolidation
From the outside, migration build a new infrastructure landscape for data consolidation projects.
projects look always relatively around the existing legacy system, Lack of data governance and
straightforward: take the existing using incremental change and end-user involvement during the
system and copy it onto a new replacement, so that – step-by- migration project are two big
platform. However, we soon step - a new system emerges while threats to the migration itself.
find that we have to deal with business continuity is maintained.
Financial services organisations
cleverly designed vendor lock ins,
Another risk that comes with must also be cognisant of the
version incompatibilities, complex
improper data consolidation is the temptation to implement a ‘big
legacy systems with a significant
lack of data quality and master bang’ project. Modernising too
number of customisations and
data governance. To ensure much, too fast will automatically
regulatory requirements on
better results once migration is lead to problems and, inevitably,
reporting in the finance world.
completed, it is not sufficient just failure. The size and complexity of
And that’s where the problems to copy data from one place to deliverables raise the probability
start. Any kind of lift and shift or the other. To avoid bringing old of failure dramatically. An
greenfield approach in migrating problems to a new platform, the agile approach with multiple
and consolidating data introduces focus must be on enhancing data sprints (units of time in which
significant business risks in terms of while migrating all that in line with project tasks are performed)
business continuity. The only feasible a solid (master) data governance. reduces this risk yet also serves
approach in our experience is to All this requires the involvement of to achieve modernisation and
the business side and availabilities Cloud migration goals.
must be ensured as misalignments
The success of data consolidation
are one of the biggest threads
projects therefore relies on how
a given project is carried out and
how data is migrated and stored.
A data decision
to make
For data consolidation to be useful, it requires careful thought and consideration as to how
data – once migrated – will be stored and used. Will data find a home in a warehouse, a
lake or a hub, and what are the differences between these? Let’s define each in turn:
A data warehouse is typically In the context of big data, data A data hub is a modern data
used to store structured data and lakes are selected to manage management architecture enabling
ensure availability for pre-defined appropriate information and have companies to consolidate and
and standardised reporting across proven to be a useful architecture. exchange data. It offers the
all users inside the organisation. However, when using them, data possibility to perform differentiated
scientists are needed to carry data analyses and to generate
out the meaningful evaluation comprehensive data science
and consolidation of the data. scenarios, amongst other things.
This naturally has an impact on There are different technologies
resourcing budgets through their that can be used when setting
need for IT skills and expertise. up a data hub architecture, all
based on Cloud infrastructures.
The intention of a data hub is to provide simplicity and to deliver up-to-date business objectives, providing
a 360-degree view of any facet of the business and allowing employees to gain access to data they want,
when they need it. The data hub is the go-to place for the core data within an enterprise. It centralises the
enterprise's critical data, usually located across different applications, and enables seamless data sharing
between diverse endpoints. It also acts as the main source of trusted data within governance initiatives.
Data hubs enable the collection of data in a highly efficient way, equipped with technology allowing
for the flexible implementation of reports and KPIs from various sources and applications. Using
data hubs can allow companies to achieve unprecedented agility, enabling quick responses to
new requirements in their individual market segments and other data relevant changes.
But while data hubs provide some answers to the question of data consolidation, and may seem simple enough,
there are still a number of risks which financial institutions face when attempting to unite data into a single source.
data
• KPIs for the
Business business
intelligence • Analytics / ML / AI
jobs
write once
Operational
applications
DATA MONITORING
Data warehouse
(structure for analytics) DATA SECURITY
Data lake
((un)structured, unrefined) DATA GOVERNANCE
How a well-defined
migration framework
can help
At Critical Software, we see data consolidation as the process
that collects all relevant data for the entire organisation, a
department or a line of business from wherever it is stored;
removes any redundancies or inconsistences; and normalises
it in order to create a safe, accurate and clear single source of
truth. This can then be exploited by standard visualisation tools
or other applications in a safe, simple and consolidated way.
1) BUILD: SET THE STAGE 2.1) OPERATE: INCEPTION & ARCHITECTURE 2.2) OPERATE: DELIVERY 3) TRANSFER
Assessment Research | Analysis | Design | Certification Data analytics | Develop | Scrum | Data science | Data engineering
CENTRES OF EXCELLENCE
Data analytics & Business intelligence | Artificial intelligence & Machine learning | User experience & User interface | Architecture & Development | Agile & Scrum | Security | Data engineering
AGILE
Through user experience design
(UxD) techniques like mapping,
INCEPTION & ARCHITECTURE Using Agile methodologies for a
easy-to-use interfaces can be built
data consolidation project enables
The analysis team will refine which clearly label data, allowing
organisations to ensure business
and prepare the backlog to be users to identify with ease the
continuity whilst their existing
integrated into the following sprints. information they are looking for.
landscape is still operational and
This is a mixed team between
can provide results as it is used
business and client, with a project
to. As we are building the new
owner from the customer side and CONSOLIDATE ELEMENT-
landscape around the existing
a business/technical analyst from BY-ELEMENT
legacy systems with incremental
Critical supported by business
change and replacement,
Using Agile allows for element-
domain experts. The technical
organisations will benefit from
by-element consolidation. We
leader will adjust the architecture to
an unbroken data analytics
typically recommend migrating
be adequate to new requirements.
service while their solution
frameworks either KPI by KPI or
will be transformed gradually.
market by market, or any other
Because of that, organisations
element-focused approach. In any
DELIVERY
will achieve an earlier return on
case, we focus on certain areas and
investment, better visibility and
The scrum teams will iterate within consolidate them step by step.
transparency, a more adaptive
each sprint on the implementation
approach and higher quality. Using this approach, financial
of the backlog previously defined.
services companies can ensure
The main task to focus on is
continuous operations but
on data-related-areas, but
also facilitate data cleansing
also includes documentation,
and data validation during
prototyping, support to key users,
the consolidation process. This
training, testing and bug fixing.
requires significant effort and
involvement from the business side
but ensures that no uncleansed
data is migrated and that the
consolidated KPIs and reports
deliver business-validated results.
White Paper How Not to Make a Mess of Data Consolidation
(WHEN)
Goals
Prepare a 2 Establishing decision rights
Focus Metrics &
roadmap
WHAT 2 areas Success Measures
Funding
5 Accountabilities
(WHO)
Data governance Stakeholder care and
9 10
office support
Figure 4 The DGI Data Governance Framework © The Data Governance Institute
White Paper How Not to Make a Mess of Data Consolidation
The previously separated worlds of analytics roles and data roles are coming together, not only impacting on
technology but also the people and processes that support and use them. Consolidation also works to optimise
business – a financial institution which has a good grasp of the data it is using is far more able to adapt to its
customers and to the market, improving its offering and subsequently increasing revenues.
Therefore, for Critical Software, business involvement is key to any data consolidation project. Any kind of
solely technical approach for a data consolidation project is bound to fail. Adopting a holistic, business-led view
of a project allows organisations to embrace new data services while ensuring any legacy systems that need to
be maintained are migrated successfully.
At Critical Software, we see data consolidation as the process of putting an organisation’s data in an
integrated environment, delivering clear business benefits in the form of:
Critical Software has been involved in various data consolidation projects and our migration framework,
combined with agile methodologies, has ensured their development efficacy and efficiency. Most importantly,
however, our approach has ensured business involvement throughout every phase of the project – avoiding any
messy communications and keeping the project on-track.
criticalsoftware.com
info@criticalsoftware.com