You are on page 1of 36

Level Up to

Instant Action

A Comprehensive Guide
for Enterprise Architects to Evaluate
Stream Processing Technologies
Instant action is the next
wave of business innovation.
Imagine the business value you could unlock by taking instant action in the moment of
opportunity instead of after the fact.

For example, tracking deliveries or managing inventory in real-time to enhance customer


experience. Customizing interactions with customers while they’re browsing online.
Confidently processing hundreds of thousands of cross-border instant payments per
second. Declining fraudulent activity during the transaction. Continuously measuring risk.
Sending alerts to service equipment before it breaks down. That’s the power of acting
instantly on streaming data.

As the demand for instant action accelerates and requires new data technologies, the
widespread availability of streaming data – often referred to as data in motion – combines
with real-time stream processing to give businesses a clear advantage by jumping on
opportunities now versus later. And while it can be argued that the traditional analytics model
of storing data, querying by analysts, then acting on insights still has a role to play, the real
opportunity is to take instant action on data, as it is created.

02
Streaming data technologies exist to transform streaming data, or data in motion, into instant action. This
guide will help you understand your options and how to evaluate them.

The increase in streaming data availability advances in data processing, and the shift in consumer behavior
for instant gratification have created a new imperative for doing business in the digital economy. Innovative
business leaders, architects, and application developers are working together to level up to real-time action.
Whether modernizing applications or building next-generation apps to power your organization, this guide
will help you select the right data processing technology to compete and thrive.

What’s in this guide?


Based on our work helping Fortune 100 customers add real-time capabilities to their applications,
we created this guide about stream processing technologies that specifically support instant action.
In it, we cover:

Definition and fundamentals of real-time Tips for choosing the right technology
stream processing
Implementation and integration strategies
Key components and architecture of
Best practices and case studies
streaming data systems

How to evaluate streaming data


technologies

03
Introduction

Batch processing
vs. real-time stream Batch Processing

processing Data
Sources Batch Processing
Query

Let’s start with the basics and look at the distinction and
Analysis
relationship between batch processing and real-time stream
processing.
Wait

Batch processing is when the processing and analysis


happens on data already stored in a database, data
Real-time Stream Processing
warehouse, or data lake.

With batch processing, you’re working from what happened


and analyzing data in fixed intervals. That’s because a pool of Instant Action

data must be collected before the data is processed, which


delays action and opportunities to drive business outcomes.
Stream Processing Engine

04
Real-time stream processing is when data processing and analysis
happen while the data is still in motion – before it is stored.
Advantages of real-time stream
processing:
Real-time stream processing enables organizations to detect issues
and opportunities automatically and take action instantly.
1. Detect and act instantly
Real-time stream processing continuously
With stream processing, you’re working computes the data – eliminating delays in

with data in motion — working from uncovering opportunities and threats that
require immediate action.
what is happening right now.

2. Improve accuracy
Utilizing the appropriate streaming data technologies empowers
Real-time stream processing can analyze
applications to seamlessly handle streaming data, enrich it with
high-velocity streams constantly at very small
contextual information, and take automated actions, all before the
intervals, ensuring a level of granularity that is
data is written into the database. This ensures efficient processing,
missed in batch analysis.
enhanced data quality, and optimal workflow management.

3. Save money
Real-time stream processing reduces server
and storage costs by analyzing data immediately
and identifying which data elements should be
stored directly, aggregated, or discarded.

05
Categories and uses of real-time
stream processing

Real-time processing is used in two key ways:

1. To deploy analytical systems


Reduce the lag time between data creation and queries for end users
to minimize the pitfall of stale data. The analytical model is supported
by streaming databases such as Apache Druid, Apache Pinot, Rockset,
SingleStore, etc. Internet of Things Interactions Data Products Transactions

2. To deploy operational systems Event Stream Processing Platform


Enable applications to automate instant actions such as alerts, system
optimizations, and live dashboard updates. Messages SDI SDI

Many people focus on the analytical model because it’s more familiar
to the batch-processing paradigm of store data > query by analysts Operational Application Operational Database Analytical Database

> act on insights. However, the operational model offers a significant


opportunity to enable real-time action through the power of
automation – going straight from event to action.

06
This guide focuses on the operational
application model

In this guide, we focus on the operational systems that enable


real-time action.

We believe the operational model is under-utilized in business for


two key reasons:

1. It goes against the traditional “database-centric” mentality


in which storing data is the primary goal.

2. Most data architectures are simply not equipped to handle


instant action on data. Batch-oriented systems make it
extremely difficult to achieve real-time action.

The real-time advantage presents opportunities that go far beyond


“after the fact’ reporting and analytics, and it’s these opportunities
we hope to help you leverage.

07
Understanding Streaming Data Technologies

Events are the fuel.


An event is anything that happens at a clearly defined time and can Event data is the data about events, actions, and behaviors that
be specifically recorded. Transactions are initiated by an event that trigger business processes. A consumer signs up for an account,
triggers business processes that span multiple systems. For example, logs in, logs out, checks their account balance… Alone, each of these
an e-commerce transaction begins with events from the consumer’s events may not be significant but when taken together as a series of
clickstream; it then requires both queries and transactions on the events and when given adequate context, you can start to analyze
customer database, inventory database, shipping system, and payment trends, patterns, signals, and anomalies upon which to take action.
system. Traditionally, this has required time-consuming intermediate
steps to run periodically across multiple systems throughout the
business process.

In today’s real-time world, data processing is enabled instantly


through a single, multi-function system that draws upon all of the
underlying subsystems simultaneously.

08
By analyzing the constant and continuous flow of event data, IDC predicts that by 2025, event streaming
business leaders can gain insights into system performance, user
technologies will be used by 90% of the
behavior, user intent, and other valuable information that can drive
Global 1000 to deliver real-time intelligence
decision-making and improve operational efficiency.
and improve outcomes, such as customer
experience.
Real-time Stream Processing

Event Data

Web server logs


Stream Processing
Engine
What is
Sensor data
Instant Action streaming data?
Application logs • Fraud check
• Product
Streaming data refers to the continuous flow of data that
Social media activity recommendation is generated from a wide range of sources – steadily and
• Upsell
timely. A constant stream of data is being generated all
Financial transactions • System alert
around us, transforming entire industries. E-commerce
• Trends • Signals
Clickstream data
and online transactions, financial markets, social media, the
• Patterns • Anomalies
Internet of Things (IoT), and AI/ML are all contributing to
Photo upload the volume and velocity of streaming data.

09
The emergence of
streaming data technologies
Some people think of streaming data technology as the evolution “There’s growing appetite for streaming data
of CEP (Complex Event Processing). CEP, which mostly focuses on in a wide range of industries and use cases
high-level pattern matching, doesn’t take advantage of the value of
because companies are under ever-greater
granular real-time data, so it doesn’t address the demands of real-
time business.
pressure to enable rapid time-to-insight from
numerous data sources and types.”
While streaming data, aka “data in motion,” is often eventually - Jason Stamper, research manager, IDC
captured in a database or other store, it’s especially valuable while
it’s in motion, before it hits a database, data warehouse, or data lake.
This is why real-time streaming technology has emerged. Banks use
stream processing to detect fraud as it’s happening and take action
instantly. Retailers use stream processing to instantly customize
product recommendations and pricing while a shopper is visiting a
website based on current and past purchases and browsing history.

10
Key components and
streaming data architecture

Message bus Stream processing engine Reference database

Provides a highly efficient and scalable Lets you ingest, query, analyze, process, High-speed storage of historical data,
event data storage system, used for and deliver streaming data as it flows. typically sourced from other systems
managing and exchanging information of record, that can be merged with
between different services and systems. Sample technologies: Hazelcast, streaming data to enrich and add context
Apache Flink, Apache Storm to detect opportunities and threats and
Sample technologies: Apache Kafka, take instant action.
Apache Pulsar, AWS Kinesis
Sample technologies: Hazelcast, Redis,
MongoDB

11
State store Streaming database*

A storage engine for managing state maintained by a stream processor. A data store designed to collect, process, and/or enrich an incoming
Stream processing platforms keep data in state stores (internal buffers) series of data points (i.e., a data stream) in real-time, typically
temporarily to support multi-stage streaming data flow pipelines; immediately after the data is created, for downstream analytics.
calculations on moving time windows; and checkpoints that support
recovery and restarts. Sample technologies: Hazelcast, Apache Druid, Apache Pinot,
Rockset, SingleStore
Sample technologies: Hazelcast, RocksDB *Streaming databases are part of the analytical model and are not discussed further in this
guide, where we specifically focus on technologies that support real-time action.

Stream Processing Engine Message Bus Stream Processing Engine Streaming Databases

Stream Processing Apache Kafka / Stream Processing


Applications Apache Pulsar / Appplications
A Typical Stream Data Sources
(Ingestion) Kinesis (ingestion) Data Sinks

Processing
Reference Database
Architecture
Stream Processing
Applications
Systems of Record
(All Operations)

Stream Processing Engine


State Store

12
Comparison of streaming
data technologies
Technology Component(s) Description Ideal Use Cases Best known for…

Apache Stream “Apache Flink is a framework and distributed • Fraud detection Popular stream
Flink processing processing engine for stateful computations • Anomaly detection processing engine,
engine over unbounded and bounded data streams. commonly used as
• Rule-based alerting
Flink has been designed to run in all common the core engine by
• Business process monitoring
cluster environments, perform computations valued-added cloud
at in-memory speed and at any scale.” • Web application vendors
- flink.apache.org (social network)

Apache Message bus “Apache Kafka is an open-source distributed • Messaging Popular message
Kafka event streaming platform used by thousands • Website activity tracking bus used for event-
of companies for high-performance data driven architectures
• Metrics
pipelines, streaming analytics, data integration, and real-time
• Log Aggregation
and mission-critical applications.” analytics, created/
- kafka.apache.org • Stream processing open-sourced by
• Event sourcing LinkedIn

• Commit log

13
Technology Component(s) Description Ideal Use Cases Best known for…

Apache Message bus “Apache® Pulsar™ is an open-source, • Streaming and queuing Emerging alternative
Pulsar distributed messaging and streaming • Kafka, RabbitMQ, SQS to Apache Kafka for
platform built for the cloud.” replacement messaging, includes
- pulsar.apache.org the Kafka API

Apache Stream “Apache Storm is a free and open source • Realtime analytics Once the most popular
Storm processing distributed real-time computation system.” • Online machine learning stream processing
engine - storm.apache.org engine around, often
• Continuous computation
associated with the
• Distributed RPC
Lambda Architecture
• ETL for managing/analyzing
big data

AWS Message bus “Amazon Kinesis Data Streams is a serverless • Create real-time A popular streaming
Kinesis streaming data service that simplifies the applications data platform for AWS
capture, processing, and storage of data • Evolve from batch to customers
streams at any scale.” real-time analytics
- aws.amazon.com/kinesis
• Analyze IoT device data

• Build video analytics


applications

14
Technology Component(s) Description Ideal Use Cases Best known for…

Hazelcast Stream Hazelcast technology merges a real-time • Fraud detection Extremely fast stream
processing stream processing engine, fast data store, and • Anomaly detection processing in a unified
operationalized machine learning into a hot architecture that
• Real-time offers /
Reference data layer that powers real-time applications. includes fast data
recommendations
database storage, to simplify
• Predictive maintenance
real-time deployments
State store • AI/ML inference

• Rule-based alerting
Streaming • Business process
database monitoring
• Web application
(social network)

MongoDB Reference “MongoDB is a source-available cross- Developing scalable Popular NoSQL


database platform document-oriented database applications with evolving document database
program. Classified as a NoSQL database data schemas that simplifies the
program, MongoDB uses JSON-like storage of varying data
documents with optional schemas.” structures and formats
- Wikipedia with rudimentary
streaming data
capabilities

15
Technology Component(s) Description Ideal Use Cases Best known for…

Redis Reference “Redis is an open source (BSD licensed), Real-time analytics, Popular in-memory
database in-memory data structure store used as including social media key-value database
a database, cache, message broker, and analytics, ad targeting, widely used for low-
streaming engine.” personalization, and IoT cost caching, includes
- redis.io a storage model for
streaming data

RocksDB State store “RocksDB is an embeddable persistent key- • Viewing history and Simple data storage
value store for fast storage.” state of users on a engine often
- rocksdb.org website embedded as the core
• Spam detection technology in other
applications database products

• Graph-search query

• Cache data from


Hadoop
• Message queue

16
3 Key Criteria for 1. Identifying business
requirements and use cases

Evaluating Streaming
Data Technologies 2. Assessing scalability and
performance capabilities

Let’s turn to how you can evaluate streaming data technologies


based on your business requirements and use cases, along with
scalability, performance, and integration needs. In this section, we 3. Evaluating integration capabilities
cover these three key criteria: with existing systems

17
1.
Identifying business
requirements and use cases
Successful projects and initiatives begin with understanding business requirements.
What are the purpose and goals for the initiative to bring real-time action into the
business? BNP Paribas Bank sought to expand revenue opportunities with retail banking
clients without having to make a huge resource investment. Their goal was to present
clients with the right financial product at the time the client was most likely to accept.

Streaming data technologies are ideal for situations when real-time interactions with
customers can immediately influence whether they make a purchase, realize a need,
open an account, or even commit fraud.

As you embark on the journey to put instant action to work for your organization, it’s
important to look at what you can do with real-time action and how these use cases
map back to business requirements.

18
Use Cases:

Instant action is great for business and gives you a competitive Work with stakeholders to answer key questions, such as:
advantage in countless ways, including:
What is the business trying to achieve?
• Generate more revenue with real-time personalized offers
Where could we apply instant action to deliver business
• Avoid costly losses with real-time fraud detection value?

• Reduce risk with real-time trade analytics What could change if the business could act instantly on

• Compete more effectively with fintech start-ups with real-time streaming data?

payments and banking What are the business requirements for latency,

• Improve customer experience and loyalty with real-time tracking processing logic, and the volume and type of input data?

and inventory management in retail and logistics Where will we have the most significant potential impact?

• Get valuable alerts and insights from your connected devices with What is the best use case(s) to start with? Why?
Internet of things analytics.
What roadblocks might we encounter along the way?
• Avoid disastrous failures and reduce the cost of repairing and
What limitations or challenges might we face when
replacing industrial equipment with predictive maintenance
implementing the technology?
• Identify more revenue opportunities in real-time with gaming
How will we measure success?
analytics

• Optimize resource allocation or identify threats with network


This first step of identifying business requirements
monitoring and use cases will form the foundation for
• Alert caregivers to non-obvious health risks with patient monitoring evaluating streaming data technology options.

19
2. Assessing scalability and
performance capabilities
For enterprise architects, scalability and performance are among Leading vendors in the stream processing space will put forward
the top concerns. Consider whether your chosen stream processing their technology for benchmarking, so you can typically find these
technology meets the following criteria: benchmark reports online. One such example is Hasso Plattner
Institute’s ESPBench Enterprise Stream Processing Benchmark.
Able to handle changes in workload
Your goal in reviewing such benchmark reports is to determine
Can manage unexpected spikes in data volume
how fast the technology can perform and how efficient it is. Then,
Is highly resilient, as measured by service availability determine if it will scale to your needs. Often, it’s better to select a
technology that can perform beyond your current needs and then
Can sustain usability and responsiveness in the face of
work with the vendor to right-size the solution for your organization.
workload and data volume increases and fluctuations

Scaling down is far easier than realizing your chosen technology


These are all critical considerations.
can’t handle your increasing needs, which often happens to
businesses that don’t plan for tomorrow.
Contrary to popular belief, you don’t need to undergo an expensive
or time-consuming bake-off between vendors. Instead, look for
reputable published benchmarks and compare those to your business
requirements. Benchmarking is a common and proven approach to
identifying the best system for specific needs.

20
3.
Evaluating integration
capabilities with existing
systems
Stitched together or unified? Like many other technology decisions, you must consider the
tradeoffs between leveraging technology you already have (even if it’s not ideal) versus adding
new technologies designed for your specific goals. The main objective is finding the best fit.
Maybe you already have a NoSQL database. How will you use it for your real-time architecture?
Does it have all the pieces for your real-time journey, or is it simply a part of your overall
solution? These should factor into your decision.

Simplicity is often a key requirement for new initiatives, and using what you already have often
appears to be the easiest. But simplicity doesn’t always look like what you think. Sometimes, it’s
more complex to use what you have. Quite often, the technology you’re adding aligns perfectly
with what you already have and ultimately reduces complexity in your architecture.

21
Choosing the Right Streaming Data Technologies for Your Unique Needs

Key Considerations for Required


Capabilities
Let’s get down to the capabilities you should look at when selecting technology for stream processing:

Capability Consideration

Performance Think throughput, latency, and efficiency. Look for proof that the system will perform and
refer to published benchmarks.

Scalability Evaluating this capability requires planning. Consider your expectations to accommodate
the extra load. Look at growth projections. Remember, it may be easier to add on with the
same vendor than to suddenly realize the technology you chose can’t scale to your level.

Resilience Bad things happen. A distributed disaster recovery strategy is essential for mission-
critical deployments. Be sure your chosen technology provider can offer the benefit of
resilience.

22
Capability Consideration

Enrichment Tying together your streaming data with historical data is where the magic happens.
Yes, you need streaming data. However, correlation with existing data is often
underappreciated and under-valued in terms of how important and how difficult it is to
implement. Stream processing sometimes requires external technologies (look for a
vendor to integrate with data-in-motion and data-at-rest). Some technologies can do
this in one platform to simplify the effort.

Stateful Similar to our point about enrichment, consider the difficulty of integrating with other
computations technologies to store state data as part of stream processing. Plan how you’ll architect
this component and maintain it over time instead of running straight ahead and realizing
you need it.

23
Matching technology
features to your architecture
How will stream processing fit into your existing infrastructure? We’ve Consider some common architectures in which you might deploy
already discussed integration. Now, let’s evaluate how those choices stream processing and how each approach impacts the features to
lead to the features you’ll look for in stream processing technologies. look for:

Architecture Feature Considerations

Real-time stream Think simplicity. Look for features, such as low-code connectors and a distributed processing
processing framework that support a simpler deployment that can connect multiple data sources and
process them in a distributed and parallelized manner.

Digital Look for features like a built-in data store that lets you easily integrate data sources into a
integration hubs central repository. Similar to the above, simplicity is key.

Real-time machine learning inference Seek features that will help facilitate real-time machine learning, with machine learning
(including anomaly detection and inference and feature stores.
real-time recommendations)

Event-driven Here, you need features that allow you to easily integrate with external technologies to quickly
microservices store stateful data and pass messages between microservices.

24
Considerations for deploying
on-premises vs. cloud
You may have already decided whether you’re deploying your stream
processing technology on-premises or in the cloud.

Here we identify key considerations specific to stream processing:

On-premises Cloud-based

Control Ideal for companies that: • Allows you to offload certain data management

• Prefer to fully manage their technology stack duties and maintenance costs.

• Prefer to keep data control in-house • Ideal for companies that lack the internal resources
to maintain on-premises solutions.
• Have the staff/resources to maintain the technology

Cost Is suitable for companies that can handle the Allows you to redirect the costs of maintaining
hardware, physical infrastructure, and maintenance and updating systems in-house to other business
costs associated with on-premises solutions. investments.

25
On-premises Cloud-based

Performance Performance is directly proportional to the • Takes advantage of the distributed resources of the
underlying infrastructure and your ability to cloud provider to place resources in the countries or
maintain, upgrade, or replace as needed. regions that are consuming them.
• Results in lower latency and better overall performance,
as well as conforming with local regulations regarding
cross-border data movement.

Security • Allows you to retain more control over security. • Security is handled by the technology provider so it’s

• Requires you to have the appropriate staff to important to look at their controls and the strength of

manage security. their security teams.


• Consider how you can secure your deployments without
losing performance and simplicity.

26
Cost analysis and budget
considerations
As you set out to bring real-time action to your first use case, don’t incremental costs of newly added technologies, but also the financial
forget about the bigger picture regarding ongoing requirements. For commitment to adjacent technologies that ultimately might not be
example, the total cost and budget will be an obvious consideration for cost-effective.
not only the first use case but also for all subsequent use cases.

Since you will likely continue building more stream processing use
A large part of managing the total
cases, you want to choose the right technologies to simplify your cost comes down to consolidating
journey. So even if the most popular technologies from the biggest technologies to reduce the total
companies seem like the safe choice now, you should consider
hardware footprint.
what other cost ramifications will come into play when you expand
your use of streaming data. You should look at the characteristics
of the different technology options to see how they fit within your If you’re working with an on-premises or hybrid architecture, also
constraints. That will help you to get the best-fit technology that can consider that the more technologies you have, the more skillsets you
serve your long-term needs. need, the more burden on your team for maintenance, and the more
points of failure you have to contend with.
To evaluate total cost, you need to think about the different
technologies involved in the use case and the hardware requirements Some of the unified platforms available today can help you cover all the
for various task-specific technologies. It’s not only about the capabilities you need while reducing architectural complexity.

27
3 Topics to Guide Your
Implementation and Integration Plans

1. Planning the 2. Data ingestion and 3. Optimizing and scaling


implementation process integration strategies streaming data architectures

28
1.
Planning the
implementation process
In any technology implementation, it’s important to start with the end In addition to knowing the business outcomes, it’s also important to
in mind. Revisit the use case you set out to address and the desired specify the technical outcomes you’re looking to achieve. Fluidra’s
business outcomes for your implementation of stream processing. existing digital platform was single-function/ siloed, becoming out-of-
These will be the guideposts by which you keep the project moving date, and not capable of handling huge spikes in web visitor traffic.
forward.
The company needed to process data much faster to support an
Fluidra – a major pool production, equipment, and accessories augmented personalized customer experience. They also needed
company – wanted to ensure that as inventory and prices changed to integrate with other systems like Salesforce and logistics
on their e-commerce platform, customers could have a real-time platforms. This prompted the design of a new architecture that
view of what was available and at what price as they made inquiries. would significantly accelerate data processing as well as decouple
Going into the project with the end in mind allowed them to plan the from systems that were inflexible and costly to maintain. Here they
implementation process more effectively. determined they would need a real-time processing technology that
could deliver data immediately to the new services that they were
building. The technical outcome was set.

29
2.
Data ingestion and
integration strategies
As you think about your use case, it’s important to identify which data to
use that you already collect and what other data you need that you are
not already collecting. Ingesting the right data is key to instant action and
one that is often undervalued in the process. Getting the right data in, and
leaving irrelevant data out, can have a huge impact on the quality of the
entire project.

Along with ingesting the right data, you need to plan how you want the
system to correlate the data to take the right actions. What will you do
with the data you’re bringing in? The integration strategy is central to
your success.

30
3.
Optimizing and
scaling streaming data
architectures
In real-time stream processing, we have found that creating Based on our extensive work with customers and desire to help
simplicity at the architecture level drives simplicity at the application others start with best practices, we also recommend avoiding adding
level. In other words, take a top-down approach to your streaming technical debt where possible. Look for quick wins that are consistent
data architecture. Try to avoid adding more and more components with your longer-term vision. It’s easy to ignore this advice for the
to your infrastructure and even replacing infrastructure if you can. sake of getting something done quickly, but that will only get you so
Instead, manage risk by leveraging what you have already in place far. Think about it now or you’ll be worrying about it later when facing
and filling in the capability gaps. impacts on performance, reliability, and/or security.

31
Best Practices for Designing and Architecting Streaming Data Systems

So far, in this guide, we’ve given you many considerations for


evaluating, choosing, and implementing real-time stream
01 Start with one use case.
Go for one win. Establish a use case that
processing to help your organization uncover more opportunities.
business teams agree on, stay focused on this
While there is much to think about, here we summarize three best
one use case, and use the experience to build on
practices for bringing your streaming data system to life:
other use cases in the future.

02 Leverage what you have.


Leverage what you already have as much as
possible – don’t think of this as an overhaul. Take
a top-down approach.

03 Keep collaborating.
Keep collaborating with the business teams
on desired outcomes. This helps manage
expectations and can prevent mistakes or
oversights that inhibit project success.

32
Case Study
Challenge:
The business challenge it faced was to increase the adoption

BNP Adds Real-Time of its products to its customer base. The bank’s marketing
team identified a set of responses to specific customer

Stream Processing situations, which would result in an offer for an upsell/cross-


sell product. One straightforward offer would be to promote

to Promote Offers at
personal loans to any customer whose bank account
balance was low and who could not provide the requested

the Right Time


amount of cash via an ATM.

Solution:
As a fast and easy-to-use stream processing engine,
BNP Paribas achieved a 400% Increase Hazelcast was a natural fit for plugging into their publish/
in Real-Time Loan Offer Conversions subscribe messaging bus, turning their environment into
an event-driven architecture. This would allow them to act
on events in real time, especially since they were already
About BNP: capturing information about customer interactions.
BNP Paribas Bank Polska, which has been listed on the Warsaw
Stock Exchange since 2011 is a member of the BNP Paribas

400% 0%
banking group whose footprint spans 71 countries. In Poland, Increase in
Downtime
as a universal bank with a global reach, it provides services conversions
to retail customers and other segments including Wealth
Management, microbusinesses, SMEs, and corporate banking.

Read the complete case study at hazelcast.com

33
Case Study
Challenge:
Fluidra was already reviewing its e-commerce platform as

Fluidra Makes Waves it recognized the need for modernization to future-proof


its business. Due to their ERP-centric architecture, they

With Real-Time also had existing latency and network bandwidth issues.
The tight coupling with their ERP systems meant they

Customer Experience
were limited in their ability to modernize their front-end
applications, making any digital transformation or migration
to the cloud challenging. The pandemic further highlighted
Fluidra overhauled its IT infrastructure that the existing digital platform was single-function/ siloed,
with an accelerated timeline. becoming out-of-date, and not capable of handling huge
spikes in user traffic.

About Fluidra:
Solution:
Fluidra is a multi-billion-dollar, multinational group listed on
The Hazelcast stream processing component was chosen
the IBEX 35, the benchmark index of the Spanish Stock
for the loading of stocks and prices for all products stored
Exchange. The company is the one-stop shop for everything
in the shopping baskets in real time. Hazelcast was a crucial
related to pool production, equipment, and accessories. Core
component in ensuring that as stock and prices changed,
to the success of the business is its reliance on a dynamic and
e-commerce customers had a real-time view of what was
ambitious IT department, charged with driving innovation using
available and at what price as they made inquiries. All the
technology to provide leading-edge customer experiences.
pricing logic is hosted on Hazelcast to dynamically calculate
unique pricing for each customer and product combination.

50-100% Faster price calculations

Read the complete case study at hazelcast.com

34
Final Thoughts

Taking instant action on streaming data when opportunities or Now it’s your turn to level up to instant action. Whether modernizing
risks present themselves enables companies to generate more applications or building next-generation applications to power your
revenue and avoid unnecessary costs and risks. Helping your organization, we hope you feel better equipped to select the right
organization uncover competitive advantages and business technology and help your organization compete and thrive in the
opportunities through real-time action is possible with the right next wave of digital innovation.
stream processing technology and a smart approach that focuses
on business value.

A range of streaming data technologies are available to help you


transform real-time data into instant action. This guide has given
you information to better understand your options and how to
evaluate them.

35
About Hazelcast
Automate
Automate data architectures to enable instant action within
windows of opportunity.

Act instantly Streamline

on streaming data.
Streamline data architectures to gain operational efficiency
including faster time to market and ROI.

Enhance
The world’s largest companies depend on Hazelcast to power
Enhance data architectures by future-proofing them to
their business-critical applications and instantly act on streaming
support business growth.
data. As a global leader in real-time data technology, our mission
is to equip businesses with the most intuitive and innovative real-


time capabilities available to act instantly on streaming data to
grow revenue, mitigate risk, and reduce costs. In the streaming data space, it’s almost impossible to
find a single platform that handles data preparation,
Hazelcast’s technology uniquely combines a powerful real-time reference data enrichment, streaming data processing,
stream processing engine, a high-performance data store, and low latent storage, and machine learning inferences
operationalized machine learning capabilities. This seamless
while supporting a handful of stream processing
integration forms a unified real-time data platform that empowers
engines. Hazelcast is the rare vendor able to provide
Fortune 500 companies with real-time applications.
this functionality, and more, in a single solution.”

For more information on how Hazelcast can help your company - Jelani Harper, GigaOm Analyst

accelerate business outcomes with a unified real-time data


platform, visit hazelcast.com.

Learn more at hazelcast.com

You might also like