0% found this document useful (0 votes)
43 views26 pages

Business Analytics Reference Architecture

Business intelligence and BPM layer.

Uploaded by

byrapaneni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views26 pages

Business Analytics Reference Architecture

Business intelligence and BPM layer.

Uploaded by

byrapaneni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Business Analytics

Reference
Architecture
2011 NATIONWIDE IT SUMMIT

ENABLE OUR PEOPLE


THE VALUE LAYERS OF THE BA
REFERENCE ARCHITECTURE
Analytics
 Information for the “masses” , Actionable,
Embedded
Analytics

Extended Enterprise Analysis


 Cross Value Chain Information, Advanced Analytics,
Dimensional Integrated Performance Management

Subject Area focused, Information Access


 Integrated, Consistent
Data Warehouse

Transaction Data
 Granular, Operational
Transactions

2011 IT SUMMIT
THE BA REFERENCE ARCHITECTURE
Advanced Data Data Master Data Content
Access Analytics BIBPM Repositories Integration Management Management Sources

Planning, Dimensional Extract / SOA Content


Predictive Subscribe Enterprise
Forecasting, Layer Orchestration Ingestion
Analytics
Web Budgeting Components
Browser Initial Staging
Data Content
Data Scorecards CRUD Extraction
Warehouses Unstructured
Mining Data Quality Transactional
(Technical/ Components
Dashboards Operational Business) Content
Portals Services
Simulation Data Stores
Master Data
Clean Staging Informational
Analysis Store
Time Document
Devices Text Analytics Persistent Transforms Management
Reporting Repository Services
Data Load
Load-Ready Components External
Optimization & Publish
Rules Master Data Records
Query
Web Management Management
Services Load/Publish Reference Data Services
Monitoring Management Web
Visualization Content Stores Virtualization Federation

Business Process Management / Workflow


Information Governance
Policy/Organization/Change Management
Data Architecture
Data Quality
Metadata

Network Connectivity, Protocols & Access Middleware


Hardware & Software Platforms

2011 IT SUMMIT
ACCESS LAYER

Access Layer

Web
Browser

Portals

Devices

Access Layer Definition


Web
Services
The Access Layer provides the mechanism for end user or external
system interaction with the analytical, informational, and reporting
applications of the Information on Demand environment. The
interaction may be human or automated through a web service.

2011 IT SUMMIT
ACCESS LAYER
Access Architecture Component Access Architecture Component Description

Web Browsers A software front end application used for retrieving and presenting information on the internet.

Portals A method of consolidating multiple information sources on one site with common access
control.

Devices Any device attached to another computer or computing device that extends its functionality.
The device has the ability to transmit and receive information.

Web Services A service or set of services that can be accessed via the internet or other computing
processing methods and provide content, information or processing services.

2011 IT SUMMIT
ADVANCED ANALYTICS
LAYER
Advanced Analytics Layer

Predictive
Analytics

Data
Mining

Simulation

Text Analytics
Advanced Analytics Layer Definition
Optimization
& Rules The Analytics Layer uses data and models to provide insight to guide
Management decisions. It empowers clients to make more effective decisions and
build more productive systems based on more complete data,
consideration of all available options, and careful predictions of
Visualization outcomes and estimates of risk. This layer is typically composed of
various technological components destined to meet specific needs,
and are usually built from “best-of-breed” software and tools such as
SPSS and ILOG.

2011 IT SUMMIT
ADVANCED ANALYTICS
LAYER
Advanced Analytics AAO Architecture Component Description
Architecture
Component
Data Mining Focused on extracting patterns and previous unknown facts from large volumes of data. It helps businesses uncover key insights, patterns,
and trends in data, then uses this insight to optimize business decisions. Data mining techniques can be divided into major categories
including classification (arranging data into predefined groups), clustering (similar to classification but the groups are not predefined), and
regression (statistical analysis between a dependent variable and one or more independent variables).
Optimization Helps companies improve operational efficiency while also providing centralized business rules in robust repositories that can be used
across applications. Business Optimization leverages advanced mathematical techniques to find the best solution to a complex problem
with many decision options and constraints. It is a powerful analytical tool for calculating the best possible utilization of resources to help
achieve a desired business result, such as reducing cost or processing time, or increasing profit, serviceability and throughput.
Predictive Analytics Analyzes patterns found in historical and current transaction data as well as attitudinal survey data to predict potential future outcomes.
The core of predictive analytics relies on capturing relationships between explanatory variables and developing models to predict future
outcomes.
Rules Management Used to define, deploy, execute, monitor, and maintain the variety and complexity of business logic that are used by operational systems
within an organization. Business rules are typically written using IF/THEN statements, decision tables, decision trees and scorecards.
Business rules describe the operations, business logic, and constraints that apply to an organization in achieving its goals.
Simulation Replicates a system, process, behavior, or business problem using advanced analytical techniques. It is used to perform “What If” analysis
based on a set of parameters and input variables. Simulation models a business process to estimate the impact of management decisions
or changes. Simulation enables companies to reproduce the dynamic behavior of a business process, in order to analyze workloads and
potential bottlenecks. Simulations are an example of how technology can aid smarter decision-making. It can predict, for example, what
will happen to an area if a new major facility is built and lead to improved planning of roads and public transportation. By providing insight
into the impact of decisions and design alternatives, simulation can help companies determine the optimal path forward.
Text Analytics Analysis of textual patterns to provide business insight. This infers meaning from unstructured data
Visualization Allows clients to gain insight through diagrams, maps, schedules, charts, and images. Visualization techniques manipulate, transform, and
render data based on points, lines, areas, volumes, images or geometric primitives in any combination. Visualization is a powerful tool for
understanding the behavior of complex systems. By building graphical displays users can easily understand and interpret large volumes of
data. Visualization is applicable to a wide range of business problems including financial trend analysis, monitoring of traffic and
communication system, analysis of social networks, and arrangement of large-scale text and image data.

2011 IT SUMMIT
ADVANCED ANALYTICS REFERENCE
ARCHITECTURE

Advanced Analytics Reference Architecture

Predictive Data Text Optimization


Analytics Mining Simulation Analytics & Rules Mgmt Visualization

Generalized Linear
Hypothesis Testing Queuing Theory Linear Programming Geo-Spatial
Models

Multivariate Analysis Longitudinal


Integer
Discrete Event Text Analytics
Programming
Exploratory Data
Analysis
Decision Trees Charts/Graphs
Constraint
Agent Based
Programming
Neural Networks Networks & Linkage
Experimental Design
Social Media Nonlinear
System Dynamics
Analytics Optimization
Survival Analysis Animation

Forecasting
Segmentation &
Clustering
Monte Carlo Business Rules Trees AAO Reference
Advanced Architecture
Analytics Definition
Reference Architecture Definition
Business Process Management / Workflow
Information Governance
The AAO Reference Architecture contains the components from the
Metadata Advanced Analytics pillar of the BA Reference Architecture. The
Data Quality components in common between the two have different points of
Network Connectivity, Protocols & Access Middleware emphasis in the different reference architectures.
Hardware & Software Platforms

2011 IT SUMMIT
ADVANCED ANALYTICS REFERENCE
ARCHITECTURE

Predictive Data Text Optimization &


Analytics Mining Simulation Analytics Rules Mgmt Visualization

Generalized Linear
Models Hypothesis Testing Queuing Theory Text Analytics Linear Programming Geo-Spatial

Multivariate Analysis Longitudinal


Discrete Event Integer
Programming
Exploratory Data
Decision Trees Analysis
Chart/Graphs

Constraint
Agent Based Programming
Neural Networks Networks & Linkage

Experimental Design
Social Media Nonlinear
System Dynamics Optimization
Survival Analysis Analytics Animation

Forecasting Segmentation &


Monte Carlo Business Rules Trees
Clustering

2011 IT SUMMIT
BUSINESS INTELLIGENCE
AND BPM LAYER

BIBPM Layer
Planning,
Forecasting,
Budgeting

Scorecards

Dashboards

Analysis

Reporting BIBPM Layer Definition

The BIBPM Layer empowers decision making and improved


Query business performance through the timely access, analysis and
reporting of actionable, accurate, and personalized information. A
variety of applications may be supported, from static reporting to
Monitoring balanced scorecards to monitoring tools that are embedded within an
operational process. This layer is typically composed of various
technological components destined to meet specific needs, and are
usually built from “best-of-breed” software and tools such as data
access models, query tools, reporting tools for OLAP.

2011 IT SUMMIT
BUSINESS INTELLIGENCE
AND BPM LAYER
BIBPM BIBPM Architecture Component Description
Architecture
Component
Planning, Forecasting, Planning, Budgeting & Forecasting leverages analytics to align financial and operational plans, understand target values for
Budgeting key categories of revenue and expenditure, and evaluate expected business outcomes. It measures progress against leading
industry best practices for the purpose of identifying opportunities to better link strategy to action, optimize budget
allocations, and perform what-if analysis.

Dashboards & Dashboards & Scorecards provide a mechanism to translate corporate strategy into measures, targets and initiatives across
Scorecards an organization and to achieve the visibility required to manage corporate performance. Dashboards are used by business
managers to take immediate actions and improve day to day business performance. They have a limited time horizon and are
updated weekly and/or daily. Dashboards typically use leading indicators, provide drilldown capabilities, and leverage
business activity monitoring and exception alerts. Scorecards are used by executives to perform cross functional monitoring
of progress towards achieving business strategy. They usually leverage historical indicators and provide limited drilldown
capabilities. Scorecards have a longer time horizon and are updated monthly/quarterly

Analysis & Reporting Business Analytics & Reporting provides the ability to connect disparate, disconnected, and non-integrated data from
departmental and functionally siloed sources into a consistent, commonly defined and governed reporting format to enable
timely and accurate analytical and reporting capabilities. It helps personalize information delivered to the user community and
defines the "why" and "how" behind historically focused "what happened" analysis. Business Analytics & Reporting includes
setting future direction, defining measures, targets, managing value drivers and analytic dimensions, enabling insight and
vision around business events. It evaluates business requirements and current enterprise wide reporting processes in order
to leverage the information architecture in the most effective and efficient manner across the organization

Query Relational ad doc query and reporting capabilities

Monitoring Operational reporting and real-time monitoring of key performance indicators (KPIs) and operational metrics

2011 IT SUMMIT
DATA REPOSITORY LAYER

Data Repository Layer

Dimensional
Layer

Data
Warehouses

Operational
Data Stores

Time
Persistent
Repository Data Repository Layer Definition

The Data Repository Layer contains the databases and data stores and
Master Data related components that provide most of the storage for the data which
supports a BA environment. The Data Repositories Layer’s repositories are
not a replacement or replica of operational databases which reside on the
Data Source Layer, but are a complementary set of data repositories that
reshape data into formats necessary for making decisions and managing a
Content Stores business. These database structures are represented by conceptual, logical,
and physical data models and data model types (e.g. 3NF, star/snowflake
schemas, unstructured, etc.)

2011 IT SUMMIT
DATA REPOSITORY LAYER
The types of data stores for analytic purposes are neither new nor groundbreaking. They represent the natural evolution of
analytic data stores based on the dynamics of how information is used today. These design patterns in the data models
represent the types of structures used in every analytic data stores, so that our teams address the development of analytic data
stores in a common and consistent manner.
Data Repository Architectural Layer Definitions

Dimensional Data Operational


Layer Warehouses Data Stores

Dimensional models are developed A Data Warehouse is the main store of analytic A Operational Data Store (ODS) is the
to support a single business function information. The DW provides business subject main repository and integration point of
or process. It is usually a subset of area orientation in order easily rationalize data operational data from disparate
information found in the data from multiple subjects (such as different lines of systems. The ODS stores transaction
warehouse, further transformed and business) and source systems. level detailed data used to satisfy
re-shaped for a specific analytical The data model is structured through its key common, integrated enterprise-level
application. The dimensional model designs to be able to accommodate updates operational data needs.
can contain both current and either by traditional change data capture
historical data, and typically processes or snapshot updates. The ODS contains current, non-
contains summarized and redundant detailed data common across
aggregated data . Because of the dynamic for multiple source multiple systems or organizational units.
system and business subject area integration, ODS’s are often the data store
Dimensional models are usually and change data capture, the optimal data structures used for Master Data
modeled as a star schema, or model approach is 3rd Normal Form. These 3rd Management Hub environments
snowflake models. normal form Data Warehouse Models are not instantiating core subject areas such as
transactional in format, but designed for optimal Customer and Product.
Dimensional Models can be loading and reading of analytic information.
instantiated as views, materialized
views, or tables. Data Warehouse may contain historical data, as
well as rationalized transactional data,
aggregated data, and derived/calculated data.

2011 IT SUMMIT
DATA REPOSITORY LAYER
The types of data stores for analytic purposes are neither new nor groundbreaking. They represent the natural evolution of
analytic data stores based on the dynamics of how information is used today. These design patterns in the data models
represent the types of structures used in every analytic data stores, so that our teams address the development of analytic data
stores in a common and consistent manner.
Data Repository Architectural Layer Definitions
Time Persistent Master Data Content
Repositories Stores Stores

Time Persistent Repositories (TPR) Master Data Stores are those that Content stores contain unstructured
are structures that are modeled contain the core business concepts and data such as text, video, documents and
after the application data stores and hierarchies of an organization such as: non-traditional relational formats such as
databases. XML.
• Suppliers
It is a raw (no data quality flagging), • Products The design of these data stores using
siloed by application, source of • Customers concepts such as content modeling are
transactional data. • Organization dependent on the planned business
usage and underlying technology.
A TPR is used for operational These data stores are often are
reporting, data quality profiling, and designed as an ODS data structure and
sourcing for Advanced Analytic are referenced by both analytic and
applications such as SAS and transactional systems.
SPSS.

They are “lightly conformed” enough


in order to provide a common key
structure in order to provide
connectivity to data warehouses
and dimensional layers.

2011 IT SUMMIT
DATA INTEGRATION LAYER

Data Integration Layer

Extract /
Subscribe

Initial Staging

Data Quality
(Technical/
Business)

Clean Staging

Data Integration Layer Definition


Transforms
The Data Integration Architectural Layer focuses on the processes
and environments that deal with the capture, qualification,
Load-Ready processing, and movement of data in order to prepare it for storage
Publish in the Data Repository Layer, which is subsequently shared with the
Analytical/Access applications and systems. This layer may process
Load/Publish data in scheduled batch intervals or in near real-time/”just-in-time”
intervals, depending on the nature of the data and the business
purpose for its use.

2011 IT SUMMIT
DATA INTEGRATION LAYER
Data Integration Architectural Layer Definitions

Extract/Subscribe Initial Staging Data Quality Clean Staging Virtualization

Extract/Subscribe is Initial Staging is an optional Data Quality are the The Clean Staging Data virtualization is the
the set of processes “landing zone” where the processes that qualifies Area in the next process of abstracting,
that capture data, copy of the data from and cleanses the data, optional landing zone, transforming, federating
transactional or bulk, sources is landed as a based upon Technical and which contains records and delivering data
structured or result of the Business Process rules. that have passed all contained within a variety of
unstructured, from extract/subscribe Regardless of the data DQ checks. This data information sources so that
various sources and processing. quality rules the Data may be passed to they may be accessed by
lands it to an Initial Quality Layer should processes that build consuming application or
Staging Area. It follows One of the purposes for the provide the following load ready files. The users when requested
the architectural Initial Staging Area is to functionality: data may also become without regard to their
principle of “read once, persist source data in non- input to transformation physical storage or
write many” to ensure volatile storage to achieve Cleansed Data Files: using processes which, in heterogeneous structure.
that impact on source the “pull it once from the data quality criteria turn produce new data
systems is minimized, source” goal. sets. This concept and software
and data lineage is Reject Data Files: data is commonly used within
managed. Data from real time sources records that fail the The Data Integration data integration, business
that is intended for real cleansing logic architecture should intelligence, service-
time targets only is not include an archiving oriented architecture data
passed through Reject Reports: a tabular facility for the files in services, cloud computing,
Extract/Subscribe and may report of the records that the clean staging area. enterprise search and
not land in Initial Staging failed with reason codes for master data management.
Area. review and renovation

2011 IT SUMMIT
DATA INTEGRATION LAYER
Data Integration Architectural Layer Definitions

Transformation Load-Ready Publish Load/Publish

Load-Ready Publish is Load/Publish is a set of standardized


Calculations & Splits Processing & Enrichment Target Filtering
a optional staging area processes. Loads are structured by
that is utilized to store subject area by data store, for
Is a transformation type that Is a transformation type that Is a target-specific
target-specific load- example, Subject Areas in the Data
allows for the creation of new provides the following transformation type
ready files. Warehouse such as Involved Party.
data elements (that extend capabilities: that target filters format
There are five types of physical load
the data set), or new data and filter multi-use
Joins -combining fields from If a target can take a architectures, they include:
sets, that are derived from data sources from
the source data. The multiple sources and storing Clean Staging Area, direct output from a
the combined set, FTP to Target – in this type of load,
enrichment capability making them load- data integration tool
the process is only responsible for
includes the following ready for targets. without storing the data
Lookups -combining fields depositing the output to the target
functions: Both vertical and first, storing it in Load-
from records with values from environment.
horizontal filtering are ready Staging Area
Calculations -Calculations reference tables and storing performed. may not be required. Piped data – the process executes a
process data in a data set to the combined set, load routine on the target that takes
produce derived data based Vertical Filtering – the data directly piped from the Target
on data transforms and Aggregations-creation of new pass only the data Specific Filter.
computations. data sets derived from the elements the target
combination of multiple needs. RDBMS Utilities – e.g. DB2’s Bulk
Splits - The architecture sources and/or records Loader on the target, but the source is
supports splitting data sets. Horizontal Filtering – Load-Ready Staging Area.
Splitting is a technique used Change Data Capture- pass only the records
identifying changed records SQL – SQL writes directly to the target
to divide a data set into that conform to the
from a source data set by database.
subsets of fields that are then targets rules.
stored individually. comparing the values to the
Message Publishing – for loading real
prior set from the source.
time data feeds to message queues

2011 IT SUMMIT
MASTER DATA
MANAGEMENT LAYER

Master Data Management


Layer
SOA
Orchestration
Components

CRUD
Transactional
Components

Master Data
Store

Data Load Master Data Management Layer Definition


Components
Master data management or MDM is a set of disciplines,
technologies, and solutions to create and maintain consistent,
Reference complete, contextual, and accurate business data for all
Data stakeholders across and beyond the enterprise.
Management

2011 IT SUMMIT
MASTER DATA
MANAGEMENT LAYER
MDM Architecture MDM Architecture Component Description
Component
SOA Orchestration Components The SOA Orchestration Components consist of the MDM Orchestration component (ESB) and the MDM Shared
Services Component. The MDM Orchestration Component represents the real-time synchronization process to
keep data in sync with external systems and/or applications. Typically this is facilitated via an Enterprise
Services Bus (ESB), which is a streamlined, distributed integration middleware that combines message
transformation, protocol transformation and process orchestration to execute a services request/response.
CRUD Transactional Components The MDM Shared Services component represents the CRUD (Create, Read, Update, Delete) transactional
services required to maintain the data within the Master Data Store. These can be modified and additional
composite services can be created based on specific requirements.
Master Data Store The Master Data Store component, or the MDM Data Repository component represents the logical and physical
MDM database. The database is the underlying repository used to persist MDM global and contexted data
elements from various source systems.

Data Load Components The Data Load Component or the MDM Data Integration (ETL/Batch) component represents the Extract,
Transformation and Load (ETL) and Bulk Data Movement (BDM) processes needed to load data into and extract
data from the MDM Data Store. ETL typically represents the inbound of data into the Master Data Store via an
initial load and ongoing delta process. BDM typically represents the outbound of data to external systems
and/or applications via an extract process.

Reference Data Management The Reference Data Management Component represents the processes in the MDM solution for controlling and
managing reference data and master data domains across the enterprise to meet business and technical
requirements that can change over time. Reference Data can be described as data used to define
characteristics of an identifier that are used within other databases or data processes in an enterprise. In many
business processes reference data is represented in master data domains like customers, products, accounts,
contracts and locations.

2011 IT SUMMIT
CONTENT MANAGEMENT
LAYER

Content Management Layer

Content
Ingestion

Content
Extraction

Content
Services

Document
Management
Services
Content Management Layer Definition

Records Contains the services, technologies and processes used to capture,


Management manage, store, preserve, and deliver unstructured content. It
Services provides global access and management of digital assets used to
collaborate and share information between a company and its
customers, suppliers, employees and business partners.
Federation

2011 IT SUMMIT
CONTENT MANAGEMENT
LAYER
Content Content Content Document Records Federation
Ingestion Extraction Services Management Management
Services Services

CM Architecture CM Architecture Component Description


Component
Content Ingestion Collects, classifies, analyzes, assigns metadata to and stores content into Content Stores.

Content Extraction Extracts data and metadata from content for use in indexing and analytics.

Content Services Repository services that provide check in/out, versioning, permission management and enforcement, as well as higher-level
application services.

Document Management Provides lifecycle management, structured and collaborative authoring, automatic publishing, and support for multiple
Services languages
Records Management Allows content and associated metadata to be treated as business records which can be held or disposed of according to
Services business needs.

Federation Consolidated search and retrieval of documents and metadata across multiple disparate content stores

2011 IT SUMMIT
CONTENT MANAGEMENT
REFERENCE ARCHITECTURE

CM Reference Architecture
Access Processing and Content Content Content
Analytics Repositories Ingestion Sources
Collaboration Scan / Fax Paper
Web Data Model
eDiscovery
Office
Browser Advanced
Compliance Documents
Support Library Services Capture (OCR,
adv. forms, etc) Reports
Ingestion Analytics
Business Applications

Content Classification

Taxonomy
Content/Text eMail Capture Audio, Video
Portals Management
Analytics
Enterprise
File System
Process Analytics Applications
Object Management Capture
Query/Search & Other Content
Devices Reporting Federation Repositories
Modeling & Report / Output
Process Simulation Management Connectors Messaging
Scorecard
ECM Reference Architecture
CM Reference Architecture Definition
Definition
Records Management eForms Social Media
Web Visualization
Services Digital Asset Mgmt Speech to Text
Content Distribution Web The CM Reference Architecture contains the components from the
Business Process Management / Case Management Content Management pillar of the BA Reference Architecture, as well
Data Governance
as selected components from many of the other pillars as well. The
Metadata components in common between the two have different emphases in
Content and Data Quality
the different reference architectures.
Security and Data Privacy
Systems Management & Administration, Disaster Recovery
Network Connectivity, Protocols & Access Middleware, SOA/Web Services, Hosting, Hardware & Software Platforms

2011 IT SUMMIT
INFORMATION GOVERNANCE
CROSS FUNCTIONAL LAYER

Cross Functional Information Governance Definition

The orchestration of people, process, and technology to enable an


organization to leverage data as an enterprise asset.

Cross functional Layer

Information Governance
Policy/Organization/Change Management
Data Architecture

Data Quality
Metadata

2011 IT SUMMIT
INFORMATION GOVERNANCE
CROSS FUNCTIONAL LAYER

IG Architecture IG Architecture Component Description


Component
Policy Information Governance Policy provides strategic and operational direction to the enterprise. An Information Governance
policy recognizes that corporate data is a critical corporate resource and will be managed as such. They are intended to guide
and affect real world decisions by placing limitations on a particular decision or process.

Organization The administrative and functional structures required to achieve the objectives of the Information Governance program.
Included in this module are the executive, management, and operational organizational structures and stewardship roles.

Change Management Change Management activities provide the strategy for the creation and distribution of meaningful and appropriate
communication of Information Governance messages to program stakeholders, including executives, as well as program and
project level communities.

Data Architecture Includes the data, application, and technology structures needed to support information governance processes and improve
information management capabilities
Metadata Metadata Management is the capture, versioning, approval, usage, and analysis of the different types of metadata found in an
Information Management environment.

Data Quality Data Quality Management is defined by how effectively the data supports the data quality requirements of the transactions
and decisions needed to meet an organization’s strategic goals and objectives, as embodied in its ability to control its assets
and conduct its core operations.

2011 IT SUMMIT
ARCHITECTURE CROSS
FUNCTIONAL LAYER

Cross Functional Layer Architecture Definition

This Cross Functional layer that provides the cohesive foundation


whereupon the BA pillars can function in an integrated and seamless
fashion.

Cross Functional Layer

Business Process Management / Workflow

Network Connectivity, Protocols & Access Middleware

Hardware & Software Platforms

2011 IT SUMMIT
ARCHITECTURE CROSS
FUNCTIONAL LAYER
Access Architecture Component Access Architecture Component Description

BPM Business Process Management is a management approach based on continually improving


and optimizing business processes using business value and technology as drivers utilizing a
systematic approach for continuous improvement.

Infrastructure The infrastructure for a Business Analytics environment is the physical hardware,
foundational software such as operating systems, security protocols, etc, data storage and
network connectivity that is considered the foundation for the BA platform that is needed to
build the specific applications that will allow the suite of applications to interact with the end
user community.
A key consideration in the BA arena is the development of workload optimized systems and
appliances that are designed and built for specific purposes of which Netezza is a prime
examples.

Network, Connectivity, Protocols & The platform and technology that allows communication between components, formal
Access Middleware description of message formats and the rules for exchanging those messages. Security plays
an important role in this sub-layer in that the appropriate controls and accesses need to be in
place and continuously monitored.

Hardware & Software Platforms The physical platforms for a Business Analytics environment that includes the hardware
components and the software applications that run on the platform.

2011 IT SUMMIT

You might also like