Professional Documents
Culture Documents
Informatica HCL
Informatica HCL
Agenda
Service-oriented architecture
Domains, nodes, and services
Administering the domain
Log Service and Log Viewer
Managing folders, users, and permissions
Service-Oriented Architecture
Product Overview
Informatica components
Repository Service
Integration Service
Informatica Client
Repository Manager
Designer
Workflow Manager
Workflow Monitor
SOA Cont.
Powercenter Domain
Powercenter Domain
Nodes
Gateway Node
Service Manager
Services
Services
Core Services
Log Services
Administration console
Administration console
Domain Property
Node Property
Designer
To add source and target definitions to the repository
To create mappings that contain data transformation instructions
Overview .. Sources
Relational - Oracle, Sybase, Informix, IBM DB2, Microsoft
SQL Server, and Teradata
File - Fixed and delimited flat file, COBOL file, and XML
Extended PowerConnect products for PeopleSoft, SAP
R/3, Siebel, and IBM MQSeries
Mainframe PowerConnect for IBM DB2 on MVS
Other - MS Excel and Access
Overview .. Targets
Relational - Oracle, Sybase, Sybase IQ, Informix, IBM DB2,
Microsoft SQL Server, and Teradata.
File - Fixed and delimited flat files and XML
Extended
Integration Service to load data into SAP BW.
PowerConnect for IBM MQSeries to load data into IBM MQSeries
message queues.
Repository Manager
Repository
The Informatica repository tables have an open architecture
Metadata can include information such as
mappings describing how to transform source data
sessions indicating when you want the Integration Service to perform
the transformations
connect strings for sources and targets
Repository
Database connections
Global objects
Mappings
Mapplets
Multi-dimensional metadata
Reusable transformations
Sessions and batches
Shortcuts
Source definitions
Target definitions
Transformations
Repository
Exchange of Metadata with other BI Tools ,Metadata can
be Exported to and Imported from BO, Cognos ..The
objects exported or Imported can be compared in their
XML formats itself.
MX Views ,This Feature enables the User to view the
Information on the Server Grids and the Object History in
the Repository Service.
Repository Security
Can plan and implement security using the following features:
User groups
Repository users
Repository privileges
Folder permissions
Locking
Types of Locks
There are five kinds of locks on repository objects:
Read lock - Created when you open a repository object in a folder
for which you do not have write permission
Write lock - Created when you create or edit a repository object
Execute lock - Created when you start a session or batch
Fetch lock - Created when the repository reads information about
repository objects from the database
Save lock - Created when the repository is being Saved.
Folders
Folders provide a way to organize and store all metadata in the repository,
including mappings and sessions
When a new version is created, the Designer creates a copy of all existing
mapping metadata in the folder and places it into the new version
Can copy a session within a folder, but you cannot copy an individual session
to a different folder
Folders
To copy all sessions within a folder to a different location,
you can copy the entire folder
Any mapping in a folder can use only those source and
target definitions or reusable transformations that are
stored:
in the same folder
in a shared folder and accessed through a shortcut
Folder permissions
Folder owner
Owners group
Shared or not shared
Folders
Folders have the following permission types:
Read permission
Write permission
Execute permission
Copying Folders
Copying Folders
Re-establish shortcuts
Choose an Integration Service
Copy connections
Copy persisted values
Compare folders
Replace folders
Comparing Folders
Versions to compare
Object types to compare
Direction of comparison
Comparing Folders
Comparing Folders
The comparison wizard displays the following usercustomized information:
Similarities between objects
Differences between objects
Outdated objects
Folder Versions
Added Features in PowerCenter 8.1
Can run object queries that return shortcut objects. Can also run
object queries based on the latest status of an object. The query can
return local objects that are checked out, the latest version of
checked in objects, or a collection of all older versions of objects.
Sources
Targets
Transformations
Mapplets
Mappings
Sessions
Designer
Designer Appendix
Informaticas Designer is the client application used to create and
manage sources, targets, and the associated mappings between them.
The Integration Service uses the instructions configured in the
mapping and its associated session to move data from sources to
targets.
The Designer allows you to work with multiple tools, in multiple
folders, and in multiple repositories at a time.
The client application provides five tools with which to create
mappings:
Source Analyzer. Use to import or create source definitions for flat file,
ERP, and relational sources.
Target Designer. Use to import or create target definitions.
Transformation Developer. Used to create reusable object that generates
or modifies data.
Mapplet Designer. Used to create a reusable object that represents a set
of transformations.
Mapping Designer. Used to create mappings.
Designer Appendix
The Designer consists of the following windows:
Navigator. Use to connect to and work in multiple
repositories and folders. You can also copy objects and
create shortcuts using the Navigator
Workspace. Use to view or edit sources, targets,
Mapplets, transformations, and mappings. You can work
with a single tool at a time in the workspace
Output. Provides details when you perform certain tasks,
such as saving your work or validating a mapping. Rightclick the Output window to access window options, such
as printing output text, saving text to file, and changing
the font size
Overview. An optional window to simplify viewing
workbooks containing large mappings or a large number
of objects
Source Analyzer
The following types of source definitions can be imported
or created or modified in the Source Analyzer:
Warehouse Designer
To create target definitions for file and relational sources
Import the definition for an existing target - Import the target definition
from a relational target
Create a target definition based on a source definition, Relational source
definition, Flat file source definition
Manually create a target definition or design several related targets at
the same time
target
Mapping
Mappings represent the data flow between sources and
targets
When the Integration Service runs a session, it uses the
instructions configured in the mapping to read, transform,
and write data
Every mapping must contain the following components:
Source / Target definitions.
Transformation / Transformations.
Connectors Or Links.
Mapping
Transformation
Transformation
Source
Source
Source
Source Qualifier
Qualifier
Target
Target
Links
Links or
or Connectors
Connectors
Sample Mapping
Mapping - Invalidation
On editing a mapping, the Designer invalidates sessions
under the following circumstances:
Add or remove sources or targets
Remove Mapplets or transformations
Replace a source, target, Mapplet, or transformation while
importing or copying objects
Add or remove Source Qualifiers or COBOL Normalizers, or
change the list of associated sources for these transformations
Add or remove a Joiner or Update Strategy transformation.
Add or remove transformations from a Mapplet in the mapping
Change the database type for a source
Mapping - Components
Every mapping requires at least one transformation
object that determines how the Integration Service reads
the source data:
Mapping - Updates
Mapping - Validation
The Designer marks a mapping valid for the following
reasons:
Connection validation - Required ports are connected and that all
connections are valid
Expression validation - All expressions are valid
Object validation - The independent object definition matches the
instance in the mapping
Mapping - Validation
Transformations used in
Informatica
Transformations
A transformation is a repository object that generates,
modifies, or passes data
The Designer provides a set of transformations that
perform specific functions
Transformations in a mapping represent the operations
the Integration Service performs on data
Data passes into and out of transformations through ports
that you connect in a mapping or mapplet
Transformations can be active or passive
Transformation - Types
An active transformation can change the number of rows
that pass through it
A passive transformation does not change the number of
rows that pass through it
Transformations can be connected to the data flow, or
they can be unconnected
An unconnected transformation is not connected to other
transformations in the mapping. It is called within another
transformation, and returns a value to that
transformation
Transformations - Types
Source Qualifier Transformation Object
This object is used to define the reader process, or the selection
for relational sources.
Can automatically generate the SQL, User can override it and
write own SQL.
In the source qualifier you can also specify filters on the source
data or distinct selection.
Transformations - Types
Expression Transformation Object
The Expression Transformation is used for data cleansing and scrubbing.
There are over 80 functions within PowerCenter, such as concatenate,
instring, rpad, ltrim and we use many of them in the Expression
Transformation.
We can also create derived columns and variables in the Expression
Transformation.
Filter Transformation
The Filter Transformation is used to do just that filter data.
We can filter at the source using the Source Qualifier, but maybe I need
to filter the data somewhere within the pipeline, perhaps on an aggregate
that Ive calculated, and we use the Filter Transformation for that. We
also use the Filter to branch and load a portion of the data into one target
based on some condition and the rest of the data into another target.
Transformations - Types
Aggregator Transformation
The Aggregator Transformation is used when doing those types of
functions that require sorting or a group by. Functions like SUM, AVG.
Aggregates are done in memory and we can accept pre-sorted input to
reduce the amount of memory necessary.
Lookup Transformation
Use the Lookup Transformation to lookup tables in the source database,
the target database or any other database as long as we have connectivity.
Lookups are by default cached in memory, however you can turn off the
caching by checking the option.
The lookup condition can be any type of Boolean expression.
Transformations - Types
Sequence Generator Transformation
We provide a Sequence Generator Transformation for those target
databases that do not have these capabilities (such as MS-SQL 6.5) and
ours works just like Oracles sequence generator.
You can specify the starting value, the increment value, the upper limit
and whether or not you want to reinitialize.
Transformations - Types
External Procedure Transformation
The same holds true for External Procedures written in C, C++, or Visual
Basic they can be called from within the mapping process using the
External Procedure transformation.
External procedure transformations are reusable transformations and we
define those in the Transformation Developer (the fourth component tool
in the Designer).
Joiner Transformation
The Joiner Transformation is used for heterogeneous joins within a
mapping perhaps we need to join a flat file with an Oracle table or
Oracle and a Sybase table or 2 flat files together. (Joins are done in
memory and the memory profile is configurable!).
Sorter Transformation
The Sorter is used to sort the data in the Ascending or the Descending
order, generally used before aggregation of data.
Transformations - Types
Normalizer Transformation
The Normalizer Transformation is used when you have occurs or
arrays in your source data and you want to flatten the data into a
normalized table structure PowerCenter will do this
automatically for you. The Normalizer is also used to pivot data.
Ranking Transformation
This is used when you are doing a very specific data mart, such as
Top Performers, and you want to load the top 5 products and the
top 100 customers or the bottom 5 products.
Transformation
Added Features in PowerCenter 8.1 (about 8.6 at the end of ppt)
Flat file lookup: Can now perform lookups on flat files. To create a
Lookup transformation using a flat file as a lookup source, the
Designer invokes the Flat File Wizard. To change the name or
location of a lookup between session runs, the lookup file parameter
can be used .
Dynamic lookup cache enhancements: When you use a dynamic lookup
cache, the PowerCenter Service can ignore some ports when it compares
values in lookup and input ports before it updates a row in the cache.
Also, you can choose whether the PowerCenter Service outputs old or
new values from the lookup/output ports when it updates a row. You
might want to output old values from lookup/output ports when you use
the Lookup transformation in a mapping that updates slowly changing
dimension tables.
Transformation
Custom transformation API enhancements: The Custom
transformation API includes new array-based functions
that allow you to create procedure code that receives and
outputs a block of rows at a time. Use these functions to
take advantage of the PowerCenter Server processing
enhancements.
Midstream XML transformations: Can now create an
XML Parser transformation or an XML Generator
transformation to parse or generate XML inside a
pipeline. The XML transformations enable you to extract
XML data stored in relational tables, such as data stored
in a CLOB column. You can also extract data from
messaging systems, such as TIBCO or IBM MQSeries.
Transformations - Properties
Port Name
Copied ports will inherit the name of contributing port
Copied ports with the same name will be appended with a number
Types Of Ports:
Data types
Transformations use internal data types
Data types of input ports must be compatible with data types of the
feeding output port
Aggregator Transformation
Performs aggregate
calculations
Components of the
Aggregator Transformation
Aggregate expression
Group by port
Sorted Input option
Aggregate cache
Aggregator Transformation
Expression Transformation
Can use the Expression transformation to
perform any non-aggregate calculations
Calculate values in a single row
test conditional statements before you output the results to target
tables or other transformations
Expression Transformation
Filter Transformation
!! Filter should always be used as close to the Source, so that the Load of
data carried ahead is decreased at / or near to the Source Itself.
Joiner Transformation
Joiner Transformation
Use the Joiner transformation to join two sources with at least one
matching port
It uses a condition that matches one or more pairs of ports between
the two sources
Requires two input transformations from two separate data flows
It supports the following join types
Normal (Default)
Master Outer
Detail Outer
Full Outer
Can be used to join two data streams that originate from the same
source.
Joiner Transformation
Joiner on Multiple Sources.
Lookup Transformation
Used to look up data in a relational table, view, synonym or Flat File.
The Integration Service queries the lookup table based on the lookup
ports in the transformation.
It compares Lookup transformation port values to lookup table
column values based on the lookup condition.
Can use the Lookup transformation to perform many tasks, including:
Get a related value
Perform a calculation
Update slowly changing dimension tables
Types Of Lookups
Connected
Unconnected
Connected Lookup
Connected Lookup
Transformation
Receives input values directly
from another transformation in
the pipeline
For each input row, the
Integration Service queries the
lookup table or cache based on
the lookup ports and the
condition in the transformation
Passes return values from the
query to the next
transformation
Unconnected Lookup
Unconnected Lookup Transformation
Lookup Caching
Router Transformation
A Router transformation tests
data for one or more conditions
and gives the option to route
rows of data that do not meet
any of the conditions to a
default output group
It has the following types of
groups:
Input
Output
For an SQL to Generate a Default Query, Its should have Linked Output port !!
Rank Transformation
Allows to select only the top or bottom rank of data, not just one
value
Can use it to return
the largest or smallest numeric value in a port or group
the strings at the top or the bottom of a session sort order
During the session, the Integration Service caches input data until it
can perform the rank calculations
Can select only one port to define a rank
Rank Transformation
Rank Transformation
Properties
Top/Bottom This is used to
specify if the top or the bottom
rank needs to be output.
Number of Ranks Specifies
the number of Rows to be
Ranked.
Transformation Scope
Transaction Applies the
A Unconnected Stored
Procedure transformation can be
called by passing the parameters
of Input through an expression in
the following format,
:SP.<Proc_Name>(Input,
Output)
Custom Transformation
Custom Transformation Operates in Conjunction with the
Procedures that are created outside the Designer. The PowerCenter
Service uses generated Functions to Interface with the procedure and
the Procedure code should be developed using API functions. The
PowerCenter Service can pass a Single row of data or an Array
depending on the Custom function.
The types of generated functions are Initialization Function - To initialize processes before data is passed to
custom function.
Notification Function - To send notifications
Deinitialization Function - To deinitialize the processes after data is
passed to custom function.
Sorter Transformation
A Sorter Transformation is used to sort Data.
Types of Sorting:
Ascending/Descending
Case Sensitive - Does Case sensitive ascending or descending sort.
Distinct - Gives distinctly sorted output rows.
Sorter Transformation
Transformation Language
The designer provides a transformation language to help
you write expressions to transform source data
With the transformation language, you can create a
transformation expression that takes the data from a port
and changes it
Can write expressions in the following transformations:
Aggregator
Expression
Filter
Rank
Router
Update Strategy
Transformation Language
Expressions can consist of any combination of the
following components:
Transformation Language
The functions available in PowerCenter are
Transformation Expressions
The pre-compiled and tested transformation expressions help you
create simple or complex transformation expressions:
Functions. Over 60 SQL-like functions allow you to change data in a
mapping.
Aggregates.
Transformation Expressions
Characters.
Character functions assist in the conversion, extraction and
identification of sub-strings.
For example, the following expression evaluates a string, starting
from the end of the string. The expression finds the first space and
then deletes characters from first space to end of line.
SUBSTR( CUST_NAME,1,INSTR( CUST_NAME,' ' ,-1,1 ))
Conversions.
Conversion functions assist in the transformation of data from one
type to another.
For example, the following expression converts the dates in the
DATE_PROMISED port to text in the format MON DD YYYY:
TO_CHAR( DATE_PROMISED, 'MON DD YYYY' )
DATE_PROMISED
RETURN VALUE
'Apr 01 1998'
Transformation Expressions
Dates.
Date functions help you round, truncate, or compare dates;
extract one part of a date; or to perform arithmetic on a date.
For example, the following expression would return the month
portion of the date,
GET_DATE_PART(Apr 1 1997 00:00:00, MM)
600
NULL
550
358
39
245.8
Transformation Expressions
Miscellaneous.
Informatica also provides functions to assist in:
aborting or erroring out records
developing if then else structures
looking up values from a specified external or static table
testing values for validity (such as date or number format)
RETURN VALUE
100
100
-500
400.55
400.55
800.10
800.10
Transformation Expressions
Operators. Use transformation operators to create
transformation expressions to perform mathematical
computations, combine data, or compare data.
Constants. Use built-in constants to reference values that
remain constant, such as TRUE, FALSE, and NULL.
Variables. Use built-in variables to write expressions that
reference values that vary, such as the system date. You
can also create local variables within a transformation.
Return values. You can also write expressions that include
the return values from Lookup, Stored Procedure, and
External Procedure transformations
Re usable Transformations
and Mapplets
Reusable Transformation
A Transformation is said to be in reusable mode when
multiple instances of the same transformation can be
created.
Reusable transformations can be used in multiple
mappings.
Creating Reusable transformations:
Design it in the Transformation Developer
Promote a standard transformation from the Mapping Designer.
Mapplet
A Mapplet is a reusable object that represents a set of
transformations
It allows to reuse transformation logic and can contain as
many transformations as needed
Mapplets can:
Expanded Mapplet
Mapplet - Components
Each Mapplet must include the following:
One Input transformation and/or Source Qualifier transformation
At least one Output transformation
Workflow Manager
Server Manager
1. Task Developer
2. Workflow Designer
3. Worklet Designer
Workflow Monitor
1. Gantt Chart
2. Task View
Workflow Manager
The Workflow Manager replaces the Server Manager in version 5.0.
Instead of running sessions, you now create a process called the
workflow in the Workflow Manager.
A workflow is a set of instructions on how to execute tasks such as
sessions, emails, and shell commands.
A session is now one of the many tasks you can execute in the
Workflow Manager.
The Workflow Manager provides other tasks such as Assignment,
Decision, and Events. You can also create branches with conditional
links. In addition, you can batch workflows by creating worklets in the
Workflow Manager.
Task Developer
Use the Task Developer to create tasks you want to execute in the
workflow.
Workflow Designer
Use the Workflow Designer to create a workflow by connecting
tasks with links. You can also create tasks in the Workflow
Designer as you develop the workflow.
Worklet Designer
Use the Worklet Designer to create a worklet.
Workflow Tasks
Command. Specifies a shell command run during the
workflow.
Control. Stops or aborts the workflow.
Decision. Specifies a condition to evaluate.
Email. Sends email during the workflow.
Event-Raise. Notifies the Event-Wait task that an event
has occurred.
Event-Wait. Waits for an event to occur before executing
the next task.
Session. Runs a mapping you create in the Designer.
Assignment. Assigns a value to a workflow variable.
Timer. Waits for a timed event to trigger.
Create Task
Workflow Monitor
PowerCenter 6.0 provides a new tool, the Workflow
Monitor, to monitor workflow, worklets, and tasks.
The Workflow Monitor displays information about
workflows in two views:
1. Gantt Chart view
2. Task view.
You can monitor workflows in online and offline mode.
Strong
Strong Lines
Lines indicate
indicate
Hours
Hours
Workflow Monitor
The Workflow Monitor includes the following performance and usability
enhancements:
When you connect to the PowerCenter Service, you no longer
distinguish between online or offline mode.
You can open multiple instances of the Workflow Monitor on one
machine.
The Workflow Monitor includes improved options for filtering tasks
by start and end time.
The Workflow Monitor displays workflow runs in Task view
chronologically with the most recent run at the top. It displays folders
alphabetically.
You can remove the Navigator and Output window.
pmrepagent
Discontinued. Use replacement commands in pmrep.
pmcmd
Updated to support new Integration Service functionality.
pmrep
Includes former pmrepagent commands. Also includes new syntax to
connect to a domain.
AddLicense
EnableService
GetLog
GetServiceStatus
RemoveNode
UpdateNode
And more
pmcmd Changes
pingservice instead of pingserver
getservicedetails instead of getserverdetails, etc.
Includes syntax to specify domain and Integration Service
information instead of PowerCenter Server information
aborttask
connect
gettaskdetails
startworkflow
And more
pmrep Changes
infacmd EnableService instead of pmrep EnableRepository
infacmd DefineRepositoryService instead of pmrep
AddRepository
Includes syntax to specify domain and Repository Service
information instead of Repository Server information
Connect
DeployFolder
Notify
Register
And more
Ported pmrepagent
Commands
Grid Option
Automatic
Failover Restart
Recovery
PowerCenter
Adaptive Load
Balancer
4
3
1
Dynamic
Distribution
Sessions on
Grid (SonG)
Dynamic
Partitions
1,2,3,4
2
Heterogenous
Hardware Grid
Pushdown Optimization
Pushdown Optimization
A session option that causes the Integration Service to
push some transformation logic to the source and/or target
database
You can choose source-side or target-side optimization, or
both
$$PushdownConfig mapping parameter
Benefits:
Can increase session performance
Maintain metadata and lineage in PowerCenter repository
Reduces movement of data (when source and target are on the same
database)
Use Cases
Batch transformation and loadstaging and target tables
in the same target database
Transformation and load from real-time status table to
data warehouse in the same database
Step 1
Step 2
Staging
Data
Sources
Warehouse
Target
Database
SQL
Extract
Source
DB
Joiner
Agg
Transform
) GROUP
Router
Load
Target
DB
DSQ
Extract
Source
Lookup
Transform
Expr
Load
Target
DB
SQL
Extract
Source
DB
Expr
Agg
Transform
Filter
Load
Target
DB
Transformations
Pushed to
Source or Target
Database
Generated
SQL Statement
Databases Supported
Supported Transformations
To Source
Aggregator
Expression
Filter
Joiner
Lookup
Sorter
Union
To Target
Expression
Lookup
Target definition
Deployment:
Can assign owners and groups to folders and deployment groups
Can generate deployment control file (XML) to deploy folders and
deployment groups with pmrep
Partitioning Changes
Database partitioning
Works with Oracle in addition to DB2
Dynamic partitioning
Integration Service determines the number of partitions to create at
run time
Integration Service scales the number of session partitions based on
factors such as source database partitions or the number of nodes in
a grid
Useful if volume of data increases over time, or you add more CPUs
Informatica
Performance Tuning
Performance Tuning
First step in performance tuning is to identify the performance
bottleneck in the following order:
Target
Source
Mapping
Session
System
The most common performance bottleneck occurs when the Integration
Service writes to a target database.
Target Bottlenecks
Identifying
A target bottleneck can be identified by configuring the session to
write to a flat file target.
Optimizing
Dropping Indexes and Key Constraints before loading.
Increasing commit intervals.
Use of Bulk Loading / External Loading.
Source Bottlenecks
Identifying
Add a filter condition after Source qualifier to false so that no data
is processed past the filter transformation. If the time it takes to run
the new session remains about the same, then there is a source
bottleneck.
In a test mapping remove all the transformations and if the
performance is similar, then there is a source bottleneck.
Optimizing
Optimizing the Query by using hints.
Use Informatica Conditional Filters if the source system lacks
indexes.
Mapping Bottlenecks
Identifying
If there is no source bottleneck, add a Filter transformation in the
mapping before each target definition. Set the filter condition to false so
that no data is loaded into the target tables. If the time it takes to run the
new session is the same as the original session, there is a mapping
bottleneck.
Optimizing
Configure for Single-Pass reading
Avoid unnecessary data type conversions.
Avoid database reject errors.
Use Shared Cache / Persistent Cache
Session Bottlenecks
Identifying
If there is no source, Target or Mapping bottleneck, then there may be a
session bottleneck.
Use Collect Performance Details. Any value other than zero in the
readfromdisk and writetodisk counters for Aggregator, Joiner, or Rank
transformations indicate a session bottleneck. Low (0-20%)
BufferInput_efficiency and BufferOutput_efficiency counter values also
indicate a session bottleneck.
(continued.)
Session Bottlenecks
Optimizing
Increase the number of partitions.
Tune session parameters.
DTM Buffer Size (6M 128M)
Buffer Block Size (4K 128K)
Data (2M 24 M )/ Index (1M-12M) Cache Size
Use incremental Aggregation if possible.
Incremental Aggregation
Incremental Aggregation
When writing to the target Integration Service
Updates modified aggregate groups in the target
Inserts new aggregate data
Deletes removed aggregate data
Ignores unchanged aggregate data
Saves modified aggregate data in the index and data files
Incremental Aggregation
System Bottlenecks
Identifying
If there is no source, Target, Mapping or Session bottleneck, then
there may be a system bottleneck.
Use system tools to monitor CPU usage, memory usage, and
paging.
On Windows :- Task Manager
On Unix Systems toots like sar, iostat. For E.g. sar u (%usage on
user, idle time, i/o waiting time)
Optimizing
Commit Points
Commit Points
Commit Points
Commit Points
Commit Points
Questions???
Informatica
Debugger
Debugger
To debug a mapping, you configure and run the Debugger from within the
Mapping Designer
When you run the Debugger, it pauses at breakpoints and allows you to view
and edit transformation output data
After you save a mapping, you can run some initial tests with a debug session
before you configure and run a session in the Server Manager
Debugger
Debugger
Debugger
After you set the instance name, breakpoint type, and optional data
condition, you can view each parameter in the Breakpoints section of
the Breakpoint Editor
Questions???
The Web Services Provider is the provider entity of the PowerCenter web service
framework that makes PowerCenter workflows and data integration functionality
accessible to external clients through web services.
Reporting Services
Reporting Service
An application service that runs the Data Analyzer
application in a PowerCenter domain. When you log in to
Data Analyzer, you can create and run reports on data in
a relational database or run the following PowerCenter
reports: PowerCenter Repository Reports, Data Analyzer
Data Profiling Reports, or Metadata Manager Reports.
The status indicator at the top of the right pane indicates when the service
has started running.
Note: Before you disable a Reporting Service, ensure that all users are
disconnected from Data Analyzer.
Cont..
PowerExchange for Web Services uses the Simple Object
Access Protocol (SOAP) to exchange information with
the web services provider and request web services.
SOAP is a protocol for exchanging information between
computers. It specifies how to encode XML data so that
programs on different operating systems can pass
information to each other. Web services hosts contain
WSDL files and web services.
HttpProxyServer
HttpProxyPort
HttpProxyUser
HttpProxyPassword
HttpProxyDomain
Cont..
Cont..
PowerCenter
Mapping Analyst for Excel
Cont..
PowerCenter
Mapping Architect for Visio
Cont..
Cont..
PowerCenter Transformations
HTTP Transformation
You can let the Integration Service determine the authentication
type of the HTTP server when the HTTP server does not return an
authentication type to the Integration Service. Or, you can specify
the authentication type for the Integration Service to use.
Unstructured Data Transformation
You can define additional ports and pass output rows to relational
targets from the Unstructured Data transformation. You can create
ports by populating the transformation from a Data Transformation
service. You can pass a dynamic service name to the Unstructured
Data transformation with source data.
Metadata Manager
Cont..
Thank You !!