Professional Documents
Culture Documents
Course Objectives
At the end of this course you will be able to: Understand how to use all major PowerCenter components Build basic ETL mappings Create and run Workflows Perform basic repository administration tasks Troubleshoot problems
Agenda
Duration 1.5 hrs per day, 4 Weeks training course Version 8.6
Decision Support
Data Warehouse
Aggregate Data Cleanse Data Consolidate Data Apply Business Rules De-normalize
Extract
Informatica as ETL
Client Objects
Designer Repository Manager Workflow Manager Workflow Monitor
8
Informatica Repository
The Informatica repository is a set of tables that store the metadata created using the Informatica Client tools. Metadata is added to the repository tables when we perform tasks in the Informatica Client application such as developing mappings or creating sessions. The Workflow Manager adds metadata to the repository tables in the form of tasks and workflows. The Integration Service creates metadata in the repository such as start and finish times of tasks as well as workflow status.
10
11
Integration Service
The Integration Service reads mapping and session information from the repository. It extracts data from the mapping sources and stores the data in memory while it applies the transformation rules that are configured in the mapping. The Integration Service loads the transformed data into the mapping targets.
12
Repository Service
The Repository Service is an application service that manages the repository. It retrieves, inserts, and updates metadata in the repository database tables. Select a Repository Service in the Navigator to access information about the service.
Integration Service
Repository Service
Repository Manager
Repository Agent(s)
Repository
13
14
Installations Required
Oracle : 10G or any Relational DB PL/SQL Developer or Toad Informatica 8x
15
Lab
Install Repository Creation
16
17
18
Design Process
1. Create Source definition(s) 2. Create Target definition(s) 3. Create a Mapping 4. Create a Session Task 5. Create a Workflow from Task components 6. Run the Workflow and verify the results
19
20
21
Source Analyzer
Relational
XML file
Flat file
COBOL file
22
Repository Agent
native
Repository
23
DEF
24
Flat File
DEF
Repository Agent
native
Repository
25
DEF
26
27
Data Previewer
Preview data in
Relational Sources Flat File Sources Relational Targets Flat File Targets
28
30
31
32
33
34
35
DEF
native
DEF Repository
36
37
38
39
DEF
DEF
DEF
40
Use Preview Data to verify the results (right mouse click on object)
41
42
Transformation Concepts
43
Transformation Concepts
By the end of this section you will be familiar with: Transformation types and views Transformation properties Informatica data types The Expression transformation The Filter transformation
44
What is a transformation? A Transformation is any part of mapping that generates or modifies Data . Transformation Types
TRANSFORMATIONS
Transformation Types
ACTIVE
PASSIVE
45
Transformations
Informatica PowerCenter provides following objects for data transformation: Source Qualifier: reads data from Flat File and Relational Sources XML Source Qualifier: reads XML data ERP Source Qualifier: reads ERP object sources Normalizer: reorganizes records from Relational and Flat File Expression: performs row-level calculations Aggregator: performs aggregate calculations Filter: drops rows conditionally Router: splits rows conditionally Sorter: sorts data
46
Transformations
Transformations objects (continued) Update Strategy: tags rows for insert, update, delete, reject Lookup: looks up values and passes them to other objects Joiner: joins heterogeneous sources Stored Procedure: calls a database stored procedure Union Transformation: Union all for two sources External Procedure (TX): calls compiled code for each row Sequence Generator: generates unique ID values Rank: limits records to a top or bottom range
47
Port
Port represents a single column of data Every transformation object definition contains a collection of ports Each port can be defined as an
Input Port (which receives data, or a data sink) Output Port (which provides data, or a data source) Input/Output port (which passes data through, or a data sink and source) Variable Port (used to store intermediate data values) Lookup Port (Used under Lookup Transformation) Return Port (Used under Lookup Transformation)
48
Port - Sample
Input Ports
49
Transformation Views
A transformation has three views:
Iconized View - shows the transformation in relation to the rest of the mapping Normal View - shows the data flow through the transformation Edit View - shows the transformation ports and properties. Allows editing
50
Normal View
Shows data flow through the transformation
Data passes through I/O ports unchanged
DATE_ENTERED passes in to the transformation through an input port It is used in the MONTH port to extract the month The month is passed through MONTH output port
51
Edit Mode
Allows users with write privileges to change or create transformation properties
Define port level handling Enter comments Make reusable Switch between transformations
Default value for the selected port
52
For input and input/output ports, default values are used to replace null values For output ports, default values are used to handle transformation errors (not null handling)
Default Values
53
Specific to the source and target database types Display in source and target tables within Mapping Designer
PowerCenter internal datatypes based on ANSI SQL-92 Display in transformations within Mapping Designer
Native
Transformation
Native
Transformation datatypes allow mix and match of source and target database types When connecting ports, native and transformation datatypes must be compatible (or they must be converted them)
54
Datatype Conversions
Integer Integer Decimal Double Char Date Raw X X X X Decimal X X X X Double X X X X Char X X X X X Date Raw
X X X
All numeric data can be converted to all other numeric datatypes, i.e., integer, double, and decimal All numeric data can be converted to string and vice versa Date can be converted only to date and string and vice versa Raw (binary) can only be converted to raw Other conversions not listed above are not supported These conversions are implicit; no function is necessary
55
Expression Transformation
Perform calculations using non-aggregate functions (row level)
Passive Transformation Connected Mode Only Ports Mixed Variables allowed Create expression in an output or variable port Usage Perform majority of data manipulation
56
Expression Editor
An expression is a calculation or conditional statement Used in Expression, Aggregator, Rank, Filter, Router, Update Strategy Performs calculation based on ports, functions, operators, variables, literals, constants, and return values from other transformations
57
Expression Validation
The Validate or OK button in the Expression Editor will Parse the current expression
remote port searching (resolves references to ports in other transformations)
Parse default values Check spelling, correct number of arguments in functions, other syntactical errors
58
Informatica Functions
ASCII CHR CHRCODE CONCAT INITCAP INSTR LENGTH LOWER LPAD LTRIM RPAD RTRIM SUBSTR UPPER REPLACESTR REPLACECHR
Character Functions Used to manipulate character data CHRCODE returns the numeric value (ASCII or Unicode) of the first character of the string passed to this function
59
Informatica Functions
TO_CHAR (numeric) TO_DATE TO_DECIMAL TO_FLOAT TO_INTEGER TO_NUMBER ADD_TO_DATE DATE_COMPARE DATE_DIFF GET_DATE_PART LAST_DAY ROUND (date) SET_DATE_PART TO_CHAR (date) TRUNC (date)
Date Functions Used to round, truncate, or compare dates; extract one part of a date; or perform arithmetic on a date To pass a string to a date function, first use the TO_DATE function to convert it to an date/time datatype
60
Informatica Functions
ABS CEIL CUME EXP FLOOR LN LOG MOD MOVINGAVG MOVINGSUM POWER ROUND SIGN SQRT TRUNC
Numerical Functions Used to perform mathematical operations on numeric data Scientific Functions Used to calculate geometric values of numeric data
61
System Variables
$$$SessStartTime
Returns the system date value as a string. Uses system clock on machine hosting Informatica server
format of the string is database type dependent Used in SQL override Has a constant value
SYSDATE
62
Mappings
63
Mappings
By the end of this section you will be familiar with: Mapping components Source Qualifier transformation Mapping validation Data flow rules Mapping parameters and variables
64
Mapping Designer
65
67
68
Mapping Validation
Mappings must be valid to be run Mapping must be end-to-end complete All expressions must be valid Mapping must obey data flow rules Mappings are always validated when saved Mappings can be validated without being saved Output Window will show why a Mapping is invalid
69
Connection Validation
Examples of invalid connections in a Mapping:
Connecting ports with mismatched datatypes Connecting output ports to a Source Connecting a Source to anything but a Source Qualifier or Normalizer transformation Connecting an output port to an output port or an input port to another input port Connecting more than one active transformation to another transformation (invalid data flow)
70
71
73
74
76
77
78
Session Tasks
79
Session Tasks
After this section, you will be able to describe Session Task properties How to create and configure Session Tasks Transformation overrides Session partitions
80
Navigator
Workspace
81
Worklet Designer
Creates objects that represent a set of tasks Objects are reusable
Workflow Designer
Maps the execution of Sessions, Tasks and Worklets for the Informatica server
82
Session Task
Created to run a mapping (one mapping only) Session Tasks can be created in the Task Developer or Workflow Developer Steps to create a Session Task
Choose the Session button from the Tasks bar or Choose menu Tasks | Create
83
Session Task
Steps to create a Session Task (continued)
Double click on the session object Valid Mappings are displayed in the dialog box
85
General Options
Performance Tab
86
Log Options
Error handling
87
Connections Tab
88
89
90
91
92
Worklets
93
Worklets
Can Contain any task available in the workflow Manager We can run Worklets inside a Workflow Can be Nested. When to create a worklet
We create a worklet when we want to reuse a set of workflow logic in several workflows.
Worklet Designer is used to create and edit worklets A worklet does not contain any scheduling or server information. To execute a worklet we need to include it in the session. Informatica does not have a parameter file or log file for a worklet. The information about the worklet gets written to the workflow log. Types Of Worklets
94
95
96
Batch
97
Workflows
98
Workflows
By the end of this section you will be familiar with: Workflow Task properties Workflow properties Configuring Workflows Workflow Connections
99
Workflow Designer
Combines Session Tasks, other types of Tasks, and Worklets as a set of instructions for the Informatica Server to accomplish data transformation and load The most simple Workflow that can be created is composed of a Start Task, a link and a Session Task
Session Task
100
Task Developer
Provides the basic building blocks of a Workflow Three types of reusable Tasks can be created with the Task Developer:
Session - Set of instructions to execute a mapping Command - Specifies shell commands to run during the workflow Email - Sends email after the workflow has completed
Session Command Email
101
Command Task
Allows you to specify one or more Unix shell or DOS (NT, Win2000) commands to run during the Workflow Can be referenced in a Session through the Session Components tab as a Pre or Post Session command Can be referenced as a Task component in a Workflow or Worklet
102
Command Task
103
Email Task
Can be configured to have the Informatica Server to send email at any point in the Workflow Email can be configured in a Session or as a Task in a Workflow or Worklet
104
Email Task
Email Variables
105
106
107
Workflow Designer
Sample Workflow
Links (required)
Session 1
Session 2
108
Link 2 109
110
Developing Workflows
Create a new workflow in the Workflow Designer
111
Developing Workflows
Enter the Workflow Properties Select a Workflow Schedule
112
Developing Workflows
113
Developing Workflows
Define events which can be used with the Raise Event Task
Define variables that can be used in later task objects (example: Decision Task)
114
Developing Workflows
115
Developing Workflows
Add Sessions and other Tasks to the Workflow Connect all Workflow components with Links Save the Workflow Start the Workflow
Save Start Workflow
116
Connections
Configure server data access connections
118
Creating a Connection
Relational (Database) connection
119
Connection Properties
Relational (Database) connection
Environment SQL SQL commands executed for each database connection
120
FTP Connection
121
122
123
Monitor Workflows
124
Monitor Workflows
By the end of this section you will be familiar with: Workflow Monitor views Actions initiated from the Workflow Monitor Truncating Monitor Logs
125
The Workflow Monitor is the tool for monitoring Workflows and Tasks The Workflow Monitor shows details about a workflow or task in two views
Gantt Chart view Task view
Monitor Workflows
Monitor Workflows
Workflow Monitor displays Workflows that have been run at least once Can monitor a server in two modes: online and offline
Online - the Workflow Monitor continuously receives information from the Informatica Server and the Repository Server Offline mode - the Workflow Monitor displays historic information about past workflow runs by fetching information from the repository
127
Monitoring Workflows
You can perform the following tasks in the Workflow Monitor:
Restart - restart tasks, workflows, or worklets Stop - stop a task, workflow, or worklet Abort - abort a task, workflow, or worklet Resume - resume suspended workflows after you fix the failed task
View Session And Workflow logs Stopping a Session Task means the server stops reading data Abort has a timeout of 60 sec. If the server is not finished processing and committing data by the timeout, the threads and processes associated with the session are killed
128
Monitoring Workflows
Task View
Task Name Workflow Name Worklet Name Start Time Completion Time
Ping Server
Status
Start, Stop, Abort, Resume Tasks,Workflows, and Worklets Get Session Logs (right click on Task) 129
Monitoring Workflows
Task View
Monitoring filters can be set using drop down menus
130
Monitoring Workflows
Truncating Monitor Logs
Workflow Monitor The Repository Manager Truncate Log option clears the Workflow Monitor logs
Repository Manager
131
132
133
Pmcmd is a program that you can use to communicate with the informatica server. You can use pmcmd in the following modes Command line mode : The
Pmcmd Command
command line syntax allows you to write scripts for scheduling workflows. Each command should include connection information to the informatica server
Interactive mode : You establish and maintain an active connection to the informatica server which allows you to issue a series of commands
Command
getserverdetails Connect/disconnect help
Mode (s)
Command line, interactive Interactive Command line, interactive Command line, interactive
Description
Displays details including server status, information on active workflows, timestamp information Connect to/disconnect from the informatica server Displays a list of pmcmd commands and syntax
Waittask
Instructs the informatica server to wait for the completion of a running task before starting another command
134
Session Parameter
Represents values that we might change between sessions such as database connection or a source file. It is used in session properties and is defined in a parameter file. Can be specified when we start a session using a PMCMD command Type of Session Parameter
Built in($PM Session Log File) User Defined(Database connections, Source file names,Target File Names)
Use Of Parameters makes the session more flexible. Make Session management easier Do not have default values.
135
Session Parameter
136
LOG FILE
Informatica creates log files for each workflow that it runs. The log files contain information about tasks the informatica server performs and the statistics about the workflow and all the sessions in the workflow. If the writer or the target database rejects data during a session run, informatica server creates a file that contains the rejected rows. Types of log files
137
Parameter file
Parameterization gives a standard feel to the sessions/workflows. Session parameters represent values that we have to change between sessions, such as a database connection or source file. Mapping parameters are given in the session properties, and then defined the in a parameter file. During the session run the Power center Server evaluates all references of the parameter to that value.
138
139
Filter transformation
140
Filter transformation
By the end of this section you will be familiar with: Filter properties
141
Filter transformation
Drops rows conditionally
Active Transformation Ports All input / output Specify a Filter condition Usage Filter rows from flat file sources Single pass source(s) into multiple targets
142
Aggregator Transformation
143
Aggregator Transformation
By the end of this section you will be familiar with:
Aggregator properties Aggregator expressions Using sorted data
144
Aggregator Transformation
Performs aggregate calculations
Active Transformation Connected Mode Only Ports Mixed Variables allowed Group By allowed Create expression in an output port Usage Standard aggregations
145
Aggregator properties
Use to toggle Sorted input
146
Informatica Functions
Aggregate Functions
AVG COUNT FIRST LAST MAX MEDIAN MIN PERCENTILE STDDEV SUM VARIANCE
Return summary values for non-null values in selected ports Aggregate functions can be used only in Aggregator transformations Calculate a single value for all records in a group Only one aggregate function can be nested within an aggregate function Conditional statements can be used with these functions
147
Aggregator expressions
Aggregate functions are supported only in the Aggregator Transformation
Sorted Data
The Aggregator can handle sorted or unsorted data. Sorted data can be aggregated more efficiently, decreasing total processing time The PowerCenter server will cache data from each group, and release the cached data upon reaching the first record of the next group Data must be sorted according to the group by Aggregator ports Performance gain will depend upon varying factors
149
No rows are released from Aggregator until all rows are aggregated
150
Each separate group (one row) is released as soon as that group is aggregated
151
Joiner transformation
152
Joiner transformation
By the end of this section you will be familiar with: When to use a Joiner Transformation Joiner properties Joiner expressions Nested joins
153
Homogeneous Joins
Joins that are done with a SQL SELECT statement: Source Qualifier contains join SQL Tables on same DB server (or are synonyms) Database server does the join work Multiple homogenous tables can be joined
154
Heterogeneous Joins
Joins that cannot be done with a SQL statement: An Oracle table and a Sybase table Two Informix tables on different servers Two flat files A flat file and a database table
155
Joiner transformation
Performs heterogeneous joins on records from different databases or flat file sources
Active Transformation Connected Ports All input or input / output M denotes port comes from master source Specify the Join condition Usage Join two flat files Join two tables from different databases Join a flat file with a relational table
156
Joiner properties
157
Joiner Conditions
158
Nested joins
Used to join three or more heterogeneous sources
159
Mid-mapping join
The Joiner does not accept input in the following situations:
Both input pipelines begin with the same Source Qualifier Both input pipelines begin with the same Normalizer Both input pipelines begin with the same Joiner Either input pipeline contains an Update Strategy Either input pipeline contains a connected or unconnected Sequence Generator transformation
160
161
Sorter Transformation
162
Sorter Transformation
By the end of this section you will be familiar with: Sorter properties Sorter limitations
163
Sorter Transformation
Can sort data from relational tables or flat files The sort takes place on the Informatica server machine. No database server is involved in performing the sort Multiple sort keys are supported The Sorter transformation is often more efficient than a sort performed on a database with an ORDER BY clause
164
Sorter Transformation
One or more sort keys defined
165
Sorter Properties
Cache size can be changed
Minimum is 8 Mb. Ensure set cache size is actually available on the Informatica server, or Session Task will fail
166
Lookup Transformation
167
Lookup Transformation
By the end of this section you will be familiar with: Lookup principles Lookup properties Lookup techniques
168
Return value
169
Lookup Transformation
Looks up values from database objects and provides to other components in a mapping
Passive Transformation Connected/Unconnected Ports Mixed L denotes Lookup port R denotes port used as a return value in an unconnected lookup Specify the Lookup condition Usage Get related values Verify if records exists or if data has changed
170
Lookup properties
Can override Lookup SQL
Toggle caching
171
172
Lookup Conditions
Multiple conditions are supported
173
Uncached
Each Mapping row needs one SQL SELECT
Rule Of Thumb: Cache if the number of records in the Lookup table is small relative to the number of mappings rows
174
175
Dynamic Lookup
176
Dynamic Lookup
By the end of this section you will be familiar with: Dynamic lookup theory Dynamic lookup advantages Dynamic lookup limitations
177
Persistent caches
By default, Lookup caches are not persistent When Session completes, cache is erased Cache can be made persistent with the Lookup properties When Session completes, the persistent cache is stored on server hard disk files The next time Session runs, cached data is loaded fully or partially into RAM and reused Can improve performance, but stale data may pose a problem
179
180
181
Does NOT change row type Use update strategy transformation before or after lookup to flag rows for insert or update to the target Ignore NULL Property
Per port Ignore NULL values from input row and update the cache using only with non-NULL values from input
182
183
184
185
186
187
INSERT
DD_INSERT
189
191
Router transformation
192
Router transformation
By the end of this section you will be familiar with: Using a Router Router groups
193
Router transformation
Rows sent to multiple filter conditions
Active Transformation Connected Ports All input/output Specify filter condition for each group Usage Link source data in one pass to multiple filter conditions
194
195
Router Groups
Input group (always one) User-defined groups Default group (always one) Each group has one condition ALL group conditions are evaluated for each row Group outputs can be ignored
196
Reusable transformations
197
Reusable transformations
By the end of this section you will be familiar with: Reusable transformation advantages Reusable transformation limitations Promoting transformations to reusable Demoting reusable transformations
198
Transformation Developer
Reusable transformations
199
Reusable transformations
Define once - reuse many times Reusable Transformations
Can be a copy or a shortcut Edit Ports only in Transformation Developer Can edit Properties in the mapping Instances dynamically inherit changes Be careful: It is possible to invalidate mappings by changing reusable transformations
201
203
204
Passive Transformation Connected Ports Two predefined output ports, NEXTVAL and CURRVAL No input ports allowed Usage Generate sequence numbers Shareable across mappings
205
Target Options
206
Target options
By the end of this section you will be familiar with: Row operations Load types Constraint-based loading Error handling
207
Target properties
Session Task Chose target
208
Delete SQL
DELETE from <target> WHERE <primary key> = <pkvalue>
The SQL statement used will appear in the Session log file
209
Constraint-based Loading
Maintains referential integrity in the Targets
pk1 fk1, pk2
Example 1
With only One Active source, rows for Targets 1-3 will be loaded properly and maintain referential integrity
fk2
Example 2
With Two Active sources, it is not possible to control whether rows for Target 3 will be loaded before or after those for Target 2
The following transformations are Active sources: Advanced External Procedure, Source Qualifier, Normalizer, Aggregator, Sorter, Joiner, Rank, Mapplet (containing any of the previous transformations)
210
Transformation errors are written to the session log, not the .bad file
211
212
213
Multi-Task Workflows
Tasks can be run sequentially, like this:
Tasks shows are all Sessions, but they can also be other Tasks such as Commands, Timer, Email, etc.
214
Multi-Task Workflows
Tasks can be run concurrently, like this:
Tasks shows are all Sessions, but they can also be other Tasks such as Commands, Timer, Email, etc.
215
Multi-Task Workflows
Tasks can be run in a combination concurrent and sequential pattern within one Workflow, like this:
Tasks shows are all Sessions, but they can also be other Tasks such as Commands, Timer, Email, etc.
216
Additional transformations
217
Additional transformations
By the end of this section you will be familiar with: The Rank transformation The Normalizer transformation The Stored Procedure transformation The External Procedure transformation The Union transformation
218
Rank transformation
Filters the top or bottom range of records
Active Transformation Connected Ports Mixed One predefined output port RANKINDEX Variables allowed Group By allowed Usage Select top/bottom Number of records
219
Normalizer transformation
Normalizes records from relational or VSAM sources
Active Transformation Connected Ports Input / output or output Usage Required for VSAM Source definitions Normalize flat file or relational source definitions Generate multiple records from one record
220
Normalizer transformation
221
Normalizer transformation
Generated Column ID
222
Passive Transformation Connected/Unconnected Ports Mixed R denotes port will return a value from the stored function to the next transformation Usage Perform transformation logic outside PowerMart / PowerCenter
223
Union Transformation
Multiple input group transformation that can be used to merge data from multiple pipelines or pipeline branches into one pipeline branch. Similar to the UNION ALL SQL statement Union transformation does not remove duplicate rows
224
225
Java Transformation
Transformation type: Active/Passive Connected Java transformation behavior is based on the following events: The transformation receives an input row The transformation has processed all input rows The transformation receives a transaction notification such as commit or rollback
227
229
HTTP Transformation
Transformation type: Passive Connected The HTTP transformation enables you to connect to an HTTP server to use its services and applications. When you run a session with an HTTP transformation, the Integration Service connects to the HTTP server and issues a request to retrieve data from or update data on the HTTP server, depending on how you configure the transformation.
231
SQL Transformation
Transformation type: Active/Passive Connected The SQL transformation processes SQL queries midstream in a pipeline. You can insert, delete, update, and retrieve rows from a database. You can pass the database connection information to the SQL transformation as input data at run time. The transformation processes external SQL scripts or SQL queries that you create in an SQL editor. The SQL transformation processes the query and returns rows and database errors
232
233
Conditional lookups
By the end of this section you will know conditional lookup: Technique Advantages Limitations
234
Unconnected lookup
Always literally unconnected from other transformations. There are no blue data flow arrows leading to or from an unconnected lookup
Lookup function can be used within any transformation that supports expressions, such as this Aggregator
Unconnected lookup
236
Conditional Lookups
237
Return port
WARNING ! If the return port is not defined, the lookup function expression will be invalid
239
Part of the mapping data flow Returns multiple values (by linking output ports to another transformation) Executed for every record passing through the transformation More visible, shows where the lookup values are used Default values are used
Separate from the mapping data flow Returns one value (by checking the Return (R) port option for the output port that provides the return value) Only executed when the lookup function is called Less visible, as the lookup is called from an expression within another transformation Default values are ignored
240
Heterogeneous Targets
241
Heterogeneous Targets
By the end of this section you will be familiar with: Heterogeneous target types Heterogeneous target limitations Target conversions
242
Oracle table
Flat file
244
245
246
Mapplets
247
Mapplets
By the end of this section you will be familiar with: Mapplet advantages Mapplet types Mapplet limitations
248
Mapplet Designer
Mapplet Advantages
Useful for repetitive tasks / logic Represents a set of transformations Mapplets are reusable Use an instance of a Mapplet in a Mapping Changes to a Mapplet are inherited by all instances Server expands the Mapplet at runtime
250
251
252
Unsupported Transformations
You may use any transformation in a Mapplet except: XML Source definitions COBOL Source definitions Normalizer Pre and post-Session stored procedures Target definitions Other Mapplets
253
254
The resulting Mapplet has no input ports When this Mapplet is used in a Mapping, it must be the first object in the data flow
255
A Mapplet Input transformation has NO input ports of its own Only ports connected from the Input transformation to another transformation display in the resulting Mapplet When connecting ports from the Input transformation, you may not connect the same port to more than one transformation
Transformation
Transformation
256
Mapplet Output
Use a Mapplet Output transformation Define Mapplet Output ports Mapplets must contain at least one Output transformation An Output transformation must have at least one port connected to another transformation within the Mapplet
257
259
260
Passive
Active
Multiple Active Mapplets or Active and Passive Mapplets cannot populate the same target instance
261
Repository Topics
By the end of this section you should be familiar with: The purpose of the Repository Server and Agent The Repository Manager interface Repository maintenance operations Security and privileges Object sharing, searching, and locking
262
Repository Service
Each repository has an independent architecture for the management of the physical repository tables Components: one Repository Service
Informatica Adminconsole
Domain
Repository Agent(s)
Client overhead for repository management is greatly reduced by the Repository Service
Repository
263
264
Repository Management
Perform all repository maintenance tasks using the Informatica Admin Console
Maintenance tasks: Create Copy Backup Restore Upgrade Register Un-Register Delete Notify Users Last activity log
265
Analysis Window
Dependency Window
Output Window
266
Steps:
267
User Management
GROUP STRUCTURE
Groups Users Privileges
Administrators Public
(all privileges) Use Designer Browse Repository Use Workflow Manager As defined
As defined
As defined
SECURITY CONTROL
Security Access To Issued By Issued To
Privileges Permissions
Repository Folder
270
Thank You
271