Professional Documents
Culture Documents
Extract, Transform, Load (ETL) : Torben Bach Pedersen Aalborg University
Extract, Transform, Load (ETL) : Torben Bach Pedersen Aalborg University
Extract, Transform, Load (ETL) : Torben Bach Pedersen Aalborg University
ETL Overview
General ETL issues
The ETL/DW refreshment process Building dimensions Building fact tables Extract Transformations/cleansing Load A concrete ETL tool Demo Example ETL flow
MS Integration Services
Extract
Extract relevant data Transform data to DW format Build keys, etc. Cleansing of data Load data into DW Build aggregates, etc.
Transform
Load
DW Phases
Design phase
Modeling, DB design, source selection, First load/population of the DW Based on all data in sources Keep the DW up-to-date wrt. source data changes
Loading phase
Refreshment phase
ETL/DW Refreshment
Refreshment Workflow
Query side
Metadata
Presentation servers
Query Services
Reporting Tools
-Warehouse Browsing -Access and Security Data marts with aggregate-only data -Query Management - Standard Reporting Conformed -Activity Monitor Data
Warehouse Bus
dimensions and facts
Data mining
Operationelle systemer
Data
Service Element
No user queries (some do it) Sequential operations (few) on large data volumes
Performed by central ETL logic Easily restarted No need for locking, logging, etc. RDBMS or flat files? (DBMS have become better at this)
Finished dimensions copied from DSA to relevant marts Allows centralized backup/recovery
Often too time consuming to initial load all data marts by failure Thus, backup/recovery facilities needed Better to do this centrally in DSA than in all data marts
Make high-level diagram of source-destination flow Test, choose and implement ETL tool Outline complex transformations, key generation and job sequence for every destination table Construct and test building static dimension Construct and test change mechanisms for one dimension Construct and test remaining dimension builds Construct and test initial fact table build Construct and test incremental update Construct and test aggregate build (you do this later) Design, construct, and test ETL automation
9
Construction of dimensions
4) 5) 6)
Building Dimensions
Static dimension table
Relatively easy? Assignment of keys: production keys to DW using table Combination of data sources: find common key? Check one-one and one-many relationships using sorting Described in last lecture Find newest DW key for a given production key Table for mapping production keys to DW keys must be updated Small dimensions: replace Large dimensions: load only changes
Load of dimensions
10
ETL for all data up till now Done when DW is started the first time Often problematic to get correct historical data Very heavy - large data volumes Move only changes since last load Done periodically (../month/week/day/hour/...) after DW start Less heavy - smaller data volumes The relevant dimension rows for new facts must be in place Special key considerations if initial load must be performed again
Incremental update
11
Snapshot sources provides only full copy of source Specific sources each is different, e.g., legacy systems Logged sources writes change log (DB log) Queryable sources provides query interface, e.g., SQL Replicated sources publish/subscribe mechanism Call back sources calls external code (ETL) when changes occur Internal action sources only internal actions when changes occur
Cooperative sources
DB triggers is an example
12
Extract
Goal: fast extract of relevant data
Extract from source systems can take a long time Extract applications (SQL): co-existence with other applications DB unload tools: must faster than SQL-based extracts Extract applications sometimes the only solution Extracts can take days/weeks Drain on the operational systems Drain on DW systems => Extract/ETL only changes since last load (delta)
Types of extracts:
13
Computing Deltas
Much faster to only ETL changes since last load
A number of methods can be used Delta can easily be computed from current+last extract + Always possible + Handles deletions - Does not reduce extract time Updated by DB trigger Extract only where timestamp > time for last extract + Reduces extract time +/- Less operational overhead - Cannot (alone) handle deletions - Source system must be changed
14
Applications insert messages in a queue at updates + Works for all types of updates and systems - Operational applications must be changed+operational overhead Triggers execute actions on INSERT/UPDATE/DELETE + Operational applications need not be changed + Enables real-time update of DW - Operational overhead Find changes directly in DB log which is written anyway + Operational applications need not be changed + No operational overhead - Not possible in some DBMS (SQL Server, Oracle, DB2 can do it)
15
DB triggers
ETL Overview
General ETL issues
The ETL process Building dimensions Building fact tables Extract Transformations/cleansing Load A concrete ETL tool Demo Example ETL flow
MS Integration Services
16
Common Transformations
Data type conversions
EBCDIC->ASCII/UniCode String manipulations Date/time format conversions To the desired DW format Depending on source format Table matches production keys to surrogate DW keys Observe correct handling of history - especially for total reload
Normalization/denormalization
Building keys
17
Data Quality
Data almost never has decent quality Data in DW must be: Precise
DW data must match known numbers - or explanation needed DW has all relevant data and the users know No contradictory data: aggregates fit with detail data The same things is called the same and has the same key (customers) Data is updated frequently enough and the users know when
18
Complete
Consistent
Unique
Timely
Cleansing
BI does not work on raw data
Pre-processing necessary for good results Can disturb BI analyses if not handled Spellings, codings, Production keys, comments, City name instead of ZIP code, Customer data, Extra values used for programmer hacks
19
Types Of Cleansing
Conversion and normalization
Text coding, date formats, etc. Most common type of cleansing Normalize spellings of names, addresses, etc. Remove duplicates, e.g., duplicate customers Approximate, fuzzy joins on not-quite-matching keys (prefix,) User-specifed rules, if-then style Automatic rules: use data mining to find patterns in data
Special-purpose cleansing
Domain-independent cleansing
Rule-based cleansing
20
Cleansing
Mark facts with Data Status dimension
Normal, abnormal, outside bounds, impossible, Facts can be taken in/out of analyses Can disturb BI tools Replace with NULLs (or estimates) Use explicit NULL value rather than normal value (0,-1,..) Use NULLs only for measure values (estimates instead?) Use special dimension keys for NULL dimension values Customer about to cancel contract, Performance Statistical significance only for higher levels
21
A given steward has the responsibility for certain tables Includes manual inspections and corrections! Default values Not yet assigned 157 note to data steward The optimal? Are totals as expected? Do results agree with alternative source? Allow management to see weird data in their reports?
DW-controlled improvement
Source-controlled improvements
22
Load
Goal: fast loading into DW
Loading deltas is much faster than total load Large overhead (optimization,locking,etc.) for every SQL call DB load tools are much faster Some load tools can also perform UPDATEs Drop index and rebuild after load Can be done per partition Dimensions can be loaded concurrently Fact tables can be loaded concurrently Partitions can be loaded concurrently
Parallellization
23
Load
Relationships in the data
Referential integrity must be ensured Can be done by loader Must be built and loaded at the same time as the detail data Today, RDBMSes can often do this Load without log Sort load file first Make only simple transformations in loader Use loader facilities for building aggregates Use loader within the same database Use partitions or several sets of tables (like MS Analysis)
Aggregates
Load tuning
24
ETL Tools
ETL tools from the big vendors
Oracle Warehouse Builder IBM DB2 Warehouse Manager Microsoft Integration Services Data modeling ETL code generation Scheduling DW jobs Hundreds of tools Often specialized tools for certain jobs (insurance cleansing,) Choose based on your own needs Check first if the standard tools from the big vendors are ok
Many others
25
Issues
Files versus streams/pipes
Streams/pipes: no disk overhead, fast throughput Files: easier restart, often only possibility Code: easy start, co-existence with IT infrastructure Tool: better productivity on subsequent projects ETL time dependent of data volumes Daily load is much faster than monthly Applies to all steps in the ETL process
Load frequency
26
ETL Overview
General ETL issues
The ETL process Building dimensions Building fact tables Extract Transformations/cleansing Load A concrete ETL tool Demo Example ETL flow
MS Integration Services
27
Integration Services
Microsofts ETL tool
Part of SQL Server 2005 Import/export wizard - simple transformations BI Development Studio advanced development
Tools
28
IS Architecture
29
Packages
The central concept in IS Package = organized set of:
Connections Control flow Tasks Sources Transformations Destinations Workflow (constraints) Logging, transaction handling, etc.
30
31
Data Sources/Connections
All ODBC/OLE DB data sources
Almost everything ! SQL Server Access Excel dBase HTML file Paradox Text files Oracle
Examples
32
Containers
Provide
Runs a control flow repeatedly by using an enumerator Runs a control flow repeatedly by testing a condition Groups tasks and containers into control flows that are subsets of the package control flow Provides services to a single task
33
Sequence container
Tasks
Many types Data Flow runs data flows Data Preparation Tasks
File System operations on files FTP up/down-load data Web service call web service XML work with XML data (XSLT, XPath,) Execute package execute other IS packages, good for structure! Execute DTS2000 package execute legacy DTS package Execute Process run external application/batch file Message Queue send/receive MSMQ messages Send mail useful for notifications WMI Data Reader Windows Management Instrumentation data WMI Event Watcher wait for WMI event
34
Workflow Tasks
Tasks 2
SQL Servers Tasks
Bulk insert fast load of data Execute SQL execute any SQL query Transfer database, error messages, jobs, logins, master stored procedures, SQL Server objects ActiveX Script execute ActiveX scripts Script execute VN .NET code Analysis Services Processing process dims, cubes, models Analysis Services Execute DDL create/drop/alter cubes, models Data Mining Query runs mining queries in DMX
Scripting Tasks
35
Event Handlers
Executables (packages, containers) can raise events Event handlers manage the events Similar to programming languages
36
Makes external data available Update Summarize Cleanse Merge Distribute Write data to specific store Create in-memory data set
Transformations
Destinations
37
Transformations
Business intelligence transformations
Fuzzy Grouping standardize values in column data Fuzzy Lookup - look up values in reference table by fuzzy match Term Extraction - extract terms from text Term Lookup look up terms and find term counts Data Mining Query run data mining prediction queries Character Map - applies string functions to character data Copy Column - adds copies of input columns to output Data Conversion converts data types Derived Column populates columns using expressions Script Component calls scripts OLE DB Command call SQL for each row
Row Transformations
38
Transformations 2
Rowset Transformations
Aggregate - performs aggregations Sort - sorts data Percentage Sampling - creates sample data set by setting % Row Sampling - creates sample data set by setting #rows Pivot - creates a less normalized pivot table of normalized table Unpivot reverse of Pivot Conditional Split - routes data rows to different outputs Multicast - distributes data sets to multiple outputs Union All - merges multiple data sets Merge - merges two sorted data sets Merge Join - joins two data sets using FULL/LEFT/INNER join Lookup Transformation - looks up ref values by exact match.
39
Transformations 3
Other Transformations
Export Column - inserts data from a data flow into a file Import Column - reads data from a file and adds it to a data flow Audit - makes environment information available to data flow Row Count - counts rows and stores final count in a variable Slowly Changing Dimension - configures update of a SCD For very specialized functionality
Customs Transformations
40
A Simple IS Case
Use BI Dev Studio/Import Wizard to copy TREO tables Save in
SQL Server File system Available from web page DROP, CREATE, source, transformation, destination Error messages? But dependencies can be set up
Execute package
41
42
Build an ETL flow using MS DTS that can do an initial (first-time) load of the data warehouse. This should include logic for generating special DW surrogate integer keys for the tables. Discuss and implement basic transformations/data cleansing. Extend the ETL flow to handle incremental loads, i.e., updates to the DW, both for dimensions and facts. Extend the DW design and the ETL logic to handle slowly changing dimensions of Type 2. Implement more advanced transformations/data cleansing. Perform error handling in the ETL flow.
Extensions:
43
Build first step and check that result is as expected Add second step and execute both, check result Add third step
Test SQL before putting into IS Do one (or just a few) thing(s) at the time
Only if doing incremental load Versions only if handling slowly changing dimensions
Summary
General ETL issues
The ETL process Building dimensions Building fact tables Extract Transformations/cleansing Load A concrete ETL tool Demo Example ETL flow
MS Integration Services
45