P. 1
Noble Datapump

Noble Datapump

|Views: 53|Likes:
Published by dararaja

More info:

Published by: dararaja on Jun 02, 2011
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PPTX, PDF, TXT or read online from Scribd
See more
See less






  • Background
  • Introduction
  • Why Use Datapump
  • DatapumpComponents
  • Outline
  • DatapumpSpeed
  • DatapumpControl
  • Major Features
  • Additional DatapumpFeatures
  • DataPumpExport Setup
  • Default DatapumpDirectory
  • Export Preliminary Setup
  • Start DatapumpExport Job
  • Example Export Parameter File
  • DatapumpImport Setup
  • Import Preliminary Setup
  • Start DatapumpImport Job
  • Example Import Parfile
  • Some Basic Parameters
  • Important Features
  • Exclude
  • Include
  • Network_Link
  • Filters
  • Table_Exists_Action
  • Import Parameters
  • Interactively Work With The Job
  • Check Status of DatapumpJob
  • Kill DatapumpJob
  • Interactive Commands
  • More Interactive Commands
  • Restrictions
  • Oracle 11g Features
  • More Oracle 11g Features
  • Oracle 11g OEM
  • Problems Encountered
  • Job Will Not Die
  • Review
  • Comments
  • The End



‡ ‡ ‡ ‡ Experience with Oracle Databases Familiar with Export/Import Utility RMAN backups Datapump to replace old Export/Import

‡ More features than the standard export/import ‡ Use in addition to RMAN backups ‡ Another means to upgrade to a higher version of Oracle

Why Use Datapump
‡ DataPump handles Oracle 10g data types. ‡ DataPump features ‡ DataPump speed

Datapump Components ‡ ‡ ‡ ‡ Command line client expdp Command line client impdp DBMS_DATAPUMP (Datapump API) DBMS_METADATA (Metadata API) .

Outline ‡ Datapump is faster than standard export/import ‡ Setup for datapump export ‡ Setup for datapump import ‡ Datapump features ‡ Experiences ‡ Kill the job .

Datapump Speed ‡ Standard Export and Import utilities ran as clients ‡ Datapump is run as part of the database instance on the database server ‡ Datapump can do parallel work ± Create multiple thread processes ± Create multiple data files (file sets) .

Datapump Control ‡ Master Control Process ‡ Master Table ‡ Worker Process .

‡ REMAP_TABLESPACE map to new tablespace ‡ ‡ ‡ ‡ .Major Features PARALLEL maximum number of threads START_JOB ability to restart a job ATTACH detach and reattach to job NETWORK_LINK export and import over network ‡ REMAP_DATAFILE import to a different datafile.

or version number.Additional Datapump Features ‡ Filter by using EXCLUDE and INCLUDE ‡ VERSION specify version of objects. latest. Parameters are compatible. .

DataPump Export Setup ‡ Make a server file system directory ‡ Create a database directory that references the file system directory ‡ Grant read write privileges to the directory ‡ Grant privileges for full export ‡ Create an export parameter file .

Default Datapump Directory ‡ Oracle default datapump directory DATA_PUMP_DIR ‡ $ORACLE_HOME/rdbms/log/ ‡ Information is found in table DBA_DIRECTORIES .

write on Directory datapump_dir to <dp_schema> .Export Preliminary Setup ‡ mkdir /backup/<database>/datapump ‡ Set up your environment for ORACLE_HOME and ORACLE_SID then sqlplus / as sysdba ‡ create DIRECTORY datapump_dir as /backup/<database>/datapump . ‡ grant read. . ‡ grant exp_full_database to <dp_schema> .

Start Datapump Export Job ‡ Expdp parfile=/backup/<database>/datapump/expdp _<database>_<db_schema>.par ‡ Expdp <db_schema>/<password> directory=datapump_dir schema=schema dumpfile=expdp_<database>_<schema>.dmp parallel=4 job_name=job_<database>_schema .

dmp Logfile=expdp_<database>_<schema>.log Directory=datapump_dir Schemas=schema Job_name=job_expdp_<database>_<schema> Status=240 .Example Export Parameter File ‡ ‡ ‡ ‡ ‡ ‡ ‡ Userid=<dp_schema>/password Dumpfile=expdp_<database>_<schema>.

Datapump Import Setup ‡ ‡ ‡ ‡ ‡ Make a server file system directory Create a database directory Grant read privileges to directory Grant privileges for full import Create an import parameter file .

‡ grant read. write on Directory datapump_dir to <dp_schema> ‡ grant imp_full_database to <dp_schema> .Import Preliminary Setup ‡ mkdir /backup/<database>/datapump ‡ Set up your environment for ORACLE_HOME and ORACLE_SID then sqlplus / as sysdba ‡ create DIRECTORY datapump_dir as /backup/<database>/datapump .

dmp parallel=4 job_name=job_impdp_<database>_<schema> .par ‡ impdp <dp_schema>/<password> directory=datapump_dir table_exists_action=truncate dumpfile=impdp_<database>_<schema>.Start Datapump Import Job ‡ impdp parfile=/backup/<database>/datapump/impd p_<database>_<schema>.

Example Import Parfile ‡ ‡ ‡ ‡ ‡ ‡ Userid=<dp_schema>/<password> Schemas=<schema> Exclude=grant Directory=datapump_dir Dumpfile=expdp_<database>_<schema>.dmp Table_exists_action=replace .

User2.Some Basic Parameters ‡ Directory=Datapump_dir .User3 ‡ Dumpfile=datapump_job_file%U.Table2 ‡ Estimate=Statistics The default is blocks.dmp ‡ Tables=Table1.Specify the datapump directory that has been defined in the database ‡ Schemas=User1. to estimate the size of the export .

Important Features ‡ EXCLUDE you can exclude schemas ‡ REMAP_SCHEMA user1 to user2 ‡ REMAP_TABLESPACE user1_data to user2_data ‡ SQLFILE script of sql (DDL) statements ‡ STATUS list status every few seconds ‡ JOB_NAME run as an instance job .

Exclude ‡ EXCLUDE=USER Exclude a specific user and all objects of that user ‡ EXCLUDE=GRANT Exclude definitions or privileges but not objects of the user ‡ EXCLUDE=VIEW. PACKAGE. . FUNCTION Exclude a specific type of object ‡ EXCLUDE=INDEX: LIKE EMP% Exclude indexes whose names start with EMP.

Include ‡ INCLUDE=PROCEDURE Include just the procedure objects ‡ INCLUDE=TABLE: IN ( MANAGERS . FACILITIES ) ‡ INCLUDE=INDEX: LIKE JOB% ‡ Note. INCLUDE and EXCLUDE parameters are mutually exclusive .

Network_Link ‡ NETWORK_LINK=database_link ‡ If this is an export then. retrieved data from the referenced database and written to a datapump file. . ‡ If this is an import then. retrieved data from the referenced database is imported into the current database.

SCHEMA_EXPORT_OBJECTS. and TABLE_EXPORT_OBJECTS ‡ For example select object_path. comments from schema_export_objects where object_path not like %/% . .Filters ‡ QUERY=employees: WHERE department_id >10 AND salary > 10000 ‡ QUERY=salary: WHERE manager_id <> 13 ‡ What can I filter with exclude and include DATABASE_EXPORT_OBJECTS.

Table_Exists_Action ‡ ‡ ‡ ‡ Skip Append Truncate Replace .

‡ Transform=segment_attributes:N Useful when you do not want to keep the original storage definition. .Import Parameters ‡ Remap_schema=User1:User2 ‡ Remap_tablespace=User1_tblspace:User2_tbl space ‡ Transform=OID:N Do not use the original Oracle Identification.

Interactively Work With The Job ‡ ‡ ‡ ‡ Check on the status Stop the job Restart the job Kill the job .

Check Status of Datapump Job ‡ Select job_name. state from user_datapump_jobs . job_mode. operation. ‡ Expdp <dp_schema>/<password> attach=<job_name> ‡ Status ‡ Exit_Client Exit client but leave job running ‡ Continue_Client Resume logging .

state from user_datapump_jobs . operation. ‡ Expdp <dp_schema>/<password> attach=<job_name> ‡ Kill_job . job_mode.Kill Datapump Job ‡ Select job_name.

Filesize File size of new files (Add_File). Continue_Client Restart job if idle. Help Interactive session commands. Exit_Client Exit interactive session. Kill_Job Delete the attached job and exit. .Interactive Commands ‡ ‡ ‡ ‡ ‡ ‡ Add_File Add dumpfile to dumpfile set.

More Interactive Commands ‡ Parallel Specify the maximum number of active workers ± Set to more than twice the number of CPUs ± Worker processes are created as needed ‡ Reuse_Dumpfiles Overwrite dump file if it exists ‡ Stop_Job Stop job execution & exit client ‡ Start_Job Start or resume current job .

‡ You can still get the error snap shot too old. . ‡ If the job is started using / as sysdba . therefore the dump file can only be imported by datapump.Restrictions ‡ DataPump is an Oracle utility. kill job. etc. you need to know the Oracle database system password. to check status.

Oracle 11g can encrypt all of the metadata and/or data ‡ Data_Options XML_CLOBS exports XML columns in uncompressed CLOB format . and data_only ‡ Encryption Oracle 10g exported already encrypted columns.Oracle 11g Features ‡ Compression besides none. and metadata_only the new features are all.

Used to merge partitions into one table. ‡ Remap_Data enables data to be modified to obscure sensitive information. Partition options are departition and merge (all partitions) ‡ Transportable permits exporting metadata for specific tables.More Oracle 11g Features ‡ Partition On import only. .

Oracle 11g OEM ‡ Tab Data Movement Move Row Data Export to Export files (expdp) Import from Export files (impdp) Import from Database (NETWORK_LINK) Monitor Export and Import Jobs .

Problems Encountered ‡ The previous run has created the datapump datafile with the same name as current job ‡ Space on the tablespaces ± The job suspends ± Make the file extensible or add another datafile ‡ NFS mounted soft ± Set event 10298 and bounce the database ± Migrate to Oracle 11g .

Job Will Not Die ‡ The datapump processes are killed ± Drop the master table with the same job name ± Delete the datapump file ‡ The datapump file has been delete or moved ± Cannot now attach to the job ± Drop the master table with the same job name ± Delete the datapump file .

Review ‡ ‡ ‡ ‡ Use datapump as another tool for the DBA Take the time to set it up properly Learn the basic and rich features Create scripts for backups and refreshes .

Part #B14215-01 Oracle Database Utilities 11g Release1(11.1).2). Part#B28319-01 Oracle is a registered trademark of Oracle Corp.Comments ‡ Comments or questions ‡ Thank you for coming ‡ References: Oracle Database Utilities 10g Release2(10. .

org Be sure to include UTOUG Training Days in the title. Thank You .The End ‡ Last slide: .Session name: Data Pump . Noble .Contact information for further questions: noblegm@ldschurch.Speaker: Gary M.

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->