You are on page 1of 3

8/9/13 riazsoft: Oracle Data Pump

riazsoft.blogspot.in/2009/05/oracle-data-pump.html 1/3
রিববার, ৩ম, ২০০৯
Oracle Data Pump
Oracle Data Pump is an efficient feature/utility for Export/Import,
available from oracle database 10g with similar look and feel to the
original utilities.
it enables very high- speed of movement of data and metadata
from one database to another.
it is much more efficient with greater control and management
of import and export jobs
With data-pump all jobs run primarily on the server using server
processes (differs from original Export and Import).
Oracle 10g's new DataPump utility is designed as the eventual
replacement for the original Oracle Import and Export utilities.
Directory Object:
The server processes access files for the data pump jobs using
directory objects that identify the location of the files. The
directory objects enforce a security model that can be used by
DBAs to control access to these files.
The Directory objects are needed to ensure data security and
integrity. Otherwise, users would be able to read data that they
should not have access to and perform unwarranted operations on
the server.
Overview of the Data Pump's suite of tools for extracting,
transforming, and loading data within an Oracle database.
Step 1(creating directory):
connect to sqlplus as sys user as sysdba
ensure the user is unlocked
give directory create and access priviliges to the user
GRANT CREATE ANY DIRECTORY TO theuser;
CREATE OR REPLACE DIRECTORY dump_dir AS '/var/backup';
GRANT READ, WRITE ON DIRECTORY dump_dir TO theuser;
Spet 2:
i.Full Export/Import-
A full export is specified using the FULL parameter. In a full
database export, the entire database is exported. This mode
requires that you have the EXP_FULL_DATABASE role.
Export Command-
expdp system/password full=Y directory=dump_dir
dumpfile=expdpdumpfile.dmp logfile=expdplogfile.log
Import Command-
impdp system/password full=Y directory=dump_dir
dumpfile=expdpdumpfile.dmp logfile=impdplogfile.log
ii. Schema Export/Import
It is invoked using the SCHEMAS parameter. Without
äগ সংরWাণাগার
► 2010 (2)
▼ 2009 (12)
► December (1)
► October (2)
► June (1)
▼ May (4)
Islam:Translation of salaah (prayer)
Oracle: Ref Cursor, Dynamic Cursor
Oracle Commands
Oracle Data Pump
► March (1)
► February (3)
আমার সºেক
riazsoft
আমারUস*ূণU´3াফাইলU´দখুন
0 Share
More

Next Blog» Create Blog

Sign In
riazsoft
8/9/13 riazsoft: Oracle Data Pump
riazsoft.blogspot.in/2009/05/oracle-data-pump.html 2/3
EXP_FULL_DATABASE role, only own schema can be exported.
Without EXP_FULL_DATABASE role several schemas can be
exported at a time.
(It can include the system privilege grants as well).
expdp theuser/thepassword schemas=theuser directory=dump_dir
dumpfile=theuser.dmp logfile=expdp_theuser.log
impdp theuser/thepassword schemas=theuser directory=dump_dir
dumpfile=theuser.dmp
logfile=impdp_theuser.log
To allow data import at existing table use the parameter:
TABLE_EXISTS_ACTION=APPEND
To limit the export/import to specific objects INCLUDE and
EXCLUDE parameters can be used. Using INCLUDE parameter, only
the specified objects will be included in the export.
expdp theuser/thepassword schemas=theuser include=TABLE:"IN
('STUDENT', 'DEPT')" directory=dump_dir dumpfile=theuser.dmp
logfile=expdp_theuser.log
expdp theuser/thepassword schemas=theuser exclude=TABLE:"=
'COURSE'" directory=dump_dir dumpfile=theuser.dmp
logfile=expdp_theuser.log
QUERY=CUSTOMERS:“WHERE TOTAL_SPENT > 10"
Using the PARALLEL parameter performance can be improved. To
allow multiple dumpfiles to be created or read, it must use the
"%U" wildcard in the DUMPFILE parameter.
expdp theuser/thepassword schemas=theuser directory=dump_dir
parallel=4 dumpfile=theuser_%U.dmp
logfile=expdp_theuser%U.log
Remap tablespace
if the tablespace names at importing database/schema are
different from the source database table spaces for the source
schema then tablespaces requires remaping. So it needs to use
remap_tablespace parameter:
remap_tablespace=SRCDATA_SPC:data02,src_users:data02,SRCIN
DX_SPC_INDX:index01
Remap schema
During import (impdp), if the user name is different from the
schema name at dumpfile then it needs to remap schema. So it
needs to use remap_schema parameter at impdp command:
remap_schema=src_schema:dest_schema
(version of import and export file should be match. Version
parameter can be used to change the export dump file. for
example if the server version is 11g and the exported file to be
used at import on 10g version 10.1 then it must be use the
parameter version=10.1 at export command on the 11g server)
Notes:
All data pump actions are performed by multiple jobs/server
processes ( not DBMS_JOB jobs) controlled by a master control
process that uses Advanced Queuing. At runtime an advanced
queue table, named after the job name, is created and used by
the master control process.
8/9/13 riazsoft: Oracle Data Pump
riazsoft.blogspot.in/2009/05/oracle-data-pump.html 3/3
নবীনতর ´পা³ পুরাতন ´পা³
The table is dropped on completion of the data pump job. The job
and the advanced queue can be named using the JOB_NAME
parameter.
Cancelling the client process does not stop the associated data
pump job. Issuing "ctrl+c" on the client during a job stops the
client output and presents a command prompt. Typing "status"
at this prompt allows you to monitor the current job.
The DBA_DATAPUMP_JOBS view can be used to monitor the
current jobs.
এরU7ারাU´পা³Uকরা riazsoft এইUসমেয় ৯:৩৩Upm
´লেবলসমূহ: data pump, expdp, impdp
(কান ম@ব\ (নই:
একÑ ম@ব\ (পা% কNন
আপনার মdব¯ িলখুন...
এইNেপ ম@ব\: Google অ¯াকাউT
5কাশ পূবdপ
´হাম
এেত সদস¯তা: মdব¯Gিল ´পা³ কdন (Atom)