You are on page 1of 3

Transportable Tablespaces

You can move large amounts of data between databases just simply by moving data files
from one database to another. You copy all the data files from the source database to the
target database and import the data dictionary information about the tablespaces from the
source database to the target database.

You use transportable tablespaces mainly in the context of a data warehouse, some of the
important features are

 Moving data from the source database (OLTP) into a data warehouse database
 Moving data from a staging database into a data warehouse database
 Moving data from a data warehouse to a data mart
 Performing tablespace point-in-time recovery
 Archiving historical data

Transporting a tablespace

There are 4 steps to transport a tablespace

1. Grant the necessary privileges


2. Make the tablespace is transportable
3. Generate the transportable tablespace set (data dictionary information)
4. Copy the data files to the target server
5. Perform the tablespace import

Privilege required to check


grant EXECUTE_CATALOG_ROLE to vallep;
tablespace
execute dbms_tts.transport_set_check('test01, test02', true);
Make sure a tablespace is
select * from transport_set_violation;
transportable
Note: if there are any errors then check with Oracle to see how to get around the
alter tablespace test01 read only;
alter tablespace test02 read only;

Generate the transportable expdp vallep/vallep directory=data_pump_dir dumpfile=test.dmp


tablespace set transport_tablespace=test01,test02 include=triggers,constraints,grant

Note: The tablespaces must be in read only mode and only metadata (data dictio
be contained in the data pump export
Copy the Data to target
Now copy all the data files and the data pump export to the target server
server
Tablespace import impdp system/system dumpfile=test.dmp transport_datafiles='test01.dbf','test02
directory=data_pump_dir

Transporting tablespaces on different platforms

You can transport tablespace between different server architectures, there is only one
requirement both platform must have the same endian format. endian format refers to the
byte ordering of the file system, they can be one of two types big or small, if they differ
you must convert the data files to the format you require.

To check what format you have

select t.endian_format from v$transportable_platform t, v$database d


Check endian format
where t.platform_name = d.platform_name;

The steps to transport a tablespace that are a different endian format are

1. Ensure that the tablespaces are self-contained


2. Make the tablespaces read-only
3. Export the metadata using Data Pump Export
4. Convert the data files to match the endian format
5. Copy the converted data files to the target system
6. Use the Data Pump Import utility to import the metadata

Source Server
Privilege required to check tablespace grant EXECUTE_CATALOG_ROLE to vallep;
execute dbms_tts.transport_set_check('test01, test02', true);

select * from transport_set_violation;


Make sure a tablespace is transportable
Note: if there are any errors then check with Oracle to see how to
them
alter tablespace test01 read only;
make the tablespace read only
alter tablespace test02 read only;
expdp vallep/vallep directory=data_pump_dir dumpfile=test.dmp
transport_tablespace=test01,test02 include=triggers,constraints,gr
Generate the transportable tablespace set
Note: remember we are only exporting the metadata
Convert the tablespace on the source server (this can also be done on the target server)
rman> convert tablespace test01 to platform 'HP-UX (64bit)' form
convert the data files (using CONVERT)
Note: Oracle will tell you the new file name of the converted file
convert the data files (using rman> convert tablespace test01 to platform 'HP-UX (64-bit)'
DB_FILE_NAME_CONVERT db_file_name _convert = 'c:\oracle\test01.dbf','c:\convert\test01.d
Now copy all the data files and the data pump export to the target
Copy the Data
already copied
Target Server
impdp system/system dumpfile=test.dmp
Tablespace import
transport_datafiles='test01.dbf','test02.dbf' directory=data_pump_

You might also like