You are on page 1of 1

Question: To process the huge file (Billion records) into oracle table

Sollution1:

We can import data from CSV files into the database through Oracle SQL Developer.

In the Data Import wizard, select a CSV import format, specify a location of the Source data, and click
Next. On the Data Import > Destination tab, specify the Oracle connection, select the database, and
table to import the data to, and then click Next.

Solution2:

By using Oracle Database utilities (such as SQL*Loader or External Tables).

Solution3:

1. We can split the huge file into 10 multiple files using Python Script and keep them in one
location (landing zone path).
2. Create 10 temporary tables and create 10 mappings and generate scenarios for all these 10
mappings and use them in ODI package with Asynchronous mode. In this way all 10
temporary tables will be loaded parallelly.
3. Create another mapping to load data from temp tables to main table.

Solution4:

There is one Knowledge module called IKM file to External table and also, we can use IKM file to
SQLLoader. We can use the below options at KM level.

1. Hints.

2. Detection strategies.

3. Create temporary indexes on tables.

4. Gather stats on a table.

5. Analyze the table.

Solution5:

Other than this we can create partition and we can do the partition exchange.

You might also like