Sometimes we have to take a logical backup of our database using EXPDP utility. Consider database size to be 400GB. We cannot take backup of 400 gb database to a single FS , we have to take it in multiple FS as one FS donot have that much space.
In such Situation we use the multiple directories feature of EXPDP utility. Using this feature we can create dumpfile in multiple directories and distribute our logical backup to different directories.
Example :-
Consider two directories named DATAPUMP and TESTING in Oracle Database :-
SQL > create or replace directory DATAPUMP as '/amit/datapump';
Directory created.
Directory created.
SQL > select * from dba_directories;
OWNER DIRECTORY_NAME DIRECTORY_PATH
------------------------- ----------------------------------- ----------------------------------------
SYS TESTING /amit/testing
SYS DATAPUMP /amit/datapump
In such Situation we use the multiple directories feature of EXPDP utility. Using this feature we can create dumpfile in multiple directories and distribute our logical backup to different directories.
Example :-
Consider two directories named DATAPUMP and TESTING in Oracle Database :-
SQL > create or replace directory DATAPUMP as '/amit/datapump';
Directory created.
SQL > create or replace directory TESTING as '/amit/testing';
Directory created.
SQL > select * from dba_directories;
OWNER DIRECTORY_NAME DIRECTORY_PATH
------------------------- ----------------------------------- ----------------------------------------
SYS TESTING /amit/testing
SYS DATAPUMP /amit/datapump
Now take logical backup using multiple directories.
$ expdp amit/amit dumpfile=datapump:expdp_datapump_%U.dmp,TESTING:expdp_testing_%U.dmp compression=all parallel=2 &
We can check that multiple dumpfiles will be created.
bash-3.2$ pwd
/amit/datapump
bash-3.2$ ls -ltr|tail
-rw-r--r-- 1 oracle oinstall 89785 Aug 30 19:45 import.log
-rw-r----- 1 oracle oinstall 7680000 Aug 31 02:34 expdp_datapump_01.dmp
bash-3.2$ cd -
/amit/testing
bash-3.2$ ls -ltr|tail
total 731952
-rw-r--r-- 1 oracle oinstall 185540 May 31 17:40 export.log
-rw-r----- 1 oracle oinstall 229363712 Aug 31 02:34 expdp_testing_01.dmp
Parallel parameter must be used while using multiple directories, otherwise expdp will not fail but it will write to only first directory. Number of directories used must be equal to parallel parameter then only all directories will be used for writing.
%U is used to create multiple files in same directory. Like expdp_datapump_%U.dmp will create expdp_datapump_01.dmp, expdp_datapump_02.dmp. If we want using %U expdp write to multiple files in parallel mode then we have to use parallel parameter.
We can also set FILESIZE parameter while using EXPDP, it will not create file more than the filesize allocated.
$expdp amit/amit dumpfile=datapump:expdp_datapump_%U.dmp,TESTING:expdp_testing_%U.dmp compression=all parallel=2 filesize=2g &
Above example will create dumpfiles like expdp_datapump_01.dmp, expdp_testing_01.dmp, expdp_datapump_02.dmp, expdp_testing_02.dmp. All dumpfiles have filesize of 2g.
I hope this article helped you.
Regards,
Amit Rath
Hi amit,
ReplyDeleteyour blog is very helpful.
if i want to create 10 filrs in same directory then setting of parallea would be???
Thanx Vikas for writing in. PFB command :-
Deleteexpdp system/system dumpfile=datapump:expdp_datapump_test_%U.dmp compression=all parallel=10 full=y &
-rw-r----- 1 oracle oinstall 917504 Nov 29 12:16 expdp_datapump_test_06.dmp
-rw-r----- 1 oracle oinstall 36704256 Nov 29 12:16 expdp_datapump_test_10.dmp
-rw-r----- 1 oracle oinstall 68395008 Nov 29 12:16 expdp_datapump_test_03.dmp
-rw-r----- 1 oracle oinstall 45752320 Nov 29 12:16 expdp_datapump_test_08.dmp
-rw-r----- 1 oracle oinstall 28200960 Nov 29 12:16 expdp_datapump_test_07.dmp
-rw-r----- 1 oracle oinstall 53862400 Nov 29 12:16 expdp_datapump_test_05.dmp
-rw-r----- 1 oracle oinstall 52895744 Nov 29 12:16 expdp_datapump_test_04.dmp
-rw-r----- 1 oracle oinstall 81645568 Nov 29 12:16 expdp_datapump_test_02.dmp
-rw-r----- 1 oracle oinstall 61628416 Nov 29 12:16 expdp_datapump_test_01.dmp
-rw-r----- 1 oracle oinstall 35647488 Nov 29 12:16 expdp_datapump_test_09.dmp
Amit Rath
Very clear and helpful Post :)
DeleteHi Amit,
ReplyDeleteI found this article very helpful in solving my issues.Thank You.
Hello Amit,
ReplyDeleteCould you please explain with same details to restore the database from multiple dump files.. Thank you
Hi Mach,
Deleteimpdp directory=datapump dumpfile=expdp_datapump_%U.dmp logfile=impdp.log parallel=5
Hope I clarified your doubt.
Thanks
Amit Rath
Amith,
Deletethanks for the wonderful article. Please can you assist me with below query :-
1) while exporting we are using the below :-
datapump:expdp_datapump_%U.dmp,TESTING:expdp_testing_%U.dmp
So while importing Iam of the opinion that this needs to also mention both directories.Else how will import knows that dumpfiles are also present in another directory.
thanks once again for wonderful articles.
Hi ,
DeleteYes while importing you have to mention both directories in below format :-
dumpfile=datapump:expdp_datapump_%U.dmp,TESTING:expdp_testing_%U.dmp
Thanks
Amit Rath
Hi Amit,
DeleteI received 4 dump files for a client database that he exported on same file share directory.The DB is around 1 TB
Now due to space constraint I was able to place 3 dumps on one file share and the 4th on anther file share different directory.Since my dump_directory can only point to one file share location how can I import this full dump without have to place all the dumps in same location.
Thanks
Abhishek
DeleteHi Amit,
Suppose the export was used with parallel 4 option to create dump files on single file share single directory but now due to space issues I placed the dump files into two file shares can I actually import them back from two different directories? Assuming my dump directory can only point to one location.
Can you please help me thanks
Abhishek
Amit Rath,
ReplyDeletethank you for all such detailed explanations and sharing.Please can you assist me with below query:-
while importing are we not suppose to also use testing directory else how will datapump come to know that dumps are been split and scattered on both directories.
thanks once again
HI Amit,
ReplyDeleteI took the export as per your article in multiple directories. I want to import this to another database which is having only one data pump directory. How we can import the backups with out creating the another dir.
Thanks for ur srticle.
Hi Amit,
ReplyDeleteYour article very helpful in solving my issues.Thank You.
Suppose i want to imp one table
Thanks for this article. Sharing an article I have written on data pump to multiple files as well but focusing on network bandwidth issue.
ReplyDeletehttp://codingsnapshot.blogspot.my/2016/05/oracle-data-pump-export-import-multiple.html
Hi Amit,
ReplyDeleteI have one doubt here , My DB size is around 2 TB and I am given 2 different locations to keep the dump files .
But both the file systems are not having equal size , I mean one is having 500 GB and other is having 1.5 TB . Is there any way to perform the export of 2 TB database ? Can we limit the export of 500 GB files in one file system ?
Thanks ,
Vikash
Hi Amit,
ReplyDeleteCan you please tell me how to write dumpfile parameter if I want to take export backup in more than 100 dumpfiles with 1 GB each filesize, as we have limit of 99 files with %U.dmp name.