dmk_dtpump.{ksh|ps1}

This chapter presents the Data Pump wrapper script dmk_dtpump{ksh|ps1}.

All scripts have a detailed synopsis please check the option “-h” or “—help”.

The execution of dmk_dtpump.{ksh|ps1} requires a configuration file, which contains the following parameters.

The script requires command lines argument in addition to the configuration file. Some of the configuration file parameters are also available in command-line. The parameters given in command line are from higher priority compared to their configuration values. Which allows different behave using the same configuration file.

Example:

  • To launch a full Data Pump export start the utility as follows (assuming configuration file stored at the default location)

dmk_dtpump.ksh –s ${ORACLE_SID} --export
  • To launch a full Data Pump export start the utility as follows (assuming configuration file stored at the default location)

dmk_dtpump.ksh –s ${ORACLE_SID} –-import

Once all parameters have been computed dmk_dtpump generates a Data Pump parameter file. You can also specify your own Data Pump parameter file for customized export/import operations.

  • Microsoft Windows Task Scheduler doesn’t launch directly the PowerShell layer.

However starting PowerShell from the DOS layer shall be achived using the following syntax:

powershell.exe –noprofile \\
–command "dmk_dtpump.ps1 –s ${ORACLE_SID} –-export"
The PowerShell profile is similar to the Unix/Linux user profile (~/.profile, ~/.bash_profile). The “-noprofile” option bypass the sourcing of the user profile.

Parameters

A hyphen character prefixes the command line parameters as they are be used within a shell.

–h or –help

Display the Synopsis.

–s or -–sid

The target database on which, the Oracle RMAN operation should be launched. This parameter is always required.

dmk_dtpump.ksh –s ${ORACLE_SID}

–e or -–export

Execute a Data Pump export operation “expdp”.

–i or -–import

Execute a Data Pump import operation “impdp”.

–tns or -–tnsname

Use a remote connection through a listener. This option is using local naming parameter file, and thus the specified value must be configured in the tnsnames.ora.

ℹ️This option is mandatory if executing dmk_dtpump on a pdb.

For example, the syntax will be the following for a full export of the PDB named DB122PDB with CDB named DB122 :

Administrator@WIN-KLICHMDL0LN:C:\\ [db122 (CDB$ROOT)] db122
********* dbi services Ltd. *********
STATUS : OPEN
DB_UNIQUE_NAME : db122
OPEN_MODE : READ WRITE
LOG_MODE : ARCHIVELOG
DATABASE_ROLE : PRIMARY
FLASHBACK_ON : NO
FORCE_LOGGING : NO
VERSION : 12.2.0.1.0
CDB Enabled : YES
List PDB(s) READ ONLY : PDB$SEED
List PDB(s) READ WRITE : DB122PDB
*************************************
Administrator@WIN-KLICHMDL0LN:C:\\ [db122 (CDB$ROOT)] cd $TNS_ADMIN
Administrator@WIN-KLICHMDL0LN:C:\\oracle\\network\\admin\\ [db122 (CDB$ROOT)]
cat .\\tnsnames.ora
db122pdb, db122pdb.it.dbi-services.com =
(DESCRIPTION =
 (ADDRESS = (PROTOCOL = TCP )( HOST = WIN-KLICHMDL0LN )( PORT = 1521 ))
 (CONNECT_DATA =
 (SERVICE_NAME = db122pdb )
 )
)
Administrator@WIN-KLICHMDL0LN:C:\\\\ [db122 (CDB$ROOT)] dmk_dtpump.ps1 -s
db122 -e -tns db122pdb
2020-07-31_16-48-10::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[C:\\oracle/admin/db122/]
2020-07-31_16-48-12::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[C:\\oracle/admin/db122//log/db122_db122pdb_expdp_20200731_164812.par
]
2020-07-31_16-48-14::dmk_dtpump_sid.pl::MainProgram ::INFO ==> FULL=Y
[OK]::dbi-services-Ltd::EXPDP::dmk_dbbackup::db122::data_pump_retCode::0
Logfile is :
C:\\oracle\\admin\\db122\\log\\db122_db122pdb_expdp_20200731_164812.log
Administrator@WIN-KLICHMDL0LN:C:\\\\ [rdbms122]

–-checkparfile

Generates the Data Pump parameter file.

–c or –-configfile

The dbi services best practice consist of saving the configuration file under the database admin directory which makes the command-line parameter “ConfigFile” useless

cp ${DMK_DBBACKUP}/templates/dtpump_{export|import}.cfg \\
 ${DMK_ORA_ADMIN_SID}/etc/dtpump_{export|import}.cfg

Otherwise the command-line parameter becomes mandatory

dmk_dtpump.ksh –s ${ORACLE_SID} –c <path to the configuration file>

–u or --schemas

Data Pump schemas export/import, comma separated lists.

–t or --tables

Data Pump table export/import, comma separated lists.

⚠️Tables export and schemas export can not be used simultaneously. In case both -u and -t are used this parameter will be ignored and full schema will be exported/imported.

⚠️When using system user as UserId in the configuration file, make sure schema name is added when naming the tables, as t scott.dept for example.

–f or --parfile

This option makes possible the use of a parfile with specific parameters as excluding schemas for example :

EXCLUDE=SCHEMA:"IN ('OUTLN','SYSTEM','SYSMAN','FLOWS_FILES','APEX_030200','APEX_PUBLIC_USER','ANONYMOUS','SCOTT','SYS')"

Those parfile parameters will be added to the parfile generated automatically by the script and located in ${DMK_ORA_ADMIN_SID}/log in order to take them in account.

ParFile

#*************************************************************
# ParFile
# Datapump parameter PARFILE
# If a parfile is specified then the DMK_DBBACKUP configfile parameters
# are appended to this parfile except for parameter already specified within
the parfile
ParFile=""

⚠️Fully qualified path without variable is expected if dmk_dtpump is not executed from the directory where the parfile is located.

UserId

#*************************************************************
# UserId
# Datapump parameter USERID # This Parameter is mandatory
# dbi services recommend to use the SYSTEM account
UserId="system/manager"

DumpFile

#*************************************************************
# DumpFile
# Datapump parameter DUMPFILE
# Default is :
${ORACLE_SID}_<tns_name>_dtpump_expdp|impdp_<YYYYMMDD>_<HHMISS>.dmp
# In case Parallel parameter is configured in the file and greater than 1
default is :
${ORACLE_SID}_<tns_name>_dtpump_expdp|impdp_<YYYYMMDD>_<HHMISS>_%U.dmp
# In case of parallel greater than 1 default dumpfile name will override
this DumpFile parameter.
DumpFile=""

Parallel

#*************************************************************
# Commandline : --Parallel or -p
# Datapump parameter PARALLEL
# Default is : "1"
Parallel=""

Consistent

#*************************************************************
# Commandline : --consistent
# Datapump parameter FLASHBACK_TIME
# Default is : "1" - means consistent
Consistent=""

⚠️The FLASHBACK_TIME parameter is only available for Data Pump exports!

DumpDir

#*************************************************************
# dtpump_DumpDir
# Datapump parameter DIRECTORY
# Default is : process aborted if no DUMPDIR specified
DumpDir="DATA_PUMP_DIR"

LogPath

#*************************************************************
# LogPath
# Log destination
# Default is : "" - $DMK_ORA_ADMIN_SID/log
LogPath=""

⚠️ Fully qualified path without variable is expected if not taking default location.

PreScript

#*************************************************************
# Launch a pre Export script
# the logfile is written under LogPath
# Default is : ""
PreScript=""

This gives the possibility for the user to execute a batch script (windows) or a shell script (linux) before executing export/import.

⚠️ If no full qualified path is given, but only the script name preceded by ./ for linux, dmk_dtpump is expected to be run in the current directory where the script is located. Otherwise, the user needs to give the full qualified path without using variables.

ℹ️ DMK_DBBACKUP considers only the return code of the script to check whether the execution was successful or not.

PostScript

#*************************************************************
# Launch a post Export script
# the logfile is written under LogPath
# Default is : ""
PostScript=""

This gives the possibility for the user to execute a batch script (windows) or a shell script (linux) after executing export/import.

⚠️ If no full qualified path is given, but only the script name preceded by ./ for linux, dmk_dtpump is expected to be run in the current directory where the script is located. Otherwise, the user needs to give the full qualified path without using variables.

ℹ️ DMK_DBBACKUP considers only the return code of the script to check whether the execution was successful or not.

SmtpServer

#*************************************************************
# SmtpServer
# Infrastructure SMTP server
# Default is : "" - localhost
SmtpServer=""

Customer

#*************************************************************
# Name/ShortName of the customer
# Default is : "dbi-services"
Customer="dbi-services-Ltd"

–n or --nosuccessmail

#*************************************************************
# Commandline : --nosuccessmail -n
# "0" -means the sucessmail will be sent as well
# "1" -means that email will be only sent in case of problem or error
# Default is : "0"
NoSuccessMail="0

MailRecipients

#*************************************************************
# MailRecipients
# comma separated list of mail recipients
# Default is : "" - empty
MailRecipients=""

MailFrom

ℹ️ A hyphen character prefixes the command line parameters as they are being used within a shell.

Each parameter defined within the configuration file is automatically parsed at runtime and environment variables defined as values implicitly expanded. Syntax:

#*************************************************************
# MailFrom
# Hardcode address From sender
# Default is "" - ${user}@${hostname}
MailFrom=""

Dumpfile name according to parallel process

⚠️In case of parallel use, dumpfile name should be composed with variable %U in order to have one dump file per process. The software will then handle following cases :

  1. In case parallel parameter from configuration file is greater than 1 the default dump file name will be composed of %U and will override the DumpFile parameter from the configuration file.

  2. In case parallel is provided in an external parfile but not dumpfile name, the software will exit with appropriate error message.

  3. In case both parallel and dumpfile name is provided in an external parfile, the software will inform about the need to use %U

  4. In case dumpfile name only is provided in an external parfile and parallel parameter in configuration file is set to one, no additional warning.

  5. In case dumpfile name only is provided in an external parfile and parallel parameter in configuration file is greater than 1, the software will inform about the need to use %U

Examples

Export and import a single table into a new schema

Let’s have an example where we will export the DEPT table from the SCOTT schema and import it into ALLEN schema

To do so we will use the REMAP_SCHEMA datapump option. We previously created ALLEN user. In case the user exists the objects will be added to the existing user schema. In case ALLEN user does not exist, the import will automatically create it with an unusable password. To do so export needs to be done with system user and you will need to reset user password after importing the data.

In our case ALLEN user exists but has no tables.

SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
no rows selected

Let’s export table DEPT from SCOTT schema with user SYSTEM.

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep UserId
dtpump_export.cfg
# UserId
UserId="system/manager"
oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] dmk_dtpump.ksh -s
DB1 -e -t scott.dept
2019-12-20_15-55-57::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[/u01/app/oracle/admin/DB1]
2019-12-20_15-55-57::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[/u01/app/oracle/admin/DB1/log/DB1_expdp_20191220_155557.par]
2019-12-20_15-55-58::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
TABLES=[scott.dept]
[OK]::dbi-services-Ltd::EXPDP::dmk_dbbackup::DB1::data_pump_retCode::0
Logfile is : /u01/app/oracle/admin/DB1/log/DB1_expdp_20191220_155557.log

We will now prepare a parfile to import the SCOTT table into ALLEN schema.

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] cat
import_scott_dept.par
REMAP_SCHEMA=scott:allen
oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep ^ParFile
dtpump_import.cfg
ParFile="import_scott_dept.par"
oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep ^DumpFile
dtpump_import.cfg
DumpFile="DB1_dtpump_expdp_20191220_155557.dmp"

We will import the data into new ALLEN schema.

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] dmk_dtpump.ksh -s
DB1 -i -t scott.dept
2019-12-20_16-45-06::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[/u01/app/oracle/admin/DB1]
2019-12-20_16-45-06::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[/u01/app/oracle/admin/DB1/log/DB1_impdp_20191220_164506.par]
[OK]::dbi-services-Ltd::IMPDP::dmk_dbbackup::DB1::data_pump_retCode::0
Logfile is : /u01/app/oracle/admin/DB1/log/DB1_impdp_20191220_164506.log

Checking we can see that table DEPT has been imported into ALLEN schema.

SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
TABLE_NAME
----------------------------------------------------------------------------
DEPT

Exporting and reimporting a schema again

User ALLEN is existing with DEPT table. We will export and import ALLEN schema using SYSTEM user.

SQL> select username from dba_users where upper(username) = 'ALLEN';
USERNAME
----------------------------------------------------------------------------
ALLEN
SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
TABLE_NAME
----------------------------------------------------------------------------
DEPT

Let’s export ALLEN schema using default dtpump_export.cfg.

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep ^UserId
dtpump_export.cfg
UserId="system/manager"
oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] dmk_dtpump.ksh -s
DB1 -e -u allen
2019-12-20_16-50-53::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[/u01/app/oracle/admin/DB1]
2019-12-20_16-50-53::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[/u01/app/oracle/admin/DB1/log/DB1_expdp_20191220_165053.par]
2019-12-20_16-50-54::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
SCHEMAS=[allen]
[OK]::dbi-services-Ltd::EXPDP::dmk_dbbackup::DB1::data_pump_retCode::0
Logfile is : /u01/app/oracle/admin/DB1/log/DB1_expdp_20191220_165053.log

Let’s drop ALLEN user.

SQL> drop user allen cascade;

User dropped.

SQL> select username from dba_users where upper(username) = 'ALLEN';

no rows selected

SQL> select table_name from dba_tables where upper(owner) ='ALLEN';

no rows selected

SQL>

We will import ALLEN schem again using previous exported dump file.

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep ^UserId
dtpump_import.cfg
UserId="system/manager"
oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep ^DumpFile
dtpump_import.cfg
DumpFile="DB1_dtpump_expdp_20191220_165053.dmp"
oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] dmk_dtpump.ksh -s
DB1 -i -u allen
2019-12-20_16-56-47::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[/u01/app/oracle/admin/DB1]
2019-12-20_16-56-47::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[/u01/app/oracle/admin/DB1/log/DB1_impdp_20191220_165647.par]
[OK]::dbi-services-Ltd::IMPDP::dmk_dbbackup::DB1::data_pump_retCode::0
Logfile is : /u01/app/oracle/admin/DB1/log/DB1_impdp_20191220_165647.log

Let’s check if ALLEN schema has been imported successfully.

SQL> select username from dba_users where upper(username) = 'ALLEN';
USERNAME
----------------------------------------------------------------------------
ALLEN
SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
TABLE_NAME
----------------------------------------------------------------------------
DEPT

Handling dump file name for import

When using import function we always need to give a dump file name. dmk_dtpump gives the possibility to either add the name of the dump file to be imported in the dtpump_import.cfg file or into a parfile that can be used to extend the possibilities. This might be a little bit annoying as a file needs to be updated before each import. This part will give some examples how we can use the dumpfile name in the command line without having to update any configuration file.

Let’s take an example and do an export of the scott schema.

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] dmk_dtpump.ksh -s
DB1 -e -u scott
2021-02-19_15-15-18::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[/u01/app/oracle/admin/DB1]
2021-02-19_15-15-18::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[/u01/app/oracle/admin/DB1/log/DB1_expdp_20210219_151518.par]
2021-02-19_15-15-19::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
SCHEMAS=[scott]
[OK]::dbi-services-Ltd::EXPDP::dmk_dbbackup::DB1::data_pump_retCode::0
Logfile is : /u01/app/oracle/admin/DB1/log/DB1_expdp_20210219_151518.log

The dumpfile automatically generated during the export is the following one DB1_dtpump_expdp_20210219_151518.dmp :

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep -i DUMPFILE
/u01/app/oracle/admin/DB1/log/DB1_expdp_20210219_151518.log
DUMPFILE=DB1_dtpump_expdp_20210219_151518.dmp

Let’s say that we would like to import only the table EMP from scott schema and to remap it to Allen schemas. We will then have to use an exisitng parfile with the REMAP_SCHEMA option.

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] cat
import_scott_dept.par
REMAP_SCHEMA=scott:allen

In the same import command line we can provide the dump file name by doing the following :

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] echo
"DUMPFILE=DB1_dtpump_expdp_20210219_151518.dmp" >>
import_scott_dept.par;dmk_dtpump.ksh -s DB1 -i -t SCOTT.EMP -f
import_scott_dept.par
2021-02-19_15-39-01::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[/u01/app/oracle/admin/DB1]
2021-02-19_15-39-01::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[/u01/app/oracle/admin/DB1/log/DB1_impdp_20210219_153901.par]
[OK]::dbi-services-Ltd::IMPDP::dmk_dbbackup::DB1::data_pump_retCode::0
Logfile is : /u01/app/oracle/admin/DB1/log/DB1_impdp_20210219_153901.log

Let’s imagine I need to reuse several time the same parfile, and that parfile has already got a DUMPFILE parameter.

Current DUMPFILE value is DB1_dtpump_expdp_20210219_151518.dmp.

I want to use new dump file DB1_dtpump_expdp_20210219_154121.dmp.

I will be able to run the following command :

oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep -i DUMPFILE
import_scott_dept.par
DUMPFILE=DB1_dtpump_expdp_20210219_151518.dmp
oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] sed -i
's/^DUMPFILE=.*$/DUMPFILE=DB1_dtpump_expdp_20210219_154121.dmp/g'
import_scott_dept.par;dmk_dtpump.ksh -s DB1 -i -t SCOTT.EMP -f
import_scott_dept.par
2021-02-19_16-03-34::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[/u01/app/oracle/admin/DB1]
2021-02-19_16-03-34::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[/u01/app/oracle/admin/DB1/log/DB1_impdp_20210219_160334.par]
[OK]::dbi-services-Ltd::IMPDP::dmk_dbbackup::DB1::data_pump_retCode::0
Logfile is : /u01/app/oracle/admin/DB1/log/DB1_impdp_20210219_160334.log
oracle@vmdmkdbboradg1:/u01/app/oracle/admin/DB1/etc/ [DB1] grep -i DUMPFILE
import_scott_dept.par
DUMPFILE=DB1_dtpump_expdp_20210219_154121.dmp

Imagine that I do not need any parfile now, I will automatically generate one :

oracle@vmdmkdbboradg1:/tmp/ [DB1] mydate=$(date +"%Y%m%d_%H%M%S");echo
"DUMPFILE=DB1_dtpump_expdp_20210219_154121.dmp" >
imp_${mydate}.par;dmk_dtpump.ksh -s DB1 -i -u SCOTT -f imp_${mydate}.par
2021-02-19_16-14-48::dmk_dtpump_sid.pl::MainProgram ::INFO ==> AdminDir:
[/u01/app/oracle/admin/DB1]
2021-02-19_16-14-48::dmk_dtpump_sid.pl::MainProgram ::INFO ==>
GlobalParams
PARFILE:[/u01/app/oracle/admin/DB1/log/DB1_impdp_20210219_161448.par]
[OK]::dbi-services-Ltd::IMPDP::dmk_dbbackup::DB1::data_pump_retCode::0
Logfile is : /u01/app/oracle/admin/DB1/log/DB1_impdp_20210219_161448.log

Last updated