This chapter presents the Data Pump wrapper script dmk_dtpump{ksh|ps1}.
All scripts have a detailed synopsis please check the option “-h” or “—help”.
The execution of dmk_dtpump.{ksh|ps1} requires a configuration file, which contains the following parameters.
The script requires command lines argument in addition to the configuration file. Some of the configuration file parameters are also available in command-line. The parameters given in command line are from higher priority compared to their configuration values. Which allows different behave using the same configuration file.
Example:
To launch a full Data Pump export start the utility as follows (assuming configuration file stored at the default location)
dmk_dtpump.ksh –s ${ORACLE_SID} --export
To launch a full Data Pump export start the utility as follows (assuming configuration file stored at the default location)
dmk_dtpump.ksh –s ${ORACLE_SID} –-import
Once all parameters have been computed dmk_dtpump generates a Data Pump parameter file. You can also specify your own Data Pump parameter file for customized export/import operations.
Microsoft Windows Task Scheduler doesn’t launch directly the PowerShell layer.
However starting PowerShell from the DOS layer shall be achived using the following syntax:
The PowerShell profile is similar to the Unix/Linux user profile (~/.profile, ~/.bash_profile). The “-noprofile” option bypass the sourcing of the user profile.
Parameters
A hyphen character prefixes the command line parameters as they are be used within a shell.
–h or –help
Display the Synopsis.
–s or -–sid
The target database on which, the Oracle RMAN operation should be launched. This parameter is always required.
dmk_dtpump.ksh –s ${ORACLE_SID}
–e or -–export
Execute a Data Pump export operation “expdp”.
–i or -–import
Execute a Data Pump import operation “impdp”.
–tns or -–tnsname
Use a remote connection through a listener. This option is using local naming parameter file, and thus the specified value must be configured in the tnsnames.ora.
ℹ️This option is mandatory if executing dmk_dtpump on a pdb.
For example, the syntax will be the following for a full export of the PDB named DB122PDB with CDB named DB122 :
The dbi services best practice consist of saving the configuration file under the database admin directory which makes the command-line parameter “ConfigFile” useless
Otherwise the command-line parameter becomes mandatory
dmk_dtpump.ksh –s ${ORACLE_SID} –c <path to the configuration file>
–u or --schemas
Data Pump schemas export/import, comma separated lists.
–t or --tables
Data Pump table export/import, comma separated lists.
⚠️Tables export and schemas export can not be used simultaneously. In case both -u and -t are used this parameter will be ignored and full schema will be exported/imported.
⚠️When using system user as UserId in the configuration file, make sure schema name is added when naming the tables, as t scott.dept for example.
–f or --parfile
This option makes possible the use of a parfile with specific parameters as excluding schemas for example :
Those parfile parameters will be added to the parfile generated automatically by the script and located in ${DMK_ORA_ADMIN_SID}/log in order to take them in account.
ParFile
#*************************************************************
# ParFile
# Datapump parameter PARFILE
# If a parfile is specified then the DMK_DBBACKUP configfile parameters
# are appended to this parfile except for parameter already specified within
the parfile
ParFile=""
⚠️Fully qualified path without variable is expected if dmk_dtpump is not executed from the directory where the parfile is located.
UserId
#*************************************************************
# UserId
# Datapump parameter USERID # This Parameter is mandatory
# dbi services recommend to use the SYSTEM account
UserId="system/manager"
DumpFile
#*************************************************************
# DumpFile
# Datapump parameter DUMPFILE
# Default is :
${ORACLE_SID}_<tns_name>_dtpump_expdp|impdp_<YYYYMMDD>_<HHMISS>.dmp
# In case Parallel parameter is configured in the file and greater than 1
default is :
${ORACLE_SID}_<tns_name>_dtpump_expdp|impdp_<YYYYMMDD>_<HHMISS>_%U.dmp
# In case of parallel greater than 1 default dumpfile name will override
this DumpFile parameter.
DumpFile=""
Parallel
#*************************************************************
# Commandline : --Parallel or -p
# Datapump parameter PARALLEL
# Default is : "1"
Parallel=""
Consistent
#*************************************************************
# Commandline : --consistent
# Datapump parameter FLASHBACK_TIME
# Default is : "1" - means consistent
Consistent=""
⚠️The FLASHBACK_TIME parameter is only available for Data Pump exports!
DumpDir
#*************************************************************
# dtpump_DumpDir
# Datapump parameter DIRECTORY
# Default is : process aborted if no DUMPDIR specified
DumpDir="DATA_PUMP_DIR"
⚠️ Fully qualified path without variable is expected if not taking default location.
PreScript
#*************************************************************
# Launch a pre Export script
# the logfile is written under LogPath
# Default is : ""
PreScript=""
This gives the possibility for the user to execute a batch script (windows) or a shell script (linux) before executing export/import.
⚠️ If no full qualified path is given, but only the script name preceded by ./ for linux, dmk_dtpump is expected to be run in the current directory where the script is located. Otherwise, the user needs to give the full qualified path without using variables.
ℹ️ DMK_DBBACKUP considers only the return code of the script to check whether the execution was successful or not.
PostScript
#*************************************************************
# Launch a post Export script
# the logfile is written under LogPath
# Default is : ""
PostScript=""
This gives the possibility for the user to execute a batch script (windows) or a shell script (linux) after executing export/import.
⚠️ If no full qualified path is given, but only the script name preceded by ./ for linux, dmk_dtpump is expected to be run in the current directory where the script is located. Otherwise, the user needs to give the full qualified path without using variables.
ℹ️ DMK_DBBACKUP considers only the return code of the script to check whether the execution was successful or not.
SmtpServer
#*************************************************************
# SmtpServer
# Infrastructure SMTP server
# Default is : "" - localhost
SmtpServer=""
Customer
#*************************************************************
# Name/ShortName of the customer
# Default is : "dbi-services"
Customer="dbi-services-Ltd"
–n or --nosuccessmail
#*************************************************************
# Commandline : --nosuccessmail -n
# "0" -means the sucessmail will be sent as well
# "1" -means that email will be only sent in case of problem or error
# Default is : "0"
NoSuccessMail="0
MailRecipients
#*************************************************************
# MailRecipients
# comma separated list of mail recipients
# Default is : "" - empty
MailRecipients=""
MailFrom
ℹ️ A hyphen character prefixes the command line parameters as they are being used within a shell.
Each parameter defined within the configuration file is automatically parsed at runtime and environment variables defined as values implicitly expanded. Syntax:
#*************************************************************
# MailFrom
# Hardcode address From sender
# Default is "" - ${user}@${hostname}
MailFrom=""
Dumpfile name according to parallel process
⚠️In case of parallel use, dumpfile name should be composed with variable %U in order to have one dump file per process. The software will then handle following cases :
In case parallel parameter from configuration file is greater than 1 the default dump file name will be composed of %U and will override the DumpFile parameter from the configuration file.
In case parallel is provided in an external parfile but not dumpfile name, the software will exit with appropriate error message.
In case both parallel and dumpfile name is provided in an external parfile, the software will inform about the need to use %U
In case dumpfile name only is provided in an external parfile and parallel parameter in configuration file is set to one, no additional warning.
In case dumpfile name only is provided in an external parfile and parallel parameter in configuration file is greater than 1, the software will inform about the need to use %U
Examples
Export and import a single table into a new schema
Let’s have an example where we will export the DEPT table from the SCOTT schema and import it into ALLEN schema
To do so we will use the REMAP_SCHEMA datapump option. We previously created ALLEN user. In case the user exists the objects will be added to the existing user schema. In case ALLEN user does not exist, the import will automatically create it with an unusable password. To do so export needs to be done with system user and you will need to reset user password after importing the data.
In our case ALLEN user exists but has no tables.
SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
no rows selected
Let’s export table DEPT from SCOTT schema with user SYSTEM.
Checking we can see that table DEPT has been imported into ALLEN schema.
SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
TABLE_NAME
----------------------------------------------------------------------------
DEPT
Exporting and reimporting a schema again
User ALLEN is existing with DEPT table. We will export and import ALLEN schema using SYSTEM user.
SQL> select username from dba_users where upper(username) = 'ALLEN';
USERNAME
----------------------------------------------------------------------------
ALLEN
SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
TABLE_NAME
----------------------------------------------------------------------------
DEPT
Let’s export ALLEN schema using default dtpump_export.cfg.
SQL> drop user allen cascade;
User dropped.
SQL> select username from dba_users where upper(username) = 'ALLEN';
no rows selected
SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
no rows selected
SQL>
We will import ALLEN schem again using previous exported dump file.
Let’s check if ALLEN schema has been imported successfully.
SQL> select username from dba_users where upper(username) = 'ALLEN';
USERNAME
----------------------------------------------------------------------------
ALLEN
SQL> select table_name from dba_tables where upper(owner) ='ALLEN';
TABLE_NAME
----------------------------------------------------------------------------
DEPT
Handling dump file name for import
When using import function we always need to give a dump file name. dmk_dtpump gives the possibility to either add the name of the dump file to be imported in the dtpump_import.cfg file or into a parfile that can be used to extend the possibilities. This might be a little bit annoying as a file needs to be updated before each import. This part will give some examples how we can use the dumpfile name in the command line without having to update any configuration file.
Let’s take an example and do an export of the scott schema.
Let’s say that we would like to import only the table EMP from scott schema and to remap it to Allen schemas. We will then have to use an exisitng parfile with the REMAP_SCHEMA option.