dmk_dtpump.{ksh|cmd}

This chapter presents the Data Pump wrapper script dmk_dtpump{ksh|ps1}.

All scripts have a detailed synopsis please check the option “-h” or “—help”.

The execution of dmk_dtpump.{ksh|ps1} requires a configuration file, which contains the following parameters.

The script requires command lines argument in addition to the configuration file. Some of the configuration file parameters are also available in command-line. The parameters given in command line are from higher priority compared to their configuration values. Which allows different behave using the same configuration file.

Example:

  • To launch a full Data Pump export start the utility as follows (assuming configuration file stored at the default location)

dmk_dtpump.ksh –s ${ORACLE_SID} --export
  • To launch a full Data Pump export start the utility as follows (assuming configuration file stored at the default location)

dmk_dtpump.ksh –s ${ORACLE_SID} –-import

Once all parameters have been computed dmk_dtpump generates a Data Pump parameter file. You can also specify your own Data Pump parameter file for customized export/import operations.

  • Microsoft Windows Task Scheduler doesn’t launch directly the PowerShell layer.

However starting PowerShell from the DOS layer shall be achived using the following syntax:

powershell.exe –noprofile \\
–command "dmk_dtpump.ps1 –s ${ORACLE_SID} –-export"
The PowerShell profile is similar to the Unix/Linux user profile (~/.profile, ~/.bash_profile). The “-noprofile” option bypass the sourcing of the user profile.

Parameters

A hyphen character prefixes the command line parameters as they are be used within a shell.

–h or –help

Display the Synopsis.

–s or -–sid

The target database on which, the Oracle RMAN operation should be launched. This parameter is always required.

dmk_dtpump.ksh –s ${ORACLE_SID}

–e or -–export

Execute a Data Pump export operation “expdp”.

–i or -–import

Execute a Data Pump import operation “impdp”.

–tns or -–tnsname

Use a remote connection through a listener. This option is using local naming parameter file, and thus the specified value must be configured in the tnsnames.ora.

ℹ️This option is mandatory if executing dmk_dtpump on a pdb.

For example, the syntax will be the following for a full export of the PDB named DB122PDB with CDB named DB122 :

–-checkparfile

Generates the Data Pump parameter file.

–c or –-configfile

The dbi services best practice consist of saving the configuration file under the database admin directory which makes the command-line parameter “ConfigFile” useless

Otherwise the command-line parameter becomes mandatory

–u or --schemas

Data Pump schemas export/import, comma separated lists.

–t or --tables

Data Pump table export/import, comma separated lists.

⚠️Tables export and schemas export can not be used simultaneously. In case both -u and -t are used this parameter will be ignored and full schema will be exported/imported.

⚠️When using system user as UserId in the configuration file, make sure schema name is added when naming the tables, as t scott.dept for example.

–f or --parfile

This option makes possible the use of a parfile with specific parameters as excluding schemas for example :

EXCLUDE=SCHEMA:"IN ('OUTLN','SYSTEM','SYSMAN','FLOWS_FILES','APEX_030200','APEX_PUBLIC_USER','ANONYMOUS','SCOTT','SYS')"

Those parfile parameters will be added to the parfile generated automatically by the script and located in ${DMK_ORA_ADMIN_SID}/log in order to take them in account.

ParFile

⚠️Fully qualified path without variable is expected if dmk_dtpump is not executed from the directory where the parfile is located.

UserId

DumpFile

Parallel

Consistent

⚠️The FLASHBACK_TIME parameter is only available for Data Pump exports!

DumpDir

LogPath

⚠️ Fully qualified path without variable is expected if not taking default location.

PreScript

This gives the possibility for the user to execute a batch script (windows) or a shell script (linux) before executing export/import.

⚠️ If no full qualified path is given, but only the script name preceded by ./ for linux, dmk_dtpump is expected to be run in the current directory where the script is located. Otherwise, the user needs to give the full qualified path without using variables.

ℹ️ DMK_DBBACKUP considers only the return code of the script to check whether the execution was successful or not.

PostScript

This gives the possibility for the user to execute a batch script (windows) or a shell script (linux) after executing export/import.

⚠️ If no full qualified path is given, but only the script name preceded by ./ for linux, dmk_dtpump is expected to be run in the current directory where the script is located. Otherwise, the user needs to give the full qualified path without using variables.

ℹ️ DMK_DBBACKUP considers only the return code of the script to check whether the execution was successful or not.

SmtpServer

Customer

–n or --nosuccessmail

MailRecipients

MailFrom

ℹ️ A hyphen character prefixes the command line parameters as they are being used within a shell.

Each parameter defined within the configuration file is automatically parsed at runtime and environment variables defined as values implicitly expanded. Syntax:

Dumpfile name according to parallel process

⚠️In case of parallel use, dumpfile name should be composed with variable %U in order to have one dump file per process. The software will then handle following cases :

  1. In case parallel parameter from configuration file is greater than 1 the default dump file name will be composed of %U and will override the DumpFile parameter from the configuration file.

  2. In case parallel is provided in an external parfile but not dumpfile name, the software will exit with appropriate error message.

  3. In case both parallel and dumpfile name is provided in an external parfile, the software will inform about the need to use %U

  4. In case dumpfile name only is provided in an external parfile and parallel parameter in configuration file is set to one, no additional warning.

  5. In case dumpfile name only is provided in an external parfile and parallel parameter in configuration file is greater than 1, the software will inform about the need to use %U

Examples

Export and import a single table into a new schema

Let’s have an example where we will export the DEPT table from the SCOTT schema and import it into ALLEN schema

To do so we will use the REMAP_SCHEMA datapump option. We previously created ALLEN user. In case the user exists the objects will be added to the existing user schema. In case ALLEN user does not exist, the import will automatically create it with an unusable password. To do so export needs to be done with system user and you will need to reset user password after importing the data.

In our case ALLEN user exists but has no tables.

Let’s export table DEPT from SCOTT schema with user SYSTEM.

We will now prepare a parfile to import the SCOTT table into ALLEN schema.

We will import the data into new ALLEN schema.

Checking we can see that table DEPT has been imported into ALLEN schema.

Exporting and reimporting a schema again

User ALLEN is existing with DEPT table. We will export and import ALLEN schema using SYSTEM user.

Let’s export ALLEN schema using default dtpump_export.cfg.

Let’s drop ALLEN user.

We will import ALLEN schem again using previous exported dump file.

Let’s check if ALLEN schema has been imported successfully.

Handling dump file name for import

When using import function we always need to give a dump file name. dmk_dtpump gives the possibility to either add the name of the dump file to be imported in the dtpump_import.cfg file or into a parfile that can be used to extend the possibilities. This might be a little bit annoying as a file needs to be updated before each import. This part will give some examples how we can use the dumpfile name in the command line without having to update any configuration file.

Let’s take an example and do an export of the scott schema.

The dumpfile automatically generated during the export is the following one DB1_dtpump_expdp_20210219_151518.dmp :

Let’s say that we would like to import only the table EMP from scott schema and to remap it to Allen schemas. We will then have to use an exisitng parfile with the REMAP_SCHEMA option.

In the same import command line we can provide the dump file name by doing the following :

Let’s imagine I need to reuse several time the same parfile, and that parfile has already got a DUMPFILE parameter.

Current DUMPFILE value is DB1_dtpump_expdp_20210219_151518.dmp.

I want to use new dump file DB1_dtpump_expdp_20210219_154121.dmp.

I will be able to run the following command :

Imagine that I do not need any parfile now, I will automatically generate one :

Last updated