If we want to stop the export process we need distinguish between two types of stop commands. Orderly shutdown of job execution and exits the client. The default will show new status when available. įrequency (secs) job status is to be monitored where Overwrite destination dump file if it exists. Quit client session and leave job running.ĭefault filesize (bytes) for subsequent ADD_FILE commands.Ĭhange the number of active workers for current job. The following commands are valid while in interactive mode. Object Type: DATABASE_EXPORT/SCHEMA/PACKAGE_BODIES/PACKAGE/PACKAGE_BODYĪt this point we can make several commands: changing the original command to use parallel process, we can add file dump files, investigate the current state, cease the export process or even reconnecting the running client interface. Start Time: Thursday, 19 February, 2015 11:17:03ĬLIENT_COMMAND zohar/******** full=y directory=DATA_PUMP_DIR dumpfile=zohar.dmp logfile=zohar.logĭump File: /u01/app/oracle/product/11.2.0/dbhome_11204/rdbms/log/zohar.dmp This command will connect us back to the management interface and run the “status” ~]$ expdp zohar attach=SYS_EXPORT_FULL_01Įxport: Release 11.2.0.4.0 - Production on Thu Feb 19 11:19:20 2015 To do this we need to run the expdp command and use the attach ~]$ expdp zohar attach=SYS_EXPORT_FULL_01 Now that we have the name of the job, we can attach our client to it.
ZOHAR SYS_EXPORT_FULL_01 EXPORT FULL EXECUTING OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE SQL> SELECT owner_name, job_name, operation, job_mode, state If our window is no longer showing the output, we can use this command from the sqlplus prompt to extract the job name: We can find the same output in our log file (which in this case is “zohar.log”). Starting "ZOHAR"."SYS_EXPORT_FULL_01": zohar/******** full=y directory=DATA_PUMP_DIR dumpfile=zohar.dmp logfile=zohar.log With the Partitioning, OLAP, Data Mining and Real Application Testing options All rights reserved.Ĭonnected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production If we’ve lost our client connection, we can re-connect (or re-attach) to the management interface for that job.įirst of all, let’s find the job name (in this case, to job call ~]$ expdp full=y directory=DATA_PUMP_DIR dumpfile=zohar.dmp logfile=zohar.logĮxport: Release 11.2.0.4.0 - Production on Thu Feb 19 11:12:37 2015Ĭopyright (c) 1982, 2011, Oracle and/or its affiliates. This will send the client to the background and get the management interface prompt. When we need to get to the management interface, we can press ctrl + c at the expdp run-time. When our backup starts it prints out at the beginning of the process the name of the job that he created. When we run the expdp command (or impdp for that matter, but let’s talk about expdp for now) we actually use a command wrapper which tells the database to create a scheduler jobs – an internal job which PL/SQL commands inside the database server which perform the actual export. In this post we will look at the solution given to us by the datapump processes and understand how the fact that we’re using data pump jobs, actually makes those issues very simple to solve. It can be a server policy or it can be that we did it by mistake but my most common reason is just loosing the network connection between my client and the server. In another scenario, we’re running an export (full, for example) and that takes a very long time but then our connection to the server is closed for some reason. The problem is that the output which goes into the log file is usually not enough to answer all the questions the customer is asking.
Generally, since our scripts are running at the background (using timed cron) and there isn’t really an output we can look at (except the log files). Sometimes, we find ourselves in the need to peek into the export process and see what is going on – and figure out what it is doing: is it stack, what table it is exporting or how much time is left till the process completes. The problem is that the tables which we export are very large and the export process might take a very long time to complete (at least a few hours). This is a pretty good solution to backup his important tables using dump file for fast recovery of specific tables (this is in no way a proper database backup) or for transferring it to his development environments. I’ve got a customer who runs export scripts for his most important tables using Oracle data pump export.