Datapump export from standby/read only databases?



Occasionally i am reminded of the fact that the exp tool still works against a standby or a database open read only where the new and much improved datapump does not - datapump is not looking so clever now they say....

Then i say

ha - it can work you just need to make use of a 'surrogate' database! - is that a new term i just invented there?

Anyway the issue is that datapump has to write something to the database it's extracting from - the Master table, which it uses to build and track it's work - obviously in a readonly or standby database this isn't possible (i'm ignoring logical standby's before you get clever, and snapshot standbys for that matter)

So how do we do this then?

Well the answer lies in the surrogate comment i made earlier - we use a separate database to store the master table in and fetch all of the data through this surrogate using a network_link.

So anyway - lets restate the issue - when you run datapump normally against a read only database - you see this message

expdp user/passfull=y

Export: Release 11.2.0.3.0 - Production on Tue Sep 16 19:07:14 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
ORA-31626: job does not exist
ORA-31633: unable to create master table "PUMPY.SYS_EXPORT_FULL_05"
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT", line 1020
ORA-16000: database open for read-only access


So to get round it you just need another database with a database link to the read only system. Ths can be on the same (or different) server but the version must be within the normal version rules for expdp. I would always suggest to use the exact same version as the db you want to extract from.

So lets switch to this surrogate database and run these steps to set things up

SQL> create public database link pumpy connect to user identified by pass using 'tns alias of actual source we want to extract from.world';

Database link created.


This assumes of course this user already exists in your read only database.....

Now the db link is there we just do datapump in the normal way - the only extra parameter is the network_link one

 expdp / network_link=pumpy dumpfile=surrogate.dmp full=y

This then carries on as normal (as if you were exporting from the actual source) - but the master_table is created in the surrogate and the data/metatdata all pulled over the db link

Export: Release 11.2.0.3.0 - Production on Tue Sep 16 19:20:57 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Starting "OSAORACLE"."SYS_EXPORT_FULL_02":  /******** network_link=pumpy dumpfile=surrogate.dmp full=y
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 1.894 GB
Processing object type DATABASE_EXPORT/TABLESPACE
Processing object type DATABASE_EXPORT/PROFILE
Processing object type DATABASE_EXPORT/SYS_USER/USER
Processing object type DATABASE_EXPORT/SCHEMA/USER
Processing object type DATABASE_EXPORT/ROLE
Processing object type DATABASE_EXPORT/GRANT/SYSTEM_GRANT/PROC_SYSTEM_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/GRANT/SYSTEM_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/ROLE_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/DEFAULT_ROLE

 --- some content cut out

. . exported "SYSTEM"."REPCAT$_USER_PARM_VALUES"             0 KB       0 rows
Master table "OSAORACLE"."SYS_EXPORT_FULL_02" successfully loaded/unloaded
******************************************************************************
Dump file set for OSAORACLE.SYS_EXPORT_FULL_02 is:
  /oracle/11.2.0.3.1.DB/rdbms/log/surrogate.dmp
Job "OSAORACLE"."SYS_EXPORT_FULL_02" successfully completed at 19:16:33


And there you have it...

Enjoy!







Comments

Post a Comment