A datapump bug or a feature, and an obscure workaround



 Another forum question prompted an interesting investigation for me today - again related to datapump (my favourite topic of late).

The question asked if i do a datapump export with the tablespace option (i.e. extract all objects located in a certain tablespace) can i then import just a certain schemas tables from that file. They had tried (and failed) to get this to work. I initially thought they must have made a mistake but after a little investigation you can;t actually do what you think you should be able to.

So is this requirement possible?

Lets quickly recap what was done in the test case and then see if we can solve this.

Girst up we create some tablespaces,users and tables:

SYS@DB>create tablespace xx;

Tablespace created.

SYS@DB>create tablespace xy;

Tablespace created.


SYS@DB>create user xx identified by xx default tablespace xx;

User created.

SYS@DB>create user xy identified by xy default tablespace xy;

User created.


SYS@DB>grant resource to xx;

Grant succeeded.

SYS@DB>grant resource to xy;

Grant succeeded.


SYS@DB>create table xx.tab1(col1 number);

Table created.

SYS@DB>create table xx.tab2(col1 number);

Table created.

SYS@DB>create table xy.tab1(col1 number);

Table created.

SYS@DB>create table xy.tab2(col1 number);

Table created.


Now we do a tablespace export (note notand export of actual tablespace but any objects located in the named tablespaces)

expdp / tablespaces=xx,xy

Export: Release 11.2.0.3.0 - Production on Fri Jul 11 16:24:13 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Starting "OPS$ORACLE"."SYS_EXPORT_TABLESPACE_01":  /******** tablespaces=xx,xy
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "XX"."TAB1"                                     0 KB       0 rows
. . exported "XX"."TAB2"                                     0 KB       0 rows
. . exported "XY"."TAB1"                                     0 KB       0 rows
. . exported "XY"."TAB2"                                     0 KB       0 rows
Master table "OPS$ORACLE"."SYS_EXPORT_TABLESPACE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for OPS$ORACLE.SYS_EXPORT_TABLESPACE_01 is:
  /oracle/11.2.0.3.0.DB/rdbms/log/expdat.dmp
Job "OPS$ORACLE"."SYS_EXPORT_TABLESPACE_01" successfully completed at 16:24:19







Now if we try and import this saying we just want to load schema 'XX' we get this

impdp / schemas=xx

Import: Release 11.2.0.3.0 - Production on Fri Jul 11 16:29:26 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
ORA-31655: no data or metadata objects selected for job
ORA-39039: Schema expression " IN ('XX')" contains no valid schemas.
Master table "OPS$ORACLE"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
Starting "OPS$ORACLE"."SYS_IMPORT_SCHEMA_01":  /******** schemas=xx
Job "OPS$ORACLE"."SYS_IMPORT_SCHEMA_01" successfully completed at 16:29:27


And it doesn't work

We can fix the problem with a small known list of tables like this (ignoring the error as it proves that it would work if we so wanted)

impdp / tables=xx.tab1

Import: Release 11.2.0.3.0 - Production on Fri Jul 11 16:24:48 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Master table "OPS$ORACLE"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "OPS$ORACLE"."SYS_IMPORT_TABLE_01":  /******** tables=xx.tab1
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39151: Table "XX"."TAB1" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Job "OPS$ORACLE"."SYS_IMPORT_TABLE_01" completed with 1 error(s) at 16:24:50



One way to fix the problem is to run the import as the schema in question, rather than trying to specify schemas - see below (note i dropped the tables before the import so they loaded without error)

impdp xx/xx job_name=RICH include=TABLE directory=data_pump_dir

Import: Release 11.2.0.3.0 - Production on Fri Jul 11 19:01:24 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
ORA-39154: Objects from foreign schemas have been removed from import
Master table "XX"."RICH" successfully loaded/unloaded
Starting "XX"."RICH":  xx/******** job_name=RICH include=TABLE directory=data_pump_dir
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "XX"."TAB1"                                     0 KB       0 rows
. . imported "XX"."TAB2"                                     0 KB       0 rows
Job "XX"."RICH" successfully completed at 19:01:25


So that works fine and we also get a warning that foreign schemas are ignored (in this case XY)

That got me thinking what if there were loads of tables but i only wanted to load a few (probably pretty unlikely) - i had a SQL query i could use to identify them if there was a table that contained a list of the objects in the dumpfile (again a pretty far fetched scenario) - and i realised this could be acheived with a bit of trickery

This case is so unlikely to happen that this solution probably has little value but I thought I'd share it anyway as it could be useful in some case i haven't considered.

My idea was that the master table contains a list of all the objects - surely we could query that to get a dynamic list?

Here is what i came up with

impdp xx/xx job_name=RICH include=TABLE:\"IN \(select OBJECT_NAME from RICH where object_schema=\'XX\' and object_type==\'TABLE\'\)\" keep_master=y



Note here that i explicitly name the job so that forces the master table name to be this name - i can then refer to that in my query.
Note also by copious use of escape characters as i did this without a parfile......
I kept the query to a minimum here - just essentially getting a list of all tables owned by XX but the query could be anything you like.
The import still has to be done by the schema owned for it to work - DBA accounts will not work.

Import: Release 11.2.0.3.0 - Production on Fri Jul 11 16:52:43 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Master table "XX"."RICH" successfully loaded/unloaded
Starting "XX"."RICH":  xx/******** job_name=RICH include=TABLE:"IN (select OBJECT_NAME from RICH where object_schema='XX' and object_type='TABLE')" keep_master=y
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39151: Table "XX"."TAB1" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
ORA-39151: Table "XX"."TAB2" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Job "XX"."RICH" completed with 2 error(s) at 16:52:44


It throws errors as the table existed but it proves that this approach does work.

Maybe this is useful to someone but i doubt it - makes interesting reading though i hope....



1 comments:

  1. Hi.

    Please show value of parameter DEFERRED_SEGMENT_CREATION. I think, that your tables were created, but not segment for them. Unfortunately, i cannot check my theory right now, but i will try asap.

    ReplyDelete