Memory overflow (my memory that is) and more datapump related tricks



I'm finding more and more recently that i rediscover things, i've not completely forgotten about them but they are so far away from the surface that initially i think "have i seen that before?" followed by "hold on - i have seen this before" then "I've actually used it before" and finally "I've blogged about if before".......

I therefore won't write any more about that particular topic, instead I'll blog about something that i did rediscover this week but I actually haven't written about before.

So here goes, a question was asked on the technet oracle run forum site about wanting to export data from a number of tables in database 'A' and load that into the same tables in database 'B'. The caveat was that any rows that existed in 'B' already should not be updated/replaced only 'extra' data from 'A' should be loaded into 'B'

So for example if in database A we have one of the tables with this data
1
2
3

And in database B we have
1
2

Then only the row with a value of 3 should be 'inserted' the other should be left alone.

Initially when reading this i thought, hmm datapump can't do that and i suggested using the merge command over a database link.

But something was niggling in the back of my mind (and id didnt like the fact that loads of long merge commands would have to be written) and i did a quick bit of research and there are actually 2 ways to solve this (one requiring less typing than the other)

The first one (with less typing) is to use something that was introduced in 11g datapump (and i guess is pretty rarely used) - here's an example of that:

First up we create a simple table, with a primary key, and put some data in

SYS@DB>create table demo.test (col1 number);

Table created.

SYS@DB>alter table demo.test add primary key (col1);

Table altered.

SYS@DB>insert into demo.test values (1);

1 row created.

SYS@DB>insert into demo.test values (2);

1 row created.

SYS@DB>commit;

Commit complete.

SYS@DB>



Then we export out that table

[oracle@server]:DB:[~]# expdp / tables=demo.test reuse_dumpfiles=y

Export: Release 11.2.0.3.0 - Production on Thu May 29 19:59:32 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Starting "OPS$ORACLE"."SYS_EXPORT_TABLE_01":  /******** tables=demo.test reuse_dumpfiles=y
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 16 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
. . exported "DEMO"."TEST"                               5.015 KB       2 rows
Master table "OPS$ORACLE"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for OPS$ORACLE.SYS_EXPORT_TABLE_01 is:
  /oracle/11.2.0.3.0.DB/rdbms/log/expdat.dmp
Job "OPS$ORACLE"."SYS_EXPORT_TABLE_01" successfully completed at 19:59:38


Obviously now loading that back in fails as the default is to skip any tables that already exist

[oracle@server]:DB:[~]# impdp /

Import: Release 11.2.0.3.0 - Production on Thu May 29 20:00:03 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Master table "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "OPS$ORACLE"."SYS_IMPORT_FULL_01":  /********
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39151: Table "DEMO"."TEST" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Job "OPS$ORACLE"."SYS_IMPORT_FULL_01" completed with 1 error(s) at 20:00:04


So instead of now skipping the objects we say if its there don't try and create it just put the data into the exiting object.


[oracle@server]:DB:[~]# impdp / table_exists_action=append

Import: Release 11.2.0.3.0 - Production on Thu May 29 20:00:44 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Master table "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "OPS$ORACLE"."SYS_IMPORT_FULL_01":  /******** table_exists_action=append
Processing object type TABLE_EXPORT/TABLE/TABLE
Table "DEMO"."TEST" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
ORA-31693: Table data object "DEMO"."TEST" failed to load/unload and is being skipped due to error:
ORA-00001: unique constraint (DEMO.SYS_C0051678) violated
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Job "OPS$ORACLE"."SYS_IMPORT_FULL_01" completed with 1 error(s) at 20:00:45


It still fails of course due to the primary key

Lets now add the new 11g parameter 'data_options=skip_consraint_errors'

[oracle@server]:DB:[~]# impdp / table_exists_action=append data_options=skip_constraint_errors

Import: Release 11.2.0.3.0 - Production on Thu May 29 20:01:57 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Master table "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "OPS$ORACLE"."SYS_IMPORT_FULL_01":  /******** table_exists_action=append data_options=skip_constraint_errors
Processing object type TABLE_EXPORT/TABLE/TABLE
Table "DEMO"."TEST" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "DEMO"."TEST"                               5.015 KB       0 out of 2 rows
2 row(s) were rejected with the following error:
ORA-00001: unique constraint (DEMO.SYS_C0051678) violated

Rejected rows with the primary keys are:
 Rejected row #1:
   column COL1: 1
 Rejected row #2:
   column COL1: 2
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Job "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully completed at 20:01:58


Now it still fails - but the good thing is it tried to do every row - it didn't just exit. In our test case though all the data is duplicated. Lets remove one row from the table and try again.

SYS@DB>delete from demo.test where col1=2;

1 row deleted.

SYS@DB>commit;

Commit complete.

SYS@DB>


Same command as last time

[oracle@server]:DB:[~]# impdp / table_exists_action=append data_options=skip_constraint_errors

Import: Release 11.2.0.3.0 - Production on Thu May 29 20:02:38 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Master table "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "OPS$ORACLE"."SYS_IMPORT_FULL_01":  /******** table_exists_action=append data_options=skip_constraint_errors
Processing object type TABLE_EXPORT/TABLE/TABLE
Table "DEMO"."TEST" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "DEMO"."TEST"                               5.015 KB       1 out of 2 rows
1 row(s) were rejected with the following error:
ORA-00001: unique constraint (DEMO.SYS_C0051678) violated

Rejected rows with the primary keys are:
 Rejected row #1:
   column COL1: 1
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Job "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully completed at 20:02:39


Result - the 'extra' row loaded and it informed us which was the duplicate - so we have a solution. The only downside is that we will likely get a huge logfile and it may be difficult to find out if there were any genuine issues rather than just PK failues.


This is where solution 2 comes in - and is actually available back in 10g. Use of the query parameter on import - this is less widely used i guess and generally people may not realise that query can be used the same way on import as it is for export.





So if i come up with a simple query where i compare the data in the dumpfile with that already in the table i can just load the rows that don't exist. In the example below the initial part of the where clause refers to data in the dumpfile (the "where col1" part), the section following the "not in" clause is the data in the table that already exists in the database
 
So lets give it a whirl

[oracle@server]:DB:[~]# impdp / table_exists_action=append query=DEMO\.TEST\:\"where col1 not in \(select col1 from demo\.test\)\"

Import: Release 11.2.0.3.0 - Production on Thu May 29 20:07:44 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Master table "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "OPS$ORACLE"."SYS_IMPORT_FULL_01":  /******** table_exists_action=append query=DEMO.TEST:"where col1 not in (select col1 from demo.test)"
Processing object type TABLE_EXPORT/TABLE/TABLE
Table "DEMO"."TEST" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "DEMO"."TEST"                               5.015 KB       0 out of 2 rows
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Job "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully completed at 20:07:45


So it runs fine but doesn't do anything as the rows already exist - again lets delete one

SYS@DB>delete from demo.test where col1=2;

1 row deleted.

SYS@DB>commit;

Commit complete.






And if we now try the import again we see

[oracle@server]:DB:[~]# impdp / table_exists_action=append query=DEMO\.TEST\:\"where col1 not in \(select col1 from demo\.test\)\"

Import: Release 11.2.0.3.0 - Production on Thu May 29 20:09:07 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning option
Master table "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "OPS$ORACLE"."SYS_IMPORT_FULL_01":  /******** table_exists_action=append query=DEMO.TEST:"where col1 not in (select col1 from demo.test)"
Processing object type TABLE_EXPORT/TABLE/TABLE
Table "DEMO"."TEST" exists. Data will be appended to existing table but all dependent metadata will be skipped due to table_exists_action of append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "DEMO"."TEST"                               5.015 KB       1 out of 2 rows
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Job "OPS$ORACLE"."SYS_IMPORT_FULL_01" successfully completed at 20:09:08


And there we go - that works well and we don;t get loads of errors - the downside being that we have to code the query caluse for every single table that we want to load the missing rows into.

So there you have it - 2 datapump solutions to the problem. Now all i have to do is not let them escape from my head again.....

Comments