Put your query on a file query.sql like this:
set current explain mode yes;
your query here;
set current explain mode no;
Then run:
db2batch -d DBNAME -f query.sql -i complete -ISO CS | tee /tmp/output_batch.txt
db2exfmt -d DBNAME -1 -o /tmp/output_exfmt.txt
This blog covers different aspects around software development and test. There are also general information useful for who needs to test performances and needs to analyze bottlenecks on application servers and databases. Disclaimer: test before you implement any advice, no warranty is provided. Before use the software mentioned in this blog read and understand all licenses agreements. Use of the information contained in this blog is under your own responsibility.
giovedì 28 aprile 2016
db2 backup online and rollforward
I want to share one situation where I got an error during the rollforward of a restore of a backup taken online because an error of missing transaction log files.
So I did backup of the database:
db2 -v backup db DBNAME online to /DB_BACKUPS compress UTIL_IMPACT_PRIORITY 50 include logs without prompting
Then I transfered it to a computer destination where executed the restore:
db2 restore database MAXDB75 from /DB_BACKUPS taken at 20160419133051 into DBNAME NEWLOGPATH '/transationlogs' redirect
Bur during rollforward I've got:
db2 "rollforward database DBNAME to end of backup and complete"
SQL1273N An operation reading the logs on database "DBNAME" cannot continuebecause of a missing log file "S0017002.LOG" on database partition "0" and logstream "0".
So I did backup of the database:
db2 -v backup db DBNAME online to /DB_BACKUPS compress UTIL_IMPACT_PRIORITY 50 include logs without prompting
Then I transfered it to a computer destination where executed the restore:
db2 restore database MAXDB75 from /DB_BACKUPS taken at 20160419133051 into DBNAME NEWLOGPATH '/transationlogs' redirect
Bur during rollforward I've got:
db2 "rollforward database DBNAME to end of backup and complete"
I solved copying manually the olds transaction logs into the new database log path.
Etichette:
backup,
db2,
restore,
rollforward
martedì 12 aprile 2016
Using external sources with IBM InfoSphere Change Data Capture
The tool allows the user to use also external loads of data without the constraint to use the tool to mirror or refresh some tables, but it need to mark on the target database the CDC subscription with the start and end rows that have been unloaded from the source.
For example customize the following example and run the CDC commands to mark the rows:
Run on the Source:
Mark before and after exporting the rows from a table.
dmmarkexternalunloadstart -I INSTANCE_NAME -s SUBSCRIPTION_NAME -t TABLE1
db2 export to /tmp/TABLE1.ixf of ixf messages /tmp/TABLE1_msgs.txt "select * from TABLE1
dmmarkexternalunloadend -I INSTANCE_NAME -s SUBSCRIPTION_NAME -t TABLE1
Copy the file /tmp/TABLE1.ixf from source to the target machine
Run on the Target:
Inport the data exported into the target
db2 import from /tmp/TABLE1.ixf of ixf messages /tmp/TABLE1_msgs.txt insert into TABLE1
Start Mirroring:
On the CDC console start the mirroring on the CDC subscription specified. This allows to transfer all new or updated rows that have been inserted, updated or deleted during the transient time of the exporting operation.
As alternative of exporting all tables from the source, it can be also executed a backup on the source, and a restore on the target.
For example customize the following example and run the CDC commands to mark the rows:
Run on the Source:
Mark before and after exporting the rows from a table.
dmmarkexternalunloadstart -I INSTANCE_NAME -s SUBSCRIPTION_NAME -t TABLE1
db2 export to /tmp/TABLE1.ixf of ixf messages /tmp/TABLE1_msgs.txt "select * from TABLE1
dmmarkexternalunloadend -I INSTANCE_NAME -s SUBSCRIPTION_NAME -t TABLE1
Copy the file /tmp/TABLE1.ixf from source to the target machine
Run on the Target:
Inport the data exported into the target
db2 import from /tmp/TABLE1.ixf of ixf messages /tmp/TABLE1_msgs.txt insert into TABLE1
Start Mirroring:
On the CDC console start the mirroring on the CDC subscription specified. This allows to transfer all new or updated rows that have been inserted, updated or deleted during the transient time of the exporting operation.
As alternative of exporting all tables from the source, it can be also executed a backup on the source, and a restore on the target.
Performance best practices with IBM InfoSphere Change Data Capture
For my experience with this data migration/synchronization tool there are a couple of best practices to speed up the transfer of data from a source database into a target database that I've used with a DB2 database. Before to start the refresh or mirroring operations of the data evaluate to:
- remove all the triggers present on the target database
- remove all foreign keys on the target database
Removing the triggers boosts a lot the speed because if you have a huge transfer of rows from a source table into a target table that has a trigger defined this can cause many consequentials insert and update to be executed.
Removing the foreign keys you can transfer without taking care of the orderm because if you are transfering all tables content involved with the foreign keys, the referencial integrity is respected anyway.
If you are transferring the database from an environment A into an environment B in general after moved the data you have probably to recreate all triggers and foreign keys after synchronized the source and target database.
Instead, for example, if you are using this tool for mirroring (that is to copy changes continuously from source to target) a primary database into a secondary database for data warehouse purposes, like for reporting, then you do not need the triggers neither the foreign keys at runtime usage of the database, then removing them it is not a problem.
Iscriviti a:
Commenti (Atom)