Many options are available for a scenario when we need to transferring or migrating data across two different Oracle database. One option is using Oracle built-in application exp and imp. exp is used to extract data and other schema objects into a dump file, and imp is the one who can read that file and import it to another machine.
Why using exp/imp? One reason is because it’s speed. From my own experience, migrating 3000 rows data can take significant amount of time if using manual insert with SQL statement, while migrating 60000 rows data using exp/imp would take no more than few minutes.
What will you need? A machine with Oracle Database installation, configured properly, a TNS record for both machine you wish to migrate, and enough disk space for storing dump file, depending on how much your data.
Let’s say we need to migrate data on app1 schema from a development machine DBDEV to staging machine DBSTAGE. The tables need to be migrated are PORT and CARD tables. DBSTAGE already has the tables created, all it need is the data from DBDEV.
First we will need to export the data. Syntax for exporting table is like this:
1 | exp username/password tables=table1,table2file=table_data.dmp |
If you don’t wish to include the password on command line, you can use this format instead:
1 | exp username tables=table1,table2file=table_data.dmp |
The password will be prompted when the command executed. If we put the above scenario the final command will look like this:
1 | exp app1/password@DBDEV tables=PORT,CARDfile=table_data.dmp |
Executing this command will present you with export status screen, will inform how much data exported and if the command exited with or without warning, or even with error. After you got your command prompt back you should be seeing one file named table_data.dmp. Next action we will be using this file to import to DBSTAGE machine.
Syntax for using imp is no different than exp:
1 | imp username/password@databasefile=table_data.dmp |
According to the scenario we should use this command:
1 | imp app1/password@DBSTAGEfile=table_data.dmp ignore=y |
Notice now we using DBSTAGE connection instead DBDEV, and also using optionignore=y. Why using this option? Since on DBSTAGE the tables has already been created, we should use this option to prevent the command for being stopped because the object already exist on target database. If we not including this option, the command will be stopped with error because it failed to create the table, which is already exist.
Executing the above command will present you with similar status screen with exp one, telling you how much rows data has been imported.
Some additional information:
- If you hit with error about constraint, like ORA-02291: integrity constraint, make sure data referenced by the constraint has also been imported. You can also try to turn off the specified constraint before importing.
- If you migrating a considerably huge amount of data, it may be wise to prevent the index from being exported with option indexes=no (while exporting). You can rebuild the index after the import complete.
Reference: