Uncategorized

AWS – export and import using datapump utility.


Introduction:
AWS Import/Export is a service that accelerates transferring large amounts of data into and out of AWS using physical storage appliances, bypassing the Internet. AWS Import/Export supports transfers data directly onto and off of storage devices you own using the Amazon high-speed internal network.

Refer http://docs.aws.amazon.com/AWSImportExport/latest/DG/whatisdisk.html

Scenario 1: Take export dump  on AWS database Server for some of the schemas.

Use following parfile to take export on aws database server

directory=<directory name>
dumpfile=<dumpfile name>
logfile=<logfile name>
schemas=<schema name>
compression=all


For example we are using AWSNSMP  database to take schemas export and keep the dumpfile on database server

We need to use DATA_PUMP_DIR directory for keeping export dumpfile and logfiles.

SQL> select name,open_mode from v$database;

NAME      OPEN_MODE
——— ——————–
AWSNSMP   READ WRITE

SQL> select * from dba_directories;

OWNER      DIRECTORY_NAME           DIRECTORY_PATH                 ORIGIN_CON_ID
———- ———————— —————————— ————-
SYS        BDUMP                    /rdsdbdata/log/trace                       0
SYS        ADUMP                    /rdsdbdata/log/audit                       0
SYS        OPATCH_LOG_DIR           /rdsdbbin/oracle/Opatch                   0
SYS        OPATCH_SCRIPT_DIR        /rdsdbbin/oracle/Opatch                   0
SYS        DATA_PUMP_DIR            /rdsdbdata/datapump                        0
SYS        OPATCH_INST_DIR          /rdsdbbin/oracle/OPatch                    0

6 rows selected.


1. Create Parfile and keep all parameters 

vi AWSNSMP_WNL.par

directory=DATA_DUMP_DIR
dumpfile=bkp_AWSNSMP.dpdmp
logfile=bkp_AWSNSMP.log
schemas=STG_MTS_APP,STG_MTS_V,
STG_MTS,
STG_ENTRP_ADM2,
STG_ENTRP_ADM2_V,
STG_ENTRP_ADM2_RO,
STG_ENTRP_ADM2_APP,
STG_ENTRP_ADM1_APP,
STG_ENTRP_ADM1,
STG_ENTRP_ADM1_V,
STG_ENTRP_ADM1_RO,
STG_ENTRP_V,
STG_ENTRP_RO,
STG_ENTRP,
STG_ENTRP_APP
compression=all
 


2. Start export as follows.

$ expdp parfile=AWSNSMP_WNL.par

Export: Release 12.1.0.2.0 – Production on Wed Apr 26 02:28:27 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.

Username: system@AWSNSMP
Password:
 


3. Monitor the export log from Database.

Findout out file name by querying  RDSADMIN.RDS_FILE_UTIL.LISTDIR package from Database.

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

FILENAME                                 TYPE         FILESIZE MTIME
—————————————- ———- ———- ———
bkp_AWSNSMP.dpdmp                         file         3051520  18-APR-17
bkp_AWSNSMP.log                           file         12118    18-APR-17
datapump/                                directory     4096     18-APR-17

use following package to view the import logfile.

SQL> select * from table
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => ‘DATA_PUMP_DIR’,
        p_filename  => ‘datapump/bkp_AWSNSMP.log’));   2    3    4

TEXT
——————————————————————————————————————————————————————————————————–
;;;
Export: Release 12.1.0.2.0 – Production on Tue Apr 18 10:16:29 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Starting “SYSTEM”.”SYS_EXPORT_SCHEMA_01″:  SYSTEM/********@AWSNSMP directory=DATA_PUMP_DIR dumpfile=bkp_AWSNSMP.dpdmp logfile=bkp_AWSNSMP.log schemas=STG_MTS_APP,STG_MTS_V,STG_
MTS,STG_ENTRP_ADM2,STG_ENTRP_ADM2_V,STG_ENTRP_ADM2_RO,STG_ENTRP_ADM2_APP,STG_ENTRP_ADM1_APP,STG_ENTRP_ADM1,STG_ENTRP_ADM1_V,STG_ENTRP_ADM1_RO,STG_ENTRP_V,STG_ENTRP_RO,STG_ENTRP,STG_ENTRP_APP compression=all encryption_password=********

Estimate in progress using BLOCKS method…
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 28.62 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT


. . exported “STG_ENTRP_ADM2″.”RECORD_SEQ_POSITION”           0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”REVINFO”                       0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SCHEDULED_FORM”                0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SIGNATURE”                     0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SIGNATURE_AUD”                 0 KB       0 rows
. . exported “STG_MTS”.”MTS_DEPLOYMENT”                      0 KB       0 rows
Master table “SYSTEM”.”SYS_EXPORT_SCHEMA_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
  /rdsdbdata/datapump/bkp_AWSNSMP.dpdmp
Job “SYSTEM”.”SYS_EXPORT_SCHEMA_01″ successfully completed at Tue Apr 18 14:16:51 2017 elapsed 0 00:00:16
 


Scenario 2: Perform export on local server from AWS database.

 Connect to any local database server.

1.    Create directory to accommodate dumpfile and logfile.

SQL> select * from dba_directories where directory_name like ‘%AWS%’;

OWNER      DIRECTORY_NAME       DIRECTORY_PATH                         ORIGIN_CON_ID
———- ——————– ————————————– ————-
SYS        AWS_PUMP_DIR         /mounts/nsgpr01_exports/exp/AWS                     0


2.    Create database link.


SQL> CREATE DATABASE LINK AWSNSMP_SYSTEM_LINK CONNECT TO SYSTEM IDENTIFIED BY  <password>  USING ‘AWSNSMP’;
Database link created.


3.    Create parfile and include network link parameter and encryption option.

Vi AWSNSMP_NL.par
directory=AWS_PUMP_DIR
dumpfile=bkp_AWSNSMP.dpdmp
logfile=bkp_AWSNSMP.log
schemas=STG_MTS_APP,
STG_MTS_V,
STG_MTS,
STG_ENTRP_ADM2,
STG_ENTRP_ADM2_V,
STG_ENTRP_ADM2_RO,
STG_ENTRP_ADM2_APP,
STG_ENTRP_ADM1_APP,
STG_ENTRP_ADM1,
STG_ENTRP_ADM1_V,
STG_ENTRP_ADM1_RO,
STG_ENTRP_V,
STG_ENTRP_RO,
STG_ENTRP,
STG_ENTRP_APP
compression=all
NETWORK_LINK=AWSNSMP_SYSTEM_LINK encryption_password=*******


4.    Initiate export.

$ expdp parfile=AWSNSMP_NL.par

Export: Release 12.1.0.2.0 – Production on Wed Apr 26 03:10:15 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.

Username:system
Password


5.    Monitor export log

$ tail -f bkp_AWSNSMP.log


;;;
Export: Release 12.1.0.2.0 – Production on Tue Apr 18 10:12:00 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Starting “SYS”.”SYS_EXPORT_SCHEMA_01″:  /******** AS SYSDBA parfile=AWSNSMP_NL.par
Estimate in progress using BLOCKS method…
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 17.62 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA

. . exported “STG_ENTRP_ADM2″.”REVINFO”                       0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SCHEDULED_FORM”                0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SIGNATURE”                     0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SIGNATURE_AUD”                 0 KB       0 rows
. . exported “STG_MTS”.”MTS_DEPLOYMENT”                      0 KB       0 rows
Master table “SYS”.”SYS_EXPORT_SCHEMA_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
  /mounts/nsgpr01_exports/exp/AWS/bkp_AWSNSMP.dpdmp
Job “SYS”.”SYS_EXPORT_SCHEMA_01″ successfully completed at Tue Apr 18 10:14:21 2017 elapsed 0 00:02:18


Scenario 3: Perform data load into AWS database using import.

1.    Create database link on local server to communicate with aws database.

SQL>CREATE DATABASE LINK AWSNSTN_SYSTEM_LINK CONNECT TO SYSTEM IDENTIFIED BY   ***** USING ‘AWSNSTN’;
Database link created.

 
2.    Transfer the dumpfile into aws database server:

Refer below link for transfer file to AWS.
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html

Use the following DBMS_FILE_TRANSFER.PUT_FILE to transfer file to aws Database server.
 

set timing on
set time on
BEGIN
DBMS_FILE_TRANSFER.PUT_FILE (
source_directory_object        => ‘AWS_PUMP_DIR’,
source_file_name               => ‘expdp_NSMPAC_schema.dmp’,
destination_directory_object   => ‘DATA_PUMP_DIR’,
destination_file_name          => ‘expdp_NSMPAC_schema.dmp’,
destination_database           => ‘AWSNSTN_SYSTEM_LINK’);
END;
/


Source_directory_object  –> local database directory where your dumpfile located
Source_file_name –>      Dumpfile name
destination_directory_object –> aws database directory where you need to place the dumpfile
destination_database –> Database link

3.    Monitor transfer of files from database.

Issue following query to view the filename

select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

$ sqlplus

SQL*Plus: Release 12.1.0.2.0 Production on Wed May 3 01:02:25 2017

Copyright (c) 1982, 2014, Oracle.  All rights reserved.

Enter user-name: SYSTEM@AWSNSTN
Enter password:


Last Successful login time: Tue May 02 2017 01:13:16 -04:00

Connected to:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

SQL> select name,open_mode from v$database;

NAME      OPEN_MODE
——— ——————–
AWSNSTN   READ WRITE

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

FILENAME                                           TYPE         FILESIZE MTIME
————————————————– ———- ———- ——————
expdp_NSMPAC_schema.dmp                             file       2.0437E+10  18-APR-17
impdp_NSMPAC_schema.log                             file       164046      18-APR-17
datapump/                                          directory     4096      19-APR-17


4.    create parfile for import with parallel parameter to speed up the import process.

vi impdp_AWSNSTN_schemas.ctl
directory=DATA_PUMP_DIR dumpfile=expdp_NSMPAC_schema.dmp logfile=impdp_NSMPAC_schema.log schemas=BRMGR01 transform=oid:n parallel=16

  
5.    Start import in nohup and background

nohup impdp SYSTEM/*****@AWSNSTN parfile=impdp_AWSNSTN_schemas.ctl &

6.    Monitor import log from database.

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

FILENAME                                           TYPE         FILESIZE MTIME
————————————————– ———- ———- ——————
expdp_NSMPAC_schema.dmp                             file       2.0437E+10   18-APR-17
impdp_NSMPAC_schema.log                             file       164046       18-APR-17
datapump/                                          directory     4096       19-APR-17



From above results you have file names,
Use rdsadmin.rds_file_util.read_text_file to read import logfile

SQL> select * from table
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => ‘DATA_PUMP_DIR’,
        p_filename  => ‘datapump/impdp_NSMPAC_schema.log’));  2    3    4

TEXT
———————————————————————————————————————————————————————————————————————————————————-
;;;
Import: Release 12.1.0.2.0 – Production on Tue Apr 18 13:07:42 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Master table “SYSTEM”.”SYS_IMPORT_SCHEMA_01″ successfully loaded/unloaded
Starting “SYSTEM”.”SYS_IMPORT_SCHEMA_01″:  SYSTEM/********@AWSNSTN parfile=impdp_AWSNSTN_schemas.ctl
Processing object type SCHEMA_EXPORT/USER

Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/AUDIT_OBJ
Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_INDEX/INDEX_STATISTICS
Job “SYSTEM”.”SYS_IMPORT_SCHEMA_01″ completed at Tue Apr 18 19:46:36 2017 elapsed 0 02:38:50

2462 rows selected.


Conclusion:
In this preceding scenario based article  we have learned how to perform export from aws databases and import into aws databases from on-premises to AWS and vice versa.

BY
Name: Omer
Designation: Senior Database Engineer
Organization: Netsoftmate IT Solutions.
Email: info@netsoftmate.com