Good Contents Are Everywhere, But Here, We Deliver The Best of The Best.Please Hold on!
Cloud security, Cloud Services, Database Management Services, Oracle Databases
Overview
 
Amazon Web Services is a comprehensive, evolving cloud computing platform provided by Amazon. Amazon Relational Database Service (Amazon RDS) is a web service that makes it easier to set up, operate, and scale a relational database in the cloud.
 
In this article we demonstrate how to establish connection to AWS RDS Oracle Database Using SQL Client.

Prerequisites
  • Install Oracle SQL Client on your client machine (Windows/Linux)
  • Use can also use Oracle SQL*Developer tool to establish connection


Procedure to Connect to AWS RDS Oracle Database

  • Put tns entry in tnsnames.ora file client machine. 
    • Review the blog article on how to find end point details of AWS RDS instance at http://netsoftmate.blogspot.com/2018/07/how-to-find-hostname-from-aws-console.html

nsmprd =
  (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = nsmprd.cjhtbujgykil.us-east-1.rds.amazonaws.com)(PORT = 1521))
    (CONNECT_DATA =
      (SERVER = DEDICATED)
      (SERVICE_NAME = nsmprd)
    )
  )

  • Check connectivity to AWS RDS Oracle instance using tnsping utility
C:Users>tnsping nsmprd

TNS Ping Utility for 64-bit Windows: Version 12.2.0.1.0 – Production on 13-JUN-2018 16:19:34

Copyright (c) 1997, 2016, Oracle.  All rights reserved.

Used parameter files:
D:oracle12cproduct12.2.0dbhome_1networkadminsqlnet.ora

Used TNSNAMES adapter to resolve the alias
Attempting to contact (DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = nsmprd.cjhtbujgykil.us-east-1.rds.amazonaws.com)(PORT = 1521)) (CONNECT_DATA = (SERVER = DEDICATED) (SERVICE_NAME = nsmprd)))
TNS-12535: TNS:operation timed out




Follow the steps below to resolve the timed out issue. 

  • Check security group and add rules to VPC security group. While Database instance creation if it is default security group then this instance firewall prevents connections.

To know more about security group please go through following link
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.RDSSecurityGroups.html

Add rule to security group

  • Login to console and click on VPC under Networking and content delivery
 



  • Under vpc dashboard navigation pane click on security group 



  • Select the security group for update.


    • Click on inbound rules and edit to add new rules.


  • Click on add another rule.


  • Select from drop down list


  • Choose oracle port in in-bound traffic
     


  • Click on save

Now check the connectivity again using tnsping

C:Users>tnsping nsmprd

TNS Ping Utility for 64-bit Windows: Version 12.2.0.1.0 – Production on 13-JUN-2018 18:37:22

Copyright (c) 1997, 2016, Oracle.  All rights reserved.

Used parameter files:
D:oracle12cproduct12.2.0dbhome_1networkadminsqlnet.ora


Used TNSNAMES adapter to resolve the alias
Attempting to contact (DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = nsmprd.cjhtbujgykil.us-east-1.rds.amazonaws.com)(PORT = 1521)) (CONNECT_DATA = (SERVER = DEDICATED) (SERVICE_NAME = nsmprd)))
OK (1500 msec)

We can see that the tnsping is now successful.


  • Now Establish the connection to AWS RDS Oracle Instance

C:Users>sqlplus

SQL*Plus: Release 12.2.0.1.0 Production on Wed Jun 13 18:37:30 2018

Copyright (c) 1982, 2016, Oracle.  All rights reserved.

Enter user-name: nsmsystem@nsmprd
Enter password:
Last Successful login time: Wed Jun 13 2018 14:31:06 +05:30

Connected to:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

SQL> select name,INSTANCE_NAME,OPEN_MODE,HOST_NAME,DATABASE_STATUS,logins,to_char(STARTUP_TIME,’DD-MON-YYYY HH24:MI:SS’) “UP TIME”from v$database,v$instance;

NAME      INSTANCE_NAME    OPEN_MODE            HOST_NAME                      DATABASE_STATUS   LOGINS     UP TIME
——— —————- ——————– —————————— —————– ———- ————————-
NSMPRD    NSMPRD           READ WRITE           ip-10-1-2-24                   ACTIVE            ALLOWED    10-JUN-2018 09:27:22




Conclusion

In this article we have learned that how to establish connection to to AWS RDS Oracle Instance using SQL Client and also we have seen how to add rules to the VPC security group.

eBook - Oracle Exadata X8M Patching Recipes | Netsoftmate 
2

Overview

All AWS RDS Instances has DNS name and port number which is also known as endpoint, to establish connection to your AWS DB Instance you require DNS names and port number, we can find endpoint details of AWS RDS Instance through AWS console.

In this article we will demonstrate, how to find endpoint details from AWS console.

Steps to find endpoint of AWS RDS Instance

  • Login to AWS console
please check previous article on how to login AWS console by click on following link

  • From console got to services then RDS

  • Click on RDS

  • Click on Instances

  • Choose oracle database instance and see database properties from Instance action tab.


  • Click on see details to display database instance properties.


Conclusion
In preceding article we have learned that how to find endpoint details from AWS console.



Next, How to establish connection between on-premises and AWS Cloud…





By
Name : Omer
Designation : Senior Database Engineer
Organization: Netsoftmate IT Solutions
. 
3

Amazon Web Services (AWS) is a secure cloud services platform, It offers compute power, database storage, evolving cloud computing platform provided by Amazon, content delivery and other functionality to help businesses scale up and grow.

Amazon Relational Database Service (Amazon RDS) is a web service that makes it easier to set up, operate, and scale a relational database in the cloud. It provides cost-efficient, resizable capacity for an industry-standard relational database and manages common database administration tasks.
To know more about Amazon RDS please visit https://aws.amazon.com/rds/

In this article we will demonstrate how to create an AWS RDS Instance.

Steps to create AWS RDS Instance (Oracle EE)


  • Login to AWS console, click here to know more about how to login to console
  • On AWS console click on RDS dashboard

  • Click on Get started now,  then on next page select engine

  • Click on Oracle Engine to create Oracle Database Instance

  • Select edition, here we are selecting enterprise edition,then another window appear where we have to provide more details, here choose license model as bring your own license

  • Now choose DB version, we have selected latest version

  • Now choose db instance class, we have choose DB instance class as db.r4.xlarge – this class has 4 cpu, 30 GB memory, to know more about Db instance class check this

  • Provide storage type, we have selected IOPS to learn more about AWS storage type please check following link.

  • Provide database settings, like DB name, master username credentials this master username we will use for connecting to database through back end.

  • After specifying DB details now move next window to provide advance configuration settings, like VPC details, security group
To learn more about VPC(virtual private cloud) check this link
To Learn about RDS Security group, please follow this link


  • Provide VPC, security group, database name and port details

  • Under Database options, provide database details character set, option group and parameter group

  • After provide all details click on Launch DB instance


  • Check progress, instance creation is in progress

  • Instance created and backup is in progress

  • Now Database is available for use


Conclusion
In this article  we have learned how to Create AWS RDS Instance. We have touch base about virtual private cloud(VPC), RDS security group, DB Instance class and storage types.



Next, How to find Host name of AWS RDS Instance….


By
Name : Omer
Designation : Senior Database Engineer
Organization: Netsoftmate IT Solutions
3


Introduction:
AWS Import/Export is a service that accelerates transferring large amounts of data into and out of AWS using physical storage appliances, bypassing the Internet. AWS Import/Export supports transfers data directly onto and off of storage devices you own using the Amazon high-speed internal network.

Refer http://docs.aws.amazon.com/AWSImportExport/latest/DG/whatisdisk.html

Scenario 1: Take export dump  on AWS database Server for some of the schemas.

Use following parfile to take export on aws database server

directory=<directory name>
dumpfile=<dumpfile name>
logfile=<logfile name>
schemas=<schema name>
compression=all


For example we are using AWSNSMP  database to take schemas export and keep the dumpfile on database server

We need to use DATA_PUMP_DIR directory for keeping export dumpfile and logfiles.

SQL> select name,open_mode from v$database;

NAME      OPEN_MODE
——— ——————–
AWSNSMP   READ WRITE

SQL> select * from dba_directories;

OWNER      DIRECTORY_NAME           DIRECTORY_PATH                 ORIGIN_CON_ID
———- ———————— —————————— ————-
SYS        BDUMP                    /rdsdbdata/log/trace                       0
SYS        ADUMP                    /rdsdbdata/log/audit                       0
SYS        OPATCH_LOG_DIR           /rdsdbbin/oracle/Opatch                   0
SYS        OPATCH_SCRIPT_DIR        /rdsdbbin/oracle/Opatch                   0
SYS        DATA_PUMP_DIR            /rdsdbdata/datapump                        0
SYS        OPATCH_INST_DIR          /rdsdbbin/oracle/OPatch                    0

6 rows selected.


1. Create Parfile and keep all parameters 

vi AWSNSMP_WNL.par

directory=DATA_DUMP_DIR
dumpfile=bkp_AWSNSMP.dpdmp
logfile=bkp_AWSNSMP.log
schemas=STG_MTS_APP,STG_MTS_V,
STG_MTS,
STG_ENTRP_ADM2,
STG_ENTRP_ADM2_V,
STG_ENTRP_ADM2_RO,
STG_ENTRP_ADM2_APP,
STG_ENTRP_ADM1_APP,
STG_ENTRP_ADM1,
STG_ENTRP_ADM1_V,
STG_ENTRP_ADM1_RO,
STG_ENTRP_V,
STG_ENTRP_RO,
STG_ENTRP,
STG_ENTRP_APP
compression=all
 


2. Start export as follows.

$ expdp parfile=AWSNSMP_WNL.par

Export: Release 12.1.0.2.0 – Production on Wed Apr 26 02:28:27 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.

Username: system@AWSNSMP
Password:
 


3. Monitor the export log from Database.

Findout out file name by querying  RDSADMIN.RDS_FILE_UTIL.LISTDIR package from Database.

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

FILENAME                                 TYPE         FILESIZE MTIME
—————————————- ———- ———- ———
bkp_AWSNSMP.dpdmp                         file         3051520  18-APR-17
bkp_AWSNSMP.log                           file         12118    18-APR-17
datapump/                                directory     4096     18-APR-17

use following package to view the import logfile.

SQL> select * from table
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => ‘DATA_PUMP_DIR’,
        p_filename  => ‘datapump/bkp_AWSNSMP.log’));   2    3    4

TEXT
——————————————————————————————————————————————————————————————————–
;;;
Export: Release 12.1.0.2.0 – Production on Tue Apr 18 10:16:29 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Starting “SYSTEM”.”SYS_EXPORT_SCHEMA_01″:  SYSTEM/********@AWSNSMP directory=DATA_PUMP_DIR dumpfile=bkp_AWSNSMP.dpdmp logfile=bkp_AWSNSMP.log schemas=STG_MTS_APP,STG_MTS_V,STG_
MTS,STG_ENTRP_ADM2,STG_ENTRP_ADM2_V,STG_ENTRP_ADM2_RO,STG_ENTRP_ADM2_APP,STG_ENTRP_ADM1_APP,STG_ENTRP_ADM1,STG_ENTRP_ADM1_V,STG_ENTRP_ADM1_RO,STG_ENTRP_V,STG_ENTRP_RO,STG_ENTRP,STG_ENTRP_APP compression=all encryption_password=********

Estimate in progress using BLOCKS method…
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 28.62 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT


. . exported “STG_ENTRP_ADM2″.”RECORD_SEQ_POSITION”           0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”REVINFO”                       0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SCHEDULED_FORM”                0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SIGNATURE”                     0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SIGNATURE_AUD”                 0 KB       0 rows
. . exported “STG_MTS”.”MTS_DEPLOYMENT”                      0 KB       0 rows
Master table “SYSTEM”.”SYS_EXPORT_SCHEMA_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
  /rdsdbdata/datapump/bkp_AWSNSMP.dpdmp
Job “SYSTEM”.”SYS_EXPORT_SCHEMA_01″ successfully completed at Tue Apr 18 14:16:51 2017 elapsed 0 00:00:16
 


Scenario 2: Perform export on local server from AWS database.

 Connect to any local database server.

1.    Create directory to accommodate dumpfile and logfile.

SQL> select * from dba_directories where directory_name like ‘%AWS%’;

OWNER      DIRECTORY_NAME       DIRECTORY_PATH                         ORIGIN_CON_ID
———- ——————– ————————————– ————-
SYS        AWS_PUMP_DIR         /mounts/nsgpr01_exports/exp/AWS                     0


2.    Create database link.


SQL> CREATE DATABASE LINK AWSNSMP_SYSTEM_LINK CONNECT TO SYSTEM IDENTIFIED BY  <password>  USING ‘AWSNSMP’;
Database link created.


3.    Create parfile and include network link parameter and encryption option.

Vi AWSNSMP_NL.par
directory=AWS_PUMP_DIR
dumpfile=bkp_AWSNSMP.dpdmp
logfile=bkp_AWSNSMP.log
schemas=STG_MTS_APP,
STG_MTS_V,
STG_MTS,
STG_ENTRP_ADM2,
STG_ENTRP_ADM2_V,
STG_ENTRP_ADM2_RO,
STG_ENTRP_ADM2_APP,
STG_ENTRP_ADM1_APP,
STG_ENTRP_ADM1,
STG_ENTRP_ADM1_V,
STG_ENTRP_ADM1_RO,
STG_ENTRP_V,
STG_ENTRP_RO,
STG_ENTRP,
STG_ENTRP_APP
compression=all
NETWORK_LINK=AWSNSMP_SYSTEM_LINK encryption_password=*******


4.    Initiate export.

$ expdp parfile=AWSNSMP_NL.par

Export: Release 12.1.0.2.0 – Production on Wed Apr 26 03:10:15 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.

Username:system
Password


5.    Monitor export log

$ tail -f bkp_AWSNSMP.log


;;;
Export: Release 12.1.0.2.0 – Production on Tue Apr 18 10:12:00 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Starting “SYS”.”SYS_EXPORT_SCHEMA_01″:  /******** AS SYSDBA parfile=AWSNSMP_NL.par
Estimate in progress using BLOCKS method…
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 17.62 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA

. . exported “STG_ENTRP_ADM2″.”REVINFO”                       0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SCHEDULED_FORM”                0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SIGNATURE”                     0 KB       0 rows
. . exported “STG_ENTRP_ADM2″.”SIGNATURE_AUD”                 0 KB       0 rows
. . exported “STG_MTS”.”MTS_DEPLOYMENT”                      0 KB       0 rows
Master table “SYS”.”SYS_EXPORT_SCHEMA_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
  /mounts/nsgpr01_exports/exp/AWS/bkp_AWSNSMP.dpdmp
Job “SYS”.”SYS_EXPORT_SCHEMA_01″ successfully completed at Tue Apr 18 10:14:21 2017 elapsed 0 00:02:18


Scenario 3: Perform data load into AWS database using import.

1.    Create database link on local server to communicate with aws database.

SQL>CREATE DATABASE LINK AWSNSTN_SYSTEM_LINK CONNECT TO SYSTEM IDENTIFIED BY   ***** USING ‘AWSNSTN’;
Database link created.

 
2.    Transfer the dumpfile into aws database server:

Refer below link for transfer file to AWS.
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html

Use the following DBMS_FILE_TRANSFER.PUT_FILE to transfer file to aws Database server.
 

set timing on
set time on
BEGIN
DBMS_FILE_TRANSFER.PUT_FILE (
source_directory_object        => ‘AWS_PUMP_DIR’,
source_file_name               => ‘expdp_NSMPAC_schema.dmp’,
destination_directory_object   => ‘DATA_PUMP_DIR’,
destination_file_name          => ‘expdp_NSMPAC_schema.dmp’,
destination_database           => ‘AWSNSTN_SYSTEM_LINK’);
END;
/


Source_directory_object  –> local database directory where your dumpfile located
Source_file_name –>      Dumpfile name
destination_directory_object –> aws database directory where you need to place the dumpfile
destination_database –> Database link

3.    Monitor transfer of files from database.

Issue following query to view the filename

select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

$ sqlplus

SQL*Plus: Release 12.1.0.2.0 Production on Wed May 3 01:02:25 2017

Copyright (c) 1982, 2014, Oracle.  All rights reserved.

Enter user-name: SYSTEM@AWSNSTN
Enter password:


Last Successful login time: Tue May 02 2017 01:13:16 -04:00

Connected to:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

SQL> select name,open_mode from v$database;

NAME      OPEN_MODE
——— ——————–
AWSNSTN   READ WRITE

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

FILENAME                                           TYPE         FILESIZE MTIME
————————————————– ———- ———- ——————
expdp_NSMPAC_schema.dmp                             file       2.0437E+10  18-APR-17
impdp_NSMPAC_schema.log                             file       164046      18-APR-17
datapump/                                          directory     4096      19-APR-17


4.    create parfile for import with parallel parameter to speed up the import process.

vi impdp_AWSNSTN_schemas.ctl
directory=DATA_PUMP_DIR dumpfile=expdp_NSMPAC_schema.dmp logfile=impdp_NSMPAC_schema.log schemas=BRMGR01 transform=oid:n parallel=16

  
5.    Start import in nohup and background

nohup impdp SYSTEM/*****@AWSNSTN parfile=impdp_AWSNSTN_schemas.ctl &

6.    Monitor import log from database.

SQL> select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR(‘DATA_PUMP_DIR’)) order by mtime;

FILENAME                                           TYPE         FILESIZE MTIME
————————————————– ———- ———- ——————
expdp_NSMPAC_schema.dmp                             file       2.0437E+10   18-APR-17
impdp_NSMPAC_schema.log                             file       164046       18-APR-17
datapump/                                          directory     4096       19-APR-17



From above results you have file names,
Use rdsadmin.rds_file_util.read_text_file to read import logfile

SQL> select * from table
    (rdsadmin.rds_file_util.read_text_file(
        p_directory => ‘DATA_PUMP_DIR’,
        p_filename  => ‘datapump/impdp_NSMPAC_schema.log’));  2    3    4

TEXT
———————————————————————————————————————————————————————————————————————————————————-
;;;
Import: Release 12.1.0.2.0 – Production on Tue Apr 18 13:07:42 2017

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.
;;;
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 – 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Master table “SYSTEM”.”SYS_IMPORT_SCHEMA_01″ successfully loaded/unloaded
Starting “SYSTEM”.”SYS_IMPORT_SCHEMA_01″:  SYSTEM/********@AWSNSTN parfile=impdp_AWSNSTN_schemas.ctl
Processing object type SCHEMA_EXPORT/USER

Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/AUDIT_OBJ
Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_INDEX/INDEX_STATISTICS
Job “SYSTEM”.”SYS_IMPORT_SCHEMA_01″ completed at Tue Apr 18 19:46:36 2017 elapsed 0 02:38:50

2462 rows selected.


Conclusion:
In this preceding scenario based article  we have learned how to perform export from aws databases and import into aws databases from on-premises to AWS and vice versa.

BY
Name: Omer
Designation: Senior Database Engineer
Organization: Netsoftmate IT Solutions.
Email: info@netsoftmate.com
6