The idea for this blog came from my recent experience with migrating an Oracle database from on-premises to Amazon RDS for Oracle. This involved moving a 30TB database using the Oracle native datapump utility. The goal was to conduct a proof-of-concept (POC) to see if Amazon RDS could be a viable solution and to ensure the application performance met expectations.
The POC results were encouraging, but migrating such a large database using datapump with S3 integration was problematic. The process involved moving the dump file to an S3 bucket, enabling S3_INTEGRATION on the RDS database instance, transferring the files from S3 to RDS storage, and then importing from there. The backup dump files needed around 16TB of storage, which became part of the Oracle RDS and couldn't be reclaimed after deleting the dump file. This leads to significant costs since storage is not cheap. Refer Amazon S3 integration.
A better solution is to use EFS_INTEGRATION. The storage with EFS is external to the database, but you can still import from the EFS mount. Although EFS is costlier than S3, it avoids consuming database storage for a long time.
Let's consider the cost implications -
The message is clear: RDS storage is too expensive to use solely for migration and leave it there. I'm showing the cost for one month of storage, but it could take longer before the storage can be released or used. For example, 12 months of RDS storage for 16TB would cost $45,219.84 ($3,768.32 per month).
While migration performance is important, the high cost makes a strong case for using EFS. The performance aspect can be discussed next time.
Let's now look at what EFS Integration would look like and how to perform export and import using data pump utilities.
Pre-requisite for this tutorial -
- Amazon RDS for Oracle database instance
- Elastic File System (EFS) target mount
- EC2 instance with Oracle database client installed
Please reach out to me if you require help with the above setup. I'll include configurations for each but won't delve into details on how to set them up.
*Amazon RDS for Oracle *
You can choose to create an EC2 server for RDS connectivity. This setup will include security groups to enable connectivity from the EC2 server to the RDS database on port 1521. We'll use this server as the Oracle database client.
*Elastic File System *
I am using us-east-1a and us-east-1b for all resources to keep traffic within the same Availability Zones, avoiding inter-AZ data transfer costs.
*EC2 instance with Oracle database client installed *
Connect to the RDS instance from EC2 server (DB client)
You'll have to install Oracle client. The easiest way is to download client home from Oracle website (https://www.oracle.com/database/technologies/oracle19c-linux-downloads.html) and unzip.
[oracle@ip-10-0-12-223 ~]$ cd $ORACLE_HOME/network/admin/
[oracle@ip-10-0-12-223 admin]$ cat tnsnames.ora
demodb = (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST= demodb.ceb9h7hgowbh.us-east-1.rds.amazonaws.com)(PORT=1521))(CONNECT_DATA=(SID= demodb)))
[oracle@ip-10-0-12-223 admin]$
[oracle@ip-10-0-12-223 admin]$ sqlplus admin@demodb
SQL*Plus: Release 19.0.0.0.0 - Production on Sat Jul 27 02:40:19 2024
Version 19.3.0.0.0
Copyright (c) 1982, 2019, Oracle. All rights reserved.
Enter password:
Last Successful login time: Fri Jul 26 2024 00:35:18 +00:00
Connected to:
Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0 - Production
Version 19.23.0.0.0
SQL> select name,open_mode from v$database;
NAME OPEN_MODE
--------- --------------------
DEMODB READ WRITE
SQL>
EFS Integration & using datapump export-import
1. First, create a new option group and add EFS_INTEGRATION.
Note that you would need EFS ID to add EFS Integration. Also USE_IAM_ROLE is set to TRUE, so IAM role should be associated with RDS instance and should have access to EFS mount.
2. Modify RDS instance to use newly created option group.
Click on Modify.
Click on continue. Choose "Apply Immediately" and modify DB instance.
3. Create IAM role and add permissions for EFS.
Role Name: efs-integ-role-for-rds
Review and create role.
4. Add this role to RDS DB instance for Feature EFS_INTEGRATION.
5. Mount EFS on EC2 server (DB client).
Get mount command from EFS
- connect to server as root
mkdir /efsdir
sudo mount -t nfs4 -o nfsvers=4.1,rsize=1048576,wsize=1048576,hard,timeo=600,retrans=2,noresvport 10.0.5.27:/ /efsdir
cd /efsdir
mkdir datapump
chown oracle /efsdir/datapump
chgrp oinstall /efsdir/datapump
chown 777 -R /efsdir/datapump
Entry can be added to /etc/fstab to have EFS mouted across reboot.
6. Next you must create an Oracle directory on Amazon RDS for Oracle.
These commands are run by admin user, other users may need proper privileges to run these commands. Note that file system path must begin with /rdsefs-.
BEGIN
rdsadmin.rdsadmin_util.create_directory_efs(
p_directory_name => 'DATA_PUMP_DIR_EFS',
p_path_on_efs => '/rdsefs-fs-0965c2a7d95fc9e06/datapump');
END;
/
fs-0965c2a7d95fc9e06 is EFS ID.
7. Verify that the database can write a file.
SQL> declare
f utl_file.file_type;
begin
f := utl_file.fopen ('DATA_PUMP_DIR_EFS', 'test.txt', 'w');
utl_file.put_line(f, 'test');
utl_file.fclose(f);
end;
/
2 3 4 5 6 7 8
PL/SQL procedure successfully completed.
SQL> !ls -l /efsdir/datapump/test.txt
-rw-r--r--. 1 3001 101 5 Jul 28 23:06 /efsdir/datapump/test.txt
SQL>
8. Create a test table, admin.demotable, to perform Data Pump export and import steps.
[oracle@ip-10-0-12-223 ~]$ sqlplus admin@demodb
SQL*Plus: Release 19.0.0.0.0 - Production on Sun Jul 28 22:56:51 2024
Version 19.3.0.0.0
Copyright (c) 1982, 2019, Oracle. All rights reserved.
Enter password:
Last Successful login time: Sun Jul 28 2024 21:44:33 +00:00
Connected to:
Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0 - Production
Version 19.23.0.0.0
SQL> create table demotable (a number);
Table created.
SQL> insert into demotable values (1);
1 row created.
SQL> c/1/2
1* insert into demotable values (2)
SQL> /
1 row created.
SQL> commit;
Commit complete.
SQL> select * from demotable;
A
----------
1
2
SQL>
9. Export the table to EFS mount using the directory created earlier.
Run this as oracle OS user.
[oracle@ip-10-0-12-223 ~]$ expdp admin@demodb tables=admin.demotable directory=DATA_PUMP_DIR_EFS dumpfile=demotable.dmp logfile=expdp_demotable.log
Export: Release 19.0.0.0.0 - Production on Sun Jul 28 23:08:29 2024
Version 19.3.0.0.0
Copyright (c) 1982, 2019, Oracle and/or its affiliates. All rights reserved.
Password:
Connected to: Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0 - Production
Starting "ADMIN"."SYS_EXPORT_TABLE_01": admin/********@demodb tables=admin.demotable directory=DATA_PUMP_DIR_EFS dumpfile=demotable.dmp logfile=expdp_demotable.log
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "ADMIN"."DEMOTABLE" 5.062 KB 2 rows
Master table "ADMIN"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ADMIN.SYS_EXPORT_TABLE_01 is:
/rdsefs-fs-0965c2a7d95fc9e06/datapump/demotable.dmp
Job "ADMIN"."SYS_EXPORT_TABLE_01" successfully completed at Sun Jul 28 23:08:56 2024 elapsed 0 00:00:19
[oracle@ip-10-0-12-223 ~]$
That's great. we can use normal expdp utility that DBAs are quite comfirtable with.
10. Let's take it a step further and perform the import.
- Drop table admin.demotable.
SQL> drop table admin.demotable;
Table dropped.
SQL> select * from admin.demotable;
select * from admin.demotable
*
ERROR at line 1:
ORA-00942: table or view does not exist
SQL>
- Import table admin.demotable
[oracle@ip-10-0-12-223 ~]$ impdp admin@demodb tables=admin.demotable directory=DATA_PUMP_DIR_EFS dumpfile=demotable.dmp logfile=impdp_demotable.log
Import: Release 19.0.0.0.0 - Production on Sun Jul 28 23:13:28 2024
Version 19.3.0.0.0
Copyright (c) 1982, 2019, Oracle and/or its affiliates. All rights reserved.
Password:
Connected to: Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0 - Production
Master table "ADMIN"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "ADMIN"."SYS_IMPORT_TABLE_01": admin/********@demodb tables=admin.demotable directory=DATA_PUMP_DIR_EFS dumpfile=demotable.dmp logfile=impdp_demotable.log
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "ADMIN"."DEMOTABLE" 5.062 KB 2 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/MARKER
Job "ADMIN"."SYS_IMPORT_TABLE_01" successfully completed at Sun Jul 28 23:14:08 2024 elapsed 0 00:00:32
[oracle@ip-10-0-12-223 ~]$
[oracle@ip-10-0-12-223 ~]$ sqlplus admin@demodb
SQL*Plus: Release 19.0.0.0.0 - Production on Sun Jul 28 23:33:16 2024
Version 19.3.0.0.0
Copyright (c) 1982, 2019, Oracle. All rights reserved.
Enter password:
Last Successful login time: Sun Jul 28 2024 23:32:56 +00:00
Connected to:
Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0 - Production
Version 19.23.0.0.0
SQL> select * from admin.demotable;
A
----------
1
2
SQL>
Reference documents -
Top comments (0)