Migrating to Oracle Autonomous Database requires careful planning, appropriate tool selection, and adherence to best practices. Oracle provides multiple migration methods and tools designed for different scenarios, from simple data loads to complex zero-downtime migrations. Understanding these options ensures successful database migrations with minimal disruption to business operations.
Migration Methods Overview
Application-Based Data Loading
SQL Developer Integration:
Data can be loaded into Autonomous Database through applications such as SQL Developer, providing a graphical interface for small to medium datasets.
SQL Developer Features:
- Import wizard for CSV, Excel, and other formats
- Table data import from other databases
- Query-based data insertion
- Ideal for development and testing scenarios
Limitations:
- Not recommended for large production databases
- Manual process requiring user intervention
- Limited parallelism and performance optimization
- Better suited for initial testing than full migrations
Object Store Staging (Recommended Approach)
Cloud Object Storage Integration:
The more efficient and preferred method to load data into Autonomous Database is to stage data into cloud object stores before importing.
Supported Object Stores:
- OCI Object Storage: Native Oracle Cloud storage
- AWS S3: Amazon Web Services buckets
- Azure Blob Storage: Microsoft Azure storage
- Google Cloud Storage: Google Cloud Platform storage
Object Storage Benefits:
- Decouples data movement from database import
- Enables parallel processing for faster loads
- Provides durable intermediate storage
- Reduces network constraints on database
- Supports multi-cloud migration scenarios
Data Pump Migration
When to Use Data Pump
Version Requirements:
Data Pump must be used to move database data into new Autonomous Database for versions above 10.1 (Release 10g or higher).
Oracle Data Pump is useful for migrating data among schemas, databases of different versions and on different operating systems, and from on-premises to on-premises and to Oracle Cloud
Data Pump is Ideal For:
- Offline migrations with acceptable downtime
- Database versions 10g and higher
- Schema-level or full database migrations
- Migrations requiring data transformation
- Cross-platform migrations with character set conversion
Data Pump Migration Process
Three-Step Process:
- Export: Use Data Pump Export (expdp) to create dump files
- Transfer: Copy dump files to object storage
- Import: Use Data Pump Import (impdp) to load into Autonomous Database
Alternative Network Mode:
You can combine these steps into a Network mode import using Database links and avoid the dumpfiles
Data Pump Best Practices
Critical Configuration Guidelines:
1. Don't Use SYS as SYSDBA:
Never use the SYS user with SYSDBA privileges for Data Pump operations. Always use ADMIN user in Autonomous Database.
2. Always Use a Parameter File:
Create a parameter file (parfile) to manage Data Pump parameters systematically and enable repeatable migrations.
Example Parameter File (export.par):
SCHEMAS=HR,SALES,FINANCE
DUMPFILE=export%U.dmp
DIRECTORY=data_pump_dir
PARALLEL=4
EXCLUDE=CLUSTER,DB_LINK
COMPRESSION=ALL
LOGFILE=export.log
3. Dumpfile Parameter:
The dumpfile parameter applies to both export and import operations, specifying dump file names and locations.
4. Filesize Parameter:
The filesize parameter applies to export only, controlling maximum size of each dump file in the export set.
5. Always Use Schema Mode:
Oracle recommends using the schema mode for migrating to Autonomous AI Database. You can list the schemas you want to export by using the schemas parameter
Why Schema Mode:
- Granular control over what gets migrated
- Easier to manage than full database exports
- Reduces export/import time
- Simplifies troubleshooting
- Aligns with ADB's multi-schema architecture
6. Always Exclude Statistics:
Exclude database statistics during export as Autonomous Database automatically gathers optimized statistics post-import.
EXCLUDE=STATISTICS
7. Always Use Parallel:
Set the parallel parameter to at least the number of CPUs you have in your Autonomous AI Database for faster migration
Parallelism Guidelines:
- Minimum: Number of OCPUs in Autonomous Database
- For HIGH service: Match OCPU count
- Multiple dump files enable parallel import
- Balance parallelism with available resources
8. Always Remove Column Encryption:
Remove column-level encryption before migration as Autonomous Database handles encryption automatically through Transparent Data Encryption (TDE).
9. Consider Using Compression:
Use Data Pump compression to reduce dump file sizes, transfer times, and storage costs.
Compression Options:
COMPRESSION=ALL          # Compress all data
COMPRESSION=METADATA_ONLY # Compress metadata only
COMPRESSION=DATA_ONLY    # Compress data only
10. Remove Segment Customization:
Use transform parameter to remove segment attributes that may not apply in Autonomous Database.
TRANSFORM=SEGMENT_ATTRIBUTES:N
This removes:
- TABLESPACE specifications
- STORAGE clauses
- LOGGING attributes
- Physical storage attributes
Additional Data Pump Recommendations
Exclude Unsupported Objects:
EXCLUDE=CLUSTER,DB_LINK,INDEXTYPE,MATERIALIZED_VIEW_LOG
Remap Tablespaces:
REMAP_TABLESPACE=USERS:DATA
Partition Merge (Optional):
PARTITION_OPTIONS=MERGE
Transform LOB Storage:
TRANSFORM=LOB_STORAGE:SECUREFILE
Use Latest Data Pump Version:
Oracle recommends using the latest Oracle Data Pump version for importing data from Data Pump files into your Autonomous AI Database as it contains enhancements and fixes
Sample Data Pump Commands
Export Command:
expdp admin/password@sourcedb \
  SCHEMAS=HR,SALES \
  DUMPFILE=export%U.dmp \
  DIRECTORY=data_pump_dir \
  PARALLEL=4 \
  EXCLUDE=CLUSTER,DB_LINK,STATISTICS \
  COMPRESSION=ALL \
  TRANSFORM=SEGMENT_ATTRIBUTES:N \
  LOGFILE=export.log
Import from Object Storage:
impdp admin/password@adb_high \
  CREDENTIAL=obj_store_cred \
  DUMPFILE=https://objectstorage.region.oraclecloud.com/n/namespace/b/bucket/o/export%U.dmp \
  PARALLEL=4 \
  TRANSFORM=SEGMENT_ATTRIBUTES:N \
  REMAP_TABLESPACE=USERS:DATA \
  EXCLUDE=INDEX,CLUSTER \
  LOGFILE=import.log
Online Migration with Oracle GoldenGate
Zero-Downtime Migrations
GoldenGate for Continuous Sync:
For online migrations, Oracle GoldenGate can be used to keep old and new databases in sync, enabling near-zero downtime transitions.
GoldenGate Benefits:
- Continuous data replication
- Minimal downtime (minutes vs. hours)
- Bi-directional replication capabilities
- Heterogeneous source support
- Transaction-level consistency
GoldenGate Requirements:
- Oracle GoldenGate 12.3.0.1.2 or later for source
- Source database version 11.2.0.4 or later
- Network connectivity between source and target
- Non-integrated Replicats for Autonomous Database
GoldenGate Migration Process:
- Configure initial data load (Data Pump or direct load)
- Set up GoldenGate Extract on source database
- Configure GoldenGate Replicat on target Autonomous Database
- Start replication and verify synchronization
- Perform validation and cutover
- Redirect applications to Autonomous Database
Migration Tools and Services
OCI Database Migration Service
Fully Managed Migration:
Database Migration Service is a fully managed cloud service that leverages Oracle's Zero Downtime Migration (ZDM) engine utilizing Oracle GoldenGate replication to provide enterprise-level database migration with minimal downtime
Database Migration Service provides:
- Logical Migration: Schema and data migration using Data Pump
- Offline Migration: Export to object store, then import to ADB
- Online Migration: GoldenGate-based continuous replication
- Minimal Downtime: Business continuity during migration
- Validation: Pre-migration environment checks
Migration Modes:
Online Migration:
Online migration uses Oracle GoldenGate replication for continuous synchronization from the source database to the target database
Offline Migration:
Using the offline migration method, Database Migration exports the data from the source database to the Object Store, and then imports the data to the target database using Data Pump
Zero Downtime Migration (ZDM)
Enterprise Migration Engine:
ZDM provides automated migration workflows combining Data Pump and GoldenGate for near-zero downtime migrations.
ZDM Capabilities:
- Automated pre-migration validation
- Intelligent migration orchestration
- Rollback capabilities
- Progress monitoring and reporting
- Integration with OCI services
Oracle GoldenGate
Real-Time Replication:
Oracle GoldenGate enables real-time data integration and replication across heterogeneous systems.
Use Cases:
- Zero-downtime migrations
- Continuous data replication
- Disaster recovery configurations
- Data distribution across geographic regions
Cloud Premigration Advisor Tool (CPAT)
Pre-Migration Assessment:
Cloud Premigration Advisor Tool (CPAT) analyzes source databases for compatibility with Autonomous Database before migration.
CPAT Capabilities:
- Identifies incompatible objects and features
- Reports on unsupported database options
- Recommends remediation steps
- Generates comprehensive assessment reports
- Validates migration readiness
OCI Application Migration
Application and Database Migration:
Comprehensive service for migrating applications along with their databases to Oracle Cloud.
Features:
- Integrated application and database migration
- Dependency mapping and analysis
- Automated infrastructure provisioning
- Application compatibility validation
Move to Autonomous Database (MV2ADB)
Automated Migration Tool:
MV2ADB leverages Oracle Data Pump, REST API and Oracle Cloud Infrastructure command-line interface (oci-cli) to export data from your on-premises database to your Oracle Autonomous Database in the cloud
MV2ADB Commands:
- auto: Automates complete migration process (export, upload, import)
- advisor: Analyzes source schemas for migration suitability
- report: Compares export and import results
Migration Planning Considerations
Key Decision Factors
1. Dataset Size:
How large is the dataset to be imported?
- Small (<100GB): Direct application loads or SQL Developer
- Medium (100GB-1TB): Data Pump with parallelism
- Large (>1TB): Data Pump with object store staging, consider Data Lake Accelerator
- Very Large (>10TB): Full transportable Data Pump, multiple parallel operations
2. Import File Format:
What is the import file format?
- Structured Data: Data Pump dump files, CSV, delimited text
- Semi-Structured: JSON, XML
- Binary Formats: Parquet, ORC, Avro
- Database Backups: RMAN backups (requires conversion)
3. Non-Oracle Source Support:
Does the method support non-Oracle database sources?
- Oracle to Oracle: Data Pump, GoldenGate
- Non-Oracle to Oracle: OCI Database Migration (supports MySQL, PostgreSQL)
- Third-Party Tools: SQL Developer Migration Workbench, Oracle SQL*Loader
- Custom ETL: DBMS_CLOUD for file-based loads
4. Object Storage Support:
Does the method support using Oracle and third-party object storage?
- OCI Object Storage: Full native support
- AWS S3: Supported via DBMS_CLOUD
- Azure Blob Storage: Supported via DBMS_CLOUD
- Google Cloud Storage: Supported via DBMS_CLOUD
- On-Premises Storage: NFS mounts, database directories
Downtime Tolerance
Planned Downtime (Offline Migration):
- Data Pump export/import
- Full transportable export
- DBMS_CLOUD data loading
- SQL*Loader
Minimal Downtime (Online Migration):
- OCI Database Migration Service (online mode)
- Oracle GoldenGate replication
- Zero Downtime Migration (ZDM)
- Materialized view replication with refresh
Network Considerations
Direct Database Link:
- Fast for small databases
- Network bandwidth dependent
- Single point of failure
- No intermediate storage required
Object Store Staging:
- Decouples network from database operations
- Enables parallel transfers
- Provides checkpoint/restart capability
- Recommended for production migrations
Migration Workflow
Phase 1: Assessment and Planning
1. Run CPAT Analysis:
-- Download and run Cloud Premigration Advisor Tool
@cpat.sql
2. Inventory Source Database:
- Identify schemas and objects to migrate
- Document database size and growth rate
- Catalog dependencies and integrations
- Review custom code and packages
3. Determine Migration Strategy:
- Online vs. offline migration
- Direct load vs. object store staging
- Single operation vs. phased approach
- Pilot migration for validation
Phase 2: Preparation
1. Provision Autonomous Database:
- Select appropriate OCPU and storage
- Configure network access (public, private endpoint, ACLs)
- Enable required features and options
2. Create Cloud Resources:
-- Create object storage credential
BEGIN
    DBMS_CLOUD.CREATE_CREDENTIAL(
        credential_name => 'OBJ_STORE_CRED',
        username => 'oracleidentitycloudservice/user@example.com',
        password => '<auth_token>'
    );
END;
/
3. Prepare Source Database:
- Collect current statistics
- Disable foreign key constraints (if needed)
- Set tablespaces to read-only (for transportable)
- Create Data Pump directory
Phase 3: Migration Execution
1. Export Source Data:
expdp admin/password \
  PARFILE=export.par
2. Transfer to Object Storage:
# Using OCI CLI
oci os object put \
  --bucket-name migration-bucket \
  --file export01.dmp
3. Import to Autonomous Database:
impdp admin/password@adb_high \
  PARFILE=import.par
Phase 4: Validation
1. Object Count Verification:
-- Compare object counts
SELECT object_type, COUNT(*)
FROM dba_objects
WHERE owner = 'HR'
GROUP BY object_type
ORDER BY object_type;
2. Data Validation:
-- Compare row counts
SELECT table_name, num_rows
FROM dba_tables
WHERE owner = 'HR'
ORDER BY table_name;
3. Application Testing:
- Validate connectivity from applications
- Execute representative test cases
- Verify performance characteristics
- Test backup and recovery procedures
Phase 5: Cutover
1. Final Synchronization:
For online migrations, ensure final data sync is complete
2. Application Redirection:
Update connection strings to point to Autonomous Database
3. Monitoring:
Monitor application behavior and database performance post-cutover
4. Decommission:
Archive or decommission source database after validation period
Best Practices Summary
Planning Best Practices
- Run CPAT before migration to identify compatibility issues
- Test migration process in non-production environment first
- Document migration procedures and rollback plans
- Establish success criteria and validation procedures
- Plan for adequate network bandwidth and time windows
Execution Best Practices
- Use latest Data Pump version for best compatibility
- Always use schema mode for Autonomous Database
- Enable parallelism matching Autonomous Database OCPUs
- Stage data in object storage for production migrations
- Use HIGH database service for import operations
- Monitor progress and resource consumption
Performance Best Practices
- Compress dump files to reduce transfer time
- Use partition_options=merge for partitioned tables
- Exclude and rebuild indexes post-import for speed
- Disable constraints during import, re-enable after
- Use Data Lake Accelerator for very large datasets
- Consider full transportable for largest databases
Security Best Practices
- Use dedicated credentials for migration operations
- Rotate credentials after migration completion
- Secure dump files with encryption when required
- Implement principle of least privilege for migration accounts
- Audit migration activities comprehensively
- Clean up intermediate storage after migration
Conclusion
Migrating to Oracle Autonomous Database requires understanding available methods, tools, and best practices. Data Pump provides reliable offline migration for databases 10g and higher, while Oracle GoldenGate enables near-zero downtime online migrations for mission-critical systems.
Key Migration Approaches:
Offline Migrations:
- Data Pump export/import via object storage (recommended)
- Database links for network mode import
- DBMS_CLOUD for file-based data loading
- SQL*Loader for formatted data files
Online Migrations:
- OCI Database Migration Service (fully managed)
- Zero Downtime Migration (ZDM) engine
- Oracle GoldenGate replication
- Materialized views with refresh
Critical Success Factors:
- Thorough pre-migration assessment with CPAT
- Appropriate tool selection based on requirements
- Adherence to Data Pump best practices
- Comprehensive testing and validation
- Detailed cutover planning and execution
Whether performing simple schema migrations or complex enterprise database transitions, Oracle provides the tools, services, and methodologies necessary for successful Autonomous Database migrations with minimal business disruption and maximum confidence.
 
 
              
 
    
Top comments (0)