loading...
Cover image for Database migration with the Schema Conversion Tool

Database migration with the Schema Conversion Tool

helenanders26 profile image Helen Anderson Updated on ・2 min read

Project Aurora (4 Part Series)

1) Database migration with the Schema Conversion Tool 2) Moving data around with the Database Migration Service 3) How to rock your first tech talk 4) The importance of database modernisation

No one likes having to refactor procs, carry over legacy objects 'just in case' and translate from one syntax to another. The AWS Schema Conversion Tool promises to take a lot of the pain out of migration. Here’s how it works at a high level:

Step 1 - Open the AWS Schema Conversion Tool and create a new project

SCT1

Image from AWS Documentation

Prerequisites: Your new Amazon RDS or Aurora instance is deployed and the necessary drivers have been installed. This takes a few clicks from the AWS console.

Once you've got everything up and running pick your source and target database types. The tool currently supports a number of database sources and Aurora and Redshift as a destination.

Limitations
You cannot use Redshift as a source if you are moving data into an Aurora database.

Step 2 - Connect to both the origin and destination servers

The Origin will appear on the left, and the Destination will appear on the right with action items in the middle.

SCT2

Image from AWS Documentation

Step 3 - Select the database you would like to analyse

Go to: Actions > Create Report

The drivers will then examine all of the objects in that schema, including the schema itself, tables, views, procedures, functions, and packages.

It will convert as much as possible automatically and provide detailed information about items it couldn’t convert. The report can be saved as a .csv file or a .pdf file for review.

SCT4

Image from AWS Documentation

  • Green can be resolved right away
  • Yellow need minor tweaks
  • Red needs significant work to update

Limitations

  • We ran into trouble when converting tables with indexes. SQL Server supports INCLUDE but as PostgreSQL does not it meant manually doing changes to make sure these columns were captured in the new index.

  • We found that the database engine that we were running on the Aurora side was out of sync with the SCT and in some cases syntax that was not supported causes errors.

Step 4 - Drill down into each object to check where changes need to be made

SCT3

Image from AWS Documentation

When ready to commit, right-click and select 'apply to database'.

You can now go to your new Amazon RDS and refresh the database to pull over your work.

Migration of the data can also happen at this point if the necessary drivers have been installed, or you can then use the AWS Database Migration Service.

SCT5

Image from AWS Documentation

And that's it. Incredibly easy to use and a massive time saver when it comes to migration, so you can spend more time upskilling users and less time tinkering with scripts and schemas.

Read more about the tool in the AWS documentation.


This post first appeared on helenanderson.co.nz

Project Aurora (4 Part Series)

1) Database migration with the Schema Conversion Tool 2) Moving data around with the Database Migration Service 3) How to rock your first tech talk 4) The importance of database modernisation

Posted on Oct 31 '18 by:

helenanders26 profile

Helen Anderson

@helenanders26

Making applications go faster at Raygun, AWS Data Hero, and tag moderator on Dev.to. Database concept you don’t understand? Let me know, I’ll write a post!

Discussion

markdown guide