DEV Community

Cover image for Migrating data from Cosmos DB to local JSON files
Burke Holland for Microsoft Azure

Posted on

4 1

Migrating data from Cosmos DB to local JSON files

We've selected our favorite tips and tricks created by Michael Crump and are delivering fresh technical content on Azure all April! Miss a day (or more)? Catch up with the series.

Don't have Azure? Grab a free subscription.

Using the Data Migration Tool with Cosmos DB

One task that seems to come up over and over is migrating data from one database/format into another. I recently used Cosmos DB as my database to store every tweet that came out of Ignite. Once I had the data and wouldn't be using Cosmos DB any more for that exercise, I needed to dump the data out to a local file to preserve the data and save money. Here is how I did it.

The Tools

Download and install the Azure DocumentDB Data Migration Tool

Get to Work

Ensure you have a Cosmos DB database and collection created that you wish to migrate out.

Go to Keys (inside your Cosmos DB blade in the portal) to copy the 8Primary Connection String.

You'll need to append the Database name to the end of the string. For example, Database=cosmosdb-ignite will be appended to the Key copied earlier AccountEndpoint=https://mbcrump.documents.azure.com:443/;AccountKey=VxDEcJblah==;Database=cosmosdb-ignite. Save this for later.

Open the Data Migration Tool and under Source Information, select DocumentDB as shown below.

You'll need to add the ConnectionString (that we just created) along with the Collection and in my case it is items. We'll take the defaults on the rest and press Verify and if successful, then press Next.

In my case, I'll export to a local JSON file and select Prettify JSON and press Next.

On the next screen, you'll see a View Command to see the command that will be used to migrate your data. This is helpful to just learn the syntax.

You'll finally see the Import has completed with over 100K items transferred in a little under 2 minutes.

We now have our local JSON file and can use it however we want! Awesome!

Want more Cosmos DB? Check out our quickstarts and tutorials!

Also check out Monday's article from Jay Gordon: Uploading your JSON data to Cosmos DB using the MongoDB API.


We'll be posting articles every day in April, so stay tuned or jump ahead and check out more tips and tricks now.

Heroku

Deliver your unique apps, your own way.

Heroku tackles the toil — patching and upgrading, 24/7 ops and security, build systems, failovers, and more. Stay focused on building great data-driven applications.

Learn More

Top comments (1)

Collapse
 
thaianhduc profile image
Thai Anh Duc

I used the tool to import from a large JSON file into CosmosDB locally for my development environment. I worked perfectly.

Billboard image

Try REST API Generation for Snowflake

DevOps for Private APIs. Automate the building, securing, and documenting of internal/private REST APIs with built-in enterprise security on bare-metal, VMs, or containers.

  • Auto-generated live APIs mapped from Snowflake database schema
  • Interactive Swagger API documentation
  • Scripting engine to customize your API
  • Built-in role-based access control

Learn more

👋 Kindness is contagious

If this article connected with you, consider tapping ❤️ or leaving a brief comment to share your thoughts!

Okay