DEV Community

Laysa Uchoa
Laysa Uchoa

Posted on • Updated on

Copy OpenSearch index data to S3

It is a good practice to perform backups of your OpenSearch (OS) data to another storage service. This way you can access your data and restore it if something unexpected happens to it.

In this article, you can find out how to dump your OpenSearch data to

  • AWS S3 bucket

To copy the index data, we will be using elasticsearch-dump
. You can read the instructions on GitHub on how to install it. From this library, we will use elasticdump command to copy the input index data to a specific output.

Make sure to have elasticsearch-dump tool installed for the next steps.

Copying data from OpenSearch to AWS S3


  • OpenSearch cluster as the input
  • AWS S3 bucket as the output

Information about your OS cluster and AWS service:


  • SERVICE_URI: OpenSearch service URI.
  • INPUT_INDEX_NAME: the index that you aim to copy from your input source.

S3 bucket:

  • AWS credentials (ACCESS_KEY_ID and SECRET_ACCESS_KEY).
  • S3 file path. for e.g. s3://${BUCKET_NAME}/${FILE_NAME}.json

Find more information about AWS credentials in the AWS docs.

Export OpenSearch index data to S3

Use elasticsearch-dump command to copy the data from your OpenSearch cluster to your AWS S3 bucket. Use your
OpenSearch SERVICE_URI for the input. For the output, choose an AWS S3 file path including the file name that you want for your document.

elasticdump \
--s3AccessKeyId "${ACCESS_KEY_ID}" \
--s3SecretAccessKey "${SECRET_ACCESS_KEY}" \
--input=SERVICE_URI/INPUT_INDEX_NAME --output "s3://${BUCKET_NAME}/${FILE_NAME}.json"  
Enter fullscreen mode Exit fullscreen mode

That is how you can copy your OpenSearch data to a S3 bucket. 🙋🏻‍♀️

Image description

Top comments (0)