<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Laurent Lemaire</title>
    <description>The latest articles on DEV Community by Laurent Lemaire (@lem01).</description>
    <link>https://dev.to/lem01</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/lem01"/>
    <language>en</language>
    <item>
      <title>The Complete Redis Backup Guide (with examples)</title>
      <dc:creator>Laurent Lemaire</dc:creator>
      <pubDate>Mon, 15 Aug 2022 08:45:05 +0000</pubDate>
      <link>https://dev.to/simplebackups/the-complete-redis-backup-guide-with-examples-108o</link>
      <guid>https://dev.to/simplebackups/the-complete-redis-backup-guide-with-examples-108o</guid>
      <description>&lt;p&gt;In this article, we’ll see how you a real case example of a Redis backup process, end-to-end.&lt;/p&gt;

&lt;p&gt;We'll cover &lt;strong&gt;how to configure a Redis backup&lt;/strong&gt;, which binaries you should use and the most important settings you need to be aware of, as well as how to store your backup remotely and how to restore it.&lt;/p&gt;

&lt;p&gt;Let's get started!&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Redis?
&lt;/h2&gt;

&lt;p&gt;Redis is an open source in-memory key-value store written in C.&lt;br&gt;
Redis stands for Remote Dictionary Server and is used as a database, cache, queue system, and message broker.&lt;/p&gt;

&lt;p&gt;Redis is fast because its data is stored in memory, meaning, unlike traditional databases Redis doesn't need to access the disk.&lt;br&gt;
While writing this article I learned that Redis is often called a "Data Structure Server" because it provides data types that are similar to those in modern programming languages.&lt;/p&gt;

&lt;p&gt;Some data structures that Redis provides are Strings, Lists, Sets, Hashes, and Sorted Sets.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://redis.io/topics/data-types"&gt;Redis Data Types →&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  What is Redis used for?
&lt;/h3&gt;

&lt;p&gt;Even though Redis could be used as your primary Database it's usually not what it is used for.&lt;br&gt;
Here are the most common use cases for Redis:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cache:&lt;/strong&gt; Redis is used as a cache, which is a fast way to store data in memory.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Session storage:&lt;/strong&gt; Redis is used to store session data. Writing and reading data out of Redis is super fast which makes it an ideal candidate for session storage.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Message queue&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are a lot of different use cases for Redis, but these are the most common ones.&lt;/p&gt;
&lt;h3&gt;
  
  
  Where is Redis database stored?
&lt;/h3&gt;

&lt;p&gt;As stated above Redis is storing its data in memory. But depending on your use case Redis can copy the data on your disk.&lt;br&gt;
This comes obviously handy when you have a large amount of data, and you need to be able to restore it and this is also why you might be needing to back it up.&lt;/p&gt;

&lt;p&gt;Redis regularly dumps its data to an RDB file on the disk based on how snapshots are configured.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Redis configuration&lt;/strong&gt;&lt;br&gt;
The&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 file contains your Redis configuration.

The configuration file is located at

 ```/etc/redis/redis.config```

 and straighforward: it's a list of instructions.
You'll find a section called

 ```#### SNAPSHOTTING ####```

 on which you can define if Redis should snapshot its data to the dis and how often it should do it.



```redis.config
save 60 1000
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;This configuration will make Redis dump the dataset to the disk every 60 seconds if at least 1000 keys are changed.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://redis.io/docs/manual/config/"&gt;Learn more about Redis Configuration →&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Redis also works with AOF (Append-Only File) which is a way to store the data on the disk by logging all write operations received by the server.&lt;br&gt;
AOS won't be covered in this article but it is worth knowing it exists, especially when you'll need to backup and restore your data.&lt;/p&gt;
&lt;h2&gt;
  
  
  How to back up Redis data
&lt;/h2&gt;

&lt;p&gt;Making a Redis backup is pretty easy. You'll need to make a fresh copy of the RDB file, compress it and save it somewhere safe.&lt;/p&gt;
&lt;h2&gt;
  
  
  Redis offers 2 methods to "force" a snapshot:
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
: This will force a snapshot to be taken synchronously.
-

 ```BGSAVE```

: This will force a snapshot to be taken asynchronously.

The easiest way is to use the

 ```SAVE```

 method but it will block all other operations until the snapshot is done.
Using

 ```BGSAVE```

 will make the server continue to accept commands and will not block other operations but you'll have to figure out when the snapshot is this one is asynchronous.

**So, if you want to make a backup you'll need to do the following:**
1. Review / update your Redis configuration.
2. Create a snapshot for your Redis data (known as a "dump"/"rdb file").
3. Save the RDB file to a remote location

### 1. Review / update your Redis configuration.

You'll need to know where your snapshot file will be generated using the redis-cli command described below.

The default location of your Redis config file is

 ```/etc/redis/redis.config```

.
You can also use this command to find the location of your Redis config file:

 ```redis-cli config get dir```

.

You can find the configuration file here [https://redis.io/topics/config](https://redis.io/topics/config).

### 2. Create a snapshot for your Redis data.

#### &amp;lt;u&amp;gt;Using redis-cli

 ```SAVE```

 command&amp;lt;/u&amp;gt;
This method will work synchronously to make a snapshot of your Redis database.
Just ssh into your server and run the following command:

Log in to the database command line interface:


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;redis-cli&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;


You might have to authenticate to your database:


```shell{promptHost: 127.0.0.1}
auth YOUR_PASSWORD_HERE
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;SAVE
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;The output will be something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;OK
&lt;span class="o"&gt;(&lt;/span&gt;1.23s&lt;span class="o"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can then exit the redis-cli by typing&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
.

At this stage, the

 ```RDB```

 file will be saved in

 ```/var/lib/redis/```

 and will be named

 ```dump.rdb```

.

#### &amp;lt;u&amp;gt;Using redis-cli

 ```BGSAVE```

 command&amp;lt;/u&amp;gt;

Using the asynchronous dump function, you'll need to make sure you are aware of the end of the process.
One way to do it is to use

 ```inotifywait```

 which will notify you when a change to the dump file is made.

### 3. Automate your Redis backups and store them on AWS S3

As stated in [Redis documentation](https://redis.io/docs/manual/persistence/), it's safe to copy the RDB file even if used by your running server:

&amp;gt; Redis is very data backup friendly since you can copy RDB files while the database is running: the RDB is never modified once produced, and while it gets produced it uses a temporary name and is renamed into its final destination atomically using rename(2) only when the new snapshot is complete.
&amp;gt;
&amp;gt; This means that copying the RDB file is completely safe while the server is running. This is what we suggest:
&amp;gt;
&amp;gt; - Create a cron job in your server creating hourly snapshots of the RDB file in one directory, and daily snapshots in a different directory.
&amp;gt; - Every time the cron script runs, make sure to call the find command to make sure too old snapshots are deleted: for instance you can take hourly snapshots for the latest 48 hours, and daily snapshots for one or two months. Make sure to name the snapshots with date and time information.
&amp;gt; - At least one time every day make sure to transfer an RDB snapshot outside your data center or at least outside the physical machine running your Redis instance.


We'll create a script that will create our dump file, then upload to Amazon S3.

#### &amp;lt;u&amp;gt;Create a shell script that will dump the Redis database&amp;lt;/u&amp;gt;


```shell
cd ~
mkdir scripts
cd scripts
nano redis_backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the script below to it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/bin/bash
rdb_file="/FOLDER_TO_YOUR_REDIS_DUMP_FILE/redis/dump.rdb"
redis_cli="/usr/bin/redis-cli"

DIR=`date +%d-%m-%y`
DEST=~/redis_backups/$DIR
mkdir $DEST

echo save| $redis_cli
exit 1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;u&gt;Send the Redis DUMP to an AWS S3 bucket&lt;/u&gt;
&lt;/h4&gt;

&lt;p&gt;Append to&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 the following:



```shell
BUCKET_NAME="YOUR_S3_BUCKET_NAME"
aws s3 cp $rdb_file s3://YOUR_S3_BUCKET/redis_backups/ &amp;amp;&amp;amp; echo "Backup copied to S3"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  &lt;u&gt;Schedule the script to run every day at midnight&lt;/u&gt;
&lt;/h4&gt;

&lt;p&gt;First, let's CHMOD the script to make it executable:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;chmod&lt;/span&gt; +x ~/scripts/db_sync.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then create a cron job to run the script every day at midnight:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;crontab &lt;span class="nt"&gt;-e&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;0 0 &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; ~/scripts/redis_backup.sh &lt;span class="c"&gt;# take a backup every midnight&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  How to restore a Redis backup
&lt;/h2&gt;

&lt;p&gt;Now that you've made a backup, we'll see how to restore it from a&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 file.

**We recommend you first try this on a fresh Redis server**

### Make sure AOF is disabled
AOF stands for Append-Only File, which will instruct Redis to log all operations in a `.aof` file.
Since we're restoring a backup, we need to disable AOF before restoring the data as we don't want Redis to log all these operations.

Open your configuration file (

```redis.config```

) and make sure

 ```appendonly```

 is set to

 ```no```

.



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;appendonly no&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;


### Stopping the Redis server

Before being able to replace the

 ```dump.rdb```

 file, you'll need to stop the Redis server.



```shell
sudo service redis-server stop
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Restoring the Redis database
&lt;/h3&gt;

&lt;p&gt;Prior to restoring the database, you can rename the existing dump.rdb file in order to have a restore point in case something goes wrong.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo cp&lt;/span&gt; /home/redis/dump.rdb /home/redis/dump.rdb.bak
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can then copy the backup rdb file as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo cp&lt;/span&gt; /redis_backups/20220810/dump.rdb /home/redis/dump.rdb
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And finally make sure to apply the right permissions to the dump.rdb file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo chmod &lt;/span&gt;660 /home/redis/dump.rdb
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Re-starting Redis server
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo &lt;/span&gt;service redis-server start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Additional Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Official Redis documentation: &lt;a href="https://redis.io/docs/manual/persistence/"&gt;https://redis.io/docs/manual/persistence/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Service to automate your Redis backups: &lt;a href="https://simplebackups.com/redis-backup/"&gt;https://simplebackups.com/redis-backup/&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;We've seen how to back up your Redis database and restore it, and we've seen how to automate the process.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;a href="https://simplebackups.com"&gt;SimpleBackups&lt;/a&gt; will save you a lot of time setting up scripts, ensuring they run without problems, all without code or maintenance.&lt;br&gt;
It will alert you when things go wrong, and allows you to store your backups on many cloud storage services like Google, DigitalOcean, Wasabi, Dropbox, and more…&lt;/em&gt;&lt;/p&gt;

</description>
      <category>redis</category>
      <category>backup</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Cloud Storage Comparison - Best Providers 2021</title>
      <dc:creator>Laurent Lemaire</dc:creator>
      <pubDate>Thu, 07 Oct 2021 13:59:58 +0000</pubDate>
      <link>https://dev.to/simplebackups/cloud-storage-comparison-best-providers-2021-2cnf</link>
      <guid>https://dev.to/simplebackups/cloud-storage-comparison-best-providers-2021-2cnf</guid>
      <description>&lt;p&gt;We've created a list of all major (and our preferred) providers, together with their prices using the region &lt;strong&gt;US-EAST-1&lt;/strong&gt; as reference.&lt;/p&gt;

&lt;p&gt;If you're looking to store your data on a Cloud Storage Provider, you've got plenty of choices.&lt;/p&gt;

&lt;p&gt;Prices are often changing and what each provider offer is different.&lt;br&gt;
Once you know your needs in terms of storage/frequency of access (...), you'll easily be able to identify which provider fits you the best.&lt;/p&gt;

&lt;p&gt;This list doesn't contain all the specificities each provider offers as we tried to summarize what most people are looking for when it comes to evaluating an Object Block Storage provider.&lt;/p&gt;

&lt;p&gt;If you want to see any additional information in this comparison &lt;a href="//mailto:contact@simplebackups.io"&gt;just let us know&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Object Storage Providers Comparison
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Name&lt;/th&gt;
&lt;th&gt;Price /GB/month&lt;/th&gt;
&lt;th&gt;Min price/month&lt;/th&gt;
&lt;th&gt;Price/1000 requests&lt;/th&gt;
&lt;th&gt;Price Ingress&lt;/th&gt;
&lt;th&gt;Price Egress&lt;/th&gt;
&lt;th&gt;Link&lt;/th&gt;
&lt;th&gt;Extra&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;AWS S3&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.02&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.01&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.09/GB/month (1GB/m included)&lt;/td&gt;
&lt;td&gt;&lt;a href="https://aws.amazon.com/s3/pricing/"&gt;https://aws.amazon.com/s3/pricing/&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;- 12 first months free   - 5GB free&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;AWS S3 - Infrequent access&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.01&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.01&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;td&gt;&lt;a href="https://aws.amazon.com/s3/pricing/"&gt;https://aws.amazon.com/s3/pricing/&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;AWS Glacier&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.03&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.01/GB&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://aws.amazon.com/s3/pricing/"&gt;https://aws.amazon.com/s3/pricing/&lt;/a&gt;   &lt;a href="https://aws.amazon.com/s3/pricing/"&gt;https://aws.amazon.com/s3/pricing/&lt;/a&gt;
&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Wasabi&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.01&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;&lt;a href="https://wasabi.com/cloud-storage-pricing/#cost-estimates"&gt;https://wasabi.com/cloud-storage-pricing/#cost-estimates&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Blackblaze&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.01&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.01/GB&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.backblaze.com/b2/cloud-storage-pricing.html"&gt;https://www.backblaze.com/b2/cloud-storage-pricing.html&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;DigitalOcean Spaces&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.02&lt;/td&gt;
&lt;td&gt;$5 (250GB included)&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.digitalocean.com/pricing/"&gt;https://www.digitalocean.com/pricing/&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;- Minimum 5$   - 1 TB of outbound transfer (additional 0.01/GB)   - Additional storage 0.02/GB&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Dropbox&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$12 (5TB included)&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.dropbox.com/plans?tab=work#features"&gt;https://www.dropbox.com/plans?tab=work#features&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;- Reference: Plan standard   - Minimum 12$/m&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Google Cloud Storage&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.02&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.01&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.12&lt;/td&gt;
&lt;td&gt;&lt;a href="https://cloud.google.com/storage/pricing"&gt;https://cloud.google.com/storage/pricing&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;- Reference: Standard   - 5GB free + 5000 operations&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;FileBase&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.01&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;0.0059/GB (1TB included)&lt;/td&gt;
&lt;td&gt;&lt;a href="https://filebase.com/"&gt;https://filebase.com/&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;- Minimum 5.99/m&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Azure Blob Storage&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.02&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.0065-$0.0005&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;n/a&lt;/td&gt;
&lt;td&gt;&lt;a href="https://azure.microsoft.com/en-us/pricing/details/storage/blobs/#overview"&gt;https://azure.microsoft.com/en-us/pricing/details/storage/blobs/#overview&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;- Reference: Hot Storage&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Exoscale&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.02&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.02&lt;/td&gt;
&lt;td&gt;&lt;a href="https://www.exoscale.com/pricing/#storage"&gt;https://www.exoscale.com/pricing/#storage&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;UpCloud&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.02&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.01&lt;/td&gt;
&lt;td&gt;&lt;a href="https://upcloud.com/pricing/#object-storage"&gt;https://upcloud.com/pricing/#object-storage&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;- Reference 250GB plan&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Vultr&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.02&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.00&lt;/td&gt;
&lt;td&gt;$0.01/GB (1TB included)&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;td&gt;- Minimu: 5$ (250gb)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;This list is maintained and updated monthly - **last update: 20/09/2021)&lt;/em&gt;* *&lt;/p&gt;

</description>
      <category>backup</category>
      <category>cloudprovider</category>
      <category>s3</category>
      <category>objectstorage</category>
    </item>
    <item>
      <title>The Ultimate MySQL Database Backup Script</title>
      <dc:creator>Laurent Lemaire</dc:creator>
      <pubDate>Sat, 13 Jun 2020 09:43:12 +0000</pubDate>
      <link>https://dev.to/simplebackups/the-ultimate-mysql-database-backup-script-12g2</link>
      <guid>https://dev.to/simplebackups/the-ultimate-mysql-database-backup-script-12g2</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Any database, needs a backup&lt;/strong&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This article is part of “The Ultimate Backup Script” series we are creating to provide you with database backup scripts that not only allow you to create database backups, but also upload the backup dumps to Amazon S3 and automate the process daily.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--W2Feirq5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://simplebackups.io/images/uploads/simplebackups-mysql-backup-storage.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--W2Feirq5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://simplebackups.io/images/uploads/simplebackups-mysql-backup-storage.png" alt="MySQL Backup Illustration" title="MySQL Backup Illustration" width="800" height="536"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Why we need a database backup?
&lt;/h3&gt;

&lt;p&gt;One might think why backup is necessary for my database? The answer is simple, backup creates a copy of your physical, logical, and operational data. Which you can store at any safe place such as Amazon S3. This copy comes into use if the running database gets corrupted. Database backup can include files like control files, datafiles, and archived redo logs.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why Amazon S3 for backup?
&lt;/h3&gt;

&lt;p&gt;For this tutorial, we have chosen Amazon S3 as it is a very common choice. You can do the same thing if you would like to use another clould storage provider. The instructions won’t differ a lot as long as the cloud provider is S3-compatible.&lt;/p&gt;

&lt;p&gt;Below we defined some less known terms that we used in the article:&lt;/p&gt;

&lt;h3&gt;
  
  
  What is Cron?
&lt;/h3&gt;

&lt;blockquote&gt;
&lt;p&gt;Cron is a software utility that offers time-based job scheduling. It supports Unix computer operating systems. To set up software environments, the developer uses Cron. He/she schedules commands or shell scripts so that they run at chosen times. It could be daily, once a week, or any interval as desired.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  What is Chmod?
&lt;/h3&gt;

&lt;blockquote&gt;
&lt;p&gt;The chmod a short command of ‘change mode’ enables the admin to set rules for file handling. In other words, with the help of a “chmod” system call. An administrator can change the access permissions of file system objects.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h4&gt;
  
  
  Database Backup Script for MySQL and Dumping to Amazon S3
&lt;/h4&gt;

&lt;p&gt;You can automate the creation of backup and storing it to Amazon S3 within a few minutes. Below bullets brief about what you are going to learn in this part of the article:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a script that automates the MySQL backup directory creation&lt;/li&gt;
&lt;li&gt;Upload/sync the backups with Amazon S3&lt;/li&gt;
&lt;li&gt;Cron will run this command every day (to back up)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 1: Generating a shell script which will dump the MySQL database
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd&lt;/span&gt; ~
&lt;span class="nb"&gt;mkdir &lt;/span&gt;scripts
&lt;span class="nb"&gt;cd &lt;/span&gt;scripts
nano db_backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the script below to it&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;
&lt;span class="nv"&gt;DIR&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sb"&gt;`&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; +%d-%m-%y&lt;span class="sb"&gt;`&lt;/span&gt;
&lt;span class="nv"&gt;DEST&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;~/db_backups/&lt;span class="nv"&gt;$DIR&lt;/span&gt;
&lt;span class="nb"&gt;mkdir&lt;/span&gt; &lt;span class="nv"&gt;$DEST&lt;/span&gt;

mysqldump &lt;span class="nt"&gt;-h&lt;/span&gt; mysql_hostname &lt;span class="nt"&gt;-u&lt;/span&gt; mysql_user  &lt;span class="nt"&gt;-p&lt;/span&gt;&lt;span class="s2"&gt;"mysql_password"&lt;/span&gt; database_name &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; dbbackup.sql
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now chomd the script to allow it to for execution&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;chmod&lt;/span&gt; +x ~/scripts/db_backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 2: Creating the shell script which sync the backups with Amazon S3
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;nano db_sync.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy and paste the script below to it&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/bin/bash
/usr/local/bin/aws s3 sync ~/db_backups s3://my-bucket-name
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now chmod the script to allow it for execution&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;chmod +x ~/scripts/db_sync.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 3: Creating the folder in Amazon S3 for the database dumps
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd ~
mkdir db_backups
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 4: Time to configure the AWS CLI
&lt;/h3&gt;

&lt;p&gt;Before installing the AWS CLI you need to install&lt;code&gt;python-pi&lt;/code&gt;. Type the following commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;apt-get update
apt-get -y install python-pip
curl "https://bootstrap.pypa.io/get-pip.py" -o "get-pip.py"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html"&gt;Install the AWS CLI&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Type the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install awscli
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 5: Time to set up AWS key &amp;amp; Secret
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-files.html"&gt;Configuration and credential file settings&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NyualHXI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://simplebackups.io/images/uploads/image2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NyualHXI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://simplebackups.io/images/uploads/image2.png" alt="" width="796" height="397"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd ~
mkdir .aws
nano ~/.aws/config
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Paste in &lt;code&gt;key_id&lt;/code&gt;and &lt;code&gt;secret_access_key&lt;/code&gt; as shown below&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[default]
aws_access_key_id=AKIAIOSFODNN7EXAMPLE
aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 6: Set up the Cron (to automate the process)
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;crontab -e
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Paste the below commands at the bottom to automate the process&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;0 0 * * * ~/scripts/db_backup.sh # take a backup every midnight
0 2 * * * ~/scripts/db_sync.sh # upload the backup at 2am
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This way the backup script will run and also sync with Amazon S3 daily.&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;Hence, by using these scripts you can achieve 3 goals:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Creating the database backup via a shell script&lt;/li&gt;
&lt;li&gt;uploading the dump to Amazon S3&lt;/li&gt;
&lt;li&gt;also automating this process using Cron.&lt;/li&gt;
&lt;/ol&gt;




&lt;p&gt;&lt;em&gt;Have you tried &lt;a href="https://simplebackups.io/?ref=blog"&gt;SimpleBackups.io &lt;/a&gt; yet?&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;SimpleBackups will save you a lot of time setting up scripts and ensuring they run without problems. It will alert you when things go wrong, and allows you store your backups on many cloud storage services like Google, DigitalOcean, Wasabi, Dropbox, and more…&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8pRDJsz---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://simplebackups.io/images/uploads/simplebackups-bringstorage.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8pRDJsz---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://simplebackups.io/images/uploads/simplebackups-bringstorage.png" alt="" width="800" height="543"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>mysql</category>
      <category>mariadb</category>
      <category>bash</category>
      <category>database</category>
    </item>
  </channel>
</rss>
