<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: James Ackah</title>
    <description>The latest articles on DEV Community by James Ackah (@james_ackah_d583c09a33715).</description>
    <link>https://dev.to/james_ackah_d583c09a33715</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/james_ackah_d583c09a33715"/>
    <language>en</language>
    <item>
      <title>Streamlining Data Security and Accessibility Through Automation</title>
      <dc:creator>James Ackah</dc:creator>
      <pubDate>Thu, 05 Dec 2024 23:06:11 +0000</pubDate>
      <link>https://dev.to/james_ackah_d583c09a33715/streamlining-data-security-and-accessibility-through-automation-330b</link>
      <guid>https://dev.to/james_ackah_d583c09a33715/streamlining-data-security-and-accessibility-through-automation-330b</guid>
      <description>&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;This project automates the process of scanning a specified directory for files that haven't been accessed in a configurable period (default: 90 days). These files are moved to an archival folder and uploaded to an AWS S3 bucket for long-term storage. A cron job is set up to run the script periodically, ensuring consistent archival and backup.&lt;/p&gt;




&lt;h2&gt;
  
  
  Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Automated File Scanning&lt;/strong&gt;: Detects and moves files not accessed for a specified period to an archive directory.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS S3 Integration&lt;/strong&gt;: Syncs archived files to a designated S3 bucket for reliable cloud storage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Configurable Retention Period&lt;/strong&gt;: Allows customization of the unused file retention period.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Logging&lt;/strong&gt;: Maintains a log file for monitoring and troubleshooting the archival process.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Error Handling&lt;/strong&gt;: Ensures robust operation with proper error handling and notifications.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Requirements
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;AWS CLI&lt;/strong&gt;: Installed and configured with appropriate IAM credentials for S3 access.

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html" rel="noopener noreferrer"&gt;AWS CLI Installation Guide&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Bash Shell&lt;/strong&gt;: The script is designed to run in a Unix/Linux environment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cron&lt;/strong&gt;: For scheduling automated execution of the script.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sudo/Root Privileges&lt;/strong&gt;: Necessary for file management in system directories.&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Usage
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Clone the Repository
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/your-username/automated-log-archival.git
&lt;span class="nb"&gt;cd &lt;/span&gt;automated-log-archival
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Update Configuration
&lt;/h3&gt;

&lt;p&gt;Edit the script (&lt;code&gt;archive_and_backup.sh&lt;/code&gt;) to customize the following variables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;SCAN_DIR&lt;/code&gt;: The directory to scan for unused files.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;RETENTION_DAYS&lt;/code&gt;: The period (in days) after which files are considered unused.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;ARCHIVES_DIR&lt;/code&gt;: The directory where files will be archived locally.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;S3_BUCKET&lt;/code&gt;: The AWS S3 bucket name for cloud backup.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. Grant Execution Permissions
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;chmod&lt;/span&gt; +x archive_and_backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. Test the Script
&lt;/h3&gt;

&lt;p&gt;Run the script manually to ensure it works as expected:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo&lt;/span&gt; ./archive_and_backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  5. Automate with Cron
&lt;/h3&gt;

&lt;p&gt;Schedule the script using a cron job:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo &lt;/span&gt;crontab &lt;span class="nt"&gt;-e&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Add the following line to run the script daily at midnight:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;0 0 &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; /path/to/archive_and_backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Logging
&lt;/h2&gt;

&lt;p&gt;The script logs its operations to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/var/log/archive_cleanup.log
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Check this file for information on completed tasks or troubleshooting errors.&lt;/p&gt;




&lt;h2&gt;
  
  
  Contributions
&lt;/h2&gt;

&lt;p&gt;Contributions, issues, and feature requests are welcome! Feel free to submit a pull request or open an issue.&lt;/p&gt;

&lt;h2&gt;
  
  
  License
&lt;/h2&gt;

&lt;p&gt;This project is licensed under the MIT License. See the &lt;code&gt;LICENSE&lt;/code&gt; file for details.&lt;/p&gt;




&lt;h2&gt;
  
  
  Author
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;James Ackah-Blay&lt;/strong&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://linkedin.com/in/jamesackahblay" rel="noopener noreferrer"&gt;LinkedIn Profile&lt;/a&gt;
Contributions, issues, and feature requests are welcome! Feel free to submit a pull request or open an issue.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  License
&lt;/h2&gt;

&lt;p&gt;This project is licensed under the MIT License. See the &lt;code&gt;LICENSE&lt;/code&gt; file for details.&lt;/p&gt;




&lt;h2&gt;
  
  
  Author
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;James Ackah-Blay&lt;/strong&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://linkedin.com/in/jamesackahblay" rel="noopener noreferrer"&gt;LinkedIn Profile&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
    <item>
      <title>Automated Backup to S3</title>
      <dc:creator>James Ackah</dc:creator>
      <pubDate>Thu, 05 Dec 2024 22:51:05 +0000</pubDate>
      <link>https://dev.to/james_ackah_d583c09a33715/automated-backup-to-s3-3a9m</link>
      <guid>https://dev.to/james_ackah_d583c09a33715/automated-backup-to-s3-3a9m</guid>
      <description>&lt;h1&gt;
  
  
  Automated Backup to S3
&lt;/h1&gt;

&lt;p&gt;This project is a Bash script designed to automate the process of creating backups, uploading them to an Amazon S3 bucket, and sending email notifications to keep administrators informed about the status of the backup process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Automated Backup:&lt;/strong&gt; Creates a compressed archive of the specified directory.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;S3 Integration:&lt;/strong&gt; Uploads the backup file to a designated Amazon S3 bucket.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Email Notifications:&lt;/strong&gt; Sends email alerts for each stage of the process (start, success, or failure).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;File Cleanup:&lt;/strong&gt; Deletes local backup files after successful upload to S3.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Before using the script, ensure the following requirements are met:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;AWS CLI:&lt;/strong&gt; Installed and configured with appropriate credentials.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Email Client:&lt;/strong&gt; &lt;code&gt;mutt&lt;/code&gt; installed and configured to send emails.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Bash Shell:&lt;/strong&gt; Running on a Linux/Unix system with Bash.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Directories:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Source directory to be backed up: &lt;code&gt;/home/ubuntu/System_Data&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Backup destination: &lt;code&gt;/home/ubuntu/system_backup/&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Usage
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Clone or Create the Script:&lt;/strong&gt;&lt;br&gt;
Save the provided Bash script as &lt;code&gt;backup.sh&lt;/code&gt; on your system.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Modify Configuration:&lt;/strong&gt;&lt;br&gt;
Update the script variables as needed:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;BACKUP_SOURCE&lt;/code&gt;: Path to the directory to be backed up.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;BACKUP_DEST&lt;/code&gt;: Path where the backup file will be stored temporarily.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;S3_BUCKET&lt;/code&gt;: S3 bucket where the backup will be uploaded.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;ADMIN_EMAIL&lt;/code&gt;: Email address to receive notifications.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Set Execute Permissions:&lt;/strong&gt;&lt;br&gt;
Make the script executable:&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   &lt;span class="nb"&gt;chmod&lt;/span&gt; +x backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Run the Script:&lt;/strong&gt;
Execute the script manually or schedule it as a cron job:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   ./backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Set Up Cron Job (Optional):&lt;/strong&gt;
Automate the backup by adding a cron job:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   crontab &lt;span class="nt"&gt;-e&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Add the following line to run the script daily at midnight:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;   0 0 &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; /path/to/backup.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Script Workflow
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Send an email notification that the backup process has started.&lt;/li&gt;
&lt;li&gt;Creates a compressed archive (&lt;code&gt;.tar.gz&lt;/code&gt;) of the specified directory.&lt;/li&gt;
&lt;li&gt;Check if the backup creation was successful:

&lt;ul&gt;
&lt;li&gt;If successful, send a success email.&lt;/li&gt;
&lt;li&gt;If failed, send a failure email and terminate the script.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Upload the backup file to the specified S3 bucket.&lt;/li&gt;
&lt;li&gt;Check if the upload was successful:

&lt;ul&gt;
&lt;li&gt;If successful, send a success email and delete the local backup file.&lt;/li&gt;
&lt;li&gt;If failed, send a failure email and terminate the script.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Example Configuration
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;BACKUP_SOURCE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"/home/ubuntu/System_Data"&lt;/span&gt;
&lt;span class="nv"&gt;BACKUP_DEST&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"/home/ubuntu/system_backup/"&lt;/span&gt;
&lt;span class="nv"&gt;BACKUP_FILE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"backup_&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; +%Y%m%d&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;.tar.gz"&lt;/span&gt;
&lt;span class="nv"&gt;S3_BUCKET&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"s3://linuxacademy-awscli-backup"&lt;/span&gt;
&lt;span class="nv"&gt;ADMIN_EMAIL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"jamesblay80@gmail.com"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Dependencies
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;tar&lt;/code&gt;: Used to compress the backup files.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;aws-cli&lt;/code&gt;: Used to upload files to S3.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;mutt&lt;/code&gt;: Used to send email notifications.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Error Handling
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;The script uses &lt;code&gt;$?&lt;/code&gt; to check the success of each operation.&lt;/li&gt;
&lt;li&gt;If any operation fails, the script sends a failure notification and exits with a non-zero status.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Notes
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Ensure the AWS CLI is configured with credentials that have appropriate permissions to upload to the specified S3 bucket.&lt;/li&gt;
&lt;li&gt;Email notifications require &lt;code&gt;mutt&lt;/code&gt; to be properly configured on the server.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Contact
&lt;/h2&gt;

&lt;p&gt;For questions or assistance, please get in touch with &lt;strong&gt;James Ackah-Blay&lt;/strong&gt; at &lt;a href="//mailto:jamesblay80@gmail.com"&gt;jamesblay80@gmail.com&lt;/a&gt;.``&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>selflearning</category>
      <category>bashscripting</category>
    </item>
  </channel>
</rss>
