<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: gerkibz</title>
    <description>The latest articles on DEV Community by gerkibz (@gerkibz).</description>
    <link>https://dev.to/gerkibz</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/gerkibz"/>
    <language>en</language>
    <item>
      <title>Moving data from google drive to AWS S3 bucket </title>
      <dc:creator>gerkibz</dc:creator>
      <pubDate>Thu, 03 Mar 2022 16:05:27 +0000</pubDate>
      <link>https://dev.to/gerkibz/moving-data-from-google-drive-to-aws-s3-bucket-27hn</link>
      <guid>https://dev.to/gerkibz/moving-data-from-google-drive-to-aws-s3-bucket-27hn</guid>
      <description>&lt;h1&gt;
  
  
  &lt;em&gt;Introduction.&lt;/em&gt;
&lt;/h1&gt;

&lt;p&gt;The current technological improvements in the cloud services sector has provided a platform for cheaper, on time and easy data storage and manipulation. At times uploading data to some of this platforms can take time, if they don't compress the file before uploading.&lt;/p&gt;

&lt;p&gt;A good example of this is uploading data to AWS is 3 times slower compared to uploading data to a google drive folder. This is because google drive compresses the data before hand.&lt;/p&gt;

&lt;p&gt;At times you want to upload data to your site or model hosted on AWS form the local drive or google drive. &lt;/p&gt;

&lt;p&gt;In this tutorial I'll take you through both uploading to google drive form local storage, uploading to AWS from local storage and sharing files between AWS S3 and google drive. &lt;/p&gt;




&lt;h1&gt;
  
  
  &lt;em&gt;Setup.&lt;/em&gt;
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Install rclone
&lt;/h2&gt;

&lt;p&gt;1.1 Head over to &lt;a href="https://rclone.org/downloads/"&gt;rclone&lt;/a&gt; to download the setup&lt;br&gt;
To install rclone on Linux/macOS/BSD systems, run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;1.2 curl https://rclone.org/install.sh | sudo bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Setup google drive
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;run: &lt;br&gt;
&lt;code&gt;rclone config&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;choose option n: New remote.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enter remote name : google-drive&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For storage option choose 16: Google Drive.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Leave client_id, client_secret blank for anonymous access or runtime credentials.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For drive access choose scope 1: full access to all files for now, you can edit this later.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Leave root_folder_id blank for anonymous access or runtime credentials.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose  n:No for the edit service_account_file config option.&lt;br&gt;
Choose yes for the edit advanced config option, this will open a link in your default browser, prompting you to sign into your google drive account and grant rclone access.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Quit the config.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;To test is connection was successful run:&lt;br&gt;
rclone lsd :&lt;br&gt;
In my case I ran &lt;br&gt;
rclone lsd google-drive:&lt;/p&gt;

&lt;h3&gt;
  
  
  Setup AWS S3
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;run: &lt;br&gt;
&lt;code&gt;rclone config&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;choose option n: New remote&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enter remote name : aws-s3&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For storage option choose 4: Amazon S3 Compliant Storage Providers including AWS, Alibaba, Ceph, Digital Ocean, Dreamhost, IBM COS, Minio, SeaweedFS, and Tencent COS.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose your S3 provider, pick option 1:Amazon Web Services (AWS) S3. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Get AWS credentials from runtime , pick 1:Enter AWS credentials in the next step.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Navigate to &lt;a href="https://console.aws.amazon.com/iam/home#/users%24new?step=final&amp;amp;accessKey&amp;amp;userNames=rclone&amp;amp;permissionType=policies&amp;amp;policies=arn:aws:iam::aws:policy%2FAdministratorAccess"&gt;aws user&lt;/a&gt; to create a user and obtain access id and key to enter below.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enter access_key_id, secret_access_key blank for the admin user created above.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Choose the region your bucket is located in : my option 11 : eu-central-1.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Endpoint for S3 API, should match your bucket region, in my case I picked option 11 : EU region.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For the canned ACL used when creating buckets and/or storing objects in S3.&lt;br&gt;
Quit the config, choose option 1 : Owner gets FULL_CONTROL. No one else has access rights (default).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose option 1 for the server side encryption.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose the default storage class option 1.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;




&lt;h1&gt;
  
  
  &lt;em&gt;Code run.&lt;/em&gt;
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Local storage to google drive
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;rclone copy /local/path remote:path&lt;/code&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;copies /local/path to the remote&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Local storage to AWS S3
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;rclone copy /local/path remote:path&lt;/code&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;copies /local/path to the remote&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Google drive to S3
&lt;/h3&gt;

&lt;p&gt;remote =&amp;gt; remote name&lt;br&gt;
&lt;code&gt;rclone copy remote:path remote:path&lt;/code&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;copies /local/path to the remote&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  REFERENCES
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://rclone.org/"&gt;https://rclone.org/&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Connecting local postgresql database to laravel in Ubuntu </title>
      <dc:creator>gerkibz</dc:creator>
      <pubDate>Sat, 15 Jan 2022 19:47:29 +0000</pubDate>
      <link>https://dev.to/gerkibz/connecting-local-postgresql-database-to-laravel-in-ubuntu-43bl</link>
      <guid>https://dev.to/gerkibz/connecting-local-postgresql-database-to-laravel-in-ubuntu-43bl</guid>
      <description>&lt;p&gt;Step 1.Install laravel:~&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -s "https://laravel.build/example-app?with=mysql,redis" | bash
composer create-project laravel/laravel example-app
cd example-app
php artisan serve
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Step 2. Install postgesql:~&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt install postgresql postgresql-contrib
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Step 3. In the config/database.php file set the default database as psql:~&lt;br&gt;
&lt;code&gt;default' =&amp;gt; env('DB_CONNECTION', 'pgsql')&lt;/code&gt;`&lt;/p&gt;

&lt;p&gt;Step 4. Create a database for your project:~&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;&lt;br&gt;
sudo -u postgres psql&lt;br&gt;
sudo -u createdb database_name&lt;br&gt;
sudo -u adduser Username&lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Step 5. To setup the path for the database, go to the .env file and setup:~&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;&lt;br&gt;
DB_CONNECTION=pgsq&lt;br&gt;
DB_HOST=127.0.0.1&lt;br&gt;
DB_PORT=5432&lt;br&gt;
DB_DATABASE=database_name&lt;br&gt;
DB_USERNAME=Username&lt;br&gt;
DB_PASSWORD=password &lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Step 6. Start the server:~&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;&lt;br&gt;
sudo service postgresql-9.3 start&lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Step 7. Migrate the files:~&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;code&gt;&lt;br&gt;
sudo service postgresql-9.3 start&lt;br&gt;
&lt;/code&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;For more information check on this project&lt;br&gt;
&lt;a href="https://github.com/EKivutha/clothes_e-commerce"&gt;https://github.com/EKivutha/clothes_e-commerce&lt;/a&gt;&lt;br&gt;
For more postresql tricks check out &lt;br&gt;
&lt;a href="https://gist.github.com/Kartones/dd3ff5ec5ea238d4c546"&gt;https://gist.github.com/Kartones/dd3ff5ec5ea238d4c546&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
