<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Hiromi</title>
    <description>The latest articles on DEV Community by Hiromi (@dentrodailha96).</description>
    <link>https://dev.to/dentrodailha96</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/dentrodailha96"/>
    <language>en</language>
    <item>
      <title>Connecting Docker to Digital Ocean Droplet using Git Actions</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Tue, 10 Feb 2026 09:55:41 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/connecting-docker-to-digital-ocean-droplet-using-git-actions-5k0</link>
      <guid>https://dev.to/dentrodailha96/connecting-docker-to-digital-ocean-droplet-using-git-actions-5k0</guid>
      <description>&lt;p&gt;After connected the &lt;a href="https://github.com/dentrodailha96/arimura_cj" rel="noopener noreferrer"&gt;Sushi Project&lt;/a&gt; in a Droplet from Digital Ocean, and created the environment inside the virtual machine, it was essential to consider a CI/CD process to keep the application updated for all the developers involved in the process. Even more that the developers are working in different timezone. &lt;/p&gt;

&lt;p&gt;The platform chosen to our CI/CD was &lt;a href="https://docs.github.com/en/actions/get-started/understand-github-actions" rel="noopener noreferrer"&gt;GitHub Actions&lt;/a&gt;. It allows us to automate everything from the build process to the production pipeline by using Event-driven workflows. In simple words, basically, changes that I send to my repository (local machine) will automatically update my app in my virtual machine (remote machine). &lt;/p&gt;

&lt;p&gt;Building on our previous experience, you should already be familiar with the concepts of &lt;a href="https://dev.to/dentrodailha96/leanings-docker-and-it-perks-33if"&gt;Docker Container&lt;/a&gt; and a &lt;a href="https://dev.to/dentrodailha96/leanings-docker-and-it-perks-33if"&gt;Droplet&lt;/a&gt;. In this section, I will explain the concept of a Container Registry and how to connect GitHub Actions to our droplet. &lt;/p&gt;

&lt;p&gt;The setup mentioned above is based on &lt;a href="https://faun.pub/full-ci-cd-with-docker-github-actions-digitalocean-droplets-container-registry-db2938db8246" rel="noopener noreferrer"&gt;Thao Truong logic&lt;/a&gt;: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnfth7b7wmet15fh13dup.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnfth7b7wmet15fh13dup.png" alt=" " width="666" height="463"&gt;&lt;/a&gt;&lt;br&gt;
source: Thao Truong, 04/12/2021, &lt;a href="https://faun.pub/full-ci-cd-with-docker-github-actions-digitalocean-droplets-container-registry-db2938db8246" rel="noopener noreferrer"&gt;post&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;As you can see in the image, there is a component called &lt;strong&gt;Container Registry&lt;/strong&gt;. Initially, I didn't immediately understand why we should use one; if you feel the same, I recommend watching this &lt;a href="https://www.youtube.com/watch?v=76rX4s73MrM" rel="noopener noreferrer"&gt;video&lt;/a&gt;. The definition is: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Container Registry is a repository/collection of repositories used to store and access container images. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In other words, the container registry is a storage of images(snapshots) of static files on a disk that can be easily copied to a container. Our goal is to use the registry to keep our Docker image updated and built inside our virtual machine every time we update our repository.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Container Repository in Digital Ocean
&lt;/h2&gt;

&lt;p&gt;I recommend to follow these tutorials: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=GikLmButNyQ" rel="noopener noreferrer"&gt;Video&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.digitalocean.com/products/container-registry/getting-started/quickstart/" rel="noopener noreferrer"&gt;DigitalOcean tutorial&lt;/a&gt; &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In my experience, these were easy to follow without issues, so I will focus on the GitHub Actions step.&lt;/p&gt;

&lt;h2&gt;
  
  
  Connecting the GitHub Actions to our Droplet:
&lt;/h2&gt;

&lt;p&gt;1) To create an Action workflow, you must have a repository containing your app files and a Dockerfile. Then, in the Actions tab, you have two paths:&lt;/p&gt;

&lt;p&gt;a) Choose a pre-built workflow that generates a .yml file, which you can then adapt to your needs. &lt;/p&gt;

&lt;p&gt;b) Choose the "Simple Workflow" option, which generates a basic .yml file, allowing you to write your script from scratch. &lt;/p&gt;

&lt;p&gt;Both will create the path &lt;em&gt;.github/workflows&lt;/em&gt; in your repository. &lt;/p&gt;

&lt;p&gt;For this guide, I will choose path "b" and develop our .yml file based on .yml file from &lt;a href="https://faun.pub/full-ci-cd-with-docker-github-actions-digitalocean-droplets-container-registry-db2938db8246" rel="noopener noreferrer"&gt;Thao Truong post&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;2) Creating the .yml/.yaml file: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Following Thao Truong's structure might fail due to the out date versions of actions and keys therefore here is my updated version:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;CICD&lt;/span&gt;

&lt;span class="c1"&gt;# 1&lt;/span&gt;
&lt;span class="c1"&gt;# Controls when the workflow will run&lt;/span&gt;
&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="c1"&gt;# Triggers the workflow on push events but only for the master branch&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt; &lt;span class="nv"&gt;main&lt;/span&gt; &lt;span class="pi"&gt;]&lt;/span&gt;

  &lt;span class="c1"&gt;# Allows you to run this workflow manually from the Actions tab&lt;/span&gt;
  &lt;span class="na"&gt;workflow_dispatch&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Image&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;version'&lt;/span&gt;
        &lt;span class="na"&gt;required&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
&lt;span class="c1"&gt;#2&lt;/span&gt;
&lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;REGISTRY&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ vars.REGISTRY}}&lt;/span&gt;
  &lt;span class="na"&gt;IMAGE_NAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ vars.IMAGE_NAME}}&lt;/span&gt;

&lt;span class="c1"&gt;#3&lt;/span&gt;
&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;build_and_push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;outputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;sha&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ steps.short-sha.outputs.sha }}&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Checkout the repo&lt;/span&gt; 
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v4&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;benjlevesque/short-sha@v3.0&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;short-sha&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;length&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;6&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;echo $SHA&lt;/span&gt;
        &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;SHA&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ steps.short-sha.outputs.sha }}&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Build container image&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;docker build -t ${{ vars.REGISTRY}}/${{ vars.IMAGE_NAME}}:${{ steps.short-sha.outputs.sha }}  ./code&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Install doctl&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;digitalocean/action-doctl@v2&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;token&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }}&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Log in to DigitalOcean Container Registry with short-lived credentials&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;doctl registry login --expiry-seconds &lt;/span&gt;&lt;span class="m"&gt;600&lt;/span&gt;

&lt;span class="c1"&gt;#given the limited plan, is better to always clean everything&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Remove all old images&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;doctl registry repository list-tags ${{ vars.IMAGE_NAME }} \&lt;/span&gt;
            &lt;span class="s"&gt;--no-header --format Tag \&lt;/span&gt;
            &lt;span class="s"&gt;| grep -v ${{ steps.short-sha.outputs.sha }} \&lt;/span&gt;
            &lt;span class="s"&gt;| xargs -I {} doctl registry repository delete-tag ${{ vars.IMAGE_NAME }} {} --force || true&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Push image to DigitalOcean Container Registry&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;docker push ${{ vars.REGISTRY}}/${{ vars.IMAGE_NAME}}:${{ steps.short-sha.outputs.sha }}&lt;/span&gt;

&lt;span class="c1"&gt;#delete leftovers from the cleaned CR&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Garbage collect registry&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;doctl registry garbage-collection start --include-untagged-manifests --force&lt;/span&gt;

  &lt;span class="na"&gt;deploy&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;needs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;build_and_push&lt;/span&gt;

    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Test SSH connection&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;appleboy/ssh-action@v0.1.6&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;host&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.HOST }}&lt;/span&gt;
          &lt;span class="na"&gt;username&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.USERNAME }}&lt;/span&gt;
          &lt;span class="na"&gt;key&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.SSHKEY }}&lt;/span&gt;
          &lt;span class="na"&gt;port&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;22&lt;/span&gt;
          &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
              &lt;span class="s"&gt;whoami&lt;/span&gt;
              &lt;span class="s"&gt;hostname&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Deploy to Digital Ocean droplet via SSH action&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;appleboy/ssh-action@v0.1.6&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;host&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.HOST}}&lt;/span&gt;
          &lt;span class="na"&gt;username&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.USERNAME}}&lt;/span&gt;
          &lt;span class="na"&gt;key&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.SSHKEY}}&lt;/span&gt;
          &lt;span class="na"&gt;port&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;22&lt;/span&gt;
          &lt;span class="na"&gt;script&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
            &lt;span class="s"&gt;# Login to registry&lt;/span&gt;
            &lt;span class="s"&gt;#docker login -u ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }} -p ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }} registry.digitalocean.com&lt;/span&gt;
            &lt;span class="s"&gt;echo "${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }}" | docker login registry.digitalocean.com -u ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }} --password-stdin&lt;/span&gt;
            &lt;span class="s"&gt;# Stop running container&lt;/span&gt;
            &lt;span class="s"&gt;docker stop ${{ vars.IMAGE_NAME }} || true&lt;/span&gt;
            &lt;span class="s"&gt;# Remove old container&lt;/span&gt;
            &lt;span class="s"&gt;docker rm ${{ vars.IMAGE_NAME}} || true&lt;/span&gt;
            &lt;span class="s"&gt;# Pull the new image&lt;/span&gt;
            &lt;span class="s"&gt;docker pull ${{ vars.REGISTRY }}/${{ vars.IMAGE_NAME }}:${{ needs.build_and_push.outputs.sha }}&lt;/span&gt;
            &lt;span class="s"&gt;# Run a new container from a new image&lt;/span&gt;
            &lt;span class="s"&gt;docker run -d \&lt;/span&gt;
            &lt;span class="s"&gt;--restart always \&lt;/span&gt;
            &lt;span class="s"&gt;--name ${{ vars.IMAGE_NAME}} \&lt;/span&gt;
            &lt;span class="s"&gt;-p 8080:80 \&lt;/span&gt;
            &lt;span class="s"&gt;${{ vars.REGISTRY}}/${{ vars.IMAGE_NAME}}:${{ needs.build_and_push.outputs.sha }}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I strongly recommend AVOIDING copying and pasting this code directly into your project. There are elements you will need to adapt, such as the branch name and the Dockerfile folder path inside your VM.&lt;/p&gt;

&lt;p&gt;Before committing and pushing your project, you must add your project variables to GitHub: &lt;/p&gt;

&lt;p&gt;2.a) Adding the variables: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;*&lt;em&gt;Go to your repository -&amp;gt; Settings -&amp;gt; Secrets and Variables -&amp;gt; Actions *&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F35rh8lg7n2xntct5x1lw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F35rh8lg7n2xntct5x1lw.png" alt=" " width="800" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After it, added the equivalent information:&lt;/p&gt;

&lt;p&gt;Secrets tab: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;DIGITALOCEAN_ACCESS_TOKEN&lt;/strong&gt;: &lt;a href="https://www.youtube.com/watch?v=GikLmButNyQ" rel="noopener noreferrer"&gt;Token generated during the Container Registry&lt;/a&gt;. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;HOST&lt;/strong&gt;: IPv4 from your Virtual Machine&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;USERNAME&lt;/strong&gt;: Username used to access your virtual machine (this is likely &lt;em&gt;root&lt;/em&gt; if you are using DigitalOcean).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SSHKEY&lt;/strong&gt;: This part was tricky because my machine did not allow an RSA key. Therefore, based on &lt;a href="https://github.com/appleboy/ssh-action/tree/master?tab=readme-ov-file#-ssh-key-setup--openssh-compatibility" rel="noopener noreferrer"&gt;Appleboy/ssh-action&lt;/a&gt;, I generated an &lt;strong&gt;ED25519 Key&lt;/strong&gt;. Using this version, I didn't need to adjust any permissions, though this can often be a blocker depending on your specific case. Once the keys were created, I copied the Public Key (the one with .pub) into &lt;code&gt;authorized_keys&lt;/code&gt; inside my VM, and I copied the Private Key into the SSHKEY variable in GitHub.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Additionally, based on this &lt;a href="https://stackoverflow.com/questions/78640406/github-actions-ssh-authentication-failure-ssh-handshake-failed-ssh-unable-t" rel="noopener noreferrer"&gt;StackOverflow&lt;/a&gt;. had to update my action version from &lt;em&gt;appleboy/&lt;a href="mailto:ssh-action@v0.1.3"&gt;ssh-action@v0.1.3&lt;/a&gt;&lt;/em&gt; to &lt;em&gt;appleboy/&lt;a href="mailto:ssh-action@v0.1.6"&gt;ssh-action@v0.1.6&lt;/a&gt;&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Variable tab:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;IMAGE_NAME&lt;/strong&gt;: The name given to the image&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;REGISTRY&lt;/strong&gt;: The DigitalOcean Container Registry URL generated by the platform. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;After completing these steps, I just needed to commit the .yml file and we were done. Since the CI/CD is triggered by pushes to the repository, the workflow will automatically start the pipeline. You can monitor the progress in the &lt;strong&gt;Actions&lt;/strong&gt; tab of your repository.&lt;/p&gt;

&lt;p&gt;I feel it’s important to share that I tried using AI to debug my code, but I actually lost more time than if I had just searched for the solution online. I truly recommend avoiding AI for debugging in this specific case.&lt;/p&gt;

</description>
      <category>githubactions</category>
      <category>github</category>
      <category>docker</category>
      <category>droplet</category>
    </item>
    <item>
      <title>SAP Datasphere and Python</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Wed, 04 Feb 2026 08:53:41 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/sap-datasphere-and-python-2024</link>
      <guid>https://dev.to/dentrodailha96/sap-datasphere-and-python-2024</guid>
      <description>&lt;p&gt;This will be a short post, but given the challenges from SAP, I do believe that is important for those that works with SAP.&lt;/p&gt;

&lt;p&gt;In recent years, SAP released the &lt;strong&gt;Business Central (SAP BC)&lt;/strong&gt; and Datasphere, which follow a similar philosophy to Microsoft Fabric (a comparison I like to use given my Azure experience). The concept is to integrate all sources through built-in APIs or integrations within the SAP Datasphere, process the information, and then deliver that data into SAP Analytics Cloud.&lt;/p&gt;

&lt;p&gt;The system itself is quite good when handling SAP-native data. In my experience, working with SAP products has always been fairly challenging, and the data is notoriously hard to extract. However, with this integrated approach, the entire process has become much smoother.&lt;/p&gt;

&lt;p&gt;In the beginning, I relied mainly on views and minor processing within dataflows. But once I tried the &lt;strong&gt;Python script task&lt;/strong&gt;, it was a game-changer!&lt;/p&gt;

&lt;p&gt;Obviously, it is somewhat limited because you are only allowed to use the &lt;strong&gt;NumPy and Pandas&lt;/strong&gt; libraries, alongside basic Python modules. Even so, this allowed me to move all my processing from SQL views directly into the dataflow. This enabled batching and parallel execution, making data loading significantly faster than using standard views.&lt;/p&gt;

&lt;h2&gt;
  
  
  But how to use Python Script from Datasphere?
&lt;/h2&gt;

&lt;p&gt;The trick is: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjrnqpo1keb33lxt0u36a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjrnqpo1keb33lxt0u36a.png" alt=" " width="800" height="232"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Understand that this is  the main function that will run the data. Once the data is there, you can use pandas dataframe logic with &lt;strong&gt;apply and lambda&lt;/strong&gt; to add the processing in each row of it. &lt;/p&gt;

&lt;p&gt;It is possible to add multiple functions inside this bigger function. Here this &lt;a href="https://community.sap.com/t5/technology-blog-posts-by-members/sap-datasphere-data-flow-scripts-and-generic-odata-unpacking-nested-values/ba-p/14085615" rel="noopener noreferrer"&gt;Forum from SAP&lt;/a&gt; explains really well the process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Tips for creating and debugging:
&lt;/h2&gt;

&lt;p&gt;1) It's really important to add the column that you are creating in the Python Script Task, and &lt;strong&gt;IN THE ORDER&lt;/strong&gt; of the creation &lt;strong&gt;AFTER&lt;/strong&gt; the incoming columns:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7xyltyx71uktjcvd8et.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7xyltyx71uktjcvd8et.png" alt=" " width="320" height="260"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg29lmof8uddoqgwge8e6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg29lmof8uddoqgwge8e6.png" alt=" " width="800" height="205"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(Example from &lt;a href="![%20](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wwdjei9dr95ez9e879yr.png)"&gt;CLTGravesen&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2) You are only able to visualize the income data and the output data. For this I recommend creating a Projection task to filter the data. &lt;/p&gt;

&lt;p&gt;3) It's important to pay attention in the data format detailed, if the income data has any difference with the target table, it won't load. &lt;/p&gt;

&lt;p&gt;I hope that this help you in your next experience with SAP Datasphere. &lt;/p&gt;

</description>
      <category>sap</category>
      <category>python</category>
      <category>datasphere</category>
    </item>
    <item>
      <title>Creating a Ubuntu Droplet and connect to local machine</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Tue, 03 Feb 2026 09:07:11 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/learning-creating-a-ubuntu-droplet-4o7g</link>
      <guid>https://dev.to/dentrodailha96/learning-creating-a-ubuntu-droplet-4o7g</guid>
      <description>&lt;p&gt;The next step of my &lt;a href="https://github.com/dentrodailha96/arimura_cj" rel="noopener noreferrer"&gt;Sushi Project&lt;/a&gt; was finding a cloud provider. Since I’m based in Europe and my customer is in Brazil, I needed a solution that guaranteed 24/7 uptime across regions.&lt;/p&gt;

&lt;p&gt;We decided to move forward with a DigitalOcean (DO) Ubuntu Droplet.&lt;/p&gt;

&lt;p&gt;To maximize efficiency, I needed to connect my local machine to the remote server. The first step was adding my SSH Key to the VM to ensure a secure, seamless connection.&lt;/p&gt;

&lt;h2&gt;
  
  
  But how to create a SSH key in your ubuntu?
&lt;/h2&gt;

&lt;p&gt;This &lt;a href="https://www.digitalocean.com/community/tutorials/how-to-configure-ssh-key-based-authentication-on-a-linux-server#step-1-creating-ssh-keys" rel="noopener noreferrer"&gt;tutorial&lt;/a&gt; together with this &lt;a href="https://bizanosa.com/ubuntu-22-04-initial-server-setup-vultr/" rel="noopener noreferrer"&gt;blog post&lt;/a&gt; shows how to create a ssh key in your local computer. &lt;/p&gt;

&lt;p&gt;One great tip I picked up from the &lt;a href="https://bizanosa.com/ubuntu-22-04-initial-server-setup-vultr/" rel="noopener noreferrer"&gt;Bizanosa&lt;/a&gt; blog was to create specific directories before generating SSH keys. This approach allows you to manage independent keys for different projects.&lt;/p&gt;

&lt;p&gt;I highly recommend this because as you juggle different environments, you may need to delete and recreate keys multiple times. By isolating them in their own folders, you gain a bigger margin for error—you can wipe a specific setup without accidentally breaking your other connections.&lt;/p&gt;

&lt;p&gt;Once the SSH key is created, the next step is to spin up a droplet in DigitalOcean. While the droplet creation process in DO is extremely intuitive, connecting your local machine to the VM can be a bit tricky if you only follow the standard documentation.&lt;/p&gt;

&lt;p&gt;To bridge that gap, I followed this &lt;a href="https://www.youtube.com/watch?v=DLy14Qit_uE" rel="noopener noreferrer"&gt;video tutorial by Bizanosa&lt;/a&gt;. It provided a much clearer walkthrough on how to generate the key and use the -i flag to specify the private key identity when logging in for the first time.&lt;/p&gt;

&lt;p&gt;During my SSH Key connection I had this error: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Fail to copy due to error: port 22: Connection refused &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This happens because didn't have any SSH Server running in my local machine. So I could not accept the connection. In this &lt;a href="https://askubuntu.com/questions/218344/why-am-i-getting-a-port-22-connection-refused-error" rel="noopener noreferrer"&gt;forum&lt;/a&gt; the error is detailed described and offered some solutions. &lt;/p&gt;

&lt;p&gt;I did the "sudo ufw allow 22", adding firewall rules but not activate right away. Then, I needed to manually activate them, by running: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;sudo systemctl start ssh  &lt;br&gt;
sudo systemctl enable ssh&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;After it, I could validate that it is running by:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;sudo netstat -anp | grep sshd&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now, I am able to copy it to my droplet: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;ssh-copy-id -i {ssh key path in your local computer} {user}@{remote host} &lt;br&gt;
or &lt;br&gt;
ssh-copy-id -i {ssh key path in your local computer} {user}@{remote IP}&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Once copied, I was able to connect to the VM by running this command: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;ssh root@{remote IP} -i {ssh key local path}&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;BONUS:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you like to confirm that the process was successfully done, you can find the authorized_keys file and see if your local key if there.&lt;/p&gt;

</description>
      <category>ubuntu</category>
      <category>digitalocean</category>
      <category>cloud</category>
      <category>ssh</category>
    </item>
    <item>
      <title>Create a Docker container</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Sun, 25 Jan 2026 20:10:36 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/leanings-docker-and-it-perks-33if</link>
      <guid>https://dev.to/dentrodailha96/leanings-docker-and-it-perks-33if</guid>
      <description>&lt;p&gt;After a long year of trying to dive into new topics and some failed certification attempts, I figured out that I needed more practice and motivating hands-on projects. Therefore, I decided to support a friend by building an application for his business. &lt;/p&gt;

&lt;p&gt;Given that I had the power to choose all the steps and the structure of the project, I decided I investing a bit in my Cloud Engineering skills.&lt;/p&gt;

&lt;p&gt;The project will be described here: &lt;a href="https://github.com/dentrodailha96/arimura_cj" rel="noopener noreferrer"&gt;Sushi Project&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The project will use a DigitalOcean Ubuntu Droplet (DO). To transport everything developed locally, I decided to create a Docker container and use GitHub Actions to transfer it to DO.&lt;/p&gt;

&lt;p&gt;From this, I have some insights about a Docker writer/workflow that might help you in the future.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installation via command line:
&lt;/h2&gt;

&lt;p&gt;I am using Ubuntu, and I prefer to install everything via the command line. It gives me a better understanding of what is being done and—low key—helps me avoid breaking my system again.&lt;/p&gt;

&lt;p&gt;I used the links [1] and [2] to install Docker and understand the process step-by-step. Below, I have documented the code from link [1] along with an explanation of each step:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Add Docker's official GPG key:
# update the ubuntu
sudo apt update
# help package to support the installation
sudo apt install ca-certificates curl
# create a keyrings directory to add the trusted gpg keys (docker requirements)
sudo install -m 0755 -d /etc/apt/keyrings 
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
# adjust the permission to the .asc file
sudo chmod a+r /etc/apt/keyrings/docker.asc

# Add the repository to Apt sources:
sudo tee /etc/apt/sources.list.d/docker.sources &amp;lt;&amp;lt;EOF
Types: deb
URIs: https://download.docker.com/linux/ubuntu
Suites: $(. /etc/os-release &amp;amp;&amp;amp; echo "${UBUNTU_CODENAME:-$VERSION_CODENAME}")
Components: stable
Signed-By: /etc/apt/keyrings/docker.asc
EOF
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Following these steps, then I could install as Docker tutorial describes and test it. &lt;/p&gt;

&lt;p&gt;After understanding better the code I was able to understand better the software itself. &lt;/p&gt;

&lt;h2&gt;
  
  
  But what is docker?
&lt;/h2&gt;

&lt;p&gt;I’ll be honest: in the beginning, Docker sounded pretty confusing to me. I often mixed it up with GitHub concepts (I know, pretty dumb!) and didn't really understand the purpose. However, this &lt;a href="https://aws.plainenglish.io/docker-explained-simply-for-a-10-year-old-the-magic-box-for-computer-programs-94452b930d6b" rel="noopener noreferrer"&gt;blog post&lt;/a&gt; gave me a really good explanation.&lt;/p&gt;

&lt;p&gt;My overall takeaway is this: Docker is a place where we can put all our code along with every installation needed for it to run smoothly anywhere. Just like a local project environment, we are able to transfer these installations online so that everyone can run the project exactly the same way on their own computer.&lt;/p&gt;

&lt;p&gt;Amazing, don’t you think? However, as a fairly new "command line girl," I struggled a bit to create my first Docker container in Ubuntu.&lt;/p&gt;

&lt;h2&gt;
  
  
  So, here are some tips to make your life easier:
&lt;/h2&gt;

&lt;p&gt;1) Basic concepts: understand the basics concepts of Docker.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Docker file: blueprint to create an image&lt;/li&gt;
&lt;li&gt;Docker image: The template for running Docker containers.&lt;/li&gt;
&lt;li&gt;container: A running process based on an image.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In other words: the &lt;strong&gt;Dockerfile&lt;/strong&gt; is the idea of eating a cookie, the &lt;strong&gt;Image&lt;/strong&gt; is the recipe, and the &lt;strong&gt;Container&lt;/strong&gt; is the actual cookie after it’s been baked!&lt;/p&gt;

&lt;p&gt;This &lt;a href="https://www.youtube.com/watch?v=gAkwW2tuIqE" rel="noopener noreferrer"&gt;video&lt;/a&gt; explain this very well. :) &lt;/p&gt;

&lt;p&gt;2) Use the commands, this helps to understand better:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker build -t  {image_name} .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command build your image. The -t means tag, to create a human readable string.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker images
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Return all the images that you built in your computer. This is very good to get the Image ID.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker ps / docker ps -a
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Return all the current containers running. If you use the second command, return all the containers that ran and when they stopped. It's good to get the container id information.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -d -p host_port:container_port {container}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command runs your container. The -d (detached) allows to run it in background, which is really useful to debug your container while is running. The -p is the publish, allowing an external traffic to reach the application. The host_port:container_port, are the ports that the container will run locally.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker exec {container} ls -la {WORKDIR path}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command is really useful for me to confirm that all the files were copied correctly from my machine to the Docker container. It really helps me debug and verify my Dockerfile. Note that you can only run this command when your container is running in the background if you are using the command line.&lt;/p&gt;

&lt;p&gt;3) It is essential to understand what is written in your Dockerfile and docker-compose.yml. These files act as the structural blueprint and the step-by-step instructions for running your container. Getting the structure and software versions right is the secret to making sure your container runs correctly every time.&lt;/p&gt;

&lt;p&gt;If you are still feeling a little bit confused about how they differ, here is a great &lt;a href="https://medium.com/@ShantKhayalian/docker-vs-docker-compose-simple-and-fun-explanation-4811582127f7" rel="noopener noreferrer"&gt;post explaining the difference between these two&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I hope blog post can help you. :) &lt;/p&gt;

&lt;p&gt;Sources: &lt;br&gt;
[1] &lt;a href="https://docs.docker.com/engine/install/ubuntu/#install-using-the-repository" rel="noopener noreferrer"&gt;https://docs.docker.com/engine/install/ubuntu/#install-using-the-repository&lt;/a&gt;&lt;br&gt;
[2] &lt;a href="https://www.youtube.com/watch?v=tjqd1Fxo6HQ" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=tjqd1Fxo6HQ&lt;/a&gt;&lt;br&gt;
[3] &lt;a href="https://aws.plainenglish.io/docker-explained-simply-for-a-10-year-old-the-magic-box-for-computer-programs-94452b930d6b" rel="noopener noreferrer"&gt;https://aws.plainenglish.io/docker-explained-simply-for-a-10-year-old-the-magic-box-for-computer-programs-94452b930d6b&lt;/a&gt;&lt;br&gt;
[4] &lt;a href="https://www.youtube.com/watch?v=gAkwW2tuIqE" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=gAkwW2tuIqE&lt;/a&gt;&lt;/p&gt;

</description>
      <category>docker</category>
      <category>container</category>
      <category>ubuntu</category>
    </item>
    <item>
      <title>Basic Ubuntu commands</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Wed, 07 Jan 2026 21:45:10 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/basic-ubuntu-commands-1i98</link>
      <guid>https://dev.to/dentrodailha96/basic-ubuntu-commands-1i98</guid>
      <description>&lt;p&gt;After three months using Ubuntu and many ups and downs, I realized that there are some commands that are a "must-know" if I would like to embrace this technology. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1) sudo:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;This command is all over the place. It is the  "superuser do", allows to execute commands with the admin privileges of the root user. We use to run system maintenance, installing software, or modifying restricted files. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2) nano:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A command used to open and edit files directly in the terminal.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;3) cd:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Used to change directories. &lt;/li&gt;
&lt;li&gt;If you type only "cd" you will be sent to the home directory. However, if you type "cd {directory path}", it will open the directory path chosen. If you want to return to the previous directory you can type "cd ..".&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;4) ls:&lt;/strong&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;List content inside the current directory. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;5) mkdir:&lt;/strong&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;With this command it is possible to create directories.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;6) chown:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Who is trusted.&lt;/li&gt;
&lt;li&gt;It's possible to define the ownership for a group of Users. &lt;a href="https://www.ituonline.com/blogs/chown-vs-chmod/#:~:text=While%20chmod%20allows%20you%20to,to%20another%20user%20or%20group." rel="noopener noreferrer"&gt;This is relevant in a multi-user environment&lt;/a&gt;. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;7) chmod:&lt;/strong&gt; &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Who is Allowed&lt;/li&gt;
&lt;li&gt;Change access permissions (read, write, execute) for files and directories.&lt;/li&gt;
&lt;li&gt;Permission codes can be found &lt;a href="http://wiki.kartbuilding.net/chmod#:~:text=read%2C%20write%2C%20execute-,Common%20Permissions%20are%3A,execute%20permission%20to%20the%20user." rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ubuntu</category>
    </item>
    <item>
      <title>Newbies in Ubuntu</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Wed, 15 Oct 2025 16:09:28 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/newbies-in-ubuntu-189m</link>
      <guid>https://dev.to/dentrodailha96/newbies-in-ubuntu-189m</guid>
      <description>&lt;p&gt;My 7-year-old Lenovo was reaching the end of its life. While the processor and memory were still fairly capable, it was no longer able to effectively handle Windows.&lt;/p&gt;

&lt;p&gt;Given this situation, and my interest in learning about Kubernetes, DevOps, and other cloud concepts, I decided to fully wipe and reboot my computer with Linux. I hadn't anticipated how customizable this experience would be. Here are some key learnings from my first week with Ubuntu:&lt;/p&gt;

&lt;p&gt;1) &lt;strong&gt;Microphone Configuration and Persistence:&lt;/strong&gt; The microphone was not configured correctly out of the box; the issue was sometimes due to the lack of noise cancellation. To resolve this, first I installed &lt;a href="https://en.ubunlog.com/how-to-configure-pulseaudio-in-ubuntu/" rel="noopener noreferrer"&gt;PulseAudio&lt;/a&gt;, followed by several features modifications within &lt;a href="https://www.informaticar.net/enable-noise-cancellation-in-ubuntu/" rel="noopener noreferrer"&gt;PulseAudio&lt;/a&gt;. Therefore, I struggled to make these changes persist across reboots. I lastly used the solution from &lt;a href="https://askubuntu.com/questions/962920/how-do-i-get-pulseaudio-to-start-automatically-in-ubuntu-17-04-and-17-10-for-fir" rel="noopener noreferrer"&gt;this stack&lt;/a&gt; and executed the following command to ensure the changes "stuck":&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;echo 'pulseaudio --start' &amp;gt;&amp;gt; ~/.profile
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2) &lt;strong&gt;Managing Python Virtual Environments:&lt;/strong&gt; Another essential command I learned early on was how to create and manage virtual environments. The first answer from &lt;a href="https://askubuntu.com/questions/1328392/how-to-activate-a-virtual-environment-in-ubuntu" rel="noopener noreferrer"&gt;this stack&lt;/a&gt; has been a life saver for my work so far.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Delete environment:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;rm -rf .env
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3) &lt;strong&gt;Keep Ubuntu updated to avoid crashing:&lt;/strong&gt; Talking with some experience people about Ubuntu, it was shared that is recommended to keep it updated. So they recommend it run the code below everyday (ideally) or every week:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt-get update; sudo apt-get full-upgrade
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;One alternative, is to add this code in a text file and then convert it into an executable file. To do it:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open your cmd, create a file:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;nano up.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;After confirm that the file was created, run the step below to convert the text file into an executable file:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo chmod a+x up.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Run the executable file:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo (path file) ./up.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;4) &lt;strong&gt;Tools to help me learn:&lt;/strong&gt; One good practice that I've got from work is to draw and document things that I've learned. Therefore, my favorite ones are &lt;a href="https://snapcraft.io/install/drawio/ubuntu#install" rel="noopener noreferrer"&gt;Draw IO&lt;/a&gt; and &lt;a href="https://askubuntu.com/questions/1439878/how-to-install-obsidian-1-0-3-on-ubuntu-22-04" rel="noopener noreferrer"&gt;Obsidian&lt;/a&gt;. One great tip for Obsidian, is to integrate in Git which make possible to use in any computer once you clone. &lt;/p&gt;

&lt;p&gt;5) &lt;strong&gt;Git push&lt;/strong&gt;: Since I started to work on Linux, I am back to use more GitHub, but mainly from the cmd. &lt;a href="https://www.techielass.com/convert-a-folder-to-a-git-repository/" rel="noopener noreferrer"&gt;This tutorial&lt;/a&gt; shows how to convert a folder to a repository.&lt;/p&gt;

&lt;p&gt;6) &lt;strong&gt;Find IP Address:&lt;/strong&gt; I figured out that some ubuntu does not show the IP Address straight forward. Therefore, we can run the following install.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo apt install net-tools 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After install, you can run 'ifconfig' and the result will return the inet or ip which will be your IP Address.&lt;/p&gt;

&lt;p&gt;It seems that ifconfig is deprecated, therefore, it's possible to use 'ip addr show' and to understand the output I &lt;a href="https://samuel-ricky.medium.com/how-to-interpret-the-output-of-ip-addr-show-8008c7c41dde" rel="noopener noreferrer"&gt;recommend read this blog post&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;BONUS) &lt;strong&gt;MOST IMPORTANT TIP&lt;/strong&gt;: An update message often pops up automatically in Ubuntu (as seen in the image below). I strongly recommend not accepting these updates; in my experience, accepting them has broken my installation multiple times and forced me to reinstall Ubuntu from scratch. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmpnnjeh3eo2502o6mwwn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmpnnjeh3eo2502o6mwwn.png" alt=" " width="487" height="230"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ubuntu</category>
    </item>
    <item>
      <title>Finding S/4HANA Sample Data for Study purpose</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Mon, 13 Oct 2025 08:11:04 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/finding-s4hana-sample-data-for-study-purpose-1e8i</link>
      <guid>https://dev.to/dentrodailha96/finding-s4hana-sample-data-for-study-purpose-1e8i</guid>
      <description>&lt;p&gt;I have been learning SAP for a year and a half by now, and I can guarantee that it was a long journey until I started to sympathize with the tools and technology itself.&lt;/p&gt;

&lt;p&gt;Besides the wide access to different tools, even paid versions, my biggest struggle was to get sample data from SAP that fit their system structure and complexity. &lt;/p&gt;

&lt;p&gt;Given this challenge, this last Friday I had the opportunity to attend something called "SAP CodeJam". It's one of the events from &lt;a href="https://community.sap.com/" rel="noopener noreferrer"&gt;SAP Community&lt;/a&gt; and it seems that this happens all around the world. &lt;/p&gt;

&lt;p&gt;In this lesson, the expert in CAP (Cloud Application Programming model),  &lt;a href="https://qmacro.org/" rel="noopener noreferrer"&gt;DJ Adams&lt;/a&gt;, presented a sample of how to do a CAP Service Integration. I will be honest with you, I don't really dominate the SAP Business Application Studio and even less the CAP Model from SAP. The highlight was, that you can alternatively use Microsoft VS Code and Docker Desktop, which made the process way easier. &lt;/p&gt;

&lt;p&gt;Here is the Git repository of the CodeJam: &lt;a href="https://github.com/dentrodailha96/cap-service-integration-codejam" rel="noopener noreferrer"&gt;https://github.com/dentrodailha96/cap-service-integration-codejam&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you are more interested in this process in specific, I really recommend to checking out!&lt;/p&gt;

&lt;p&gt;Now let's talk about SAP S/4HANA sample data. In this workshop, DJ Adam, used data from an SAP Sandbox. In order to get the data, we accessed the &lt;a href="https://api.sap.com/" rel="noopener noreferrer"&gt;SAP Business Accelerator Hub&lt;/a&gt;, it's an SAP platform that combines different SAP products and provides their respective API to access the data. &lt;/p&gt;

&lt;p&gt;In the case presented, we are used the Business Partner (A2X) data, and in the exercise 3 from the git, DJ Adam shows how to see the model to understand better which data are we working with. &lt;/p&gt;

&lt;h3&gt;
  
  
  So, how to get the data?
&lt;/h3&gt;

&lt;p&gt;I did a Python code in which I return the API using the xml.ETREE library:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;#import libraries o interest
&lt;/span&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;pandas&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;pd&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;xml.etree.ElementTree&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;ET&lt;/span&gt;

&lt;span class="c1"&gt;# Get the URL in SAP Business Accelerator Hub Request
&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://sandbox.api.sap.com/s4hanacloud/sap/opu/odata/sap/API_BUSINESS_PARTNER/A_BusinessPartner?$top=50&amp;amp;$inlinecount=allpages&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="c1"&gt;# Get your personal API Key
&lt;/span&gt;&lt;span class="n"&gt;headers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;APIKey&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{key from site}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;root&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ET&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fromstring&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;ns&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;atom&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;http://www.w3.org/2005/Atom&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;m&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;http://schemas.microsoft.com/ado/2007/08/dataservices/metadata&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;d&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;http://schemas.microsoft.com/ado/2007/08/dataservices&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="n"&gt;entries&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;entry&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;root&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;findall&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;atom:entry&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ns&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;props&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;entry&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;atom:content/m:properties&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ns&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;props&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;bp&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;props&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;d:BusinessPartner&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ns&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;props&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;d:BusinessPartnerFullName&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ns&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nb"&gt;hash&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;props&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;d:BusinessPartnerUUID&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ns&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;BusinessPartner&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;bp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;bp&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;BusinessPartnerFullName&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;BusinessPartnerUUID&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;hash&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After this step, you get the data in a DataFrame format and you can play around just like with any other database. &lt;/p&gt;

&lt;p&gt;It's important to notice that not all fields are filled with information, but the main key fields have some connection, enabling the creation of a model. &lt;/p&gt;

</description>
      <category>sap</category>
      <category>cap</category>
    </item>
    <item>
      <title>Overthewire - Level 13</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Sun, 13 Jul 2025 09:25:49 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/overthewire-level-13-4691</link>
      <guid>https://dev.to/dentrodailha96/overthewire-level-13-4691</guid>
      <description>&lt;p&gt;The level 13, is not that intuitive as thought. &lt;/p&gt;

&lt;p&gt;We are going to use sshkeys, so below has a small explanation about this concept.  &lt;/p&gt;

&lt;p&gt;SSH Keys: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;They are security way to authenticte with remote servers, replacing the need for passwords.&lt;/li&gt;
&lt;li&gt;They can be public (sharable) or private. &lt;/li&gt;
&lt;li&gt;These keys are used to connect local machine to remote servers, then the machine prove your identity using SSH Key to access it. Once authenticated, all data is encrypted and sent through the secure tunel created between machines. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Given this concept in mind, we need to connect the SSH Server with the sshkey. &lt;/p&gt;

&lt;p&gt;1) Connect to level 13, then ls to find all files in this directory.&lt;br&gt;
2) Once found the key, we need to connect the SSH Server with it: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;ssh -i [file] -p 2220 [SSH Server]@localhost&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;ssh = secure shell command&lt;br&gt;
-i [file] = specific file of interest&lt;br&gt;
-p 2220 = port of interest &lt;br&gt;
[SSH Server] = connection to the local machine &lt;/p&gt;

&lt;p&gt;3) This will automatic connect us to the level 14, once inside of level 14, we need to cat the location of the path '/etc/bandit_pass/bandit14' to find it password. &lt;/p&gt;

&lt;p&gt;I hope this helped you :) &lt;/p&gt;

</description>
    </item>
    <item>
      <title>OvertheWire - Learning Linux</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Sun, 29 Jun 2025 08:25:15 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/overthewire-learning-linux-5c80</link>
      <guid>https://dev.to/dentrodailha96/overthewire-learning-linux-5c80</guid>
      <description>&lt;p&gt;Hi, &lt;/p&gt;

&lt;p&gt;Given the current economical crisis, I decided to learn Linux to get new opportunities. After some research, I've noticed that "&lt;a href="https://overthewire.org/wargames/bandit/bandit14.html" rel="noopener noreferrer"&gt;Over the wire&lt;/a&gt;" is an great site to start explore Ubuntu and understand promt from it perspective. &lt;/p&gt;

&lt;p&gt;Moreover, I don't have portifolium and I definitely need to improve my STAR explianing. So, paralell to my learning, I will share my experience here and GitHub. &lt;/p&gt;

&lt;p&gt;Here, I won't decribe the challenges, but I would like to share links, terminal command and steps that help me understand how to archive the level. So, &lt;em&gt;los geht's!&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Level 1 to 12:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Level 0: 
In this level you just need to connect to the server. Is important to understand how the logic to connect to the server: &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;ssh &lt;em&gt;&lt;/em&gt;@&lt;em&gt;&lt;/em&gt; -p &lt;em&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Level 1:&lt;/strong&gt; &lt;br&gt;
Relevant commands:&lt;/p&gt;

&lt;p&gt;cat: Open file&lt;br&gt;
copy: ctrl + shift + C&lt;br&gt;
paste: ctrl + shift + v&lt;/p&gt;

&lt;p&gt;** &lt;a href="https://medium.com/@.Qubit/how-to-create-open-find-remove-dashed-filename-in-linux-27ee297d1740" rel="noopener noreferrer"&gt;Important: if the file is called "-" to open it we must use: "cat &amp;lt;-" or "cat ./-" &lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Level 2:&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;To read files with spaces in the name we must type the name of the file between apostrophe. Example: 'file name'&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;- Level 3: *&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Open all the files of a directory: ls -a&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Level 4:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;du: returns the size of every subdirectory inside of a directory.&lt;br&gt;
du -h: shows all the human readable format files. &lt;/p&gt;

&lt;p&gt;** To open a file inside a directory, that is inside another directory you must define the "directory/filename".&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Level 5:&lt;/strong&gt;&lt;br&gt;
ls -lh: detail information and humam executable from files in a directory.&lt;/p&gt;

&lt;p&gt;find .-maxdepth x -type f  !executable -ls&lt;/p&gt;

&lt;p&gt;find is to find &lt;br&gt;
.-maxdepth x : how deep you would like to look up into the directories&lt;br&gt;
-type f: restrict only to regular files&lt;br&gt;
! : not &lt;br&gt;
-executable : type of the files&lt;br&gt;
-ls: where to find &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Level 6:&lt;/strong&gt;&lt;br&gt;
It's possible to find a file by the size, user and group in the server. The command below makes this possible: &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;find / -type f -size 33c -user bandit7 -group bandit6
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;find: to find something&lt;br&gt;
/: look from the root&lt;br&gt;
-type f: restrict only to regular files&lt;br&gt;
-size 33c: define the size and add the c after the number to restrict to that size. &lt;br&gt;
-user: define the user after it &lt;br&gt;
-group: define the group after group&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Level 7:&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;pwd: show the currently directory path&lt;br&gt;
to find a string in files inside a directory:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; grep -rnw 'directory path' -e '&amp;lt;string&amp;gt;'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;grep: command plaintext data set for lines that match a regular expression&lt;br&gt;
-rnw: recursive, line number, whole world&lt;br&gt;
-e: pattern used to search&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;- Level 8: *&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In order to find a string that repeats in a file we must use the command "sort". &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sort &amp;lt;file.txt&amp;gt; | uniq -u (this case shows only the unique strings)
sort &amp;lt;file.txt&amp;gt; | uniq -c (this case shows the strings and how many times they repeat)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;- Level 9:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To find a string inside a document:&lt;/p&gt;

&lt;p&gt;strings {document name.type} | grep -A1 '{string}'&lt;/p&gt;

&lt;p&gt;strings = extracts printable strings from the file&lt;br&gt;
 grep = a expression to helps search for a specific text within files&lt;br&gt;
 A1 = number the lines after it you would line to show&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Level 10:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Base64 = binary-to-text encoding scheme that represents binary data in ASCII string format by translating into a radix-64 representation. AKA, Some systems (like email or web URLs) don't work well with raw binary data (like images or files). Base64 converts that data into plain text that can safely be sent or stored.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  cat {document} | base64 --decode
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;cat = read document&lt;br&gt;
| = action after it&lt;br&gt;
base64 = code to be decoded&lt;br&gt;
--decode = to decode the code&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Level 11:&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;ROT13 = is a simple letter substitution cipher used to obsure text.&lt;br&gt;
          tr 'A-Za-z' 'N-ZA-Mn-za-m'&lt;/p&gt;

&lt;p&gt;tr = translating or deleting character in linux the following sequence is the translation from ROT13.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Leve 12:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Hexdump = hexdump is a command that show the contet of a file in hexadecimal format (base 16). When a file does not contain a text data, like images and compiled programs, so you can see the raw bytes.&lt;/p&gt;

&lt;p&gt;Steps to decode: &lt;a href="https://mayadevbe.me/posts/overthewire/bandit/level13/" rel="noopener noreferrer"&gt;https://mayadevbe.me/posts/overthewire/bandit/level13/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I hope this help you like it helped me. :) &lt;/p&gt;

</description>
      <category>ubuntu</category>
      <category>programming</category>
      <category>learning</category>
    </item>
    <item>
      <title>SAP Datasphere - Users and Roles</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Tue, 29 Apr 2025 11:42:47 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/sap-datasphere-users-and-roles-12id</link>
      <guid>https://dev.to/dentrodailha96/sap-datasphere-users-and-roles-12id</guid>
      <description>&lt;p&gt;Once you begin working in the SAP ecosystem, you'll quickly realize the importance of granting access only to users responsible for specific tasks. SAP Datasphere is no exception—it offers a clear and intuitive environment for managing user roles and access rights according to each area of responsibility.&lt;/p&gt;

&lt;p&gt;For example, an Data Analyst probably will only need to have access to the data for modelling and create measurement which is not necessary to access the users administration and platform connections. In this case the SAP Datasphere administrator can create an specific role or get one from the "already created" ones and guarantee the visualizations for this purpose.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to grant access to other users in SAP Datasphere?
&lt;/h3&gt;

&lt;p&gt;**You must be either system owner or DW Administrator to add a new user.&lt;br&gt;
1) Enter in SAP Datasphere and open the left lateral bar, in the lower entry will have Security (locker simbol). &lt;br&gt;
2) Then you must click in the "+" sign, followd by the user informations.&lt;br&gt;
3) Before closing this page, you must click in save ("disket simbol") then the user will start to have access. &lt;/p&gt;

&lt;p&gt;Sometimes the new user does not receive an email, then you just need to send the tenant link and he will be able to access the SAP Datasphere. &lt;/p&gt;

&lt;p&gt;Gaining access to the SAP Datasphere tenant does not grant access to everything by default. The tenant is divided into Spaces, which are used to integrate systems and store data for individual projects. After granting a user access, you will likely need to define their scopes within the specific workspace where they will be working.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to Grant/Create Scopes for your user?
&lt;/h3&gt;

&lt;p&gt;1) Add the relevant scopes for your space:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can use the pre defines scopes or create new ones. &lt;/li&gt;
&lt;li&gt;Click in "Scopes" inside the Scoped Role box and add the space of interest. 
2) Add an user:&lt;/li&gt;
&lt;li&gt;Click in "Users" inside the Scoped Role box an click in "+". &lt;/li&gt;
&lt;li&gt;Choose the option that fit or project the best.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;After that, the user should see the relevant applications in their side navigation bar. If an app is missing, it usually means the user doesn’t have access to the corresponding scope. In that case, you’ll need to repeat step 2 to properly assign the required scope.&lt;/p&gt;

</description>
      <category>sap</category>
      <category>datasphere</category>
      <category>etl</category>
      <category>datae</category>
    </item>
    <item>
      <title>What is SAP Datasphere?</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Mon, 28 Apr 2025 14:39:59 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/what-is-sap-datasphere-1enn</link>
      <guid>https://dev.to/dentrodailha96/what-is-sap-datasphere-1enn</guid>
      <description>&lt;p&gt;SAP Datasphere is a "unified service for data integration, cataloging, semantic modelling, data warehousing and virtualizing workloads across SAP and &lt;em&gt;non-SAP data&lt;/em&gt;". &lt;/p&gt;

&lt;p&gt;In simpler terms, SAP Datasphere is a PaaS (Platform as a Service) solution that brings different types of data together into one place for easy analysis. You can connect it with SAP Analytics Cloud to create a Fabric environment — and the combination of these two powerful tools is known as SAP Business Technology Cloud.&lt;/p&gt;

&lt;p&gt;SAP Datasphere operates through a SAP BTP (Business Technology Platform) Cockpit account, which can also link to other services like SAP HANA Cloud, SAP AI Core, and more.&lt;/p&gt;

&lt;p&gt;However, I've noticed that not all tutorials out there are very straightforward when it comes to setting up integrations and finding complete solutions. And not only the tutorials but the SAP universe can be quite trick.&lt;/p&gt;

&lt;p&gt;That’s why I’d like to share some steps, experiences, and tips from my own journey with SAP Datasphere — hopefully making it a bit easier for others starting out! 😊&lt;/p&gt;

&lt;p&gt;If you're new to SAP Datasphere, I highly recommend starting with &lt;a href="https://developers.sap.com/mission.data-warehouse-cloud-get-started.html" rel="noopener noreferrer"&gt;this hands-on tutorial&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;It offers an excellent opportunity to practice and explore the main features of the platform using a 90-day free trial — perfect for getting familiar with its possibilities! &lt;/p&gt;

&lt;p&gt;P.S.: I am not working for SAP, I have been studying the tool for an year.  I work as a consultant and it was required develop this knowledge however I am always open to discuss and help if anyone interest about the topic. &lt;/p&gt;

</description>
      <category>sap</category>
      <category>datasphere</category>
      <category>etl</category>
    </item>
    <item>
      <title>SAP S/4HANA Cloud</title>
      <dc:creator>Hiromi</dc:creator>
      <pubDate>Tue, 05 Nov 2024 15:45:01 +0000</pubDate>
      <link>https://dev.to/dentrodailha96/sap-understanding-4f2p</link>
      <guid>https://dev.to/dentrodailha96/sap-understanding-4f2p</guid>
      <description>&lt;p&gt;I have been studying SAP in the last 8 months, and I was feeling the I need to structure my learnings. Given this challenge, I will give a try by creating blog posts regarding SAP topics.So, let's begin :) &lt;/p&gt;

&lt;p&gt;I started my learning journey by SAP S/4HANA Cloud. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;SAP S/4HANA Cloud is a Software-as-a-Service (SaaS) version of the SAP S/4HANA ERP System. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;What is Software-as-a-Service?&lt;br&gt;
Is a Software distribution model in which a cloud provider hosts applications and makes them available to end users over the internet. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;What is SAP S/4HANA ERP System? &lt;br&gt;
ERP Stands for Enterprise Resource Planning: Core business processes needed to run a company: finance, HR, manufacturing, supply chain, services, procurement etc. "In a nutshell, it is a set of software application that are intended to integrate and streamline business processes".&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this link has better summary regarding ERP: (in construction)&lt;/p&gt;

&lt;p&gt;In SAP words, SAP S/4HANA "&lt;em&gt;is a suite of integrated business applications that enable the planning of company resources according to the needs of the company&lt;/em&gt;".&lt;/p&gt;

&lt;p&gt;It is part of BES, Business Event Service, an event-driven architecture component in SAP that allows different systems, apps or processes to communicate and update each other. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxftlx726sv2datqdb2z7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxftlx726sv2datqdb2z7.png" alt=" " width="800" height="466"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This image shows the structure of SAP S/4HANA and its components. The SAP S/4HANA Cloud is the back-end ERP of the business in a Cloud environment, and the connection with SAP Fiori, facilitates the UI with the interface.  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://help.sap.com/docs/SAP_S4HANA_CLOUD/0f69f8fb28ac4bf48d2b57b9637e81fa/36b7901040cb4c18b557ca9cc90a7002.html" rel="noopener noreferrer"&gt;Here&lt;/a&gt; is possible to dive in in the layers from SAP S/4HANA Cloud integration framework with different SAP softwares, like SAP Datasphere. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd8pg7slv7p9wef6u5umb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd8pg7slv7p9wef6u5umb.png" alt=" " width="520" height="697"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tutorials to learn better the software in practice?&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://developers.sap.com/mission.abap-env-connect-s4hana.html" rel="noopener noreferrer"&gt;https://developers.sap.com/mission.abap-env-connect-s4hana.html&lt;/a&gt;&lt;br&gt;
&lt;a href="https://developers.sap.com/group.abap-custom-ui-s4hana-cloud.html" rel="noopener noreferrer"&gt;https://developers.sap.com/group.abap-custom-ui-s4hana-cloud.html&lt;/a&gt;&lt;br&gt;
&lt;a href="https://developers.sap.com/tutorials/btp-cf-ext-s4hanacloud.html" rel="noopener noreferrer"&gt;https://developers.sap.com/tutorials/btp-cf-ext-s4hanacloud.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sources:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://www.leanix.net/en/wiki/tech-transformation/what-is-s4hana-cloud#:%7E:text=S%2F4HANA%20Cloud%3F-,SAP%20S%2F4HANA%20Cloud%20is%20a%20Software%2Das%2Da,available%20as%20cloud%20ERP%20software" rel="noopener noreferrer"&gt;https://www.leanix.net/en/wiki/tech-transformation/what-is-s4hana-cloud#:~:text=S%2F4HANA%20Cloud%3F-,SAP%20S%2F4HANA%20Cloud%20is%20a%20Software%2Das%2Da,available%20as%20cloud%20ERP%20software&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.techtarget.com/searchcloudcomputing/definition/Software-as-a-Service" rel="noopener noreferrer"&gt;https://www.techtarget.com/searchcloudcomputing/definition/Software-as-a-Service&lt;/a&gt;&lt;br&gt;
mySAP ERP for Dummies (2005) - Ian Kimbell, Andreas Vogel&lt;/p&gt;

</description>
      <category>sap</category>
      <category>dataengineering</category>
      <category>erp</category>
      <category>cloud</category>
    </item>
  </channel>
</rss>
