DEV Community

Lukas Behnke
Lukas Behnke

Posted on

Navigating the Cloud Resume Challenge with Azure

Introduction
The Cloud Resume Challenge is an innovative and practical way to gain hands-on experience with cloud technologies. Specifically, the Azure-focused version of the challenge offers a comprehensive journey through various aspects of Microsoft's cloud platform. This blog post will outline my experience and the valuable skills I acquired while completing this challenge. You can see my created website on www.behnke.work. The full code of the challenge you can see here on my github.

1. Certification: Starting with the Basics
My first step was obtaining the AZ-900 certification, a foundational look at Azure services. Although a more advanced Azure certification could also suffice, the AZ-900 was a great starting point, providing a broad overview of cloud concepts and Azure specifics.

2. HTML: Building the Foundation
I created my resume in HTML, ensuring that it was no longer just a static Word document or PDF. This step reinforced my understanding of HTML basics and laid the groundwork for further styling and functionality.

3. CSS: Adding Style
Styling my resume with CSS was the next step. As a non-designer, my goal was to create a visually appealing layout without overcomplicating the design. This process improved my CSS skills and helped me appreciate the importance of website aesthetics.

4. Static Website: Deploying on Azure
Deploying the HTML resume as a static website on Azure Storage was an enlightening experience. It allowed me to explore Azure's storage capabilities and understand the nuances of cloud-based website hosting.

5. HTTPS: Securing the Site
Implementing HTTPS for the website, using Azure CDN, was crucial for security. This step was essential for protecting the site and its visitors, and it taught me about the implementation of SSL/TLS in cloud services.

6. DNS: Custom Domain
I then linked a custom domain name to my Azure CDN endpoint. This process involved navigating DNS settings and understanding how domain names and cloud resources interact. I bought my domain on Godaddy-

7. JavaScript: Adding Interactivity
Adding a visitor counter to the website using JavaScript was both challenging and rewarding. This feature required me to write a script that interacted dynamically with the page and backend services. I implemented it into my html codethe following script:

window.onload = function() {
                fetch('https://resumefunction59939new.azurewebsites.net/api/resume_counter?code=oeRwhdElrOPSqf3z3by-TyQO5bCZ3OsOdLqZMVK4eua9AzFuUoUWyw==')
        .then(response => {
            // Check if the request was successful
            if (!response.ok) {
                throw new Error('Network response was not ok');
            }
            return response.json();
        })
        .then(data => {
            // Extract the counter value
            document.getElementById("visitorCount").innerText = data.counter
            // You can then use 'counter' as needed in your application
        })
        .catch(error => {
            document.getElementById("visitorCount").innerText = "-"
            console.error('There was a problem with your fetch operation:', error);
        });
  };
Enter fullscreen mode Exit fullscreen mode

8. Database: Leveraging CosmosDB
For the visitor counter, I utilized Azure CosmosDB’s Table API. This choice provided a scalable and serverless option for database management, which was cost-effective for the project.

9. API: Azure Functions
Creating an API with Azure Functions was a key learning moment. This serverless compute service allowed me to set up an HTTP-triggered function that interacted with the CosmosDB, abstracting the database operations from the front end.

10. Python: Expanding My Programming Skills
Writing Python code for the Azure Function was a great way to explore a different programming language. Python’s versatility in back-end development was a valuable skill to add to my repertoire.

def main(req: func.HttpRequest) -> func.HttpResponse:
    logging.info('Python HTTP trigger function processed a request.')

    endpoint = os.environ["COSMOS_DB_ENDPOINT"]
    key = os.environ["COSMOS_DB_KEY"]
    client = CosmosClient(endpoint, key)
    database_name = 'Resume'
    database = client.get_database_client(database_name)
    container_name = 'Items'
    container = database.get_container_client(container_name)

    query = "SELECT * FROM c"
    items = list(container.query_items(query=query, enable_cross_partition_query=True))

    if items:
        # Increment counter
        newCounter = items[0]['counter'] + 1
        items[0]['counter'] = newCounter

        # Replace the entire document in the database
        container.replace_item(item=items[0]['id'], body=items[0])

        # Create a JSON response with the updated counter
        response_body = json.dumps({"counter": newCounter})
        return func.HttpResponse(body=response_body, mimetype="application/json", status_code=200)
    else:
        return func.HttpResponse("No items found", status_code=404)`
Enter fullscreen mode Exit fullscreen mode

11. Tests: Ensuring Code Quality
Including tests for the Python code was crucial for maintaining code quality. Learning about writing effective tests in Python helped me understand the importance of test-driven development.

12. Infrastructure as Code (IaC)
Using Azure Resource Manager (ARM) templates for defining infrastructure was a major learning curve. This approach to IaC was critical for efficient, error-free deployment of cloud resources.

13. Source Control: GitHub Repositories
For both the API and website, I used GitHub for version control. This practice ensured that my code was well managed and easily accessible.

14. CI/CD (Back end)
Setting up CI/CD with GitHub Actions for the backend was another highlight. This automated the testing and deployment process, making updates seamless and efficient.

name: Deploy ARM Template

on: [push]

jobs:
  build-and-deploy:
    runs-on: ubuntu-latest

    steps:
    - name: 'Checkout GitHub Action'
      uses: actions/checkout@v2

    - name: 'Login via Azure CLI'
      uses: azure/login@v1
      with:
        creds: ${{ secrets.AZURE_CREDENTIALS }}

    - name: 'Deploy ARM template'
      uses: azure/ARM-deploy@v1
      with:
        subscriptionId: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
        resourceGroupName: 'resumechallenge59939'
        template: './backend/backend-arm.json'
        parameters: './backend/parameters.json'

    - name: 'Deploy Azure Function'
      uses: azure/functions-action@v1
      with:
        app-name: 'resumefunction59939new'
        package: './backend/resume-function'
        publish-profile: ${{ secrets.AZURE_FUNCTIONAPP_PUBLISH_PROFILE }}
        # Set environment variables
      env:
        COSMOS_DB_ENDPOINT: ${{ secrets.COSMOS_DB_ENDPOINT }}
        COSMOS_DB_KEY: ${{ secrets.COSMOS_DB_KEY }}

    - name: 'Set Function App Settings'
      run: |
        az functionapp config appsettings set --name resumefunction59939new --resource-group resumechallenge59939 --settings COSMOS_DB_ENDPOINT=${{ secrets.COSMOS_DB_ENDPOINT }} COSMOS_DB_KEY=${{ secrets.COSMOS_DB_KEY }}

    - name: 'Logout'
      run: az logout
Enter fullscreen mode Exit fullscreen mode

15. CI/CD (Front end)
Similar to the back end, I implemented CI/CD for the front end. This setup automatically updated the Azure Storage blob whenever I pushed new code, enhancing productivity and consistency.

name: deploy_frontend
# deploys when push is made from frontend folder

on: [push]
jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3

      - uses: azure/login@v1
        with:
          creds: ${{ secrets.AZURE_CREDENTIALS }}

      - name: Upload to blob storage
        uses: azure/CLI@v1
        with:
          inlineScript: |
            az storage blob upload-batch --account-name resumestorage59939 --auth-mode key -d '$web' -s frontend/web --overwrite

      # Azure logout
      - name: logout
        run: |
          az logout
        if: always()
Enter fullscreen mode Exit fullscreen mode

16. Reflection: Learning and Growing
This project was a journey of learning and skill-building. From cloud fundamentals to advanced concepts like serverless computing and CI/CD, the experience has been invaluable. Publishing this blog post is not just a challenge requirement; it's a way to share my journey and inspire others to explore the possibilities within Azure.

Conclusion
The Azure edition of the Cloud Resume Challenge has been an engaging and educational experience. It provided a practical framework to apply and expand my cloud knowledge. For anyone looking to delve into cloud computing, especially with Azure, I highly recommend taking on this challenge.

Top comments (0)