<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: TrueLime</title>
    <description>The latest articles on DEV Community by TrueLime (@truelime).</description>
    <link>https://dev.to/truelime</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/truelime"/>
    <language>en</language>
    <item>
      <title>How to Delete a Code Wiki in Azure DevOps</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Mon, 06 Oct 2025 06:26:17 +0000</pubDate>
      <link>https://dev.to/truelime/how-to-delete-a-code-wiki-in-azure-devops-4lf2</link>
      <guid>https://dev.to/truelime/how-to-delete-a-code-wiki-in-azure-devops-4lf2</guid>
      <description>&lt;p&gt;In one of my projects, I initially created a &lt;strong&gt;Code Wiki&lt;/strong&gt; next to the regular project Wiki in Azure DevOps, thinking it might be useful, but later realized I didn’t really need it. Everything seemed fine until I decided to clean it up and &lt;strong&gt;deleted all the content&lt;/strong&gt; from the Code Wiki. That’s when things got weird.&lt;/p&gt;

&lt;p&gt;Whenever I clicked on &lt;strong&gt;Wiki&lt;/strong&gt; in the left navigation, I ended up on a &lt;strong&gt;blank screen&lt;/strong&gt; where I could only create a new document. After saving it, nothing happened, no navigation options, no way to switch back to my main project Wiki. The only workaround I found was to &lt;strong&gt;manually add &lt;code&gt;.wiki&lt;/code&gt; to the query string&lt;/strong&gt; in the URL to reach the regular Wiki again. 🤦‍♂️  &lt;/p&gt;

&lt;p&gt;Even after digging through &lt;strong&gt;Azure DevOps project settings&lt;/strong&gt;, there was &lt;strong&gt;no option&lt;/strong&gt; to delete or reset the broken Code Wiki from the UI.&lt;/p&gt;

&lt;p&gt;After some research, I discovered that you can actually remove a Code Wiki using the &lt;strong&gt;Azure DevOps CLI&lt;/strong&gt;, and that finally solved it. 🚀&lt;/p&gt;




&lt;h2&gt;
  
  
  🧰 Steps to Delete a Code Wiki
&lt;/h2&gt;

&lt;p&gt;Open your terminal or PowerShell and run the following command (adjust it to your setup):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;az devops wiki delete &lt;span class="nt"&gt;--wiki&lt;/span&gt; &amp;lt;wikiName&amp;gt; &lt;span class="nt"&gt;--project&lt;/span&gt; &amp;lt;projectName&amp;gt; &lt;span class="nt"&gt;--yes&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace &lt;code&gt;&amp;lt;wikiName&amp;gt;&lt;/code&gt; with the name or ID of your Code Wiki, and &lt;code&gt;&amp;lt;projectName&amp;gt;&lt;/code&gt; with your Azure DevOps project name (if it’s not already set as default).&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;--yes&lt;/code&gt; parameter simply skips the confirmation prompt.&lt;/p&gt;




&lt;h2&gt;
  
  
  ⚙️ Installing the Azure DevOps CLI
&lt;/h2&gt;

&lt;p&gt;If you haven’t installed the Azure DevOps extension yet, you can do so in PowerShell with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;az extension add &lt;span class="nt"&gt;--name&lt;/span&gt; azure-devops
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can explore all available extensions using:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;az extension list-available &lt;span class="nt"&gt;--output&lt;/span&gt; table
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;Just sharing this in case anyone else runs into the same problem, hopefully this saves you a few hours of head-scratching. 💡&lt;/p&gt;

</description>
      <category>azuredevops</category>
      <category>cli</category>
      <category>wiki</category>
      <category>devops</category>
    </item>
    <item>
      <title>Vibe Coding: Revolutionizing Software Development with GitHub Copilot Agents</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Fri, 11 Apr 2025 05:59:22 +0000</pubDate>
      <link>https://dev.to/truelime/vibe-coding-revolutionizing-software-development-with-github-copilot-agents-ojo</link>
      <guid>https://dev.to/truelime/vibe-coding-revolutionizing-software-development-with-github-copilot-agents-ojo</guid>
      <description>&lt;p&gt;Ever had those days when coding felt like a grind, as if you were stuck in a cycle of repetitive tasks and endless screen time? Yeah, I know the feeling. 😩 But things might just be changing for the better. Recently, I've discovered an exciting concept: &lt;strong&gt;vibe coding&lt;/strong&gt;. It's introducing a new era where coding integrates more seamlessly into our daily lives. The idea is to balance focusing on code with participating in everyday activities—a kind of part-time presence that maintains full-time productivity. Let me explain how &lt;strong&gt;GitHub Copilot Agents&lt;/strong&gt; in VS Code are making this possible.&lt;/p&gt;

&lt;h2&gt;
  
  
  Enter the Agents: More Than Just Code Completion
&lt;/h2&gt;

&lt;p&gt;GitHub Copilot got attention when it debuted as an assistant capable of completing code snippets. However, the introduction of &lt;strong&gt;Agents&lt;/strong&gt; takes it a step further—from suggesting lines of code to managing comprehensive coding tasks. Imagine having a coding buddy that doesn't just write code but also coordinates multiple steps in a workflow. 🛠️ All you need to do is sketch out your goal, jot down some clarifying remarks, and watch as the Agent takes your idea and runs with it. Intriguing, isn't it?&lt;/p&gt;

&lt;h2&gt;
  
  
  Vibe Coding: Multitasking's New Friend
&lt;/h2&gt;

&lt;p&gt;Ever tried cooking dinner while bug-fixing a component? I've often faced a choice between a properly cooked meal or functional markup. But with vibe coding, those tough choices are no longer an issue. Picture this: you're enjoying your favorite series and simply pause to approve a refactor that your Agent handled while you were distracted. It's more like directing than coding, and that's the charm. It's letting coding fit into scenarios once thought impossible while maintaining quality. 🍕 + 🎥 + 👨‍💻&lt;/p&gt;

&lt;h2&gt;
  
  
  A Few Speed Bumps: Speed and Autonomy 😅
&lt;/h2&gt;

&lt;p&gt;Before you get too excited, vibe coding isn't without its quirks. The Agents might resemble a talented junior developer: excellent work, but needing a bit more time, especially with complex tasks. Still, considering the trade-off of outsourcing some cognitive load, the extra moments aren't too steep a price.&lt;/p&gt;

&lt;p&gt;Moreover, Agents aren't fully autonomous yet. The Agent still checks in with you for key decisions, asking for clarification or direction when it’s unsure how to proceed. In a way, it keeps you involved and ensures your expertise remains a critical part of the process—at least for now.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Tip:&lt;/strong&gt; VS Code's Agent Mode is available. Consider testing it to experience vibe coding firsthand: &lt;a href="https://code.visualstudio.com/blogs/2025/04/07/agentMode" rel="noopener noreferrer"&gt;VS Code Agent Mode&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;What's more, you can even let Copilot generate your commit messages. It's a small way to harmonize with vibe coding and avoid bland entries like "fixed bug." 🎶&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up: Looking Ahead
&lt;/h2&gt;

&lt;p&gt;So what does this mean for the future of coding? Perhaps it will become less of a task and more of a harmonious process, with its subtle warmth and humanity 🌍. I'm curious to see how this fits into my daily rhythm. If you're interested, fire up VS Code, enable Agent Mode, and give vibe coding a try. Let the machine handle the typing while you focus on strategizing about the grand picture.&lt;/p&gt;

&lt;p&gt;Thanks for reading, code less and vibe more!&lt;/p&gt;

</description>
      <category>digitaljeroen</category>
      <category>ai</category>
      <category>vibecoding</category>
      <category>githubcopilot</category>
    </item>
    <item>
      <title>Integrating AI Agents with n8n: Enhance Your Workflow Automation</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Thu, 10 Apr 2025 06:58:40 +0000</pubDate>
      <link>https://dev.to/truelime/integrating-ai-agents-with-n8n-enhance-your-workflow-automation-52pc</link>
      <guid>https://dev.to/truelime/integrating-ai-agents-with-n8n-enhance-your-workflow-automation-52pc</guid>
      <description>&lt;p&gt;As someone deeply immersed in digital experience architecture, I've always been fascinated by the idea of integrating AI agents into workflow automation. It's like adding a touch of intelligence to your workflows, almost as if they can think for themselves! 😃 Recently, I've been exploring n8n to see how much AI agents can elevate automation, and it's been an exciting journey!&lt;/p&gt;

&lt;h3&gt;
  
  
  The Challenge of Crafting Effective System Prompts
&lt;/h3&gt;

&lt;p&gt;The success of AI agent integration in n8n hinges on one crucial element: the &lt;strong&gt;system prompt&lt;/strong&gt;. This is not just any piece of text; it defines the message that guides your agent's behavior, setting the tone, outlining its tasks, and determining its scope. A well-crafted system prompt can be the difference between your agent performing expertly or straying into unpredictable territory. Imagine instructing your agent like a helpful colleague, minus the coffee breaks. ☕&lt;/p&gt;

&lt;h3&gt;
  
  
  Designing the Perfect System Message
&lt;/h3&gt;

&lt;p&gt;Creating an effective system prompt can be quite the art form. The objective is to clearly describe the behaviors you wish to see, without letting the logic become overly complex. ✨ Here's a little insight: I've been refining my prompts with a custom GPT assistant, enhancing their clarity and reliability. It's not about using sophisticated language, but about fine-tuning the nuances and intent—a dynamic puzzle that keeps me engaged.&lt;/p&gt;

&lt;p&gt;Testing AI agents directly within n8n is a great strategy. It's akin to working with clay; you mold, test, and tweak until your AI agent behaves as envisioned. This approach allows for experimentation and re-running system messages to continually adjust the agent's behavior before putting it into action.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;💡 &lt;em&gt;Pro tip: Always test your AI agents within n8n for direct feedback, making it easier to refine prompts and behaviors.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Expanding Capabilities: Connecting the Dots
&lt;/h3&gt;

&lt;p&gt;A feature that significantly enhances n8n’s capabilities is the ability for AI agents to interact with external tools or trigger sub-workflows. This paves the way for dynamic processes, like auto-generating documents, managing APIs, or orchestrating multi-layer operations. Precise system prompts ensure your AI understands user intent accurately, thus enabling reliable, multi-step automations.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Endless Cycle: From Experiences to Innovations
&lt;/h3&gt;

&lt;p&gt;As I reflect on this integration journey, the rewards are inherently tied to the learning curve. Each adjustment brings new insights, but there's always room for enhancement. I'm curious about how others navigate this realm—have you approached prompt design differently or discovered unique hacks for validation processes? Let’s share experiences—I'm eager for a knowledge exchange! 🔄&lt;/p&gt;

&lt;p&gt;For those interested in elevating their automation capabilities, n8n offers an array of integrations worth exploring. Visit &lt;a href="https://n8n.io/" rel="noopener noreferrer"&gt;n8n.io&lt;/a&gt; if you're intrigued.&lt;/p&gt;

&lt;p&gt;Hope this gives your setup a boost. Let me know how the experience works for you!&lt;/p&gt;

</description>
      <category>digitaljeroen</category>
      <category>ai</category>
      <category>automation</category>
      <category>n8n</category>
    </item>
    <item>
      <title>Mastering Azure DevOps Pipeline Variables and Secrets: Tips and Best Practices</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Wed, 09 Apr 2025 06:03:18 +0000</pubDate>
      <link>https://dev.to/truelime/mastering-azure-devops-pipeline-variables-and-secrets-tips-and-best-practices-2fb8</link>
      <guid>https://dev.to/truelime/mastering-azure-devops-pipeline-variables-and-secrets-tips-and-best-practices-2fb8</guid>
      <description>&lt;p&gt;I recently found myself deeply immersed in optimizing our Azure DevOps pipelines. It's that moment when you uncover little hacks and best practices that make you wish you'd thought of them sooner. Aligning variable management with &lt;code&gt;appsettings.json&lt;/code&gt; led to a genuine "eureka" moment. It's like stumbling upon that perfectly fitting puzzle piece that was hiding in plain sight. 😅&lt;/p&gt;

&lt;h4&gt;
  
  
  The Organizational Challenge
&lt;/h4&gt;

&lt;p&gt;Managing the multitude of configurations across different environments can rapidly transform pipeline management into a headache 😫. When unchecked, variables can become chaotic, sabotaging the very efficiency they're supposed to bolster. Thankfully, Azure DevOps offers a feature called &lt;strong&gt;variable libraries&lt;/strong&gt;, which help bring order to this chaos.&lt;/p&gt;

&lt;h4&gt;
  
  
  Structuring Variables with Dot Notation
&lt;/h4&gt;

&lt;p&gt;To enhance clarity and organization, I started aligning our variable naming conventions with our &lt;code&gt;appsettings.json&lt;/code&gt; file using dot notation. Instead of opting for vague variable names, I designed a structure like &lt;code&gt;AppSettings.Database.ConnectionString&lt;/code&gt;. This tidy approach intuitively maps to application configurations. It also adds a layer of control over what flows into the pipeline—a true game changer! 💡&lt;/p&gt;

&lt;h4&gt;
  
  
  Seamless YAML Integration
&lt;/h4&gt;

&lt;p&gt;A standout benefit is how easily these variable groups integrate with YAML pipelines. By linking variable groups, we automate and simplify environment-specific configurations. Imagine making a change in an environment without causing a storm in your pipeline ecosystem. With this approach, maintaining pipelines feels more like a pleasant stroll rather than a frantic sprint. I've noticed a significant reduction in our pipeline complexity, making maintenance and scaling so much easier.&lt;/p&gt;

&lt;h4&gt;
  
  
  💡 &lt;strong&gt;Security First: Guard Your Secrets&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Securing sensitive information isn't just advisable—it's essential. Azure DevOps offers a nifty feature to mark sensitive variables as secrets, with the lock icon 💼. This ensures they remain secure and hidden, protecting them from scrutiny, whether in logs or the interface. By playing safe, we can prevent unwanted surprises.&lt;/p&gt;

&lt;h4&gt;
  
  
  Key Vault Integration: A Seamless Experience
&lt;/h4&gt;

&lt;p&gt;Integrating &lt;strong&gt;Azure Key Vault&lt;/strong&gt; with Azure DevOps was another exciting discovery. By enabling the Key Vault option, secrets are safely incorporated without being directly stored within DevOps. You simply select your Azure subscription and pick the relevant Key Vault. Such integrations elevate variable management, offering unparalleled security and control.&lt;/p&gt;

&lt;p&gt;For those interested in delving deeper into the mechanics, Azure’s documentation provides valuable insights: &lt;a href="https://learn.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops&amp;amp;tabs=azure-pipelines-ui%2Cyaml" rel="noopener noreferrer"&gt;Variable Group Management in Azure DevOps&lt;/a&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  Wrapping Up
&lt;/h4&gt;

&lt;p&gt;This journey through optimizing Azure DevOps pipelines taught me how a bit of structure significantly reduces complexity and enhances security. If we tech folks can prioritize maintainability and scalability in our configs, maybe we'll spend more time enjoying coffee—and less time debugging 🤔.&lt;/p&gt;

&lt;p&gt;With security and organization in our pipelines strengthened, I'm eager to explore more automation possibilities with Azure DevOps. As cloud environments become increasingly intricate, keeping our pipeline management lean and efficient will undoubtedly be critical.&lt;/p&gt;

&lt;p&gt;Thanks for reading, happy building!&lt;/p&gt;

</description>
      <category>digitaljeroen</category>
      <category>azuredevops</category>
      <category>keyvault</category>
      <category>yaml</category>
    </item>
    <item>
      <title>Revolutionizing Prototyping with AI: How Lovable Enhances Efficiency</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Tue, 08 Apr 2025 11:12:17 +0000</pubDate>
      <link>https://dev.to/truelime/revolutionizing-prototyping-with-ai-how-lovable-enhances-efficiency-1obc</link>
      <guid>https://dev.to/truelime/revolutionizing-prototyping-with-ai-how-lovable-enhances-efficiency-1obc</guid>
      <description>&lt;p&gt;As a digital experience architect, I'm always on the lookout for new ways to streamline the prototyping phase. Recently, I discovered an intriguing AI-driven tool called &lt;strong&gt;Lovable&lt;/strong&gt;, which has genuinely changed how I approach building application solutions. 🌟 I wanted to experiment with AI-driven development using a simple e-commerce app, and Lovable seemed perfectly suited for the job.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Challenge: Fast and Functional Prototyping
&lt;/h3&gt;

&lt;p&gt;Prototyping has always felt like meticulously carving a sculpture from stone: rewarding yet labor-intensive. Traditionally, it meant laboring over architectural decisions, writing boilerplate code, and layering on necessary features just to test the viability of an idea. However, Lovable changes this dynamic by transforming well-structured prompts into vivid application prototypes. 🚀&lt;/p&gt;

&lt;h3&gt;
  
  
  A Guided Tour Through AI-Powered Creation
&lt;/h3&gt;

&lt;p&gt;Lovable breaks down the barriers to swift prototype generation. With it, turning intents into functional applications feels like engaging in a conversation. Whether creating a Progressive Web App or a simple shopping cart-enabled e-commerce site, Lovable constructs prototypes while offloading features like user authentication or data storage. 🌐 It helps your app stand independently with minimal input.&lt;/p&gt;

&lt;p&gt;But there's more: Lovable navigates architectural choices, initiates code generation, and anticipates needs you might not have explicitly detailed in your prompts. 🛠️ Its iterative refinement ability through follow-up prompts makes developing with Lovable resemble chatting with a well-informed assistant.&lt;/p&gt;

&lt;h3&gt;
  
  
  Keeping Things Real with GitHub Copilot
&lt;/h3&gt;

&lt;p&gt;Integrating Lovable into my workflow felt seamless when paired with &lt;strong&gt;GitHub Copilot&lt;/strong&gt;. Typically, I start by using Lovable to rapidly generate prototype versions for assessment. After uploading to GitHub, Copilot lets me refine and personalize the prototypes with minimal friction, efficiently handling tricky corner cases. 📈 By naturally compartmentalizing tasks between these two AI tools, my development process becomes streamlined and balanced.&lt;/p&gt;

&lt;h3&gt;
  
  
  Catching Your Creative Waves (But Watch the Credits!)
&lt;/h3&gt;

&lt;p&gt;While Lovable impresses with its functional elegance, it's important to be cautious when enjoying generous prompt use, as credits can deplete faster than expected. Within limited thresholds, Lovable offers sufficient room to experiment without a commitment.&lt;/p&gt;

&lt;p&gt;The AI-powered prototyping wave has undeniably arrived, bringing both possibilities and caution. Lovable makes this wave intuitive and enjoyable, shifting development towards more playful and vibe-centric coding. I strongly advocate diving into Lovable and embracing this lively approach.&lt;/p&gt;

&lt;p&gt;Check out Lovable for yourself: &lt;a href="https://lovable.dev" rel="noopener noreferrer"&gt;Lovable Official Site&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Final Thoughts: The Playful Dance of AI Development
&lt;/h3&gt;

&lt;p&gt;Reflecting on my experience with Lovable, I've learned that AI can transform technical barriers into creative pathways, offering a welcoming starting point for developers eager to innovate. As we steer towards more AI-driven solutions, understanding the balance between automated scaffolding and creative coding remains crucial. The dance of creativity and efficiency continues to fascinate me, and I'm excited to see where this journey leads next. 🎉&lt;/p&gt;

&lt;p&gt;Thanks for reading, happy building!&lt;/p&gt;

</description>
      <category>digitaljeroen</category>
      <category>ai</category>
      <category>prototyping</category>
      <category>development</category>
    </item>
    <item>
      <title>Harnessing AI Automation with n8n for Seamless Blog Writing</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Mon, 07 Apr 2025 14:38:31 +0000</pubDate>
      <link>https://dev.to/truelime/harnessing-ai-automation-with-n8n-for-seamless-blog-writing-1f8h</link>
      <guid>https://dev.to/truelime/harnessing-ai-automation-with-n8n-for-seamless-blog-writing-1f8h</guid>
      <description>&lt;p&gt;Over the past few weeks, I've been perfecting the way I create and share technical blog posts 📘. Introducing AI automation into my writing routine has truly transformed the process. As I explored ways to optimize my approach, I turned to &lt;strong&gt;n8n&lt;/strong&gt;, a comprehensive workflow automation tool that fits well with my tech-savvy mindset.&lt;/p&gt;

&lt;h2&gt;
  
  
  Simplifying the Writing Process with n8n
&lt;/h2&gt;

&lt;p&gt;I started by designing a custom workflow using n8n. If you haven't tried it yet, it's worth a look. This tool helped me develop a seamless AI-powered system that takes a rough idea and turns it into a polished article through several intelligent steps 🔄.&lt;/p&gt;

&lt;p&gt;The workflow begins by identifying the core message within the concept text. It then crafts a draft that captures the essence of what I aim to convey. This is not just a basic conversion; it refines the tone to match my personal writing style. And yes, my subtle use of emojis makes it through 😉.&lt;/p&gt;

&lt;h2&gt;
  
  
  Structuring for Success: JSON and the dev.to API
&lt;/h2&gt;

&lt;p&gt;Once the article feels ready, the workflow prepares it in a neat JSON format. This formatting is key for the next step—publishing via the &lt;strong&gt;dev.to API&lt;/strong&gt;. You can learn more about the API &lt;a href="https://dev.to/api"&gt;here&lt;/a&gt; 🌐.&lt;/p&gt;

&lt;p&gt;This structured approach has greatly simplified the writing and publishing process for me. In the past, writing in English wasn't my strongest suit and often felt like an obstacle 🎯. Thanks to AI, that hurdle is now removed. Writing in English has become smooth, efficient, and even enjoyable.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Role of Transparency: #digitaljeroen
&lt;/h2&gt;

&lt;p&gt;To maintain transparency, I've introduced the &lt;strong&gt;#digitaljeroen&lt;/strong&gt; tag for each AI-assisted post. This tag is a nod to my digital counterpart, crucial in bringing these blog entries to life 📌. It's a fun way to acknowledge the digital help that allows me to share thoughts and insights more consistently.&lt;/p&gt;

&lt;h2&gt;
  
  
  Encouraging Exploration with n8n
&lt;/h2&gt;

&lt;p&gt;If you're thinking of exploring similar automation paths, I highly recommend n8n. It's an excellent platform that offers great flexibility to build your AI workflows. Check it out at &lt;a href="https://n8n.io" rel="noopener noreferrer"&gt;n8n.io&lt;/a&gt; and see how it might simplify your processes as it has mine. 🚀&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping It Up
&lt;/h2&gt;

&lt;p&gt;Leveraging AI and n8n in my blogging system has transformed writing and publishing into a more streamlined experience ✨. If you're interested in enhancing your workflow with AI, trying n8n might open new doors for you. Don't hesitate to share your thoughts or experiences, and if you want to connect on LinkedIn, I'd love to hear from you! 😊&lt;/p&gt;

&lt;p&gt;Connect with me: &lt;a href="https://www.linkedin.com/in/jeroenfurst/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;&lt;/p&gt;

</description>
      <category>digitaljeroen</category>
      <category>ai</category>
      <category>automation</category>
      <category>n8n</category>
    </item>
    <item>
      <title>Tips for Migrating from KX13 to Xperience by Kentico</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Thu, 26 Sep 2024 13:27:47 +0000</pubDate>
      <link>https://dev.to/truelime/tips-for-migrating-from-kx13-to-xperience-by-kentico-709</link>
      <guid>https://dev.to/truelime/tips-for-migrating-from-kx13-to-xperience-by-kentico-709</guid>
      <description>&lt;p&gt;Transitioning from Kentico Xperience 13 (KX13) to &lt;a href="https://www.kentico.com/platforms/xperience-by-kentico" rel="noopener noreferrer"&gt;XbyK&lt;/a&gt; is a significant move that requires careful planning. While you might be satisfied with your current platform and are a fan of Kentico 😉, considering a migration now can help you make the transition more cost-effectively in the future. Think of it like maintaining your car 🚗—regular check-ups and part replacements keep it running smoothly. Similarly, modernizing or replacing components of your platform over time sets a solid foundation for the next generation.&lt;/p&gt;

&lt;p&gt;Starting early allows your organization to strategize the best approach, and your agency or friendly neighbourhood &lt;a href="https://www.linkedin.com/company/truelime/" rel="noopener noreferrer"&gt;Kentico Partner&lt;/a&gt; 🦸 can offer valuable guidance 🤝. There's no need to do everything at once; gradual preparation can make the process much smoother.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Considerations for a Successful Migration
&lt;/h2&gt;

&lt;p&gt;Here are 10 essential steps to guide you through a smooth and efficient migration from Kentico Xperience 13 to Xperience by Kentico:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Understand that Migration is necessary 🛠️&lt;/strong&gt;&lt;br&gt;
An upgrade from KX13 to XbyK isn't possible due to significant technological changes. The biggest shift is the replacement of the CMS based on webforms with a modern, super-fast frontend technology—React ⚛️. This change means certain concepts and functionalities work differently in XbyK.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Evaluate the Kentico Roadmap 🗺️&lt;/strong&gt;&lt;br&gt;
Don't hesitate to get started—being informed puts you in control. By regularly checking &lt;a href="https://roadmap.kentico.com" rel="noopener noreferrer"&gt;roadmap.kentico.com&lt;/a&gt;, you'll stay up to date with upcoming features and monthly updates. This helps you plan your migration more effectively, especially with XbyK’s composable architecture. If certain functionalities can be replaced or updated within your current platform, it could make the transition smoother. Whenever possible, prioritize features already available in XbyK to streamline the process and ensure a seamless migration.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Prepare your MVC Application 🧰&lt;/strong&gt;&lt;br&gt;
Ensure your MVC application is running on modern .NET. If you're currently using .NET MVC 5, your first step should be migrating to .NET Core MVC. Since XbyK also uses .NET Core MVC, this will make your API easier to migrate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Align your Frontend Stack 🖥️&lt;/strong&gt;&lt;br&gt;
Make sure your frontend stack is compatible with .NET Core MVC and React. While the CMS is based on React, your frontend doesn't have to be. However, adopting React could offer additional benefits. Examine your current frontend code to see how it can be migrated to XbyK. If you're still storing unstructured HTML/markup in the database, prioritize updating these before beginning the migration.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Utilize Standard Modules 🧩&lt;/strong&gt;&lt;br&gt;
Replace custom-made solutions with standard modules from KX13. For example, if you use a custom URL module, consider switching to Kentico's standard solution called &lt;a href="https://docs.kentico.com/developers-and-admins/development/routing/content-tree-based-routing" rel="noopener noreferrer"&gt;content tree-based routing&lt;/a&gt;. This change will allow you to benefit immediately from XbyK's architectural setup.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Allocate Time for Content Migration ⏳&lt;/strong&gt;&lt;br&gt;
Since a direct upgrade isn't possible, use migration toolkits to transfer your content to XbyK. A great tool for this is the &lt;a href="https://github.com/Kentico/xperience-by-kentico-kentico-migration-tool" rel="noopener noreferrer"&gt;Kentico Migration Tool&lt;/a&gt;📦. An advantage of this approach is that necessary content models are created automatically. This method isn't limited to KX13; you can migrate from other platforms as well via the &lt;a href="https://github.com/Kentico/xperience-by-kentico-universal-migration-tool" rel="noopener noreferrer"&gt;Universal Migration Tool&lt;/a&gt;. Using a universal format like JSON can expedite the process. Familiarize yourself with content modeling concepts in XbyK, such as the content hub and reusable content, to maximize benefits.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;7. Leverage Page Builder and Widgets 🛠️&lt;/strong&gt;&lt;br&gt;
If you're already using the Page Builder and widgets, you'll be pleased to know they're compatible with XbyK. While KX13 widgets can be used in XbyK’s compatibility mode with minimal changes, you may want to explore the new &lt;a href="https://docs.kentico.com/developers-and-admins/customization/extend-the-administration-interface/ui-form-components" rel="noopener noreferrer"&gt;UI Form Components&lt;/a&gt; for enhanced editing capabilities. The migration can be done smoothly using the &lt;a href="https://github.com/Kentico/xperience-by-kentico-kentico-migration-tool" rel="noopener noreferrer"&gt;Kentico Migration Tool&lt;/a&gt;, which helps convert the Page Builder JSON data structure to be compatible with the new components. Keep in mind that while widget properties may need some adjustments, the views and view models will require little to no rewriting, ensuring a seamless transition.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;8. Clean up unused content and functionalities 🧹&lt;/strong&gt;&lt;br&gt;
Reducing the amount of content and unused functionalities before migration can save significant time. Now is an excellent opportunity to phase out or remove elements that are no longer necessary.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;9. Modernize your .NET Code 🔄&lt;/strong&gt;&lt;br&gt;
Update old .NET code to modern standards. This includes not only MVC but also any functionalities using packages based on outdated development techniques.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;10. Rethink your Infrastructure and Deployment ☁️&lt;/strong&gt;&lt;br&gt;
With XbyK's different structure, you no longer need two app services to host the application. Reflect on how you want to handle hosting in the future. Consider whether setting up infrastructure yourself is still the best use of your resources. &lt;a href="https://www.kentico.com/platforms/xperience-by-kentico/development/saas" rel="noopener noreferrer"&gt;Kentico SaaS&lt;/a&gt; could offer substantial benefits by handling much of the heavy lifting. With just a few clicks, you can have a complete environment ready, allowing you to focus on developing your application.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Thoughts 💡
&lt;/h2&gt;

&lt;p&gt;By starting preparations now, you can make the migration to XbyK a smoother and more cost-effective process. Regularly updating and modernizing components of your platform ensures you're ready for future advancements, such as expanding into other channels or leveraging AI 🤖. Kentico remains an excellent choice for organizations aiming to achieve digital growth without the need for complete replatforming. Reach out to your agency or &lt;a href="https://www.linkedin.com/company/truelime/" rel="noopener noreferrer"&gt;Kentico Partner&lt;/a&gt; for advice tailored to your specific needs—they're there to help you navigate this significant transition.&lt;/p&gt;

</description>
      <category>kentico</category>
      <category>xbyk</category>
      <category>dxp</category>
      <category>migration</category>
    </item>
    <item>
      <title>Embracing the Headless Channel in Xperience by Kentico</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Mon, 08 Jan 2024 14:20:07 +0000</pubDate>
      <link>https://dev.to/truelime/embracing-the-headless-channel-in-xperience-by-kentico-367a</link>
      <guid>https://dev.to/truelime/embracing-the-headless-channel-in-xperience-by-kentico-367a</guid>
      <description>&lt;p&gt;In December last year, Xperience by Kentico rolled out 🎁 a &lt;a href="https://community.kentico.com/blog/xperience-by-kentico-refresh-december-14-2023" rel="noopener noreferrer"&gt;significant refresh&lt;/a&gt;, introducing the Headless Channel functionality. This innovative feature is designed to enhance content delivery and management in the digital ecosystem. By leveraging the Headless Channel, users can now seamlessly distribute content across various platforms, including campaign websites, mobile apps, and microsites. This development marks a significant shift in how content is managed and delivered, offering greater flexibility and efficiency 💪. In this article, we'll dive into the Headless Channel and discover how it enhances digital content strategies.&lt;/p&gt;

&lt;h2&gt;
  
  
  Channel Management in Xperience by Kentico
&lt;/h2&gt;

&lt;p&gt;In Xperience by Kentico, channels are a pivotal concept for organizing and delivering content. These channels include &lt;a href="https://docs.xperience.io/xp/developers-and-admins/configuration/website-channel-management" rel="noopener noreferrer"&gt;website channels&lt;/a&gt; for traditional web content, &lt;a href="https://docs.xperience.io/xp/developers-and-admins/digital-marketing-setup/email-channel-management" rel="noopener noreferrer"&gt;email channels&lt;/a&gt; for digital marketing, and the brand new &lt;a href="https://docs.xperience.io/xp/developers-and-admins/configuration/headless-channel-management" rel="noopener noreferrer"&gt;headless channels&lt;/a&gt; for API-driven content delivery. &lt;/p&gt;

&lt;p&gt;Each channel type serves a specific purpose: website channels focus on content for web presentations, email channels are dedicated to managing and delivering marketing communications 📣, and headless channels provide greater flexibility for content distribution across various platforms. This segmentation not only refines content management and delivery strategies but also facilitates seamless integrations ⚙️, enhancing the effectiveness of digital content across various platforms.&lt;/p&gt;

&lt;h2&gt;
  
  
  Advantages of the Headless Channel
&lt;/h2&gt;

&lt;p&gt;The Headless Channel in Xperience by Kentico is designed to simplify content management and distribution. This functionality, which requires minimal development effort to implement, can be &lt;a href="https://docs.xperience.io/xp/developers-and-admins/configuration/headless-channel-management#Headlesschannelmanagement-Createheadlesschannels" rel="noopener noreferrer"&gt;easily activated and configured&lt;/a&gt;. It enables the efficient distribution of specific content across various applications, services, and online platforms.&lt;/p&gt;

&lt;p&gt;This approach is particularly beneficial for teams looking to enhance their digital presence without the need for extensive coding or software development. By leveraging the &lt;a href="https://graphql.org/" rel="noopener noreferrer"&gt;GraphQL&lt;/a&gt; API endpoint, developers can prepare and execute queries to retrieve the desired content. This makes the Headless Channel a valuable tool 🔨 for content managers and developers, offering ease of use and practicality.&lt;/p&gt;

&lt;h2&gt;
  
  
  The differences between REST and Headless
&lt;/h2&gt;

&lt;p&gt;If you're familiar with the REST service in Kentico Xperience, the Headless functionality is a notable enhancement, offering performance benefits and greater flexibility. While the REST service facilitates the management of system objects or pages, the Headless approach, using a GraphQL API endpoint, streamlines content distribution across multiple channels. Furthermore, it provides content editors with full control 💯 over the content they wish to distribute via the headless channel. This not only improves performance but also simplifies the process of repurposing existing content for different platforms.&lt;/p&gt;

&lt;h2&gt;
  
  
  Secured by default
&lt;/h2&gt;

&lt;p&gt;In Xperience by Kentico, the security of the Headless channel is organized through &lt;a href="https://docs.xperience.io/xp/developers-and-admins/configuration/headless-channel-management#Headlesschannelmanagement-ManageAPIkeys" rel="noopener noreferrer"&gt;token authentication&lt;/a&gt;. This model enhances data protection by securing the GraphQL API endpoint, requiring valid API keys 🔑 for access and permitting only authenticated requests with authorized security tokens.&lt;/p&gt;

&lt;p&gt;Furthermore, Xperience by Kentico's headless API enables Cross-Origin Resource Sharing (&lt;a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS" rel="noopener noreferrer"&gt;CORS&lt;/a&gt;), allowing for effortless integration with web applications across different domains. Configuring the &lt;a href="https://docs.xperience.io/xp/developers-and-admins/development/content-retrieval/retrieve-headless-content#Retrieveheadlesscontent-Cross-originresourcesharing" rel="noopener noreferrer"&gt;CorsAllowedOrigins&lt;/a&gt; setting allows you to specify which domains are authorized to access the API, ensuring that responses include the necessary Access-Control-Allow-Origin header.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuration of Headless content
&lt;/h2&gt;

&lt;p&gt;To effectively utilize the Headless Channel in Xperience by Kentico, it's essential to configure headless items. These items are key in determining what content types will be available through the GraphQL API. The approach involves identifying and setting up the specific content types required for your headless channel.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi1m67yor3ohztxinds2w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi1m67yor3ohztxinds2w.png" alt="Configure headless item" width="800" height="541"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For existing content stored in the Content Hub, these headless items play a pivotal role. They need to configured to link with reusable content from the Content Hub, ensuring that this content, along with all its associated fields, is readily accessible through GraphQL queries.&lt;/p&gt;

&lt;h2&gt;
  
  
  Explained through an example scenario
&lt;/h2&gt;

&lt;p&gt;Consider the following example use case for Xperience by Kentico's headless capabilities, which involves the management and distribution of location data as reusable content. In this example, a 'location' 🌍 content model contains details such as address and directions, and it is utilized across multiple web pages through various web channels. &lt;/p&gt;

&lt;p&gt;With the headless functionality, this location data can now also be made available to external sources. By creating a headless item with a content item selector, these locations can be easily selected and offered from a central source.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5246gutd0unrehmdo1ip.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5246gutd0unrehmdo1ip.png" alt="Reusable content in headless item" width="800" height="217"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Through the built-in GraphQL UI, it is possible to easily extract the entered location data from the system. Only a few &lt;a href="https://docs.xperience.io/xp/developers-and-admins/configuration/headless-channel-management#Headlesschannelmanagement-ConfiguretheheadlessAPI" rel="noopener noreferrer"&gt;configuration adjustments&lt;/a&gt; are needed, such as enabling the GraphQL API endpoints. Activating GraphQL introspection is not mandatory, but it is quite handy as it makes the GUI tools available for exploring the schema.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Enhance your channel's API security by using the &lt;strong&gt;GraphQlEndpointPath&lt;/strong&gt; property to specify a custom and less predictable URL endpoint.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Then, we can easily retrieve our location data, including the data from the linked address content item from the content hub.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6i7f1872kythr9p5va0u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6i7f1872kythr9p5va0u.png" alt="GraphQL result" width="800" height="333"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  In conclusion
&lt;/h2&gt;

&lt;p&gt;The introduction of headless functionality in Xperience by Kentico is expected to significantly impact future projects. &lt;a href="https://roadmap.kentico.com/c/189-headless-activity-tracking-data-submission-api" rel="noopener noreferrer"&gt;Headless activity tracking&lt;/a&gt; is already on the roadmap, and there's hope for expanding its capabilities to include various types of content from different channels. This indicates Kentico's commitment to evolving its headless capabilities to meet complex future needs. And with such advancements, opting for the headless channel will surely become a 'no-brainer' 😉&lt;/p&gt;

</description>
      <category>kentico</category>
      <category>headless</category>
      <category>dxp</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Kentico SaaS: First Contact</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Tue, 19 Sep 2023 10:45:08 +0000</pubDate>
      <link>https://dev.to/truelime/kentico-saas-first-contact-21bm</link>
      <guid>https://dev.to/truelime/kentico-saas-first-contact-21bm</guid>
      <description>&lt;p&gt;In this article, we will take a first look 👀 into the exciting new world of Kentico SaaS. Kentico SaaS offers customers a worry-free way to host their Xperience by Kentico websites. It provides a cloud environment equipped with all the essential components needed to run your platform smoothly. Furthermore, I will also walk you through setting up a new Xperience by Kentico project from scratch, preparing it for SaaS deployment, integrating it with a CI/CD pipeline via Azure DevOps, and automatically rolling it out to various environments within Kentico SaaS. Finally I will also touch on the topic of Continuous Deployment, a much-anticipated new feature in Xperience by Kentico.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exploring hosting options: Engage
&lt;/h2&gt;

&lt;p&gt;In today's world 🌎, customers expect to operate their websites seamlessly, whether it's to meet performance and scalability demands during peak moments or to ensure the security of sensitive data. Meeting these expectations is no longer an option but a fundamental requirement. Fortunately, services like Microsoft's Azure cloud ☁️ address many of these concerns when used as a platform-as-a-service. This is an area where we have been diligently working on and making improvements for years.&lt;/p&gt;

&lt;p&gt;There are situations where it can be difficult to keep up with all the technological changes. If your agency lacks the resources to tackle this, such as providing education 🎓 for Certified DevOps Engineers, offering independent hosting services can be challenging. Organizations are increasingly looking for SaaS solutions with straightforward service levels.&lt;/p&gt;

&lt;p&gt;This is where Kentico SaaS can offer an appealing alternative 💪. Kentico SaaS offers secure, scalable, and cost-effective hosting for your platform. This service is specifically designed for Xperience by Kentico, ensuring that only essential components are deployed to run your platform. Furthermore, the platform undergoes continuous improvement, reducing the necessity for ongoing knowledge investments. If a new cloud 👽 technology becomes available, there's a good chance that it will be adopted and made available, ensuring your platform remains up-to-date.&lt;/p&gt;

&lt;p&gt;However, it's important to note that there are a few considerations, especially if you're accustomed to taking matters into your own hands and conducting thorough analyses 🧠. In such situations, you may find it necessary to enlist additional support to address any challenges in understanding platform errors, as obtaining comprehensive diagnostic info can be demanding, even though monitoring tools are available. On the other hand, you can also leverage this to your advantage 😉.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating Kentico SaaS projects: Make it so
&lt;/h2&gt;

&lt;p&gt;Getting your application ready for Kentico SaaS was easier than I had anticipated. By running a few commands, you can make your Xperience by Kentico SaaS-ready 💯. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Add the &lt;strong&gt;--cloud&lt;/strong&gt; parameter when creating your Xperience by Kentico project. This parameter installs a boilerplate project suitable for SaaS deployments.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The next steps involve building and configuring your application, and then preparing it for deployment. Kentico offers a PowerShell script for &lt;a href="https://docs.xperience.io/xp/developers-and-admins/deployment/deploy-to-the-saas-environment#DeploytotheSaaSenvironment-DeploytotheSaaSenvironmentdeploy" rel="noopener noreferrer"&gt;generating the Kentico SaaS deployment package&lt;/a&gt;. This package can be uploaded via the Xperience Portal, and then rolled out to different environments. This process can also be automated, as we'll discuss shortly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flpaxy8uu1bmavujlr10v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flpaxy8uu1bmavujlr10v.png" alt="Xperience Portal" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Of course, the prerequisite is to &lt;a href="https://www.kentico.com/platforms/xperience-by-kentico/xap" rel="noopener noreferrer"&gt;subscribe to Kentico SaaS&lt;/a&gt;. There are several editions with various functionalities and features to consider. In this example, I'll be using the medium tier, which provides QA, UAT, and PROD environments, as I prefer a robust DTAP approach.&lt;/p&gt;

&lt;h2&gt;
  
  
  Streamlined Deployment: The Final Frontier
&lt;/h2&gt;

&lt;p&gt;Release pipelines have become essential building blocks for DevOps engineers. We typically use pipelines for two purposes: &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Compiling and automatically testing the application with each commit. &lt;/li&gt;
&lt;li&gt;Automatically building and deploying release-ready code.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Kentico has thoughtfully integrated these aspects into the Kentico SaaS service, offering APIs to deliver deployment packages, making it remarkably straightforward to seamlessly integrate into your own release pipeline.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Send the package via a POST request to the deployment API endpoint: &lt;a href="https://xperience-portal.com/api/deployment/upload/PROJECT_GUID" rel="noopener noreferrer"&gt;https://xperience-portal.com/api/deployment/upload/PROJECT_GUID&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Once a deployment package is received, Kentico immediately starts deploying 🚀 it to the target environment. This process takes a few minutes, and the environment is updated. Error handling is also built in, should the package encounter deployment issues.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Furh7i3lxao3mm4a4w7tn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Furh7i3lxao3mm4a4w7tn.png" alt="Ongoing deployment" width="522" height="261"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once tests are successful and you have the green light ✅ for the release, you can proceed to the next environment via the Xperience Portal. The portal provides all the necessary information to manage this process effectively, including an audit trail to ensure only authorized personnel can push releases.&lt;/p&gt;

&lt;h2&gt;
  
  
  The past is written, but the future is left for us to write
&lt;/h2&gt;

&lt;p&gt;Xperience by Kentico introduces a new feature called &lt;a href="https://docs.xperience.io/xp/developers-and-admins/ci-cd/continuous-deployment" rel="noopener noreferrer"&gt;Continuous Deployment&lt;/a&gt;. The purpose of Continuous Deployment is to facilitate incremental deployments. It shares similarities with Kentico's Continuous Integration but has some significant distinctions. The Continuous Integration feature serializes content and objects into XML files to maintain synchronization 🔄 across developer environments. While this approach is effective for version control, it was not suitable for deploying to other environments. Typically, additional tools like the &lt;a href="https://www.toolkitforkentico.com/" rel="noopener noreferrer"&gt;BizStream Toolkit&lt;/a&gt;, Content Staging, or custom deployment tools were used 😉.&lt;/p&gt;

&lt;p&gt;The workflow for Continuous Deployment is as follows: First, you need to generate the Continuous Deployment configuration file. The configuration file enables you to specify the restore mode (Create/CreateUpdate/Full) for each object, which determines the operations required in the target database during repository restoration. The next step is to generate the Continuous Deployment files. This can be achieved by using the following command: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;dotnet run --no-build -- --kxp-cd-store&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;When a release is submitted to the Xperience Portal, the Continuous Deployment files will be processed automatically. Once the release is completed, you will notice that the changes are available. This automation is an additional advantage when you use Kentico SaaS 😍.&lt;/p&gt;

&lt;h2&gt;
  
  
  To Boldly Go ✨
&lt;/h2&gt;

&lt;p&gt;So far, you've been able to read about my initial experiences with Kentico SaaS. I've demonstrated how simple it is to make a project SaaS-ready. I've also covered the generation of a deployment package and how to submit it to the Xperience Portal. What has pleasantly surprised 🎁 me, in particular, is the straightforwardness of the deployment process. You can see that Kentico has put considerable thought into keeping this process as simple as possible. Additionally, I find the new Continuous Deployment feature personally very appealing. It will be very interesting to see how it performs in upcoming Xperience by Kentico projects. To be continued!&lt;/p&gt;

</description>
      <category>kentico</category>
      <category>saas</category>
      <category>azure</category>
      <category>devops</category>
    </item>
    <item>
      <title>Troubleshooting "Our Services Aren't Available Right Now" Error in Safari with Azure Front Door</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Tue, 29 Aug 2023 11:13:48 +0000</pubDate>
      <link>https://dev.to/truelime/troubleshooting-our-services-arent-available-right-now-error-in-safari-with-azure-front-door-ono</link>
      <guid>https://dev.to/truelime/troubleshooting-our-services-arent-available-right-now-error-in-safari-with-azure-front-door-ono</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In my line of work, which often involves managing projects hosted on Azure and utilizing DXP platforms like &lt;a href="https://www.kentico.com/" rel="noopener noreferrer"&gt;Kentico&lt;/a&gt; and &lt;a href="https://umbraco.com/" rel="noopener noreferrer"&gt;Umbraco&lt;/a&gt;, Azure Front Door has become an vital component of our setup. We are strong advocates of this service and have utilized its capabilities to enhance various aspects of our applications. A great example of this is the built-in Web Application Firewall (WAF) capabilities and protection against DDoS attacks.&lt;/p&gt;

&lt;p&gt;However, a recent development has brought a new challenge to our attention, particularly concerning users who access our websites via Safari browsers. Interestingly, these users have reported encountering an unexpected issue. On occasion, they are met with a seemingly generic error message from Azure Front Door: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;"Our services aren't available right now. We're working to restore all services as soon as possible. Please check back soon."&lt;/strong&gt; &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This issue is interesting because the Azure Front Door is configured in detection mode, with the aim of analyzing incoming traffic without actively blocking it. To determine the cause, a thorough analysis is necessary.&lt;/p&gt;

&lt;h2&gt;
  
  
  Root cause of the error
&lt;/h2&gt;

&lt;p&gt;Investigating the issue proved to be somewhat of a puzzle due to its sporadic behavior. However, we managed to make progress by reproducing the situation on a Mac. During testing, we noticed something intriguing: when Safari displayed the "Our Services Aren't Available Right Now" error message, we also observed a "&lt;strong&gt;421 Misdirected Request&lt;/strong&gt;" response in the browser. This helped guide our search for more information and led us to the following relevant post on &lt;a href="https://stackoverflow.com/questions/75165055/occasionally-receiving-421-response-code-from-azure-front-door-when-using-wildca" rel="noopener noreferrer"&gt;Stack Overflow&lt;/a&gt;. This post introduced us to the concept of "&lt;strong&gt;Domain Fronting&lt;/strong&gt;," a behavior inherent in Azure Front Door.&lt;/p&gt;

&lt;p&gt;Essentially, domain fronting is a security measure in Azure Front Door that blocks certain HTTP requests. Specifically, if the host header of a request doesn't match the original TLS SNI extension, it can be blocked. Understanding this behavior shed light on the root cause behind the "Our Services Aren't Available Right Now" error experienced by Safari users and helped determine the next steps to be taken.&lt;/p&gt;

&lt;h2&gt;
  
  
  Hello, IT
&lt;/h2&gt;

&lt;p&gt;Reaching out to Microsoft was a logical step that provided valuable insights into the problem's cause, especially regarding its connection to the way Safari interprets TLS mismatches. In the course of our communication with Microsoft, we discussed the possibility of disabling domain fronting, as mentioned in the Stack Overflow post, as a potential measure to address the issue. This collaborative approach helped us to gain a more comprehensive understanding of the problem and explore potential solutions.&lt;/p&gt;

&lt;p&gt;It became clear that in our setup, the use of a multi-domain SSL certificate played a crucial role in the Safari-Azure Front Door relationship. The problem appears to be connected to how Safari interprets TLS mismatches and is not a bug within Azure Front Door. Microsoft suggested a potential solution, which involves using distinct certificates for each custom domain or considering the utilization of Azure Front Door managed certificates, incorporating individual Subject Alternative Names (SANs).&lt;/p&gt;

&lt;h2&gt;
  
  
  Fixing the problem
&lt;/h2&gt;

&lt;p&gt;As a solution to the problem, we ultimately opted to &lt;strong&gt;use a separate SSL certificate for each domain&lt;/strong&gt; hosted on the platform. Luckily the SSL certificate management can be automated within Azure Front Door. Initially, we encountered difficulties in provisioning the SSL certificates. Once again, Microsoft came to the rescue.&lt;/p&gt;

&lt;p&gt;To provision the SSL certificates, it's important that the domains are mapped to Azure Front Door via DNS. Additionally, you need to either add "digicert.com" to the CAA (Certification Authority Authorization) record or remove the CAA record altogether (leaving it blank and unlinked to any CA). Once this step is completed, Digicert can seamlessly issue the necessary SSL certificates.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this article, you've been able to read about how we went from a vague browser-specific issue to the eventual solution by implementing individual SSL certificates for the website domains in question. Along the way, it provided us with numerous insights into the capabilities of Azure Front Door. Hopefully, this article will help you if you encounter the same issue. If you have any tips or suggestions, please feel free to share them in the comments below.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>azure</category>
      <category>safari</category>
      <category>frontdoor</category>
    </item>
    <item>
      <title>Essential Insights into Kentico Xperience's Hotfix Methodology for Developers</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Fri, 23 Jun 2023 12:35:18 +0000</pubDate>
      <link>https://dev.to/truelime/essential-insights-into-kentico-xperiences-hotfix-methodology-for-developers-2ep0</link>
      <guid>https://dev.to/truelime/essential-insights-into-kentico-xperiences-hotfix-methodology-for-developers-2ep0</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In this blog post, I cover some hidden magic of Kentico Xperience, specifically an internal method that silently operates in the background whenever a hotfix is implemented. While developers may not directly witness this process, it executes significant changes that can have a profound impact on the functionality of Kentico Xperience. Understanding and correctly executing this hotfix procedure is crucial to avoid potential problems that could arise within the system. &lt;/p&gt;

&lt;p&gt;In this post, I will explore the detailed workings of Kentico Xperience's hotfix methodology. By the end of this article, you will have a thorough grasp of the inner workings of Kentico Xperience's hotfix methodology, allowing you to confidently apply hotfixes with ease and avoid any potential errors, ensuring a smooth and trouble-free experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding the Hotfix Procedure
&lt;/h2&gt;

&lt;p&gt;The Kentico Xperience &lt;a href="https://docs.xperience.io/installation/hotfix-instructions-xperience-13" rel="noopener noreferrer"&gt;hotfix procedure&lt;/a&gt; involves several steps to ensure a successful implementation. First, you need to download and install the Kentico hotfix utility, which serves as a tool to facilitate the process. Once installed, you can launch the utility and follow the provided wizard, which guides you through the necessary steps. This includes executing the required database SQL scripts and updating the Kentico Xperience files, particularly the CMS files. Afterward, you need to open the live site application (MVC) in Visual Studio, where you will update the NuGet packages to the target hotfix version. To complete the procedure, you rebuild the solution and start the application to apply the hotfix effectively. By following these steps, you can ensure a smooth and error-free experience during the Kentico Xperience hotfix process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Potential Challenges and Solutions during the Hotfix Procedure
&lt;/h2&gt;

&lt;p&gt;During the execution of the hotfix procedure in Kentico Xperience, various issues can arise that are immediately visible and require attention before proceeding to the next step. For instance, when executing the SQL script, it is possible for it to fail and provide instant feedback about the error encountered. Such errors need to be addressed and resolved before proceeding further. &lt;/p&gt;

&lt;p&gt;Similarly, when updating the NuGet packages and building the solution, compilation errors can occur, necessitating their resolution. Another potential area where errors can occur is when updating the administration interface through the hotfix wizard. In some cases, certain files may require manual comparison and updating, especially if they have been intentionally modified. &lt;/p&gt;

&lt;p&gt;Each of these steps collectively forms the foundation of the hotfix procedure, and it is crucial to perform them correctly. Once these steps are completed accurately, the next phase involves launching the CMS and the live site (MVC) application.&lt;/p&gt;

&lt;h2&gt;
  
  
  Testing and Validating the Hotfix
&lt;/h2&gt;

&lt;p&gt;To ensure a smooth production rollout, it is advisable to execute the SQL script on multiple database copies, preferably with representative data. This allows for thorough testing and helps identify any potential issues that may arise from variations in data between different environments. By closely mimicking the production environment, we can proactively address discrepancies and prevent complications during the hotfix deployment process.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Hidden Operations of Kentico Xperience Hotfixes
&lt;/h2&gt;

&lt;p&gt;Upon launching the hotfixed application, it is important to note that background tasks might still be in progress. Although not immediately visible during the loading process, these tasks can be traced in the event log, indicating the execution of additional operations specific to a particular hotfix version.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq9eb1qkbs7etfvazcmf5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq9eb1qkbs7etfvazcmf5.png" alt="Hotfix start and finish logged in Event Log" width="567" height="114"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A deeper analysis of the &lt;a href="https://docs.xperience.io/installation/hotfix-instructions-xperience-13/hotfix-instructions-xperience-13-source-code" rel="noopener noreferrer"&gt;source code&lt;/a&gt; reveals the background execution to update class form definitions. These updates involve modifying or expanding table structures with new database fields. An example is the introduction of the &lt;strong&gt;DocumentUnpublishedRedirectUrl&lt;/strong&gt; field, which was added in &lt;a href="https://docs.xperience.io/release-notes-xperience-13#ReleasenotesXperience13-Ref8" rel="noopener noreferrer"&gt;refresh 8&lt;/a&gt;, also known as hotfix version 13.0.94. These updates are performed through the internal Kentico Xperience API, ensuring consistent synchronization of the database across all relevant locations.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exploring the Importance of Class Form Definitions Updates
&lt;/h2&gt;

&lt;p&gt;The process of updating class form definitions in Kentico Xperience involves an additional level of complexity. It relies on a temporary table named &lt;strong&gt;[Temp_FormDefinition]&lt;/strong&gt;, which is populated with corresponding changes by the hotfix SQL script (see screenshot below). This step is particularly relevant for refreshes that introduce new functionalities to Kentico Xperience.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv2zji6wj8po7vuha51uk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv2zji6wj8po7vuha51uk.png" alt="Example SQL Script inserting the Temporary Table" width="800" height="215"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;[Temp_FormDefinition]&lt;/strong&gt; table contains an XML value that stores the class form definition. When Kentico Xperience detects the execution of a hotfix, it reads and processes this temporary table in the background. At the end of the process, the table is cleaned up and removed. Simultaneously, a Kentico Xperience setting is updated to store the hotfix version.&lt;/p&gt;

&lt;p&gt;Kentico Xperience utilizes two settings for this purpose: &lt;strong&gt;CMSHotfixVersion&lt;/strong&gt; and &lt;strong&gt;CMSHotfixDataVersion&lt;/strong&gt;. When the Kentico Xperience application starts, it compares the values of these settings. If they differ, the method responsible for reading the temporary table and processing the class form definitions is triggered. This ensures that any necessary updates are applied seamlessly during the startup process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pitfalls of the Temporary Table in Kentico Xperience
&lt;/h2&gt;

&lt;p&gt;It is crucial to highlight the potential pitfalls that can occur if the temporary table, &lt;strong&gt;[Temp_FormDefinition]&lt;/strong&gt;, is empty but has not been fully processed. Incorrect execution can result in error messages, as previously illustrated by the example of the new &lt;strong&gt;DocumentUnpublishedRedirectUrl&lt;/strong&gt; field. For instance:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Caused exception:&lt;br&gt;
Invalid column name 'DocumentUnpublishedRedirectUrl'.&lt;br&gt;
, StackTrace: at CMS.DataEngine.AbstractDataConnection.HandleError(String queryText, Exception ex)&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;In a recent project, we encountered this exact issue. We had automated the hotfix rollout process, which included the automatic execution of the SQL script. However, for unknown reasons, the temporary table was either not updated or not updated completely. Despite the absence of any hotfix-related notifications in the event log, we found that the two settings, &lt;strong&gt;CMSHotfixVersion&lt;/strong&gt; and &lt;strong&gt;CMSHotfixDataVersion&lt;/strong&gt;, were still identical. As a result, we observed abnormal behavior and encountered error messages due to the failure to update the source table structures. &lt;/p&gt;

&lt;p&gt;This showed us once again the crucial importance of maintaining the correct sequence and order of operations during the hotfix procedure to avoid such issues.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In conclusion, this blog post has shed light on the under the hood background tasks that are crucial for a successful hotfix procedure in Kentico Xperience. While the more visible tasks are important, it is equally vital to recognize and ensure correct execution of these background tasks. Neglecting them can lead to unexpected issues and complications within the applications.&lt;/p&gt;

&lt;p&gt;Hope this helps. Happy developing!&lt;/p&gt;

</description>
      <category>kentico</category>
      <category>xperience</category>
      <category>upgrade</category>
      <category>hotfix</category>
    </item>
    <item>
      <title>Tackle 0 Byte files in Azure Blob Storage with ease using Azure PowerShell</title>
      <dc:creator>Jeroen Fürst</dc:creator>
      <pubDate>Tue, 03 Aug 2021 13:10:33 +0000</pubDate>
      <link>https://dev.to/truelime/tackle-0-byte-files-in-azure-blob-storage-with-ease-using-az-powershell-5fb7</link>
      <guid>https://dev.to/truelime/tackle-0-byte-files-in-azure-blob-storage-with-ease-using-az-powershell-5fb7</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Have you ever felt like you were facing a challenge that would take you a long time to solve? I had that feeling recently when I had to track down corrupted files in Azure Blob Storage. Fortunately, with the help of some PowerShell scripts for Azure, I was able to easily trace and fix the files, by in my case deleting them.&lt;/p&gt;

&lt;p&gt;If this sounds familiar to you and if you are looking for a step-by-step plan to handle files in bulk, then you have come to the right place. In this article I will briefly discuss how the files got corrupted. I will then cover the manual approach that I took to locate the files. Finally, I will present a step-by-step plan of PowerShell scripts to find and delete the 0 Byte files.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cause of the problem: out of memory
&lt;/h2&gt;

&lt;p&gt;The application in question is a Digital eXperience Platform (&lt;a href="https://xperience.io" rel="noopener noreferrer"&gt;Kentico Xperience&lt;/a&gt;) that consists of typical Azure components such as an Azure Web App, Azure SQL Database and Azure Blob Storage for the storage of media files. Synchronization with the Product Information Management System takes place every night via a scheduled task, whereby the latest product data including images are updated and stored in Azure Blob Storage. The synchronization went smoothly until at some point the error message: &lt;code&gt;Exception type: System.OutOfMemoryException&lt;/code&gt; occurred. &lt;/p&gt;

&lt;p&gt;The synchronization failure was not directly linked to the broken images. Only after reports came in that images were randomly missing on the website, it became clear that there was more going on. Debugging the image processor showed that, due to the memory exception, thumbnails were generated as 0 Byte images spread over dozens of folders 😱. &lt;br&gt;
And since only the updated products are synced, searching for broken thumbnail images is like searching for a needle in a haystack.&lt;/p&gt;
&lt;h2&gt;
  
  
  Manual approach to find and delete broken images
&lt;/h2&gt;

&lt;p&gt;In the search for the 0 Byte files I came across Azure temp and Azure cache folders on the Azure Web App. The application uses these folders to store temporary data from Azure Blob Storage. Through &lt;a href="https://github.com/projectkudu/kudu/wiki/Kudu-console" rel="noopener noreferrer"&gt;Kudu Console&lt;/a&gt;, a service available for Azure Web Apps, it is possible to navigate through these folders using the command prompt. By executing the following command line script all 0 Byte files will be searched and written to a txt file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;forfiles /S /M &lt;span class="k"&gt;*&lt;/span&gt;.&lt;span class="k"&gt;*&lt;/span&gt; /C &lt;span class="s2"&gt;"cmd /c if @fsize EQU 0 (if @isdir EQU FALSE echo @path)"&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; list.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Source: &lt;a href="https://www.ryadel.com/en/find-list-zero-byte-files-windows-linux-cmd-batch-terminal-prompt/" rel="noopener noreferrer"&gt;How to find and list zero byte files in Windows and Linux&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Now that we have a list of broken files, we can track them down in the Azure Blob Storage media container. For this I can highly recommend the &lt;a href="https://azure.microsoft.com/en-us/features/storage-explorer/" rel="noopener noreferrer"&gt;Microsoft Azure Storage Explorer&lt;/a&gt;  💯.&lt;/p&gt;

&lt;p&gt;This approach works fine if you only need to go through a reasonable amount of files and folders. In my case it concerned dozens of files in even more folders. That is why I went looking for a way to search and delete all 0 Byte files in a single go. Azure PowerShell to the rescue!&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure PowerShell step-by-step guide to find and remove 0 Byte files
&lt;/h2&gt;

&lt;p&gt;This guide is based on Azure PowerShell commands. For more info check out how you can &lt;a href="https://docs.microsoft.com/en-us/powershell/azure/install-az-ps" rel="noopener noreferrer"&gt;install the Azure Az PowerShell module&lt;/a&gt;. The steps consist of Azure PowerShell commands to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Access the Azure subscription&lt;/li&gt;
&lt;li&gt;Specify the Azure Storage account &lt;/li&gt;
&lt;li&gt;Execute a search command to find the files&lt;/li&gt;
&lt;li&gt;Extend the search command with instruction to delete&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In the following steps we will need to provide some data from Azure. It is recommended to check whether you have the required permissions to access the above-mentioned Azure resources via Azure PowerShell.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Note: I recommend testing the scripts extensively in a test environment first before getting going wild on production.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Connect to your Azure account&lt;/strong&gt;&lt;br&gt;
You can optionally specify the desired Azure tenant by passing in the &lt;code&gt;-TenantId&lt;/code&gt; parameter. See the &lt;a href="https://docs.microsoft.com/en-us/powershell/module/az.accounts/connect-azaccount" rel="noopener noreferrer"&gt;documentation&lt;/a&gt; for more info.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Connect-AzAccount 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 2: Create the Azure storage context&lt;/strong&gt;&lt;br&gt;
Next, we will indicate which Azure Blob Storage we are going to target. This can be done using an &lt;a href="https://docs.microsoft.com/en-us/powershell/module/az.storage/new-azstoragecontext" rel="noopener noreferrer"&gt;Azure storage context&lt;/a&gt;. Since we need the context in the next steps, we store the Azure storage context in the &lt;code&gt;$Context&lt;/code&gt; variable.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$Context&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; New-AzStorageContext &lt;span class="nt"&gt;-StorageAccountName&lt;/span&gt; &lt;span class="s2"&gt;"&amp;lt; Account Name &amp;gt;"&lt;/span&gt; &lt;span class="nt"&gt;-StorageAccountKey&lt;/span&gt; &lt;span class="s2"&gt;"&amp;lt; Storage Key ends with == &amp;gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 3: Search for the files&lt;/strong&gt;&lt;br&gt;
Using the Azure storage context stored in the previous step, we can retrieve files from a desired Azure container. By extending the &lt;a href="https://docs.microsoft.com/en-us/powershell/module/az.storage/get-azstorageblob" rel="noopener noreferrer"&gt;Get-AzStorageBlob&lt;/a&gt; command with a &lt;a href="https://docs.microsoft.com/en-us/powershell/module/Microsoft.PowerShell.Core/Where-Object" rel="noopener noreferrer"&gt;Where-Object&lt;/a&gt; pipeline we can specify exactly what we need, namely all 0 Byte files.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Get-AzStorageBlob &lt;span class="nt"&gt;-Context&lt;/span&gt; &lt;span class="nv"&gt;$Context&lt;/span&gt; &lt;span class="nt"&gt;-Container&lt;/span&gt; &lt;span class="s2"&gt;"&amp;lt; Container Name &amp;gt;"&lt;/span&gt;  | Where-Object &lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="nv"&gt;$_&lt;/span&gt;.Length &lt;span class="nt"&gt;-eq&lt;/span&gt; 0&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You could extend the where condition to filter only images (jpg) by additionally passing in the content type:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Get-AzStorageBlob &lt;span class="nt"&gt;-Context&lt;/span&gt; &lt;span class="nv"&gt;$Context&lt;/span&gt; &lt;span class="nt"&gt;-Container&lt;/span&gt; &lt;span class="s2"&gt;"&amp;lt; Container Name &amp;gt;"&lt;/span&gt;  | Where-Object &lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="nv"&gt;$_&lt;/span&gt;.Length &lt;span class="nt"&gt;-eq&lt;/span&gt; 0 &lt;span class="nt"&gt;-and&lt;/span&gt; &lt;span class="nv"&gt;$_&lt;/span&gt;.ContentType &lt;span class="nt"&gt;-eq&lt;/span&gt; &lt;span class="s2"&gt;"image/jpeg"&lt;/span&gt;&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Step 4: Remove the 0 Byte files&lt;/strong&gt;&lt;br&gt;
If you have tracked down the broken images, deleting them should be super simple. You only need to add the &lt;a href="https://docs.microsoft.com/en-us/powershell/module/az.storage/remove-azstorageblob" rel="noopener noreferrer"&gt;Remove-AzStorageBlob&lt;/a&gt; pipeline at the end of the command from the previous step:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Get-AzStorageBlob &lt;span class="nt"&gt;-Context&lt;/span&gt; &lt;span class="nv"&gt;$Context&lt;/span&gt; &lt;span class="nt"&gt;-Container&lt;/span&gt; &lt;span class="s2"&gt;"&amp;lt; Container Name &amp;gt;"&lt;/span&gt;  | Where-Object &lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="nv"&gt;$_&lt;/span&gt;.Length &lt;span class="nt"&gt;-eq&lt;/span&gt; 0&lt;span class="o"&gt;}&lt;/span&gt; | Remove-AzStorageBlob
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In case of the additional jpg images content type filter:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Get-AzStorageBlob &lt;span class="nt"&gt;-Context&lt;/span&gt; &lt;span class="nv"&gt;$Context&lt;/span&gt; &lt;span class="nt"&gt;-Container&lt;/span&gt; &lt;span class="s2"&gt;"&amp;lt; Container Name &amp;gt;"&lt;/span&gt;  | Where-Object &lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="nv"&gt;$_&lt;/span&gt;.Length &lt;span class="nt"&gt;-eq&lt;/span&gt; 0 &lt;span class="nt"&gt;-and&lt;/span&gt; &lt;span class="nv"&gt;$_&lt;/span&gt;.ContentType &lt;span class="nt"&gt;-eq&lt;/span&gt; &lt;span class="s2"&gt;"image/jpeg"&lt;/span&gt;&lt;span class="o"&gt;}&lt;/span&gt; | Remove-AzStorageBlob
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And with that you have all the scripts needed to find the broken files and delete them in bulk 😊.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this article I shared my approach to tackle the issue that I faced in which images became corrupted due to memory problems of the Azure web application. In the first part of the article I provided a solution to manually detect and delete the broken files by the combined forces of Kudu Console and the Microsoft Azure Storage Explorer. Because in my case it involved a substantial amount of files, I also provided a step-by-step guide consisting of Azure PowerShell commands to search for the relevant files and delete them in one go.&lt;/p&gt;

&lt;p&gt;In the end I hope that you don't run into these situations and that your application continues to run smoothly. In all other cases I hope this post was helpful.&lt;/p&gt;

&lt;p&gt;Thank you for reading!&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>powershell</category>
    </item>
  </channel>
</rss>
