<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Matt Allford</title>
    <description>The latest articles on DEV Community by Matt Allford (@mattallford).</description>
    <link>https://dev.to/mattallford</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mattallford"/>
    <language>en</language>
    <item>
      <title>Beyond skeleton pipelines: who owns your software pipeline?</title>
      <dc:creator>Matt Allford</dc:creator>
      <pubDate>Thu, 14 Aug 2025 00:41:08 +0000</pubDate>
      <link>https://dev.to/mattallford/beyond-skeleton-pipelines-who-owns-your-software-pipeline-3dn9</link>
      <guid>https://dev.to/mattallford/beyond-skeleton-pipelines-who-owns-your-software-pipeline-3dn9</guid>
      <description>&lt;p&gt;Your current software delivery processes are probably working fine. They build your code, maybe run a test or two, and get your software to production. That's no small achievement, but here's a question that might make you pause.&lt;/p&gt;

&lt;p&gt;When did someone last spend time improving them?&lt;/p&gt;

&lt;p&gt;If you're like many software teams I've worked with, the answer is probably "not recently". Maybe not ever.&lt;/p&gt;

&lt;p&gt;I've worked with developer teams at different maturity levels and have repeatedly seen this pattern. Great development teams who are busy building software but aren't large enough to have platform teams or dedicated operational folk responsible for the software delivery process. They typically start with a template from their cloud provider, delivery tooling vendor, or, these days, their favorite AI LLM. Initially, it gets the job done, but I refer to these as skeleton pipelines—the bare minimum framework to ship code.&lt;/p&gt;

&lt;p&gt;Skeleton pipelines are a great starting point. They get you from nothing to shipping code quickly, but your pipelines are living, breathing artifacts. Before long, they need some meat added to the skeleton to provide functionality beyond the basics and make your delivery process robust and delightful.&lt;/p&gt;

&lt;p&gt;I've seen when teams experience friction with their software release process, but don't treat it as a bug or feature improvement that needs to be logged, prioritized, and worked on. If your deployment pipelines are responsible for delivering your product to customers, shouldn't you treat them as part of your product? When was the last time someone added a pipeline improvement to your backlog or spent time making your deployment process better?&lt;/p&gt;

&lt;p&gt;The important shift is treating delivery process improvements as regular development work, not spare-time projects that inevitably get pushed aside.&lt;/p&gt;

&lt;h2&gt;
  
  
  The ownership vacuum
&lt;/h2&gt;

&lt;p&gt;Ask any group of engineers, "Who owns your software delivery pipelines?" and I bet you'll get different answers. When writing this post, I put that question into a search engine and reviewed the discussions, which weren't surprising. Some say developers should own what they build, others insist it's up to platform or DevOps engineers. Then there's the view that everyone is responsible because DevOps is about everyone working together. My observation is that often, no one owns it, and that's when it becomes a problem.&lt;/p&gt;

&lt;p&gt;The distinction that matters isn't really about ownership. It's about accountability. Someone needs to be accountable for ensuring your delivery process evolves, improves, and supports your organization with shipping software rather than being a friction-filled process people loathe working with. Without that accountability, pipelines can easily become unmanaged components that gradually deteriorate until they become technical debt and a bottleneck instead of an enabler.&lt;/p&gt;

&lt;p&gt;There's no one-size-fits-all answer for who should own and be accountable for your pipelines. You must discuss and agree based on your maturity, size, team structure, and individual skill sets. The important thing is that it is discussed and agreed upon, and not left to assumptions. Ask yourself now if everyone on the team knows how and where to raise an issue with the deployment process today. If the answer is no, then start there. Determine who's responsible and establish clear channels for pipeline feedback, problems, and improvements.&lt;/p&gt;

&lt;h2&gt;
  
  
  The cost of skeleton pipelines
&lt;/h2&gt;

&lt;p&gt;This ownership vacuum creates predictable problems I've seen many times. Teams fall into the "if it ain't broke, don't fix it" mentality, leaving pipelines as-is for months or years. The issue is that what worked well 6, 12, or 18 months ago likely isn't fit for purpose today. New tooling and techniques exist now that didn't then. Your team will likely ship more frequently—or at least want to—and handle more complex deployments than when you first created that pipeline. The infrastructure you're deploying to has likely changed. Different developers and engineers might work at your company who didn't before. You're missing out on tooling and practices that could: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Save weekly hours&lt;/li&gt;
&lt;li&gt;Bring new functionality to your software delivery lifecycle that adds business value&lt;/li&gt;
&lt;li&gt;Catch issues before they become customer problems&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Worse, they often become single points of knowledge. Whoever initially sets up the pipeline is the only person who understands how it works. When that person leaves or gets pulled onto other projects, your team inherits a mysterious system everyone fears touching.&lt;/p&gt;

&lt;p&gt;Fragile pipelines break at the worst possible moments. They miss opportunities to catch issues early, optimize build times, have predictable outcomes, or provide better feedback to developers. Maybe, most importantly, they frustrate your team and slow down what they're supposed to enable: shipping great software and products.&lt;/p&gt;

&lt;h2&gt;
  
  
  You can't afford to ignore this
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;We're too busy building features to worry about pipeline improvements.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Sound familiar? It's a common pushback I hear from teams, and I get it. When trying to ship products and keep customers happy, spending time on what feels like an internal process is often considered a luxury you can't afford.&lt;/p&gt;

&lt;p&gt;Platform teams are getting a lot of attention lately, and for good reason, as they work well for larger organizations. But if your company doesn't need everything else a platform team typically handles, creating one just for pipeline ownership is unnecessary. The reality is that you don't need a dedicated platform team to own your pipelines. You can borrow their mindset by treating your delivery process as part of your product. Pipeline improvements get backlog items, performance issues get bug tickets, and deployment pain points get the same attention as user-reported problems.&lt;/p&gt;

&lt;p&gt;Start by paying attention to where your process hurts. If deployments make you nervous, your team loathes "release day", when releases fail for mysterious reasons or when you're manually testing things your team can automate. These pain points are your roadmap for high-value improvements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Making the investment
&lt;/h2&gt;

&lt;p&gt;Improvements may include: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Adding vulnerability scanning during builds&lt;/li&gt;
&lt;li&gt;Configuring integration tests against ephemeral infrastructure&lt;/li&gt;
&lt;li&gt;Properly storing and versioning your deployment artifacts&lt;/li&gt;
&lt;li&gt;Implementing flexible approval workflows&lt;/li&gt;
&lt;li&gt;Creating a robust deployment strategy&lt;/li&gt;
&lt;li&gt;Measuring deployment metrics&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I'm not giving you a shopping list of things to do. Instead, I recognize that "you don't know what you don't know" because I've been there too, and you may be experiencing issues you didn't know there were solutions to.&lt;/p&gt;

&lt;p&gt;We have a white paper titled &lt;a href="https://octopus.com/whitepapers/ten-pillars-of-pragmatic-deployments" rel="noopener noreferrer"&gt;The ten pillars of pragmatic deployments&lt;/a&gt;. I bring this to your attention not for a checklist of everything you should do but for insight into what we think good deployments look like based on years of experience building a deployment automation tool. The ideas discussed in this post might help you enhance your software delivery process today or provide insight into issues and solutions you may not have been aware of.&lt;/p&gt;

&lt;h2&gt;
  
  
  Take ownership
&lt;/h2&gt;

&lt;p&gt;It's easy to overlook software delivery pipelines. They're usually out of sight until they get in your way, kind of like plumbing. It's invisible when it works and catastrophic when it doesn't. Excellent plumbing supports everything in your house to function smoothly, and great delivery processes do the same for your product development.&lt;/p&gt;

&lt;p&gt;Claim ownership of your software delivery processes and ensure someone is accountable for their care and feeding. Talk to your engineers involved in developing and releasing software, and find out where the pain is. Implement easy-to-use feedback loops, add pipeline improvements to your backlog, and treat deployment pain points as bugs worth investing in and fixing.&lt;/p&gt;

&lt;p&gt;Your future self and team will thank you when your next deployment goes smoothly instead of keeping everyone up after hours or nervously huddling around a few pizzas doing your monthly release with crossed fingers.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>cicd</category>
    </item>
    <item>
      <title>What GitOps changes about elevated access</title>
      <dc:creator>Matt Allford</dc:creator>
      <pubDate>Wed, 30 Jul 2025 23:41:26 +0000</pubDate>
      <link>https://dev.to/mattallford/what-gitops-changes-about-elevated-access-277l</link>
      <guid>https://dev.to/mattallford/what-gitops-changes-about-elevated-access-277l</guid>
      <description>&lt;p&gt;Recently, we surveyed the industry to gain insights into the adoption and challenges of real-world GitOps, with the results forming the &lt;a href="https://octopus.com/publications/state-of-gitops-report" rel="noopener noreferrer"&gt;State of GitOps report&lt;/a&gt;. While reviewing the trends and results, the data around one key finding jumped out at me:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;GitOps reduces elevated access&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Overall, 66% of respondents agreed. Among organizations with higher GitOps maturity, agreement rose to 77%, but interestingly, there was also a slight increase in disagreement among the highest performers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fee31lnuvp0mzwqbg0try.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fee31lnuvp0mzwqbg0try.png" alt="Bar chart showing responses to ‘GitOps reduces elevated access’ across groups with different GitOps maturity scores." width="800" height="597"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So, does GitOps change or reduce the need for elevated environment access, and should it?&lt;/p&gt;

&lt;h2&gt;
  
  
  How GitOps shifts access away from infrastructure
&lt;/h2&gt;

&lt;p&gt;The 4 GitOps principles encourage you to avoid manually logging into environments and performing tasks. The goal is to manage system state declaratively through version-controlled configuration and apply those states automatically using reconciliation agents.&lt;/p&gt;

&lt;p&gt;If fully embraced, GitOps agents continuously observe the system state, correcting any drift by reapplying the desired state defined in version control. Continuous reconciliation discourages manual changes and automatically reverses them if they occur.&lt;/p&gt;

&lt;p&gt;In traditional operations teams, access to systems with SSH or tools like kubectl is common, but operating in a mature GitOps model should make these break-glass exceptions.&lt;/p&gt;

&lt;p&gt;Most survey respondents agree that GitOps should reduce the need for privileged production access by replacing hands-on changes with declarative, auditable workflows.&lt;/p&gt;

&lt;h2&gt;
  
  
  Rethinking what elevated access means with GitOps
&lt;/h2&gt;

&lt;p&gt;So why might some respondents to the survey disagree that embracing GitOps reduces elevated access?&lt;/p&gt;

&lt;p&gt;At lower maturity levels, teams may treat Git as a safe, developer-focused system. But as Git becomes the source of truth for managing production infrastructure and applications, a key question emerges:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If a developer can change a manifest in Git for a production environment, does that emulate elevated access?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In a GitOps model, Git becomes the control plane, meaning write access to a production configuration in Git, especially with automatic reconciliation, becomes a form of production access.&lt;/p&gt;

&lt;p&gt;High performers likely consider GitOps secure when configuring repositories well and implementing mature environment promotion workflows. However, Git isn’t the only sensitive surface in GitOps architecture. You must also secure the reconciliation controller and supporting workflow or promotion tooling with the same level of rigor. Together, these systems form the modern delivery pipeline and deserve the same access scrutiny once reserved for direct infrastructure.&lt;/p&gt;

&lt;p&gt;High performers likely aren’t ignoring or rejecting GitOps’ security benefits, but instead acknowledge that you must treat Git itself as a sensitive operational boundary now more than ever. It’s now one of several critical control points in the delivery process that need strict governance.&lt;/p&gt;

&lt;p&gt;Interestingly, these same high performers agreed that GitOps reduces overall elevated access. So, the takeaway isn’t contradiction; it’s maturity. High-performing teams see GitOps as secure only when Git, the reconciliation controller, and supporting environment progression tooling are all treated as critical access points, with the appropriate controls in place.&lt;/p&gt;

&lt;h2&gt;
  
  
  Handling exceptions to the GitOps model
&lt;/h2&gt;

&lt;p&gt;In the early stages of adopting GitOps, it’s common for DevOps engineers, platform engineers, and even developers to rely on familiar tools like kubectl to inspect or tweak clusters and applications directly. Especially in learning environments and environments early in the deployment process, like dev, this is precisely what they’ll do. This approach lets them rely on familiar tools while gaining confidence in GitOps practices.&lt;/p&gt;

&lt;p&gt;The ultimate goal is for nobody to have direct access to production environments, even if read-only.&lt;/p&gt;

&lt;p&gt;In the &lt;a href="https://octopus.com/publications/state-of-gitops-report" rel="noopener noreferrer"&gt;State of GitOps report&lt;/a&gt;, we surfaced six core practices of GitOps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Declarative desired state&lt;/li&gt;
&lt;li&gt;Human-readable format&lt;/li&gt;
&lt;li&gt;Responsive code review&lt;/li&gt;
&lt;li&gt;Version control&lt;/li&gt;
&lt;li&gt;Automatic pull&lt;/li&gt;
&lt;li&gt;Continuous reconciliation&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You won’t embrace all of these GitOps practices on day one, and that’s ok. As we learned from the DevOps movement, embracing a GitOps infrastructure and application delivery approach involves continuous improvement, learning, tweaking, and time.&lt;/p&gt;

&lt;p&gt;With that said, even at high maturity levels with GitOps where you implement many or all of the practices, there may still be valid scenarios where folks need elevated access. For example, emergency responses (break-glass scenarios) or debugging complex or unreproducible issues.&lt;/p&gt;

&lt;p&gt;In these cases, it’s crucial to have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Break-glass procedures. Documented steps for requesting, granting, and revoking temporary access. These should include automated expiration, require multi-party approval, and be used only in exceptional circumstances.&lt;/li&gt;
&lt;li&gt;Audit trails. You must log every access event, recording who accessed and changed what, when, and why.&lt;/li&gt;
&lt;li&gt;Controlled suspension of reconciliation. Sometimes, you must temporarily disable the GitOps agent to prevent it from overwriting a manual change made during incident response. You should log and time-bound this suspension, and include procedures to reconcile safely afterward.&lt;/li&gt;
&lt;li&gt;Post-incident recovery. After manual intervention, have a defined process for reconciling the system back to the desired state, committing required changes to Git, and re-enabling automatic reconciliation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These situations should be rare. Ideally, releases have already passed through multiple environments that closely mirror production in both infrastructure and configuration. If you experience an issue where you find your developers or engineers need to access environments directly, after the issue gets resolved, question why. Could they have achieved the same outcome by using other systems and tooling? Is there an element of education and training required? Is there a need to implement supplementary tooling or processes to help troubleshoot and resolve issues that surface after deployment? Work on fixing it, and implementing that fix in your process so it doesn’t occur again.&lt;/p&gt;

&lt;p&gt;Releasing to production shouldn’t be a production.&lt;/p&gt;

&lt;h2&gt;
  
  
  The evolving access conversation
&lt;/h2&gt;

&lt;p&gt;GitOps won’t immediately eliminate the need for you to have a process for elevated production access, but it should help make it an exception rather than a routine operation. You must treat Git and GitOps tooling with the same care and control as production infrastructure. The fundamentals still apply; they apply in more places and different places than you’re used to.&lt;/p&gt;

&lt;p&gt;As you mature your GitOps adoption, the conversation shifts from “do we still need access?” to “are we securing the new access surface and tooling?”.&lt;/p&gt;

&lt;p&gt;Happy deployments!&lt;/p&gt;

</description>
      <category>gitops</category>
      <category>devops</category>
      <category>cicd</category>
    </item>
    <item>
      <title>Azure Functions: Managing Authentication and Secrets</title>
      <dc:creator>Matt Allford</dc:creator>
      <pubDate>Wed, 06 May 2020 06:37:07 +0000</pubDate>
      <link>https://dev.to/cloudskills/managing-authentication-and-secrets-4h4j</link>
      <guid>https://dev.to/cloudskills/managing-authentication-and-secrets-4h4j</guid>
      <description>&lt;h1&gt;
  
  
  Table Of Contents
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Prerequisites&lt;/li&gt;
&lt;li&gt;Step 1 — Configuring a Managed Identity and Role Based Access Control&lt;/li&gt;
&lt;li&gt;Step 2 — Creating a Cost Center Function&lt;/li&gt;
&lt;li&gt;Step 3 — Accessing Secrets from Azure Key Vault&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Introduction
&lt;/h3&gt;

&lt;p&gt;Azure Functions provide flexibility with your workflows and logic processes, no doubt about it. In parts one and two of this blog series, you created different types of Azure Functions, provided input data with a trigger, configured event-based triggers and used output bindings to send data to other applications.&lt;/p&gt;

&lt;p&gt;To date in this series, you have not worked directly with Azure Resources using the &lt;code&gt;Az&lt;/code&gt; PowerShell module within the function itself. This is because access to Azure resources from the functions have not yet been configured. Additionally, you have not been shown how to handle passwords or secrets you may need to leverage within the function, you definitely don't want to be storing those in plain text in the &lt;code&gt;run.ps1&lt;/code&gt; file itself.&lt;/p&gt;

&lt;p&gt;In this guide, you will configure a managed identity on the function app and provide access to Azure resources for that identity using Azure role-based access control. Next, you will deploy an event grid function that shows this access working by setting tags on Azure resources based on logic within the function. Finally, you will learn how to securely access secrets stored in Azure Key Vault from within a function.&lt;/p&gt;

&lt;p&gt;When you're finished, you'll be able to securely provide Azure functions with role-based access to Azure resources and leverage password and secrets in your functions that are stored in Azure Key Vault.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Before you begin this guide you'll need the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;(optional) Familiarity with PowerShell would be beneficial&lt;/li&gt;
&lt;li&gt;An &lt;a href="https://azure.microsoft.com/en-us/" rel="noopener noreferrer"&gt;Azure Subscription&lt;/a&gt;, you can create a free account if you don't have an existing subscription&lt;/li&gt;
&lt;li&gt;An Azure function app based on PowerShell to integrate with Azure and create a new function in. If you have the function app from post one in this series, you can use that. This post follows on from post two in this series and will be using the same function app &lt;code&gt;cloudskills20200406&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 1 — Configuring a Managed Identity and Role Based Access Control &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;There will be times in your functions that you want to read, modify and maybe even delete resources within your Azure subscriptions. By default, the functions you create do not have permission to manage any of your Azure resources, whether that be in the same or separate subscriptions. By configuring a managed identity and setting role-based access control, the function app (and therefore all functions within the function app) will have the permission to manipulate resources that you provided the managed identity access to.&lt;/p&gt;

&lt;p&gt;First, log in to the &lt;a href="https://portal.azure.com/" rel="noopener noreferrer"&gt;Azure Portal&lt;/a&gt;, search for &lt;strong&gt;Function App&lt;/strong&gt; and click on your function app. Our function app is called &lt;code&gt;cloudskills20200406&lt;/code&gt;, yours will be called something different.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F801o2d1fezxu2a1wrgmb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F801o2d1fezxu2a1wrgmb.png" alt="step1-1-open-function-app" width="754" height="337"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the function app opened, click on &lt;strong&gt;Platform features&lt;/strong&gt; and then click on &lt;strong&gt;identity&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fzl41ivnf5g4xtcdh43kx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fzl41ivnf5g4xtcdh43kx.png" alt="step1-2-open-identity-configuration" width="772" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this post you will configure a system assigned managed identity. Toggle the status to &lt;strong&gt;On&lt;/strong&gt; and click &lt;strong&gt;Save&lt;/strong&gt;. You'll be asked if you want to enable the managed identity, click &lt;strong&gt;yes&lt;/strong&gt; to that as well. Note that the managed identity being registered will be named the same as your function app.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4o6k31ov37g507yr6g8f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4o6k31ov37g507yr6g8f.png" alt="step1-3-enable-system-identity" width="408" height="315"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After a few moments the Object ID for the managed identity will be shown on this page. Copy that to the clipboard as you will use it to assign a role to the managed identity using PowerShell.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fcawkbjfkplqwrz5do1u1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fcawkbjfkplqwrz5do1u1.png" alt="step1-4-copy-managed-identity-id" width="437" height="266"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You now need to assign a role for the newly created managed identity. In this example the managed identity is being assigned the contributor role on the subscription. You may want to scope this further depending on your requirements, such as targetting a resource group rather than an entire subscription. To find your subscription ID, use &lt;code&gt;Get-AzSubscription&lt;/code&gt;. Open up &lt;a href="https://shell.azure.com/" rel="noopener noreferrer"&gt;Cloud Shell&lt;/a&gt; and run the following command to provide the managed identity object the contributor role on an Azure subscription:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;New-AzRoleAssignment&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-ObjectId&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;8aebb402-14ab-4ce7-b5c9-b4122384ffa7&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-RoleDefinitionName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;Contributor&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Scope&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;/subscriptions/b0d214f3-4b5b-45f6-a841-db43c23acbba/&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this step you configured the function app to use the managed identity and created a new role assignment in Azure to provide the managed identity with access to resources. You now can manage objects in Azure from within your functions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2 — Creating a Cost Center Function &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;In an earlier blog post in this series you created an event grid function, which was triggered following a particular event, processed some data then wrote the processed data out to a table in Azure Storage.&lt;/p&gt;

&lt;p&gt;In this step you will create another event grid function, but instead of writing out to another application or service, the function will leverage access from the newly assigned role in the previous step to modify an Azure resource directly. The function will look for when a new resource group is created, and based on the value of a tag named &lt;code&gt;Env&lt;/code&gt;, will then add another tag to the resource group named &lt;code&gt;CostCenter&lt;/code&gt; with an appropriate value, either &lt;code&gt;0001&lt;/code&gt; if the Env tag is Development, or &lt;code&gt;0002&lt;/code&gt; if the Env tag is Production.&lt;/p&gt;

&lt;p&gt;This paragraph is going to be high-level steps as you went through this in the previous post in this series. If you haven't read that post, please refer to it for the detailed walk-through steps. In the Azure Portal, open your function app and create a new function based on an Azure Event Grid Trigger. Add an event grid subscription with the following settings:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Topic Types&lt;/strong&gt;: Azure Subscriptions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Filter to Event Types&lt;/strong&gt;: Resource Write Success&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fboty2iwv7goiav192r9w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fboty2iwv7goiav192r9w.png" alt="step2-1-event-grid-subscription" width="665" height="593"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, after your event grid subscription is created, you will be back in the editor looking at the run.ps1 file. Copy the PowerShell function below and click &lt;strong&gt;Save&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="kr"&gt;param&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$TriggerMetadata&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="nv"&gt;$costCenterDevelopment&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"0001"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$costCenterProduction&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"0002"&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="nv"&gt;$eventSubject&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'subject'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="kr"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$eventSubject&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;contains&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'resourcegroups'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;Get-AzResourceGroup&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Id&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventSubject&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-ErrorAction&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;SilentlyContinue&lt;/span&gt;&lt;span class="w"&gt;

  &lt;/span&gt;&lt;span class="kr"&gt;if&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;-and&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Tags&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'Env'&lt;/span&gt;&lt;span class="p"&gt;]){&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="kr"&gt;switch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Tags&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'Env'&lt;/span&gt;&lt;span class="p"&gt;]){&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"Development"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Tags&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'CostCenter'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$costCenterDevelopment&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="n"&gt;Set-AzResourceGroup&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ResourceGroupName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Tag&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Tags&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"Production"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Tags&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'CostCenter'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$costCenterProduction&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="n"&gt;Set-AzResourceGroup&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ResourceGroupName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Tag&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$rg&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Tags&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice in this code the PowerShell Az cmdlets are being leveraged to Get and Set Resource Groups. By default, when a PowerShell function is created, behind the scenes there are some extra files. You will explore those further in a later blog in this series, but one of those files is &lt;code&gt;requirements.psd1&lt;/code&gt;, which contains the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="c"&gt;# This file enables modules to be automatically managed by the Functions service.&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="c"&gt;# See https://aka.ms/functionsmanageddependency for additional information.&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="c"&gt;#&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;@{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s1"&gt;'Az'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'3.*'&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What this means is that the Az PowerShell modules are loaded with the function and are available for use.&lt;/p&gt;

&lt;p&gt;Finally, go ahead and create a test empty resource group so the new function is triggered. Make sure during creation you assign a tag with a Name of &lt;code&gt;Env&lt;/code&gt; and the value is either &lt;code&gt;Development&lt;/code&gt; or &lt;code&gt;Production&lt;/code&gt;. You can use the PowerShell example below to create a resource group with an appropriate tag:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;New-AzResourceGroup&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;costcenter-rg01&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Location&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'West US'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Tag&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;@{&lt;/span&gt;&lt;span class="nx"&gt;Env&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"Development"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After a few moments, if you watch the log stream you will notice the function is triggered. If you check the tags on your resource group, you now will have an additional tag with a name of &lt;code&gt;CostCenter&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9t0ee7b9acd2o19gugwh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9t0ee7b9acd2o19gugwh.png" alt="step2-2-log-stream-output" width="800" height="176"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fx763vnp8nul4i3cmerle.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fx763vnp8nul4i3cmerle.png" alt="step2-3-resource-group-tags" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This capability provides you with effectively an unlimited range of options on configuring role-based access control to your Azure resources to then manage and maintain them using Azure Functions. In the final step of this post, you will learn how to access secrets stored in Azure Key Vault such as passwords and API keys securely from your function.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3 — Accessing Secrets from Azure Key Vault &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Occasionally you will want your function to connect to an endpoint that it is not authorized to by default. This could be an Azure resource, such as the guest operating system of a virtual machine, or it could be an external service such as an alerting system. Hopefully it goes without saying, but just in case, it is not a good idea to save these secrets in the &lt;code&gt;run.ps1&lt;/code&gt; file in clear text. Historically, the method was to create a new "application setting" within the function app which has a name and a value. The application settings can be called from the PowerShell function by using an environment variable such as &lt;code&gt;$env:AppSettingName&lt;/code&gt;. Application settings are encrypted at rest and are transmitted over an encrypted channel, but they can still be exposed in plain text and there are no advanced secrets management capabilities such as versioning or auditing.&lt;/p&gt;

&lt;p&gt;In late 2018 Microsoft announced support for accessing Azure Key Vault secrets from a function app. The good news (for backward compatibility) is that it still uses the function application setting, instead of putting the secret itself in the value, you enter a reference to the Azure Key Vault secret.&lt;/p&gt;

&lt;p&gt;In this step you will add an application setting and store the value directly in the application setting. Next, you will create a new Azure Key Vault, provide the managed identity we created in Step 2 access to the Key Vault, create a new secret within the Key Vault and finally, update the value of the application setting to reference the Key Vault Secret instead of storing the value directly.&lt;/p&gt;

&lt;p&gt;First, select your function app in the Azure Portal, click on &lt;strong&gt;Platform features&lt;/strong&gt; and then click on &lt;strong&gt;Configuration&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0fyh5tr12kp18i18mw63.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0fyh5tr12kp18i18mw63.png" alt="step3-1-function-app-configuration" width="762" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;strong&gt;New Application Setting&lt;/strong&gt;, for the Name enter &lt;code&gt;AlertSystemAPI&lt;/code&gt; and for the value enter &lt;code&gt;KeUkcwjdBIrmgEzvKOlRuNKowAfg&lt;/code&gt; and then click &lt;strong&gt;ok&lt;/strong&gt;. This isn't a real key for anything, it is simply an example.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F3fnd4cbkj7zbxfb5h29l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F3fnd4cbkj7zbxfb5h29l.png" alt="step3-2-add-application-setting" width="800" height="363"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the configuration screen, click on &lt;strong&gt;save&lt;/strong&gt; and click &lt;strong&gt;continue&lt;/strong&gt;. At this point in time, you would be able to reference the value of this new application setting by using &lt;code&gt;$Env:AlertSystemAPI&lt;/code&gt; in your PowerShell code inside of &lt;code&gt;run.ps1&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Next, you will create an Azure Key Vault. The examples below are done in PowerShell to make it easier than navigating through the portal with screenshots. You might try it as well using PowerShell from the Cloud Shell, or &lt;a href="https://shell.azure.com" rel="noopener noreferrer"&gt;https://shell.azure.com&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;New-AzKeyVault&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;CSFunctionVault&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-ResourceGroupName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;functiondemo&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-location&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Australia East"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;By default the Key Vault doesn't have an access policy assigned, so no users or accounts can access the vault. Here, you will provide the managed identity of the function app (the ObjectID being used below is the same one we used in step 1) access to get secrets from the new vault, as well as providing your account access to get and set secrets. Just make sure to update the value of &lt;code&gt;ObjectId&lt;/code&gt; and &lt;code&gt;-UserPrincipalName&lt;/code&gt; in the text below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;Set-AzKeyVaultAccessPolicy&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-VaultName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;CSFunctionVault&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-ObjectId&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;4f14fcc6-b345-4f8c-9589-5d21edd0772c&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-PermissionsToSecrets&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;get&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-PassThru&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="n"&gt;Set-AzKeyVaultAccessPolicy&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-VaultName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;CSFunctionVault&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-UserPrincipalName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;your.name&lt;/span&gt;&lt;span class="err"&gt;@&lt;/span&gt;&lt;span class="nx"&gt;domain.com&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-PermissionsToSecrets&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;set&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="nx"&gt;get&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-PassThru&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, you will create a new secret in the vault with a name of &lt;code&gt;AlertSystemAPI&lt;/code&gt; and a value of &lt;code&gt;KeUkcwjdBIrmgEzvKOlRuNKowAfg&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$secret&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;ConvertTo-SecureString&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-String&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;'KeUkcwjdBIrmgEzvKOlRuNKowAfg'&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-AsPlainText&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Force&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="n"&gt;Set-AzKeyVaultSecret&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-VaultName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;CSFunctionVault&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;AlertSystemAPI&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-SecretValue&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$secret&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the output of the last command, you will see an ID for the secret that has been created. Copy this ID to your clipboard, you will need it in a moment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftgfnwmj8lklrn665fyo8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftgfnwmj8lklrn665fyo8.png" alt="step3-3-key-vault-secret-id" width="800" height="223"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lastly, you will now update the value of the application setting in the function app to reference the secret you just created. In the Azure Portal, navigate back to your function app, click on Platform features and Configuration. Click on the Edit button for the entry named &lt;code&gt;AlertSystemAPI&lt;/code&gt; and replace the value with the following, making sure to replace &amp;lt;SecretID&amp;gt; with the ID that you copied in the last step:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;@Microsoft.KeyVault(SecretUri=&amp;lt;SecretID&amp;gt;)&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1en0moejh1m1cr1nzzbn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1en0moejh1m1cr1nzzbn.png" alt="step3-4-modify-application-setting" width="800" height="143"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You still reference this value by using &lt;code&gt;$Env:AlertSystemAPI&lt;/code&gt; in your PowerShell code inside of &lt;code&gt;run.ps1&lt;/code&gt;, but now instead of the value being stored in the function app, it is stored in a central Key Vault providing the standard and regulations of secrets management.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;In this article you configured a managed identity on a function app and configured an Azure role assignment to a subscription for the managed identity. This provides the ability for functions within the function app to manage Azure Resource that is has delegated access to. Next, you deployed another event grid function and used logic within the PowerShell function to put a cost center tag on a new resource group, dependant on which environment tag value was assigned to the resource group during deployment.&lt;/p&gt;

&lt;p&gt;Finally you reviewed the application settings within a function app to understand how environmental values are stored and called. Though these are encrypted at rest and in flight, they lack some core secrets management capabilities, so you enhanced this functionality by creating an Azure Key Vault Secret in a new Key Vault and referenced the secret from the function app application settings.&lt;/p&gt;

&lt;p&gt;Next up in this series you will look at developing and running functions locally in Visual Studio Code, followed by deploying the function to Azure directly from Visual Studio Code. Finally you will create a new Azure function app that integrates with a git repo in Azure DevOps and uses a CI/CD pipeline to deploy updates to the function app.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>serverless</category>
      <category>devops</category>
      <category>functions</category>
    </item>
    <item>
      <title>Azure Functions: Creating a PowerShell Event Based Function</title>
      <dc:creator>Matt Allford</dc:creator>
      <pubDate>Sun, 03 May 2020 10:42:15 +0000</pubDate>
      <link>https://dev.to/cloudskills/azure-functions-creating-a-powershell-event-based-function-1aah</link>
      <guid>https://dev.to/cloudskills/azure-functions-creating-a-powershell-event-based-function-1aah</guid>
      <description>&lt;h3&gt;
  
  
  Introduction
&lt;/h3&gt;

&lt;p&gt;In the first post in this series, you created a basic Azure Function based on a HTTP webhook trigger and learned some of the fundamental concepts of Function Apps and functions. While webhook functions certainly are valuable and solve problems, some powerful workflows can be created that are based on event-driven data.&lt;/p&gt;

&lt;p&gt;In this guide you will create a new function that will integrate with Azure Event Grid, resulting in the function being triggered when a particular event takes place. You will analyze the data being sent to the function by event grid to understand what you are working with.&lt;/p&gt;

&lt;p&gt;You'll also set up an Azure Storage account to use as an output binding from the function. When the function is triggered, you will use logic to get data from the triggered event and insert that into a new row within Azure table storage in the destination storage account.&lt;/p&gt;

&lt;p&gt;The goal in this guide is to trigger a function when a file is uploaded to blob storage in a specific storage account. If the file is a CSV file, the function will collect some data from the incoming event data, specifically, the file name of the CSV that was uploaded and using an output binding, will insert a new row into an Azure Storage Table with some data from the event.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5vf5oirnoj6eu6lekx2t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5vf5oirnoj6eu6lekx2t.png" alt="overview" width="800" height="311"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you're finished, you'll have a good understanding of how event-driven data can be used with Azure Functions and have exposure to creating output bindings to send processed data in the logic of the function to external components.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Before you begin this guide you'll need the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;(optional) Familiarity with PowerShell would be beneficial&lt;/li&gt;
&lt;li&gt;An &lt;a href="https://azure.microsoft.com/en-us/" rel="noopener noreferrer"&gt;Azure Subscription&lt;/a&gt;, you can create a free account if you don't have an existing subscription&lt;/li&gt;
&lt;li&gt;An Azure function app based on PowerShell to create a new function in. If you have the function app from post one in this series, you can use that.&lt;/li&gt;
&lt;li&gt;A Storage account where blobs will be uploaded to be the event grid trigger. In this demo I created a separate resource group and storage account for this by running the following PowerShell commands from &lt;a href="https://shell.azure.com" rel="noopener noreferrer"&gt;Azure Cloud Shell&lt;/a&gt;:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;New-AzResourceGroup&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;BlobStorage&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Location&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Southeast Asia"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;New-AzStorageAccount&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;csfunctionstg01&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-ResourceGroupName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;BlobStorage&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Location&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Southeast Asia"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-SkuName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;Standard_LRS&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 1 — Creating an Event Grid Based Function
&lt;/h2&gt;

&lt;p&gt;This guide uses the function app created in the previous post in this series. If you don't have a function app, refer back to that post to create one for use here.&lt;/p&gt;

&lt;p&gt;First, log in to the &lt;a href="https://portal.azure.com/" rel="noopener noreferrer"&gt;Azure Portal&lt;/a&gt; and search for &lt;strong&gt;Function App&lt;/strong&gt;. Our function app is called &lt;code&gt;cloudskills20200406&lt;/code&gt;, so click on that. Yours will be called something different.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6mcrppfnap6l0wnt01mz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6mcrppfnap6l0wnt01mz.png" alt="step1-1-open-function-app" width="681" height="308"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the panel on the left-hand side, click the &lt;strong&gt;+&lt;/strong&gt; next to &lt;strong&gt;Functions&lt;/strong&gt; to create a new function.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6dnwf3tbijet8i2c98z9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6dnwf3tbijet8i2c98z9.png" alt="step1-2-create-new-function" width="379" height="330"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Scroll down and select the &lt;strong&gt;Azure Event Grid trigger&lt;/strong&gt; and give the function a name. Ours will be called &lt;code&gt;CSEventGridTrigger&lt;/code&gt;. Click &lt;strong&gt;Create&lt;/strong&gt; to create the new function.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fl5zx35375uoqd3hkdxfx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fl5zx35375uoqd3hkdxfx.png" alt="step1-3-new-event-grid-trigger" width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will be taken to the PowerShell code editor for the new trigger. You will notice there is a parameter named &lt;code&gt;eventGridEvent&lt;/code&gt;. This is the variable that will contain the incoming event data, which we'll need to explore after configuring an Event Grid subscription.&lt;/p&gt;

&lt;p&gt;At the top of the window, click on &lt;strong&gt;Add Event Grid subscription&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fal52b1jy7cp32jzvcxn0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fal52b1jy7cp32jzvcxn0.png" alt="step1-4-add-event-grid-subscription" width="800" height="236"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On this page you are presented with some forms to fill out to create the event subscription. This will be looking at the storage account we created in the prerequisites section of this guide. A few things to note:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can optionally create filters and additional features of the event subscription to further target which events you are interested in, to trigger the function. For example, you could create a filter here so that only events where a CSV file was uploaded to blob storage are delivered to the function. You'll be doing this logic in the PowerShell function itself, this is just an example&lt;/li&gt;
&lt;li&gt;For the topic type, observe the different resources that can be selected within an event subscription&lt;/li&gt;
&lt;li&gt;On the event type, in our example this is filtered to just show &lt;code&gt;Blob created&lt;/code&gt; as this is the only event we are interested in having trigger the function&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Fill out the basic information for the event subscription and then click on &lt;strong&gt;Create&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvu5si4k06xodk4nykk17.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvu5si4k06xodk4nykk17.png" alt="step1-5-create-event-subscription" width="681" height="734"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You've created a new PowerShell function and configured an Event Grid subscription to send data to this new function based on particular event types.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2 — Reviewing Event Grid Data Sent to the Function
&lt;/h2&gt;

&lt;p&gt;Now that you have an event grid subscription created, you need to understand what data and properties are being sent to the function by event grid. Without this information, it will be difficult to maniuplate the data and write logic into your function.&lt;/p&gt;

&lt;p&gt;In the template that was created for us, the &lt;code&gt;$eventGridEvent&lt;/code&gt; variable is being output to the host, so if we trigger the function we can see the data in the Logs console. An alternative method is to convert the object to JSON and then output that, giving us the full data structure to copy into a text editor to view the data structure and properties.&lt;/p&gt;

&lt;p&gt;Replace line four with the following code and click &lt;strong&gt;save&lt;/strong&gt;. Your function should look the same as the image below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;ConvertTo-Json&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;Write-Output&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8l6wav9q2wb9uc27nu2v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F8l6wav9q2wb9uc27nu2v.png" alt="step2-1-modify-eventgrid-output" width="800" height="236"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on &lt;strong&gt;Logs&lt;/strong&gt; at the bottom to reveal the log-streaming service. Click on Expand to make the window larger. You will come back to this window in a moment to review the output.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvpqpxt8681xktjmyfkco.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvpqpxt8681xktjmyfkco.png" alt="step2-2-open-log-streaming" width="800" height="159"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, you will perform an event that causes the function to trigger and review the event data. Open another browser tab or window with the Azure Portal open and navigate to the storage account that was created in the prerequisites section of this guide. After clicking on the storage account, use the search feature in the properties pane and search for &lt;strong&gt;Containers&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fto8lub3wc1p57ks8no34.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fto8lub3wc1p57ks8no34.png" alt="step2-3-open-blob-storage" width="611" height="289"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;strong&gt;+ Container&lt;/strong&gt; to add a new container. In this guide, the container is called &lt;code&gt;test&lt;/code&gt; and has the default private access.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftwl9xttopy04h9g1ptnb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftwl9xttopy04h9g1ptnb.png" alt="step2-4-create-blob-container" width="800" height="174"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on the newly created container and then click upload to upload a small test file from your computer, such as a text file. Click &lt;strong&gt;Upload&lt;/strong&gt; to upload the file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fsi6v5b01mv4ll4qw2d4e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fsi6v5b01mv4ll4qw2d4e.png" alt="step2-5-upload-test-file" width="800" height="162"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Switch back to the window or tab where we left the log streaming running within the function. After a few moments you will see the function has executed. In the log-stream output you will see a JSON body showing the properties and data sent to the function by the Event Grid service, as shown in the image below (the highlighted text is the JSON data). You can copy this data to a text editor to observe as you build out the logic of your function. For example, there is a property in this JSON output named &lt;code&gt;subject&lt;/code&gt;. To reference or use that in the function itself, you can use now use &lt;code&gt;$eventGridEvent.subject&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ff318ijbncfg6pae2a0vs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ff318ijbncfg6pae2a0vs.png" alt="step2-6-observe-function-log-stream" width="800" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you are finished, close the log streaming console (or collapse it) so we can see the &lt;code&gt;run.ps1&lt;/code&gt; script again. Leave your other tab or window open with the storage account where you uploaded the blob, that will be used again later in this guide.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3 — Configuring an Azure Table Storage Output
&lt;/h2&gt;

&lt;p&gt;When you come up with an idea for a function to solve a problem or add value, the function itself often won't be the end of the line. You will find requirements for your function needing to send data to another application or system, whether that be for logging purposes, triggering another downstream process, creating an alert, and so on. Azure functions have a concept called output bindings, which are preconfigured connectors provided to you so you don't need to code the connection in manually from your function. In this step, you will create and leverage an output binding to write data from the function in to a row in Azure Table Storage.&lt;/p&gt;

&lt;p&gt;First, you need to create the output binding for the function. In this example, you are going to output the result to a new Table named &lt;code&gt;FunctionOutput&lt;/code&gt; hosted in the same storage account that is already being used by the function app. You probably wouldn't &lt;em&gt;normally&lt;/em&gt; do this, it is just to save creating another storage account as we already have one we can use.&lt;/p&gt;

&lt;p&gt;In the left-hand side panel under your function, click on &lt;strong&gt;integrate&lt;/strong&gt;. Under Outputs, click on &lt;strong&gt;+ New Output&lt;/strong&gt;, select &lt;strong&gt;Azure Table Storage&lt;/strong&gt; and click &lt;strong&gt;Select&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F93jdpylts8awdg0h4uo6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F93jdpylts8awdg0h4uo6.png" alt="step3-1-create-new-output" width="800" height="333"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Change the &lt;strong&gt;Table Name&lt;/strong&gt; to &lt;code&gt;FunctionOutput&lt;/code&gt;. Notice the parameter name is set to &lt;code&gt;outputTable&lt;/code&gt;, this is what will be used later in the PowerShell function output binding. Leave the default Storage account connection as this will automatically select the storage account associated with the function app, but if you did want to select a different storage account as the target, you can click &lt;strong&gt;new&lt;/strong&gt; and select an alternative storage account (or create a new one). Click on &lt;strong&gt;save&lt;/strong&gt; to save the output binding.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgm2hj3iohjfcwh0i58s3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgm2hj3iohjfcwh0i58s3.png" alt="step3-2-configure-storage-output" width="800" height="248"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the left-hand side panel, click on your function name to be returned to the &lt;code&gt;run.ps1&lt;/code&gt; script where we will bring everything together with some PowerShell logic to pull some properties from an incoming event grid trigger and output them to Azure Table Storage.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4 - Putting Logic in the PowerShell Function
&lt;/h2&gt;

&lt;p&gt;Let's do a quick recap of where we are at. So far we have:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Created a new PowerShell function based on an Event Grid trigger&lt;/li&gt;
&lt;li&gt;Created an Event Grid subscription which is looking for a &lt;code&gt;Blob Created&lt;/code&gt; event and will send the event data to the newly created function&lt;/li&gt;
&lt;li&gt;Triggered the event by uploading a test file to the blob storage and viewed the contents of the event data that is sent to the function&lt;/li&gt;
&lt;li&gt;Created a binding output based on Azure Table Storage to a separate storage account&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Finally, you will put some logic into the &lt;code&gt;run.ps1&lt;/code&gt; file to process the event grid data and then use &lt;code&gt;Push-OutputBinding&lt;/code&gt; to send the processed data to Azure Table Storage.&lt;/p&gt;

&lt;p&gt;Below is a small example of what you will run in the PowerShell function. Copy and paste this into the &lt;code&gt;run.ps1&lt;/code&gt; file in the Azure Portal and click &lt;strong&gt;Save&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="kr"&gt;param&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$TriggerMetadata&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="n"&gt;Write-Host&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"New event occurred - &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eventType&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="nv"&gt;$SubjectName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;subject&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;-split&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$FileName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$SubjectName&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;-1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="kr"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$Filename&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;-like&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"*.csv"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;

    &lt;/span&gt;&lt;span class="nv"&gt;$Data&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;PSCustomObject&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;@{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;partitionkey&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;eventTime&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;rowkey&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;FileName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$Filename&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;Action&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;api&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;

    &lt;/span&gt;&lt;span class="c"&gt;# Associate values to output bindings by calling 'Push-OutputBinding'.&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;Push-OutputBinding&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;outputTable&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Value&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$Data&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5vbxz761x0z217xtdkn0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5vbxz761x0z217xtdkn0.png" alt="step4-1-update-powershell-code" width="658" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's break down some components of the PowerShell code being used in this example.&lt;/p&gt;

&lt;p&gt;Firstly, you define some parameters to be used within the function. &lt;code&gt;$eventGridEvent&lt;/code&gt; is the name of the parameter that will store data from the inbound Event Grid subscription. Then &lt;code&gt;Write-Host&lt;/code&gt; is being used just to show an example of writing some information to the log stream.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="kr"&gt;param&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$TriggerMetadata&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="n"&gt;Write-Host&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"New event occurred - &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;eventType&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, it is going to get the data stored in &lt;code&gt;$eventGridEvent.subject&lt;/code&gt; and split it based on the forward slash. You can tell that this needs to be done by examining the output data from step 2. After the data has been split, we are looking for the file name which will always be the last item, which can be accessed by using &lt;code&gt;[-1]&lt;/code&gt;. The filename is stored in a variable named &lt;code&gt;$FileName&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$SubjectName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;subject&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;-split&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$FileName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$SubjectName&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nt"&gt;-1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, if the file name ended in &lt;code&gt;.csv&lt;/code&gt;, a custom PowerShell object will be created. Data is being pulled from the Event Grid data by accessing properties of &lt;code&gt;$eventGridEvent&lt;/code&gt; as well as defining the file name that we extracted in the previous step. The &lt;code&gt;partitionkey&lt;/code&gt; and &lt;code&gt;rowkey&lt;/code&gt; values are specific to Azure Table Storage and these provide unique IDs to that row of data. Conceptually you do not need to worry about these, but if you are following along and weren't familiar, you may have wondered why those properties were included.&lt;/p&gt;

&lt;p&gt;The last step is to use &lt;code&gt;Push-OutputBinding&lt;/code&gt; to send the data in the custom object to Azure Table Storage, specifically the endpoint we defined in Step 3.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="kr"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$Filename&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;-like&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"*.csv"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;

    &lt;/span&gt;&lt;span class="nv"&gt;$Data&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;PSCustomObject&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;@{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;partitionkey&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;eventTime&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;rowkey&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;FileName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$Filename&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;Action&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$eventGridEvent&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="err"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;api&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;

    &lt;/span&gt;&lt;span class="c"&gt;# Associate values to output bindings by calling 'Push-OutputBinding'.&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;Push-OutputBinding&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;outputTable&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Value&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$Data&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 5 - Triggering the Function and Reviewing Output
&lt;/h2&gt;

&lt;p&gt;Using Azure Portal, go to the storage account you created in the prerequisites section of this guide. You should still have this open in another browser window/tab from Step 2. It will have a container and a test file already from step 2. Upload another document by clicking &lt;strong&gt;Upload&lt;/strong&gt;, select a CSV file from your computer and then click &lt;strong&gt;Upload&lt;/strong&gt; again.&lt;/p&gt;

&lt;p&gt;Optionally, navigate back to the log stream console of the function to monitor the function being triggered.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9eqfm820yu04tmb67kad.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9eqfm820yu04tmb67kad.png" alt="step5-1-review-log-stream-output" width="800" height="175"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you have Azure Storage Explorer installed, you can use that to connect to the storage account that is being used by the function app and look at the table storage. There should be a new table named &lt;code&gt;functionOutput&lt;/code&gt; with a row of data that was inserted by the output binding.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6o4lis1ijtggpjuhm13v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6o4lis1ijtggpjuhm13v.png" alt="step5-2-review-table-storage-explorer" width="800" height="372"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Not everyone may have Azure Storage Explorer installed and configured and it is OK if you don't, though it is a great utility to have. In this next step you will use PowerShell in the Azure Cloud Shell to review the output data.&lt;/p&gt;

&lt;p&gt;Open a new Cloud Shell, either by clicking on the cloud shell icon in Azure Portal, or in a new browser window or tab navigate to &lt;a href="https://shell.azure.com" rel="noopener noreferrer"&gt;https://shell.azure.com&lt;/a&gt;. If you still have the tab open from a previous step, you can use that but you may need to reconnect your session. You can use the code below to look at the data in table storage of the same Azure Storage Account that is being used by your function app. Just be sure to change the name of the resource group hosting your function app specified in &lt;code&gt;$ResourceGroup&lt;/code&gt; below.&lt;/p&gt;

&lt;p&gt;The code below assumes you used the default storage account within the function app for the output binding and that there are no other tables in that storage account.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Install the aztable Ps Module&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;Install-Module&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;aztable&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="c"&gt;# Get the storage key for the storage account&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$ResourceGroup&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"functiondemo"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$storageaccount&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;Get-AzStorageAccount&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-ResourceGroupName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$ResourceGroup&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="nv"&gt;$storageAccountKey&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Get-AzStorageAccountKey&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-ResourceGroupName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$ResourceGroup&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$storageaccount&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;StorageAccountName&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Value&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="c"&gt;# Get a storage context&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$stgcontext&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;New-AzStorageContext&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-StorageAccountName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$storageaccount&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;StorageAccountName&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-StorageAccountKey&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$storageAccountKey&lt;/span&gt;&lt;span class="w"&gt;


&lt;/span&gt;&lt;span class="nv"&gt;$table&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Get-AzStorageTable&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Context&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$stgcontext&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;CloudTable&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="n"&gt;Get-AzTableRow&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Table&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$table&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;ft&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Faqc4hy35dwvvbb5i1vaq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Faqc4hy35dwvvbb5i1vaq.png" alt="step5-3-review-table-storage-powershell" width="800" height="370"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can see the data that we collected in our custom object inside the function has been inserted in to the Azure Storage table as a new row.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this article you created a new PowerShell function based on the Azure Event Grid template. You then configured an Event Grid Subscription to listen for an event type of &lt;code&gt;Blob created&lt;/code&gt; on a particular storage account, which would send the event data to the PowerShell function. Next, you tested the trigger and observed the contents of the event by outputting it to a JSON object, viewable in the function log stream.&lt;/p&gt;

&lt;p&gt;Afterward, you configured an output binding for Azure Table Storage using the default storage account that is created with the function app. You then updated the code within the &lt;code&gt;run.ps1&lt;/code&gt; file which inserted a subset of selected data to Azure Table storage if a CSV file is uploaded to the source blob storage account. You then uploaded a CSV file to trigger the logic from end-to-end and finally reviewed the output in Azure Table Storage.&lt;/p&gt;

&lt;p&gt;While the specific example used in this guide may not provide any immediate real-world value, the purpose was for you to observe and understand the power of event-based functions and explore methods to have the function pass this data on to another system or process via an output binding. The options are effectively limitless and now you can put your imagination to the test to build functions, adding value to you and/or your customers.&lt;/p&gt;

&lt;p&gt;In the next post in this series, you will create another event-based function and explore the ability to have the function authenticate and modify Azure resources and access secrets from Azure Key Vault.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>serverless</category>
      <category>devops</category>
      <category>functions</category>
    </item>
    <item>
      <title>Azure Functions: An Introduction for DevOps Engineers</title>
      <dc:creator>Matt Allford</dc:creator>
      <pubDate>Tue, 21 Apr 2020 00:48:17 +0000</pubDate>
      <link>https://dev.to/cloudskills/azure-functions-an-introduction-for-devops-engineers-4bi3</link>
      <guid>https://dev.to/cloudskills/azure-functions-an-introduction-for-devops-engineers-4bi3</guid>
      <description>&lt;h1&gt;
  
  
  Table Of Contents
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Prerequisites&lt;/li&gt;
&lt;li&gt;Understanding Function App Plans&lt;/li&gt;
&lt;li&gt;Step 1 — Creating a Function App&lt;/li&gt;
&lt;li&gt;Step 2 — Creating Your First Azure Function&lt;/li&gt;
&lt;li&gt;Step 3 — Reviewing the Function in the Azure Portal&lt;/li&gt;
&lt;li&gt;Step 4 — Running the Function&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Introduction
&lt;/h3&gt;

&lt;p&gt;Azure Functions is a serverless compute service provided by Microsoft, enabling you to run small pieces of code on-demand without a need to build the underlying infrastructure to run that code. You can write code in a variety of languages and publish these blocks of code as functions, which is then hosted in Azure in a container called a Function App. Functions run when they are "triggered" and you'll see that in action in this guide. Though functions can run for a longer time under the premium and app service plans, the ultimate goal of a function should be to do a particular task and do that task as efficiently as possible.&lt;/p&gt;

&lt;p&gt;In this first part of a multi-part guide, you will review Azure Function App plans and deploy an Azure Function App. Next you will examine the high-level components of the Function App, deploy a PowerShell based function and review the logic of a templated PowerShell webhook function. Afterwards, you will trigger the function using several different methods and observe the output and log stream. When you're finished you will have a PowerShell based Azure Function that you can modify to accept different types of input and perform actions when the function is triggered and have a fundamental understanding of what capability Azure Functions can provide.&lt;/p&gt;

&lt;p&gt;By the end of the series, you will have deployed functions based on different triggers, worked with output bindings to send processed data to other applications, understood how authentication and secrets management works, developed, tested and deployed functions in Visual Studio Code and finally, put the function code under management of git and built a CI/CD pipeline in Azure DevOps to deploy your function when new code is committed to a master branch.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Before you begin this guide you'll need the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;(optional) Familiarity with PowerShell would be beneficial&lt;/li&gt;
&lt;li&gt;An &lt;a href="https://azure.microsoft.com/en-us/" rel="noopener noreferrer"&gt;Azure Subscription&lt;/a&gt;, you can create a free account if you don't have an existing subscription&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Understanding Function App Plans &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Like any cloud service, it is important to understand what the cost is going to be and what options are available for optimizing that cost. Three plan types can be selected when you create a function app and each option influences the scalability, resource availability and support for advanced features. The plan type cannot subsequently be changed, so ensure you understand the options below upfront:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Consumption.&lt;/strong&gt; This is the default hosting plan when creating a function app. Billing is based on the number of function executions, execution time and memory used. You only pay for the time your functions are running and scale is handled automatically by Microsoft. At the time of writing, there is a free grant (per month) from Microsoft for the consumption plan, allowing up to 1 million executions and 400,000GB-s (seconds) of resource consumption. By default, functions running under a consumption plan timeout in 5 minutes but can be expanded to a maximum of 10 minutes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Premium.&lt;/strong&gt; You can specify several warm instances that are always running and ready to run your function. At least one warm instance is required at all times per plan, meaning there will be a minimum monthly cost even if you don't trigger any functions. With that said, it does mean there is no cold start like you will see when triggering functions on a consumption plan. Functions running under a premium plan have a default timeout of 30 minutes and can be set to run for an unlimited time. Premium plans also provide some advanced capability when compared to consumption, such as being able to connect the Function App to an Azure Virtual Network (VNet) for more controlled connectivity.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;App Service Plan.&lt;/strong&gt; An App Service Plan is a set of dedicated compute resources in Azure where other App Service apps, such as a Web App, can run. Microsoft provided the ability to connect Azure Functions to an App Service Plan, meaning you can use the underlying resources in the App Service Plan for multiple services. Typically you would use this plan if you already have an App Service that isn't fully utilized or if you want to provide a custom image that your functions will use when they run.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The biggest downside that is usually associated with the consumption plan is that the functions are cold, meaning there will be a small lag time between when the function is called and when the execution of the function is completed. You will observe this later in this guide. Think about what your function is doing and whether the response from the function needs to be as quick as possible. If another process or a user is waiting on the result of the function running, it might be worthwhile to look at using one of the other plans.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1 — Creating a Function App &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;In this step you will create an Azure Function App using the Azure Portal selecting PowerShell as the runtime. This will provide the resources you need as the backbone to then create your first function.&lt;/p&gt;

&lt;p&gt;First, it is important to understand the resources that are required when provisioning a function:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Function App.&lt;/strong&gt; Think of this as the logical container that lets you group functions together. This will need a unique name in the Azure Cloud&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Storage Account.&lt;/strong&gt; Internally, functions use storage for operations of the function app&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Optionally, you can enable Application Insights on the Function App to provide monitoring of the function app and functions.&lt;/p&gt;

&lt;p&gt;Let's get stuck in and create your first function app. Log in to the &lt;a href="https://portal.azure.com/" rel="noopener noreferrer"&gt;Azure Portal&lt;/a&gt; and search for &lt;strong&gt;Function App&lt;/strong&gt; and then click &lt;strong&gt;Create Function App&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fzfard0wb8mhsf3gh3vm7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fzfard0wb8mhsf3gh3vm7.png" alt="Step1-1-Create-Function-App" width="800" height="495"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Fill out the basic information for the function app and then click on &lt;strong&gt;Next: Hosting&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Subscription&lt;/strong&gt;: Select the Azure subscription that you will deploy this function to.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Resource Group&lt;/strong&gt;: We're creating a new resource group for this demo called functiondemo&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Function app Name&lt;/strong&gt;: This is the name of your function app and will be the URL used to access functions within your function app. As noted above, this needs to be globally unique.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Publish&lt;/strong&gt;: There is capability to publish a function as a docker container rather than raw code, but for this demo select Code.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Runtime Stack&lt;/strong&gt;: Use this to select the language that your functions will be written in. For this demo, use PowerShell.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Version&lt;/strong&gt;: Depending on the language you select, there may be different versions available to use in the runtime. At the time of writing, version 6 is the only one available for PowerShell&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Region&lt;/strong&gt;: Select the Azure region where your function app will be hosted&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6l2g71nefp6uz63q3bs8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6l2g71nefp6uz63q3bs8.png" alt="Step1-2-Basic-App-Settings" width="674" height="690"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the next page we get some options for the hosting of the app function. Select your desired options and then click &lt;strong&gt;Next: Monitoring&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Storage Account&lt;/strong&gt;: A storage account is required for a function app. Either choose an existing account or create a new one. We're creating a new account for this demo.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Operating System&lt;/strong&gt;: Depending on your chosen runtime you can select the underlying Operating System that your functions will run on. We're using Windows as it is the only option for PowerShell&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Plan Type&lt;/strong&gt;: As we covered in the introduction, this is where you select the plan for the function app. This can't be changed later. For this demo we're using the consumption plan.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4lz63h197s8jweemisn9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4lz63h197s8jweemisn9.png" alt="Step1-3-Hosting-App-Settings" width="660" height="555"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can optionally enable Application Insights and add tags to the resources being provisioned. We aren't looking at App Insights in this demo, so we've chosen not to enable it. Disable Application insights then click on &lt;strong&gt;Review + Create&lt;/strong&gt;. Go ahead and review your selections and click &lt;strong&gt;Create&lt;/strong&gt; to provision the resources.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fugo6jwixmadvilf48i2m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fugo6jwixmadvilf48i2m.png" alt="step1-4-review-and-create" width="528" height="722"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, let's look at what was provisioned. If you followed along you will have:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;An App Service Plan&lt;/li&gt;
&lt;li&gt;An Azure Storage account&lt;/li&gt;
&lt;li&gt;An App Service&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F23zncdlbrgc8897y4kwi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F23zncdlbrgc8897y4kwi.png" alt="step1-5-resource-review" width="564" height="129"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the next step you will look at the App Service and create a function.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2 — Creating Your First Azure Function &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;In this step you will create a PowerShell Azure Function using the Azure Portal.&lt;/p&gt;

&lt;p&gt;Using the Azure Portal, search for Function App and  click on the Function App you provisioned in the previous step. The Function App is currently empty, so click on either the &lt;strong&gt;+&lt;/strong&gt; next to Functions on the left-hand side or click on the &lt;strong&gt;+ New Function&lt;/strong&gt; button to create a new function.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fslfc72z1jh0mwjgh4wdu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fslfc72z1jh0mwjgh4wdu.png" alt="step2-1-new-function" width="800" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Some different tools and methods can be used to create and configure Azure Functions. One of the options is to create the function in the Azure Portal, which is suitable for getting started. In a later blog in this series we will look at using Visual Studio Code for local development and publishing, but for now select &lt;strong&gt;In-Portal&lt;/strong&gt; and click &lt;strong&gt;Continue&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Flwcpei454hd926if3lt3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Flwcpei454hd926if3lt3.png" alt="step2-2-development-environment" width="800" height="438"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Microsoft provides several templates for functions to help you get started quickly depending on what you want your function to do. For this demo select &lt;strong&gt;Webhook + API&lt;/strong&gt; and click &lt;strong&gt;Create&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgouduj5zvfy23zts65q3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgouduj5zvfy23zts65q3.png" alt="step2-3-function-template" width="800" height="438"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A new function is created within the function app and you are presented with PowerShell code as a template for the function.&lt;/p&gt;

&lt;p&gt;In the next step, we will review some important files that have been created as part of the function and break down the example PowerShell function to understand the workflow.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3 — Reviewing the Function in the Azure Portal &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;At the bottom of the screen, click on the arrow next to Logs. The Logs window provides a streaming view of the logs which can be useful when wanting to debug or troubleshoot a function. you will see this in action when we run the function in the next step.&lt;/p&gt;

&lt;p&gt;On the right-hand side of the screen, click on the arrow to expand a pane where you can view the files in the function.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fce9ixthz8gwhmfpivjl4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fce9ixthz8gwhmfpivjl4.png" alt="step3-1-expand-logs-and-files" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each function contains two important files that you need to be familiar with that you can see in the panel on the right hand side. You can click on each file to view it in the editor:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Run.ps1&lt;/strong&gt;. This is the PowerShell function that is currently open in the portal editor. This is the code that gets run when your function is triggered.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;function.json&lt;/strong&gt;. This is a configuration file that defines the behavior of the function, such as how it gets triggered, what authorization is required and what input and output bindings are defined. This example defines a HTTP in binding, named &lt;code&gt;Request&lt;/code&gt;, a HTTP out binding named &lt;code&gt;Response&lt;/code&gt; and you will come back to these in a few moments when we break down the PowerShell function.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fctkx2wseshtkkefeabew.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fctkx2wseshtkkefeabew.png" alt="step3-2-function-files" width="381" height="220"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Speaking of authorization, three values can be used:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;anonymous&lt;/strong&gt;: No API Key is required and the function can be triggered by anyone&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;function&lt;/strong&gt;: A function-specific API Key is required in the call to the function&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;admin&lt;/strong&gt;: The master key is required in the call to the function&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;By default, functions use the &lt;strong&gt;function&lt;/strong&gt; authorisation option.&lt;/p&gt;

&lt;p&gt;Click on &lt;strong&gt;Run.ps1&lt;/strong&gt; if you don't already have it selected and let's break down the logic in the example function.&lt;/p&gt;

&lt;p&gt;The first line is to ensure the &lt;code&gt;System.Net&lt;/code&gt; library is being used in this function. Like a standard PowerShell function, some parameters have been defined:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Request&lt;/strong&gt;: This is the name of the "HTTP in" binding defined in &lt;code&gt;function.json&lt;/code&gt; and will contain the data coming in with the HTTP trigger, via a query or a body, and the data will be available in this variable in the rest of the script.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;TriggerMetadata&lt;/strong&gt;: This parameter is optional and can be used to access extra information about the trigger. The metadata available does vary from binding to binding, but they all contain a &lt;code&gt;sys&lt;/code&gt; property with some data.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="kr"&gt;using&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kr"&gt;namespace&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;System.Net&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="c"&gt;# Input bindings are passed in via param block.&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="kr"&gt;param&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$Request&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$TriggerMetadata&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following is an example of using Write-Host to output information to the Log stream. You will see this in the Log console when we run the function a little later on.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Write to the Azure Functions log stream.&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;Write-Host&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"PowerShell HTTP trigger function processed a request."&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the following lines, we are looking to see if a Name value has been provided by the person or process that triggered the function, either as a query parameter or in a body send with the request.&lt;/p&gt;

&lt;p&gt;If a value was provided for &lt;code&gt;Name&lt;/code&gt;, it is stored in the variable &lt;code&gt;$Name&lt;/code&gt; and the &lt;code&gt;$status&lt;/code&gt; variable is set to &lt;code&gt;[HttpStatusCode]::OK&lt;/code&gt;. The function would then keep running to the next block below. If a value for &lt;code&gt;Name&lt;/code&gt; cannot be found on the incoming request, the &lt;code&gt;$status&lt;/code&gt; variable is set to &lt;code&gt;[HttpStatusCode]::BadRequest&lt;/code&gt; and a custom message is put in to the &lt;code&gt;$body&lt;/code&gt; variable.&lt;/p&gt;

&lt;p&gt;One way or another, when this piece of code completes, we will have values for &lt;code&gt;$status&lt;/code&gt; and &lt;code&gt;$body&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Interact with query parameters or the body of the request.&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nv"&gt;$name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$Request&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Query&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Name&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="kr"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;-not&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nv"&gt;$name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$Request&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Body&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Name&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="kr"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nv"&gt;$name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nv"&gt;$status&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;HttpStatusCode&lt;/span&gt;&lt;span class="p"&gt;]::&lt;/span&gt;&lt;span class="n"&gt;OK&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nv"&gt;$body&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Hello &lt;/span&gt;&lt;span class="nv"&gt;$name&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="kr"&gt;else&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nv"&gt;$status&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;HttpStatusCode&lt;/span&gt;&lt;span class="p"&gt;]::&lt;/span&gt;&lt;span class="n"&gt;BadRequest&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nv"&gt;$body&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Please pass a name on the query string or in the request body."&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The final part of the function is to call &lt;code&gt;Push-OutputBinding&lt;/code&gt; and provide the value to an output so it can be processed. For this example, the output being used is a HTTP output. For the Name parameter in this example, the value &lt;code&gt;Response&lt;/code&gt; is being used. Remember a few moments ago where you looked at &lt;code&gt;function.json&lt;/code&gt; and a HTTP binding with an out direction was defined named &lt;code&gt;response&lt;/code&gt;? That is what is being called now and the data in the &lt;code&gt;-Value&lt;/code&gt; parameter will be sent and processed by the output binding. You can create different types out output bindings and that will be explored further in the next post in this series.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Associate values to output bindings by calling 'Push-OutputBinding'.&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;Push-OutputBinding&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;Response&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-Value&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="n"&gt;HttpResponseContext&lt;/span&gt;&lt;span class="p"&gt;]@{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;StatusCode&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$status&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nx"&gt;Body&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$body&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can edit the &lt;code&gt;function.json&lt;/code&gt; and &lt;code&gt;run.ps1&lt;/code&gt; files directly in the Azure Portal and click on Save at the top to save the file.&lt;/p&gt;

&lt;p&gt;In the next step, you will get your hands dirty and trigger the function in a few different ways and observe the output.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4 - Running the Function &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Now that you understand the logic of the function when it is triggered, in this step you will trigger the function in a few different ways to observe the behavior. We can use the Azure Portal to trigger the function and observe the logs and the output.&lt;/p&gt;

&lt;p&gt;In the Azure Portal, ensure you are still on the screen with the &lt;code&gt;run.ps1&lt;/code&gt; function and you have the logs visible from the panel at the bottom, and the Test area showing in the pane on the right. Your screen should look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fu918fseowaxxx56ywb8t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fu918fseowaxxx56ywb8t.png" alt="step4-1-prepare-portal" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the test area, ensure the HTTP method is set to GET and then under query, ensure there is nothing in the name or value fields (if they are showing) and then click &lt;strong&gt;Run&lt;/strong&gt; down the bottom. Notice that the function takes a few seconds to complete. This is because of the cold start on the consumption plan. Because we didn't pass the function a name with a value, either in the query or in a request body, the function will fail and we will receive the response code 400 with a message telling us to pass a name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fdkjpf2it2bqdbuz9363b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fdkjpf2it2bqdbuz9363b.png" alt="step4-2-test-get-noquery" width="427" height="783"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Behind the scenes, this is the actual URL that was used to trigger function:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://cloudskills20200406.azurewebsites.net/api/HttpTrigger1?code=jWiMZha0KyfCSUOEDUXb7F9TbOgMBCHIQP52nJqnoZdsZaKB2CBhMQ==
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If we break that URL down further, this section is the FQDN of the function app and the name of the function we are triggering:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://cloudskills20200406.azurewebsites.net/api/HttpTrigger1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And below, after &lt;code&gt;?code=&lt;/code&gt; is the function key that we need to append to the trigger, because the function is expecting this for authorization:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;?code=jWiMZha0KyfCSUOEDUXb7F9TbOgMBCHIQP52nJqnoZdsZaKB2CBhMQ==
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If the authorization in the function.json file was set to anonymous, we would not need to append this key. There are a few ways to get the key to use, the easiest way on the screen you are on is to click on the &lt;code&gt;&amp;lt;/&amp;gt; Get Function URL&lt;/code&gt; at the top of the screen:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F73y9v1agtc6cpmbn54us.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F73y9v1agtc6cpmbn54us.png" alt="step4-3-get-function-url" width="737" height="198"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fceuoig37xhyxiuwjlctk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fceuoig37xhyxiuwjlctk.png" alt="step4-4-get-function-url" width="538" height="112"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Go  over to the right-hand pane and this time click on &lt;strong&gt;Add parameter&lt;/strong&gt; under &lt;code&gt;Query&lt;/code&gt;. Enter &lt;strong&gt;name&lt;/strong&gt; for the name, put your name in the value and click on Run again. This time you will see a Status 200 returned, and the output will display "Hello &amp;lt;yourname&amp;gt;\".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fr3s29yyxucadblg228m2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fr3s29yyxucadblg228m2.png" alt="step4-5-test-get-query" width="427" height="783"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Behind the scenes, this is the URL that was used to run the function this time. You can see the query for name and the value appended to the end.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://cloudskills20200406.azurewebsites.net/api/HttpTrigger1?code=jWiMZha0KyfCSUOEDUXb7F9TbOgMBCHIQP52nJqnoZdsZaKB2CBhMQ==&amp;amp;name=Matt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, instead of sending the name as a query, send it as a key value pair in a JSON body. Change the HTTP method from GET to POST and to clean up, click &lt;strong&gt;x&lt;/strong&gt; to remove the query we used in the last step. In the Request Body, by default it will have a JSON payload with a key value pair of &lt;code&gt;name&lt;/code&gt; and &lt;code&gt;Azure&lt;/code&gt;. Leave that for this demo, and click on &lt;strong&gt;Run&lt;/strong&gt;. Again, you will see a status 200 returned and the output will change to "Hello Azure".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F337v5hc1wpxhe3qzuy6g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F337v5hc1wpxhe3qzuy6g.png" alt="step4-6-test-post" width="427" height="780"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hopefully you've been looking at the Logs window every time the function has been triggered, though sometimes it can be a bit buggy. If not, we will examine it now. The first and third lines are informing us that the trigger has been called and has been executed. The second line is the output from line 7 in the Run.ps1 file, showing how we can use &lt;code&gt;Write-Host&lt;/code&gt; within a function to output information to the log stream.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2020-04-06T09:56:12.450 [Information] Executing 'Functions.HttpTrigger1' (Reason='This function was programmatically called via the host APIs.', Id=29592acd-dd21-4cc0-b5de-ca9481a9148b)
2020-04-06T09:56:12.494 [Information] INFORMATION: PowerShell HTTP trigger function processed a request.
2020-04-06T09:56:12.494 [Information] Executed 'Functions.HttpTrigger1' (Succeeded, Id=29592acd-dd21-4cc0-b5de-ca9481a9148b)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;In this article you have explored the capabilities of Azure Functions and reviewed the options available for running the functions in different App Plans.&lt;/p&gt;

&lt;p&gt;You deployed a new Function App in Azure and created a new function based on a webhook trigger. Once the function app was available, you reviewed the authorization options and reviewed the logic provided in the default templated PowerShell function.&lt;/p&gt;

&lt;p&gt;Finally, you triggered the function using several methods that provided varying results and observed the output both in the Azure Portal and in the Log Stream.&lt;/p&gt;

&lt;p&gt;In the next post in this series, you work with PowerShell based functions in more detail, creating examples where the function will perform actions on Azure resources based on particular logic and output the results using a binding other than HTTP.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>serverless</category>
      <category>devops</category>
      <category>functions</category>
    </item>
  </channel>
</rss>
