DEV Community

Olivier Miossec
Olivier Miossec

Posted on • Edited on

Azure Functions with PowerShell, How I started

I started working with PowerShell in Azure Functions v2 since the first beginning of the Preview (almost, I started on 30th April 2019). The idea to create serverless tools with PowerShell was really exciting.

What are Azure functions? It's the Function as a Services model implementation on Azure. It let developers to create API and other solutions by only focusing on code. It can be seen as; you provide the code, Azure provide and manage everything else.

But why using PowerShell Core to create APIs, this is not the best language to do that! you got a point. PowerShell, is not a first-choice language for web development. But I have two reasons to use it.

The first one, even if I know Python and .net, I am more skilled with PowerShell and I can do many thing that will take me more time with another languages.

The second, Azure Functions is not only for HTTP API, it’s an event driven Serverless platform. As PowerShell is mostly an ops language (but you can do OOP, TDD, it's a true language) for automating everything. Azure Functions and PowerShell Core is an event based Serverless automation Tools.

In my job, I have to manages Azure Resources, Stop and start VM based on a different event or conditions, react on what happen to a resource, move things, query external API, …

Before we talk about Azure Functions you have to understand what a Function App in Azure is :

A function app is just a kind of web app, simple enough to deploy from the portal, arm, cli (but not from PowerShell), this app can contain one or more function and you are only billed when the function run (with a free tiers). A function is just a folder in the function App with a run.ps1 file that contain the function code and a function.json for the configuration of the function bindings and trigger. The last things, Azure Functions with PowerShell is loaded with AZ module to manage Azure resources.

I have some troubles during my first days with Azure functions, but no real problems at this time except one; the lack of documentation. Every output operation on bindings use the push-OutputBinding. The same cmdlet for Blob, Queue, SignalR, … but you don’t always know the type and/or the format of the output value. It’s almost the same with the param value from the trigger.

You have to test and try. I had to take a look on c# examples, it gives clues on how you need to format the value.

Be careful, Azure Functions use PowerShell Core and not PowerShell 5.1 or other Windows PowerShell version. So update your development platform first.

I had to implement Timer functions, a function that run on a schedule. It is simple, you put your code and give a cron expression to schedule the execution. But the default time zone is UTC, when you configure your function to run at 2PM CET it run at 12AM. But you can change it at the function app level. You need to add an application setting, WEBSITE_TIME_ZONE. For the value take a look at this page https://docs.microsoft.com/en-us/windows-hardware/manufacture/desktop/default-time-zones. And yes, this setting manages daytime saving.

I was more frustrated by the cold start especially for the blob trigger. It can take more than 5 minutes to start. It’s really frustrating when debugging, but if the solution can wait few minutes before starting and you do not manage a lot blob is not a big problem. If not, you should take a look at Event Grid. It wasn’t the case, so I didn’t switch to Event Grid. But I tested it and it is not so complex to do.

One of the greatest features is the ability to give an identity to a function. It means you can give to the function App a sort of Azure AD system account in the subscription., a system assigned managed identity. It’s like an Azure Ad user or service account. You can grant permission to a function via RBAC to resources. If you want to stop, start or change a VM you can use the function app with identities to do that.

It is great, but it can be dangerous too. Functions receive data from external sources. The data should be considered as untrusted by default. It doesn’t matter if the data comes from a trigger or a binding, http or queue. You should test the data type and the format before doing anything.

Function identity is a great feature, it let you to automate Azure with more power and possibilities then Azure Automation Account. But avoid giving too much privileges. Only give the right privilege to the right resources. Do not give contributor role to a subscription or a resources group.

Remember that the identity is given to the Function App not to single function. You should take that in consideration when using managed identity.

Another thing I learn working with Azure Function is to avoid to write a long script that can do many things. There are many reasons for that. The first one, and it’s obvious, Azure Functions use the Serverless pattern and you should do it too. You should also remember that the default timeout is 5 minutes. You can manage the timeout from 1s to 10 minutes by using functionTimeout in the host.json at the root of the function App.

A function needs to be dedicated to one thing, just like a function in PowerShell (or any other language). Imagine that you need to call a Rest Api, save the result in an Azure Sql Database, start a vm based on the result of the API and log actions in an Azure Storage Table.

You can do it in one function, but it may not scale very well and if you have an error you may not be able to resume the operation and you may loose data.

Instead you can leverage triggers and bindings to create multiple functions.

In this example, we can create a function that query and parse the API result then add a message in one queue, if the condition is meet, it add another message in a queue. We can now create two Queue Function one that will manage database insertion and another that will start the vm. For the log action we can also use a queue trigger function.

With queue, if something goes wrong in one function, the message will stay in the queue and another run will take of it.

Using several functions make debugging and testing simpler, writing code is less complex.

Last things I learn is to use PowerShell module. Azure functions will load any modules presents inside the modules folders at the root at the function App folder.

Instead of writing a script for each function, I started to create a module for each function App. All the business logic is the module and the Azure function code only call the cmdlet I have created.

I did that because it’s easier to automate unit and acceptance testing using pester in a module.
Another reason is code reuse. It’s frequent to use the same logic for several triggers. You can have some business logic you need to run from time to time using a timer trigger and on demand using a http trigger. Using a module prevent writing the same code two time.

Azure functions with PowerShell is still in Preview, but I didn’t find any major bug. Sometime, I had to restart the function App. The biggest problem is the documentation. You need to learn and experiment a lot.
From an Ops perspective Azure Functions is a big change. Serverless and loosely coupled patterns are new for PowerShell scripters (and something from another world for Ops who don't script). But learning them is worth the time spent to fail, discover and experiment.

Top comments (3)

Collapse
 
vinayrana8727 profile image
vinayrana8727

Hi, does azure powershell core function with consumption plans scale ??

I have a event based trigger which may run 1000 times in a second and execute powrshell script. But I keep getting this warning that your execution is in queue.
I somewhere read that azure powershell core functions runs on 1Vcpu core, but does it mean that it won't scale when I have 1000 execution per seconds.

Collapse
 
jerppuliz profile image
Jere Vekka

You wrote "In this example", but I can't find any :/
Should there be an example or are you talking just in general?

Collapse
 
omiossec profile image
Olivier Miossec

Hi Jere
Yes I should rewrite this part to make it more clear
thanks to you for the comment