DEV Community

Mike Pfeiffer for CloudSkills.io

Posted on • Originally published at cloudskills.io

Serverless Automation with PowerShell and Azure Functions

In this episode I had a chance to catch up with Eamon O’Reilly, Principal Program Manager on the Azure Functions team at Microsoft Ignite 2019. Find out how you can take advantage of PowerShell in Azure Functions, including native support for PowerShell Core 6 as well as the Azure Az modules, and more, to build event-based solutions.

You can listen to the interview here:
https://cloudskills.fm/049

Resources from the show:

Full Transcript:

Mike Pfeiffer:
What's up everybody? Welcome back to another episode. Super excited, happy here. We’re at Microsoft Ignite 2019 and I’m sitting with Eamon O’Reilly from Microsoft. Eamon, welcome to the show.

Eamon O’Reilly:
Hey, perfect Mike. Hey, thanks for having me.

Mike Pfeiffer:
Yeah, it’s awesome. I’m really excited about this episode because of what you’re working on at Microsoft and for the folks listening. Maybe you could let them know who you are, what you do at Microsoft.

Eamon O’Reilly:
Yup. Hi everyone. My name is Eamon O’Reilly and I work on the Azure Functions team. I’m mostly focused on automation scenarios and kind of bringing those into Azure and enabling kind of a lot of our IT PRO DevOps users to kind of get familiar doing automation and doing Serverless.

Mike Pfeiffer:
Awesome. I’ve been looking forward to meeting you and talking to you today because this is an area for me where … I’ve loved PowerShell since it came out, and I’ve had … I think for me PowerShell really amplified my career. I know a lot of people have a similar story and now that we can use it and it functions in Azure. I think it’s going to open up a lot of cool automation capabilities for people.

Eamon O’Reilly:
Yeah.

Mike Pfeiffer:
So there’s so many places we could start-

Eamon O’Reilly:
I know in some ways you get excited when we start talking about it because it opens up a whole kind of avenue to how do I bring the automation I used to do, make that available in the cloud, but then all this other automation, that was just so difficult or I wouldn’t even do it before, it comes a lot easier now when you can bring that PowerShell knowledge, but bring it kind of into the cloud.

Mike Pfeiffer:
Yeah. So for the folks that are listening, we’ve got tons of ops folks, tons of developers, however, there’s a lot of people that haven’t started working with Azure Functions yet. So maybe we could explain the basics of that, and then we can start diving into how a DevOps engineer could use PowerShell and Azure Functions.

Eamon O’Reilly:
So, if you are not familiar with Azure Functions in general, the main idea behind it is that, we call it a Serverless platform. It means you don’t have to manage the servers, you write your PowerShell scripts and we host and run them for you. You don’t have to worry about where it’s running or any of that. The second thing is, you only pay for what you use. So when your script is running, there’s a small cost because we’re spinning up back and compute for you. That’s all you pay for. And then the third thing is, it will scale automatically. So say you have a lot of invocations of your script you need to run, we’ll make sure that we spin up enough compute in the backend so that you don’t have to worry about, “Am I overloading my machine? What happens to it now to, did it just fail.” We’ll automatically guarantee liability by scaling out.

Eamon O’Reilly:
So those are kind of the three pillars of Functions in general. And so when you take those into your mind, it allows you then really say, “Okay, how do I take advantage of that platform with my automation?” And so things I would normally do on premises, like, run a script, perhaps a shut up down all my test machines at night and I might schedule tasks that are using task manager and hook up a script. You can now bring that same script into the cloud, hook that up using our own scheduler on top of functions, and then that will do it at scale, you never have to worry, was the machine down when it ran? Did I miss something? It will just happen.

Mike Pfeiffer:
All right.

Eamon O’Reilly:
You’re like, 50 machines, if you shut down 100, 500, we will actually scale out and kind of called more implications of that script as necessary to make sure all those things shut down for you.

Mike Pfeiffer:
Right. And I think that this is probably something that people don’t even realize they’re going to be able to do and when they get into this, it’s going to be a light bulb moment. One of the things you mentioned was kind of the cost model, you’re just paying for it when it’s running. A lot of us, what we do to your point earlier was, schedule a thing, we’re always polling, we’ve got this system sitting there, sometimes it’s not actually doing stuff. And now we can really just take advantage of paying for the stuff when we’re using it and maybe executing those Functions based off of an event instead of you’re checking something all the time.

Eamon O’Reilly:
Yeah. That was the one of the disadvantages. We really had a good way before. So you wrote your task edge or you’d run your script and then you kind of had a pole. If you need to something to check more often, you increase your task and it was always running, and sometimes it’s putting a load on the external system, that you are calling, that people are like, “Why am I getting called all this time?” So then we’d have these backoff scripts we’d have to write. And it got actually quite complex-

Mike Pfeiffer:
Totally.

Eamon O’Reilly:
Like, really good on premises with scripts. And so, one of the things when we moved to the cloud is not only kind of, do you move to the Serverless concept, which you start to take advantage of the event based services that are in Azure. And so a good example of that is a service called the Event Grid. And so the nice about the Event Grid is, as you make any changes or do any deployments inside of any of the Azure services, it gets notified and that Event Grid service which is built into Azure and now has ability to call your Functions when those events happen and you can subscribe to those, so it’s kind of a puncture subscription model.

Eamon O’Reilly:
So let’s say someone deploys a new virtual machine or someone deploys that can use SQL server, you can now say, when that happens, call my Function, because I want to be able to go do something and maybe it’s add a tag and add the cost center of the SQL server so that I could actually charge that back to the right group. Or add a tag that says, “Shut this down on the weekends, because it’s not being used. It’s just a dead machine.” But now before, you’d have to try to insert yourself into the mix and try to work with the developers or run something separately, every hour, looking from new machines [inaudible 00:05:34]. You don’t have to worry about any with that when you kind of go to the event based model because the guarantees delivery, so even if it couldn’t call you … If it fails it’ll call you again.

Eamon O’Reilly:
And so it’s just one scenario event based where you can start to move a lot of the things you used to have to worry about on premises. You can kind of just say that the cloud take care of those things for me.

Mike Pfeiffer:
Yeah. And I think that that is a place where people are going to be able to come up with some really creative solutions. It’ll be interesting to see how people use that to innovate and just solve problems because you can sort of scribed it almost. It seems like almost any event and so the possibilities are almost endless, it seems like. But kind of taking a step back a little bit, the Azure Functions runtime just went through a big update. You guys made a big announcement this week about PowerShell going to general availability, and maybe you could talk about the runtime platform, why the versioning considerations are important, and what version of PowerShell that we got out there, all that kind of stuff.

Eamon O’Reilly:
Definitely you kind of do need to understand what you’re working against. So when we built … So how Functions works is, they have this concept of language workers and so the core platform kind of gives you the ability to run these language workers inside of the Function service. So we do all that work for you. It’s all open source as well, so you could run it on premises-

Mike Pfeiffer:
Nice.

Eamon O’Reilly:
So you can take this work, actually these language workers and run them on premises. But what we did was, we worked with the PowerShell team to build basically a PowerShell language worker for Azure Functions. And so what that enables us to do is we took PowerShell core and then we took all of the event based programming model that Functions has and make that work seamlessly with PowerShell. And so when you actually develop your scripts, when they’re going to run, they’re going to run on basically a language worker in our service, which is running PowerShell 6. So you have to know when I’m running scripts, I can’t make a dependency on WMF 5 or 5.1.

Mike Pfeiffer:
Okay. That’s super important.

Eamon O’Reilly:
Yeah. You need to know that. So if you have dependencies, the one thing we’ve seen is, Azure AD still has that dependency because it has some other libraries that aren’t available in core yet. And so that’s one of those that you have to be careful how you use it. There is ways to work around it, but they’re not as pretty. You kind of shell out to partial that exe because it is running down a Windows [container 00:07:42] and be able to call those AD commands. But it’s kind of a workaround until we get the Azure AD module in because by default, it’s only going to run PowerShell 6.

Mike Pfeiffer:
Got it. So-

Eamon O’Reilly:
That’s probably the biggest one. Beyond that, everything else, we’ve done these previews, everything’s been kind of working really well over the last four months. That’s the only one that we’ve been kind of working through.

Mike Pfeiffer:
I can see where we’re going to use it a lot. But I guess one of the things that kind of popped into my head there was around the versions and things like that. And if you’re starting this from scratch, you’re building a Function app, building your first PowerShell Function, should you be doing that in a portal, or should you be using developer tooling, visual studio code, how should we start attacking this?

Eamon O’Reilly:
There’s a simple tutorial we have and so we kind of recommend VS code now. I think a lot of the IT pros, have been working over the last few years, we kind of all started in ISE. It’ll actually work really well. And the debugger was great, all the IntelliSense we’ve got, it was really nice. And so I did most of my development there as well. But then over the years we’ve kind of stopped the investments in the ISE, and we’ve basically put a lot of our partial investments into VS code or visual studio.

Eamon O’Reilly:
So now our recommendation is starting VS Code, because you get all that rich debugging IntelliSense and the nice things, we actually have a complete emulation of the hosts, that language worker that we run on premises, we actually run it on your machine as well. So this allows you basically to start a VS Code, write the script like you normally would, don’t have to worry that much about the differences, but you can actually set a break point, hit F5 and you can step through your codes inside the language worker that’s running locally. And then when you deploy your code into the Azure, it’s going to be the same language worker that we spin up and run for you.

Mike Pfeiffer:
That’s awesome. So-

Eamon O’Reilly:
So we get this kind of, do your work locally because you’re probably going to be quicker from an iteration standpoint, but if you just wanna write a small little script and you’re just trying to get something done quick, the portal is available for you.

Mike Pfeiffer:
Sure.

Eamon O’Reilly:
If you know how to fix it, then get in there and make some changes. You can do all your AD in the portal as well. But our general recommendation is starting VS code because even if you don’t have source control and all the DevOps, and you just push straight to the cloud, it opens up the path when you want to check in or you want to start to get there, it’s easier to go from VS code in, and then have that pushed out to the cloud.

Mike Pfeiffer:
I think VS Code just is such an awesome idea and I was really pumped when they were talking about visual studio online and being able to run in a browser now. So, with with that in cloud shell, it’s just an insanely good time to be working in Azure. It doesn’t matter if you’re Ops or Dev, or anywhere in between. And so my next question would be, let’s say that we’ve got a Function app, we now are getting into building Functions. Maybe we’re doing a digital studio code, inside of our functions, our PowerShell Functions, we obviously can invoke those Functions through an event subscription, like you mentioned, but there’s other bindings Ziggy have in Azure Functions. There’s all kinds of ways to wire up invocations of these Functions, right?

Eamon O’Reilly:
Yeah. So one of the really cool things about Functions, we basically call it event based. So what we’re opening up with PowerShell supporting Functions is really event based automation. And the idea of about event based automation is that we have these concepts in Functions called triggers and bindings like you mentioned. And what triggers allow you to do is automatically subscribe to an event that happens. So the typical one is a timer, like a scheduled task. And so that’s a trigger event that gets called.

Eamon O’Reilly:
The other one will support people probably familiar with webhooks, HCP, APIs, that’s another trigger event. But we also have triggers into other systems like Event Grid like I talked about, but you might have a queue an Azure queue and you’re going to use messages and you want to trigger yourself when those messages arrive in a queue or using Cosmos DB or table storage. All of these have built in triggers so you don’t have to write the code to say when something happens go notify me or pass me the data. We basically have built a whole set of libraries. The good thing about adopting functions is that we gain all the benefits that were written for the developers, but now we can use them inside of the IT PRO scenarios as well. And so every time they keep adding more of these triggers, we inherit them for PowerShell as well without doing any work.

Mike Pfeiffer:
That’s really cool.

Eamon O’Reilly:
That is kind of nice. That’s how you get triggered, your notification and then the bindings is how do I pass data back into those systems or how do I go read additional data from other systems? And so bindings is the same concept where we write the code to do the connection to those systems, retrieve the data or write the data and you have to write zero code. It’s all config based. So you just say, “Oh, I need to talk to that storage account with this Q, and I want to listen to the messages of this type. Need to put, maybe some filter on them.” So you just specify that in config and then when you’ve … anytime and a message arrives there, the Function will get called.

Mike Pfeiffer:
It’s awesome.

Eamon O’Reilly:
[inaudible 00:12:22]. So those are what allows you to just kind of connect all these systems together but not have to worry about maintaining the connection pool or running out of sockets and how do I do the authentication? You spend a lot of time there, where you don’t want to really spend the time. And so that’s one of the things.

Mike Pfeiffer:
That’s super fascinating because based on what you’re saying, essentially I could go in and take a trigger that’s predefined and then have an output binding and I’m writing very little code compared to what I used to be doing.

Eamon O’Reilly:
Yes, that’s the whole perspective.

Mike Pfeiffer:
You just really very focused. Right?

Eamon O’Reilly:
Yeah. That’s kind of that developer model we talked about, we say that Functions is kind of consists of two things. One is that, the Serverless definition I mentioned earlier around auto scale, no servers and pay for what you use. But the other is that we have a programming model, which is what we call event based programming model, which is very unique to Functions but PowerShell and automation, event-based automation is exactly that.

Eamon O’Reilly:
You really want to get notified when something happens, like an alert, your machine is running out this space or the CPU is just getting high on heat, or the memory or something queuing up in SQL, your indexes are low, you want to get notified of these things, then take some remediation action, maybe go add more compute to it. Maybe you have to open up a ticket in your third party service now or some other ticketing system. All of those are really events, and so the idea behind this is, if you can take advantage of these triggers and bindings to integrate your systems together, then you can start to bring a lot of your continuous operations into the cloud and enable not just kind of continuous integration, continuous deployment, but continuous operations as well. And that’s really what kind of closes the loop, and they’re kind of the cloud with Azure Functions and our other services, make that a lot easier than it was in the past.

Mike Pfeiffer:
It’s awesome. So if I’m a DevOps engineer or just a developer that wants to use Azure and Azure Functions and I want to start getting my automation scripts into that kind of DevOps model, it’s an easy way for me to connect a GitHub repo and kind of get my code loaded into the Function app and-

Eamon O’Reilly:
Yeah. So, it’s very easy. We have some built in tooling that basically if you’re using Azure DevOps, we have some built in tooling that will create the project into Azure DevOps, but also set up the build pipeline. And the published pipeline so that when you checking your source code, it can automatically have the default on how to build. Well, partially there’s much build but maybe you have to go get some modules there.

Mike Pfeiffer:
Sure.

Eamon O’Reilly:
And the other thing is an automatic release. And so if you integrate into Azure DevOps, it sounds like it’s a complex system, but really it isn’t for a lot of basic stuff, and I think most IT people I start talking now, once we walk through the benefits of it, they started adopting it as well. Because allows you the approval mechanism and other kinds of integrations you want to do. So that sets up that whole kind of continuous pipeline. From your box, you’re doing all your development test on all the way and publish into an Azure Function.

Mike Pfeiffer:
Right. I mean the possibilities here are really cool because Azure DevOps to your point can go very complex. But if you’re just literally checking a script essentially into a repo that goes through a basic build process and then you have maybe a couple of stages in a release pipeline where it’s like, let’s go to the development subscription and we have a release gate before production and stuff like that and so forth.

Eamon O’Reilly:
Exactly. You just kind of get the basic building blocks, sometimes you don’t have to add in all the unit tests, all that, just get right all steps in place and then start building some additional validation as you go through. You don’t have to do it all at once. I kind of tell up here, before we didn’t do anything, so you know whenever you add is going to be better than what we had before. So start down that journey and you can start adding more and more things to it.

Mike Pfeiffer:
My source control 10 years ago for my PowerShell scripts was a folder on a server somewhere.

Eamon O’Reilly:
That is all we did. We copied it gets some … We decided adding access to it. So who’s got access to the server? How do I get my scripts? I can only copy them. I can’t edit them. That was our source control. But it’s so much easier when we started adopting these cloud techniques where you don’t have to think about a lot of this stuff anymore. Especially if your organization is already kind of onboarded to these services, and a lot of times they have. Us, from kind of a automation standpoint, developer or IT, it’s very easy to just take advantage of those things.

Mike Pfeiffer:
Yeah. And this is going to be an awesome way for people that already have PowerShell skills. Because the PowerShell community is so big and vibrant and there’s so many people out there, that have skills with it and now they’re going to be able to bring those skills into Azure Functions. Super exciting. And as we’re talking about building up Function apps and our projects and stuff, how should we start designing solutions? Should we have one function per kind of type of thing we’re trying to accomplish? Or if we put a bunch of functions on one script, like how is it done?

Eamon O’Reilly:
Yes. That’s a great question. I think that it does take a second to kind of, what am I trying to accomplish? The kind of the recommendation we tell all of our customers is create kind of a Function app per scenarios. And so if you have scenarios around basically remediation of alerts on my SQL servers and maybe you’ve got five different remediation tasks you do, depending on the kind of events that happen, then the way a Function app works is, you can have multiple functions inside of those. And so those are related and that benefit of making them kind of adding all of your related functions into a functionality deployed, is that they all work together. If you have common files or common modules you want to share, you can all reference them very easily and start using them and then also get deployed as a single entity. So when you start doing the deployment, you know they’re all in sync when these set of scenarios or Functions all work together.

Eamon O’Reilly:
And so there isn’t really a cost other than the kind of the management costs of creating other Function apps, but generally, create them personally, then I think you’re going to be in a lot better shape cause as you do your build and dev and test, you kind of want to think about if you’re building a microservice, you want to be able to build that test it and publish it without affecting a bunch of other ones. And so if you try to overload and put too many Functions into your Function app, then you kind of got up to test everything, make sure it all works and then you can open up other problems.

Mike Pfeiffer:
Sure. So, keep it simple. Makes a lot of sense. I think one of the things that I was confused about when I very first started working with Functions, was, I was thinking, “Oh the Function app is just the Function itself.” But that’s not the case to what you just said. It’s like the function app is kind of like a container and they have multiple functions within it.

Eamon O’Reilly:
It is. Yeah. It’s really built for kind of sharing, probably-

Mike Pfeiffer:
I guess I should be careful-

Eamon O’Reilly:
… inside of it.

Mike Pfeiffer:
… using container as a terminology. Not to confuse people. As a Docker container-

Eamon O’Reilly:
Yeah, exactly.

Mike Pfeiffer:
… more of an umbrella of all these functions.

Eamon O’Reilly:
Yeah. It’s almost like the host we have in the back end that we spin up to run it. I just call it compute, because we do support containers and other things and Functions, but we take care of that responsibility for you. Then you don’t really care what the computers in the backend. That’s the goal of Serverless really, is just write your code, kind of understand your limits. I’ve tried PowerShell 6, can be doing other stuff. But once you understand those limits, then focus on the business value.

Mike Pfeiffer:
So if I’m a PowerShell person here and I’m thinking about this first question probably coming into my head is what do I get inside the Function runtime? You mentioned PowerShell 6, what other modules can I get access to and what if I maybe built my own or I want something from PowerShell gallery, how do I navigate that?

Eamon O’Reilly:
That’s a great question. So one of the things we’ve been working on over the last few months is a feature called manage dependencies. And so as you know, there’s no package manager in PowerShell. As you look at kind of other languages like Node or Python, they kind of have package manager techs and then package managers like Pip or npm that can go and parse those and do work for you. And so one of the things we’ve introduced with Azure Functions is the requirements of PST one-five. And so what this allows you to do now is you can specify which modules you want to have available inside of your Function app. And so you can list easy, and then you can basically go 2.*, or 2.2.5.6 and then you can go SQL server, you can go into [Sunline 00:20:12], you can basically go down the list of modules you need from the gallery, you just specify them and which version you want. And then we will at runtime go and get all those for you and make them available onsite that host so that all those modules are there and manage for you and you didn’t even have to package them up or do anything.

Eamon O’Reilly:
And the really nice thing about it is, it’s kind of a unique feature that only exists in Azure Function, it doesn’t exist anywhere else. Is that when I called out like that, the 2.1.*, then what we do is we will periodically, every time your Function app starts, or every once a day we will go and check if there’s a newer version of that star, and then we will go and automatically bring it down, but bring it in a smart way so that we don’t kill off your existing ones, but bring down the updated version and then slot it in for the next Function invocation afterwards. And what we recently did with that feature was, we’ve seen so many people, they’re writing scripts and that automation is touching other systems that are under some compliance, like they’re under PCI or they’re under some other policy. And so then the automation gets inherited into that system.

Eamon O’Reilly:
And the biggest problem we have is where there any security or critical fix that now I’m exposed to, because I’ve put my automation into those systems. And so what we said is, if you take advantage of this kind of the minor star, we will guarantee you as those security and critical fixes come out, we can take those, install them automatically and you don’t have to worry about it.

Mike Pfeiffer:
Wow. That’s a game changer.

Eamon O’Reilly:
It’s just one of those things that you’re always kind of worried about. Like, am I in a good state or not? We can guarantee you’re in the right state, but you still have full control. So if you trust some vendor, like, “I know those guys aren’t going to break me hopefully between those, so I’m going to take those. But other ones I don’t want to, I’ll just specify the version.” Because that allows you then to kind of get all the flexibility. And then if you write your own modules or you want to just kind of package them up, you can literally just create a modules folder at the root of your app. You just save the module from the gallery or just copying your modules into that folder, and then when you package up that module’s directory with it, upload it and then map it in so it’s available in your PS module.

Mike Pfeiffer:
Makes sense. So she’s part of your project then.

Eamon O’Reilly:
Exactly. You don’t have to like import it it’s just already imported because we added it to the module pack.

Mike Pfeiffer:
Got it.

Eamon O’Reilly:
So, those are kind of the two ways to get your modules into the system. Either using our requirements of PST 1, which will require manage dependencies or just uploading them to the modules folder.

Mike Pfeiffer:
Okay. And so as long as it’s a module that works with PowerShell core, we’re in good shape?

Eamon O’Reilly:
Exactly. And most of them, all the work has been done for compatibility. So a lot of the modules … Windows that didn’t work are starting to work better. And then with PowerShell 7, there’s more and more modules coverage coming out all the time. So I think most people do a quick test and they’d be surprised how much actually it works in PowerShell 6. We’ve just been doing our job working with the PowerShell guys over the last few years.

Mike Pfeiffer:
That’s really awesome. So, basically build a module manifest, reference the modules you want that are either publicly available or build your own modules folder, get it in there. And this is easier than I think most people probably realize [crosstalk 00:23:06].

Eamon O’Reilly:
Exactly. So, you can do it from your VS code and publish it or it could even from the portal, you could just go in, edit that requirements of PST 1, like, “Oh, I’m missing a module.” That’s like, “Command not found.” Just go in and type in the module you want, hit save, we’ll go get it for you, you write your script and it’s right there. So, and then you can update it if you want to, you have that control.

Mike Pfeiffer:
Sure.

Eamon O’Reilly:
Because that’s kind of a new concept to a lot of the, I think PowerShell users that like other languages kind of had some better support around, but hopefully this makes it easier.

Mike Pfeiffer:
And I like the fact that you can kind of pin your solution to a specific set of versioned modules. So it’s like at the time that I wrote this, I know it worked with this specific version, because I’m like, “Oh, I that into my repository, it’s always going to work the same way as I expect it to.”

Eamon O’Reilly:
Yeah. Sometimes you’re really careful Like, I don’t know if this thing is ever updating on me, because like, I wanted this, lock this down and I’ll take care of the security updates and the republishing and all of that. Because it’s an active project. I’m always doing. So then yeah, you should pin it to those versions. But if there’s the certain ones you’re like, “I just want the minor ones.” And we only support the minor updates at the moment. You can’t do like, ., because just to kind of protect users but also ourselves, because usually major version updates are the ones that have the breaking changes.

Mike Pfeiffer:
Makes sense.

Eamon O’Reilly:
Most people will guarantee backward compat between minor versions, they usually don’t between major.

Mike Pfeiffer:
And it’s good though that people can pick their way of doing it. Right?

Eamon O’Reilly:
Yeah, exactly.

Mike Pfeiffer:
Because they’ve got options.

Eamon O’Reilly:
Yeah. There’s nothing forced on them. It’s like whatever kind of works for your environment.

Mike Pfeiffer:
Okay. So here’s a common question that people have been asking, they’re saying, “How do I pick between Azure automation and PowerShell and Azure Functions?” So it’s kinda like if you’re stepping back from it and looking at it, you might not realize the kind of nuances and the difference between the two.

Eamon O’Reilly:
Yeah. So, I used to work on the Azure automation team. So we shipped that, I think it was in early 2014 and when we shipped it, we kind of shipped it as a general purpose automation system, and it’s kind of there to do three things, one was deploy your resources, one of us to respond to events, and then was to integrate into other systems. And we had these kinds of major capabilities to offer like graphical [altering 00:25:14] and PowerShell runbook support and [long running 00:25:16] and hybrid. So that was kind of the automation system for everything in Azure.

Eamon O’Reilly:
Since 2014, that was our general purpose automation system. But in the last five years we’ve started to build these domain specific automation systems so you can think of arm. So now we recommend customers to use Azure resource manager for doing deployments into Azure. Whereas before they might’ve just tried to script it up. And similarly, if you tried to apply policy now before people would apply their policy rules using automation. Whereas now we recommend using Azure policy to set that and blueprints, if you want to do roll-outs across your whole subscription and your logic apps is what we recommend now to integrate into systems because they have over 300 connectors and it’s just automatic, and then Functions for event-based automation.

Eamon O’Reilly:
So we’ve kind of built these verticals that the idea behind it was you have to write less code, and we will take more of the responsibility because we’re now domain-specific scenarios. You don’t have to kind of code up everything in PowerShell like you used to. And so the role Function plays is that event based automation part. So anytime you want to respond to an event, we wouldn’t encourage you to use Functions to deploy all your resources in Azure. That’s not really what it’s designed for. Even try to integrate into third party systems like service now or some other ones. I generally would kind of recommend you use large gaps or you’re to enforce policy with Functions because you can use Azure policy, and maybe something like that for things that it can’t do but add value there.

Eamon O’Reilly:
And so what I talk to customers now is first look at the domain-specific automation services we have and see if those can meet your needs. And if they don’t meet your needs in those kinds of like deployment, respond, integrate, orchestration, all those different services we have, then you can always use Azure automation for kind of the general purpose work, that’s available. But I think in most scenarios you’ll be able to do a lot of your work now using one of these services.

Mike Pfeiffer:
Right. That’s exciting. I can’t wait to get in there and kind of mess around with it. One of the things that we see people using a lot in Azure automation over the past couple of years was the startup shutdown scenario, right?

Eamon O’Reilly:
Yes.

Mike Pfeiffer:
It’s to shut down your servers at the end of the day, bring it back up at 8:00 AM in the morning type of thing. And if you poke around some of the places that you guys have solutions, like serverlesslibrary.net, you start looking at the Functions that are built in PowerShell, you see somebody already worked on that for same the stereo, right?

Eamon O’Reilly:
Exactly. Yeah. And those are the same because that’s kind of like an event like it. Those are our timer triggers, is what we call it, which is scheduled tasks. So at eight o’clock go see which machines you should shut down and shut down those machines. And that was one of the most common scenarios we had in Azure automation. But it’s the same type of scenario you will do in Functions because it’s like an event-based scenario.

Mike Pfeiffer:
Yeah, because the schedule is your event.

Eamon O’Reilly:
Exactly. The schedule is the event and then seven o’clock hits, go shut those things down. The nice thing about it is because we’ve hooked into, we have native integration of Event Grid. So when you deploy your virtual machines now, you can automatically get a notification by Event Grid and then you can go and add a tag that basically says, “I’m going to auto shut down this machine.” And then I even have a scenario where you could then put in the start and stop time. Or what that allows you to do is kind of apply your general rules onto the virtual machine. But as the owner, like the dev comes in or you’re some application owner, they can modify those start and stop times and so then they only get affected when they’re actually there. And so there’s kind of advanced functionality you can-

Mike Pfeiffer:
That’s sweet.

Eamon O’Reilly:
… go on top of the like straight forward shutdown.

Mike Pfeiffer:
There’s so many possibilities because it’s almost like the creativity is the only limitation in a way because there’s so many kind of different scenarios where you could bake this in.

Eamon O’Reilly:
Yeah. We spent so much time from the operation sides doing this type of work of trying to ensure that our costs are well maintained across our infrastructure or that we can respond to alerts quickly or we can integrate into other systems because we need approvals. So all this kind of automation is being handwritten today in PowerShell and other scripting languages. But I think as you start to move into the cloud, it isn’t just pick up all my PowerShell and drop it in, it’s like, “Are there other services in Azure that can make my life a lot easier and I have less code to maintain longer term.” And that’s where Event grid Comes in as your monitor comes in, as your policy comes in. Some of these other native services are for deployment and try to take advantage of those when you move over, but those PowerShell skills will still move over for everything else you’re trying to accomplish, but it isn’t just, grab everything and drop it.

Mike Pfeiffer:
Sure.

Eamon O’Reilly:
You can, but that’s not really what we would recommend.

Mike Pfeiffer:
Sure. That makes a lot of sense. And kind of going down that road a little bit further, mentioned serverlesslibrary.net because they’ve got a lot of boiler plate code to get started. If I’m super brand new to PowerShell Functions, is there any kind of places like that other than serverlesslibrary.net where there’s some sample projects to get started with?

Eamon O’Reilly:
Yeah. That’s our recommendation right now. So we put a couple of simple tutorial ones to kind of get started in the docs. But what we’ve been doing is, encouraging people to push to the serverlesslibrary.net, and the reason for that is, we have integration in the portal so that when you go and try to create a PowerShell Function, there’s like a Serverless library icon.

Mike Pfeiffer:
Okay.

Eamon O’Reilly:
If you click on that, if from the portal experience and it brings you up into the library.

Mike Pfeiffer:
Nice.

Eamon O’Reilly:
So it just kind of encourages you because there’s nothing worse than a blank screen when you’re doing an automation, that is the death of that gutter you should already [inaudible 00:30:43] else. And so what we’ve tried to do was to push a few samples built by the engineering team up there, but then we’ve already seen community. At the PowerShell community is so good. MVPs we have just … People are really in love with the capabilities it has and so they’ve been pushing more and more samples up there. And so I’m encouraging most people, just since we weren’t generally available this week, is to try to push more and more samples up there. Not just because it helps the community, but it kind of helps the overall service as well so that more and more capabilities get pushed in there.

Mike Pfeiffer:
And I think that’s an important message for the folks listening because there are so many people listening to the show that are PowerShell focused. And I think that you guys out there listening, you may not even realize serverlesslibrary.net exists and now this is your calling to get out there and fill that thing up with awesome automation samples.

Eamon O’Reilly:
Yeah, we’re asking. So all you have to do is … It’s not too difficult, you basically have to have a GitHub account because that’s where you’re going to host your source code. So you create a GitHub account, create a GitHub repository. You just put in your code there and then you put a request pointing to that code and then we do a quick review, make sure it’s in good shape and then we add it into the Serverless library and it shows up along with everybody else. And then people can download it, upload it. And if they have issues they’ll go to your repo and they can always open issues if they need to. But since the process is like probably 10 or 15 minutes, you can kind of have the basic stuff done.

Mike Pfeiffer:
That’s a really cool scenario for becoming an open source contributor, leveraging your existing PowerShell skills and just getting in there and contributing.

Eamon O’Reilly:
It’s amazing. Even myself, because a lot of times we didn’t really have a place to put all this knowledge. We had tons of scripts doing all these things internally. I’d write lots of samples with customers, we’d go onsite and work on stuff and it was down that knowledge was basically lost. And then we were like, “Oh okay.” “Hold on, I have it on some folder.” Try to bring it back. I’ve been doing, I think a lot of our MVPs, and I think I encourage anyone out there is to try to look at GitHub now as a place where you can kind of share that knowledge and one thing it does, it benefits you because you kind of get out there and then people take advantage of your work, but then people also give you ideas. They kind of give you other things to think about, but you also can take advantage of other people’s samples. And so that’s the one thing we’re trying.

Eamon O’Reilly:
PowerShell is a very good community. We didn’t always have a good way to share the knowledge because we own the PowerShellgallery.com, but I think GitHub opens up another set of scenarios there.

Mike Pfeiffer:
I think it does, and I think that the lights really start to come on when you start experimenting in getting into this kind of social coding thing that is taking off right? And that’s where you start getting inspiration and new ideas sometimes where … “Oh, I just tried that now, I’ve seen how I can solve that problem. Now I’ve got an idea how to solve this other one.”

Eamon O’Reilly:
Yeah. That’s how you’d be successful really, it’s like trying to reinvent the wheel over and over just isn’t going to work. So it’s the community aspect. That’s where we’ve seen huge gains in just the knowledge on what kind of scenarios you’re trying to do, How do we need to make the function service better? What works, what doesn’t work. That kind of interaction between the community and engineering team is really what drives us going forward. We saw it even with PowerShell when they open source, they got that great community interaction. Function is also open source from the beginning and so we saw the same type of thing. Mostly developers coming in and saying, “This doesn’t work well, how can you add all these bindings? And we’re missing the trigger for this.” And that going back and forth, people adding their own code in there. It’s the same idea now with PowerShell in the automation. This is an opportunity, I think for a lot of IT pros also get in to that space and your own career I think will benefit from it, but your company would benefit too because now everyone is sharing this knowledge and it’s not necessarily contained in a thousand different companies with the same stuff over and over again.

Mike Pfeiffer:
Yeah. I think we’ve all seen that now, that it really does make a big difference. Microsoft is I think the biggest open source contributor on our GitHub now.

Eamon O’Reilly:
Yeah, exactly.

Mike Pfeiffer:
It then has changed things and we can see the maturity of Azure as a result of part of that. It’s really exciting. So kind of getting into the depths of this, just so I can meet maybe a little bit more of an advanced concept of what are the PowerShell folks listening are probably wondering how they can start consuming event data and what does that actually mean? Like, if an event does take place and I’m subscribed at a function, like how do I know how to access it? Does it come in as a parameter? Like do I have to parse JSON, what do we do there to handle those events?

Eamon O’Reilly:
Generally, events will come in as a parameter called [inaudible 00:35:04]. And typically it’s just a serialized object, have all of the properties. So, say you got an alert from Azure monitor it’s actually called us and it’s going to send us the payload, basically of the alert, what resource, is this VM, this ID. Here is the memory, here’s the activity it was doing. All of that payload basically gets serialized with JSON sent to us. And then we, because we see this as JSON, we actually converted back so that you have a JSON object inside of PowerShell that you can just start .property into it. And so the first thing I usually do when I’m integrating these systems is just dump out the parameter so that I can see it. And then once I see-

Mike Pfeiffer:
You see the schema of an event.

Eamon O’Reilly:
Exactly. See what the schema is looking like, then I know what I’m dealing with and then typically I’ll just pick it up, copy it, and then I won’t keep doing it. I’ll use that as my test thing. So I’ll go into VS Code and I’ll just create like that value and then I can pass that into my own workbook for iteration. And so I can tweak the values, change it, and I have something that I can quickly do.

Mike Pfeiffer:
So the good news is you’re already focused in … You already understand object oriented as PowerShell scripter, and when an event sends a payload over to you, you’re going to get that as an object, you’re going to get the look at all the properties to the point, you can get all of it the first time and save it off so you understand what you’re supposed to be working against.

Eamon O’Reilly:
That’s the first thing I do. Because once you understand the schema and then you copy it out of there to stick it into VS Code, put it into a variable, then you can basically just go invoke rest methods, call that Function locally and then pass in that body. And it’s exactly what you’re going to get from Event Grid or from Azure alerts or from another third party API that’s calling you, because they all do the same thing. They basically package everything they have and give it to us instead of that payload. So if it was in JSON it’s going to commit to something else, so as long as you just dump it out, you know what you’re working with.

Mike Pfeiffer:
And that’s so powerful because like you mentioned monitoring, which there’s a lot of cool hooks with Azure monitor, but you can send that over and you can see what time it was, maybe what VM was impacted or whatever resource, all that information, and then you can use it as conditional checks to build the logic for your function. Right?

Eamon O’Reilly:
Exactly. What am I supposed to do with it? It’s just a CPU that I just got the alert on. How bad is it? Do I need to cool it down and see has it recovered at all or is it still paying? Do I need to scale it up? Do I need to add more resources? Should I go add a ticket into my servers now? Or some, get on system. At that logic that you do, all the time manually maybe, you can start to kind of codify that business logic you have into that event. And so now every time that goes … Once you’ve kind of defined what that process is, it will happen over and over and over again.

Mike Pfeiffer:
Yeah. One of our customers has been doing a lot of experimenting around this concept and doing a lot of different things. It’s just been interesting, but they’d been using it kind of as a way to please some of the operational habits of their team. And I think that you guys are working on a lot of stuff with the Azure policy, which you may not have to wire up your own logic, but in this case these guys were wanting to do some things when resource groups are getting created, so they can inspect what happened and if it wasn’t done right they could go off and … Deep provision stuff and it’s just really, really cool time.

Eamon O’Reilly:
It is because you’re starting to … Like I always kind of say, you’re starting to transfer your knowledge into the system. Before we always heard that developers transferring all their knowledge into the application and that was what you’re trying to get and that IT was on the side trying to stitch it all together and figure out what’s going on.

Eamon O’Reilly:
We have kind of a group platform now we hope, where we can start to put our knowledge that we have from the automation and IT ProSites into the system as well. And so now we have those two things working together and one of the reasons we brought kind of the automation into the Functions experience because now we have the developers and the operations in the same platform. And so it’s not easy to just share all the benefits that the developers get with new triggers and bindings. We also get the automation side as we do investments like manage dependencies, we did for the automation side, we can make those to developers as well in other languages, so we start to benefit from unifying the work we do and similarly good Azure DevOps or some of these other systems, we’re all kind of working in the same system. So as you start to get Azure DevOps together, it’s nice when the systems are also in sync.

Mike Pfeiffer:
Yeah. One repo, right? With-

Eamon O’Reilly:
One repo.

Mike Pfeiffer:
… the infrastructure code, app code.

Eamon O’Reilly:
Yeah. And then you do troubleshooting later. Those same troubleshooting steps, we integrate with application insights in Functions, but so with PowerShell and so we map right information, right error, right warning, right error into the right fields and application insights. When you go in there, they look very natural to you as a PowerShell user, but you gain all the power of application insights for troubleshooting, diagnostics, exceptions.

Mike Pfeiffer:
That’s actually really fascinating. And I was kind of doing an oversight on my part, I wasn’t even thinking about output from functions. And so I mean if I’m debugging in the console, clearly I can just watch on the kind of live streams-

Eamon O’Reilly:
Console logs, yeah.

Mike Pfeiffer:
Console logs. But if I missed it cause it was night, I can get into application insights and see that one.

Eamon O’Reilly:
All of the data stored in there. And the nice thing is we also give you host level information, so that hosts we spin up, you can also see it, that’s , it’s under memory contention or CPU, because maybe that’s what’s affecting my runtime, plus all of the logs you generate also gets stored into the same system and then you get all those nice visuals so you can kind of see how things are integrated together inside. I know. People may not be familiar with application insights, but they did a really good job at helping you troubleshoot and figure out what’s going on. And so with all of that power we just absorb as well into our kind of automation scripts.

Mike Pfeiffer:
That’s really fascinating. Application insights is insanely sexy if anyone hasn’t looked at it. It’s honestly more of a developer topic.

Eamon O’Reilly:
It was, it was kind of developed for a developer, but again, the idea of, why do we have all these benefits going over there that we can’t on the automation side, take the kind of job and we can. All we have to do is some work on the platform side to kind of integrate these seamlessly together and then I think it almost jump-starts us instead of us having to rebuild all these capabilities again, we actually have like 80% of it because devs are basically doing 80% of the stuff we often do. Is just we need to add the 20% that are very kind of automation specific in there as well.

Mike Pfeiffer:
And so a big takeaway there is make sure I’m also building an application insights resource as I’m spinning up a function app, which is the default anyway-

Eamon O’Reilly:
It does defaulter too, but yeah, that’s really powerful. I did kind of advise that for almost anybody.

Mike Pfeiffer:
Okay. Any other kind of best practices shooting from the hip that we should think about as we get into this as developers of PowerShell Functions in Azure Functions?

Eamon O’Reilly:
Yeah. The one thing to call out is Functions … There is a couple of plans, so generally we say the functions have been short lived and do a specific task and then get out of there. And so generally, we don’t recommend long running functions. So we have two tiers. We have this consumption tier and Functions, which you can only run for 10 minutes. So if you’re trying to do a task that takes more than 10 minutes, it’s going to fail on you and then you’re getting [crosstalk 00:41:56].

Mike Pfeiffer:
That’s good to know.

Eamon O’Reilly:
Basically can’t do 10 minutes. But we have this plan called premium, that you can run basically off for an hour guaranteed. And so there are scenarios where you might want to run it for a longer period of time, then you can kind of move to our premium plan. But you gain all of the scaling benefits, but there’s an upfront cost you have to pay for that initial instance. So the best practices is, keep things short lived. And the second thing is when you’re trying to keep them single-purpose, so it’s easier to troubleshoot. That’s always the best practice. And if you try to do a lot of different tasks, I’d look at moving to premium. Because even if you just picked a premium plan, you can still deploy lots of Function apps on it. So it’s not like one function app per premium plan. If you pick a premium plan-

Mike Pfeiffer:
Got it.

Eamon O’Reilly:
… what it gives you is a bigger instance to work on. So we gave you this very small instance and consumption. And so sometimes you’re hitting memory or sockets and you’re like, “Oh, what’s going on?” It’s typically you’re trying to do too much inside of that VS script you wrote. And so if that starts to happen to you, it’s sometimes easier just to say, " I’m going to move to a premium plan." You’ll pay maybe 130, 140 a month as a starting, then that’s all you need and you scale up if you do it, but you can put a lot of your automation onto that one instance, so when you do the math, it’s not really going to cost you much and you might end up troubleshooting things and more.

Mike Pfeiffer:
And especially if you’re a big organization, huge IT organization, right? It makes total sense.

Eamon O’Reilly:
It’s totally coming around for some of these other things that you’re paying for like one VM running, it’s going to cost you more than that typically, over the month.

Mike Pfeiffer:
So I might’ve missed it when you mentioned it. So it’s 10 minute limitation, but if you go premium, what is the cap there on the execution time?

Eamon O’Reilly:
You get an hour.

Mike Pfeiffer:
An hour?

Eamon O’Reilly:
An hour guaranteed. I can go longer, we don’t try to get rid of it, but we just guarantee it as an hour-

Mike Pfeiffer:
Got it.

Eamon O’Reilly:
… that you can stay with.

Mike Pfeiffer:
And so when application developers are talking about Serverless and Functions, the conversation of call start comes up a lot. And typically what that is, well I gotta let you explain it for people that don’t know what it is, but first, what is it and how do we navigate it?

Eamon O’Reilly:
So that’s a great question. It comes up every single time we talk about Serverless and the challenges when you talked about initially like what do we offer with Serverless? We had this automatic scare, you don’t have to pay for the servers, you only pay for what you use. The key one there is you only pay for what you use. And so what that means is, there’s no computer available for you until you need it. And so because we don’t want to pay for it either. And so what that happens is, if you haven’t called this Function, this script in a while, the first time it gets called, it’s like, “Oh, you just called us.” So we need to go find some compute inside of our backend service, give it to you so you can run. So this is what that call start is and it’s common across all Serverless platforms as there’s always that first delay.

Eamon O’Reilly:
And then we keep it around for a while, and so as you call more and more and more the core circles away. So it’s only that initial time which you’ll get.

Mike Pfeiffer:
And that’s an important note you made there. That’s not just a Microsoft Azure Functions that everybody, every platform.

Eamon O’Reilly:
Yeah. Because it’s pay for use, we’re not going to keep it around either. Everybody has this problem. So how we tackled it, obviously we’ve done a lot of interesting engineering in the background to try to keep things available for you, like pulling so that it’s a lot faster to allocate that compute. But in the premium plan, the reason it costs you that kind of upfront cost is we guarantee you one instance all the time. So we’re guaranteeing you compute that we’re going to make available to you. So you’re call start is zero, and what it means with a call start is, say you’re using like a third party approval system and you call a Function to go say like stop my machine.

Eamon O’Reilly:
If we call start, we’d have to wait a couple of seconds or more for something to come up, go get it and then stop it. In premium that instance is already there. So when you make that call, it’s going to be exactly that millisecond it gets called. It’s just going to take the command, run the script and execute.

Mike Pfeiffer:
Got it.

Eamon O’Reilly:
And so that’s the idea of a call start and how we’ve kind of solved it with our premium plan and going forth.

Mike Pfeiffer:
That’s awesome. So, basically if I want the most capabilities, no call starts, longest execution time, I should go premium?

Eamon O’Reilly:
It is. In some ways I would recommend it, because I’ve just seen myself where, a lot of these automation scripts, they tend to often go a little longer, you don’t have to have to stress about it all the time. Am I worried about my 10 minutes? Did I just do something? And I think for a lot of people, you can start, if you’re playing around in PLCs learning, why not use consumption? It’s almost going to feel like it’s free because we do such an application. But then once you start thinking about production, you’re going to have lots of these automation, then I think I would kind of upgrade into the premium plan and I think not only do you get at those features, but you also get hybrid connectivity. So now your script that’s running can actually reach back into your local data center and perform actions on premises. So we see some customers like, “Oh I need to add a new user and I only have directory sync for my local AD up to Azure AD, so I need to make my change back to my local AD.”

Mike Pfeiffer:
Got you.

Eamon O’Reilly:
They can reach back to hybrid workers. And basically, we have this hybrid connection that’s in Azure Functions. Basically, it uses a service bus computer in the backend, but you can make standard [inaudible 00:47:03] end calls and it’s all outbound first from your machine, and it will take those commands, run them, and then send them back up again. So, it’s not just managing your Azure resources, you can actually manage all of your own premises resources in the same way.

Mike Pfeiffer:
That’s fascinating stuff. So I think one of the last questions I’ve got for you here, I can keep going for a couple hours-

Eamon O’Reilly:
I know. There’s just so much. I think we just touched on a lot of [crosstalk 00:47:22]-

Mike Pfeiffer:
I know. It’s awesome.

Eamon O’Reilly:
… but it’s good.

Mike Pfeiffer:
It is good. If I’m thinking about this and I’m usually like, when I run a script, I’ll put a CSV and I put it somewhere or I want to dump some data in a folder on my file system. How do I navigate that upon my PowerShell in the Serverless world? The PowerShell script in a Serverless world?

Eamon O’Reilly:
There’s kind of two ways. One is, if you’re familiar with Azure storage, there’s a feature called Azure files and basically what it does, it’s almost like a niche you would use locally. And so what it does is, we mount that Azure files from your storage account, onto the Azure Functions and so that storage account is your is your storage account, so it’s persisted. So you could put some stuff there if you needed it to be around temporarily or you can just create your own storage account and just upload the files there, like mapping the drive and then once you map the drive and you can just copy it. So you just do a normal copy item into that folder just like you would locally-

Mike Pfeiffer:
Got you.

Eamon O’Reilly:
… instead of copying into some internal share you caption into an Azure file share, which is just backed by Azure storage.

Mike Pfeiffer:
So I guess it is an important note then they were developing Azure Functions in the Serverless world. They’re [Steelers 00:48:34] completely. We got to use things like Azure storage or manage SQL, or whatever.

Eamon O’Reilly:
You have to assume they’re going to go away. They will go away. So never even attempt stuff. Make sure that attempt, you only need it for that period of time because it’s going to disappear. You need to persist anything, leverage the Azure storage with their file shares and because once you put it there, then it’s yours and then all of the Functions that start up can also get access to it.

Mike Pfeiffer:
Awesome. Well I think this is gonna be a game changer for a lot of people and kind of as we wrap up this episode, any last resources you want to point anybody to or kind of things are coming down the road?

Eamon O’Reilly:
Yeah. There’s lots of stuff going on, to get started, I think the best place is the Azure docs. There is some good information, there’s kind of two of this tutorial up there and then we have that we we’re calling the reference article and there’s all really good information around concurrency, how to deal with concurrency inside of there, how to deal with deployments, integration with DevOps, VS Code. So the docs are really good place just to get started and kind of learn all the capabilities we have, but then I would would recommend Serverless library, but there’s some good samples showing up there to get started. You can build from those and then contribute hopefully up there as well. And then just kind of follow us on Twitter. There’s a lot of activity happening. And don’t just zero in on just the PowerShell part, there’s a lot of going out in Functions in general that you will benefit from as we add more and more capabilities.

Mike Pfeiffer:
Awesome. We’ll link all that up for everybody listening in the show notes and there you have it everybody. Serverless is not just for developers, it’s for everybody.

Eamon O’Reilly:
Yeah. We’ve kind of opened it up that’s dev plus ops-

Mike Pfeiffer:
Awesome.

Eamon O’Reilly:
… added together.

Mike Pfeiffer:
Well, Eamon O’Reilly, really appreciate you being here and I will see everybody on the next episode. Maybe we’ll have you back another time.

Eamon O’Reilly:
Yeah. It will be awesome-

Mike Pfeiffer:
Thanks so much.

Eamon O’Reilly:
… I really enjoyed it.

Top comments (0)