DEV Community

Cover image for Introduction to Microsoft Security Copilot
Romanus Onyekwere
Romanus Onyekwere

Posted on

Introduction to Microsoft Security Copilot

Chapter 1; What is Microsoft Copilot for Security?
Chapter 2; The Workflow of Microsoft Copilot For Security
Chapter 3; Get Started With Microsoft Copilot For Security
Chapter 4; Create Effective Prompts
Chapter 5; Handle Incidents With Microsoft Copilot For Security
Chapter 6; Use Microsoft Copilot For Security in Microsoft Defender
Chapter 7; Analyze Vulnerabilities With Microsoft Copilot For Security
Chapter 8; Analyze Suspicious Code With Microsoft Copilot For Security
Chapter 9; Analyze vulnerabilities With Microsoft Copilot For Security
Chapter 10; Analyze Suspicious Code With Microsoft Copilot For Security
Chapter 11; Hunt Threats With Microsoft Copilot For Security
Chapter 12; Create Your Promptbooks
Chapter 13; Query Uploaded Files

Chapter 1; What is Microsoft Copilot For Security?
It is a virtual assistant powered by AI and designed for security analysis. Why do we need a Copilot for Security? Because it addresses some common challenges in cybersecurity. For example, we have too many security threats with all kinds of hacking techniques and techniques, too many alerts that overwhelm our security team, and too many manual tasks to analyze, mitigate, and report security incidents. On the other side, we have too few skills covering multiple knowledge domains, too few analysts who are experienced in security operations, and the too few hours for incident response. But is it possible for us to manage security at machine speed and scale? A potential answer is using artificial intelligence, or AI, specifically the rise of generative AI is transforming how we work and learn today. Generative AI became popular after OpenAI launched its ChatGPT application in late 2022. Powered by its large language model, ChatGPT can answer questions covering many domains, summarize documents, write articles and reports, analyze and generate codes, and provide advice and recommendations. Partnering with OpenAI, Microsoft introduced a Copilot, an AI-powered virtual companion to improve work productivity. Building on the Copilot architecture,

Image description
Microsoft announced a series of products, such as Copilot for Windows, Microsoft 365, Dynamics 365, Power Platform, GitHub, and Copilot for Security, which you will learn in this course. There are two ways to use Microsoft Copilot for Security. You can directly access its portal at securitycopilot.microsoft.com and enter your prompts, just like how you use ChatGPT. This is called a standalone experience, or you can use Copilot within Microsoft Security Solutions, like a Microsoft Defender, Intune, Entra, and Purview. This is called an embedded experience. What can you do with Microsoft Copilot for Security?

Image description
Here are some common use cases. Summarize security incidents, investigate and respond to incidents, write security reports, analyze vulnerability impact, analyze suspicious codes, and generate scripts for threat hunting. To do that, Copilot for Security can work with various security solutions, such as XDR, extended detection and response; SIEM, security information and event management; IAM, identity and access management, Cloud Posture Management, data protection, endpoint management, and threat intelligence.

Chapter 2; The Workflow of Microsoft Copilot For Security
Let's take one step deeper and look at the workflow of Microsoft Copilot for Security. Copilot for Security works as an orchestrator to interact with three parties, user interfaces, including Copilot for Security portal, and Microsoft Security Solutions with Copilot embedded, like Microsoft Defender and Intune, plugins for integrating with Microsoft security products like Defender and Sentinel, and the third-party products like ServiceNow and Splunk, and AI services, including large language models like GPT, responsible AI for checking input prompts and output responses, and underlying Azure Open AI service. Here's how it works. First, a user prompt is sent to Copilot for Security. Next, Copilot for Security selects the right plugins to preprocess the prompt, so it can retrieve specific context. For example, it can call Microsoft Defender Threat Intelligence to get information about a vulnerability, based on the CVID in the user prompt. This process is called grounding. It helps AI generate more relevant and actionable answers. Then the modified prompt is sent to the large language model. Next, the large language model generates results. Once the responsible AI check is completed, the LLM response is sent to Copilot for Security.

Image description
Then Copilot for Security chooses the plugins for post processing or grounding for output. Finally, Copilot for Security sends the response plus app commands, if applicable, back to the requester. One thing I want to point out, the whole workflow is governed by the Microsoft Security Trust Boundary. In other words, your data are your data. They always remain within your company's boundary, and your data are not used to train the foundation AI models.

Chapter 3; Get Started With Microsoft Copilot For Security
Let's get started with Microsoft Copilot for Security. Open a browser, then enter the URL, securitycopilot.microsoft.com. I will sign in with my Microsoft work account. On the Copilot for Security portal, the top area is for showing your session history. If this is your first time on Copilot for Security, you will see some learning information. In the middle area, you can explore some promptbooks. A promptbook is a collection of prompts working together to complete a security task. For example, this vulnerability impact assessment, it has four prompts. The prompt bar is where you interact with Copilot. You can ask anything about security, or click the prompts icon for prompt suggestions. I can use promptbooks, I can also leverage your system capabilities. Click "see all system capabilities." There are many prebuilt prompts based on the plugins I have. I can search for a specific promptbook or system capability. For example, search for script. It will find for me. Now let's enter a prompt. For example, I want to get an overview of recent cyber threats. I can enter, "Summarize cyber threats within the last seven days." Then click submit. We will get the response from Copilot for Security, and it provides some reference links. To see how the Copilot worked on my request, I can expand the steps completed. First, based on my prompt, it chose the plugin, Microsoft Defender Threat Intelligence. Then it looked up threat intelligence information, and sent to large language model to process. Finally, it prepared the response, run safety checks, and compose the output message. I can rate the Copilot's response by selecting, "it looks right," "needs improvement," or "inappropriate." I can export response by saving it to a Word document, sending a email, or copying it to the clipboard. I can also ping this response to a ping board so I can revisit later. In addition, I can edit, resubmit, and delete this prompt. Copilot for Security keeps a history of my sessions. I can edit the session name at the top. For example, change it to summarize recent cyber threats. Then I click save. Click Microsoft Copilot for Security at the top left, to go back to the homepage. I can see my recent session shown on the homepage. Click the manual icon at the top left. I can view all my sessions, access the promptbook library, and adjust the settings, like the preferences, user permissions, data, and the privacy. Go back to the homepage. Click the sources icon in the prompt bar. Here, we can manage plugins. I can set up Copilots access to Microsoft security products like Defender, Entra, Intune and Sentinel, third party products like ServiceNow and Splunk, public websites, and add your custom plugins. As we learned earlier, Microsoft Copilot for Security can use these plugins to pre-process your prompt, and post process AI's response. I can also upload files like my security policies, so it will add my organizational knowledge to Copilot, and it will help Copilot generate more relevant responses. Click the question icon at the bottom right of the homepage. We can view documentation, or click help to contact support. Now, you should know the general operations of Microsoft Copilot for Security, but this portal is not the only place to access Copilot for Security. We can also use this embedded experience in some Microsoft security products like Microsoft Defender. I will show you later in this course.

Chapter 4; Create Effective Prompts
Microsoft Copilot for Security is powered by generative AI. To help generate better responses, we need to create effective prompts. What is the prompt? Basically, it's an instruction you send to generative AI. Unlike programming languages, such as Java or Python, we use natural language to create a prompt. For example, you can enter, "Suggest a seven-day California travel plan." Then based on your prompt, generative AI applications, like ChatGPT, will generate the response.

Image description

Microsoft Copilot for Security is a generative AI application specializing in security, so your prompts should be security related. If you ask for a recommendation on your next vacation, Copilot for Security won't help with it. The effectiveness of your prompts directly affects AI-generated responses. For example, let's compare two prompts, "List all incidents," versus, "Can you find the incidents within the last 60 days? It's for my status update with managers. Make output as a table with title, severity, status, and owner. Please check with Microsoft Defender." Which prompt do you think will better help get what you want from Copilot for Security? Did you select the second one? Then you are right. How can we create effective prompts? Actually, we just need to ask five basic questions: who, what, why, how, and where? Specifically, who means the role we want AI to play. Copilot for Security has a predefined role as a knowledgeable security analyst, so we don't need to worry about this question, and we can simply call Microsoft Copilot for Security you. What means the goal we want to achieve with AI. For example, to find the incidents within the last 60 days. Why means the context of our request. For example, the reason for finding incidents is for my status update with managers. How means the output format we want. For example, output as a table with title and severity. And where means the source of our plugins or data. For example, check with Microsoft Defender to find incidents.

Image description
Now, let's put these essential elements together, we'll get an effective prompt. In Microsoft Copilot for Security, you have three ways to provide prompts. You can write your prompts from scratch, use a system capability, it's a prebuilt prompt provided by one of your plugins, or use a promptbook, it's a collection of prompts working together to complete a task. Let's do a quick demo of using prompts in Microsoft Copilot for Security. Open the left menu, then click My sessions, click New session. In the prompt bar, enter a simple prompt, "List all incidents." Copilot for Security generated the response with some incidents. It looks okay, but it's not tailored to my status report. Now I will enter a prompt with my specific expectations, "Can you find the incidents within the last 60 days? It's for my status update with managers. Make output as a table with title, severity, status, and owner. Please check with Microsoft Defender." Click Submit. Expand the output message. Here's a table with the required information for my status report. I then click Export to Excel. So this prompt is more effective.

Chapter 5; Handle Incidents With Microsoft Copilot For Security
Incident investigation and response are critical to our security operations. Let's see how Microsoft Copilot for Security can help us handle incidents. In a typical security operation center, or SOC, security analysts spend significant time handling incidents. They're using variance of security systems. For example, XDR extended defense and response, like Microsoft Defender, SIEM, security information and event management like Microsoft Sentinel, and threat intelligence, like Microsoft Defender Threat Intelligence. And they are following a multi-step workflow. For example, triage incidents to assess security impact, determine incident severity, and assign owners, investigate incidents to correlate security events, discover associated entities and collect evidence, respond to incidents to contain attacks, eradicate threats, and recover services. Also, security analysts need to provide summaries and write reports along the process. As you can see, the job of a security analyst is complex and time consuming. That's why we need Copilot for Security, an AI powered virtual companion specializing in security. Working together with Copilot, security analysts can significantly improve their productivity of incident handling.

Image description
Now let's do a quick demo. Here's my Microsoft Copilot for Security. In the prompt bar, enter the prompt, "List incidents in Microsoft Defender within the last 30 days. Make output as a table." Expand the response message. I can see a table with an incident and the incident ID is 100. To quickly understand this incident, I can go back to the prompt bar and enter, "Summarize the key facts of incident number 100 in Microsoft Defender." Click submit. Copilot for Security provided a summary of the incident with some key points. Next to help with my investigation, I will enter "Find entities associated with that incident." Click submit. Entities involved with an incident are like account. IP address, device, URL, file, and the process. Copilot found the device vm-win11, the user vmadmin, IP address, and the suspicious file. These entities provide valuable context for further investigation. For example, if I want to know about the device vm-win11, I can enter, "Tell me about the device vm-win11." Copilot told me the device IDs, the operating system and the version, and the primary user. I can also ask Copilot for Security to suggest how to respond to this incident. Copilot suggested contacting the device user, in this case, Harry, to confirm the activity. It even wrote a sample email. This email looks very professional, so I can use it right away and it will help me determine the next steps in the incident response process. Finally, I can ask Copilot for Security, "Write an incident report for non-technical managers." Copilot provided a good starting point. I can then tailor this report to meet my specific needs.

Chapter 6; Use Microsoft Copilot For Security in Microsoft
Defender

Microsoft Copilot for Security is embedded in Microsoft Defender. This is convenient for security analysts because they don't need to open another browser while investigating and responding to incidents. Here's my Microsoft Defender environment. From the list of incidents, let's select the incident number 100. The Copilot for Security pane is embedded on the right hand side. It automatically creates an incident summary and it recommends some investigation actions. Under the attack story, I can see a Suspicious PowerShell Command Line. Click it to see more details. It tells me that the WINWORD.EXE executed a script. Click it. It shows the script. I'm not familiar with the parameters used here, but I can ask Copilot to analyze it. It helps me quickly understand the purpose of the script. Finally, I can ask the Copilot to write an incident report. Click the three dots to show more options. Then click generate incident report. It drafted a report for me. I can also open it in Copilot for Security standalone portal, and I can continue my work there. Click my sessions and click view my sessions. I can see the sessions created in Microsoft Defender are also kept in my session history.

Chapter 7; Analyze Vulnerabilities With Microsoft Copilot For Security
Vulnerabilities are weaknesses in our systems that hackers can exploit to launch cyber attacks. So analyzing vulnerabilities is a common task for security professionals. In vulnerability analysis, security analysts often start with a Common Vulnerability and Exposure or CVE. There are various sources where you can search CVEs. For example, the official CVE List managed by Miter, threat intelligence products like Microsoft Defender's Threat Intelligence, and many threat intelligence feeds. The analysis work often includes vulnerability summary, impacted technologies, threat actors, tactics, techniques, and procedures, or TTPs, used by hackers, suggested actions for prevention, and the remediation. It takes time for security analysts to complete the vulnerability research and analysis. Now let's see how Microsoft Copilot for Security can help us. First, we'll find a sample CVE. This is a CVE website. We can click Search to search CVE List. Let's use this CVE. CVE-2020-1472. Click Submit. And then we can see the details about this CVE. Go to my Microsoft Copilot for Security. In the prompt bar, enter the prompt. Summarize vulnerability CVE-2020-1472. Make a list of key points and impacted technologies. This is for my vulnerability impact analysis. Click submit. Copilot for Security provided a summary of that vulnerability with some key points. It also showed the base score is 5.5, and the severity level is medium. And the impacted technology is the Netlogon Remote Protocol. To defend against this vulnerability, I can ask Copilot for Security to suggest actions to handle this vulnerability categorized into prevention and the remediation. Copilot recommended several actions for prevention and the remediation. Finally, let's ask Copilot for Security to generate an executive report about this vulnerability for non-technical managers. Copilot wrote a decent report with essential elements of this vulnerability analysis including summary, impact, and the suggested actions.

Chapter 8; Analyze Suspicious Code With Microsoft Copilot For Security
Security analysts need to deal with all kinds of suspicious codes. Let's see how Microsoft Copilot for Security can help with that. Suspicious code analysis includes some common tasks, such as explain what the code does, investigate its security impact and related vulnerabilities, recommend actions to defend against the malicious code, write a code analysis report, and share your findings with your team members. It's impossible for security analysts to know every programming language and the system command. Also, examining a complex script with hundreds of lines takes a long time. This is why we need Microsoft Copilot for Security to work with us together. Now, let's do a quick demo. Here's a sample incident in my Microsoft Defender. Under Attack story, click Suspicious PowerShell download or encoded command execution. I find a suspicious PowerShell script. Under the command line, I see a very long script and a part of it seems to be encoded by Base64. Fortunately, we can click Analyze to let Copilot for Security analyze it. Copilot summarized the script with a step-by-step explanation. This embedded script analysis function is very convenient for me during my incident investigation. I can also click more options than click Open in Copilot for Security. It will take me to the standalone portal, and I can continue my work there. Now let's start a new session to analyze another script. To find a sample script, I will go to Exploit Database. Exploit Database is a collection of public exploits used by many penetration testers and security researchers. For our demo, we will use this script. I'll copy the whole script. Go back to my Microsoft Copilot for Security. This time, instead of entering my prompt, I will use a system capability. Let's search, "Analyze the script or command." I will paste the sample script and click prompt. Copilot provides a step-by-step breakdown of what the script does. To further investigate that script, I can enter, "Show me the CVEs related to the script." Copilot found the CVE ID and provided the details of this vulnerability. I can then ask a Copilot to recommend actions to defend against the script. Copilot has suggested some actions such as update phpMyAdmin to the latest version. Finally, I can share my findings with my colleagues. Click the share icon at the top right. Enter the name or email. In this case, I just sent to myself, and click Send button. Once my colleagues receive the invitation email, they can click a view in Copilot link to access my shared session.

Chapter 9; Analyze Vulnerabilities With Microsoft Copilot For Security
Vulnerabilities are weaknesses in our systems that hackers can exploit to launch cyber attacks. So analyzing vulnerabilities is a common task for security professionals. In vulnerability analysis, security analysts often start with a Common Vulnerability and Exposure or CVE. There are various sources where you can search CVEs. For example, the official CVE List managed by Miter, threat intelligence products like Microsoft Defender's Threat Intelligence, and many threat intelligence feeds.

Image description
The analysis work often includes vulnerability summary, impacted technologies, threat actors, tactics, techniques, and procedures, or TTPs, used by hackers, suggested actions for prevention, and the remediation. It takes time for security analysts to complete the vulnerability research and analysis. Now let's see how Microsoft Copilot for Security can help us. First, we'll find a sample CVE. This is a CVE website. We can click Search to search CVE List. Let's use this CVE. CVE-2020-1472. Click Submit. And then we can see the details about this CVE. Go to my Microsoft Copilot for Security. In the prompt bar, enter the prompt. Summarize vulnerability CVE-2020-1472. Make a list of key points and impacted technologies. This is for my vulnerability impact analysis. Click submit. Copilot for Security provided a summary of that vulnerability with some key points. It also showed the base score is 5.5, and the severity level is medium. And the impacted technology is the Netlogon Remote Protocol. To defend against this vulnerability, I can ask Copilot for Security to suggest actions to handle this vulnerability categorized into prevention and the remediation. Copilot recommended several actions for prevention and the remediation. Finally, let's ask Copilot for Security to generate an executive report about this vulnerability for non-technical managers. Copilot wrote a decent report with essential elements of this vulnerability analysis including summary, impact, and the suggested actions.

Chapter 10; Analyze Suspicious Code With Microsoft Copilot For Security
Security analysts need to deal with all kinds of suspicious codes. Let's see how Microsoft Copilot for Security can help with that. Suspicious code analysis includes some common tasks, such as explain what the code does, investigate its security impact and related vulnerabilities, recommend actions to defend against the malicious code, write a code analysis report, and share your findings with your team members. It's impossible for security analysts to know every programming language and the system command. Also, examining a complex script with hundreds of lines takes a long time. This is why we need Microsoft Copilot for Security to work with us together. Now, let's do a quick demo. Here's a sample incident in my Microsoft Defender. Under Attack story, click Suspicious PowerShell download or encoded command execution. I find a suspicious PowerShell script. Under the command line, I see a very long script and a part of it seems to be encoded by Base64. Fortunately, we can click Analyze to let Copilot for Security analyze it. Copilot summarized the script with a step-by-step explanation. This embedded script analysis function is very convenient for me during my incident investigation. I can also click more options than click Open in Copilot for Security. It will take me to the standalone portal, and I can continue my work there. Now let's start a new session to analyze another script. To find a sample script, I will go to Exploit Database. Exploit Database is a collection of public exploits used by many penetration testers and security researchers. For our demo, we will use this script. I'll copy the whole script. Go back to my Microsoft Copilot for Security. This time, instead of entering my prompt, I will use a system capability. Let's search, "Analyze the script or command." I will paste the sample script and click prompt. Copilot provides a step-by-step breakdown of what the script does. To further investigate that script, I can enter, "Show me the CVEs related to the script." Copilot found the CVE ID and provided the details of this vulnerability. I can then ask a Copilot to recommend actions to defend against the script. Copilot has suggested some actions such as update phpMyAdmin to the latest version. Finally, I can share my findings with my colleagues. Click the share icon at the top right. Enter the name or email. In this case, I just sent to myself, and click Send button. Once my colleagues receive the invitation email, they can click a view in Copilot link to access my shared session.

Chapter 11; Hunt Threats With Microsoft Copilot For Security
Let's look at how to use Microsoft Copilot for Security to hunt threats. What is threat hunting? It's a proactive approach to discovering potential cyber threats. In other words, security analysts don't wait for alerts to come. They actively search for signs of malicious behavior within their environments. There are some common activities in the threat hunting. First, security analysts need to create a hypothesis. That's an assumption about a specific threat that might exist in the environment. Then they create hunting queries to analyze data from various sources Once they discover the potential threat, security analysts will take actions, like a reporting investigation and a response. If you use Microsoft Security products, you can hunt threats in Microsoft Defender and a Microsoft Sentinel, and you need to use KQL, Kusto Query Language, to write hunting queries.

Image description
But the challenge is that security analysts may lack expertise in KQL and may not be familiar with data schemas. Now let's see how Microsoft Copilot for Security can help us. Here's Advanced hunting in Microsoft Defender. I need to create KQL queries to hunk threats over the data in these schemas. For example, I can write alertinfo summarize count() by category. Click Run query. It will return the count of each alert category. However, threat hunting queries are much more complex. You can find a sample query by clicking Queries, then choose one from the list. For example, under Lateral Movement, I can choose impersonated User Footprint. If I'm not familiar with KQL and underlying data schemas, it will take me a long time to write this hunting query. Fortunately, we can ask for help from Copilot for Security. To do that, let's create a new query window. Then click Copilot. I can ask a question to generate a query. For example, find the devices with software linked to high severity CVEs. Copilot generated a hunting query and ran it for me. In the Result pane, it showed some matching records. Expand the first record. I can see the device, vm-win11, has an outdated Windows version that contains the vulnerability, and I can also see the related CVE ID. After validating my hypothesis, I can take further actions. I can choose the record, then click Take actions. Then I can decide if I want to isolate device, collect investigation package, or restrict app execution. Go back to the Copilot for Security pane. If I don't want Copilot to automatically run query for me, I can click More options. Then uncheck this setting. I can then start a new chat. Let's ask a new question. Find all failed remote logins to the device vm-win11 within the last 30 days. Click Submit, After Copilot for Security generates the query, I can choose Run query or just Add to editor. I can continue editing the query. After it's done, I can click Run query

Chapter 12; Create Your Promptbooks
Let's talk about how to create your promptbooks in Microsoft Copilot for Security. A promptbook is a collection of prompts that run in sequence automatically to complete a task. For example, investigating an incident or analyzing a vulnerability. Also, we can define the inputs for a promptbook. For example, an incident ID, a CVE ID, or additional context.

Image description
Microsoft Copilot for Security provides a promptbook library. You can find many prebuilt promptbooks created by Microsoft, or you can add your custom promptbooks to the library, and you can decide who can use these promptbooks, anyone in your organization, or just you. Here's my Microsoft Copilot for Security. In the previous lesson, we wrote some prompts to analyze a vulnerability. Now we can create a custom promptbook to save those prompts as a series of actions. So next time we need to analyze another vulnerability, we can just use this promptbook to complete our task automatically. I will click the checkbox to select the prompts I want. (silence) Then click the Create Promptbook icon at the top. I will name this promptbook as Custom Promptbook, Analyze a Vulnerability. I can add a tag, for example, demo. For the description, I will enter summarize a CVE, suggest actions for prevention and remediation, and create an executive report. The input for this prompt book is a CVE ID. To add this input, I can modify the first prompt. Replace the existing CVE number with a variable format. (silence) Left arrow, CVE ID, and the right arrow. Then click the confirm icon. At the bottom of the window, I can see the input field, CVE ID, has been added. I can select who can use this promptbook. I will choose anyone in my organization. Finally, click Create to create this promptbook. Go to the promptbook library. I can see my custom promptbook is in the library. I can filter my promptbook by sources. For example, click Demo SOC, which is my demo organization. I can see the custom promptbooks shared within my organization. To run this promptbook, I can click the play icon to start a new session, provided the CVE ID, for example, CVE-2020-1472. Then click Run. It will run the prompts in sequence automatically and generate the responses.

Chapter 13; Query Uploaded Files
We can add our organizational knowledge to Microsoft Copilot For Security by uploading files. This method is based on the generative AI architecture approach called the Retrieval-Augmented Generation or RAG. By using this method, we can ground the AI model with our own data so it will generate more relevant responses.

Image description
In Copilot For security, you have two ways to connect your organizational knowledge base, upload files, or use Azure AI Search Plugin. Let's do a quick demo on uploading a file. I have a sample file called Incident Response Team Communication Plan. This document defines the roles and the responsibility of my team members during the incident response process. Now let's upload this file to Microsoft CoPilot For Security. In the prompt bar, click the Sources icon. In the left menu, click Files. Click Upload File. Choose my sample document. I already have my file uploaded, so I will just click Cancel. Wait for the file status to become ready. Now we can use this file name or the keyword uploaded files in our prompt. Go back to the prompt bar. I will enter a prompt. Query the uploaded files to find who should provide detailed technical information about an incident. It tells me the person responsible is the Technical Lead, Henry. I can also use a system capability to query this document. Click the prompts in the prompt bar. Then search for query uploaded files. Enter my query. List the primary contact persons during an incident response process. I can also provide my instructions on how to compose an answer. For example, format output as a table. Click Submit. Expand the result table. Copilot helps me find the related information.

Top comments (0)