Retrieving Key Vault Secrets from a Logic App using Managed Service Identity

Azure Logic Apps are great for automating tasks, and there is a plethora of actions available out of the box. However, sometimes these don’t work the way you would like, so it comes down to failing back to the old tried-and-true methods. I recently came across this with Azure Key Vaults.

For the uninitiated, Azure Key Vaults are security enclaves deployed within Azure data centers that are used for the sole purpose of key and secret storage. This allows the escrow of these sensitive pieces of information in dedicated hardware, which can only be retrieved by an account with the proper permissions. These Key Vaults are also ridiculously inexpensive, costing only $0.03/1000 operations (for standard Key Vaults) which makes them a no-brainer for usage.

Out of the box, Azure Logic Apps do have integration with Azure Key Vaults via an action (currently in preview at the time of writing). However, this integration only offers two means of authenticating to the Key Vault:

  • Creating a connection using a named user
  • Authentication via a pre-created Service Principal
Azure Key Vault Actions

These options have a major downfall: the management of a password or secret for connectivity. By signing in with my account (or a service account), that means that if I or that account are ever offboarded, this connection is broken and must be reestablished. Service Principals can help mitigate this, but the secret that’s used must be managed, or set to never expire (which I prefer not to do as a security hole).

The fix: Managed Identities!

Managed Identities in Azure AD are like (group) Managed Service Accounts (gMSAs) in Active Directory: The principal is created and a randomly-generated password is used and managed by Azure AD. This significantly reduces the attack surface of this account, as the password is never exposed to anyone, and the lifecycle of the identity (for system-managed) is tied to the resource: when it is deleted, the Managed Identity is removed as well. Managed Identities can be used with many different Azure resources, including Virtual Machines, and in our case, Logic Apps!

Creating the Managed Identity

Creating the Managed Identity is extremely simple. Within the Logic App, simply navigate to the Identity area of the Logic App blade, and switch the Status to On. This will automatically create the identity in Azure AD and provision all the backend items.

Once this configuration is saved, the Object ID will be displayed and the object can be found in Azure AD using the name of the Logic App or the Object ID.

Created Managed Identity
Managed Identity in Azure AD

Phew, that was difficult! Now we can assign that Managed Identity rights to our resources.

Assigning the Managed Identity Within Azure Key Vault

Once our Managed Identity is created, we can assign it to secrets in the Key Vault. I’m going to create a secret for our purposes for this Logic App.

The first step is to create the secret, which is no different than normal in a Key Vault. Navigate to the Key Vault, select Secrets, and add a new Secret.

Azure Key Vault Secret Creation 1
Azure Key Vault Secret Creation 2

Once the secret is created, we can set our Access Policy in the Key Vault. Note down the URI of this Secret, we will need this later.

First we navigate to Access Policies and Add New

Key Vault Access Policies

In the Access Policy, we select our Managed Identity and give it permissions to read Secrets.

Selecting our Managed Identity
Setting the appropriate permissions

Our Managed Identity can now retrieve Secrets from this Key Vault. Now comes the fun part where we use it in a Logic App

Using Stored Secrets in a Logic App

Now we can put it all together.

Within the Logic App, instead of using the Key Vault action to retrieve Secrets, we will instead use an HTTP request. We will use the following parameters:

  • Method: GET
  • URI: <Key Vault Secret URI>
  • Queries: api-version=7.0
  • Authentication: Managed Identity
  • Audience: https://vault.azure.net (This is accessible via the Add New Parameter dropdown)
Retrieving our Secret

By running this manually, we now see that the Secret is returned:

Returned Secret

Now I can send this to a simple Parse JSON activity to extract the secret, which can then be used in later activities.

Parse JSON and use in HTTP call

And that’s it! I’ve now retrieved a Secret from an Azure Key Vault using a Managed Identity!

Push MDATP Alerts to Log Analytics using Logic Apps

One of the questions that I get asked all the time is how to integrate cloud solutions into monitoring platforms. Whether it is Azure AD sign-in logs, Exchange Audit Logs, or anything else, the primary desire is a centralized location for these logs to provide a “single pane of glass”.

In the past, Rich and myself have talked with clients about using Azure Log Analytics as the centralized platform. However, it was never really a SIEM, and many clients wanted to keep using their on-premises solutions, such as Splunk, ArcSight, or similar.

With the addition of Azure Sentinel on top of Azure Log Analytics, we’re starting to have more conversations about leveraging cloud-native solutions instead of bolted-on products. Because of the work we do in the Microsoft security space, one of the items we wanted to do was push data from Microsoft Defender ATP (formerly Windows Defender ATP) into Log Analytics, which we can then write queries and alerts on within Sentinel.

To do this, we decided to use Azure Logic Apps for two main reasons:

  1. It is low-cost, with a consumption-based model
  2. It provides a modular, graphical means of authoring workflow

This Logic App consists of five easy steps.

Step 1: The Trigger

#TriggerWarning

All Logic Apps must start with a trigger, either based on an event or a schedule. We decided to start this Logic App off simply: a trigger based on when a Defender ATP Alert occurred. Pretty simple, and we just used Rich’s account for the API integration.

Logic App Trigger

Step 2: API Call, The First

The first real step after receiving data via the trigger is to… Receive data.

Huh?

To push data into Log Analytics and have it parsed properly, we want to submit each alert (and corresponding information) as a single JSON object to the Log Analytics endpoint. However, the output from the trigger is just the alert ID and machine ID. We could send that into the activity specifically used for the retrieval of alerts that is built in to Logic Apps, but that gives us a nicely parsed object, which isn’t what we want.

To fix this, the first step we are doing is to pull the alert directly from the MDATP API, which does give us our JSON object. To do this, we have created an App Registration in Azure AD to handle this connection for us. We are using the following configuration:

  • Method: GET
  • URI: https://api.securitycenter.windows.com/api/alerts/<Alert ID from Step 1>
  • Authentication: Active Directory OAuth
  • Tenant: <AAD Tenant GUID>
  • Audience: https://api.securitycenter.windows.com
    • Note the lack of a trailing “/” at the end
  • Client ID: <App Registration Client ID>
  • Credential Type: Secret
  • Secret: <App Registration Key>
First API Call

Step 3: API Call, The Second

Because alert info isn’t everything we need, we also decided to pull in machine information for a given alert. Luckily, this configuration is nearly identical to the other API call, just a different endpoint. For this, we’re using the following settings. Stop me if this looks familiar:

  • Method: GET
  • URI: https://api.securitycenter.windows.com/api/machines/<Machine ID from Step 1>
  • Authentication: Active Directory OAuth
  • Tenant: <AAD Tenant GUID>
  • Audience: https://api.securitycenter.windows.com
    • Note the lack of a trailing “/” at the end
  • Client ID: <App Registration Client ID>
  • Credential Type: Secret
  • Secret: <App Registration Key>
Second API Call

Step 4: Putting it All Together

Now that we have all of our data, we can push it into ALA. However, we want to ensure that the two items are linked, and we don’t want to maintain multiple tables. So to fix this, we’re simply joining both JSON objects into a single one using the union() function in Logic Apps. Luckily, we aren’t sharing many fields between the two objects, so we don’t have to worry about overwriting properties. This is then utilized by the in-box action to send data to Log Analytics, into a custom log we’ve named MDATPAlerts. Note: Log names in Log Analytics are case sensitive, so if you are creating multiple Logic Apps to forward data, our MDATPAlerts log would be different from a MDATPalerts log.

Sending data to Log Analytics

The Final Product

In the end, we now have all MDATP alerts going into our workspace, which we can generate alerts on in Sentinel, create dashboards, and correlate against other potential IoCs. All these logs are going into our MDATPAlerts_CL log (CL meaning Custom Log) in Log Analytics, and can be searched from there. Then the fun begins!

Running this Blog on Azure

One of the items that I get asked about a lot are for some real world examples of running workloads on Azure. Because this blog is hosted in my Azure subscription, I figured this would be a great place to start! And the number of resources (and the way I’ve deployed them) may give some other people ideas on their own deployments.

So to begin, here is what I have deployed across 3 different Resource Groups (RGs)

Resource Group #1 – Azure DevOps

DevOps Resource Group

This is the easiest so may as well start here. When I was doing some functional load testing, I didn’t have an Azure DevOps instance associated with my personal account. So I used an automatically generated one on Azure, which is deployed into its own Resource Group. That’s it, just a DevOps instance.

  • Total Resources in RG #1: 1
    • Azure DevOps Organization

Resource Group #2 – The Real Stuff

Main Resource Group

Now on to my second Resource Group, which is responsible for hosting the actual blog and directly required services. This is deployed on Azure Platform as a Service (PaaS) so there is not a server in sight! In fact, this is deployed 100% on containers, as most PaaS services are.

The blog itself is running on a single Azure App Service hosted in North Central US, hosted on a Linux App Service Plan, using the Basic tier (sorry for the performance). The Linux app service is running a WordPress container, which is hosting this site, and using an Azure DB for MySQL on the backend. I am also storing all my media straight in an Azure storage account, which gives me 5PB of storage (at this time of writing), versus the 10GB of local storage with the app service.

To monitor the environment I’ve got an App Insights workspace set up, with its shared dashboard resource, and an anomalous failures alert, all of which are separate resources.

The final item I have in this Resource Groups is the SSL certificate bound to the site, which we’ll discuss more in the next section.

  • Total Resources in RG #2: 8
    • Azure App Service
    • Azure App Service Plan
    • Azure Database for MySQL
    • Azure Storage Account
    • Application Insights Workspace
    • App Insights Shared Dashboard
    • App Insights anomalous failures alert
    • App Service Certificate

Resource Group #3 – Security is Kinda Important

Encryption Resource Group

The last item in the previous section is for an App Service certificate. You may or may not have noticed, but this site is currently using LetsEncrypt. LetsEncrypt is great because it’s free, but the certificates are only good for 3 months at a time, rather than the usual 1-3 years. Rather than going through and renewing this every 3 months manually, I’m automating the process using a great WebJob solution from Ohad Schneider called letsencrypt-webapp-renewer which is just a WebJob that runs in an Azure App Service, which itself has its own App Service Plan.

To allow for me to use the Free tier for this to keep costs low, I am triggering this WebJob to run with a Function (hooray for 1M free executions and 400k free GB/s), which also has its own consumption-based App Service Plan.

Finally, I’ve got an additional Storage Account to store required files and such for this solution.

  • Total Resources in RG #3: 5
    • Azure App Service
    • Azure Function
    • Azure App Service Plan x2
    • Azure Storage Account

Recap:

In total, I am running 14 resources in Azure to support this blog, for a total running cost of about $50/mo, spread across 3 different Resource Groups. I could consolidate, but I like having some separation between resources. This also allows me to view the cost of individual components, so that I can better track my costs and projections.