Azure identity - managed identities and roles

by DotNetNerd 23. June 2023 07:41

secret identity

For a while I have actually not had to do much configuration of app registrations, managed identities and so on, simply because I often join teams who have other people doing that part. So it has been fun and at time challenging to get back into it, as I have had to lately. So to help others, and maybe a future me I just want to write down a few notes, that will help me remember a few key details.

First of all when we are talking about a setup with a frontend application and backend api app registration, the app roles and claim mapping should be done in the backend api, with roles being defined as part of the app registration, and the mapping from e.g. groups being done in enterprise application. The roles and claims will then become available in the access token, once a user logs in using the clientid for the frontend application, that uses the backend API.

In most we cases we will then also need to have services that should be able to call our API. Authenticating using managed identity makes this quite simple, with the caviat being that roles can only be assigned using a script.

To run e.g. a Functions app that should be allowed access you need to configure a managed identity and add the role to it using a powershell script like below. User assigned managed identity can be used if you wish to use it for multiple services, however things like access to key vault using keyvault references require a system managed identity - so you will likely need to either use a system managed identity and configure it for each service, or have both kinds of managed identity configured.

 

$objectIdForManagedIdentity = "insert object (principal) id from managed identity"
$enterpriseApplicationObjectId = "insert enterprise applications object id"
$roleIdGuid = "insert role id"

$uri = "https://graph.microsoft.com/v1.0/servicePrincipals/$objectIdForManagedIdentity/appRoleAssignments"
$body = @{
    principalId = $objectIdForManagedIdentity
    resourceId = $enterpriseApplicationObjectId
    appRoleId = $roleIdGuid
} | ConvertTo-Json
az rest --method POST --url $uri --body $body

 

After that when fetching a token using DefaultAzureCredentials the .default scope should be used and the ManagedIdentityClientId should be set to ensure it uses the user assigned identity.

Putting API Managemement in front of blob storage

by DotNetNerd 13. February 2023 09:12

A nice and simple way to expose static files is through Azure blob storage. If you are already using API Management you might want to have requests to through there, in order to ensure you can move it to somewhere else in the future. It requires a few steps to get it to work though.

First of all Managed Identities should be enabled in API management and Access Control (IAM) should be configured for the container to allow API management to access the file. In API management the endpoint is added with authentication-managed-identity policy to allow authentication is passes through. After that a number of headers should be removed and the x-ms-version, which is required to do AD authentication, should be set when forwarding the request from API Management to the blob storage endpoint. 

In my case I also wanted to avoid the .json extension in the endpoint, so the configuration ended up looking something like this.

<policies>
    <inbound>        
        <set-header name="Ocp-Apim-Subscription-Key" exists-action="delete" />
        <set-header name="Sec-Fetch-Site" exists-action="delete" />
        <set-header name="Sec-Fetch-Mode" exists-action="delete" />
        <set-header name="Sec-Fetch-Dest" exists-action="delete" />
        <set-header name="Accept" exists-action="delete" />
        <set-header name="Accept-Encoding" exists-action="delete" />
        <set-header name="Referer" exists-action="delete" />
        <set-header name="X-Forwarded-For" exists-action="delete" />
        <set-header name="x-ms-version" exists-action="override">
            <value>@{string version = "2017-11-09"; return version;}</value>
        </set-header>        
        <rewrite-uri template="/settings.json" copy-unmatched-params="true" />
        <authentication-managed-identity resource="https://storage.azure.com/" />
    </inbound>
    <backend>
        <base />
    </backend>
    <outbound>
        <base />
    </outbound>
    <on-error>
        <base />
    </on-error>
</policies>

Updating API management from Azure Devops Pipeline

by DotNetNerd 14. December 2022 12:32

Recently I needed to update Azure Management API based on the swagger specifications when our services were deployed. It seems like a pretty standard thing that there would be  a task for, but it turned out to require a bit of Powershell - it i still fairly simple though.

A general function to acccomplish it could look like this bit of code.

 

[CmdletBinding()]
Param(
    [string] [Parameter(Mandatory=$true)] $ResourceGroupName,
    [string] [Parameter(Mandatory=$true)] $ServiceName,
    [string] [Parameter(Mandatory=$true)] $ApiName,
    [string] [Parameter(Mandatory=$true)] $SpecificationFilePath
)

$apiMgmtContext = New-AzApiManagementContext -ResourceGroupName $ResourceGroupName -ServiceName $ServiceName
$api = Get-AzApiManagementApi -Context $apiMgmtContext -ApiId $ApiName

if ($null -eq $api) {
    Write-Error "Failed to get API with name $ApiName"
    exit(1)
}

$apiVersionSetId = $api.ApiVersionSetId.Substring($api.ApiVersionSetId.LastIndexOf("/")+1)
$apiVersionSet = Get-AzApiManagementApiVersionSet -Context $apiMgmtContext -ApiVersionSetId $apiVersionSetId

Import-AzApiManagementApi -Context $apiMgmtContext `
                        -SpecificationUrl $SpecificationFilePath `
                        -SpecificationFormat 'OpenApi' `
                        -Path $api.Path `
                        -ApiId $api.ApiId `
                        -ServiceUrl $api.ServiceUrl`
                        -ApiVersionSetId $apiVersionSet.Id

 

 

 

This can of course be used locally, but to run it from a release pipeline, the cleanest way I have found, is to add it to a separate repository, and include it as an artifact. From there we just need a Azure Powershell build step, configured as shown by the YAML below.

 

variables:
  swaggerUrl: 'https://my_appservice.azurewebsites.net/swagger/1.0.0/swagger.json'

steps:
- task: AzurePowerShell@5
  displayName: 'Azure PowerShell script: Update API Management'
  inputs:
    azureSubscription: 'xxx'
    ScriptPath: '$(System.DefaultWorkingDirectory)/_MyCompany.BuildScripts/UpdateApiManagement.ps1'
    ScriptArguments: '-ResourceGroupName "resource_group_name" -ServiceName "management_api_service_name" -ApiName "unique_api_id" -SpecificationFilePath $(swaggerUrl)'
    azurePowerShellVersion: LatestVersion

 

And that is all that is required, to automate it and make it update along with the build. Nice and easy.

Rolling my own OAuth 1.0 client

by DotNetNerd 24. March 2020 08:18

OAuth is one of those things where I always wonder about how bad the state of libraries etc are. It seems like a problem pretty much everyone will tackle, but I am still not able to find a simple, library that works well with .NET Core HttpClient without being a whole framework in itself or at least very framework dependent.More...

This side up please

by DotNetNerd 5. March 2020 11:29

On my current project we started seeing issues with images, especially when taken using an IPhone, that were shown as being rotated. Reading up on it I found that this is due to IOS using EXIF orientation, which is not always handled the same way on eg. Windows.

I found a couple of functions, that I modified to work with our TypeScript codebase, so that it utilizes async/await and has a minimum of type information. I suspect they might be useful for others, or myself later on, so here they are.

More...

Copying data on Azure in code

by DotNetNerd 5. February 2020 09:04

I have recently been looking at copying entire collections of data on Azure, in a way that should run as either Azure Functions or Webjobs. This is useful for backups, and simply moving data between environments. I didn't come across too many good samples of how to do this, so I expect it can be a useful topic for others who need to do the same thing.

More...

Online tools and resources

by dotnetnerd 9. October 2019 08:25

Every once in a while you run into a great online tool or resource, and makes life as a developer easier. Moving between companies as I do, I often see that people are using services that provide lots of value, but are not necessarily well known by most. So in this post I will start by sharing a few of the services that I have come accross lately.More...

Azure AD B2C easy auth across frontend and backend

by DotNetNerd 2. April 2019 17:02

Recently I had the need to setup easy auth using Azure B2C to authenticate users across a frontend Azure Web App and an Azure Functions backend. Allthough it sounds like a regular scenario, the documentation I found could have been better. I don’t have the time to write the complete docs, but this blogpost will outline the steps, so I can remember it for next time, and hopefully to enable you to do the same kind of setup. Let’s get cracking. More...

Why cloud native is a gamechanger

by dotnetnerd 9. November 2018 11:25

Cloud native is one of those words that make some people shake their heads and call BS. In some contexts I am one of those people. It does however also have its place, because building solutions that are cloud centric does come with a number of benefits and enables solutions that were very hardif not impossible pre-cloud.

Sure, you can script the setup of a server from scratch, but it requires quite a bit of work, it takes time to execute and you still end up with an environment that requires updates and patching as soon as the script is a week old. In a cloud setup good practices, in the form of DevOps mainly using the CLI makes this very obtainable. Actually the current environment I am working with combines an ARM template and a few lines of script so we can spin up an entire environment in about 15 minutes. The only manual step is setting up the domain and SSL cert, but even that could be scripted if I wanted to.

More...

UI testing done right with Cypress.IO

by dotnetnerd 11. October 2018 12:07

Finally, someone has written a UI testing tool for the web and done it right! For at least 5 years I have been envious of the UI testing tools that were written for native application development. I have tried various tools for UI testing websites, but they all relied on selenium, which sucks harder than my vacuum cleaner. No matter how much lipstick you put on a pig, it is still a pig, so the brittle nature of selenium would bleed through, and require you to do updates to drivers as well as handle very low level things like timing between a click and the actual page being re-rendered. So working with those tools has been slow and painful.

More...

Who am I?

My name is Christian Holm Diget, and I work as an independent consultant, in Denmark, where I write code, give advice on architecture and help with training. On the side I get to do a bit of speaking and help with miscellaneous community events.

Some of my primary focus areas are code quality, programming languages and using new technologies to provide value.

Microsoft Certified Professional Developer

Microsoft Most Valuable Professional

Month List

bedava tv izle