by DotNetNerd
23. June 2023 07:41
For a while I have actually not had to do much configuration of app registrations, managed identities and so on, simply because I often join teams who have other people doing that part. So it has been fun and at time challenging to get back into it, as I have had to lately. So to help others, and maybe a future me I just want to write down a few notes, that will help me remember a few key details.
First of all when we are talking about a setup with a frontend application and backend api app registration, the app roles and claim mapping should be done in the backend api, with roles being defined as part of the app registration, and the mapping from e.g. groups being done in enterprise application. The roles and claims will then become available in the access token, once a user logs in using the clientid for the frontend application, that uses the backend API.
In most we cases we will then also need to have services that should be able to call our API. Authenticating using managed identity makes this quite simple, with the caviat being that roles can only be assigned using a script.
To run e.g. a Functions app that should be allowed access you need to configure a managed identity and add the role to it using a powershell script like below. User assigned managed identity can be used if you wish to use it for multiple services, however things like access to key vault using keyvault references require a system managed identity - so you will likely need to either use a system managed identity and configure it for each service, or have both kinds of managed identity configured.
$objectIdForManagedIdentity = "insert object (principal) id from managed identity"
$enterpriseApplicationObjectId = "insert enterprise applications object id"
$roleIdGuid = "insert role id"
$uri = "https://graph.microsoft.com/v1.0/servicePrincipals/$objectIdForManagedIdentity/appRoleAssignments"
$body = @{
principalId = $objectIdForManagedIdentity
resourceId = $enterpriseApplicationObjectId
appRoleId = $roleIdGuid
} | ConvertTo-Json
az rest --method POST --url $uri --body $body
After that when fetching a token using DefaultAzureCredentials the .default scope should be used and the ManagedIdentityClientId should be set to ensure it uses the user assigned identity.
by DotNetNerd
13. February 2023 09:12
A nice and simple way to expose static files is through Azure blob storage. If you are already using API Management you might want to have requests to through there, in order to ensure you can move it to somewhere else in the future. It requires a few steps to get it to work though.
First of all Managed Identities should be enabled in API management and Access Control (IAM) should be configured for the container to allow API management to access the file. In API management the endpoint is added with authentication-managed-identity policy to allow authentication is passes through. After that a number of headers should be removed and the x-ms-version, which is required to do AD authentication, should be set when forwarding the request from API Management to the blob storage endpoint.
In my case I also wanted to avoid the .json extension in the endpoint, so the configuration ended up looking something like this.
<policies>
<inbound>
<set-header name="Ocp-Apim-Subscription-Key" exists-action="delete" />
<set-header name="Sec-Fetch-Site" exists-action="delete" />
<set-header name="Sec-Fetch-Mode" exists-action="delete" />
<set-header name="Sec-Fetch-Dest" exists-action="delete" />
<set-header name="Accept" exists-action="delete" />
<set-header name="Accept-Encoding" exists-action="delete" />
<set-header name="Referer" exists-action="delete" />
<set-header name="X-Forwarded-For" exists-action="delete" />
<set-header name="x-ms-version" exists-action="override">
<value>@{string version = "2017-11-09"; return version;}</value>
</set-header>
<rewrite-uri template="/settings.json" copy-unmatched-params="true" />
<authentication-managed-identity resource="https://storage.azure.com/" />
</inbound>
<backend>
<base />
</backend>
<outbound>
<base />
</outbound>
<on-error>
<base />
</on-error>
</policies>
by DotNetNerd
14. December 2022 12:32
Recently I needed to update Azure Management API based on the swagger specifications when our services were deployed. It seems like a pretty standard thing that there would be a task for, but it turned out to require a bit of Powershell - it i still fairly simple though.
A general function to acccomplish it could look like this bit of code.
[CmdletBinding()]
Param(
[string] [Parameter(Mandatory=$true)] $ResourceGroupName,
[string] [Parameter(Mandatory=$true)] $ServiceName,
[string] [Parameter(Mandatory=$true)] $ApiName,
[string] [Parameter(Mandatory=$true)] $SpecificationFilePath
)
$apiMgmtContext = New-AzApiManagementContext -ResourceGroupName $ResourceGroupName -ServiceName $ServiceName
$api = Get-AzApiManagementApi -Context $apiMgmtContext -ApiId $ApiName
if ($null -eq $api) {
Write-Error "Failed to get API with name $ApiName"
exit(1)
}
$apiVersionSetId = $api.ApiVersionSetId.Substring($api.ApiVersionSetId.LastIndexOf("/")+1)
$apiVersionSet = Get-AzApiManagementApiVersionSet -Context $apiMgmtContext -ApiVersionSetId $apiVersionSetId
Import-AzApiManagementApi -Context $apiMgmtContext `
-SpecificationUrl $SpecificationFilePath `
-SpecificationFormat 'OpenApi' `
-Path $api.Path `
-ApiId $api.ApiId `
-ServiceUrl $api.ServiceUrl`
-ApiVersionSetId $apiVersionSet.Id
This can of course be used locally, but to run it from a release pipeline, the cleanest way I have found, is to add it to a separate repository, and include it as an artifact. From there we just need a Azure Powershell build step, configured as shown by the YAML below.
variables:
swaggerUrl: 'https://my_appservice.azurewebsites.net/swagger/1.0.0/swagger.json'
steps:
- task: AzurePowerShell@5
displayName: 'Azure PowerShell script: Update API Management'
inputs:
azureSubscription: 'xxx'
ScriptPath: '$(System.DefaultWorkingDirectory)/_MyCompany.BuildScripts/UpdateApiManagement.ps1'
ScriptArguments: '-ResourceGroupName "resource_group_name" -ServiceName "management_api_service_name" -ApiName "unique_api_id" -SpecificationFilePath $(swaggerUrl)'
azurePowerShellVersion: LatestVersion
And that is all that is required, to automate it and make it update along with the build. Nice and easy.
by DotNetNerd
5. February 2020 09:04
I have recently been looking at copying entire collections of data on Azure, in a way that should run as either Azure Functions or Webjobs. This is useful for backups, and simply moving data between environments. I didn't come across too many good samples of how to do this, so I expect it can be a useful topic for others who need to do the same thing.
More...
by DotNetNerd
2. April 2019 17:02
Recently I had the need to setup easy auth using Azure B2C to authenticate users across a frontend Azure Web App and an Azure Functions backend. Allthough it sounds like a regular scenario, the documentation I found could have been better. I don’t have the time to write the complete docs, but this blogpost will outline the steps, so I can remember it for next time, and hopefully to enable you to do the same kind of setup. Let’s get cracking. More...
by dotnetnerd
9. November 2018 11:25
Cloud native is one of those words that make some people shake their heads and call BS. In some contexts I am one of those people. It does however also have its place, because building solutions that are cloud centric does come with a number of benefits and enables solutions that were very hardif not impossible pre-cloud.
Sure, you can script the setup of a server from scratch, but it requires quite a bit of work, it takes time to execute and you still end up with an environment that requires updates and patching as soon as the script is a week old. In a cloud setup good practices, in the form of DevOps mainly using the CLI makes this very obtainable. Actually the current environment I am working with combines an ARM template and a few lines of script so we can spin up an entire environment in about 15 minutes. The only manual step is setting up the domain and SSL cert, but even that could be scripted if I wanted to.
More...
by DotNetNerd
26. May 2016 11:27
One of the really nice things about Azure Webapps is the support for running Webjobs. Most large webapplications will at some point need some data or media processed by a background process, and for that Webjobs are a perfect fit.
More...
by dotnetnerd
11. May 2016 12:49
A few weeks ago Microsoft introduced the concept of "Cool" Blob Storage on Azure, which means that you get REALLY cheap storage for data that you don't access very often - backup being an obvious usecase. In my case I have used Dropbox for backups for a while, and although it works fine for a certain amount of data, it is not really a good fit for backing up that family photos and videos once a year from the home NAS.
More...
by DotNetNerd
29. March 2016 12:59
A core value that Azure brings to modern projects, is to enable developers to take control of the deployment process, and make it fast and painless. Sure scalability is nice, when and if you need it, but the speed and flexibility in setting up an entire environment for your application is always valuable - so for me this is a more important feature of Azure. Gone are the days of waiting at best days, most likely weeks and maybe even months for the IT department to create a new development or test environment.
More...
by DotNetNerd
4. May 2015 05:22
At Build there was a number of huge announcements. It is amazing to see that Azure is keeping up the pace, and as a developer I am excited about the promise of services that will enable me to focus on implementing solutions rather than fiddeling with servers and infrastructure.
More...