Adatis BI Blogs

Converting Data Factory Azure Function Activities from Web to Native

Microsoft have recently added support to call Azure Functions natively within Data Factory. Previous to this, you only had one option if you wanted to leverage the serverless compute – which was through a web activity. While this certainly did a job, it wasn’t ideal as it exposed the function/host key within code and also the http message so as to authenticate with the functions.  As such, I expect quite a few people’s pipelines will be in a similar state to my own. I have recently been through the process of upgrading them in line with the new native ADF functionality and as part of this, I thought it would be worth my time to share the process I went through so you don’t suffer the same pains!Linked ServiceThe first step in this process is to add a new ADF linked service into your data factory. When you try to do this, you’ll probably end up doing what always seems to happen when first searching for an Azure Function connector and search the prompt in front of you. This is default filtered to data connectors so it won’t appear, and you’ll need to switch the tab at the top to compute, at which point you’ll see it. The next step is to add the URL to your functions end point which will be in the format You’ll then need to supply a key to the function app for authentication purposes. I strongly advise you to set up an Azure Key Vault and store the key in there, and then in ADF you can access the key via a secret. This is so simple to do and promotes best practice from a security perspective as well as being useful for CI/CD purposes. This key can be both a function key (specific to a single function) OR a host key (access to all functions). Most of the tutorials on the web as well as the official MS content currently reference the function key, so I was slightly worried at first you could only call functions via the function key. This is not the case, and the host key can be used too. If you’re not too familiar with functions, this can be found within the application settings of your function app.PipelinesThe next step in the process is replacing those web activities with native function activities. While this is a fairly simple process, you’ll no longer be calling a GET on a HTTP webhook directly via a URL string, and passing in parameters. Instead, you’ll need to specify the function you need to call and pass in the parameters via a JSON body as part of a POST call to the end point. For any functions not requiring parameters, you’ll need to use a GET rather than a POST with an empty body as ADF has issues validating a body without any content. To pass the parameters into a body, you’ll need to wrap them in JSON with the following format. I’ve provided examples of strings, integers, and expressions but hopefully you get the point here.Function CodeAs part of this process, you’ll also need to modify your C# code within your function apps. Previous to this upgrade, you may be handling your functions through the HttpResponseMessage object (see below for an example – I’ve highlighted in red the code snippets that relate to the subject matter). Your web activity is happy to accept a response from this class in Data Factory!Now that we are using the native functions, Data Factory will no longer accept a Http message as a response, and instead wants a JSON object (JObject). As a result, our HttpResponseMessage needs to be replaced by an IActionResult object (see below for an example of changes). The change is relatively trivial and you only need to decide on the type of Object to create instead of just creating a response.To use this object, you’ll need to import the Microsoft.AspNetCore.Mvc libraries through NuGet. At this point, I ran into an issue since the dependencies required by Microsoft.AspNetCore.Mvc conflicted with my existing libraries – to such an extent where separate NuGet packages wanted my Newtonsoft.Json library to be both equal to version 9.01 and greater than version 11.0.1. To solve this issue, I imported separate parts of the main library – in this instance it was Microsoft.AspNetCore.Mvc.Abstractions and Microsoft.AspNetCore.Mvc.Core. These 2 libraries did not have the same constraints as the entire framework library. While the above code changes allowed me to send/receive a JObject, in most scenarios I also needed to POST parameters into the function. In this case, I also needed to modify the code to read this (see highlighted code block below). The content is now treated as a stream which then needs to be deserialised from JSON into an object. The parameters can then be read by treating the object as a list.ErrorsIf you’ve found this blog as part of a Google search, you’ll more likely than not of hit the following generic error { "errorCode": "3600", "message": "Error calling the endpoint.", "failureType": "UserError", "target": "Activate New Parent Pipeline" }. This not a particularly helpful error as it basically tells you, you're not able to receive a correctly formatted response. More often than not, this will be due to an issue in your function code, than something in ADF. I spent a while thinking it was an authentication error when it was not! You may receive a slightly more verbose error if you check the function app logs through something such as Azure Insights – again, I would strongly recommend you set this up as part of your Azure functions service as it’s not ideal to rely on ADF to debug this. Alternatively, I would also suggest setting up Postman to debug the issues locally to have a bit more control over the variables at runtime through the Locals output.ConclusionHopefully you will find this guide useful if you’ve about to go through a similar process to myself. It might also be useful for anyone setting up functions in ADF for the first time as it will cover most of the content required for the services to talk to one another.

Azure Functions Key Vault Integration

IntroductionA really useful feature that recently came in to public preview for Azure Functions is native integration with Key Vault when retrieving application settings. This feature allows you to source applications settings directly from Key Vault without making any changes to your function code and without having to implement any additional infrastructure code to handle connecting to Key Vault and retrieving secrets. The original blog post announcing this feature can be found at Key Vault integration leverages another recent Azure Functions feature called Managed Identities. When creating a function app you can easily create a system assigned managed identity by enabling it through the Azure Portal or including a property within the ARM template (see for more information); this allows you to interact with many Azure services using Azure AD authentication without the need to worry about managing service principals and the inevitable downtime that occurs when their keys expire.As the feature is currently in preview there is a limitation in that it doesn’t support rotation of the secrets meaning only a single version of the secret is supported however, I’ll run through a workaround for that issue in this post.Now that we’ve covered the basics of what the feature does I’ll run through a quick demo to demonstrate how it can help us when we’re developing function apps. This post isn’t intended to be an introduction to Azure Functions or Azure Key Vault so I have assumed a basic knowledge of each and only covered the elements of the set up that are related to the Key Vault integration.DemoThe first step is to create our resources in Azure. For the demo we need to create new resource group that contains a function app with its associated storage account and an instance of Azure Key Vault:Next, we need to enable the system assigned managed identity on our function app, we do this by navigating to the function app in the portal, clicking on the “Platform Features” tab and then clicking on “Identity”:From there we can set the status of the system assigned managed identity to “On” and click save to apply the change:Next, we need to grant our function app permissions to retrieve secrets from Key Vault; we do this by navigating to Key Vault in the portal and clicking on “Access Policies”:Now we need to click on “Add New” and click on the “Select Principal” panel that is displayed, which brings up a new blade where we can search for the managed identity created for our function app in Azure AD:Now we’ve selected our principal we need to give it the appropriate permissions. To retrieve secrets our function app only needs “Get” permissions on secrets so we can select that option and click ok:The Key Vault UI isn’t the most intuitive as most people forget the next step but we need to click the save button to commit the changes to our access policies:Now that we’ve created our Azure resources and set Azure Functions up so that it can communicate with Key Vault we need to add a secret to Key Vault and get our function app to retrieve it. To do this I have created the following script that will add the given secret to Key Vault and then set the required application settings on our function app to allow the value to be retrieved:Once we run the script our secret will be added to Key Vault and the relevant application setting that acts as a pointer to the secret is added to our function app meaning that if we always use this script to deploy our secrets our function app will always be using the latest version of the secret meaning we are able to work around the current limitation that means the url in the application setting for our function app must include the version of the secret. After running the script we can take a look at the application settings for our function app and see that the reference to the secret is added:Now we’ve added our secret to Key Vault and created the reference to the secret in our function app we can test that our functions are able to retrieve the secret. To do this we can create a new HTTP triggered function that has the following code to use the Environment.GetEnvironmentVariable function to retrieve the value of our application setting:When we run the function the result returned is as follows, which matches the value we added to Key Vault! Obviously, in the real world we wouldn’t want to expose the value of our secret through outside of our function but this allows us to see the value that was returned by the app setting.ConclusionTo conclude, we’ve shown how easy it is to integrate Azure Functions with Azure Key Vault. With the newly released integration we can leverage managed identities to access Key Vault without the need to write any additional code or have the overhead of managing service principals, this means we can ensure that our secrets are stored safely in Key Vault thereby improving the security of our serverless application.