azure functions
269 TopicsSuperfast using Web App and Managed Identity to invoke Function App triggers
TOC Introduction Setup References 1. Introduction Many enterprises prefer not to use App Keys to invoke Function App triggers, as they are concerned that these fixed strings might be exposed. This method allows you to invoke Function App triggers using Managed Identity for enhanced security. I will provide examples in both Bash and Node.js. 2. Setup 1. Create a Linux Python 3.11 Function App 1.1. Configure Authentication to block unauthenticated callers while allowing the Web App’s Managed Identity to authenticate. Identity Provider Microsoft Choose a tenant for your application and it's users Workforce Configuration App registration type Create Name [automatically generated] Client Secret expiration [fit-in your business purpose] Supported Account Type Any Microsoft Entra Directory - Multi-Tenant Client application requirement Allow requests from any application Identity requirement Allow requests from any identity Tenant requirement Use default restrictions based on issuer Token store [checked] 1.2. Create an anonymous trigger. Since your app is already protected by App Registration, additional Function App-level protection is unnecessary; otherwise, you will need a Function Key to trigger it. 1.3. Once the Function App is configured, try accessing the endpoint directly—you should receive a 401 Unauthorized error, confirming that triggers cannot be accessed without proper Managed Identity authorization. 1.4. After making these changes, wait 10 minutes for the settings to take effect. 2. Create a Linux Node.js 20 Web App and Obtain an Access Token and Invoke the Function App Trigger Using Web App (Bash Example) 2.1. Enable System Assigned Managed Identity in the Web App settings. 2.2. Open Kudu SSH Console for the Web App. 2.3. Run the following commands, making the necessary modifications: subscriptionsID → Replace with your Subscription ID. resourceGroupsID → Replace with your Resource Group ID. application_id_uri → Replace with the Application ID URI from your Function App’s App Registration. https://az-9640-faapp.azurewebsites.net/api/test_trigger → Replace with the corresponding Function App trigger URL. # Please setup the target resource to yours subscriptionsID="01d39075-XXXX-XXXX-XXXX-XXXXXXXXXXXX" resourceGroupsID="XXXX" # Variable Setting (No need to change) identityEndpoint="$IDENTITY_ENDPOINT" identityHeader="$IDENTITY_HEADER" application_id_uri="api://9c0012ad-XXXX-XXXX-XXXX-XXXXXXXXXXXX" # Install necessary tool apt install -y jq # Get Access Token tokenUri="${identityEndpoint}?resource=${application_id_uri}&api-version=2019-08-01" accessToken=$(curl -s -H "Metadata: true" -H "X-IDENTITY-HEADER: $identityHeader" "$tokenUri" | jq -r '.access_token') echo "Access Token: $accessToken" # Run Trigger response=$(curl -s -o response.json -w "%{http_code}" -X GET "https://az-9640-myfa.azurewebsites.net/api/my_test_trigger" -H "Authorization: Bearer $accessToken") echo "HTTP Status Code: $response" echo "Response Body:" cat response.json 2.4. If everything is set up correctly, you should see a successful invocation result. 3. Invoke the Function App Trigger Using Web App (nodejs Example) I have also provide my example, which you can modify accordingly and save it to /home/site/wwwroot/callFunctionApp.js and run it cd /home/site/wwwroot/ vi callFunctionApp.js npm init -y npm install azure/identity axios node callFunctionApp.js // callFunctionApp.js const { DefaultAzureCredential } = require("@azure/identity"); const axios = require("axios"); async function callFunctionApp() { try { const applicationIdUri = "api://9c0012ad-XXXX-XXXX-XXXX-XXXXXXXXXXXX"; // Change here const credential = new DefaultAzureCredential(); console.log("Requesting token..."); const tokenResponse = await credential.getToken(applicationIdUri); if (!tokenResponse || !tokenResponse.token) { throw new Error("Failed to acquire access token"); } const accessToken = tokenResponse.token; console.log("Token acquired:", accessToken); const apiUrl = "https://az-9640-myfa.azurewebsites.net/api/my_test_trigger"; // Change here console.log("Calling the API now..."); const response = await axios.get(apiUrl, { headers: { Authorization: `Bearer ${accessToken}`, }, }); console.log("HTTP Status Code:", response.status); console.log("Response Body:", response.data); } catch (error) { console.error("Failed to call the function", error.response ? error.response.data : error.message); } } callFunctionApp(); Below is my execution result: 3. References Tutorial: Managed Identity to Invoke Azure Functions | Microsoft Learn How to Invoke Azure Function App with Managed Identity | by Krizzia 🤖 | Medium Configure Microsoft Entra authentication - Azure App Service | Microsoft Learn177Views0likes0CommentsUse User managed identity to replace connection string in"AzureWebJobsStorage" for function apps
Managing the connectivity between the function app and the storage account is crucial as the Azure Function runtime is stored in the Azure storage account. In case of a disconnection, you might run into common errors such as "Azure Functions runtime is unreachable". Fortunately, Microsoft has a helpful guide to address this issue, which provides self-help troubleshooting steps for recovering your storage account in case of such errors. You can check it out here: https://learn.microsoft.com/en-us/azure/azure-functions/functions-recover-storage-account. Previously, the only way to grant the permission for an Azure Function to access its runtime in a storage account was via the connection string in "azurewebjobsstorage" configuration. However, a new and more secure approach exists for granting a function app access to the storage account without compromising sensitive information. This method involves leveraging a "managed identity" to replace the connection string used in "azurewebjobsstorage". By adopting this approach, you can ensure that secrets remain private while still granting necessary permissions for the function app to operate seamlessly. We already have detailed instructions available for replacing the connection string in "azurewebjobsstorage" using a "system-assigned identity". The tutorial can be found here: https://learn.microsoft.com/en-us/azure/azure-functions/functions-identity-based-connections-tutorial. It's worth noting that both "system-assigned" and "user-assigned" identities are supported in this scenario. For those who prefer to use a "user-assigned" identity to replace the connection string, here are the instructions: Instructions: a. Prepare a user-assigned identity and copy client id for later use. b. Grant “Storage Blob Data Owner” for the identity in storage account. c. Assigned user-assigned identity to the function app. d. Add 3 corresponding appSettings. AzureWebJobsStorage__accountName = Storage account name (System assigned identity only need this setting.) AzureWebJobsStorage__clientId = Client id of user assigned identity. AzureWebJobsStorage__credential = managedidentity Then it is done. You would be able to run your function app correctly without “AzureWebJobsStorage”.13KViews1like3CommentsAzure Functions Flex Consumption is now generally available
We are excited to announce that Azure Functions Flex Consumption is now generally available. This hosting plan provides the highest performance for Azure Functions with concurrency-based scaling for both HTTP and non-HTTP triggers, scale from zero to 1000 instances, and no cold start with the Always Ready feature. Flex Consumption also allows you to enjoy seamless integration with your virtual network at no extra cost, ensuring secure and private communication, with no considerable impact to your app’s scale out performance. Learn more about How to achieve high HTTP scale with Azure Functions Flex Consumption, the engineering innovation behind it, and project Legion, the platform behind Flex Consumption. In addition to the fast scaling based on per-instance concurrency, you can choose between 2048MB and 4096MB instance sizes. As the function app receives requests it will automatically scale from zero to as many instances of that instance size as needed based on per instance concurrency, and back to zero for cost efficiency when there’s no more requests to process. You can also take advantage of the built-in integration with Azure Load Testing and the Performance Optimizer to optimize your HTTP functions for performance and cost. Flex Consumption is now generally available for .NET 8 on the isolated worker model, Java 11, Java 17, Node 20, PowerShell 7.4, Python 3.10, and Python 3.11 in Australia East, East Asia, East US, North Europe, Southeast Asia, Sweden Central, UK South, and West US 2, and in preview in East US 2, South Central US, and West US 3. By December 9th 2024, .NET 9 will also generally available in Australia East, East Asia, East US, North Europe, Southeast Asia, Sweden Central, and UK South. Besides the currently supported DevOps and dev tools like VS Code, Java tooling, Azure Pipeline tasks, and GitHub Actions, you can now use the latest Visual Studio 2022 v17.12 update or newer to create and publish to Flex Consumption apps. The Flex Consumption plan offers competitive pricing with flexible options to fit your needs, with GA pricing taking effect on December 1, 2024. For detailed pricing information, please refer to the pricing page. Customer adoption and scenarios We have been working with several internal and external customers during the public preview period, with hundreds of external customers actively using Flex Consumption. “ At Yggdrasil, we immediately started adopting Flex Consumption functions when they went into public preview, as they offer the combination of cost-efficiency, scalability, and security features we need to run our company. We already have 100 Flex Consumption functions running in production, and expect to move at least another 50 functions, now that the product has reached GA. We migrated to Flex from consumption to have VNet integration and private endpoints. – Andreas Strandfelt, Partner & Senior Cloud Specialist at Yggdrasil Commodities ApS “ What really matters to us is that the app scales up and down based on demand. Azure Functions Flex Consumption is very appealing to us because of how it dynamically scales based on the number of messages that are queued up in Azure Event Hubs – Stephan Miehe, GitHub Senior Director. Public case study “ Microsoft AI We had a need to process a large queue, representing a significant volume of data with inconsistent availability. Azure Functions Flex Consumption dramatically simplified the code footprint needed to perform this embarrassingly parallel task and helped us complete it in a much shorter timeframe that we had expected. – Craig Presti, Office of the CTO, Microsoft AI project “ Going Forward In the upcoming months we look forward to rolling out even more features to Flex Consumption, including: Availability zones: Enabling availability zones will be possible for new and existing Flex Consumption apps 512 MB instance size: We will introduce a new, smaller instance size for more granular control Enhanced tooling support: PowerShell modules and Terraform AzureRM support New language versions: Support for the latest language versions like Node 22, Python 3.12, and Java 21 Expanded regional availability: The number of regions will continue to expand in early 2025 with UAE North, Centra US, West US 3, South Central US, East US 2, West US, Canada Central, France Central, and Norway East coming first Metrics support: Full Azure Monitor metrics support for Flex Consumption apps Deployment improvements: Zero-downtime deployment to ensure no disruption to running executions More triggers: Kafka and SQL triggers Closing features: Addressing the limitations identified in Considerations. Please let us know which ones are most important to you! Get Started! Explore our reference samples, quickstarts, and comprehensive documentation to get started with the Azure Functions Flex Consumption hosting plan today!5.8KViews1like16CommentsRetrieving Azure App Service Deployment Center Events - Monitoring
Hello Team, I would like to know how to retrieve Azure App Service Deployment Center events. Specifically, I’m looking to integrate a webhook to capture trigger and deployment events from the Deployment Center. Thanks, Vinoth_Azure59Views0likes2CommentsBuilding a TOTP Authenticator App on Azure Functions and Azure Key Vault
Two-factor authentication (2FA) has become a cornerstone of modern digital security, serving as a crucial defense against unauthorized access and account compromises. While many organizations rely on popular authenticator apps like Microsoft Authenticator, there's significant value in understanding how to build and customize your own TOTP (Time-based One-Time Password) solution. This becomes particularly relevant for those requiring specific customizations, enhanced security controls, or seamless integration with existing systems. In this blog, I'll walk through building a TOTP authenticator application using Azure's modern cloud services. Our solution demonstrates using Azure Functions for server-side operations with Azure Key Vault for secrets management. A bonus section covers integrating with Azure Static Web Apps for the frontend. The solution supports the standard TOTP protocol (RFC 6238), ensuring compatibility with services like GitHub and Microsoft's own authentication systems. While this implementation serves as a proof of concept rather than a production-ready system, it provides a solid foundation for understanding how authenticator apps work under the hood. By walking through the core components - from secret management to token generation - it will share valuable insights into both authentication systems and cloud architecture. This knowledge proves especially valuable for teams considering building custom authentication solutions or those looking to better understand the security principles behind 2FA. Understanding TOTP Time-based One-Time Password (TOTP) is an algorithm that generates temporary passwords based on a shared secret key and the current time. The foundation of TOTP lies in its use of a shared secret key. When a user first enables 2FA with a service like GitHub, a unique secret key is generated. This key is then encoded into a QR code that the user scans with their authenticator app. This initial exchange of the secret is the only time it's transmitted between the service and the authenticator. For example, a service will provide a QR code that looks like this: On decoding that, we see that the text encoded within this QR code is: otpauth://totp/Test%20Token?secret=2FASTEST&issuer=2FAS When we break down this URI: otpauth:// specifies this is an OTP authentication totp/ indicates this is time-based (as opposed to counter-based HOTP) Test%20Token is the account name (URL encoded) secret=2FASTEST is the shared secret key issuer=2FAS identifies the service providing the 2FA Once the user scans this, the secret is shared with the app and both the service and authenticator app use it in combination with the current time to generate codes. The process divides time into 30-second intervals. For each interval, the current Unix timestamp is combined with the secret key using a cryptographic hash function (HMAC-SHA1), which produces a consistent 6-digit code that both sides can generate independently. Security in TOTP comes from several key design principles. The short 30-second validity window means that even if an attacker intercepts a code, they have very limited time to use it. The one-way nature of the hash function means that even with a valid code, an attacker cannot work backwards to discover the secret key. Additionally, since the system relies on UTC time, it works seamlessly across different time zones. Most services implement a small amount of time drift tolerance. Since device clocks may not be perfectly synchronized, services typically accept codes from adjacent 30 second time windows. This provides a balance between security and usability, ensuring that slight time differences don't prevent legitimate authentication attempts while maintaining the security benefits of time-based codes. TOTP has become the de facto standard for two-factor authentication across the internet. Its implementation in RFC 6238 ensures compatibility between different services and authenticator apps. This means that whether you're using Google Authenticator, Microsoft Authenticator, or building your own solution like we are, the underlying mechanics remain the same, providing a consistent and secure experience for users. Architecture Our TOTP authenticator is built with security and scalability in mind, leveraging Azure's managed services to handle sensitive authentication data. The system consists of two main components: the web frontend, the backend API, and the secret storage. Backend API: Implemented as Azure Functions, our backend provides endpoints for managing TOTP secrets and generating tokens. We use Azure Functions because they provide excellent security features through managed identities, automatic HTTPS enforcement, and built-in scaling capabilities. The API will contain endpoints for adding new 2FA accounts and retrieving tokens. Secret storage: Azure Key Vault serves as our secure storage for TOTP secrets. This choice provides several crucial benefits: hardware-level encryption for secrets, detailed access auditing, and automatic key rotation capabilities. Azure Key Vault's managed identity integration with Azure Functions ensures secure, certificate-free access to secrets, while its global redundancy guarantees high availability. Prerequisites To follow along this blog, you'll need the following: Azure subscription: You will need an active subscription to host the services we will use. Make sure you have appropriate permissions to create and manage resources. If you don't have one, you can sign up here: https://azure.microsoft.com/en-us/pricing/purchase-options/azure-account Visual Studio Code: For the development environment, install Visual Studio Code. Other IDEs are available, though we will be benefiting from the extensions within this IDE. Download VS Code here: https://code.visualstudio.com/ VS Code Azure extensions (optional): There are many different ways to deploy to Azure Static Web Apps and Azure Functions, but having one-click deploy functionality inside our IDE is extremely useful. To install on VS Code, head to Extensions > Search Azure Static Web Apps > Click Install and do the same for the Azure Functions extension. Building the app Deploying the resources We will need to create at least an Azure Key Vault resource, and if you want to test the Function in the cloud (not just locally) then an Azure Function App too. I've attached the Azure CLI commands to deploy these resources, though it can be done through the portal if that's more comfortable. Firstly, create an Azure Key Vault resource: az keyvault create \ --name <your-kv-name> \ --resource-group <your-rg> \ --location <region> Enable RBAC for your Azure Key Vault: az keyvault update \ --name <your-kv-name> \ --enable-rbac-authorization true Create new Azure Function App: az functionapp create \ --name <app-name> \ --storage-account <storage-name> \ --consumption-plan-location <region> \ --runtime node \ --runtime-version 18 \ --functions-version 4 Set Azure Key Vault name environment variable in Azure Function App: az functionapp config appsettings set \ --name <app-name> \ --resource-group <your-rg> \ --settings "KEY_VAULT_NAME=<your-kv-name>" Grant your Azure Function App's managed identity access to Azure Key Vault: az role assignment create \ --assignee-object-id <function-app-managed-identity> \ --role "Key Vault Secrets Officer" \ --scope /subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.KeyVault/vaults/<your-kv-name> Building the API The backend of our authenticator app serves as the secure foundation for managing 2FA secrets and generating TOTP tokens. While it might be tempting to handle TOTP generation entirely in the frontend (as some authenticator apps do), our cloud-based approach offers several advantages. By keeping secrets server-side, we can provide secure backup and recovery options, implement additional security controls, and protect against client-side vulnerabilities. The backend API will have two key responsibilities which the frontend will trigger: Securely store new account secrets Generating valid TOTP tokens on demand First, we need to create an Azure Functions project in VS Code. The creation wizard will ask you to create a trigger, so let's start with (1) and create a trigger for processing new accounts: Go to Azure tab > Click the Azure Functions icon > Click Create New Project > Choose a folder > Choose JavaScript > Choose Model V4 > Choose HTTP trigger > Provide a name ('accounts') > Click Open in new window. Let's make a few modifications to this base Function: Ensure that the only allowed HTTP method is POST, as there is no need to support both and we will make use of the request body allowed in POST requests. Clear everything inside that function to make way for our upcoming code. Now, let's work forward from this adjusted base: const { app } = require("@azure/functions"); app.http("accounts", { methods: ["POST"], authLevel: "anonymous", handler: async (request, context) => { } }); This accounts endpoint will be responsible for securely storing new TOTP secrets when users add accounts to their authenticator. Here's what we need this endpoint to do: Receive the new account details: the TOTP secret, account name and issuer (extracted from the QR code on the frontend) Validate the request, ensuring proper formatting of all fields and that the user is authenticated Store the secret in Azure Key Vault with appropriate metadata Return success/failure status to allow the frontend to update accordingly. First, let's validate the incoming request data. When setting up two-factor authentication, services provide a QR code containing a URI in the otpauth:// format. This standardized format includes all the information we need to set up TOTP authentication. Assuming the frontend has decoded the QR code and sent us the resulting data, let's add some code to parse and validate this URI format. We'll use JavaScript's built-in URL class to handle the parsing, which will also take care of URL encoding/decoding for us. Add the following code to the function: // First, ensure we have a JSON payload let requestBody; try { requestBody = await request.json(); } catch (error) { context.log('Error parsing request body:', error); return { status: 400, jsonBody: { error: 'Invalid request format', details: 'Request body must be valid JSON containing a TOTP URI' } }; } // Check for the URI in the request const { uri } = requestBody; if (!uri || typeof uri !== 'string') { return { status: 400, jsonBody: { error: 'Missing or invalid TOTP URI', details: 'Request must include a "uri" field containing the TOTP setup URI' } }; } This first section of code handles the basic validation of our incoming request data. We start by attempting to parse the request body as JSON using request.json(), wrapping it in a try-catch block to handle any parsing failures gracefully. If the parsing fails, we return a 400 Bad Request status with a clear error message. After successfully parsing the JSON, we check for the presence of a uri field in the request body and ensure it's a string value. This validation ensures we have the minimum required data before we attempt to parse the actual TOTP URI in the next step. Let's now move on to parsing and validating the TOTP URI itself. This URI should contain all the important information: the type of OTP (TOTP in our case), the account name, the secret key, and optionally the issuer. Here's an example of a valid URI which would be provided by services: otpauth://totp/Test%20Token?secret=2FASTEST&issuer=2FAS To parse this, add the following code after our initial validation: // Parse and validate the TOTP URI try { const totpUrl = new URL(uri); // Validate it's a TOTP URI if (totpUrl.protocol !== "otpauth:") { throw new Error("URI must use otpauth:// protocol"); } if (totpUrl.host !== "totp") { throw new Error("URI must be for TOTP authentication"); } // Extract the components const accountName = decodeURIComponent(totpUrl.pathname.split("/")[1]); const secret = totpUrl.searchParams.get("secret"); const issuer = totpUrl.searchParams.get("issuer"); // Validate required components if (!secret) { throw new Error("Missing secret in URI"); } // Store the parsed data for the next step const validatedData = { accountName, secret, issuer: issuer || accountName, // Fall back to account name if issuer not specified }; ... } catch (error) { context.log("Error validating TOTP URI:", error); return { status: 400, jsonBody: { error: "Invalid TOTP URI", details: error.message, }, }; } We use JavaScript's built-in URL class to do the heavy lifting of parsing the URI components. We first verify this is actually a TOTP URI by checking the protocol and path. Then we extract the three key pieces of information: the account name (from the path), the secret key, and the issuer (both from the query parameters). We validate that the essential secret is present and store all this information in a validatedData object. Now that we have our TOTP data properly validated and parsed, let's move on to setting up our Azure Key Vault integration. Firstly, we must install the required Azure SDK packages: npm install azure/identity azure/keyvault-secrets Now we can add the Azure Key Vault integration to our function. Add these imports at the top of your file: const { DefaultAzureCredential } = require('@azure/identity'); const { SecretClient } = require('@azure/keyvault-secrets'); const { randomUUID } = require('crypto'); // Initialize Key Vault client const credential = new DefaultAzureCredential(); const vaultName = process.env.KEY_VAULT_NAME; const vaultUrl = `https://${vaultName}.vault.azure.net`; const secretClient = new SecretClient(vaultUrl, credential); This code sets up our connection to Azure Key Vault using Azure's managed identity authentication. The DefaultAzureCredential will automatically handle authentication when deployed to Azure, and our vault name comes from an environment variable to keep it configurable. Be sure to go and set the KEY_VAULT_NAME variable inside of your local.settings.json file. Now let's add the code to store our TOTP secret in Azure Key Vault. Add this after our URI validation: // Create a unique name for this secret const secretName = `totp-${Date.now()}`; // Store the secret in Key Vault with metadata try { await secretClient.setSecret(secretName, validatedData.secret, { contentType: 'application/json', tags: { accountName: validatedData.accountName, issuer: validatedData.issuer, type: 'totp-secret' } }); context.log(`Stored new TOTP secret for account ${validatedData.accountName}`); return { status: 201, jsonBody: { message: 'TOTP secret stored successfully', secretName: secretName, accountName: validatedData.accountName, issuer: validatedData.issuer } }; } catch (error) { context.error('Error storing secret in Key Vault:', error); return { status: 500, jsonBody: { error: 'Failed to store TOTP secret' } }; } When storing the secret, we use setSecret with three important parameters: A unique name generated using a UUID (totp-${randomUUID()}). This ensures each secret has a globally unique identifier with no possibility of collisions, even across distributed systems. The resulting name looks like totp-123e4567-e89b-12d3-a456-426614174000. The actual TOTP secret we extracted from the URI. Metadata about the secret, including: contentType marking this as JSON data tags containing the account name and issuer, which helps us identify the purpose of each secret without needing to retrieve its actual value A type tag marking this specifically as a TOTP secret. If the storage succeeds, we return a 201 Created status with details about the stored secret (but never the secret itself). The returned secretName is particularly important as it will be used later when we need to retrieve this secret to generate TOTP codes. Now that we can securely store TOTP secrets, let's create our second endpoint that generates the 6-digit codes. This endpoint will: Retrieve a secret from Azure Key Vault using its unique ID Generate a valid TOTP code based on the current time Return the code along with its remaining validity time Follow the same setup steps as earlier, and ensure you have an empty function. I've named it tokens and set it as a GET request: app.http('tokens', { methods: ['GET'], authLevel: 'anonymous', handler: async (request, context) => { } }); Let's add the code to validate the query parameter and retrieve the secret from Azure Key Vault. A valid request will look like this: /api/tokens?id=totp-123e4567-e89b-12d3-a456-426614174000 We want to ensure the ID parameter exists and has the correct format: // Get the secret ID from query parameters const secretId = request.query.get('id'); // Validate the secret ID format if (!secretId || !secretId.startsWith('totp-')) { return { status: 400, jsonBody: { error: 'Invalid or missing secret ID. Must be in format: totp-{uuid}' } }; } This code first checks if we have a properly formatted secret ID in our query parameters. The ID should start with totp- and be followed by a UUID, matching the format we used when storing secrets in our first endpoint. If the ID is missing or invalid, we return a 400 Bad Request with a helpful error message. Now if the ID is valid, we should attempt to retrieve the secret from Azure Key Vault: try { // Retrieve the secret from Key Vault const secret = await secretClient.getSecret(secretId); ... } catch (error) { context.error('Error retrieving secret:', error); return { status: 500, jsonBody: { error: 'Failed to retrieve secret' } }; } If anything goes wrong during this process (like the secret doesn't exist or we have connection issues), we log the error and return a 500 Internal Server Error. Now that we have the secret from Azure Key Vault, let's add the code to generate the 6-digit TOTP code. First, install otp package: npm install otp Then add this import at the top of your file: const OTP = require('otp'); Now let's generate a 6-digit TOTP using this library from the data retrieved from Azure Key Vault: const totp = new OTP({ secret: secret.value }); // Generate the current token const token = totp.totp(); // Calculate remaining seconds in this 30-second window const timeRemaining = 30 - (Math.floor(Date.now() / 1000) % 30); return { status: 200, jsonBody: { token, timeRemaining } }; Let's break down exactly how this code generates our 6-digit TOTP code. When we generate a TOTP code, we're using our stored secret key to create a unique 6-digit number that changes every 30 seconds. The OTP library handles this through several steps behind the scenes. First, when we create a new OTP instance with new OTP({ secret: secret.value }), we're setting up a TOTP generator with our base32-encoded secret (like 'JBSWY3DPEHPK3PXP') that we retrieved from Azure Key Vault. When we call totp(), the library takes our secret and combines it with the current time to generate a code. It takes the current Unix timestamp, divides it by 30 to get the current time window, then uses this value and our secret in an HMAC-SHA1 operation. The resulting hash is then dynamically truncated to give us exactly 6 digits. This is why anyone with the same secret will generate the same code within the same 30-second window. To help users know when the current code will expire, we calculate timeRemaining by finding out how far we are into the current 30-second window and subtracting that from 30. This gives users a countdown until the next code will be generated. With both our endpoints complete, we now have a functional backend for our TOTP authenticator. The first endpoint securely stores TOTP secrets in Azure Key Vault, generating a unique ID for each one. The second endpoint uses these IDs to retrieve secrets and generate valid 6-digit TOTP codes on demand. This server-side approach offers several advantages over traditional authenticator apps: our secrets are securely stored in Azure Key Vault rather than on user devices, we can easily back up and restore access if needed, and we can add additional security controls around code generation. Testing First, we'll need to run the functions locally using the Azure Functions Core Tools. Open your terminal in the project directory and run: func start I'm using a website designed to check if your 2FA app is working correctly. It creates a valid QR code, and also calculates the TOTP on their end so you can compare results. I highly recommend using this alongside me to test our solution: https://2fas.com/check-token/ It will present you with a QR code. You can scan it in your frontend, though you can copy/paste the below which is the exact same value: otpauth://totp/Test%20Token?secret=2FASTEST&issuer=2FAS Now let's test our endpoints sequentially using curl (or Postman if you prefer). My functions started on port 7071, be sure to check yours before you send the request. Let's start with adding the above secret to Azure Key Vault: curl -X POST http://localhost:7071/api/accounts \ -H "Content-Type: application/json" \ -d '{ "uri": "otpauth://totp/Test%20Token?secret=2FASTEST&issuer=2FAS" }' This should return a response containing the generated secret ID (your UUID will be different): { "message": "TOTP secret stored successfully", "secretName": "totp-f724efb9-a0a7-441f-86c3-2cd36647bfcf", "accountName": "Test Token", "issuer": "2FAS" } Sidenote: If you head to Azure Key Vault in the Azure portal, you can see the saved secret: Now we can use this secretName to generate TOTP codes: curl http://localhost:7071/api/tokens?id=totp-550e8400-e29b-41d4-a716-446655440000 The response will include a 6-digit code and the remaining time until it expires: { "token": "530868", "timeRemaining": 26 } To prove that this is accurate, quickly look again at the website, and you should see the exact same code and a very similar time remaining: This confirms that your code is valid! You can keep generating new codes and checking them - remember that the code changes every 30 seconds, so be quick when testing and validating. Bonus: Frontend UI While not the focus of this blog, as bonus content I've put together a React component which provides a functional interface for our TOTP authenticator. This component allows users to upload QR codes provided by other services, processes them to extract the TOTP URI, sends it to our backend for storage, and then displays the generated 6-digit code with a countdown timer. Here's how it looks: As you can see, I've followed a similar style to other known and modern authenticator apps. I recommend writing your own code for the user interface, as it's highly subjective. However, the following is the full React component in case you can benefit from it: import React, { useState, useEffect, useCallback } from "react"; import { Shield, UserCircle, Plus, Image as ImageIcon } from "lucide-react"; import jsQR from "jsqr"; const TOTPAuthenticator = () => { const [secretId, setSecretId] = useState(null); const [token, setToken] = useState(null); const [timeRemaining, setTimeRemaining] = useState(null); const [localTimer, setLocalTimer] = useState(null); const [error, setError] = useState(null); const [isPasting, setIsPasting] = useState(false); useEffect(() => { let timerInterval; if (timeRemaining !== null) { setLocalTimer(timeRemaining); timerInterval = setInterval(() => { setLocalTimer((prev) => { if (prev <= 0) return timeRemaining; return prev - 1; }); }, 1000); } return () => clearInterval(timerInterval); }, [timeRemaining]); const processImage = async (imageData) => { try { const img = new Image(); img.src = imageData; await new Promise((resolve, reject) => { img.onload = resolve; img.onerror = reject; }); const canvas = document.createElement("canvas"); const context = canvas.getContext("2d"); canvas.width = img.width; canvas.height = img.height; context.drawImage(img, 0, 0); const imgData = context.getImageData(0, 0, canvas.width, canvas.height); const code = jsQR(imgData.data, canvas.width, canvas.height); if (!code) { throw new Error("No QR code found in image"); } const response = await fetch( "http://localhost:7071/api/accounts", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ uri: code.data }), } ); const data = await response.json(); if (!response.ok) throw new Error(data.error); setSecretId(data.secretName); setToken({ issuer: data.issuer, accountName: data.accountName, code: "--", }); setError(null); } catch (err) { setError(err.message); } finally { setIsPasting(false); } }; const handlePaste = useCallback(async (e) => { e.preventDefault(); setIsPasting(true); setError(null); try { const items = e.clipboardData.items; const imageItem = Array.from(items).find((item) => item.type.startsWith("image/") ); if (!imageItem) { throw new Error("No image found in clipboard"); } const blob = imageItem.getAsFile(); const reader = new FileReader(); reader.onload = async (event) => { await processImage(event.target.result); }; reader.onerror = () => { setError("Failed to read image"); setIsPasting(false); }; reader.readAsDataURL(blob); } catch (err) { setError(err.message); setIsPasting(false); } }, []); const handleDrop = useCallback(async (e) => { e.preventDefault(); setIsPasting(true); setError(null); try { const file = e.dataTransfer.files[0]; if (!file || !file.type.startsWith("image/")) { throw new Error("Please drop an image file"); } const reader = new FileReader(); reader.onload = async (event) => { await processImage(event.target.result); }; reader.onerror = () => { setError("Failed to read image"); setIsPasting(false); }; reader.readAsDataURL(file); } catch (err) { setError(err.message); setIsPasting(false); } }, []); const handleDragOver = (e) => { e.preventDefault(); }; useEffect(() => { let interval; const fetchToken = async () => { try { const response = await fetch( `http://localhost:7071/api/tokens?id=${secretId}` ); const data = await response.json(); if (!response.ok) throw new Error(data.error); setToken((prevToken) => ({ ...prevToken, code: data.token, })); setTimeRemaining(data.timeRemaining); const nextFetchDelay = data.timeRemaining * 1000 || 30000; interval = setTimeout(fetchToken, nextFetchDelay); } catch (err) { setError(err.message); interval = setTimeout(fetchToken, 30000); } }; if (secretId) { fetchToken(); } return () => clearTimeout(interval); }, [secretId]); if (!secretId) { return ( <div className="w-[416px] max-w-full mx-auto bg-white rounded-xl shadow-md overflow-hidden"> <div className="bg-[#0078D4] p-4 text-white flex items-center gap-2"> <Shield className="mt-px" size={24} /> <h2 className="text-xl font-semibold m-0">My Authenticator</h2> </div> <div className="p-6"> <div className={`w-full p-10 border-2 border-dashed border-gray-300 rounded-lg text-center cursor-pointer transition-all duration-200 ${ isPasting ? "bg-gray-100" : "bg-white" }`} onPaste={handlePaste} onDrop={handleDrop} onDragOver={handleDragOver} tabIndex={0} > <ImageIcon size={32} className="text-gray-600 mx-auto" /> <p className="text-gray-600 mt-3 text-sm"> {isPasting ? "Processing..." : "Paste or drop QR code here"} </p> </div> {error && <div className="text-red-600 text-sm mt-2">{error}</div>} </div> </div> ); } return ( <div className="w-[416px] max-w-full mx-auto bg-white rounded-xl shadow-md overflow-hidden"> <div className="bg-[#0078D4] p-4 text-white flex items-center gap-2"> <Shield className="mt-px" size={24} /> <h2 className="text-xl font-semibold m-0">My Authenticator</h2> </div> <div className="flex items-center p-4 border-b"> <div className="bg-gray-100 rounded-full w-10 h-10 flex items-center justify-center mr-4"> <UserCircle size={24} className="text-gray-600" /> </div> <div className="flex-1"> <h3 className="text-base font-medium text-gray-800 m-0"> {token?.issuer || "--"} </h3> <p className="text-sm text-gray-600 mt-1 m-0"> {token?.accountName || "--"} </p> </div> <div className="text-right"> <p className="text-2xl font-medium text-gray-800 m-0 mb-0.5"> {token?.code || "--"} </p> <p className="text-xs text-gray-600 m-0"> {localTimer || "--"} seconds </p> </div> </div> <div className="p-6"> <div className={`w-full p-10 border-2 border-dashed border-gray-300 rounded-lg text-center cursor-pointer transition-all duration-200 ${ isPasting ? "bg-gray-100" : "bg-white" }`} onPaste={handlePaste} onDrop={handleDrop} onDragOver={handleDragOver} tabIndex={0} > <ImageIcon size={32} className="text-gray-600 mx-auto" /> <p className="text-gray-600 mt-3 text-sm"> {isPasting ? "Processing..." : "Paste or drop QR code here"} </p> </div> </div> {error && <div className="text-red-600 text-sm mt-2">{error}</div>} </div> ); }; export default TOTPAuthenticator; For deployment, I recommend Azure Static Web Apps because it offers built-in authentication, global CDN distribution, and seamless integration with our Azure Functions backend. Summary In this blog, we've built a TOTP authenticator that demonstrates both the inner workings of two-factor authentication and modern cloud architecture. We've demystified how TOTP actually works - from the initial QR code scanning and secret sharing, to the time-based algorithm that generates synchronized 6-digit codes. By implementing this ourselves using Azure services like Azure Key Vault and Azure Functions, we've gained deep insights into both the security protocol and cloud-native development. While this implementation focuses on the core TOTP functionality, it serves as a foundation that you can build upon with features like authenticated multi-user support, backup codes, or audit logging. Whether you're interested in authentication protocols, cloud architecture, or both, this project provides hands-on experience with real-world security implementations. The complete source code for this project is available on my GitHub repository: https://github.com/stephendotgg/azure-totp-authenticator Thanks for reading! Hopefully this has helped you understand TOTP and Azure services better.844Views0likes0CommentsBuilding a Cryptographically Secure Product Licensing System on Azure Functions and Cosmos DB
Building a robust software licensing system requires careful consideration of cryptographic security, attack vectors, and implementation details. While many licensing systems rely on simple API calls or basic key validation, creating a more secure system demands a deeper understanding of cryptographic principles and secure communication patterns. At its core, a secure licensing system must solve several fundamental challenges. How do we ensure that validation responses haven't been tampered with? How can we prevent replay attacks where valid responses are captured and reused? This article will present a robust solution to these challenges using cryptographic signatures, nonce validation, and secure key management. However, it's important to acknowledge an uncomfortable truth: no licensing system is truly unbreakable. Since license validation code must ultimately run on untrusted machines, a determined attacker could modify the software to bypass these checks entirely. Our goal, therefore, is to implement security measures that make bypass attempts impractical and time-consuming, while providing a frictionless experience for legitimate users. We'll focus on preventing the most common attack vectors — response tampering and replay attacks— while keeping our implementation clean and maintainable. The security approach Understanding the attack surface is crucial for building effective defenses. When a client application validates a license, it typically sends a request to a validation server and receives a response indicating whether the license is valid. Without proper security measures, attackers can exploit this process in several ways. They might intercept and modify the server's response, turning a "license invalid" message into "license valid." They could record a valid response and replay it later, bypassing the need for a real license key. Or they might reverse engineer the client's validation logic to understand and circumvent it. Our solution addresses these vulnerabilities through multiple layers of security. At its foundation, we use RSA public-key cryptography to sign all server responses. The validation server holds the private key and signs each response with it, while the client applications contain only the public key. This means that while clients can verify that a response came from our server, they cannot generate valid responses themselves. Even if an attacker intercepts and modifies a response, the signature verification will fail. To prevent replay attacks, we implement a nonce system. Each license check generates a unique random value (the nonce) that must be included in both the request and the signed response. The client verifies that the response contains the exact nonce it generated, ensuring that old responses cannot be reused. This effectively turns each license check into a unique cryptographic challenge that must be solved by the server. Architecture Our licensing system is built around a secure API that handles license validation requests, with the architecture designed to support our security requirements. The system consists of three main components: the validation API, the license storage, and the client function. Validation API: Implemented as an Azure Function, providing a serverless endpoint that handles license verification requests. We chose Azure Functions for several reasons: they offer excellent security features like managed identities for accessing other Azure services, built-in HTTPS enforcement, and automatic scaling based on demand. The validation endpoint is stateless, with each request containing all necessary information for validation, making it highly reliable and easy to scale. License storage: We use Azure Cosmos DB, which provides several advantages for our use case. First, it offers strong consistency guarantees, ensuring that license status changes are immediately reflected across all regions. Second, it includes built-in encryption at rest and in transit, adding an additional security layer to our sensitive license data. Third, its flexible schema allows us to easily store different types of licenses and associated metadata. Client function: I'll be writing my client function in JavaScript, though it is easily replicable in the language that is most relevant for your product. It handles nonce generation, request signing, and response verification, encapsulating these security details behind a clean, simple interface that application developers can easily integrate. Let's look at how this fits together with a flowchart illustrating the journey: Prerequisites To follow along this blog, you'll need the following: Azure subscription: You will need an active subscription to host the services we will use. Make sure you have appropriate permissions to create and manage resources. If you don't have one, you can sign up here: https://azure.microsoft.com/en-us/pricing/purchase-options/azure-account Visual Studio Code: For the development environment, install Visual Studio Code. Other IDEs are available, though we will be benefiting from the Azure Functions extension within this IDE. Download VS Code here: https://code.visualstudio.com/ VS Code Azure Functions extension (optional): There are many different ways to deploy to Azure Functions, but having one-click deploy functionality inside our IDE is extremely useful. To install on VS Code, head to Extensions > Search Azure Functions > Click Install. Building the solution Generating a public/private key pair As outlined, we'll be using RSA cryptography as a security measure. This means that we'll need a public and private key pair. There are many ways to generate these, but the one I will suggest does it all locally through the terminal, so there is no risk of exposing the private key. Run the following command in a local terminal: ssh-keygen -t rsa -b 2048 Then follow the wizard through. It will save the files at C:\Users\your-user\.ssh. You are expecting to find a id_rsa and a id_rsa.pub. Both of these can be opened via Notepad or other text editors. The .pub file type represents the public key, and the other is the private key. It is extremely important to keep the private key absolutely confidential, while there is no risk associated with exposing the public key. If the private key was exposed, then attackers could sign responses as if it were the server to circumvent license checks. Keep these keys handy as we will need them for signing and verifying signatures for this system. Deploying the resources Before we dive into the implementation, we need to set up our Azure infrastructure. Our system requires two main components: an Azure Function to host our validation endpoint and a Cosmos DB instance to store our license data. Let's start with Cosmos DB. From the Azure Portal, create a new Cosmos DB account. Select the Azure Cosmos DB for NoSQL API - while Cosmos DB supports multiple APIs, this option provides the simplicity and flexibility we need for our license storage. During creation, you can select the serverless pricing tier if you're expecting low traffic, or provisioned throughput for more predictable workloads. Make note of your endpoint URL and primary key - we'll need these later for our function configuration. Once your Cosmos DB account is created, create a new database named licensing and a container named licenses. For the partition key, use /id since we'll be using the license key itself as both the document ID and partition key. This setup ensures efficient lookups when validating licenses. Next, let's deploy our Azure Function. Create a new Function App from the portal, selecting Node.js as your runtime stack and the newest version available. Choose the Consumption (Serverless) plan type - this way, you only pay for actual usage and get automatic scaling. Make sure you're using the same region as your Cosmos DB to minimize latency. After the Function App is created, you'll need to configure your application settings. Navigate to the Configuration section and add the following settings: COSMOS_ENDPOINT: Your Cosmos DB endpoint URL COSMOS_KEY: Your Cosmos DB primary key PRIVATE_KEY: Your RSA private key for signing responses You may also want to consider setting up Application Insights for monitoring. It's particularly useful for tracking license validation patterns and detecting potential abuse attempts. With these resources deployed, we're ready to implement our validation system. The next sections will cover the actual code implementation for both the server and client components. Creating the validation API As discussed, the validation API will be deployed as an Azure Function. While this use case only requires one endpoint, the benefit is that it is scalable with the ability to add more endpoints as the scope widens. To create a Function project using the Azure Functions extension: Go to Azure tab > Click the Azure Functions icon > Click Create New Project > Choose a folder > Choose JavaScript > Choose Model V4 > Choose HTTP trigger > Provide a name (eg 'validation') > Click Open in new window. This will have created and opened a brand new Function project, with a HTTP trigger. This HTTP trigger will represent our validation endpoint and will be what the client calls to get a signed verdict on the license key provided by the user. Now let's write the code for this validation endpoint. The base trigger provided by the setup process should look something like this: const { app } = require('@azure/functions'); app.http('validate', { methods: ['GET', 'POST'], authLevel: 'anonymous', handler: async (request, context) => { context.log(`Http function processed request for url "${request.url}"`); const name = request.query.get('name') || await request.text() || 'world'; return { body: `Hello, ${name}!` }; } }); Let's make a few modifications to this base: Ensure that the only allowed HTTP method is POST, as there is no need to support both and we will make use of the request body allowed in POST requests. Clear everything inside that function to make way for our upcoming code. Optional: The first parameter inside the http function, where mine is validate, represents the route that will be used. With the current setup, mine would be example.com/validate. If you want to change this path, now is the time by adjusting that parameter. Now, let's work forward from this adjusted base: const { app } = require('@azure/functions'); app.http('validate', { methods: ['POST'], authLevel: 'anonymous', handler: async (request, context) => { } }); Our job now is handle an incoming validation request. As per the above flow chart, we must check the license key against actual existing keys stored in our Cosmos DB database, sign the response with the private key and then return it to the client. First, let's define the format that we expect in the incoming body and give ourselves access to that data. This validation endpoint should expect two data points: licenseKey and nonce. The following code will pull those two variables from the body, and return an error in the case that they have not been provided: try { const body = await request.json(); const { licenseKey, nonce } = body; if (!licenseKey) { return { status: 400, body: { error: 'Missing licenseKey' } }; } if (!nonce) { return { status: 400, body: { error: 'Missing nonce' } }; } ... } catch (error) { return { status: 400, body: { error: 'Invalid JSON in request body' } }; } Now that we know for sure a license key and nonce were provided, this is just a case of adding layers of checks. The first check to add is to ensure that the license key and nonce formats are valid. It makes sense to have a standardized format, such as a UUID, so we can reduce unnecessary database calls. I'll use UUID format. It's important to remember the desired format, so that we can (a) give frontend validation to the end user if they provide an wrongly formatted license key, and (b) ensure we generate a correctly formatted nonce. As for validating this, add the following regex as a const at the top of the class: const UUID_REGEX = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i; Then add these two checks after our previous ones: if (!UUID_REGEX.test(licenseKey)) { return { status: 400, body: { error: 'Invalid license key format' } }; } if (!UUID_REGEX.test(nonce)) { return { status: 400, body: { error: 'Invalid nonce format' } }; } Now, we are validating the format of both the license key and nonce using a UUID regular expression pattern. The code uses the test method to check if each string matches the expected UUID format (like "123e4567-e89b-4d3c-8456-426614174000"). If either the license key or nonce doesn't match this pattern, the function returns an 400 Bad Request response with a specific error message indicating which field had the invalid format. This validation ensures that we only process requests where both the license key and nonce are properly formatted UUIDs. Next is the most important check: validating the license. As explained, I'm choosing to use Azure Cosmos DB for this, though this part may vary depending on your chosen data store. Before we add any code, let's first conceptualize a basic schema for our data. Something like this: { "type": "object", "required": [ "id", "redeemed" ], "properties": { "id": { "type": "string", "pattern": "^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$" }, "redeemed": { "type": "boolean", "default": false } } } Cosmos DB is a great solution for this because we can easily add to the schema as the scope changes. For example, some potential ways to increase this functionality are: An expiry date A set of all activations, enforcing a cap Data to identify the license holder once redeemed Now that we know what data to search, let's add the code. First, we need to install the required dependency so we can use Cosmos DB: > npm install azure/cosmos Next, create a local.settings.json file if it doesn't already exist. Then, add two environment variables to the Values section, COSMOS_ENDPOINT and COSMOS_KEY: { "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "", "FUNCTIONS_WORKER_RUNTIME": "node", "COSMOS_ENDPOINT": "X", "COSMOS_KEY": "X" } } Be sure to populate those variables with the actual data from your deployed Cosmos DB resource. We'll be using these in a moment to programmatically connect to it. Add this import to the top of the function class: const { CosmosClient } = require('@azure/cosmos'); Now, let's add the code to connect to our Cosmos DB resource and read the data to validate the license: const cosmosClient = new CosmosClient({ endpoint: process.env.COSMOS_ENDPOINT, key: process.env.COSMOS_KEY }); const database = cosmosClient.database('licensing'); const container = database.container('licenses'); try { const { resource: license } = await container.item(licenseKey, licenseKey).read(); if (!license) { return { status: 404, body: { error: 'License not found' } }; } ... } catch (dbError) { if (dbError.code === 404) { return { status: 404, body: { error: 'License not found' } }; } return { status: 500, body: { error: 'Error validating license' } }; } The first part of the code creates our connection to Cosmos DB. We use the CosmosClient class from the SDK, providing it with our database endpoint and key from environment variables. Then we specify we want to use the licensing database and the licenses container within it where our license documents are stored. The core validation logic is surprisingly simple. We use the container's item method to look up the license directly using the provided license key. We use this key as both the item ID and partition key, which makes our lookups very efficient - it's like looking up a word in a dictionary rather than reading through every page. If no license is found, we return a 404 status code with a clear error message. We've also implemented proper error handling that distinguishes between a license not existing (404) and other potential database errors (500). This gives clients clear feedback about what went wrong while keeping our error messages secure - we don't expose internal system details that could be useful to attackers. Also, at this point we don't need to do any cryptographic signing, because there is no benefit to an attacker in replaying a rejected validation. Now, after confirming a license exists in our database, but before returning it to the client, we need to sign our response to prevent tampering. This is crucial for security - it ensures that responses can't be modified or forged, as they need a valid signature that can only be created with our private key and verified with our public key. The signing process uses Node's built-in crypto module for RSA signing. First, we load our private key from environment variables. Then, for any valid license response, we create a SHA-256 signature of the JSON data which we provide in the HTTP response. Add this import to the top of the class: const crypto = require('crypto'); Then, add your previously generated private key to our environment variables: { "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "", "FUNCTIONS_WORKER_RUNTIME": "node", "COSMOS_ENDPOINT": "X", "COSMOS_KEY": "X", "PRIVATE_KEY": "X" } } Then, after our Cosmos DB lookup, add the following code to sign a valid response: const respData = { valid: !license.redeemed, nonce, licenseKey }; const sign = crypto.createSign('SHA256'); const dataString = JSON.stringify(respData); sign.update(dataString); return { body: { ...respData, signature: sign.sign(process.env.PRIVATE_KEY, 'base64') } }; Now, every valid response includes both the license data and a cryptographic signature of that data. The client can then verify this signature using our public key to ensure the response hasn't been tampered with. Finally, after we have created respData, and set valid to the inverse of license.redeemed, let's ensure we set redeemed to true in the database, so that the key cannot be used for future requests. Between signing the JSON and returning the data, add the following code: await container.item(licenseKey, licenseKey).patch([ { op: 'replace', path: '/redeemed', value: true } ]); This ensures a license can only be validated once and marks it as redeemed upon first use. And that's it for the validation endpoint! We've built a secure HTTP endpoint that validates license keys through a series of steps. It first checks that the incoming request contains both a license key and nonce in the correct UUID format. Then it looks up the license in our Cosmos DB to verify its existence and redemption status. Finally, for valid licenses, it returns a cryptographically signed response that includes the license status and the original nonce. The signature ensures our responses can't be tampered with, while the nonce prevents replay attacks. Our error handling provides clear feedback without exposing sensitive system details, making the endpoint both secure and developer-friendly. The final code for this endpoint should look like this: const { app } = require('@azure/functions'); const { CosmosClient } = require('@azure/cosmos'); const crypto = require('crypto'); const UUID_REGEX = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i; app.http('validate', { methods: ['POST'], authLevel: 'anonymous', handler: async (request, context) => { try { const body = await request.json(); const { licenseKey, nonce } = body; if (!licenseKey) { return { status: 400, body: { error: 'Missing licenseKey' } }; } if (!nonce) { return { status: 400, body: { error: 'Missing nonce' } }; } if (!UUID_REGEX.test(licenseKey)) { return { status: 400, body: { error: 'Invalid license key format' } }; } if (!UUID_REGEX.test(nonce)) { return { status: 400, body: { error: 'Invalid nonce format' } }; } const cosmosClient = new CosmosClient({ endpoint: process.env.COSMOS_ENDPOINT, key: process.env.COSMOS_KEY }); const database = cosmosClient.database('licensing'); const container = database.container('licenses'); try { const { resource: license } = await container.item(licenseKey, licenseKey).read(); if (!license) { return { status: 404, body: { error: 'License not found' } }; } const respData = { valid: !license.redeemed, nonce, licenseKey }; const sign = crypto.createSign('SHA256'); const dataString = JSON.stringify(respData); sign.update(dataString); await container.item(licenseKey, licenseKey).patch([ { op: 'replace', path: '/redeemed', value: true } ]); return { body: { ...respData, signature: sign.sign(process.env.PRIVATE_KEY, 'base64') } }; } catch (dbError) { if (dbError.code === 404) { return { status: 404, body: { error: 'License not found' } }; } return { status: 500, body: { error: 'Error validating license' } }; } } catch (error) { return { status: 400, body: { error: 'Invalid JSON in request body' } }; } } }); Adding the client function All of this server-side validation is pointless unless we properly verify responses on the client side. Our signed responses and nonce system are security features that only work if we actually validate them. Imagine if someone intercepted the server response and modified it to always return valid: true - without verification, our client would happily accept this fraudulent response. Similarly, without nonce checking, someone could capture a valid response and replay it later, effectively reusing a one-time license multiple times. Two critical checks need to happen when we get a response from our validation endpoint: We need to verify that the response's signature is valid using our public key. This ensures the response actually came from our server and wasn't tampered with in transit. Even the smallest modification to the response data would cause the signature verification to fail. We must confirm that the nonce in the response matches the one we generated for this specific request. The nonce is like a unique ticket - it should only be valid for one specific validation attempt. This prevents replay attacks where someone could capture a valid response and reuse it later. As outlined earlier, this client check could be written with any programming language in any environment that allows a HTTP request. For consistency, I'll do mine in JavaScript, though know that there would be no issues in porting this over to Java, Kotlin, C++, C#, Go, etc. This function would then be implemented into your product and triggered when the user provides a license key that needs validating. Let's start with a base function, with crypto imported ready for the signature verification and the licenseKey parameter provided by the user: const crypto = require('crypto'); async function validateLicense(licenseKey) { } Before we make our request to the validation server, we need to generate a unique nonce that we'll use to prevent replay attacks. Using the crypto module we just imported, we can generate a UUID for this purpose: const nonce = crypto.randomUUID(); Now we can make our request to the validation endpoint: try { const response = await fetch('https://your-function-url/validate', { method: 'POST', headers: { 'Content-Type': 'application/json', }, body: JSON.stringify({ licenseKey, nonce }) }); if (!response.ok) { const errorData = await response.json(); throw new Error(errorData.error || 'License validation failed'); } ... } catch (error) { console.error('License validation failed:', error.message); return false; } With our nonce generated, we make our HTTP request to the validation server. We send a POST request with our license key and nonce in the request body, making sure to set the Content-Type header to indicate we're sending JSON data. The response handling is thorough but straightforward: if we get anything other than a successful response (indicated by response.ok being false), we attempt to parse the error message from the response body and throw an error. For successful responses, we parse the JSON data which will contain our validation result, along with the nonce and cryptographic signature we'll need to verify. This gives us a clean way to handle both successful validations and various error cases (like invalid license keys or server errors) while ensuring we have proper data to perform our security checks. Now, let's check the nonce: if (data.nonce !== nonce) { return false; } This simple but crucial comparison verifies that the response we received was actually generated for our specific request. By checking data.nonce !== nonce, we ensure the nonce returned by the server exactly matches the one we generated. If there's any mismatch, we return false immediately - we don't even bother checking the signature or license status because a nonce mismatch indicates either a replay attack (someone trying to reuse an old valid response) or response tampering. Think of it like a ticket number at a deli counter - if you give them ticket #45 but they call out #46, something's wrong and you need to stop right there. Now we perform our second and most crucial security check: cryptographic signature verification. Using Node's crypto module, we create a SHA256 verifier and reconstruct the exact same data structure that the server signed - this includes the validation result, nonce, and license key in a specific order. We verify this data against the signature provided in the response using our public key. The server signed this data with its private key, and only the matching public key can verify it - any tampering with the response data would cause this verification to fail. If the signature is valid, we've confirmed two things: the response definitely came from our validation server (not an impersonator) and the data hasn't been modified in transit. Only then do we trust and return the validation result. Let's add the code after the nonce check which checks the signature: const verifier = crypto.createVerify('SHA256'); verifier.update(JSON.stringify({ valid: data.valid, nonce: data.nonce, licenseKey: data.licenseKey })); const isSignatureValid = verifier.verify( process.env.PUBLIC_KEY, data.signature, 'base64' ); if (!isSignatureValid) { return false; } return data.valid; The final code for this function should look like this: const crypto = require('crypto'); async function validateLicense(licenseKey) { const nonce = crypto.randomUUID(); try { const response = await fetch('https://your-function-url/validate', { method: 'POST', headers: { 'Content-Type': 'application/json', }, body: JSON.stringify({ licenseKey, nonce }) }); if (!response.ok) { const errorData = await response.json(); throw new Error(errorData.error || 'License validation failed'); } const data = await response.json(); if (data.nonce !== nonce) { return false; } const verifier = crypto.createVerify('SHA256'); verifier.update(JSON.stringify({ valid: data.valid, nonce: data.nonce, licenseKey: data.licenseKey })); const isSignatureValid = verifier.verify( process.env.PUBLIC_KEY, data.signature, 'base64' ); if (!isSignatureValid) { return false; } return data.valid; } catch (error) { console.error('License validation failed:', error.message); return false; } } With our validation function complete, we can now reliably check license keys from any part of our application. The function is designed to be straightforward to use while handling all the security checks under the hood - just pass in a license key and await the result. It returns a simple boolean: true if the license is valid and verified, false for any kind of failure (invalid license, network errors, security check failures). Here's how you might use it: const isValid = await validateLicense('123e4567-e89b-4d3c-8456-426614174000'); if (isValid) { // License is valid - enable features, start your app, etc. } else { // License is invalid - show error, disable features, etc. } Testing Let's do a couple tests running this function. Firstly, I'll do a valid license key that I've inserted into the database. The document looks like this: { "id": "123e4567-e89b-4d3c-8456-426614174000", "redeemed": false } Here's the response: { "valid": true, "nonce": "987fcdeb-51a2-4321-9b54-326614174000", "licenseKey": "123e4567-e89b-4d3c-8456-426614174000", "signature": "XkxZWR0eHpKTFE3VVhZT3JUEtYeXNJWUZ6SWJoTUtmMX0JhVXBHK1ZzN2lZYzdGSnJ6SEJa1VOCnBrWkhDU0xGS1ZFTDmFZN1BUeUlHRzl1V0tJPT0=" } Running the function returns true. Which means that the above response has passed the nonce and signature validation too! I tested with a license key which does not exist in the database, 0bd84e7b-d91e-47e8-81b8-f39a5c1f8c72, and the result was that the function returned false. Summary Building a secure license validation system is a delicate balance between security and usability. The implementation we've created provides strong security guarantees through cryptographic signatures and nonce validation, while remaining straightforward to integrate into any application. Let's recap the key security features we've implemented: UUID-based license keys and nonces for standardized formatting Server-side validation using Azure Cosmos DB for efficient license lookups Response signing using RSA public-key cryptography Nonce verification to prevent replay attacks Comprehensive error handling with secure error messages While this system provides robust protection against common attack vectors like response tampering and replay attacks, it's important to remember that no licensing system is completely unbreakable. Since validation code must run on untrusted machines, a determined attacker could potentially modify the software to bypass these checks entirely. Our goal is to make bypass attempts impractical and time-consuming while providing a frictionless experience for legitimate users. The architecture we've chosen - using Azure Functions and Cosmos DB - gives us plenty of room to grow. Some potential enhancements could include: Rate limiting to prevent brute force attempts IP-based activation limits License expiration dates Feature-based licensing tiers Usage analytics and reporting The modular nature of our implementation means adding these features would require minimal changes to the core validation logic. Our schema-less database choice means we can evolve our license data structure without disrupting existing functionality. Remember to store your private key securely and never expose it in client-side code. The public key used for verification can be distributed with your client applications, but the private key should be treated as a critical secret and stored securely in your Azure configuration. This concludes our journey through building a secure license validation system. While there's always room for additional security measures and features, this implementation provides a solid foundation that can be built upon as your needs evolve.975Views0likes0Comments