Basic programming, .NET technology.

[Part 2] - Practice Azure services - Enhance security by using Key Vault and App Configuration

 Scenario

In the previous post, we use a shared key to establish a connection to an Azure storage account (including Table storage and Blob storage) and SQL Database. This method can lead security vulnerabilities. To enhance the security of our application, we can leverage App Configuration along with Azure Key Vault. This post will illustrate how to integrate these services to our current application.

Let's started.

Overview architecture


The app now functions similarly to what was described in the earlier post. The difference is that we now save the connection string (shared key) for the storage account and SQL Database in Key Vault, and the configurations are managed in App Configuration instead of relying on the environment variables of Azure App Service. When the API starts up, it will load the configuration from an additional configuration provider, which is App Configuration, with the values sourced from Key Vault.

Set up Azure Service

If you're unsure how to set up Azure App Configuration and Azure Key Vault, you can refer to these Microsoft articles:

Quickstart: Create an Azure App Configuration store | Microsoft Learn

Tutorial for using Azure App Configuration Key Vault references in an ASP.NET Core app | Microsoft Learn

You need to have all below information prior to proceeding to the next step.

Key Vault and new secrets


App configuration


Activate Managed Identity for the API



Remember the Object (principal ID) to use for the next step.

Grant access to your API/ local dev to Key Vault

Azure role-based access control: "Key Vault Secrets User"/"Key Vault Administrator"


Modify code

You can use code sample from previous post.

Add one more extension class to add Azure App Configuration to provided configuration builder.


Modify "Program.cs" file


Run locally

Change value of config key: "appConfig:ConnectionString"
Run API from Visual studio. If It runs successfully, then deploy to Azure App Service 

Deploy new changes to Azure App Service

Check API

Environment variables of App service


API swagger - OK: means the APP can interact with App Configuration and Key Vault



Over to you: When to use the App Configuration and Key Vault and what are pros and cons?

Share:

[Part 1] - Practice Azure services - Build a small storage management app - Upload document feature

This series aims to showcase the integration of Azure services. It consists of several sections.

In this initial part, we concentrate on creating a document upload feature utilizing various Azure services. For the first post, we will employ these services at a fundamental level.

Scenario

The document management feature is required to save various documents (such as import documents, export documents, etc.).

Prerequisites

- An Azure subscription.

- Visual Studio or Visual Studio Code.

- A basic understanding of the Azure services referenced in this article and the ability to create them (using the Azure portal or Bicep, for example) is necessary.

Tech stacks

- ASP.NET CORE 8

- Vue.js 3, PrimeVue

How we use Azure service for this functionality

+ Azure blob storage: for uploading document files.

+ Azure table storage: to store the document types

+ Azure SQL Database: to save the document information such as document type, document URL...

+ Azure App Service: to host the storage API

+ Static Web App: to serve the Vue single-page application (SPA).

To authenticate the SQL Database and Storage account, we will use connection string (shared key) for these services.

Overview architecture


1 - User access document management page, list document is loaded from SQL database and user would like to upload new document

2, 3 - SPA obtains SAS (shared access signature) token via API

4 - SPA uses SAS to upload appropriated document to blob storage 

5,6 - SPA  save document information to database via API.


Code Example

You can use the upload doc sample or create your own API and front-end using your preferred tech stack.
This sample includes infra code(not complete) and app code (Vue app and API)

Set up Azure services

If you're knowledgeable about Bicep code or interested in learning it, you can use Bicep code found in "infra" folder to deploy Azure services. Of course, there are few aspects you will need to modify or include. :)

The required azure services (sample is below, services name is your choice)


Run locally

Storage API
1. Open project in folder: "upload-doc\app\api"
2. Add configuration for table storage, blob storage and SQL database


3. Run code from your IDE.

Vue App
1. Open Vue project: ".\upload-doc\app\webapp"
2. Change API base URL in the "HttpService.ts" file


3. Run: npm install
4. Run: npm run dev

Deploy to Azure services

1. Deploy API to Azure App Service created in previous step.
2. Check the API



3. Deploy the Vue app to Static Web App created in previous step.
4. Check the application



Ultimately, you are aware of how to integrate Azure Storage, Azure App Service, Azure SQL Database, and Static Web App for your own objectives.

Over to you
When should we use these services?
Share:

Azure Function Fundamental

This post is to learn about Azure Function. 

What is Azure Function?

Azure Function allow us run a piece of code/ don’t worry about infrastructure (Cloud infrastructure provide all, application scale )

Key Features:

  • Serverless: No need to manage or provision infrastructure. Azure takes care of everything for you.
  • Event-driven: Functions can be triggered by various events, such as file uploads, database changes, or HTTP requests.
  • Pay-per-use: You only pay for the resources consumed during function execution. This makes it a cost-effective option for lightweight, event-driven applications.

Azure Function Pricing Plans

Azure Functions operates on a pay-per-use pricing model, offering flexibility based on how your app scales. There are three main hosting plans:

  • Consumption Plan: The default option, where you are billed based on the resources consumed during function execution. This plan automatically scales based on demand.
  • Premium Plan: Adds additional features like VNET Integration and more control over scaling, while still following the pay-per-use model.
  • Dedicated (App Service) Plan: Allows you to run Azure Functions on dedicated VMs with more predictable scaling and resource allocation.

You can choose a plan based on your application's performance needs and budget. Check out Azure Functions Scaling documentation for more details.

Function Scenarios

Here are some common scenarios where Azure Functions excel:

  1. Process File Uploads: Automatically trigger actions such as file validation or transformation when files are uploaded to storage.
  2. Run Scheduled Tasks: Create scheduled functions to automate recurring tasks such as sending emails or database maintenance.
  3. Build a Scalable Web API: Easily build REST APIs that scale automatically based on traffic.
  4. Create a Reliable Messaging System: Azure Functions can integrate with Azure Service Bus or Kafka to process message queues and build asynchronous message-driven systems.

For more examples and use cases, visit Azure Functions Scenarios.

Triggers and Bindings

Azure Functions work based on triggers and bindings that define how a function is activated and how it interacts with other Azure services.

Triggers

A trigger is what causes your function to execute. Some common trigger types include:

  • HTTP requests
  • File uploads to Azure Blob Storage
  • Messages on Azure Service Bus queues or topics
  • Timer triggers for scheduled jobs

Bindings

Bindings provide a way to input and output data to and from your function. There are two types of bindings:

  • Input Binding: Provides data to your function from a source like a database or a file.
  • Output Binding: Sends data to a destination, such as a database or messaging service.

Azure Functions support various types of bindings such as for Azure Blob Storage, Service Bus, Event Hubs, and more.

Triggers and Bindings

Dependency Injection in Azure Functions

In a real-world application, your functions may depend on services like databases, caches, or APIs. Azure Functions support dependency injection (DI), which allows you to inject these services into your functions. This is especially useful when your function needs to manage resources or interact with external services.

In Azure Functions, you can configure DI through the IFunctionsHostBuilder interface. There are three service lifetimes you can configure for dependency injection:

  • Transient: New instance is created each time the function is invoked.
  • Scoped: A new instance is created per request within a function execution.
  • Singleton: A single instance is used for the lifetime of the application.

Best Practices for Azure Functions

To make the most of Azure Functions, here are some best practices you should follow:

  1. Choose the Correct Hosting Plan: The hosting plan affects how your app is scaled, how instances are allocated, and what resources are available. Ensure that you choose the appropriate plan for your app's size and functionality.
  2. Configure Storage Correctly: Azure Functions rely on Azure Storage for managing triggers and logging. Ensure that you have sufficient storage and that it’s configured properly.
  3. Organize Your Functions: Functions are deployed together and scale together. It’s essential to keep your functions organized to ensure they work well together.
  4. Write Robust Code: Ensure your functions are stateless, avoid long-running operations, and handle errors gracefully.
  5. Manage Connections Efficiently: Properly manage connections, whether it's with an HTTP client, Azure client, or SQL client to ensure optimal performance and resource usage.
  6. Error Handling and Retry: Azure Functions provides built-in retry policies, including fixed delay and exponential backoff strategies, to handle transient errors effectively.

Developing with .NET

Azure Functions supports .NET, and you can choose between the in-process model and the isolated worker process model.

  • In-Process Model: The function executes within the same process as the Azure Functions runtime, allowing for faster execution and easier debugging.
  • Isolated Worker Process: A more flexible model that runs the function in a separate process, providing better isolation and allowing the use of features like dependency injection in a .NET standard way.

For more details on developing with .NET, check out the following resources:

Example: C# Isolated Worker Process Model

Check it out here: C# Isolated Worker Functions.

If you refer to a simple version only highlight the keys information, click here

Share:

Azure App Service Fundamental

This post to learn about Azure App Service. 

So, Let's started.

When it comes to hosting your web application, REST API, or mobile backend, Azure App Service offers an incredibly versatile platform-as-a-service (PaaS) solution. Whether you are a small startup or a large enterprise, Azure App Service provides a range of features to support your web apps, APIs, and mobile applications with minimal overhead. In this blog post, we'll dive into what Azure App Service is, its key components, and how to make the most out of its features.

What is Azure App Service?

Azure App Service is a fully managed hosting platform that enables developers to build, deploy, and scale web apps, APIs, and mobile backends. It supports a wide range of programming languages such as .NET, .NET Core, Java, Node.js, PHP, and Python. You can host your applications on either Linux or Windows operating systems, providing flexibility depending on your tech stack.

Why Choose Azure App Service? As a Platform-as-a-Service (PaaS), Azure App Service abstracts away the infrastructure management and lets you focus on building and deploying your apps. The platform automatically handles patching, scaling, and monitoring, which saves you time and resources while ensuring that your app stays available and performs well under load.

Supported Languages & OS:

  • Languages: .NET, .NET Core, Java, Node.js, PHP, Python
  • Operating Systems: Linux, Windows

What is an App Service Plan?

An App Service Plan (ASP) defines a set of compute resources for running your web apps. It outlines the operating system, region, number of virtual machine instances, size of VM instances, and pricing tier for your app. Each ASP helps determine the infrastructure and pricing structure for your app.

Key Components of an App Service Plan:

  • Operating System: Choose between Windows or Linux
  • Region: Select the region that best serves your audience (e.g., West US, East US)
  • VM Instances: Define how many instances (virtual machines) you need
  • VM Size: Choose between Small, Medium, or Large VM sizes
  • Pricing Tiers: Azure offers different pricing tiers:
    • Free/Shared: Basic hosting, ideal for testing or development.
    • Basic/Standard/Premium: For production workloads with varying levels of compute power.
    • PremiumV2/PremiumV3: Advanced features and performance for more demanding apps.
    • Isolated/IsolatedV2: Secure environments with isolated networks for enterprise-grade applications.

Scaling Your App

Scaling is a fundamental part of managing web apps on Azure App Service. When your app is hosted on an ASP, scaling can be done in two ways:

  1. Scale Up: Increase the size of the virtual machine (VM), which adds more CPU and memory.
  2. Scale Out: Increase the number of VM instances, which spreads the load and enhances performance during peak times.

Auto-scaling

Azure offers automatic scaling that helps adjust resources based on your app’s demand. For instance, when traffic spikes, Azure can automatically increase the number of instances, and when traffic drops, it will scale down. This option ensures optimal performance without the hassle of manual intervention.

You can also define scaling rules for your app to trigger scaling actions based on specific metrics, such as CPU usage or memory consumption.

Deployment Best Practices

Azure provides various deployment strategies, including Blue/Green Deployment via Deployment Slots. Using deployment slots, you can test new versions of your app in a non-production environment before swapping it with the live version. This technique helps ensure that there is no downtime during updates.

Key Deployment Practices:

  • Use Deployment Slots: Run your app in staging and swap it to production once tested.
  • Run from Package: For fast deployments and easy rollback, using a package-based deployment offers many benefits.
  • Diagnostic Logs: Set up logs to track any errors or issues with your deployments, enabling faster troubleshooting.

Configuration

Azure App Service provides flexible configuration options for setting up your app’s environment. You can configure common app settings such as environment variables, connection strings, and more through the Azure Portal or Azure CLI.

Configuration

Monitoring Your App

Effective monitoring is crucial for understanding your app’s performance and availability. Azure provides various tools to monitor your app's health:

  • Metrics: Track performance metrics like CPU, memory usage, response time, etc.
  • Resource Logs: Access detailed logs for in-depth troubleshooting.
  • Application Insights: Monitor your app’s availability, performance, and usage with Azure Monitor.

You can use Diagnostic Settings to route your logs to Azure Monitor Logs or Blob Storage for further analysis. These metrics and logs will help you proactively address issues and optimize your application for the best performance.

Monitoring

Networking Features

Azure App Service offers several networking features to ensure your app remains secure and performs well:

  • Virtual Network Integration: Allows your app to securely access resources in your private network.
  • Access Restrictions: Set up rules to restrict who can access your app.
  • Private Endpoints: Securely connect to your app through private IPs.
  • Azure Firewall: Control outbound traffic from your app to protect against security threats.

Networking

Authentication & Security

Azure App Service comes with built-in authentication features to help secure your web applications. You can integrate your app with various identity providers such as Microsoft Entra ID, Facebook, Google, Twitter, Apple, and more. This feature enables you to add user authentication quickly without building a custom authentication system from scratch.


If you refer to a simple version only highlight the keys information, click here
Share:

Azure Service Bus

 1. Overview

  • Message broker
  • Can decouple applications and services
  • Some common scenarios:
    • Messaging: Transfer business data
    • Decouple application: client and server don’t have to be online at the same time.
    • Topic and subscriptions: enable  1:n relationships between publisher and subscriber.
    • Message session: implement workflows that require message ordering or message deferral.
2. Queue
  • FIFO
  • Use case: point to point communication




3. Topic
  • Topic and subscription: subscription as a virtual queue of a topic. Each subscription receive a copy message


4. Receive Mode
  • Default is PeekLock mode
  • Receive mode: ReceiveAndDelete  and PeekLock
    • ReceiveAndDelete:
      • Service bus mark message as being consumed
      • The application can tolerate not processing a message if a failure occurs  
    • PeekLock
      • two-stage
      • it possible for the application cannot tolerate missing messages.
      • when Bus receives the request, it locks the message
      • after processing -> call CompleteAsync
      • some error occur -> call AbandonAsync
5. Security:

- Using Shared Access Signature
  • With Azure Portal -> use can define Share Access Policies for Namespaces, queues, topics, and their subscriptions
  • There are three policy rules:
    • Send
    • Listen
    • Manage
  • SAS can be applied on the namespace, entity (queue, topic), HTTP level, or  AMQP level
  • Use SAS Authorization
    • Connection string contains a rule name (SharedAccessKeyName) and rule key (SharedAccessKey) => when we pass it to any constructor or factory,  the SAS token provider is automatically created and populated.
  • Application usecase: => avoid use default Share Access Signature
    • Grant role/claims for the client as their responsibility


References:
Share:

7. ASP .Net core - Fundamentals - Logs

 This series is a collection of knowledge about ASP .NET Core. It just is my notes.

Part 1: Startup file.

Part 2: DI.

Part 3: Middleware.

Part 4: Host and Servers.

Part 5: Configurations.

Part 6: Environment.

Part 8: Error Handling.

Part 9: Routing.

Part 10: Make an HTTP Request.

Part 11: Static files.

Part 12: Authentication and Authorization.

Part 13: CORS.

The log is one of the approaches to troubleshoot the application. So, Don't forget to implement the logs for your application.

Which kind of log provider is best: built-in or third party???
If your app integrates with Azure or deploys on Azure => use Application Insight
If your app deploys on-premise => use a structured log third party, for example, NLog

  1. Log level

LogLevelValueMethodDescription
Trace0LogTraceContain the most detailed messages. These messages may contain sensitive app data. These messages are disabled by default and should not be enabled in production.
Debug1LogDebugFor debugging and development. Use caution in production due to the high volume.
Information2LogInformationTracks the general flow of the app. May have long-term value.
Warning3LogWarningFor abnormal or unexpected events. Typically includes errors or conditions that don't cause the app to fail.
Error4LogErrorFor errors and exceptions that cannot be handled. These messages indicate a failure in the current operation or request, not an app-wide failure.
Critical5LogCriticalFor failures that require immediate attention. Examples: data loss scenarios, out of disk space.
None6
Specifies that a logging category should not write any messages.
  1. Built-in logging provider

  1. Third-party logging provider
    • NLog
    • Serilog

  1. How to use AppInsight and Telemetry
a. step 1: Enable .net core API use AppInsight
    • Create an AppInsight resource via Azure Portal.
    • Get the information of instrumentation key or connection string
b. step 2: Configure something of AppInsight in .net core API
    • Add Nuget package
      • Microsoft.ApplicationInsights.AspNetCore
    • ConfigureService
public void ConfigureServices(IServiceCollection services)
{
...
services.AddApplicationInsightsTelemetry(Configuration["ApplicationInsights:ConnectionString"]);
}
    • Important: some Azure region requires a connection string instead of  Instrumentation key
    • Note: You can also configure AppInsight following these recommendations
It's best practice to have an extension method to do the configuration, see:
public static IServiceCollection AddTelemetry(this IServiceCollection services, IConfiguration configuration)
{
services.AddApplicationInsightsTelemetry(options =>
{
var appSettings =configuration.GetAppSettings();
options.InstrumentationKey = appSettings.AIKey;
//we want to know when the worker stops
options.EnableHeartbeat = true;
options.EnableAdaptiveSampling = false;
});
services.AddTelemetryService(configuration.GetAppSettings());
return services;
}

c. step 3 raise custom metric in AppInsight
    • Use TelemetryClient
TelemetryClient _telemetryClient;
...
[HttpGet]
public string Get()
{
_telemetryClient.TrackMetric("CustomMetricExample22", 1, new Dictionary<string, string> { { "Value", "22" } });
...
}
      • The best practice is to create a wrapper class for TelemetryClient

References:

Share:

8. [Updated] ASP .Net core - Fundamentals - Error Handling

This series is a collection of knowledge about ASP .NET Core. It just is my notes.

Part 1: Startup file.

Part 2: DI.

Part 3: Middleware.

Part 4: Host and Servers.

Part 5: Configurations.

Part 6: Environment.

Part 7: Logs.

Part 9: Routing.

Part 10: Make an HTTP Request.

Part 11: Static files.

Part 12: Authentication and Authorization.

Part 13: CORS.

There are some ways to handle exceptions while we working with Asp .net core

1. Use try-catch block
a. Best practice when using "throw"
    • Throw ex: loose stack trace on downstream layer
      • For example, an exception has occurred in the Data layer, Business layer catch this exception and use 'Throw Ex', at API layer will lose stack trace
      • Shouldn't use like this
    • Throw: keep stack trace on downstream layer
      • For example, an exception has occurred in the Data layer, Business layer catch this exception and use 'Throw', at API layer will get a stack trace in detail
      • Should use like this
    • Throw new Exception: it will make downstream layer losing stack trace in detail
      • Only use at a lower level, the downstream level should use "Throw", we will keep stack trace on the downstream level


3. Validation DDD
There are 4 possible ways to implement validation.

  • IsValid method: transition an entity to a state (potentially invalid) and ask it to validate itself.
  • Validation in application services.
  • TryExecute pattern.
  • Execute / CanExecute pattern.
There are two major types of validation.

  • Task-based where you don’t need to map validation errors to UI elements.
  • CRUD-y where you do need to do that.
The Execute / TryExecute pattern works best for task-based scenarios. For CRUD scenarios, you need to choose between Execute / TryExecute and validation in application services. The choice comes down to purity versus ease of implementation.

4. Centralize exception handling with IExceptionFilter
  • Implement an ExceptionFilter class


  • Add to the configuration method


  • Throw an exception when needed.


5. [From ASP.NET Core 8] - Use IExceptionHandler

The built-in exception handler middleware uses IExceptionHandler implementations to handle exceptions.

This interface has only one TryHandleAsync method.
  • If Exception can handle => return true;
  • If Not => return false
  • Can implement a custom logic in the method
Basic usage:

Implement of IExceptionHandler

public class GlobalExceptionHandler : IExceptionHandler
    {
        private readonly ILogger<GlobalExceptionHandler> logger;

        public GlobalExceptionHandler(ILogger<GlobalExceptionHandler> logger)
        {
            this.logger = logger;
        }

        public async ValueTask<bool> TryHandleAsync(
            HttpContext httpContext,
            Exception exception,
            CancellationToken cancellationToken)
        {
            this.logger.LogError(exception, "Exception occurred: {Message}", exception.Message);

            var problemDetails = new ProblemDetails
            {
                Status = StatusCodes.Status500InternalServerError,
                Title = "Server error -- apply IExceptionHandler for Global"
            };

            httpContext.Response.StatusCode = problemDetails.Status.Value;

            await httpContext.Response
                .WriteAsJsonAsync(problemDetails, cancellationToken);

            return true;
        }
    }


Use the handler in Program file

// User Exception Handler
builder.Services.AddExceptionHandler<GlobalExceptionHandler>();
builder.Services.AddProblemDetails();

var app = builder.Build();

app.UseExceptionHandler();

Exception Handler in new way (sample)


References:

[Microsoft Learn] - Handle errors in ASP.NET Core

Share:

Featured Posts

Data type 3 - string type