How to use Azure CDN to boost website performance and SEO?

There are websites out there that were developed when cloud services were not so present. In some cases, they need a boost to strive. Luckily, that is not so complicated.

Rafael Pereira
7 min readOct 20, 2020
Slow services don’t last.
Slow services don’t last.

Web site loading times are super important. Nowadays, users have no patience for slow websites. And a website's overall performance and responsiveness are one of the most important factors to retain users. Loading times also have a considerable business impact due to its effect on SEO (Search Engine Optimization).

There are a lot of websites with a worldwide reach that are being served from a single location. That presents a challenge, especially if the website delivers lots of content. Luckily, cloud services help us deliver the content from locations closer to the end-user, as is the case of the CDNs (Content Delivery Networks).

CDNs are a group of the geographically distributed servers working together to provide fast delivery of internet content.

This article will guide you through the process of migrating assets from an existing on-premises website to the Azure Cloud and serve them with an Azure CDN:

  1. Overall Architecture;
  2. Azure Storage Account & Blob Container configuration;
  3. Migration of existing assets with AzCopy;
  4. Azure CDN Profile configuration;
  5. Required Applicational Changes;

Overall architecture

To take the most out of the Azure CDN, we need to ensure that the content’s origin is as accessible as possible to the CDN nodes. This way, even the first load (when the content is not yet cached on the CDN) will be pretty fast.

One of the best ways to achieve slow latencies on content loading for an Azure CDN is to use the Azure Blob Storage as the content origin. That’s what we will use.

High-level Architecture for an Azure powered CDN solution
High-level architecture of the Azure-powered CDN solution

Our CDN solution requires the following Microsoft Azure Resources to be created:

  • Azure Subscription: is a logical container used to provision resources in Azure;
  • Azure Resource Group: is a container that holds related resources for an Azure solution;
  • Azure Storage Account and one Blob Container: where the assets will be permanently stored;
  • Azure CDN profile: configuration of the CDN that will deliver the assets stored on the Azure Blob Container;

The solution described is a simple baseline. More complex use cases may require several Storage Accounts in different locations and object replication between them.

Storage Account Configuration

This part assumes you already have an Azure Subscription and a Resource Group to hold the necessary resources. Important parameters to setup the Azure Storage Account:

  • Location: the primary location where the assets will be available; The closer this location is to the place where the assets are loaded to the website, the better;
  • Replication: in this case, we have selected Read-Access Geo-Redundant Storage, as we want to be able to failover to another region in case the primary location becomes offline;
  • Data protection: you may want extra protection when some assets are accidentally deleted. Point-In-Time restore, Soft-Delete, or Blob-Versioning are advisable in those cases;

Creating the Storage Account using Azure Portal

Setting up a Azure Storage Account to hold the website’s assets
Creating the Storage Accounts that will hold the Blob container with the assets.

Creating the Storage Account using Azure CLI

az storage account create \ 
--name websitestaticcontent \
--resource-group azureassetssample \
--location eastus \
--sku Standard_RAGRS

Creating the Blob Container using the Azure Portal

Now we need to create a Blob Container to hold our assets and choose the public access level. It is possible to enable anonymous access for blobs and containers, but it’s not advisable when content belongs to different users. And additionally, third parties would be able to use your assets directly without being authorized to do so. Hence, we will make the container private, and we will use SAS (Shared Access Signatures) tokens to access our assets.

Setting up the Azure Blob Container and the public access level
Setting up the Azure Blob Container.

Creating the Blob Container using the Azure CLI

az storage container create \
--name assets \
--account-name websitestaticcontent \
--public-access off

Migration of existing assets

Microsoft Azure offers several options to migrate on-premises data to Azure. For huge quantities of data is even possible to ship disk drives to Microsoft so that they are loaded directly on the data center.

For this tutorial, we will load the on-premises data to Azure using the AzCopy command-line utility. This utility is available for Windows, Linux, and Mac. Here the steps required:

  1. Connect AzCopy to Azure;
  2. Generate a SAS token for the Storage Account so that AzCopy can upload the assets to the blob container;
  3. Finally, execute the AzCopy to upload the on-premises data to the container;

Executing the process described using Azure CLI and AzCopy:

# Generate a SAS token and specify the permissions (in this case add, create, delete, list, read and write)
# Add expiration date for the SAS (important to ensure that the token will not be wrongfully misused in the later future if leaked)
az storage container generate-sas \
--account-name websitestaticcontent \
--name assets \
--expiry 2020-11-22T00:00Z \
--permissions acdlrw
-> Sample output: "se=...&sp=...&sv=...&sr=...&sig=..."# Login AzCopy to Azure (requires device code flow)AzCopy login# Run AzCopy to load the local assets
# On the destination container's Url add the SAS token generated in the previous step
AzCopy sync
"assets/" \
"https://{StorageAccount}.blob.core.windows.net/assets?{SAS}"\
--recursive=true

Testing the Assets Migration

After the assets are uploaded to the Azure Blob Container, you may copy the Blob Container URL, append the path to a blob/file and access it on the browser. If you use the SAS token and it didn’t expire yet, you should see the blob. Without a SAS token, the following error should be displayed:

Azure Blob Error when SAS token not passed
The error returned when no SAS token is passed to access a Blob.

This basically means that the configuration is working properly and that the assets are safely available on Azure.

CDN Profile Configuration

Now the data is already on an Azure Blob Container, and we need to ensure the assets will be geographically distributed and cached by using an Azure CDN. The next step is, therefore, to create an Azure CDN profile. Here the parameters we need to pay special attention to when creating one:

  • CDN Profile / Pricing tier: in this use case, we will use the default Standard Microsoft. However, if you deliver large files or you need special optimizations for video streaming, you should compare the offerings available and select the one that better suits your business requirements;
  • CDN Profile / CDN Endpoint Name: the CDN Url (it will be suffixed by “azureedge.net”);
  • CDN Profile / Origin hostname: where the assets will come originally from. In this case, the Url to the Azure Storage Account we have created before;
  • CDN Profile / Endpoint Caching Rules /Query String Caching behavior: as we will use SAS tokens (passed as Query String parameters) to ensure only authorized users/apps can access the assets, we need to select Cache Every Unique URL. The default is to Ignore query string, which wouldn’t meet the security requirements;

CDN Profile Configuration using Azure Portal

1. Creating the CDN Profile and the first Endpoint:

Creating an Azure CDN profile to serve our assets
Creating and configuring the CDN Profile that will deliver the assets.

2. After the CDN Profile is created, we need, as explained, to change the caching behavior of the Endpoint (select the Endpoint and then Caching Rules):

Changing the Azure CDN Endpoint Query String Caching behavior
Changing the Query String Caching behavior for the created Endpoint.

CDN Profile Configuration using Azure CLI

The following code resembles with commands the process exemplified above using the Azure Portal:

# Creating the CDN Profile az cdn profile create \
--name assetsglobaldelivery \
--resource-group azureassetssample \
--sku Standard_Microsoft
# Adding an Origin Endpoint to the previously created CDN Profile
# UseQueryString = Cache Every Unique URL on the Azure Portal Interface
az cdn endpoint create
--name globalreach \
--origin websitestaticcontent.blob.core.windows.net 80 443 \
--profile-name assetsglobaldelivery \
--resource-group azureassetssample \
--query-string-caching-behavior

Required Applicational changes

We are almost ready to go. But we still need to ensure that our application uploads and saves the assets on the Azure Blob Container. Even more important, when an asset is added to a webpage, it must point to the CDN Url, and a SAS token must be added to the Query String.

As this logic varies per technology stack (CMS, custom-tailored app, frameworks, etc.), no detailed instructions will be given for this part. Though here is a list of recommendations:

  • When possible, catch an asset uploaded event and upload it to the Azure Blob Container (you will need a connection string with the right permissions on the container, of course). This way, you would still have a copy on-premises;
  • Centralize the Asset Url construction logic and specify the CDN Url;
  • Reuse SAS tokens whenever possible to make the most out of the CDN cache. However, never reuse SAS tokens for different users;
  • IMPORTANT: casing and some characters are not supported in paths and filenames on Azure Blob Containers. Check the link, and if your assets’ paths are not supported, a uniformization process will be necessary!

After the applicational part is wired up with the Azure Blob Container and the Azure CDN, you are ready to go-live and boost your app. Don’t forget to measure the impact both of the performance and SEO. A change is not an improvement until we prove it with measurements!

--

--

Rafael Pereira

Eternally curious. Especially interested in software and management.