In the last post in this series, I cover using GitHub Actions to update the website content out in the storage account automatically. I also include updating the function code and settings for the visitor counter.

Links to the full series:

Using GitHub Actions Workflow

GitHub Actions allow you to automate software development workflows that integrate with your GitHub repositories. You can build and deploy software packages stored in your repositories by customizing actions in the workflow.

If you’ve worked with Azure Pipelines, the concept is very similar. Both are written in YAML and have integrations with third-party solutions and platforms. I’ve built some pipelines in the past, so working with GitHub Actions was an easier task for me.

Updating Static Website Content

Microsoft also provides a great guide for deploying a static website to Azure Storage using GitHub Actions. Before building the workflow, I created a service principal for authenticating out to the Azure cloud. The service principal has the Contributor role on the resource account where all the components are stored. I configured a GitHub secret to use the service principal credentials to authenticate to Azure in the workflow.

GitHub Actions executes the workflow when a commit or pull request is made against the main branch in the repo. The workflow beings by using the stored GitHub secret to authenticate to Azure using the Azure CLI. Next, the workflow copies the website content from the repository to the Azure Storage container. This command is what I included in my deployment script, so I already had the command ready.

Once the copy is completed, the workflow purges the Azure CDN endpoint content. This action ensures that the new website content is displayed in a timely manner since the CDN caches content. I added the --no-wait switch as purging the content can take a few minutes, and I wanted the workflow to go ahead and complete.

Here are the steps for logging in, updating the content, and purging the CDN endpoint:

    runs-on: ubuntu-latest
    - uses: actions/checkout@v2
    - uses: azure/login@v1
        creds: ${{ secrets.AZURE_CREDENTIALS }}
    - name: Upload to blob storage
      uses: Azure/cli@1.0.4
        inlineScript: |
            az storage blob upload-batch --account-name stjbtazureresumeprod --destination '$web' --source frontend
    - name: Purge CDN endpoint
      uses: Azure/cli@1.0.4
        inlineScript: |
           az cdn endpoint purge --content-paths  "/*" --profile-name "cdnp-azureresume" --name "cdne-jeffbrowntech-me" --resource-group "azureresume-rg" --no-wait

Updating the Azure Function Code

Since I code the VisitorFunction in the portal, I couldn’t use the az functionapp app CLI command. However, I found that Azure Functions store the code in the associated storage account using Azure Files. I navigated through the file share and found the function’s files: run.ps1 and function.json. Using an Azure CLI action, I used the az storage file upload command to upload the repo files to the file share locations.

Here is the upload step from the worklfow:

- name: Copy function code files
  uses: Azure/cli@1.0.4
    inlineScript: |
      az storage file upload --account-name stjbtazureresumeprod --share-name func-jbtresume-prod-001 --source ./backend/VisitorCounter/run.ps1 --path "site/wwwroot/VisitorCounter"
      az storage file upload --account-name stjbtazureresumeprod --share-name func-jbtresume-prod-001 --source ./backend/VisitorCounter/function.json --path "site/wwwroot/VisitorCounter"

GitHub stores the completed workflow in the repository root in a folder named .github. Navigate to the main.yml file and select Edit to open the workflow editor again. You can view my full workflow in my repository using the following link:

JeffBrownTech / azure-resume-project / .github / workflows / main.yml


That completes the Azure Resume Challenge! I found this to be a fun challenge that forced me to try a few different things that I had not worked with before. I’m finding the Azure CLI a nice alternative to PowerShell and will continue to use it in the future.

In the future, I will work on revamping the project to include Terraform to deploy all the Azure resources. I will then include the deployment as part of the GitHub Actions workflow so the whole solution is automated.