Skip to main content

Low-code/serverless (integration) environments and testability

This month I attended the Integrate 2018 integration congress which was held from June 4-6 in London. These three days I and all other attendees where informed about the latest low-code and serverless integration solutions in the Azure cloud. Think about solutions like Logic Apps, Azure Functions, Event Grid and API Management.

One general theme I was missing during all these sessions is the testability of the solution. For BizTalk my team invested a lot of time in writing unit tests for our solutions. One of my colleagues wrote an unit test framework which let us test our XML schemas, XSLTs and pipeline components for example based on the MSTest framework. Our CI/CD pipeline will execute these test every time the code is checked in and the BizTalk solution has not to be deployed on an actual BizTalk environment. With these low-code/serverless tools it feels I am back in the work on my machine (actual environment) and the labour intensive manually testing era. None of the speakers mentioned the concepts of unit testing and automatic integation/acceptance testing during the conference for these products.

Within our company we have a team of developers which use the low-code Mendix platform. This is also a platform which seems to lack the possibility to write unit tests for it. When observing that team I see that they do a lot of manually testing. They have a few testers in their Scrum team which do al the manually testing to verify that the apps are doing the right things. On the possitive side they are utilizing the test plan functionality of VSTS Test Manager for their manually testing but on the negative side they still not invested the time to write automatic web tests for their (web) apps.

Possible solutions for testing serverless applications like Logic Apps

Back to my own serverless/low-code integration stack. I don't like the idea that my integration solutions need a lot of manual testing every time I make a change to it. What are possible solutions to mitigate this?

For verifying that your ARM templates, which are Logic Apps in essence also, are not containing mistakes you can utilize for example JSON Lint to verify that the JSON code is valid. You can add this as a step in your VSTS CI builds by installing it via NPM. Add the Node Tool Installer step and the NPM step. In this NPM step you install JSON Lint with the command install jsonlint -g. To verify the actual JSON code you have to add a command line step. With the command FOR /r %%A IN (*.json) DO jsonlint %%A I can verify if every JSON file contains valid JSON. The next step to verify that the ARM template is deployable is doing a trial Azure Resource Group deployment with the 'Validation only' switch. It's not fool proof but will detect most errors. You can extend the validation of the ARM template by using Pester. I found the article "Testing ARM Templates with Pester" written by Justin Yoo which describes that scenario.

Based on my search findings unit testing Logic Apps looks not to be possible. We have to deploy our Logic Apps and utilize the concepts of automated integration/acceptance tests with BDD frameworks like SpecFlow. I found the following articles on the web about this solution path:
I will have to spent some time to investigate if SpecFlow is a solution for my team. It looks better than doing only manual testing thats for sure.

Comments

  1. The term "low-code app development" didn't exist until a few years ago but the concept isn't a new one. business users who see an opportunity to optimize a process and take it upon themselves to create their own apps. Rather than spend the time and manual effort to code an app from scratch that is made up of common features and components, low code development platforms let the developers work from existing templates and drag prebuilt elements, forms, and objects together to get a particular department or team the simple working app they need with a lot less hassle.

    ReplyDelete

Post a Comment

Popular posts from this blog

CS8357: The specified version string contains wildcards, which are not compatible with determinism.

Today I was busy with creating a WCF service solution in Visual Studio Enterprise 2017 (15.9.2). In this solution I use a few C# class libraries based on .NET 4.7.2. When I compiled the solution I got this error message: Error CS8357: The specified version string contains wildcards, which are not compatible with determinism. Either remove wildcards from the version string, or disable determinism for this compilation The error message is linking to my AssemblyInfo.cs file of the Class library projects. In all the projects of this solution I use the wildcard notation for generating build and revision numbers. // Version information for an assembly consists of the following four values: // // Major Version // Minor Version // Build Number // Revision // // You can specify all the values or you can default the Build and Revision Numbers // by using the '*' as shown below: // [assembly: AssemblyVersion("1.0.*")] [assembly: AssemblyVersion("1.0.

Fixing HTTP Error 401.2 unauthorized on local IIS

Sometimes the Windows Authentication got broken on IIS servers so you cannot log in locally on the server. In that case you get the dreadfully error message HTTP Error 401.2 - Unauthorized You are not authorized to view this page due to invalid authentication headers. To fix this issue you can repair the Windows Authentication feature with the following PowerShell commands: Remove-WindowsFeature Web-Windows-Auth Add-WindowsFeature Web-Windows-Auth

Azure DevOps YAML based CD pipeline with multiple artifacts

In my post called " Switch from Azure DevOps release pipeline to YAML based CI/CD pipeline " I explained how to implement a combined CI/CD pipeline based on YAML templates. Because most of the deployments in my daily work are consuming multiple artifacts I had to do some additional R&D to find a solution for those releases in Azure DevOps. Base structure In the first post about the transition to YAML CI/CD pipelines I introduced the base structure. The templates I want to reuse are added to the yaml-template folders in the root of my GIT repo. For the CD pipelines which lack a CI phase I introduce a new folder called yaml-releases which will contain the YAML files for releases which are based on multiple artifacts. The folder structure within my GIT repo is thus: yaml-releases releaseArea myMultipleArtifactsRelease azure-pipelines.yml cd-job.yml yaml-templates jobs ... templates ...