Mendix is a great platform to build apps, both large and small. The Mendix platform even provides a very usable service portal for deploying the developed Mendix apps. However, all of this is done by hand. Someone has to manually create a deployment package en deploy it to the correct environment. This is OK when there are only several apps to deploy, but when you are building more and more apps, this becomes cumbersome. This is when continuous integration (CI) and continuous delivery (CD) provide us more speed and control.
How to achieve this with Mendix
Mendix offers a free to use API to use with CI tools, also known as the Mendix Deploy API. A link to all the functions can be found here. The Mendix deploy API has all the functions available to implement a proper pipeline for various scenarios. From checking a new commit, building a new package to deploy the newly built package. And next to those primary functions, more functions are available to configure the complete environment via the API. Think of constants, scheduled events and custom runtime settings.
Let’s dive in and make it work
In the following example, I’ll use Windows PowerShell as the tool to provide CI for Mendix. Why Powershell? Well, for starters it is pre-installed on any Mendix development machine, which is Windows. And since Microsoft is contributing more and more in the open-source community, the next versions of PowerShell, Powershell Core, are also supported in other operating systems like Linux, macOS and even ARM. This makes it possible for everybody to use it locally and on any server, with the possibility to execute it automatically via a background job.
I got interested in it with the PowerShell example already present on the Mendix documentation site. The prerequisites are mentioned here like having the current Powershell, making sure you can run it on your system with ‘Set-ExecutionPolicy RemoteSigned’ and how to get the credentials to build and deploy apps.
Now I want to take it some steps further by enhancing the script, make it more modular and follow some best practices to share the scripts without compromising security. So we split up the core parts of the CI-script.
The structure set up to keep a nice CI script that can be updated and shared.
- We have a script function file with all the functions and the logic to get it working. This can be placed some centrally, like a Github repo.
- We have a script data file with all the specific environment data to use in the CI pipeline. This can be placed inside the Mendix repository of your app.
- We have a script data file with the Mendix credentials. You should keep this for yourself.
By placing the script functions file and the script data file of the app in there corresponding repositories, updating is done automatically for everybody who’s involved in the project.
Some explanation about the files
The script functions file is the most important one to get something done but contains no data and information about the app and the app environment nor any security details. It does contain all the functions to create a build and to deploy this to the environment of choice. By default, this would be Test.
It contains four parameters at the time of writing this post. Two for the data files and two for script usage. Do I want to deploy directly? And do I want to set the configuration during the deploy? The latter is still work in progress. Both are optional and when not provided, the values are interpreted as ‘false’.
The script data file for the app environment contains some main information like, which branch we want to use for builds and the environment to deploy to. Shortly also environment settings like constants and scheduled events are included.
The script data file for the Mendix API credentials is quite self-explanatory.
The flow to get it deployed
The flow to get a new package deployed.
The steps to take to get a new deployment onto the specified environment are the ones shown above. The first decision is done for performance enhancements, not creating a new build every time for the same revision checked in. The second one is determined based on the parameter ‘-dodeploy’.
Get the script working on your pc
Make sure to start PowerShell with administrator privileges, so the script may be executed. Remember the absolute paths where the necessary files are located and run the script via the PowerShell console.
Example with a build only.
Example with a build and a deploy.
Now we’ve all the necessary aspects set up, we can automate it. Create a batch file to use it with a scheduled task (every 5 minutes) or just double-click when needed. Of course, keep in mind to have this batch file also not inside the repository unless you all share the same locations and filenames you’re referring to inside the batch file. Now you don’t need to go to the cloud portal and have to wait before you can go further. Automation can save you easily 15 minutes or more with every deploy, so I suggest to apply this right from the start if possible.
Example of batch file contents.
A test of the created batch file. It works!
Next steps to do
The script is currently working as it supposed to do. However, I have some enhancements I will likely add to the script, like logging to an external file for traceability and setting the environment configuration.
Tools I used to build the script and test every iteration are Powershell ISE and Visual Studio Code with a PowerShell extension.
The script I use and which I update can be found here. The repository also includes some template files, which can be used and changed to your’ app details.
I hope you take some time during the first sprint in your project to get things up and running and with that, set up this CI-pipeline. It’s pretty straight forward and it will save you a lot of time right from the start.
Bart Poelmans – Mendix Expert @ Bizzomate