Powerful new tool for automated testing
By: Luther Jolliff
In April of 2018 Puppet announced the launching of Continuous Delivery for Puppet Enterprise v1. The software’s automated testing capabilities seemed very cool and were adopted by some, but did not reach mainstream appeal. Then came CD4PE v2. All of a sudden, powerful new features such as Impact Analysis and individual module deployments sprung up within months of each other. Bugs that had plagued users since day one were resolved with every release. Puppet has clearly made the development of this application one of their top priorities, and once our team took the service for a test drive on a Kubernetes cluster (we even made a Helm chart) it was easy to see why. In this post we will give a short walk-through of what the service does and how it is used.
CD4PE is an essential resource that utilizes Docker and Puppet Pipelines to automatically run custom tests on code commits or any specified trigger. Pipelines can be configured to deploy individual modules to an environment once all tests have passed. The examples shown below are based off version 2.7.1.
This process is similar to testing solutions found in services like Jenkins, but the interface (in our opinion) is much easier to use and navigate through. We were running automated tests within an hour of installing the software.
To demonstrate, we will show screenshots of the UI as a commit moves through the pipeline:
Above are several modules we are managing with CD4PE. The color status of added Puppet modules will change from green to red if a test in the pipeline fails. To add a new module simply follow the prompts and add the provided webhook to your repository configuration.
It’s good to see a lot of green! Opening the module will provide the user with a live list of past jobs. Each “job” represents a test that is ran against your code, which will be shown in more detail later. This pipeline will run two tests against our commit, then deploy the code if both tests are successful. No deployments will be made if the code trigger was a pull request. A pipeline identical to this can even be automatically generated!
Any pipeline can be manually triggered or deployed. If the deploy environment does not have its own branch, a new one will be created (as of version 2.8.0)
Note: Each branch can get its own unique pipeline
You are able to either create your own jobs or use one of the several that come built in. The defaults for each job can also be modified here. Nothing too mind-blowing but notice how clean and easy to use the UI is.
Per the screenshot shown above you can easily customize your job and set tasks to execute upon success or failure. The ability to run your tests inside a docker container is very convenient, just make sure the image you use has the necessary packages installed (in this instance, “pdk”). Commands can be programmed to execute upon success or failure of a test.
Code environments are required to run jobs on, and you can specify which job hardware you would like to use by adding tags. Running a curl command on a node is all it takes to configure it as job hardware and link it to the CD4PE service (similar to another software we all know…).
That covers the basics! Access control can be managed with users and groups, just like the Puppet Console. Available source control options are: Github, Github Enterprise, Gitlab, Bitbucket and Azure DevOps. You can download the software here to try it for yourself. There is also a PDK compatible Puppet module to simplify the configuration needed. Hopefully this was an enlightening guide to getting familiar with one of Puppet’s most exciting services, offering a powerful alternative to many DevOps solutions.