The need for e2e testing is getting more powerful each day. Development speed is evolving and the right way to keep riding that speed boost wave is to utilize e2e testing for your applications. Developers can push code and be sure that bugs which would usually be missed with manual regression testing are caught by the automated e2e scripts. That way we can release more often and be confident that the new code deployment will not break the application.
E2e testing is interpreted in many ways, but let’s lay down the rules, so we are all on the same page for the purpose of this blog post.
End-to-end testing is a testing type where you test your whole application, from “end-to-end”. It assures that all pieces of the application function and work together as expected. There is a great deal of e2e testing frameworks. We will try to accomplish our goal with “TestCafe” and “Cucumber.js”.
To make it more specific, we will test a web application from end-to-end the way the user would do it and that’s through the UI using TestCafe as the testing tool.
TestCafe is a Node.js tool for automated e2e web testing. It is free and open source, easy to set up and it is working on all popular environments.
With TestCafe we can run tests on multiple browsers on multiple platforms, write stable tests in latest JS and TS versions and integrate our tests with CI pipelines (in our case a Jenkins pipeline).
By far, the biggest feature is that TestCafe does not use a web driver. With this feature, we avoid putting a huge load on our Operations support because there is no need to upgrade web drivers for multiple browsers after either the testing software updates or the browser itself. This way we only update the browser and TestCafe can continue working perfectly.
With Cucumber, we will write down our business flows and map those flows on actual test code.
The whole idea of mapping business flows came out of TDD (Test Drive Development) and eventually evolved into BDD (Behaviour Driven Development). The BDD frameworks basic concepts are focused around DSL languages using natural language constructs that can express the behaviour and outcomes of a certain piece of software. When the business problem is complex, that is the moment where BDD really shines.
The end result of BDD development is the delivery of working, tested software that matters.
Let’s get into the detailed testing setup to understand how Cucumber actually runs the test code.
Cucumber is basically a test runner. It loads the feature files prior to running the test code. In that process, Cucumber uses regex to map a certain feature step to the written test code. Only this way you can get proper test results in the end. In the report file, each feature step will then have either a failed, passed, pending or undefined flag set.
To help understand the Feature file structure check the image below:
Let’s take a look at our feature file that we will use for this demo:
The feature file describes a part of the Comsysto website. On line 2 we give a short description of the feature. Subsequently, we start detailing our scenarios but in a way that it is not too complicated. It should be easily readable and comprehensible for both developers and for business.
There is much detail, practice and finesse involved in creating good feature files. Use the 3 amigos principle if possible, writing features is all about collaboration!
Now that we have our feature file, let’s take a look at a code snippet that will be executed for the first scenario:
As we are running our tests, Cucumber is automatically doing all the mapping for us.
If you wonder what the “page” object is, that is a Selector object that we load from our custom library. That way you can have all your page object selectors on one place and save yourself a bunch of code.
To get more familiar with what happens in the Jenkins pipeline, let’s run the tests directly on our workstation first.
Will not work for Windows, please user Mac or Linux
First download the repository:
Then install dependencies.
Be sure to have BOTH Chrome and Firefox browser installed
and then run the tests
Right after running the command, TestCafe will bring up a new browser window and start executing the tests.
After the testing is finished, you will see a small and not so pretty report in the command line:
The actual error stack trace is located above the summary so you can see in more detail what exactly happened if a certain test failed.
If you have feature files that are missing a certain step definition, Cucumber will create the code stubs for you so you only need to implement the test logic and do not have to fiddle around much with preparing everything. Try it out by commenting out a Given, When or Then block from the test code file.
Let’s see how it looks on a simple example:
Generated stubs after running Cucumber
This project is built here:
so, you can take a look at the Pipeline without downloading or installing anything.
We run everything inside Docker so to be able to run the demo yourself you need to have Docker daemon running. No Docker? Get it here
Jenkins is built using the Jenkins LTS Docker Image. On top of that, we install Node and NPM so we can run JS helper scripts for parsing/updating data and generating HTML reports. More configuration is done using Groovy; we copy various configurations, install plugins, we set the default user for accessing Jenkins GUI, and we also get rid of stuff like the CSP rules that block us from viewing our HTML reports.
Jenkins is not triggered by a Git Hook, instead Jenkins checks for new commits on GitHub. Not the best practice because we use resources on our master node every couple of minutes to scan for changes on GitHub but this way the project is made to run on each and every machine without the need to do any initial configuration.
Avoid running resource intensive stuff on your Jenkins master node, use slaves.
Jenkins has this project already built in and ready to execute the e2e tests and produce HTML reports. You can add more job configurations and include them by default here.
Jenkins will be setup up with all necessary plugins installed. You can update the list of plugins here.
You can also use this file to update your plugins. Just bump the version in the file and re-build the Docker Image.
To get to this point where Jenkins is set up and running the way we want it to run, we will simply install necessary npm modules
and after that we will build the Docker images:
With the previous command, we actually run 2 npm scripts:
This command will build 2 containers. One being Jenkins and the other one being the container on which the test are actually executed (the Slave).
As you can see, many useful commands have been saved for later use as npm scripts. Check them out in the package.json file
The browsers container also includes Node and NPM as well as 2 browsers, Chrome and Firefox. When the Jenkins Pipeline is triggered, the tests run in parallel in both browsers. That saves us time in the long run because we do not need to wait for 1 pipeline step to finish so the other one can start.
If you implement parallel execution in your systems, be careful with scenarios where you change data! For example, updating login password.
Details about the containers can be found in the Dockerfile:
Most of the code is well commented so you can use it as documentation.
This npm script will run the following command:
and start the Jenkins server in the background.
Let’s just explain this command in more detail because it got upgraded along the investigation.
First, we store the current working directory, in order to use that value for setting the correct path for the jenkins_home parameter. The tests take a long time to execute and Jenkins by default will think that the Cucumber process is hanging and will break the build. In our case we set the heart beat check option to a custom value to bypass the problem.
We also pass our local Docker socket to the container so the container itself can spawn new containers on the host machine. Since the Jenkins slave is also a Docker container this setup came in quite handy.
If you want to stop the Jenkins container execute the following npm script:
All the changes that you may have made through the Jenkins GUI will be saved (all files that are saved in the Jenkins home folder).
Access it on http://localhost:8080
with default username admin and password admin.
You can change the default admin password later through the Jenkins admin console or in the docker/jenkins/bin/security.groovy configuration file. Don’t forget to build the Docker Image after you make a change.
Click on the testcafe-demo pipeline and start the build by clicking Build Now.
As the tests run you can view the details in the Blue Ocean dashboard.
And now for the grand finale, all what we have been waiting for. The Test Report.
The final report is generated using a .js library. Experiments with injecting custom metadata into the .json report generated by Cucumber went well, so we have more information prior to generating the HTML report. There is definitely more room for improvement in that direction like attaching screenshots, dynamically getting browser information…
It is also possible to just use the generic Cucumber HTML plugin in Jenkins, but this report is more eye catching and includes more data from the start (like OS and browser on which the tests were executed).
Builds like these can be set as downstream builds, so they trigger after your integration tests have finished executing or after the final build artifact has been deployed to pre-production environment.
Since the start of writing, the Blog page changed and tests that were passing eventually failed, therefore it became a nice little addition to the blog. Did the requirements change, developers forgot to update tests after code change, or we have a bug at hand? You be the judge.
By implementing e2e testing we make our applications more stable and less prone to defects caused by implementing new code.
We have proven that it is possible to implement solid e2e testing practices by using TestCafe and Cucumber with automation software like Jenkins. Leveraging Docker gives us more flexibility for Jenkins and Jenkins slave orchestration which conserves resources/time
About once or twice a month we will inform via newsletter on various topics that concern us. Subscribe to our newsletter - it's the easiest way to get a steady insight into our activities.Subscribe now!