In Case Study

How Jonar Shortened Their Repetitive UI Testing by 80% and Increased Their Test Coverage

Computer Software codeless test automation case study

Over the course of one year:

Lines of Selenium code60K+

Test elements executed

~$500K

Saved in test automation costs

~$200K

Saved in test maintenance costs

Highlight:

Jonar wanted to shorten their UI testing to keep up with their company’s two-week sprints. The QA team did not have coding skills and simple recording tools were not an option, as their test scenarios were long, end-to-end processes, with frequent updates. They chose TestCraft as their codeless UI test automation platform for simple and fast automated test creation and execution.

  • Jonar reduced their bi-monthly time spent on repetitive, regression testing time by 80%, freeing up time for their QA team to tackle a wider range of tests on a more in-depth level.
  • Jonar detected and resolved 200 issues at an early stage of the software development cycle. Fixing these issues earlier saved them the extra time and cost of mitigating issues at a later stage of development.
  • Due to TestCraft’s self-healing mechanism, Jonar reported high test resiliency, with tests rarely breaking (less than 5%) when there is not a real issue.
  • Using TestCraft exposed technical debt that we didn’t even realize we had. Finding and fixing critical bugs early in the development process improved our code quality tremendously and better prepared us for scale.

    Jon Ruby CEO of Jonar

What Jonar Was Looking for in a QA Automation Tool

For many companies, ERP software can seem complicated, intimidating and downright scary. Jonar, a Canada-based ISV, created ParagonERP in order to combat this bad reputation. Their cloud-based ERP solution offers all the features and functionality needed for a robust ERP system, while also offering a simple setup, ease-of-use, and infinite customization abilities.

While the Jonar team conducts a wide range of tests on a continual basis, UI testing was still proving to be a major bottleneck. For each two-week sprint, it would take their two-person QA team 16 hours on average to do UI testing manually. Not only was their UI testing very time-consuming, but Jonar also found that the tests were not in-depth enough to provide the highest quality feedback. Conducting manual UI tests was wasting valuable time and resources.

In order to shorten this process, Jonar searched for an automated UI testing platform for their QA testers. In their search for this tool, Jonar identified three needs:

  • Easy interface. Previous test automation tools that Jonar tested were cumbersome, making test creation overly tedious. Their test automation tool of choice would need an exceptional UI to help them create and execute their tests in a timely fashion.
  • Simple test replication. Instead of re-recording test scenarios, Jonar was looking for a tool where they could replicate test scenarios easily to use them in future test flows.
  • Ability to read off of external data files. Jonar’s tests routinely included repeatable tests with different data sets from an external source. Other tools that they considered did not have this must-have capability.

Why Codeless Test Automation Rather Than Coding Using Selenium?

In their initial search for an automated UI testing platform, Jonar’s development team proposed using Selenium for this task. Yet Jonar’s CEO, Jon Ruby, wanted the QA team to manage the company’s test automation efforts. But the QA team did not have the coding skills necessary to work extensively with Selenium.

Unlike other agile companies, Jonar’s QA team and development team work separately when it comes to software testing. While the teams interact with each other on many other tasks and communicate before and during releases, the various testing done at Jonar (e.g. unit testing) falls under the sole charge of QA. When it came to UI testing, Ruby concluded that it didn’t make sense process-wise to delegate this task to the development team. If developers were to use Selenium, they would ultimately be the ones to validate their own tests due to their coding knowledge. Ruby described the inherent bias that would come out of such an arrangement as having “inmates running the insane asylum.”

  • We wanted our QA team to take over the test automation process. Since QA is the team that focuses primarily on user experience, they should be in charge of creating and executing UI tests as well. Delegating this to the development team would be like having ‘inmates running the insane asylum.’

    Jon Ruby CEO of Jonar

Since QA is the team that focuses primarily on user experience, Ruby felt strongly that QA should be in charge of creating and executing UI tests as well. They would be best equipped to check the development team’s work without bias and ultimately ensure that ParagonERP was moving in the right direction. Therefore, in order to harness the power of the QA team without recruiting test engineers, Jonar opted for codeless test automation.

Faster Test Creation with Visual Test Modeling

Jonar quickly found that TestCraft had the ease of use needed for faster test creation alongside the depth of functionality needed for the highly complex level of their tests. Using TestCraft’s unique visual test modeling, Jonar was able to build up a library of modular pieces that they could add to multiple test flows. They were also able to use data through an external file.

Reusing and merging components for future tests was a major time-saver when building new, large tests and modifying existing ones. Jonar would only need to apply a change once and it will automatically apply to all other tests that use that component. Having the building blocks for tests readily available and easily customizable made building large tests significantly easier, while shortening their test creation time to a matter of minutes.

Example of a TestCraft test scenario, using reusable components

Example of a TestCraft test scenario, using reusable components

By replacing their manual test routine with an automated one, Jonar reduced their 16 hours of testing during a two-week sprint to just 3 hours. Reducing their automated UI testing time also created a positive ripple effect. By using TestCraft to automate simple, yet time-consuming test flows, Jonar’s QA team could start focusing more time and energy on doing the complex test scenarios. During each two-week sprint, Jonar’s QA team could now dedicate more resources to tests that require human attention, such as exploratory tests and break tests. By freeing up more time for QA to tackle a wider range of tests on a more in-depth level, TestCraft helped enhance Jonar’s product quality overall.

 

Jonar's TestCraft UI testing dashboard

Jonar’s TestCraft UI testing dashboard

In addition to increasing their test coverage and testing more frequently, Jonar was also able to do this without hiring any additional personnel to conduct UI tests or focus on complex scenarios. The ability to produce high-quality tests while maintaining their existing QA team made a major financial impact, saving Jonar roughly $20,000 in staff time every two weeks. These cost savings allowed Jonar the flexibility to better allocate their resources for future company needs.

Faster Issue Mitigation and Exposing of Technical Debt

Another major benefit of TestCraft’s unique visual modeling was Jonar’s ability to find bugs more quickly. Since Jonar could test more frequently, they were able to find bugs earlier in the development process.

Over the course of a year, Jonar found roughly 200 defects that they would have not found without TestCraft. Out of these bugs, 50% of them were either critical or would have had a negative effect on their end users. Fixing these issues earlier saved Jonar the extra time and cost of mitigating issues at a later stage of development.

Chart showing the increased relative cost of fixing bugs at later stages in development.
Source: Agile Modeling

TestCraft also helped Jonar expose technical debt that they didn’t realize they had. Some of the bugs they found were symptomatic of larger issues that applied to multiple cases.

High Test Resiliency Due to AI Technology

Due to TestCraft’s AI-based, self-healing technology, the Jonar team reported over 4,300 elements that were auto-fixed throughout the year, with minimal instances of test breakage. This translated into major test maintenance time and cost savings for the company (see below). This high test resiliency, along with fixing critical bugs, allowed Jonar to better prepare for scale.

Case Study Savings Stats

Continuous Feature Improvement and Test Automation Support

While going through this change in mindset, Jonar was able to benefit from constant innovation they saw on the TestCraft platform. Jonar especially appreciated TestCraft’s ability to take multiple specs and merge them together into one spec when creating subsequent tests. This and other TestCraft frequent usability improvements made Jonar’s test creation more intuitive and simplified their testing process. As they were developing these testing best practices, Jonar felt more equipped with the resources they needed to test properly.

  • With TestCraft, we are treated like a partner instead of a customer. That is a huge plus.

    Ami Cumberbatch Senior Manager of Dev Ops at Jonar

In addition to these developments, Jonar built a strong relationship with TestCraft team whenever they reached out for support or suggestions. Ruby explained, “With TestCraft, we are treated like a partner instead of a customer. That is a huge plus.” Throughout their work with TestCraft, Jonar was in touch with the team to discuss anything from test creation details to determining overall ROI from test automation. This partnership has been essential for Jonar’s continued work with TestCraft, giving them a space to continue developing and implementing test automation best practices.

Jonar’s Software Testing “Aha” Moment

While Jonar was ultimately looking for a codeless test automation tool to speed up their UI testing, they ended up getting more during the first few months of implementing TestCraft. Working with TestCraft, Jonar realized a need for them to reflect on their software testing processes. After nine months of working with TestCraft, they ultimately underwent a major shift in mindset that improved their test automation efforts tremendously. This shift happened when addressing two questions: how to automate most effectively, and how to determine test automation success.

What is the Best Way to Approach Test Automation?

When Jonar started getting into UI test automation, they began by immediately diving into their larger, more complex processes. Yet they found that without first establishing a foundation of simpler base test cases, it was harder to delve into the complicated test scenarios. Using TestCraft, Jonar was empowered to think more deeply about their internal processes and plan more efficiently for how to organize their test automation efforts.

After gaining a better understanding of their own internal processes, as well as QA processes in general, Jonar ultimately decided on a “go wide, then deep” approach to test automation. When they started automating tests that were simple, but annoying and repetitive, this laid the groundwork for them to use the library of test models they created within TestCraft for more complex scenarios down the road. This made their testing more organized and effective.

Determining Test Automation Success

This introspection also affected how Jonar determined whether or not their test automation was successful. Originally, Jonar would determine that a test was successful only if the entire flow ran perfectly. While this may make sense on the surface, Ruby discovered that this thought process was inherently flawed as it classified finding bugs as negative and not positive.

Using TestCraft, Ruby realized that finding bugs that caused tests to fail was actually a measure of success, not of failure. The fact that tests were not running smoothly initially meant that the QA team found a lot of errors on the development side. They exposed issues that their team could not detect manually before, since they would easily overcome them. Now that Jonar’s tests are running smoothly, Ruby is confident that their code quality and their ability to scale has improved dramatically. Furthermore, Jonar now has a safety net that if they add something new and the tests break, their team will know exactly what the issue is and fix it quickly.

Computer software case study best practice

Putting Customers First with Automated UI Testing

Determined to make their customers their highest priority, Jonar continues to ensure that their UI is providing the best user experience possible. With the help of TestCraft, Jonar established a strong automated UI testing foundation that has set them up for future growth.

Free Trial

Please register here and start your free trial