Thirteen lessons learned in Test Automation – part 2

June 30, 2020

We all know the benefits of test automation, but can test automation really advance your testing mission? Do all test automation projects succeed in adding the desired value?  The reality is that some test automation efforts, unfortunately, are not favorable; they fail soon after they have been initiated due to different pitfalls.

I have faced various challenges along the testing journey, experienced different situations, successful ones, but also others that made me learn, sometimes the hard way, how to avoid the extensive trial and error and how to proceed when embarking upon the promise of test automation. Part one you can find here.

Here is part 2 of the 13 “lucky” lessons that I’ve learned from my experience.

 

8. Less is more

Sometimes, doing less is more than enough.”

Context: We integrated our test automation in CI, for the primary purpose of getting transparency and, mainly, fast feedback. Each newly written test was automatically added to the suite. Test execution was growing and growing, being hard to follow. Afterward, we realized that a test automation execution running multiple hours is not fast feedback. We were able to mitigate this situation by running the tests in parallel; however, the feedback loop still wasn’t as short as we wanted for our execution.

I’ve realized that we do not need all these tests to reach our goal, so we defined a new solution. We reviewed our tests and chose which of them would be part of a daily CI run and which would run less frequently. We created different execution pipelines for specific test type execution or specific functionality. Smoke tests ran daily, regression weekly, and functionality based tests if changes were made within that particular area.

I’ve learned that less is more, that we do not need everything to run at the same time, we need to group our tests in smaller pieces and reach our goal, with less coverage on daily runs. If we have specific tests written, that doesn’t mean that we need to run them all the time.

9. Don’t reinvent the wheel

Just realign and use it better

Reinvent the testing wheelContext: I had a colleague who was very passionate about technology, and started building his own tool for testing the REST API calls from the project.  He spends a considerable time to construct the tool, using different classes for HTTP requests, handling the header types, parsing results, having different utility methods for making it cleaner. After a while, the application changed, and thus he needed to update quite some parts of the tool. He found himself in a situation where he needed to maintain not only the written tests but also the tool itself. As it was all very complex, it was hard for other colleagues to use it. It took him a lot of time and sometimes, frustration.

To make things less complex and time-consuming, you can select a very well-constructed open-source tool on the market, and adjust your framework based on your needs, without involving a lot of time in designing your own tool. There are a lot of tools and libraries that can help you perform your automation without spending the time to construct your own and reinvent the wheel.

I’ve learned that an open-source tool is a real asset that you can use and realign to fit your context, and you can successfully achieve your goal.

10. Have an instrumented code

Make your work easier

Testing instrumented codeContext: I was writing UI tests on a project developed 8 years ago. The HTML elements did not have any IDS, names, or clear attributes that I could use to identify the web elements. My XPATHs were a long series of nodes and axis combined. Very hard to maintain, and sometimes, very hard to understand what they are doing. It was tough to continue like that. Talking with my developer colleagues, we reach to the conclusion that we need to instrument our code a little and be able to add new attributes to the web elements. So, we did! Not for the entire application, due to its complexity and size, but on all newly developed areas. It worked like a charm!

I’ve learned that each new UI product should begin by keeping the design for automation for the UI in mind that can be easily instrumented. You cannot thoroughly do that entirely for existing legacy products.

I’ve learned that not having a clear UI can be quite a pain for test automation, and having an instrumented code, with clear attributes for each web element, helps a lot.

11. Maintainability is king

“Make it work. Make it Right. Make it Fast” Kent Beck

test maintainabilityContext: I was the new tester on a larger project, and my role there was to write test automation. Yeah! So, I’ve started to analyze the existing solution. I was amazed to find a linear framework structure, elements identified with absolute path, code duplicated, test data hardcoded, lack of modularity, and so on. The result was that 40-50% of the test was failing due to bad automation code. We managed to fix around 10% of the tests, and we found out that they were failing in another area. It was pointless. We had only one solution: to write a reusable and maintainable framework. So we did. After the framework was in place, the design time was increased with 40% and the maintenance time, from 60% to 10%.

From my perspective, we cannot call it automation if there is not a well-structured, coding-based framework in place. We cannot have results, in the long run, without designing a framework to contain the main test automation standards: POM, modular technique, factory patterns, element extension, different layers, test data usage, error handlings, and so on.

I’ve learned that a test framework will help us, in a systematic way, to achieve the desired results for our automation project, increasing the design time and way better maintenance. If you do not have a framework in place … you’re lost.

12. What is written needs to be maintained

“The time to repair the roof is when the sun is shining.”

Context: We write and write new tests and after that … we write new ones, or we plan to write new ones. Most of the time, we miss including the time that is also needed to maintain the tests. For any change in the application, the associated automated tests also need adjustments. And we find ourselves spending more time in maintaining new tests than in writing new ones or, worse, we exclude them from the execution suite. Not analyzing maintenance effort at the beginning can lead to lower test coverage or buggy tests.

I’ve learned that every test that we automate is one more test that we need to maintain. I’ve learned that we need to include maintenance in our day-to-day activity and include maintenance time when we estimate our automation effort.

13. Treat your automation code as production code

Automation is not an “Instant Fix.”

Context: During my years of experience, I’ve heard about a lot of different situations in the testing community. Stories such as: “We do not have time to maintain the tests so we are just going to put that on hold” (and remaining on hold), “We can take a junior developer/tester and have him/her do automation for one month, and that’s it” (with no preparation), “Just do some automated tests so we have some”. “Test automation review? Why?” Any of these situations sound familiar?

Test automation is designed to check code that is going into production. How can we trust the check if it is not up to date or well written? If we have a bug in the application, we first fix the bug, and then we push the fix. The same we need to do with test automation. We need to have well written and up to date tests that check living and breathing production code.

I’ve learned that we need to address the same quality attributes to test automation as we do for production code. I’ve learned that we need people with software development skills to write automation tests and to have an appetite for testing. The automation mindset is essential.

These are the remaining test automation lessons I’ve learned throughout my test automation journey. Having all 13 lessons, what experiences have you had in test automation that led you to your lessons learned list? I am looking forward to hearing your stories!

Codruta Bunea
Test Lead and Expert
codruta.bunea @ tss-yonder .com



Our solutions

  New Initiatives
Build a brand new app
  Modernizations
Modernize your legacy application
  Application Management
Further develop and/or maintain your application

Our components

·   Software Development
·   Software Architecture
·   Project Management
·   Testing
·   DevOps
·   Requirement Analysis
·   UI/UX Design
·   Software Audit

Copyright © 2020 Yonder • SitemapPrivacy policyCookies policy