April 16, 2007

Performance Testing Lessons Learned

Posted by Ben Simo

Web and client/server load testing can easily become a complex task. Most people I've met got started in load testing with only minimal training in using the test tools. This is how I got started in load testing -- although I had an advantage in that I had been exposed to load testing of communications systems. I also had experience with automated single-user performance testing. I had led some small-scale manual load tests with multiple testers on a conference call hitting the same client-server application at once. (And we found some show-stopping bugs doing that manual testing.) I had watched others perform load tests. I had read numerous load test plans and reports. However, I had never directly participated in executing automated load tests... then I was asked to lead a load testing project.

Through the years, I have made many mistakes designing, scripting, and executing load tests. Load testing easily becomes complex. Tool sales people sometime tell us that nearly anyone can create tests with their tools. (Yet buying test tools is sometimes just like buying a new car: the salesman tells you that the car is reliable and has a great warranty; then the finance person warns of everything that could go wrong that isn't covered in the warranty and trys to sell you an extended warranty and maintenance contract.) Learning the mechanics of how to use a tool are often the easy part. Its what you do with the tool that matters.

Here is the short list of some of the important performance/load testing lessons I have learned. Some I learned from my own experience. Some I learned from the failures of others.

  • Bad assumptions waste time and effort
    • Ask questions
    • Performance testing is often exploratory
    • Expect surprises
    • Prepare to adapt
  • Get to know the people in your neighborhood: no single person or group is likely to have all the required information
    • Subject-matter experts
    • Developers
    • System administrators
    • Database administrators
    • Network engineers
  • Don’t script too much too soon: you may end up tossing out much of what you script
    • Applications change
    • Real usage is difficult to estimate
    • Tool limitations may be discovered
  • Different processes have different impacts: what users do can be as, or more, important as how many users are doing it
    • Include high-use processes (80/20 rule)
    • Include high-risk processes
    • Include “different” processes
  • Modularize scripts: simplify script maintenance -- but only when you intend to run the script again
  • Data randomization is not always a good thing: randomization can make result comparison difficult
  • Code error detection and handling
    • Don’t assume that your tool will handle errors
    • Catch and report errors when and where they happen
    • Realize that errors may change simulated user activity
  • Know your tools and test environment
    • Tool’s supported protocols and licensing
    • Load generator and network resource usage
    • Load balancing and caching mechanisms
    • Beware of test and production environment differences
  • Try to translate results into stories that matter to the applicable stakeholders
    • Tests are run to answer questions: don't overwhelm your audience with hundreds of numbers if they just want a "yes" or "no" answer


and finally…


  • Most performance and load-related problems are due to software code or configuration; not hardware

    • Don’t throw more hardware at a software problem

            Edit

          4 Comments:

          April 28, 2007  
          Shrini Kulkarni wrote:

          Would it not be better to use
          "browser based and non browser based" in stead of "Web and client/server .."
          as Web apps are also client server based apps. It would be like saying "Eating fruits and apples can be ..."

          I have seen it as common practice using web and client-servers as two distinct sets of apps. In strict technical sense there are not. you might also use "Windows think client based apps vs Web browser based apps"

          Let us practice to be as precise as possible ...

          Shrini

          April 29, 2007  
          Ben Simo wrote:

          Shrini,

          I agree that web applications are also client/server applications. I used those terms as distinct sets of applications simply because that's how the terms are usually used. I prefer your more precise terms.

          Web applications are becoming thicker with Javascript (especially AJAX implementations) and are therefore more like tradititional Windows thick "client/server" applications than they have ever been in the past.

          Thanks,

          Ben

          June 11, 2007  
          Rahul Verma wrote:

          Hi Ben,

          Your post reminded me of some of the no-so-good experiences I had with performance testing. You have listed some good points which a performance tester should take care of.

          You have inspired me to pen down my own experiences with performance testing. I will work in that direction.

          Regards,
          Rahul Verma

          May 16, 2012  
          LoadRunner Training wrote:

          The last one is good "Don’t throw more hardware at a software problem" use this as short cut to resolve performance issues. Very good post..


          Jaipal
          http://iloveloadrunner.com