You are using an older browser that might negatively affect how this site is displayed. Please update to a modern browser to have a better experience. Sorry for the inconvenience!

Performance Testing Planning


Before starting Performance testing activities, a detailed plan should be created that will help how performance testing will be done from a technical perspective and business perspective. Here will describe the performance test planning steps and explanations for each.

Objective of Performance Testing 

To determine the performance/speed/effectiveness of the application under expected     workload. The main goal Performance testing is to check a software program’s

Speed – Identify whether the application responds quickly.

Scalability – To determine the capacity of maximum user load the application can handle.

Stability – To find the stability of the application under varying loads.

Performance Test Planning Steps: 

   1. Performance Test Requirements: 

  • Why the Performance testing is needed?  it requires a confirmation from client /Dev Team: are there any enhancements/changes in the code /changes in the environment setup/hardware setup?

Example:

  • Enhancements done in Homepage.
  • XYZ JVM Memory size has increased/decreased.
  • Enhancements done in backend through web services.
  • Test Environment: Confirmation is required whether the PT Environment got set up properly for Performance Testing.
  • Identify the Critical Business Scenarios needed to be tested in PT. 

Example: Need to find which business scenarios are used by clients under peak load.

  • Scenario 1: Purchase Order
  • Scenario 2: Search Products
  • Scenario 3: Payment Action
  • Test Data Setup (Do we have the test IDs for our PT?)

Example:

  •  For Scenario 1 & 2, we need login credentials.
  • For Scenario 3, dummy credit card details are required
  • Test Volume/Workload 
    • Identify the test volumes to be achieved in our testing based on the production data.
    • If the application is going live mode first time, approximately we can take the test volume based on client requirements/expectations.
  • Test Executions

Example: What are the types of testing needed?

  • Load test (2 Hrs)
  • Ramp Up test (Ex: For first 1 Hr 100 Vusers should access the business scenario and next every 20 or 30 mins 25% Vusers should increase till 2Hrs 0r 3Hrs).
  • Endurance Test (To Identify whether the application is stable for continuous period like a business hour 10 to 5)
  • Baseline Test (To Identify the application behaviour once the code got ready at first time)
  • Servers and Database involved in testing (Get the Server details and Database details involved in our performance testing; this would be helpful to analyse the results) 

Example:

  •   ABC Server
  •   DB001 Database

2.PT Shakedown: 

Check the application manually to make sure the business scenario is working fine in our Performance testing/QA/UAT environment after the environmental setup got completed.

3.PT Scripting (Using Performance Testing tool):

Here consider, we have selected LoadRunner Performance testing tool to do scripting.

  • Identify the Protocol for your application using Protocol Advisor in LR and proceed the scripting. Start the Scripting activities for the business scenarios with one user and make sure the script should work for multiple users with your enhancements.
  • Run the scripts with one Vusers for each script in controller/Performance centre to identify the average response (It will be helpful to calculate pacing value to achieve the desired load).

Following standards/enhancements should be used effectively: (They will change based on client requirements)

  • Remove the Unwanted Cookies in script.
  • Add the transaction name based on the steps/requests to get to know the proper response time for the transaction.
  • Do the parameterization (what are the values user entered) and correlation (Values returned by server) properly.
  • Add the required headers.
  • Insert Think time (As real user will spend a few seconds for every steps)
  • Write the functions/logic to reduce the code size (if applicable).

4.PT Test Execution (Using Controller/Performance Centre): 

As Per the requirement, create the load test plan using controller/Performance centre

Step 1: Create a new Test Plan (Provide the Test Plan Name as Ex: TestPlan_1)

Step 2: Pull the scripts in the created test plan to be tested

Step 3: Set the scenario based on requirements (Ex: Set 1 Hrs/2 Hrs test duration and pacing value to achieve the volume)

Step 4: Assign Load Generators based on the virtual user count in your test scenario.

Step 5: Run the scenario to procced the test execution.

5.PT Report: 

Once the expected tests are completed, we need to analyse the results and identify the bottlenecks. (Once we fixed the issues, can re-run the test executions)

Example Bottlenecks are:

  • Whether all the transaction response times meet the SLAs. (Like Acceptable response time in PT as 5 secs below; if it is more than 5secs, then we need to identify the issues)
  • Compare the transaction response times with previous releases (if applicable)
  • Whether the Hits per second and throughput have met the expectations? (Ex: we may hit more request to the server (Hits per second) but receiving less response (throughput) because of connection pool issue)
  • Memory Leak (Whether the GC is working fine as saw tooth pattern in periodically for all JVM’s involved in PT)
  • CPU Utilization and Memory Utilization
  • Identify the Indexes and views were working fine
  • Identify any queries will take much time (this issue might lead to get high response time)

6.PT Result Review Meeting: 

Share the results to stakeholders and schedule a meeting to give a brief explanation of the test results and provide performance improvement/suggestions (about the response time/any code tuning/memory Size).

7.PT Signoff: 

After the review meeting is over with stakeholders/dev team, we can send the signoff approval/conditional approval for the project.

Note: If any issues couldn’t be replicated with the current release/sprint and if it won’t affect in production/may be an environmental issue, then we can get the conditional approval and reproduce it in next release/sprint.

Conclusion: 

Above all the metrics should be done to capture the performance test results and provide them to the clients to optimize the application for quick response and save the cost by fixing the issues before the application goes to production.