Reporting performance results. How to.

This is a quick and simple page just to highlight one issue. I have seen so many performance test reports that are hacked together from screen shots and graphs and tables and thrown out there with no context and no explanations. This situation needs to be addressed.

  1. For external reports, give some context. Who are you, what are your roles and responsibilities. What are your contact details.
  2. If any of your audience are external to the project in hand, or if you are external, provide a project overview
  3. Always give a management summary of the test results
  4. Detailed test results (graphs, tables etc.) should be provided to back up any salient points in the summary
  5. More in depth results can be provided depending on the target audience.
  6. Do be aware of commercial confidence if any of the audience is external
  7. Consider general readability - perhaps put large amounts of data in an appendix for example, if you do need to include it

Point 3 above is the main requirement of any report.


Below are a selection of summaries from various different projects at various stages of development. The main point is that most of the audience only needs to read this summary to get the full picture. I would usually also paste these summaries into the email directly, so your audience don't even have to click through to the report:


You get the idea...

[Home] [About (CV)] [Contact Us] [JMeter Cloud] [webPageTest] [_64 images] [asset moniitor] [Linux Monitor] [Splunk ETL] [Splunk API] [AWS bash] [LR Rules OK] [LR Slave] [LR CI Graphs] [LoadRunner CI] [LR CI Variables] [LR Bamboo] [LR Methods] [LR CI BASH] [Bash methods] [Jenkins V2] [Streaming vid] [How fast] [Finding Issues] [Reporting] [Hand over] [VB Scripts] [JMeter tips] [JMeter RAW] [Dynatrace] [Documents] [FAQ] [Legal]

In the Cartesian Elements Ltd group of companies