Issues identified during the test run are stored in a Report. You can access all Project reports on the Test Assets page by clicking the View Report button of the Project. Clicking the red Findings button discussed earlier leads straight to the most recent report that belongs to the current test run.
The picture below shows an example report of a test run based on this tutorial. The Summary tab of the report contains general information about the test run, including an expandable summary of passed/failed tests and trend charts. Please note the information on this tab gets finalized after the test has been completed. The information may be incomplete while the test is running.
The Findings tab shown below lists all the findings. (alternatively, called Observations) The table contains high-level information about each finding, such as:
- The Message and Fields being tested at the time
- The name of the detection mechanism that picked up the anomaly
- Whether the issue could be reproduced or not (requires the Analysis feature to be enabled in the Project configuration)
We can obtain more information about the findings by clicking on them. Each finding is presented on three tabs, as shown in the screenshot below. These are:
- Observation: This tab contains observations about the unexpected behaviour detected. The information presented here and its format varies based on which detection mechanism picked up the issue.
- Test Cases: This tab shows a list of messages sent (and in certain cases messages received) right before the issue was detected. More on this a bit later.
- Replays: Findings can be re-tested by clicking the Reproduce button in the top-left corner of the screen. If the issue is still present at the time of the re-test, it will be reported under the Replays tab.
Given that in this tutorial, we used the GDB monitor, no surprise we can see information obtained from the GDB server under the Observation tab shown below.
If you look at the first line of the assembly instructions and check the
rax register's value by scrolling a bit further down, you can see that we have identified a NULL pointer dereference bug.
As mentioned earlier, the Test Cases tab provides a list of messages that, at the time, could be used to trigger the issue. When the Analyze findings during testing feature is enabled for the Project, the list presented is accurate and only shows the exact messages that had to be sent to trigger the problem.
You can inspect each message sent by clicking the Eye icon (View). You can download these messages in different formats for manual testing or include them in your regression test suite. However, as you may have noticed, you do not have to maintain regression tests manually as you can quickly run regression tests by clicking the Reproduce button or calling the appropriate API endpoint.
This was all for this tutorial. You may to check out the next tutorial to learn how to work with Message Templates as it will help you with:
- Customizing and improving your test cases
- Testing custom and proprietary applications
- Optimizing test performance and run time