Contact Info
Location 24 Holborn Viaduct, London EC1A 2BN
Follow Us

Penetration Test Report Example: What to Expect

What a good UK penetration test report looks like — and what to push back on

If you are buying a penetration test for the first time, the report is the bit you should care about most. The testing is important, but what you actually hand to an auditor, an enterprise customer, or an insurer is the report — and the gulf between a good one and a thin one is enormous. This article walks through the structure of a CREST-grade UK pen test report, shows a worked example finding, and calls out the things that separate an engineering-useful report from a theatre one.

Section 1: Attestation letter

One to two pages, signed by a qualified individual at the testing firm. The attestation states the client name, the scope that was tested, the dates the test ran, the methodology used (OWASP WSTG, OWASP ASVS, OSSTMM), the lead tester's CREST certifications, and a single-sentence outcome statement like "No critical or high-severity vulnerabilities were identified; two medium findings have been communicated and were remediated during the engagement window".

This is the document you send to procurement. It contains no finding detail, no architectural specifics, and no evidence that a determined attacker could use — which is exactly why it is safe to share broadly. Good attestation letters are clear about what was and was not in scope. Vague attestations that hedge everything ("the testing identified some areas for improvement") tend to be written by testers who know the scope was too narrow.

Section 2: Executive summary

Two to four pages. Written for a non-technical reader — a CEO, a head of sales, a compliance officer. A good executive summary includes:

  • A heat-map or bar chart of findings by severity.
  • Business-language descriptions of the top three issues. Not "XSS in /settings/profile" but "An attacker who tricks a logged-in user into clicking a malicious link can hijack that user's session and see all data visible to them".
  • A remediation status column. If you fixed things during the test window, this is where it is highlighted.
  • A comparison to the previous test if this is not the first engagement.

The executive summary is the part most customers and auditors actually read. If it reads as dense engineering prose with CVSS numbers, the tester has not done their job.

Section 3: Scope and methodology

A short section listing what was in scope — URLs, IP ranges, user roles — and what was out of scope. The methodology should reference recognised standards (OWASP WSTG for web, OWASP MASVS for mobile, OWASP API Top 10 for APIs — see our API Top 10 walkthrough). Scope creep or scope surprises after kick-off are where many tests go wrong; an explicit scope section anchors both sides to what was agreed.

Section 4: Findings

The bulk of the report. Each finding is its own two-to-three-page section with a consistent structure:

  1. Finding title and ID (e.g. F-004: IDOR in invoice download endpoint).
  2. Severity rating — typically CVSS 3.1 or 4.0 base score, mapped onto Critical / High / Medium / Low / Informational.
  3. Affected component — URL, parameter, or service.
  4. Description — one paragraph of what the vulnerability is and why it is a problem.
  5. Reproduction steps — ordered steps a developer can follow to see the issue for themselves, usually with redacted screenshots.
  6. Impact — what an attacker could do if they exploit it.
  7. Remediation guidance — concrete advice, not just a reference to an OWASP cheat sheet. Good reports show the vulnerable code pattern and the fixed code pattern where applicable.
  8. References — relevant CVEs, OWASP entries, framework documentation.

Worked example: a medium-severity IDOR finding

To make this concrete, here is how a medium finding reads in a good report. The product is a fictional SaaS called Acme Billing.

F-004: Insecure Direct Object Reference on invoice download endpoint
Severity: Medium (CVSS 3.1 base 6.5 — AV:N/AC:L/PR:L/UI:N/S:U/C:H/I:N/A:N)
Affected: GET /api/v1/invoices/{id}/download

Description. The invoice download endpoint enforces authentication but not authorisation. Any authenticated user can retrieve any invoice in the system by incrementing the id path parameter. An attacker who registers for a free trial account is able to enumerate and download invoices belonging to other tenants.

Reproduction. (1) Create a free trial account. (2) Authenticate to the application and capture the session token. (3) Issue GET /api/v1/invoices/1/download, .../2/download, etc. Invoices belonging to other accounts are returned.

Impact. Disclosure of financial data belonging to other customers. Under UK GDPR this is a reportable personal data breach if any named individual is referenced in the invoices.

Remediation. Scope invoice retrieval by the authenticated user's organisation. In Laravel, add a global scope on the Invoice model that filters by organisation_id, or use a policy gate in the controller: Gate::authorize('download', $invoice). Do not rely on unguessable IDs as the control; use server-side authorisation.

Section 5: Appendices

Tooling used, testers involved (with CREST certifications listed), the engagement timeline, and any raw evidence the client has asked to be preserved. This is where compliance teams verify that the test was run by qualified individuals over the stated window.

Section 6: The retest addendum

Ordered separately a few weeks after the main report, once you have fixed the findings. A good retest addendum restates each finding, summarises what the client claimed to have fixed, records what the tester verified, and updates the status to Closed / Risk Accepted / Partial / Unchanged. For enterprise procurement, the retest addendum is often more valuable than the original report because it evidences the remediation loop. Not every testing firm includes retest as standard; ask whether it is included in the quote or an extra — at YUPL, retest is included in the fixed price for every engagement.

Red flags in a pen test report

  • No reproduction steps. If you cannot reproduce the issue from the report, your engineers cannot fix it and your auditor cannot verify it.
  • "Use a WAF" as remediation advice. A WAF is a control you may or may not run; remediation guidance should address the root cause.
  • Copy-pasted boilerplate. If two findings describe the same class of issue and the "remediation" section is identical word-for-word, chances are the tester is padding.
  • CVSS scores without justification. A finding rated High should have a CVSS vector that supports High. If the vector does not match the rating, the rating is not defensible.
  • No informational findings. A test with zero informational findings either had an artificially narrow scope or the tester did not surface the things that are not strictly vulnerabilities but matter for defence-in-depth.

What to do with the report once you have it

First, read the executive summary with your engineering lead. Disagree with anything that does not match your understanding of the system before accepting severity ratings. Second, triage the findings into your issue tracker within three working days of receiving the report — not three weeks. Third, schedule remediation with fixed target dates by severity (Critical and High within 30 days, Medium within 90, Low and Informational into the next quarter). Fourth, book the retest as soon as you are confident the work is complete. Fifth, share the attestation letter with procurement reviewers and share the full report only under NDA and only to those who legitimately need finding-level detail.

How much of this is on the report vs on you

A good report is half the equation. The other half is whether your team turns findings into closed tickets quickly. The most common failure mode is not a bad tester — it is a good test followed by a slow remediation cycle. Fix that side of the loop and the report becomes a genuine asset: proof that you run a security programme, not just that you once ran a test.

If you are about to buy your first penetration test and want to see an example of what a YUPL report looks like before you commit, get in touch. We share redacted samples before the first call so you know what you are paying for.

About the author. Spencer Schotel is CTO of YUPL, a CREST-accredited UK agency specialising in Laravel engineering and penetration testing.