TestKase
TestKase
|Docs
Accessibility TestingScans

Comparing & Consolidating Scans

Compare two scan versions side-by-side, or roll multiple independent scans into one consolidated report.

Comparing & Consolidating Scans

Once you have more than one accessibility scan, two questions come up:

  1. "Did our last release fix what we said it did?" — answered by Compare.
  2. "What does our overall accessibility look like across these N independent scans?" — answered by Consolidate.

TestKase ships both as first-class reports.

Compare two versions of a scheduled scan

Use Compare when a single scan runs on a schedule (or you've manually re-run the same scan multiple times) and you want a focused diff between any two of those runs.

When to use it

  • Verify a release: compare the run from before deploy with the run from after.
  • Track regressions over a quarter: compare run #1 with run #12.
  • Sign off on an audit fix: compare the audit-time scan with the post-fix scan.

How it works

  • Pick the parent (root) scan in your scan list.
  • Choose any two version numbers from its run history (e.g. Run #3 and Run #7).
  • TestKase auto-orders them oldest-first and builds a side-by-side diff:
    • Score delta (+12 points, -3 points)
    • Issues fixed (in the older run, gone in the newer)
    • Issues introduced (not in the older, present in the newer)
    • Issues unchanged
    • Per-rule and per-page rollups

Open the report

  • From the scan list, open the parent scan → Compare versions → pick the two run numbers → Compare.
  • Direct link: /web-scanner/compare?rootScanId=<id>&v1=<n>&v2=<n>.

API

POST /api/v1/accessibility-testing/compare
Content-Type: application/json
{
  "rootScanId": 191,
  "versions": [3, 7]
}

Order doesn't matter — the backend sorts the two versions oldest-first before producing the diff.

Paginated issues from the comparison:

POST /api/v1/accessibility-testing/compare/issues
Content-Type: application/json
{ "rootScanId": 191, "versions": [3, 7] }
?page=1&limit=25&impact=critical&status=introduced

Limits

  • Exactly two versions per compare call. For multi-run trend analysis, run sequential compares or wait for the trend dashboard.
  • Both versions must belong to the same root scan (same URL list, same WCAG settings — that's what makes the diff meaningful).

Consolidate multiple independent scans

Use Consolidate when you have separate scans (different URLs, different times, different scan groups) and you want one combined view — typically for an audit or stakeholder presentation.

When to use it

  • Your site has 5 scan groups (marketing, app, docs, pricing, blog) and you want one combined accessibility score for the company.
  • You ran independent scans for different products and want to share a single report with a stakeholder.
  • You're producing an end-of-quarter compliance summary across many scan groups.

How it works

  • Pick two or more completed scans from the scan list.
  • TestKase merges the issues, deduplicates by rule + selector + URL, and produces:
    • Combined accessibility score (weighted by URL count)
    • Per-scan and overall rollups
    • Unified WCAG conformance matrix across all included URLs
    • Affected-pages and affected-components tables that span every included scan

Open the report

  • Scan list → multi-select scan rows → Consolidate action.
  • Direct link: /web-scanner/consolidate?scanIds=<id>,<id>,….

Export as PDF

The consolidated report can be exported as a single PDF for executive sharing, audits, and compliance attestations:

POST /api/v1/accessibility-testing/consolidate/export-pdf
Content-Type: application/json
{ "scanIds": [12, 47, 88] }

Returns application/pdfattachment; filename="consolidated-report-<date>.pdf".

API

POST /api/v1/accessibility-testing/consolidate
Content-Type: application/json
{ "scanIds": [12, 47, 88] }

Paginated issues:

POST /api/v1/accessibility-testing/consolidate/issues
?page=1&limit=25&impact=serious
Content-Type: application/json
{ "scanIds": [12, 47, 88] }

Limits

  • Minimum two scan IDs.
  • Scans must be in completed status; pending, running, or failed scans are skipped.
  • All scans must belong to your workspace (or be shared with a team you're a member of).

Compare vs Consolidate — which one?

QuestionUse
"What changed between two runs of the same scan?"Compare
"What's our overall accessibility across multiple different scans?"Consolidate
One root scan, two snapshots in timeCompare
Multiple independent scans, one combined viewConsolidate
Need a side-by-side diff with introduced/fixed sectionsCompare
Need a single PDF for an executiveConsolidate