Panel Stats

Quality-monitoring snapshot for the ICSAC advanced AI review panel. Published openly so the health of our review process is as auditable as the reviews themselves.

Generated 2026-04-30 · 6 reviews total, 6 in the last 30 days

30-day rates

Recommend rate
66.7%
Reject rate
0.0%
Disagreement rate
66.7%
Authenticity-flag rate (all-time)
0.0%
Review Quality Control flag rate
0.0%

Every panel review is audited by a separate Review Quality Control pass. The flag rate above is the fraction of audited reviews where the audit surfaced a concern that warranted a second look from a human editor before the acceptance decision.

Recommendation mix (all-time)

OutcomeCountShare
RECOMMEND 4 67%
REVIEW_FURTHER 1 17%
REJECT 0 0%
PAUSED_AI_FAILURE 1 17%

Per-dimension score distribution

Mean-of-means across reviews, bucketed into bins from 1 (poor) to 5 (excellent). Drift in the Methodological Transparency or Citation Integrity columns toward the top bins is the early warning sign we watch for — it would mean the panel is starting to rubber-stamp thin methodology.

Dimension Mean 1-1.992-2.993-3.994-4.995
Domain Fit 4.48 00131
Methodological Transparency 3.94 01040
Internal Consistency 3.92 01040
Citation Integrity 3.28 02210
Novelty Signal 4.06 00230
Authorship Authenticity 4.20 01031

This snapshot is regenerated on each accepted submission and on manual refresh. Raw review data is private; this page exposes only aggregate quality signals.