Panel Stats
Quality-monitoring snapshot for the ICSAC advanced AI review panel. Published openly so the health of our review process is as auditable as the reviews themselves.
Generated 2026-04-30 · 6 reviews total, 6 in the last 30 days
30-day rates
- Recommend rate
- 66.7%
- Reject rate
- 0.0%
- Disagreement rate
- 66.7%
- Authenticity-flag rate (all-time)
- 0.0%
- Review Quality Control flag rate
- 0.0%
Every panel review is audited by a separate Review Quality Control pass. The flag rate above is the fraction of audited reviews where the audit surfaced a concern that warranted a second look from a human editor before the acceptance decision.
Recommendation mix (all-time)
| Outcome | Count | Share |
|---|---|---|
| RECOMMEND | 4 | |
| REVIEW_FURTHER | 1 | |
| REJECT | 0 | |
| PAUSED_AI_FAILURE | 1 |
Per-dimension score distribution
Mean-of-means across reviews, bucketed into bins from 1 (poor) to 5 (excellent). Drift in the Methodological Transparency or Citation Integrity columns toward the top bins is the early warning sign we watch for — it would mean the panel is starting to rubber-stamp thin methodology.
| Dimension | Mean | 1-1.99 | 2-2.99 | 3-3.99 | 4-4.99 | 5 |
|---|---|---|---|---|---|---|
| Domain Fit | 4.48 | 0 | 0 | 1 | 3 | 1 |
| Methodological Transparency | 3.94 | 0 | 1 | 0 | 4 | 0 |
| Internal Consistency | 3.92 | 0 | 1 | 0 | 4 | 0 |
| Citation Integrity | 3.28 | 0 | 2 | 2 | 1 | 0 |
| Novelty Signal | 4.06 | 0 | 0 | 2 | 3 | 0 |
| Authorship Authenticity | 4.20 | 0 | 1 | 0 | 3 | 1 |
This snapshot is regenerated on each accepted submission and on manual refresh. Raw review data is private; this page exposes only aggregate quality signals.