Skip to content

STD-GOV-003: Standards Audit Checklist

1. Purpose

This standard defines the monthly audit procedure used to verify that the Standards & Practices Library remains a controlled system:

  • internally consistent
  • enforced by automation
  • followed in module delivery
  • safe for operations, security, and regulated domains

2. Scope

In Scope

  • Standards Library integrity (registry, naming, counts)
  • Governance enforcement (decisions, waivers)
  • CI/automation enforcement
  • Module lifecycle compliance (DoD + Analog Twin)
  • Security baseline compliance
  • Data governance and auditability
  • Observability and incident posture
  • Coding and design system compliance
  • Release and rollback discipline

Out of Scope

  • Feature prioritization disputes (handled by PRD governance)
  • Non-production experiments in isolated sandboxes (unless merged)
  • Vendor platform outages or external service failures (unless gaps in response standards are found)

3. The Standard (MUST)

3.1 Audit Frequency

  • MUST run a Standards Audit every 30 days.
  • MUST run an additional audit after any major standards overhaul (category additions, governance model changes, security/release changes).

3.2 Inputs

Audit inputs MUST include:

  • all merged PRs during the audit window (last 30 days)
  • all standards changed in the audit window
  • waiver registry snapshot (open + created + expired)
  • CI logs for required checks
  • release/change logs (if production deploys occurred)

3.3 Outputs

Each audit MUST produce:

  1. Audit Report (1 page)
  2. Top 5 Findings (ranked)
  3. Corrective Action List (owners + due dates)
  4. Waiver Register Snapshot
  5. Audit Scorecard (Green/Yellow/Red)

3.4 Critical Failures

Any of the following constitutes a Critical Failure and triggers mandatory remediation:

  • Registry or catalog mismatches (counts, duplicates, missing standards)
  • CI bypass or merging with failing required checks
  • Security baseline violation (unauthorized access, secret leakage)
  • Data governance violation for governed domains (finance/governance/evidence)
  • Release rollback not possible or migration irrecoverable
  • Waivers missing expiry, approver, or remediation plan

4. Audit Checklist (PASS/FAIL)

4.1 Registry & Catalog Integrity (Non-Negotiable)

  • [ ] PASS: Total standards count matches filesystem + registry
  • [ ] PASS: No duplicate categories/entries in dashboard
  • [ ] PASS: All files follow STD-[CAT]-[ID]_[Name].md
  • [ ] PASS: All categories exist in .categories.json and match catalog usage

Fail condition: Any mismatch or duplication.

4.2 Governance Enforcement (GOV)

4.2.1 Decision protocol use

  • [ ] PASS: Any standards change includes a decision reference (ADR / decision log)
  • [ ] PASS: Standards changes include reason + impact + rollback plan (if applicable)

Fail condition: Standards edited with no traceable decision.

4.2.2 Waiver discipline

  • [ ] PASS: Every waiver includes scope, reason, approver, expiry, remediation plan
  • [ ] PASS: No expired waivers remain active

Fail condition: Missing expiry or remediation, or stale waivers.

4.3 CI / Automation Reality Check (GEN-002 + DEV)

  • [ ] PASS: Required checks run automatically in CI
  • [ ] PASS: PRs cannot merge while required checks are failing

Track last 30 days:

  • [ ] First-pass green PR rate ≥ 70%
  • [ ] Avg reruns per PR ≤ 2
  • [ ] Avg time-to-green ≤ 30 min (excluding major refactors)

Fail condition: Manual-only checks, merges bypassing CI, or chronic red builds.

4.4 Module Lifecycle Compliance (PRD)

For every module promoted as Stable in the last 30 days:

  • [ ] PASS: Meets DoD (tests + walkthrough + seed + docs)
  • [ ] PASS: Analog Twin updated (blueprints + standards refs + registry)
  • [ ] PASS: Operational safety present (logs + threat check + error handling)

Capability governance:

  • [ ] PASS: All merged features have CAP entries linked to PR/module
  • [ ] PASS: Lifecycle states correct (Experimental/Stable/Deprecated)

Fail condition: Stable modules missing DoD components or unregistered capabilities.

4.5 Security Baseline Verification (SEC)

Sample last 10 PRs touching data access:

  • [ ] PASS: Authorization checks exist where required
  • [ ] PASS: No implicit trust from client input
  • [ ] PASS: No inconsistent role logic duplication across modules

Secrets + environment safety:

  • [ ] PASS: No secrets in repo, logs, or committed config
  • [ ] PASS: Env patterns match standard and do not enable accidental production access

Fail condition: Any unauthorized access path or leaked secret.

4.6 Data Governance & Auditability (DAT)

For finance/governance/evidence/disputes:

  • [ ] PASS: Append-only history exists where required (no silent overwrites)
  • [ ] PASS: Audit fields present where required (createdBy, createdAt, etc.)
  • [ ] PASS: Deletions are governed (soft delete or explicit governed delete)

Data export + reproducibility:

  • [ ] PASS: Critical datasets export deterministically
  • [ ] PASS: Seeds are idempotent and reproducible

Fail condition: Mutation where immutability is required or data cannot be validated.

4.7 Observability & Incident Readiness (OPS)

Logging quality:

  • [ ] PASS: Logs are structured and non-spammy
  • [ ] PASS: Errors include correlation IDs (request/trace key)
  • [ ] PASS: No sensitive data logged

Incident posture:

  • [ ] PASS: Severity definitions exist and are applied
  • [ ] PASS: Incident playbooks exist for production-impacting components

Fail condition: Blind ops, sensitive logging, or no incident posture.

4.8 Coding & Design System Compliance (COD + DES)

I18n:

  • [ ] PASS: No hardcoded UI strings in newly merged UI code
  • [ ] PASS: Translation keys present in locale files

Styling:

  • [ ] PASS: Token discipline enforced (no “magic colors” or drift)
  • [ ] PASS: Components follow approved UI composition patterns

Performance hygiene:

  • [ ] PASS: No N+1 query patterns introduced
  • [ ] PASS: Lists paginated and bounded
  • [ ] PASS: Expensive work not defaulting to client-side execution

Fail condition: i18n regressions, styling entropy, or cost/perf regressions.

4.9 Release & Migration Discipline (REL)

Rollback readiness:

  • [ ] PASS: Rollback plan documented per release
  • [ ] PASS: Migrations reversible or forward-fixable

Compatibility:

  • [ ] PASS: Breaking changes explicitly labeled
  • [ ] PASS: API contract stability preserved where required

Fail condition: No rollback path or silent breaking behavior.

5. Scoring & Interpretation

5.1 Score Bands

  • Green (World-Class): 95–100% pass, 0 critical failures
  • Yellow (Operational): 85–94% pass, ≤1 critical failure (must have remediation)
  • Red (Uncontrolled): <85% pass OR ≥2 critical failures

5.2 Mandatory Response to Critical Failure

  • MUST open corrective actions within 48 hours
  • MUST assign owners and due dates
  • MUST re-audit affected sections after remediation

6. Required Audit Deliverables (Template)

6.1 Top 5 Findings (Ranked)

For each:

  1. Finding + evidence
  2. Impact
  3. Root cause
  4. Fix + owner + due date

6.2 Waiver Snapshot

  • New waivers created: #
  • Active waivers: #
  • Expired waivers cleared: #
  • Top waiver causes (top 3)

6.3 Action Plan

  • Immediate fixes (≤ 7 days)
  • Short-term improvements (≤ 30 days)
  • Standards revisions required (if any)

7. The Path (SHOULD)

  • Audits SHOULD be run by someone not directly responsible for the audited changes (reduces bias)
  • Audits SHOULD sample at least 10 PRs or 20% of merged PRs (whichever is greater)
  • High-risk domains (finance, auth, evidence) SHOULD be sampled every audit cycle

8. Changelog

Date Version Change
2026-01-25 v1.0 Initial ratified release

Version History

Version Date Author Change
0.1.0 2026-01-26 Antigravity Initial Audit & Metadata Injection