Dataface Tasks

Regression prevention and quality gates

IDM4_V1_0_LAUNCH-DASHBOARD_FACTORY-02
Statusnot_started
Priorityp1
Milestonem4-v1-0-launch
Ownerdata-analysis-evangelist-ai-training

Problem

Quality checks on dashboard templates are manual and happen only at review time — there are no automated gates that prevent a regression from being published. A template that passed review last week can break silently after a Dataface engine update, a schema change, or a YAML convention shift, and no one finds out until a user reports it. Without automated regression prevention, maintaining quality at scale requires ever-increasing manual effort that does not keep pace with the growing template catalog.

Context

  • Manual review is not enough to protect repeatable production, review, and publishing of quickstarts and example dashboards once the change rate increases; regressions will keep shipping unless the highest-value checks become automatic.
  • This task should identify what needs gating in CI or structured review and what evidence is sufficient to block a risky change before it reaches users.
  • Expected touchpoints include examples/, review/publishing docs, production-line scripts, and dashboard content fixtures, automated tests, eval/QA checks, and any release or review scripts that can enforce the new gates.

Possible Solutions

  • A - Add only a few narrow tests around current bugs: easy to land, but it rarely protects the broader behavior contract.
  • B - Recommended: define a regression-gate bundle around the core behavior contract: combine focused tests, snapshots/evals, and required review evidence for risky changes.
  • C - Depend on manual smoke testing before each release: better than nothing, but too inconsistent to serve as a durable gate.

Plan

  1. Identify the highest-risk behavior contracts for repeatable production, review, and publishing of quickstarts and example dashboards and the types of changes that should be blocked when they regress.
  2. Choose the smallest practical set of automated checks and required review evidence that covers those contracts well enough to matter.
  3. Wire the new gates into the relevant test, review, or release surfaces and document when exceptions are allowed.
  4. Trial the gates on a few representative changes and tighten the signal-to-noise ratio before expanding the coverage further.

Implementation Progress

Review Feedback

  • [ ] Review cleared