Create analytics repo Dataface branch and bootstrap workflow
Problem
Set up the internal analytics repo as a first-class Dataface example-customer repo for analyst work. Create and document the Dataface branch strategy in /Users/dave.fowler/Fivetran/analytics, define how /faces and Dataface config live alongside dbt assets, and make sure the new dft init/bootstrap flow works there end-to-end before analysts use it.
Context
This task now includes the BQ connection setup (previously a separate "wire Dataface to BQ" task, now cancelled and merged here). The deliverable is end-to-end: cd ~/Fivetran/analytics/dbt_ft_prod && dft serve → working dashboards against BigQuery.
Analytics repo structure
- Repo:
/Users/dave.fowler/Fivetran/analytics - dbt project:
dbt_ft_prod/(project name:prj_production, profile:fivetran) models/contains: bi_core/ (45 gold-layer), staging/ (143 across 18 sources), intermediate/ (minimal), plus marketing, product, engineering, feature_adoption, support dirscto-researchalready uses this repo as a metadata source
Decisions (resolved)
File layout: faces/ and dataface.yml go inside dbt_ft_prod/, as siblings of models/. This is the standard dbt convention — dft init detects dbt_project.yml and puts faces/ at the same level as models/. So the layout is:
analytics/dbt_ft_prod/
├── dbt_project.yml
├── dataface.yml # BQ source config
├── models/ # existing dbt models
├── faces/ # Dataface dashboards (new)
│ └── (starter face from dft init)
├── macros/
├── tests/
└── ...
Branch strategy: Use a dedicated dataface/bootstrap branch for initial setup. Once validated, the intent is to merge to main (it's additive — faces/ and dataface.yml don't break dbt). But don't commit to main until sorted with the analytics team.
BQ connection: The dataface.yml declares a BigQuery source using gcloud application-default credentials. The actual credential stays in environment/local config, not checked into the repo. Initial dataset scope is bi_core models — let the catalog/inspector discover what's available.
Not an eval dependency: The eval workstream has its own DuckDB-based data path (apps/evals/). This task is purely about analyst dashboarding against the real warehouse.
Depends on dft init
This task depends on add-dft-init-for-dbt-native-repo-bootstrap.md — run dft init first in the analytics repo. This task is the first real-world test case for that command. If dft init doesn't handle the analytics repo well, feed pain points back into that task.
Possible Solutions
Option 1: Keep analytics repo read-only and author dashboards elsewhere
- Treat analytics purely as a metadata source and keep all Dataface assets in the Dataface repo.
- Pros:
- Lowest coordination with analytics repo
- Simplest ownership story
- Cons:
- Does not match the intended analyst workflow
- Prevents analytics from serving as a realistic example-customer repo
Option 2: Use analytics as the customer repo, but rely on manual conventions
- Create a Dataface branch in analytics and manually add
/facesand config there. - Pros:
- Gets close to the target workflow quickly
- Minimal Dataface product changes required
- Cons:
- Process remains tribal and fragile
- Hard to onboard analysts consistently
Option 3: Make analytics the canonical example-customer repo and validate the full bootstrap flow there Recommended
- Define the branch and repo-shape strategy in analytics, then use the new
dft initflow to prove end-to-end setup there. - Pros:
- Produces a real internal proving ground before analyst rollout
- Clarifies repo boundaries and ownership early
- Gives Dataface a concrete, non-toy example repo for docs and testing
- Cons:
- Requires coordination across repos
- Forces decisions on directory placement and branch workflow sooner
Plan
- Create
dataface/bootstrapbranch in the analytics repo. - Run
dft initfromdbt_ft_prod/— should detectdbt_project.yml, createfaces/, scaffold starter dashboard, createdataface.yml. - Edit
dataface.ymlto add BQ source config (type: bigquery, project: digital-arbor-400, credentials: application-default). - Verify
dft servestarts, can list tables via catalog/inspector, and renders the starter face. - Build one real dashboard against
bi_coredata to prove end-to-end works. - Feed any
dft initpain points back to the companion task. - Document the repo boundary (what stays in Dataface repo vs. analytics repo).
Implementation Progress
- 2026-03-22 execution run:
- Created analytics branch:
dataface/bootstrapin/Users/dave.fowler/Fivetran/analytics. - Ran
dft init --project-dir /Users/dave.fowler/Fivetran/analytics/dbt_ft_produsing the companion task branch implementation (codex/add-dft-init-for-dbt-native-repo-bootstrap). - Verified dbt detection and scaffold creation under
dbt_ft_prod/:faces/hello.ymlfaces/partials/.gitkeep
- Verified idempotent re-run behavior: second
dft initrun skipped existing scaffold files as expected. - Added Dataface project config at
dbt_ft_prod/dataface.ymlwith BigQuery source:type: bigqueryproject: digital-arbor-400dataset: bi_core- Credentials via gcloud ADC (no checked-in keyfile).
- Added one real dashboard face in analytics repo:
dbt_ft_prod/faces/bi_core_bootstrap_check.yml(KPI + sample table againstbi_core.revenue). - Verified Dataface server startup against analytics project:
dft serve --project-dir /Users/dave.fowler/Fivetran/analytics/dbt_ft_prod --host 127.0.0.1 --port 9899GET /healthreturned{"status":"ok","service":"dataface-server"}.
- Validation result:
dft validatepasses for analyticsfaces/.
- BigQuery query verification blocker:
dft inspect table revenue --connection bigquery://digital-arbor-400 --dialect bigquery --schema bi_corecurrently fails with:Reauthentication is needed. Please run gcloud auth application-default login to reauthenticate.- This is an environment credential state issue (expired ADC), not a task code/workflow shape issue.
- Follow-up command for final end-to-end warehouse query proof:
gcloud auth application-default login- then rerun inspect/render checks from analytics
dbt_ft_prod.
- 2026-03-25 follow-up:
- Local analyst workflow is now confirmed working in
/Users/dave.fowler/Fivetran/analytics/dbt_ft_prod. - Dashboards are serving against BigQuery through the analytics-repo bootstrap
path (
dft init+dataface.ymlin the analytics repo), which means the earlier ADC reauthentication note should be treated as a time-local environment credential issue rather than an open product/workflow blocker. - This task remains the canonical home for the old cancelled "wire Dataface to internal analytics repo and BigQuery source" work.
Repo boundary (documented)
- Analytics repo (
/Users/dave.fowler/Fivetran/analytics/dbt_ft_prod) owns: faces/dashboard YAML filesdataface.ymlsource/project config- Branch workflow for analyst-authored dashboards (
dataface/bootstrapnow; future feature branches per dashboard effort) - Dataface repo (
/Users/dave.fowler/Fivetran/dataface) owns: - CLI/runtime product code (
dft init,dft serve, compiler/renderer/inspector) - Templates/scaffolding behavior and docs
- Validation/runtime fixes discovered while operating against analytics as a customer repo
QA Exploration
- [x] QA exploration completed (or N/A for non-UI tasks)
N/A for browser QA. Verification should happen in the analytics repo through CLI bootstrap/validate/serve flows.
Review Feedback
- [ ] Review cleared