Create analytics repo Dataface branch and bootstrap workflow¶
Problem¶
Set up the internal analytics repo as a first-class Dataface example-customer repo for analyst work. Create and document the Dataface branch strategy in /Users/dave.fowler/Fivetran/analytics, define how /faces and Dataface config live alongside dbt assets, and make sure the new dft init/bootstrap flow works there end-to-end before analysts use it.
Context¶
This task now includes the BQ connection setup (previously a separate "wire Dataface to BQ" task, now cancelled and merged here). The deliverable is end-to-end: cd ~/Fivetran/analytics/dbt_ft_prod && dft serve → working dashboards against BigQuery.
Analytics repo structure¶
- Repo:
/Users/dave.fowler/Fivetran/analytics - dbt project:
dbt_ft_prod/(project name:prj_production, profile:fivetran) models/contains: bi_core/ (45 gold-layer), staging/ (143 across 18 sources), intermediate/ (minimal), plus marketing, product, engineering, feature_adoption, support dirscto-researchalready uses this repo as a metadata source
Decisions (resolved)¶
File layout: faces/ and dataface.yml go inside dbt_ft_prod/, as siblings of models/. This is the standard dbt convention — dft init detects dbt_project.yml and puts faces/ at the same level as models/. So the layout is:
analytics/dbt_ft_prod/
├── dbt_project.yml
├── dataface.yml # BQ source config
├── models/ # existing dbt models
├── faces/ # Dataface dashboards (new)
│ └── (starter face from dft init)
├── macros/
├── tests/
└── ...
Branch strategy: Use a dedicated dataface/bootstrap branch for initial setup. Once validated, the intent is to merge to main (it's additive — faces/ and dataface.yml don't break dbt). But don't commit to main until sorted with the analytics team.
BQ connection: The dataface.yml declares a BigQuery source using gcloud application-default credentials. The actual credential stays in environment/local config, not checked into the repo. Initial dataset scope is bi_core models — let the catalog/inspector discover what's available.
Not an eval dependency: The eval workstream has its own DuckDB-based data path (apps/evals/). This task is purely about analyst dashboarding against the real warehouse.
Depends on dft init¶
This task depends on add-dft-init-for-dbt-native-repo-bootstrap.md — run dft init first in the analytics repo. This task is the first real-world test case for that command. If dft init doesn't handle the analytics repo well, feed pain points back into that task.
Possible Solutions¶
Option 1: Keep analytics repo read-only and author dashboards elsewhere¶
- Treat analytics purely as a metadata source and keep all Dataface assets in the Dataface repo.
- Pros:
- Lowest coordination with analytics repo
- Simplest ownership story
- Cons:
- Does not match the intended analyst workflow
- Prevents analytics from serving as a realistic example-customer repo
Option 2: Use analytics as the customer repo, but rely on manual conventions¶
- Create a Dataface branch in analytics and manually add
/facesand config there. - Pros:
- Gets close to the target workflow quickly
- Minimal Dataface product changes required
- Cons:
- Process remains tribal and fragile
- Hard to onboard analysts consistently
Option 3: Make analytics the canonical example-customer repo and validate the full bootstrap flow there Recommended¶
- Define the branch and repo-shape strategy in analytics, then use the new
dft initflow to prove end-to-end setup there. - Pros:
- Produces a real internal proving ground before analyst rollout
- Clarifies repo boundaries and ownership early
- Gives Dataface a concrete, non-toy example repo for docs and testing
- Cons:
- Requires coordination across repos
- Forces decisions on directory placement and branch workflow sooner
Plan¶
- Create
dataface/bootstrapbranch in the analytics repo. - Run
dft initfromdbt_ft_prod/— should detectdbt_project.yml, createfaces/, scaffold starter dashboard, createdataface.yml. - Edit
dataface.ymlto add BQ source config (type: bigquery, project: digital-arbor-400, credentials: application-default). - Verify
dft servestarts, can list tables via catalog/inspector, and renders the starter face. - Build one real dashboard against
bi_coredata to prove end-to-end works. - Feed any
dft initpain points back to the companion task. - Document the repo boundary (what stays in Dataface repo vs. analytics repo).
Implementation Progress¶
- Initial planning notes:
- Confirmed the analytics repo is already available locally at
/Users/dave.fowler/Fivetran/analytics. - Confirmed
cto-researchalready points both LookML and dbt connectors at that repo, so this is the right real-world proving ground. - Confirmed this task should stay focused on analytics repo workflow and branch strategy, not on moving Dataface product code into analytics.
QA Exploration¶
- QA exploration completed (or N/A for non-UI tasks)
N/A for browser QA. Verification should happen in the analytics repo through CLI bootstrap/validate/serve flows.
Review Feedback¶
- Review cleared