I've recently been sinking my teeth back into an analytics engineering project — I'm building the initial data infrastructure for a client, which (so far) includes Fivetran, Snowflake, dbt, and Hex.

It's my first time deeply using Hex. It's trite to say this, but it's true — using Hex has been ✨transformational✨  to my workflow in a way that I did not expect. Prior to this, I had a strong aversion to using notebooks: as a SQL-first data analyst, the thought of using pandas to do a simple group by was not fun, nor was the idea of importing psycopg2 and managing my warehouse credentials just to run a simple query.

But in Hex, the fact that I can easily mix SQL (and then write SQL on the results of that SQL), python, and markdown is a game changer for how I work and communicate with my peers both on the data team and outside of it.

Here's some of the ways I've been using it so far!

Note: this writeup assumes you're familiar with both Hex and dbt (and Snowflake too, though you'll get the gist if you use BigQuery or Redshift)

Workflows

Exploratory analyses

Right now, I'm working on a project to define some "north star metrics". At this stage in the project, we've got some ideas about what metrics might be good to measure, but we're not sure if we can (a) measure them with the data we have, and (b) whether they give us meaningful results. We haven't even gotten into building models yet.

Before, this kind of "build a new metric" / exploratory work would look like the following:

The former workflow is janky, and the latter can lead to over engineering before aligning on the thing you're building.

Here's what my flow looks like today (now using Hex)