We’re looking for a Data QA Specialist to own the quality of our clickstream and web analytics data (Snowplow and related tracking), and the reporting pipelines that depend on it.
You’ll sit between Analytics and Data Engineering, designing and running tests that ensure events are correctly tracked, transformed, and made available in our data warehouse and BI tools. Your work underpins reliable reporting and experimentation across the business.
responsibilities :
Validate Snowplow / clickstream events from collection through to warehouse and reporting layers.
Check event schemas, properties, timestamps, user IDs, sessions, and attribution fields for accuracy and consistency.
Monitor and flag issues such as:
Missing or duplicate events
Unexpected spikes/drops
Schema drift / unannounced changes
Test and validate ETL / ELT pipelines feeding dashboards and reporting tables (e.g. marketing, product, funnel performance).
Compare source vs transformed data to ensure metrics (sessions, clicks, conversions, revenue, etc.) are accurate and reconciled.
Help define data quality checks (row counts, null checks, referential integrity, thresholds, anomaly detection).
Work with Analytics and Data Engineering to define:
Test cases for new tracking requirements and events
Regression tests when pipelines or schemas change
Staging and production environments
Using SQL queries, test scripts, and tool-based checks
Log defects clearly and track them through to resolution.
Act as the bridge between Analytics and Data Engineering:
Translate analytics requirements into testable data conditions.
Escalate issues with clear impact statements (“this breaks X dashboard / Y KPI”).
Clarify and request access to tables, views, and logs needed for QA.
Align on SLAs for data delivery and fixes.
Coordinate with web / app teams where tracking implementation needs changes.
Event schemas and naming conventions
Known data caveats and limitations
Test plans and standard QA procedures
Flagging inconsistent naming or definitions
Helping enforce standards across teams.
requirements-expected :
Strong experience in data quality / data QA / analytics engineering roles.
Clickstream or web analytics data (e.g. Snowplow, GA, similar tools).
SQL for investigation, reconciliation, and QA.
Working with data warehouses / lakehouses (e.g. Redshift, BigQuery, Snowflake, etc.).
Experience testing data pipelines (ETL/ELT) and reporting layers.
Comfortable working with analytics teams (analysts / product analysts) and engineering / data engineering teams.
Can explain data issues clearly to both technical and non-technical stakeholders.
Can document data issues, risks, and decisions in a structured way.