Job Description

Summary

We have created the Factors of Growth & Impact to help Villagers better measure impact and articulate coaching, feedback, and the rich and rewarding learning that happens while exploring, developing, and mastering the capabilities and contributions within and outside of the team:

Technical Skills:

  1. Collaborate on designing and implementing unified orchestration patterns (Dagster/Airflow) to replace legacy and fragmented scheduling
  2. Develop governance-as-code systems in partnership with the team that automatically apply policy tags, RLS, and access controls through an active control plane

Complexity and Impact of Work:

  1. Help guide the technical design for platform capabilities like data contracts, automated quality gating, observability, and cost visibility
  2. Support the migration of workloads from legacy patterns to the modern platform, ensuring domain teams have clear paths and golden templates

Organizational Knowledge:

  1. Partner with domain teams (Asset Data, Reporting & Statements, Product teams) to understand their needs and design platform capabilities that enable their success
  2. Promote and support data mesh principles and dbt best practices, helping domain owners build and own their data products while platform ensures quality

Communication and Influence:

  1. Promote data platform engineering best practices, developer experience, and "Data as a Product" principles across the engineering organization
  2. Contribute to architectural decisions and help establish engineering culture around reliability, cost efficiency, and operational excellence

You may be a fit for this role if you:

  1. 5-7+ years building data platforms or infrastructure: You bring experience helping design and operate modern data platforms that handle enterprise-scale workloads with quality, governance, and cost controls
  2. Strong dbt and SQL expertise: You're proficient with dbt and SQL, understand dbt Mesh, and have strong opinions on data modeling, testing, and documentation best practices
  3. Orchestration experience: You've implemented production data orchestration with Airflow, Dagster, Prefect, or similar tools, and understand the trade-offs between different orchestration patterns
  4. Cloud data warehouse proficiency: You have strong experience with BigQuery, Snowflake, or Redshift, including query optimization, cost management, and security configurations
  5. Platform mindset: You think in terms of golden paths, reusable abstractions, and developer experience - you build systems that let others move fast safely

Although not a requirement, bonus points if:

  1. Metadata and catalog experience: You've worked with Atlan, Collibra, DataHub, or similar metadata platforms and understand active governance patterns
  2. Data observability tools: You've implemented data quality monitoring with Great Expectations, Monte Carlo, Soda, or similar tools
  3. Infrastructure as code: You have experience with Terraform, Kubernetes, and modern DevOps practices for data infrastructure
  4. You're the kind of person who gets excited about declarative config, immutable infrastructure, and metrics dashboards showing cost-per-query trending down

Skills
  • Communications Skills
  • Database Management
  • SQL
  • Team Collaboration
© 2026 cryptojobs.com. All right reserved.