Job Description

Summary

SDF is looking for a curious, hands-on Data Engineer to join our team. In this role your focus will be all things data: designing, building, and implementing data pipelines for our public analytics dataset, Hubble.

You’ll work on a range of projects that assess the Stellar network’s real-world impact on the global financial system. This includes tracking Soroban (smart contract) adoption, monitoring liquidity and transaction volume, and building tools to support smart contract developers and ecosystem projects. Our aim is to make Stellar data more accessible, actionable, and useful to the broader Stellar ecosystem.

In this role, you will:

  1. Contribute to the design and implementation of data pipelines that support insights into the liquidity, adoption, and usage of the Stellar Network.
  2. Conduct ad hoc data analysis to clean, transform, and distill key insights about the Stellar Network.
  3. Support efforts to monitor and maintain our data marts, with a focus on usage, quality, and freshness.
  4. Collaborate with stakeholders to turn business priorities and community requests into actionable data products.
  5. Support data accessibility by contributing to self-service tools like dashboards, KPIs, and SQL Interfaces.

You have:

  1. 2-3 years of professional data engineering, software engineering or related technical roles
  2. Familiarity with modern data warehousing concepts and experience working with ETL tools like dbt, Fivetran, Databricks, or Talend 
  3. Some experience with SQL, and experience analyzing data to validate assumptions and surface insights
  4. Some experience with ETL schedulers such as Apache Airflow, Dagster, or AWS Glue
  5. Working knowledge of a programming language such as Python or Golang
  6. Strong communication skills, and a willingness to learn how to communicate technical concepts to a broader audience
  7. Interest in working on a small, collaborative team where you’ll have room to grow and take initiative
  8. An eagerness to learn and grow in a dynamic environment, with mentorship and peer support

Bonus points:

  1. Experience working with cloud platforms (especially GCP) and modern data infrastructure
  2. Experience working in collaborative codebases and contributing to team-owned data infrastructure
  3. Exposure to BI tools and interest in making data accessible through simple visualizations
  4. Comfort with git and common engineering workflows (PRs, code review, testing)
  5. Interest in open-source data tools or participating in data/crypto communities
  6. A curiosity about blockchain technologies and a desire to understand decentralized systems

We offer competitive pay with a base salary range for this position of $140,00 - $200,000 depending on job-related knowledge, skills, experience, and location.

Skills
  • AWS
  • Communications Skills
  • Database Management
  • Development
  • Python
  • Software Engineering
  • SQL
  • Team Collaboration
© 2025 cryptojobs.com. All right reserved.