Job Description

Summary

What are the role responsibilities?

  1. Collaborating with business teams to design systems architecture:
  2. Design and implement robust, scalable, and optimized data architectures.
  3. Support continuing increases in data volume and complexity.
  4. Pipeline Ownership:
  5. Establish best practices and patterns to source, qualify, and ingest data from third party sources.
  6. Work with data analysts and data scientists to decode and analyze onchain data at scale.
  7. Recommend and implement ways to improve data reliability, efficiency, and quality.
  8. Data Strategy and Execution:
  9. Work with stakeholders including the Marketing, Product, Analytics, and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  10. BI and data tool enablement:
  11. Enable data analytics and visualization tools to extract valuable insights from the data to enable data-driven decisions
  12. Innovation and Improvement:
  13. Keep up-to-date with the latest industry trends in data engineering.
  14. Explore new technologies and approaches for continuous improvement of data systems.
  15. Collaboration and Communication:
  16. Work closely with analytics and business teams to define and refine data requirements for various data and analytics initiatives.
  17. Ensure clear communication of project progress and results to stakeholders.
  18. Collaborate with data engineers across the wider OP stack and ecosystem to enable open source and publicly available datasets. 

What skills do you bring?

  1. 4+ years of professional data engineering experience
  2. Experience in building and optimizing large scale data pipelines from ingestion to insights.
  3. Advanced working knowledge of Python and SQL.
  4. Familiarity with modern data processing tools (DuckDB, polars, ClickHouse).
  5. Experience with cloud-based data management technologies (Cloud Storage, BigQuery, ClickHouse Cloud, Kubernetes). 
  6. Experience with workflow orchestration tools such as Dagster, Airflow, dbt etc.
  7. Experience with Cloud Services such as Google Cloud Services, AWS, etc. 
  8. Strong analytic skills related to working with unstructured datasets, we are looking for an engineer who can understand the business and how to build to requirements
  9. Excellent communication skills with the ability to engage, influence, and inspire partners and stakeholders to drive collaboration and alignment.
  10. Experience with web3 and blockchain protocols is a plus.
  11. Self-starter who takes ownership, gets results, and enjoys the autonomy of architecting from the ground up. 

Skills
  • Communications Skills
  • Database Management
  • Development
  • Python
  • Software Engineering
  • SQL
  • Strategic Thinking
  • Team Collaboration
© 2025 cryptojobs.com. All right reserved.