Job Description
Summary
Our Krakenites are a world-class team with crypto conviction, united by our desire to discover and unlock the potential of crypto and blockchain technology. What makes us different? Kraken is a mission-focused company rooted in crypto values. As a Krakenite, youll join us on our mission to accelerate the adoption of cryptocurrency so the world can achieve financial freedom and inclusion. For over a decade, Krakens focus on our mission and crypto ethos has attracted many of the most talented crypto experts in the world. Before you apply, please read the Kraken Culture Explained to learn more about our internal culture, values, and mission. As a fully remote company, we have Krakenites in 60+ countries who speak over 50 languages. Krakenites are industry pioneers who have a long track record of building premium products for professionals and institutions as well as newcomers to the space. Kraken is committed to industry-leading security through our products like Kraken Pro, Kraken NFT, and Cryptowatch, with a focus on world-class customer support and crypto education for all. Become a Krakenite and build the internet of money! Proof of Work The team The Data Engineering team is responsible for designing and implementing scalable solutions that allow the company to make data-driven decisions quickly and accurately on petabytes of data. The team maintains the companys data warehouse and data lake, and is responsible for creating various pipelines to move and process vast amounts of data, including both batch and streamed data, through different data products. In this role, you will work closely with the Finance team to build data products that support Finance transformation, enabling the company to be an industry pioneer. The opportunity
- Build scalable and reliable data pipelines that collect, transform, load and curate data from both internal financial systems and blockchains
- Ensure high data quality for pipelines you build and make them auditable by Finance stakeholders
- Drive data systems to be as near real-time as possible
- Build data connections to company's internal IT systems
- Develop, customize, and configure self service tools that enable Finance stakeholders to extract and analyze data from our massive internal data store
- Evaluate new technologies and build prototypes for continuous improvements in data engineering
- 5+ years of work experience in relevant field (Data Engineer, DW Engineer, Software Engineer, etc)
- Experience with data warehouse technologies and relevant data modeling best practices (Athena, Iceberg, Druid, etc.)
- Experience building data pipelines/ETL and familiarity with design principles (Apache Airflow is a big plus!)
- Excellent SQL and data manipulation skills using common frameworks like Spark/PySpark, Pandas, or similar
- Proficiency in a major programming language (e.g. Scala, Python, ...)
- Experience with business requirements gathering for data sourcing
- Experience working in finance, fintech, or similar
- Experience working with cloud services (e.g. AWS, GCP, ) and/or Kubernetes
- Experience in building and contributing to data lakes in the cloud
- Designing and writing CI/CD pipelines
- Working with petabytes of data
Skills
- Data Structures