Job Description
Summary
BitGo is seeking a skilled Data Engineer with strong experience in building robust, scalable data pipelines and monitoring systems. The ideal candidate will have expertise in SQL and Python, and hands-on experience with modern data platforms such as Snowflake, dbt, and cloud-native orchestration tools (e.g., Airflow). Familiarity with reconciliation processes, anomaly detection, and data quality monitoring is key. A background in supporting audit, compliance, or risk functions is a plus, along with a collaborative mindset and attention to detail in working with control-focused teams.
You'll collaborate closely with talented engineers, analysts, data scientists, product managers, and security experts across multiple domains, gaining exposure to all facets of our business, while providing repeatable and auditable results to various external stakeholders. .
This is a unique opportunity to contribute to a high-impact team at the forefront of digital asset security and financial technology.
Responsibilities:
- Design, build, and maintain scalable, reliable data pipelines that collect, transform, and curate data from internal systems.
- Build automated reconciliation, monitoring and alerting systems.
- Enhance and expand BitGo’s blockchain reporting infrastructure.
- Integrate select external data sources to enrich the data platform.
- Ensure high data quality and auditability across all pipelines.
- Optimize data systems for near real-time processing and insights.
Skills & Experience:
We are looking for teammates who share and practice our values: open communication, transparency, taking ownership, and a high level of craftsmanship. We are looking for coworkers who share our vision and mission: delivering trust in digital assets.
Required
- Engineering degree in Computer Science or equivalent
- 5+ years of work experience in relevant field (Data Engineer, Software Engineer)
- Strong experience with server-side languages (Python)
- Strong experience with SQL databases like Postgres or MySQL
- Experience building data pipelines/ETL and familiarity with design principles
- Experience with data warehouse technologies and data modeling best practices (Snowflake, BigQuery, Spark etc)
- Strong experience with systems design and event driven systems (Kafka)
- A self-starter capable of adapting quickly and being decisive
- A willingness to be on the forefront of designing for quality and attestable results
- Experience with unit and functional testing and debugging
- Experience in Git/GitHub and branching methodologies, code review tools, CI tools, JIRA, Confluence, etc.
- Ability to work independently in a fast-paced environment
- High integrity and accountability
- Comfortable with inclusion in on-call rotations for system support
- Effective written and verbal communication skills
Preferred
- Experience in Financial Services and/or Financial Technology
- Understanding and strong interest in cryptocurrencies and blockchain industry
- Experience with large-scale, real-time, and distributed applications
- Experience with data governance, handling PII, data masking
Skills
- Communications Skills
- Database Management
- Development
- Python
- Software Engineering
- SQL