Job Description
Summary
This is a hands-on technical leadership role with the opportunity to shape the future of our data platform. You will be responsible for designing and building the core systems to unlock value from our data as we grow. We are looking for a passionate engineer who thrives on solving complex challenges, from high-level architectural design down to implementation. Your impact will come from tackling technical problems directly while also mentoring the team and influencing the long-term vision.
What you will do
- Lead the design and delivery of complex data engineering projects, partnering with stakeholders to take initiatives from concept to successful implementation.
- Design and develop core components of our data platform, including building and scaling real-time data pipelines for event delivery and enable Machine Learning
- Embed best practices and governance into the Data Platform, enabling other teams to build confidently and consistently on robust foundations
- Develop orchestration and automation capabilities to streamline data workflows and improve platform efficiency
- Own the estimation, planning, and execution of complex projects, including new feature development and major service upgrades.
- Write well-tested, high-quality, and performant code for real-time data processing, automation, and infrastructure management.
- Mentor engineers on the team, elevating their skills and promoting best practices in data engineering through guidance and code reviews.
As a Senior Data Engineer on our Data Platform team, you’ll be a key contributor to the evolution of our platform. You’ll work across a wide range of technical challenges, from low-latency, data-intensive streaming systems to machine learning enablement, orchestration, data governance, and platform scalability, ensuring our data foundations are reliable, performant, and ready for future growth.
💻 What you will be working with/on
- GCP
- Pub/Sub and Kafka
- Apache Beam
- BigTable
- Redis
- Kubernetes
- Airflow
- BigQuery
About You
- Proven experience building and delivering low-latency, real-time data applications.
- Hands-on expertise with real-time/event-driven systems (e.g., Kafka, Pub/Sub) and in-stream processing frameworks (e.g., Apache Beam, Spark, Flink etc).
- Proficiency with Infrastructure as Code (IaC) (e.g., Terraform) to manage complex, production-grade data infrastructure.
- Experience building platform capabilities that other engineers and teams can leverage and build upon
- A team-first mindset, with a passion for mentoring others and sharing knowledge to raise the bar across the organization.
- The ability to collaborate effectively with cross-functional partners, including Product Managers, Backend Engineers, and Machine Learning Engineers.
You’re an experienced engineer who thrives on building robust, scalable, and impactful data solutions. You’re motivated by tackling hard problems, enabling machine learning use cases like fraud detection, and helping your teammates succeed. You value quality outcomes over simply shipping code.
Skills
- Communications Skills
- Development
- Leadership
- Software Engineering
- Team Collaboration

