Job Description
Summary
The Integration and Automation team is part of Chainalysis’ Enterprise Data & Analytics team who is responsible for generating insights from Chainalysis’ enterprise data. We do this by ingesting, storing and enabling consumption of trusted Account Journey data, metrics and insights across the B2B Enterprise Funnel. We design and implement the critical infrastructure that unifies data across business systems, empowering analytics, operational excellence, and advanced AI initiatives. Our mission is to deliver secure, scalable, and reliable data solutions that enable every part of Chainalysis to thrive in an ever-evolving technology landscape.
As a staff integration engineer, you lead the design, development and optimization of mission-critical integrations across our tech-ecosystem such as NetSuite, Salesforce, Adaptive and Workday. You work across low-code platforms and custom cloud-native services, leveraging your deep software engineering background and DevOps expertise. This role requires strong problem solving skills, and the ability to partner with key stakeholders and drive strategic collaboration to shape the future of our enterprise data backbone.
In this role, you’ll:
- Design and implement robust, scalable integrations between Databricks and key enterprise business systems
- Lead architecture reviews and ensure compliance with security, privacy and data governance standards
- Translate business requirements into technical specifications and integration workflows
- Architect, develop, and maintain automated data pipelines using a combination of low-code tools and custom AWS solutions (Lambda, Step Functions, S3, CloudWatch, VPCs, IAM)
- Own the provisioning and management of infrastructure for integrations by applying Infrastructure as Code principles with Terraform
- Mentor engineers and champion best practices in software engineering, source control (Gitflow), and DevOps workflows within Github repositories
- Collaborate closely with data scientists, product managers, and stakeholders to ensure integrations deliver transformative business value
- Proactively monitor, troubleshoot, and ensure reliability and observability of integration solutions across distributed systems
We’re looking for candidates who have:
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent professional experience
- Minimum 8+ years of hands-on experience in software engineering, with expertise in backend or distributed systems in cloud environments
- Advanced proficiency in Python or Node.js
- Proven track record building and managing integrations and automations using low-code/iPaaS platforms (such as Workato and Fivetran)
- Expert-level AWS experience: Lambda, Step Functions, S3, CloudWatch, VPCs, IAM.
- Deep hands-on expertise with Terraform and modern DevOps practices, including CI/CD and infrastructure lifecycle automation
- Mastery of Git and modern source control workflows (Gitflow), using Github
- Strong communication, collaboration, and leadership skills with a history of mentorship and technical ownership.
You might also have:
- Hands-on experience with Databricks or Snowflake, especially in data warehousing, modeling, or pipeline optimization.
- Familiarity with reverse ETL and modern data stack solutions, such as Airbyte or Census.
- Prior experience building complex workflows between business systems (Salesforce, Clari, Marketo, Netsuite, Adaptive, Workday).
- Experience with AI-assisted development environments, such as Windsurf or Cursor
- Exposure to Model Context Protocol (MCP) and expertise in building autonomous AI agents using AWS solutions such as Bedrock and Agentcore.
Technologies we use:
- Python, Node.js
- AWS (Lambda, Step Functions, S3, CloudWatch, VPCs, IAM)
- Workato, Fivetran
- Databricks, Airbyte
- Terraform
- Git, Github
- Salesforce, Netsuite, Workday, Marketo, Clari, Adaptive
- Windsurf
Skills
- AWS
- Communications Skills
- Development
- Python
- Software Engineering
- Team Collaboration

