Job Description
Summary
We’re looking for a skilled Data Engineer to design and implement high-throughput, real-time data pipelines that serve both trading systems and analytics infrastructure. You’ll work closely with a small, senior team to build systems that ingest, transform, and serve blockchain data at scale — especially on Solana.
If you're passionate about Rust, real-time systems, stream processing, and the crypto space (or want to be), this role is for you.
Key Responsibilities:
- Build and maintain low-latency data pipelines using tools like Apache Flink, NATS, Kafka, or similar
 - Ingest and process blockchain data from Solana via Geyser or Yellowstone
 - Work with ClickHouse, RisingWave, or similar systems to support analytics and internal dashboards
 - Develop production-quality code in Rust and/or C++
 - Optimize pipelines for performance, reliability, and scalability
 - Collaborate with quant, infra, and trading teams to ensure systems meet performance requirements
 
Key Qualifications:
- Strong experience with stream processing frameworks (e.g., Flink, Kafka, NATS)
 - Hands-on experience with Rust or C++ in production environments
 - Familiarity with data lake and analytics engines like ClickHouse, RisingWave, or Arroyo
 - Understanding of real-time or low-latency system architecture
 - Bonus: Experience with Solana or other blockchain infrastructure
 - Bonus: Familiarity with Geyser plugin system or Solana RPC indexing
 
Preferred Qualifications:
- Experience with Rust
 - Experience with blockchain technologies
 - Familiarity with real-time data processing
 - Background in financial or trading systems
 
What We Offer:
- Competitive salary and benefits package.
 - Opportunity to work with a passionate and innovative team.
 - Flexible working hours and remote work options.
 - Professional growth and development opportunities.
 - A collaborative and inclusive company culture.
 
Skills
- C++
 - Development
 - Rust
 - Software Architecture
 - Software Engineering
 - Team Collaboration
 

