Job Description
Summary
Department environment
Having established market-leading products with a large client base since Copper's launch in 2018, our engineering team has remained at the forefront of blockchain technology. Setting standards for the cryptocurrency infrastructure globally. The Data Engineering team extends upon this to provide critical data services to the organisation.
Through innovative data management and analytics, we enable our teams to make informed, data-driven decisions to drive growth, reduce risk and improve operational efficiencies.
Copper is hiring for an ambitious individual to drive the growth of the BI Data function. We are looking for a candidate who is equally at ease with the operational aspects of the role as they are championing best practices to deliver the insights required to make data driven decisions. You will play a leading role in managing and expanding the data warehouse and BI visualisation capabilities supporting our data strategy.
You will have wide-ranging responsibilities to help us execute on our data strategy.
Key Responsibilities of the role
- Develop BI technical architecture, build data models and strategic data services
- ELT Pipelines
- Dashboards
- Engage key stakeholders to explain our capabilities, understand their requirements and manage key relationships through effective communication.
- Promote tools adoption and deliver a strong data culture across the company, through relevant frameworks, processes, and training.
Your experience, skills and knowledge
Essential
- Expert SQL knowledge and experience working with relational databases as well as working familiarity with a variety of databases.
- Experience ingesting and manipulating a variety of data sources (Real time/batch data) and data formats (json/binary/csv).
- Data warehousing experience (knowledge of and experience with Kimball Dimensional, Data Vault and other related DW methodologies).
- Experience building and optimising pipelines and ETL/ELT workflows.
- Strong analytic skills related to working with unstructured datasets.
- Engineering best practices and standards.
Desirable
- Experience with data warehouse software (e.g. Snowflake, Google BigQuery, Amazon Redshift).
- Experience with data tools: Hadoop, Spark, Kafka, etc.
- Code versioning (Github integration and automation).
- Experience with scripting languages such as Python or R.
- Working knowledge of message queuing and stream processing.
- Experience with Apache Spark or Similar Technologies.
- Experience with Agile and Scrum Technologies.
- Familiarity with dbt and Airflow is an advantage.
- Experience working in a start-up or scale up environment.
- Experience working in the fields of financial technology, traditional financial services, or blockchain/cryptocurrency.
- Data Management and governance experience.
Skills
- Analytical Thinking
- Database Management
- Development
- Python
- SQL