What will I be doing?
β
β
As an Analytics Engineer, you will be responsible for designing, developing, and optimising our data infrastructure and analytical tools. You will work closely with data engineers, data analysts, and business stakeholders to ensure that data is reliable, accessible, and actionable. Your role will involve transforming raw data into a more refined form, creating data models, and ensuring the data pipelines are efficient and scalable.
β
β
Key responsibilities include:
- Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient data pipelines using tools like DBT, SQL, Python, and ETL frameworks.
- Data Modelling: Develop and maintain logical and physical data models that support the company's reporting and analysis needs. Implement best practices for data warehouse design.
- Data Transformation: Transform raw data from various sources into clean, reliable datasets that can be used for analysis and reporting, ensuring data quality and consistency.
- Collaboration with Stakeholders: Work closely with data analysts, data scientists, and business stakeholders to understand their data needs and translate these requirements into technical solutions.
- Performance Optimization: Optimise existing data processes for performance and scalability, ensuring that data can be processed quickly and efficiently.
β
β
Is this the job for me?
It could be if some of these apply
- You have experience with analytics engineering delivering analytical solutions (e.g. in the Databricks stack)
- You have experience with data modelling using tools such as DataForm or DBT
- You have experience orchestrating complex data processing pipelines
- You love building scalable, resilient analytical products
- You seek learning opportunities to deepen your expertise or broaden your knowledge
- You are a data engineer, analytics engineer or software engineer with an interest in the data domain (data modelling, data transformation, etc.) or highly motivated technical data analyst
- You are comfortable working in an agile development environment that uses Terraform, continuous integration and continuous delivery best practices, and have experience of pair programming, CI/CD and deployment strategies
- You have experience designing and building large-scale data pipelines that utilise streaming technologies (e.g. Kafka Streams, Amazon Kinesis or similar) with an emphasis on the sourcing and transformation of data.
β