The Global Fraud Authentications Technology team develops and manages innovative products designed to support the ever-changing threat landscape. We work in close partnership with our Fraud Authentications Business and key Technology Partners throughout Scotiabank to drive global technology delivery. We are key team members within Global Operations Technology to ensure business strategies, plans and initiatives are executed / delivered in compliance with governing regulations, internal policies and procedures.
β
β
Is this role right for you? In this role you will:
- Designing, building, operationalizing the Customer Profile Database using Google Cloud Platform (GCP) data services such as DataProc, Dataflow, CloudSQL, BigQuery, CloudSpanner in combination with third parties such as Spark, Apache Beam/ composer, DBT, Cloud PubSub, Confluent Kafka, Cloud storage Cloud Functions & Github
- Designing and implementing data ingestion patterns that will support batch, streaming and API interface on both the Ingress and Egress.
- Working with other Data Engineers & Application Architects in developing framework and custom code using best practices that will meet the demanding performance requirements
- Working with minimum supervision in designing and building production data pipelines from data ingestion to consumption using GCP services, Java, Python, Scala, BigQuery, DBT, SQL etc.
- Building and managing data pipelines with a deep understanding of workflow orchestration, task scheduling and dependency management
- Ability to do proof of technology using GCP technologies and work with data architects, solution architects to achieve the desired results and performance.
β
β
Do you have the skills that will enable you to succeed in this role? Weβd love to work with you if you have:
- 3+ years of experience in data engineering, performance optimization for large OLTP / OLAP applications
- Advanced level experience and knowledge with the primary managed data services within GCP, including DataProc, Dataflow, BigQuery/DBT, Cloud Spanner, Cloud SQL, Cloud Pub/Sub, Spark etc.
- Working experience with containerized systems in public cloud (Azure or GKE/GCP)
- Proficiency in Python, data processing frameworks, and ETL/ELT processes
- Experience with Infrastructure as Code (IaC) practices and frameworks like Terraform
- Knowledge of Java microservices and Spring Boot is an asset
- Experience with data streaming and technologies such as Kafka, Spark-streaming etc. would be an asset
- Working knowledge of developing and scaling JAVA REST services, using frameworks such as Spring would be an asset
- Good communication and problem-solving skills. Ability to effectively convey ideas to business and technical teams
β