Your Team Responsibilities
We are building a next-generation Lakehouse architecture to power data-driven decision-making across the organization. This modern platform integrates structured and unstructured data pipelines and serves as the core analytical engine for Marketing, Coverage, and Finance teams, enabling the generation of real-time insights and strategic KPIs.
We are looking for an exceptional Data Engineer to join our Data Engineering Development team . You will work in a collaborative, global environment and play a key role in building high-performance, scalable data pipelines that integrate SAP finance data and cloud-native sources into our Azure-based Enterprise Data Warehouse (EDWH) Lakehouse.
Your Key Responsibilities
- Design, build, and maintain scalable data pipelines to ingest, transform, and deliver structured and unstructured data from SAP systems to Azure Data Lake using ADF and Databricks
- Develop and manage data workflows from platforms like SAP , Revstream and On-Prem Oracle and Sql Servers
- Collaborate with business teams to gather and understand data requirements for analytics, and reporting needs.
- Integrate financial data and ensure its consistency, quality, and accessibility for downstream analytics.
- Optimize data storage, processing, and transformation in Data Factory, and Databricks environments.
- Support the development of dashboards and reports in Power BI by ensuring data models are optimized for performance.
- Ensure data governance, security, lineage, and compliance across pipelines.
- Participate in code reviews, testing, and documentation of data processes.
- Proactively identify opportunities to automate and optimize data flows.
Your skills and experience that will help you excel
- 5-7 years of hands-on experience in Data Engineering, preferably in Finance domain.
- Proven experience working on Azure Cloud with services like:
- Azure Data Factory
- Azure Synapse Analytics
- Azure Data Lake Storage (ADLS)
- Databricks / Spark
- Proficient in SQL, Python, and PySpark for data transformation and orchestration.
- Strong experience with SAP modules (FI/Sales) and SAP data extraction tools (e.g. SAP BW).
- Experience in Power BI or equivalent reporting tools for enabling business-facing analytics.
- Understanding of data modeling, ETL best practices, and pipeline optimization.
- Strong problem-solving skills, attention to detail, and ability to work in cross-functional teams.
Preferred Skills
- Knowledge of CI/CD pipelines, version control (e.g., Git), and infrastructure-as-code practices.
- Exposure to data quality frameworks and monitoring tools.
- Familiarity with enterprise data cataloging tools and metadata management.