Job Summary
We are excited to introduce an exceptional opportunity for a Senior Data Engineer! In this pivotal role, you will spearhead the development and optimization of our data engineering solutions, focusing on modernizing our data warehousing and processing strategies. You'll play a crucial role in guiding our transition towards more flexible and scalable data processing patterns, such as ELT, and strategically utilizing Operational Data Stores (ODS) to meet our evolving data requirements in a dynamic technological landscape. Join us and lead the charge in shaping the future of our data infrastructure!
Duties and Responsibilities (Include but are not limited to):
- Lead the architectural design and implementation of scalable data engineering solutions, leveraging advanced cloud data warehouse technologies (e.g., Snowflake, AWS Redshift, Google BigQuery, Databricks, or Azure Synapse Analytics). This includes promoting the adoption of ELT patterns over traditional ETL processes to enhance data agility and efficiency.
- Champion the development and evaluation of proof of concept (POC) initiatives for the adoption of an Operational Data Store (ODS) and other modern data processing frameworks, such as the Medallion Architecture, ensuring our approach remains technology-agnostic and aligned with best practices.
- Oversee the optimization of data flows, utilizing ELT processes to streamline data loading and transformation in cloud data warehouses, ensuring high data quality and accessibility.
- Direct and refine CI/CD processes for seamless data pipeline deployments, incorporating best practices in version control with git.
- Collaborate with cross-functional teams to capture and address comprehensive data requirements, ensuring robust support for business analytics and decision-making.
- Uphold rigorous data security and compliance standards, aligning with financial industry regulations and evolving data privacy best practices.
Key Requirements
- Experience: Minimum of 5 years in Data Engineering, including 2+ years in a senior or leadership role, with a preference for experience in the financial services sector.
- Technical Expertise: Proficiency in at least one major cloud data warehouse solution (e.g., Snowflake, AWS Redshift, Google BigQuery, Databricks, Azure Synapse Analytics), with a strong emphasis on implementing ELT patterns and familiarity with modern data architecture frameworks like the Medallion Architecture.
- Leadership and Innovation: Demonstrated leadership in driving the adoption of modern data processing strategies, with the ability to manage complex projects and innovate within the data engineering space.
- Programming Skills: Strong proficiency in programming languages such as Python or Java and can demonstrate advanced knowledge of SQL on a cloud data warehouse solution, essential for developing and managing ELT processes.
- Certifications: Cloud platform certification (e.g., AWS Solutions Architect, Google Cloud Professional Data Engineer, Snowflake SnowPro) is highly desirable.
- Communication: Excellent verbal and written communication skills, essential for effective collaboration across teams and with stakeholders.
Minimum Qualifying Attributes:
- Hands-on experience with CDC-based data ingestion tools and methodologies.
- Comprehensive understanding of data modeling, ETL/ELT processes, and ensuring data security and privacy, especially within the financial industry.
Please note that by submitting your personal information to Deka Minas you free-willingly issue the business consent to make use of such data for the specific purpose of securing you either permanent or temporary employment. Our business makes use of a POPIA compliant database and you have the right to access, right to correction and right to deletion of your personal information.