Job Summary
PBT Group is looking for a motivated and talented DataOps Engineer to join an industry-leading IT company specialising in financial services who primary focus on AWS, to provide scalable, secure, and efficient data-driven solutions.
As a DataOps Engineer you will be responsible for designing, developing, and maintaining our cloud-based data platforms on AWS. This role presents an excellent opportunity for individuals passionate about data engineering and cloud technologies to advance their careers in a forward-thinking, innovative environment.
Responsibilities
- Data Pipeline Development:
- Support the ingestion, transformation, and storage of large datasets from diverse sources, ensuring data cleanliness, reliability, and readiness for analysis.
- Work closely with data engineers, analysts, and stakeholders to integrate new data sources effectively.
- Automation & Optimization:
- Develop and maintain automation scripts (e.g., Python) to streamline data workflows and platform management tasks.
- Monitor data platform performance, troubleshoot issues, and implement improvements for efficiency and cost optimisation.
- Cloud Infrastructure & Platform Management:
- Design and implement scalable, high-performance infrastructure solutions, utilising containerisation and automation tools.
- Use infrastructure as code (Terraform) to provision and manage AWS resources, ensuring high performance, security, scalability, and reliability.
- Resolve problems across multiple domains using troubleshooting and problem-solving techniques.
- Continuous Integration/Continuous Deployment (CI/CD):
- Drive CI/CD implementation and improvements by incorporating best practices and automation tools.
- Conduct code reviews and contribute to team knowledge sharing for consistency and high-quality output.
- Documentation & Knowledge Sharing:
- Maintain comprehensive documentation of data processes, platform configurations, and best practices.
- Stay current with the latest AWS services, data engineering tools, and industry trends, integrating new knowledge to enhance the data platforms.
Qualifications
- Education: Bachelor’s degree (3 years) in Computer Science, Informatics, Data Science, Mathematics, Statistics, or Engineering.
- Certifications: AWS Certified Cloud Practitioner or other relevant certifications are advantageous.
- Technical Skills:
- Advanced understanding of AWS services (S3, EC2, Lambda, RDS).
- Proficiency in at least one programming language (e.g., Python, Java, SQL); familiarity with scripting languages is a plus.
- Experience with infrastructure as code tools such as Terraform.
Competencies & Attributes
- Technical and Analytical:
- Strong analytical thinking with attention to detail.
- Intellectual curiosity and continuous improvement mindset.
- Interpersonal:
- Stakeholder engagement skills, including collaboration, influencing, and persuading.
- Adaptability and ability to thrive in a fast-paced environment.
- Professional:
- Ability to work effectively with diverse teams and present recommendations across organisational levels.
- Experience in a regulated environment within the financial services industry is advantageous.
Additional Requirements:
- Industry Knowledge: Experience within the financial services industry in a regulated environment for multi-product, multi-stakeholder organisations is a strong advantage.
- Business Acumen: Ability to present and communicate recommendations effectively across all organisational levels.
- Team Collaboration: Ability to work with diverse teams to ensure successful project implementation.