Job Summary
We are seeking a Data Streaming Engineer with strong experience in Kafka and Flink for a contract position in the financial services industry. The ideal candidate will work closely with development and data engineering teams to support data streaming architecture, ensuring efficient real-time data processing.
Key Responsibilities:
- Design, build, and maintain streaming pipelines using Kafka and Flink.
- Handle data integration, transformation, and real-time data processing.
- Collaborate with other engineering teams to integrate streaming solutions into a wider data ecosystem.
- Implement monitoring and alerting systems to ensure data pipeline reliability.
- Optimise data flows and create fault-tolerant data streaming systems.
Key Requirements:
- Experience with Kafka and Flink: Strong skills in these technologies are non-negotiable.
- Solid knowledge in real-time data processing and distributed systems.
- Background in Data Engineering and building scalable data solutions.
- Strong proficiency in programming languages such as Python, Java, or Scala.
- Experience with cloud platforms like AWS, GCP, or Azure is an advantage.
- Familiarity with DevOps tools and principles.
Additional Skills:
- Problem-solving and analytical thinking.
- Ability to work independently and manage complex projects.
- Strong communication skills for collaboration with cross-functional teams.