ENVIRONMENT:
DO you consider yourself an all-round Data Engineer experienced in every step of data flow, from configuring data sources to integrating analytical tools. Then a Specialist IT Service Provider in Durbanville wants you as its next Data Engineer who will contribute to architecting, building, testing, and maintaining the data platform as a whole. The ideal candidate must possess a Bachelors Degree in a Scientific or Engineering discipline, have Certification/Experience with Agile or other methodologies such as Scrum or Kanban and ideally be busy with or have obtained an AWS Certification. You must have experience implementing data pipelines using cloud infrastructure and services, Intermediate to Advanced R, Python or SQL optimization, developing ETL strategies; Infrastructure: AWS (Kinesis, API Gateway, S3, DMS); Dev Tools: Git, Docker, Elastic Container Service (ECS), Elastic Kubernetes Service (EKS) and be able to conduct systems analysis and prepare requirement specifications concerning data-related business processes and systems.
DUTIES:
Mission
Deliver accurate data reliably within the required processing time, ready for processing by analytics applications.
You will work across multiple business teams to implement new and improve existing systems for:
- Extracting data from current sources
- Data storing/transition for all data gathered for analytical purposes.
- Transformation: Cleaning, structuring, and formatting the data sets to make data consumable for processing or analysis.
- An all-round Data Engineer, understand what is required from the users and deliver an appropriate data driven technical solution
- Contribute to strategic design of the architecture of a data platform.
- Develop data systems tools, customize and manage integration tools, databases, warehouses, and analytical systems.
- Data pipeline maintenance/testing.
- Machine Learning algorithm deployment. Machine Learning models designed by Data Scientists, deployed into production environments, managing computing resources and setting up monitoring tools.
- Manage data and meta-data storage, structured for efficiency, quality and performance.
- Track pipeline stability.
- Monitor the overall performance and stability of the systems.
- Keep track of related infrastructure costs and manage these as efficiently as possible, continuously balancing performance and cost.
REQUIREMENTS:
Qualifications
- Minimum of a Bachelor's Degree in a Scientific or Engineering related field.
- Certification or experience with Agile or other development methodologies - Agile, Kanban or Scrum
- Ideally busy with or obtained AWS Certifications.
Experience/Skills -
- Implementing data pipelines using cloud infrastructure and services.
- Intermediate to Advanced R, Python or SQL optimization, developing ETL strategies.
- Infrastructure: AWS (Kinesis, API Gateway, S3, DMS).
- Dev Tools: Git, Docker, Elastic Container Service (ECS), Elastic Kubernetes Service (EKS).
- Ability to conduct systems analysis and prepare requirement specifications concerning data-related business processes and systems.
- AWS analytics architecture experience will be a real advantage.
ATTRIBUTES:
- Willingness to learn new technology and skills, figure it out attitude
- Dynamic, driven by results. Enjoy getting things done.
- A proactive, results-oriented, and can-do attitude is highly desirable
- Self-starter with the ability to successfully plan, organize, and execute assigned initiatives with minimal guidance and direction.
- Strong listening, problem-solving and analytical skills.
- Willing to take the initiative.
- Excellent written and verbal communication skills.
- Exhibits close attention to detail and is a champion for accuracy.
- High level of Integrity.
- Proven ability to meet deadlines, ability to multi-task effectively
- The ability and willingness to do a variety of different tasks.