Job Details

Job: Data Engineer

Company: takealot.com

Location: Western Cape, South Africa

Date Posted: 13-8-2024

Job Image

Takealot Group, South Africa's leading online retailer, is seeking a talented Data Engineer to join their team. The company is a young, dynamic, and hyper-growth company that seeks smart, creative, hard-working people with integrity to grow their career. The position reports to the Engineering Manager and involves designing, developing, testing, and maintaining data architectures, preparing data for descriptive, predictive, and prescriptive modeling, automating repetitive tasks and manual processes related to data usage, optimizing data delivery, designing, developing, and testing large stream data pipelines, ensuring the highest standard in data integrity, leveraging best practices in continuous integration and delivery, collaborating with other engineers, ML experts, analysts, and stakeholders to produce efficient and valuable solutions, contributing to the data democratization and literacy vision by making accessible and easy-to-use data products and tools, and implementing features, technology, and processes that move towards industry best practices. The skills required include working well with people, being a team player, active listener, mentor, and able to communicate well. The candidate should be passionate about technology, systems, and data, always learning, and keeping up to date with the industry. They should have a deep understanding of data pipelining, streaming, and Big Data technologies, methods, patterns, and techniques, and can troubleshoot complex database operations and performance issues. Qualifications and experience include a Bachelor's Degree or Advanced Diploma in Information Systems, Computer Science, Mathematics, or Engineering, at least 3 years of experience in a software/technology environment, and experience with open source relational database systems, significant technical experience, database and data warehousing principles, write code using Java and Python, familiarity with CI/CD tools, Kafka, PubSub, stream data pipeline frameworks, data warehousing, data lakes, lambda/kappa architectures, cloud environments, and containerisation frameworks, tools, and platforms.