Job Purpose

Design, build, and maintain scalable data pipelines and architectures that support analytics, reporting, and enterprise data solutions.

Key Responsibilitiese

  • Build, maintain, and optimize ETL/ELT workflows and data integration processes.
  • Develop scalable data architectures on cloud platforms (Azure, AWS, GCP).
  • Ensure data reliability, performance, and high availability across systems.
  • Work with data governance teams to implement data quality and security controls.
  • Troubleshoot data pipeline issues and improve system efficiency.
  • Collaborate with data analysts, scientists, and BI developers to ensure data readiness.

Required Skills & Knowledge

  • Strong proficiency in SQL, Python, and data pipeline development.
  • Experience with cloud data services (Azure Data Factory, AWS Glue, BigQuery, etc.).
  • Knowledge of data modeling, warehousing, and distributed systems.
  • Familiarity with CI/CD practices and version control tools.
  • Understanding of data governance frameworks.

Education

Bachelor’s degree in Computer Science, Software Engineering, or related fields.

Experience

+7 years of experience in data engineering and data platform development.

Competencies

  • Technical Excellence
  • Attention to Detai
  • Problem Solving
  • Collaboration
  • System Optimization

Why work here?

    • Innovative Environment:

Be a part of a dynamic team that values innovation and the incorporation of cutting-edge tools and methodologies.

    • Professional Growth:

We provide numerous opportunities for skill development and career advancement.

    • Collaborative Culture:

Experience a company culture that promotes teamwork, inclusivity, and mutual respect.

    • Work-Life Balance:

We understand the importance of a balanced life. Flexible hours and remote work options are available to fit your lifestyle.

Apply for this position

Interested candidates are invited to complete the form below, submitting their CV and a cover letter detailing their relevant experience.