Job Description
Job Description
Key Responsibilities
- Design, develop, and maintain robust ETL/ELT pipelines for both batch and streaming data
- Integrate data from a variety of source systems (e.g., APIs, Kafka, relational databases, file systems)
- Perform data cleansing, enrichment, and validation to ensure data quality and reliability
- Support implementation of data lineage and versioning across pipelines Collaborate with BI, Data Science, and Platform teams to support cross-functional data needs
- Optimize pipeline performance and manage data partitioning, schema evolution, and table formats
Qualifications
- Bachelor’s degree in Information Technology, Computer Engineering, Computer Science, or related field
- 3+ years of experience in data engineering or a similar role, fresh graduate with strong potential will also be considered.
- Proficiency in SQL and Python, with strong understanding of data manipulation and p...