Job Description
Key Responsibilities
Design, build, and maintain end-to-end data pipelines
Develop scalable ETL/ELT workflows for structured and unstructured data
Integrate data from multiple sources (APIs, databases, third-party tools)
Ensure data quality, reliability, and performance optimization
Build and maintain data warehouses/lakes
Implement data validation, monitoring, and logging
Collaborate with product, engineering, and analytics teams
Maintain documentation for pipelines, schemas, and processes
Ensure data security and compliance best practices
Required Skills & Experience
3+ years of experience in a Data Engineering role
Strong experience with:
Python (preferred) or Scala
SQL (advanced querying and optimization)
Hands-on experience with:
ETL/ELT pipeline development
Data warehousing concepts
Workflow orchestration tools (e.g., Airflow, Prefect)
Experience working with:
Relational databases (Postgres, MySQL, etc.)
Cloud platfor...
Design, build, and maintain end-to-end data pipelines
Develop scalable ETL/ELT workflows for structured and unstructured data
Integrate data from multiple sources (APIs, databases, third-party tools)
Ensure data quality, reliability, and performance optimization
Build and maintain data warehouses/lakes
Implement data validation, monitoring, and logging
Collaborate with product, engineering, and analytics teams
Maintain documentation for pipelines, schemas, and processes
Ensure data security and compliance best practices
Required Skills & Experience
3+ years of experience in a Data Engineering role
Strong experience with:
Python (preferred) or Scala
SQL (advanced querying and optimization)
Hands-on experience with:
ETL/ELT pipeline development
Data warehousing concepts
Workflow orchestration tools (e.g., Airflow, Prefect)
Experience working with:
Relational databases (Postgres, MySQL, etc.)
Cloud platfor...