Job Description
Job Family:
Data Engineering & Architecture Consulting
Travel Required:
Clearance Required:
What You Will Do:
Develop and implement CI/CD pipelines for Databricks notebooks and jobs.
Develop ETL pipelines using PySpark and Databricks.
Implement Delta Lake for ACID transactions and data reliability.
Optimize ingestion from APIs, streaming, and batch sources.
Ensure compliance with data governance and security standards.
Collaborate with data engineers and scientists to support data pipelines and ML workflows.
Conduct ETL and data quality analysis using various technologies (i.e., Python, Databricks).
Ensure data gover...