Job Description
Design and develop Databricks solutions leveraging Lakehouse architecture for enterprise data processing and analytics
Develop and optimize ETL/ELT pipelines
Create and manage structured streaming pipelines for real-time data processing
Configure and optimize Databricks clusters and Spark jobs for optimal performance
Utilize Delta Live Tables for data ingestion and transformations
Apply Unity Catalog features and IAM best practices for security governance and access control
Support infrastructure and resource management using Terraform
Participate in Agile/Scrum development process and collaborate with team members
Implement monitoring solutions for pipeline performance and data quality
Contribute to code reviews and knowledge-sharing sessions
What you must have:
6+ years of ex...