Job Description
• Strong experience with Python and Spark (PySpark)
• Hands-on with Databricks (Jobs, Workflows, Delta Lake, Unity Catalog)
• Proficient in SQL for complex data transformations and optimizations
• Proficient in CI/CD principles, version control and best practices for deploying workloads to production
• Solid understanding of distributed data processing and production-grade data workflows
Nice to Have
• Exposure to Machine Learning workflows and tools like MLFlow
• Exposure to Generative AI stack (LLMs, Agents, MCP Servers)
• Familiarity with Snowflake, Airflow, or similar orchestration and warehousing platforms
Seniority level
Seniority level
Mid-Senior level
Employment type
Employment type
Contract
Job function
Job function
OtherIndustries
IT Services and IT Consulting
Referrals increase your...