Job Description
b) Technical Expertise
- Proven experience converting SAS EG code to PySpark, SQL, and/or R
- Familiarity with SAS macros , data steps, and PROC SQL
- Experience with Databricks notebooks, Delta Lake, and Spark SQL
- Ability to optimize PySpark code for performance and scalability
c) Mainframe & DB2 Integration
- Experience accessing DB2 data via ODBC/JDBC or ETL tools
- Familiarity with mainframe systems (JCL, COBOL, IMS)
- Ability to replicate or migrate data from DB2 to cloud data lake
d) Tooling & Automation
- Use of SAS-to-PySpark conversion tools (e.g., SAS2PY, SPROCKET, Scintilla, AI tools)
- Automated testing and validation frameworks for data parity
- Schema mapping and lineage tracking capabilities
e) Project Delivery & Governance
- Experience successfully onboarding with client systems.
- Clear do...