Sr, Data Engineer
Dedicated Project

Job Description

Excellent communication and interpersonal skills since he has to interact with cross teams

Large Data handling and processing (Parquet, Jason & avro) on AWS Environment

PySpark experience and exposure is must

Build Pipelines on AWS Environment

API development experience for data exchange is desirable

AWS certified data engineer is a must.

Agile mode of delivery experience is preferred


Experience (Min) : 8.0 Years
Experience (Max) : 10.0 Years
Location: Remote


Python AWS API Parquet Json avro PySpark Agile Data Engineer

Refer and Earn

Learn More