Data Engineer
Dedicated Project

Job Description

Role                                   : AWS Data Engineer


  • Development of data platforms, integration frameworks, processes, and code.
  • Develop and deliver APIs in Python or Scala for Business Intelligence applications build using a range of web languages
  • Develop comprehensive automated tests for features via end-to-end integration tests,  performance tests, acceptance tests and unit tests.
  • Elaborate stories in a collaborative agile environment (SCRUM or Kanban)
  • Familiarity with cloud platforms like GCP, AWS or Azure.
  • Experience with large data volumes.
  • Familiarity with writing rest-based services.
  • Experience with distributed processing and systems
  • Experience with Hadoop / Spark toolsets
  • Experience with relational database management systems (RDBMS)
  • Experience with Data Flow development
  • Knowledge of Agile and associated development techniques including:
  • Iterative Development
  • Refactoring
  • Unit Testing
  • Continuous Integration and Delivery
  • Acceptance Test Driven Development
  • Knowledge of formal testing and deployment methods from conception 
  • Snowflake and redshift experience compulsory


Experience (Min) : 3.0 Years
Experience (Max) : 5.0 Years
Job Type : Onsite & remote
Location : Gurgaon , Haryana , India Pune , Maharashtra , India Bangalore , Karnataka , India

Primary Skills

AWS AWS Data Engineering Snowflake Redshift

Secondary Skills

Frameworks APIs Scala Python Testing SCRUM Kanban Cloud platforms GCP AWS Azure Rest-based services Hadoop Agile Spark RDBMS Data Flow Iterative Development Refactoring

Refer and Earn

Learn More