Data Engineer
Dedicated Project

Job Description

• Experience in developing rest API services using one of the Scala frameworks
• Ability to troubleshoot and optimize complex queries on the Spark platform
• Expert in building and optimizing ‘big data’ data/ML pipelines, architectures and data sets
• Knowledge in modelling unstructured to structured data design.
• Experience in Big Data access and storage techniques.
• Experience in doing cost estimation based on the design and development.
• Excellent debugging skills for the technical stack mentioned above which even includes
analyzing server logs and application logs.
• Highly organized, self-motivated, proactive, and ability to propose best design solutions.
• Good time management and multitasking skills to work to deadlines by working
independently and as a part of a team.

• Ability to analyse and understand complex problems.
• Ability to explain technical information in business terms.
• Ability to communicate clearly and effectively, both verbally and in writing.
• Strong in user requirements gathering, maintenance and support
• Excellent understanding of Agile Methodology.
• Good experience in Data Architecture, Data Modelling, Data Security.
 

Experience -Must have:
a) Scala: Minimum 2 years of experience
b) Spark: Minimum 2 years of experience
c) Hadoop: Minimum 2 years of experience (Security, Spark on yarn, Architectural
knowledge)
d) Hbase: Minimum 2 years of experience
e) Hive - Minimum 2 years of experience
f) RDBMS (MySql / Postgres / Maria) - Minimum 2 years of experience
g) CI/CD Minimum 1 year of experience
 

Experience (Good to have):
a) Kafka
b) Spark Streaming
c) Apache Phoenix
d) Caching layer (Memcache / Redis)
e) Spark ML
f) FP (Scala cats / scalaz)


Qualifications
Bachelor's degree in IT, Computer Science, Software Engineering, Business Analytics or
equivalent with at-least 2 years of experience in big data systems such as Hadoop as well as
cloud-based solutions.

Details

Experience (Min) : 2.0 Years
Experience (Max) : 6.0 Years
Location: Remote

Mandatory Skills

Optional Skills

Hadoop SDLC Rest API Scala Spark Bigdata ML pipelines Data sets Debugging Agile Data Architecture Data Modelling Data Security Hbase Hive RDBMS MySQL Postgres Maria CI/CD Kafka Apache Phoenix Caching layer Memcache Redis Spark ML FP Scala cats Scalaz

Refer and Earn

Learn More