7+ years of development experience in the core tools and technologies like Python, SQL, AWS (Lambda, Glue, S3, Redshift, Athena, IAM Roles & Policies) , PySpark used by the solution services team.
• Architect and build high-performance and scalable data pipelines adhering to data lakehouse, data warehouse & data marts standards for optimal storage, retrieval, and processing of data.
• 3+ years of experience in Agile Development and code deployment using Github & CI-CD pipelines.
• 2+ years of experience in job orchestration using Airflow.
• Expertise in the design, data modelling, creation, and management of large datasets/data models
• Ability to work with business owners to define key business requirements and convert to technical specifications
• Experience with security models and development on large data sets
• Ensure successful transition of applications to service management team through planning and knowledge transfer
• Develop expertise of processes and data used by business functions within the US Affiliate
• Responsible for system testing, ensuring effective resolution of defects, timely discussion around business issues and appropriate management of resources relevant to data and integration
• Partner with and influence vendor resources on solution development to ensure understanding of data and technical direction for solutions as well as delivery
Experience (Min) : | 7.0 Years |
Experience (Max) : | 9.0 Years |
Location : | Bangalore, Karnataka, India |