Genpact hiring for Lead Consultant – Data Engineer (AWS + Python)

Location : Pune

Qualification : Bachelor's Degree (Master’s Preferred)

                                     

Experience : Freshers/Experienced

Job Description :

Genpact is seeking a Lead Data Engineer with strong expertise in AWS cloud services, Python, and SQL to design, implement, and optimize data pipelines and applications. The role involves working on end-to-end data engineering solutions including cloud architecture, migration, ETL development, and performance tuning in a fast-paced, collaborative environment.

You’ll be responsible for building scalable data solutions, automating processes, securing cloud infrastructure, and supporting business decision-making with reliable data systems.

Apply LinkClick Here

Key Responsibilities :

1. Data Pipeline Development & Cloud Solutions

Design and deploy scalable, secure, and cost-effective AWS data pipelines using Glue, Lambda, Step Functions, and Redshift.

Build and manage ETL workflows, data ingestion frameworks, and data lakes.

2. System Optimization & Security

Implement monitoring and optimization strategies for system performance and cloud cost-efficiency.

Apply IAM, security groups, encryption, and other security protocols to protect data.

3. Data Migration & Integration

Migrate data from legacy databases to cloud solutions like Redshift or DynamoDB.

Work with tools like Apache Spark, Hadoop, and Databricks for big data processing.

4. Requirements Gathering & Technical Leadership

Collaborate with stakeholders to understand requirements and translate them into technical specifications.

Review and contribute to design documentation, perform code reviews, and guide junior engineers.

5. Deployment & Maintenance

Support deployment in production environments; implement disaster recovery and backup strategies.

Perform unit testing, debugging, and troubleshooting for smooth delivery.

Required Skills

Technical Skills:

  • Expertise in AWS services: Glue, Lambda, Redshift, S3, Step Functions
  • Strong programming in Python
  • Advanced SQL and relational database management
  • Experience with Apache Spark, Hadoop, and Databricks (preferred)
  • Familiarity with data lake architecture, data warehousing, and real-time streaming

Cloud & Security:

  • Proficient in IAM, encryption, and cloud security practices
  • Experience in designing fault-tolerant, scalable systems on AWS

Soft Skills:

  • Excellent analytical and problem-solving skills
  • Strong communication and collaboration skills for cross-functional teamwork
  • Ability to document processes, mentor peers, and manage project tasks


Comments