Offshore Pod Leads (TBD, AN, IN)
NTT DATA
About This Role
Req ID:360093
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.
We are currently seeking a Offshore Pod Leads to join our team in TBD, Andaman and Nicobar Islands (IN-AN), India (IN).
JOB DESCRIPTION
Data Engineering Pod Lead
Databricks Lakehouse Migration Program
Two Roles: Informatica Pod Lead | AWS Glue Pod Lead
Engagement Type
Contract / Staff Augmentation or Full-Time Employee (FTE) - Open
Seniority Level
Lead / Architect - 12+ years of relevant experience
Number of Openings
2 (one per pod)
Team Size
4-6 Data Engineers per pod lead
Cloud Platform
AWS (Glue, Redshift, S3, Kinesis Streams, IAM, CloudWatch)
Target Platform
Databricks Lakehouse (Unity Catalog, Delta Lake, Workflows)
Program Type
Client-facing migration engagement - ETL modernization
Program Context & Opportunity
Our client is undertaking a large-scale data platform modernization initiative - migrating from a legacy ETL ecosystem (Informatica PowerCenter, AWS Glue, and Amazon Kinesis Streams) feeding Amazon Redshift into a unified Databricks Lakehouse architecture built on Delta Lake. This is a high-impact, high-visibility program requiring experienced technical leaders who can navigate complex legacy systems, architect modern solutions, and lead skilled engineering teams through the full migration lifecycle.
We are hiring two dedicated Pod Leads - one for each legacy source domain - who will be jointly accountable for technical excellence, delivery velocity, and team development throughout the engagement.
Common Responsibilities - Both Pod Leads
Technical Leadership & Architecture
• Own the end-to-end technical design and implementation of the migration from the respective source platform to Databricks Lakehouse (Delta Lake, Unity Catalog, Databricks Workflows).
• Conduct thorough assessments of existing ETL jobs - analyzing lineage, dependencies, transformation logic, scheduling, and data quality rules - prior to migration planning.
• Define migration patterns, reusable frameworks, and coding standards adopted across the pod.
• Architect scalable, cost-efficient pipelines using Databricks PySpark, Spark SQL, and Delta Live Tables (DLT) as appropriate.
• Make and document key architectural decisions (ADRs) with clear rationale and trade-off analysis.
• Drive adoption of software engineering best practices: version control (Git), CI/CD, unit testing, and code review within the pod.
Team Leadership & Delivery Management
• Directly lead a pod of 4-6 Data Engineers, providing technical mentorship, task assignment, code reviews, and unblocking day-to-day impediments.
• Manage sprint planning, backlog refinement, and progress tracking against migration milestones in close coordination with the Program Manager.
• Hold the team accountable for quality and velocity - proactively flag risks, scope changes, and dependencies before they become blockers.
• Conduct regular 1:1s and technical feedback sessions to support the professional growth of pod members.
• Foster a culture of ownership, collaboration, and continuous improvement within the pod.
Client & Stakeholder Communication
• Serve as the primary technical point of contact for your pod's workstream with the client.
• Translate complex technical concepts and migration trade-offs into clear, concise communications for both technical and non-technical stakeholders.
• Participate in program-level status reviews, architecture governance meetings, and client steering committees as required.
• Manage expectations around scope, timelines, and quality, escalating issues appropriately.
Quality, Governance & Documentation
• Ensure all migrated pipelines meet data quality, SLA, and observability requirements defined by the client.
• Champion data governance best practices including lineage tracking, catalog registration in Databricks Unity Catalog, and access control alignment.
• Produce and maintain clear technical documentation: architecture diagrams, runbooks, migration playbooks, and handover materials.
• Coordinate with QA/testing resources to validate migrated pipelines against source-system outputs.
ROLE 1 - Informatica PowerCenter Pod Lead
Role Overview
The Informatica Pod Lead will own the migration of Informatica PowerCenter-based ETL jobs to the Databricks Lakehouse platform. This role demands deep expertise in Informatica's architecture, transformation logic, and metadata - paired with the ability to re-engineer complex legacy workflows into modern, cloud-native Databricks pipelines on AWS.
Role-Specific Responsibilities
• Analyze and decompose Informatica PowerCenter mappings, sessions, workflows, and worklets to understand full transformation logic, source/target connectivity, and scheduling dependencies.
• Define and execute a structured migration methodology - assess, convert, validate - for ...
Ready to Apply?
Click the button below to visit the company's application page.
Apply for this Position