Flag job

Report

Data Management Team Lead

Location

Gurugram, Haryana, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

EXL

Website: exlservice.com
Job details:

Role Overview

We are seeking a Fractional Solution Architect to lead end-to-end architecture for modern data platforms, analytics, AI, and automation initiatives. This is a principal-level, tech-agnostic role responsible for translating business outcomes into scalable, cloud-portable architecture decisions across AWS, Azure, Databricks, Microsoft Fabric, Snowflake, and hybrid ecosystems. The role includes ownership of complex data migration and replatforming programs (legacy to cloud, warehouse to lakehouse, monolith to cloud-native), ensuring minimal disruption, performance integrity, governance continuity, and long-term architectural sustainability.

The ideal candidate is an architecture leader who:

· Owns end-to-end architecture decisions beyond diagrams

· Leads complex data migration and replatforming initiatives with structured, phased execution

· Chooses the right tools for the right workloads without vendor bias

· Owns non-functional trade-offs: cost, risk, scale, security, governance, auditability

· Designs for portability, scale, and long-term sustainability

· Ensures performance, scalability, and operational resilience during and after migration

· Balances business priorities with strong engineering rigor

---

Key Responsibilities

Architecture Leadership & Modernization Strategy

· Lead end-to-end solution architecture across analytics, AI, automation, and data platforms

· Translate business objectives into architecture patterns and platform decisions

· Define target-state architecture principles: scalability, resiliency, security-by-design, observability, cost optimization, maintainability

· Drive cloud-native modernization strategies (monolith → managed services)

· Establish ingestion, transformation, storage, and serving reference patterns

· Maintain architecture documentation: current/target state, ADRs, integration flows, NFRs

· Design vendor-neutral, cloud-portable patterns

Platform & Technology Decision Ownership

· Drive architecture decisions across AWS, Azure, Databricks, Microsoft Fabric, Snowflake, and hybrid stacks

· Decide which tool fits which workload — without vendor bias

· Define lake/lakehouse/warehouse strategies

· Own non-functional tradeoffs: cost, scale, risk, security, auditability, governance

· Ensure solutions are future-proof and cloud-portable

Delivery Enablement & Stakeholder Leadership

· Partner with Sales in pre-sales, solutioning, and RFP responses

· Shape engagement scope, phasing, and delivery strategy

· Define core architecture scope vs. attach/skill pods

· Provide architecture governance through design reviews and risk management

· Guide engineering teams across Databricks, AWS-native, Azure-native, or hybrid deployments

· Act as trusted advisor without becoming sales-driven

Scalable Data Platform Design

· Design scalable batch + streaming data platforms

· Define ingestion patterns (batch, CDC, event-driven, streaming)

· Establish multi-layered data architecture patterns (e.g., bronze/silver/gold)

· Optimize distributed processing workloads (Spark, Flink, Beam, equivalent)

· Ensure high-volume scalability (1TB+/day workloads or equivalent)

· Design storage layouts, partitioning strategies, compaction, caching, and query optimization

· Ensure performance baselines and ongoing optimization

Engineering & Execution Excellence

· Translate architecture into actionable engineering designs

· Own decisions on processing engines, storage formats, data delivery mechanisms

· Design CI/CD pipelines for data & platform deployments

· Implement Git-based workflows and automated promotion strategies

· Establish observability, monitoring, resilience, and error recovery frameworks

· Ensure strict data freshness SLAs

Data Modeling & Data Product Engineering

· Design curated, high-value data products for analytics, APIs, ML, and monetization

· Implement schema evolution and incremental processing strategies

· Ensure consistency, conformity, lineage, and quality across layers

· Enable downstream consumption via APIs, reports, and ML pipelines

---

Must-Have Skills & Experience

Technical Foundations

· Strong data architecture fundamentals (Lake / Lakehouse / Warehouse)

· Cloud architecture expertise (AWS / Azure / GCP)

· Cross-cloud deployment experience

· Deep understanding of analytics + AI lifecycle

· Spark & SQL concepts

· Data modeling & performance tuning

· Data migration expertise

Architectural Capabilities

· Translating business outcomes → architecture decisions

· Designing cloud-portable patterns

· Deciding tool-to-workload fit objectively

· Owning non-functional requirements end-to-end

· Making tradeoffs across cost, risk, scale, governance

Leadership & Communication

· Executive-level communication

· Structured thinking and decision clarity

· Ability to say what not to build

· Influence without authority

· Partner effectively with business and engineering stakeholders

Click on Apply to know more.

Skills

AWS
Azure
Beam
business objectives
caching
data architecture
data modeling
Databricks
end-to-end
Flink
GCP
Snowflake
Spark
SQL