Flag job

Report

Kavi Global - Senior Data Engineer - ETL/ELT Processes

Location

Chennai, Tamil Nadu, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Kavi Global

Website: kaviglobal.com
Job details:
Description

Job Summary :

We are seeking a highly skilled Data Engineer to design, develop, and maintain robust data pipelines and architectures.

The ideal candidate will transform raw, complex datasets into clean, structured, and scalable formats that enable analytics, reporting, and business intelligence across the organization.

This role requires strong collaboration with data scientists, analysts, and cross-functional teams to ensure timely and accurate data availability and system performance.

Key Responsibilities

  • Design and implement scalable data pipelines to support real-time and batch processing.
  • Develop and maintain ETL/ELT processes that move, clean, and organize data from multiple sources.
  • Build and manage modern data architectures that support efficient storage, processing, and access.
  • Collaborate with stakeholders to understand data needs and deliver reliable solutions.
  • Perform data transformation, enrichment, validation, and normalisation for analysis and reporting.
  • Monitor and ensure the quality, integrity, and consistency of data across systems.
  • Optimize workflows for performance, scalability, and cost-efficiency.
  • Support cloud and on-premise data integrations, migrations, and automation initiatives.
  • Document data flows, schemas, and infrastructure for operational and development purposes.
  • Apply best practices in data governance, security, and compliance.

Required Skills & Qualifications

  • Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.
  • Proven 6+ Years experience in data engineering, ETL development, or data pipeline management.

Proficiency With Tools And Technologies Such As

  • SQL, Python, Spark, Scala
  • ETL tools (e., Apache Airflow, Talend)
  • Cloud platforms (e., AWS, GCP, Azure)
  • Big Data tools (e., Hadoop, Hive, Kafka)
  • Data warehouses (e., Snowflake, Redshift, BigQuery)
  • Strong understanding of data modelling, data architecture, and data lakes.
  • Experience with CI/CD, version control, and working in agile environments.

Preferred Qualifications

  • Experience with data observability and monitoring tools.
  • Knowledge of data cataloguing and governance frameworks.
  • AWS/GCP/Azure data certification is a plus

(ref:hirist.tech) Click on Apply to know more.

Skills

Python
Agile
Airflow
AWS
Apache
Apache Airflow
Azure
BigQuery
Business Intelligence
compliance
cross-functional
data engineer
ETL
GCP
Hadoop
Hive
Kafka
Snowflake
Spark
SQL
version control