Flag job

Report

Lead II- Data Engineering - Kafka

Location

Bangalore Urban, Karnataka, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Growel Softech Pvt Ltd

Website: growelsoftech.com
Job details:
Exp : 7 to 10 Years

Location : TVM,Pune, Blr, Hyd, Pune

Shift Timing : 2 to 11 PM.

Responsibilities

  • Develop, and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and AWS services.
  • Configure and manage Kafka connectors, ensuring seamless data flow and integration across systems.
  • Demonstrate a strong understanding of the Kafka ecosystem, including producers, consumers, brokers, topics, and schema registry.
  • Design and implement scalable ETL/ELT workflows to process large volumes of data efficiently.
  • Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and Glue.
  • Implement robust monitoring, testing, and observability practices to ensure data platform reliability and performance.
  • Uphold data security, governance, and compliance standards across all data operations.

Requirements

  • Minimum of 5 years of experience in data engineering or related roles.
  • Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
  • Proficient in coding with Python, SQL, and Java (Java strongly preferred). Person needs to flexible to write code in Python/Java
  • Experience with infrastructure-as-code tools (e.g. CloudFormation) and CI/CD pipelines.
  • Excellent problem-solving skills and strong communication and collaboration abilities
Click on Apply to know more.

Skills

Python
AWS
Apache
Apache Kafka
CloudFormation
compliance
data lake
data warehouse
ETL
infrastructure-as-code
Java
Kafka
Lambda
SQL