Flag job

Report

Kafka Developer - Cluster Management

Salary

15 - 25 LPA

Min Experience

3 years

Location

Chennai, Tamil Nadu, India

JobType

full-time

About the role

Job Summary

We are seeking a skilled and experienced Kafka Developer to join our data engineering team in Noida.

In this role, you will be responsible for designing, developing, and maintaining Kafka-based data streaming solutions.

You will work closely with data engineers, software developers, and other stakeholders to ensure the reliability, scalability, and performance of our real-time data pipelines.

This is an excellent opportunity to work on cutting-edge data streaming technologies and contribute to the development of our data-driven applications.

Responsibilities

Kafka Development and Implementation :

  • Design, develop, and implement Kafka-based data streaming solutions.
  • Configure and manage Kafka clusters.
  • Develop and maintain Kafka Producers and Consumers.
  • Implement data serialization and deserialization using formats like Avro, JSON, or Protobuf.
  • Optimize Kafka performance and ensure scalability.

Data Pipeline Development

  • Design and develop real-time data pipelines for data ingestion, processing, and distribution.
  • Integrate Kafka with other data systems, such as databases, data lakes, and message queues.
  • Implement data transformations and enrichments using Kafka Streams or KSQL.

Testing And Quality Assurance

  • Develop and execute unit and integration tests for Kafka applications.
  • Monitor and troubleshoot Kafka performance and issues.
  • Ensure data quality and reliability.

Deployment And Maintenance

  • Deploy and maintain Kafka clusters and applications in production environments.
  • Implement monitoring and alerting for Kafka infrastructure.
  • Perform performance tuning and optimization.

Documentation And Collaboration

  • Create and maintain detailed documentation for Kafka configurations, applications, and processes.
  • Collaborate with data engineers, software developers, and other stakeholders.
  • Participate in code reviews and knowledge sharing sessions.

Requirements

  • Proven experience in Kafka development and implementation.
  • Strong understanding of Kafka architecture and concepts.
  • Proficiency in programming languages such as Java, Scala, Python, or Go.
  • Experience with data serialization formats like Avro, JSON, or Protobuf.
  • Knowledge of Kafka Streams or KSQL.
  • Experience with Kafka Connect.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.
  • Ability to work independently and as part of a team.

Bachelor's degree in Computer Science, Information Technology, or a related field.

Preferred Skills (Bonus)

  • Experience with cloud-based Kafka solutions (e.g , Confluent Cloud, AWS MSK, Azure Event Hubs).
  • Knowledge of stream processing frameworks like Apache Flink or Apache Spark Streaming.
  • Experience with containerization technologies (e.g , Docker, Kubernetes).
  • Familiarity with CI/CD pipelines.
  • Understanding of data governance and security best practices

(ref:hirist.tech)

Skills

kafka
java
scala
python
go
avro
json
protobuf
kafka-streams
ksql
kafka-connect