Senior Data Engineer

Salary

₹15 - 25 LPA

Min Experience

3 years

Location

Bangalore

JobType

full-time

About the role

Experience: 3-6 Years

Location: Whitefield, Bangalore

Employment Type: Full-Time

Compensation - Upto 35 Lakhs

 

Key Responsibilities

  • Data Pipeline Development: Design, develop, and maintain efficient ETL pipelines to collect, transform, and load data.
  • Data Modeling: Develop Logical and Physical Data Models, including data warehouse designs, to support business requirements.
  • Monitoring and Maintenance: Implement monitoring and alerting systems to ensure the reliability of data pipelines. Perform routine maintenance and troubleshooting as needed.
  • Collaboration: Work closely with upstream and downstream teams to ensure end-to-end execution of data pipelines. Streamline workflows and processes to support team expansion and inter-team collaboration.
  • Optimization: Optimize data processing and storage solutions for scalability and efficiency when working with large volumes of data.
  • Development Support: Contribute to all phases of development, including design, implementation, and operation of production systems.
  • Code Quality: Write clean, maintainable, and efficient code. Demonstrate a solid understanding of Object-Oriented Design (OOD) principles and ensure coding best practices are followed.
  • Problem Solving: Handle ambiguous or undefined problems with a structured and analytical approach.
  • Data Security: Ensure adherence to data security and privacy principles in all aspects of data handling.

 

Required Skills and Qualifications

  • Programming Expertise: Strong proficiency in Python for data manipulation and transformation.
  • Database Knowledge: In-depth understanding of either NoSQL or relational databases, with experience in data modeling.
  • Workflow Management: Hands-on experience with workflow management tools like Apache Airflow.
  • ETL Expertise: Proven experience building and maintaining ETL pipelines for large-scale data processing.
  • Design Principles: Strong grasp of Object-Oriented Design concepts and best coding practices.
  • Problem-Solving Skills: Ability to tackle undefined or ambiguous problems effectively.
  • Data Security: Understanding of data security and privacy principles.

Nice-to-Have Skills

  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) for data storage and processing.
  • Experience with big data tools like Spark, Kafka, or Hadoop.
  • Knowledge of CI/CD pipelines and DevOps practices.

 

How to Apply

Interested candidates can send their resume and a brief cover letter to shivani@hireveda.com or https://lnkd.in/gft_duez.

Skills

ETL
Python
Apache Airflow
Data Warehouse