Flag job

Report

Advanced Data Engineer

Min Experience

4 years

Location

Bengaluru, Karnataka, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Advanced Data Engineer is responsible for the advanced data engineering and architecture. The Advanced Data Engineer works as individual contributor, facilitates data engineering, design implementation as well as developing functional & user acceptance, and constructs solutions that remain scalable, adaptable, and replicable to deliver the standard solution across various platforms. Also, to provide technical expertise as a mentor to other team members.

 

This role requires experience in Big Data and Enterprise Datawarehouse infrastructure, as well adopting appropriate emerging technologies and its solutions to answer business problems and identify opportunities impacting the revenue or operating income. 

 

Advanced Data Engineer is expected to closely work with ISC Digitization team members as well with Enterprise IT team to align practices and priorities to support the transition of analytics to the FORGE – Big Data platform.  

Responsibilities

Key Tasks and Responsibilities

  • Connect with business partners and identify opportunities to drive business value via analytics solutions.
  • Design and build publication-ready data pipelines using diverse sets of structured and unstructured data.
  • Ensure data pipelines are created using credible qualitative and quantitative methodologies based on key insights.
  • Perform statistical analysis of complex data sets to better understand trends, relationships between variables, and to formulate business intelligence insights.
  • Pays high attention to data accuracy. In depth understanding of data identification, collection, processing, and analysis methodologies.
  • Drive continuous improvement and innovation. Work with cross-functional teams effectively
  • Actively consult, conduct pre-development workshops, develop POCs, Lead incubation.
  • Coach and mentor junior data engineers, leading by example through acting as an individual contributor
  • Work with Honeywell businesses and Enterprise IT teams to clearly outline how enterprise platforms can enable business growth & productivity by seamlessly combining structured and unstructured data into a single, self-service analytical environment.
  • Leverage knowhow and domain knowledge of Honeywell experts to define, design and develop intelligent solutions to operate on large data sets related to Supply Chain.

Qualifications

Qualifications:

  • Bachelor’s in engineering / technology degree or equivalent 
    • Specialized in Computer Science, Software Engineering / Information Technology or related field is preferred
    • Microsoft Fabric Data Engineer Associate / Azure Data Engineer Certification 
    • Architect Certification / Snowflake certification is desirable
    • Azure / Databricks / Scala / Python / and Visualization techniques Certification is preferred 
  • 6+ years of Software development with 4+ years of experience working with data engineering.
  • 2+ years of experience with cloud/on premises data warehouses and data modeling.
  • 2+ years of hands-on experience in creating technical solutions on Cloud - Snowflake EDW / Informatica IICS / HVR as well as design and development on SQL Server is advantageous
  • Demonstrated experience in progressively challenging and Responsible roles. Must have experience working in Matrix organization structure.
  • Strong experience in Snowflake, including data modeling, performance tuning, query optimization, Snowpipe, Streams/Tasks, User Defined Objects and secure data sharing.
  • Strong experience building scalable data pipelines using SQL, Python, and modern ELT/ETL frameworks, with a focus on automation, reliability, and CI/CD integration.
  • Deep understanding of cloud platforms (Azure, AWS, or GCP), including storage layers, compute services, networking fundamentals, IAM, and cost optimization.
  • Proven ability to design and implement robust data architectures, including batch and streaming patterns, orchestration frameworks, and best practices for data quality, governance, and observability.
  • Proficiency with Informatica IICS, including building and orchestrating cloud-native ETL/ELT pipelines, parameterization, and integration with enterprise data platforms.
  • Clear and confident communicator, able to articulate vision, technical concepts, and requirements effectively to both technical and non-technical stakeholders.
  • Innovative, systems-level thinker with a demonstrated ability to rapidly conceptualize solutions, generate creative ideas, and integrate diverse technologies to solve complex problems.
  • Experienced mentor and coach, capable of guiding and developing seasoned technology specialists, fostering technical excellence, and promoting best practices.
  • Strong business acumen, with the ability to connect data engineering decisions to business strategy, operational needs, and measurable outcomes.
  • Collaborative team player, receptive to feedback, adaptable to evolving priorities, and committed to a positive, high-performance team culture.
  • Hands-on experience with Databricks for data processing, Spark-based transformations, and collaborative development in notebook-driven workflows.
  • Knowledge in Elasticsearch Kibana, REST APIs.
  • Experience with the Version Control, agile and DevOps methodologies
  • Developing IoT Connectivity Solutions using Azure event hub/Apache Kafka.
  • Exposure to Supply Chain, Manufacturing & Logistics domain.

Experience:

  • 6+ years of Software development with 4+ years of experience working with data engineering.
  • 2+ years of experience with cloud/on premises data warehouses and data modeling.
  • 2+ years of hands-on experience in creating technical solutions on Cloud - Snowflake EDW / Informatica IICS / HVR as well as design and development on SQL Server is advantageous
  • Demonstrated experience in progressively challenging and Responsible roles. Must have experience working in Matrix organization structure.

Essential skills:

  • Strong experience in Snowflake, including data modeling, performance tuning, query optimization, Snowpipe, Streams/Tasks, User Defined Objects and secure data sharing.
  • Strong experience building scalable data pipelines using SQL, Python, and modern ELT/ETL frameworks, with a focus on automation, reliability, and CI/CD integration.
  • Deep understanding of cloud platforms (Azure, AWS, or GCP), including storage layers, compute services, networking fundamentals, IAM, and cost optimization.
  • Proven ability to design and implement robust data architectures, including batch and streaming patterns, orchestration frameworks, and best practices for data quality, governance, and observability.
  • Proficiency with Informatica IICS, including building and orchestrating cloud-native ETL/ELT pipelines, parameterization, and integration with enterprise data platforms.
  • Clear and confident communicator, able to articulate vision, technical concepts, and requirements effectively to both technical and non-technical stakeholders.
  • Innovative, systems-level thinker with a demonstrated ability to rapidly conceptualize solutions, generate creative ideas, and integrate diverse technologies to solve complex problems.
  • Experienced mentor and coach, capable of guiding and developing seasoned technology specialists, fostering technical excellence, and promoting best practices.
  • Strong business acumen, with the ability to connect data engineering decisions to business strategy, operational needs, and measurable outcomes.
  • Collaborative team player, receptive to feedback, adaptable to evolving priorities, and committed to a positive, high-performance team culture.

 

Desired skills:

  • Hands-on experience with Databricks for data processing, Spark-based transformations, and collaborative development in notebook-driven workflows.
  • Knowledge in Elasticsearch Kibana, REST APIs.
  • Experience with the Version Control, agile and DevOps methodologies
  • Developing IoT Connectivity Solutions using Azure event hub/Apache Kafka.
  • Exposure to Supply Chain, Manufacturing & Logistics domain.

Company

Honeywell helps organizations solve the world's most complex challenges in automation, the future of aviation and energy transition. As a trusted partner, we provide actionable solutions and innovation through our Aerospace Technologies, Building Automation, Energy and Sustainability Solutions, and Industrial Automation business segments – powered by our Honeywell Forge software – that help make the world smarter, safer and more sustainable.

About the company

Manufactures aerospace products, building technologies, and industrial control systems.

Skills

Snowflake
Informatica IICS
Databricks
Spark
Python
SQL
Azure
AWS
GCP
Elasticsearch
Kibana
REST APIs
CI/CD
HVR
Snowpipe
Azure Event Hub
Kafka