Flag job

Report

Cloud Engineer

Location

Bengaluru, Karnataka, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Recro

Website: recro.io
Job details:

AWS & GCP Data Engineer

Location: Hyderabad or Bangalore

Experience - 5 to 9 Years


Role Responsibilities

  • Design and develop scalable end-to-end data products aligned with enterprise data standards.
  • Build and optimize batch and near-real-time data pipelines.
  • Implement Bronze / Silver / Gold data models and curated data products.
  • Engineer solutions on AWS (S3, Glue, MWAA, Athena/Redshift) and GCP (GCS, BigQuery, Dataproc, Dataflow).
  • Collaborate with business stakeholders, analysts, and data scientists to deliver data solutions.
  • Contribute to reusable patterns, lighthouse initiatives, and platform modernization.
  • Mentor junior engineers and drive engineering best practices.

Role Purpose

  • Design and build scalable, governed data products aligned to data-as-a-product strategy.
  • Bridge architecture, engineering, and business process requirements.
  • Enable analytics and AI use cases on AWS and GCP platforms.
  • Ensure data quality, governance, and lineage across pipelines.

Minimum Requirements

  • 5+ years of experience in Data Engineering / Data Platforms.
  • Strong hands-on experience with AWS and GCP.
  • Proficiency in PySpark, SQL, and Python.
  • Experience with Airflow / Cloud Composer / MWAA.
  • Solid understanding of lakehouse architecture and data modelling.
  • Experience working with enterprise business processes (Finance, Supply Chain, Sales, etc.).
  • Strong communication and stakeholder collaboration skills.

Click on Apply to know more.

Skills

Python
Airflow
AWS
BigQuery
data engineer
data models
data solutions
Dataflow
Dataproc
end-to-end
GCP
GCS
SQL