Flag job

Report

Data Engineer - Google Cloud Platform

Salary

₹35 - 50 LPA

Min Experience

5 years

Location

Itanagar, Arunachal Pradesh, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Job Title : GCP Data Engineer Python | PySpark | GCP | DBT | SQL

Experience Required : 5 years

Location - Remote

Mandatory Skills : Python, PySpark, GCP, SQL, DBT, Airflow, Git

Job Summary

We are looking for a Senior Data Engineer to join our growing data platform team. This role demands a hands-on expert with deep experience in building scalable data pipelines, transforming data using modern frameworks like PySpark and DBT, and deploying them in cloud environments like GCP. You will be instrumental in designing and developing the backbone of our data architecture, ensuring fast, reliable, and accessible data for downstream analytics, reporting, and machine learning.

Key Responsibilities

  • Design and implement automated, scalable, and reliable data pipelines.
  • Work collaboratively with data scientists and analysts to translate business needs into data solutions.
  • Develop and maintain data models, transformation workflows, and data validation frameworks.
  • Build and deploy data processing code using PySpark, SQL, and Python in Databricks environments.
  • Implement workflows and orchestration using Airflow or similar tools.
  • Maintain clean, well-documented, and version-controlled code using Git/GitHub.
  • Apply best practices in software engineering: unit testing, code reviews, CI/CD.
  • Monitor and optimize data storage and compute resources for cost and performance.
  • Work in a cross-functional team with a clear focus on delivering high-quality, production-grade systems.

Required Qualifications

  • Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, or a related field.
  • 5+ years of hands-on experience in data engineering roles.
  • Strong expertise in Databricks, PySpark, and Spark SQL including optimization techniques, Unity Catalog, and workflow management.
  • Proficiency with Python for scripting and data manipulation.
  • Solid experience with SQL and relational database systems.
  • Proficient in DBT for transformation and model management.
  • Strong understanding of Google Cloud Platform (GCP) or any major cloud provider.
  • Experience working with Linux environments and shell scripting.
  • Skilled in data orchestration using Airflow or similar tools.
  • Familiarity with version control systems (Git/GitHub).
  • Strong communication skills with both technical and non-technical stakeholders.

(ref:hirist.tech)

Skills

python
pyspark
gcp
sql
dbt
airflow
git