Flag job

Report

GCP Data Engineer (PySpark & Python) | Bangalore

Location

Bengaluru, Karnataka, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

Website: pgcdigital.ai
Job details:

We’re Hiring: GCP Data Engineer (PySpark & Python) | Bangalore

📍 Location: Bangalore

💼 Experience: 5+ Years

🔄 Interview Process: 2–3 Rounds

We are looking for a highly skilled GCP Data Engineer with strong expertise in PySpark and Python to join our growing data engineering team. If you have a strong background in building scalable data pipelines on Google Cloud, we’d love to connect with you!

🔍 Key Responsibilities

  • Design, develop, and deploy scalable ETL/ELT pipelines using PySpark on GCP
  • Build end-to-end data pipelines using Apache Beam / Dataflow / Spark
  • Work extensively with GCP services like BigQuery, Cloud Storage, Dataproc, and Dataflow
  • Optimize PySpark jobs and BigQuery queries for performance and efficiency
  • Handle large-scale structured and unstructured data using Spark SQL & PySpark
  • Build and manage workflows using Apache Airflow / Cloud Composer

🛠️ Required Skills

  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Expertise in BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL
  • Proficiency in PySpark, Python, and SQL
  • Experience with Hadoop ecosystem / Spark-based data processing
  • Knowledge of Relational, Analytical, and NoSQL databases
  • Strong problem-solving and performance tuning skills

🎓 Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
  • GCP Professional Data Engineer / Solution Architect Certification is a strong advantage

Click on Apply to know more.

Skills

Python
Airflow
Apache
Apache Airflow
Beam
BigQuery
data engineer
Dataflow
Dataproc
end-to-end
ETL
GCP
Google Cloud
Hadoop
NoSQL
Spanner
SQL