Flag job

Report

Data Engineer ID56374

Location

Kolkata, West Bengal, India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

AgileEngine

Website: agileengine.com
Job details:
AgileEngine is an Inc. 5000 company that creates award-winning software for Fortune 500 brands and trailblazing startups across 17+ industries. We rank among the leaders in areas like application development and AI/ML, and our people-first culture has earned us multiple Best Place to Work awards.

WHY JOIN US
If you're looking for a place to grow, make an impact, and work with people who care, we'd love to meet you!

ABOUT THE ROLE
Data Engineer role focuses on building and evolving scalable data solutions on Google Cloud Platform to support complex data ecosystems. You will work with technologies like Python, SQL, BigQuery, and Airflow to enable reliable data processing and analytics. This opportunity offers exposure to modern data architectures, AI-driven solutions, and collaboration across global teams, driving impactful data innovation.

WHAT YOU WILL DO
- Build and maintain scalable, distributed, fault-tolerant data pipelines on GCP, including BigQuery-based lakehouse layers and Dataproc-driven Delta Lake workflows;
- Participate in meetings with stakeholders across data engineering, compliance, and business teams globally;
- Build pipelines to acquire, normalise, transform, and release large volumes of financial data;
- Design and implement bitemporal data models on BigQuery;
- Build and maintain software testing frameworks for data pipelines and transformation logic;
- Take ownership of solutions including ingestion pipelines, QA workflows, correction management, and audit trail implementation;
- Collaborate with team members and contribute to shared platform services;
- Support teams in implementing AI solutions and integrating services with data platforms.

MUST HAVES
- 6-8 years of experience in data engineering;
- Proficiency in Python for data pipeline development and automation;
- Strong SQL skills with BigQuery including partitioning, clustering, and time-series queries;
- Hands-on experience with Cloud Composer (Apache Airflow) ;
- Working knowledge of Dataproc (Apache Spark) ;
- Experience with Git for versioning and collaboration;
- Familiarity with REST APIs ;
- Familiarity with GCP technologies including Cloud Storage, Pub/Sub, Datastream, Cloud Monitoring, IAM, and VPC Service Controls;
- Upper-intermediate English level.

NICE TO HAVES
- Knowledge of pandas, PySpark, or equivalent;
- Knowledge of columnar storage and time-series analytics;
- Familiarity with Dataplex;
- Understanding of Change Data Capture patterns using Datastream;
- Understanding of bitemporal data modeling concepts;
- Understanding of financial reference data;
- Familiarity with BigQuery cost management;
- Exposure to CI/CD pipelines and Terraform;
- Experience with LLMs and Agentic AI using Vertex AI.

PERKS AND BENEFITS
- Remote work & local connection: Work where you feel most productive and connect with your team in periodic meet-ups.
- Legal presence in India: Full local compliance and structured work environment.
- Competitive compensation in INR: Dedicated budgets for growth, education, and wellness.
- Innovative projects: Work with modern technologies and global clients.


Click on Apply to know more.

Skills

Python
Airflow
Apache
Apache Airflow
Apache Spark
BigQuery
clustering
compliance
data engineer
data modeling
data models
data solutions
GCP
Git
Google Cloud
Pandas
Spark
SQL
VPC
Vertex
REST APIs