Website:
paypay.co.in
Job details:
Job Description
PayPay India is looking for a Data Engineer to work on our payment system to deliver the best payment experience for our customers. This platform is vital to support our increasing business demands. The Data Pipeline team is tasked with creating, deploying, and managing this platform, utilizing leading technologies like Databricks, Delta Lake, Spark, PySpark, Scala, and the AWS suite.
We are actively seeking skilled Data Engineers to join our team and contribute to scaling our platform across the organization.
Main Responsibilities
- Create and manage robust data ingestion pipelines leveraging Databricks, Airflow, Kafka, and Terraform.
- Ensure high performance, reliability, and efficiency by optimizing large-scale data pipelines.
- Develop data processing workflows using Databricks, Delta Lake, and Spark technologies.
- Maintain and improve the Data Lakehouse, utilizing Unity Catalog for efficient data management and discovery.
- Construct automation, frameworks, and enhanced tools to streamline data engineering workflows.
- Collaborate across teams to facilitate smooth data flow and integration.
- Enforce best practices in observability, data governance, security, and regulatory compliance
Qualifications
- Minimum 7 years as a Data Engineer or similar role.
- Hands-on experience with Databricks, Delta Lake, Spark, and Scala.
- Proven ability to design, build, and operate Data Lakes or Data Warehouses.
- Proficiency with Data Orchestration tools (Airflow, Dagster, Prefect).
- Familiarity with Change Data Capture tools (Canal, Debezium, Maxwell).
- Strong command of at least one primary language (Scala, Python, etc.) and SQL.
- Experience with data catalog and metadata management (Unity Catalog, Lakeformation).
- Experience in Infrastructure as Code (IaC) using Terraform.
- Excellent problem-solving and debugging abilities for complex data challenges.
- Strong communication and collaboration skills.
- Capability to make informed decisions, learn quickly, and consider complex technical contexts.
- Leverage AI/LLM-based tools in daily workflows (e.g., code development, reviews, testing, debugging, documentation), while ensuring human oversight, judgment, and accountability drive the final product.
Click on Apply to know more.