OpenGov Inc.
Website:
opengov.com
Job details:
Job Description:
Senior Data Platform Engineer
Location: Pune
Tech Stack: Snowflake, dbt (Cloud/Core), AWS, Terraform, Fivetran
About the role
OpenGov is seeking a highly technical Data Platform Engineer to build and scale our centralized data infrastructure. As an early member of the Data Infrastructure team, you will co-architect the systems that power analytics and AI at scale—from ingestion to transformation and governed access. You are the engineer who builds the factory, not just the one who runs the assembly line. You’ll create the "paved path" that enables our federated business units to build, deploy, and monitor their own data products with speed and reliability. You will partner with the engineering/product, and operations team in G&A and GTM, requiring strong collaboration and communication to translate requirements into scalable data platform solutions.
What You’ll Own
- The Foundation: Build and maintain our core stack: Snowflake (Data platform), dbt (transformation), AWS/Fivetran (ingestion).
- Outcome Focused: Lead implementation of ELT/ETL pipelines for analytical and AI/ML-driven use cases, ensuring data readiness for consumption
- Self-Service Tooling: Build the CI/CD pipelines and dbt Mesh structures that allow federated teams to merge code and deploy models without breaking the central warehouse.
- Infrastructure as Code (IaC): Manage our data environment using Terraform/Github to ensure scalability, security, and reproducibility.
- Observability & Trust: Implement "Data SLA" monitoring so we know data is broken before our customers do.
- Security & Governance: Implement & Automate the Role-Based Access Control (RBAC) in Snowflake that balances "open data" with strict government compliance (GDPR/SOC2).
What You Bring
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- 8+ Years of Experience in data engineering, platform engineering, or related roles with experience in modular & governed architecture.
- Snowflake & dbt - You don't just write queries; you bring strong experience in the Snowflake platform, understanding clustering keys, micro-partitions, optimization and dbt project modularity.
- Proven experience designing and scaling automated ingestion frameworks using Fivetran and AWS services (S3, Lambda, IAM) to reliably and securely move data from disparate sources into Snowflake.
- Platform Mindset: Prior experience in building self-service provisioning and CICD frameworks.
- Software Engineering Rigor - You treat data code like software (unit tests, documentation, version control, and peer reviews).
- Programming Excellence: Strong proficiency in Python for integration and SQL for advanced data transformation and Snowflake optimization.
- Experience in using AI tools (e.g. Codex, Claude Code, Cursor) to improve engineering workflows.
- Strong communication and collaboration skills, with the ability to work effectively across cross-functional teams.
Nice to Have
- Metadata & Discovery: Experience with data cataloging tools (e.g., DataHub) to help federated teams find and trust data.
- Familiarity working with operations data (e.g., Salesforce, Workday, Netsuite) as well as telemetry/operational data from customer-facing products.
- Orchestration Expertise: Familiarity with Airflow or Dagster for managing complex, multi-stage dependencies.
- Streaming Exposure: Experience with real-time data ingestion (e.g., Kafka, Snowflake Snowpipe) for low-latency use cases.
- Public Sector Awareness: Familiarity with government data security standards or handling multi-tenant data architectures.
Click on Apply to know more.