Flag job

Report

Data Engineer expertise in SPARQL

Location

India

JobType

full-time

About the job

Info This job is sourced from a job board

About the role

**Work Location :** 7 Deloitte USI Locations - Hyderabad, Mumbai, Delhi/NCR, Bengaluru, Kolkata, Pune, and Chennai **Job Description ** * Design, develop, and maintain scalable data pipelines using Python, ensuring efficient data integration, transformation, and performance optimization. * Use graph technologies to model complex data relationships, manage graph databases, and implement SPARQL queries for data retrieval. * Implement and manage automated data workflows using orchestration tools (e.g., Apache Airflow, Prefect) to ensure reliability and fault tolerance. * Ensure data accuracy and consistency through quality checks, and enforce data governance policies for data integrity. * Work with data scientists, analysts, and stakeholders to deliver data solutions, and clearly communicate technical concepts to non-technical audiences. * Leverage Prophecy platform to design and manage data workflows, ensuring smooth integration and optimization for scalability. **Required Skills ** * Expertise in graph databases and technologies (e.g., Neo4j, SPARQL). * Proven experience in Python and data pipeline development. * Hands-on experience with data orchestration tools (e.g., Apache Airflow, Prefect). * Strong understanding of data quality, governance, and integrity best practices. * Excellent communication and collaboration skills. * Experience with the Prophecy platform is a plus. * A continuous learning mindset to stay current with evolving data engineering trends and tools.

Skills

python
graph databases
SPARQL
data pipeline development
data orchestration tools
data quality
data governance
data integrity