About the role
Design, develop, and optimize ETL/ELT pipelines for extracting, transforming, and loading healthcare data from multiple sources.
Ensure data quality, security, and compliance through best practices in data governance and management.
Work closely with data analysts to develop KPI dashboards and customized reports, leveraging BI tools such as Power BI, Tableau, or Qlik.
Maintain and improve data infrastructure to optimize performance and scalability.
Identify and implement new technologies and best practices to improve data workflows.
Education: Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field.
Experience:
Proven experience in developing and optimizing ETL/ELT pipelines, preferably in the healthcare sector.
Strong knowledge of SQL and NoSQL databases.
Experience with data processing tools such as Apache Spark, Airflow, or similar frameworks.
Proficiency with BI tools (e.g., Power BI, Tableau, Qlik) for data visualization and reporting.
Technical Skills:
Proficiency in Python and R for data engineering tasks.
Ability to design scalable, efficient data models and architectures.
Soft Skills:
Strong analytical and problem-solving abilities.
Ability to work independently while collaborating with analysts and stakeholders.
Excellent communication skills to translate complex data concepts into business insights.
Nice to Have:
Experience with healthcare data standards (e.g., HL7, FHIR, DICOM).
Understanding of data privacy and security regulations (HIPAA, GDPR).
Frontend development experience for data visualization or dashboard integration.
Experience with cloud platforms (AWS, Azure, GCP) is a plus but not required.