About the role
Design, implement, and optimize data pipelines using Apache NiFi.
• Ingest, transform, and distribute data across multiple sources such as databases, APIs, cloud storage, and messaging systems.
• Ensure data quality, security, and compliance within data flows.
• Strong understanding of ETL/ELT pipelines, data integration
• Familiarity with big data and streaming technologies (Kafka, Spark, Hadoop, etc.).
• Monitor and troubleshoot NiFi performance, dataflow bottlenecks, and failures.
• Integrate NiFi with various databases, cloud services, and big data technologies.
• Collaborate with data analysts, developers, and DevOps teams to support data-driven solutions.
• Knowledge of Streamsets and/or other data pipeline technologies is a plus
• Knowledge of Java or python as well