Job Overview:
We are seeking a talented Apache NiFi Workflow Developer to design, implement, and optimize data workflows using Apache NiFi. The ideal candidate will have hands-on experience building scalable workflows for data integration, transformation, and orchestration in real-time and batch environments. This is a fully remote position that offers flexibility, professional growth, and the opportunity to work on exciting, large-scale data projects.
Key Responsibilities:
- Develop and manage end-to-end data workflows using Apache NiFi, including the creation of processors, data pipelines, and real-time data flows.
- Design and optimize ETL/ELT workflows to handle structured, semi-structured, and unstructured data.
- Configure, deploy, and monitor NiFi clusters, ensuring optimal performance, reliability, and scalability.
- Integrate various data sources, including APIs, databases, cloud storage, and messaging systems (e.g., Kafka, S3, HDFS).
- Implement version control and manage data flow configurations using NiFi Registry.
- Collaborate with cross-functional teams, including data engineers, architects, and DevOps, to meet business and technical requirements.
- Troubleshoot and resolve issues related to NiFi workflows, ensuring minimal downtime and data loss.
- Document workflow processes, best practices, and configurations for transparency and operational efficiency.
- Ensure adherence to data security, compliance, and governance best practices.
Required Skills and Qualifications:
- 3-5 years of hands-on experience with Apache NiFi, focusing on workflow design, configuration, and optimization.
- Solid understanding of data integration concepts, ETL processes, and real-time data flow management.
- Experience handling different data formats (e.g., JSON, XML, CSV, Avro).
- Proficiency in managing NiFi clusters, configuring processors, and optimizing workflows.
- Strong knowledge of REST APIs, data storage systems (e.g., S3, HDFS, RDBMS, NoSQL), and real-time data pipelines.
- Familiarity with NiFi Registry for flow versioning and deployment.
- Knowledge of scripting languages (e.g., Python, Groovy, or Java) to enhance NiFi workflows.
- Experience in troubleshooting performance issues, optimizing NiFi configurations, and improving throughput.
- Excellent problem-solving and analytical skills.
- Strong communication skills and the ability to work effectively in a remote, collaborative environment.
Good-to-Have Skills:
- Experience working with cloud platforms (AWS, Azure, GCP) and deploying NiFi workflows in cloud environments.
- Familiarity with Apache Kafka, messaging systems, and distributed data processing frameworks.
- Knowledge of DevOps practices, including CI/CD pipelines, Docker, Kubernetes, and monitoring tools.
- Understanding of data security, governance, and compliance standards.