We are seeking two highly skilled Individual Contributors (ICs) with experience in Apache NiFi, Java or Python, and big data/streaming technologies. These engineers will be responsible for designing, implementing, and optimizing data pipelines. They will collaborate closely with Evan Ginsberg and Aashish to ensure high-quality data processing and integration.
Key Responsibilities:
- Design, implement, and optimize data pipelines using Apache NiFi.
- Ingest, transform, and distribute data from multiple sources (databases, APIs, cloud storage, messaging systems).
- Ensure data quality, security, and compliance within data flows.
- Develop and maintain ETL/ELT pipelines for seamless data integration.
- Work with big data and streaming technologies such as Kafka, Spark, Hadoop, etc.
- Monitor and troubleshoot NiFi performance, resolving dataflow bottlenecks and failures.
- Integrate NiFi with various databases, cloud services, and big data technologies.
- Collaborate with data analysts, developers, and DevOps teams to support data-driven solutions.
Required Skills & Qualifications:
- Strong hands-on experience with Apache NiFi for data pipeline development.
- Proficiency in Java or Python.
- Experience with ETL/ELT pipelines and data integration.
- Knowledge of big data and streaming technologies (Kafka, Spark, Hadoop, etc.).
- Strong troubleshooting skills in data processing and performance optimization.
- Experience working with cloud-based data architectures.
- Ability to work independently and deliver solutions with minimal supervision.
Preferred Qualifications:
- Experience in distributed systems and real-time data processing.
- Familiarity with DevOps tools and CI/CD for data pipelines.
- Previous experience collaborating in cross-functional teams.
If you are a highly motivated engineer with expertise in NiFi and big data technologies, we invite you to apply and be part of an exciting team driving data innovation.