
Big Data (NCS/Job/ 2012)
Job Skills
Job Description
Responsibilities
-
Collaborate closely with Product Management and Engineering leadership to devise and build the right solution.
-
Participate in Design discussions and brainstorming sessions to select, integrate, and maintain Big Data tools and frameworks required to solve Big Data problems at scale.
-
Design and implement systems to cleanse, process, and analyze large data sets using distributed processing tools like Akka and Spark.
-
Understand and critically review existing data pipelines, and come up with ideas in collaboration with Technical Leaders and Architects to improve upon current bottlenecks.
-
Take initiatives and show the drive to pick up new stuff proactively, and work as a Senior Individual Contributor on the multiple products and features we have.
Requirements
-
3+ years of experience in developing highly scalable Big Data pipelines.
-
In-depth understanding of the Big Data ecosystem including processing frameworks like Spark, Akka, Storm, and Hadoop, and the file types they deal with.
-
Experience with ETL and Data pipeline tools like Apache NiFi, Airflow etc.
-
Excellent coding skills in Java or Scala, including the understanding to apply appropriate Design Patterns when required.
-
Experience with Git and build tools like Gradle/Maven/SBT.
-
Strong understanding of object-oriented design, data structures, algorithms, profiling, and optimization.
-
Elegant, readable, maintainable, and extensible code style.