Logo

Data Engineer (NCS/Job/ 2056)

For The Sonatype Platform Empowers Speed And Security In Open Source
2 - 8 Years
Full Time
Up to 30 Days
Up to 36 LPA
1 Position(s)
Hyderabad
Posted 21 Days Ago

Job Skills

Job Description

    • We’re looking for a Data Engineer/Senior Data Engineer to join our growing Data Platform team. This role is a hybrid of data engineering and business intelligence, ideal for someone who enjoys solving complex data challenges while also building intuitive and actionable reporting solutions.

 

    • You’ll play a key role in designing and scaling the infrastructure and pipelines that power analytics, dashboards, machine learning, and decision-making across Sonatype. You’ll also be responsible for delivering clear, compelling, and insightful business intelligence through tools like Looker Studio and advanced SQL queries.

What You’ll Do

    • Design, build, and maintain scalable data pipelines and ETL/ELT processes.
    • Architect and optimize data models and storage solutions for analytics and operational use.
    • Create and manage business intelligence reports and dashboards using tools like Looker Studio, Power BI, or similar.
    • Collaborate with data scientists, analysts, and stakeholders to ensure datasets are reliable, meaningful, and actionable.
    • Own and evolve parts of our data platform (e.g., Airflow, dbt, Spark, Redshift, or Snowflake).
    • Write complex, high-performance SQL queries to support reporting and analytics needs.
    • Implement observability, alerting, and data quality monitoring for critical pipelines.
    • Drive best practices in data engineering and business intelligence, including documentation, testing, and CI/CD.
    • Contribute to the evolution of our next-generation data lakehouse and BI architecture.

What We’re Looking For

    • 5+ years of experience as a Data Engineer or in a hybrid data/reporting role.
    • Strong programming skills in Java/Scala (Java Preferred).
    • Proficiency with data tools such as Databricks, data modelling techniques (e.g., star schema, dimensional modelling), and data warehousing solutions like Snowflake or Redshift.
    • Hands-on experience with modern data platforms and orchestration tools (e.g., Spark, Kafka, Airflow).
    • Proficient in SQL with experience in writing and optimizing complex queries for BI and analytics.
    • Experience with BI tools such as Looker Studio, Power BI, or Tableau.
    • Experience in building and maintaining robust ETL/ELT pipelines in production.
    • Understanding of data quality, observability, and governance best practices.

Bonus Points

    • Experience with dbt, Terraform, or Kubernetes.
    • Familiarity with real-time data processing or streaming architectures.
    • Understanding of data privacy, compliance, and security best practices in analytics and reporting.