Logo

Data Engineer Snowflake Python DBT (NCS/Job/ 2264)

For Data And Artificial Intelligence Company
5 - 7 Years
Full Time
Immediate
Up to 15 LPA
1 Position(s)
Riyadh
Posted 2 Days Ago

Job Skills

Job Description

  • Design and develop end-to-end data pipelines using Airflow, Python, and DBT to ingest, transform, and deliver analytics-ready data in Snowflake.
  • Implement data quality frameworks — defining validation rules, reconciliation logic, and exception handling mechanisms.
  • Create and maintain data quality dashboards and automated alerting mechanisms for proactive issue identification.
  • Collaborate with data owners and business stakeholders to define data quality KPIs, thresholds, and governance standards.
  • Integrate data quality validation checks within ETL/ELT pipelines for automated enforcement.
  • Conduct performance tuning and query optimization in Snowflake to ensure cost-effective operations.
  • Establish best practices for testing, documentation, and CI/CD in data pipelines.
  • Support root cause analysis of data issues and drive continuous improvement in data quality and observability.
  • Collaborate with analytics, DevOps, and data architecture teams to ensure alignment on design, deployment, and governance practices.

Required Skills & Experience:

  • 5–7 years of experience in data engineering or data quality engineering roles.
  • Strong expertise in Snowflake — including warehouse design, optimization, security, and data sharing.
  • Advanced proficiency in Python for data transformation, validation, and automation.
  • Practical experience with Airflow for workflow orchestration and job scheduling.
  • Hands-on with DBT for modular transformations, testing, and documentation.
  • Deep understanding of data quality concepts — accuracy, completeness, consistency, timeliness, and validity.
  • Experience in implementing data validation frameworks (e.g., Great Expectations, Deequ, or custom Python-based frameworks).
  • Strong SQL and performance tuning skills.
  • Familiarity with version control (Git) and CI/CD processes for data workflows.
  • Knowledge of data governance and metadata management practices.