Logo

Snowflake Data Integration (NCS/Job/ 2936)

For A Multinational It And Business Consulting Service Company
5 - 8 Years
Full Time
Up to 15 Days
Up to 22 LPA
1 Position(s)
Bangalore / Bengaluru, Chennai, Hyderabad, Mumbai, Pune
Posted 2 Days Ago

Job Skills

Job Description

We are seeking a skilled Snowflake Data Integration Engineer with strong expertise in Snowflake, SQL, and Python, along with hands‑on experience in the Healthcare domain. The ideal candidate will design, develop, and optimize data pipelines, ensuring high-quality integration of healthcare data from various sources into Snowflake.


Key Responsibilities

Data Integration & Development

  • Design, develop, and maintain ETL/ELT pipelines for healthcare data using Snowflake, SQL, and Python.
  • Build scalable and optimized data models following Snowflake best practices (clustering, micro-partitions, warehouses, stages).
  • Perform data ingestion from multiple sources (EHR, claims, clinical data, HL7, FHIR, payor/provider systems).

Healthcare Data Expertise

  • Work with healthcare data standards including:
    • HIPAA compliance
    • HL7, FHIR, X12, EDI formats
    • Claims, encounter, and clinical datasets
  • Ensure data accuracy, quality, privacy, and compliance across all stages of integration.

Collaboration & Operations

  • Collaborate with Data Architects, Analysts, and Business stakeholders to understand data requirements.
  • Optimize Snowflake queries, storage costs, and performance.
  • Troubleshoot data issues, pipeline failures, and performance bottlenecks.
  • Support analytics, reporting, and BI teams by making clean, validated datasets available.

Required Skills & Qualifications

  • Strong hands‑on expertise in Snowflake (warehouses, stages, streams, tasks, Snowpipe, Time Travel).
  • Advanced proficiency in SQL (query optimization, stored procedures, analytical functions).
  • Experience with Python for data pipelines, automation, and transformation logic.
  • Demonstrated experience in the Healthcare domain, working with clinical or claims data.
  • Solid understanding of ETL/ELT frameworks, data warehousing concepts, and data modeling.
  • Experience with cloud platforms (AWS/Azure/GCP) is a plus.
  • Familiarity with data integration tools (Informatica, Matillion, Airflow, DBT, etc.) is an advantage.