Logo

Senior SQL Data Engineer (NCS/Job/ 2917)

For  A Cloud Computing, Computer Security,Data Centers Company
7 - 12 Years
Full Time
Immediate
Up to 30 LPA
1 Position(s)
Remote/Work From Home (Wfh)
Posted 7 Days Ago

Job Skills

Job Description

Mandatory Skill: at least one year’s experience with Python Generative AI.

Senior SQL Data Engineers with no less than seven years’ experience (clients will not compromise on the years of experience working with SQL and specifically T-SQL).  These resources will be assigned to a variety of projects like fraud detection initiatives, Incident Management, potential Gen AI projects, and the exit of MuleSoft.  In order of priority, the client is looking for resources with relevant work experience in:

  1. SQL & T-SQL
  2. Azure Data Factory (ADF)
  3. SSIS
  4. At least one year’s experience with Python Generative AI. (Mandatory)

Job Profile: Senior Data Engineer

The Senior Data Engineer will be responsible for managing and optimizing data systems, focusing on developing, automating, and orchestrating data pipelines within the client’s BI Analytics team. This role requires seven years’ experience leveraging T-SQL expertise, Azure Data Factory (ADF), SSIS, and Python Generative AI (1yr exp) to enhance data processing, integration, and transformation.

Key Responsibilities:

  1. Data Pipeline Development and Integration:
    • Design and implement data pipelines using modern cloud technologies.
    • Integrate various source systems, including real-time data sources and ETL processes, to ensure seamless data flow and accessibility.
  2. Cloud Technology Utilization:
    • Utilize cloud platforms like Azure Synapse, Redshift, and Snowflake for data processing and storage, ensuring scalability and performance optimization.
  3. Azure Data Factory (ADF) Expertise:
    • Use ADF for building and managing ETL/ELT pipelines, leveraging T-SQL skills for data ingestion, transformation, and loading into analytical systems.
    • Configure Linked Services and Datasets to connect T-SQL sources to Azure environments.
    • Use Azure Batch to modify/create jobs, allocate proper resources, and maintain existing jobs
  4. Data Transformation:
    • Implement code-free transformations using ADF’s visual mapping data flows and transformations for complex logic.
    • Orchestrate transformations on external computer services like Azure Databricks.
  5. Pipeline Orchestration and Automation:
    • Build and automate multi-step workflows using ADF, implementing control flow logic and scheduling pipeline runs.
  6. Monitoring and Troubleshooting:
    • Use ADF’s monitoring tools and Azure Monitor to track pipeline performance and troubleshooting issues.
  7. Data Governance and Strategy:
    • Collaborate with data governance teams to ensure data quality and compliance.
    • Implement data security measures and integrate with tools like Microsoft Purview for data lineage.
  8. SQL Server Integration Services (SSIS):
    • Use SSIS for data migration and ETL tasks, extracting data from various sources and loading into data warehouses.
  9. Python Generative AI Applications:
    • Utilize Python-based GenAI for automated code generation, natural language to T-SQL translation, and synthetic data creation.
    • Enhance data documentation and query optimization using AI tools.
  10. Optimization and Performance Tuning:
    • Optimize queries and data models, using ADF features for performance tuning and cost efficiency.

Skills and Qualifications:

  • Proficiency in Transact-SQL, Azure Data Factory, SSIS, and Python.
  • Experience with cloud platforms and data integration tools.
  • Strong understanding of data governance, security, and compliance.
  • Ability to collaborate with data science teams and support AI initiatives.
  • Excellent troubleshooting and problem-solving skills.