atsmantra logo
Rarr Technologies Pvt Ltd logo

GCP Big Data Engineer(RARR Job 5282)

For International Trade And Development Company

5 - 16 Years

Full Time

Up to 30 Days

Up to 34 LPA

1 Position(s)

Nagpur, Kolkata, Kochi, Bangalore / Bengaluru, Chennai, Coimbatore, Hyderabad, Noida, Mumbai, Pune

5 - 16 Years

Full Time

Up to 30 Days

Up to 34 LPA

1 Position(s)

Nagpur, Kolkata, Kochi, Bangalore / Bengaluru, Chennai, Coimbatore, Hyderabad, Noida, Mumbai, Pune

Job Description

We are looking for a skilled and proactive GCP Big Data Engineer with hands-on experience in building and maintaining scalable data pipelines using Google Cloud Platform (GCP) services. The ideal candidate must have a strong foundation in BigQuery, Python, and data engineering best practices. You will work closely with data analysts, architects, and business stakeholders to design efficient data solutions that drive business value.

Key Responsibilities:

  • Design, develop, and maintain scalable ETL/ELT data pipelines on GCP.

  • Work with BigQuery, Cloud Storage, Cloud Composer, and other GCP tools to ingest, transform, and load data.

  • Write efficient, reusable, and modular Python scripts for data processing and automation.

  • Optimize data workflows for performance and cost efficiency.

  • Ensure data quality, validation, and governance across pipelines.

  • Collaborate with data scientists and analysts to understand business requirements and translate them into technical solutions.

  • Monitor and troubleshoot data pipeline issues in production environments.

  • Implement CI/CD practices for data engineering workflows.

Required Skills:

  • 5+ years of experience in data engineering with at least 4+ years on GCP.

  • Strong expertise in BigQuery and SQL performance tuning.

  • Proficiency in Python for data manipulation, automation, and orchestration.

  • Experience with Cloud Composer (Apache Airflow) for workflow management.

  • Familiarity with data modeling, partitioning, clustering, and query optimization.

  • Strong understanding of data warehouse concepts and best practices.

  • Experience with version control (Git) and DevOps tools.

  • Excellent problem-solving, communication, and collaboration skills.

Preferred Qualifications:

  • GCP Professional Data Engineer certification.

  • Experience with other GCP services like Pub/Sub, Dataflow, and Dataproc.

  • Exposure to data security and compliance practices.

  • Knowledge of additional programming languages such as Java or Scala is a plus.

Education:

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or related field.

Matching Jobs

Nilasu Consulting Services Pvt Ltd logo
Full-stack developer (VueJS, Veutify)

For A French Mnc It Company

location icon

Noida

experience icon

4 - 7 Years ( Full Time )

skill icon

Ci/Cd, Flask Framework, Python, Sql, Veutify, Vuejs

Not disclosed

share icon
Rarr Technologies Pvt Ltd logo
Datastage ETL Engineer

For An Indian Multinational Information Technology Company

location icon

Tech Mahindra Pan India

experience icon

5 - 8 Years ( Full Time )

skill icon

Datastage, Etl, Linux, Sql, Unix

Not disclosed

share icon
Ignitivesearch Consulting Private limited logo
IBM ELM

For Engineering Services, Design Led Manufacturing, Networks.

location icon

Bangalore / Bengaluru

experience icon

3 - 5 Years ( Full Time )

skill icon

Alm, Ibm Elm, Java, Perl, Sql

Not disclosed

share icon
atsMantra logo
A unified recruitment ecosystem designed to simplify hiring for companies, recruitment agencies, and job seekers alike. From powerful applicant tracking to smart job discovery, we offer intelligent tools that bring speed, clarity, and structure to every step of the recruitment journey.
atsMantra Facebook accountatsMantra Instagram accountatsMantra Twitter accountatsMantra LinkedIn accountatsMantra YouTube account