GCP Data engineer

Full Time | PAN India | India

Industry : Computer Software
Experience5 - 12 years
Compensation700,000 - 3,200,000
Openings1

Role Overview

Designation / RoleOverview

Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL/ELT workflows on GCP.
  • Configure and execute data ingestion pipelines using reusable GCP-based frameworks.
  • Work with GCP services such as Google Cloud Storage (GCS), BigQuery, Composer, Data Fusion, and Dataproc for end-to-end data processing.
  • Experience in Data proc and Data flow is must
  • Manage parameters, job configurations, and metadata for ingestion and transformation jobs.
  • Debug and resolve issues related to data, parameters, and job execution; escalate framework-related bugs as needed.
  • Monitor daily job runs, troubleshoot failures, and ensure SLAs are consistently met.
  • Collaborate with cross-functional teams to ensure smooth delivery and integration.
  • Follow best practices in version control (Git) and infrastructure-as-code (Terraform).
  • Maintain deployment scripts and infrastructure configurations.
  • Guide and mentor technical teams to achieve delivery milestones.
  • Proactively suggest and contribute to process and framework improvements.
  • Required Skills & Experience
  • 5+ years of experience as a Data Engineer or in a similar role.
  • Strong hands-on knowledge of GCP, including GCS, BigQuery, Composer, Data Fusion, and Dataproc.
  • Proficiency in Python and PySpark for scripting, debugging, and automation.
  • Proficiency in Dataproc and Dataflow
  • Experience with Hadoop, HDFS, and the broader Big Data ecosystem.
  • Solid understanding of data migration from on-premises to cloud environments.
  • Expertise in SQL for processing large volumes of semi-structured and unstructured data.
  • Familiarity with file formats such as Avro and Parquet.
  • Strong debugging, troubleshooting, and analytical skills.
  • Experience with infrastructure-as-code tools, especially Terraform.
  • Knowledge of version control systems, particularly Git.
  • Good to Have
  • Exposure to Jira, Agile methodologies, Sonar, TeamCity, and CI/CD pipelines.
  • Experience with enterprise job schedulers (e.g., TWSd).
  • Background in international banking, multi-vendor, or multi-geography teams.
  • Knowledge of SQL optimization in BigQuery.
  • Additional experience in Python development.
  • Soft Skills
  • Attention to detail and a strong sense of ownership.
  • Excellent communication and collaboration abilities.
  • Ability to work effectively in a fast-paced, deadline-driven environment.
  • Proactive approach to process improvement and innovation.

Skill Set

GCP bigquery storage dataflow pub sub sql cloud composer IAM dataproc
Application

Apply for this role

PDF, DOC, or DOCX up to your server upload limit.