INTL INDIA - GCP Big Data Engineer

Post Date

Oct 23, 2025

Location

Phoenix,
Arizona

ZIP/Postal Code

85054
US
Dec 31, 2025 Insight Global

Job Type

Contract

Category

Programmer / Developer

Req #

PHX-49c74ded-8e4b-43ab-8383-cb3bb4da23e4

Pay Rate

$12 - $15 (hourly estimate)

Job Description

Insight Global is seeking a GCP Big Data Engineer to support a large Financial Services client. This candidate will be supporting a migration from a legacy on prem platform into Data Warehouse hosted in GCP. They will be focusing on migration, data transformations and then managing the data integrations from downstream data. The ideal candidate will have hands-on experience designing, developing, and optimizing scalable data pipelines for large-scale data processing, transformation, and analytics.

Design and implement robust, scalable, and efficient data pipelines using Hadoop, Hive, Spark, Python, and Shell scripting.
Develop and maintain data workflows on GCP using BigQuery, DataProc, Airflow, and Pub/Sub.
Build and optimize event-driven data processing systems using Kafka or GCP Pub/Sub.
Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
Design and optimize data models for performance, scalability, and reliability in SQL-based environments.
Ensure data quality, integrity, and governance across all data platforms.
Implement and maintain CI/CD pipelines using tools like Jenkins, Git, XLR, and automated testing frameworks.
Manage source code and configuration using GitHub and other version control tools.
Apply knowledge of distributed systems, algorithms, and relational databases to solve complex data engineering challenges.

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Required Skills & Experience

6- 8 years of Data Engineering experience in a Big Data environment.

Hadoop Hive, Spark, Python, shell scripting, GCP Cloud - Big Query, Airflow, DataProc, PubSub

Strong experience in designing, developing, and optimizing data pipelines for large-scale data processing, transformation, and analysis using Big Data and GCP technologies.

Proficiency in SQL and database systems, with experience in designing and optimizing data models for performance and scalability.

Experience of building event processing pipelines with Kafka or GCP PubSub.

Knowledge of distributed (multi-tiered) systems, algorithms & relational databases

Experience with CICD pipelines, Automated test frameworks, and source code management tools (XLR, Jenkins, Git).

Good knowledge and experience with configuration management tools like GitHub

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.