Google Cloud Platform (GCP) Data Engineer

Post Date

Jul 31, 2025

Location

Phoenix,
Arizona

ZIP/Postal Code

85054
US
Oct 01, 2025 Insight Global

Job Type

Contract

Category

Software Engineering

Req #

HOU-799116

Pay Rate

$28 - $35 (hourly estimate)

Job Description

Looking for a seasoned Software Engineer with big data and full-stack development expertise. This role will be pivotal in maintaining and modifying existing NIFI jobs, migrating existing big data, NIFI and java rest APIs to the Google Cloud Platform (GCP).

We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form. The EEOC "Know Your Rights" Poster is available here.

To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ .

Required Skills & Experience

BS in computer science, computer engineering, or other technical discipline, or equivalent work experience.
Hands-on software development experience with GCP, AWS or other Cloud/Big Data solutions.
Experience working with Hadoop, MapR, Hive, Spark, shell scripting, GCP clusters and distributed (multi-tiered) systems.
Proficiency in developing and optimizing data pipelines using NIFI or GCP Cloud Dataflow.
Experience building event processing pipelines with Kafka or GCP Pub Sub.
Hands-on experience with SQL and HSQL, and multiple storage technologies including RDBMS, document stores, and search indices. Hands-on experience with cloud services for application development and deployment such as Kubernetes, docker, etc.
Experience in developing REST APIs using Springboot or Apache Camel.
Hands-on experience setting up instrumentation, analyzing performance, distributed tracing, and debugging using tools like Dynatrace, Splunk, etc.
Strong Object-Oriented Programming skills, SOLID principles, and design patterns; preferably Java.
Good knowledge of CICD pipelines and source code management tools (XLR, Jenkins, GitHub).
Familiarity with Agile & scrum ceremonies.


Preferred Qualifications
Certifications in cloud platform (GCP Professional Data Engineer) is a plus.
Salesforce knowledge or prior experience integrating with the salesforce platform is a major plus.

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.