GCP Data Engineer

Post Date

Jul 01, 2025

Location

Dearborn,
Michigan

ZIP/Postal Code

48126
US
Sep 07, 2025 Insight Global

Job Type

Contract

Category

Data Warehousing

Req #

MIC-792406

Pay Rate

$46 - $58 (hourly estimate)

Job Description

Day to Day:

In this role, the data engineer will work closely with cross-functional teams in an Agile environment to design, build, and maintain scalable data pipelines and cloud-based analytics solutions. A typical day involves collaborating with fellow engineers, product owners, and data stakeholders to understand business needs and translate them into technical solutions. The engineer will develop and optimize data workflows using GCP services such as BigQuery, Dataflow, Pub/Sub, and Dataproc, while also ensuring data quality through robust ETL processes. Responsibilities include automating data ingestion, transformation, and validation, as well as contributing to the design of CI/CD pipelines for continuous integration and deployment. The role also involves participating in code reviews, technical discussions, and knowledge sharing sessions to promote best practices and drive innovation across the team.

We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form. The EEOC "Know Your Rights" Poster is available here.

To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ .

Required Skills & Experience

Must Have:

3+ years of hands-on experience with cloud platforms (GCP preferred), including production-scale solution design and implementation
In-depth understanding of Google Cloud Platform services and architecture, or equivalent cloud technologies
5+ years of analytics application development experience
5+ years of SQL development experience
Experience with GCP-based Big Data deployments using tools such as BigQuery, Dataflow, Pub/Sub, Dataproc, Airflow, Google Cloud Storage, and Terraform
2+ years of professional development experience in Java or Python, including Apache Beam
Strong experience in building data pipelines and architectures for data processing
Proficient in ETL processes including extracting, loading, transforming, cleaning, and validating data
1+ year of experience designing and building CI/CD pipelines
Proven ability to work in Agile environments, including pair programming and cross-functional collaboration

Nice to Have Skills & Experience

Plusses:

Experience with machine learning tools (e.g., TensorFlow, BigQueryML, Vertex AI) and data governance platforms like DataPlex or Informatica EDC
Familiarity with modern data engineering tools and practices, including Git, Jenkins, CI/CD, DBT/Dataform, and performance tuning
Strong communication skills, ability to document complex systems, and experience mentoring engineers
GCP Professional Data Engineer certification or a Masters degree in a related field

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.