Data Engineer

Post Date

Feb 05, 2025

Location

Hartford,
Connecticut

ZIP/Postal Code

06156
US
Jul 10, 2025 Insight Global

Job Type

Contract,Perm Possible

Category

Data Warehousing

Req #

CHI-760549

Pay Rate

$49 - $61 (hourly estimate)

Job Description

Insight Global's client is looking for a Data Engineer to design, develop, and optimize data pipelines and infrastructure on Google Cloud Platform (GCP). The ideal candidate will have experience in big data processing, ETL development, data warehousing, and cloud-native solutions. They will be implementing best practices to ensure data governance, security, and compliance within GCP. When joining the data engineering team, they will be collaborating with data scientists, analysts, and software engineers to support business requirements. Additionally, they will stay up to date through monitoring and troubleshooting data pipeline performance, failures, and cost efficiency.

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.

To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ .

Required Skills & Experience

4-7 years of GCP
Experience in data pipeline development using Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery
ETL/ELT development using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts
Proficient in SQL, Python, and Java for data processing and automation
Knowledge of CI/CD pipelines, Git, and DevOps
Understanding of IAM, encryption, GDPR, HIPAA standards
Real-Time and Batch Processing with Cloud Storage, Pub/Sub, Ni-Fi, Cloud SQL, and Bigtable

Nice to Have Skills & Experience

GCP Certifications (Professional Data Engineer, Associate Cloud Engineer)
Experience with machine learning pipelines on GCP (Vertex AI, AI Platform, etc)
Exposure to Kafka, Ni-Fi, or other streaming technologies
Experience with Docker, Kubernetes, GKE

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.