Data Engineer

Post Date

Feb 12, 2026

Location

Denver,
Colorado

ZIP/Postal Code

80202
US
Apr 14, 2026 Insight Global

Job Type

Contract

Category

Computer Engineering

Req #

SEA-43080d03-0ff9-4d1b-a88f-e59b77dca21c

Pay Rate

$38 - $48 (hourly estimate)

Job Description

Our Client, a play in the telecom industry is looking to bring on a Data Engineer for a 12 month contract. The Data Engineer builds and optimizes scalable data pipelines using AWS, Databricks, and Snowflake. This role focuses on transforming complex data into trusted, analytics-ready datasets while ensuring performance, reliability, and cost efficiency. This role will collaborate with business teams to deliver actionable insights; strong written and verbal communication is required.

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Required Skills & Experience

• 4+ years developing cloud solutions as a Data Engineer
• Bachelor’s Degree in Computer Science, Computer Engineering, or a related field
• Clear and concise verbal and written communication; ability to communicate technical concepts to non-technical people
• Proven experience developing solutions with:
 AWS (Redshift, S3, Step Functions, Eventbridge, CloudWatch)
 Databricks (Spark, Delta Lake, Apache Iceberg, Unity Catalog)
 Snowflake
• Strong proficiency in SQL to write and optimize performance of large-scale analytics queries
• Strong proficiency in Python for custom data processing
• Familiarity with CI/CD, version control (Git), and automated testing for data pipelines
• Familiarity with Infrastructure as Code (Terraform, or similar)
• Solid understanding of data warehousing and dimensional modeling
• Ability to write detailed and comprehensive testing documentation. Strong focus on code quality with the ability to design and execute thorough tests.
• Ability to conduct effective code reviews.

Nice to Have Skills & Experience

• Hands-on experience integrating AI tools/solutions into data workflows to improve efficiency, automation, or developer productivity

Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.