Data Engineer

Post Date

Jul 11, 2025

Location

Vancouver,
British Columbia

ZIP/Postal Code

V6J1C7
Canada
Sep 16, 2025 Insight Global

Job Type

Contract

Category

Data Warehousing

Req #

VAN-793082

Pay Rate

$46 - $57 (hourly estimate)

Who Can Apply

  • Candidates must be legally authorized to work in Canada

Job Description

Day to Day

Insight Global is seeking a Data Engineer to join one of a Retail Client in Vancouver BC. This Data Engineer will sit within the People Enablement Technology team which is committed to enabling seamless experiences for their global workforce. The Data & Analytics Pillar is a newly formed team, focused on automating our core health & wealth offerings as well as work across the department to enable efforts driven by the People Systems, Digital Workplace & Learning, and Workforce Agility pillars. The project the successful candidate will join is bringing the learning data from various platforms such as LI learning into an existing data lake. This will allow for analytics insights to be generated from this people data for various teams such as the ESG team. The data engineer will support using data bricks for this ingestion, and use unity catalog for its categorization and potential encryption if required.

A successful candidate will be a problem solver and an expert in ETL programming/scripting, data modelling, data integration, SQL and have exemplary communication skills. The candidate will need to be comfortable with ambiguity in a fast-paced and ever-changing environment, and able to think big while paying careful attention to detail. The candidate will know, and love working with new technologies, can model multidimensional datasets, and can partner with cross functional business teams to answer key business questions. Apart from building data pipelines, you will be an advocate for automation, performance tuning and cost optimization. Be ready to question the status quo and bring forth intelligent solutions and proof of concepts.

Principal Duties and Responsibilities:

Support the design and implementation of complex data integration, modeling, and orchestration workflows using tools such as Databricks, Azure Data Factory, and Snowflake
Collaborate with architects, engineers, analysts, and business stakeholders to deliver secure, reliable, and high-performing data solutions
Apply DevSecOps principles to ensure secure development, deployment, and monitoring of data pipelines and infrastructure
Implement and maintain data encryption, access controls, and governance using Databricks Unity Catalog to protect sensitive information
Participate in operational support, including troubleshooting, performance tuning, and continuous improvement of data workflows
Create and maintain technical documentation and contribute to knowledge-sharing across the team

We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form. The EEOC "Know Your Rights" Poster is available here.

To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ .

Required Skills & Experience

Must Haves:
5+ years of experience as a Data Engineer working with Databricks, SQL & Python
3+ years with Cloud Data Integration & Modelling (Azure Data Factory)
Strong experience with Unity Catalog for categorization and filing sensitive data
Experience with DevOps tools and building secure, scalable data workflows
Proficiency in Python, and Pyspark
Experience working with people centered and sensitive data to understand the complexities of encryption, decryption (such as health information, payment information, banking information).
Experience with data modeling, data warehousing, and building ETL pipelines.
Excellent problem-solving skills, combined with the ability to present your findings/insights clearly and compellingly in both verbal and written form.
Strong documentation and communication skills
Bachelors degree in Computer Science, Mathematics, Statistics, Operations Research.

Nice to Have Skills & Experience

Plusses:
- Snowflake experience

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.