Who Can Apply
- Candidates must be legally authorized to work in Canada
Job Description
Design, build, and maintain secure, scalable data pipelines and integration workflows using Databricks, Azure Data Factory, and Snowflake
Partner closely with architects, senior engineers, analysts, program/project managers, platform teams, and business stakeholders to deliver reliable data solutions
Develop data models and orchestration workflows to support people‑focused analytics and reporting initiatives
Implement and maintain data security best practices, including encryption, access controls, and governance using Databricks Unity Catalog
Apply DevSecOps practices to data pipeline development, deployment, and monitoring
Create and maintain technical documentation and contribute to team knowledge sharing
Troubleshoot, tune, and continuously improve data workflows and performance
Participate in limited operational support for critical job failures based on an on‑call rotation
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Required Skills & Experience
3+ years of hands‑on data engineering experience
Strong experience with Databricks, SQL, and Python
3+ years of cloud data integration experience (Azure Data Factory or similar ETL tools)
Experience building secure, scalable data pipelines and workflows
Solid understanding of data encryption, access controls, and governance for sensitive data
Experience applying DevOps or DevSecOps principles in a data engineering environment
Strong communication skills and ability to document technical solutions
Ability to collaborate effectively with cross‑functional teams in a fast‑paced environment
Bachelor’s degree in Computer Science or related field, or equivalent practical experience
Nice to Have Skills & Experience
Experience with Databricks Unity Catalog
Experience working with Snowflake
Background supporting analytics or reporting platforms tied to people or HR data
Curious, analytical mindset with a desire to understand how and why systems work
Comfort working independently once objectives are clear
Experience navigating ambiguity and focusing on solutions over problems
Strong sense of ownership and accountability for delivered solutions
Ongoing interest in learning and deepening technical expertise
Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.