Who Can Apply
- Candidates must be legally authorized to work in Canada
Job Description
Insight Global is looking for a Data Engineer for a large enterprise in Toronto. The Data Engineering team is responsible for designing, building, and maintaining the data infrastructure that powers industry‑standard tools and client‑facing applications. This role focuses on developing, optimizing, and maintaining scalable data pipelines and platforms that ensure accurate, reliable, and timely delivery of data across internal systems and analytics products.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Required Skills & Experience
- 5-10+ years of hands‑on Data Engineering experience within large enterprise environments
- Strong experience designing and maintaining ETL processes and data pipelines
- Strong hands‑on experience with Azure Fabric, including notebooks, data flows, data pipelines, and deployment pipelines
- Advanced SQL skills (Microsoft SQL Server, PostgreSQL)
-Proven experience using GitHub for version control
- Familiarity with CI/CD pipelines for data workflows
- Experience working with Snowflake, including star schema design
- Strong understanding of medallion architecture (bronze, silver, gold layers)
- Experience integrating applications and databases, including designing and implementing API integrations
- Ability to monitor, troubleshoot, and optimize large‑scale data processing workflows
Strong analytical, problem‑solving, and troubleshooting skills
Nice to Have Skills & Experience
- Experience with Click Reporting or similar data visualization/reporting tools
- Additional experience with Python‑based analytics or automation
Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.