Job Description
Seeking a Data QA Engineer II to join the Data Engineering team supporting RIDW, our enterprise Snowflake data platform. This is a net-new headcount created to meet the growing quality and validation demands of the platform as we scale toward the DWEB Enterprise BI Go-Live and beyond.
In this role, you will own the design and execution of automated data quality frameworks, validate ETL and SCD pipeline behavior across our Bronze-Silver-Gold medallion architecture, and serve as the primary quality gate for data delivered to Power BI reporting and downstream business consumers. You will work closely with data engineers, the Data Engineering manager, and business stakeholders to ensure data accuracy, reliability, and observability across all platform layers.
Pipeline & ETL QA
• Design and execute test plans for Snowflake stored procedures, Task DAGs, and incremental/full-load pipeline patterns
• Validate SCD Type 1 and Type 2 logic, including active record counts, reactivation paths, and tenant code update behavior
• Test bronze ingestion across various data sources.
• Perform row count reconciliation, duplicate detection, and watermark/freshness validation across environment tiers (DEV, QA, UAT, PROD)
• Own regression testing during environment refreshes and schema promotion cycles
Data Quality & Validation
• Build and maintain data quality rules and threshold definitions.
• Validate Gold layer semantic models including rule view aggregations, Dynamic Tables, and reporting views
• Identify and document data anomalies, surface them via structured QA handoff packages, and track remediation through to sign-off
• Verify referential integrity across key grain tables.
• Conduct UAT validation cycles in coordination with business stakeholders and the BI team
Automation & Framework Build
• Develop automated test suites using Python, dbt tests, or Snowflake-native SQL to reduce manual regression effort
• Integrate QA checks into CI/CD pipelines via Azure DevOps, including pre-merge and post-deploy validations
• Build and maintain data observability tooling: zero-row load alerts, data freshness checks, watermark stall detection, and anomaly notifications
• Author reusable QA scripts and test fixtures that can be deployed consistently across DEV/QA/UAT environments
• Contribute to promotion tracking and CI/CD logging patterns that connect deployment events to quality outcomes
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Required Skills & Experience
• 3-5 years of experience in data quality, QA engineering, or data testing roles
• Hands-on proficiency with Snowflake SQL, including window functions, CTEs, MERGE statements, and stream/task patterns
• Experience testing ETL pipelines — incremental loads, CDC patterns, SCD Type 1/2, and bulk reconciliation
• Ability to write structured QA test plans, acceptance criteria, and sign-off documentation
• Experience with Azure DevOps or similar CI/CD tooling for managing deployments and test automation
• Strong analytical mindset — comfortable identifying root causes from large data sets
• Advanced proficiency in Microsoft Excel, including formulas, pivot tables, lookups, data validation, conditional formatting, and the ability to analyze, reconcile, and present large data sets effectively
Nice to Have Skills & Experience
• Experience with Power BI or DAX — ability to validate report data against source-layer results
• Exposure to Snowflake Dynamic Tables, Task DAGs, or stored procedure-based ETL patterns
• Familiarity with medallion architecture (Bronze / Silver / Gold) and layered data platform patterns
• Experience with dbt tests or similar declarative data quality frameworks
• Working knowledge of Jira for test case management, bug tracking, and sprint ceremonies
• SnowPro Core certification or equivalent demonstrated Snowflake expertise
Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.