INTL India - Senior Data Engineer (Azure and Snowflake) - cb11b0

Post Date

Jul 18, 2025

Location

Tempe,
Arizona

ZIP/Postal Code

85284
US
Sep 24, 2025 Insight Global

Job Type

Contract

Category

Data Warehousing

Req #

PHX-796452

Pay Rate

$15 - $19 (hourly estimate)

Job Description

Our client is looking for a Senior Data Engineer to join their team in India. Responsibilities are shown below:
Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals.
Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options.
Determine solutions that are best suited to develop a pipeline for a particular data source.
Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development.
Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).
Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery.
Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders.
Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability).
Stay current with and adopt new tools and applications to ensure high quality and efficient solutions.
Build cross-platform data strategy to aggregate multiple sources and process development datasets.
Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation.

We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form. The EEOC "Know Your Rights" Poster is available here.

To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ .

Required Skills & Experience

8+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment.
5+ years of experience with setting up and operating data pipelines using Python or SQL
MUST have experience working in the retail industry
5+ years of advanced SQL Programming: PL/SQL, T-SQL
5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization.
Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads.
5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data.
5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions.
5+ years of experience in defining and enabling data quality standards for auditing, and monitoring.
Strong analytical abilities and a strong intellectual curiosity
In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts
Understanding of REST and good API design.
Experience working with Apache Iceberg, Delta tables and distributed computing frameworks
Strong collaboration and teamwork skills & excellent written and verbal communications skills.
Self-starter and motivated with ability to work in a fast-paced development environment.
Agile experience highly desirable.
Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.

Nice to Have Skills & Experience

Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management).
Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques.
Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks.
Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools.
Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance).
Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting.
ADF, Databricks and Azure certification is a plus.

Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.