ETL Developer

Post Date

Feb 20, 2026

Location

Tampa,
Florida

ZIP/Postal Code

33607
US
Apr 22, 2026 Insight Global

Job Type

Contract

Category

Programmer / Developer

Req #

TPA-a53f49ae-343f-4795-9a17-acc1a718f688

Pay Rate

$50 - $62 (hourly estimate)

Job Description

We are looking for an experienced ETL Developer to design, develop, enhance, and support enterprise data integration solutions. The role will focus on building and maintaining data warehouse pipelines using Informatica (IDMC / PowerCenter / IICS), implementing dimensional data models, and delivering reliable batch/CDC processing with strong scheduling and operational support. The ideal candidate has strong SQL/PL/SQL skills, hands-on experience with Snowflake, and can build/maintain pipelines using Azure Data Factory (ADF).

Key Responsibilities (Critical/High)
• Develop and maintain data warehousing and ETL solutions using Informatica IDMC, Informatica PowerCenter, and/or Informatica Cloud (IICS). (Critical)
• Design and implement ETL mappings, workflows, and end-to-end scheduling logic; monitor and support production jobs. (Critical)
• Perform dimensional data modeling (Star Schema/Snowflake schema), including FACT and DIMENSION table design and physical/logical modeling. (Critical)
• Build efficient, scalable ETL transformations using components such as Aggregator, Join, Lookup, Merge, Funnel, Filter, Sort, Transformer. (High)
• Write and optimize complex SQL and PL/SQL to support transformations, validations, and performance tuning. (High)
• Integrate data from diverse sources/targets including Oracle Connector, ODBC Connector, sequential files, datasets. (High)
• Implement and support Change Data Capture (CDC) solutions. (High)
• Use enterprise schedulers such as Control-M for batch orchestration and operational execution. (High)
• Participate across the full SDLC: analysis, design, development, testing, deployment, and production support. (Critical)
• Build and support data pipelines using Snowflake and Azure Data Factory (ADF) (Azure pipelines). (High)
• Collaborate with data architects, analysts, and business stakeholders to translate requirements into robust technical solutions. (High)

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Required Skills & Experience

• Bachelor's Degree
• 4+ years of experience in ETL development and maintenance for data warehousing solutions. (Critical)
• Hands-on expertise with Informatica IDMC / PowerCenter / IICS. (Critical)
• Strong experience in dimensional modeling (Star/Snowflake schemas), including FACT/DIM design. (Critical)
• Strong knowledge of ETL mapping design, workflow logic, and scheduling. (Critical)
• Proficiency in SQL and PL/SQL, including performance optimization. (High)
• Solid working experience with UNIX and Oracle tools in support of ETL operations. (High)
• Experience with production support, job monitoring, and issue resolution in scheduled/batch environments. (Critical)

Nice to Have Skills & Experience

• Strong working knowledge of Snowflake (data loading patterns, performance considerations). (High)
• Experience building pipelines using Azure Data Factory (ADF) / Azure orchestration. (High)
• Experience with CDC tooling/approaches (log-based or incremental patterns). (High)

Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.