An automotive financial services client of Insight Global is seeking an ETL Developer to join their engineering team. The ETL developer will be responsible for building and maintaining scalable data pipelines using tools like Fivetran, Databricks, and custom integrations via APIs and flat files. This role involves collaborating with analysts, stakeholders, and engineers to deliver reliable, secure, and high-quality data integrations. The developer will also be responsible for designing and optimizing ETL/ELT workflows, integrating data from diverse sources, automating processes, and ensuring compliance with data governance standards. Additionally, this person will support analytics initiatives by delivering well-modeled data, documenting processes, and providing technical assistance to users. This hybrid role requires 4-days onsite in Houston, Texas and within the pay range of $55-$65/hour.
We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to
Human Resources Request Form. The EEOC "Know Your Rights" Poster is available
here.
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy:
https://insightglobal.com/workforce-privacy-policy/ .
Bachelors degree in Computer Science, Information Systems, or a related field.
5+ years of experience in ETL development or data engineering.
Strong hands-on experience with ETL tools (e.g., Fivetran), including connector setup, schema mapping, and monitoring.
Proven experience with Databricks (Spark, SQL, notebooks, Delta Lake).
Experience building automation workflows or using low-code automation platforms.
Proficiency in SQL and at least one scripting language (e.g., Python).
Solid understanding of data warehousing principles and cloud data platforms.
Familiarity with REST APIs and third-party data integration.
Experience with version control (Git) and CI/CD for data workflows.
Exposure to tools like Airflow, dbt, Power BI, or Tableau.
Understanding of data governance and privacy standards (e.g., GDPR, HIPAA).
Experience working in Agile development environments.
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.