The Data Integration Engineer is responsible for ensuring the smooth day-to-day operations and seamless flow of data into the vendor's solution database and its components from various data sources. They develop Data Ingestion and Integration from data sources to vendor database. They also develop all necessary ad-hoc Data Ingestion/Integration processes and promote them into Operational in Production Environment by adhering to agile, reliable, secure, high performing, efficient and effective methodology. The Data Integration Engineer works with the team to develop agile reliable, secure, high performing, efficient, and effective state-of-the-art data and analytics driven solutions, working across the platform to drive Data and Business Analytics to a new level while leveraging best in class Data tools and technologies.
We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to
HR@insightglobal.com. The EEOC "Know Your Rights" Poster is available
here.
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy:
https://insightglobal.com/workforce-privacy-policy/ .
Proficiency in data modeling techniques (star schema, snowflake schema) to design efficient data structures.
Expertise in ETL (Extract, Transform, Load) processes and tools like Informatica, Talend, or Apache Airflow to extract, clean, transform, and load data into data warehouses or data marts.
Strong understanding of relational databases (SQL Server, Oracle, MySQL) and NoSQL databases (MongoDB, Cassandra).
Ability to write complex SQL queries and optimize database performance.
Knowledge of data warehousing concepts and data lake architectures.
Familiarity with cloud platforms like AWS, Azure, or GCP and their data integration services (AWS Glue, Azure Data Factory, Google Cloud Data Fusion).
Proficiency in programming languages like Python or Java for data integration tasks, scripting, and automation.
Understanding of scripting languages like Bash or PowerShell for automating data integration processes.
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.