Find Your Perfect Job

Job Search Results for big data

Sort and Filter  | 30 Results for big data  | Save This Search

Mar 23, 2026

Brookfield, WI

|

Data Warehousing

|

Perm

|

$105k - $199k (estimate)

{"JobID":509523,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-88.11,"Longitude":43.06,"Distance":null},"State":"Wisconsin","Zip":"53045","ReferenceID":"MKE-46be3e24-1934-4301-aa69-04e10e3df945","PostedDate":"\/Date(1774271690000)\/","Description":"Insight Global is seeking a Sr. Data Engineer to join our actuarial consultant customer 100% remotely. In this position as a Sr. Data Engineer of our client?s Data Platform, you will be responsible for designing and implementing robust Data Platform solutions that meet business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our client?s projects. Responsibilities include: ?Data Platform: Creation of a Databricks Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise. oData Governance: Configuring and maintaining unity catalog to enable enterprise data lineage and data quality oData Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutionsoExternal Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performanceoETL: Building solutions within Delta Live Tables and automation of transformationsoMedallion Architecture: Building out performant enterprise-level medallion architecture(s) oStreaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions oLarge Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare dataoPlatform Engineering: Building out Infrastructure as Code using Terraform and Asset BundlesoCosts: Working with the business to build cost effective and cost transparent Data solutions?Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performanceoExperience working with Migration tools i.e., AWS DMS, AWS Glue, Fivetran, integrate.iooIdentify and implement improvements to enhance data processing efficiencyoExperience with building out effective pipeline monitoring solutionsoBuild the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based ?big data? technologies.?Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured dataoAssemble large, complex data sets that meet functional and non-functional business requirementsoDevelop and maintain data models, ensuring they align with business objectives and data privacy regulations?Collaboration: Partner internally with key stakeholders to ensure we are providing meaningful, functional, and valuable dataoEffectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions. oCommunicate complex technical concepts to non-technical stakeholders and provide guidance on best practices.oEnsure that technology execution aligns with business strategy and provides efficient, secure solutions and systems?Processes and Tools: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.oBuild analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.oCreate data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"REMOTE Sr. Data Engineer","City":"Brookfield","ExpirationDate":null,"PriorityOrder":0,"Requirements":"?7+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products?Expert level experience working in Databricks and AWS?Advanced level experience working in both relational and non-relational databases such as SQL Server and PostgreSQL?Experience building and managing solutions on AWS ?Advanced with building and deploying IAC using terraform, asset bundles and github?Advanced in building out data models, data warehouses, designing of data lakes for enterprise (and product use)?Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions.?Experience in performance tuning, query optimization, security, monitoring, and release management. ?Experience working with and managing large, disparate, identified, and de-identified data sets from multiple data sources","Skills":"?Bachelor\u0027s degree or master\u0027s degree in computer science, data engineering or related field?Experience managing and standardizing clinical data from structured and unstructured sources?Health and Life Insurance business experience?Knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7?Knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm?Associate or Professional level solution architecture certification in Azure and/or AWS?Experience in Snowflake ?Experience in Spark?Experience with Salesforce Integration","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":199000.0000,"SalaryLow":105000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is seeking a Sr. Data Engineer to join our actuarial consultant customer 100% remotely. In this position as a Sr. Data Engineer of our client?s Data Platform, you will be responsible... for designing and implementing robust Data Platform solutions that meet business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our client?s projects. Responsibilities include: ?Data Platform: Creation of a Databricks Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise. oData Governance: Configuring and maintaining unity catalog to enable enterprise data lineage and data quality oData Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutionsoExternal Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performanceoETL: Building solutions within Delta Live Tables and automation of transformationsoMedallion Architecture: Building out performant enterprise-level medallion architecture(s) oStreaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions oLarge Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare dataoPlatform Engineering: Building out Infrastructure as Code using Terraform and Asset BundlesoCosts: Working with the business to build cost effective and cost transparent Data solutions?Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performanceoExperience working with Migration tools i.e., AWS DMS, AWS Glue, Fivetran, integrate.iooIdentify and implement improvements to enhance data processing efficiencyoExperience with building out effective pipeline monitoring solutionsoBuild the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based ?big data? technologies.?Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured dataoAssemble large, complex data sets that meet functional and non-functional business requirementsoDevelop and maintain data models, ensuring they align with business objectives and data privacy regulations?Collaboration: Partner internally with key stakeholders to ensure we are providing meaningful, functional, and valuable dataoEffectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions. oCommunicate complex technical concepts to non-technical stakeholders and provide guidance on best practices.oEnsure that technology execution aligns with business strategy and provides efficient, secure solutions and systems?Processes and Tools: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.oBuild analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.oCreate data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Feb 18, 2026

Woonsocket, RI

|

Data Warehousing

|

Contract-to-perm

|

$46 - $57 (hourly estimate)

{"JobID":496384,"JobType":["Contract-to-perm"],"EmployerID":null,"Location":{"Latitude":-71.5015,"Longitude":42.0016,"Distance":null},"State":"Rhode Island","Zip":"02895","ReferenceID":"CHI-08fc2e4b-3cc7-4a84-9b1b-cc977ea93fad","PostedDate":"\/Date(1771439012000)\/","Description":"In role, the candidate is expected to develop core components for the Data Warehouse. Work in this role is data intensive and includes understanding of requirements, software development, unit testing, and related documentation.?Evaluate and interpret documented technical requirements and translate those into software deliverables.?Develop database software components (procedures, packages, functions, etc.) that meet documented requirements.?Identify unit test cases required to validate software components.?Execute unit test cases against software components to confirm accuracy of development.?Peer review development deliverables for team members to confirm standards and best practices.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"Data Engineer","City":"Woonsocket","ExpirationDate":null,"PriorityOrder":0,"Requirements":"?5+ years of relevant programming and/or analytic experience?Experience in working in a data warehouse environment as well as the ability to work with large data volumes from multiple data sources?Strong SQL and database development (e.g., Oracle, MS SQL, PostgreSQL, etc)?Knowledge of analytic programming tools and methods (SAS, OLAP, Business Objects, Crystal Reports)?Deep understanding of relational databases, data systems and data warehouses?Knowledge of other reporting platforms such as: Business Objects, Microsoft Access, web development technology, big data analytics, data mining, Visual Basic","Skills":"","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":57.0000,"SalaryLow":45.6000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

In role, the candidate is expected to develop core components for the Data Warehouse. Work in this role is data intensive and includes understanding of requirements, software development, unit... testing, and related documentation.?Evaluate and interpret documented technical requirements and translate those into software deliverables.?Develop database software components (procedures, packages, functions, etc.) that meet documented requirements.?Identify unit test cases required to validate software components.?Execute unit test cases against software components to confirm accuracy of development.?Peer review development deliverables for team members to confirm standards and best practices.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Nov 11, 2025

Chesterfield, MO

|

Data Warehousing

|

Perm

|

$110k - $130k (estimate)

{"JobID":466007,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-90.58,"Longitude":38.65,"Distance":null},"State":"Missouri","Zip":"63005","ReferenceID":"STL-f932f9cb-ca20-47e8-8d38-1ccc3968195d","PostedDate":"\/Date(1762879317000)\/","Description":"Insight Global is looking to add a 2 Data Engineers to our client\u0027s team who are headquartered in the St. Louis area but can sit remotely. They will be responsible for moving applications from one environment to another in addition to ensuring every component has been upgraded and will also be responsible for modernizing the framework as well. They will need someone to come in and support development efforts across the enterprise and will interact with multiple different application owners \u0026 software vendors so communication is very important. They will be using Apache Spark, Apache Iceberg, and Apache Airflow for ETL pipelines. This person will transform and integrate EBCDIC Mainframe data into Hive and Impala tables using Precisely Connect for Big Data, optimize data transformation processes for performance, scalability, and reliability, and ensure data consistency, accuracy, and quality across the ETL pipelines. This person will function as the point of escalation for ETL loads, data quality issues, and production support issues, monitor ETL workflows and troubleshoot issues to ensure smooth production operations and implement proactive monitoring, alerting, and reporting solutions to improve pipeline reliability. This individual will be responsible for supporting and rebuilding legacy ETL jobs (currently not using ACID transactions) with modern solutions using Apache Spark and Apache Iceberg to support ACID transactions. They will be overseeing design, and implement workflows for automating data pipelines using Apache Airflow as well as establishing and enforcing best practices for ETL code development, version control, and deployment using Azure DevOps.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"Senior Data Engineer","City":"Chesterfield","ExpirationDate":null,"PriorityOrder":0,"Requirements":"5+ years of Data Engineering experienceExperience with Apache Spark, Airflow or IcebergPython programming experienceProduction Support experienceLinux experienceStrong communication skills","Skills":"Experience with Precisely ConnectExperience with HadoopExperience with Cloudera","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":130000.0000,"SalaryLow":110000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is looking to add a 2 Data Engineers to our client's team who are headquartered in the St. Louis area but can sit remotely. They will be responsible for moving applications from one... environment to another in addition to ensuring every component has been upgraded and will also be responsible for modernizing the framework as well. They will need someone to come in and support development efforts across the enterprise and will interact with multiple different application owners & software vendors so communication is very important. They will be using Apache Spark, Apache Iceberg, and Apache Airflow for ETL pipelines. This person will transform and integrate EBCDIC Mainframe data into Hive and Impala tables using Precisely Connect for Big Data, optimize data transformation processes for performance, scalability, and reliability, and ensure data consistency, accuracy, and quality across the ETL pipelines. This person will function as the point of escalation for ETL loads, data quality issues, and production support issues, monitor ETL workflows and troubleshoot issues to ensure smooth production operations and implement proactive monitoring, alerting, and reporting solutions to improve pipeline reliability. This individual will be responsible for supporting and rebuilding legacy ETL jobs (currently not using ACID transactions) with modern solutions using Apache Spark and Apache Iceberg to support ACID transactions. They will be overseeing design, and implement workflows for automating data pipelines using Apache Airflow as well as establishing and enforcing best practices for ETL code development, version control, and deployment using Azure DevOps.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Feb 05, 2025

Louisville, KY

|

Data Warehousing

|

Contract-to-perm

|

$46 - $57 (hourly estimate)

{"JobID":401214,"JobType":["Contract-to-perm"],"EmployerID":null,"Location":{"Latitude":-85.7285454545455,"Longitude":38.2424545454545,"Distance":null},"State":"Kentucky","Zip":"40222","ReferenceID":"LOU-2d7dc852-1a86-4b9a-ab88-9eacd850b0ee","PostedDate":"\/Date(1738775855000)\/","Description":"We are seeking a data labeling SME. The Data Labeling Subject Matter Expert will be responsible for overseeing and ensuring the accuracy and quality of data labeling processes. This role requires a deep understanding of HIPPA, PII, and data annotation techniques, tools, and best practices. The targeted pay range is $50-65/hour.We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form. The EEOC \"Know Your Rights\" Poster is available here. To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ .","Title":"Data Labeling SME","City":"Louisville","ExpirationDate":null,"PriorityOrder":0,"Requirements":"- 3 years professional in Data Labeling - Knowledge of identifying HIPPA and PII in the data labeling process. - Professional experience providing technical expertise and support in the selection and implementation of data labeling tools and platforms.- Ability to Conduct quality assurance checks on labeled data to ensure consistency and accuracy.- Experience having trained and mentored team members on data labeling processes and best practices.- Previous experience in developing and implementing data labeling guidelines, standards, and best practices.- Prior healthcare industry experience.","Skills":"- Previous professional working experience with BIG ID as an automated tool for Data Identification and Labeling","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":57.0000,"SalaryLow":45.6000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

We are seeking a data labeling SME. The Data Labeling Subject Matter Expert will be responsible for overseeing and ensuring the accuracy and quality of data labeling processes. This role requires a... deep understanding of HIPPA, PII, and data annotation techniques, tools, and best practices. The targeted pay range is $50-65/hour.We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form. The EEOC "Know Your Rights" Poster is available here. To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ .

Mar 23, 2026

Brookfield, WI

|

Data Warehousing

|

Perm

|

$131k - $217k (estimate)

{"JobID":509541,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-88.11,"Longitude":43.06,"Distance":null},"State":"Wisconsin","Zip":"53045","ReferenceID":"MKE-7678c53e-8e6e-4b23-ba33-fecf1c37fcae","PostedDate":"\/Date(1774271786000)\/","Description":"Insight Global is seeking a Staff Data Engineer to join our actuarial consultant customer 100% remotely. In this position as a Staff Data Engineer of our client?s Data Platform, you will be responsible for designing and implementing robust Data Platform solutions that meet business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our client?s projects. What you will be doing:?Acts as a subject matter expert and thought leader within the Data Platform Domain?Data Strategy: Serves as a thought leader in data processing design and implementation, defining advanced structure for moving, storing, and maintaining high-quality data.?Team Leadership: Leads projects by managing timelines, coordinating teams, and communicating project statuses. Influences organizational direction through effective leadership and strategic collaboration?Data Governance and Security: Serves as a subject matter expert on governance standards, continuously aligning data practices with evolving industry best practices and requirements?Project Management and Scope of Work: Contributes to defining the overall vision and strategy for data engineering within the organization, ensuring alignment with organizational goals and long-term objectives?Results Orientation: Establishes visionary goals, advises on strategic plans, employs advanced monitoring, influences high-level stakeholders, and delivers transformative results?Data Platform: Expansion of our Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise?Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage, data quality, auditability and data stewardship?Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions?Access Management: Always ensure a policy of least privilege is followed for anything being implemented?External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance?ETL: Building solutions within Delta Live Tables and automation of transformations?Medallion Architecture: Building out performant enterprise-level medallion architecture(s) ?Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions ?Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data?Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles?Costs: Working with the business to build cost effective and cost transparent Data solutions?Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance?Experience working with Migration tools i.e. Fivetran, AWS technologies and custom solutions?Identify and implement improvements to enhance data processing efficiency?Design and implement reliable and resilient Event Driven data processing?Experience with building out effective pipeline monitoring solutions?Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud based ?big data? technologies?API Development: Drive our design and implementation of internal APIs for integrating data between different systems and applications?Integration with external systems utilizing API driven processes to ingest data?Develop APIs built on top of datasets for internal systems to consume data from Databricks?Experience integrating with external APIs including but not limited to Salesforce, Financial systems, HR systems and other external systems?Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data?Assemble large, complex data sets that meet functional and non-functional business requirements?Develop and maintain data models, ensuring they align with business objectives and data privacy regulations?Collaboration: Partner internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data?Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions. ?Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices.?Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems?Gather requirements and build out project plans to implement those requirements with forecasted efforts to implement?Processes and Tools: Identify, design, and implement internal process improvements: oautomating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.?Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.?Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.?Lead investigation of new tooling, develop implementation plans, and deployment of necessary toolingWe are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"REMOTE Staff Data Engineer","City":"Brookfield","ExpirationDate":null,"PriorityOrder":0,"Requirements":"?15+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products?Expert level experience working in Databricks and AWS?Expert level experience working in both relational and non-relational databases such as SQL Server, PostgreSQL, DynamoDB, DocumentDB?Experience building and managing solutions on AWS ?Expert in building out data models, data warehouses, designing of data lakes for enterprise and product use cases?Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions?Experience in performance tuning, query optimization, security, monitoring, and release management?Experience working with and managing large, disparate, identified and de-identified data sets from multiple data sources?Experience with building and deploying IAC using terraform, asset bundles and GitHub?Experience collaborating with Data Science teams and building AI based solutions to drive efficiencies and business value","Skills":"?Bachelor\u0027s degree or master\u0027s degree in computer science, data engineering or related field?Experience managing and standardizing clinical data from structured and unstructured sources?Health and Life Insurance business experience?Knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7?Knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm?Associate or Professional level solution architecture certification in Azure and/or AWS?Experience in Snowflake ?Experience in Spark?Experience with Salesforce Integration","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":217000.0000,"SalaryLow":131000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Insight Global is seeking a Staff Data Engineer to join our actuarial consultant customer 100% remotely. In this position as a Staff Data Engineer of our client?s Data Platform, you will be... responsible for designing and implementing robust Data Platform solutions that meet business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our client?s projects. What you will be doing:?Acts as a subject matter expert and thought leader within the Data Platform Domain?Data Strategy: Serves as a thought leader in data processing design and implementation, defining advanced structure for moving, storing, and maintaining high-quality data.?Team Leadership: Leads projects by managing timelines, coordinating teams, and communicating project statuses. Influences organizational direction through effective leadership and strategic collaboration?Data Governance and Security: Serves as a subject matter expert on governance standards, continuously aligning data practices with evolving industry best practices and requirements?Project Management and Scope of Work: Contributes to defining the overall vision and strategy for data engineering within the organization, ensuring alignment with organizational goals and long-term objectives?Results Orientation: Establishes visionary goals, advises on strategic plans, employs advanced monitoring, influences high-level stakeholders, and delivers transformative results?Data Platform: Expansion of our Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise?Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage, data quality, auditability and data stewardship?Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions?Access Management: Always ensure a policy of least privilege is followed for anything being implemented?External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance?ETL: Building solutions within Delta Live Tables and automation of transformations?Medallion Architecture: Building out performant enterprise-level medallion architecture(s) ?Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions ?Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data?Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles?Costs: Working with the business to build cost effective and cost transparent Data solutions?Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance?Experience working with Migration tools i.e. Fivetran, AWS technologies and custom solutions?Identify and implement improvements to enhance data processing efficiency?Design and implement reliable and resilient Event Driven data processing?Experience with building out effective pipeline monitoring solutions?Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud based ?big data? technologies?API Development: Drive our design and implementation of internal APIs for integrating data between different systems and applications?Integration with external systems utilizing API driven processes to ingest data?Develop APIs built on top of datasets for internal systems to consume data from Databricks?Experience integrating with external APIs including but not limited to Salesforce, Financial systems, HR systems and other external systems?Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data?Assemble large, complex data sets that meet functional and non-functional business requirements?Develop and maintain data models, ensuring they align with business objectives and data privacy regulations?Collaboration: Partner internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data?Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions. ?Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices.?Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems?Gather requirements and build out project plans to implement those requirements with forecasted efforts to implement?Processes and Tools: Identify, design, and implement internal process improvements: oautomating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.?Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.?Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.?Lead investigation of new tooling, develop implementation plans, and deployment of necessary toolingWe are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Mar 25, 2026

Dearborn, MI

|

Data Warehousing

|

Perm

|

$80k - $100k (estimate)

{"JobID":511170,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-83.21,"Longitude":42.31,"Distance":null},"State":"Michigan","Zip":"48126","ReferenceID":"MIC-6c63a0db-16b6-4c7f-99c6-461fdd29baea","PostedDate":"\/Date(1774471343000)\/","Description":"Day to Day:In this role, you?ll leverage your expertise in GCP and data engineering to modernize legacy applications and build scalable cloud analytics platforms. You?ll collaborate with global engineering teams to define and implement enterprise data strategies, working closely with stakeholders to align solutions with business needs and regulatory requirements. Your responsibilities will include designing and delivering a unified data platform on GCP, guiding teams on data modeling and architecture, and advising on best practices for cloud security, DevOps, and data mesh. You?ll also support proof-of-concepts, product evaluations, and integration efforts, while enabling insights through AI/ML platforms and modern data solutions.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"INTL - MX - GCP Data Engineer","City":"Dearborn","ExpirationDate":null,"PriorityOrder":0,"Requirements":"Must Have:? 5+ years of analytics application development experience? 5+ years of SQL development experience? Any amount of GCP cloud experience is required? In-depth understanding of cloud architecture and Google Cloud Platform services? Strong understanding of DevOps principles, CI/CD pipelines, and automated testing? Familiarity with cloud security best practices (IAM, encryption, network security)? Experience with domain-driven design and data mesh principles? Ability to design and implement data lakes, warehouses, and analytics platforms? Strong understanding of microservices architecture","Skills":"Plusses:? Bachelor?s degree in Computer Science, Engineering, Data Science, or related field? Google Professional Cloud Data Engineer certification? Experience in banking or financial regulatory reporting? Experience migrating legacy analytics applications to cloud platforms? Strong leadership, communication, and presentation skills? Exposure to diverse technologies and platforms? Ability to work in fast-paced, multi-project environments? Strong experience with GCP Big Data tools such as BigQuery, BigTable, Dataflow, Pub/Sub, Data Fusion, Dataproc, Cloud Build, Airflow, and Terraform (or equivalent technologies)","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":100000.0000,"SalaryLow":80000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Day to Day:In this role, you?ll leverage your expertise in GCP and data engineering to modernize legacy applications and build scalable cloud analytics platforms. You?ll collaborate with global... engineering teams to define and implement enterprise data strategies, working closely with stakeholders to align solutions with business needs and regulatory requirements. Your responsibilities will include designing and delivering a unified data platform on GCP, guiding teams on data modeling and architecture, and advising on best practices for cloud security, DevOps, and data mesh. You?ll also support proof-of-concepts, product evaluations, and integration efforts, while enabling insights through AI/ML platforms and modern data solutions.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Nov 14, 2025

Dearborn, MI

|

Data Warehousing

|

Perm

|

$68k - $85k (estimate)

{"JobID":467334,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-83.21,"Longitude":42.31,"Distance":null},"State":"Michigan","Zip":"48126","ReferenceID":"MIC-b3661525-899a-4053-94be-b76673c6e8da","PostedDate":"\/Date(1763126759000)\/","Description":"Day to Day:In this role, you?ll leverage your expertise in GCP and data engineering to modernize legacy applications and build scalable cloud analytics platforms. You?ll collaborate with global engineering teams to define and implement enterprise data strategies, working closely with stakeholders to align solutions with business needs and regulatory requirements. Your responsibilities will include designing and delivering a unified data platform on GCP, guiding teams on data modeling and architecture, and advising on best practices for cloud security, DevOps, and data mesh. You?ll also support proof-of-concepts, product evaluations, and integration efforts, while enabling insights through AI/ML platforms and modern data solutions.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"INTL - MX - GCP Data Engineer","City":"Dearborn","ExpirationDate":null,"PriorityOrder":0,"Requirements":"Must Have:? 5+ years of analytics application development experience? 5+ years of SQL development experience? Any amount of GCP cloud experience is required? In-depth understanding of cloud architecture and Google Cloud Platform services? Strong understanding of DevOps principles, CI/CD pipelines, and automated testing? Familiarity with cloud security best practices (IAM, encryption, network security)? Experience with domain-driven design and data mesh principles? Ability to design and implement data lakes, warehouses, and analytics platforms? Strong understanding of microservices architecture","Skills":"Plusses:? Bachelor?s degree in Computer Science, Engineering, Data Science, or related field? Google Professional Cloud Data Engineer certification? Experience in banking or financial regulatory reporting? Experience migrating legacy analytics applications to cloud platforms? Strong leadership, communication, and presentation skills? Exposure to diverse technologies and platforms? Ability to work in fast-paced, multi-project environments? Strong experience with GCP Big Data tools such as BigQuery, BigTable, Dataflow, Pub/Sub, Data Fusion, Dataproc, Cloud Build, Airflow, and Terraform (or equivalent technologies)","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":85000.0000,"SalaryLow":68000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

Day to Day:In this role, you?ll leverage your expertise in GCP and data engineering to modernize legacy applications and build scalable cloud analytics platforms. You?ll collaborate with global... engineering teams to define and implement enterprise data strategies, working closely with stakeholders to align solutions with business needs and regulatory requirements. Your responsibilities will include designing and delivering a unified data platform on GCP, guiding teams on data modeling and architecture, and advising on best practices for cloud security, DevOps, and data mesh. You?ll also support proof-of-concepts, product evaluations, and integration efforts, while enabling insights through AI/ML platforms and modern data solutions.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Apr 07, 2026

Houston, TX

|

Data Warehousing

|

Contract-to-perm

|

$43 - $54 (hourly estimate)

{"JobID":515797,"JobType":["Contract-to-perm"],"EmployerID":null,"Location":{"Latitude":-95.38,"Longitude":29.76,"Distance":null},"State":"Texas","Zip":"77027","ReferenceID":"AUS-23fbaeb3-5784-47d1-90fe-9f828816b792","PostedDate":"\/Date(1775565678000)\/","Description":"The client is seeking a Data Engineer to join their fast-growing data team and play a critical role in building, maintaining, and scaling a modern AWS-based data lake house. This individual will focus heavily on ingesting, transforming, and modeling geospatial and tabular data?bringing spatial data such as points, polygons, routes, and raster datasets into production-grade pipelines and databases rather than working solely within desktop GIS tools. The Data Engineer will design and maintain robust data pipelines, develop spatially aware data models, and partner closely with cross-functional teams to deliver reliable, scalable data solutions while leading technical initiatives in an Agile environment. The salary range for this position will range between 100-115K+ annually once converted from contract depending on level of experience.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"Senior Data Engineer","City":"Houston","ExpirationDate":null,"PriorityOrder":0,"Requirements":"? 5+ years of experience as a Data Engineer supporting complex, enterprise-level environments? Strong experience with PostGIS, including storing and querying spatial data in PostgreSQL using functions such as spatial joins, distance calculations, and containment queries? Hands-on experience loading and managing common GIS formats (GeoJSON, Shapefiles, GeoTIFFs) into databases using Python and GDAL-based tooling? Proven experience writing DBT models, including models that incorporate spatial SQL logic? Experience using Dagster to orchestrate data ingestion and transformation pipelines? Strong Python experience, including writing and maintaining production code using libraries such as geopandas, rasterio, boto3, and related tooling? Strong SQL skills, including query development, data analysis, and reporting? Experience working with multiple SQL databases such as PostgreSQL, SQL Server, MySQL, or Oracle? Server or platform administration experience, including job scheduling, automations, and pipeline maintenance? Experience working in both Windows and Linux environments","Skills":"? Experience with big data technologies such as Spark? Familiarity with Microsoft Power Platform? Proven experience leveraging Generative AI to improve data quality, accessibility, pipeline performance, or operational efficiency? Experience with machine learning workflows or data preparation for ML use cases? Proficiency with AWS services such as S3, EC2, RDS, Redshift, and familiarity with comparable Azure servicesStrong experience using Pandas for data transformation and analysis","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":54.0000,"SalaryLow":43.2000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

The client is seeking a Data Engineer to join their fast-growing data team and play a critical role in building, maintaining, and scaling a modern AWS-based data lake house. This individual will... focus heavily on ingesting, transforming, and modeling geospatial and tabular data?bringing spatial data such as points, polygons, routes, and raster datasets into production-grade pipelines and databases rather than working solely within desktop GIS tools. The Data Engineer will design and maintain robust data pipelines, develop spatially aware data models, and partner closely with cross-functional teams to deliver reliable, scalable data solutions while leading technical initiatives in an Agile environment. The salary range for this position will range between 100-115K+ annually once converted from contract depending on level of experience.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Jan 16, 2026

Fort Mill, SC

|

Data Warehousing

|

Perm

|

$130k - $205k (estimate)

{"JobID":483928,"JobType":["Perm"],"EmployerID":null,"Location":{"Latitude":-80.86,"Longitude":34.99,"Distance":null},"State":"South Carolina","Zip":"29707","ReferenceID":"CLT-03644071-2a89-497c-a8e4-80e5439acc19","PostedDate":"\/Date(1768602976000)\/","Description":"?Lead a team of 2-5 Data Engineers with varied experience level to design and build data pipelines from various data sources to a target data warehouse using real-time and batch data load strategies utilizing cutting edge cloud technologies.?Serve as a people leader by directly mentoring and coaching engineers, supporting formal performance review cycles, and providing regular, actionable feedback to drive individual growth and team success.?Work with a cross functional team of business stakeholders, engineering leaders, data analysts and data scientists to formulate both business and technical requirements.?Partner with business and data leaders to understand broader business strategies and define a roadmap outlining how the team will support and enable key initiatives?Establish and enforce standards for documenting data platform designs, including logical and physical data models, metadata, ETL specifications, and end-to-end integration workflows.?Design processes for ensuring data quality and integrity while providing timely responses to ad hoc requests.?Lead and execute proof-of-concepts where appropriate to assess, validate, and improve technical processes and approachesWe are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"Data Engineering Manager","City":"Fort Mill","ExpirationDate":null,"PriorityOrder":0,"Requirements":"?3+ years of experience managing a mid-size team, at least 2 data engineers?3+ years of experience in the big data space?Experience in translating business requirements into technical data solutions on a large scale?Experience working on Spark (SparkSQL / Data Frames / Dataset API) using Scala/Python to build and maintain complex ETL pipelines.?Experience with Databricks?2+ years of experience working on AWS?Excellent communication skills, with the ability to clearly articulate complex technical concepts to non-technical stakeholders?Experience with GitHub and CI/CD processes?Experience driving operational excellence for large-scale data platforms, including reliability, scalability, performance and cost optimization","Skills":"","Industry":"Data Warehousing","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":205000.0000,"SalaryLow":130000.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

?Lead a team of 2-5 Data Engineers with varied experience level to design and build data pipelines from various data sources to a target data warehouse using real-time and batch data load strategies... utilizing cutting edge cloud technologies.?Serve as a people leader by directly mentoring and coaching engineers, supporting formal performance review cycles, and providing regular, actionable feedback to drive individual growth and team success.?Work with a cross functional team of business stakeholders, engineering leaders, data analysts and data scientists to formulate both business and technical requirements.?Partner with business and data leaders to understand broader business strategies and define a roadmap outlining how the team will support and enable key initiatives?Establish and enforce standards for documenting data platform designs, including logical and physical data models, metadata, ETL specifications, and end-to-end integration workflows.?Design processes for ensuring data quality and integrity while providing timely responses to ad hoc requests.?Lead and execute proof-of-concepts where appropriate to assess, validate, and improve technical processes and approachesWe are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Feb 03, 2026

Seattle, WA

|

System Administrator

|

Contract

|

$24 - $30 (hourly estimate)

{"JobID":490338,"JobType":["Contract"],"EmployerID":null,"Location":{"Latitude":-122.33,"Longitude":47.57,"Distance":null},"State":"Washington","Zip":"98134","ReferenceID":"SEA-809dae4a-dd89-481f-b292-ff55d739bbc0","PostedDate":"\/Date(1770155773000)\/","Description":"A Fortune 500 company is looking for a remote Data Administrator to support the organization\u0027s Retail Food Safety Operations team. This role focuses on keeping our food safety operations running smoothly by validating and monitoring data across multiple systems to make sure everything is accurate, consistent, and useful for business decisions. Day to day, you?d be managing and organizing our SharePoint sites, Teams channels, Outlook workflows, and document repositories so that critical food safety documentation, compliance records, and operational standards are easy to find, properly categorized, and aligned with corporate requirements. You?d support a fully remote team of 22 regional food safety and public health advisors, helping them collaborate efficiently, reference historical materials, and maintain clean, structured information libraries. A big part of the work involves auditing processes, organizing and labeling documents, and making sure our Non-Partner Workforce?related materials and accommodations are categorized correctly?without needing to be the subject-matter expert on the content itself. Overall, it?s a blend of technical administration, process upkeep, and thoughtful information organization to ensure the team has what they need at their fingertips.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global\u0027s Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.","Title":"Data Administrator","City":"Seattle","ExpirationDate":null,"PriorityOrder":0,"Requirements":"? 3-5 years of experience in a Data Administrator or related role? 2+ years of experience with architect document repository? Proficient in Microsoft Office Suite (Outlook, Teams, SharePoint)Strong experience working with Smartsheet and Documentary Repository (Sharepoint)","Skills":"Background in food safety or similar knowledge","Industry":"System Administrator","Country":"US","Division":"IT","Office":null,"IsRemoteJob":true,"IsInternalJob":false,"ExtraValues":null,"__RecordIndex":0,"__OrdinalPosition":0,"__Timestamp":0,"Status":null,"ApplicantCount":0,"SubmittalCount":0,"ApplicationToHireRatio":0,"JobDuration":null,"SalaryHigh":30.0000,"SalaryLow":24.0000,"PayRateOvertime":0,"PayRateStraight":0,"Filled":0,"RemainingOpenings":0,"TotalOpenings":0,"Visa":null,"ClearanceType":null,"IsClearanceRequired":false,"IsHealthcare":false,"IsRemote":false,"EndClient":null,"JobCreatedDate":"\/Date(-62135578800000)\/","JobModifiedDate":"\/Date(-62135578800000)\/"}

A Fortune 500 company is looking for a remote Data Administrator to support the organization's Retail Food Safety Operations team. This role focuses on keeping our food safety operations running... smoothly by validating and monitoring data across multiple systems to make sure everything is accurate, consistent, and useful for business decisions. Day to day, you?d be managing and organizing our SharePoint sites, Teams channels, Outlook workflows, and document repositories so that critical food safety documentation, compliance records, and operational standards are easy to find, properly categorized, and aligned with corporate requirements. You?d support a fully remote team of 22 regional food safety and public health advisors, helping them collaborate efficiently, reference historical materials, and maintain clean, structured information libraries. A big part of the work involves auditing processes, organizing and labeling documents, and making sure our Non-Partner Workforce?related materials and accommodations are categorized correctly?without needing to be the subject-matter expert on the content itself. Overall, it?s a blend of technical administration, process upkeep, and thoughtful information organization to ensure the team has what they need at their fingertips.We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

1 - 10 of 30