REMOTE Sr. Data Engineer

Post Date

Mar 23, 2026

Location

Brookfield,
Wisconsin

ZIP/Postal Code

53045
US
May 23, 2026 Insight Global

Job Type

Perm

Category

Data Warehousing

Req #

MKE-46be3e24-1934-4301-aa69-04e10e3df945

Pay Rate

$105k - $199k (estimate)

Job Description

Insight Global is seeking a Sr. Data Engineer to join our actuarial consultant customer 100% remotely. In this position as a Sr. Data Engineer of our client’s Data Platform, you will be responsible for designing and implementing robust Data Platform solutions that meet business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our client’s projects. Responsibilities include:
• Data Platform: Creation of a Databricks Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise.
o Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage and data quality
o Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions
o External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance
o ETL: Building solutions within Delta Live Tables and automation of transformations
o Medallion Architecture: Building out performant enterprise-level medallion architecture(s)
o Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions
o Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data
o Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles
o Costs: Working with the business to build cost effective and cost transparent Data solutions
• Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance
o Experience working with Migration tools i.e., AWS DMS, AWS Glue, Fivetran, integrate.io
o Identify and implement improvements to enhance data processing efficiency
o Experience with building out effective pipeline monitoring solutions
o Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based ‘big data’ technologies.
• Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data
o Assemble large, complex data sets that meet functional and non-functional business requirements
o Develop and maintain data models, ensuring they align with business objectives and data privacy regulations
• Collaboration: Partner internally with key stakeholders to ensure we are providing meaningful, functional, and valuable data
o Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions.
o Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices.
o Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems
• Processes and Tools: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
o Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics.
o Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

Required Skills & Experience

• 7+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products
• Expert level experience working in Databricks and AWS
• Advanced level experience working in both relational and non-relational databases such as SQL Server and PostgreSQL
• Experience building and managing solutions on AWS
• Advanced with building and deploying IAC using terraform, asset bundles and github
• Advanced in building out data models, data warehouses, designing of data lakes for enterprise (and product use)
• Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions.
• Experience in performance tuning, query optimization, security, monitoring, and release management.
• Experience working with and managing large, disparate, identified, and de-identified data sets from multiple data sources

Nice to Have Skills & Experience

• Bachelor's degree or master's degree in computer science, data engineering or related field
• Experience managing and standardizing clinical data from structured and unstructured sources
• Health and Life Insurance business experience
• Knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7
• Knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm
• Associate or Professional level solution architecture certification in Azure and/or AWS
• Experience in Snowflake
• Experience in Spark
• Experience with Salesforce Integration

Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.