Job Description
• TL Development: Write new ETLs for real-time and batch data to support reporting and product features.
• Requirement Gathering & Design: Collaborate with product teams to understand needs, design solutions, and document implementation plans.
• System Development: Build monitoring, alerting, and deployment pipelines; ensure performance and cost-efficiency.
• Best Practices: Apply standards for test coverage, scalability, fault tolerance, and system architecture.
• Documentation: Create system overviews, runbooks, and improvement recommendations.
• Agile Project Management: Participate in stand-ups, manage blockers, track progress, and communicate with stakeholders.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Required Skills & Experience
o 3-5+ years in software/data engineering using Agile and DevOps.
o Building Data Lakes/Warehouses with big data and cloud tech.
- Technical Skills:
o End to end ETL work flows
o Kafka for real time data streaming/ data ingestion
o Spark for real time data for aggregation and computational data engineering
o Databricks for clustering and other
o AWS and S3 buckets (they don’t use lambdas or anything rn)
o Python and Pyspark
o SQL for querying
o experience with centralizing analytics an data for insights for other teams to create roadmaps and direction
Nice to Have Skills & Experience
o ML Experience deploying ML models.
o CICD or Devops (they have another team to support this but nice to have)
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.