The role requires ability to design, develop and troubleshoot complex data pipeline systems, perform data quality analysis, and develop using automation technologies. Must have the ability to collaborate and communicate effectively with multiple team members across different US time zones and create design documentation. Must be able to review existing data pipelines, quickly understanding their flow and implement modernized versions based on additional requirements.
Core Responsibilities
- Design, develop, and maintain data pipelines using Azure Data Factory (ADF), including orchestration of ETL/ELT workflows.
- Implement data pipelines using ADF, Kusto Query Language (KQL), Bicep, and Ev2, including pipelines, datasets, linked services, and triggers.
- Collaborate with internal teams to define pipeline requirements, data sources, and transformation logic.
- Monitor and troubleshoot pipeline executions, ensuring data quality and timely delivery.
- Contribute to modularization and reusability of Bicep templates for scalable deployments.
- Participate in code reviews and CI/CD processes using Ev2 for ADF and Bicep artifacts.
- The team is responsible for security services in airgap clouds, including security monitoring in Azure. The team supports an incredibly complex data processing pipeline, which processes security logs for Windows and Linux machines in all of azure.
- The role will process security logs and take them from raw format to product reports consumable by compliance and other teams to create detections, reports, etc. This role will also support the teams efforts to refactor aspects that will enable some services to function without as much dependance on the cloud. The role may work with some international governments, and if the candidate has an active US Government Security Clearance they may work on classified US government workloads.
- The scope of managing data pipelines for multiple sovereign nations with sovereign cloud instances is complex and requires surge support to build out these pipeline in a scalable manner so the team can deploy into multiple clouds with automation.
- The team is building the necessary architecture, and this role will support by taking on some work for some subsets of the data pipeline.
- The role will consist of reviewing kusto queries, both the logic and the data in the pipelines to understand the orchestration flow. Then the role will define architecture in Kusto KKL scripts, Bicep (converts ARM templates to programming language-like interface), and EV2 deployments which deploys the templates and scripts. The role will also manage service config files and define parameters.
We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to
Human Resources Request Form. The EEOC "Know Your Rights" Poster is available
here.
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy:
https://insightglobal.com/workforce-privacy-policy/ .
Data Engineering with 35 years of experience in data engineering or DevOps roles, focusing on cloud-native services, preferably Azure. The ideal candidate will have strong hands-on experience with database query syntax, data orchestration pipelines, and deployment automation technologies, preferably Azure Data Explorer (ADX/Kusto) and Azure Data Factory (ADF).
- 5 years experience with at least on database query language, preferably Kusto. Significant experience in a transferable language like SQL or Splunk syntax may be acceptable.
- 5 years experience with deployment automation technologies. Azure data factory and EV2 are ideal for cloud.
- 3 years experience with cloud development in general, preferably in Azure
- 35 years of experience in data engineering or DevOps roles with a focus on cloud native services (Azure preferred)
- Strong hands-on experience with data orchestration pipelines, preferably Azure Data Factory, including design, development, and troubleshooting
- Proficiency in deployment automation technologies
- Familiarity with Azure Data Explorer (ADX / Kusto) or similar cloud-native database solutions.
- Experience with version control systems (e.g., Git) and working in collaborative development environments.
- Ability to debug and optimize data workflows and handle dependency management
- Strong communication and documentation skills.
- Ability to work independently and manage time effectively in a remote or hybrid setting.
- Comfortable collaborating across engineering, product, and operations teams.
- Experience with Kusto Query Language (KQL) and Azure Data Explorer (Kusto) for pipeline observability.
- Experience with Bicep for deploying Azure resources, with a solid understanding of ARM templates and Azure Resource Manager.
- Exposure to region-agnostic deployment strategies and multi-environment orchestration using Ev2.
- Knowledge of security and compliance practices in Azure environments.
- Candidate with US Government Security Clearance Preferred but not required
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.