Job Description
Architect, design, develop, and implement next-generation data streaming and event-based architecture / platform using software engineering best practices in the latest technologies:
o Data Streaming, Event Driven Architecture, Event Processing Frameworks
o DevOps (Jenkins, Red Hat OpenShift, Docker, SonarQube)
o Infrastructure-as-Code and Configuration-as-Code (Ansible, Terraform / CloudFormation, Scripting)
• Administer Kafka including automating, installing, migrating, upgrading, deploying, troubleshooting, and configuring on Linux.
• Provide expertise in one or more of these areas: Kafka administration, event-driven architecture, automation, application integration, monitoring and alerting, security, business process management/business rules processing, CI/CD pipeline and containerization, or data ingestion/data modeling.
• Investigate, repair, and actively ensure business continuity regardless of impacted component: Kafka Platform, business logic, middleware, networking, CI/CD pipeline, or database (PL/SQL and Data Modeling).
• Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience
The pay range for this position is $65/hr - $75/hr
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Required Skills & Experience
• Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field.
• Masters or Doctorate degree may substitute for required experience
• 8+ years of combined experience with Site Reliability Engineering, providing DevOps support, and/or RHEL administration for mission-critical platforms, ideally Kafka.
• 4+ years of combined experience with Kafka (Confluent Kafka, Apache Kafka, Amazon MSK)
• 4+ years of experience with Ansible automation
• Must be able to obtain and maintain a Public Trust. Contract requirement.
• Strong experience with Ansible Automation and authoring playbooks and roles for installing, maintaining, or upgrading platforms
• Solid experience using version control software such as Git/Bitbucket including peer reviewing Ansible playbooks
• Hands-on experience administrating Kafka platform (Confluent Kafka, Apache Kafka, Amazon MSK) via Ansible playbooks or other automation.
• Understanding of Kafka architecture, including partition strategy, replication, transactions, tiered storage, and disaster recovery strategies.
• Strong experience in automating tasks with scripting languages like Bash, Shell, or Python
• Solid foundation of Red Hat Enterprise Linux (RHEL) administration
• Basic networking skills
• Solid experience triaging and monitoring complex issues, outages, and incidents
• Experience with integrating/maintaining various 3rd party tools like ZooKeeper, Flink, Pinot, Prometheus, and Grafana.
• Experience with Platform-as-a-Service (PaaS) using Red Hat OpenShift/Kubernetes and Docker containers
• Experience working on Agile projects and understanding Agile terminology.
Nice to Have Skills & Experience
• Preferred Confluent Certified Administrator for Apache Kafka (CCAAK) or Confluent Certified Developer for Apache Kafka (CCDAK)
• Practical experience with event-driven applications and at least one event processing framework, such as Kafka Streams, Apache Flink, or ksqlDB.
• Understanding of Domain Driven Design (DDD) and experience applying DDD patterns in software development.
• Experience working with Kafka connectors and/or supporting operation of the Kafka Connect API
• Experience with Avro / JSON data serialization and schema governance with Confluent Schema Registry.
• Preferred experience with AWS cloud technologies or other cloud providers; AWS cloud certifications.
• Experience with Infrastructure-as-Code (CloudFormation / Terraform, Scripting)
• Solid knowledge of relational databases (PostgreSQL, DB2, or Oracle), NoSQL databases (MongoDB, Cassandra, DynamoDB), SQL, or/and ORM technologies (JPA2, Hibernate, or Spring JPA)
• Knowledge of Social Security Administration (SSA)
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.