Job Description
Hadoop Admin role supporting NextGen Platforms built around Big Data Technologies (Hadoop, Jupyter Notebook, Spark, Kafka, Impala, Hbase, Docker-Container, Ansible and many more). Requires experience in cluster management of vendor based Hadoop and Data Science (AI/ML) products like Cloudera, DataRobot, C3, Panopticon, Talend, Trifacta, Selerity, ELK, KPMG Ignite etc. Analyst is involved in the full life cycle of an application and part of an agile development process. They require the ability to interact, develop, engineer, and communicate collaboratively at the highest technical levels with clients, development teams, vendors and other partners. The following section is intended to serve as a general guideline for each relative dimension of project complexity, responsibility and education/experience within this role.
- Works on complex, major or highly visible tasks in support of multiple projects that require multiple areas of expertise
- Team member will be expected to provide subject matter expertise in managing Hadoop and Data Science Platform operations with focus around Cloudera Hadoop, Jupyter Notebook, Openshift, Docker-Container Cluster Management and Administration
- Integrates solutions with other applications and platforms outside the framework
- He / She will be responsible for managing platform operations across all environments which includes upgrades, bug fixes, deployments, metrics / monitoring for resolution and forecasting, disaster recovery, incident / problem / capacity management
- Serves as a liaison between client partners and vendors in coordination with project managers to provide technical solutions that address user needs
Required Skills & Experience
* 5+ years of Hadoop admin experience
* Hands on experience building and managing Hadoop clusters
* Strong technical knowledge: Unix/Linux; Database (Sybase/SQL/Oracle), Java, or python programming skills, prefers python but java is ok too!
* Strong experience within Jupyter Notebook platform- specifically building out platforms, they work with Anaconda
* Strong grasp of automation / devops tools -- Ansible, Jenkins, SVN, Bitbucket --they are using ansible heavily so that is ideal
* Experience working on Big Data Technologies
* Hadoop, Kafka, Spark, Impala, Hive, Hbase etc. Hadoop is #1 and will make them stand out!
* Knowledge with Cloudera Big Data stack
Nice to Have Skills & Experience
* Cloudera Dev Certification
* Docker-Container, Openshift (knowledge is fine)
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.