Senior Hadoop Administrator

München  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

Skillset:

Excellent hands-on working experience with Hadoop ecosystem for at least 2 years : Apache Spark, Kafka, HDFS, MapReduce, Zookeeper, Hive, Impala, Oozie, Flume, Sentry, but also Oracle, MySQL, PSQL
- Hands-on experience with automation, provisioning, configuration and deployment technologies : Puppet, Ansible, VMware, Docker, etc
- Hadoop cluster design, cluster configuration, server requirements, capacity scheduling, installation of services : name node, data node, zookeeper, job tracker, yarn, etc.
- Skills in installing, upgrading & managing distributions of Hadoop : (CDH5x), Cloudera Manager, MapR, etc.
- Hands-on experience with enterprise-level Linux deployments as well as shell scripting : bash, tcsh, zsh
- Experience with implementing Hadoop clusters in a large scale environment : preferably including multi-tenancy and security with Kerberos
- At least 3 years of experience with system administration of Hadoop clusters : including OS level administration (Linux)
- Deep expertise in distributed computing and the factors determining and affecting distributed system performance

Responsibilities:
Manage very large-scale, multi-tenant and secure, highly-available Hadoop infrastructure supporting rapid data growth for a wide spectrum of innovative internal customers
- Install Hadoop distributions, updates, patches, version upgrades
- 3rd-Level-Support (DevOps) for business-critical applications and use cases
- Design, implement and maintain enterprise-level security (Kerberos, LDAP/AD, Sentry, etc.)
- Create run books for troubleshooting, cluster recovery and routine cluster maintenance
- Troubleshoot Hadoop-related applications, components and infrastructure issues at large scale
- Design, configure and manage the backup and disaster recovery of big data
- Provide architectural guidance, planning, estimating cluster capacity, and creating roadmaps for Hadoop cluster deployment
- Evaluate and propose new tools and technologies to meet the needs of the global organization (Allianz Group)
- Work closely with infrastructure, network, database, application, business intelligence and data science units
Start
11/2017
Dauer
12 months
Von
charlotte Maurin
Eingestellt
24.10.2017
Projekt-ID:
1439062
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren