Beschreibung
Aufgabe:- Set up the infratructure environment
- Work on 2 (external) Use Cases to bring into the customers systems
Anforderung:
- Three years of experience as DevOps or working in that area
- Very strong experience with Linux and scripting languages
- Strong knowledge of Hadoop ecosystem and related projects (Hive, Impala, Spark, ...)
- Knowledge in databases technologies and fundamentals, both RDBMS and NoSQL
- Work experience with 2 or more programming languages (Java, C++ / C#, Scala, Python, Ruby, C); JavaScript frameworks as a plus
- Experience with distributed computing and resource management (YARN, Mesos...)
- Thorough bugtracking, debugging and stacktrace/logfile deep dive skills
- Capability of result oriented communication with people from different departments with different skill sets
- Excellent time-management skills
- Fluent English (spoken and written)
Wünschenswert:
- Technical knowledge of Hadoop ecosystem and resource management
- Hands-on experience with hypervisor , provisioning, configuration and deployment technologies, (KVM, Vagrant, Puppet, Docker...)
- Hands-on experience with cloud technologies, principles (IaaS, PaaS, SaaS) and capabilities
- Managing security in an enterprise environment (LDAP/AD, Kerberos, encryption...)
- Knowledge of basic networking concepts (OSI Layers, VPN, Tunneling) and protocols
- Familiar with several architectural (MVC, naked objects, microservices...), design (facade, proxy, observer...) patterns and principles (DRY, SOLID, SOC...)
Umgebung/Sonstiges:
Vollzeit
Beginn: 19.6.2017
Dauer: 31.10.2017
Branche: IT/Beratung