Beschreibung
Our client is a leading niche IT consultancy based in Munich, they develop and implements big data solutions for international companies. Helping customers manipulate and analyze their data across global data centers using advanced big data, machine learning, and deep learning tools in a productive environment.Your tasks
• You develop concepts and integrate new software and system components from the Hadoop world (Hortonworks, Cloudera, MapR) or the ELK stack with common methods and tools from the continuous integration/delivery world like Jenkins and Co
• You conceptualize the relevant IT processes in the infrastructure environment, which are provided on-premise or off-premises (cloud), optimizing automation with tools such as Ansible, Chef or Puppet
• You operate the complete development and production infrastructure, including monitoring, troubleshooting and optimization
• You continue to develop the entire IT infrastructure, interacting with and managing various cloud providers (AWS, OTC, Azure, etc.) as well as virtualization and container technologies
Your strengths
• You have a degree in computer science or a related field
• You have a good knowledge of networks and Linux
• You bring first experience in the areas of administration, development, system integration as well as server and application operation
• You have in-depth knowledge of common Internet technologies and their protocols
• You have already acquired know-how in script programming and ideally also in big data technologies (eg Hadoop