München: Big Data DevOps Engineering POS00395 PostgreSQL Elasticsearch Spark Kafka Python Azure AWS

München, Bayern  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

München: Big Data DevOps Engineering POS00395 PostgreSQL Elasticsearch Spark Kafka Python Azure AWS

Einsatzort: München

Ort: München
Start: 23.03.2020, ein etwas früherer od späterer Einstieg wäre möglich
Dauer: 31.12.2020++

Big Data DevOps Engineer
We are looking for a Big Data DevOps Engineer who is as passionate about our vision as we are. This includes
the following responsibilities:
• Develop new functionalities for the Global Data Platform (GDP)
• Evaluate new technologies in the field of cloud computing, distributed computing, data engineering and data
science
• Look for opportunities to improve performance, reliability and automation
• Resolve incidents and change requests
• Support and interact with data scientists and data engineers
• Write technical documentation, announcements, blog posts, and best practices

Technical Skills
Our platform is a diverse product consisting of several technologies. This characteristic should be reflected
in our cross functional team of individuals who have a deep understanding of their technical specialty and at
the same time are able to work outside of their core area. We encourage to contribute your individual
strengths and personality to the team and give you the room to develop some new skills and experiences. The
following technical skills are required for this particular position:

Software Development
• Proficiency in at least one programming language (we have components written in Python, Scala, Elm etc.)
• Software engineering (design patterns, version control, automated testing, concurrent programming etc.)
• Continuous integration, deployment, and delivery

Data Engineering (offering the services below to our customers)
• Databases (e.g. PostgreSQL, Elasticsearch, Neo4J)
• Distributed systems (e.g. Spark)
• Event-based systems (e.g. Kafka)
• Workflow orchestration and automation (e.g. Luigi)
• Data Science Tooling (e.g. Jupyter, R-Studio, Zeppelin)
• Basic understanding how machine learning algorithms work

General IT Skills
• Advanced Experience with Linux
• Containerization (Docker)
• General understanding of Networking, Infrastructure, Orchestration, Distributed Systems and IT Security
Principles

Other Skills
• DevOps mindset (you build it, you run it; taking responsibility for your work)
• Willingness and ability to learn new technologies and tools
• Team player open to work in an agile environment
• Fluent English (written and spoken) is a must, other languages (e.g. German, French, Italian, etc.) are a
plus
• Customer orientation & communication skills

Start
03.2020
Dauer
9 Monate
(Verlängerung möglich)
Von
CAES GmbH
Eingestellt
03.03.2020
Ansprechpartner:
Rafael Gallus
Projekt-ID:
1903122
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren