Senior Data Stream Engineer with Kafka skills

Nordrhein-Westfalen, Essen  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

Aufgabe:

- Design and execute Kafka clusters strategy to help our development teams deliver faster and more reliably
- Develop and maintaine Kafka clients and Libraries for high quality
- Write services and data pipeline for large scale distributed computing, streaming and big data engines using spark, samza and druid
- write performant, clean, modern applications using java, scala or python
- Shaping our state-of-the-art data platform

Soft-Skills:

- Very good analytical and conceptual competencies
- high dedication to quality
- independent way of working
- high motivation and engagement
- Professional appearance and communication
- team-minded

Fach-Skills:

- strong fundamentals in apache kafka
- strong fundamentals in distributed systems design and operation
- experience in structured application event logging, for analytics and machine learning
- experience with modern application system log management (Syslog, SumoLogic, Fluentd, Loggly, Splung, etc.)
- at least 2 years of development, experience deploying, monitoring and troubleshooting multi-tier applications and distributed systems at scale
- container and cloud orchastration experience (kubernetes, mesos, etc.)
- ability to work with remote teams
- ability to work hard, have fun and make a difference
- ability to keep on growing and learning


ggf. anteilig remote-Arbeit möglich

ggf. gelegentlich Reisen zu einem anderen Projektstandort
Start
ab sofort
Von
Quanto AG
Eingestellt
27.02.2018
Ansprechpartner:
Kerstin Hartmann
Projekt-ID:
1511338
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren