Projektangebot Nr. 11548 - Big Data Spezialisten (w/m) in Frankfurt am Main

Hessen, Frankfurt am Main  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

Sehr geehrte Damen und Herren,

für unseren Kunden suchen wir zwei: Big Data Spezialisten (w/m)

Details:

System engineer / solution architect with deep understanding of big data solutions (Hadoop, Cloudera), and in particular building data lakes in an big enterprise environment.
As part of the engineering team his/her task would be to support Engineers and Developers integrating different kind of data sources in a large, distributed data lake, and cover especially the data specifications part of it.

Roles and Responsibilities:

Work closely with solution designers to understand approach, requirements and solution design of current project
Collect, prepare and process data source specifications from various data source owners
Cover all significant aspects of data sources like type, volume, velocity, frequency, content description, etc.
Coordinate with systems engineers and other stakeholders outside the project to align requirements and design
Support project management and preparing routines
Provide clear qualitative and quantitative information and knowledge to the team
Take part in and lead workshops
Document and visualize source descriptions and other artifacts
Drive the source management and on-boarding processes for the project
Support the general solution architecture and design efforts of the project

Skills = Must Have:

Several years of proven experience as architect/engineer in complex Big Data projects (Hadoop, Kafka, Spark, Hbase, HIVE, Impala, etc.) or other mass data applications at Petabyte scale
At least 3 years of experience with IT Projects in large enterprises
At least 3 years of experience with (big) data integration projects
At least 1 year of experience with performance critical data processing at large scale using files, data streams and databases
Very good documentation and visualization skills
Excellent oral and written communication skills in English (German nice to have)
Team player, experience with agile development approach and tools (SCRUM, KANBAN, JIRA)
Must be able to effectively and professionally communicate within the team

Beneficial:

At least 3 years of experience with large Hadoop clusters
Experience with batch and streamed data flows in and out Cloudera (Spark, Kafka)
Experience with testing IT Systems and Data Integration Systems
Experience with working in internationally distributed cross-functional teams
Experience with the Cloudera framework

Beginn: asap aber spätestens zum 31. Dezember 2016
Laufzeit: 30.09.2017 +
Einsatzort: Frankfurt am Main
Workload: Fulltime
Sprache: Deutsch und Englisch in Wort & Schrift
Bewerbungsunterlagen: CV in Deutsch und/oder Englisch im Word-Format + Stundensatz All-In

Können Sie uns hierbei unterstützen bzw. eine Empfehlung aussprechen? Ich freue mich über eine Nachricht ihrerseits.
Vielen Dank!

Mit freundlichen Grüßen

Sascha Riethmüller
Geschäftsführender Gesellschafter

Projekt Broker Consultant Services GmbH
Wilhelm-Leuschner-Straße 79, 60329 Frankfurt am Main
T: | F: | M:
| www.projekt-broker.com

Amtsgericht Frankfurt am Main HRB 93917, Steuernummer:
Geschäftsführende Gesellschafter: Sascha Riethmüller, Christian Weindl
Start
12.2016
Dauer
12 Monate
(Verlängerung möglich)
Von
Projekt Broker Consultant Services GmbH
Eingestellt
01.12.2016
Ansprechpartner:
Sascha Riethmüller
Projekt-ID:
1248237
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren