Beschreibung
We are looking for a Data Engineer to work on the applications of the customer interaction department and platforms of the CIO area. The Data Lake platform is used for high-performance collection, enrichment, linking, refinement and evaluation of data from numerous sources and allows for improving data use in all areas. The platform offers the basis for big data analyzes and data science, and therefore new business opportunities, and is also based on current standard technologies (AWS Serverless).You should have experience in conceptual design and data flows with AWS and Apache Spark/Scala. Additionally, you should be able to describe, coordinate and document functional and non-functional requirements in the form of epics, features and user stories. This also includes the creation of technical documents (e.g. use cases / user stories, product backlog, functional system descriptions, technical service descriptions, technical interface descriptions). Further tasks are to build and evaluate functional specifications, execute test cases and document requirements in Jira.
Project details:
• Starting period: 15.09.2022
• Ending period: 31.12.2022 (Possible extension)
• Full-time: (5 days/week)
• Location: Remote/ Frankfurt (10%)
• Language: German
Experience:
• At least 3 ETL tools (Spark, Nifi, Jupyter,...)
• Architecture and implementation knowledge in the AWS stack
• ECS, EMR, Athena, Glue, Redshift, Kinesis or MSK, Lambda, Python
• Architecture and implementation of relational databases, SQL, ETL and BI tools
• Programming with Apache Spark and Scala
• Agile environment (e.g. Scrum)
• Agile requirements engineering (e.g. user journeys, EPICs, features, user stories, user story mapping)
• Certification/knowledge (AWS Certified Solutions Architect, AWS Certified Developer)
• Problem-solving/work organization know-how, which he can demonstrate on the basis of the last three projects
Do you have someone in your network, who might be interested, we will be happy to work together!