Beschreibung
We are currently searching for a Data Intelligence Engineer to help us with applications of the customer interaction department and platforms of the CIO area. The Data Lake platform is used for high-performance collection, enrichment, linking, refinement and evaluation of data from numerous sources and allows the improvement of data use in all areas. The platform offers the basis for big data analyses and data science, and therefore new business opportunities.Having experience in conceptual design and data flows with AWS and Apache Spark/Scala is a must. Additionally, you should be able to describe, coordinate and document functional and non-functional requirements in the form of epics, features and user stories. This also includes the preparation of technical documents (e.g. use cases / user stories, product backlog, functional system descriptions, technical service descriptions, technical interface descriptions). Further tasks are to build and evaluate functional specifications, execute test cases and document requirements in Jira.
Project details:
• Starting period: 01.12.2022
• Ending period: 31.03.2022 (Possible extension)
• Full-time: (5 days/week)
• Location: Remote/ Frankfurt (10%)
• Language: German
Experience:
• At least 3 ETL tools (Spark, Nifi, Jupyter,...)
• Architecture and implementation knowledge in the AWS stack (e.g. ECS, EMR, Athena, Glue, Redshift, Kinesis or MSK, Lambda, Python
• Architecture and implementation of relational databases, SQL, ETL and BI tools
• Programming with Apache Spark and Scala
• Agile environment (e.g. Scrum)
• Agile requirements engineering (e.g. user journeys, EPICs, features, user stories, user story mapping)
• Certification/knowledge (AWS Certified Solutions Architect, AWS Certified Developer)
• Problem-solving/work organization know-how, which he can demonstrate on the basis of the last three projects
Would you be interested or do you have someone in your network, who might be, we will be happy to work together!