Beschreibung
Role: Data EngineerLocation: Munich, Germany (Remote work till Lockdown)
Start: immediate/ at a short notice
Duration: 6 months
Type of Contract: Freelance Contract
Essential Skill: Big Data (L3)
Additional Skills: Azure Data Lake Analytics (L3),Python for Data Science (L3)
Experience Required: 8 to 10 years
Job Details:
We are looking for a Software Engineer / Data Engineer who is as passionate about our vision as we are. This includes the following responsibilities:
• Develop new functionalitiesespecially for the core data capabilitiesprovided by the Global Data Platform (GDP), which are:
o Real Time Ingestion Service
o Data Governance Solution
o Knowledge Graph
o Data Supermarket (UI for data consumers based on the knowledge graph to search and consume data sets available within Allianz)
• Understand and explain advantages and disadvantages of the proposed solutions to internal and external stakeholders
• Evaluate new technologies in the field of data engineering, processing and management
• Contribute to improve the Allianz Big Data software stack and define the final production environment
• Look for opportunities to improve performance, reliability and automation
• Resolve incidents and change requests
• Support and interact with data consumers and data engineers within the Allianz Group
• Write technical documentation, announcements, blog posts, and best practices
Technical Skills
Our platform is a diverse product consisting of several technologies. This characteristic should be reflected in our cross functional team of individuals who have a deep understanding of their technical specialty and at the same time are able to work outside of their core area. We encourage to contribute your individual strengths and personality to the team and give you the room to develop some new skills and experiences. The following technical skills are required for this particular position:
Software Development
• Proficiency in at least one programming language (we have components written in Angular, Coljure, Python, Scala, Elm etc.)
• Software engineering (design patterns, version control, automated testing, concurrent programming etc.)
• Continuous integration, deployment, and delivery
Data Engineering (using the tools to build data-driven solutions)
• Competence in running big data workloads in production at scale
• Data Engineering Pattern (Ingestion, Transformation, Storage, Consumption etc.)
• Event-based systems (deep knowledge on the Kafka confluent stack)
• Databases (e.g. PostgreSQL, Neo4J, Stardog)
• Cloud Storage (Azure Datalake or AWS S3 with EMR)
• Distributed systems (e.g. Spark)
• Knowledge Graph (e.g. Stardog)
• Data Governance Frameworks (e.g. Informatica)
• Data Virtualization Technologies (e.g. Denodo)
General IT Skills
• Advanced Experience with Linux
• Containerization/ Container Orchestration (Docker &Kubernetes)
• General understanding of Infrastructure, Orchestration, Distributed Systems and IT Security Principles (Data Access Control)
Other Skills
• DevOps mindset (you build it, you run it; taking responsibility for your work)
• Willingness and ability to learn new technologies and tools
• Team player open to work in an agile environment
• Fluent English (written and spoken) is a must, other languages (e.g. German, French, Italian, etc.) are a plus
• Customer orientation &communication skills