Beschreibung
Data Engineer
6-9 months
Remote - Germany - Onsite once a quarter
English speaking
Strong expertise in data warehousing, Snowflake, and building scalable data pipelines, particularly integrating data from SAP systems and other sources using Kafka. Experience with Azure cloud, Python, and working in agile environments is essential.
Key Requirements:
Proven experience in data engineering and data warehousing
Strong knowledge of Snowflake and Kafka-based pipelines
Familiarity with Azure cloud platform
Proactive in describing things
Responsibilities:
Design, develop, and optimize scalable data pipelines to process large datasets within a Microsoft Azure environment
Architect and manage databases and data warehouses using SQL
Integrate and model data to support advanced analytics and business intelligence
Ensure high standards of data quality, consistency, and security
Analyze and document source systems, including data flow and system landscape mapping
Collaborate with Data Product Owners, fellow engineers, and business units to create efficient, value-driven data solutions
Participate in regular on-site team meetings and workshops
Your Profile
4–6 years of experience in data architecture and engineering,
Deep proficiency in SQL and hands-on experience with database management systems such as Snowflake and PostgreSQL
Proven track record of relevant project work, clearly documented with timelines and scope
Solid experience with Big Data technologies (e.g., DBT, Azure Data Factory, Databricks, Kafka)
Strong knowledge of cloud platforms, preferably Microsoft Azure
Familiarity with agile methodologies and cross-functional team collaboration
6-9 months
Remote - Germany - Onsite once a quarter
English speaking
Strong expertise in data warehousing, Snowflake, and building scalable data pipelines, particularly integrating data from SAP systems and other sources using Kafka. Experience with Azure cloud, Python, and working in agile environments is essential.
Key Requirements:
Proven experience in data engineering and data warehousing
Strong knowledge of Snowflake and Kafka-based pipelines
Familiarity with Azure cloud platform
Proactive in describing things
Responsibilities:
Design, develop, and optimize scalable data pipelines to process large datasets within a Microsoft Azure environment
Architect and manage databases and data warehouses using SQL
Integrate and model data to support advanced analytics and business intelligence
Ensure high standards of data quality, consistency, and security
Analyze and document source systems, including data flow and system landscape mapping
Collaborate with Data Product Owners, fellow engineers, and business units to create efficient, value-driven data solutions
Participate in regular on-site team meetings and workshops
Your Profile
4–6 years of experience in data architecture and engineering,
Deep proficiency in SQL and hands-on experience with database management systems such as Snowflake and PostgreSQL
Proven track record of relevant project work, clearly documented with timelines and scope
Solid experience with Big Data technologies (e.g., DBT, Azure Data Factory, Databricks, Kafka)
Strong knowledge of cloud platforms, preferably Microsoft Azure
Familiarity with agile methodologies and cross-functional team collaboration