Senior Data engineer

Stuttgart, Nordrhein-Westfalen, München, Köln, Hessen, Hamburg, Hamburg, Frankfurt am Main, Düsseldorf, Dortmund, Berlin, Berlin, Bayern, Baden-Württemberg  ‐ Remote

Schlagworte

Information Engineering Airflow Amazon Web Services Datenmodell Datenqualität Snowflake Amazon S3 Cloud Computing Softwaredokumentation Data Hub Data Integration Data Transformation Data Warehousing Python Grundbesitz SQL Datadog Data Lineage

Beschreibung

Hi William,
I have a requirement for a Freelance Data Engineer to join a data team working within a market-leading international marketplace company.
You will join a cross functional, agile data team working with the latest modern technologies. This team are responsible for building the data hub for a leading real estate platform with a focus on delivering business data models, data performance and data ingestion topics.
Are you currently available for new projects?

B2B / Freelance Contract
Role: Data Engineer
Engagement: Full-time freelance contract
Project Length: 6 months + extensions
Work Location: 100% remote
Start Date: November / December 2023
Project Language: English

Please reply with your CV, availability and day rate if you are interested in this position and we can schedule a call to discuss it further.
---

Role:
A Senior Data Engineer is required to join the data team of an online marketplace company, which are currently experiencing a company-wide technology transformation. The primary focus of the company is the unification of two global platforms into one unified tech stack and platform. The data team are a key contributor to these efforts, working with high-volume heterogeneous data.

Tasks:
• Build upon the current data technology stack (AWS, Snowflake, Airflow)
• Design and develop data integration pipelines which collect, store and transform data assets used in production.
• Set up data ingestion frameworks which are used by other product teams
• Maintain a data catalog of company data assets so that stakeholders who need access can find, comprehend, and rely on it.
• Working on data lineage, data transformation, data modelling, data storage, data access and data quality.
• Upkeep and operation of the data infrastructure, including data quality assurance, documentation, cloud infrastructure orchestration, and operational efficiency.
• Building and enhancing data products (transformations, dashboards, logical data warehouse models, etc.)

Key Skills
• Python
• SQL
• Snowflake
• Airflow / Airflow Astronomer
• AWS Services (SNS, Lambda, S3)
• DBT
• Datadog
• Strong data modelling skills
Start
ab sofort
Auslastung
100% (5 Tage pro Woche)
Dauer
6 Monate
(Verlängerung möglich)
Von
Montash
Eingestellt
14.11.2023
Ansprechpartner:
William Paixao
Projekt-ID:
2680611
Branche
IT
Vertragsart
Freiberuflich
Einsatzart
100 % Remote
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren