Python developer with Azure (m/f/d)

Düsseldorf  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

We are currently looking for a python developer (m/f/d) with Azure background for our FMCG client

Start: ASAP
End: 22.11.2021
Location: remote
Volume: 60 PD

Background:
The project delivers a design and implementation of a new concept for data ingestion into Azure Data Lake Gen2. The concept must be in line with the existing concept for ingesting data into the client's data warehouse. Especially curation of the data (naming convention, key definition) should be in line with the curation concept used in NEW BI.

The aim of the project is to deliver data in a consistent well documented way to various use cases within the client Data Foundation.

Tasks (independently performed):
The delivery of the project has to be organized by the external consultant incl. planning the work packages (backlog), organizing meetings, planning deployments, testing and documenting the developed code.
? Analysing the current ingestion concept of the client's Data Warehouse (DWH) as a first step before designing a new concept for ingestion of data into the data lake. Data ingestion extracts data from the source where it was created or originally stored and loads data into a destination or staging area.
? Reviewing the result of the clients Data Foundation (the data lake and data warehouse implementation) review (available as Word document and Power Point presentation) to be able to design new concepts.
? Independently organizing Workshops with internal employees to discuss the results of the review is part of the project. Workshops can be conducted remotely or on-site depending (taking to the infection rules into account).
? The current ingestion concepts are described in the wiki documentation of the client's NEW BI project. The project related concept has to be independently reviewed and refined if necessary. Access to internal DevOps (including wiki pages), GIT repos and development environments are granted internally in advance.
? Creating a concept for data ingestion and curation of data into the client´s Data Lake based on Azure Data Lake technology, taking into account modern DataOps concepts.
? The proposed design and changes to the design needs to be aligned with the Data Engineering CoE of the client
? Independently integrate the new developed ingestion concept into the existing data landscape of the Data Foundation, specifically considering the ingestion concept of the client's DWH)
? Implementation of the new developed concept for ingestion and curation of data leveraging Azure Synapse Analytics (SPARK). Ingestion has aspects of both development and operations:
o The SCRUM team does not take over any operational tasks. The developed code and pipeline are handed over to the maintenance team. Corresponding hand over sessions are organized as part of the sprint review.
? Independently perform unit testing of CI/CD (Continuous Integration / Continuous Delivery) pipelines during implementation.

Skills:
- Senior Python developer(pySPAKR, pandas and data lake delta)
- SQL
- Azure Dev Ops - especially experience with CI/CD pipeline
- Azure Synapse Analytics (SPAK cluster)
- Common document models (JSON, XML)
Optionally:
? .Net Framework (especially C#)

Michael Bailey International is acting as an Employment Business in relation to this vacancy.
Start
09/2021
Dauer
60 days
Von
Michael Bailey Associates
Eingestellt
25.08.2021
Projekt-ID:
2190963
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren