MS cloud Data Architect

Rheinland-Pfalz, Ludwigshafen am Rhein  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Schlagworte

Azure Cloud Architect

Beschreibung

Required skills:


• More than 12 years of overall IT industry experience along with 2+ years of experience on cloud-based solution.
• More than 5 years of experience in Big data, Spark applications.
• Good understanding of Architectural Development Management practices and exposure to Enterprise Architecture frameworks (e.g. TOGAF)
• Well versed with different Cloud Architecture / Design / Integration Patterns, application migration methodology, and implementation approach
• Strong understanding of different Multi-tenancy model implementation techniques using various Cloud Service Models (e.g. BPaaS, SaaS, PaaS, IaaS)
• Good knowledge on Data Lake, Staging area, 3NF, data labs and data marts including starflake and snowflake schemas
• Experience in migrating and integrating different types of data into data lake within in-premise or cloud ecosystem
• Experience on Big Data Tools and Technologies such as Spark, Kafka, Flume, Sqoop, Hive, HDFS, Mapreduce, HBase etc. and with respect to MS Azure general services used to build the data platforms like Azure AD, Data catalog, Stream services. Data factory etc.
• Proficient in distributed computing principles and familiar with key architectures including Lambda and Kappa architectures, and has a broad experience across a set of data stores (e. g., HDFS, Azure Data Lake Store, Azure Blob Storage, Azure SQL Data Warehouse, Apache HBase, Azure DocumentDB), messaging systems (e. g., Apache Kafka, Azure Event Hubs, Azure IoT Hub) and data processing engines (e. g., Apache Hadoop, Apache Spark, Azure Data Lake Analytics, Apache Storm, Azure HDInsight, Azure Databricks).

Responsibility:

• The Azure Data Architect is responsible for helping to design, deploy, manage and support the systems and infrastructure required for a data processing pipeline in support of a products requirements.
• Primary responsibilities revolve around DevOps and include implementing ETL (extract, transform and load) pipelines, monitoring/maintaining data pipeline performance.
Start
2020
Von
Adroit People Ltd
Eingestellt
01.04.2020
Ansprechpartner:
Roshini Agarwal
Projekt-ID:
1916360
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren