Big Data Architect

DE  ‐ Vor Ort
Dieses Projekt ist archiviert und leider nicht (mehr) aktiv.
Sie finden vakante Projekte hier in unserer Projektbörse.

Beschreibung

Core skills (mandatory)
Data Architecture
Solution Architecture with focus on Big Data architecture; Requirements capture, conceptual and contextual architectures, technology selection, detailed design, planning and costing
Apache Hadoop and knowledge of multiple distributions (Cloudera, HortonWorks, HDInsight etc.) associated Apache Big data products (Hive, Impala, Oozie etc.)
Data ingestion design including batch and Real Time architectures using tools like Kafka, Storm, Kinesis or equivalents.
Data governance and metadata management using tools like Apache Atlas.
Data transformation technologies including but not limited to Spark, Python, Nifi.
Data deployment experience on cloud native and hybrid cloud solutions
Microservice/SOA/stateless approaches to data ingestion & consumption
Expertise and experience in producing solution and information architectures using a subset and/or all of the technologies above.
Traditional ETL tools such as (DataStage, Informatica, Ab Initio etc)
Information Glossary tooling eg IIGC, Informatica Enterprise Data Governance or Colilbra.

Data Management
Experience of Datamodelling and optimization for row and columnar based environments. Using tools such as Infosphere Data Architect, Erwin etc
Data Governance approaches and technologies which cover Business Glossary, Metadata Management and Data Lineage
Security governance and access management at infrastructure, server, application including role and attribute based access
Consulting with regard to recent data-related regulation, including the Data Protection Act (DPA) and Global Data Protection Regulation (GDPR)

General
Cross sector consulting and delivery using the above technologies and capabilities
Experience in delivering solutions and capabilities using the above in both an agile and waterfall delivery methodology.
Experience in delivering solutions and capabilities using the above using a distributed team across multiple geographies.
To be able to translate requirements/problem statements into a big data and/or analytics solution using the above technologies and

Start
ab sofort
Dauer
12 months
Von
Trilogy International
Eingestellt
17.05.2022
Projekt-ID:
2387689
Vertragsart
Freiberuflich
Um sich auf dieses Projekt zu bewerben müssen Sie sich einloggen.
Registrieren