Profilbild von BengWi Lo Senior Data Engineer aus Dusseldorf

Beng Wi Lo

verfügbar

Letztes Update: 24.04.2024

Senior Data Engineer

Abschluss: Diplom Ingenieur
Stunden-/Tagessatz: anzeigen
Sprachkenntnisse: deutsch (verhandlungssicher) | englisch (verhandlungssicher)

Schlagwörter

Data Warehousing Microsoft Azure Python Microsoft Sql-Server Java Amazon Web Services Analytisches Denken Apple IOS Ubuntu Cloud Computing + 25 weitere Schlagwörter anzeigen

Dateianlagen

CIIA-Diploma_050323.pdf
CEFA-Diploma_050323.pdf
SnowflakeProCore-Certification_050323.pdf
BengWiLo-CV_240424.pdf

Skills

Data Platform:
  • Snowflake Data Cloud
  • Databricks Data Lakehouse
  • PostgreSQL
  • Azure SQL Server
  • DB2
  • Oracle
ETL / ELT Tool:
  • Azure Databricks
  • Azure Data Factory
  • Snowpark for Python
  • Apache Spark
  • Apache Kafka
  • DBT
  • Talend Data Management
  • Informatica Data Integration
Data Warehouse Modeling:
  • Dimension Modeling
  • Data Vault Modeling
Cloud technology:
  • Microsoft Azure
  • Amazon Web Services
  • Google Cloud
Programming skills:
  • Python
  • SQL
  • Java
  • Scala
Advanced Analytic:
  • Scikit-learn
  • Spark MLlib
Operating System:
  • Red Hat Linux
  • Ubuntu Linux
  • Windows
  • macOS

Projekthistorie

04/2023 - bis jetzt
Senior Data Engineer
ISTA SE (Energie, Wasser und Umwelt, 5000-10.000 Mitarbeiter)

Responsibilities:
  • Build a new data platform on Snowflake Data Cloud (Azure Cloud Platform)
  • Data integration with Azure Databricks, Azure Data Factory and Snowpark for Python
  • Data model with Data Vault 2.0 methodology and implementation the data model with Data Build Tool (DBT)
  • Develop Proof of Concept for generative AI use case in Snowflake
  • Build CI / CD Data pipelines on Gitlab for full automation of testing and deployment
  • Provision and manage infrastructure in Azure Cloud and Snowflake with Terraform

01/2022 - 12/2023
Senior Data Engineer
DB Systel GmbH (Transport und Logistik, 5000-10.000 Mitarbeiter)

RESPONSIBILITIES
  • Build ETL-Pipelines using Python/Pandas
  • Use Gitlab as CI/CD tool for full automation of testing and deployment
  • Design and manage containerized applications with Docker
  • Develop and deploy data-driven product & services used by customers
  • Solve various data integration challenges
  • Bring capabilities to deliver fast reporting and analytics solutions

11/2021 - 12/2021
Snowflake Architect
Ratepay GmbH (Banken und Finanzdienstleistungen, 250-500 Mitarbeiter)

RESPONSIBILITIES
  • Design and implement cloud data platform on Snowflake AWS Cloud:
    • Migration of on-premise SQL-Server database to Snowflake on AWS
    • Integration of Okta as Identity Provider, Tableau as BI Tool
  •  Design and implement the Snowflake Layered Security:
    • Single Sign-On and SCIM Integration with Okta as Identity Provider
    • Network Security with AWS Private Link
    • Design Role Based Access Control (RBAC)
    • Design Security Monitoring scripts

05/2021 - 10/2021
Senior Solutions Architect
SNOWFLAKE Computing GmbH (Internet und Informationstechnologie, 1000-5000 Mitarbeiter)

RESPONSIBILITIES
  • Guide customers through the process of migrating to Snowflake and develop methodologies to improve the migration process
  • Deploy Snowflake following best practices, including ensuring knowledge transfer sothat customers are properly enabled and are able to extend the capabilities of Snowflake on their own
  • Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology
  • Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
  • Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments
  • Provide guidance on how to resolve customer-specific technical challenges
  • Support other members of the Professional Services team develop their expertise
  • Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake's products and marketing.

01/2019 - 04/2021
Senior Data Engineer
ARAG Versicherung (Versicherungen, 1000-5000 Mitarbeiter)

RESPONSIBILITIES

Design and building a data lake on Microsoft Azure Cloud:
  • Build ETL pipelines with Azure Databricks, Azure Data Factory, Azure Functions and Talend Data Management
  • Data integration from variety of data sources (e.g. relational data from databases such as IBM DB2, MS SQL Server, SAP BW and semi-structured data from web services such as weather data, check24 data)
  • Deployment of Azure Database for PostgreSQL for relational data within the data lake
  • Deployment of Azure Data Lake Storage gen. 2 for unstructured / semi-structured data within the data lake

12/2017 - 12/2018
Data Specialist
AXA Versicherung (Versicherungen, >10.000 Mitarbeiter)

RESPONSIBILITIES

Building a scalable big data pipeline moving streaming data and batched data to a data lake on
AWS Cloud:
  • Data integration from variety of data sources IBM Mainframe, Oracle, MS SQL Server and SAP BW with Informatica Data Integration Tool and AWS Glue
  • Develop python application for automatic generation of Informatica mappings, sessions and workflows
  • Deployment of Streaming Platform Confluent Kafka, which is one of the main building blocks for data ingestion process
  • Deployment of a Snowflake Data Cloud in AWS Cloud Platform for structured and semi-structured data within the data lake
  • Deployment of AWS S3 storage for unstructured and semi-structured data within the data lake

03/2016 - 11/2017
Big Data Engineer
FinTech Company (Banken und Finanzdienstleistungen, 5000-10.000 Mitarbeiter)

RESPONSIBILITIES

Building a big data platform on Azure HDInsight platform:
  • Extract, transform and load data from variety of sources (Oracle, MS SQL Server, SAP BW) to Azure blob storage
  • Use HDInsight tools Spark und Hive for transforming and storing data from the Azure blob storage to Hadoop-based storage (HDFS)
  • Support data scientist team building a Hadoop-based analytical platform leveraging the analytic tools such as Spark Machine Learning, Apache Mahout

Support Proof of Concept activity designing a machine learning based predictive model for credit card fraud detection leveraging Azure HDInsight platform:
  • Design a predictive model using HDInsight tools such as HDFS, HBase, Kafka, Spark
  • Build a data pipeline, which extract the master data from HBase noSQL database through Kafka streaming platform to Spark for batch processing
  • Build a data pipeline for the transaction data, which is streamed through Kafka streaming platform to Spark for real-time Processing
  • Use Spark Machine Learning library to model a prediction for credit card fraud detection

06/2008 - 02/2016
Technical Sales Manager
ZTE (Telekommunikation, >10.000 Mitarbeiter)

RESPONSIBILITIES

• Provide technical support to the customers (Vodafone Group, Deutsche Telekom, Telefonica O2) for deploying and testing of ZTE mobile and data products
• Organize Proof of Concept with the customers to demonstrate ZTE product capabilities and to specify their potential use cases
• Team management in the ZTE mobile and data-based product development
• Planning and organization of pre-sales activities and meetings with suppliers (Qualcomm, Intel, Microsoft and Google)

Zertifikate

SnowPro Core Certification
2021

Reisebereitschaft

In der Stadt Dusseldorf mit einem Radius von 100 km verfügbar
Profilbild von BengWi Lo Senior Data Engineer aus Dusseldorf Senior Data Engineer
Registrieren