malhar parve verfügbar

malhar parve

Informatica,Informatica Powercenter,Talend,Pentaho,ETL,SQL,PLSQL,DWH, Linux,Big Data

verfügbar
Profilbild von malhar parve Informatica, Informatica Powercenter, Talend, Pentaho, ETL, SQL, PLSQL, DWH, Linux, Big Data aus FrankfurtamMain
  • 60489 Frankfurt am Main Freelancer in
  • Abschluss: Bachelor of Engineering in Computer science
  • Stunden-/Tagessatz:
  • Sprachkenntnisse: deutsch (Grundkenntnisse) | englisch (Muttersprache)
  • Letztes Update: 09.09.2020
SCHLAGWORTE
PROFILBILD
Profilbild von malhar parve Informatica, Informatica Powercenter, Talend, Pentaho, ETL, SQL, PLSQL, DWH, Linux, Big Data aus FrankfurtamMain
DATEIANLAGEN
SKILLS
ETL Tools
Informatica PowerCenter 10.4, Talend Data Integration 7.2, Pentaho Data Integration (Pentaho Kettle) 8.3
Databases
Oracle 19c, Oracle 12c, Oracle Exadata 12c, Microsoft SQL Server 2016, IBM DB2, Hadoop HDFS, PostgreSQL 10.7
Cloud Based Datawarehouse:
Snowflake
Big Data Ecosystem:
Hadoop 2.0, Spark, Scala, Sqoop, Flume
Cloud Technologies:
Amazon Web Services (AWS), AWS S3, AWS RDS(PostgreSQL)
Modeling
3-NF, Dimensional Modeling (Star & Snow Flake), Data Vault (Raw Vault & Business Vault)
Modeling Tools
PowerDesinger
Software Development Methods
Agile, SCRUM, Waterfall
Programming Languages
Python 3.7.6, Core JAVA, Scala, SQL, T-SQL, PL/SQL, UNIX/Bash Shell scripting
Scheduling Tools
BMC Control-M 9.0.19, Automic UC4, Informatica Scheduler
Version Control
Subversion, GitHub
Operating Systems
Windows 7/10, Linux, Solaris
Other Tools
Informatica Mapping Architect for Visio, Atlassian Jira, Atlassian Confluence, GitHub, Hue, Eclipse, Toad, SQL Developer, FTP, sFTP, WinSCP, FileZilla, Putty, HP Quality Center, Aginity Workbench for IBM Netezza, Citrix, Hue, PostMan REST API Testing, pgAdmin4, SnowSQL

Certifications          

4/2014 Informatica PowerCenter 9.x Developer
5/2015 TOGAF 9.1 (Enterprise Architecture Framework) Certified Professional
7/2017 International Knowledge Measurement - Informatica PowerCenter
PROJEKTHISTORIE
  • 01/2020 - bis jetzt

    • Lipsia Digital GmbH
    • 10-50 Mitarbeiter
    • Internet und Informationstechnologie
  • ETL Development and Support
  • Contract Type: Freelance
    Role: ETL Developer
    Project: Procurify Integration with DATEV

    Read all bill details including purchase orders, approvals, attachments from Procurify, a cloud-based procurement management system and send it to the Flowwer2, a target system for the Procurify DATEV Connector. Flowwer2 is DATEV approved tool which can be connect to a specific DATEV Client and can send via structured data as well as attachments to DATEV.

    Flowwer2 will be used to receive and send invoice data and related attachments (invoice.pdf, po.pdf, shipping slip.pdf AND approval log.pdf) to DATEV.


  • 03/2019 - 12/2019

    • Deutsche Boerse
    • >10.000 Mitarbeiter
    • Banken und Finanzdienstleistungen
  • ETL Development and Support
  • Deutsche Boerse, Frankfurt am Main through Marlin Green Ltd

    Job Type: Freelancer

    Role: Senior Data Engineer

    Project: Regulatory Reporting Hub (RRH)

    MIFIR/EMIR Transaction Regulatory Reporting to NCAs e.g. BaFin, AMF, etc.

    Project Technology Stack

    Source System: XML Files, Flat Files, Oracle

    Target System: Oracle, XML, CSV

    ETL Tool: Informatica Powercenter 10.2

    Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting

    Scheduling Tool: Control-M

     

    Responsibilities

    - Design, Develop and Maintain Informatica ETL/Data pipelines

    - Performance tuning of ETL pipelines for faster loading in various environments

    - Bug Fixing, Deployment, Production Support, Data Analysis

    - Read data from XML & Flat files to load into staging, core layer and further to Delivery Area in Oracle database.

    - Perform various cleansing and data completeness checks

    - Enrich data from various reference/lookup tables and load into core layer

    - Used various transformation like XML Source Qualifier, XML Parser, XML Generator, Transaction Control, Normalizer, lookup, update strategy, etc.

    - Performance optimization of informatica mappings and sessions for faster loads

    - Developed SCD Type1 and 2 mappings to load history data into data mart.


  • 10/2017 - 06/2019

    • ATOS AG / JOB AG Source One GmbH
    • >10.000 Mitarbeiter
    • Banken und Finanzdienstleistungen
  • ETL Development and Support
  • ETL Developer
    Commerzbank
    Commerzbank, Frankfurt am Main through ATOS AG / JOB AG Source One GmbH
    Contract Type: Freelance
    Role: ETL Developer
    Project
    Compliance (CMC & CAF) - AML Reporting - Frankfurt & Singapore
    This was a data Integration project which includes providing data from various banking applications like Murex Cash, Murex Equity, Murex Currency, etc. for compliance reporting.
     

    Project Technology Stack
    Source System: Flat Files, MS SQL Server
    Target System: Oracle, Flat Files, Hadoop HDFS
    ETL Tool: Informatica PowerCenter 10.1, Informatica BDM
    Other programming languages: Oracle SQL & PLSQL, Python Scripting, Unix Shell Scripting, UC4 Scripting
    Scheduling Tool: Automic UC4

    Responsibilities:

    - Design ETL Pipelines and ETL Architecture.
    - Design Informatica ETL jobs as per the quality and software development standards.
    - Source to target data mapping analysis and design.
    - Analyse, design, develop, test and document Informatica ETL programs from detailed and high-level specifications, and assist in troubleshooting.
    - Creation of project-related documents like HLD, LLD, etc.
    - Created reusable transformations and mapplets
    - Developed data ETL pipelines for Change Data Capture (CDC)
    - Creation of data pipelines to load into Hadoop HDFS.
    - Complex Informatica PowerCenter ETL development and Quality Assurance.
    - Design and develop various slowly changing dimension load e.g. Type 1, Type 2.
    - Responsible for finding various bottlenecks and performance tuning at various levels like mapping level, session level, and database level.
    - Extensive use of various active and passive transformations like Filter, Router, Expression, Source Qualifier, Joiner, and Look up, Update Strategy, Sequence Generator, Rank, and Aggregator.
    - Debugging and troubleshooting Sessions using the Informatica Debugger and Workflow Monitor.
    - Implement various loads like Daily Loads, Weekly Loads, and Quarterly Loads.
    - Conduct Unit tests, Integration tests, performance tests, etc.
    - Contact point for problems in the Production environment and Defects Tracking with business. (3rd-Level-Support)
    - Supported deployment team in various environment's deployments
    - Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures to troubleshoot any database problems

    - Created UC4 jobs and scripts to execute Informatica Workflows, Shell Scripts, PLSQL Procedures, etc.

    - Created Python scripts for file processing and error checks.

    - Develop & Bugfix Unix Bash Shell Scripts for flat file checks and error handling.

  • 06/2015 - 09/2017

    • Capgemini GmbH / Templeton & Partners Ltd
    • >10.000 Mitarbeiter
    • Konsumgüter und Handel
  • ETL Development and Support
  • ETL Tech Lead
    Aldi Sued
    Aldi Sued, Mülheim an der Ruhr through Capgemini GmbH / Templeton & Partners Ltd
    Contract Type: Freelance
    Role: ETL Tech Lead
    Project: Retail Enterprise Data Warehouse
     

    Project Technology Stack
    Source System: MS SQL Server, Flat Files, Oracle
    Target System: Oracle Exadata
    ETL Tool: Informatica PowerCenter 10.1
    Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
    Scheduling Tool: Informatica Scheduler
    Project Methodology: Scrum/Agile

    Responsibilities
    - Participate in scoping, source system data analysis, target system requirements, volume analysis and migration window determination.

    - Design & Develop Informatica PowerCenter ETL SCD Type-1 and Type-2 mappings to load dimensions in Sales Data Warehouse.

    - Developed Informatica mappings to load fact tables using various transformation like sorter, aggregator, joiner, lookup, update strategy, sequence generator, etc.
    - Perform data cleansing tasks using expression transformation, etc.
    - Contact point for problems in the Production environment and Defects Tracking with business. (3rd-Level-Support)
    - Developed Informatica PowerCenter mappings to move data from stage to Core and Data Mart Layer.

    - Implement various loads like Daily Loads, Weekly Loads and Monthly Loads.
    - Developed PL/SQL Packages, Procedures and Functions accordance with Business Requirements e.g. loading time dimension, etc.
    - Documented various POCs and ETL solutions in Confluence.
    - Debugging and troubleshooting Sessions using the Informatica Debugger and Workflow Monitor.
    - Responsible for finding various bottlenecks and performance tuning at various stages like Source, mapping, transformation, session..
    - Created Materialized Views and partitioning tables for performance reasons.
    - Worked on various back end Procedures and Functions using PL/SQL.
    - Designing Tables, Constraints, Views, and Indexes etc.
    - Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures.

    - Participated in Scrum Daily meetings, estimate subtasks of user stories, sprint analysis, etc.


  • 01/2015 - 05/2015

    • Informationsfabrik GmbH
    • >10.000 Mitarbeiter
    • Konsumgüter und Handel
  • ETL Development and Support
  • Senior ETL Consultant
    HRS
    HRS, Köln through Informationsfabrik GmbH

    Contract Type: Freelance
    Role: Senior ETL Consultant
    Project: Hotel Enterprise Data Warehouse
     

    Project Technology Stack
    Source System: MS SQL Server, Flat Files, Oracle, XML
    Target System: Sybase IQ
    ETL Tool: Informatica PowerCenter
    Other programming languages: Oracle SQL & PLSQL, T-SQL, Unix Shell Scripting
    Scheduling Tool: Control-M
    Project Methodology: Waterfall
    Data Modeling: Dan Linstedt Data Vault Modeling

    Responsibilities
    - Use of Data Vault as Data Modeling approach for the Hotel Enterprise Data Warehouse.
    - Define the ETL Architecture to load the Data Vault model and data mart.
    - Analysed sourced data to identify candidates for Hub, Satellite and link tables
    - Developed Informatica PowerCenter mappings, sessions and workflows to load Hub, Link & Satellite tables.
    - Added Hub, Link & Satellite tables including business keys, Surrogate keys, descriptive satellite information to Data Vault Model.
    - Implement various loads like Daily Loads, Weekly Loads.
    - Perform various data cleansing tasks.
    - Perform test using sample test data in accordance with the client data migration/integration needs.
    - Contact point for problems in the Production environment and Defects Tracking with business.
    - Developed Informatica PowerCenter mappings to move data from stage to core and data mart layer.
    - Documented various input databases and data sources.
    - Debugging and troubleshooting Sessions using the Informatica Debugger and Workflow Monitor.
    - Developed UNIX shell scripts to perform various user requirements.

  • 01/2014 - 09/2014

    • IBM Deutschland / Questax AG
    • >10.000 Mitarbeiter
    • Konsumgüter und Handel
  • ETL Development and Support
  • Senior ETL Consultant
    KARSTADT
    Karstadt, Essen through IBM Deutschland / Questax AG

    Contract Type: Freelance
    Role: Senior ETL Consultant

    Project:
    Karstadt information systems for measures and analytics (KARISMA)

    The goal of this project was to create centralized Analytical and Reporting system for Karstadt Warehouse GmbH. The major part of the project was to replace existing SAP BW Reporting system and create new enterprise data warehouse with Informatica PowerCenter 9.5.1 for ETL and Cognos 10 for Reporting. Informatica PowerExchange 9.5.1 with BCI (Business Content Integration) & Data Integration using ABAP methods were used to connect to Karstadt SAP Retail system and read data from SAP Standard and Customized Data Sources. IBM Netezza 7 was used as Target system with Informatica PowerExchange for Netezza.
     

    Project Technology Stack
    Source System: SAP, IDOC, Flat Files, XML
    Target System: IBM Netezza
    ETL Tool: Informatica PowerCenter 9.5, Informatica Powerexchange 9.5
    Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
    Scheduling Tool: Informatica Scheduler
    Project Methodology: Waterfall

  • 07/2011 - 12/2013

    • CSC Deutschland GmbH / Datamatics Global Solutions GmbH & Hays AG
    • >10.000 Mitarbeiter
    • Banken und Finanzdienstleistungen
  • ETL Development and Support
  • Senior ETL Consultant
    Deutsche Bank
    Deutsche Bank, Frankfurt through CSC Deutschland GmbH / Datamatics Global Solutions GmbH & Hays AG
    Job Type: Employee & Contract Type: Freelance

    Role: Senior ETL Consultant
     

    Responsibilities
    Informatica PowerCenter ETL Tool Development & Support
    for Data Migration and Data Integration Projects.

    Projects:
    #1 Retail Banking - Postbank Savings Deposit Accounts Migration

    This project involves migration of savings account
    deposits data from Mainframe IBM Z/OS systems to SAP
    Deposit Management application using PowerCenter 9.1 HF
    1 and Power exchange 9.1. This involves reading data
    from flat files, mainframe data sets, oracle and
    writing data into flat files which will be then
    uploaded into SAP Deposit Management application. Power
    exchange 9.1 was used for the connecting to mainframe
    and reading of Mainframe data sets. ETL Informatica
    PowerCenter 9.1 was used for the extraction,
    transformation, and loading of data into the target
    systems. The project had only single load i.e. one time
    migration. This project involves data extract,
    transformation and loads of 250 to 500 million records.

    #2 Retail Banking - Postbank Savings Deposit Accounts Integration
    This project involves integration of savings deposit
    account data from SAP, Mainframe systems in to the
    Oracle enterprise data warehouse.
    ETL activities includes loading of this data into
    Oracle enterprise data warehouse used for Retail
    banking reporting in Deutsche Bank, Germany. ETL
    Informatica PowerCenter 9.1 HF 1 was used for the
    extraction, transformation, and loading of data into
    the target systems. The project had several loads like
    Daily Loads, Weekly Loads, Monthly Loads, Quarterly
    Loads, and YTD Loads. These loads were implemented
    using Incremental Loading (Change Data Capture), and
    Slowly Changing Dimension Mappings. This project
    involves data extract, transformation and loads of 30
    to 50 million records.

    #3 Retail Banking - Auto Deployment
    Deutsche Bank started this project to save huge time
    for deployment of all ETL components e.g. informatica
    PowerCenter workflows, informatica powerexchange data
    maps, parameter files, shell scripts,etc. This project
    helped Deutsche Bank to save time spend by deployers in
    deploying multiple environments, error free deployments
    and hence reduce cost.

    #4 Retail Banking - LDAP Integration


  • 11/2010 - 06/2011

    • Hitachi Consulting Pvt Ltd
    • >10.000 Mitarbeiter
    • Sonstiges
  • ETL Development and Support
  • Senior ETL Consultant
    American Home Mortgage Servicing Inc
    American Home Mortgage Servicing Inc, Texas through Hitachi Consulting Pvt Ltd, Pune
    Job Type: Employee
    Role: Senior ETL Consultant

    Project:
    Home Mortgage Enterprise Data Warehouse


  • 10/2008 - 06/2010

    • Sigma Systems
    • 1000-5000 Mitarbeiter
    • Sonstiges
  • Oracle/Unix/Java Production Support
  • Software Engineer
    Sigma Systems
    Sigma Systems, Pune
    Job Type: Employee


    Role: Software Engineer
    Oracle, Unix, Java Development & Support

ZEITLICHE UND RÄUMLICHE VERFÜGBARKEIT
N / A
SONSTIGE ANGABEN
Summary

Senior DW/ETL Developer with 11+ years of experience in Informatica PowerCenter, Informatica PowerExchange for SAP/Mainframe/ Netezza, Informatica Big Data Management, Talend, Pentaho Data Integration, ETL, ELT, ETL Architecture & Design, Agile, Scrum ETL development, Data Analysis, Big Data, Data Migration, Data Integration, Data warehousing, Data Modeling, Dimensional Modeling, Data Vault, Hadoop, SQL, PLSQL, ORACLE, UNIX, UNIX Shell Scripting, BMC Control-M, Automic UC4, Banking, Telecom, Retail

Skills

ETL Tools
Informatica PowerCenter 10.4, Talend Data Integration 7.2, Pentaho Data Integration (Pentaho Kettle) 8.3
Databases
Oracle 19c, Oracle 12c, Oracle Exadata 12c, Microsoft SQL Server 2016, IBM DB2, Hadoop HDFS, PostgreSQL
Big Data Ecosystem:
Hadoop 2.0, Spark, Scala, Sqoop, Flume
Modeling
3-NF, Dimensional Modeling (Star & Snow Flake), Data Vault (Raw Vault & Business Vault)
Modeling Tools
PowerDesinger
Software Development Methods
Agile, SCRUM, Waterfall
Programming Languages
Core JAVA, Scala, SQL, T-SQL, PL/SQL, UNIX/Bash Shell scripting
Scheduling Tools
BMC Control-M 9.0.19, Automic UC4, Informatica Scheduler, CronTAB
Version Control
Subversion, GitHub
Operating Systems
Windows 7/10, Linux, Solaris
Other Tools
Informatica Mapping Architect for Visio, Atlassian Jira, Atlassian Confluence, GitHub, Hue, Eclipse, Toad, SQL Developer, FTP, sFTP, WinSCP, FileZilla, Putty, HP Quality Center, Aginity Workbench for IBM Netezza, Citrix, Hue, PostMan REST API Testing, pgAdmin4

Certifications          

4/2014Informatica PowerCenter 9.x Developer
5/2015TOGAF 9.1 (Enterprise Architecture Framework) Certified Professional
7/2017International Knowledge Measurement - Informatica PowerCenter
KONTAKTANFRAGE VERSENDEN

Nachricht:

Absenderdaten: