Dateianlagen
CV_MadhuriV_FreeLancer.pdf
Der Download dieser Datei ist nur mit einem kostenpflichtigen Business- oder Enterpriseaccount möglich.
Skills
Languages: Terraform(HCL), SQL, Shell Scripting, Hive QL, Impala, Python(basics), Yaml.
OS: Linux, Windows.
Data-System: RDBMS(DB2 & Postgres), MongoDB, Hadoop, DataBricks, Kafka.
ETL: Pentaho, Kettle, Cognos.
Others: GitActions, Docker, Containerization(Kubernetes), Azure, AWS(basics).
Projekthistorie
- Design architecture around Cloud-native applications and deliver resilient
- infrastructure.
- Development of high availability & scalable cloud services.
- Delivering applications aligning with security standard (ISO 27001).
- Technical consultant for Azure, IaC and CI/CD Implementations.
- Build large scale Big Data processing and analytics infrastructure.
- Develop infrastructure(VM) for the Data Scientists to process production data.
- Customize Azure VMs inline with the security guidelines from the client and install custom software on the VM.
- Deploy VMs using terraform.
- Integrate authentication of users to the VMs using AD.
- Experience in deploying highly available App services with custom domains.
- Develop Datalake in cloud using Azure Storage.
- Develop data pipelines to move data from on-premise to Cloud (data lake).
- Experience in configuring DataBricks end-to-end (Workspace config, cluster config , configure endpoints such as SQL) using IaC .
- Securing Cloud services (private endpoints, data-at-rest & data-intransit encryption).
- Experience in automating Azure services deployment using Terraform (VMs, Apps, Data Factory , Data Lakes and Data Bricks, Azure EventHub, Azure DBMS).
- Develop custom docker images for deployment in AKS.
- Design and develop back-end infrastructure to deploy auto-scale gitrunners in AKS.
- Development of CI/CD pipelines for automated deployment of IaC using Azure Devops.
- Design and secure scalable & reliable big data platform (Hadoop) in Cloud.
- Develop ETL process to extract data from on-prem to Hadoop.
- Implement cloud native load balancers to deploy highly available Impala service.
- Implement Authentication and Authorization concept for the Hadoop Servers and Applications.
- Allow users to authenticate using SAML to the Hadoop front-end applications.
- Design and secure scalable & reliable big data platform (Hadoop).
- Develop ETL process to extract data from Hadoop and live data sources into DWH.
- Develop ETL processes and Reports using Pentaho BI Suite.
- Train customers on DB2 fundamentals and best practices.
- Develop training labs material for CICS on Linux(TXSeries).
- Develop Metadata models and reporting solutions using Cognos and DB2.
- Technical support to DB2 customers on performance, migration.
- Developing PoC using new features in DB2.
- Verify DB2 features using shell scripting.
Develop reports using Cognos Impromptu.
Develop and Optimize existing Unix shell scripts.