Having around 12 years of IT experience in the field of Data warehousing in tools like Informatica 8/9/10x Power Center, Informatica Cloud, PL/SQL, Snowflake, Python, AWS, SFDC, ELK, Couch Base, Five9, Postgres, New Relic, Neo4j, Oracle, Teradata, SQL Server and UNIX Shell Scripting.
Strong knowledge in Data Integration into Snowflake Database from AWS, SFDC , Web Service (REST API and SOAP API), SQL Server, Couch Base, Five9, Postgres, New Relic, Neo4j and XML & JSON files etc.
Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions.
Expertise hands on experience on Informatica Power Center /IICS Performance optimization, bug fixes, troubleshooting, debugging and monitoring.
Expertise development skills in Data Integration, Administration and Monitoring in Informatica Intelligent Cloud Service.
Expertise in Snowflake - data modeling, ETL using Snowflake SQL, implementing complex Stored Procedures and standard DWH and ETL concepts.
Expertise in deploying Snowflake features such as data sharing, events, and Data Lake build.
Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe model techniques using Python.
Good hands on experience on Python web-scraping using Anaconda, Splash, Docker Desktop and Docker Toolbox.
Expertise development skills in creation of synchronization tasks, complex mapping task, task flows and replication tasks...etc.
Expertise in IICS Administration activities like Projects, connections creation, permission grant or revoke, jobs schedule and code deployments from Non-Prod to Prod.
Expertise in writing PL/SQL stored procedures, optimized performance tuning, debugging and trouble shooting.
Expertise in Snowflake advanced concepts like setting up resource monitors, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
Expertise experience in creating publication and consumption of Kafka topics, resiliency, monitoring through real time data streams.
Good knowledge on KSQL and K-streams.
He has adept knowledge of Data Warehousing Methodologies and as part of his assignments he has worked extensively in the Analysis, Data Modeling, Design, Development, Testing, Performance Tuning and Production Support as a Developer.
Experience in using Automation Scheduling tools like AppWorx, Control –M, Crontab, Autosys and Maestro.