Associate -icdw L2 Production Support & Analyst Resume
SUMMARY:
- Over 7+ years of IT experience in Business Application Software Design, Development, Implementation, Testing and Support of Data warehouse and Master Data Management using Informatica 8.6/9.x, MDM,Hadoop, Teradata, Oracle( SQL &PL/SQL), SQL Server and UNIX.
- Experience in Requirement gathering, Analysis, System Design and Development phases of Software Development (SDLC)), STLC Life Cycles and Agile methodologies.
- Experience in Data Warehouse, Bigdata, Masterdata Managment,Data analytics, Data Analysis, trouble - shooting, ETL Development, Reporting, Real-Time support, Testing and documentation.
- Strong exposure in Informatica,Abinitio MDM,Teradata,Oracle,Sql Server and Hadoop
- Extensively used ETL methodology for supporting data extraction, transformation and loading process in a corporate-wide ETL Solution using Informatica.
- Strong Experience to work in Development, Production support teams in handling critical situations to meet the deadlines for successful completion of the tasks/projects.
- Conversant with database systems like Oracle, SQL Server,Teradata and DB2 using tools like Toad and Aqua Data Studio for performing SQL and PL/SQL programming.
- Expertise in creating Database DDL and DML scripts
- Experience with legacy systems like mainframe COBOL files, web services.
- Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, Netezza, teradata and worked on integrating data from flat files like fixed width and delimited.
- Involved in designing and implementation of different data warehouse dimensional models like Star Schema, Snow Flake Schema.
- Experience in Data Warehousing techniques using different Data Warehouse Schemas, Slowly changing dimensions, Fact tables and Aggregate tables.
- Experience in analysis of stored procedures, packages, database triggers and functions using PL/SQL.
- Worked on Rewrite Process from loader scripts to Informatica.
- Hands on experience in Audit check implementation, pre-validation and pre-cleansing activities using shell scripting and database tables.
- Expertise in analyzing the performance bottlenecks and tuning the mappings.
- Experience in all the cleansing activities using Informatica power center and Informatica IDQ.
- Developed entire MDM flow like Loading to landing tables, creating Mappings for implementing cleansing and stage load, executing Match/Merge progress, validating Base object load.
- Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
- Knowledge in Data Steward Utilities (Merge Manager, Hierarchy Manager, Data Manager).
- In Depth knowledge in MDM Base Object, Match Path, Match columns, Match Rule, Fuzzy match and Exact match.
- Experience with Hadoop ecosystem tools like HDFS, Hive and Pig.
- Hands on experience with Teradata utilities likebteq fast export, fast load and multi load.
- Experience in performance tuning of Teradata queries and utilities scripts for the better distribution of data and fixing any bottlenecks.
- Experience in working with PPI Teradata tables and work in Teradata specific SQL fine-tuning to increase performance of the overall ETL process. worked with the Business users and system analysts to sort out the problems during requirements and development phases.
- Experience in estimating ETL requirements, designing and architecting integration solutions.
- Expertise in using DVO for data testing.
- Experience in using Automation Scheduling tools like Autosys,TWS and Control-M.
- Experienced in working with industries including Banking and Insurance .
- LOMA (ALMI Certified Professional) in Insurance and Health insurance Domain.
- Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
- Proficient in designing the testing artifacts like Test cases, Test Reports, Test Summary Reports.
- Working Knowledge on Sap Business Intelligence & Business Objects.
- Good judgment and communication skills for Reporting and prioritizing software bugs.
- Self-starter and comfortable in high intensity and challenging work environment.
- Good interpersonal skills, commitment, result oriented, hard working with a quest and zeal to learn new technologies and undertake challenging tasks.
TECHNICAL SKILLS:
ETL & Reporting Tools: Informatica Power center 8.6.1, 9.1, 9.6, Informatica MDM 9.6, Abinito (GDE 3.2), Splunk, Hadoop Ecosystem (Hive, Pig, Map reduce and HDFS),Business Objects, SAP BIDatabases, Oracle, Teradata, SQL Server, DB2
Scheduling Tools: Autosys, TWS, Control M
Other Languages: SQL, PL/SQL,UNIX Shell Scripting, Mainframes,C
Operating Systems: Window7/XP Professional,Linux, Unix
Utilities: Putty, Ultra Edit, Soap
DB Tools: Oracle SQL Developer, DB Visualizer, Toad, Teradata ql assistant
PROFESSIONAL EXPERIENCE:
Confidential
ASSOCIATE -ICDW L2 Production support & Analyst
Environment: Informatica BDE, Informatica Power Center 9.6, Informatica IDQ, Teradata, Oracle, Hive and Hadoop HDFS, Informatica MDM 9.7, Abinito (GDE 3.2), Cognos Reporting tool, Unix Scripting, ETL’s Data Flow designing through Visio, CONTROL M, Query it, soap, Sif
Responsibilities:
- Monitoring the Daily loads and Production Environment.
- Identifying trends and opportunities to improve operational efficiency.
- Identifying the reoccurring issues and fixing code issues through ITSM
- Implemented the password parameterization for the teradata ETLs in the informatica with EPV
- Providing the second level support for the ICDW applications, job scheduling, change and incident support and handling user queries.
- Handling code promotion, change and problem management, incident management, EURC request management, job scheduling, file and database table management, transfer/load/extract data management, application wellness checks, daily status reporting, application recovery, and disaster recovery.
- Analyzing recurring production issues and drive to resolution Develop proactive processes to alert/eliminate issues before they escalate in business impact Identify areas .
- Documenting all the failures, corresponding resolution steps and code fixes for future reference.
- Hands on experience in analyzing bottlenecks in Informatica transformations, embedded queries and tuning them for better performance.
- Handling Informatica and Database version upgrades.
- Handling the P1 calls with the help of PAC team
- Handling loads during outages and maintenance activities.
Confidential
ASSOCIATE LEAD
Environment: Informatica power center 9.1.1/9.6.1, MDM, Oracle 11g/12c, SQL server HPSM, Tivoli workload scheduler, Aqua data studio, Toad, ITG, UNIX,
Responsibilities:
- Responsible for Business Analysis and Requirements gathering.
- Worked closely with clients to analyze data, determines project requirements, review functional requirements and develop detailed designs to meet requirements.
- Design the ETL Architecture to meet Business reporting and data analytics requirements.
- Closely involved the designing, Functional and technical documents.
- Prepared Logical and Physical Model of Data Modeling based on Business Requirements.
- Estimated ETL requirements, designing and architecting integration solutions.
- Developed complex Informatica mappings and designed Dimensions and fact mappings with complex logic etc.
- Created mapping documents to outline data flow from sources to targets.
- Identified and proposed application performance initiatives and design of Database and Informatica partitioning.
- Identified and proposedsolutions to improve the business functionality and generalize the best practices .
- Worked on defining the Development & Migration Standards to be used for Informatica Coding .
- Creating technical guideline documents standards to ensure uniform quality on all the deliverables.
- Providing technical leadership to support team for Root Cause Identification & problem resolution .
- Created new Tables, Indexes, Synonyms and Sequences needed as per new requirements.
- Tuned various performance issues mappings.
- Initiated ICD10 changes across the application and down streams and making sure that all applications are adhere to the changes.
- Worked for Walmart, Medstat,Eaton Amgen, GBCS and HIMA clients.
- Converted SQL code into informatica mappings.
- Created new mappings to create consolidated RTP file for claim record types with Event no8 and to load SF files into ITSHOST tables.
- Monitoring the Daily loads and Production Environment.
Confidential
ETL Developer
Environment: Oracle 10g, Pl/SQL, Informatica, Windows NT, UNIX,Autosys
Responsibilities:
- Gathering requirements and designing ETL data model to extract data from source system and loading into DWH, from DWH to downstream application.
- Creating Technical specification and data mapping specification documents by coordinating with Clients and Data Architects.
- Developing mappings using various transformations like Connected and Unconnected lookups, Update strategy, Aggregator, Router, Filter, Joiner, Expression, Stored Procedure, Normalizer, Source Qualifier, Rank, Sequence Generator, Sorter, SQL transformation according to the business logic.
- Tuning Mappings and Sessions at Query level and Informatica level during development phase.
- Working with different data sources like flat files (Fixed width and Delimited) and RDBMS tables.
- Identifying root causes proactively whenever issue comes at DB level and Informatica level during development phase.
- Migrating and validating ETL code from DEV to QA and QA to PROD.
- Reviewing ETL Code and Oracle DDL, DML scripts created by Team Members.
- Implementing Incremental load logic, CDC & MD5 logic in ETL design/development to improve the performance of loads.
- Creating PROD Release documents, which include Impact analysis docs, Object, list, UTP, UTR, LLD, Rollout and Rollback plans.
- Doing impact analysis and estimating efforts required to accomplish assigned tasks onto priority basis.
- Creating Unix shell scripts to perform various checks like file availability check, file size check, file timestamps check. Also, creating scripts to transfer (FTP) files from one server to another server