Sr Etl/ Informatica Developer Resume
Danville, PA
SUMMARY
- 8 years of Information Technology experience in ETL System Analysis, Design, Development, Maintenance, and Implementation.
- 7 plus years of ETL experience using Informatica Power Center 9.6.1/9.1.0/8.6/8.1/7.1, Informatica Power Mart 5.1, Informatica Data Quality, and Metadata Manager, Tableau
- Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
- Experience in working with business analysts to identify study and understand requirements and translated them into ETL code in Requirement Analysis phase.
- Experience in creating High Level Design and Detailed Design in the Design phase.
- Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.
- Extensively worked on the ETL mappings, analysis and documentation of OLAP reports
- Well versed in OLTP Data Modeling, Data warehousing concepts.
- Knowledge of Entity - Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
- Experience in integration of various data sources like Oracle, DB2, SQL server and non-relational sources like flat files into staging area.
- Experience in extracting Claims and Member information from FACETS claim engine tables.
- Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklets, Control).
- Good Understanding knowledge of Ralph Kimball & Bill Inmon Methodologies.
- Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance.
- Experienced in UNIX work environment, file transfers, job scheduling and error handling.
- Extensively worked on developing and debugging Informatica mappings, mapplets, sessions and workflows.
- Experience in writing, testing and implementation of the PL/SQL triggers, stored procedures, functions, packages.
- Experience with industry standard methodologies like waterfall, Agile within the software development life cycle.
- Tableau BI Program Implementation, client consulting project delivery and advisory.
- Working with the Tableau desktop, and tableau Server\site for tableau BI reporting
- Involved in Unit testing, System testing to check whether the data loads into target are accurate.
- Experience in support and knowledge transfer to the production team.
- Assign work and provide technical oversight to onshore and offshore developers.
TECHNICAL SKILLS:
Data Warehousing: Informatica Power Center 10.1.0/9.X/ 8.6.1/8.5/7.1 Informatica Power exchange 8.6.1/9.1Informatica Data QualityInformatica Power Mart 5.1(Designer, Workflow Manager, Workflow Monitor, Server Manager), Informatica Power Connect, OLAP
Data Modeling: ER Studio, Erwin 4.0/3.5, Toad Data Modeler
Databases: Oracle 11g/9i/8i, MS SQL Server 2014/12/08, MS Access, DB2
Business Intelligence: Business Objects
Languages: SQL, PL/SQL, Unix Shell Script
Operating Systems: Windows 98/2000/2008 Servers, Linux, Unix
Scheduling Tools: DAC, Autosys, Tidal Schedulers
Reporting Tools: Tableau Desktop, Tableau server\site
Tools: Toad, SQL* Loader, Tableau, Tableau BI Products (desktop, server, reader & online), Business Objects.PROFESSIONAL EXPERIENCE:
Confidential, Danville, PA
Sr ETL/ Informatica Developer
Responsibilities:
- Interacted with business users for requirement analysis and to define business and functional specifications.
- Documented user requirements, Translated requirements into system solutions and developed implementation plan and schedule.
- Developed Complex transformations, Mapplets using Informatica Power Center 10.1.0/9.X to Extract, Transform and load data into data marts from Operational data store (ODS).
- Handling the day to day production incidents/failures and fixing them.
- Prioritizing the tickets on the basis of emergency and completing the REQ/RITM.
- Extracted claims and member related information from FACETS claims engine tables and loaded into Data Warehouse/ Data Mart.
- Prepared software requirement specifications, business rules interacting with business units and designed Star schema, logical and physical database design for Domains.
- Developed data Mappings, Transformations between source systems and warehouse
- Performed Type1 and Type2 mappings
- Implemented Aggregate, Filter, Join, Expression, Lookup and Update Strategy transformations.
- Used debugger to test the mapping and fixed the bugs.
- Created sessions, sequential and concurrent batches for proper execution of mappings using server manager.
- Migrated development mappings as well as hot fixes them in production environment.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target-based commit interval.
- Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
- Created Workflows with worklets, event wait, decision box, email and command tasks using Workflow Manager and monitored them in Workflow Monitor.
- Created Pre/Post Session/SQL commands in sessions and mappings on the target instance.
- Responsible for daily verification that all scripts, downloads, and file copies were executed as planned, troubleshooting any steps that failed, and providing both immediate and long-term problem resolution.
- Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
- Involved in writing batch scripts for file transfers, file renaming, and archiving old files for automation.
- Design, development and deployment of tableau reports. Tableau extract incremental strategy execution, extract refreshment and tableau extract scheduling thru the tableau server.
- Generation of Tableau extracts and implementation of different calculations, visualizations in the tableau lay out design.
- Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.
- Driving the Data using slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
Environment: Informatica Power Center 10.1/9.X, SQL Server 2014/2012, Azure SQL Server database, Oracle 11g R2, Oracle 12c, Linux Server, High Availability, batch Scripts and Business Objects XI R2, Tidal Scheduler, Service Now, JIRA, Tableau BI Products (desktop, server, reader & online), Business Objects.
Confidential, Sacramento, CA
Sr Informatica Developer
Responsibilities:
- Interacted with business users for requirement analysis and to define business and functional specifications.
- Documented user requirements, Translated requirements into system solutions and developed implementation plan and schedule.
- Extracted data from Oracle, DB2, Flat files, XML Files, Flat Files and populated into EDW (SQL Server)
- Developed Complex transformations, Mapplets using Informatica Power Center 9.1.0/9.6.1 to Extract, Transform and load data into data marts, Enterprise Data warehouse (EDW) from Operational data store (ODS) with Dimensional Modeling (Star Schema and Snow Flake Schema).
- Prepared software requirement specifications, business rules interacting with business units and designed Star schema, logical and physical database design for Domains.
- Developed data Mappings, Transformations between source systems and warehouse
- Extracted data and loaded using Teradata utility tools such as Mload, Fastexport, fastload etc.
- Performed Type1 and Type2 mappings
- Implemented Aggregate, Filter, Join, Expression, Lookup and Update Strategy transformations.
- Used debugger to test the mapping and fixed the bugs.
- Created sessions, sequential and concurrent batches for proper execution of mappings using server manager.
- Created IDQ data profile and score cards for the research users to analyze the Confidential t trends.
- Migrated development mappings as well as hot fixes them in production environment.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
- Created Workflows with worklets, event wait, decision box, email and command tasks using Workflow Manager and monitored them in Workflow Monitor.
- Created Pre/Post Session/SQL commands in sessions and mappings on the target instance.
- Responsible for daily verification that all scripts, downloads, and file copies were executed as planned, troubleshooting any steps that failed, and providing both immediate and long-term problem resolution.
- Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
- Involved in writing batch scripts for file transfers, file renaming, and archiving old files for automation.
- Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.
Environment: Informatica Power Center 9.1.0/9.6.1, IDQ 9.1, Oracle 11g R2, Oracle 12c, Exadata, SQL SERVER 2012/2008, batch Scripts and Business Objects XI R2, Tidal Scheduler, UNIX
Confidential, Washington DC
Sr. ETL/Informatica Developer
Responsibilities:
- Gather; analyze business and functional requirements and translating them into technical specifications.
- Assisted in Creating Logical and Physical Data Modeling.
- Creating the design and technical specifications for the ETL process of the project.
- Extensively used Informatica power center for extraction, transformation and loading process.
- Worked on different types of SCD’s like Type 1 and 2.
- Designing and creation of complex mappings using SCD type II.
- Extracted data from various sources like SQL Servers, Oracle, flat files, XML files and loaded data into SQL Server based Data Warehouse, Data marts from Operational Data Store (ODS) systems.
- Extracted claims and member data from FACETS claim engine tables to load into Data Warehouse and data marts.
- Worked with Informatica Data Quality toolkit for analyzing, data cleansing, data matching, data conversion, exception handling of member information.
- Processed claims information through IDQ defined rules to check the quality of the claims before loading into Data Marts.
- Performed data profiling to understand the data patterns using Informatica Data Quality.
- Performed Data Standardization process using IDQ process.
- Worked extensively on Source Analyzer, Mapping Designer, Mapplet designer and Warehouse Designer and Transformation Developer.
- Designed and developed Informatica Mappings and sessions based on business rules.
- Developed several Mappings and Mapplet’s using corresponding Sources, Targets and transformations such as expression, joiner, aggregator, lookup, update strategy, filter.
- Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
- Used workflow manager for session management, database connection management and scheduling of jobs.
- Created sessions and workflows to run with the logic embedded in the mappings using Power center Designer.
- Created Post session commands to move files from Informatica server to other location.
- Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
- Extensively developed Informatica mappings & tuned them for better performance.
- Creating the catalogs and analyzing the joins.
Environment: InformaticaPower Center 9.1.0/9.5/9.6.1, Teradata, Business Objects XI/6.5, Erwin 3.5, PL/SQL, Oracle 11g, DB2, Toad, FACETS, Informatica Data Quality 9.6
Confidential, Hartford, CT
Sr Informatica/IDQ Developer
Responsibilities:
- Designed and Implemented the ETL Process using Informatica power center.
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL, and performance tuning.
- ETL flows are developed from Source to Stage, Stage to Work tables and Stage to Target Tables.
- Performed the data profiling and analysis making use of Informatica Data Quality (IDQ) 8.6.1.
- Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
- Designed various mappings using transformations like Look Up, Router, Update Strategy, Filter, Sequence Generator, Joiner, Aggregator, and Expression Transformation.
- Created Work Flows with Command Tasks, Worklets, Decision, Event Wait and Monitored sessions by using workflow monitor.
- Extracted claims and member related information from FACETS claims engine tables and loaded into Data Warehouse/ Data Mart.
- Migrated Informatica Folders from Development Environment to Test and System Test Environment and Worked with Admins to migrate the same to Production environments.
- Wrote PL/SQL procedures for reconciliation of financial data between source and target to automate testing phases and help business for preliminary validation.
- Wrote UNIX scripts, environment files for Informatica.
- Developed Metadata driven code for effective utilization and maintenance using technical metadata, business metadata and process metadata.
- To externalize the business logic instead hardcoding in the mapping I have used Parameter file in Informatica.
- Generated Cognos reports to test standardized reports as per business requirements.
- Tuned Mappings and Mapplets for best Performance on ETL Side and Created Indexes and Analyzed tables periodically on Database side.
- Organized the dataflow and developed many Autosys jobs for Scheduling Jobs and moved to production.
- Primary resource in Production support team so, involved in emergency calls when application outage occurred and resolved defects when raised.
Environment: Informatica Power Center 8.6.1, IDQ 8.6.1, Oracle 10g, SQL Server 2005, Oracle Exadata, FACETS, MS Excel, MS Access, Flat Files, Cognos 8.0, SQL DEV, Toad, Unix, PL/SQL, Windows XP, Erwin, Autosys