Sr. Etl Informatica Developer Resume
Collierville, TN
SUMMARY
- Over 7+ years extensive experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading Data using InformaticaPowerCenter/ IDQ.
- Extensively used Informatica PowerCenter9.6/9.1/8, Informatica Data Quality (IDQ) 9.6/9.1 as ETL tool for extracting, transforming, loading and cleansing data from various source data inputs to various targets, in batch and real time.
- Worked on Informatica Data Quality 9.6/9.1(IDQ) tool kit and performed data profiling, cleansing and matching and imported data quality files as reference tables.
- Strong work experience in Data Warehouse life cycle development, performed ETL procedure to load data from different sources like SQL Server, Oracle, Mainframe, Teradata and flat files into data marts and data warehouse using Informatica Power Center - Designer, Workflow Manager, and Workflow Monitor.
- Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
- Strong understanding of OLAP and OLTP Concepts.
- Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
- Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un-connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.
- Involved in understanding of Business Processes, grain identification, identification of dimensions and measures for OLAP applications.
- Involved in the Analysis, Design, Development, Testing and Implementation of business application systems for various Sectors.
- Experience in SQL, PL/SQL and UNIX shell scripting.
- Solid experience in writing SQL queries and Stored Procedures.
- Good knowledge on data quality measurement using IDQ and IDE.
- Extensive ETL experience using InformaticaPowerCenter(Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
- Good experience in Informatica Installation, Migration and Upgrade Process
- Experience in handling initial/full and incremental loads.
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions.
- Designed and Developed IDQ mappings for address validation / cleansing, data conversion, exception handling, and report exception data.
- Working experience using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets, Re-usable Tasks, Monitoring Sessions.
- Extensive experience using database tool such as SQL *Plus, SQL *Developer.
- Experienced in Performance tuning of Informatica and tuning the SQL queries.
- Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Experience in working both Waterfall & Agile Methodologies.
- Committed team player with multitasking capabilities. Excellent interpersonal and human relations skills.
- Strong communication skills, both verbal and written, with an ability to express complex business concepts in technical terms.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 10.1/9.1/8.x/7.x/6.x, Power Exchange 9.1/8.x/7.x, Informatica Data Quality, Power Mart 5.x (Source Analyzer, Data Warehouse Designer, Mapping Designer, Mapplet, Transformation Developer, Repository Manager, Workflow monitor)
Data Modeling: MS Visio, ER Diagrams
Databases: Oracle 12C/11g/10g/9i/8i, Sybase, DB2, MS SQL Server 2000/2005, Teradata
DB Tools: Oracle SQL Developer, SQL*Loader, Toad, Oracle SQL*Plus
Operating Systems: Windows 98/2003/2000/ XP/7/NT, UNIX, MS-DOS
Scheduling Tools: Informatica Scheduler, Skybot, Autosys, Control M, DAC
Languages: SQL, PL/SQL, UNIX, HTML, XML, C, C++, Java
Office Suite: MS Word, MS Power Point, MS Excel, MS Access
Methodologies/Data: Modeling Tools Data Mart, Dimensional, Snow Flake, Star Schema, ERWIN
Other Tools: Tableau, UML Tools
PROFESSIONAL EXPERIENCE
Sr. ETL Informatica Developer
Confidential, Collierville, TN
Responsibilities:
- Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using InformaticaPowerCenter.
- Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.
- Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in InformaticaPowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
- Coded Teradata BTEQ scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
- Worked with team to convert Trillium process into Informatica IDQ objects.
- Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor.
- Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
- Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
- Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
- Proficient in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality.
- Worked on Informatica Data Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.
- Coded Teradata BTEQ SQL scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
- Experience in creation of ETL Mappings and Transformations using InformaticaPowerCenterto move data from multiple sources into target area using complex transformations like Expressions, Routers, Lookups, Source Qualifiers, XML generator, XML Parser, Aggregators, Filters, Joiners.
- Responsible in preparing Logical as well as Physical data models and document the same.
- Performed ETL code reviews and Migration of ETL Objects across repositories.
- Developed ETL's for masking the data when made available for the Offshore Dev. Team.
- Developed UNIX scripts for dynamic generation of Parameter Files& for FTP/SFTP transmission.
- Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's.
- Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica IDQ cleanse Adapters.
- Migrated codes from Dev to Test to Pre-Prod. Created effective Unit, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure successful execution of accurate data loading.
- Scheduled Informatica workflows using OBIEE.
- Involved in implementing change data capture (CDC) and Type I, II, III slowly changing Dimensions.
- Developed functions and stored procedures to aid complex mappings
Environment: InformaticaPowerCenter10.x/9.6, Informatica Data Quality (IDQ) 9.6, Oracle 11g, Teradata, PL SQL, SQL developer, TOAD, Putty, Unix.
Sr. ETL Consultant
Confidential, Boston, MA
Responsibilities:
- Actively involved in Requirements gathering, analyze business, Design and Development, testing and implementation of business rules.
- Leads data migration activities, facilitating data mapping, data cleansing and data validation activities.
- Experience in end to end data migration roles including leading, designing, and executing data migration.
- Involved in the entire SDLC (Software Development Life Cycle) process that includes Implementation, testing, deployment, documentation, training, and support.
- Extracted data from Teradata database transformed as per business requirements and loaded successfully into the new Oracle warehouse.
- Analyzed business process and developed ETL procedures to move data from source to target systems.
- Define data migration approach and strategies, provide detailed data load plans for all objects, sequences, systems and responsibilities.
- Worked on Power Exchange Sources and flat files to load the CDC (Change Data Capture) data into Oracle Target tables.
- Parameterized the code in order to change the necessary parameter files in UNIX without disturbing the running applications in production.
- Extensively used and modified SQL (Teradata to Oracle) for accessing and manipulating database systems by updating SQ-SQL Overrides, Update Overrides, Post-SQL, Pre-SQL, etc.
- Performed Data profiling on various source systems like SQL Server, DB2, Teradata, Oracle and Flat files.
- Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target tables.
- Expert in troubleshooting complex problems by checking sessions and error logs.
- Created Workflows using Workflow manager for different tasks such as sending emails, timer that triggers when an event occurs and sessions to run a mapping.
- Hands on experience in performance tuning techniques by identifying and resolving bottlenecks in SQL
- Queries, Sources, Targets, transformations and mappings to improve performance.
- Hands of knowledge in creating Shell scripts for automation and execution of the workflows.
- Raised change requests, analyzed and coordinated resolution of program flaws in Prod and QA environment, migrated the fixed code in Dev and to QA, preproduction and production environments according to change request approval.
- Provided 24*7 Production Support during the Post Implementation of ETL activities that included troubleshooting and running jobs manually.
Environment: Informatica Power Center 9.6/9.5/9.1, DB2, SQL Developer, TOAD, Teradata 14, Teradata SQL Assistant, Quality Center, Oracle 9i/10g/11g, Unix, Autosys
Sr. ETL Informatica Developer
Confidential, Reston, VA
Responsibilities:
- Using the InformaticaPower Center Designer analyzed the source data to Extract & Transform from various source systems (SQL server and flat files) by incorporating business requirement rules.
- UsingInformaticaPower Center 9.5.1 as a tool, extracted data from Flat files, SQL to build Data Source and Applied business requirement logic to load the Facets, Vendor and Pharmacy data into Data Warehouse Data Mart Tables.
- Designed and developed Mappings using different transformations such as Source Qualifier, Expression, Aggregator, Filter, Joiner, and Lookup to load data from source to target tables.
- Created Stored Procedures to handle the selection criteria such as Address, Provider, Specialty, Chapters and Credentialing and to load the data for the Extract and Exclusion reports based on the business requirements.
- Created Stored Procedures to load the crosswalks data for Medicare Provider and Printed Directories to SQL Staging tables.
- Created variables and parameters files for the mapping and session so that it can migrate easily in different environment and database.
- Perform Unit testing by writing simple test scripts in the database and involved in integration testing.
- Creating and Deploying SSRS reports for the provider directory Health Plan and vendor data and generate the daily, weekly, monthly and Quarterly reports.
- Created Subscriptions and generated Subscription ID for the Extract and Exclusion Reports from SSRS Reports.
- Created tidal jobs to schedule theinformaticaworkflows using Tidal Scheduler.
- Created and configured workflows, worklets and sessions usingInformaticaWorkflow Manager.
- Created command task at the workflow level by writing the commands to get the Flat files having same structure i.e. Indirect File Loading and at the end of the task moving all the Flat files using command task to an archive directory.
- Used workflow level variables for the reusability of the code.
- Used Mapping Parameters and Variables to implement Object Orientation techniques and facilitate the reusability of code.
- Coordination of system/Integration/UAT testing with other teams involved in the project and review of test strategy
Environment: InformaticaPower Center 9.5, SQL Server 2012, Oracle 11g, Flat Files, SSRS, Microsoft Visual Studio 2012, Tidal, JIRA, Microsoft Visio, SharePoint, Facets, Tortoise SVN.
Informatica Data Quality Developer
Confidential, Chicago, IL
Responsibilities:
- Interacted with Business Analyst to understand the requirements and translated them into appropriate technical requirement document.
- Coordinating with the client team daily for issues and status update.
- Conducting technical design presentation to the client and getting the sign off.
- Worked withInformaticaData Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
- Designed mappings using different transformations like Lookup's (connected, Unconnected, Dynamic), Expression, Sorter, Joiner, Router, Union, Transaction Control, Update strategy, Normalizer, Filter, Rank, and Aggregator usingInformaticaPower center designer.
- Used IDQ (InformaticaData Quality) as data Cleansing tool to create the IDQ Mapplets and to correct the data before loading into the Target tables.
- Created IDQ Mapplets for address cleansing, telephone cleansing and SSN cleansing and used them asInformaticaMapplets in Power Center.
- Written complex queries in Teradata to transform the data at DB end which could be used as SQL override in SQ transformation. Also written Stored Procedures as per requirement.
- Created profiles on the Source table to find the Data anomalies using IDQ.
- Analyzed different data sources like Oracle, Flat files including Delimited and Fixed width like text files, XML files from which the contract data and billing data is coming and understand the relationships by analyzing the OLTP Sources and loaded into Teradata warehouse.
- Worked on Slowly Changing Dimension (SCD) Type 1 and Type 2 to maintain the full history of customers.
- Responsible for performance optimization by writing SQL overrides instead of using transformations, implementing active transformation like filters as early as possible in the mapping, selecting sorted input when using Aggregator or Joiner transformations
- Implemented performance tuning at all levels like source level, target level, mapping level and session level.
- Created reusable transformations at mapping level in power center designer.
- Creating Procedures and packages based on the BA requirement.
- Working on simple and complex Queries based on Business Requirements
- Creating Jill scripts based on data load requirements.
- Creating DML scripts based on requirements
- Closely moved with the Micro strategy reporting team and helped them to get the data for creating the report.
- Have performed code promotion from development level to production level.
- Prepared an ETL technical document maintaining the naming standards.
Environment: InformaticaPower Center 9.6, Data Quality 9.6, Oracle 11g, Cognos, Netezza, Web Services, Teradata v13, Teradata utilities Windows server 2012.
ETL Informatica developer
Confidential
Responsibilities:
- Understanding the existing business model and customer requirements.
- Worked on ETL toolInformaticato create maps and transformations.
- Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and from the relational database, which was Oracle 9i.
- Configuring Repository Manager, created folders and managed objects in repository manager forInformatica.
- Used Filter, Router, Aggregator, Lookup, Expression, and Update Strategy transformations whenever required in the mapping.
- Involved in the complete software development life cycle (SDLC) of the project.
- Developed Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
- Used Change Data Capture to implement incremental load.
- Used CDC mechanism to extract data which has changed at source since the last extract and load it in Data marts.
- Involved in Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
- Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
- Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
- Created and monitored sessions and batches to run the mappings.
- Involved in creation of sessions and scheduling of sessions.
Environment: Oracle 9i,Informatica8.6(Designer, Workflow Manager, Workflow Monitor, Repository Manager), Cognos, Test Director, SQL *Plus, UNIX.
ETL Informatica developer
Confidential
Responsibilities:
- Involved in design, development and maintenance of database for Data warehouse project.
- Involved in Business Users Meetings to understand their requirements.
- Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration withInformatica8.X.
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
- Worked extensively with the connected lookup Transformations using the dynamic cache.
- Worked with complex mappings having an average of 15 transformations.
- Created and scheduled Sessions, Jobs based on demand, run on time and run only once
- Monitored Workflows and Sessions using Workflow Monitor.
- Performed Unit testing, Integration testing and System testing ofInformaticamappings
- Coded PL/SQL scripts.
- Wrote Unix scripts, Perl scripts for the business needs.
- Coded Unix Scripts to capture data from different relational systems to flat files to use them as the source file for ETL process
- Created Universes and generated reports on using Star Schema.
Environment: InformaticaPowerCenter 8.6, Oracle 10g, SQL, PL-SQL, UNIX