Sr.data Analyst/informatica Mdm Developer Resume
Denver, CO
SUMMARY
- Over 8+ years of extensive experience in complete Software Development Life Cycle (SDLC) including System Requirements Collection, Architecture, Design, Data Analysis, Coding, Development, Testing, Production Support, Maintenance and Enhancement in variety of technological platforms with special emphasis Client/Server, Data Warehouse MDM and Business Intelligence applications in Windows and Unix environments.
- Extensively worked on Informatica Designer Tools (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer) and Workflow Manager Tools (Task Developer, Worklet and Workflow Designer) Informatica MDM HUB Console and IDD Developer.
- Strong Data Warehousing ETL experience of using Informatica 9.5.1/9.1/8.6.1/8.5 IDQ (Informatica Data Quality)
- Strong knowledge in Kimball and Inmon methodology and models, Dimensional modeling using Star schema and Snowflake schema.
- Strong skills in Informatica Power Center (versions 8.6.1/ 8.5/7.1/6.2/5.0 ), SQL, PL/SQL, Stored Procedures and Triggers, Performed Debugging, Troubleshooting and Performance Tuning.
- Extensive experience with Data Extraction, Transformation, and Loading(ETL) from disparate Data sources like Multiple Relational Databases like Oracle, SQL Server, MS Access, Teradata, and Worked on integrating data from flat files and XML files into a common reporting and analytical Data Model using Informatica.
- Experience in Designing, Developing, Documenting and Testing of ETL Jobs and mappings to populate tables in Data warehouse and Data marts.
- Expertise in working with various operational sources like Oracle, Teradata, DB2, SQL Server 2005/2008, Flat files into a staging area.
- Experience in Database programming for Data Warehouse (Schemas), proficient in dimensional modeling, Star Schema modeling, and Snowflake modeling.
- Proficiency in data warehousing techniques for data cleaning, Slowly Changing Dimension phenomenon, surrogate key assignment and CDC (Change Data Capture).
- Extensive ETL experience, designing and developing jobs using Informatica.
TECHNICAL SKILLS
Data Warehousing: Informatica 10.0, 8.6/8.1/7.1.2/7.1.1/6.2/5.1.2/4.7 , (Informatica power center/power mart/power exchange, Informatica Data Quality, MDM ) SQL*plus, SQL*loader, Teradata, SQL loader utilities.
Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERwin 4.1/3.5.2. TOAD, Sybase Power Designer, Oracle warehouse builder
Databases: Oracle 10g/9i/8i,Teradata, MS SQL Server 2008/2005, DB2, MS Access 97/2000, Sybase.AS400,I Series
Reporting Tools: Business Objects XI, COGNOS 8.0, Microstrategy, Tableau
Programming: SQL, PL/SQL, SQL*Plus, XML, Unix Shell Scripting, Control - m, Robot Series.
Environment: UNIX, Windows XP/Vista, Linux, MS DOS 6.22.Mainframe
PROFESSIONAL EXPERIENCE
Confidential, Denver, CO
Sr.Data Analyst/Informatica MDM Developer
Responsibilities:
- Involved in various phases of SDLC from requirement gathering, analysis, design, development and testing to production.
- Analyzed and created business/solution functional requirement document to master customer data
- Graphically represented the data flow and process flow by creating Dataflow diagrams and Process flow diagrams
- Analyzed and profiled data from different source systems using Informatica data quality tool
- Generated data profiling results using Informatica data quality tool
- Filtered tables in source system to obtain the most relevant data/ tables in accordance with the business requirements for decision making by upper management.
- As per the profiling results, helped MDM developers to configure trust for different source systems in Informatica HUB console.
- Worked with ETL and MDM developers and created data quality rules to standardize the data as part of MDM project.
- Analyzed functional requirement and created the logical model using ER-win.
- Converted the logical model into the physical model by identifying the tables and columns on a granulated level using ER-win.
- Created the data flow and conceptual diagram for the business for better understanding and identifying the stages of data.
- Created source to target mapping document for the ETL development and ETL testing team.
- Set-up functionality walk-through workshops with business analysts and developers.
- Reviewed the mapping documents from source to target landing tables in CMX ORS schema
- Thoroughly conducted data analysis and gap analysis between source systems for MDM model.
- Designed and developed the Test Suite in the HP ALM tool
- Mapped all test cases to the function requirement document
- Uploaded all source to target mapping document in ALM and also mapped it to the function test cases
- Created table level and attribute level test cases in ALM
- Created complex SQL script by implementing business transformation logic to test the data in target tables against developer developed ETLs
- Created complex SQL scripts to test the transformation logic from source database to target database as part of UAT Testing
- Created quality rule and index design, development and implementation patterns with cleanse, parse, standardization, validation, scorecard, exception, notification and reporting in IDQ.
- Configured Address Doctor which can cleanse whole world address Data in IDQ.
- Worked on data cleansing and standardization using the cleanse functions in Informatica MDM/IDQ.
- Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
- Configured match rule set property by enabling search by rules in MDM according to Business Rules.
- Involved in creating, monitoring, modifying, & communicating the project plan with other team member.
Environment: Informatica MDM Hub 10.0, Informatica IDQ 9.5.1, Informatica Power Center 9.6/8.6/8.1, Power Exchange 9.6, Oracle 9i/10g, PL/SQL, V2R4, HP-UX,, Korn Shell Scripting, Mainframes Z/OS, Control M,, SQL SERVER 2008,Toad for Oracle, Google cloud, AS400, DB2, Integrity Tool, ROBOT I series, Informatica Cloud, Teradata,SQL Navigator, Razor SQL
Confidential, Birmingham, AL
Sr. Data Analyst / ETL Designer
Responsibilities:
- Collaborated with Business People for requirements gathering, done the Data Analysis on the legacy systems.
- Involved in preparing the functional specs and prepared the Technical design
- Informatica client tools - Used Source Analyzer, Target designer, Mapping Designer, Mapplets Designer and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
- Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
- Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements
- Used IDQ to create scorecard to manage audit of Auto loans.
- Credited Audit table to better manage the Load counts in Warehouse Database.
- Profiled the data using Informatica Analyst tool to analyze source data (Departments, party and address) coming from Legacy systems and performed Data Quality Audit.
- Worked on Data Standardization, Table, Scorecards, Data Validation and Modification, Data Masking, Columns, Rules, Database Interaction through IDQ/MDM.
- Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor.
- Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data.
- Worked in enhancement of SQL, dynamic sql and PL/SQL code to run in properly manner.
- Create a weekly process presenting cleansing, validation and duplicate suspect results to all the distributor.
- Managing and design of the physical and logical schema structure for Projects.
- Used CDC for moving data from Source to Target to identify the data in the source system that has changed since the last extraction with the help of CDC.
- Designed and Developed Oracle PL/SQL procedures, performed Data Import/Export, Data Conversions and Data Cleansing operations.
- Used IDQ for creating profile and scorecard for Audit Purpose.
- Set up thrust and match/merge rules for MDM.
- Worked as Data Steward for creating rule based consolidation of data.
- Managed MDM for CCAR and Credit Risk Database in Oracle Exadata.
- Coordinated with the reporting team to develop the Microstrategy reports.
Environment: Informatica Power Center 8.6/8.1, Power Exchange, Informatica Data Quality, Informatica MDM hub, Oracle 9i/10g, PL/SQL, V2R4, HP-UX, Windows 2000,Korn Shell Scripting, Mainframes Z/OS, Control M, Microstrategy, SQL SERVER 2008, Toad for Oracle.
Confidential, Columbus, OH / Dallas, TX
Sr. ETL DEVELOPER
Responsibilities:
- Performed a major role in understanding the business requirements, designing and loading the data into data warehouse
- Informatica client tools - Used Source Analyzer, Target designer, Mapping Designer, Mapplets Designer and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
- Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
- Worked on Static and Dynamic Caches for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
- Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements
- Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings and to verify data in target tables and Loaded data into Teradata using Fast Load, BTEQ, and Fast Export, Multi Load, and Korn shell scripts.
- Used Fast Load, Multi-Load, and Tpump for data loading.
- Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
- Used CDC for moving data from Source to Target to identify the data in the source system that has changed since the last extraction with the help of CDC.
- Used Power Exchange for SAP FICO input for moody’s rating agency.
- Designed and Developed Oracle PL/SQL procedures, performed Data Import/Export, Data Conversions and Data Cleansing operations.
- Used Autosys for automating Batches and Session.
Environment: Informatica Power Center 8.6/8.1,Power Exchange, Oracle 9i/10g, PL/SQL, Teradata V2R4,HP-UX,Windows 2000,Korn Shell Scripting, Mainframes Z/OS, Autosys, OBIEE, Multiload, Fastload, FastExport, Toad for Oracle.
Confidential, Minneapolis, MN
Sr. ETL-Informatica Developer
Responsibilities:
- Collaborated with Business People for requirements gathering, done the Data Analysis on the legacy systems.
- Involved in preparing the functional specs and prepared the Technical design
- Perform in depth Data Analysis and implemented Cleansing process and Data Quality
- Designed and Implemented the Pre Staging and Staging approach, for cleansing using Informatica ETL and UNIX.
- Designed and developed star schema model for target database using ERWIN Data modeling.
- Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
- Used various active and passive transformations such as Aggregator, Expression, Sorter, Router, Joiner, connected/unconnected Lookup, Stored Procedure, and Update Strategy transformations for data control, cleansing, and data movement.
- Involved in massive data cleansing prior to data staging.
- Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
- Created sequential/concurrent Sessions/ Batches for data loading process and used Pre & Post Session SQL Script to meet business logic.
- Designed and developed Mapplets for faster development, standardization and reusability purposes.
- Involved in pre and post session migration planning for optimizing data load performance.
- Extensively used ETL Informatica tools to extract data stored in MS SQL 2000, Excel, and Flat files and finally loaded into a single Data Warehouse.
- Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations.
- Extensively involved in performance tuning of the Informatica Mappings/Sessions by increasing the caching size, overriding the existing SQL.
- Worked as team leader to implement Incremental load and CDC to data mart using informatica for reporting purpose.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
Environment: Informatica Power Center 8.6.1 Metadata Manager, Power Exchange 8.6, Data Analyzer, UNIX, Oracle 10g, Flat Files, XML Files, SQL and PL/SQL, ERWIN 4.0, Toad for Oracle10g, Remedy, Ultra Edit, Microsoft Project, ESP.
Confidential, Atlanta, GA
ETL Developer/ Data Analyst
Responsibilities:
- Worked with Business analysis team on regular bases and assisted implementing agile methodology on project.
- Designed ETL architecture and putting in place ETL standards for Informatica.
- Administrator of several informatica environments including production environments.
- Created Reusable transformations and Mapplets using transformation developer and Mapplet Designer throughout project life cycle.
- Scheduled Informatica workflows using Informatica scheduler and external scheduler Control Manager.
- Did testing for Informatica Jobs on Pre Prod Server before moving them into Production servers
- Created and maintained Connect Direct Connections for secure data transfer between production servers and other environments.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Worked with Crontab utility in UNIX to manage and run workflow as per schedule.
- Developed and maintained simple and complex end-user reports and reports from Informatica repository for internal use in Business Objects.
- Installed and configured the transformation server for a Data Replication tool called Data Mirror.
- Configured several oracle production servers for replication with the help of Data Mirror.
- Supported Production environment for BI tools like Business Objects and Brio Hyperion 6.6.4
- Extensively used the SQL for Oracle and Teradata and developed PL/SQL scripts.
Environment: Informatica Power Center 7.1.4/7.1.1 , Cognos, Oracle, Teradata, Toad 8.6, UNIX, HP-UX 11i, Sun Solaris 5.4, SQL, FACETS 4.X, MS Office Visio 2003.