Sr. Etl Informatica Dih Admin/ L2 Support Resume
Cleveland, OhiO
PROFESSIONAL SUMMARY:
- Over 7+ years of IT experience in Data warehousing with emphasis on Business Requirements Analysis, Application Design, Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems
- Involvement in all phases of SDLC (Systems Development Life Cycle) from analysis and planning to development and deployment
- Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica PowerCenter 9.6.1/9.5.1/9.1/8. 6/7.1/6.1/5.1, PDI (Pentaho Data Integration),, DIH 9.x, IDQ 9.x, IDE 8.x, RTM, Informatica Web services 4.2 and PowerExchange 9.1
- Highly experienced in Data Warehousing, ETL and Business Intelligence experience using Pentaho Suite (Pentaho Data Integration/Kettle, Pentaho BI Server, Pentaho Meta Data and Pentaho Analysis Tool & Mondrian OLAP).
- Experience in using Webservices and security configuration in Informatica server (end - to-end), Power exchange, Power Connect, SAP BW
- Expert in Data Obfuscation Process using SQL Server, SSIS.
- Worked in different areas include System Analysis, Design, Development and Testing as a specialist in Informatica MDM and Power Center.
- Hands-on experience with Informatica Data Quality toolset and proficiency in IDQ development around data profiling, cleansing, parsing, standardization, validation, matching and data quality exception monitoring and handling.
- Experience in OLTP Modeling (2NF,3NF) and OLAP Dimensional modeling (Star and Snow Flake) using ERwin Standard Edition/r7.3/4/3.5 (conceptual, logical and physical data models)
- Experience in integration of various data sources definitions like SQL Server, Oracle, MySQL AB, Flat Files, XML and XSDs
- Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Worklets and Workflows for data loads
- Created Mappings in Mapping Designer to load data from various sources using complex transformations like transaction control, Lookup (Connected and Un-connected), Joiner, sorter, Aggregator, Update Strategy, Filter and Router transformations
- Expertise in Installing and Managing Informatica Power center, Metadata Manager, Data Explorer and Data Quality
- Experience with relational databases such as Oracle 8i/9i/10g/11g, SQL SERVER 2005/2008, Teradata 13/12/V2R5/V2R6 and DB2 UDB
- Experience with Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, TPUMP, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, Visual Explain)
- Proficient in delivering the high data quality by designing, developing and automation of audit process and implementation the reconcile process accordingly.
- Strong skills in SQL, PL/SQL packages, functions, stored procedures, triggers and materialized views to implement business logic in oracle database.
- Experience with Database SQL tuning and query optimization tools like Explain Plan
- Experience in SQL*Loader, UTL File Concepts, Import and SQL*Plus, DBMS Packages
- Experience in moving the Program Files (FTP) to the server by using the Tools like WINSCP, Putty and Telnet
- Experience in evaluating data profiling, cleansing, integration and extraction tools(Informatica, Kalido, and Composite Software)
- Strong experience in implementing CDC using Informatica Power Exchange 8.6/7.1
- Extensively Worked on Data Integration Between SFDC(Sales Force Dot Com) and Various Relational Data Base Sources
- I specialize in EDM, Enterprise Data Warehouse, SQL Server Analysis/Integration Services, Oracle and SSRS & Microstrategy Reports and other related products using Service Oriented Architecture (SOA)
- Experience in debugging and performance tuning of sources, targets, mappings and sessions.
- Experience in identifying and resolve ETL production root cause issues. Experience in maintenance, enhancements, performance tuning of ETL code.
- Extensive Agile and Scrum experience.
- Knowledge of Informatica administration in windows and Linux environment
- Experienced In working with various Scheduling tools like Autosys, Control-M, Informatica Scheduler
- Developed test case’s for business and user requirements to perform System/ Integration / Performance testing
- Continuously monitor the accuracy of the data and the content of the delivered reports.
- Strong analytical and problem solving skills
- Excellent communication and interpersonal skills. Ability to work effectively while working as a team member as well as individually.
TECHNICAL EXPERTISE:
Operating Systems: Windows(NT,2000/03/XP/Vista/7), Mac (10.4/10.5/10.6 ), Linux(Red Hat), UNIX (Solaris, AIXv5.2,SunOS 5.10,)
Languages: SQL, PL/SQL, T-SQL, UNIX Shell Scripts, Perl Scripting,Java, XML
ETL Tools: Informatica PowerCenter 9.6.1/9.5.1/9.1/8. x/7.1/6.2, IDE/IDQ,PDI(Pentaho Data Integration) 4.2,CoSort, DIH, PowerExchange/Connect
Data Modeling: ERWin StandardEdition/r7.3/4/3.5, MS-Visio 2010/2007
Databases: Oracle 11g/10g/9i/8i, Siebel, MS SQL Server 2005/2008/2008 R2, SSIS, SSRS, IBM DB2 UDB LUW, Teradata 13/12/V2R5/V2R6
Scheduling Tools: Autosys, Control-M, Informatica Scheduler.
Reporting Tool: Tableau, SSRS
Others: Web Services, MS Office, MS Visio, TOAD, FTP, CUBE, SFTP, SCP,GIS, MKS, ALM,TortoiseSVN 1.7.9,PAC2000 v7.6.
PROFESSIONAL EXPERIENCE:
Confidential, Cleveland, Ohio
Sr. ETL Informatica DIH Admin/ L2 Support
Responsibilities:
- Experience in identifying the source data elements from the leveraged commerical banking and relating the profitability metrics.
- Created database materialized views to consume data from different source using DB Links.
- Experience in installing and configuring Informatica Powercenter client 9.6.1/9.5.1 in windows environment
- Created mappings, mapplets, sessions, workflows, worklets in Informatica Powercenter 9.6.1/9.5.1.
- Involved in Data Center Migration, up gradation of Informatica DIH and Control-M.
- Created generic worklets for session logging of each workflow process.
- Experience with configuring security access manager, resource loading and monitoring operations of MDM HUB.
- Participated in designing of Multidimensional Modeling in Microstrategy, regulating Star and Snowflake Schemas.
- Fixing existing MicroStrategy Dashboards and schema objects.
- Well versed in developing the complex SQL queries, unions and multiple table joins and experience with Views.
- Developed ETL process using Pentaho PDI to extract the data from various data sources and populated it in to our BI data warehouse.
- Primarily worked on Upgrading the Hub Platform Environment from Informatica MDM 9.1 to Informatica MDM 10.2.
- Deployed SSIS Package into Production server and used Package configuration to export various package properties.
- Utilized Informatica IDQ Analysis and Developer Tool to determine the data quality issues for the MDM data.
- Designed and developed mappings to extract, cleanse, transform and load into target tables using different IDQ transformations.
- Designed and created matrix and tabular, drill down, drill through, and parameterized reports using SSRS and generated these reports using variables, expressions and functions.
- Worked on Maintainence of warehouse metdata for further development of application.
- Extensively used Teradata utilities like Fast load, Multiload to load data into the staging database
- Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
- Implemented effective date range mapping (Slowly Changing dimension type2) methodology for accessing the full history of accounts and transaction information.
- Created control files and used SQL*Loader to load the delimited flat files into the relational oracle database.
- Developed Various SQL Server Reporting Services(SSRS) reports which are used to track agent performances.
- Designed pilot system using Pentaho Kettle for ETL enhancements and Pentaho BI Server for publishing the Pentaho Reports on user funds transfer measures.
- Guiding Agile project teams to achieve a high level of performance.
- Configured NDM directories and created scripts for consuming and producing large files across the nodes.
- Provided weekly production support in a rotation shift basis and addressed the problem tickets during the oncall support.
- Developed Functional requirement Document and batch data load architecture using Microsoft visio and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis
- Created different views using Tableau Desktop that were published the business stake holders for analysis and customization using filters and actions.
- Responsible for understanding Existing SSIS reports and converting them to MicroStrategy reports and Visual Insights.
- Developed Functional requirement Document and batch data load architecture using Microsoft visio and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis
Environment: Informatica MDM 10.2, Informatica IR 10.0 Power Exchange 9.1, MicroStrategy Tool Suite 10.2, Oracle 11g,MicroSoft SQL Server 2010, SSIS, SSRS, Putty, WinScp, SQL, PL/SQL, TOAD,SQL Loader, Shell Scripts, DIH, Microsoft Visio 2010, Tableau 9.0, TortoiseSVN,Autosys CA Workload Control Center(Scheduler), Windows, UNIX, PAC2000
Confidential, Dallas, TX
Sr. ETL Informatica Admin/ L2 Production support Engineer
Responsibilities:
- Identified the source critical data elements that are mandatory to be included in the S1 Vendor XML and generate the same using Informatica.
- Created ETL process that will generate ACH templates / profiles for migrating the data to the new S1 application but is still outside of the S1 XML
- Created the S1 generated billing files, that are currently sent from the existing systems, to existing AAA and CMS billing systems
- Responsible for setting up jobs in EDW/DIH/ETL Integration environment complete on a daily basis.
- Developed many MDX queries for creating measures that fits business logic.
- Created standard NACHA files which included reusable Sessions, command tasks, worklets and workflows to send the files to the existing PEP system using PDI.
- Experience with Informatica PowerExchange 8.6 for Salesforce (SFDC Connector).
- Created the Information Reporting and Automated Clearing House (ACH) data model using ERWIN r7 data modeller. Star Schema Modeling, and Snowflake modeling, FACT and Dimensions tables, physical and logical modeling.
- Worked with Informatica IDQ to determine Data Quality issues and Remediation process for bad data.
- Publishing customized interactive reports and dashboards, report scheduling using Tableau server.
- Worked on Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems
- Using Dimensional Modeling Techniques to create Dimensions, Cubes and Fact tables.
- Extracted data from various sources like Flatfile, SQL SERVER 2008 R2, XML files, loaded into Oracle 11g database.
- Combined views and reports into interactive dashboards in Tableau Desktop that were presented to Business Users, Program Managers, and End Users.
- Provide technical support for the IDQ, DIH and MDM Environment
- Used Informatica Power center to configure Webservices and Security of the data
- Involved in Data Profiling using Data Explorer.
- Built the Physical Data Objects and developed various mapping, mapplets/rules using the InformaticaData Quality (IDQ) based on requirements to profile, validate and cleanse the data.
- Worked on Multidimensional Models and created Reports in Report Studio using Cubes as a data sources.
- Created Informatica maps using various transformations like Web services consumer, XML, HTTP transformation, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Filter and Router
- Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping
- Developing Oracle PL/SQL stored procedures, Functions, Packages to facilitate the functionality for various modules
- Extensively used TOAD utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping
- Converted manual jobs to fully automated processes using Autosys scheduler running both Unix and Bourne shell scripts.
- Defined Calculated members, named sets, and executed other script commands using MDXexpressions.
- Generated reports based on the statistical analysis of the data using SQL Server Reporting Services (SSRS).
- Created UNIX shell scripts to trigger the workflows, parse the files and monitor the loads. Created various batch Scripts for scheduling various data cleansing scripts and loading process
- Identified process bottlenecks and implemented performance tuning at mapping and session levels
- Designed and developed stored procedures, views and tables necessary to support SSRS reports.
- Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings
- Knowledge of Agile Project Management methodology.
- Used Debugger to troubleshoot cause of invalid results, if any, after running a process
- Provided production support for the Executive Management Information System (EMIS), a web reporting portal to informatica and teradata data sources
Environment: Informatica Multi Domain MDM 9.1, Informatica Power Center 9.5, Sales force, Power Exchange 9.1, PDI Kettle 4.2,CoSort,Oracle 11g, Web Services, MicroSoft SQL Server 2008 R2, SSRS, XML, CUBE, Putty, WinScp, SQL, PL/SQL, TOAD, DIH, Teradata 11, Erwin r7, Shell Scripts, Microsoft Visio 2010, MKS Integrity Client, GIS, Autosys CA Workload Control Center(Scheduler), Windows, UNIX
Confidential, Miami, FL
Sr.ETL/Informatica Developer/Admin
Responsibilities:
- Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system
- Interacted with Business Users and Managers in gathering business requirements
- Wrote Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews
- Experience in Technical Architecture design,Dimensional modelling, BI application design using the Kimball Life Cycle Methodology
- Configured Message Queues to alert downstream systems upon insert or update actions in MDMHub.
- Experience with Informatica PowerExchange 8.6 for Salesforce (SFDC Connector).
- Extensively used Teradata utilities like Fast load, Multiload to load data into the staging database
- Wrote bteq scripts to load data from staging to target tables
- Extensively worked on Power Center 8.6 Client Tools like Power center Designer, Workflow Manager, and Workflow Monitor
- Create mapping using Informatica IDQ (Data Quality) to cleanness data and feed into tables again.
- Extensively worked on Power Center 8.6 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
- Analyzed the source data coming from different sources (Oracle and SAP Flat files) and worked on developing ETL mappings
- Created Parameter files and validation scripts
- Developed various Oracle sub functions like Stored Procedures, Functions, and Packages using PL/SQL in order to ensure the data integrity and security
- Created custom Mload Scripts from command task
- Created Process Control and Metadata for informatica jobs
- Created reusable Sessions, command tasks, worklets and workflows in Workflow Manager
- Utilized Informatica toolset (Informatica Data Explorer, and Informatica Data Quality) to analyze legacy data for data profiling
- Assisted with peer code reviews and testing of development team's T-SQL code.
- Worked on Control-M for defining, scheduling and monitoring jobs
- Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level
- Worked on Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems
- Provided product demonstrations and technical guidance to business users on the capabilities of Informatica MDM .
- Performed Unit testing and Data validation testing using Validation scripts.
- Created Data Validation document, Unit Test Case Document, Technical Design Document, Informatica Migration Request Document and Knowledge Transfer Document
- Proven Accountability including professional documentation and weekly status report
- Provided production support for the Executive Management Information System (EMIS), a web reporting portal to informatica and teradata data sources
Environment: Informatica Power Center 8.6, Informatica Multi Domain MDM, Oracle 10g, IDQ, SAP Flat Files, Web Services, Teradata 11, UNIX, Win XP, SQL * Plus, Transact-SQL, Control-M, Putty.
Confidential, Minnetonka, MN
Informatica Developer
Responsibilities:
- Responsible for business analysis and requirements collection.
- Analysis of star schema in dimensional modeling and Identifying suitable dimensions and facts for schema.
- Involved in the Design and development of Data Mart and populating the data from different data sources using Informatica.
- Experience in developing reports using SSRS 2008 and SSRS 2008 R2
- Documented data conversion, integration, load and verification specifications.
- Experienced Informatica Power Exchange 7.1 to extract the data from the Legacy Mainframe Source system to the Staging database
- Used SSIS jobs for importing data from the flat files that brings the data to the application tables.
- Parsing high-level design to simple ETL coding and mapping standards.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Extensively used ETL to load data from wide range of sources such as flat files and Oracle 9i to XML Documents
- Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
- Developed, documented and executed unit test plans for the components
- Involved in Informatica administrative work such as creating Informatica folders, repositories and managing folder permissions
- Used XML schema to extract data from Oracle 9i into XML using Export Option in Informatica.
- Created pre-session and post-session shell scripts and mail-notifications.
- Developed Shell scripts using UNIX to drop and recreate indexes and key constraints
- Used TOAD to develop oracle PL/SQL Stored Procedures
- Creating the design and technical specifications for the ETL process of the project
- Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources
- Collected performance data for sessions and performance tuned by adjusting Informatica session parameters
- Worked with the various enterprise groups to document user requirements, translate requirements into system solutions and produce development, testing and implementation plan and schedule
Environment: Informatica PowerCenter/PowerMart 7.1, Power connect, Oracle 9i, Flat Files, XML, SQL Server 2005, SSIS, SSRS, TOAD, Informatica Schuduler, Shell Scripts, UNIX, Windows XP
Confidential
ETL Developer
Responsibilities:
- Extracted Data from Different Sources by using Informatica Center 6.1
- Used Informatica client tools Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations
- Involved in the process in documentation of source-to-target documentation, design documentation of the Data Warehouse Dimensional Upgrades.
- Extensively used Informatica for loading the historical data from various tables for different departments.
- Cleanse the source data, Standardize the Vendors address, Extract and Transform data with business rules, and built Mapplets using Informatica Designer
- Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data
- Designed and developed complex Aggregate, expression, filter, join, Router, Lookup and Update transformation rules
- Developed schedules to automate the update processes and Informatica sessions and batches
- Analyze, design, construct and implement the ETL jobs using Informatica
- Created Users, user groups, database connections and managed user privileges using supervisor.
- Created universe and reports
- Involved in creating desktop intelligence and web intelligence reports
Environment: Informatica Power Center 6.1, Windows NT, Excel, SQL Server 2005, Erwin 3.5