Sr. Informatica /etl Developer Resume
Bellevue, WA
SUMMARY
- INFORMATICA/ETL Developer with 9 years of overall experience in IT Industry with emphasis on Data Warehousing tools using industry accepted methodologies and procedures.
- Extensively worked on ETL and SQL for over 6 years using Informatica.
- Technical expertise in ETL methodologies, Informatica 7.x/8.x/9 - Power Center, Client tools - Mapping Designer, Mapplet Designer, Transformations Developer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
- Expertise in Data Warehousing, Data Migration, Data Modeling, and Data Cleansing.
- Proficiency in utilizing ETL tool Informatica Power Center 9.x for developing the Data warehouse loads with work experience focused in Data Acquisition and Data Integration.
- Extensively worked on ETL and SQL for over 6 years using Informatica.
- Extensive knowledge on Master Data Management (MDM) concepts.
- Extensive experience on Designing, Managing and administrating MDM/DIW objects using Kalido DIW/MDM8.5/9 tool.
- Experience in Installing Informatica and Configuration in Windows and UNIX environment.
- Designed and developed a robust end-to-end ETL process for the efficient extraction, transformation and loading of source data to the staging and then to the data mart.
- Directly responsible for the Extraction, Transformation & Loading of data from multiple sources into Data Warehouse. Complete knowledge of data ware house methodologies (Ralph Kimball, Inmon), ODS, EDW and Metadata repository.
- Expertise in Extraction, Transformation & Loading of data using heterogeneous sources and targets.
- Experience in Performance Tuning of Informatica (sources, mappings, targets and sessions) and tuning the SQL Queries
- Extensive experience in developing the Workflows, Worklets, Sessions, Mappings and configuring the Informatica Server using Informatica Power Center.
- Converted the SSIS packages into Informatica mappings.
- Used Pentaho Data Integration Designer to create ETL transformations
- Developed a UNIX Shell scripts which will send the reports to client over the network by using file transfer protocol (FTP & SFTP) and generating the log file, which will keep the history for the FTP reports.
- Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De normalization Concepts.
- Work directly with non-IT business analysts throughout the development cycle and provide production support for ETL.
TECHNICAL SKILLS
Databases: Oracle 9i/10g/11g, SQL Server 2012/2008 R2/2008/2005, DB2, MySQL 5.0/4.1, MS-Access.
ETL Tools: Informatica (PowerCenter 9.x, 8.x, 7.x)
Programming Skills: C, SQL,HTML
Methodologies: Dimensional Modeling - Star / Snowflake
Data Modeling - Logical, Physical: TWS for job scheduling, Ultra Edit, Microsoft office tools, Microsoft Visio, Lotus notes, Microsoft outlook express
Operating Systems: UNIX (Sun-Solaris, HP/UX), Windows 95/98/00/NT/XP
Other tools: Editors (SQL Navigator, Toad, QMF)
PROFESSIONAL EXPERIENCE
Confidential, Bellevue, WA
Sr. Informatica /ETL Developer
Responsibilities:
- Involved in the Review of Requirements and development of Technical Design documents.
- Participated in impacted analysis
- Co-ordinate with business team and DBA
- Analyzing the source data and deciding on appropriate extraction, transformation and load strategy
- Developed Informatica mappings/mapplets, sessions, Workflows for data loads and automated data loads using UNIX shell scripts.
- Designed and developed UNIX Scripts to automate the tasks.
- Develop SQL DDL and DML statements
- Preparing ETL mapping documents for every mapping and Data Migration document for smooth transfer of project from Development to Testing environment and then to production environment
- Developed Mapplets and Worklets for reusability.
- Involved in upgrading Informatica 8.6 to Informatica 9.1 and worked on Informatica 9.5.1
- Installed/configured Informatica’s ILM product suite for Data masking (Data Privacy), file archive load and data discovery and data visualization for data archive.
- Created projects in ILM for data masking with different parameters like commit interval, encryption key and degree of parallelism.
- Developed Fast Export scripts to send data files to other teams.
- Developed mappings which load data mart data into Teradata.
- Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.
- Used Informatica web services to create work requests/work Items for the end user.
- Used HTTP transformation for web services.
- Used informatica web service transformation to read data from a web service source and write data to a web service target.
- Worked on SQL Server Integration Services (SSIS) to integrate and analyze data from multiple homogeneous and heterogeneous information sources (CSV, Excel, Oracle db).
- Created SSIS packages to Extract, Transform and load data using different transformations such as Lookup, Derived Columns, Condition Split, Aggregate, Pivot Transformation, and Slowly Changing Dimension, Merge Join and Union all.
- Developed Custom Logging so user can know when a row is inserted in custom logging table by every SSIS package that executes.
- Developed several reports that exploited the use of Sub reports, report grouping, Pentaho Sub Confidential.
- Created dashboards in Pentaho using Pentaho Dashboard Designer.
- Designed and developed a series of complex Business Intelligence solutions using Pentaho Report Designer
- Created the Sales Force connections in Informatica Power Center.
- Extracted data from SalesForce legacy system, SalasVision, Charles River (Trading Platform).
- Integrated the data from Oracle to Sales Force (SFDC) using Informatica cloud.
- Designed and developed mappings with optimal performance using Aggregator, Joiner, B2B Transformation, Sequence Generator, Un-Cached & Cached Lookup, Connected & Unconnected Source Target pre and post load Stored Procedure Transformations, Update Strategy, Union etc.
- Experience in Informatica B2B Data Exchange using Unstructured, Structured Data sets.
- Created project in B2B data studio to load the data from excel to Oracle.
- Worked with event view, schema view, binary source view and repository view of B2B data transformation studio.
- Retrieved data from unstructured sources like xml using B2B data transformation.
- Created mapping using B2B DT Visual studio with script pane and target schema for xml and excel source data.
- Worked on designing catalogs, categories, sub-category and user roles using Kalido MDM 9.
- Populated the MDM tables using Stating table feed or File feed.
- Developed the windows batch script to load the master data for the Kalido MDM tables.
- Shared the knowledge with Business people on how to work on MDM interface, key-in the master data, authorize and publish the data.
- Maintain Development, Test and Production mapping migration Using Repository Manager. Involved in enhancements and maintenance activities of the data warehouse including performance tuning.
- Experience in analyzing the reporting requirement.
- Extensively used Stored Procedures, Functions and Packages using PL/SQL scripting for creating Connected and Unconnected Stored Procedure Transformations.
- Created AutoSys schedules/jobs for automation of ETL load process.
- Involved in Unit Testing, User Acceptance Testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
- Designing the ETL process and defining the strategies for data loads of type1/type2/type3
- Developed ETL mappings to load data into heterogeneous application of different databases.
- Analyzed Source system functionality Identifies and logs issues and perform regression testing.
- Maintain traceability from requirements in order to design to test results
- Incorporated the ETL logic in mappings to update different databases of DB2, Oracle and SQL server using flat files, DB2, Oracle and SQL server
- Create test case, test Specification and test reports and execute test specs and document the results.
- Identify and resolved the performance issues in Extract SQL queries for quick data retrieval.
- Production support for installed mappings.
- Done Code reviews and validated the mappings.
- Designed the schedule of workflow for Monthly loads.
- Run the Unix scripts to migrate the data from one server to another severs
Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1, Informatica B2B Data Transformation (DT)/Data Exchange (DE), Informatica ILM, SQL Server 2012/2008 R2, SSIS, DB2, Oracle 11g, Oracle EBS, Pentaho, Sales Force.com, SQL, Unix files, COBOL files, XML, Toad, Autosys scheduler.
Confidential, Denver, CO
Sr. Informatica Developer
Responsibilities:
- Analysis of Client’s requirements.
- Preparation of Master reference Documents and Detailed Design Specifications from the Source Field Matrix provided.
- Strong in UNIX Shell scripting. Developed UNIX scripts using PMCMD utility and scheduled ETL load using utilities like UC4 Automation tool.
- Developed a UNIX Shell scripts which will send the reports to client over the network by using file transfer protocol (FTP & SFTP) and generating the log file, which will keep the history for the FTP reports.
- Identifying the complexity involved and preparation of Work Allocation Sheet.
- Identifying and scripting the DDL changes required.
- Creating mappings, reusable transformations, sessions, workflows as per design docs.
- Testing and debugging the mappings and analyzing the data for proper population of data.
- Involved in Data Modeling Using Erwin.
- Created SSIS packages using BIDS.
- Imported data from AS400 to be loaded into SQL Server 2008 into the dbo schema through import wizard and stored it as an SSIS package.
- Worked with heterogeneous sources from various channels like Oracle, SQL Server, flat files.
- Worked on Informatica tool Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Extensively used Transformations like, Aggregator, Router, Joiner, Expression, Lookup, Update Strategy, and Sequence Generator.
- Setting up Batches and sessions to schedule the loads at required frequency using Power Center Workflow manager, PMCMD and also using scheduling tools. Generated completion messages and status reports using Workflow manager.
- Attended POC for Talend open studio.
- Created the ODS jobs using Talend Open Studio.
- Used Talend open studio to execute jobs for ODS.
- Debugged numerous issues in Talend.
- Worked closely with the administrators with the configuration of Talend Open studio.
- Integrated the sales force data into target Oracle using Informatica cloud.
- Validated the sales force target data in force.com application.
- Created Invoices, Cash Receipts, RMA, RMA Start into Sales Force from Oracle EBS.
- Extensively worked in the performance tuning of ETL mappings and sessions.
- Developed XML’s using XML parser Transformation for large scale data transmission.
- Developed functional and Technical design documents for implementing Informatica web services.
- Wrote Shell Scripts for Informatica Pre-session, Post session Scripts.
- Written PL/SQL stored Procedures and Functions for Stored Procedure Transformation.
- Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
- Identifying the issues and resolving during UAT testing.
- Production environment set up for final promote.
- Performance tuning for the Informatica run time.
Environment: Informatica PowerCenter 9.1.0 HF1, SQL Server 2008 R2, SSIS, DB2, UNIX Shell Scripting, TOAD for DB2 Flat files, Windows 7, Perl Script.
Confidential, Everett, WAInformatica Developer/ ETL Developer
Responsibilities:
- Designed the mappings between sources (external files and databases) to targets.
- Logical design of the warehouse by identifying key data points, their dimensions and measures.
- Identifying redundancy and taking steps to control it.
- Performed major role in understanding the business requirements, designing and loading of data into data warehouse (ETL). Designed star schema using Dimension modeling using ER-win design tool.
- Extensively used VBScript to check file status, creating files and move files from one directory to another directory.
- Extensively used ETL (Informatica) to load data from source to ODS.
- Extensively worked on created and reviewing SQL scripts and UNIX shell Scripts.
- Used Unix Shell Scripts for automation of ETL Batch Jobs.
- Successfully identified the entities to be retired in the legacy system using Informatica Application ILM.
- Identified and archived the legacy data into tapes for future reference using Informatica ILM.
- Involved in creating the database procedures and triggers.
- Aggregator, sequence, joiner and etc transformations used in this populating data process.
- Wrote SQL procedures.
- Validate the data in warehouse and data marts after loading process balancing with source data.
- Target database build in Oracle database.
- Involved in production support and involved in generation of reports using Business Objects.
- Extensively used Calculations, Variables, Breaks, Sorting, and Alerts for creating Business Objects reports.
- Interaction with end users, regularly in order to generate the reports required, these reports were generated using Business Objects functionality such as Slice and Dice, Drill Down, Master/detail, User Responses and Formulas
- Scheduling the reports in Business Objects.
Environment: Oracle 9i, PL/SQL, Teradata, UNIX Shell Scripts, Windows NT, Erwin, Informatica Power center 8.6.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange 8.0,Business Objects.
Confidential, Colorado Springs, CO
Informatica Developer/ ETL Developer
Responsibilities:
- Involved in identifying dimensions and facts to create enterprise data warehouse and model.
- Involved in Designing a Star-Schema based warehouse after understanding the business logic.
- Created mappings using various Transformations like Aggregator, Expression, Filter, Router, Joiner, Lookup, Update strategy, Source Qualifier, Sequence generator, Stored Procedure and Normalizer.
- Retrieved data for the property and casualty insurance customer/claims Database.
- Developed various Mappings & Mapplets to load data from various sources using different Transformations.
- Integrated data from external sources like swift, xml using B2B Data exchange tool of Informatica.
- Generated events, errors, transactions, values etc on the B2B dashboard for business decisions.
- Worked with data integration hub of Informatica B2B
- Implemented Performance tuning in Mapping by identifying the bottlenecks and Implemented effective transformation Logic.
- Performed various update strategies using Lookup and Update Strategy transformations.
- Creating Stored Procedures to populate sample data and carrying out test load.
- Loaded data from flat files and XML files to temporary tables in Oracle database using SQL*Loader.
Environment: Oracle 9i, Informatica Power Center 8.1.1, UNIX/AIX, Windows 2000, SQL*Loader, SQL Server 2005, Erwin 4.1, Toad. SQL, PL/SQL
Confidential, Irving, TX
Informatica Developer
Responsibilities:
- Developed various Mappings and Transformations using Informatica Designer
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
- Involved in design, development and maintenance of database for Data warehouse project
- Designed procedures for getting the data from other systems to Data warehousing system
- The data was standardized to store data from various Business Units in data warehouse tables
- Designed and Customized data models for Data Mart supporting data from multiple sources on real time.
- Created various transformations like Joiner, Aggregate, Expression, Filter, update Strategy
- Extensive system study, design, development and testing were carried out in the Oracle environment to meet the customer requirements.
- Performed various SQL queries for business analysts.
- Analyzed the data & build data warehouses by using SQL, PL/SQL, and SAS.
Environment: Informatica 7.x Oracle 9i, PL/SQL, SQL Loader, Windows, UNIX.