Etl/ Informatica Developer Resume
Greensboro, NC
SUMMARY
- 7 plus years of work experience inETL(Extraction, Transformation and Loading) of data from various sources into EDW, ODS and Data marts usingData Integration Tool Informatica Power Center 9.0.1/8.x/7.x/6.xin Insurance, Retail, Banking, Telecom and Health care departments.
- Experience in the Implementation of full lifecycle inData warehouse, Operational Data Store(ODS) and BusinessData martswithDimensional modelingtechniquesStar SchemaandSnow flake SchemausingKimballmethodologies.
- Experiencein design, development and maintenance of software applications in Information Technology, Data warehouse and RDBMS concepts.
- Expertise onInformatica Mappings, Mapplets, Sessions, WorkflowsandWork letsfor data loads.
- Experience inPerformance TuningofTargets, Sources, Sessions, Mappings and Transformations.
- Worked on Exception Handling Mappings forData Quality, Data Profiling, Data Cleansing, Data Validation
- Experience in Configuration of Informatica MDM Data Director, Hierarchy Manager, Match/Merge process
- Hands - on experience with Informatica MDM Hub configurations - Data modeling Data Mappings (Landing, staging and Base Objects), Data validation, Match and Merge rules, writing and customizing user exits, customizing configuring Business Data director (BDD)Informatica data director (IDD)applications
- Experience in developingInformatica Reusable componentsfor using them across the projects.
- Extensively worked withInformatica Mapping Variables, Mapping ParametersandParameter Files.
- Worked onSlowly Changing Dimensions - Type 1, Type 2 and Type 3in different mappings as per the requirements
- DatabaseslikeOracle, DB2, SQL Server, Microsoft Accessand Worked on integrating data fromFlat fileslike fixedwidth /delimited, XML filesand COBOL files.
- Experience in writing Stored Procedures, Functions, Triggers and Views onOracle 10g/9i/8i, DB2, SQL Server 2008/2005/2000 , DB2,materialized TSQL
- Extensively worked on Monitoring and Scheduling of Jobs usingUNIX Shell Scripts
- Worked withPMCMDto interact with Informatica Server from command line and execute theShell script.
- Good knowledge onMigratingthe environment from Informatica 7.x and 8.x to V9.
- Experience on ER Data Modeling tools likeErwin, ER-StudioandVisioin developing Fact & Dimensional tables, Logical and Physical models.
- Expertise on tools likeToad, Autosys, and SQL Server Management Studio. Involved in Unit testing, Functional testing and User Acceptance testing on UNIXandWindowsEnvironment.
- Good exposure to theSoftware Development Life Cycle (SDLC)andOOADTechniques.
- Completed documentation in relation to detailed work plans, mapping documents.
- Experience inmanaging onsite- offshore teams and coordinated test execution across locations
- Excellent communication skills, documentation skills, team problem solving ability, analytical and programming skills in high speed, quality conscious, multi-tasked environment.
TECHNICAL SKILLS
Data Warehousing: InformaticaPower Center9.x/8.x/ 7.x/6.x(Source Analyzer, Data Warehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Workflow Monitor, Work lets), Informatica MDM, Data Profiling, Data cleansing, OLAP, OLTP.
Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin4.1/3.5.2.
Database: Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008 , Teradata, Sybase, D
DB Tools: TOAD, SQL*Plus, PL/SQL Developer, SQL * Loader, Teradata SQL Assistant
Reporting Tool: Cognos 8.0/7.0
Programming: SQL, PL/SQL, TSQL, C, Unix Shell Scripting, AS/400
Environment: Windows NT/98/XP/2000, HP-UX 10.20, MS DOS.
Others: Autopsy’s, BODS 4.0, MS Excel, MS Word, MS PowerPoint, MS Outlook, MS Office products
PROFESSIONAL EXPERIENCE
Confidential, Greensboro, NC
ETL/ Informatica Developer
Responsibilities:
- Worked as a Data migration consultant by converting various complex objects like Customer Master Data, Vendor Master Data, Joint Operations Agreements, Joint Ventures, and Division of Interests etc.
- Worked closely with off shore implementation partnerslikeAccenture.
- Extracted data fromoracle databaseand spreadsheets and stages into a single place and applied business logic to load them in the central oracle database
- UsedInformatica Power Center 8.6/9.0.1for extraction, loading and transformation (ETL) of data in the data warehouse
- Developedcomplex mappingsin Informatica to load the data from various sources
- Extensively used Transformations likeRouter, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure.
- Extensively used Informaticadebuggerto figure out the problems in mappings. Also involved in troubleshooting existing ETL bugs.
- Extensively used Toad for SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
- Used thePL/SQL Proceduresfor Informatica mappings for truncating the data in target tables at run time.
- Worked on de-dup process to cleanse the data for Finance and Controlling objects to help better cleanse data that can be loaded on SAP.
- Resolved Inconsistent and Duplicate Data to Support Strategic Goals with Multidomain MDM
- Worked onMDM Hub configurations - Data modeling Data Mappings (Landing, staging and Base Objects), Data validation, Match and Merge rules, writing and customizing user exits, customizing configuring Business Data director (BDD)Informatica data director (IDD)applications
- Designed thecustom error control logicwith in the pipeline to capture and load the bad records to a control table, and recover the workflow in the event of failures.
- Used InformaticaPower Center Workflow managerto create sessions, batches to run with the logic embedded in the mappings
- Created procedures totruncate datain the target before the session
- Createddeployment groups, migrated the codeinto different environments
- Written documentation to describe program development, logic, coding, testing, changes and corrections
- Followed Informatica recommendations, methodologies and best practices from velocity documentation
- Provided support during various phases of project and plan the E & T process.
- Was involved in production support to make sure all issues are fixed in the respective turn-around times.
Environment: Informatica Power Center 8.6.1/9.0.1 , Informatica MDM 9.5,Oracle 10g/9i, Business Objects Data Services 4.1(BODS), MS-SQL server, Toad, HP Quality Center, Tidal, MS Office Suite
Confidential, Memphis
Informatica Developer
Responsibilities:
- Responsible for gathering requirement of the project by directly interacting with client and made analysis accordingly.
- Coordinated the work flow betweenonsite and offshoreteams.
- Defined various facts and Dimensions in the data mart including Fact less Facts, Aggregate and Summary facts.
- Extracting, Scrubbing and Transforming data fromFlat Files, Oracle, SQL Server, DB2,Teradataand then loading into Oracle database using Informatica
- Worked on optimizing the ETL procedures in Informatica 9.0.1/8.6.1 version.
- Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Implementing logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data warehouse as well as in Data Mart.
- UsedType 1andType 2mappings to update Slowly Changing Dimension Tables.
- Involved in theperformance tuningprocess by identifying and optimizing source, target, and mapping and session bottlenecks.
- Configuredincremental aggregator transformationfunctions to improve the performance of data loading. WorkedDatabase level tuning, SQL Query tuning for the Data warehouse and OLTP Databases.
- UsedInformaticarepository manager to create folders and add users for the new developers.
- Maintained stored definitions, transformation rules and targets definitions usingInformaticarepository manager.
- ConfiguredInformatica Serverto generate control and data files to load data into target database using SQL Loader utility.
- UsedActive batch scheduling toolfor scheduling jobs.
- Checked Sessions and error logs to troubleshoot problems and also useddebuggerfor complex problem trouble shooting.
- Negotiated with superiors to acquire the resources necessary to produce the project on time and within budget. Get resources onsite if required to meet the deadlines.
- Delivered projects working inOnsite-Offshore model. Directly responsible for deliverables.
- DevelopedUNIX Shell scriptsfor calling the Informatica mappings and running the tasks on a daily basis.
- Wrote OraclePL/SQLprocedures and functions whenever needed.
- Created & automated UNIX scripts to run sessions on desired date & time for imports.
Environment: Informatica Power Center 9.0.1/8.6.1 , PL/SQL, Oracle 9i, TOAD, Erwin 7.0, Unix, SQL Server 2000,Autosys, Windows Server 2003, Visio 2003.
Confidential, Sandy, UT
ETL Developer
Responsibilities:
- Participated in documenting the existing operational systems.
- Involved in the requirements gathering for the warehouse. Presented the requirements and a design document to the client.
- Created ETL jobsto load data from staging area into data warehouse.
- Analyzed the requirementsand framed the business logic for the ETL process.
- Involved in the ETL design and its documentation
- Designed and developed complexaggregate, join, lookup transformation rules(business rules) to generate consolidated (fact/summary) data using Informatica Power center 6.0.
- Designed and developed mappings usingSource qualifier, Aggregator, Joiner, Lookup, Sequence generator, stored procedure, Expression, Filter and Rank transformations
- Development of pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions
- Evaluated thelevel of granularity
- Evaluatedslowly changing dimensiontables and its impact to the overall Data Warehouse including changes to Source-Target mapping, transformation process, database, etc.
- Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system usingER-STUDIO.
- Collect and link metadata from diverse sources, including relational databasesOracle, XML and flat files.
- Created, optimized, reviewed, and executedTeradata SQL test queriesto validate transformation rules used in source to target mappings/source views, and to verify data in target tables
- Extensive experience withPL/SQLin designing, developingfunctions, procedures, triggersandpackages.
- Developed Informaticamappings, re-usable SessionsandMappletsfor data load to data warehouse.
- Designed and developed Informatica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows and usedDebuggerto test the mappings and fix the bugs
- Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements usingActive batch scheduling tool.
- Cleansed the data usingMDMtechnique
- Created stored procedures to validate the data before loading data into data marts
- Worked oncognos reportsfor generating reports
- Extracted DTS (Data Transformation Services) packages from theSQL server 2000.
- Converted the DTS packages into Informatica mappings and loaded the target into the Oracle database.
- Created SHELL SCRIPTS for generic use.
- Developed and maintained optimized SQL queries in the Data Warehouse.
Environment: Windows XP/NT, Informatica Power center 7.1/8.6,MDM, UNIX, Oracle 11g, Oracle Data Integrator,Teradata SQL Assistant, SQL, PL/SQL,SQL Developer, ER-win, Oracle Designer
Confidential, Charlotte, NC
Informatica Developer
Responsibilities:
- Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
- Created and ModifiedT-SQLstored procedures for data retrieval fromMS SQL SERVERdatabase.
- Automated mappings to run usingUNIX shell scripts, which included Pre and Post-session jobs and extracted data from Transaction System into Staging Area.
- Extensively used Informatica Power Center 6.1/6.2 to extract data from various sources and load in to staging database.
- Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, legacy data.
- Created mappings using the transformations likeSource qualifier, Aggregator, Expression, Lookup, Router, Filter, Rank, Sequence Generator, Update Strategy, Joiner and stored procedure transformations.
- Designed the mappings between sources (external files and databases) to operational staging targets.
- Involved in data cleansing, mapping transformations and loading activities.
- Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
- Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.
Environment: Informatica 6.1/6.2, PL/SQL, MS Access, Oracle 8i/7i, DB2, Windows 2000, UNIX
Confidential
Informatica Developer
Responsibilities:
- Experienced working withTelecom Network Informationfor mapping data from legacy systems to target systems
- Extracted Data from legacy Sources by usingInformatica Power Center.
- Extensively usedInformatica client toolsSource Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations.
- Cleanse the source data, Standardize the Vendors address, Extract and Transform data with business rules, and built Mapplets using Informatica Designer.
- Extracted data from different sources of databases. Created staging area to cleanse the data and validated the data.
- Designed and developed complexAggregate, Expression, Filter, Join, Router, LookupandUpdate transformationrules.
- Developedschedulesto automate the update processes andInformatica sessionsandbatches.
- Analyze, design, construct and implement the ETL jobs using Informatica.
Environment: Informatica Power Center 5.1.1., Cognos, Windows NT, PL/SQL, Excel, SQL Server 7.0, Erwin.