Etl Developer Resume
Fair Fax, VA
SUMMARY
- Over 7+ years’ Experience with Informatica 9.x/8.x/7.x (Source Analyzer, Mapping Designer, Mapplet Designer, Transformations Designer, Warehouse Designer, Repository Manager, and Workflow Manager/Server Manager).
- Implementation of full lifecycle in Data warehouses and Business Data marts with Star Schemas, Snowflake Schemas, SCD & Dimensional Modeling.
- Experience in integration of various data sources (flat - files, Sql Server, DB2, Oracle, XML files, SAP R/3, Teradata, etc.)
- Strong database skills in Teradata, Oracle, SQL Server.
- Experience in Teradata SQL Assistant and Toad for testing and analyzing the data.
- Extensively worked with the files in HIPAA format and have good knowledge with serializers and parsers.
- Well adept in planning, building, and managing successful large-scale Data Warehouse and decision support systems.
- Comfortable in both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
- Experience in data mart life cycle development, performed ETL procedure to load data from different sources into data marts and data warehouse using InformaticaPowerCenter.
- Extensively worked in Extraction, Transformation and Loading of data from multiple sources into Data Warehouse.
- Expertise in creating mappings, mapplets and reusable Transformations using Informatica Designer.
- Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
- Good knowledge in tuning the performance of SQL queries and ETL process.
- Worked on UNIX-shell Scripting, Python for Pre-Session, Post-Sessions tasks and also automated the scripts using a scheduling tool.
- Hands on experience in creating Indexes and partitioning tables for performance.
- Significant experience in Dimensional and Relational Data Modeling and Business Intelligence experience in Star Schema/Snowflake modeling, FACT & Dimensions tables, Cubes, Dash Boards, Physical & logical data modeling using ERWIN.
- Significant Experience in PL/SQL,Procedures/Functions, Triggers and Packages.
- Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
- Familiar with Data modeling and worked with data modeling tool Erwin.
- Excellent Analytical, Communication skills, working in a team and ability to communicate effectively at all levels of the development process.
- Expertise in generating reports for end client using various Query tools like Cognos, Business Objects and OBIEE.
- Worked closely with Business Object’s reporting team in order to meet users/business needs.
- Very Good Hands on knowledge in data management and implementation of Big Data/Hadoop applications using Informatica Big data Version.
TECHNICAL SKILLS
Languages: C/C++, HTML, XML, SQL, T-SQL, PL/SQL, Unix Shell Script, Python, Java Scripts
ETL: Informatica 9.x/8.x/7.x(Power Center/Power Mart) (Designer, Workflow Manager, Workflow Monitor, Server Manager, Informatica Data Quality (IDQ).Power Exchange, Power Connect, Power Exchange,B2B DTS 8.6/9.1), Informatica Cloud Services
Databases: Teradata, Oracle 8i/9i/10g/11g, Sql Server 2008/2005, Netezza 7.0.4,Teradata V2R6.2, Sybase 11.5,GreenPlum, Mainframe DB2, SAP-R/3.
OLAP: Business objects XIR3/XIR2, BI Publisher, XML Publisher
Operating Systems: Windows 2000/ NT/ XP/ Vista/98/ 95, LINUX, UNIX, DOS, Windows 2003 server.
Tools: SalesForce.com, SQL Loader, SQL*Plus, TOAD, SQL Developer, BTEQ, Teradata SQL Assistant,Erwin
Reporting Tools: Microsoft Reporting Services, Crystal Reports 9/10.
Scheduling and Other Tools: Autosys, Tidal, Control-m, IBM Clearquest
PROFESSIONAL EXPERIENCE
Sr. Informatica Developer
Confidential - Minnepolis, MN
Responsibilities:
- Analyzed the Source and Target data and designed the Source to Target Data Map-ping document.
- Involved in developing an ETL strategy to populate the Data Warehouse from various source systems feeds using Informatica Power Center and Teradata.
- Extensively worked in developing optimized complex mappings from Business Requirements Document to extract, transform and load data.
- Developed SQL scripts to extract, transform and load data into the database from Business Requirements Document.
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Java transformation.
- Work with support team to define methods for and potentially implement solutions for performance measuring and monitoring of all data movement technologies.
- Designed and developed Data Profiling in Informatica Designer to profile the Source data to meet business requirements using IDQ
- Created Data Profiles in Profile Manager using Auto Profile and Custom Profile options to validated documented business rules.
- Implemented performance tuning logic on Targets, sources, mappings and sessions to provide maximum efficiency and performance.
- Created deployment groups in Informatica to handle code migrations from DEV, QA and Prod.
- Developed Unit tests and separate User Acceptance test cases (UAT) to validate data against test scenarios.
- Scheduled Informatica jobs using Autosys job scheduler.
- Involved in Performance Tuning of mappings in Informatica.
- Good understanding of source to target data mapping and Business rules associated with the ETL processes.
Environment: Informatica Power Center 9.5, InformaticaIDQ 9.5,Informatica Cloud Services, Autosys, Toad, SQL Assistant.
Sr.ETL Developer
Confidential, Camp Hill, PA
Responsibilities:
- Used Informatica Power Center for extracting Source data and loading into target table.
- Analyzed the source data, Coordinated with Data Warehouse team in developing Relational Model.
- Designed and developed logical and physical models to store data retrieved from other sources including legacy systems.
- Extensively used Informatica Power Center 8.1.1 to extract data from various sources and load in to staging database.
- Interacted with business representatives for Need Analysis and to define Business and Functional Specifications. Participated in the Design team and user requirement gathering.
- Performed source data analysis .Primary data sources were from Oracle & SQL server 2005.
- Extensively used ETL to load data from multiple sources to Staging area (Oracle 10g) using Informatica Power Center 8.1.1
- Performed migration of mappings and workflows from Development to Test and to Production Servers.
- Involved in the Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
- Worked with pre and post sessions, and extracted data from Transaction System into Staging Area. Knowledge of Identifying Facts and Dimensions tables.
- Participated in all facets of the software development cycle including providing input on requirement specifications, high level design documents, and user’s guides.
- Tuned sources, targets, mappings and sessions to improve the performance of data load.
- Involved in Unit testing and documentation.
Environment: Informatica Power Center 8.1/8.5/8.6, Informatica Power Exchange 8.1, Informatica Power Connect, Oracle 10g, Teradata, GreenPlum,DB2, SSIS, SQL Server 05, Autosys, Toad 9.0.1, Erwin 4.2, Unix, Siebel CRM, SQL Developer, SQL, T-SQL,, PL/SQL.
Informatica consultant
Confidential - Fair Fax, VA
Responsibilities:
- Involved closely with the team implementing logical and physical data modeling with STAR schema techniques using Erwin in Data warehouse as well as in Data Mart.
- Used Erwin for logical and Physical database modeling of the warehouse, responsible for Database schemas creation based on the logical models.
- Involved in the Administration and maintenance of the Informatica Repository.
- Extracted data from flat files, SQL Server 05, XML Files data using Informatica Power Connect and performed massive data cleansing applying all the business rules prior to loading them into Oracle staging tables.
- Extensively used Informatica to load data from MS SQL Server, SAP- R/3 and DB2, into the target Oracle.
- Implemented data load from different type of source like Flat File, Relational Database, Oracle Logs using Informatica Power Exchange.
- Performed Data Profiling and created User Defined Functions (UDF) /process to handle null fields with default values for string, date and number.
- Created packages using SSIS as ETL tools to transfer and integration of data from databases.
- Used PMCMD, PMREP, Python and UNIX shell scripts for workflow automation and repository administration.
- Created Data Breakpoints & Error Breakpoints for debugging the mappings using Debugger.
- Implemented Slowly Change Dimension Type I & II to capture history and created process flow chart for insert/update using effective end date while loading data into the data marts.
- Developed Python, shell scripts, PL/SQL scripts, T-SQL scripts and sql Stored Procedures for regular Maintenance and Production Support to load the warehouse in regular intervals and to perform Pre/Post Actions.
- Extensively involved in designing the SSIS packages to export data of flat file source to SQL Server database.
- Developed BTEQ scripts to extract, transform and load data into the Teradata database from Business Requirements Document.
- Worked on Teradata Utilities like Fastload, Multiload and Fast Export.
- Extensively worked with the Teradata SQL assistant/NEXUS for testing and analyzing the data.
- Performed Bulk Load of large volume of data by creating pre- and post-load scripts to drop indexes and key constraints before session and rebuild those indexes and key constraints after the session completes.
- Developed BI Publisher reports as per the User requirements.
- Familiar with batch scheduling using Autosys.
Environment: Informatica 8.1.1, Power Exchange, Oracle 10g, PL/SQL, Toad 9.4, SQL Server 2005, Windows NT, UNIX Shell Scripting.
Informatica consultant
Confidential
Responsibilities:
- Involved in gathering the Business requirements from the client and writing and maintaining technical/design documentation.
- Involved in Logical and Physical data models that capture current/future state data elements and data flow using Erwin.
- Used advanced techniques of Informatica to load data from source MS SQL Server, Flat files, DB2 into the target DW, TeraData.
- Designed and developed complex Mappings using InformaticaPowerCenter Designer.
- Designed reusable transformations and Mappings Data Quality Plan Mapplets, Build Data quality Scorecards.
- Design/develop and manage power center upgrades from v 7.x to V 8.x, Migrate ETL code from Informatica v 7.x to V 8.x. Integrate and manage workload of Power exchange CDC.
- Extensively worked with transformations Lookup, Update Strategy, Expression, Filter, Stored Procedures, Router and others.
- Designed, developed and fine-tuned stored procedures. Tuning of SQL queries for better performance. Imported external data (flat files) into Oracle tables.
- Implemented changes in slowly changing Dimension (Type 2 & 3) tables.
- Created and ran workflows using Workflow Manager to load the data into the Target Database.
- Optimized/Tuned mappings for better performance and efficiency.
- Developed mapping parameters and variables to support SQL override.
- Used debugger to test the data flow between source and target and to fix the invalid mappings.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Used session partitions, dynamic cache memory and index caches for improving performance of Informatica services/ server
- Designed and developed table structures, stored procedures and functions to implement business rules.
- Used UNIX shell scripting and PMCMD command line for workflow automation and Created Shell scripts to execute some packages from Informatica.
- Used application development tools like TOAD (Oracle) to develop SQL queries and stored procedures. Oracle as backend database and Oracle CDC.
- Tested the data and data integrity among sources and targets using Unit Testing, UAT.
- Associated with Production support team in various performances related issues.
- Involved in Generating and Validating Reports using COGNOS.
Environment: InformaticaPowerCenter 7.x/8.x, PowerExchange, COGNOS, Oracle 9i, SQL, PL/SQL
ETL Developer
Confidential
Responsibilities:
- Designed and Developed mappings needed for enhancement of project.
- Worked with Informatica Power center 7.x Mapping Designer, Workflow Manager, Workflow Monitor and Admin Console.
- Exported and Imported mappings between different Informatica folders and also Repositories.
- Responsible for Administration of Informatica Environment. Created users and groups, configured profiles and privileges.
- Informatica Debugger is used to test the mappings and fixed bugs.
- Provided Production Support for IDS DSS and created Java and Unix Shell Scripts to delete data that is older than 3 years from fact tables.
- Created Stored Procedures and SSIS (DTS) packages for the ETL process.
- Developed Java and UNIX shell scripts to get data from external systems flat files to EDW stage area, and to schedule various workflows to load the data into target system.
- Familiar with Data Warehouse Architecture, design methodologies and Best practices.
- Experience with server log files, editing UNIX scripts, FTP files and checking space on server.
- Implemented performance tuning techniques on Targets, Sources, Mappings and Sessions.
- Scheduled Informatica sessions and workflows using Informatica Scheduler.
- Performance and fine tuning of Informatica mappings.
- Loaded data into Datamarts on a daily basis.
- Created new repositories and new folders within the repositories using Repository Manager.
Environment: Informatica Power Center 7.1, Oracle 9i Windows Server 2003,sql, PL/SQL, SQL*Plus, XML, JS, PL/SQL Developer, SSIS, Teradata, DB2, Erwin 4.1,AppWorx 5.1.1, Windows NT/2000.