Etl /netezza Developer Resume
CharlottE
SUMMARY
- Over 10 years of IT experience in System Analysis, Design, coding and testing of Data Warehousing implementations across multiple domains such as Banking, Insurance, and Finance.
- Strong experience in developing of Data Warehousing using Informatica Power Center client tools, Power Exchange, Netezza, Teradata, BOs, AUTOSYS, and JCL Control M.
- Expertise in Creation, Modification and Testing of Informatica Mappings, Mapplets, Workflows, Worklets, Debugging and Performance tuning of Informatica Mappings and Sessions.
- Extensively worked on various transformations like Lookup, Joiner, Router, Union, SQL, Webservices Consumer Transformation, Sorter, Aggregator, Expression, etc.
- Experience in Write High Quality SQL’s, Stored Procedures, Functions as per client business requirements.
- Extensive experience with Data Extraction, Transformation, and Loading(ETL) from heterogeneous Data sources of Multiple Relational Databases like Oracle, Netezza, Teradata, DB2, SQL Server, and Worked on integrating data from flat files like fixed width and delimited, XLS into a common reporting and analytical Data Model using Informatica.
- Strong hands on experience using Teradata utilities (SQL, B - TEQ, Fast Load, MultiLoad, Fast Export, and Tpump).
- Extensively used Bulk Reader/Writer to significantly reduce ETL load times.
- Good experience with External tables, Distribution, GROOM and other features of Netezza.
- Proficiency in data warehousing techniques for Slowly Changing Dimension phenomenon, surrogate key assignment, Normalization & De-normalization, Cleansing, Performance Optimization along with the CDC.
- Good Experience in UNIX Shell scripting & Windows Batch Scripting for Parsing Files & automation of batch ETL jobs.
- Involved in resolving Production job failures and Involved in fixing the production data Issues.
- Expertise in the System integration testing, Regression testing and User acceptance testing for Data warehousing Applications (ETL and OLAP).
- Hands on experience in designing, development and implementation of OBIEE 10.1.3.x
- Experienced in designing customized interactive dashboards in OBIEE using drill down, guided navigation, prompts, filters, and variables.
- Expert in using OBIEE Answers to create queries, format views, charts, and add user interactivity and dynamic content to enhance the user experience.
- Experienced in Creating Problem Tickets, Work Requests, Change Requests (CRs) and Work Orders.
- Involved in Code Reviews & suggested best practices and presented Informatica new features in USER Group meetings.
- Hands on Experience using ITIL model, Supported Level1 & Level2 people on Production Issues and involved in tuning Informatica jobs as Performance group member under Level3 team.
- Extensive experience in Offshore and Onsite model. Received good Customer feedback for timely delivery and fulfilled the customer expectations.
- Good knowledge in Software Development Life Cycle (SDLC) and Quality Assurance Life Cycle.
- Strong analytical, presentation, problem solving and excellent inter-personal communication skills and flexible to learn new technologies in the IT industry towards company’s success.
PROFESSIONAL EXPERIENCE
Confidential, Charlotte
ETL /Netezza Developer
Responsibilities:
- Involved in meetings with Business System Analysts to understand the functionality and prepared ETL Technical Specifications.
- Extensively used Informatica 9.6.1 to load data from flat files, CDW (Oracle), Deposits (Teradata), RDR source systems to ATLAS (Netezza) database.
- Involved in the development of Informatica mappings and also tuned for better performance.
- Created several complexInformaticamappings, Mapplets depending on client requirements. Extensively worked on Connected & Unconnected Lookups, Router, Expression, Source Qualifier, Aggregator, Filter, and Sequence Generator, SQL, Union, Web Services Consumer Transformation.
- Created reusable transformations and Mapplets and used them to reduce the development time and complexity of mappings and better maintenance.
- Used Workflow manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions.
- Created several complex mappings as per business requirements.
- Extensively used Bulk Reader/Writer to significantly reduce ETL load times.
- Extensively used Full Pushdown Optimization in Informatica to performance for Netezza DB loads.
- Thorough understanding of External tables, Distribution, GROOM and other features of Netezza.
- Involved in the optimization of SQL queries which resulted in substantial performance improvement for the conversion processes.
- Involved in SQL performance tuning.
- Interacted with users with SQL Queries and trouble shooting for their reports.
- Created, developed, Tested Jil scripts using Autosys.
- Involved in developing UNIX shell scripts.
- Involved in Unit testing, Integration testing, and code reviews.
- Responsible for Error handling, bug fixing, Session monitoring and log analysis.
- Involved in unit testing, UAT testing and integrated testing of entire process flow.
- Exposure in SAS with focus on Base SAS, SAS/SQL, SAS/Macros, SAS Enterprise Guide.
- Created Reusable ETL components to Invoke SAS Data Flux jobs.
- Involved in data loads and fixing the Issues in all the environments.
- Involved in production support and expertise in trouble shooting the application.
- Created change requests, problem tickets, work requests, and work orders.
- Involved in recovery of critical job failures in production environment. Coordinated with DBA’s, analysts, external clients and business users in the process of recovery.
Environment: Informatica-Power Center 9.5.1, Netezza, Oracle 11g, Teradata, Toad, SAS, SAS DI, SAS DATA FLUX, UNIX, Toad, Windows, Tidal, SAS Enterprise Guide, AUTOSYS, AgenityWorkbench
Confidential, Charlotte
ETL Lead Developer
Responsibilities:
- Involved in meetings with Business System Analysts to understand the functionality and prepared ETL Technical Specifications.
- Extensively used Informatica to load data from flat files to Oracle tables.
- Involved in the development of Informatica mappings and also tuned for better performance.
- Most of transformations were used like the Source Qualifier, Application SQ, Aggregator, Expression, Lookup, Router, Normalize, Filter, Update Strategy and Joiner transformations etc.
- Created reusable transformations and Mapplets and used them to reduce the development time and complexity of mappings and better maintenance.
- Used Workflow manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions.
- Converted batch jobs with BULKLOAD utility to TPUMP utility.
- Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.
- Wrote appropriate code in the conversions as per the Business Logic using BTeq scripts.
- Loaded the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).
- Involved in Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions
- Involved in Tuning critical Informatica jobs & database loader jobs.
- Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.
- Created, developed, Tested Jill scripts using Autosys.Created shell scripts for retrieving data into Management reports.
- Involved in data analysis and created reports using SQLs.
- Created ETL Schedules using Informatica tool.
- Involved in SQL performance tuning.
- Involved in developing UNIX shell scripts.
- Involved in Unit testing, Integration testing, and code reviews.
- Involved in creating OBIEE layers (Physical, Logical and presentation layers).
- Experienced in designing customized interactive dashboards in OBIEE using drill down, guided navigation, prompts, filters, and variables.
- Expert in using OBIEE Answers to create queries, format views, charts, and add user interactivity and dynamic content to enhance the user experience.
Environment: Informatica 9.1, Teradata, Oracle, DB2, Flat Files, AUTOSYS, UNIX, RUMBA, TOAD, Teradata SQL Assistance, SSH tools (WinScp, Super Putty).
Confidential, MI
Informatica Developer
Responsibilities:
- Responsible for Business Analysis and Requirements Collection and involved in Designing Technical Specification.
- Transforming high-level design spec to simple ETL coding. Designing mappings according to the mapping standards, naming standards and warehouse standards for future application development.
- Created common objects for the ETL team to maintain the Common Code and Documented standards for Informatica code development.
- Extensively used Informatica to load data from Mainframe files, IBMDB2, Flat Files files to DB2 and Mainframe Files.
- Involved in SQL performance tuning.
- Extensively used Informatica Power Exchange to connect to the Mainframe sources.
- Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Lookup, Router, Normalize, Filter, Update Strategy and Joiner transformations.
- Created reusable transformations and Mapplets and used them in complex mappings.
- Used for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions.
- Migrated mappings from Development to Systems.
- Involved in developing UNIX shell scripts.
- Extensively used Power Exchange for creating Data Maps and CDC Sources with the COBOL copybooks.
- Created Registration groups and Registrations. This will allow them to be reflected in the Application and Extraction groups as part of Data Capture in Power Exchange.
- Ran Mainframe CA7 JCL jobs as part of Power Exchange Change Data Capture Condenser job.
- Used Informatica Power change tool to create data maps for Copybooks, VSAM datasets and load/access data to/from Mainframe files.
- Extracted the G/L - Accounting & Profit center Master data.
- Extracted the G / L transaction data from R/3.
- Performed Unit Testing and Integration Testing and tuned for better performance.
- Expertise in fixing the critical data issues in production.
- Co-Ordinated with Offshore Teams and delivered tasks on time.
Environment: Informatica Powercenter8x, Informatica Power Exchange, DB2, DBArtisian, CA7, Confidential Compare, Mainframe, LINUX, WindowsXP, Autosys, SQL Server, Flat Files,VSAM Files.
Confidential, OH
Informatica Developer
Responsibilities:
- Detailed Analysis of the requirement.
- Developing detailed ETL specifications based on business requirements.
- Used Informatica Designer to create complex Mappings and Mapplets.
- Extensively used transformations like Router, Aggregator, Lookup, Source qualifier, Joiner, Expression and Stored Procedures in extracting data in compliance with the business logic developed.
- Creation and Testing Informatica Mappings and Workflows.
- Monitoring and performance tuning of ETL mappings.
- Used heterogeneous data sources, like Oracle, and flat files.
- Performed data validation (with predefined test scripts or criteria) as per the requirement from development or testing coordinator.
- Extensively used PL/SQL and SQL.
- Created and Modified UNIX Shell scripts.
Environment: Informatica7.1, Oracle, DB2, Flat files, Maestro, and HP-UX
Confidential
Informatica / SQL Developer
Responsibilities:
- Creating SQL*Loader control file scripts to load data from interface files to staging tables.
- Developed Physical and Logical models using Designer 2000.
- Created the Business Objects repository in the application database.
- Developed Business Objects Reports design/distribution/security modules.
- Created PL/SQL scripts for data transformation as per program specifications.
- Generated Shell scripts for backups.
- Automated Backup system Using Scripts.
- Wrote Scripts for monitoring the SGA for Oracle Database.
- Implemented PL/SQL Scripts for data loading.
- Tuned database at SQL level and Operating System Level.
Environment: Informatica, Oracle, Oracle Developer, Flat Files, UNIX, Windows 2000, Approx.
Confidential
Informatica Developer
Responsibilities:
- Developed Mappings, Mapplets for new country rollout Frankfurt and successful project implementation
- Used Informatica Power Center to design, develop Source to target mappings
- Created reusable Mapplets using Mapplets Designer to use in the mappings, Mapping Parameters and Mapping Variables to use globally
- Designed and developed Complex Mappings using the Designer transformations like Lookup, Aggregator, Expression, Router, Filter, Update Strategy and Stored Procedure
- Managed user groups, users and the associated privileges
- Used workflow monitor to monitor the Informatica server and reviewed error logs that it generates for each session it runs.
- Maintained night run batch cycle Mappings in Autosys for production environment.
- Designed sessions in Workflow Manager and monitored using Workflow Monitor
- Documented the disaster-recovery procedures for fail-over
- Configured and ran the Debugger from within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow
- Involved in the preparation of documentation for ETL standards, procedures and naming conventions, shell scripts.
- Developed Mappings for SDMS reports used on daily basis for business use
- Developed reconciliation Mappings for E, T & L stages to run on daily basis.
Environment: Informatica 7.1, Cognos, IBM AIX 5.2, ClearCase, AUTOSYS, IBM DB2
Confidential
Informatica Developer
Responsibilities:
- Extensively worked on data Extraction, Transformation and Loading from source to target.
- Worked on various transformations like Source qualifier, lookup, router, filter, aggregator, Joiner, expression, union, update strategy, sorter, sequence generator
- Created source definitions, target definitions, reusable transformations, mappings and mapplets
- Created, configured and monitored the sessions using workflow manager and workflow monitor.
- Created reusable tasks - command task in Task Developer and other tasks - control task, decision task, Event task (Event - Raise, Wait), Assignment task
- Prepared Unit Test Specification Requirement documents and Conducted unit testing and fixed bugs
- Scheduled Sequential, Concurrent sessions and batches to load data from source to target database
Environment: Informatica power center 6.2, ORACLE 9i, UNIX