We provide IT Staff Augmentation Services!

Informatica & Netezza Developer Resume

5.00/5 (Submit Your Rating)

Houston, TX

SUMMARY

  • Over 8 years of IT experience in analysis, design, development and implementation of Data warehouses, data marts using Informatica Power Center with Oracle, MS SQL server, DB2, Microsoft access and Teradata, Netezza databases.
  • Strong experience in Informatica Power Center (versions 9.5/8.6), RDBMS, Data warehousing/ETL, OLAP (7.1/5.1) including Designer, Workflow Manager and Informatica Power Exchange (version 9), IDQ, Informatica B2B.
  • Proficiency in utilizing ETL tool Informatica Power Center 9.5x/9.1/8.x/7.x for developing the Data warehouse loads with work experience focused in Data Integration as per client requirement.
  • Expertise in designing confirmed and traditional ETL Architecture involving Source databases Oracle, Flat Files (fixed width, delimited), DB2, SQL server, Target databases Oracle, Teradata and Flat Files (fixed width, delimited).
  • Have clear understanding of Data warehousing concepts with emphasis on ETL and life cycle development Using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.
  • Extensive experience in designing and developing complex mappings applying various transformations such as lookup, source qualifier, update strategy, router, sequence generator, aggregator, rank, stored procedure, filter joiner and sorter transformations, Mapplets.
  • Designed complex ETL mappings, reusable transformations using Regular Expressions, Dynamic Lookups, and Update Strategies for slowly changing dimensions, critical performance tweaking stages with partitioning, persistent cache and extensive use of mapplets for common sub routines.
  • Experience in Debugging / Optimizing / Tuning Informatica mappings for better understanding the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Strong knowledge of Entity - Relationship concept, Facts and dimensions tables and developing database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional data modeling.
  • As a Informatica Administrator with particular area of expertise in the implementation, maintenance and support of virtual and physical Linux/Unix Servers running on RedHat Linux based on company’s requirement’s.
  • Pertinent experience using heterogeneous databases such as Oracle 8i/9i/10g/11g, DB2/UDB, MS SQL Server 2005 (SSIS), Mainframes (Vsam/Cobol files) DB2, XML, COBOL files and various Flat Files (Structured/Unstructured) systems for integration into Enterprise/Business Analysis DWH.
  • Experience in Data warehouse OLAP reporting using Business Objects.
  • Worked on projects related to Informatica B2B with aggregation, extraction and Data transformation.
  • Worked with Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL*Loader, Export/Import utilities.
  • Strong experiences in UNIX Shell scripting.
  • Created scripts for calling pmcmd command, file watcher, FTP and file archiving scripts.
  • Involved in Technical Documentation, Unit test, Integration test and writing the test plan.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 9.6.x.9.5.x, 9.1.x, 8.6.x & 7.x

Databases, Utilities: Teradata V13.10, Oracle 11g/10g/9x, Netezza, Microsoft Access, Qlik view, Tableau, Erwin 3.5/4.0/4.2, IBM DB2, MS SQL Server 2000/2005 and Business Objects XI

Languages: SQL, PL/SQL, UNIX Shell Script

Informatica Tools & Features: Informatica B2BDT, Informatica Power exchange, Informatica Data Quality, Informatica Data Analyzer, Informatica Metadata Manager, PCPO, Pushdown Optimization (PDO), Portioning and DVO

Operating Systems: Windows 2000/2003/NT/XP/2007, AIX, UNIX

GUI Tools: MS-Office, MS Visio, TOAD, AUTOSYS

Processes: Informatica Quality Processes, Informatica Velocity

Domain Knowledge: Insurance, Banking and Health care

PROFESSIONAL EXPERIENCE

Confidential, Houston, TX

Informatica & Netezza Developer

Responsibilities:

  • Created mappings and session for jobs in Informatica.
  • Worked on BI tools, Informatica Metadata manager, Toad by various environmental sources.
  • Designed worklets and workflows for jobs in Informatica.
  • Implemented shell scripts to run the jobs in UNIX through Informatica.
  • Maintained the highest possible control and visibility around transactions and processes, ensuring compliance with all regulations and service-level agreements with Informatica B2B Data Exchange.
  • With the help of Informatica B2B Data Exchange achieved scalable data integration for large data transactions.
  • Worked on prototypes related to Netezza and Oracle Data Integrator (ODI) to interact with Netezza database.
  • Used Netezza Data-Warehouse of Twinfin appliance to search 5-10 terabyte tables with sub-second response time.
  • Consulted groups working on Netezza Data-Warehouse architecture, disaster recovery, performance, workload management and security issues.
  • Extensively used IBM Netezza AMPP platform on developing C/C++ code to SPU from DBOS dispatches and data relation from SPU to DBOS event.
  • Used Netezza SQL to maintain ETL frameworks and methodologies in use for the company and also accessed Netezza environment for implementation of ETL solutions.
  • Expertized in Netezza queries by performance tuning and techniques such as CBT, Collocation and collocated join.
  • Designed and Implemented data profiling, cleansing, name-address enrichment and matching merging solutions using Informatica Data Quality (IDQ) and extraction from Informatica B2B data exchange.
  • Implemented Shell scripts to capture the unique process ids for job, which are initiated by Informatica.
  • Implemented Shell scripts to kill the unique process ids for job, which are initiated by Informatica.
  • Accessed Informatica data director and DT Studio projects created well-versed DT Studio scripts, which are uploaded in server for usage of modifying existing Informatica schemas using unstructured data transformation, and did governance using Informatica data director tool.
  • Used Informatica MDM to support data governance.

Environment: Informatica Power Center 9.1, 8.6, Oracle 9i, Informatica B2BDT, Informatica B2BDX, HP Quality Center 10.0, Control-M, Tableau (versions 8.3x and 9.0), MS Visio, UNIX shell Scripting, Netezza, AIX Teradata, SQL developer, TOAD, PL/SQL (Stored Procedures, Triggers, Packages), Windows NT/XP.

Confidential, Cupertino, CA

Informatica & Netezza Developer

Responsibilities:

  • Parsed high-level design spec to simple ETL coding and mapping standards.
  • Used Informatica designer for developing various mappings.
  • Designed and developed ETL Mappings using Informatica to extract data from flat files from Oracle and to load the data into the target database.
  • Worked with integration of Informatica B2BDT to Informatica Power center.
  • Worked with Informatica B2B data transformation to read unstructured and semi structured data and load them to the target.
  • Worked extensively on Netezza and on UNIX for UNIX Shell Scripting.
  • Extensive knowledge on IBM Netezza dollar prompt, arrow prompt and on SPU’s. by performance tuning and techniques such as CBT, Collocation and collocated join.
  • Worked on large data processing for efficiency of MPP (Massively Parallel Processing).
  • Extensively worked on creating Netezza groups, users, tables and views.
  • Worked with design and developing various major data feeds and supervise the implementation of various Netezza data sources.
  • Used Aqua data studio for Netezza to work on administration capabilities and database query tool.
  • Used FPGA’s for Netezza appliance for creation of S-blades and data loading.
  • Worked closely data transfer from source database to bulk reader and load the bulk data to IBM Pure data Netezza system. Also kept transaction log using CDC and filter the required table and load the Netezza system from a stream loader.
  • Successfully created complex Informatica mappings to load data in to the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Extensively used various transformations like Source Qualifier, Joiner, Aggregation, Connected and unconnected lookup, Rank and Filter.
  • Qlikview MVC pattern for user interface and data extraction.
  • Worked on SQL tools like TOAD to run SQL Queries to validate the data.
  • Developed Batch Jobs using UNIX Shell scripts to automate the process of loading, pushing and pulling data from and to different servers.

Environment: Informatica Power Center 9.5x, SQL server 2005/2008, Informatica Power exchange, Informatica B2BDT, Informatica B2BDX, Netezza, Oracle10g, BO XI 3.1, Erwin 4.5, Microsoft Visio, Qlikview. SQL Plus, TOAD, UNIX Shell scripting, Windows XP, UNIX.

Confidential, New York, NY

Informatica Developer

Responsibilities:

  • Interacted with Business Analysts to gather and analyze the Business Requirements.
  • Involved in Designing, forward engineering using Tableau.
  • Prepared Technical Required Documents along with Design and Mapping documents.
  • Worked with BA/BSA, Data Modeler/Architect team in designing the Data Model for the project.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline the data to Database.
  • Used Informatica Power Center to create mappings, mapplets, User defined functions, workflows, worklets, sessions and tasks.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Used Shell scripts for performing initial file validations like duplicate checks, counts (control & balance), file size & structural validations, date and data type format validations on input and intermediate files coming from a third party system.
  • Implemented control and balance piece at all required levels by using ABC layer.
  • Created Sessions, Workflows, Link Tasks and Command Tasks using Workflow Manager and monitored the workflows/sessions using Workflow Monitor.
  • Worked as a Informatica Administrator to setup and maintain Informatica folders and files and also made sure to release code to production Environment.
  • Managed and maintained existing Informatica interfaces like Tidal, Power exchange, and ODBC connections.
  • Extensively used stored procedures, functions and also worked with parallel processing.
  • Worked with events such as Timer, Event Wait, Event Rise, Control and Decision.
  • Created Unix Shell scripts for FTP/MFT, Error handling, Error reports, Parameter files etc.
  • Involved in performance tuning of the complex mappings and extracting reports using Tableau.
  • Performed Unit testing and worked closely with offshore testing team in preparing test cases for System Integration testing.
  • Involved in migration of the mappings, Sessions and other Objects from Development to QA and to Production environment.

Environment: Informatica Power Center 8.6.1 & Informatica Power Center 9.1(Early updates), SQL Server 2008, Oracle10g, Oracle SQL Developer, SQL Server Management Studio, Tableau (Extensive use of Tableau 8.1x), SQL Plus, TOAD, Shell scripting, Windows XP, UNIX.

Confidential, Irving, TX

Informatica Developer

Responsibilities:

  • Extensively worked on Informatica Power Center tools- Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager and Informatica Power Exchange.
  • Worked with Data modeller, according to changes in the model worked on changes in the Informatica.
  • Imported the source definition into source analyzer from oracle and for DB2, VSAM files used Informatica power exchange in the source analyzer.
  • Used informative power exchange for Target Designer imported from DB2, VSAM files.
  • Worked with Business Analyst, Translating Business requirements into Functional requirements.
  • Had a role of Managing people effectively in extension programs that require constant planning and development.
  • Exposed to communicate with workstations for enabling users with one or more databases using Teradata BTEQ.
  • Developed and Tested new mappings.
  • Created and configured the sessions for work flow.
  • Created and maintained parameter files for workflows in the UNIX and CVS.
  • Worked on Joiner, Aggregator, Update Strategy, Rank, Router, Lookup (static and Dynamic), Sequence Generator, Filter, Sorter and Source Qualifier.
  • Extensively involved in hFine-tuning of Informatica Code (mapping and sessions) by using SQL override to Source Qualifier and Look up transformation transformations to obtain optimal performance and throughput.
  • Implemented slowly changing dimensions - Type I, II &III in different mappings as per the requirements.
  • Created connections for relational, Non-relational and applications in the workflow manager.
  • Created mappings using Reusable Transformations.
  • Migrating mappings, workflows and parameter files from development to production.
  • Developed PL/SQL scripts line packages and store procedures for data updates.

Environment: Informatica Power Center (8.6.1,9.1), Informatica Power Exchange (8.1/8.5/8.6), SQL Server 2008, Teradata 12.0, Oracle10g, Oracle SQL Developer, SQL Server Management Studio, Tableau, SQL/SQL Plus, Windows XP/NT, UNIX Shell Scripting

Confidential

Informatica ETL Developer

Responsibilities:

  • Extensive user interface to gather requirements and perform business analysis.
  • Extensively worked with Aggregate, Joiner, Lookup, Rank, Sorter, Sequence generator, Union, Filter, Router, Update Strategy and transformations.
  • Participated in converting requirements to detailed design.
  • Profiled source systems to identify data gaps and data quality
  • Designed error-handling strategy for Informatica ETL.
  • Created technical specifications for all mappings and interfaces.
  • Unit testing of mapping/sessions and batches.
  • Analyzed the source systems to create a road map for the ETL process.
  • Designed and developed the Informatica mappings for source system data extraction; data staging, transformation, validation, and error handling.
  • Designed testing procedures and test plans for mappings, workflows, and interfaces.
  • Developed schedule processing scripts using Autosys scheduler.
  • Coordinated and executed historical data loads.
  • Created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Extensive work on Informatica stored procedure for business requirement.
  • Managed Change control implementation and coordinating daily, monthly releases and reruns.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Extensively used Mapplets for use in mappings thereby saving valuable design time and effort.
  • Extensively worked with various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
  • Worked with session logs, Informatica Debugger and Performance logs for error handling while having workflows and session fails.
  • Worked extensively with the business intelligence team to in corporate any changes that they need in the delivered files.

Environment: Informatica Power Center 8.6.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Oracle10g, Oracle SQL Developer, MOAD, FLOAD, SQL/SQL Plus, AIX, Windows XP.

Confidential

Informatica ETL Developer

Responsibilities:

  • Worked with source databases like Oracle, SQL Server and Flat Files.
  • Worked on Day to day administration activities like user creation, folder creation and configuring DB connections.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Lookup (connected and unconnected), Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Developed Mapplets to implement business rules using complex logic.
  • Used Informatica Repository Manager to create Repositories, User Groups and Users based on their roles.
  • Converted the PL/SQL Procedures and SQL*Loader scripts to Informatica mappings.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval.
  • Developed UNIX shell scripts to automate the data transfer (FTP) process to and from the Source systems, to schedule weekly and monthly loads/jobs.
  • Used Informatica Designer to create complex mappings using different transformations to move data to multiple databases.
  • Design and Development of pre-session, post-session and batch execution routines to run Informatica sessions using Informatica Server manager.
  • Used Debugger to check the errors in mapping.
  • Worked on Informatica Slowly Changing Dimension (SCD) I, II, III for connected lookup (LKP).
  • Generated UNIX shell scripts for automating daily load processes.
  • Managed Change control implementation and coordinating daily, monthly releases and reruns.

Environment: Informatica Power Center 8.6.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Stored Procedure, Slowly Changing Dimension I, II, III), Oracle10g, Oracle SQL Developer, MOAD, FLOAD, SQL/SQL Plus, Unix (Sun Solaris).

We'd love your feedback!