We provide IT Staff Augmentation Services!

Informatica Tech Lead Resume

0/5 (Submit Your Rating)

Desmoines, IA

SUMMARY

  • More than 10 years of software experience in analysis, design, development, testing ETL Data - warehousing in various domains like Healthcare, Financial Retail and Hospitality .
  • Worked extensively with Dimensional modeling, Data migration, ETL Processes.
  • Through knowledge in Oracle and Teradata RDBMS Architecture. Extensive use of Teradata Export tools like Fastload, Multiload, Fast Export and Tpump
  • Expertise in Report formatting, Batch processing, Data Loading and Export using BTEQ
  • Knowledge in Query performance tuning, managing and Request scheduling
  • Expertise in Resource Management using TDQM
  • Well versed with Teradata Manager Suite for Performance Monitoring (PMON), Statistics and Alert management.
  • Responsible for all activities related to the development, implementation, administration and support of ETL processes for large-scale data warehouses using PowerCenter /PowerMart.
  • Extracted data from various sources such as Oracle, Microsoft SQL Server, and Flat files, COBOL, XML and DB2.
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Have knowledge of Informatica MDM hub and execution of Informatica MDM hub processes.
  • Designed and developed efficient Reconciliation, Error handling methods and implemented throughout the mappings.
  • Worked in Production support to resolve problmes and fine tuning for performance improvement.
  • Experience in Study, Design, Analysis, Development, and Implementation of Data Warehousing concepts.
  • Experience in Full Life Cycle Implementation of Data warehouses.
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflakes modeling, fact and dimension tables modeling of data at all the three levels: view, logical & physical.
  • Experience in various features of Erwin like Forward Engineering and Reverse Engineering.
  • Strong Experience in creating Stored Procedures and Triggers using Oracle PL/SQL.
  • Skilled in Tuning of SQL.
  • Skilled in Unit Test, System Integration Test and UAT.
  • Good experience in UNIX and writing shell scripts for Informatica pre & post session.
  • Self-Starter who can adapt and learn fast with excellent communication & Interpersonal skills.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.6/9.5/9.1/8. x/7.xInformatica Data Quality, Informatica MDMInformatica Power Exchange

Business intelligence: Business Objects XI R3/5.1/6.5/Cognos

RDBMS: Oracle Exadata/11g/10g/9i/8i, Sybase IQTeradata V2R5/V2R4IBM DB2 UDB 8.1/7.0MS SQL Server 2014/2008, MS Access 7.0/2000

Teradata Tools & Utilities: Query Facilities: Queryman, BTEQ Load & Export: FastLoad, Mload, FExp Teradata Manager, PMON

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.1/3.5.2, Microsoft Visio

Tools: Toad, SQL developer, Win SQL, Quest Central for DB2, SQL*LoaderSerena dimensions, Clear case, CVS, CA scheduler, Tidal, AutoSysControl-M, MANTIS, ALM, Citrix

Languages: SQL, PL/SQL, UNIX Shell Scripting, C, C+

Operating Systems: UNIX Solaris/AIX, Windows

PROFESSIONAL EXPERIENCE

Confidential, Desmoines, IA

Informatica Tech Lead

Responsibilities:

  • Created functional specifications by analyzing the requirements of different business owners.
  • Involved in the Analysis and the design of Logical and Physical Data Model for ETL mapping and the process flow diagrams.
  • Implemented complete Data warehouse architecture from initial data load to Load Ready from operational system, Load Ready to stage, Stage to Warehouse.
  • Implemented star-schema methodology to build data ware house for business partners, which involved creating dimensions like Host Member Dim, Home Member Attribution, Reconciliation etc.,
  • Loaded data incrementally into type-2 slowly changing Dimensions
  • Extracted source data from RTMS and Facets sourcing applications to load data into Data warehouse in Sybase IQ.
  • Devised Re-usability and reduced redundancy of code by creating shortcuts, Mapplets, Reusable Transformations, Reusable Sessions and Worklets.
  • Designed and developed mappings using Aggregator, Joiner, Normalizer, Rank, Sequence Generator, SQL, Transaction Control, Connected-Unconnected-Lookup, Source Target pre and post load Stored Procedure Transformations, Update Strategy, Union, XML Transformations etc.,
  • Extensively used Normalizer transformation to de-Normalize data and Aggregator to Normalize data in the transformation logic.
  • Used FTP and External Loader connections in sessions to Load Bulk data into Sybase IQ load ready tables.
  • Expert in using the Informatica Debugger and the Debugger expression editor to evaluate complex expressions to understand the bugs in the code.
  • Used Constraint Based loading & target load ordering to load multiple targets with PK-FK relation in the same mapping.
  • Developed workflow tasks objects like reusable Email, Event wait, Command, Decision and scheduler.
  • Used Parameter files to override mapping parameter, Mapping Variables, Workflow Variables, Application connection object Name, Relational Connection Names, $Param-Session Parameters and Ftp Session Parameters.
  • Developed automation of ETL processes based on both user-defined and Predefined Event waits.
  • Created Test cases for Unit Test, System Integration Test to check the data quality and data Integrity.
  • Worked with DBA’s to develop stored procedures as per the business needs.
  • Studied Session Log Files, Thread Statistics and Performance Statistics and Reject files to understand the root cause of data rejection and to pinpoint performance deadlocks.
  • Improved performance of sessions with high Update volume by using performance improvement techniques like partitioning, external loaders.
  • Used pre build look up caches and Persistent Lookup Cache to improve performance.
  • Adjusted data, index cache sizes for cached transformations like Aggregator and Sorter, Adjusted DTM buffer sizes to set the optimum buffer block size for high Performance target loading by the transformation thread and also Adjusted Line Sequential Buffer lengths for Flat File Loads to achieve optimal performance.
  • Involved in Informatica Object Migrations and script migrations from development to UAT and coordinated with change management to rollout the code to production environment.
  • Scheduled huge data loads using Informatica scheduler and also used pmcmd commands to start and stop workflows from scripts.
  • Created sessions and workflows to help schedule nightly loads and process data from all source terminal Data Collection points.
  • Provided Projects related Technical documentation for the Team and updated in SharePoint.
  • Provided on-call technical support for the production team during 3 week warranty period for the implemented project.

Environment: Informatica Power Center 9.6.1, Informatica Power Exchange, SQL Server 2014, Sybase IQ, MS Visio, Win SQL, UNIX Solaris 10, Ultra Edit, Microsoft SharePoint, Serena Dimensions

Confidential, Columbus, OH

Sr Informatica Developer

Responsibilities:

  • Created technical specification documents by analyzing the business requirement and functional requirement documents
  • Have knowledge of Informatica MDM hub and execution of Informatica MDM hub processes. Know how to construct the schema used in Informatica MDM hub implementation and stored in the hub store.
  • Created workflows and worklets for new mappings.Fine tuned the existing mappings by analyzing data flow.
  • Used Normalizer transformation to normalize the data from COBOL source files.
  • Created pre and post session stored procedures for dropping and recreating of indexes.
  • Attended the data mapping sessions with the client to map the fields from client source system to ABMS system.
  • Worked with DBA to create users, roles and profiles.
  • Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator and Joiner on the extracted source data according to the business rules and technical specifications.
  • Created reusable transformations and Mapplet and used with various mappings.
  • Created Connected, Unconnected and Dynamic lookup transformation for better performance and increased the cache file size based on the size of the lookup data.
  • Done various optimization techniques in Aggregator, Lookup, Joiner transformations.
  • Created mapping to generate parameter files to load the data form the source system.
  • Created Oracle Stored Procedure to implement complex business logic for good performance and called from Informatica using Stored Procedure Transformation.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Was involved in Unit Test, System Integration Test to test the data.
  • Performed scheduling of the ETL jobs using CA scheduler as scheduling tool.
  • Wrote Shell Scripts for getting the Process date from the header record and updating the process control table to be used to load data.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Created UNIX shell scripts and called them as pre session and post session commands
  • Worked with Caliber RM and Serena Version Manager to maintain the versioning of various documents

Environment: Informatica Power Center 9.5.1, Informatica MDM, IBM AIX V6.1, Oracle 10g, Cobol Files, PL/SQL, SQL*Loader, SQL Developer 3.2, CA schedulerMS Visio.

Confidential, Dallas, Texas

Sr Informatica Developer

Responsibilities:

  • Designed and developed efficient Error handling, Performance improvement methods and implemented throughout the mappings.
  • Co-ordinate with the Architectural team and come up with possible solutions.
  • Built campaign management data ware house for VCA Animal Hospitals, which involved creating dimensions like Confidential t, Client, Clinic, Breed, Product, Segment, Campaign etc., and facts like Order Fct, Campaign Response Fct, Cmt Disp Fct, Digital Comm Summary Fct etc.
  • To assist production support team in case of any load/performance issues with the existing processes.
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable components such as Mapplets, Reusable transformations and sessions etc.
  • Developed complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected and unconnected look up, update Strategy, expression, aggregator, joiner, filter, normalizer, rank and router transformations.
  • Developed and Implemented Informatica parameter files to filter the daily data from the source system.
  • To make sure that the ETL standards are followed across the data warehouse.
  • Used Informatica debugging techniques to debug the mappings and used session log files to trace errors that occur while loading.
  • Performed Unit Testing and Integration Testing on the mappings.
  • Involved in designing Audit process and Reconciliation processes for the sessions loaded as part of mapping design.
  • Responsible for Creating workflows and worklets. Created Session, Event, Command, Control Decision and Email tasks in Workflow Manager
  • Tuned performance of Informatica sessions by Studying Session Log Files to understand the root cause of performance deadlocks and used different performance techniques like sorting, caching data, session partitions etc.
  • Developed Slowly Changing Dimension Mapping’s for Type 1 SCD and Type 2 SCD for implementing the Slowly Changing Dimension Logic.
  • Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for tables, Email tasks and various other applications.
  • Created Materialized views for summary tables for better query performance.
  • Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.

Environment: Informatica Power Center 9.1/8.6.1, Informatica Data Quality, Informatica Metadata Manager, Oracle Exadata,10g, SQL Server 2008, PL/SQL, MS Visio, TOAD, SQL Developer, UNIX AIX V6, WINSCP, Business Objects XI R3, Microsoft SharePoint, Tidal

Confidential, TX

Sr.Informatica (ETL) Developer

Responsibilities:

  • Involved in the Study of the business logic and coordinate with the client to gather user requirements.
  • Created technical specification documents like system design and detail design documents for the development of Informatica Extraction, Transformation and Loading (ETL) of data into various tables and defining ETL standards.
  • Designed and developed ETL process using Informatica tool to load data from wide range of sources such as Oracle, DB2, SQL Server, XML and Flat Files.
  • Based on the logic, Developed various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, SQL transformations in the mapping.
  • Created mappings, which include the Error Handling Logic being implemented to create an error, ok flags and an error message depending on the source data and the lookup outputs.
  • Developed Informatica parameter files to filter the daily data from the source system. Created mappings in the designer to implement Type 2 SCD.
  • Fine tuned the mappings by analyzing data flow and Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Lookup, Joiner and aggregator transformations.
  • Responsible for migrating the folders or mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to other Environment.
  • Developed Informatica parameter files to filter the daily source data.
  • Responsible for loading data into warehouse using Oracle Loader.
  • Worked with Stored Procedure Transformation for time zone conversions.
  • Created UNIX scripts to transfer (FTP) the files from Windows server to the specified location in UNIX server.
  • Created UNIX scripts to automate the activities like start, stop, and abort the informatica workflows by using PMCMD command in it.
  • Created batch scripts on windows server to archive and delete the files and run these scripts using Autosys.
  • Used various Oracle Index techniques like B*tree, Bitmap index to improve the query performance and created scripts to update the table statistics for better explain plan.
  • Created Materialized view to summarize the data based on the user requirement to improve the Business Objects report queries performance.
  • Responsible for loading data into warehouse using Oracle Loader for history data
  • Extensively involved in the analysis and tuning of the application code (SQL).
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data.

Environment: Informatica 8.6 (Power Center, Designer, Workflow Manager, Administrator and Repository Manager), Oracle 11g, DB2, Flat files, MS sql server 2008, PL/SQL, SQL*Loader, TOAD, Business Objects 6.5, UNIX Sun Solaris 9, AutoSys, Clear Case.

Confidential, Dallas, Texas

Sr Informatica Developer

Responsibilities:

  • Involved in the Analysis and the design of Logical and Physical Data Model for ETL mapping and the process flow diagrams.
  • Implemented star-schema methodology to build campaign management data ware house for business partners, which involved creating dimensions like Product, Time, Source, Segment, Event, and Campaign Type etc.,
  • Imported Informatica Data Quality (IDQ) plans as mapplets into Power Center to analyze standardize and cleanse the source data.
  • Created Power Exchange Data maps based on the copybooks to get the VSAM Mainframe source files into oracle staging tables.
  • Devised Re-usability and reduced redundancy of code by creating shortcuts, Mapplets, Reusable Transformations, Reusable Sessions and Worklets.
  • Designed and developed mappings using Aggregator, Joiner, Normalizer, Rank, Sequence Generator, SQL, Transaction Control, Connected-Unconnected-Lookup, Source Target pre and post load Stored Procedure Transformations, Update Strategy, Union, XML Transformations etc.,
  • Extensively used Normalizer transformation to de-Normalize data and Aggregator to Normalize data in the transformation logic.
  • Used FTP and External Loader connections in sessions to Load Bulk data into the oracle target tables.
  • Used Parameter files to override mapping parameter, Mapping Variables, Workflow Variables, Application connection object Name.
  • Developed automation of ETL processes based on both user-defined and PredefinedEvent waits.
  • Developed and Loaded Summary-Agg Tables for better reporting performance.
  • Studied Session Log Files, Thread Statistics and Performance Statistics File and Reject files to understand the root cause of data rejection and to pinpoint performance deadlocks.
  • Used Pushdown optimization both on the Source and the target side to push transformation logic to the database to improve performance.
  • Extensively used pre-session and post-session variable assignment to increment and transport counter values for purposes of Auditing.
  • Provided Projects related Technical documentation for the Team and updated in SharePoint.

Environment: Informatica Power Center 8.6, Informatica Power Exchange 8.6, Informatica Data Quality, Oracle Exadata, SQL Server, PL/SQL, MS Visio, TOAD, UNIX AIX, WINSCP, Cognos, Microsoft SharePoint, Tidal

Confidential, Birmingham, AL

ETL Developer

Responsibilities:

  • Good Knowledge on SYNCSORT for change Data Capture, MANTIS for defects rising and getting them fixed.
  • Involved in fine tuning of sql queries in the transformations.
  • Involved in creating workflows and worklets for new mappings.
  • Worked extensively in Performance Tuning with existing mappings by analyzing data flow.
  • Involved in modifying the stored procedure with line of business.
  • Proficient in understanding business processes / requirements and translating them into technical requirements
  • Modified a number of Informatica Mappings, Mapplets and Transformations to load data from relational and flat file sources into the data mart.
  • Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator and Joiner on the extracted source data according to the business rules and technical specifications.
  • Created reusable transformations and Mapplet and used with various mappings.
  • Created Connected, Unconnected and Dynamic lookup transformation for better performance and increased the cache file size based on the size of the lookup data.
  • Done various optimization techniques in Aggregator, Lookup, Joiner transformations.
  • Developed and Implemented Informatica parameter files to filter the daily data from the source system.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Performed regression testing and integration testing.
  • Developed Shell Scripts for getting the data from source systems to load into Data Warehouse.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Created UNIX shell scripts and called them as pre session and post session commands
  • Worked with Serena Version Manager to maintain the versioning of various documents

Environment: Informatica Power Center 8.1.1,IBM AIX, Oracle 10g, Flatfiles, DB2, SQL Server 2000, Cognos, UNIX scripting, PL/SQL, SQL*Loader, Erwin 4.1, TOAD 7.6, MANTIS.

Confidential, Minneapolis, MN

Teradata Developer

Responsibilities:

  • Coordinating with the client and gathering the user requirements to create specs.
  • Implementing logical and physical data modeling with STAR and SNOWFLAKE techniques using Erwin in Data Mart.
  • Created tables, views in Teradata, according to the requirements.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Considering both the business requirements and the factors to create NUSI, created appropriate NUSI for smooth (fast and easy) access of data.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Fastload, Multiload and TPump.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart
  • Developed scripts to load high volume data into empty tables using FastLoad utility. Used FastExport utility to extract large volumes of data at high speed from Teradata RDBMS.Created stored procedures in TERADATA SQL Assistant.Performance tuning for TERADATA SQL statements using Teradata EXPLAIN plan.
  • Worked with DBA to develop database backup & recovery strategies, implement and administer it by using different backup tools like ARC and Netvault.
  • Creating ETL Scripts & Procedures extract data and populate the data warehouse.
  • Designed various financial reports in NCR's Basic Teradata SQL (BTEQ).
  • Implementing ad-hoc techniques to achieve business needs
  • Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator and Joiner on the extracted source data according to the business rules and technical specifications.
  • Performed scheduling techniques with ETL jobs using scheduling tools, cron jobs through pmcmd commands, based on the business requirement.
  • Developed Shell Scripts for getting the data from source systems to load into Data Warehouse. Created UNIX shell scripts and called as pre session and post session commands.

Environment: Informatica PowerCenter 7.1, NCR’s Unix Server, Teradata RDBMS V2R5, TUF 6.1 (Teradata Utility Foundation) which Includes Teradata SQL Assistant, Teradata Manager 6.0, PMON, BTEQ, MLOAD, Control-M.

We'd love your feedback!