We provide IT Staff Augmentation Services!

Informatica Etl Developer Resume Profile

3.00/5 (Submit Your Rating)

EXPERIENCE SUMMARY:

6 years of experience in IT Industry involving Software Analysis, Design, Development, Integration, Maintenance, Reports, Coding, Bug fixing, Creating Specifications, Maintenance, Production Implementation, Testing.

  • Extensively used Source Qualifier, Connected and Unconnected Lookup, Normalizer, Router, Filter, Expression, Aggregator, Stored Procedure, Sequence Generator, Sorter, Joiner, Update Strategy, Union Transformations, Mapplets.
  • Experience in Mappings, Mapplets, Worklets, Reusable Transformations, Sessions/Tasks, Workflows and Batch Processes in Informatica Server.
  • Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs Type 1/Type 2/ Type 3 loads.
  • Experience in Data warehousing, Data Extraction, Transformation and loading ETL data from various sources like Oracle, SQL Server, Microsoft Access, Microsoft Excel and Flat files into Data Warehouse and Data Marts using Informatica Power Center 9.0/8.6.0/8.1.1/7.x/6.x.
  • Experience on Monitoring and Scheduling of Jobs using UNIX Korn Bourn Shell Scripting.
  • Experience with high volume datasets from various sources like Oracle, Flat files, SQL Server and XML.
  • Involved in the Migration process from Development, Test and Production Environments. Extensive database experience using Oracle 11g/10g/9i, Sybase, MS SQL Server 2008/2005, MySQL, SQL, PL/SQL, SQL Plus, Teradata.
  • Experienced in designing ETL procedures and strategies to extract data from heterogeneous source systems like Oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
  • Worked with Full Software Development Life Cycle SDLC involving Application Development and ETL/OLAP Processes.
  • Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Mart and Power Center.
  • Worked extensively on Error Handling, Performance Analysis and Performance Tuning of Informatica ETL Components and Teradata Utilities, UNIX Scripts, SQL Scripts etc.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • An expert in the ETL Tool Informatica which includes components like Power Center, Power Exchange, Power Connect, Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administration Console, IDE Informatica Data Explorer, IDQ - Informatica Data Quality.
  • Reviewed the SQL for missing joins join constraints, data format issues, mis-matched aliases, casting errors.
  • Experience in using the Data stage Utilities like Pushdown optimization, CDC techniques, Partition and implemented Slowly Changing dimensions Type 1, Type 2 methodology for accessing the full history of accounts and transaction information.
  • Populate or refresh Teradata tables using Fast load, Multi load Fast export utilities for user acceptance testing and loading history data into Teradata.
  • Involved in developing strategies for Extraction, Transformation and Loading ETL mechanism using DataStage tool.
  • Expertise in gathering, analyzing and documenting business requirements, functional requirements, and data specifications for Business Objects Universes and Reports.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Worked on complex queries to map the data as per the requirements.
  • Strong Data Modeling experience in ODS, Dimensional Data Modeling Methodologies likes Star Schema, Snowflake Schema. Design and development of OLAP models consisting of multi-dimensional cubes and drill through functionalities for data analysis.
  • Proven record of success in design, development and implementation of software applications using object oriented technology.
  • Good exposure to Mainframe Systems and knowledge in handling COBAL files.
  • Well versed in writing UNIX shell scripting.
  • Strong decision-making and interpersonal skills with result oriented dedication towards goals

Professional Experience:

Confidential

Informatica ETL Developer

Responsibilities:

  • Developed and tested thoroughly Informatica mappings/workflows as per technical and functional specs provided.
  • Proficient in understanding business processes / requirements and translating them into technical requirements and assisting developers with design work, analysis.
  • Involved in requirements analysis and Architect role with business users for design of tables in the new Warehouse.
  • Lead the Performance Tuning of Informatica Mappings, Sessions and workflows. Developed unit test cases and did unit testing for all the developed mappings.
  • Used KSH to run Informatica workflow by passing parameters.
  • Strong in Data warehousing concepts Fact table, Dimension table, Star and Snowflake schema methodologies.
  • Review of ETL Detailed design documents for Mapping and DDL specifications document for creation of tables, defining key constraints on tables and data types.
  • Extracting data from various source systems like Flat Files, Teradata staging tables, reporting tables.
  • Experience in working with business users, App DBA's, ETL Admins, Prod. Support.
  • Participating in daily scrum meetings and attending weekly keys meeting.
  • Designing mappings as per the business requirements using transformations such as Source Qualifier, Expression, Aggregator, Lookup, Filter, Sequence generator, Router, Union, Joiner, Update strategy etc.
  • Coordination of System / Integration / UAT testing with other teams involved in project and review of Master test strategy.
  • Setup ODBC, Relational, Native and FTP connections for Sybase, Teradata and Flat File.
  • Involved in performance tuning of the workflows and mapping. Portioning mapping for optimal performance.
  • Extensively used parameter file for standardizing the ETL and making it generic.
  • Hands on experience with PL/SQL programming, triggers, stored procedure, functions, packages. Strong understanding of oracle Architecture and the design and implementation of slowly changing dimension.
  • Developed UNIX shell scripts to automate the data warehouse loading.
  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.

Environment: Informatica 9.5, Sybase, Teradata, Oracle SQL Developer, UNIX Shell Scripting, Flat Files, COBOL Files, Windows, MS Outlook, SQL Assistant, CA ESP, Putty.

Confidential

Teradata/Informatica ETL Developer

Responsibilities:

  • Responsible for Creating workflows and worklets. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager.
  • Done extensive testing and wrote queries in SQL to ensure the loading of the data. Perform unit testing at various levels of the ETL and documented the results too.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Developed Informatica Mappings, Mapplets and Transformations to load data from different relational sources like Oracle, Teradata, DB2, and flat files sources into the data mart.
  • Developed mappings using Informatica to load data from sources such as Sequential files into the target Teradata tables, Sequential files.
  • Extensively worked on designing Business views and writing SQL queries in Teradata catering specific business requirement to generate data.
  • Done the documentation part like preparing migration checklist, fixing the errors and documenting the changes done, Testing Results.
  • Worked with QA team to create test cases and validated the source and target tables.
  • Used CA ESP for scheduling the jobs in UAT and Production environment.
  • Used OBIEE for preparing reports and analysis.
  • As it is a Single resource Project have to take complete care of the code and the requirements from preparing the TDD to Code migration.
  • Worked on moving the code from DEV to QA to PROD and performed complete Unit Testing, Integration Testing, user Acceptance Testing for the three environments while moving the code.

Environment: Informatica 9.1, Teradata, Toad, Oracle SQL Developer, UNIX Shell Scripting, Windows, MS Outlook, SQL Assistant, CA ESP, OBIEE.

Confidential

ETL/Teradata Developer

Responsibilities:

  • Handling the File transfer through SFTP Scripts, which are running through Informatica and also having some Unix Shell Scripts used to send mails to Clients whenever there is success or failure depending upon the Business requirements to the Customers.
  • Expertise in ETL reporting based on the Flat Files data sourced in different platforms.
  • Experience in migrations of project's application interfaces.
  • Worked on Teradata SQL Assistant, Teradata administrator, Teradata view point and BTEQ scripts.
  • Created joins, macros and secondary indexes for faster retrieval of data.
  • Solid experience in performance tuning on Teradata SQL Queries and Informatica mappings.
  • Responsible for developing the mappings from given technical documents using various transformations like Aggregator, Joiner, Sorter, update, expression etc.
  • Experience in doing changes to existing mappings in accordingly to the user requirements and business logic.
  • Worked with sources like flat files, COBOL files, Oracle tables and legacy systems. Written various types of SQL queries involving joins, aggregate calculations and tuning the queries too.
  • Done extensive testing and wrote queries in SQL to ensure the loading of the data. Perform unit testing at various
  • levels of the ETL and documented the results too.
  • Responsible for Creating workflows and worklets. Created Session, Event, command, and Control Decision and Email tasks in Workflow Manager.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Extensively used Expression, Joiner for heterogeneous sources, look-up, filter, aggregate, and update strategy transformations to transform data before loading into the target tables.
  • Developed workflow sequences to control the execution sequence of various jobs and to email support personnel.
  • Implemented various transformation logics using transformations like Joiner, Aggregator, sorter, Update strategy, and filter, Router, Expression.
  • Responsible for implementation of one time data movement and data replication methods.
  • Worked extensively with parameterizing the workflows, configuring the reusable success and failure mails, control tasks command tasks, Informatica scheduler.
  • Worked on preparing a table of summarization and aggregation of the fact data
  • Developed ETL jobs to extract information from Enterprise Data Warehouse.

Environment: Informatica 9.1, Oracle, Teradata, SQL Developer, Putty, UNIX, Windows.

Confidential

ETL/Teradata Developer

Responsibilities:

  • Developing new Bteqs, Queries as Part of Performance Tuning as per change requests given by clients.
  • Design of Mapping Sheets for the Complex Views and Business Views for build of Business views.
  • Involved in writing complex SQL code using SQL Assistant and BTEQ to generate Ad-hoc reports for the business team.
  • Analyzing the end to end system of the Data warehouse.
  • Fixing operational issues of the jobs to ensure data currency up to date and meeting the business requirements.
  • Resolving Application dockets with high priorities raised by business users and end users.
  • Fixing Data quality issues and correcting tainted data.
  • Active interaction with business users, participating in onshore offshore conferences with the client, downstream business users and other vendors under consideration.

Environment: Teradata SQL, BMC Remedy User iTAM , Mainframe Control-M Scheduler.

Confidential

Teradata/ETL Developer

Responsibilities:

  • Physical and logical design of the dimensional model of data warehouse using ERWIN.
  • Involved in business requirements, technical requirements, high-level design, and detailed design process.
  • Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD.
  • Developed Parallel Routines and Custom Built-Operators to meet the business logic which was otherwise not possible with available active/passive stages.
  • Developed Overriding SQL for Data Stage Jobs.
  • Performance tuning and optimization of database configuration and application SQL
  • Performed unit and system test for the modified code and loaded shadow data marts for testing prior to production implementation.
  • Worked on UNIX Shell Scripting to schedule the loading process using Teradata utilities.
  • Involved in centralized data management for important entities single portal for multiple users across multiple organizations, enables consistent data flow throughout the enterprise, data validation and error checks to ensure input data is clean
  • Loaded XML data into Teradata using XML import feature.
  • Acted as the lead developer for all the ETL jobs, reading data from the vendors and affiliates, and finally loading dimension, fact and other aggregate tables.
  • Worked with the users and testing teams to implement the business logic as expected.
  • Written several Teradata BTEQ scripts to implement the business logic.

Environment: Teradata V12, Teradata Utilities, Unix Servers 5380 / 5250, Oracle, BTEQ, MLOAD, FLOAD, SAS, ERWIN.

Confidential

ETL/Teradata Developer

Responsibilities:

  • Worked as Data warehouse Developer and was responsible for the timely and quality delivery of the ETL code and all related documents
  • Implemented SCD type-1 and SCD type-2 load strategy for the Data Warehouse
  • Responsible for requirements gathering from Clients, analyzed the functional specs provided by the data architect and created technical specs documents for all the enhancements.
  • Worked with high-volume data, Tuning and troubleshooting of mappings and Created documentation to support for the Application.
  • Perform impact analysis, identifying gaps and code changes to meet new and changing business requirements.
  • Legacy System Data Analysis to identify logical conditions, joins, filter criteria etc to gather data necessary for conversion. Identify gaps or flaws where the logic might fail and reporting to the Source System for correction.
  • Involved in developing an ETL architecture based on Change Data Capture data acquisition methods using redo log files for inserts, updates and deletes of data from transaction database to load into Data Warehouse.
  • Worked with Connected and Unconnected Stored Procedure for pre post load sessions.
  • Tuned performance of Informatica sessions for large data files by increasing block size, data cache size and sequence buffer length.
  • Developed PL/SQL Procedures, Functions and Packages and SQL scripts
  • Understood requirements and created BSD and later developed detailed TDD for the whole Operations data migration project.
  • Extensively worked in data Extraction, Transformation and loading from source to target system using BTEQ, FastLoad, and MultiLoad.
  • Developed OLAP applications using Cognos suite of tools and extracted data from the enterprise data warehouse to support the analytical and reporting for all Corporate Business Units.
  • Design and developed PL/SQL procedures, packages and triggers to be used for Automation and Testing.
  • Involved in performance tuning on the source and target database for querying and data loading.
  • Developed MLoad scripts and shell scripts to move data from source systems to staging and from staging to Data warehouse in batch processing mode.
  • Involved in writing scripts for loading data to target data Warehouse for BTEQ, FastLoad, and MultiLoad.
  • Error handling and performance tuning in Teradata queries and utilities.
  • Data reconciliation in various source systems and in Teradata.
  • Involved in unit testing and preparing test cases.
  • Involved in peer-to-peer reviews.

Environment: Informatica 8.6, Teradata V2R6, Teradata Utilities Multiload, FastLoad, FastExport, BTEQ, Tpump , SQL Server 2000, Oracle, FTP, CVS, Windows XP, UNIX, Pentium Server.

Confidential

Informatica ETL Developer

Responsibilities:

  • Analyzed the source data coming from Flat files and worked with business users.
  • Worked with most of the DB client tools like SQL Navigator, TOAD, Data Browser, Teradata SQL Assistant, etc.
  • Technical writing skills to provide professional reports for implementation documentation and assessment.
  • Understand the components of a data quality plan. Make informed choices between sources data cleansing and target data cleansing.
  • Performed data quality analysis to validate the input data based on the cleansing rules
  • Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator, and Joiner on the extracted source data according to the business rules and technical specifications.
  • Optimized the Mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
  • Schedule, Run and Monitor sessions by using Informatica Workflow Manager.
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes.
  • Wrote Unix Shell Scripts for extracting parameters and automating the FTP and SFTP Processes.
  • Redesigned some of the existing mappings in the system to meet new functionality.
  • Written, tested and implemented various UNIX Shell, PL/SQL and SQL scripts to monitor the pulse of the database and system.
  • Requirement gathering and worked according to the CR.
  • Code Development as per the client requirements.
  • Involved in the development backend code, altered tables to add new columns, Constraints, Sequences and Indexes as per business requirements.
  • Perform DML, DDL Operations as per the Business requirement.
  • Creating views and prepares the Business Reports.
  • Resolved production issues by modifying backend code as and when required.
  • Used different joins, sub queries and nested query in SQL query.

Environment: Informatica 8.6, Windows 98/NT/2000, Oracle 9i/8i, TOAD, Database Tools/Utilities.

We'd love your feedback!