We provide IT Staff Augmentation Services!

Sr.etl Informatica Developer Resume

0/5 (Submit Your Rating)

Dover, NH

PROFESSIONAL EXPERIENCE:

  • 6 years of experience in the Data Warehouse analysis, design, development, testing and implementation phases of Software Development Life Cycle (SDLC).
  • Strong exposure in the Data warehouse development process with Extraction/Transformation/Loading using Informatica Power Center.
  • Extensive experience in ETL and Data Integration for developing ETL mappings and scripts using Informatica Power Center - using Designer, Repository Manager, Workflow Manager & Workflow Monitor. Exposure on Power Exchange functionality.
  • Expert using databases like Oracle 12c/11g/10g, MS-SQL Server 2008/2014, Teradata 15/14/12, My Sql and DB2.
  • Thorough understanding of Business Intelligence and Data Warehousing concepts with emphasis on ETL.
  • Experience in integration of various data sources like Oracle, DB2, Teradata, Netezza, Mainframes, SQL server, XML, Flat files and extensive knowledge on Oracle, Teradata, Netezza and MS Access
  • Experience in creating IDQ Data profiling, ScoreCards, Mapplets, Rules, Mappings, workflows, Data Cleansing, Data Standardization process using IDQ Developer and Informatica Analyst tools.
  • Expertise in working with various transformations using Aggregator, lookup, update strategy, joiner, filter, sequence generator, normalizer, sorter, router in Informatica power center designer.
  • Expertise in debugging, performance tuning and optimization of the sessions and mappings.
  • Proficiency in data warehousing techniques for data cleansing, slowly changing dimension phenomenon (SCD - Type 1& Type 2), surrogate key assignment, change data capture (CDC).
  • Proficient knowledge in different phases of testing like Unit, Integration testing and Regression testing.
  • Excellent communication and interpersonal skills.
  • Experience in working with Data profiling process of examining the data available in an existing data source.
  • Experience in Business Requirement document and Functional Requirement document(BRD & FSD).
  • Excellent scripting knowledge in developing databases using Teradata SQL.
  • Extensively worked with Teradata utilities like BasicTeradataQuery (BTEQ), Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Expertise in creating Databases, Users, Tables, Macros, Views, Joins and Hash Indexes in Teradata Database.
  • Created Shell Scripts for invoking SQL scripts and scheduled them using Crontab, Maestro.
  • Experienced in fast paced Agile Development Environment and methodologies including Scrum.
  • Excellent analytical skills and strong ability to perform as part of a team.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.2/9.6.1/9.5.1/8.6.1, Datastage

Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, E-R Modeling.

RDBMS: Oracle 12c/11g/10g, Teradata 15/14/13/12, DB2, SQL Server, MySQL, Sybase

Reporting Tools: Cognos, Business Objects, Tableau

Scheduling Tools: Control M, Autosys, Maestro, Cron Job

Languages: XML, UNIX Shell Scripting, SQL, PL/SQL, TSQL

Operating Systems: Windows, Unix, Linux

PROJECT EXPERIENCE:

Confidential, Dover, NH

Sr.ETL Informatica Developer/Data Analyst

Roles / Responsibilities:

  • Documenting user requirements. Translate requirements into system solutions and developed implementation plan and schedule.
  • Extensively participated in functional and technical meetings for designing the architecture of ETL load process (mappings, sessions, workflows from source systems to staging to DW).
  • Developing Mapping Documents indicating the source tables, columns, data types, transformation required, business rules, target tables, columns and data types.
  • Extensively used ETL to load data from flat file, Oracle database.
  • Responsible in developing the ETL logics and the data maps for loading the tables.
  • Used IDQ to complete initial data profiling and matching/removing duplicate data.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Union, Joiner, Sequence Generator, Sorter, Aggregator, Update strategy & stored procedure transformation.
  • Developing Complex Transformations, Mapplets using Informatica to Extract, Transform and Load data into Data marts, Enterprise Data warehouse (EDW)
  • Creating Informatica Exception Handling Mapping for Data Quality, Data Cleansing and Data Validations.
  • Writing complex Oracle SQL queries for Source Qualifier SQL Overrides.
  • Working on Performance Tuning of the complex transformations, mappings, sessions and SQL queries to achieve faster data loads to the data warehouse.
  • Extracted data from flat files/ databases applied business logic to load them into Netezza tables.
  • Generating Email Notifications through scripts that run in the Post session implementations.
  • Worked with slowly changing dimension Type1, Type2, and Type3.
  • Used Version mapping to update the slowly changing dimensions to keep full history to the target database.
  • Used Debugger in Informatica for testing the mapping and fixing the bugs.
  • Extensively worked with the DBA for partitioning and creating indexes on tables used in source qualifier queries.
  • Designed mappings for populating Tables for one-time load and incremental loads.
  • Catching up for the applications and need to make sure catch up process is completed
  • Strong on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation
  • Experience in working with Informatica to improve the data management by consolidation of data across different platforms of the business.
  • Created Workflows containing command, email, session, decision and a wide variety of tasks.
  • Developed Parameter files for passing values to the mappings for each type of client.
  • Create workflows based on dependencies in Informatica for the code changes.
  • Writing UNIX scripts for unit testing the ETL Code (Sessions, Workflows, and Log Files).
  • Unit test the Data Mappings and Workflows and validate the data loaded into database.
  • Involved in ETL Code migration from Environment to Environment.
  • Creating session tasks, event waits & raise, command task, worklets etc. in the workflow manager and deployed them across the DEV, QA, UAT and PROD repositories.
  • Involved in creation of Environment Document which provides instructions to Implement/Test the Project in QA and Production environments.
  • Extensive experience in developing Stored Procedures, Functions and Triggers, Complex SQL queries using SQL Server TSQL and Oracle PL/SQL.
  • Prepared low-level technical design document and participated inbuild/reviewof theBTEQ Scripts, Fast Exports, Multi loadsandFast Load scripts.
  • Monitored the ETL Production Jobs and coordinated with offshore team to delegate tasks and monitor the day-to-day progress status and resolve the technical bugs and issues.
  • Worked on optimizing and tuning theTeradataviews andSQL’sto improve the performance of batch and response time of data for users.

Tools: & Technologies: Informatica Power Center 10.2/9.6.1, Teradata TD12 & TD14, Oracle 12c/11g, PL/SQL, SQL, BO, TOAD, LINUX, Shell scripts.

Confidential, St. Louis, MO

Sr. ETL Informatica Developer

Roles / Responsibilities:

  • Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
  • Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions.
  • Designed logical models as per business requirements using Erwin.
  • Worked on different data sources such as Oracle, Netezza, MS SQL, Flat files etc.
  • Responsible for Business Analysis and Requirements Collection.
  • Created the design and technical specifications for the ETL process of the project.
  • Responsible for reviewing the Business requirement and developing the technical Design.
  • Translated requirements into business rules & made recommendations for innovative IT solutions.
  • Documented data conversion, integration, load and verification specifications.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Worked with Power Center Designer tools in developing mappings and mapplets to extract and load the data from flat files to Teradata database.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Used Informatica as an ETL tool to create Data Marts, source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Worked on Informatica Power center tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
  • Used Informatica MDM (Siperion) tool to manage Master data of EDW.
  • Created relationships for the base objects which reflects from look up tables in MDM.
  • Worked on SQL Server including DDL, TSQL, Stored Procedures, etc. in large-scale relational databases.
  • Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
  • Developing Teradata Utility scripts like Fast Load, Multi Load to load data from various source systems to Teradata.
  • Worked in onsite-offshore model, coordinated and communicated with offshore teams to get the things done.
  • Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
  • Involved Working on Unit testing, Integration/ System testing, Regression testing, and User Acceptance Testing.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Created Workflows containing command, email, session, decision and a wide variety of tasks.
  • Scheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling.
  • Employed performance tuning to improve the performance of the entire system.

Tools: & Technologies: Informatica Power Center 9.5.1, Oracle, TD12, Shell Scripting, LINUX, Business Objects, Windows, TOAD.

Confidential, Dayton, OH

Informatica Developer & Support Engineer

Roles / Responsibilities:

  • Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
  • Responsible for Impact Analysis, upstream/downstream impacts.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Successfully upgraded Informatica 9.1 and to 9.5 and responsible for validating objects in new version of Informatica.
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Extensively worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
  • Loaded data in to the Teradata tables using Teradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT.
  • Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Integrated the data into centralized location. Used migration, redesign and Evaluation approaches.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
  • Scheduling Informatica jobs and implementing dependencies if necessary, using Autosys.
  • Managed post production issues and delivered all assignments/projects within specified time lines.

ENVIRONMENT: Informatica Power Center 9.1.1/9.5.1, Oracle, DB2, Teradata, Flat Files, Erwin 4.1.2, Sql Assistant, Toad, Winscp, Putty, Autosys, UNIX.

Confidential, Bellevue, WA

Informatica Developer & Support Engineer

Responsibilities:

  • Involved in Requirement analysis in support of Data Warehousing efforts
  • Maintain Data Flow Diagrams (DFD’s) and ETL Technical Specs or lower level design documents for all the source applications
  • Worked with source databases like Oracle, SQL Server and Flat Files
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations
  • Extensively worked with various Passive transformations like Expression, Lookup, Sequence Generator, Mapplet Input and Mapplet Output transformations
  • Created complex mappings using Unconnected and Connected lookup Transformations
  • Implemented Slowly changing dimension Type 1 and Type 2 for change data capture
  • Worked with various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache
  • Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational databases, transform, cleanse data and load it into data marts
  • Worked with various Informatica Power Center objects like Mappings, transformations, Mapplet, Workflows and Session Tasks
  • Responsible for the performance tuning of the ETL process at source level, target level, mapping level and session level
  • Worked extensively with update strategy transformation for implementing inserts and updates
  • Extensively used the tasks like email task to deliver the generated reports to the mailboxes and command tasks to write post session and pre session commands
  • Extensively used debugger to test the logic implemented in the mappings
  • Performed error handing using session logs
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and target-based commit interval
  • As per business we implemented Auditing and Balancing on the transactional sources so that every record read is either captured in the maintenance tables or wrote to Target tables
  • Auditing is captured in the audit table and EOD snapshot of daily entry is sent to the distributed list to analyze if anything is abnormal
  • Monitored workflows and session using Power Center workflows monitor
  • Used Informatica Scheduler for scheduling the workflows

Environment: Informatica Power Center 9.5.1/9.1.1, Oracle 10g, Sql Server, DB2, SQL, PL/SQL, Cognos, Maestro, Unix AIX and windows.

Confidential

ETL Informatica Developer

Roles / Responsibilities:

  • Analysing the system requirements, business logic and specific line of business.
  • Interact with the users to understand the requirements and document them.
  • Involved in Table Design and creation of indexes.
  • Analyzing calling patterns of customers across various legs of usage Local, STD and ISD.
  • Used SQL*Loader as an ETL tool to load the data from flat files to source tables.
  • Used Fast Load Script as an ETL tool to load the data from flat files/CSV files to Confidential Enterprise Data Warehouse tables.
  • Created PL/SQL stored procedures, functions and packages for moving the data from source to target.
  • Created scripts to create new tables, views and queries for new enhancement in the applications.
  • Partitioned the fact tables and materialized views to enhance the performance.
  • Extracting thousands of records from the database for data analysis.
  • Involved in tuning of queries to enhance Performance.
  • Involved in the continuous enhancements and fixing of production problems.
  • Created Customer segmentation (LVC, MVC, HVC & UHVC) based on the usage behaviour.
  • Effectively supporting for decision making through out the organization.

Programming Tools & Technologies: Informatica Power Center, Oracle 11g, Teradata 12, SQL * Plus, TOAD, SQL Loader,

We'd love your feedback!