We provide IT Staff Augmentation Services!

Sr Etl Informatica Developer Resume

0/5 (Submit Your Rating)

BostoN

SUMMARY

  • Around 8.5 years of IT experience in all phases of SDLC in DW/BI (Analysis, Design, Development, Test, Implementation and Operations).
  • 8 years of experience in DataWarehousing using ETL and OLAP tools InformaticaPowerCenter9.5/9.1/8.x/7.x, Informatica PowerExchange9.x/8.x, OBIEE 11g/10g, Teradata, Informatica Data Quality 8.x Informatica Big Data Edition, Informatica Cloud.
  • Worked on Various Domains Health care, Banking, Financial & Retail and extensively worked on Marketing Data.
  • Accomplished and have good experience on Informatica applications.
  • Designed and Developed the ETL processes for various domains like HealthCare, Finance, Logistics and Insurance.
  • Strong data processing experience in designing and implementing Data Warehouse and Data Mart applications, mainly Transformation processes using ETL tool INFORMATICA POWER CENTER 8.x/ 9.0.1/9.1.0/9.5.1 , Power Exchange 8.x/9.5.1 and UNIX Shell Scripting
  • Proficient in Dimensional Data modeling and Relational Star Join Schema/ Snowflake models, FACT & Dimensions tables, Physical & logical data modeling and Ralph Kimball Methodologies.
  • Responsible for creating Staging Jobs in Informatica using TPT operator and PDO techniques
  • Interacted with end - users and functional analysts to identify and develop Business Specification Documents (BSD) and transform it into technical requirements.
  • Proficient with RDBMS systems, SQL, Oracle PL/SQL, database design and optimizations
  • Skilled in Data warehousing, Scripting and Data Modeling with 7years’ experience in dealing with various Data sources like Oracle, SQL Server, MS-Access, Teradata, and DB2.
  • Strong knowledge of Data warehouse design such as dimensional data modeling, logical and physical design.
  • Extensively worked with Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.

TECHNICAL SKILLS

Data warehousing: Informatica PowerCenter 9.5/9.1/8.6/8.5/8.1/7.1 , Informatica Power Exchange 9.5/8.6/8.1,Informatica Big Data Edition, Informatica IDQ, 8.x,Informatica Power Center Visio 9.x

BI Tools: OBIEE 11g/10g

Databases: Oracle Exadata/11g/10g/9i, Sybase, Teradata13/12/V2R6, MS-Access, DB2 8.0/7.0, MS-SQL Server

Languages: XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting, Perl, Java

Operating System: HP-UX 11/10/9, IBM-AIX6.1/5.3, Sun Solaris 9/8/7, SCO-UNIX, LINUX, Windows XP Professional/95/98/2000/XP/2010/Vista/Windows 8

Other Tools: MS Visual Source Safe, ZENA, Autosys, Control M, uni center, Remedy, Clarity,TIDAL,JIRA,SVN Repository

DB Tools: SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace, MLOAD, FLOAD, FEXPORT, TPUMP

Microsoft Tools: MS Office, MS Front Page, MS Outlook, MS Project. MS Visio

PROFESSIONAL EXPERIENCE

Sr ETL Informatica Developer

Confidential, Boston

Responsibilities:

  • Involved in effort estimation for the requirements in the project and prepared mapping documents based on client requirement specifications.
  • Used File Bridge for flat file automatic DQ validations.
  • Involved in design and development of Informatica mappings using various transformations like expression, lookup, router, joiner, union sorter, aggregators etc.
  • Involved in developing SCD type 1, type 2 mappings in Informatica level and also involved in writing the stored procedure to perform type 1 and type 2 operations.
  • Involved in the development of automated stored procedures to use as post and pre SQL in the Informatica session level which are useful to load the dimension and fact tables based on the table type.
  • Used SQL Server as database and design various stored procedures using SQL Server.
  • Used SVN to store all the automated scripts and regular insert scripts to run automatically after every data model deployment.
  • Used TIDAL auto scheduling tool for automate the process.
  • Involved in writing shell scripts to automate the jobs which were used in the TIDAL.
  • Widely used JIRA Dashboard to track the issues and communicate with the team members upon the particular issue.
  • Responsible for building Stored Procedures using Teradata and MS-SQL Server.
  • Interacted and conducted various meetings with end users for Data Reconciliation purposes.
  • Collaborated with the QA team to get solve the outstanding issues.
  • Perform the unit testing for files using UNIX commands.
  • Perform the unit testing as per the design document and as per the mapping design flow.
  • Responsible to fix the issues during the load failure due to MLOAD or Locks on the target tables.
  • Worked on Teradata utilities BTEQ, MLOAD, FLOAD, TPUMP and TPT Operator.
  • Used the Volatile tables’ concept in writing the BTEQ Scripts.
  • Used FAST LOAD Utility to load data into stage tables.
  • Responsible for writing the BTEQ Scripts to populate the Dimensions and the Fact Tables used for

Sr ETL Informatica Developer

Confidential, Cumberland

Responsibilities:

  • Responsible to develop the ETL Informatica Mappings to mask the target data.
  • Responsible to analyse the Source data(Files) sent by the user prior to development and communicated to application team if required.
  • Responsible for testing the Masked data and provide the Unit Test documents.
  • Built the Customised functions and reusable objects to facilitate reusability of INFA objects
  • Responsible to run and monitor the weekly load and fix the issues if the load fails.
  • Worked on fixed width Source files extensively and generated the masked files accordingly.
  • Used the Teradata Utilities, Fast Load, Multi and, BTEQ Queries to load the tables.
  • Loaded data into flat files/CSV file using the FAST EXPORT Utility.
  • Worked on Data Masking using the Data Masking transformation in Informatica Power Centre 9.x.
  • Responsible to fix the issues during the load failure due to MLOAD or Locks on the target tables.
  • Used the Volatile tables concept in writing the BTEQ Scripts.
  • Worked on the data load requests requested by the user.
  • Responsible for Updating the Data discovery documents and fixing the code accordingly when the user or application team confirms the new changes.
  • Worked with Expression, Data Masking Transformation and Lookup transformations and stored Procedure so as to mask the Production data.
  • Worked on HP Quality Centre to handle the defects raised for various applications.
  • Worked on Performance tuning and reduced the data load times of few Weekly Jobs.
  • Responsible to Code and implement the SCD Type 1 and Type 2 logics to load the Dimensions.
  • Responsible for developing the Mappings using the Power canter and Teradata TPT Operator.
  • Responsible for writing the BTEQ Scripts and automating the same using the INFA Mappings.
  • Responsible to load the Detail and Summary Fact Tables in the Data Mart.
  • Responsible to write the SQL’s to generate KPI Reports for end Users in OBIEE and Tableau.
  • Have did the ETL Vs User Data Re-conciliation for the reports generated in OBIEE.
  • Used transformations such as the source qualifier, normalizer, aggregators, connected & unconnected lookups, filters, sorter, stored procedures, router & sequence generator.
  • Checked and tuned the performance of application

Sr ETL Informatica Developer

Confidential, Columbus

Responsibilities:

  • Responsible for developing the ETL Source code using the Mapping requirement documents
  • Responsible for running the Jobs using Control M Scheduler.
  • Extensively worked in data Extraction, Transformation and Loading from source to target system using Power Center of Informatica.
  • Importing Source/Target tables from the respective databases and created reusable transformations andmappings using Designer Tool set of Informatica.
  • Created Transformations and Mappings using Designer Tools set of Informatica.
  • Scheduled Workflows Using Workflow Manager.
  • Responsible for creating Staging Jobs in informatica using TPT operator and PDO techniques.
  • Involved in working with various sources in Oracle, used Informatica to load data into data warehouse.
  • Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup (Connected & Unconnected), Source Qualifier, Filter, Update Strategy, Stored Procedure, Router, and Expression).
  • Worked with transformations like Source qualifier, Aggregator, Connected and Unconnected lookups, Filters & Sequence generator.
  • Identified and tracked slowly changing dimensions.
  • Successfully coded a mapping with dynamic lookup logic.
  • Analyze and improve current ETL usage, concepts, and performance.
  • Created, scheduled, and monitored the sessions and workflows on the basis of run on demand, run on time, run only once using Informatica Power Center Server Manager.

Environment: Informatica Power Center 9.1.0/9.5, Workflow Manager, Workflow Monitor,PL/SQL, SQL, Oracle 11g/10 g, Toad 10.6, Oracle 11.6

Sr ETL Informatica Developer

Confidential, New York City

Responsibilities:

  • Proficient in understanding business processes/requirements and translating them into technical requirements
  • Worked with Source Analyzer, Target Analyser, Transformation designer, Mapping designer and Workflow Manager.
  • Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development by applying Ralph Kimball methodology of dimensional modeling
  • Imported various Application Sources, created Targets and Transformations using Informatica Power Center Designer (Source analyzer, Transformation developer, Mapplet designer, and Mapping designer).
  • Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Worked with various transformations to solve the Slowly Changing Dimensional Problems using Informatica Power Center
  • Developed and scheduled Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results in Workflow monitor.
  • Did performance tuning at source, transformation, target, and workflow levels.
  • Used PMCMD, and UNIX shell scripts for workflow automation.
  • Parameterized the mappings and increased the re-usability
  • Used transformations such as the source qualifier, normalizer, aggregators, connected & unconnected lookups, filters, sorter, stored procedures, router & sequence generator.
  • Checked and tuned the performance of application.
  • Created and defined DDL’s for the tables at staging area.
  • Responsible for creating shared and reusable objects in Informatica shared folder and update the objects with the new requirements and changes.
  • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
  • Migration between Development, Test and Production Repositories.
  • Involved in creating and management of global and local repositories and assigning permissions using Repository Manager. Also migrated repositories between development, testing and production systems.
  • Did unit test and development testing at ETL level in my mappings.
  • Involved in UAT Support.
  • Involved in Peer reviews of LLDs, mappings, Sessions and PWX data maps.
  • Monitoring currently running jobs.

Environment: Informatica Power Center 9.1.0/9.5, Workflow Manager, Workflow Monitor,PL/SQL, SQL, Oracle 11g/10 g, Toad 10.6, Oracle 11.6.

Sr ETL Informatica Developer

Confidential, Jersey City, NJ

Responsibilities:

  • Involved in design, development and maintenance of database for Data warehouse project.
  • Involved in Business Users Meetings to understand their requirements.
  • Converted business requirements into technical documents - BRD, explained business requirements in terms of technology to the developers.
  • Developed ETL Mappings and Test plans.
  • The Data flow diagrams ranged from OLTP systems to staging to Data warehouse.
  • Developed Test plans to verify the logic of every Mapping in a Session. The test plans included counts verification, look up hits, transformation of each element of data, filters, and aggregation and target counts.
  • Developed Complex Informatica Mappings using various transformations- Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.
  • Extensively used SCD’s (Slowly Changing Dimension) to handle the Incremental Loading for Dimension tables, Fact tables.
  • Designed various Mappings for extracting data from various sources involving Flat files, Oracle and SQL Server, IBM DB2.
  • UsedTeradata utilities fastload, multiload, tpumpto load data
  • Wrote BTEQ scripts to transform data
  • Wrote Fastexport scripts to export data
  • Worked on Debugging and Troubleshooting of the Informatica application. For debugging utilized Informatica debugger.
  • Worked on Performance Tuning to optimize the Session performance by utilizing, Partitioning, Push down optimization, pre and post stored procedures to drop and build constraints.
  • Worked on Teradata utilities BTEQ, MLOAD, FLAOD and TPUMP to load staging area.
  • Created UNIX Script for ETL jobs, session log cleanup and dynamic parameter.
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once
  • Monitored Workflows and Sessions using Workflow Monitor and Scheduler alert editor.
  • Performed Unit testing, Integration testing and System testing of Informatica mappings.

Environment: Informatica PowerCenter Cloud 9.5/9.1, Informatica Power Exchange 9.1, OracleExadata, Teradata13, MSSQL Server 2008 R2, T-SQL, Erwin 8, SQL Server 2008, TOAD, PL/SQL Developer, Linux, Tidal, Shell Scripting, Perl Scripting, Windows XP, Putty, DB2 Mainframe, OBIEE 11g, Informatica Cloud, Salesforce.

Informatica Developer

Confidential, Bridgewater, NJ

Responsibilities:

  • Involved in Requirement Gathering and Business Analysisand created tech specs for ETL process.
  • Developed data Mappings between source systems and warehouse components using Mapping Designer
  • Setup folders, groups, users, and permissions and performed Repository administration using Repository Manager.
  • Design, development and implementation. It includes the several modifications, enhancement and maintenance of existing software.
  • Build the Dimension & Facts tables load process and reporting process using Informatica and OBIEE.
  • Consolidate sales order data using Informatica and loaded to aggregate tables for reporting.
  • Coordinated with Onsite and Offshore team.
  • Developed various complex Mapplets and Stored procedure to facilitate loading of data on weekly and monthly basis.
  • Worked on Various transformations such as Joiner, Filter, Update Strategy, Expression, Router, Lookup, Union, and Aggregator.
  • Administered and developed monitoring scripts for power center server process and workflow jobs.
  • Designed and developed enhanced data models and ETL applications to meet the business requirement.
  • Mapping type 1 & type 2 Slowly Changing Dimension (SCD) using Informatica.
  • Timely delivery of ETL tasks is performed using workflow manager and scheduler.
  • Worked closely with DBA and developers during planning, analyzing, developing and testing phase of the project.
  • Involved in Designing of Data Modeling for the Data warehouse.
  • Involved in the performance tuning of the Informatica mappings and the sequel queries inside the source qualifier.
  • Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.
  • Involved in user acceptance testing. Defect reporting & tracking in Jira and resolved them.

Environment: Informatica Power Center 8.6.1, Informatica Power Exchange 9.1, MS OFFICE/VISIO 2007, Oracle Exadata, DB2Mainframe, UNIX, AIX, Rally, MS Series, Web Services, Java, Tidal, Putty, Basecamp, Erwin 7.5, PL/SQL

ETL/ Informatica Developer

Confidential

Responsibilities:

  • Designed and developed complex mappings using various transformations in Designer to extract the data from sources like Oracle, SQL Server and flat files to perform mappings based on company requirements and load into Oracle tables.
  • Extensively worked on various source like flat files, Databases, XML andWeb services.
  • Created and alteredTables, Triggers, Sequences, and other DDL, DML, DCL and utilities of Oracle.
  • Extensively worked in the performance tuning of ETL procedures and processes.
  • Performance Tuned Informatica Targets, Sources, mappings & sessions for large data files by increasing data cache size, sequence buffer length and target based commit interval.
  • Any issues with the daily running jobs, find out the errors and fix them within time limit
  • Used Partition Primary Index concept to increase the performance of loading and collected stats on tables and indexes.
  • Mainly responsible for solving Informatica and Database Issues for different departments if ticket is opened.
  • Any problem with the ETL codes, prime responsibility was to test in Integration Environment before loading in Production.
  • Used Debugger utility of the Designer tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results for various tickets.
  • Used Teradata Utilities Fast Load and Multi Load to load data into the Tables.
  • Worked on Fast Export to load data into csv files, responsible to build the ETL load using the TPT Operator.
  • Wrote Analytical queries to generate report, designed normal and materialized View.
  • Optimized mappings having transformation features like Aggregator, filter, Joiner, Expression and Lookups
  • Created daily and weekly workflows and scheduled to run based on business needs

Environment: Informatica PowerCenter 8.6, Oracle 10g, Teradata V2R5, XML, TOAD, SQL, PL/SQL, IBM AIX, UNIX Shell Scripts, Web Intelligence, DSBASIC, Erwin 6.1, Remedy, Maestro job scheduler, Mercury Quality Center, Teradata

Informatica /Developer

Confidential

Responsibilities:

  • Collection of requirements from business users and analyzed based on the requirements.
  • Extensively used flat files for Designed and developedcomplexInformaticamappings using expressions, aggregators, filters, lookup and stored procedures to ensure movement of the data between various applications.
  • Designed and developed complex mappings using various transformations in Designer to extract the data from sources like Oracle, SQL Server and flat files to perform mappings based on company requirements and load into Oracle tables.
  • Well versed with creating and Altering Table, Triggers, Sequences, and other DDL, DML and DCL utilities of Oracle 9i and 10g.
  • Error checking & testing of the ETL procedures & programs using Informatica session log.
  • Performance Tuned Informatica Targets, Sources, mappings & sessions for large data files by increasing data cache size, sequence buffer length and target based commit interval.
  • Performed error handling for removing database exceptional errors.
  • Configured Informatica Power Exchange connection and navigator.
  • Created Registration, Data Map, configured Real-Time mapping and workflows for real-time data processing using CDC option of Informatica Power Exchange.
  • Used Debugger utility of the Designer tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results.
  • Worked with tools like TOAD to write queries and generate the result.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica mappings. Involving also in the functional and technical design document sessions with Technical team and business users team.
  • Actively involved in acquiring training on the Portico 8.0/9.0 versions of Provider Manager and Business Administrator modules of Portico.
  • Working on the Data Analyzer, Data Quality and data profiler tool for handling the business logic and exception logic from the inbound feeds and also the outbound feeds.

Environment: Informatica PowerCenter 8.1, Informatica Power Exchange 8.1, DB2 Mainframe, Cobol, ORACLE 10g, UNIX, Windows NT 4.0, Sybase, UNIX Shell Programming, PL/SQL, TOAD, Putty, WinScp, Remedy, Teradata

We'd love your feedback!