We provide IT Staff Augmentation Services!

Informatica Developer Resume Profile

5.00/5 (Submit Your Rating)

PROFESSIONAL PROFILE

Dynamic, skilled business intelligence specialist with proven success in designing and implementing multiple large scale data warehouses in multiple domains that improved business decisions. Possess good knowledge in creating and implementing ETL process using various tools. Repeatedly successful in providing ETL solutions as well as to provide other technical solutions, consultation to the business and management.

TECHNICAL EXPERIENCE

  • 10.5 years of software experience in analysis, design, development, testing which includes 4.5 Yrs. of Onshore experience.
  • Experience in full life cycle implementation of data warehouses from gathering detail business requirements and translate them into technical requirements till implementation, data warehouse standards and procedures.
  • Solid hands on experience with Data warehousing tools, Data Analysis, ETL Development, Data Mapping, Unit Testing, Migration, Conversions, and Process Documentation.
  • Proficient in data modeling using dimension models with both star and snowflake schemas and Entity-Relationship modeling.
  • Have good exposure in data modeling data marts using Erwin tool.
  • Have Good Exposure to Teradata DBA utilities Teradata Manager, Work load Manager, index wizard, Stats Wizard and Visual Explain.
  • Have developed complex Teradata SQL queries.
  • Good knowledge on various relational databases like Teradata, Oracle, SQL Server, MS Access and DB2.
  • Developed complex ETL process using Informatica PowerMart and PowerCenter between Teradata, Oracle, MS SQL Server, DB2, Flat files.
  • Experienced in writing complex SQL procedures and functions.
  • Skilled in providing data integrity solutions through Informatica data quality and data explorer tools
  • Experience in Testing, Debugging and Performance Tuning of targets, sources, mappings and sessions.
  • Experience in creating Transformations and Mappings using Informatica 9.x/8.x/7.x/6.x Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Handled change controls, impact analysis and production support.
  • Ability to learn very quickly and apply new skills to existing problems and commitment to quality from requirements through design, construction, testing, and implementation.
  • Good analytical and logical programming skills with a good understanding at the conceptual level and possess good presentation, interpersonal skills with a strong desire to achieve specified goals.

TECHNICAL SKILLS

ETL Tools

Informatica Power Center 9.x/8.x/7.x, Ecmap Ecgateway

RDBMS

Oracle 11i/10g/9i/8i, Teradata V2R5/V2R6, SQL, PL/SQL, DB2, MS SQL Server 2005/2008/2000/7.0/6.5, MS Access 7.0/2000

Business Intelligence

Business Objects 6.5

Data Modeling

MS Visio

Languages

SQL, PL/SQL, UNIX

C, C , HTML,DHTML, Java script

Operating Systems

Windows 2008/2003/2000/NT 4.0, HP UNIX 11.0, Linux

Industries/Verticals

Retail, Insurance, Telecom and Healthcare

PROFESSIONAL EXPERIENCE:

  • Data Warehouse Fund Raising FR data to provision marketing professionals with specific data at specific levels to enable the following:
  • Targeted Marketing Campaigns
  • Program Reporting
  • Applied Analytics
  • Ad Hoc Research
  • These capabilities can now be accomplished using only FR data or FR data with other LOB data. These capabilities are analyzing constituent specific data for RFM purposes:

Role: Informatica and Teradata Senior Developer

  • Gathered requirements and built source to target mapping documents and ETL data flow documents
  • Created completed Push down Optimization Informatica mappings to load data into Teradata database.
  • Performed major role in understanding the business requirements and designing and loading the data into data warehouse ETL .
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source Target definitions and coded the process of data flow from source system to data warehouse.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Manage ETL developers and allocate tasks and projects, responsible for deliveries and task/project tracking
  • Provide solutions for workflow automation through ETL using Informatica
  • Create test plans, strategies and documentation of results
  • Create and manage best practices and standards for ETL processes
  • Perform code review, bug fixing, issues tracking and resolution
  • Provide solutions for workflow automation through ETL using Informatica
  • Create test plans, strategies and documentation of results
  • Worked on UNIX shell scripts to move data from EDW tables.

Technical Environment: Informatica 9.5, Teradata 14.0, UNIX, Oracle 11g

Confidential

  • At present, CareFirst receives 837s from the trading partners through Pipeline and CCA applications. All Real Med 837s are received by the pipeline and all other trading partners send the 837's to CCA to route them to appropriate adjudication system. CCA send these 837's to TRMS system for business analysis and reporting purpose.
  • As the CCA and pipeline are planned to be sunset a single Consolidated Gateway CGW is being developed to handle all the incoming 837 transactions. CGW will be the only gateway to receive 837 transactions from all the trading partners. There is a need to replicate the CCA process to send the human readable print version of the 837's to TRMS.

Role: Informatica and Teradata Senior Developer

  • Gathered requirements and built source to target mapping documents and ETL data flow documents
  • Performed major role in understanding the business requirements and designing and loading the data into data warehouse ETL .
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source Target definitions and coded the process of data flow from source system to data warehouse.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Manage ETL developers and allocate tasks and projects, responsible for deliveries and task/project tracking
  • Provide solutions for workflow automation through ETL using Informatica
  • Create test plans, strategies and documentation of results
  • Create and manage best practices and standards for ETL processes
  • Perform code review, bug fixing, issues tracking and resolution
  • Provide solutions for workflow automation through ETL using Informatica
  • Create test plans, strategies and documentation of results
  • Worked on UNIX shell scripts to move data from EDW tables.

Technical Environment: Informatica 9.5, Teradata 12.0, UNIX, Oracle 11g

Confidential

ETL Technical Lead

Responsibilities:

  • Managed 7 member ETL Offshore team.
  • Gathered requirements and built source to target mapping documents and ETL data flow documents
  • Performed major role in understanding the business requirements and designing and loading the data into data warehouse ETL .
  • Using Abinitio client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source Target definitions and coded the process of data flow from source system to data warehouse.
  • Using Abinitio Workflow Manager and Workflow Monitor to schedule and monitor session status
  • Developed mappings in Abinitio to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier
  • Implemented Performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.

Technical Environment: Ab initio, Teradata 12.0, UNIX, Oracle 11g, SQL SERVER

Confidential

ETL TEAM LEAD

Description: Lites system gets eligibility and claims files from various core systems and it performs compliance check on incoming files based on the layout and then processes claims and generates reports. Finally all this information is loaded into enterprise data warehouse

Responsibilities:

  • Managed 8 member ETL team.
  • Gathered requirements and built source to target mapping documents and ETL data flow documents
  • Performed major role in understanding the business requirements and designing and loading the data into data warehouse ETL .
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source Target definitions and coded the process of data flow from source system to data warehouse.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Manage ETL developers and allocate tasks and projects, responsible for deliveries and task/project tracking
  • Provide solutions for workflow automation through ETL using Informatica
  • Create test plans, strategies and documentation of results
  • Create and manage best practices and standards for ETL processes
  • Perform code review, bug fixing, issues tracking and resolution
  • Provide solutions for workflow automation through ETL using Informatica
  • Create test plans, strategies and documentation of results
  • Created complex SQL queries in Teradata database 12
  • Worked on UNIX shell scripts to move data from EDW tables.
  • Have Good knowledge in Teradata utilities MLoad, Fastload and Tpump.
  • Have good exposure to Teradata DBA utilities Teradata Manager, Stats wizard, index wizard, work load manager and visual explain.
  • Deployed over 200 production jobs and managed change control processes

Technical Environment: Informatica 8.5, Teradata 12, unix, Clear case and TFS.

Confidential

Senior Informatica and Teradata Developer

CIS system extracts data from four operation source systems ECC, WMDS, Careplanner, and TriMed , transforming them to separate medical management subject areas, and then loading to the Clinical Subject Area in Enterprise Data Warehouse EDWard . The goal is to collect and report information across the patient continuum of care for all WellPoint medical management programs UM, CM, DM into one common Clinical Subject Area. In addition to the four operation source systems, an inbound feed of Disease Management DM data from HMC will also loaded to the Clinical Subject Area.

Responsibilities:

  • Managed 8 member ETL team.
  • Gathered requirements and built source to target mapping documents and ETL data flow documents
  • Performed major role in understanding the business requirements and designing and loading the data into data warehouse ETL .
  • Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source Target definitions and coded the process of data flow from source system to data warehouse.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, Source Qualifier
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Defined Target Load Order Plan for loading data into Target Tables
  • Manage ETL developers and allocate tasks and projects, responsible for deliveries and task/project tracking
  • Provide solutions for workflow automation through ETL using Informatica
  • Create test plans, strategies and documentation of results
  • Create and manage best practices and standards for ETL processes
  • Perform code review, bug fixing, issues tracking and resolution
  • Provide solutions for workflow automation through ETL using Informatica
  • Create test plans, strategies and documentation of results
  • Created complex SQL queries in Teradata database 12
  • Worked on UNIX shell scripts to move data from EDW tables.
  • Have Good knowledge in Teradata utilities MLoad, Fastload and Tpump.
  • Have good exposure to Teradata DBA utilities Teradata Manager, Stats wizard, index wizard, work load manager and visual explain.
  • Deployed over 200 production jobs and managed change control processes
  • Technical Environment: Informatica power center 8x, Teradata, Unix

Confidential

Informatica and Teradata Senior Developer

  • Confidential is taking a major step towards centralizing the global multiple data sources into a central repository named Confidential NDW which would be helpful for effective decision making. As a part of
  • it, NDW True is one of the modules which integrates the product testing data into the central warehouse and building multidimensional views on top of it. Cognos Reports are built on top of
  • These Multidimensional views to support the crucial business decisions for a product launch in market.

Responsibilities:

Maintained Teradata development database.

  • Built Presentation Layer in Star schema from TRUE Data warehouse existing in third Normal form.
  • Created Measures in Teradata without using Cubes, and these measures in different aggregation levels
  • Fine tuned ETL and Cognos reporting SQL queries
  • Used unix shell scripts for invoking Teradata utilites.
  • Implemented the incremental loading into target database using Teradata sql queries
  • Helped True ETL and Cognos team members to create tables, indexes, macros and views
  • Extensively used the collect stats to fine tune the sql queries.
  • Created Informatica mappings to load data in dimensional and fact tables from datawarehouse.

Technical Environment: Informatica power center 7.0 and Teradata V2R5

Confidential

ETL Senior Developer

  • The Book of Business Data Mart will contain data from the following subject areas: Premiums, Premium Rate Actions, Group Coverage, Enrollment, Claims, Capitation, Rx Claims, Broker Commissions and DxCG Risk Scores. This data will be aggregated at the group and effective date level within Book of Business and combined with several additional sources of data used to enrich Book of Business. These include:
  • Completion Factors
  • Plan Wide Discount Factors
  • Disease Management, Wellness, and QIPS add-on expenses
  • Allocated SG A expense
  • Coverage Code Groups
  • Market Segment Groups
  • Customer Affiliation and Category
  • Sales Representative Assignments
  • Sale Representative - Manager Hierarchy
  • Customer SIG Segment Assignments

Responsibilities:

  • Managing the Development and Testing database.
  • Helped offshore team members to create tables, indexes Secondary and Join , Macros and views.
  • Created fast load scripts to load sources files data into Teradata tables.
  • Used Multi load to load data to the tables through DataStage and fixed the bugs while loading the data
  • Extensively used collect stats to fine tune the queries.
  • Fine tuned the long running SQL queries.
  • Created SQL scripts to implement the Referential Integrity check monitor the development server sessions through Teradata Manager.
  • Used Teradata Manager to abort the blocked sessions
  • Released the archive locks and skewed sessions.
  • Helped system testing team to write data validation queries. Communicating with Onshore DBA s for various offshore Teradata database related issues and to update daily status.

Technical Environment: Informatica power center 7, Teradata V2R5 and UNIX

Confidential

Involved in developing Provider module in Informatica and Teradata.

Responsibilities:

  • Extensively worked on Informatica tools such as Informatica Designer Source Analyzer, Warehouse Designer, mapping Designer, Transformations and Informatica workflow manager.
  • Used Source analyzer to Import the data from relational tables.
  • Responsible for developing mappings to extract /transform /load ETL using Informatica 7.1.
  • Moved the code across the folders using Repository Manager.
  • Created and scheduled workflows and Sessions using the Workflow Manager.
  • Performed Debugging and Performance Tuning of Mappings and complex SQL queries.

Confidential

Informatica Developer

It was a maintenance project, worked in production support and worked on new enhancements.

Responsibilities:

  • Extensively worked on Informatica tools such as Informatica Designer Source Analyzer, Warehouse Designer, mapping Designer, Transformations and Informatica workflow manager.
  • Used Source analyzer to Import the data from relational tables.
  • Responsible for developing mappings to extract /transform /load ETL using Informatica 7.1.
  • Moved the code across the folders using Repository Manager.
  • Created and scheduled workflows and Sessions using the Workflow Manager.
  • Performed Debugging and Performance Tuning of Mappings and complex SQL queries.

Technical Environment: Informatica power center 7, Oracle, SQL Server

Confidential

ETL Developer

AmeriSource Bergen has Service Level reporting system that was built on mainframe with the data sent to the customer. AmeriSource Bergen would like to enhance the Service Level Reporting System with better performance, new rules and Drill capability and group policies.

Confidential

  • The Objective of this project is creating Text files with certain format and feed it into MSI program which is running in the Mainframe System and getting Customer and Contract related reports from the WEB. The Interlinx ETL load processes like daily, History, system assurance, reconciliation and sync are involved in this project.
  • As an ETL Developer, I was responsible in taking care of InterlinX Daily - Scheduled as well as Run on Demand Jobs, Developing Design Logic, Testing Daily and Monthly Data Extract and Ensuring 100 Data Quality through InterlinX Sync Process and System Assurance Process.
  • Ameri Source Bergen Data warehouse Consolidation
  • Confidential is one of the leading Pharmacy product distributors in Confidential . This company has two major transaction systems, viz., STAR and Distrack. AmeriSource Bergen would like to consolidate data from both these transaction systems into a single data warehouse. Merge Customer Information from STAR and Distrack into a Single Customer Dimension V Consolidated Customer Dimension and to expand the information that is being captured from STAR to include additional information such as, business type, private label code, etc. Also to identify different aggregate levels and build suitable aggregates that will help the users in information retrieval.

Responsibilities:

  • Involved in designing, developing and documenting of the ETL Extract, Transformation and Load strategy for custom Snowflake Schemas to populate the Data Warehouse from various source systems like DB2, Oracle, Flat File feeds using Informatica
  • Identifying source, target and transformation bottlenecks and Performance Tuning
  • Created and loaded data to Heterogeneous Targets using Power Center 5.1
  • Created complex mappings using mapping designer and reusable mapplets using mapplet designer
  • Created different Transformations such as Expressions, Sequence Generator and Router

Technical Environment: Informatica power center 5, Oracle, SQL Server

We'd love your feedback!