We provide IT Staff Augmentation Services!

Sr Informatica Engineer Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • Over 9+ years of total IT experience and Technical proficiency in the Data Warehousing teamed with Business Requirements Analysis, Application Design, Data Modeling, Development, testing and documentation.
  • Strong ETL experience using Informatica PowerCenter 8.x/7.x/6.x (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Mappings, Mapplets, Transformations.
  • Worked on various projects through all phases of Data Warehouse Development Life Cycle, including, requirement gathering and analysis, design, development, testing, performance tuning, and production support.
  • Good Experience on Managing, Scheduling, and Monitoring of Informatica workflows.
  • Designed and developed complex mappings, from varied transformation logic like Unconnected Connected and dynamic lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more.
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks at various levels like sources, targets, mappings, and sessions.
  • Design and Maintain Informatica PowerCenter mappings for extraction, transformation and loading between Oracle, Teradata and Sybase databases.
  • Extensively followed Ralph Kimball and Bill Inman Methodologies. Design the Data Mart model with Erwin using Star Schema methodology. Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin. Designed and Customized data models for Data warehouse supporting data from multiple sources on real time.
  • Involved to working in Slowly Changing Dimensions type 1, type2 and type 3 based on the requirement. Understanding the Star Schema and Snowflake Schema, relationship between Fact and Dimension tables. Understanding of primary, foreign, compound, and surrogate keys relationship.
  • Strong knowledge in Data Warehousing concepts, Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMon), UNIX scripting.
  • Used ETL Informatica PowerCenter 8.1 for loading data from Oracle/Sybase/Flat Files into a target database.
  • Responsible for extracting data from Oracle/Sybase/Flat Files.
  • Highly experienced in SQL performance tuning and debugging of existing ETL process.
  • Experienced with Logical and Physical Data Model design process in Star and Snowflake Schema, 3NF.
  • Experienced with mentoring Teradata Development teams, data modeling, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing.

TECHNICAL SKILLS

Data Modeling:

ERWIN 7.1/4.2/4.0/3.5/3.x, Visio, Oracle Designer 2000, Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Entities, Attributes, Cardinality, ER Diagrams.

Data warehousing & ETL:

Informatica PowerCenter 9.1/8.6/8.1.1/7.1/7.0/6.2/6.1/5.1.2/5.1.1/4.7 (Workflow Manager, workflow Monitor, Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica PowerMart 6.2/6.1/5.1.2/5.1.1/4.7, Power Connect, PowerPlug, PowerAnalyzer, SuperGlue, Power Exchange,Oracle Warehouse Builder 11g Release1 (Warehouse Builder Client and Server, Design and Runtime Repositories, Design Browser and Runtime Audit Browser, Client Version, Integration with Oracle Application Server), Datamart,DataStage, ETL, OLAP, ROLAP, MOLAP, OLTP, Star Schema, Snowflake Schema, Autosys, Control M, Maestro

BI & Reporting:

OBIEE, Oracle Discoverer, Answers & Publisher, Business Objects Developer Suite 5.1/6.0 , Cognos, SQL Developer and others.

Databases:

Oracle 10g/9i/8i/8.x/7.x, DB2 8.0/7.0/6.0, Teradata V2R5/V2R4/V2R3, Sybase SQL Server 12.0/11.x, MS SQL Server 2005/2000/7.0/6.5, MS Access 7.0/’97/2000, PL/SQL, SQL*Loader, Developer 2000

Job Scheduling & Other tools:

CA Autosys, BMC Control M, Cron, IBM Maestro, Quest TOAD 7.6, Quest Central for DB2

Environment:

UNIX (Sun Solaris, HP-UX, AIX), Windows 2003/2000/XP/98, Sun-Ultra, Sun-Spark, Sun Classic, RS/6000, HP 9000, Compaq Tru64, SCO Unix, Mainframes

Others:

Cobol, Java, XML, JavaScript, XHTML, HTML, DHTML, C++, Visual C++, VBScript, CSS, SQL

EDUCATION
B.Tech in Computer Science

PROFESSIONAL PROJECTS

Confidential,Cambridge, MA Feb 2011- Till Date
Sr Informatica Engineer-Customer Master Integration Project, Aggregate Spend Project, Commercial Compliance Project, Government Pricing, Network Contract Reporting, ASAP, VA data

Responsibilities:

  • COM-COM Project is primarily developed to integrate the data from ASDR, Datawarehouse and load them into Commercial data mart tables. All the data sources from ASDR along with the data from the datawarehouse is integrated and loaded into the datamart tables.
  • Responsible for the new enhancements to the CM Integration Project i.e. loading the data to the customer master datamart rather than the data warehouse.
  • All the data which is loaded into data warehouse will be redirected to be loaded into customer master datamart so that this can be easily accessed through several processes in the organization.
  • Regarding CM Integration project, we have used oracle tables and flat files as sources and targets.
  • Involved in requirements gathering, creating technical specifications document, developing phase 1, 2, 3 mappings, testing the mappings, providing support and also included in the SIT bug fix team as well.
  • AGG Spend project is a federal project primarily developed to compute all the expenses that are spent on various healthcare professionals, conferences, their gifts etc and travel expenses. All these expenses needs to be tracked and monitored for every pharmaceutical industry, medical organizations as well and should be submitted to the government.
  • Involved in performance testing team for COGNOS reports that are generated for COM-COM project.
  • Developed several mappings for data sources like CONCUR, TEXTBOOK, DISPLAY REQUEST, IIS, CONCUR MEETING, EDUCATIONAL MATERIAL, HCPP, SPEKAER BUREAU, CTMS, AHM, AXIOM etc.
  • Gathered data from and loaded into oracle tables, cvs files, web services, SalesForce objects source table types.
  • Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.
  • Involved in the development, testing, documentation as well as part of bug fix team as well.
  • ASAP (AHM Speaker Attendee Program) project is basically built to identify the meetings, meeting attendees and persons who are conducted on or after January 1, 2011 and with a listed set of conferences.
  • Used SalesForce objects as targets in this project and utilized AHM tables like meetings, meeting attendees and persons as sources to load the relevant data into these SalesForce objects.
  • Involved in the development, testing, creating SSP documents and making it all set for the mappings to be migrated to the production environment.
  • Supporting the project in production for quite some time until it became stabilized completely.
  • In Government pricing project, worked on web services to fetch the data from the lists in the pricing SharePoint site.
  • Used web services, cvs files and oracle tables as sources and loaded the data into oracle relational tables.
  • Used xsd.exe tool to generate the xsd’s from the xml generated from the SharePoint.
  • Also used xml parser to parse the xsd generated and load the data accordingly.
  • Worked on web services lists, xsd.exe tool, web services consumer transformation and xml parser transformation as well.
  • Uploading the Data files into the tables of the Oracle database in the server running on UNIX with the help of a scheduled Cron job.
  • Involved in other mappings were developed to load the data into chargeback tables.
  • Several columns are computed and later audited to see if they are matching with the computed value or not and assign the value accordingly.
  • This data is used by the COGNOS team to generate reports and submitted to the business user for review.
  • In VA data we had to calculate the MTD, YTD, etc aggregated columns for both the fiscal and calendar year data that is available in the DW tables and load it into COM schema.
  • It contains three years previous data and several aggregated columns needs to be generated as required.
  • Apart from the projects mentioned, they are several small scale projects primarily internal projects that I was actively involved like for maintaining a couple of rosters(i.e. the information which we get from the vendors which needs to be maintained for our internal purposes like AHM FEED , AHM HCP Feed, Good Works HCP Roster etc)
  • Extracted data from DB2, Teradata and Oracle databases.
  • Involved in fetching the data from MDM_STG schema rather than from the DW tables.
  • Every mapping developed should also contain a technical specification document as well as mapping document that should be uploaded in the project share point site.
  • Used almost all the transformations in Informatica like router, expression, filter, update strategy, joiner, lookup, union, sorter, rank, aggregator, update strategy etc.
  • Mappings and mapping documentations are developed as per the company standards.
  • Parameter files should and must be used for each of the workflows and placed at the server location.
  • There is a standard for creating and maintaining the parameter file so that it is easily readable and understandable to others.
  • TIDAL is a GUI scheduling tool which is used here to automate and schedule the Informatica workflows.
  • Toad is the Oracle GUI tool that is used for querying the database and testing whether the mapping is working for its intended use.
  • Shell scripting is used in a low level to remove unknown characters from the file, align the file etc.

Environment: Informatica 9.1/8.6, Oracle 10g, Toad 9.7, Teradata, TIDAL, SalesForce, UNIX, Web services, xsd.exe, Shell scripting.

Confidential,New Jersey Aug 2010- Dec 2010
ETL-Informatica Developer – Backend Crediting Systems (BECS) and GPPR External (Catalog)

Responsibilities

  • BECS is a project which mainly focuses on integration the claims from both Avaya and Nortel after Avaya undertook Nortel.
  • Worked with domain experts since design documents were not in place.
  • Developed mappings using transformations like stored procedure, expression, router, filter, joiner etc for BECS project.
  • Implemented mapping using both relational tables and flat files as targets.
  • I even had an opportunity to write few stored procedures in SQL Server and Oracle as well.
  • Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD.
  • BECS project is a data integration project to migrate the data from SQL Server Database to Oracle database.
  • Several critical problems have been resolved through Informatica and shell scripts are written to execute the stored procedures through a command task in Informatica.
  • As a whole, BECS gained me an extensive knowledge regarding integrating two servers, their firewall issues and so on.
  • In Catalog project, the pricing data from the already existing 60Days Notification data has to be reused and separate mappings are created for both the material master and customer master as well using these material numbers.
  • We have used flat files as targets for these mappings because they can be easily fetched as inputs along with the SAP data for GPPR.
  • Gathered data from salesforce.com through SFDC connector into Oracle tables
  • We have used aggregator, joiner, router, filter, expression, stored procedure, rank transformations during the development of this project.
  • Designed and developed UNIX scripts to load tables and also for scheduling the jobs.
  • Apart from these, I was also involved in modifying the existing mappings of 60Days Notification because this file is missing few records with few Mantras.
  • Developing, Testing and Validating Teradata-BTEQ scripts
  • Analysis has been performed to understand the 130 mappings that are in place for the 60Days Notification and understood a scenario where I can actually include the query to fulfill the small gap of the remaining Material Numbers.
  • Involved in the development, UAT and production phases of BECS project and I was involved till UAT phases of Catalog Project. Several test cases and the mapping logic has been documented.
  • Through this project, I was interacting with SAP data and understood how the data from SAP can be accessed through Informatica.

Environment: Informatica 8.6, Query Analyzer, SQL Navigator, Oracle 10g, Toad 9.5, UNIX O/S, XD Portal (bug tracking), Teradata, Use cases, MS SQL Server 2005/2000, PL/SQL, AutoSys, Shell Programming

Confidential,Jersey City, New Jersey Jan 2009 – Aug 2010
ETL - Informatica Developer – Credit Risk & Market Risk Data Warehouse

Responsibilities:

  • Prepared functional requirements and detail design specifications for ETL Process and data base designs
  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Power Center of Informatica.
  • Involved in analyzing scope of application, defining relationship within & between Groups of data, Star schema and Snowflake schema.
  • Implemented various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
  • Used Informatica mappings to load flat files, oracle to flat files, flat files to oracle, oracle to oracle.
  • Created and Monitored Batches and Sessions using Informatica Power Mart Server.
  • Performance tuning of Informatica mappings.
  • Develops methods to efficiently reuse existing components. Participates in ETL process design and maintenance sessions and using Control-M job scheduler.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Prepare development strategy summary for each data domain, for the proposed changes to that domain, for analytics, reports, and data quality routines.
  • Collaborates with team members to resolve issues to assure the successful delivery processes.
  • Worked on Power Exchange helping organization unlock mission-critical operational data and deliver it, on demand. Data models with experience in performing gap analysis to identify potential data set modifications and enhancements.
  • Successfully moved the Sessions and Batches from the Development environment to Production environment.
  • Monitored workflows and session using Power Center and Power Exchange workflows monitor. Used Informatica Scheduler for scheduling the workflows.
  • Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables.
  • Reviewed Informatica ETL mappings/workflows and SQL that populates EDW and data mart Dimension and Fact tables to ensure accuracy of business requirements.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Handled release tickets for defects, enhancements, maintenance & system upgrades, and process optimization activities. Reversed engineer and documented existing ETL program codes.
  • Performed data quality and integrity testing within EDW and Star Dimensional Model Data Mart
    Products providing high-performance Exchange access to the wide variety of mission-critical data used.
  • Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the Data marts (Load Data, Analyze using OLAP tools).
  • Involved in Developing Data Marts for specific Business aspects like Marketing & Finance
  • Implement continuous improvement initiatives within team.

Environment: Informatica 8.6,ERWIN (Data Model), Teradata SQL Assistant, TOAD, Kintana, Sun Solaris(Sun Fire), SQL,PL/SQL (Programming Lang), SSH Secure Shell (FTP tool),Data Marts, Trillium (Data Quality), IBM DB2 (database), Informatica Power Center 7.1.1,Power Exchange ,Oracle 10g, CVS (version controlling), XD Portal (bug tracking), UNIX (O/S), XML spy (to generate XML Schema) and Edit plus (IDE).

Confidential,Alpharetta, GA May 2008 – Dec 2008
ETL- Informatica Developer – Customer Billing Data Mart

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection. Gather and analyze business requirements by interacting with business clients and end-users.
  • Developed Informatica ETL mappings, sessions, Worklets, and workflows.
  • Implemented performance tuning of mappings, sources, targets, and sessions had experience identifying the bottlenecks.
  • Extensively used Erwin for Logical / Physical data modeling and Dimensional Data Modeling, and designed Star Schemas.
  • Extensively worked on transformations like Expression, Router, Dynamic Lookup, Update Strategy, Sorter, Aggregator, Union, Source Qualifier transformations.
  • Created reusable mapplets and transformations to maintain row level security across the clients
  • Develop Functional requirements (use case modeling), system requirements and design by analyzing the existing Legacy system, other heterogeneous source systems and impact analysis.
  • Loaded data into Teradata using Informatica, FastLoad, BTEQ, Fast Export, MultiLoad, and Korn shell scripts.
  • Imported Sources and Targets to create Mappings based on business logic and developed Transformations using Power center Designer. Used Informatica Workflow Manager, Workflow Monitor to create sessions and batches.
  • Unit tested the developed ETL scripts, created test SQLs, and handled UAT issues. Enhanced SQLs for performance tuning. Rolled out documentation for the ETL Process.
  • Supported the production implementation of the data warehouse, monitored jobs & fixed failed scripts
  • Worked on Power Exchange honors the security requirements of all sources, targets, and architectures, and employs industry-standard encryption to maintain security across the chosen architecture.
  • Responsible for ETL process under development, test and production environments
  • Supporting all the multiple Platforms through high performance environment provided by Power Center, Power Exchange.
  • Involved in preparation of the detail level design documents and involved physical Database for development and Designing the ETL architecture.
  • Developing Informatica mappings using all transformations and used Informatica Workflow manager to create Session, Workflows to run with the logic embedded in the mappings.
  • Guide technical resources of project through design, construction, testing and implementation and Develop and manage BI methodology.
  • Provide support and technical guidance to resolve support issues as needed.
  • Defining test case scenarios and performing manually executed unit testing. Experience using Load runner to test reports and software batch processes desirable.
  • Responsible for the overlap of business requirements with the application and technical architecture.
  • Track new trends and directions in technology, and determines potential impact to our Team in the Project.

Environment: Informatica 8.1, ERWIN (Data Model), SSH Secure Shell (FTP tool), SQL, PL/SQL, Clear Case (version controlling), Power Center 7.1.1,Power Exchange , Teradata SQL Assistant, Teradata Administrator, Teradata Utilities (BTEQ, FastLoad, Fast Export, MultiLoad Update/Insert/Delete/Upsert), ERWin Data Modeler, Control-M, Teradata Warehouse Miner, XD Portal (bug tracking), UNIX (o/s), Use cases, RUP, Legacy Systems, Oracle 10g/9i, Trillium 7.0, MS SQL Server 2005/2000, PL/SQL, Business Objects XI/6.5, AutoSys, Shell Programming, SQL * Loader, IBM DB2 8.0, Toad, Excel and Unix scripting, Sun Solaris, Windows NT

Confidential,New Jersey Apr 2007 – May 2008
ETL - Informatica Developer – Global Consumer Data Mart (Credit Card Data Processing)

Responsibilities:

  • Business Analysis and Requirements Collection and parsing high-level design spec to simple ETL coding and mapping standards.
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
  • Designed and Customized data models for Data warehouse supporting data from multiple sources on real time.
  • Used Workflow Manager for creating and maintaining the Sessions and Workflow Monitor to monitor, edit, schedule, copy, abort and delete the session.
  • Architecture design by effective data modeling, implementing database standards and processes. Data profiling and definition of enterprise business data hierarchies.
  • Analysis and designing the ETL architecture, creating templates, development, deployment, maintenance, and 24x7 support.
  • Building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Designing of Logical/Physical Data warehouse out of Staging Database to facilitate reporting.
  • Designed procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
  • Created Stored Procedures for data transformation purpose.
  • Generated PL/SQL and Shell scripts for scheduling periodic load processes.
  • Extensively worked on the Database Triggers, Functions and Database Constraints.
  • Track new trends and directions in technology, and determines potential impact to our Team in the Project.
  • Writing UNIX Shell Scripts for automating process.

Environment: Informatica 7.0/8.1, ERWIN (Data Model), SSH Secure Shell (FTP tool), Trillium (Data Quality), IBM DB2 (database), CVS (version controlling), XD Portal (bug tracking), UNIX (o/s), XML spy (to generate XML Schema) and Edit plus (IDE),,Sun Solaris 9, Oracle 10g, Windows XP/2000, TOAD 8.5.3.2,COGNOS Report Net, COGNOS Framework Manager, Rational Clear Case (for version control).

Confidential,Boston, MA Nov 2005 - Mar 2007
ETL - Informatica Developer – BASEL II Compliance Data Mart

Responsibilities:

  • The ETL Engineer Develops, designs, maintains, tests, installs, and modifies ETL (Extraction, Transformation and Loading) processes within the context of Business Intelligence and Corporate Performance Management solutions.
  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Oracle warehouse Builder and OWB performance tuning and monitoring.
  • Performance tuning of SQL Queries, Sources, Targets and sessions.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Created the Entity Relationship diagrams & maintained corresponding documentation for with all attributes, table names and constraints.
  • Building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Responsible for Data Analysis and Requirements Collection
  • Performed Data Cleansing and Data Analysis. Researched sources and identified key attributes for Data Analysis. Analyzed and created Facts, Dimension tables.
  • Designed the technical design document for extracting the data received from the various source systems.
  • Developed technical design specification to extract the data from the data mart tables and export them as text files to the downstream system.

Environment: Informatica 8.1, Data Modeling, Data Analysis, Business Analysis, Erwin 3.5.2, Oracle Warehouse Builder 10g Release2 (Warehouse Builder Client and Server, Design and Runtime Repositories, Design Browser and Runtime Audit Browser, Client Version, Integration with Oracle Application Server), Business Objects 6.0, PL/SQL, XML, Oracle 8i/7.3, SQL Server 2000, Maestro, DB2 6.0, Windows 2000, AIX 4.3.3, Shell Scripting.

Confidential,Chennai, INDIA Jul 2003 – Oct 2005
Client: Merrill Lynch, Hopewell, New Jersey and BCBS, Detroit, Michigan
Informatica Developer – Commercial Banking Data Mart – EDWH,
Equity Traded System – Middle Office Data Reconciliation

Responsibilities:

  • Requirements gathering, Source data analysis and design.
  • Designed and Customized data models for Data warehouse supporting data from multiple sources on real time. Developed best practices and procedures for ETL development.
  • Data collection and transformation mappings and design of the data warehouse data model.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Created the Entity Relationship diagrams & maintained corresponding documentation for with all attributes, table names and constraints.
  • Performed Data Cleansing and Data Analysis. Researched sources and identified key attributes for Data Analysis.
  • Developed data Informatica Jobs between source systems to target System. Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Mainframe Jobs.
  • Created Impromptu catalogs and reports based on business requirements.
  • Developed and test all the backend programs, Informatica and update processes. Involved in testing all stages of warehouse development and deployment life cycle.
  • Design, build and test extraction/transformation process and scripts. Manage integration and performance testing. Creating unit test cases and checklists. Creating detail design documents.
  • Developed data Mappings between source systems and warehouse components.
  • Created and Monitored Batches and Sessions using Informatica Power Mart Server.

Environment: Informatica 6.0 (repository manager, designer, workflow manager and workflow monitor), ERWIN (Data Model), SSH Secure Shell (FTP tool), Trillium (Data Quality), IBM DB2 (database), CVS (version controlling), XD Portal (bug tracking), UNIX (o/s), XML spy (to generate XML Schema) and Edit plus (IDE).

We'd love your feedback!