We provide IT Staff Augmentation Services!

Sr Informatica Developer (l3 Support) Resume

0/5 (Submit Your Rating)

SUMMARY

  • 8+ years of IT experience with in depth knowledge in design, development, enhancement of Data Warehouse and Business Intelligence applications for Financial, banking, insurance, Retail and manufacturing industries.
  • Strong development skills and ability to work through entire software development life cycle (SDLC) methodologies.
  • Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads from different sources like Oracle, DB2, SQL Server, Flat files, XML data with various data formats.
  • Expertise in using Informatica Power center / Power Mart for data extraction, transformation and loading (ETL) from desperate data sources like oracle, SQL server, flat files, MS Access, JMS and XML files and load the data into different targets.
  • Expertise in extracting the data from xml files and loaded into ODS, EDW using XML Source Qualifier, XML Parser Transformations.
  • Extensively worked on ETL Informatica Transformations including - Source Qualifier, Connected - Unconnected Lookup, Filter, Expression, Router, Joiner, Rank, Aggregator, Sorter and Sequence Generator and created complex mappings.
  • Extensively worked on slowly changing dimensions like SCD type-1, type-2 and type-3 to maintain the transactional data and historical data for the need of business process and change data capture (CDC) using Informatica applications.
  • Solid Experience in working with UNIX Shell Scripts for automatically running sessions, aborting sessions, creating parameter files and running batch jobs.
  • Experienced in designing Data Marts and Data Warehouse using Star Schema and Snowflake Schemas in implementing Decision Support Systems, followed Ralph Kimball and Bill Inmon methodologies extensively.
  • Experience in working with Frequency Distribution, Pattern analysis, address verification, primary/foreign key analysis, redundant data analysis and creating profile jobs.
  • Strong command in incorporating data quality business rules using data flux tool that are used across an entire enterprise within many different aspects of business intelligence and overall data hygiene.
  • Identifying, researching, and analyzing the issues in order to develop solutions and resolve recurring issues and to drive and assist with data analysis to improve operation including uncovering data anomalies and performing research of other forms of key operational data to enable efficiencies and overall data quality
  • Expert in advance data modeling concepts like Conformed Dimensions, Role playing Dimensions, Multi valued Dimensions, Data warehousing Bus Matrix Architecture, Degenerated dimensions, Fact less fact tables to handle complex scenarios in data modeling.
  • Strong knowledge and good work experience in database analysis, normalization and De-normalization techniques for optimum performance of relational and dimensional database environments.
  • Strong database experience with SQL, PL/SQL Programming skills like creating indexes, joins, functions, stored procedures, triggers and cursors using Toad and SQL*Plus.
  • Have excellent analytical, problem solving, communication and interpersonal skills, with ability to interact with individuals at all levels and work as a part of team as well as independently.
  • Experienced in data migration from one server / location to different server environments using FTP, Secure FTP, SCP, Network Data Mover methodologies.
  • Extensively worked in developing and implementing Business Objects universe Design, Crystal Reports, Crystal Enterprise and crystal report explorer.
  • Extensive knowledge in building and enhancement of Conceptual, Logical and Physical Data Models using data model design tools such as Erwin, ER Studio, Info sphere Data Architect, Rational Data Architect, Microsoft Visio with both transactional (ODS) and dimensional (slac) design.
  • Proficient in designing and creating the data marts in the data warehouse.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.1/8.6.1/8.5/8.1.1/8.0/7. x, Power Mart 6.2/6.1/5.1. x/4.7, Power Exchange, Power Connector

Reporting Tools: Cognos 10.x, 8.x, Report Net, Power Play, Cognos Query, Power Play Web reports, Crystal Reports, Meta Manager.

Databases: Oracle 11g/10g /9i/8i, SQL Server 2000/2005, MS SQL, MS-Access, IBM DB2 v9.5/9.1, UDB, Tera Data.

Languages: SQL, PL/ SQL, C, C++, HTML, Unix Shell scripting.

Operating systems: Windows NT/2000/XP/Vista, UNIX, Linux, IBM AIX 5.3/4.2, MS-DOS

Other Tools: TOAD, SQL*Loader, Microsoft Office and Visio;

PROFESSIONAL EXPERIENCE

Confidential

Sr Informatica Developer (L3 Support)

Responsibilities:

  • Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
  • Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Mart
  • Extensively used Power Center/Mart to design multiple mappings with embedded business logic.
  • Creation of Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner
  • Created Mapplet and used them in different Mappings
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache
  • Performance tuning using round robin, hash auto key, Key range partitioning
  • Used shell scripts for automating the execution of maps
  • Designed and developed Oracle PL/SQL scripts for Data Import/Export
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.
  • Maintained the batch processes using Unix Shell Scripts.
  • Designed and Deployed UNIX Shell Scripts
  • Managed Change control implementation and coordinating daily, weekly, monthly releases and reruns
  • Created a bridge table for the numerous dimension groups and related them accordingly.
  • Developed enterprise-wide framework that defines how multiple sources of data should be consolidated into a single, structure that enables the use of creative business analytics to extract useful information from a large data sources.
  • Applied reusable tasks and work lets in informatica that can be helped for easier design of workflows to implement the transformations that help movement of data from diversified sources to the target.
  • Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer length and Target based commit interval, and mappings by dropping and recreation of indexes.
  • Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations
  • Used mapping parameters to extract the required data from the sources and direct the data to the sources.
  • Written UNIX Shell Scripts and Pmcmd command line utility to interact with Informatica Server from command mode.

Environment: Informatica Power Center 8.6.1/9.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Power Analyzer, IMS Data, Maestro, PL/SQL, Oracle 10g/9i,Toad,Tidal

Confidential - Waukegan, IL

Sr. Informatica Developer

Responsibilities:

  • Design and develop the semantic layer for the reporting purpose so that the end user reporting may not be affected.
  • Determined the project scope using business requirements by conducting the number of sessions during envisioning phase.
  • Worked closely with SME’s and the off shore team to analyze the areas that would be affected and needs good attention.
  • Captured the changes in the data field and analyzed the impacts on different systems with the introduction of Oracle’s off shelf product.
  • Created analytical view for key business requirement such as revenue generation and auto warranty renewals.
  • Extract data from Oracle and upload to oracle tables using Sql Loader
  • Derived the data mappings that could load all the information from sources to the targets.
  • Designed the informatica source from mainframe system to target mappings to perform the initial load into the production server.
  • Devised strategies of extracting data from UNIX to Staging and then from Staging to ORACLE RDBMS.
  • Efficiently utilized Transformations including Expression, Router, Joiner, and Connected and Unconnected lookups, Filter, Aggregator, Rank, Sorter and Sequence Generator in the mappings.
  • Developed scripts in Unix Shell programming for Data Import/Export, Data Conversions and Data Cleansing methodologies.
  • Provided with complex workflows that invoke the designed mappings in Informatica power center.
  • Implemented and extensively worked on slowly changing dimensions Type1, Type2 and Type3 for accessing transaction history of accounts and transaction information using Change Data Capture (CDC) corresponding to business requirements.
  • Tuned Mappings, Sessions, and SQL for better performance by eliminating various performance bottlenecks.
  • Designed and Developed Relational and Dimensional Data Models in Framework Manager based on client’s reporting requirements.
  • Designed and Developed List Reports, Crosstab Reports, Charts by using Cognos 10 Report Studio and Query Studio. Created multi-page reports

Environment: - Informatica Power Center 9.1,Cognos 10(Framework manager, Report Studio, Query Studio, Cognos Connection), Autosys, Oracle 11g, Sybase, PL/SQL, SQL*Plus, flat files, web services, UNIX, Windows NT, Erwin 7.x, Toad 10, Oracle.

Confidential, Madison, WI

Sr. ETL Developer

Responsibilities:

  • Held sessions with business user and data analysts to gather requirements and associated different source systems for data integration process.
  • Identified documented data sources and transformation logic required to populate data and maintain targets.
  • Played key role in determining data feed extraction process, data analysis, and testing and project coordination.
  • Designed conceptual design model to understand the base architecture of data integration process includes sources, transformations rules and target.
  • Transformations logic implemented to support ETL (extract, transform, load) processes to load data into staging database using Informatica power center.
  • Applied transformation logics by implementing transformations such as XML transformation, lookup transformations, Update strategy transformation, expression transformation, joiner transformation etc to extract, transform and load data into target.
  • Designed re-useable transformations to quickly add new data sources and transformations related to business needs.
  • Implemented change data capture mechanism using slowly changing dimensions type-II.
  • Event wait, event raise, command task utilized to handle the execution of workflow sessions using Informatica workflow manager.
  • Developed Teradata BTEQ scripts to Load data from Teradata Staging to Enterprise Data warehouse.
  • Utilized command line Pmcmd command to run Informatica workflow jobs from command line and used these commands in shell scripts to create, schedule and control workflow.
  • Develop store procedure, packages to utilize in mappings using store procedure transformation.
  • Identify performance bottlenecks in sources, mappings, workflow session and used best practices for performance tune of bottlenecks.
  • Errors are handled using workflow and session error log files in Informatica Power Center Workflow monitor.
  • Utilized Autosys scheduler to run sessions and workflows in batch processes.
  • Documented operational/technical guide to support issues in the productions process.
  • Helped in resolving issues at level-2 production environment.
  • Coordinated in writing test plans to perform various test cases for the validity of data.

Environment: Informatica Power Center 8.1.1, DB2, Oracle 10g, SQLSERVER 2005, UNIX, TOAD, SQL*Plus, Core FTP & Windows XP

Confidential, CA

Sr. Informatica Developer

Responsibilities:

  • Gathered Business requirements by conducting meetings with business analyst, stake holders, development teams and data analysts on a scheduled basis.
  • Imported various Sources, Targets, and developed Transformations using Informatica Power Center Designer.
  • Developed various Mappings with the collection of all Sources, Targets, and Transformations.
  • Created Mapplets using Mapplets Designer.
  • Used Type2 mapping to update a slowly changing dimension table to keep full history.
  • Captured source file/table definitions, target data, and data mart table definitions.
  • Create and maintain metadata and ETL documentation that support business roles and detailed source to target data mappings.
  • Designed and developed complex Aggregate, Join, Router, Look up and Update transformation rules (business rules).
  • Created and Scheduled Sessions and Batch Processes based on run on demand, run on time, run once, and run continuously.
  • Testing and validation of the developed Informatica mappings.
  • Monitored sessions using Informatica Server Manager.
  • Applied performance-tuning logic to optimize session performance.
  • Interacted with business users, analysts for requirements, developed conceptual and logical data models.
  • Wrote PL/SQL Packages, procedures, functions in oracle for business rules conformity.
  • Utilized SQL loader, export/import utilities for data load and transformation.
  • Involved in Dimensional Modeling Techniques to create Dimensions and Fact tables.
  • Developed PL/SQL stored procedures for source pre load and target pre load to verify the existence of tables.
  • Worked with different databases such as Oracle and Flat files and used Informatica to extract data.
  • Involved in Unit Testing, Integration, and User Acceptance Testing of Mappings
  • Worked with Senior Developer in Documenting the ER Diagrams, Logical and Physical models, business process diagrams and process flow diagrams using Visio 2000.
  • Performed Data Profiling to assess the risk involved in integrating data for new applications, including the challenges of joins and to track data quality standards.
  • Extensively worked with Business Intelligence team in preparing and delivered a variety of interactive and printed reports by using Cognos.

Environment: Informatica Power Center 8.1.1, DB2, Oracle 10g, SQLSERVER 2005, UNIX, TOAD, Visio 2000, SQL*Plus, Core FTP & Windows XP.

Confidential, Lincolnshire, IL

ETL Developer

Responsibilities:

  • Identified the exact business requirements by interacted with the business analysts and other management through JAD sessions and brought up the exact requirements.
  • Followed agile methodology during the development process of the data designing.
  • Designed the conceptual models for the flow of data between different systems.
  • Added enhancements to the data model by following Star schema using Ralph Kimball methodology.
  • Applied data profiling techniques to analyze the content, quality and structure of data from different sources to make initial assessments.
  • Provided with data cleansing techniques that can be used for multiple systems including billing, customer service centers and e-channels.
  • Applied source to target mapping and generated mapping matrix for transformation.
  • Managed the Meta data which controlled the flow of the data to different systems helped the organization to control the fraud and achieved solution for fraud detection in cases of subscription fraud by building the customer behavior profile.
  • Debugged the mapping in Informatica using debug wizard to observe the flow of data using different test case for different types of data.
  • Implemented pipeline Partitioning (hash key, key range, round robin and pass through) to improve session performance.
  • Extensively used the capabilities of Power Center such as File List, Pmcmd, Target Load Order, Constraint Based Loading, Concurrent Lookup Caches etc.
  • Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance.
  • Created scripts in UNIX for migration of data between the sources and the target data bases.

Environment: Informatica Power center 8, Toad, Oracle 9i, SQL Plus, PL/SQL, UNIX, and Windows 2003.

Confidential

Data Base Developer

Responsibilities:

  • Worked with business users to define and analyze problems and business needs by involving in sessions with the analysts.
  • Established data standards for customer information, including data definitions, component structures (such as for complex data types), code values, and data use.
  • Created a bridge table for the numerous dimension groups and related them accordingly.
  • Developed enterprise-wide framework that defines how multiple sources of data should be consolidated into a single, structure that enables the use of creative business analytics to extract useful information from a large data sources.
  • Applied reusable tasks and work lets in informatica that can be helped for easier design of workflows to implement the transformations that help movement of data from diversified sources to the target.
  • Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer length and Target based commit interval, and mappings by dropping and recreation of indexes.
  • Performed Unit testing during the mapping phase to ensure proper and efficient implementation of the transformations
  • Used mapping parameters to extract the required data from the sources and direct the data to the sources.
  • Written UNIX Shell Scripts and Pmcmd command line utility to interact with Informatica Server from command mode.

Environment: - SQL Server 2005, Oracle 8i, Erwin, UNIX and Shell Scripting, Informatica 7.1.

We'd love your feedback!