We provide IT Staff Augmentation Services!

Informatica Developer Resume

2.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY:

  • A results - driven ETL/BI Developer with demonstrated success in the design, development and deployment of large-scale Data Warehouse, Data Integration and Business Intelligence applications for healthcare, telecom, retail and utilities industries.
  • 6+ years of experience in Data Integration, Data Warehousing, System Analysis, Design, and Development of client-server and web applications.
  • Expertise using Informatica Power Center(Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager, Workflow Monitor, Source Analyzer & Target Designer).
  • Experience with Business Intelligence tools like SAP Business Objects and IBM Cognos.
  • Experience working with various databases like Oracle, Netezza, Teradata, SQL Server, and DB2 UDB.
  • Extensive knowledge in writing Shell Scripts, AwkScripts, Perl Scripts and to automate all the batch process and error handling.
  • Worked on external scheduling tools like Control M, AutoSys, Tivoli and Tidal Work Scheduler for automating ETL job runs.
  • Proven experience in creation of Dimensional, Relational and OLAP models for Business Intelligence reporting in a client-server environment.
  • Extensive experience in writing and analyzing performance of SQL, Sql*Loader, Pro*C, Triggers, Stored Procedures, Functions, and Packages.
  • Proficient in warehouse designs based on Ralph Kimball and William Inman methodologies.
  • Expertise in Developing Mappings, Define Workflows &Tasks, Monitoring Sessions, Export & Import Mappings and Workflows and Backup, Recovery.
  • Implemented Performance Tuning at various levels such as Source, Target, Mapping, Session and System.
  • Implemented complex business rules in Informatica by creating re-usable transformations, robust mappings/mapplets.
  • Knowledge in creating logical and physical models with ER/Win and ER/Studio.
  • Unit testing, Integration testing, and Performance testing.
  • Prepare functional, technical, mapping documents and test cases.
  • Over 3 years of experience in Production Support with proven ability to rapidly troubleshoot and diagnose issues.
  • Experience in full life cycle (SDLC) of Data Warehouse design and implementation in Waterfall and Agile work environments.
  • Good communication, analytical and troubleshooting skills.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.5/9.1/8.6/7.1.

BI Tools: SAP Business Objects 4.0/XIR2/6.5, Cognos Impromptu 7.4.

Databases: Oracle 11g/10g/9i, Netezza 4.5/3.1.2,Teradata 12.0,DB2 9.5/8.1, SQL Server7.0.

Languages: C, C++,SQL, PL/SQL, Shell Scripting, XML, Java, COBOL.

Data Modeling Tools: ER/Studio 7.5, ER/Win4.0.

Scheduling Utilities: Control M 6.2, ESP, AutoSys, and Tivoli Work Scheduler.

Version Control: PVCS 7.5, Microsoft Visual Source Safe.

Operating Systems: Windows NT/2000/XP/7, IBM AIX 5.3/5.2, HP-UX B.11.31, LINUX 2.6.

PROFESSIONAL EXPERIENCE:

Confidential, Dallas, TX

Informatica Developer

Responsibilities:

  • Experience with full development cycle of a Data Warehouse, including requirements gathering, design, implementation, and maintenance.
  • Assisted Business Analyst in documenting business requirements, technical specifications and implementation of various ETL standards in the mappings.
  • Loaded ED18 data into the datamart using Informatica mappings, mapplets and workflows for daily loads.
  • Extracted large volumes of data feed from different data sources, performed transformations and loaded the data into various targets.
  • Designed Informatica mappings such as pass through, split, Type 1, Type 2 and used various transformations such as Aggregator, Ranking, Mapplets, Connected and Unconnected lookup, sorter, Transaction control, Filter, Joiner, Update Strategy transformations etc.
  • Created both SCD1 and SCD2 dimension tables and used oracle sequences, mapping variables and parameters, source query and lookup query override to implement complex logic.
  • Implemented logic to load Header and Detail Fact tables in the datamart.
  • Created source, target and transformation shortcuts to ensure consistency throughout folders.
  • Used reusable transformations to ensure code reusability and simplicity.
  • Defined target load order plan and constraint based loading for loading data correctly into different target tables.
  • Analyzed the business requirements and involved in writing test plans and test cases.
  • Used debugger wizard to create data breakpoints for debugging the mappings.
  • Validated ETL mappings and tuned them for better performance and implemented various Performance and Tuning techniques.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the mapping.
  • Performed Unit Testing at various stages by checking the data manually. Performed Data Analysis and Data validation by writing SQL queries and PL/SQL scripts.
  • Validated Job flows and dependences used by TIDAL scheduler to run informatica workflows.

Environment: Informatica Power Center 9.5, Oracle 11g, Cisco Tidal Enterprise scheduler 6.1, Windows 7, SQL, PL/SQL.

Confidential, Plano,TX

Informatica Developer

Responsibilities:

  • Analyzing data integration requirements, data sources and targets, business rules, and transformation logic.
  • Create ETL Mapping and Application Design Specification documents using business and functional requirements.
  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on integrating data from different source systems.
  • Populate IBOR database using InformaticaMappings,Mapplets and Workflows for batch loads.
  • Read the data from Flat files, COBOL files, Source Databases and applied transformations like Expression, Filter, Router, Joiner, Lookup, Sorter, Aggregator, Rank, Update strategy, Source Qualifier in many mappings based on the requirements.
  • Tuned the workflows and mappings.
  • Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
  • For better performance, created pipeline partitions, SQL Override in source qualifier.
  • Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger Wizard.
  • Implemented audit and reprocess design for rejected/invalid records based on business rules.
  • Log number of records extracted, loaded, invalid, and rejected with respective source and target object names and batch run date in ETL processing to Oracle.
  • Troubleshoot and fix defects raised by QA team in system, integration and volume testing environments.

Environment: Informatica Power Center 9.5, IBM MDM 10.1,Java, Oracle 11g, DB2 9.5, MQ Series, Tivoli Work Scheduler, ESP, IBM Initiate, Windows 7, AIX 5.3, Shell Scripting, SQL, PL/SQL, SOAPUI, Citrix.

Confidential, Richardson, TX

Informatica Designer/Developer

Responsibilities:

  • Based on the business requirements created Functional design documents andTechnical design specification documents for ETL Process.
  • Designed ETL process and creation of ETL design and system design documents.
  • Developed code to extract, transform, and load (ETL) data from inbound flat files and various databases into datamart using complex business logic.
  • Extensively worked on Informatica client tools like Designer, Workflow Manager and workflow monitor.
  • Developed Informatica mappings, enabling the ETL process for large volumes of data into target tables in a given batch window.
  • Created complex mappings using Aggregator, Expression, Joiner transformations including complex lookups, Stored Procedures, Update Strategy and others.
  • Created reusable transformations and mapplets in the designer using transformation developer and mapplets designer tools.
  • Worked with Variables and Parameters in the mappings to pass the values between sessions.
  • Created partitions for parallel processing of data.
  • Checked and tuned the performance of Informatica Mappings.
  • Expertise in writing BTEQ scripts in Teradata and running them by writtingkorn shell scripts in UNIX environments.
  • Proficient in creating MLOAD, Fast load and T Pump control scripts to load data to the database.
  • Created Workflow, worklets and tasks to schedule the loads at required frequency using Autosys scheduling tool. Created dependencies in autosys.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 9.1, Oracle 11g, Teradata 12.0, Autosys, HP-UX B.11.31,Shell Scripting, SQL, PL/SQL, XML,CA Workload Control Center r11.1, Microsoft Visual Source Safe, Windows 7.

Confidential, Irving, TX

Informatica Developer

Responsibilities:

  • Implemented data staging, cleansing and transformation mechanisms, normalized databases, anddeveloped Star/Snow Flake Schemas including loading of data marts.
  • Analyzed project requirements with business users and created ETLdesign documents, mapping documents, job sequence flow diagrams, logical and physical data models.
  • Designed, developed Informatica mappings, enabling the ETL of the data into target tables.
  • Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner and Update Strategy.
  • Created Workflow, worklets and tasks to schedule the loads at required frequency using Workflow Manager.
  • Designed and implemented mappings using SCD type 2 and CDC methodologies.
  • Setting of Error Logic for streamlining and automating the data loads for cleansing and trapping incorrect data on staging servers and then loading it to the Netezza data warehouse.
  • Proficient in using Workflow Manager Tools like Task Developer, Workflow Designer and Worklet Designer.
  • Created shortcuts to reuse objects across folders without creating multiple objects in the repository and automatically inherit changes made to the source.
  • Developed and scheduled various pre- and post-sessions commands and workflows for all mappings to load data from source files to target tables.
  • Created mappings to dynamically generate parameter files.
  • Implemented reusable transformations and mapplets to use in multiple mappings.
  • Created and scheduled On Success tasks like email to the ETL team and business analysts.
  • Wrote shell scripts for Netezza Database to perform inserts & updates and avoid dupes since Netezza does not enforce primary key.
  • Scheduled Informatica Workflows using Control Mand incorporated logic for automatic email distribution in case of job failure and reject records.
  • Involved in Unit, Integration, and Regression testing and Peer review of designed jobs.

Environment: Informatica Power Center 8.6, Oracle 10g/9i, Netezza 4.5/3.1.2,PL/SQL, RETEK 6 &12, IBM MQ series 6.0, SQL*Loader, XML,Windows XP, LINUX 2.6, AIX 5.3, PRO*C, Shell Scripting, SQL NAVIGATOR, Secure CRT 6.0, Putty, Control M 6.2.1, TOAD 8.6, ER/Studio 7.5, PVCS 7.5.1, Micro Strategy 8.1

Confidential

Data Warehouse Developer

Responsibilities:

  • Used Informatica to extract, transform and load the data into Star schema datamart.
  • Used Informatica Power Center to create mappings, mapplets by using various transformations like aggregator, sort, rank, router, update strategy, joiner and filter which were used in the process of selectively loading data into the datamarts.
  • Worked on Informatica Power Center 7.1 Designer, Workflow Manager and Work flow Monitor tools.
  • Worked on Unix shell scripting that were in turn used to trigger informatica jobs for daily loads, weekly loads and incremental loads.
  • Involved in the system analysis and design of Data warehouse Business Object Reports.
  • Coordinating with end-users and business analysts to understand the scope of business requirements, user expectations, and ensure the accuracy of reporting specifications.
  • Build the new universes as per the user requirements by identifying the required tables from datamart and by defining the universe connections.
  • Created complex reports using multiple data providers and formatted reports using crosstabs, charts, graphs, pie charts, sections, filters, prompts, controls, alerts and hyperlinks.
  • Created the reports using Business Objects functionalities like queries, sub queries, slice and dice, drill down, cross tab, master detail and formulae etc.
  • Built full client and WEBI reports.
  • Expertise on linking data providers and Universes to build complex queries.
  • Created Business Objects reports using formulae, grouping, parameters and static/dynamic cascading prompts to handle the customer's requests from tables, stored procedures and views.
  • Developed and created classes with dimension, detail & measure objects and developed custom hierarchies to support drill down reports.
  • Good experience in maintaining BO repository, adding users and groups, assigning password security, assigning privileges to documents and Universes.

Environment: Informatica Power Center 7.1,SAP Business Objects XIR2, Oracle 10g/9i, Crystal Reports 2008,, SQL, PL/SQL, Windows XP, AIX 5.2, Shell Scripting, Control M 6.2,TOAD 8.0, PVCS.

Confidential

Business Intelligence Developer

Responsibilities:

  • Involved in different phases of the project like analyzing requirement, planning, execution and maintenance.
  • Created reports containing complex charts using BO to present the end users with a better graphical representation of the data.
  • Created classes and objects in universes using Sql Server database.
  • Involved in the designing and building Universe Objects and Conditional Objects.
  • Set up groups and users in Business Objects Repository. Managed user profiles using Supervisor and set object level security and row level security restrictions, to restrict sensitive client.
  • Created standard and adhoc reports using Business Objects’ functionalities like slice and dice, templates, drill down, cross tab, master/detail, breaks and custom sorts in reporting..
  • Created derived tables in the Designer to enhance the capabilities and performance of the universe.
  • Created reports with prompts, alerts, breaks, sections and hyperlinks.
  • Created, maintained and modified the business critical reports using the Web Intelligence.
  • Interacted with the users to analyze the changes for improvement they need in existing reports and developed the reports to the requirements.
  • Used different business objects @functions like @select, @where, @prompt, hierarchies.
  • Responsible for creating the Deski and Webi reports (On demand, Adhoc reports, summary reports, sub reports, dynamic grouping, crosstab, graphical, etc).
  • Designed and developed customized Cognos reports for the client: Impromptu, Power Play Cube, adhoc and cognos queries using Cognos reporting platform.
  • Involved in creation of prompts and cascading prompts pages and also applied conditional formatting for reports in Report Studio.
  • Used Report Studio and Analysis Studio to build reports based on cubes built in Power play transformer and deploying the cube in Cognos.
  • Schedule, monitor and control access of users by creating groups in Cognos connection.

Environment: Business Objects 6.5/5.1,COGNOS Impromptu 7.4,Oracle 8i, SQL Server 2005, LINUX, Windows 2003 Server, Windows-XP and Clearcase.

We'd love your feedback!