We provide IT Staff Augmentation Services!

Sr. Informati Developer Resume

4.00/5 (Submit Your Rating)

CA

PROFESSIONAL SUMMARY

  • 9+ years of Data Warehousing experience in Data Integration using ETL Informatica PowerCenter tools. Hands on end to end experience in software development life cycle (SDLC), starting from Requirement gathering, Analysis, Design, Development, Testing, Deployment and Production support of Data Integration projects to populate Data Mart and Data Warehouse.
  • Strong knowledge in industry standard Business Intelligence Reporting tools like Business Objects, Cognos and OBIEE.
  • Experienced indesigning customized interactive dashboardsin OBIEE using drill down, guided navigation, prompts, filters, and variables.
  • Expert in usingOBIEE Answersto create queries, format views, charts, and add user interactivity and dynamic content to enhance teh user experience.
  • Experience in implementinghistorical loads, incremental loadsandChange Data Capture(CDC).
  • Was involved in basic Informatica administration such as creating folders, users, change management, etc., and also involved in moving code fromDEV to TEST and PRODusing deployment groups in Informatica Repository manager.
  • Experienced in working wifVariable Managerto definesession and repository variablesand initialization blocksto streamline administrative tasks and modify metadata content dynamically.
  • Extensive hands on experience in various Relation Database management systems (RDBMS) like Oracle, SQL Server and DB2.
  • Experience in both UNIX and Windows environments. UNIX Shell scripts and windows batch scripts to schedule teh jobs.
  • Very good knowledge and understanding of Dimensional modeling using Start and snow flake schemas.
  • Profiling of Source data in various database environments and external flat files.
  • Hands on experience in Data Cleansing, Exceptionand Error Handling Procedures; Implemented Slowly Changing Dimensions (SCD Type 1/2/3 Methods), extensively used Analytical, Aggregate, String, Numeric, Date and Conversion Functions.
  • Extensively used transformations - Source Qualifier, Normalizer, XML, Expression, Aggregator, Rank, Sequence, Lookup, Update, Router, Filter, Joiner, Union, Sorter, Transaction Control, and Stored ProcedureTransformations in mapping development process; Worked wif Repository Migration Process from Product Life Cycle Environments from Development to Production
  • Job scheduling experience using Control-M.
  • Experience in Agile Data Warehousing Integration.
  • Conceptual and Logical data modeling experience for OLAP.
  • Knowledge on Ralph Kimball’s Dimensional Data Warehousing (DDW) using star and snowflake methodologies.
  • Good knowledge and understanding of Teradata SMP (Symmetric Multiprocessing) and MPP (Massively Parallel Processing) Architecture.
  • Hands on experience in creating Indexes like UPI/NUPI, USI/NUSI and Join Indexes.
  • Proficient in implementing solutions using Teradata utilities - BTEQ, Fast Load, Multi Load, TPUMP and Fast Export.
  • Wrote BTEQ scripts to Extract and Transform teh data.
  • Acquired knowledge in teh areas ofData Integration, Data Element Analysis, DB Schema Management, Data Modeling,andData Testing for Executive and Decision Supporting Systems.
  • Involved in Application, ETL and Data Integration validation and testing. Developed Use Cases and Unit test documents. Supported User Acceptance Testing (UAT) to validated business rules for teh Applications, Data Extraction Process and Target Data Quality Scripts. Tested Application Navigation and functionality for data integrity; Supported QA team in fixing teh defects and closing teh QA tickets. Prepared Data Validation SQL Scripts and Published Resultant Data into teh Excel for business user’s evaluation.
  • Provided data extract feed files to fulfill teh Operational and Analytical teams request to run teh business.
  • Excellent Analytical, Critical thinking, and Creative Problem solving skills.Excellent communication skills and ability to work effectively and efficiently in teams and individuallyVery structured, organized, focused on quality and delivery of teh product to teh client wif in teh time lines to positively impact teh business.

TECHNICAL SUMMARY

ETL Tool: Informatica Power Center 9.5/9.1/8.6/8.1/7.1/6.2,Power Exchange, SSIS

BI: Business Objects XI R3, Cognos 10, SQL Server 2008 SSRS

Databases: Oracle11g/10g/9i/8i,SQLServer 2008 / 2005 and DB2.,Sybase, Teradata

Data Modeling Tools: ER Studio 9, ERWin

Scheduler: Control M

Version Control Issue Tracker: PVCS / Mercury Quality Centre

Operating Systems: Windows NT/XP / 7 / 2008 Server, UNIX

PROFESSIONAL EXPERIENCE

Confidential, CA

Sr. Informatica Developer

Responsibilities

  • Currently working in Experian Consumer Services (ECS) - Business Intelligence Reporting team as a Sr. Informatica developer.
  • Gathered User Requirements. Prepared Business, Functional Requirement and Design Documents.
  • Conducted meetings wif internal teams and did impact analysis. Came up wif appropriate implementation solutions to full fill teh requirements.
  • Interacted wif end-users and involved in preparing Technical Design Documentations.
  • Transform teh business requirements to system/data requirements.
  • Identify source systems, connectivity, tables, and fields, ensure data suitability for mapping.
  • Researched sources and identified key attributes for Data Analysis.
  • Converting teh requirements to design specification and then translate into Informatica Transformation logic.
  • Implement ETL Processes using Informatica tools: Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer, Mapplet Designer and Workflow Manager
  • Extensive performance tuning was done by determining bottlenecks Confidential various points like targets, sources, mappings, sessions or system. This led to better session performance. Enhancements to Informatica Mappings.
  • Performed teh data validations and control checks to ensure teh data integrity and consistency.
  • Analyzed teh exceptions rose during teh ETL process and resolved them.
  • Used Debugger to test teh data flow between source and target and to fix invalid mappings.
  • Used Informatica Power center workflow manager to create sessions, workflows and Worklets to run wif teh logic embedded in teh mappings. Combined several Informatica sessions in desired execution sequence.
  • Monitor Workflows in Production Environment and debug/resolve issues
  • Unit and Integration testing has been extensively done Confidential development level and coordinated wif Quality team while performing System testing.
  • Providing status updates to Client and senior management on time.
  • Documented teh test cases and results for future understanding and reference.
  • Converted Stored Procedures to Informatica Mappings.
  • Created Source and Target Definitions in teh repository using Informatica Source Analyzer and Warehouse Designer.
  • Optimize SQL queries for better performance.
  • Extensively worked on UNIX and shell scripts.
  • Providing status updates to Client and senior management on time.
  • Worked wif Slowly Changing Dimensions like Type 1, Type 2.
  • Worked wif Informatica Version Controlling for Team Based Development to check in and check out mappings, objects, sources, targets, workflows, etc.,
  • Extensive use of Harvest to avoid Version Conflict, during teh change management.
  • Hands on expertise on CA-7 and Autosys scheduling tools.
  • Developed Teradata SQL to load teh data into teh target system as per teh requirement.

Environment: Informatica PowerCenter 9.5, Oracle 11g, SQL Server 2008/2005, DB2, PL/SQL, UNIX, Toad, Teradata studio,, Autosys/CA7

Confidential, Parsippany, NJ

BI Technical Lead - Hoovers Data Migration

Responsibilities:

  • Participating in User meetings, gathering requirements, discussing teh issues to be resolved. Translating user inputs into ETL design documents.
  • Conducted meetings wif Internal Teams and gathered information about teh requirements.
  • Interacted wif end-users and involved in preparing Technical Design Documentations.
  • Transform teh business requirements to system/data requirements.
  • Developed Mappings for data Synchronization between Order Management, Customer Management, Purchase Order and Billings Systems.
  • Prepared Test cases for teh Mappings. Involved in Integration of teh code and testing.
  • Coordinated teh Offshore ETL Team.
  • Created Mappings using transformations like Source Qualifies, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expression, Aggregator, Union to load Target tables.
  • Documentation to describe program development, logic, coding, testing, changes and corrections.
  • Responsible for creating theDATA MAPS,extracting (incremental) CDC data fromMainframe sources,exporting and updating them to therepository,importing teh required source files on teh staging environment by usingInformatica Power Exchange.
  • Imported external data (flat files) into Oracle tables using SQL*Loader.
  • Implemented changes in slowly changing dimension tables.
  • Optimized/Tuned mappings for better performance and efficiency.
  • Used mapping parameters and variables to support efficient mapping design.
  • Involved in performance tuning of teh Informatica Mappings and Sessions to enhance teh performance by using different Partitioning techniques and teh various transformations.
  • ConfiguredInformatica Power exchange to connect to SAP source systemto extract sales and spend data.
  • Created and Configured Workflows, Worklets and Sessions using Informatica Workflow Manager.
  • Performance Tuning of sources, targets, mappings, transformations and sessions to optimize session performance
  • Actively participated in Data Mart Design for reporting purposes.
  • Created Mapplets and used them across Mappings for maintaining teh standards.
  • Worked wif SCD tables using Lookup and Update Strategy transformations
  • Modified coded procedures to extract daily incremental data.
  • Scheduling and Loading data process and monitoring teh ETL Process
  • Used debugger to test teh mapping and fixed teh bugs. Performed Unit and Integrated Testing

Environment: Informatica PowerCenter 8.6, Power Exchange, SSIS, Business Objects XI R2, Teradata 13, Sybase IQ 15.3/12.7, Oracle 10g, Erwin 7.2, TOAD, MS SQL Server 2008/2005, PL/SQL, Shell Programming, SQL * Loader, UNIX, Windows 7/NT/XP.

Confidential, Atlanta, GA

DW Informatica Developer

Responsibilities:

  • Requirements gathering and business analysis by attending business meetings
  • Assisted in Logical and physical modeling, followed Star schema to build Claims data mart
  • Extensively used Erwin for Dimensional data modeling
  • Prepared low level Design documents
  • Prepared Mapping Specifications by capturing all teh requirements and functionalities.
  • Developed mappings inInformatica7.1.2.
  • Used Stored Procedure Transformations in teh Mappings to implement teh requirements of OLTP system.
  • CreatedMappletsand used them across Mappings for maintaining teh standards.
  • Worked wif SCD tables using Lookup and Update Strategy transformations
  • Modified coded procedures to extract daily incremental data.
  • Scheduling and Loading data process and monitoring teh ETL Process
  • Implemented Call center integration wif Salesforce. When customers responds to campaigns by calling teh agents, based on teh number they called, a Visual force work flow is launched to assist teh agents.
  • Implemented Visualforce Flow in conjunction to help assist sales agents as part of call center integration wif Salesforce.
  • Designed and developed UNIX Shell Scripts.
  • Imported external data (flat files) into Oracle tables using SQL*Loader.
  • Implemented changes in slowly changing dimension tables.
  • Optimized/Tuned mappings for better performance and efficiency.
  • Used mapping parameters and variables to support efficient mapping design.
  • Involved in performance tuning of teh Informatica Mappings and Sessions to enhance teh performance by using different Partitioning techniques and teh various transformations.

Environment: Oracle 10g, SQL Server 2000 andInformatica7.1.2, SQL* Loade

Confidential

ETL Informatica Developer (Offshore)

Responsibilities:

  • Developed complex mappings using all transformation of informatica extensively.
  • Integrated Sessions and Workflows based on dependencies.
  • Carried regressive Unit Testing of ETL Code.
  • Used all tasks effectively in Workflow Manager.
  • Participated in system testing and integration testing.
  • Participated in UAT and fixed defects raised by QA team.
  • Worked in Post Production historical and incremental loads.
  • Worked as a Quality Coordinator for ETL Team. Prepared Work plans, Testing and Review Cycle for teh Team. Attended Internal Quality Audits.

Environment: Informatica 7.1.2, Oracle 9i, SQL, Windows NT, 95, 98.

Confidential

Informatica Developer (Offshore)

Responsibilities

  • Building Data Mart (Conceptual, Logical and Physical) for reporting environment.
  • Creating Data Dictionary for user defined databases (Collateral, Securities and Reporting) in Oracle, Server. Worked on Data Conversion for ACH Transactions.
  • In teh role of Data Analyst performed analysis and design of extensions to an existing data warehouse/mart business intelligence platform.
  • Define enterprise data architecture vision, strategy, principals and standards; get buy-in from stake-holders, management, business partners, and propagate throughout teh company.
  • Segregated data and organized teh data for common subjects using ODS.
  • Generated operational reports from ODS as opposed to teh transactional/ legacy system.
  • Performed Data Architecture in Designing and implementing a Metadata Repository and providing a centralized information source for teh data models, data maps, processes, documents, contact lists, project calendars and issues affecting teh merger wif Wachovia.
  • Implemented Agile Methodology for building an internal application.
  • Preparing Functional Specifications Document for teh project.
  • Reverse engineering old database and creating subject areas for each schema.
  • Preparing High Level Data Flow for all teh major Applications

Environment:Oracle 9i/10g, SQL Server 2000/ 2005,ERwin7.5.2, Power Designer 7.1, BO XI R3, Windows XP, XML, Excel, Access.

We'd love your feedback!