We provide IT Staff Augmentation Services!

Sr. Informatica Developer / Datawarehouse Analyst Resume

0/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • Over 7+ years of focused experience in Information Technology with a strong background in Data warehousing and almost 7 years of ETL experience using Informatica PowerCenter 9.5/9.1/8.6.1/8.1.1/8.0/7.1
  • Extensive experience in Data Modeling, Database Design, Programming, Development and Implementation of Client - Server Applications & Database systems using MSSQL
  • Extensive experience in developing strategies for ETL process using Informatica in large Warehouse environments.
  • Worked on complete SDLC including requirements gathering, functional design, development, testing, deployment and support.
  • Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tool Informatica PowerCenter in Extraction/Transformation/Loading of data and develop Worklets, Workflows and Mappings.
  • Experience in Repository Configuration and processing tasks using Workflow Manager & Workflow Monitor to move data from multiple sources into targets
  • Developed complex Mappings, used transformations such as Lookup, Router, Filter, Expression, Aggregator, Joiner, Stored Procedure and Update Strategy.
  • Working Experience in E-R modeling-FACT & Dimensions tables, Physical & logical data modeling.
  • Strong experience in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema methodologies used in relational, dimensional and multidimensional modeling, Data Profiling and Data cleansing.
  • Highly experienced in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Functions, User profiles, Relational Database Models and Data Integrity, SQL joins and Query Writing.
  • Expertise in developing SQL, SQL*Plus and PL/SQL codes through various Procedures/Functions, Packages, Cursors and Triggers to implement the business logics of database.
  • Good Knowledge on applying rules and policies usingILM(Information Life Cycle Management) workbench for Data Masking Transformation and loading into targets.
  • Implemented Archival and Retirement Projects usingILMtool.
  • Strong understanding of Performance tuning in Informatica and loading data into Data Warehouse/Data Marts.
  • Strong in developing data models including Logical, Physical, Conceptual and additionally dimensional modeling using star schema for data warehousing projects.
  • Responsible for interacting with business partners to identify information needs and business requirements for reports.
  • Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work independently.
  • Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions, and effectively manage client expectations.

TECHNICAL SKILLS

ETL and BI Tools: Informatica Power Center 7.X/8.x/9.x, Informatica Power Exchange 9.xInformatica Power Exchange CDC for Oracle, Hyperion Financial Data Quality Management 9.x

DBMS/RDBMS: Oracle 11g//10g/9i/8i/7.x, SQL server, Microsoft Access, DB2 8.1

OPERATING SYSTEMS: UNIX, Linux, DOS, Windows XP/ 2003/2000/2008/ NT

LANGUAGES: SQL, T-SQL, PL/SQL, Unix Shell Programming, C, MS-Excel, BasicCOBOL.

REPOTING and ANALYSIS Tools: Business Object6.x, Hyperion Financial Management 9.x, OLAP HFM Financial reporting Training.

VERSION CONTROL Systems: PVCS, Tortoise SVN(Subversion), Microsoft Visual Source Safe

JOB SCHEDULING Tool: Autosys, Control-M, Tivoli

GUI Tools: Toad, SharePoint, Visual Basic, Visual FoxPro, ASP, SQL Developer, WinSql

WEB TECHNOLOGIES: HTML, JavaScript, XML

PROFESSIONAL EXPERIENCE

Confidential, Columbus, OH

Sr. Informatica Developer / DataWareHouse Analyst

Responsibilities:

  • Involved in the full System Development Life Cycle (SDLC) and responsible for all part of requirement gathering, planning, designing, developing, testing and implementing.
  • Interacted with Data Stewards and Subject Matter Experts for requirement gathering for efficiently Transform, Extract and Load data sets to interpret business requirement specifications into data mapping.
  • Used Informatica MDM for Data cleansing.
  • Resolved Inconsistent and Duplicate Data to support strategic goals with Multi domain MDM.
  • Filtered data from Transient Stage to EDW by using Complex T-SQL statements in Execute SQL Query Task and in Transformations and implemented various Constraints and Triggers for data consistency and to preserve data integrity.
  • Used For-Each Loop Container, Sequence Container, Script task, Execute SQL task, Send Mail Task, Package Execution task to achieve business needs.
  • Worked on multiple projects using Informatica developer tool(IDQ) of latest versions 9.1.0 and 9.5.1.
  • Created and modified various Stored Procedures used in the application using T-SQL.
  • Evaluated ETL coding specifications accurately and built the same into highly optimized Informatica mappings and migrated code to the staging area.
  • Wrote custom T-SQL stored procedures and triggers to improve performance, preserve referential integrity, and provide additional application functionality Implemented backup and database maintenance plans.
  • Prepared and Verified the High Level Design (HLD) and Detail Level Design (DLD) documents for Armor Build Enterprise Data warehouse (EDW).
  • Interface with the Database Administration Group to ensure proper configuration of database objects in Support of ETL code
  • Worked closely with the OBIEE team in building the RPD and dashboard.
  • Customized the OBIEE dashboard.
  • Performed data quality check using Trillium data quality.
  • Cleansed/standardized, address validation, de-duplication, cleaning of the legacy source data using Trillium data quality.
  • Familiar with the OBIEE datawarehouse Application Console (DAC).
  • Created OBIEE source system containers.
  • Acquired knowledge of integrating the Informatica workflows into OBIEE.
  • Able to manage Queries in Teradata and connect to it through ODBC for pulling out data.
  • Restored the relational connections into OBIEE for the Informatica workflows in OBIEE DAC
  • Extracted difference sources from Oracle, SQL Server, CDMA (Access application), Flat Files, XML, Teradata, and Siebel.
  • Delivered data for Informatica cloud for Software as a service (SaaS) consumption.
  • Created complex mappings for Staging area to EDW and from EDW to Data Mart using different transformations like Data Masking, Joiner, Expression, Aggregator, Lookup, Update Strategy, Filter and Router Transformation etc
  • Implemented SCD1/SCD2/SCD3 logic in the mappings whenever required.
  • Implemented ILMto connect to production data from the test Informatica servers for roduction like testing on the test environments.
  • Created a fully Test Data Management process using informatica ILM which involved extracting the data from production and make it available in an organized, secure, consisitent and controlled manner in the gold copy for the Testers to execute their test cases.
  • Created Stored Procedure to load the Calendar Date table and then Calendar Month table.
  • Used Perl Script to load the target table from flat file.
  • Reviewed performance issues in the datamart like analyzing and optimizing query by involving the DBA’s for the SQL override lookups.
  • Created FTP connections to read data from the flat files from different server.
  • Implemented performance tuning of the Informatica mapping using various components like parameter files, variables and various caches like Static, Dynamic, Shared, Persistent
  • Implemented Email, Command task and Decision tasks and shell scripts in the Post session and Pre session commands.
  • Converted PL-SQL code to T-SQL.
  • Performed Data Profiling and Code migrations from DEV to QA, and then PROD repositories and monitoring all the environments.
  • Used HP-Quality Center for creating and maintaining the history of defects during Integration Testing.
  • Extensive experience in troubleshooting and solving migration issues and production issues.
  • Successfully developed and delivered Test data for implementation of Metavance System (Armor - Build 2 warehouse) for Unit Testing.
  • Insured that all enterprise procedures are followed such as system testing, change management, project prioritization, promotion to production, time entry, etc
  • Worked with Production support team and successfully delivered the Claims data for Audit.
  • Draft, institute and continually refine required processes. Actively participated in user acceptance and sign-off process. Worked effectively with assigned project managers.
  • Participated in the review and maintenance of data security standards to protect information assets.
  • Provided reliable, timely support of integration, performance and user acceptance testing processes.
  • Executed Unit test plans with Smoke/System/Integration tests.
  • Worked with the DBAs for the performance and database tuning.
  • Scheduled sessions and work flows using Autosys and Informatica Scheduler.
  • Provided production support by monitoring the processes running daily.

Environment: Informatica Power Center 9.X/8.X, SQL Server 2008 R2/2012, Oracle 10g/11g, OBIEE 11g, Siebel, Teradata, MDM,IDQ,SQL*Plus, PL/SQL, Rapid SQL 7.3.0, MS Visual Sourcesafe 6.0, MS Visio, HP-CDMA (Access Application), WS FTP Pro 8.02, HP- Quality Center 9.2, ClearQuest 7.0, SQL Server 2008, Win-XP,PERL, AIX 5.3 (Korn shell)

Confidential, LOS ANGELES, CA

Sr. Informatica/T-SQL Developer

Responsibilities:

  • Analyzed the source data, Coordinated with Data Warehouse team in developing Relational Model and Participated in the Design team and user requirement gathering.
  • Created SSIS packages for getting the data from As 400 into SQL Server.
  • Worked as a developer in creating complex Stored Procedures, Triggers, Functions, Indexes, Tables, Views and other T-SQL code and SQL joins for applications.
  • Developed various T-SQL stored procedures, functions and packages.
  • Built Data Transformation Services (DTS) packages for ETL using DTS wizard
  • Involved in Designing ER models (Logical/Physical) for Oracle database to store data retrieved from other sources including legacy systems.
  • Extensively used Informatica Power Center 8.6.1 to extract data from various sources and load in to staging database.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager
  • Used B2B data transformation for getting the data from unstructured sources (XML).
  • Solved T-SQL performance issues using Query Analyzer.
  • Parsed the data from XML’s XSD file using xml transformation.
  • Participated in the detailed level and high-level design documentation of the ETL system and mapping of business rules.
  • Interacted with business representatives for Need Analysis and to define Business and Functional Specifications.
  • Retrieved data from MQ Series sources.
  • Carried out a Developer’s role on MS SQL Server 2008, MS Access Databases (using DTS, T-SQL, Stored Procedures and Views).
  • Expertise in the optimization of the performance of the Designed workflows processes in Informatica and to identify the bottlenecks in different areas after the full volume system run.
  • Implemented Slowly Changing Dimension methodology for accessing the full history.
  • Created Several Informatica Mappings to populate the data into dimensions and fact tables.
  • Worked cooperatively with the team members to identify and resolve various issues relating to Informatica and other database related issues.
  • Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and filter.
  • Developed Workflows and session to load data into targets.
  • Have good knowledge of HIPAA (
  • Extensive use of Command tasks, Decision tasks, email tasks in the workflow design.
  • Developed the Test Scripts and performed the Unit Tests on the ETL mappings.
  • Used DT/designer to validate the physical model of data warehouse or data mart
  • Extensively used DT/Studio to define the mapping process and identify about the data flow from source to target by using Data Flow Designer.
  • Used DT/Studio Designer to extract multiple sources such as XML files, flat files, Oracle tables and SQL Server tables for the target.
  • Used Power Exchange interface to get the data from mainframe files.
  • Used batch process CDC (change data capture) mechanism in Power Exchange to get the recent data.
  • Created IDQ Algorithm and Identified duplicate records from the source data by using IDQ.
  • Standardize and configure the mappings with data quality transformations to compare the values on a field-field basis by using IDQ Workbench.
  • Used Standardize transformation to identify the customer information such as spelling in the columns, address, number format of SSN and telephone area code etc.
  • Used Unix Shell scripts and Perl Scripts to Migrate data from different legacy systems to staging area.
  • Used Mload and Fast Load utility tools to load Teradata.
  • Cleansed landing data with Informatica IDQ to load staging area in the Informatica MDM.
  • Used Address Validator transformation to format the data by analyzing input/output ports.
  • Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter Update Strategy, Joiner, Transaction Control and Stored Procedure.
  • Scheduled sessions and work flows using Autosys / Informatica scheduler.
  • Worked with pre and post sessions and Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Developed the Test Scripts and performed the Unit Tests on the ETL mappings.
  • Successful code migration from DEV to Testing & Production environment

Environment: Informatica Power Center / Informatica PowerExchange 8.6.1, Informatica Data Transformation/ Data Exchange (B2B),MDM,IDQ Embarcadero DT Studio 8.6.1, Embarcadero Rapid SQL, IDQ (Workbench) 9.0.1, Oracle 10g, Teradata 13, PL/SQL, SQL Assistant, Erwin 7.2, MS Visio, Autosys 4.5.0, SQL Server 2008, Win-XP, AIX 5.1

Confidential, RI

IS Lead Application Developer

Responsibilities:

  • Designed and developed the Informatica/ETL Mappings/Workflows for the Retail Data warehouse data extractions, data transformations, data staging, movement and aggregation. Also involved in loading and staging of Data Marts.
  • Worked at every role within the lifecycle of the applications from architecting, to development and testing, to go live. And then with routine maintenance and production bug fixes, system enhancements, administration, and tuning.
  • Work closely with Managers, Project Managers, Technical Product Managers, clients, subject-matter experts, and data modelers to obtain requirements, objectives, and business rules for projects.
  • Interact and collaborate with other infrastructure teams for ensuring availability of Informatica environment.
  • Played Project lead role in Unix Migration project. Formed a team with offshore and onshore people and successfully migrated code from Sun Solaris Unix box to IBM Unix box.
  • Represented my team and worked alone and took all responsibilities for Informatica Migration project from Version 7 to version 8.3 and from version 8.3 to version 9. Successfully migrated 5 Applications
  • Interacted with users to support analysis and provide definition and requirements for the application.
  • Actively participated in defining requirements, functional specifications, designing documentation and testing strategies.
  • Participated in creation of logical data models based on entity-relationship diagrams.
  • Created data model through Erwin.
  • Developed Profile, Unix shell scripts to automate Informatica Process.
  • Maintained and Batch scripts and automated jobs scheduling jobs through Control-M.
  • Performed Informatica code migrations to maintain programs and deployed through Version Control s/w Subversion.
  • Modify and maintain the Oracle Metadata tables to support the Essbase cubes.
  • Created Data map from mainframe Copybook through Informatica Power Exchange.
  • Consistently involved in testing, debugging, performance tuning. Used Debugger for debugging Mappings, Designed and configured sessions with effective caching and logging. Resolved memory related issues like DTM buffer size, cache size to optimize session runs as well as SQL Queries Optimization.
  • Understand and interpret complex business requirements by liaising with the Business and translate them into business intelligence solutions.
  • Providing training to end users from time to time on the various functionalities of the tool and basic concepts of multidimensional OLAP databases.
  • Involved in writing project documentation and chart out LOEs with milestones, technical deliverables, and project plans.

Environment: Informatica Power Center7.x/8.x/9.x, Informatica Power Exchange, Oracle 9i/10g/11g, SQL, Sun Solaris Unix, Linux, Erwin 3.X Data Modeler, Toad for Oracle, Windows 2003, Subversion, Control-M, IDQ,WS FTPPRO, WinScp, MS Office Suite, MS Visio.

Confidential, MA

ETL/Informatica Developer

Responsibilities:

  • Involved in the design, development and implementation of ETL process using Informatica.
  • Generated workflows using reusable tasks, worklets and workflows containing sessions, commands, event raise/wait, decision and timers for flow control
  • Involved in optimization of ETL programs and SQL queries(overriding SQL query) in order to increase performance
  • Worked with relational sources and flat files
  • Involved in rigorous testing of ETL mappings
  • Extensively involved in monitoring the jobs in order to detect and fix unknown bugs and track performance
  • Provided excellent Informatica production support
  • Used Debugger to test the mappings and fix the bugs

Environment: Windows XP, Informatica Power Center 7.1.x, DB2 8.1, WinSql 5.5, UNIX

Confidential

Informatica Developer

Responsibilities:

  • Involved in developing and validating business rules
  • Lead ETL developer and ETL design for the Consumer Database
  • Analysis, Coding and Testing the Data Loads
  • Scheduling jobs in Informatica 7 for daily run
  • Generation of error report for data that falls out during the load
  • Generating documents for Knowledge Transfer
  • Platform Migration- Procedures and Mappings

Environment: Windows XP, Informatica Power Center 7.1.x, Oracle 9i, SQL*Plus, TOAD, UNIX, Autosys, PVCS

Confidential

ETL Developer

Responsibilities:

  • Analyze the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Develop logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
  • Tune Informatica mappings for optimum performance.
  • Provide Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mapping.
  • Extensively used Power Center/Mart to design multiple mappings with embedded business logic. Created Mapplet and used them in different Mappings.
  • Create events and tasks in the work flows using workflow manager.
  • Use shell scripts for automating the execution of maps.
  • Design and developed Oracle PL/SQL Scripts for Data Import/Export, Data Conversions and Data cleansing.
  • Generate reports using Cognos Impromptu tools querying database.
  • Design and deployed in Perl

Environment: Informatica Power Center 6.2, Oracle, ERWIN, PL/SQL, UNIX, SQL, XML, Shell-Script

We'd love your feedback!