Sr. Data Warehouse Consultant Resume
Houston, TX
SUMMARY
- Ten Plus years of Professional experience as Software developer including over Nine Plus years of experience in designing and development of Data warehouse/ Data Migration/ Data Conversion projects as Informatica ETL consultant and client/server applications.
- Carried out Full Data warehouse life cycle in the project, requirement analysis, Modeling and designing, database and end user application development.
- 9+ years experienced Informatica ETL expert with experience in online analytics, system analysis, creating technical design documents, project plans, managing Data Warehouse projects.
- Strong understanding of Conceptual Data Modeling (CDM), Logical Data Modeling (LDM), Physical Data Modeling (PDM).
- Creating data models, building Informatica ETL processes, dimensional data modeling primarily star schema, performing data migration, defining user interfaces, executing large projects, assuring quality and deployment.
- Developed staging areas to Extract, Transform and Load new data from the OLTP database to warehouse.
- Strong in Dimensional Modeling, Star/Snow Flake Schema, Extraction, Transformation & Load and Aggregates.
- Strong in converting business requirements to project documentation and technical specifications.
- Extensive experience in using Business intelligence report tools like Business Objects, Cognos BI Suite for warehouses and data marts, OBIEE etc.
- Sound knowledge of Data warehousing concepts and Informatica ETL tool.
- Excellent communication/Presentation, communication, interpersonal presentation and analytical skills.
TECHNICAL SKILLS
Languages: C, Java (jdk1.2, Servlets, JSP), J2EE, SQL, PL/SQL, T - SQL, NZ-SQL.
Data warehousing Tools: Informatica Power Center 9.6.1/9.5.1/9.1.0/8.6.1/8.5.1/7.1 , Informatica Power Exchange, Informatica Metadata Manager, Informatica Data Quality, Informatica Data Explorer, DataStage Server Edition, Talend Open Studio, Talend data profiler, Business objects 5.1/6.5,Cognos BI suite 6.0/7.2.Reportnet1.1, QlickView (Netezza), OBIEE.
Data Modeling Tools: Erwin, Embarcadero.
RDBMS: Oracle Exadata, Oracle 11g/10g/9i/8i/8.0, DB2, Oracle EBS 11g, Siebel 7.8, Ms SQL Server 2012/2008 R2/2005, Sybase, Teradata (BTEQ, Fast load, multi load, TPump, SQL Assistant, Fast Export), Netezza TwinFin 3/6/skimmer.
Script Languages: Perl, Script, Java script, korn shell script, Application Server IBM WebSphere 5
Others: Citrix, Telnet, PLSQL developer, Toad 9.7.2/6.5, T-Pump, ISQL, Workbench Aginity, Management Studio, Visio pro, COBOL, IMS, VSAM
Operating System: Windows XP / Windows 2000, WinNT, UNIX (AIX, HP, SCO), Linux, Sun Solaris
PROFESSIONAL EXPERIENCE
Confidential, Houston, TX
Sr. Data Warehouse Consultant
Responsibilities:
- Worked closely with business analysts to understand and document business needs for decision support data.
- Used the Update Strategy Transformation to update the Target Dimension tables.
- Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
- Upgraded Informatica from 9.5.1 to 9.6.1 on Linux servers for Dev/Test and Prod environments.
- Involved in the development of Informatica mappings and also tuned for better performance.
- Migration of data from Oracle 11g to Oracle Exadata
- Created Oracle Exadata database, users, base table and views using proper distribution key structure.
- Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
- Calculated the KPI’s and worked with the end users for OBIEE report changes.
- Created the RPD for OBIEE.
- Developed mapping parameters and variables to support connection for the target database as Oracle Exadata andsource database as Oracle OLTP database.
- Created Mapplets and used them in different Mappings.
- Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
- Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
- Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
- Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
- Created Schema objects like Indexes, Views, and Sequences.
- Extracting the data from .BRD files, Flat files & Oracle and load them through Informatica.
- Worked with Crontab for job scheduling.
- Production Support and issue resolutions.
- Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
- Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
- Performed unit testing and system testing of the mappings developed and documented with various scenarios.
- Written Unix Shell Scripting for repository backups, job scheduling on Crontab etc.
Environment: Informatica Power Center 9.6.1/9.5.1 HF2,Talend, Informatica Power Exchange 8.6.1, Oracle Forms, Green Plum, Informatica Data Quality 8.6.1,Tableau, SQL Server 2008 R2,Netezza, ODI, Oracle 11 g, Oracle Exadata, OBIEE, PL/SQL, Linux, Putty, Mongo DB, Winscp.
Confidential, Houston, TX
Sr. Informatica Architect/ T-SQL Developer
Responsibilities:
- Created the architectural design of the Informatica /ETL and Data warehouse.
- Applied patches to Informatica Servers.
- Worked with SQL DBA’s on collation change on SQL Server 2012.
- Applied Sales Force license to the domain for development and production environments.
- Created Informatica best practices document, mapping/Unit testing document templates.
- Communicated with the networking team on ETL server upgrades to space, memory and processors.
- Maintained Informatica servers to make sure integration services, repository and servers are up and running and also coordinated with networking teams for server reboots etc.
- Created ETL auditing reports for error handling/validations etc against Informatica ‘OPB’ repository tables.
- Installed/Configured Informatica 9.1.0 HotFix1 on Development server.
- Installed/Configured Informatica HotFix3 on development/production environment.
- Applied EBF on Informatica server for SQL Server native client 11.0.
- Production support for the daily/weekly and monthly loads.
- Installed/configured SAP adaptor and JD Edwards (Power Exchange) for Informatica.
- Configured the SFDC application connection to fulltest and production.
- Configured JD Edwards Power Exchange for Informatica.
- Standardized the parameter file location for each project in BWParam folder of Informatica.
- Deployed Informatica workflows from Development
- Deleted old repository backup files and purged deleted objects from repository in both development and production environments.
- Supported the development team on ETL standards, naming conventions, best practices and folder structures.
- Installed/configured Teradata Power Connect for Fast Export for Informatica.
- Folder migration from Development to UAT to Production.
- Created shared/regular folders for Projects and personal development.
- Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite.
- Designed and developed Reports for the user Interface, according to specifications given by theTrack leader.
- Involved in performance tuning at source, target, mapping and session level.
- Loaded Oracle tables from XML sources.
- Configured Informatica for the SAP Connector.
- Extracted data from SAP and loaded into Oracle EBS
- Introduced the concept of Data Dashboard to track the technical details sighting the continuous requirement changes and rework needed.
- Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
- Created source system containers for OBIEE.
- Created subject areas in containers for OBIEE.
- Created narrative reports in OBIEE.
- Retrieved data from SAP using Informatica Power Exchange.
- Supported Integration testing by analyzing and fixing the issues.
- Created Unit Test Cases and documented the Unit Test Results.
- Resolved Skewness in Teradata.
- Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
- Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
- Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
- Used Informatica web services to create work requests/work Items for the end user.
- Successfully Integrated Multiple XML sources and created a de-normalized, flat-structured file.
- Supported Integration testing by analyzing and fixing the issues.
- Created Unit Test Cases and documented the Unit Test Results.
- Created Stored Procedures to validate and load the data from interface tables to the Oracle E-Business Suite internal tables.
- Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
- Staged data in Oracle E-Business Suite stage tables using Power Center in Informatica.
- Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
- Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1,Informatica Power Exchange 8.6.1, Informatica Data Quality 8.6.1, Green Plum, Informatica Data Explorer 8.6, SQL Server 2008 R2/2012, Oracle 10g, Oracle 12.0.4/5 EBS,Netezza, JD Edwards, SAP, Sales Force dot com (SFDC), ODI,Informatica/windows scheduler, Force.com, Teradata 13, Tableau,OBIEE, PL/SQL, Windows Server 2008 R2 Standard.
Confidential, Houston, TX
ETL Architect/ Informatica Lead
Responsibilities:
- Extensively worked with the solution architect to implement the project from scratch.
- Upgraded Informatica Power center from 8.6.1 to 9.1.
- Converted all the Invesco specific utilities like FileManipulator, DBWrapper, ProgremmeWrapper etc into Informatica jobs.
- Purged the deleted objects from the repository in the DEV and QA environments.
- Managed the growth of repository as part of normal administration maintenance.
- Created the shell script to old repository backup files.
- Worked with versioned objects.
- Installed and configured Informatica Power Exchange for CDCand Informatica Data Quality (IDQ).
- Extensively handled Informatica server related issues with the development teams.
- Created various issue tickets with Informatica professional services.
- Created SR’s with Informatica PS various workflow errors which require SR to be opened.
- Worked with the network supportteams both on Windows and AIX servers for memory, storage, event log errors, performance issues and coordinated outage for maintenance.
- Worked closely with the Cognos reports development team in configuring the cubes.
- Created Cognos cubes and denormalized the data for faster access.
- Customized the Cognos reports.
- Provided upgrade and configuration support as part of administration.
- Applied HotFix 5 on Informatica 8.6.1.
- Created labels for each project for code deployments.
- Created deployment groups for code migration from DEV to QA and also QA to Prod.
- Created program wrapper/DB wrapper/File Manipulator jobs.
- Attended POC of IDR.
- Configured Informatica data replication (IDR) fast clone.
- Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
- Created custom plans for product name discrepancy check using IDQand incorporated the plan as a mapplet into Power Center.
- Used IDQ’s standardized plans for addresses and names clean ups.
- Used Workflow Manager for creating and maintaining the sessions and also to monitor, edit, schedule, copy, aborts and deletes of the session.
- Extensively worked in performance tuning of the programs, ETL Mappings and processes.
- Developed Interfaces using UNIX Shell Scripts to automate the bulk load & update Processes.
Environment: Informatica PowerCenter 9.1, SQL Server 2008 R2, Oracle 11g, Sybase, SQL server Management Studio, MS Visual Studio 2010, SQL Developer for Oracle, Program Wrapper, File Manipulator, Cognos EBusiness Suite, Cognos Cubes, Teradata, Designer, SQL Server Migration Assistant (SSMA), CMS, ER-Studio, AIX, UNIX Shell Scripting, SQL, PLSQL, T-SQL, XML, Xqueries on SQL Server Database, Jira Bug tracker.
Confidential, AUSTIN, TX
Data Architect/DW Developer
Responsibilities:
- Responsibilities include Production Implementation, Scheduling, Data Loading, Monitoring, Troubleshooting and Support for Global Operations Reporting using Informatica and Business Objects.
- Extensively worked with the solution architect to implement the project from scratch.
- Training team members to run and monitor workflows in Informatica.
- Lead the team of change Management.
- Created extensive T-SQL procedures.
- Created SSIS packages to get the data from the flat files and load it into Oracle 11g.
- Created the deployment document to be followed for migration of code from one environment to another.
- Created the metadata layer for OBIEE.
- Created batch scripts to rename/copy and move the processed files.
- Designed conceptual, logical and physical data model using Erwin data modeling tool.
- Administrative role and monitoring and support for several projects in Staging and Production environments for Informatica and OBIEE.
- Exceptional background in analysis, design, development, and implementation and testing of data warehouse development and software application
- Created ETL Mappings to ensure conformity, compliance with standards and lack of redundancy.
- Exceptional background in analysis, design, development, and implementation and testing of data warehouse development and software applications.
- Designed and developed Informatica mappings, to load data into target tables.
- Worked with static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
- Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
- Created Mapplets to be re-used in the Informatica mappings.
- Developed ETL Mappings to ensure conformity, compliance with standards and lack of redundancy.
- Designed and developed Informatica mappings for data loads and data cleansing. Extensively worked on Informatica Designer, Workflow Manager.
- Extensively used most of the transformations of Informatica including lookups, Stored Procedures, Update Strategy.
- Created SSIS packages to automate data from Flat files into Oracle database.
- Created SSIS package to get the dynamic source file name using ForEachLoop Container.
- Used the Lookup, Merge, Data conversion, sort etc Data flow transformations in SSIS.
- Used Workflow Manager for creating and maintaining the sessions and also to monitor, edit, schedule, copy, aborts and deletes of the session.
- Extensively worked in performance tuning of the programs, ETL Mappings and processes.
- Developed Interfaces using UNIX Shell Scripts to automate the bulk load & update Processes.
- Made performance improvements to the database by building Partitioned tables, Index Organized Tables and Bitmap Indexes.
- Developed OBIEE RPD and DAC from end user’s input.
- Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules and security.
- Designed and Developed Audit process to maintain the data integrity and data quality. Source to target audits were built to make sure accurate data is loaded to the warehouse and Internal Audits for checking the integrity of the data within the data warehouse.
- Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.
- Strong knowledge in Oracle Business Intelligence Enterprise Edition.
Environment: Erwin 4.2, Informatica Power Center 9.1/8.6.1, Informatica Data Replication 9.1.1 (IDR), Oracle 10g/11g,Oracle Forms, Teradata 13.1, SQL Developer, Crontab scheduler, SQL Server 2008 R2,SQL Server Management Studio, SQL Server Visual Studio 2007, PL/SQL, SQL Assistant, Windows XP, Sun Solaris 5.1.0, UNIX scripting, OBIEE, IBM Vendavo Profit Analyzer.
Confidential, HOUSTON, TX
Informatica/OBIEE lead
Responsibilities:
- Extensively involved in gathering business requirements and translating them into technical requirements
- Responsible for loading the data into base model repository and Dimensional model.
- Used Informatica’s Data Transformation (B2B) tool to retrieve unstructured (xml) data.
- Responsible in strictly maintaining naming standards and warehouse standards for future development.
- Extensively wrote xqueries to get data from XML stored in SQL Server.
- Actively worked on Informatica Administration Console.
- Coordination and Leading the ETL development team and assisting the OBIEE team on RPD and Dashboard customization.
- Worked closely with the OBIEE team in building the RPD and dashboard.
- Customized the OBIEE dashboard.
- Configured SFDC license in administration console.
- Wrote BTEQ scripts of Teradata extensively.
- Retrieved data from JD Edwards and loaded into Oracle.
- Created customized OBIEE model in the RPD to retrieve the RPD data into dashboard.
- Configured replication jobs for initialsync for replication jobs using DR console.
- Created Informatica IDR jobs using Data replication console.
- Scheduled the replication job using replication console.
- Extracted data from various sources such as Flat Files, Oracle using Informatica Power Center
- Designed and Developed Stored Procedures/Views in Oracle.
- Created SSIS packages using BIDS.
- Imported sources and targets for the SSIS packages into BIDS.
- Achieved performance improvement by tuning SQL queries, xqueries, extraction procedures between Oracle and Power Center.
- Offered 24/7 production support for the application on a time to time basis.
- Developed complex Power Center mappings and IDQ plans by using different transformations and components respectively to meet the data integration and data quality requirements for various clients.
- Used Informatica xml SQ transformation to read data from an xml (XSD) source and write data to a xml target.
- Created and monitored workflows/sessions using Informatica Workflow Manager/Workflow Monitor to load data into target Oracle database
- Performed Unit testing, Integration Testing, and User Acceptance testing to pro-actively identify data discrepancies and inaccuracies
- Involved in performance tuning at source, target, mapping and session level
- Delivered and sign-off of deliverables pertaining to the Transactional Data warehouse.
- Migrated the Informatica Code using the deployment groups
- Prepared design documents, ETL Specifications and migration documents
- Introduced the concept of OBIEE Dashboard to track the technical details sighting the continuous requirement changes and rework needed
- Maintained daily Tech Tracker for the updates from the team regarding their objects, issues and progress
- Involved in Informatica PowerCenter 8.1.1 SP4 Repository upgrade to Informatica Power center 8.6.1
- Involved in providing Informatica Technical support to the team members, as well as, the business
Environment: Informatica Power center 9.1/8.6.1, Informatica Data Transformation (B2B), Informatica Data Quality 8.6.2, Informatica IDR, Oracle 11g/10g, SQL Server 2008, SSIS, Windows 2003/2008, Sun Solaris, Red Hat Linux, OBIEE, Micro Strategy, UNIX shell scripting.
Confidential, HOUSTON, TX
BI Lead / Netezza Implementation Consultant
Responsibilities:
- Provided architectural design of Netezza to the client.
- Provided Netezza framework design to the client (pros and cons).
- Extensively assisted the Framework implementation team with time-to-time client requirements and WM specific framework design.
- Interacted with the business users, assisted the ELT developers with code development.
- Assisted the Informatica lead in understanding the framework requirement.
- Worked closely with data modelers in requirement gatherings.
- Assisted the ELT developers in understanding the mapping documents.
- Created nzsql procedures to be kicked off within Framework.
- Extensively created shell scripts for the configuration of meta, ddl, data, xfr files in the Unix directories.
- Mentor client on different BI tools.
- Evaluated Team member’s performances.
- Created stored procedures (NZ-SQL).
- Scheduled the ELT load in Autosys.
- Debugged and corrected the xfr files developed by other ELT developers.
- Fixed numerous bugs with load issues.
- Optimized the NZ-SQL queries.
- Converted Oracle ddl’s to Netezza ddl’s.
- Created the format of the unit test documents per Netezza Framework.
- Assisted ELT developers in creating the unit test documents.
- Managed user and folder permissions for the developers.
- Purged old repository objects weekly.
- Created shell script for repository backup weekly.
- Developed data Mappings between source systems to Landing and from Standardization to warehouse components using Mapping Designer.
- Did performance tuning on the ELT code developed by ELT developers.
- Debugged the framework error logs.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Problem resolution involves resolving the issues raised by the client/actuarial users, validation of the data in the database or in the application functionality.
- Worked closely with QlickView Developers.
Environment: Informatica Power center 8.6.1, Trillium Data Quality, Netezza TwinFin 6 (Production), Netezza TwinFin 3 and Netezza Skimmer (Non-production), QlickView Reporting tool from Netezza, Oracle 11g/10g, Windows 2003/2008, Sun Solaris, Red Hat Linux, SUSE Linux, Micro Strategy, Crystal Reports, UNIX as Server Autosys and UNIX Shell Script.
Confidential, HOUSTON, TX
Sr. Informatica Administrator/Developer/Data Analyst
Responsibilities:
- Installation & Configuring Informatica 8.6.1 on UNIX (SUNos 5.11), also configuring the High availability option for the same.
- Configuring the domain with various application services like repository services & integration services.
- Purge deleted objects from the repository in the DEV, and QA environment. The repository is approx. 850MB and growing. In order to manage the growth of the repository, the purging of deleted objects must be performed as part of the normal administrative maintenance (1 hour every other day).
- Performed Informatica PowerCenter HotFix upgrade / TOMCAT upgrade to meet the Organization security standards.
- Creating & maintaining user, groups for Development repository and Quality Check.
- Maintaining the repository & purging the repository objects & Archiving Periodic repository backup.
- Workflow errors that require Service Request to Informatica for problem resolution. This sometimes requires backing up and zipping the repository, and FTP uploading them to Informatica FTP site. Informatica will examine repositories for inconsistencies, and fix them. Downloading fixed repositories, and restoring them in the troubled environment.
- Revised the standards for the ETL Design
- Involved in Gathering and Analyze business requirements for the RS One conformed to the business rules
- Design & Customizing the Business process using Informatica transformations including SQL Transformation, Source Qualifier, Lookup, Aggregator (Incremental update), Expression, Joiner, Filter, Router, Update Strategy Transformations.
- Created Reusable Transformations Logic and Mapplets to use in Multiple Mappings and also worked with shared folders, shortcuts.
- Developed Informatica mapping to handle dynamic partition i.e., to create/add & Archive partitions of an oracle data base tables.
- Flexible Implementation of mapping, sessions and workflows using Parameter file/Global parameters and by implementing Informatica Best Practices
- Worked with various relation and non relational sources like FF (Direct / Indirect), Relational Tables and ERP systems.
- Created SSIS packages to get the data from AS 400 (.csv files) into SQL Server 2008.
- Used the data conversion transformation in SSIS to get the correct datatypes into SQL Server database.
- Loaded data into Interface tables of Oracle EBS.
- Wrote extensive validation scripts before loading the data into Oracle EBS.
- Configured SAP IDoc connector for Informatica.
- Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
- Did bulk loading of Teradata table using TPump utility.
- Retrieved data from SAP IDocs using Informatica connector.
- Imported data from AS400 to be loaded into SQL Server 2008 into the dbo schema through import wizard and stored it as an SSIS package.
- Created SSIS packages for importing the XRef tables for Purchase orders conversion.
- Developed and configured various mappings and workflows for reading and writing the data to JMS (JAVA Message Service) Queues, using Application Source qualifier.
- Significantly Involved in the analysis and Implementation of Performance Tuning techniques for SQL, Transformations, Mappings, Session
- Developed & Tuned DML/DDL SQL’s to implement Data modeling changes, Also developed T-SQL and PLSQL procedure to handle the Informatica JOB metadata.
- Providing the Functional and Technical specification for designing customized workflows & their automation.
- Implementing MDM and Maintaining data Integrity by eliminating the redundancy.
- Flexible Implementation of mapping, sessions and workflows using Parameter file/Global parameters and by implementing Informatica Best Practices
- Change data capture by multiple level scheduling of session and workflow.
- Developed various shell scripts using pmcmd command line program, to execute and maintain the workflow jobs.
Environment: Informatica PowerCenter 8.6.1/8.1.1 , Erwin Data Modeler, Teradata, SQL Server 2008 R2, SSIS, SSRS, Oracle 11g/10g, Oracle 11.X (EBS), SQL Server Management studio, TOAD 9.6, SAP IDoc, OBIEE Reporting, Windows NT/2000, UNIX SUNos 5.11, HP Load Runner 9.50.
Confidential, DENVER, CO
Sr. Informatica Lead/Administrator
Responsibilities:
- Created Integration Requirements and Design Specification Document.
- Provided architectural design for Informatica.
- Defined and designed flow and standards for Informatica.
- Presented all of the Informatica tools to the Client, its usage, advantages and disadvantages for them to make up their mind to proceed with specific tools of Informatica.
- Extracted data from SalesForce legacy system, SalasVision, Charles River (Trading Platform).
- Documented ETL requirements translating STM’s Business logic into ETL language.
- Created Projects, jobs in Talend Open Studio.
- Used Basic run, debug jobs and used metadata wizard etc in Talend Open Studio
- Lead the offshore GDC team of ETL Developers providing them with in depth understanding of the Architecture, ETL system design and requirements.
- Provided the real time solutions with Informatica mappings for the traders to instantaneously react to market opportunities.
- Did extensive analysis for advanced trading analytics for drill down capability (by trader, portfolio etc).
- Analyzed data from commodity exchanges (ICE and NYMEX) and pricing sources (LIM and Platts).
- Created ETL mappings for identification of arbitrage opportunities, optimize a portfolio in real-time, simulate transactions and automatically execute trade strategies with live feeds.
- Worked closely with data population developers, multiple business units and a data solutions engineer to identify key information that will enhance business decision-making.
- Used Informatica data explorer tool (IDE) for data profiling.
- Loaded the relational tables for trade decision support which is consumed by the dashboard for trade decisions.
- Involved in designing the Data warehouse based on the requirement document using Informatica Power Center 8.6.1.
- Created stored procedure to be called in Informatica for the nexval from Dual table of oracle.
- Created reusable expression transformation for Meta columns of Standardization area.
- Masked data and populated to the limited trust zone using Data masking transformation of Informatica.
- Used SQL, Stored Procedure.
- Used Exceed tool for scheduling the Autosys jobs.
- Debugged and corrected mappings created by GDC.
- Fixed numerous bugs with Testers inputs.
- Created Visio documents for Autosys Production Schedule.
- Created Production readiness document.
- Created Autosys documents for one time, Daily loads of data.
- Used Exceed to execute the Autosys Jobs.
- Upgraded Informatica Power Center 8.1.1 SP4 to 8.6.1.
- Configured Informatica Power Exchange add on for SAP (Power Connect)
- Created Cognos Cubes and developed Cognos reports
- Used Informatica Data Quality tool for Standardization by referring to the database dictionary tables and populating the flat file dictionaries.
- Good knowledge of Oracle major upgrade from 10.2.0.4 to 11g
- Worked with tools - Source Analyzer, Warehouse designer, Transformation and Mapping Designer, Transformations developer, Informatica Repository Manager and workflow Manager and Informatica workflow monitor.
- Read CSV and Tab delimited file and worked with code page.
- Created .CSV files from excel spreadsheets and loaded into the target Oracle Database.
- Worked with memory cache for static and dynamic cache for better throughput of sessions containing Rank, Sorter, lookup, joiner, Aggregator transformations.
- Wrote UNIX shell scripts extensively.
- Mentor and tutor Informatica users on the Power Center product suite.
- Created deployment groups for each iteration releases.
- Created labels for deployment groups for migration.
- Managed tools and services of Informatica.
- Managed user and folder permissions for the developers.
- Purged old repository objects weekly.
- Created shell script for repository backup weekly.
- Developed data Mappings between source systems to Landing and from Standardization to warehouse components using Mapping Designer.
- Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner.
- Did performance tuning on the mappings developed by developers.
- Wrote PL/SQL Packages, Stored procedures to implement business rules and validations in the actuarial system.
- Looked up and read session, event and error logs for troubleshooting.
- Created Informatica unit testing document.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Problem resolution involves resolving the issues raised by the client/actuarial users, validation of the data in the database or in the application functionality.
- Worked closely with Cognos Developers for building cubes and upgrade from Cognos 8.3 to 8.4.1.
- Optimizing/Tuning mappings/sessions, indexing and partitioning for better performance and efficiency.
- Created the reusable transformations for better performance.
- Design and implement data verification and testing methods for data warehouse.
Environment: Informatica Power center 8.5.1, Informatica Metadata Manager, Informatica Data Quality 8.6.2, Informatica Data Explorer, Oracle 11g/10g, SalesForce, SalesVision, Charles River, Windows 2003/2008, Sun Solaris, Red Hat Linux, SUSE Linux, Talend Open Studio 4.1, Talend Open Data Profiler, XML Sources, Cognos ePlanning 8.4.1, UNIX as Server and Citrix as Client, Exceed Autosys Tool and UNIX Shell Script.
Confidential, COLORADO SPRINGS, CO
Sr. Informatica Developer/Administrator
Responsibilities:
- Studied the current OLTP system to understand the existing data structures.
- Worked with server team regarding storage, memory, event log errors, performance issue, and coordinate outage for maintenance.
- Used Deployment group migration.
- Trouble shooting and resolving issues workflow objects.
- Workflow errors that require service request to Informatica for problem resolution.
- FTP uploading the Informatica repository to Informatica site and downloading the fixed repository and restoring in troubled environment.
- BULK INSERT using Transact-SQL statement that implements a bulk data-loading process.
- Participate in gathering of business requirements and carry out a suitable data Model for Data mart.
- Involved in preparing technical design/specifications for data Extraction, Transformation and Loading.
- Worked with XML sources and targets.
- Lead the team of ETL developers
- Provided technical feed to the team of developers
- Worked extensively on Informatica Data Quality (IDQ) for the clean up of addresses in the Customers table.
- Have very good understanding of Informatica 9 version including new features and data quality.
- Worked with full transactional drill down capabilities using graphical view.
- Used Normalizer transformation to normalize and de-normalize the data.
- Worked with Mainframe/Cobol sources using Normalizer.
- Involved in major up gradation of oracle from 9.2.0.7 to 10.2.0.3.
- Involved in Oracle DBA including cold and hot back up.
- Oracle SQL Tuning, trouble shoot, explain plan etc.
- Developed various complex mappings with parameter files, Mapplets and Transformations for migration of data from various existing systems to the new system using Informatica Designer.
- Involved in data migration from compass3 legacy phase to EDW.
- Used various Transformations like Expression, Filter, Joiner and Lookups for better data massaging and to migrate clean and consistent data.
- Provide technical/user documentation and training.
- Developed Pl Sql Triggers.
- Used Trillium data quality Informatica.
- Used High Availability for Informatica and push down optimization.
- Define, create and access data source layouts and target data warehouse and data mart schemas through Informatica Power Mart Designer.
- Performed various Informatica Administrator tasks including up gradation of Informatica from version 8.5.1 to 8.6.1.
- Created new folder and assigned user permissions to the members of the group.
- Did Data migration, Data cleansing and Aggregation while transformation.
- Used mapplets to implement the business logic to effect easy change management.
- Conducted Unit and Systems Tests and proven proficiency in documenting the test case results.
- Written shell scripts to automate the workflows and also to FTP the files.
- Created, scheduled and monitored work flows and sessions using Informatica Power center Server.
- Involved in Performance tuning of various queries and stored Procedures using Explain plan.
- Performed various Informatica Administrative tasks & managed Informatica Repository.
- Performed the data extraction, transformation with business rules, maintained repositories, created users, user groups and developed business reports.
- Generated reports using Cognos Impromptu & Power Play.
- Analyzed and explored the reports using slice and dice and drill down.
- Created different joins according to requirements & database functions.
- Created Catalog Prompts, Conditions and Calculations.
- Created and enhanced existing Power Cube as per user requirements.
- Prepared the test cases, UTP, review records
- Supported process steps under development, test and production environment.
Environment: Informatica Power center 7.9/8.X, Informatica Metadata Manager, Windows 2000 Server, Oracle 8.1, MS SQL server 2005, T/SQL, Cognos BI suite6.0, UNIX, Perl script, Shell script.
Confidential, SAINT PAUL, MN
Sr. Informatica Administrator
Responsibilities:
- Identify tables/Scripts required for full load, daily delta load, and bi-hourly load.
- Lead the team of Datawarehouse developers and provided technical and logistics guidance.
- Document ETL process for full, daily and bi-hourly loads.
- Develop and document ETL validation process.
- Create detailed design documents and Informatica objects.
- Very good understanding of the Novell Network.
- Extracted data from complex XML hierarchical schemas for transformation and load into teradata and vice versa.
- Used XML source analyzer to extract data from XML files.
- Used Informatica Data Quality (IDQ).
- Performed Teradata SQL tuning for HP Neoview.
- Hands on experience Using ETL platform SQL server integration services (SSIS) 2008 to build innovative ETL based application.
- Used SQL server reporting services (SSRS) using multiple output formats.
- Create Informatica extract scripts and execute extract- relational loader.
- Used IBM Maximo Asset management for planned and unplanned activities.
- Build Neoview mappings and load the GL data to HP neoview.
- Data migration from Teradata to HP Neoview.
- Performed Oracle DBA tasks and performance tuning.
- Validate data loads per plan and also validate load times.
- Used CRC incremental logic for complex fact tables.
- Import data from the data source Peoplesoft HR.
- Updating the Tivoli scheduler for the daily and bi-hourly loads for HP neoview.
- Installation and upgradation of Informatica from 8.1 sp2 to 8.1 sp4 with ebf memory patch.
- Upgradation from Informatica 8.1 sp4 to 8.5 on all the three servers (test, development and production).
- Informatica fix/fast export.
- Hands on experience on Data mirror constellar hub and Transformation server.
- Resolve syntax differences from teradata to neoview and document it.
- Run benchmarks and complete tuning.
- Analyze the query performance and designed the loading process schedules.
- Resolved migration issues across repositories and for individual objects using objimport and objexport tasks.
- Worked with performance issues of Informatica mappings and tuned ETL mappings for better performance using various techniques and strategies.
- Maintaining Versioning provided by Informatica tool.
- Developed UNIX scripts for automation of Informatica jobs.
- Developed UNIX and PL/SQL scripts for pre and post session processes, creation and dropping the indexes to automate the daily loads.
- Identified and resolved numerous technical and operational problems in the datamart design and ETL implementation.
Environment: Informatica power center 7.1/8.1 sp2/sp4/8.5, Informatica Power Exchange, Teradata, HP Neoview, unix Shell scripting, queryman, oracle 10.2.0.3,Business objects 6.5, Tivoli scheduler, MS SQL Server.
Confidential, WA
Sr. Informatica Developer
Responsibilities:
- Assisting in Business Requirements gathering and Analysis.
- Analyzing the source data coming from Oracle and working with business users and developers to develop the Data Model.
- Identifying and tracking SCDs, heterogeneous sources and determining dimension hierarchies.
- Developed a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.
- Created Informatica mappings with parameters files and transformations (such as Source qualifier, Aggregators, Filters, Router, Joiners, Sequence, Lookup’s, Update Strategy),tasks( such as sessions, command, Event wait, Event raise, Decision) to build business rules
- Worked on IBM Information server for a different project and have sound knowledge of Datastage.
- Developed Ab Initio tasks and scripts for other project called purchasing datamart using Ab Initio version 1.15
- Used Control-M tool for scheduling Jobs.
- Used Siebel EIM to populate, update, delete data from Siebel database using EIM Tables.
- Loaded Siebel base tables using Siebel EIM.
- Extracted data from PeopleSoft HR.
- Performed Oracle DBA tasks including major up gradation of Oracle from 9.2.0.7 to 10.2.0.3.
- Done Oracle performance tuning.
- Oracle hot and cold backups.
- Hands on experience with cold fusion scripting for the web page connecting to Oracle.
- Worked with IBM TDW.
- Creating test cases for Informatica mappings and design documents for production support.
- Designed the ETL processes using Informatica to load data from Oracle ERP, SQL Server, Flat Files, XML Files, Excel files and Legacy system extracts to target Oracle 10g Data Warehouse database.
- Involved in data migration from Oracle to PeopleSoft ERP.
- Using Workflow manager and monitor for Workflow, sessions and database connection management and workflow monitoring.
- Extensively worked with flat files and excel sheet data sources. Wrote batch scripts to automate flat file transportation local to Informatica server.
- Utilized worklets to trigger job execution based on flat file availability for file sourced mappings. Scheduling and automating jobs to be run in a batch process.
- Informatica environment administration. Creating users and groups, configuring profiles and privileges. Configured folders and folder security. Configured shared folders for re-usable objects.
- Fixing invalid Mappings, modifying and testing coded procedures and functions, Unit and Integration Testing of Informatica sessions, batches and target Data. Created PL/SQL packages for iterative costing calculations.
- Worked closely with Software Developers to isolate, track, and troubleshoot defects.
- Identifying performance bottlenecks and fine-tuning ETL mappings and workflows for performace. Created scripts for automating dropping-recreation of warehouse table indexes for bulk loads.
- Analyzing the source data and deciding on appropriate extraction, transformation and load strategy.
- Creating batch scripts for post load rename of stand alone and legacy system extract files. Parameterizing batch scripts to make them generic across the board.
- Working with business analysts for data testing and validation.
- Changing of Universe identification as per the client requirements to facilitate easy identification for the end users.
Environment: Informatica PowerCenter 6.2/7.1, Informatica Power Exchange, Oracle ERP/10g, Ab Inito 1.15, Siebel 7.8, SQL Server 2003, T/SQL, Erwin 4.1, NT Batch Scripting, Shell scripting, Toad 6.5, and Windows NT 4.0, Hyperion Systems 9.
Confidential, LEXINGTON, KY
ETL Developer
Responsibilities:
- Involved as ETL architect creating ETL interfaces to load data from different source systems to Date warehouse databases.
- Prepared schedules of coding, reviewing, testing and production move.
- Data Stage controls the daily EDW batch load. A Datastage master wrapper job has been created and will act as the starting point for each Site.
- Developed a system for long-term decision-making, strategic plans, support and solutions giving mainstream users the ability to access, analyze and share information in the company’s database.
- Worked with the business analyst to understand business requirements for decision support data.
- Involved in gathering end-user requirements, documenting, and developing prototypes.
- Prepare estimations as per the user requirement and make sure the deliverables are delivered as per schedules.
- Used Datastage Director and the runtime engine to schedule running the server jobs, monitoring scheduling and validating its components.
- Involved as reviewer in all step of SDLC for a requirement in job Jar cycle.
- Assisted in preparing HLD, LLD, test cases and UTP.
- Mentored 4 ETL developers on various project deliverables
- Prepared the documents and supported for production deployment of mappings, sessions, workflows and Cognos reports and cubes.
- Involved in solving the production issues in ETL mappings for some of applications which are very critical to the functional user.
- Involved in supporting of ETL interfaces after the work hours if there were any production issue.
- Participated in weekly CCB calls and change implementation, Planning and review meetings.
- Developed and maintained Repositories and implemented Security, created Source and Target mappings, Managed Sessions.
- Created mappings in Data Stage Server using transformer, containers etc.
- Developed complex mappings to load Dimension & Fact tables as per STAR schema.
- Optimizing and tuning mappings for better performance and efficiency includes sources, Targets, Transformers, sessions.
- Used Visio Pro for the Data Flow Diagrams, Flow Charts and other diagrams. PVCS as been used as software configuration Management tool
- Customized the catalogs and created user profiles to grant user permission for accessing data from different database tables.
- Used Cognos Impromptu tool to create standard & ad-hoc reports.
- Helped reports team to develop SQL queries and tuned many long running reports to increase the performance.
Environment: Data Stage Server/Parallel Extender Edition 6.1, Oracle 9i, DB2, Sybase, Teradata as Source and Target Database, Flat files, UNIX as Server and Citrix as Client, Cognos BI suite 7.2(Power Play transformer (OLAP), Cognos upfront, Cognos impromptu, Cognos Access Manager, Cognos visualizer), Toad, Autosys, Perl Script.