Etl /informatica Consultant Resume
Chicago, IL
SUMMARY
- Over 7 years of complete software Development Life Cycle implementation experience of Data Warehouse/Data mart and Business Intelligence Oracle Business Intelligence.
- Excellent skills in Informatica PowerCenter, Power Exchange, Power Connect, Data Explorer (IDE) and Data Quality (IDQ).
- Experience in complete PeopleSoft implementation cycle involving requirements gathering, analysis, specification, design, development, conversions, installations, maintenance and performance tuning.
- Involved in the full Project Life Cycle of the Enterprise Data Warehouse (EDW) Projects.
- Strong working experience in Informatica Data Integration Tools - Repository Manager, Designer, Workflow Manager and Workflow Monitor.
- Experience in providing end to end business intelligence solution by dimensional modeling design, developing and configuring OBIEE Repository (RPD),Interactive Dashboards,OBIEE Answers, Security implementation,AnalyticsMetadata objects, Web Catalog Objects(Dashboards,Pages,Folders,Reports).
- Extensive background using Oracle development tool set (Including PL/SQL, SQL*Plus, SQL Loader, PL/SQL developer).
- Involved in all the stages of Software Development Life Cycle.
- Extensively used DataStage Client components like DataStageDesigner, Director, Manager and Administrator in Data Warehouse development.
- Worked with Data Migration, Data Conversion in diverged environment like DB2 UDB, Oracle, SQL Server and complex files.
- Proficiency in Design, Development, Implementation, Administration and Support of ETL processes for Data Marts and large scale Data Warehouses using DataStage PX 8.X ETL tool
- Extensive experience on masking the sensitive data.
- Proven track record in troubleshooting of DataStage jobs and addressing production issues like performance tuning and enhancement.
- Expertise in Developing Mappings, Mapplets and Transformations between Source and Target using Informatica Designer and writing Stored Procedures.
- Experience in Administering DAC 10.1.3.4 and Informatica 9.0.1.
- Extensive experience in working with OBIEE Answers, Delivers, iBots, and Intelligent Dashboards.
- Experience in Performance tuning of Informatica (sources, targets, mappings and sessions).
- Experience in working on ETL tools like Informatica PowerCenter (9.x / 8.x / 7.x).
- Good knowledge on OLAP tool like Business objects.
- Strong experience in UNIX Shell scripting (Korn, BASH, and CSH) as part of file manipulation, Scheduling and text processing.
- Experience in using Data Warehouse Administration Console (DAC 10.1.3.4) for scheduling and executing Incremental ETL loads.
- Extensive experience in MS- SQL 2005, PL/SQL, SQL, Procedures and Oracle 9i.
- Effective in cross-functional and global environments to manage multiple tasks & assignments concurrently.
- Played significant role in various phases of project life cycle, such as requirements definition, functional & technical design, testing, Production Support and implementation.
- Multilingual, highly organized, detail-oriented professional with strong technical skills.
- Good aptitude for problem solving and logical analysis.
- Highly motivated, team player with excellent interpersonal and communication skills, organized and quick learner.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center v 9.1/8.6.1/8.5.1/8.1.1/7. x/6.x, Informatica DataExplorer (IDE) v8.x/5.x, Informatica DT Studio, Confidential Web sphere /Ascential DataStage 8.x /7.5x2/7.x /6.x, Quality Stage.
Data model: Erwin 4.2
Operating Systems: Windows XP, NT, Windows 95, MS-DOS, UNIX
Computer Languages: PL-SQL, C, C++, COBOL, Visual Basic 6.0, XML, JavaScriptVBScript, Unix Shell Scripts, Control-M & Perl Scripts.
Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERWin 3.5.2/3.x, Oracle Designer.
RDBMS: MS-Sql 2005, ORACLE (8, 9i, 10g, 11i), SQL, Teradata and MS- Access.
PROFESSIONAL EXPERIENCE
Confidential, Chicago IL
ETL /Informatica Consultant
Responsibilities:
- Involved in Requirements gathering for data warehouse.
- Fulfilled High Priority tickets with SLA of 3 hrs to meet the needs of High Stake Clients.
- Coordinated with data modeler and database administrator to understand the business process of the data warehouse model.
- Design data warehouse table structure to satisfy business requirements, identify data source, develop strategies for data extraction from source systems, design transformation and loading into data warehouse target systems; modify the existing ETL code (mappings, sessions and workflows) and the shell scripts as per the user requirements. Monitoring workflows/mappings in Informatica for successful execution.
- Involved in scheduling the jobs using Control-M.
- Extracted the data from PeopleSoft on our data mart and load into Datawarehouse.
- Extracted the source data from different sources like Complex Fixed width Multiple record type flat files(from Mainframes) and Oracle Databases, cleanse the data, Applied business rules and loaded OLTP (SCD Type1) and Data Warehouse (SCD Type2) at the same time.
- Implemented Invoice level security in OBIEE depending on the user level access by integrating with EBS.
- Familiar with Oracle BI Apps (OBIA) 7.9.x modules- Sales, Service, Marketing, HR, and Finance.
- Created MUD (Multiple user Development) environment by utilizing projects in OBIEE.
- Utilized session variables, repository variables and initialization blocks in the repository building and modification procedures in OBIEE.
- Gathered the requirement, completed the proof of concept, Designed, Developed and Tested Physical Layer, Business Layer and Presentation Layer of OBIEE.
- User tables were created so that other user information is masked from becoming visible to unauthorized users.
- Managed and customized the ETL Informatica/DAC loads, RPD and Web catalog metadata according to the reporting needs of the client.
- Implemented Invoice level security in OBIEE depending on the user level access by integrating with EBS.
- Worked on creating a common Data Quality job which performs Data type checks, Data Validation Checks and Parent Child relation checks on Source files that come from different vendors like EDS,TSYS, Visa and MasterCard.
- Involved in the physical & logical design using Erwin.
- Installed and Configured Informatica Power Center / DAC for OBIA Supply Chain Analytics on Dev Environment.
- Involved in creating reports using Business Objects.
- Installed and Configured DAC 10.1.3.4.1 on UNIX server.
- Worked using DAC to perform scheduling and executing incremental loads.
- Involved in application testing, bug fixes and Production support.
- Managed Change control implementation and coordinating daily, monthly releases and reruns
- Used Debugger to validate the mappings and gain troubleshooting information.
- Worked extensively on creating complex PL/SQL Stored Procedures and Functions and optimizing them for maximum performance.
- Creating and Scheduling workflows and sessions using Control M Scheduler.
- Involved in documentation of complete life cycle of the project.
- Worked with DBA’s to address complex issues like identifying Deadlock, Memory issues, and PLSQL Packages, creating indexes, constraints and triggers
- Involved in Source system analysis and Requirement Gathering meetings and created Design specification and Mapping Documents for ETL Process.
- Involved in creating detailed design documents, Technical Design documents and Mapping documents and Test Cases for Informatica jobs.
Environment: Informatica 9.1, OBIEE 11.1.1.6, DAC, PeopleSoft Financials 9.x/8.x(GL,AP, AR AM, BI, SQL Server, Oracle 9i/8i, 3.2.2, Confidential AIX 5.2, Universe Designer, Teradata 13.10, Oracle 11g, Oracle 11g-Release 2, SQL Server 2000/2005, UNIX KornShell scripting.
Confidential, Philadelphia PA
Sr.ETL Consultant / Informatica
Responsibilities:
- Worked with the Business Analysis team in collecting business requirements.
- Studying the existing environment, validating the requirements and gathering source data by interacting with clients on various aspects.
- Wrote complex SQL override scripts to source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy. Executing jobs on Autosys Scheduled.
- Modified the existing ETL code (mappings, sessions and workflows) and the shell scripts as per the user requirements. Monitoring workflows/mappings in Informatica for successful execution.
- Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator, Update Strategy, Rank, Expression and lookups (connected and unconnected) while transforming the Sales Force Data according to the business logic.
- Responsible for preparing Source to Target Mapping (STM) specifications and Transformation rules and Design Specification Document.
- Involved in Writing/Updating Technical Design Documents and Design Documents.
- Responsible for Production Turnover for moving Informatica objects (workflows/mappings/sessions), Oracle objects (functions, stored procedures, DDL queries), UNIX objects (shell scripts) and Windows FTP Scripts from Test to QA to PROD. Migrating Harvest packages (UNIX shell scripts) from Test to QA to Production Environments. Creating PTMs and DTG request for developers in order to the Production Environment.
- Server as point of contact between the developer and administrator for communications pertaining to successful execution of job. Resolve issue that cause the production jobs to fail by analyzing the ETL code unlock files created by failed jobs on the Informatica server. Utilize Lotus Notes for communicating with the developers as well as the managers and upload documents in SharePoint.
- Generated Business Reports for Business Users using Informatica.
- Responsible to define ETL load mechanisms, Involved in designing the Extract, Transform and load.
- Enhanced the existing Mappings according business requirements.
- Extensively worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
- Involved in writing PL/SQL Stored Procedures and also modified existing stored procedures to improve the query performance.
- Worked extensively on ROBO-FTP and created scripts to Encrypt/Decrypt and zip files and FTP them to different External Vendors.
- Involved in Scheduling calendar jobs on Autosys.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
- Joined Tables Originating from Oracle and SQL Server.
- Experience working with XML sources and XML targets using XSD.
- Actively Participated in Team meetings and discussions to propose the solutions to the problems.
- Involved in application testing, bug fixes and Production support.
- Managed Change control implementation and coordinating daily, monthly releases and reruns.
Environment: Informatica 9.1/8.6, Informatica DT Studio 9.1/8.6, SSIS, Windows Server 2003, UNIX Shell Scripts,XML,DOS, Oracle 9i,10g,SQL,SQL *Loader, Business Objects 6.5 PL/SQL, Confidential DB2, MS Access, Control-M, Perl, IMS.
Confidential, San Francisco CA
DataStage Consultant
Responsibilities:
- Data analysis, data mapping, extract, transformation and load.
- Involved in creating Conceptual design which gives the Source to target mappings with the business transformation rules.
- Worked with functional staff to translate business requirements into appropriate logical and physical data structures
- Developed source to target mappings with transformation rules.
- Involved in defining the intermediate snap tables.
- Created process flow diagrams.
- Used various stages like pivot, sequential file, sort, merge, pivot, modify, aggregator etc.
- Involved in production support on rotational basis.
- Developed job sequence and implemented restart ability using various checkpoints
- Extensively worked on DataStage Director, DataStage Manager.
- Written Unix Shell Script for creating Header and Tailer for the output file.
- Involved in Test case writing and code review.
- Used DataStage version control to move the jobs from DEV to SYS to PROD.
- Developed and deployed Control-M jobs.
- Created parallel extender jobs to extract data from different source files, transform the data using stages like Transformer stage.
- Mapped the source and target databases by studying the specifications and analyzing the required transforms.
- Unit tested the mappings to check for the expected results.
- Experiencein conducting data validation using SQL.
- Involved in development of UNIX shell scripts for Batch jobs etc run.
- Strong experience in UNIX Shell scripting (Korn, BASH, and CSH) as part of file manipulation, Scheduling and text processing.
- Prepared test docs which involved preparation of test data, test case and unit testing of the jobs
- Provided data modeling support for numerous strategic applications.
Environment: DataStage Enterprise Edition (8.1, Designer, Director, Manager, Parallel Extender), Windows Server 2003, SQL Server 7.0, Oracle 9i, SQL, PL/SQL, UNIX Shell Scripts, SQL*Loader, Erwin 4.1.
Confidential, Murray Hills NJ
ETL Contractor
Responsibilities:
- Extensively worked on Data Extraction, Data Integration, Transforms Loading and Analysis.
- Created transforms using stage variables for jobs and code for batch jobs to run jobs in parallel.
- Knowledge of configuration files and partition techniques for Parallel jobs.
- Extensively used Parallel Stages like JOIN, LOOKUP, FUNNEL, and REMOVE DUPLICATES for development.
- Used DataStage Manager for importing Metadata from repository, new job categories and importing table definitions from database tables.
- Designed, Developed, Tested server and parallel jobs according to technical design documents.
- Experienced in complete life cycle (Design, Development, Maintenance and Documentation) of Conversion Process.
- Extracted data from legacy telecommunications systems using Monarch Software (report mining/data analysis software), delimited files, Excel files and ACCESS databases.
- Developed server job routines for custom tasks to be handled in job sequencers
- Used DataStage as an ETL tool to extract data from sources systems, aggregate the data and load into the Teradata database.
- Developed UNIX shell scripts using Korn Shell for automating the initial load conversion process, which calls ETL jobs to load data to the database system.
- Developed a Validation Processing system for entities like Merchants, Terminals in (and) Express.
- Extensive knowledge on Oracle, 9i and 10g, with experience within very large scale database environment and mission critical large OLTP (Informix) and OLAP systems.
- Performing import and export of data stage components, table definitions and routines using DataStage Manager.
- Provided technical support by responding to inquiries regarding errors, problems, or questions in Production environment.
- Provided IT Support to multiple environments like Production and UAT Environments.
- Coordinated with CM team to deploy ETL code to production environments.
- Worked with DBA team to improve the loading performance in conversion cycles by using proper indexes in the queries in ETL jobs.
- Worked with SDE team to analyze the errors that came up during conversions and applied the business rules to ETL jobs as per the solutions provided by SDE team.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
Environment: Ascential DataStage 8.1, (Designer, Director, Manager, Parallel Extender), Windows Server 2003, UNIX Shell Scripts, Oracle 9i, 10g, SQL, SQL*Loader, Business Objects 6.5, PL/SQL, Confidential DB2, MS Access, Control-M, Perl, IMS.
Confidential, Ashburn VA
Sr. ETL Developer
Responsibilities:
- Extensively worked on Data Extraction, Data Integration, Transforms Loading and Analysis.
- Created transforms using stage variables for jobs and code for batch jobs to run jobs in parallel.
- Knowledge of configuration files and partition techniques for Parallel jobs.
- Extensively used Parallel Stages like JOIN, LOOKUP, FUNNEL, and REMOVE DUPLICATES for development.
- Used DataStage Manager for importing Metadata from repository, new job categories and importing table definitions from database tables.
- Designed, Developed, Tested server and parallel jobs according to technical design documents.
- Experienced in complete life cycle (Design, Development, Maintenance and Documentation) of Conversion Process.
- Extracted data from legacy telecommunications systems using Monarch Software (report mining/data analysis software), delimited files, Excel files and ACCESS databases.
- Developed server job routines for custom tasks to be handled in job sequencers
- Used DataStage as an ETL tool to extract data from sources systems, aggregate the data and load into the Teradata database.
- Developed UNIX shell scripts using Korn Shell for automating the initial load conversion process, which calls ETL jobs to load data to the database system.
- Developed a Validation Processing system for entities like Merchants, Terminals in(and) Express.
- Extensive knowledge on Oracle, 9i and 10g, with experience within very large scale database environment and mission critical large OLTP (Informix) and OLAP systems.
- Worked with DBA team to improve the loading performance in conversion cycles by using proper indexes in the queries in ETL jobs.
- Worked with SDE team to analyze the errors that came up during conversions and applied the business rules to ETL jobs as per the solutions provided by SDE team.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
Environment: Ascential DataStage8.0, (Designer, Director, Manager, Parallel Extender), Windows Server 2003, UNIX Shell Scripts, Oracle 9i, 10g, SQL, SQL*Loader, Business Objects 6.5, PL/SQL, Confidential DB2, MS Access, Control-M, Perl, IMS.
Confidential
Intern
Responsibilities:
- Requirement specification document preparation
- Involved in the development of parallel extender jobs using various stages
- Preparation of design guidelines for DB2 Database, DataStage ETL and Unix.
- Involved in Designed jobs and analyzing scope of application, defining relationship within groups of data.
- Used the DataStageDesigner to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse.
- Analysis of star schema and data model.
- Identifying suitable dimensions and facts for schema.
- Extensive experience in creating and loading data warehouse tables like dimensional, fact and aggregate tables using Ascential DataStage.
- Developed Job Control routines and Transform functions for converting data from one format to other Format.
- Writing Perl, UNIX Shell Scripts for processing/cleansing incoming text files.
- Writing Oracle Stored Procedures, SQL scripts and calling at pre and post session.
- Coordinating tasks and issues with Team Lead and Project Manager on daily basis.
Environment: Ascential DataStage7.5, 7.5x2, Universe Basic, Oracle 8i, SQL Server, SQL,, Business Objects 5.1, Web Intelligence, Shell Scripts, Erwin, Windows NT 4.0, AIX 4.x, Perl, PVCS, XML