We provide IT Staff Augmentation Services!

Data Warehouse Analyst Resume

3.00/5 (Submit Your Rating)

East Moline, IL

Professional Summary:

  • Successfully gained 7+ years experience in Design, Data Modeling, Development, Administration, Testing and Migration of Legacy/OLTP and Data ware Housing applications.
  • Exceptional experience in Modeling, designing and building the Business Intelligence/Data ware housing Architectures using Informatica Power Center/Power Mart 5.0/6.2/7.1/8.1.0/8.6.0, AbInitio, Cognos 7.1/8.0/8.3. and Business Objects 5.2/XI R2
  • Experience with Cognos 8.3 / Cognos 8 BI with Analysis Studio, Report Studio, Query Studio Frame Work Manager and Cognos Connection.
  • Gained Experience as ETL Developer in Informatica Power Center/Power Mart 5.0/6.2/7.1/8.1.0 and Over 4 + years of Experience as ETL Architect/Designer and 1+ year of experience as ETL Administrator.
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
  • Expertise in Oracle 9i/8i/8.0/7.x,Teradata, MS SQL Server 2000/7.0/6.5, Erwin 4.0/3.5, SQL, PL/SQL, SQL* PLUS, SQL* Loader, TOAD, Stored procedures, triggers Exceptional Experience in Administration of Informix and Oracle using UNIX and NT/ MS windows 2000/XP Platforms.
  • Proficient of coding UNIX shell scripts.
  • Good Knowledge of Data Modeling using ERWIN.
  • Good Knowledge of Data Analytic and reporting tools like Cognos 7.0/8.0/8.3, Business Objects 5.2/XI
  • Good Knowledge of SQL, Triggers, Stored procedures, database table design, indexes, performance tuning in Informix.
  • Competency in understanding business application, business dataflow, and data relation.
  • Ability to rapidly learn new concepts together with excellent interpersonal skills.
  • Excellent skills in understanding business needs and converting them into technical solutions.
  • Good experience and domain knowledge of Banking, Finance, Telecommunication systems, Inventory control systems in manufacturing industries.
  • Good Knowledge in testing and preparing project documentation and presentation.
  • Excellent problem solving skills with a strong technical background and result oriented team player with excellent communication, interpersonal skills.

Technical Skills:

ETL : Informatica Power Center/Power Mart 5.0/6.2/7.1/8.1.0/8.6.0
Repository manager, Designer, creating Scheduling and running
the tasks, Data stage 6.0, AbInitio 1.13.7/1.14.1
Cognos BI Suite : COGNOS 8.3/Cognos 8 BI: Framework Manager, Analysis Studio, Report Studio, Metric Studio and Event Studio, Business Objects5.0
Languages : SQL, PL/SQL, SQL Loader HTML, XHTML, Java, VB 6, ASP.NET, VB.NET, C, C++, j2EE
Web Servers : Microsoft ISS, Java Web Sever
Databases : Oracle 10g/9i/8i/8.x, SQL Server 7/2000, IBM DB2, Teradata and MS Access
Software Packages : MS-Office 2003, Microsoft Word, Excel and Power Point
Other Tools : Toad, Deployment Tool and Migration Utility (Impcat2xml)
Data Modeling Tools: Visio 2000
Operating Systems : Windows 95/98/2000/NT/XP, Linux and UNIX

Education: Bachelor of Engineering in Electronics Engineering

Professional Experience:

Client: Confidential, East Moline, IL 10 /08 to Till Date
Role: Data Warehouse Analyst

4CS is the leading provider of warranty software solutions specializing software which helps reduce the warranty cost of the products significantly by improving the quality and reliability of the products. The project goal was to develop service hub data warehouse models for various customers (Daimler Trucks, Freightliner, Kawasaki and Rolls Royce) in Informatica, Cognos 8.3 and integrate with the application using SDK successfully.

Responsibilities:

  • Performed Data Analysis work for new and existing business requirement.
  • Involved in preparing URS (User Requirement Specification), SRS (System Requirement Specification), SDD (Software Design document).
  • Designed and developed of Dimension Data Model (Star Schema).
  • Designed and developed of Logical and Physical Data model in Designer 2000, Visio 2000 and Erwin.
  • Extensively interacted with users and involved in preparing various documents like Detailed Designs for CRF/IDS and EDW, Dev to PRD Migration plan, Integration test plan Developed mappings to populate Reference data tables which provide codes and descriptions for dimension tables in the database.
  • Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations.
  • Implemented efficient and effective performance tuning procedures.
  • Fixed invalid Mapping's, tested Stored Procedures and Functions, Unit Testing of Informatica Sessions, Workflows.
  • Used Debugger wizard to remove bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Created Sessions, reusable Worklets in Workflow Manager and scheduled the sessions to run at a specified frequency.
  • Used session logs to debug sessions.
  • Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL Interacted with business users and development team to analyze requirements for new models.
  • Involved with installation and configuration of Informatica and Cognos 8.3 BI Suite.
  • Involved in the Unit testing and debugging to check the data consistency.
  • Scheduled and monitored jobs using Workflow Manager Workflow Monitor
  • Involved in creation of Cognos Model in Framework Manager.
  • Investigated join issues in the Model in Framework Manager and worked along with the data modelers in resolving the issues for better report functionality.
  • Created data sources in Cognos portal and published packages to be accessed by Report Studio, Analysis Studio, and Query Studio.
  • Created simple and complex reports using Report Studio by applying various concepts and functionalities such as drill-through, conditional reporting, master-detail, tables, charts, filters, prompts, various calculations and joins using tabular sets and tabular models.
  • Involved in integration of Cognos environment with Third Party Front End.
  • Developed, Customized and Bursted Reports using Report Studio.
  • Trained users in Query Studio and provided sample reports for understanding the Report layouts and functionality.
  • Involved in deployment of reports to multiple environments.

Environment: Informatica 8.1, Cognos 8.3, Cognos 8.0, Framework Manager, Cognos Connection, Report Studio, Query Studio, Analysis Studio, Access Manger Administration, Configuration Manager, Content Manger, DB2, Oracle 10g, TOAD, SQL Server 2000, Linux.

Client: Confidential, Chicago, IL
Role: ETL Architect 11/07- 9/08

Scope: As an SR ETL Developer, I designed, developed and maintained Reporting/ETL processes for Agent Sales Productivity improvement applications as well as CRM and Portal website.

Responsibilities:

  • Involved in creating Agent Sales Productivity Data Business Requirements Documents.
  • Provided data maps of the data mart systems and feeder system.
  • Designed source to target mapping to create data base design and structure.
  • Created metadata repository providing central source of information for enterprise wide data model and data definition for various subject area in the company.
  • Reverse engineered various internal systems to provide effective data model and relation between system at enterprise level.
  • Extraction, Transformation and Loading were performed using Informatica Power Center to build the Data warehouse using business transformation rules.
  • Ad hoc reporting using SQL.
  • Created Source to Target guideline for Informatica Developers.
  • Created ETL process to transfer and clean data from various sources to Data warehouse and Data Mart.
  • Involved in creating ETL detailed design documents using Business Requirements Documents.
  • Involved in creating Reporting Requirements documents attending the design related meetings.
  • Involved in performing the Data Analysis as well as Data Profiling.
  • Involved in creating the ETL Transformation Rules and Mapping Specification documents.
  • Involved in the Creating the ETL Development standards as well as the development check lists
  • Involved in Maintaining BRD’s, project related ETL Specs, Data Models, DML’s and DDL’s in Conseco Document Repository.
  • Provided Project Estimates for development team efforts related ETL and Database Development.
  • Involved in Table Space Size estimations and planning on table and index locations on table spaces.
  • Involved in Migration of Database Objects and ETL Objects across environments.
  • Involved in Datamodeling basing on the requirements using ERWIN and maintaining of Different versions on data models of all of the Subject Areas.
  • Involved in developing the ETL data flows to move the data across Legacy, OLTP, Data ware house and Marts.
  • Involved in tuning the Application as well as ETL
  • Extensively created mapping/mapplets, reusable transformations using all the transformations like Joiner, Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, worklets etc.
  • Involved in Creating UNIX shell scripts for generating the Parameter Files, Manipulate/Archive Source Files, Schedule/Run Informatica workflows.
  • Created BOXI Universes and Reports using Web Intelligence Tool.
  • Extensively worked with the QA team to get the system and Integration testing done successfully and involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data.

Technologies Used: Informatica Power Center 7.1.2/8.6.0, Oracle 10G,SQL, Business Objects BOXIR2, ERWIN, Autosys , Harvest, CSDR, Unix, Toad, Shell Scripting using AWK/ SED on Unix HP/AIX

Client: Confidential
Role: ETL Lead 01/07 to 10/07

Scope: As an ETL lead created BOA Loan Data ware house to provide loan details provided to their customers and employees. This Data Ware House generates the data for generating the reports about the loan details of their customers and employees by applying specific conditions like eligibility for the employee loan is they must be with BOA for atleast 3 years.

Responsibilities:

  • Involved in creating High Level Requirements Documents.
  • Involved in creating Mapping Specification documents using HLD.
  • Involved in developing the ETL data flows to move data across Data Ware House and Data Marts.
  • Provided Project Estimates for development team efforts.
  • Involved in designing Logical/Physical Data Models for Staging Area using ERWIN.
  • Extensively created mapping/mapplets, reusable transformations using all the transformations like Joiner, Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, worklets etc.
  • Involved in Creating UNIX shell scripts for generating the Parameter Files, Manipulate/Archive Source Files, Schedule/Run Informatica workflows.
  • Involved in tuning the SQL’s and also tuning the long running ETL mappings.
  • Involved in migrating Informatica 6.2 to Informatica 8.1.2.
  • Extensively worked with the QA team to get the system and Integration testing done successfully and involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data.

Technologies Used: Informatica Power Center 8.1, Oracle 10G,SAS, SQL, ERWIN, Unix CONTAB, Toad, Shell Scripting using AWK/ SED on Unix Sun OS 5.9

Client : Confidential

Role: ETL Developer 03/06 to12/06

Scope: Home Depot, one of the fortune 500 companies, is one of the leaders in retail business in United States. It has several group companies .TCS has long association with Home depot.

Responsibilities:

  • Participated in Requirement Gathering, Business Analysis, User meetings, discussing the issues to be resolved and translating user inputs into ETL design documents.
  • Analysis, Design and Development, testing and implementation of Informatica transformations and workflows for extracting the data from the multiple sources.
  • Developed extraction mappings to load data from Source systems to ODS to Data Warehouse.
  • Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations.
  • Used debugger to test the mapping and fixed the bugs.
  • Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
  • Used session partitions, dynamic cache memory, and index cache to improve the performance.
  • Documented the mappings used in ETL processes.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into Target.
  • Wrote SQL, PL/SQL codes, stored procedures and packages.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Optimize SQL queries for better performance.
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance

Technologies Used: Informatica, Oracle 9i database, Cognos, Unix AIX.

Client : Confidential 09/05 to03/06

Role : ETL Developer

Scope: REW stands for Risk Early Warning. This project contains all the information of different types of frauds and fraud handing methods of credit cards. This project will replace existing REW MIS processes for the REW organization with a robust, automated analytical reporting solution. The main objective of this project is to clearly define metric definitions and to ensure consistent metric understanding across the REW organization.

Responsibilities:

  • Initial requirement analysis.
  • Preparation of ETL design documents.
  • Creation of graphs using different types of components.
  • Creation of wrapper scripts and post process scripts.
  • Involved in unit testing and preparation of unit test plans.
  • Involved in System Integration testing and preparation of SIT plans.
  • Involved in post implementation production support.

Technologies Used : Ab-Initio 1.14.5, Oracle 9i database, Cognos, Unix AIX

Client : Confidential

Role : ETL Developer 03/05 to09/05

Scope: Cornerstone project is replacing CRS' recoveries SOR, R2, with a new off-the-shelf product LONDON Bridge Debt Manager. With that, all of CRS' interface related to recoveries will be replaced. All internal interfaces will be replaced to utilize Debt Manager’s built in interfaces and Capital One’s DDE infrastructure. All external interfaces with external recovery agencies will be replaced with London Bridge’s Partner .Net framework.
The purpose of this project is the EIM Cornerstone Interfaces. It pulls the data from Source system to EIM DDE for outbound files and put the signal files to Target system.
CRS produces a file that all CRS accounts which will be used by Global One application process.
The data files for GL Register, Agency Commission and NSE are created by CRS with all CRS accounts. These file are meant for delivery to Global One interface on GROMIT. The CRS process informs with the use of a signal file that the data file is available for pick up.
These data file will be available on JACK at the specified location. CRS process will generate a signal file and put it at the specified location on JCAK for the EIM Cornerstone process.
EIM Cornerstone process job will pull (FTP) the data file once the signal file was found available

Responsibilities:

  • Worked as ETL/BI developer for client CapitalOne to Banking for their datawarehouse using AbInitio, Oracle.
  • Worked on the initial Application study,support for the applications and maintening the application documents and enhancements This project involved extraction from different source systems , transformation and load programs into oracle database and file systems. Mainly involved in job monitoring and bug fixing for the job failures. Sending the intimation mails to the scheduling team to run the specific jobs for weekly and monthly run.
  • Review of Existing Documentation
  • Understanding system environment, load processes and system standards.
  • Produce transforming specifications and peer review

Performing Testing and reviews,
Production Maintenance
Technologies Used : Ab-Initio 1.13.7, Teradata database, Cognos, Unix AIX

Client: Confidential HARTFORD, CT
Role: ETL Developer 05/04- 01/05

Scope: As an ETL Developer worked on the End User Computing System and Fund Allocations project. The EUC system calculates the End user commissions and shows it on the Financial and Sales Reporting. The Data is loaded from Mainframe VSAM files in to Staging Area in IBM UDB database and from there data is loaded into UDB Data ware database and also into the existing Sybase database.

Responsibilities:

  • Understanding the entire source systems and architecture
  • Created mappings for the application.
  • Developed mappings and tuned them for better performance.
  • Involved in the Unit testing to check the data consistency.
  • Developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.
  • Created sessions, database connections and batches using Workflow Manager
  • Scheduled and monitored jobs using Workflow Manager Workflow Monitor
  • Understanding of requirements to Depth, Clarity
  • Fix problems in code during development and maintenance phases and modify corresponding documentation
  • Coding to cleanse the data
  • Writing shell scripts for executing the jobs.

Technologies Used: Informatica 5.1, Sybase 11.5, IBM UDB DB2 6.0, SAS, SQL, ERWIN 3.5, SQL Advantage, Command Center, BCP, Hyperion Essbase 6.0, Cobol, Power Builder, Shell Scripting using AWK/ SED on Unix Sun OS 5.6

Client: Confidential USA.
Role: ETL Developer 08/03- 04/04
Scope: CMDB needed a Data Warehouse so that they could maintain historical data and central location for Integration, Analyze Business in differentlocations, According Profit areas, which could serve a purpose of a DSS for decision makers. The data was extracted from legacy systems andwas loaded into the warehouse tables for query, analysis and reporting. I have involved in the Loans and Member Ship Modules. In the LoanModule WaMu involved in disbursal of loan amounts for various purposes like: Personal Loan, Educational Loan, Vehicle Loan, Housing Loan,Consumer Durable Loan, etc. The company requires different level of analysis regarding Loan amounts, Type of customers, Type of payment schedules, Interest rates (variable or fixed), defaulters list and the penal interest calculations, etc

Responsibilities:

  • Understanding the entire source systems and architecture
  • Created mappings for the application.
  • Developed mappings and tuned them for better performance.
  • Involved in the Unit testing to check the data consistency.
  • Developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica ETL (Power Center) tool.
  • Created sessions, database connections and batches using Workflow Manager
  • Scheduled and monitored jobs using Workflow Manager Workflow Monitor
  • Understanding of requirements to Depth, Clarity
  • Fix problems in code during development and maintenance phases and modify corresponding documentation
  • Coding to cleanse the data
  • Writing shell scripts for executing the jobs.

Technologies Used: Informatica 4.7, Oracle Database and Unix.

Client: Confidential Bangalore, India.
Role: Apprentice 06/02- 07/03
Scope: BFL InfoTech offers quality professional solutions and services in the area of IT Consulting, Project Management, Software Specifications, Testing and Validation to meet your business goals. Additionally we develop projects and products for the global market in the areas of e-solutions, Smart Card solutions, Loyalty and Payment solutions, and Enterprise solutions. BFL has relationship with very big companies like BEL(Bharath Electronics Limited).

Responsibilities:

  • Undergone the initial training programs which includes basic knowledge on SDLC.
  • Undergone the training programs held for freshers in Java and Testing tools.
  • Learned to do simple coding in JAVA.
  • Involved in testing the code and preparing project documentation.
  • Involved in simple coding using HTML.
  • Undergone basic training in Database Skills.

Technologies Used: Java, HTML, Oracle Database.

We'd love your feedback!