Data Architect/lead Resume
Santa Clara, CA
SUMMARY:
- Over 13 years of total IT experience in software analysis, design and development for various software applications in client-server environment with expertise in providing Business Intelligence solutions in Data Warehousing for Decision Support Systems and SDLC.
- Possessing over 10 years of practical hands on experience in Data warehousing implementation, migration and ETL processing using Informatica Power Center 9.0.1, OBIEE, Oracle andTeradata V2R6 including 3 years in leading BI Team.
- Experience in ETL process using ETL Tools, creating and maintaining Repository, Source systems, Target Databases and developing strategies for ETL mechanisms using ETL tools like Informatica Power Center, Kettle.
- Extensive BI and Data warehousing experience using reporting Solutions and Informatica PC 9.0.1, 8.1/7.x/6.x.
- Analyzing the requirements leading the team to populate the data and complete the E-LT Process. Extensive experience in Data warehouses with strong understanding of Dimensional Modeling, Star and Snowflake Schemas design thru OBIEE Admin Tool and created measures and Hierarchy Levels.
- Experience in creating Metadata Objects and Web Catalog Objects (Dashboards, Pages, and Reports) using OBIEE Admin Tool.
- Provide leadership and practices for Informatica installation, administration, backup & recovery, Unit test, release, management, design features and functionality on BI Project.
- Designed and developed complex mapping logic from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Union and more.
- Experience in developing OBIEE Repository (.rpd) at three layers (Physical Layer, Business Model & Presentation Layer), Time Series Objects, Interactive Dashboards with drill-down capabilities using global & local Filters, Security Setup (groups, access/query/report privileges), configured Analytics Metadata objects (Subject Area, Table, Column), Web Catalog Objects (Dashboard, Pages, Folders, Reports) and scheduling iBots .
- Experience in integration of various data sources with Multiple Relational Databases like Oracle, Teradata, SQL Server, MS Access, DB2, and XML System. Used Teradata Utilities like Fast Load, Multi Load, And Fast Export.
- Creating appropriate indexes, usage of hints, re-building indexes and also used the Explain Plan and SQL Tracing.
- Hands on experience in Performance tuning mappings, identifying and resolving performance bottlenecks at various levels like sources, targets, mappings, and sessions.
- Analyze business requirements and translate them into functional and technical design specifications and implementation plans.
- Well-developed organizational and communication skills, with strength in inter-personal relationships
- Self-starter, result-oriented team player with an ability to manage multiple tasks simultaneously.
- Proven strong analytical and logical skills with ability to follow project standards and procedures as per client specifications.
Core Skills
ETL Tools : Informatica Power Center 9.0.1
Reporting and Other : OBIEE 10.1.x Answers, Dashboards, Business Object and SAP
Databases : R12, Oracle 10g, DB2, Vertica, Teradata, XML Database, SQL Server.
Database Tools : Oracle SQL*Plus, SQL, PL/SQL, Navigator, Toad 7.4, Teradata SQL
Teradata Utilities : Fast Load, Multi Load, Fast Export, Bteq Scripting
Scheduling Tools : CRON, Tivoli Workload Scheduler 7.0, AppWork V5.1
Languages : SQL, PL/ SQL, UNIX Shell Script (Bash, ksh), BTEQ, Multi load, Fast Load
Case Tools : Erwin, Rational Rose Clear Case, Quest, Microsoft Visio, UML
Office Suite : MS Word, MS Power Point, MS Excel, MS Access
Operating Systems : Windows, UNIX (Solaris 7/8, CONFIDENTIAL-UX, AIX), Linux
- Oracle Certified Professional in Oracle 9i SQL.
- Informatica Certified Designer in Informatica Power Center 6.2.
- Informatica Power Center 8.1 Trained Professional from Informatica Corporation.
- PayPal IMD Change Request Process Training.
EducationB. Sc. (Mathematics
DCS (Computers .PGDCA (A Level)
Professional Experience
Client: Confidential, Santa Clara, CA Oct 11 – Till Date
Data Architect/Lead
Description:
- Confidentialis in Digital media business and wanted to created sustainable data warehouse and reporting system. Currently new Tax reporting requirement came for Partners Best Buy Canada and need to push to DW environment for reporting purpose. In current DW sales and refund system does not collect the tax related data. After successful completion of this project Rovi Corp can do the tax calculation and reporting for Best Buy Canada. This requires existing ETL process and architect changes and implementation.
Responsibility:
- Worked in ETL Changes, Release management for Tax changes.
- Meeting with Cross functional team and doing the validation of cases and its implementation.
- Identify opportunities for process/cost optimization, process redesign and development of new process.
- Participated on designing Start Schema Model. Extract and load DB tables for a customer.
- Create, update and maintain project documents including business requirements, functional and non-functional requirements, functional design, data mapping, etc.
- SupportInformaticaAdministrator to setup Development, Testing and Production environments.
- Participate in evaluating requirements, working withBIteam and product managers to provide solutions to end users.
- Own, develop and nurture the overall logical & physical Data warehouse/Data-mart data model and data architectures using ERWIN and OBIEE.
- Implemented the Parameter file system to avoid the frequent changes. Join Different conf call and impact analysis.
- Worked On PL/SQL, SQL Server and Informatica PC to populate data to Oracle System.
- Worked on Advance mapping on Informatica PC 8, used different transformation to acomplish business requirements like Lookups, Expression, Store Procedures, Router, SQ, Update Strategy etc.
- Environment: SQL Server 2008, Informatica Power Center 8, Oracle 10, File System, PL/SQL, OBIEE, RPD.
- Client: Confidential. Newark, CA June 11 – Oct 11
Sr. BI Consultant
Description:
- Confidential is working to implement new system and interface for their product Risk Link. By being data efficient, fast as real life, open to the best, and in the cloud, customers will realize significant improvements in data management, automation and operating efficiencies so more time and resources can be engaged in creating risk intelligence and disseminating insight. This data is used by insurance and reinsurance industries do assess the damage and act accordingly.
Responsibility:
- Responsible for the design, development, testing and documentation of the Informatica mappings, Reports and Workflows based on RMS standards.
- Perform Data analysis and understand Business requirements. Lead the effort on Data Validations. Implemented File management system including File Validation for source.
- Worked on maintaining the quality standards on ETL, Data and report level.
- Created views and aggregate tables to support reports, Wrote Database Scripts in order to support the project. Develop and reframe solutions based on persormance Optimizations.
- Worked on Data Encription, Validation and Standardization. Worked on Informatica Administration to manage repository and have experience on Admin Console administration experience.
- Worked on InformaticaServer administration in the areas of back up repository, configure setting/performance tuning, and meta management.
- Performed Administrative talk as Informatica Administrator and participated on data-oriented tasks on Master Data projects especially members/Payment, like standardizing, cleansing, merging, de-duping rules along with UAT in each state.
- Create, update and maintain project documents including business requirements, functional and non-functional requirements, functional design, data mapping, etc.
- Identify opportunities for process optimization, process redesign and development of new process.
- Worked on Advance mapping on Informatica PC 9, used different transformation to acomplish business requirements like Lookups, Expression, Store Procedures, Router, SQ, Update Strategy etc.
- Environment: CONFIDENTIAL Vertica Database, Squirrel SQL, SQL Server 2008, Informatica Power Center 9.0.1, Flat Files
- Client: Confidential, San Ramon, CA Oct 10 – June 11
Lead - ETL
Description: With the onset of the new Market Redesign Technology Upgrade, the Front Office has determined a need to gather and analyze market data from various sources within CAISO and other Confidentialinternal applications in support of least cost dispatch. In order to accomplish this task, we are working towards an aggregation of data from various resources to provide daily, weekly and quarterly analysis on the new Markets and Confidential’s Portfolio Optimization & Trading strategies. The Project will be known as ETO (Energy Trading and Optimization).
Responsibility:
- Worked on Information gathering from different sources within PGE for implementing ETO Project.
- Participate inBIand Database product evaluations, POC\'s and business decisions.
- Analysed different source like XML, Database, EXCEL based existing system to extract and aggregate to new system and experience withAgile Methodology.
- Worked on different ETL and BI based releases and provided integration with SAP Business Object, HANA tools.
- Worked in Initial implementation of Informatica Power Center 9, Shell Scripts and other tools.
- Implemented file management systems for XMLs and Unstructured EXCEL Files.
- Extracted data from XML system for Master and Details record set and extablished one to many relationship in database using Informatica while populating in database.
- Contributed on doing the technical documents for dimenstions and Fact tables.
- Wrote Test Plan and Script at high level, tested the build of Informatica, Scripts.
- Worked on Performance Improvement and tunned mapping, Session and Workflow for better performances.
- Design and Development of ETL routines, using Informatica Power Center Within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Stored procedures / functions, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done.
- Environment: Oracle 10g, PL/SQL, Informatica Power Center 9.0.1, Business Object reports, XML Source, Toad, Linux
- Client: Confidential, Miami Jan 10 – Oct 10
Lead BI Consultant
Description:
- Confidential was implementing project for all Caribbean island Telecommunication LIME legacy data to SAP System and do the Data warehouse implementation for reporting purpose. The Data is related to sales and marketing which needed to bring back in Data Warehouse thru ETL Process.
Responsibility:
- Done an implementation of ECTL process for existing legacy system to move to SAP System.
- Identify opportunities for process optimization, process redesign and development of new process.
- Leading BI Team and Design the Data model for Sales, and implemented shell Scripts.
- Designed the Star Schema, Physical and Logical Model. Also the Parameter tables and control tables.
- Responsible for the design, development, testing and documentation of the Informatica mappings, Transformation and Workflows based on LIME standards.
- Integration with SAP BO, HANA for Analysis with Informatica PC 9.
- Finally populating the data to datawarehouse in Oracle thru Informatica PC 8.6. Performance tunning in Informatica, Oracle SQL and used old OBIEE reports to BO Reports.
- Participated on End to end development and involved in decision making process, different meetings and conf calls.
- Worked on reporting requirements and scheduling and monitoring Jobs.
- Designed Mappings for Stage area and responsible to develop and analyze the requirement work out with the solution.
- Environment: Oracle 10g, PL/SQL, Informatica Power Center 8.6.0, Business Object, HANA tools, DB2 Source.
- Client: Confidential, Genentech, Inc. SSF Aug 09 – Dec 09
Lead ETL Consultant
Description:
- Genentech engages in research, development, manufacture, and marketing of biotechnology products for serious or life-threatening diseases. BI is responsible to manage the Confidential System database for Genentech customers, clients and Products. When fully Implemented the BI System (part of CIT Commercial Group) the business will be able to generate the reports based on ETL code and helps high level management on decision making.
Responsibility:
- Design, develop and implemented ECTL process for existing tables in Oracle to move to DW. Performed data-orientred tasks on Master Data especially Patient, customers like standardizing,cleansing, merging, de-dupingrules along with UAT in each state.
- Responsible for the design, development, testing and documentation of the Informatica mappings, Transformation and Workflows based on Gene standards.
- Identify opportunities for process optimization, process redesign and development of new process.
- Initiate, define, manage implementation and enforce data QA processes across, Interact with other Team. Interacted with data quality team. Finally populating the data to datawarehouse in Oracle thru Informatica PC 8.1. Performance tunning in Informatica, Oracle SQL.
- Used Informatica to populate Oracle Table from Flat File System BI System.
- Designed Mappings for Landing, CADS, CDM area and responsible to develop and analyze the requirement work out with the solution. Tuning the SQL Queries, Mapping and PL/SQL Blocks.
- Done Incremental Load in Dimention tables and worked in highly normalised schema. Used Stage work and Dw table concept to load data, applied Start Schema concept. Used Informatica to doscheduling the workflow thru workflow manager.
Environment: Oracle 9i, PL/SQL, Informatica Power Center 8.1.6, UNIX Shell Script, BO.
Client: Confidential, Costa Mesa, CA Jan 09 – Jul 09
Team Lead
Description:
- Confidential is working to implement new system and interface for Auto Club. Automobile Club Enterprises wanted to migrate their Insurance payment system from Mainframe system to Oracle R12. This migration requires extracting payment details information GDG COBOL file and using Informatica to populate Oracle R12 Tables and distribute the file to internal system and corresponding bank. ETL Process will also record the errors and send the successful transmission back to the System for confirmation. A Complete implementation will enable the system to operate on Oracle R12 on go live and retire the mainframe system for Payments; feed the data warehouse system in Teradata.
Responsibility:
- Responsible for preparing the BRD, technical specifications from the business requirements and participated to develop ETL environment, processes, programs, and scripts to acquire data from source systems to populating to target system followed by feeding to data warehouse.
- Analysis of the Business and documenting it for development. Creating Tables and used Informatica to populate Oracle Table from Cobol GDG file System EBS System.
- Mentored the team of 3 Informatica Developer and prepared them for development.
- Worked in different tool in OBIEE like Admin Tool, Answers,Dashboards, Pivot Tables etc.
- Created Physical and Logical Layer extensively worked in different layers in OBIEE along with reports. Designing Start Schema Model. Extract and load DB2 tables for a customer.
- Wrote Teradata BTEQ Scripts, Fast Load, Mload etc in order to support the project.
- Performed Administrative talk as Informatica Administrator and participated on data-oriented tasks on Master Data projects especially members/Payment, like standardizing, cleansing, merging, de-duping rules along with UAT in each state.
- Responsible for the design, development, testing and documentation of the Informatica mappings, Reports and Workflows based on AAA standards. Implemented DAC in this Project to schedule.
- Used Informatica and generated Flat file to load the data from Oracle to Teradata and BTEQ/Fast Load Scripts to do incremental load. Used Stage work and Dw table concept to load data, applied Start Schema concept and created basic reports thru OBIEE.
- Identify opportunities for process optimization, process redesign and development of new process.
Environment: Oracle R12, PL/SQL, Informatica Power Center 8.6, Oracle EBS, BTEQ, Fast Load, Multi Load, OBIEE, Answers, Dashboard, DAC, DB2 Source and target, Cobol Copy Book.
Client: Confidential, San Francisco, CA Apr ’08 – Dec’ 08 Sr ETL ConsultantDescription:
The Problem of Obtaining customer and account information directly from mainframe operational systems, affects mainframe utilization, operational system performance, size of the mainframe environment and availability of mainframe. The objective of this project is to improve availability of key business data and processes by replicating the static or near static data from those SORs into a non-mainframe database. A successful implementation will reduce mainframe load and in the long run will save money by not having to constantly invest in increasing mainframe capacity including best available quality of data and controlling it and populate it to Data Warehouse database going forward.
Responsibility:
- Provide technical guidance to programming team on overall methodology, practices, and procedures for support and development to ensure understanding of standards, profiling and cleaning process and methodology.
- Interact with the team to facilitate development, provide data quality reports, and perform software migration activities and accumulated the data for B2B solution.
- Worked in Informatica 8.x to create and deploy the business rules to polulate data into tables.
- Created Snapshots, Summary tables and views in Database to reduce the system overhead and provide best quality of data for report, worked on cash management and configuration of DAC.
- Creation of presentation layer tables by dragging appropriate BMM layer logical columns in OBIEE.
- Developed Global prompts, Filters, Customized Reports and Dashboards
- Implemented Pivot Tables and manipulated Init Blocks for report analysis.
- Perform data quality analysis, standardization and validation, and develop data quality metrics.
- Insuraing the data quality in Source and Target levels ot generate proper data report and profiling.
- Provide overall direction and guidance to ETL development and support for the Prescription Solutions’ Data Mart. Applying Velocity Best Practice for the Project work.
- Created and used reusable Mapplets and transformations using Informatica Power Center.
- Responsible for the design, development, testing and documentation of the Informatica mapping,
- Environment: Windows, UNIX, Informatica 8.x, Oracle 10g, SQL, UNIX Shell Script, OBIEE, Answers, Admin tool.
Client: Confidential, San Jose, CA July’07 –Mar’08 Senior Data Warehouse ETL DeveloperDescription:
Responsibility:
- Responsible for preparing the technical specifications from the business requirements.
- Analyze the requirement work out with the solution. Develop and maintain the detailed project documentation.
- Used Informatica and generated Flat file to load the data from Oracle to Teradata and BTEQ/Fast Load Scripts to do incremental load. Used Stage work and Dw table concept to load data, applied Start Schema concept. Created UDFs in JAVA Transformation to complete some tasks.
- Design, develop and implemented ECTL process for Marketing Team for existing tables in Oracle. Wrote BTEQ Scripts in order to support the project.
- Used version control system to manage code in different code streams like Clear case.
- Performed data-oriented tasks on Master Data projects especially Customer/Party, like standardizing, cleansing, merging, de-duping, determining rules.
- Responsible for the design, development, testing and documentation of the Informatica mappings, PL/SQL, Transformation, jobs based on Paypal standards.
- Initiate, define, manage implementation and enforce DW data QA processes across, Interact with other QA Team. Interacted with data quality team.
- Identify opportunities for process optimization, process redesign and development of new process.
- Anticipate & resolve data integration issues across applications and analyze data sources to highlight data quality issues. Did performance and analysis for Teradata Scripts.
- Migrate SAS Code to Teradata BTEQ Scripts to do the scoreing for score taking in account various parameters like login details, transaction $ amount etc. Playing with Marketing Data for various reports.
Environment: Oracle 9i, Informatica PC 8.1, PL/SQL, Teradata V2R6, Teradata SQL Assistant, Fast load, BTEQ Script, SAS Code, Clear case, JAVA Language, Pearl Scripts, XML Source.
Client: Confidential, SLDM, Phoenix, CA Oct’06 – June’ 07Informatica and ETL LeadDescription:
- The SchwabLink file delivery process is one of the cornerstone technology offerings of Schwab Institutional (SI). Effective portfolio management can not be performed without data. Financial advisors doing business with SI rely upon the SchwabLink file delivery process to obtain daily updates to customer, account, balance, position, and other important data required for portfolio management.
- The file delivery process harvests Schwab broker/dealer data for use in creating download files. In addition, data is received from external sources including U.S. Trust, The Charles Schwab Trust Company, The Charles Schwab Bank, and Great Western. In total, nearly 4G of data is received daily for processing and delivery of more the 68,000 individual files – with 79 different file formats – to advisors using the SchwabLink file download mechanism.
Responsibility:
- Design Road map for ETL implementation and worked with ETL Team.
- Developed reusable mappings and mapplets to acomplish similar business rules using various transformations. Configuring VSAM Files as source.
- Designed the process of Parameter file generation from Informatica.
- Used the Shell Scripts for running the Workflows and Normalised the process.
- Checkin-Check-out of the mapping and locks checking.
- Created and used reusable Mapplets and transformations using Informatica Power Center.
- Responsible for the design, development, testing and documentation of the Informatica mapping, sessions and workflows based on Schwab standards and specifications.
- Created Scripts to populate Teradata tables and used BTEQ Scripts along with Teradata utilities.
- Executed several sessions and created batches to automate the loading process.
- Loaded the data from VSAM file to Oracle and also created flat files.
- Generated the XML Source/Target for specific metadata requirements.
Environment: Oracle 10g, Informatica PC 8.1.1, Clear Case 7.0, Toad, COBOL Files, Teradata Source, Target.
Confidential, Aug ‘05 – Sep ‘06Senior Data Warehouse Consultant Description:
- The Data Warehouse programme will enable the identification and realization of improvements in the areas of cross-selling & market share growth, risk management, operational excellence and finance & regulatory reporting by defining and building the Enterprise Data Warehouse providing accurate, consistent, timely and shared customer centric information for Singapore and Hong Kong.
- This project is about the business’s requirement for consolidated, timely, standardized and cleansed data, globally available on a timely basis to ultimately answer business questions for identifying and realizing improvements across the business lines and geographies. It is an “enterprise-wide” initiative that addresses the lack of quality and timely customer information, the need to merge more and more data from currently separate sources and to establish standards for wide range of information used within the bank.
Responsibility:
- Working as Team lead Involved in the development Team for setting the standards for Informatica developers in project.
- Communicate with immediate supervisor on the status of the work in-progress, both verbal and written communications.
- To Support whole systems, TWS daily run and Daily Backup for whole systems.
- Responsible for designing the ETL source map and transformation logic.
- Developed Teradata script for FastLoad, MultiLoad, FastExport scripts
- Created DSNs and Database connections to support Informatica Architecture.
- Involved in working with various active and passive transformations.
- Created workflows and sessions to perform data loading.
- Designed Test-Cases and validation scripts to test the loaded data.
- Tuned the mapping, data flow and session property sheet to optimize the performance.
- Worked in Star schema and Snowflake schema design.
- Written Shell Scripts to run Informatica Workflows and Backup.
Environment: Windows, IBM AIX, Teradata V2R5, Informatica PC 7.1.2 (Designer, Repository Manager, Workflow Manager, Monitor), Tivoli Workload Scheduler, Teradata Utilities like Fast Load, Multi Load, Fast Export, SQL.
Confidential, Jan’05 –Jul’05Team LeadDescription:
- To create a tiered Data Warehouse Architecture based on the requirements. Data from various data sources will be consolidated into different business subject areas within the Data Warehouse layer. This will ensure semantic integrity for common attributes across different data marts and eliminate redundancy in data extraction, transformation and consolidation.
Responsibility:
- Designed the tables and created the mappings to populate data.
- Used control table and Parameter files to run the mappings
- Written shell scripts in order to take repository backup and run workflows.
- Implemented complex mappings using expressions, aggregators, filters, joiner, rank, union, sequence generator and procedures. Populated Teradata Tables for DW Implementations.
- Developed and tested ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translated business rules and functionality requirements into ETL procedures.
- Created Parameter Files and also done the version control thru Clear Case.
- Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
- Used Bteq Scripts. Implemented the Change requests, Involved in testing the mappings/Documents.
Environment: Informatica PC 6.2, Oracle 8i/9i, Cognos Report Studio, Toad, SQL Informatica Repository Manager, Designer, Unix Shell Script, Windows 2000/NT, Teradata V2R1 , Rational Rose Clear Case.
Confidential May ‘04 - Dec ‘04 Sr. ETL Consultant/Informatica LeadDescription:
- Confidential is facing compelling reasons to revamp the existing DSS and create a Partner Data Warehouse. The merger between CONFIDENTIAL and Compaq has resulted in duplication of legacy systems, business processes and data structures maintained by the pre-merger companies. The source systems to existing DSS are being revamped to iron out inconsistencies between pre-merger CONFIDENTIAL and Compaq business processes and data.
- Migrate to a redesigned data model.
- Migrate PL/SQL procedures used for ETL to Informatica. Incorporate intra-day data loading and Migrate to a combined single Business Objects Universe from the current practice of multiple Universes.
Responsibility:
- Upgrated Infornmatica environment from Informatica 6.x to 7.3.
- Migrated existing PL/SQL Procedures in to Informatica Mappings.
- Using shell scripts automated the Process of Extraction for some Origin which sends the data in flat file format.
- For new integrated bussiness logics we created mapping.
- Created UNIX environment variables, developed shell scripts to FTP daily and monthly delta files and archive old files.
- Optimization and performance tuning of mappings to eliminate bottlenecks and achieve high response time.
- Documented the mapping strategies and transformations involved in the extraction, transformation and loading process.
- In Unix box we have written Shell Scripts for running the session.
- Executed several sessions and created batches to automate the loading process.
- Involved in testing and Maintance of same project.
Environment: Informatica Power Center 6.2, Oracle 8i/9i, Shell Scripts, SQL, PL/SQL, Windows, CONFIDENTIAL-UX.
Confidential May ‘03 - Apr ‘04ETL ConsultantDescription:
- The Customer wanted to integrate their Data from all origin, which is Legacy and Oracle. After the Integration, we apply business logics and use the Informatica sessions to pull the data to Target system and this data can be used to generate Cognos Cubes and Reports.
Responsibility:
- Done Various Loads like Daily, Weekly and Monthly Load based upon the request.
- Automated the Process of Extraction for some Origin.
- Developed various Mappings to implement business logic.
- Used the Shell Scripts for running the session.
- Used Update strategy and Target load plans to load data into Type-2 Dimensions.
- Created and used reusable Mapplets and transformations using Informatica Power Center.
- Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
- Created Snapshots, Summary tables and materialized views in Database to reduce the system overhead.
- Design and Development of ETL routines, using Informatica Power Center Within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Stored procedures / functions, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done.
- Executed several sessions and created batches to automate the loading process.
- Involved in testing the developed mappings and created the test cases.
- Loaded the data from .CSV file to Oracle and Teradata.
Environment: Informatica Power Center 6.2, Oracle 8i, Shell Scripts, Teradata, Teradata Utilities like Fast Load, Multi Load, Fast Export, SQL.
Confidential Nov‘02 – Apr‘03ETL ConsultantDescription:
- The customer wanted to synchronize its “Sell”, “Plan-Buy”, “Make” and “Deliver” strategies, policies, processes, systems and organizations with their factories, suppliers, service providers and retailers.
- In order to be able to secure the best prices, terms and quality at global or regional level and reduce transaction costs, the organization needed access to global and regional purchasing information.
- To effectively leverage supply chain information, the company decided to implement a Retail Sales data warehousing solution.
Responsibility:
- Involved in development of various mappings.
- Executed several sessions and created batches to automate the loading process.
- Involved in testing the developed mappings and created the test cases.
- Developed STTs (Design Document) for better understanding of existing ETL processes.
- Involved in analyzing the business process, selecting the grains, Dimensional Modeling and identifying the facts.
- Implemented Retails Schema extensibility by identifying surrogate keys for the fact table to deploy the SCD.
- Designing star schema, fact table, dimension table and hierarchies.
- Created various transformations to support voluminous loading of data in target table.
- Determining the folder structure to reflect the business terms.
Environment: Informatica Power Center 6.2 , DB2, Oracle 8i, SQL, PL/SQL, SQL*Loader, Unix shell scripts, Windows 2000, Informatica PC 5.0, Cognos 6.0, ERWIN.
Confidential Oct’01 - Nov’02ETL DeveloperDescription:
- Multiple acquisitions/mergers resulted in multiple purchasing systems with multiple language applications.
- Within the organization multiple direct/indirect purchasing systems existed.
- No single source of consolidated sourcing information was available.
Responsibilities:
- Developed various active and passive transformations to implement business logic.
- Developed and validated appropriate mapping among transformations.
- Create batches for bulk loading in server manager
- Created repository local as well as global.
- Designing star schema, fact table, dimension table and hierarchies.
- Involved in identifying surrogate keys for the fact table.
- Responsible for analyzing of the existing OLTP design and designed OLAP.
- Trained end users to generate reports and create power cubes from the catalogs.
Environment : Win’2k, Informatica 5.0 Power Mart, Oracle 8i, SQL, PL/SQL, SQL*Loader.
Confidential Jan’99 - Sep’01
Software Developer
Description :
- Inventory control system was developed to handle the inventory positions of the company and other allied activities of the company such as processing of orders, deliveries, invoice, billing maintaining the stock position and the reorder level was maintained and to take care of the purchase and sales departmental activities.
Responsibilities:
- Developed detailed DFD and ER diagram for all stages.
- Designed ordering modules.
- Created Tables, indexes and integrity constraints.
- Involved in system design, development and Implementation.
- Written triggers, procedures and functions for all sub programs.
- Involved in maintenance and designing and testing of all modules.
- Involved in developing various stored procedures, functions to retrieve usage.
- Calculate balances, generate invoices, process payment & perform other tasks using PL/SQL.
- Created custom PL/SQL scripts to automate the billing system in the Bill Cycles, Custom Tables and Involved in Design & development of Billing Database.
Environment: Win’NT, Oracle , SQL , HTML, PL/SQL, SQL Plus.