Informatica Netezza Lead Resume
Miami, FL
EXPERIENCE SUMMARY:
Over 10 years of Information Technology experience primarily within the areas of Business Intelligence and Data Warehousing. While specializing in extract, transform and load (ETL) development, I also have experience with all aspects of the development lifecycle for Data Warehousing, Enterprise Architecture, Dimensional Modeling, ODS, Star Schema, Snowflake schema, Data Analysis, Data Integration, Data Migration, Data Profiling and Data Conversion, and reporting.
TECHNOLOGIES:
ETL Tools: Informatica Power center 9.*, Power exchange 9.*, Data Quality 9.*, DVO, Data transformation studio, Salesforce, MDM, SSIS
Database: Netezza, ORACLE 12c,11g/10g/9.x with PL/SQL, Teradata, IBM DB2 MS SQL Server, OBIEE
GUI & Tools: TOAD, SQL Developer, Aginity for Netezza, IBM Data Studio, Winsql, Teradata SQL Assistant, MS data tools, Visual Studio
Languages: Unix shell scripting, SQL, PL/SQL, Java
Reporting Tools: OBIEE, Micro Strategy, SSRS
Modelling Tools: Embarcadero ER Studio, Erwin
Schedulers: Tivoli, Tidal, Control M, Red Wood
QC: HP Quality Center
Loaders: Oracle SQL/external Loader, Netezza Bulk loader/Reader/Writer, Teradata Loader utilities and TPT (Teradata Parallel Transporter)
Operating System: UNIX, LINUX, Windows XP/NT/2000/98
INDUSTRY EXPERIENCE:
- Banking/Financial Industry
- Pharmaceutical/Healthcare Industry
- Transportation/Suply Chain/Logistics/Truck rental
- Cruise/Hospitality Industry
- Telecom/Billing Industry
- Entertainment/Film Industry
- University/Education system
- Consulting Services
PROFESSIONAL EXPERIENCE:
Confidential, Miami, FL
Informatica Netezza Lead
Responsibilities:
- Design, Architect, Develop and implement enterprise data warehouse (EDW to RDW) solutions for fleet management, repair and maintainance, rental agreements and inspections, invoicing, credit memos, warranty, parts and pricing, servicing and performance maintenance indicators.
- Design and Develop ETL framework/solution via Informatica mappings from mainframe, database and heterogeneous sources( SQL Server, flat files, mainframe, Oracle, Netezza, DB2, Sales force) performing staging, standardization, transformation, ODS, Confirmed & Junk Dimensions, Facts and Fact Latest/MVs, Aggregating/summarizing and creating views for reporting and down stream data needs.
- Create CDC process for daily/incremental loads, historical conversion and ad - hoc loads for UAT testing.
- Perform data profiling, data modeling to evaluate the right model, Denormalized and Dimensional modeling (star and snow flake schemas) whereever applicable for EDW.
- Gather business requirements, evaulate business need, prepare functional and requirements document with business rules.
- Perform data analysis, work estimation for development, testing and deployment phases.
- Assign right distribution key for Netezza tables so that bigger dimensions and facts are distributed on the same key which will yield better performance as they will be on same data slice which will avoid data skews, redistributions, slower data loads, retension, latency and provide faster porformance.
- Create data model by reverse engineering using Embarcadero ER Studio.
- Conversion of SSIS packages and SQL Server stored procedures into relational tables and Informatica mappings.
- Performance tune ETL, database processes and long running jobs for faster throughput via reading and loading via NZ Bulk loader/reader and writer for super fast data loads into Netezza sources and targets.
- Create mappings to perform upsert operations using IDLookup, Sales force lookup and External ID fields in salesforce using powerexchange for salesforce with powercenter
- Create Informatica DVO mappings/single/multiple pairs to validate data across multiple databases leveraging informatica powercenter objects via Informatica DVO for testing/validation purposes.
- Using FTP connection, to read from mainframe directly, application connections for sales force connector, relational connections for ODBC and Netezza bulk reader and writer to process into Netezza targets.
- Perform data profiling, data anlysis using informatica developer and analyst
- Process xml sources via xml parser and write to/generate single or multi viewed xml targets
- Good knowledge of Informatica Data transformation Studio to parse unstructured data.
- Good knowledde of processing message queues in real time scenaio by making the workflow run infinitely and constantly checking for message queues to be processed.
- Develop SCD Type I, II, III and junk dimension mappings with md5 checksum logic and facts with Type II implementation when needed. Also, develop daily and monthly aggregate/summary facts for monthly reporting/aggregation.
- Lead, manage and implement project deliverables and mentor team members in developing efficient, re-usable etl and sql processes
- Develop intelligent etl processes using Dynamic lookups for mutiple changes scenarios in same data load.
- Create SSIS packages to read from relational and other sources to populate into SQL Server targets and create SSRS reports based on reporting needs.
- Create tables, dmls, views, stored procedures and unix scripts for automation and archiving.
- Analyse reporting queries in Microstaregy by executing reports in microstrategy developer, extracting underlying SQLs and analysing queries(sql)/query performance and optimized etl wherever applicable. Also, create new data marts by profiling the results from existing legacy reports and perform conversion for new data marts with star schema.
- Create user, groups and roles in Informatica administrator.
- Good knowledge of web service consumer transformation and processing message queues.
- Adjust/modify DTM buffer size with data blocks w.r.to source and target precisions to improve data loads to attain better performance
- Extensive use of Informatica mapping variables, mapping parameters, session parameters, workflows variables, parameter file, inbuilt informatica functions for different requirements, data cleansing, date modifcations, dat standardizations, and perform CDCs
- Extensive usage of Informatica transformations for differerent requirements(Expression, Filter, Router, Joiner, Lookup, Union, Aggregator, Java, XML parser and generator, Sorter, Rank, Normalizer and few others) as needed
- Work with business partners in undertanding requirements, creating user stories, and work together in performing SIT and UAT and perform testing and deployment activities, implemenation plan and follow change management process.
- Create mapping specifications, technical design, process flow and job run book documents
Environment: Informatica Power Center & Power Exchange 9.6.1, Informatica Developer 9.6, Informatica Power Exchange for Salesforce, Informatica DVO, Informatica Data transformation Studio, Netezza, NZLoad, Aginity, DB2, Mainframe, Winsql, Oracle, SQL Developer, SQL Server, SSIS, SSRS, UNIX, ER Studio, Redwood scheduler, Micro strategy Developer, Embarcadero ER Studio
Confidential, Fort Lauderdale, FL
Informatica Senior Engineer
Responsibilities:
- Design, Architect, Develop and implement Global Credit Bureau Reporting for US, CA and International markets
- Develop ETL framework/solution via Informatica mappings from mainframe, database and heterogeneous sources performing staging, standardization, transformation, summarizing and reporting to respective bureaus
- Perform data profiling, data modeling to evaluate the right model( Normalized(3NF), Denormalized and Dimensional modeling) applicable for each market.
- Apply stringent business rules and regulatory requirements for consumer, corporate and commercial reporting. So that Bureau processing is performed with highest precision.
- Performance tune ETL and database processes and long running jobs for faster throughput
- Build unix shell scripts for automation and workflow/job execution for Control M
- Process xml sources via xml parser and write to/generate single or multi viewed xml targets
- Build efficient ETL processes which can be leveraged by multiple markets
- Lead and work independently in Agile frame work
- Implement informatica partitioning with hash auto sort for better performance and to avoid processing duplicates in multiple partitions.
- Develop intelligent etl processes which can handle dynamic bureau formatting needs.
- Develop Stored procedures for override process
- Extensive use of Informatica mapping variables, mapping parameters, session parameters, workflows variables, parameter file, inbuilt informatica functions for different requirements and perform CDCs
- Extensive usage of Informatica transformations for different requirements(Expression, Filter, Router, Joiner, Lookup, Union, Aggregator, Java, XML parser and generator, Sorter, Rank, Normalizer and few others) as needed
- Create mappings/sources to read comp 3 fields from vsam/mainframe sources via powercenter
- Perform Star schema implementation and Denormalized model/pivoting as needed for requirement
- Good knowledge of HDFS cluster and Big Data concepts and sourcing via Informatica BDE.
- Work with business partners in undertanding requirements, creating user stories, and work together in performing SIT and UAT
- Create registartion and extraction data maps to read or write to pwx sources or targets via pwx application connections
- Create extensive formating ( Metro2, TUDF, Variable byte, XML) files in informatica for bureau requirements.
- Good usage of Analytical functions when needed to improve performance (Rank, Row number, Dense rank, Partition by, Aggregate functions, First Value, Last value and windowing
- Good understanding of Teradata parallel transporter (TPT)
- Develop FLOAD, MLOAD & BTEQ Scripts and Informatica Fastloader utility to Teradata targets
- Perform bulk loading, indexing on columns to yield better performance and gather statistics once job completes.
- Create Data Partitioned Secondary Indexes (DPSI) on partitioned tables to avoid Non-Partitioning Indexes (NPIs) having contension issues
- Perform Data profiling, Data standardization, Parsing and address validator to eliminate duplicates.
- Perform push down optimization in Informatica
- Perform data standardization, data validation and data profiling
- Lead and implement deployment activities, change management process and create mapping specifications, process flow docs and implementation plans
Environment: Informatica Power Center & Power Exchange 9.6.1, Informatica Developer 9.6, DB2 11, Teradata, Teradata SQL Assistant, Mainframe, UNIX
Confidential, Atlanta, GA
Informatica Data Migration Specialist
Responsibilities:
- Involve in Migration Analysis, Migration Design, Migration Development, Migration Testing and Migration Deployment phases of the Data Migration project.
- Manage and Lead design, development, deployment tasks for a team of ten resources and client team members in completing deliverables, project milestones, change requests and support items.
- Create level of effort for development tasks, project plan, deployment plan, monitor and track the status on development items, production releases, issues, risks and report consolidated status to Senior Management and create change requests.
- Design ETL Architecture and data flow process for Vision single biller system.
- Design and develop SCD Type I, II and III per business requirements
- Design and implement CDC process for adhoc, daily and weekly jobs/loads.
- Design and implement code reusability by utilizing same mapping and session in multiple workflows(for different regions) utilizing parameter files executed via PMCMD
- Create and maintain users, groups, roles, privileges, folders, database connection and Repository maintenance & administration, Informatica support tasks and propogate velocity methodologies and best practices.
- Utilize user defined and built in Session Parameters for parameterization, code re-usability and metadata information/statistics.
- Architect and develop audit processing of data loads for verifcation, reconciliation and meterics operation.
- Implement performance tuning in mappings, sessions, transformation, workflows and sql queries for faster throughput and increased performance. By utilizing indexes, partitioned tables, Full/Use hash hints and related items.
- Develop and implement mappings utilizing PowerExchange(PWX) connector to source for IBM Mainframes/DB2 tables. Install PWX connector on Powercenter node and Listner for each partitioned DB2 system.
- Design and develop partitioned tables either list or range per values/date data on key attributes for fast quering results, ETL throughput and enhanced performance.
- Develop unix shell scripts to execute wrappers, file list, and archiving process.
- Develop FLOAD, MLOAD & BTEQ Scripts and Informatica Fastloader utility to load data into Teradata targets for empty tables. Mload for upserts, and Tpump for frequent updates.
- Develop and perform performance tuning of long running jobs/stored procedures by SQL tuning, Query optimization, indexes, hints, filtering data wherever applicable and removing unnecessry sorting. Leverage Hash join/ nested loops wherever applicable.
- Develop Stored Procedures to gather statistics/reports from metadata tables/views for repository objects/maintenance.
Environment: Informatica PowerCenter & Power Exchange 9.5.1, Solaris platform, Oracle 11G, PL/SQL, Toad 11.6, SQL Developer, IBM Mainframe Z/os, DB2, Teradata, Teradata SQL Assistant, HP Mercury QA 10.0, UNIX, Data Studio, Mocha soft, Navigator
Confidential, Rockaway, NJ
Lead Informatica Data Quality Designer
Responsibilities:
- Involve in requirement gathering, analysis, architecture, design, development, testing, and deployment of entire lifecycle of Data Quality and Data Integration environments.
- Create and maintain users, groups, roles, privileges, folders, database conections, nodes, PowerCenter and Data Quality repository & services Administration, which involves PowerCenter repository and Integration Service, and Model Repostory Service, Data Integration Service, Analyst Service and Content management Service.
- Design strategies and rules/mapplets for cleansing, Standardization, Matching and Consolidation of reference data using IDQ
- Perform GAP analysis in identifying anomalies and form patterns for Data Profiling
- Created and maintained Informatica Data Quality PDOs, Profiles, Reference tables and Score cards.
- Involved and assisted Data Steward Process.
- Architect, design and develop Customer Registry data flow process for HCP, Addresses, phone numbers and State Licenses.
- Create mappings and mapplets utilizing mapplets/rules and reference tables required for cleansing, standardization, matching, merging and consolidation using data quality and data integration transformations. Used data quality transformations like Case Converter, Standardizer, Parser, Labeler, Merge, Decision, Key Generator, Match, Consolidator, and Address Validate/Accelarator along with power center/Data Integration transformations.
- Create mappings, sessions and workflows to maintain and load staging & ODS layers using PowerCenter transformations involving Type I, Type II and truncate and load tables leveraging CDC checksum number.
- Integrated IDQ mapplets with Informatica power center mappings and scheduled as part of enterprise workflow.
- Good knowledge of Informatica MDM, match and merge to golden customer, and Data Studio.
- Maintain database schemas, users, roles, privileges and table spaces. Create and maintain tables, views, synonyms, stored procedures and packages.
- Perform performance tuning to tune queries and etl processes for better performance and faster throughput both in database and etl processes.
- Create shell scripts to process files, automate and archive files and etl processes
- Manage and Lead onsite, offshore and team members in providing solutions, following defect resolution process, daily and weekly calls, providing status reports to senior management and making sure deliverables are completed in time and report senior management of any risks/enhancements ahead of time if needed.
- Create level of efforts for project deliverables, enhancements and support items.
- Create high level and detailed projects plans involving design, development, testing, UAT and deployment phases of project deliverables.
- Involve and assist senior management in creating RFPs.
- Create mapping specifications, data lineage, data model and technical design document.
- Perform unit testing, Integration testing and UAT. Create test cases, test scripts and test plans documents involving positive and negative scenarios.
Environment: Informatica PowerCenter & Informatica Data Quality 9.5.1, Windows platform, Oracle 11G, PL/SQL, Toad 10.6, SQL Developer, Siebel CRM & Siebel UCM 8.0, Erwin data modeler r7.3, HP Mercury QA 10.0, Windows 7
Confidential, Danbury, CT
Informatica and Data Quality Lead
Responsibilities:
- Involve in requirement gathering, analysis, design, development, testing, and migration (deployment to production) of entire lifecycle of PMDW Enterprise Data Warehouse environments.
- Create confirmed dimensions which can be leveraged for EDW.
- Understand PL/SQL procedures which currently populate EDW and convert those to ETL mappings for formulary, tier, product, plan and payer data for physician and organizational data
- Perform GAP analysis, involve in business requirements gathering, and prepare ETL specifications and technical design documents.
- Create score cards, data profiles, and find data anomalies & patterns for reference data
- Create IDQ rules/mapplets, reference tables for standardization and cleansing routines
- Work with data stewards in integrating standardization, anomalies and patterns of reference data
- Create mappings leveraging Case Converter, Standardizer, Parser, Labeler, Merge, Decision, Key Generator, Match, Consolidator and data integration transformations.
- Work to ensure right group of clusters are associated for grouping, matching and consolidation and fine tune performance of etl process.
- Develop Type 1, Type 2 and Type 3 SCD mappings.
- Create Informatica fast load utility connection and load into Teradata targets.
- Good experience using Teradata utilities Fast Load, Mload, Tpump and Bteq scripts.
- Create mapplets and leverage lookup, expressions, aggregator, joiner, SQL, filter, router, external stored procedure and other transformations in mappings wherever needed.
- Good understanding of Informatica Data Services.
- Manage and lead onshore and offshore teams. Conduct design review, defect check and resolution, status meetings on daily basis and send weekly status reports to PMO team.
- Create staging and ODS layers and populate data to external tables (objects) so that information can be processed into Sales Source Veeva Cloud using Informatica Sales Force connectors.
- Leverage Informatica scheduler to process daily CDCs.
- Configure OBIEE repository, build joins and relationships so that reports and segment trees can run successfully.
Environment: Informatica PowerCenter & Data Quality 9.5.1, Informatica Sales force Connector, Power Exchange, Oracle 11gr2, PL/SQL, SQL Server, Veeva Sales Force, Siebel CRM, Teradata 12, Teradata SQL Assitant, Unix platform (AIX), Toad 10.6, SQL Developer, OBIEE 10.1, Siebel CRM & Siebel UCM 8.0, Erwin data modeler r7.3, HP Mercury QA 10.0, Windows XP
Confidential, Miramar, FL
Data Warehouse Team Lead
Responsibilities:
- Involved in requirement analysis, design, development, implementation, testing, and migration (deployment to production) of entire lifecycle of CEM, CMA, UIW and Guest Revenue Enterprise Data Warehouse environments.
- Involved in creating Request for proposals (RFPs) for client requirements.
- Experienced in Informatica 6/7/8.6/9.1 Development and Administration.
- Experienced in administering, creating, maintaining Siebel Analytics 7.8 repository and run reports, adhoc queries, segment trees and list outputs.
- Experienced in synchronizing customer data across multiple databases via MDM.
- Experienced in Dimensional Modeling, ETL Architecture and reporting/analytics of Data Warehouse lifecycle.
- Manage onshore and offshore teams/resources to develop, test, deploy and continue client work operations and provide ETL solutions.
- Created business requirement documents, functional specifications, technical design documents, traceability matrix, test scripts and UAT scripts.
- Involved with senior management and practiced project planning, progress tracking, project budgets, client rapport and mentoring team members.
- Conducted business and technical workshop sessions proposing design and architecture review involving Business, IT and PMO team.
- Created space estimates for new requirements, warehouse maintenance and proposing design solutions. Ran statistics on schema tables for faster performance and regular maintenance, check to maintain indexes are valid and operational for query performance.
- Created designs to maintain facts and summaries to capture (insert) and process only changed information and archive and drop partitions when needed.
- Partitioned Facts and summaries on key attribute columns, created indexes on important attributes which are used heavily by marketing and data warehouse teams. This will help in faster data retrieval, quick execution of reports, achieve higher through put performing ETL loads.
- Performed Informatica migrations using shared objects, shortcuts, mapplets, reusable transformations, mappings, sessions, command tasks, worklets, workflows.
- Experienced in real time ETL processing via IBM WESB message queues, and processing the information into Dimensions and Facts parsing via XML parser transformation utilizing sources in CLOB format adhering to XML XSD structures and canonical format generated via stored procedures and invoked by real time IBM Data Mirror integration service through Enterprise Service Bus (ESB).
- Performed dimensional modeling using Star schema following Ralph Kimball Methodologies and created Data Models per subject area using Erwin Data Modeler.
- Good understanding of Big Data concepts.
- Designed and deployed slowly changing dimensions in Type I, Type II, Type III and Type IV Dimensions utilizing checksum numbers for detecting change and maintaining history.
- Created Staging areas and Operational data store to maintain transactional data from heterogeneous sources while pushing to warehouse. This process would help in faster loads, high query performance; reduce data integrity and data governance issues.
- Mentor and manage onsite, offshore teams and client employees in ETL design, development, and implementation.
- Solid Experience in performance tuning on SQL queries, long running ETL jobs, long running queries/reports modifying etl routines (mappings, sessions, transformations), utilizing indexes, performance hints, filtering unnecessary data, avoiding full table scans when required and followed few other approaches.
- Implemented and lead change management process in creating tickets for requirement for change applicable for projects and support maintenance, which include deployment tasks, implementation plans, security plans, test plans and following the change management migration protocols and standards.
- Used XML sources parsed XCD design layout and processed data into tables.
- Created Stored Procedures and Packages required for ETL operations.
- Created Shell Scripts to process files, create list files, concatenate objects and archive files.
- Created FTP rule jobs utilizing FTP mechanism to FTP flat files based on subject from Infa target files to destination servers.
- Imported new data sources in physical layer, defined joins and conditions between dimensions and facts for new subject areas in business model and mapping layer and created conceptual model in presentation layer in OBIEE.
- Created and maintained the OBIEE repository and defined joins in Business and mapping layer used in various subject areas.
- Used OBIEE answers for reporting purposes and performing marketing analytics
- Created BQY queries, reports, pivot reports for Guest Revenue, CRM projects using Hyperion 9.3. Performed testing by presenting reports, BQYs and queries to Business and IT teams in SIT and UAT.
Environment: Informatica PowerCenter 9.0.1, MDM, Oracle 11gr2, PL/SQL, SQL Server, DB2, AS 400 reservation system, Unix platform (AIX), Toad 10.6, SQL Developer, Tivoli, OBIEE 10.1, Hyperion 9.3, Siebel Analytics 7.8, Siebel CRM & Siebel UCM 8.0, Erwin data modeler r7.3, HP Mercury QA 10.0, Windows 7
Confidential, Charlotte, NC
Netezza ETL and Testing Lead
Responsibilities:
- Involved in requirement analysis, design, development, implementation, testing, and migration (deployment to production) of entire lifecycle of Sales Channel Enterprise Data Warehouse environments.
- Good understanding of Netezza Twinfin architecture
- Designed and deployed slowly changing dimensions (Type 1 and Type2) and Facts and static dimensions using NZload utility.
- Performed unit, integration and system testing processing data for multiple divisions, and verifying the results with the business requirements and logging defects as and when defects are detected and performed defect resolution process to fix the issues.
- Used Winsql and Agenity Netezza SQL applications to perform data analysis, testing and implementation tasks.
- Created sql scripts required to process the requirements and scheduled the jobs for batch processing via Tidal scheduling tool.
- Created unit and UAT test scripts to support unit and UAT phases of testing.
- Used environment variables in netezza sql scripts to dynamically change the database instances and settings.
- Used Configuration files to maintain environment variables, server names, setting, user id and password credentials at the ETL owner level.
- With the use of environmental variables and configuration files, netezza scripts had the ability to point to dev, test and prod instances without modifying the code.
- Created technical design document and deployment implementation plans for migration to production environment.
- Created initial load scripts and scheduled batch loads to run daily once the historical loads were processed successfully.
- Managed onshore and offshore resources to develop and test IT solutions.
- Created LOE (level of effort) models for the work assignment tasks, maintained and deployed tasks per project plan.
Environment: Netezza 8, NZload, Oracle 11gR2, PL/SQL, Unix platform (AIX), Winsql, Agenity, Toad 10.6, SQL Developer, Tivoli, Windows XP, Microsoft Office Suite, HP Mercury QA 10.0
Confidential, Miramar, FL
ETL Team Lead
Responsibilities:
- Designed and deployed overall CDC (Change data capture) process related to fact and summary loads for a large Marketing Data Mart (3 TB). The process includes the usage of control tables and parameter files for PowerCenter sessions.
- Designed and deployed processes for Type I, Type II, Type III and Type IV Dimensions utilizing checksum numbers for detecting change and maintaining history.
- Mentor and manage onsite, offshore teams and client employees in ETL design, development, and implementation.
- Implemented dimensional modeling by creating dimensions and facts using star schema.
- Design, develop and implement PowerCenter mappings, sessions, worklets, workflows, email, decision, event wait, decision and command tasks.
- Used mapping parameters, mapping variables, session parameters and parameter files on different mappings and sessions, based on various business requirements.
- Developed mappings, sessions which leveraged CDC dates in control tables to process daily incremental loads.
- Extensively created and used mapplets, worklets and Reusable Transformations to prevent redundancy of transformation usage and maintainability.
- Migrate ETL processes from Test to UAT using object and deployment groups.
- Created standards for receiving flat file extracts from various vendors to insure consistent and reusable processes for loading inbound data.
- Developed and maintained summary and snapshots of data utilizing Oracle materialized views with fast and full refresh strategies.
- Created Tables, Partitioned Tables and Materialized Views for faster data accessing and reporting needs.
- Created Stored Procedures, functions, triggers and UNIX scripts for various business requirements.
- Created QA Test Plan, Test Cases and Test Scripts for Unit testing, System Integration Testing and User Acceptance Testing. Executed multiple tests in QA Test lab for integration testing, created defects, identified defect root cause and resolution to perform the fix.
- Execute campaigns through Siebel analytics for various business needs through answers application.
- Create individual subjects, import tables, define the join, create hierarchies and levels through RPD files and execute reports for various business requirements through Siebel Analytics (OBIEE).
- Good understanding of Master data management (MDM).
Environment: Informatica PowerCenter 8.6.1, Oracle 11gR2, PL/SQL, Unix platform (AIX), Toad 10.5, SQL Developer, Tivoli, Windows XP, Microsoft Office Suite, HP Mercury QA 10.0, Siebel Analytics, Siebel, UCM.
Confidential
Data warehouse Developer
Responsibilities:
- Involved in data analysis, designing, and development, business requirements and identification of business rules of data warehouse environment.
- Managed security by assigning user permission for various folders.
- Tuned existing mappings and SQL queries to improve the performance and obtain throughput.
- Designed and developed ETL routines, using Informatica PowerCenter .Within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, connected and unconnected stored procedures Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers was extensively done
- Developed Type1 and Type2 Mappings.
- Used mapping parameters, mapping variables embedded in parameter files.
- Worked on creation of the Xref tables for conversion purpose.
- Used BULK Mode with Database Partitioning to obtain maximum throughput and faster loads.
- Implemented partitioning and Query Optimization for performance enhancements.
- Experienced in writing to parameter file and using the variables in informatica mappings.
- Performed Unit Testing & Integration Testing.
- Created tables, views, stored procedures, Functions and triggers and developed shell scripts for automating flat file processes.
Environment: Informatica PowerCenter 7.1.3, DB2, SQL Server, SAP Legacy systems, IBM AIX 5L(Unix), Rapid SQL, Control M, Windows XP, Microsoft Office Suite.
Confidential
Graduate Assistant (Oracle Developer)
Responsibilities:
- Interacted with end users to gathering requirements.
- Created tables, views, synonyms and sequences.
- Optimized queries for better performance.
- Created Database Stored Procedure, Functions, Triggers and Packages.
- Perform capacity planning required to create and maintain database environments.
- Designed and developed web pages using ASP and SQL
- Report Generation using crystal reports 8.0.
- SQL tuning on existing reports
- Modified existing forms, reports as per the enhancement.
- Extensively used Database triggers, PL/SQL procedures, packages and functions while developing the forms and reports.
- Database maintenance: Creation and Management of Table spaces, Data files, Indexes, Rollback Segments, Tables.