Etl Lead Consultant Resume
Jersey City, NJ
SUMMARY
- 10+ years of IT experience with expertise in Analysis Design Development Testing and implementation of Data warehouse Data marts with ETL OLAP Tools using Informatica Power Center v9.x 8.x 7.x Datastage Oracle 10g/9i/8i SQL Server Teradata
- Extensive experience in Pharmacy, Healthcare, Banking, Insurance, Trading and Retail.
- Ability to analyze source systems and business requirements identify and document business rules design data architecture.
- Extensive Knowledge in Dimensional Data modeling Star Schema modeling Snow - Flake modeling Fact and Dimension tables Physical and logical data modeling using Erwin data modeling tool.
- Work with Dimensional Data warehouse in Star and Snowflake Schemas created slowly changing SCD Type I/II/III dimension mappings.
- Having experience with Onsite/Off-shore model and leading medium team.
- Experience in complete life cycle Implementation of data warehouse.
- Strong experience with Data Migration/Acquisition and Integration projects.
- Experience in OLTP/OLAP system study, analysis and ER modeling, developing Database schemas like Star schema and Snowflake schema used in relational and multidimensional modeling by using Erwin, Visio
- Experience with the Informatica Data quality and Master Data Management products.
- Extensive experience on Informatica Data Quality (IDQ) components are Data Profiling, Data Validation, Data Cleansing.
- Extensive experience in implementing CDC using Informatica Power Exchange 8.x/7.x.
- Extensively worked with Oracle PL/SQL Stored Procedures, Triggers, Functions, and Packages, and also involved in Query Optimization.
- Expertise in Administration tasks including Importing/Exporting mappings, copying folders over the DEV/QA/PRD environments, managing Users, Groups, associated privileges and performing backups of the repository.
- Experience in scheduling the batch jobs using workflow monitor DAC Scheduling Tivoli TWS Autosys and FTP methods.
- Developed Complex mappings from varied transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy and more.
- Experience on Python scripting.
- Expertise in using UNIX and writing Perl and UNIX shell scripts.
- Experience in Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and pushdown optimization, and also identifying performance bottlenecks.
- Good exposure to Development, Testing, Debugging, Implementation, Documentation, End-user training and Production support.
- Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
- An excellent team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation. A quick learner with an aptitude for taking responsibilities.
TECHNICAL SKILLS
ETL Tools: Informatica PowerCenter 9.6 / 9.5 / 9.1 / 8. X / 7.X, Informatica Power Exchange 9.5, IDQ, ILM, MDM, Datastage.
Databases: Oracle 11g / 11i / 10g / 9i /8i, SQL Server 2008 R / 2000, Teradata, Sybase, DB2, MS Access 7.0 / 97 / 2000, Teradata 13.0.
Scripting Languages: SQL, PL/SQL, Python, UNIX Shell Scripting.
Operating Systems: UNIX, Windows 7/XP/2003/2008, Linux
Business Intelligence Tools: OBIEE, Qlikview
Scheduling Tools: DAC, Tivoli
DB Tools: SQL Server Management Studio, TOAD, SQL Navigator, SQL *Plus, SQL *Loader, TD SQL Assistant 13.10, DBArtisan
Data Modeling Tools: MS Visio, ERwin
PROFESSIONAL EXPERIENCE
Confidential
ETL Lead Consultant
Responsibilities:
- Working as application architect to implement end to end application of migration project.
- Analyzing existing architecture of the project and preparing Design documents.
- Interacted with business users and gathering the requirements which are needs to be done while migrating.
- Involving into logical and physical data models using CA Erwin Data Modeler based on business requirements and also followed by existed architecture.
- Leading medium size offshore team.
- Involved into validation of VMS configuration process between existed architecture and HRA2.0 architecture.
- Successfully Loaded Data into target ODS from the source system SQL Server Database into the Staging table.
- Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
- Interacted with Business Users in the design of technical specification documents and signoff before start development.
- Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
- Debugged the mapping of the failed session.
- Analyzed and Created Facts and Dimension Tables after data got loaded.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Data validation between existing processes to new process.
- Using DAC to schedule ETL jobs along with generation of dynamic parameter file.
- Analyzed the database and provide detailed information on their operational and Forecast for future years.
- Involving into Data Governance, Data cleansing, Data profiling with capabilities of IDQ.
- Supporting to HRA production team either on technical or functionally.
- Implementing enhancements and taking care deployment till Production.
- Involved in Unit and Integration testing of Mappings and sessions.
- Supporting to production issues.
Environment: INFORMATICA 9.6, Oracle 11g, IDQ, MDM, OBIEE, DAC
Confidential, Jersey City, NJ
Informatica Consultant
Responsibilities:
- Was responsible in creating the Teradata Financial Service Logical Data Modeling architecture followed by Physical, Symantec and Presentation layer.
- Interacted with business users in gathering the requirements in data usage and report generation.
- Created Informatica repository for Implementation of application
- Involved in creating logical and physical data models using CA Erwin Data Modeler based on business requirements.
- Leading medium size team.
- Successfully Loaded Data into target ODS from the source system SQL Server Database into the Staging table and then to the target database Oracle.
- Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
- Worked as a Business analyst, Responsible in gathering Requirements and IT review. Interacted with Business Users in the design of technical specification documents.
- Re-designed ETL mappings to improve data quality.
- Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
- Analyzed and Created Facts and Dimension Tables.
- Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Within specified time, projects are delivered making sure all the requirements gathering, business analysis and designing of the data marts.
- Used Data Quality processes to Validation, Cleansing and Profiling.
- Developed Schedules for daily batches and used email tasks to check if any failures in the sessions using post session failure email option.
- Involving into Data Governance, Data cleansing, Data profiling with capabilities of IDQ.
- Designed developed and implemented Master Data Management solution Customer Party MDM Domain using MDM 9.6.1
- Developed MDM Hub Match and Merge rules Batch jobs and Batch groups.
- Prepared Python scripts to parse XML documents to load the data into relational database.
- Prepared Python scripting to get Online queries/web based applicants data from Customers.
- Involved in Unit and Integration testing of Mappings and sessions.
- Assisted Testing team in creating test plan and test cases.
Environment: INFORMATICA 9.6, Tearadata13.0, IDQ, Python, DAC
Confidential, Santa Ana
Informatica Lead Consultant
Responsibilities:
- Interacted with business users in gathering the requirements in data usage and report generation.
- Involved in creation of Informatica Repository architecture followed by Kimball model.
- Created and monitored Database maintenance plans for checking database integrity, data optimization, rebuilding indexes and updating statistics.
- Acted as a team lead for development team.
- Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat files, XML Files, SQL Server, etc. into the Staging table and then to the target database Oracle.
- Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
- Worked as a Business analyst, Responsible in gathering Requirements and IT review. Interacted with Business Users in the design of technical specification documents.
- Re-designed ETL mappings to improve data quality.
- Created Stored procedure transformations to populate targets based on business requirements.
- Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
- Used Pipeline Partitioning feature in the sessions to reduce the load time.
- Analyzed and Created Facts and Dimension Tables.
- Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.
- Created Data Breakpoints, Error Breakpoints for debugging the mappings using Debugger Wizard.
- Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and modularity.
- Developed the pre & post session shell scripts, which will create the parameter file dynamically.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Within specified time, projects are delivered making sure all the requirements gathering, business analysis and designing of the data marts.
- Involved in Unit and Integration testing of Mappings and sessions.
- Developed Schedules for daily and weekly batches using Unix Maestro.
- Prepared ETL mapping specification document.
- Assisted Testing team in creating test plan and test cases.
Environment: Informatica 8.6, Oracle 10g, UNIX
Confidential
ETL Lead Developer
Environment: Informatica 8.6, Oracle 10g
Responsibilities:
- Responsible for Business meeting, conversion of business rules into technical specifications on trading day by day activities.
- Prepared PL/SQL scripts (Stored Procedures, Functions and Packages).
- Performance Tuning in SQL scripts level.
- Prepared PL/SQL blocks for single execution purpose for few infotypes.
- Performance tuning of SQL queries used in extraction query to Source system.
- Created Views and Materialized views on requirement
- Create indexes and Functional indexes on required tables
- Worked with INFORMATICA Power Center client tools like Repository Manager, Designer, Workflow Manager, and Workflow Monitor
- Worked with Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Developer, and Mapping Designer
- Worked with Task Developer, Worklet Designer, and Workflow Designer in the Workflow Manager
- Responsible for extracting data from Oracle, XML Files and Flat files
- Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level
- Handled various loads like Intra Day Loads, Daily Loads, Weekly Loads, Monthly Loads, and Quarterly Loads using Incremental Loading Technique
- Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor
- Extensively worked with the Debugger for handling the data errors in the mapping designer
Confidential
Senior Developer
Responsibilities:
- Involved in meeting with business to gather requirements.
- Provide technical leadership to other support team members and resolution of technical issues
- Provide recommendations on database index strategies based on performance metrics
- Reviewed data models and doing code reviews.
- Created complex stored procedures, functions, triggers and packages.
- Creates and maintains the overall and detailed project plan(s) and supervise the D/W ETL processes.
- Used the Datastage to develop jobs for extracting, transforming, integrating, and loading data into the Staging and Integration layers.
- Used transformer, lookup and aggregator stages in designer to achieve complex business logics
- Used Key Management functions Surrogate Keys were generated for composite attributes while loading the data into Data Warehouse.
- Developed user defined stages for implementing Complex business logic.
Environment: DATASTAGE 8.0, Oracle 10g
Confidential
Team Lead
Responsibilities:
- Responsible for Business meeting, conversion of business rules into technical specifications
- Prepared PL/SQL scripts (Stored Procedures, Functions and Packages).
- Performance Tuning in SQL scripts level.
- Prepared PL/SQL blocks for single execution purpose for few infotypes.
- Performance tuning of SQL queries used in extraction query to Source system.
- Worked with Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Developer, and Mapping Designer
- Worked with Task Developer, Worklet Designer, and Workflow Designer in the Workflow Manager
- Responsible for extracting data from Oracle and Flat files
- Data Quality Implementation-Informatica Developer (IDQ)
- Involving into Data Quality like Data analysis, Data cleansing, Data profiling with capabilities of IDQ.
- Prepared Data quality business rules
- Created GUI related dashboards for Client understanding
- Excellent knowledge on Informatica platform as a whole and the integration among different Informatica components and services.
- Responsible for Performance in Informatica PowerCenter at the Target Level, Source level, Mapping Level, Session Level, and System Level
- Extensively worked with both Connected and Un-Connected Lookups
Environment: Informatica 8.6, Oracle 10g
Confidential
Software Consultant
Environment: Datastage 8.0, Oracle 10g, SAP ABAP
Responsibilities:
- Building Jobs, based on functional Mapping Specs for different subject area Payroll Acc.
- Created Sequence and Parallel jobs based on Mapping Spec.
- Prepared PL/SQL scripts (Stored Procedures, Functions and Packages).
- Performance Tuning in SQL scripts level.
- Prepared PL/SQL blocks for single execution purpose for few infotypes.
- Performance tuning of SQL queries used in extraction query to Source system.
- Created Views and Materialized views on requirement
- Create indexes and Functional indexes on required tables
- Debugging in Data stage job Involved in UNIT and Performance Testing and prepared Documentation. Prepared Technical Specifications Documents.
- Involved in Reconciliation and User acceptance Testing.
- Involved in Data Modeling activities.
- End to end Business requirement analysis to find the feasibility of the project.
- Creation of Wrapper script (Script to invoke Datastage jobs with input parameters).
- Development of jobs and sequencers for extracting the data from the variable data sources such as flat files, main frame database etc. loading modules using the DataStage designer tools for loading the data into the data warehouse as per the specified design.
- Developer from ETL Development side.
- Creation of the low level and high level design documents.
- Developing design strategy for loading huge amounts of data.
- Preparation of Unit Test cases and Unit test results document and End - end testing of the jobs.
Confidential, Baltimore, MD
ETL Developer
Responsibilities:
- Prepared PL/SQL scripts like Procedures, Functions and Packages.
- Involved into performance tuning in Database and ETL mapping level.
- Performance tuning of SQL queries used in extraction query to Source system.
- Created Views and materialized views for Analysis team.
- Created Simple, Complex and Functional indexes on required tables.
- Worked with Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Developer, and Mapping Designer
- Worked with Task Developer, Worklet Designer, and Workflow Designer in the Workflow Manager
- Responsible for extracting data from Oracle, Sybase, and Flat files
- Extensively worked with both Connected and Un-Connected Lookups
- Extensively worked with Look up Caches like Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations.
Environment: Informatcia 7X/8X, Oracle 9i