Sr. Etl Developer/analyst Resume
Irving, TexaS
SUMMARY
- 10+ years of IT experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing, Data Integration Solutions and Database Administration.
- Over 8 years of extensive work experience as a Business Intelligence - Data warehouse ETL Developer.
- ETL, Data analysis for ODS, online transactional processing (OLTP) and Data Warehouse / Business Intelligence (BI) - logical/physical, relational and multi-dimensional modeling (Star Schema, Snowflake Schema), optimization, partitioning, archiving and capacity planning.
- Expertise in Oracle 10g, SQL Server2000/2005, Informatica 9.5, Oracle Warehouse Builder 10g, Erwin, UNIX and shell Scripts.
- Proficient in Normalization/Denormalization techniques in relational/dimensional database environments.
- As part of History Data Conversion, Identified multiple legacy source systems data structure, mapped to new system’s data structure and designed ETL maps.
- Post production support, handling new enhancements and customer support.
- Implemented various performance techniques to maintain system stability with increasing data volume.
- Expert in handling ETL bottlenecks at various stages like backend queries, designer stages, sessions, Server calls and CPU utilization.
- Extensive knowledge in using Explain Plan, TKProf and query optimization.
- Designed complex ETL mappings like slowly changing dimensions, critical performance tweaking stages with partitioning, persistent cache, Complex Data Exchange and extensive use of mapplets for common sub routines.
- Improved ETL performance by implementing Archival strategy, partitioning etc and fine tuned sqls with unwanted loops, indexes, pagination and recursive indexes between parent and child tables.
- Coordinated with DBAs, UNIX Administrators in setting up physical database and provided best practices for the database maintenance with larger volumes of data.
- Experience in End-to-End Database Development and database administration, Maintenance, performance tuning and Production Support.
- Currently playing Senior ETL developer role at Confidential at Cincinatti, Ohio.
TECHNICAL SKILLS
ETL Tools: Informatica, Contextia
Languages: C, C++, SQL, PL/SQL, TSQL
Data Quality Tool: Informatica DQ, Trillium
Operating Systems: Unix, Windows Server 2003/2008/R2/2012/R2/XP/Vista/7, Red Hat Linux
Databases: Oracle, Teradata, MySQL, SQL Server, Postgre SQL, Green Plum, DB2
Utilities: SQL *LOADER, FLOAD, MLOAD, TPUMP, TPT
Scripting Languages: Shell Scripting, Windows Power Shell, Python Scripting.
Other Tools: ERwin, TOAD, PuTTY, SQL Developer, WINSCP, Rally, JIIRA, HPQC, HPSD, Appworx, TWS
PROFESSIONAL EXPERIENCE
Confidential, Irving, Texas
Sr. ETL Developer/Analyst
Responsibilities:
- Analyzed the existing data for Requirement gathering at initial stage of project. Contacted Business users for clarification and worked with Business Analyst to have complete understanding of Business needs and expectations.
- Used Passport tool to analyze raw source data.
- Created High Level Requirement document.
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Created business rules in Informatica Developer and imported them to Informatica power center to load the standardized and good format of data to staging tables.
- Designed and used control tables, configuration tables and Stats tables.
- Risk analysis - Mitigation, avoidance plans.
- Used Power Exchange for reading data from Legacy system
- Implemented Pass Through, Auto Hash, User defined Hash Key and Data Base Partitions for performance tuning.
- Interacting with client (BA team) for validations and evaluation to move in to production.
- Creating Deployment groups for Informatica Migrations.
Environment: Informatica Power Center 9.5, IDQ, UNIX, Flat Files, Mainframes, DB2, TWS.
Confidential, Cincinatti, Ohio
Project Lead
Responsibilities:
- Involved in multiple requirement discussion meetings with Business Users and Source system experts and involved in functional and technical spec designs for ETL part.
- Used Erwin to develop Physical Data Models and Business Models.
- Used Oracle Warehouse Builder (OWB) 10.2 for transforming the data from ODS to JDA (Forecasting Application).
- End to end development of ODS and OLTP (Online Transactional Processing) for Cargo.
- Involved in the Real time implementationfrom Requirement Analysis phase.
- Design and Development of ETL(Extraction Transformation Loading) using OWB and PL/SQL packages.
- Designed various ETL jobs for pulling data from various legacy systems and for loading into the integrated database.
- Performing data profiling to provide high quality of data to business users and customers.
- Performed User Acceptance/Business process testing and suggested enhancements to improve the End User functionality
- Accountable for developing the core coding of the ongoing interfaces meeting with the PL/SQL coding standards.
- Point of Contact for the QA team members to do a End-End testing for all the Modules required for forecasting the data.
- Analyzing the existing packages, shell scripts and converting into Informatica with some enhanced functionality as demonstrated by business users
- Developed various reusable mapplets for common code lookups
- Used various transformations like Expressions. Lookups, Mapplets for the ETL maps
- Written manual SQL procedures for some of the ETL and data cleansing tasks in ETL load to ODS
- Developed corresponding ETL jobs, batches and automated real time data load.
- Have Knowledge on Informatica MDM concepts and implementation of De-duplication process.
- Developed SQL procedures for handling data profiling while extracting data from source systems
- Creating control files and data files to load the flat file data into Oracle 10g table using SQL*Loader.
- Used Explain Plan, TKPROF utility, SQL Trace and various HINTS to tune the complex queries of Reports, Procedures, Functions and Packages both at the Server Level and Client Level
- Performance tuningforTERADATA SQLstatements using huge volume of data.
- CreatedFast Load, Fast Export, Multi Load, TPUMP, and BTEQto load data from Oracle database and Flat files to primary data warehouse.
- Conducted quality reviews of Test Plans and Test Case Documents developed by QA Team.
Environment: Informatica Power Center 9.5, Windows, Flat Files, Teradata, Oracle10g, Green Plum, UNIX.
Confidential
Team Lead
Responsibilities:
- Involved in End to end development of ODS (Operational Data Source), DIM (Data warehouse) and data marts.
- Attended multiple requirement gathering
- Design and Development of ETL (Extraction Transformation Loading) using Informatica 8.6, Oracle 10g and UNIX shell scripts
- Identified the sources, targets and developed ETL mappings using Informatica 8.6. Incorporated various transformations, Lookups, Expressions, Routers, Sequence Generators, Mapplets, Dynamic Lookups and reusable maps.
- ETL performance (Data base and ETL jobs) by optimizing back end queries, by implementing techniques like index drop and re create, table partition, parallelism etc
- Enhancing the system with change requests and co-coordinating business requirements to development team.
- Developed corresponding sessions and workflows to schedule the ETL jobs.
- Enhanced ETL performance (Data base and ETL jobs) by optimizing oracle back end queries, by implementing techniques like index drop and re create, table partition, parallelism etc.
- In ETL, implemented partitioning at Designer level in mappings as well as in sessions, included parallel execution among the sub jobs in ETL workflow batches, incorporated persistent cache for lookups, redesigned the joiners, stored procedures and frequent unnecessary data base connectivity’s.
- Developed UNIX shell scripts for some of the ETL activities.
- Debugging data quality scripts in Trillium.
Environment: Informatica 8.6, Trillium, Windows Server 2003, Oracle 10g.
Confidential, Texas
Team Member / Team Lead
Responsibilities:
- Responsible for take the back up of contextia (ETL Tool) code.
- Creating the data base when there is a new extract (data set).Import the data from source server to source database using SQL import wizard.
- Responsible for set the environment to load the data from source database to target database using contextia (ETL Tool).
- Responsible to assign issues within the team and generate the issue status report.
- Responsible for analysis and fixing the issue.
Environment: Contextia, Windows Server 2003, Oracle 10g, SQL Server 2005, PostGre SQL.
Confidential, Cincinatti, Ohio
Team Member
Responsibilities:
- End to end development of ODS and OLTP (Online Transactional Processing) for Model N Implementation
- Used Oracle 10g and Informatica for history data migration and ODS real time data.
- Involved in gathering the functional and technical requirements from the business and well documenting the details in the Design Specification Document.
- Used MS Visio to develop Physical Data Models and Business Models.
- Identify, analyze and translate business needs into system applications and improve business processes.
- Involved in data analysis, data profiling and data conversion from various legacy systems like SAP, Gentran and I-Many.
- Involved in the three stages of the development as Extraction, Transformation and Loading using complex Stored Procedures, Packages and Functions.
- Used various Extraction Techniques to pull data from Legacy Systems dealing with file formats like XML, CSV and Fixed Length flat file.
- Developed various reusable mapplets for common code lookups.
- Incorporated various transformations, Lookups, Expressions, Routers, Sequence generators, mapplets, dynamic lookups and reusable transformations in ETL maps
- Developed various stored procedures, Packages and views for ETL needs.
- Written manual SQL procedures for some of the ETL and data cleansing tasks in ETL load to ODS
- Used Various Custom Return Types and Functions to meet the application requirements.
- Developed corresponding ETL jobs, batches and automated real time data load.
- Developed various database objects like Views and materialized views for backend.
- Extensively used Built-in packages like UTL, DBMS to deal with the different data formats like Flat File structures, CSV, DAT according to the SAP file formats.
- Used various lookups to pull the data from Model N utilizing various Database links that refer to the tables/columns/views that are part of the database schema.
- Involved in Unit Testing the interfaces and documenting the test cases for various scenarios to check the compatibility of the functionality with Model N.
- Involved in System Integration Testing (SIT) that interacts with the legacy systems to pull the data and load the data into Model N ensuring that the business needs are met accordingly.
- Involved in User Acceptance Testing (UAT) wherein sample data is used to test the actual data that flows in-out from Model N.
Environment: Informatica Power Center 8.6, Windows, Flat Files, Teradata, Oracle10g, UNIX.
Confidential, Fremont, California
Database/Python Developer
Responsibilities:
- Gathered the requirement from the client and translated the business details into Technical design.
- Interacting with client on daily basis to understand the requirements and preparing IT specifications.
- Code review, Performance Analysis and Testing.
- Preparing Implementation Documentation and user support documents.
- Developing front end for various applications using PHP and java script.
- Creating backend scripts and installation and uninstallation scripts using python.
- Creating Unit test cases for developer level testing.
- Creating selenium scripts to test the web functionalities.
- Write Oracle Queries, Procedures, Packages, triggers etc.
- Created Materialized Views for reporting requirements.
- Handling database cleanup activities.
Environment: Python, SQL Server2000, MySQL, UNIX
Confidential, Dallas, Texas
Oracle DBA
Responsibilities:
- RMAN Backup and Recovery, DB Performance Tuning instance level.
- Standby database Maintenance and Configuring Data guard.
- Monthly range partitioning of few tables.
- Creating/Resizing of Tablespaces, Tables, and Users for custom applications.
- Monitoring Table space size and file system size alerts using UPTIME tool.
- Cold/Hot Refresh/Cloning QA, DEV databases from Production environment for testing purpose.
- Physical database design, Table, Column mapping with existing requirement.
- Creating/Dropping Databases, Users, Table spaces, Tables & Database Objects.
- Oracle 10g Real Application Cluster Implementation and Production support.
- Stats Gather at Database, Schema, Table and Index level.
- Applying quarterly CPU patch to all Databases.
Environment: Oracle9i, Pythong, UNIX