We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

0/5 (Submit Your Rating)

Oakland, CA

SUMMARY

  • 10 years of experience in developing and designing of Data warehouse, Data migration, Identity resolution and data quality projects using Informatica products (Informatica Power Center 7x/8x/9x, Informatica Power Exchange and Informatica Data Quality ).
  • Extensive programming experience in Oracle, Teradata, MS SQL Server, MYSQL, Netezza and DB2.
  • Extensive reporting experience in OBIEE.
  • Good knowledge of Hadoop ecosystems, HDFS, Big Data.
  • Knowledge on ecosystems like Hive, Pig, Sqoop, NoSQL, Map Reduce and Hbase.
  • Strong knowledge of Hadoop and Hive and Hive’s analytical functions.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, SQL Server, MY SQL, DB2, Teradata, Netezza, Sales Force, SAP, @Task and Flat Files.
  • Experienced in writingSQL Statements, PL/SQL code for the database objects such as tables, views, indexes, sequences, procedures, functions, packages, and triggers
  • Good back end programming experience usingPL/SQL,SQL, MSSQL, Stored Procedures, Functions, Ref Cursors, Constraints, Triggers, Indexes - B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views, Database Links, Export/Import Utilities, Ad-hoc queries using SQL.
  • Expertise in QA Testing in distributed Unix/Windows Environment and Oracle/SQL Server/Teradata /DB2 databases as back end, Performed end-to-end testing.
  • Expertise in extended validation of report functionality developed, using Cognos, Business Objects, by writing complex SQLs at the backend.
  • Experienced SQL Data Analyst / Data Reporting Analyst with strong background in design, development, and support of online databases and information products as well as data analysis / reporting / processing.
  • Extensively written test scripts for back-end validations.
  • Basic knowledge on Hadoop echo systems (HIVE, Sqoop and Pig) and IBM Big SQL.
  • Extensive knowledge of Relational Database Management Systems (RDBMS) and relational and normalized databases.
  • Experience in data modeling with SQL Data modeler and Visio tool.
  • Well versed in exposing stored procedures as web services in Informatica for data warehousing projects.
  • Database end table design and implementation for Data warehouse and related ETL Processes.
  • Analysis, Design, Development and Implementation of Data warehouse, ETL Clint/Server applications.
  • Extensive working experience in data migration using Informatica Power Center.
  • Knowledge in Data Warehouse implementation using tools like Informatica Power Mart and Power Center, Power Connect, Power Exchange CDC.
  • Proficient in using Informatica workflow manager, workflow monitor, server manager, PMCMD (Informatica command line utility) to create, schedule and control workflows, tasks, and sessions.
  • Strong in developing data models including Logical, Physical, Conceptual and additionally dimensional modeling using star schema for data warehousing projects.
  • Experienced in Tuning Informatica Mappings to identify and remove processing bottlenecks.
  • Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using Autosys.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 9.1/8.6/7.2, (Designer, Mappings, Mapplets, Transformations, Workflow Manager, Workflow Monitor) Power Exchange and SAS.

OLAP Tools: Cognos, OBIEE.

Dimensional Data Modeling: Erwin, Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimension Tables.

Databases: Oracle, SQL, PL/SQL, Teradata, SQL*Plus, MS SQL server, MS Access, Netezza, MYSQL, DB2, Big SQL and Hive.

Programming: SQL, PL/SQL, SQL*Plus, C, C++, C#.

Environment: Windows, Linux (Red hat), UNIX.

Testing: Manual, Modular Testing, System Testing, Integration, Unit, Regression and Performance Test

Defect Tracking Tools: Quality Center, Clear Quest, Bugzilla, Visual Studio Team System

Web Services: SAP, Taleo, SFDC and @task.

Tools: Toad, SQL Developer, Teradata SQL Assistant, SVN(Version control) and WIN SCP.

PROFESSIONAL EXPERIENCE

Confidential, Oakland, CA

Sr. ETL Developer

Responsibilities:

  • Extracting the data from Teradata, DB2, BIG SQL, Netezza and loading in to DOR VDW using with SAS programs.
  • Extensively involved in writing DDL and DML operations.
  • Extensively worked on writing the SQL queries using joins, Order by and Group by.
  • Created Collections for accessing and storing complex data resulted from joining of large number of tables.
  • Developed PL/SQLPackages, Procedures and Functions accordance with the business requirements for loading data into database tables.
  • Tested Complex ETL jobs based on business user requirements and business rules to load data from source RDBMS tables to target tables.
  • Scheduling the Oracle jobs as per the requirement (Daily, weekly and monthly).
  • Monitoring day-to-day Process for different Data Loads and resolving Issues.
  • Worked extensively on exception handling to trouble shoot PL/SQL code.
  • Extensively used TOAD, SQL Developer and Teradata SQL assistant tool to increase the productivity and application code quality.
  • Understanding the business documents and create statements and modify the scripts.
  • Develop SQL Loader Script to load staging area using data from flat file.
  • Involved in testing all the scripts in test database and moved into the production.
  • Involved table partition in database.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Created the Sas data sets for comparison is to prove that file formats are the same and number of records is correct.
  • Created the variable frequencies for comparing the source and target data using with SAS.
  • Created the SAS QA programs for test the data.
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Testing of records with logical delete using flags.
  • Maintaining the Release related documents by Configuration management techniques.
  • Monitored indexes and analyzed their status for performance tuning and query optimization.

Environment: Oracle SQL and PL/SQL, DB2, BIG SQL, Teradata, Netezza, SQL Server, SAS and UNIX.

Confidential, Bloomington, IL

Sr. ETL-Informatica Developer

Responsibilities:

  • Gather and Analyze requirements from Users
  • Designed the functional specifications are source, target, current & proposed process and Interface process flow diagram.
  • Designed ETL process, load strategy and the requirements specification after having the requirements from the end users.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Maintaining the Release related documents by Configuration management techniques.
  • Involved in data design and modeling, system study, design, and development.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Created Folders and placed all the tested ETL, DB2 and UNIX scripts in the Staging path for DST & Production movement.
  • Created database objects like views, materialized views, procedures, packages using Oracle PL/SQL with SQL developer
  • Created Records, Tables, Collections (Nested Tables and V-arrays) for improving performance by reducing context switching.
  • Created number of database Triggers according to business rules usingPL/SQL
  • Involved in loading and Re-loading the County data into the database.
  • Participated in Performance Tuning of SQL queries using Explain Plan to improve the performance of the application.
  • Worked extensively on exception handling for handling errors using system defined exceptions and user defined exceptions.
  • Monitoring the scheduling jobs.
  • Created Unix Shell Scripts for automating the execution process.
  • Created number of database Triggers according to business rules usingPL/SQL.
  • Extensively used Explain Plan, TKPROF forSQLQuery Tuning.
  • UsedPL/SQLtables, PRAGMA AUTONOMUOS TRANSACTION to avoid mutating problem in database trigger.
  • Validating the load process of ETL to make sure the target tables are populated according the data mapping provided that satisfies the transformation rules.
  • Ensured that the mappings are correct and conducted data validation testing
  • Tuned ETL jobs/procedures/scripts, SQL queries, PL/SQL procedures to improve the system performance.
  • Involved in data design and modeling, system study, design, and development.
  • Worked on bug fixes on existingInformaticaMappings to produce correct output.
  • Identified the bottlenecks in the source, target, mapping, and loading process and successfully attended/resolved the performance issues across this project.
  • Helping the team in fixing the technical issues if any and Tuning of Database queries for better performance.

Environment: Informatica 9.6, Mainframe DB2, COBOL, UNIX, Oracle SQL and PL/SQL .

Confidential, Ashburn, VA

ETL-Informatica Developer

Responsibilities:

  • Developed Informatica parameter files to filter the daily data from the source system.
  • Implemented Type 2 Slowly Changing Dimensions Methodology to keep track of historical data.
  • Identified and eliminated duplicates in datasets thorough IDQ components of edit Distance, Jaro distance and Mixed field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Prepared the data flow of the entire Informatica objects created, to accomplish testing at various levels unit, performance and Integration and Baseline test.
  • Studied Session Log files to find errors in mappings and sessions.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Worked closely with DBA in creating the Tables, indexes, views, Index rebuilds.
  • Involved table partition in database.
  • Developed PL/SQLProcedures and Functions accordance with the business requirements for loading data into database tables.
  • Developed theSQLcode for performance and penalty statistics calculation.
  • Involved in tuningSQLqueries.
  • Executing Daily and Monthly reports in UNIX and automated the scripts using shell script.
  • CreatedSQL* Loader scripts to load data into temporary staging tables.
  • Created scripts for Views, Materialized Views and Partitioned tables.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Validated the data in the reports by writing simple to complex SQL queries in the transactional system.
  • Responsible for Analyzing and Implementing the Change Requests.
  • Created complex stored procedures, views, SQLjoins and scripts.
  • Writing high quality and well documented code according to standards.
  • Involved in fine tuning stored procedures by making use ofPL/SQLcollections and its BULK FETCH and BULK INSERT features.
  • Involved in daily status calls with clients.
  • Designed ETL process, load strategy and the requirements specification after having the requirements from the end users.
  • Maintaining the Release related documents by Configuration management techniques.
  • Designed the functional specifications are source, target, current & proposed process and Interface process flow diagram.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Extensively worked in data Extraction, Transformation and loading from CSV files, XML Files & S&P (Standard & Poor’s) data to Microsoft SQL Server Data base.
  • Created Folders and placed all the tested ETL, SQL Server and UNIX scripts in the Staging path for production movement.

Environment: Informatica, Oracle SQL and PL/SQL, Quality Center, DB2, Unix Shell Scripting.

Confidential, Orem, UT

ETL-Informatica Developer

Responsibilities:

  • Involved actively in full life cycle of project which includes screen design, table design, preparation of user manual, preparation of test plans and development of the program using Oracle 10g
  • Interaction with Development Team in fine Tuning the SQL and PL/SQL Codes.
  • Implementation of Oracle & Application Software Patches.
  • Developed Forms using Forms and custom Reports.
  • Writing database creation scripts & PL/SQL server side stored procedures, functions and triggers, SQL*Net installation and configuration.
  • The front end consists of Oracle Web forms which allow the user to login and enter the input processing information, such as file name, date, month and year and submit the form for the load process.
  • Worked with Informatica Designer of Functional Description, Scope and Detailed functional requirements.
  • Designed and Created data cleansing and validation scripts using Informatica ETL tool.
  • Responsible for creating, importing all the required sources and targets to the shared folder.
  • Worked with Different sources such as Relational Data bases (Oracle, SQL Server, and MY SQL), Flat files, Adobe PDFs, Xml files, Sales Force, SAP/BW, SAP/ECC, Taleo and @task.
  • Utilized of InformaticaIDQ 8.6.1to complete initialdata profiling and matching/removing duplicate data.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Created Folders and placed all the tested ETL, Oracle and UNIX scripts in the Staging path for production movement.
  • Created new database objects such as tables, views, indexes, triggers and synonyms.
  • Wrote different new scripts in PL/SQL using Packages, Procedures, Functions and Triggers.
  • Created Collections for accessing and storing complex data resulted from joining of large number of tables.
  • Involved in loading and Re-loading the County data into the database.
  • Extensively used TOAD tool to increase the productivity and application code quality.
  • Understanding the business documents and create statements and modify the scripts.
  • Scheduled the jobs in Dev and Testing environment using Informatica Scheduler.
  • Prepared the data flow of the entire Informatica objects created, to accomplish Integration testing.
  • Prepared Unit Test Plans.
  • Supported user queries against availability of data from Data Warehouse.
  • Performed troubleshooting for non-uniform ness.

Environment: Informatica Power Center 8.6.1, Informatica B2B data exchange and transformation, Oracle 10g/11, SQL Server, MYSQL, Toad 9.1, Tidal 5.3.1, Flat Files, Windows NT, UNIX, Sales Force, SAP/BW,SAP/ECC, Quality Center, Taleo and @Task.

Confidential, Los Angeles, CA

ETL-Informatica Developer

Responsibilities:

  • Used Oracle Designer 9i to perform data modeling.
  • Documented Tech Specs for the proposed data base design.
  • Developed PL/SQL packages and procedures for the back end processing of the proposed data base design.
  • Conducted PL/SQL training session for co-workers to educate about the latest PL/SQL features, PL/SQL performance tuning.
  • Performed the set up the VSS source code architecture for the client.
  • Performed code reviews.
  • Used SQL Navigator to reverse engineer the model of the legacy applications.
  • Prepared the data flow of the entire Informatica objects created, to accomplish Integration testing.
  • Scheduled the jobs in Dev and testing environment using Informatica Scheduler.
  • Created Folders and placed all the tested ETL, Oracle and UNIX scripts in the Staging path for production movement.
  • Created and managed many objects in large Oracle Databases containing millions of records.
  • Coded and debugged Stored Procedures, Packages and Views in Oracle Databases usingSQLand PL/SQL, which were called by user-oriented application modules.
  • Extensive querying usingSQL* plus / TOAD to monitor quality & integrity of data.
  • Analyzed queries usingSQLTrace facility and Explain Plan utility to obtain the execution process.
  • WrotePL/SQL programs to read from files, upload data, and mass updates/inserts of data based on certain business policies.

Environment: Informatica Power Center 7.1, Oracle 9i, SQL Server, Toad, Tidel, Teradata, Flat Files, Windows NT, UNIX and HP Quality center.

Confidential, Saint Paul, MN

ETL-Informatica Developer

Responsibilities:

  • Requirement gathering and Business Analysis
  • Extensively used ETL and Informatica to load data from MS SQL Server, Excel spreadsheet and flat files into the target Oracle 9i database.
  • Implemented various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
  • Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.
  • Writing scripts to create tables and views
  • Developing PL/SQL procedures, packages, triggers, functions
  • Extensively used ETL and Informatica to load data from MS SQL Server, flat files into the target Oracle 9i database.
  • Implemented various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.
  • Developed PL/SQL procedures/packages to kick off the SQL Loader control files/procedures to load the data into Oracle 9i.
  • Prepared the data flow of the entire Informatica objects created, to accomplish Integration testing.
  • Scheduled the jobs in Dev and testing environment using Informatica Scheduler.
  • Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries Optimization as well.

Environment: Informatica Power Center 7.x, Oracle 9i, MS SQL Server 2000, UNIX, PL/SQL and SQL.

We'd love your feedback!