We provide IT Staff Augmentation Services!

Sr. Oracle Developer Resume

2.00/5 (Submit Your Rating)

NC

SUMMARY:

  • Around 9 Plus years of experience as Oracle Developer with Extensive knowledge and work Experience in SDLC including Requirements Gathering, Business Analysis, System Configuration, Design, Development, Testing, Technical Documentation and Support.
  • Strong experience in systems Analysis, design and development of complex software systems in Banking and Telecom Domain.
  • Have in - depth knowledge in Data analysis, Data warehousing and ETL techniques, Business Objects, UNIX Shell Scripting, SQL, Power Builder, PL/SQL scripts.
  • Rich experience working with Data Stage ETL tool for the extraction, transformation and loading of data among different databases. Strong knowledge of ETL processes using UNIX shells scripting, SQL, PL/SQL and SQL Loader.
  • Strong Experience in front-end and back-end development using Forms11g/10g, SQL, PL/SQL, Oracle 11g/10g and Oracle Designer10g for Data modelling .
  • Experience with Data flow diagrams, Database normalisation techniques, Entity relation modeling and design techniques.
  • Rich experience in writing SQL queries, Views, Materialized views, PL/SQL procedures, functions, packages, triggers, cursors, collections, Ref cursor, cursor variables, Dynamic SQL.
  • Strong knowledge of Data Extraction, Data Mapping, Data Conversion and Data Loading process using UNIX Shell scripting, SQL, PL/SQL and SQL Loader.
  • Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS. Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes.
  • Effectively made use of Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write and Transportable table spaces.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL. Worked extensively on Ref Cursor, External Tables and Collections.
  • Good knowledge on logical and physical Data Modeling using normalizing Techniques.
  • Created Shell Scripts for invoking SQL scripts and scheduled them using crontab.
  • Extensively used bulk collection in PL/SQL objects for improving the performing.
  • Experienced in Design and Development of Data Warehousing solutions for Extraction, Transformation and Loading (ETL) mechanism.
  • Usage of vi editor for writing the Unix korn, bash scripts. Experience in Unix utilities like SED, AWK
  • Huge experience in developing Oracle PL/SQL packages and object types.
  • Proficiency in using UNIX shell scripts for testing the application.
  • Good Knowledge on UNIX and Batch scripts like File watcher , File ftp, Data Validation, User defined functions, File Format Validation, Crone Jobs.
  • Developed Materialized views for data replication in distributed environments.
  • Experience in query optimization, performance and tuning (PL/SQL) using SQL Trace, Explain Plan, Indexing, Bulk Collect, AWR and table partitioning.
  • Collaborated with QA teams for creating and reviewing Test Plans and Test Cases.
  • Experience creating and managing defects in the Quality Center.
  • Strong Analytical and Problem Solving skills, Multi-Tasking abilities. Self-motivated, flexible, knowledgeable, hardworking, and detail oriented person. Good communicator, coordinator and organizer.
  • Worked with business users as well as senior management and Participated in walkthroughs with the management and developer team

TECHNICAL SKILLS:

Methodologies: Waterfall, Agile-Scrum

Requirements Management: JIRA

Database: SQL Server, Oracle 9i/10g/11g.

Scripting: Unix Shell, Perl 5.8.8

Release Management: ITCM, JIRA, RNOW (Release now)

Application Languages: Java, SQL, PLSQL

Scheduling Tools: Control M, Autosys

Testing Tools: HP Quality Center, JIRA

Operating Systems: Microsoft Windows Xp/2007/vista/7/8.

PROFESSIONAL EXPERIENCE:

Confidential, NC

Sr. Oracle Developer

Responsibilities:

  • Involved in getting the User Requirements, Data Modeling & Development of the system
  • Writing (Back-end) PL/SQL code to implement business rules through triggers, cursors, procedures, functions, and packages using SQL*Plus Editor or TOAD.
  • Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable
  • Used Bulk collect and For all in stored procedures to improve the performance and make the application to run faster.
  • Created Records, Tables, Objects, Collections (Nested Tables and Varrays), and Error Handling.
  • Extensively used Cursors, Ref Cursors, Dynamic SQL and Functions.
  • Developed Shell scripts to automate execution of SQL scripts to check incoming data with master tables, insert the valid data into Customer Management System and invalid data into error tables which will be sent back to sender notifying the errors.
  • Extensively involved in performance tuning using Explain Plan, DBMS PROFILER and Optimized SQL queries, created Materialized views for better performance.
  • Developed UNIX Shell Script as required.
  • Used DBMS SQLTUNE.REPORT SQL MONITOR package to generate sql monitoring report and tune the queries.
  • Involved in getting the User Requirements, Data Modeling & Development of the system
  • Worked on logical and physical data modeling and created Materialized views.
  • Played key role in designing and implementing Strategic and Tactical sourcing of data from various systems, processing it and feeding into Data marts and various external systems
  • Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures.
  • Developed stored procedures / packages to run ETL jobs to extract customer information from OLTP database into data warehouse.
  • Responsible for decreasing Load times of Data Warehouse by optimizing ETL procedures using PL/SQL
  • Developed Unix Shell Scripts to automate backend jobs, load data into the database.
  • Analysis and gathering of new business requirements from the business resources.
  • Extracting and Loading inventory data from various systems by using PL/SQL packages and procedures.
  • Bulk upload /extract of data from databases using adhoc procedures.
  • Developed PL/SQL routines for daily checks and automated them on windows scheduler.
  • Created temporary tables and materialized views on Oracle Database.
  • Performance tuning of SQL queries using Explain Plan to generate efficient monthly and quarterly reports.
  • Developed PL/SQL Functions in-order to get hierarchy of locations.
  • Extensively used tools like Oracle SQL Developer and Toad to generate PL/SQL programs.
  • Created and maintained Database Objects (Tables, Views, Sequences and stored procedures).
  • Involved in debugging and Tuning the PL/SQL code, tuning queries, optimization for the Oracle database.
  • Extensively used PL/SQL Collections, BULK Collect, Table Types, And Record Types, UTL FILE .
  • Wrote complex stored procedures and packages.
  • Created various Function Based Indexes to significantly improve performance.
  • Extensively used Bulk Collections to insert and update huge amount of data into target databases.
  • Created UNIX shell scripts to automate data loading, extraction and to perform regular updates to database tables to keep in sync with the incoming data from other sources.
  • Used SQL* Loader to load csv files on to database’s temporary tables.
  • Adhoc bulk loading Inventory items into DB using PL/SQL scripts.
  • Optimized SQL to improve query performance using SQL Navigator and PL/SQL Developer.

Environment: Oracle 10g/11g, PL/SQL Developer, SQL*Plus, TOAD 10.1, SQL*Loader, SVN, Oracle Forms 10g, Oracle Reports 10g, Windows Server 2008/Windows, UNIX, JIRA, Confluence, Metasolve App.

Confidential, NY

Sr. Oracle/Unix Developer

Responsibilities:

  • Involved in tuning the performance for very large database systems
  • Designed and developed several complex database procedures, packages. Extensively used features like Cursors, autonomous transactions , distributed transactions, exception handling, Dynamic, pl/sql tables, bulk load methods , optimizer Hints, Cursor variables and returning of multi record set from procedures and functions
  • Generated DDL scripts and Created and modified database objects such as tables, views, sequences, functions, synonyms, indexes, packages, stored procedures using TOAD tool.
  • Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable
  • Created B Tree indexes, Function based indexes on columns to minimize query time and achieve better performance.
  • Worked on the trades, positions, reference data, Market data load process (ETL) within IBTS.
  • Worked on the technical implementation of multiple global Orders, trades, products, positions, Reference data and Market Data ETL’s sourcing projects for the Actimize IBTS application to serve Confidential ’s Trades Surveillance
  • Implemented Temp table of rowid’s - Primary key approach to process Trades from LCDB system into IBTS system.
  • Used Bulk collect and Forall in stored procedures to improve the performance and make the application to run faster.
  • Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable
  • Used Bind Variables while writing Dynamic SQL to improve performance.
  • Extensively worked on BULK COLLECTS, BULK INSERTS, and BULK UPDATES & BULK DELETES for loading, updating and deleting huge data.
  • Used Oracle Analytical functions such as RANK, DENSE RANK, LEAD, LAG, LISTAGG & ROW NUMBER Functions for sorting data.
  • Created and modified database objects such as Tables, Views, Materialized views, Indexes, Sequences and constraints, SQL queries (Sub queries and Join conditions).
  • Implemented CTAS (Create Table As Select), Exchange Partition approaches to optimize archive process.
  • Provide quality operations support for production environment. Work with QA and provide support to provide timely technical resolutions for defects.
  • Created various UNIX Shell Scripts for scheduling various data loading process. Maintained the batch processes using Unix Shell Scripts.
  • Conducted Oracle database and SQL code tuning to improve performance of the application, used Bulk binds, in-line queries, Dynamic SQL, Analytics and Sub-query factoring etc.
  • Created and tested various complex reports in different styles in oracle 10g.
  • Created Cursors and Ref cursors as a part of the procedure to retrieve the selected data.
  • Created SQL Loader scripts using UNIX shell scripting and PL/SQL
  • Used Bind Variables while writing Dynamic SQL to improve performance.
  • Involved in performance tuning using Indexes, Hints, Explain Plan, Stats gathering etc.
  • Used techniques like direct path load , external tables for loading the data faster.
  • Involved in the technical consolidation/integration of various sources trades, orders, market data, positions feeds in LCDB Data warehouse
  • Performed SQL and PL/SQL tuning to improve the performance with the help of SQL Trace, Explain Plan, Indexing and Hints.
  • DBMS STATS is used to collect and build various Data statistics to provide information to COST based optimizer, to find chained rows, to build histograms.
  • Supported multiple Downstream Application like PLM, GPLM, MANTAS, Trade Surveillance, ACTIMIZE, GWRS/DMS.

Environment: Oracle 10g/11g, TOAD 10.6.1, SQL*Loader, Oracle Forms/ Reports 10g, Windows 7, UNIX Shell Scripting, Emptoris v9/v10.1, Combionics, Framesoft, TFS.

Confidential, Deerfield IL

Oracle Developer

Roles & Responsibilities:

  • Created database objects like Tables, views, procedures, functions, packages, materialized views and Indexes.
  • Created Packages and Procedures to automatically drop table indexes and create indexes for the tablesWorked extensively on Ref Cursor, External Tables and Collections.
  • Developed materialized views for data replication in distributed environments.
  • Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
  • Wrote complex queries including Sub queries, Joins, Unions, Intersect and Aliases.
  • Developed several procedures and functions using advanced Oracle concepts such as REF CURSORS and BULK COLLECTS, INDEX BY TABLES and FORALL statements to improve performance in insert and update statements.
  • Performed unit and integration testing of the applications created using PL/SQL modules and used PVCS for version control in team development.
  • Wrote complex SQL queries including inline queries and sub queries for faster data retrieval from multiple tables
  • Used DBMS SQLTUNE.REPORT SQL MONITOR package to generate sql monitoring report and tune the queries
  • Implemented Temp table of rowid’s - Primary key approach to update user id/week no coming from CHAMP (Point of Sale) system.
  • Involved in getting the User Requirements, Data Modeling & Development of the system
  • Worked on logical and physical data modeling and created Materialized views.
  • Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures.
  • Developed stored procedures / packages to run ETL jobs to extract customer information from OLTP database into data warehouse.
  • Responsible for decreasing Load times of Data Warehouse by optimizing ETL procedures using PL/SQL
  • Partitioned Tables using Range Partitioning, List Partitioning and created local indexes to increase the performance and to make the database objects more manageable
  • Enhanced a Champ module's performance in a preexisting product by 2800% faster (reducing runtime from 16 hours to 40 mins) adopting a more efficient data accessing path than the old one.
  • Used PLSQL to Extract Transform and Load (ETL) the data into Data Warehouse and Developed PL/SQL scripts in accordance with the necessary Business rules and procedures.
  • After ETL, Involved in Data Validation, Data Performance in loading target table and for reporting purpose.
  • Developed control files for SQL* Loader and PL/SQL scripts for Data Extraction/Transformation/Loading (ETL) and loading data into interface Tables (Staging area) from different source systems and validating the data
  • Developed various Mappings to load data from various sources using different Transformations.
  • Created and modified database objects such as Tables, Views, Materialized views, Indexes, Sequences and constraints, SQL queries (Sub queries and Join conditions).
  • Used Bulk Collect, Bulk Insert, Update functions in the ETL Programs.
  • Created and used Table Partitions to further improve the performance while using tables containing large number of columns and rows.
  • Extensively involved in performance tuning using Explain Plan, DBMS PROFILER, Optimized SQL queries and created Materialized views for better performance.
  • Implemented Data loading process using UNIX Korn shell scripts resolved application issues in UNIX production environment. Read and interpreted UNIX logs
  • Extensively used error and exception handling techniques for validation purposes in code.
  • Involved in fine tuning the existing packages for better performance and providing on-going support to existing applications and troubleshooting serious errors when occurred.
  • Developed an effective Error Mechanism by creating Monitor, Error-Log Tables using Autonomous transactions
  • Designed and maintained project data models like ERDs (Entity Relationship Diagrams) using Toad Data Modeler tool using forward as well as Reverse Engineering techniques.
  • Interacted with developers and Quality Assurance team for resolving application and database related problems.
  • Extensive testing was done on the programs for achieving accuracy, timely processing of data.

Environment: Informatica Power Center 9.5, Oracle 10g/11g, Powerbuilder 9.0, PL/SQL, SQL, UNIX, TOAD, Subversion, SQL*Plus, Control-M.

Confidential, Chicago, IL

PL/SQL developer

Responsibilities:

  • Analyzed and overviewed the database design for better understanding the relations, associations and dependencies within the database.
  • Analysis of business functionality with the client and the developers.
  • Depending on the business requirements, created and modified database objects like tables, indexes, views, stored procedures, functions, packages, Triggers, synonyms, Materialized views, to make new enhancements or resolve problems.
  • Created various SQL and PL/SQL scripts as required.
  • Extensively used Cursors, Ref Cursors and Exceptions in developing Packaged Procedures and Functions. Developed SQL and PL/SQL scripts to transfer tables across the schemas and databases.
  • Worked on complete SDLC (System Development Life Cycle) including system analysis, High level design, detailed design, coding and testing.
  • Creation of database objects like Tables, Views, Procedures using Oracle tools like SQL*Plus, SQL Developer and Toad.
  • Developed various complex queries to generate various reports based on the requirements.
  • Developed PL/SQL packages for generating various feeds from oracle database using UTL FILE utility in the client’s secure data transmission server.
  • Developed UNIX scripts for transmitting files from the secure server to the customer specified server, using various FTP batch processes.
  • Involved in writing Triggers in various levels like Database level, Table level to track information about users (login, logout information).
  • Involved in setting up the DBMS JOB package which allows a user to schedule a job to run at a specified time
  • Developed UNIX scripts for sending mails to the bank and the business team, when the external feeds to the bank are received.
  • Designed and implemented client-server application using Oracle Forms 10g.
  • Involved in the generation of User Interface using oracle forms by extensively creating forms as per the client requirements.
  • Developed various new reports from scratch utilizing Oracle Reports 10g for day to day validations.
  • Involved in creating INDEXES to avoid the need for large-table, full-table scans and disk sorts, and for fast retrieval of data from database objects while generating reports.
  • Created complex triggers for generating primary key values and for implementing the complex Business logic.
  • Involved in Bulk loading of data from non-Oracle data sources, flat files using SQL * Loader.
  • Optimized SQL and PL/SQL Query performance using Oracle utilities like Explain Plan, SQL-Trace.
  • Worked with Bulk Collects to improve the performance of multi-row queries. Involved in performance tuning using Explain Plan, TKPROF utilities.

Environment: Oracle 10g, Forms 10, Reports 10g, SQL * Plus, TOAD, SQL*Loader, SQL, PL/SQL Developer, UNIX, Windows XP.

Confidential

Data Analyst /Oracle Developer

Responsibilities:

  • Coordinated with the front end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
  • Created scripts to create new tables, views, queries for new enhancement in the application using oracle tools like Toad, PL/SQL Developer and SQL* plus.
  • Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
  • Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
  • Involved in the continuous enhancements and fixing of production problems.
  • Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
  • Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
  • Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
  • Partitioned the fact tables and materialized views to enhance the performance.
  • Extensively used bulk collection in PL/SQL objects for improving the performing.
  • Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching.
  • Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ERWIN.
  • Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.
  • Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
  • Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
  • Wrote shell scripts for scheduling jobs.
  • Used PVCS for source code configuration management.
  • Assisted in testing and deployment of the application.
  • Conducted User Acceptance Testing (UAT) and collaborated with the QA team to develop the test plans, test scenarios, test cases, test data to be used in testing based on business requirements, technical specifications and/or product knowledge.

Environment: / Tools: Oracle 11g, SQL*Plus, TOAD, SQL*Loader, Informatica, SQL Developer, Shell Scripts, UNIX, ERWIN, Quality Control.

We'd love your feedback!