We provide IT Staff Augmentation Services!

Onsite Lead Resume

2.00/5 (Submit Your Rating)

Profile:

  • Around 6+ years of professional experience in developing Oracle PL/SQL, UNIX Shell Scripting, Autosys Jobs, PL/SQL Performance tuning, Perl, V$ views, DBA views, SQL Server T-SQL coding activities for industries like Retail & Investment banking, human resource, Healthcare etc.,
  • Extensive experience in Retail domain Walmart.Com - project like Catalog.
  • Extensive experience in banking domain Wells Fargo Financial Services - Data warehouse project like CDS.
  • Extensive experience in banking domain project like GTFP (IFC - World Bank Group) project.
  • Extensive experience in banking domain project like BDR a project of (SOCIETE GENERALE Retail & Investment Banking – a French based banking sector).
  • Onsite lead co-ordinate with the off-shore by conveying the business requirement identifies requirement gaps, review code and mentor to complete development on time.
  • Experience in large Data warehouse projects done ETL process using PL/SQL scripts, SNAP SHOTS, VIEWS, BULK COLLECT, UTL_FILE utilities to handle 100 million volume of data load.
  • Experience large retail domain by working with walamart.com used best PL/SQL methodology for site facing APIs.
  • Experience in Emergency Bug Fix by co-coordinating with CPE team to bring up the site functionality on issues.
  • Suggested and re-designed the Incremental data push project added business values and tuned in a way to enhance Walmart.Com business profitability.
  • Involved in Performance tuning, PL/SQL code review suggested best methods to fine tune the existing code and reduced the time line of the ETL process from 60hours to 5hours with best practice of PL/SQL coding and ETL Process design.
  • Experience in developing/maintaining PL/SQL packages, Procedures, Functions, Cursors, Triggers, and written complex SQLs using joins, sub-queries.
  • Database experience using Oracle 10g/9i/8i/8.x, TOAD, SQL, PL/SQL, SQL*Plus, SQL*Loader, SQL - Server
  • Strong experience in developing Oracle PL/SQL Programs such as - Developing Packages, Stored Procedures, Functions, Anonymous Blocks, Creation of Triggers, Views, DB Changes, Perl Scripting as per spec.
  • Converted Oracle SQL into T-SQL giant queries and delivered to the source team which sources in SQL Server DB and get the extracts in files to load and process in CDS.
  • Worked in phases of SDLC – Requirement analysis, Estimation, written test cases, Technical Design, Code Development, unit testing, implementation and bug fixes.
  • Involved in successful production deployment of CDS and handled multiple CRs for each release without production issues after deployment.
  • Created game plan for every release which handled them on time by coordinating with off-shore DBAs, Autosys team all the deployment from the first release are been done without any flaws to the running production system.
  • Involved in Data Modeling using Erwin created the schema objects using Erwin and deployed in Database.
  • Involved in developing mass upload procedures and scripts that used to validate business rules and load data from source files to target data base using pl/sql, SQL*Loader.
  • Developed packages, procedures, pl/sql scripts that verify business rules and load sensitive data like market data (Ratings), Securities data mass uploads and Third parties mass upload, SAC mass upload, ICARE mass upload, SNP, Moody’s & Fitch data for SG.
  • User Account management: Creating Users, Disabling users, Responsibility management involved in preparation of Quality documents based on the requirements as per SDLC.
  • Monitoring & Managing the Table spaces and data files growth. Analyzing Tables & Schemas, collecting system statistics and rebuilding indexes
  • Good amount of experience to develop in user defined & Built-in exceptions
  • Have exposure to Banking Application and its functionality.
  • Worked in Enhancements, Projects, and Bug fixes of the banking domain application.
  • Appreciated by the onsite team in Paris for many sensitive and defect free deliveries on target.
  • Ability to deliver quality deliverables even under limited target as per client expectations in SAC project.

Education Qualification:
Master of Computer Applications

Technical Summary


Languages:

C/C++, Java, HTML, JavaScript, SQL, PL/SQL, UNIX Shell Scripting, J2EE, PERL Scritping

RDBMS:

Oracle 10g/9i/8i/8.x, SQL SERVER 2008

DB Tools:

TOAD 9.0.1, SQL*Plus, PL/SQL Developer 7.1.4.1390, SQL optimizer,

Modeling Tools:

ERWin 4.0

Reporting Tools:

BusinessObjects, WebI, Crystal Reports 11, Oracle Forms & Reports D2K

Oracle Utilities:

SQL*Loader, Export/Import Utilities, External Tables

Oracle Tuning:

SQL Tuning, Explain Plan, Table Partitioning, Materialized views, Hints,
B-Tree/Bit-Map Indexing, SQL-Trace, TKPROF.

OS :

UNIX, Windows

PROFESSIONAL EXPERIENCE

Company : Confidential,San Bruno, CA Duration: May 2011 – Till Date
Project Name : Catalog ( Wal-Mart items/Bundles/Incremental Data Push/Smart Pricing/Pangaea)
Technology : Oracle 10g, PL/SQL Developer, SQL plus, SQL Optimizer, SQL, Dynamic SQL, SQL loader, TOAD 8.5/8.6, bulk Collect techniques
Role : Onsite Lead
Description: Wal-Mart serves customers and members more than 200 million times per week at more than 9759 retail units under 60 different banners in 28 countries. With fiscal year 2010 sales of $405 billion, Wal-Mart ranked in the top ten among retailers in Fortune Magazine’s 2010 Most Admired Companies survey.
Catalog project deals with all the Wal-Mart items and their details like price, availability, inventories, seller details etc.
I have been involved in bundle enhancement Phase I, II, III & Incremental data push projects from 11.7 to 11.10, 12.1, 12.2, 12.3 & 12.5 release. Bundles items are combinations of various related items bundled which have re-calculated price, quantity, descriptions at bundle item level & item level which facilitates the user to choose their bundles at required combinations which improved the selling of items in the form of bundles through Walmart.com. The bundle enhancements facilitated the way to create/modify/remove items within the bundle based up on the business needs which improves the business of selling various attractive bundled items. Incremental data push project deal with capture & replication of data from staging DB to Prod DB (site facing). The business transaction that happens in staging DB by the business people need to be pushed to prod the site facing data base which is a critical task need to capture & replicate huge amount changes from PPROD to PROD is been done trigger framework and later pulled & replicated to Prod tables through the replication process. Have made the design to sequence the tasks and properly logged added functionality to run on errors to report quality most granular level.

Responsibilities:

  • As a lead developer involved in co-ordination between on-site & off-shore to achieve on time deliveries for all the release.
  • Designed the Incremental data push project to have more control over logging and performance and suggested best re-design that paved a way for profitability to business.
  • Obtained the business requirements and communicated to off-shore team by daily meetings and guided them through the requirement gathering/coding/QA/implementation process.
  • Closely worked with the QA team during QA phase to classify bugs, enhancements and fixes also by the way communicated the QA issues or requirement changes over team meeting to off-shore development team to fix the issues and reviewed the code fixed during onsite time and work on further issues if any.
  • Guided the off-shore team to understand the existing business logic written & help them to continue development for new requirements.
  • Involved in communicating business needs to off-shore clearly to make them in the same page so that no missing or misunderstanding requirements happen in deliveries.
  • Involved in analyzing the existing complex packages and make the way to implement Bundles Phase I which is accessed by merchants to add/delete/modify items into bundles. This facilitates the merchants to change the properties of the bundles based on the customer needs.
  • Involved in writing business complex logics using best PL/SQL techniques like bulk collect, Types, string handling procedures, functions, as catalog project deals with item flows among various systems keen understanding of the existing system need to considered as the changes to any item attribute it will impact the business in multiple ways.
  • In Bundle enhancement Phase II – a bulk loading option has been facilitated to create bundles items in bulk loading.
  • Involved in creating bulk bundle procedure which consists of best PL/SQL utilities like Cursors, parameterized cursors, complex business validations, Exception handling and logging techniques using Autonomous transactions.
  • Controlled the bulk loading flow with proper error flags to facilitate commit/rollback on errors.
  • Created procedures to generate reports to front end like TOOL a power builder client.
  • In Bundle Enhancement Phase III – involved in creating complex views to filter out bundle items which are having common visual variant a concept which help to categorize bundle items and their visibility on the .Com site.
  • Created complex views, used grouping techniques to filter out bundles which have visual variant, mandatory items, with visible display status, having items related to Wal-Mart etc achieved the business logic using complex SQL written.

Company : Confidential,Tempe, AZ Duration: Jun 2010 – Apr 2011
Project Name : CDS (Finance Profitability Bridge Project)
Technology : Oracle 10g, SQL plus, SQL Optimizer, SQL, Dynamic SQL, SQL loader, TOAD 8.5/8.6, Perl scripting, bulk Collect techniques, UNIX Shell scripting, Autosys Jobs.
Role : Lead Developer
Description: Confidential,is a diversified financial services company providing banking, insurance, investments, mortgage, and consumer and commercial finance through more than 10,000 stores and 12,000 ATMs and the Internet (wellsfargo.com and wachovia.com) across North America and internationally.
The CDS project is into the merger of data between East Wachovia and Wells Fargo, due to the merger the company data need to be merged into single entity. Key function of CDS project is to receive data from three legacy platform CP, ICON and ProfitMax data and consolidate them in order to get used by ProfitMAX NextGen platform.
The data from east Wachovia are sent through NDM process to UNIX box, the prescheduled file watcher identifies the file and the Perl script is called through shell script and data are loaded into the staging tables by sql loader into staging tables. The loaded data are further processed for merger process where number of refreshed snapshot data is been merged with the pre loaded data into the combined profit data final target tables. The data from the staging table ware been processed for errors and marked with error flags and details in the target tables. The finance power users have been given access to the target table by snapshots used to verify the processed data. The project consists of Extracting/sourcing data from Flat files, Snap shots and merging the data into targets using PL/SQL engines while the overall process get automated by Autosys jobs, Perl scripts, Shell scripts which call the PL/SQL Engines to Transform the data then to Load into the Target tables.
Responsibilities:

  • Involved in analysis, design, and estimation of the project, on time deliveries as per the functional specification
  • Converted Oracle SQL into T-SQL giant queries and delivered to the source team which sources in SQL Server DB and get the extracts in files to load and process in CDS.
  • During UAT interacted with various teams in order fix the issues in fast pace and explained the data issues to the users and source teams to fix them quickly.
  • Developed & Reviewed the PL/SQL code written by team members and fine-tuned the code with best practices of coding which reduced the processing time effectively from 60 hours to 16 hours.
  • Analyzed issues raised by users both in UAT & PROD also provided fix and explanations whether it’s a data issue or issue due to code or data refresh timing issues etc.
  • Involved in performance tuning by adding hints, using materialized views, review and modify the PL/SQL codes to perform better and reduced the process timings to run in a business day which ran for 40 hours.
  • Involved in partition of the tables, Index to improve the performance, estimated the table space to hold the ETL process and bulk data flow from source to target.
  • Expertise in using V$ views and DBA views to keep watch the status of a running scripts and finding issues like locks, poor performing queries, explain plan etc.
  • Used the DBA views to find the data base user list based on account availability, account status, locks, expired accounts etc.
  • Provided extended support even during non-business hours and solved the issues by working with off-shore DBA’s, East coast business users, analysts round the clock as a continued effort to keep up UAT in pace.
  • Handled Production deployment by deploying UNIX shell scripts, sql files and coordinated other team members during the CR time.
  • Created game plans to for the Production CR’s and handled them by co-operating with teams in order to have successful deployments on time within the CR window as per the clients policy.
  • Involved in overall design, time lines, deploying codes and files in SIT, UAT, PROD environments in the CDS project.
  • Developed and involved in tuning the plsql scripts written for merge process, did performance test and changed the code in way to minimize performance lags.
  • Suggested best methodologies to team members to overcome the performance lags in their code.
  • Used Materialized views, Indexes, bulk collect utilities, views, optimizer hints, tune the PL/SQL code to avoid huge hits to database by including views and Materialized views with Index for consolidated data, effective Cursors.
  • Created autosys jobs, using utilities like file watcher, scheduler etc, involved in Perl scripting to load the files into the target tables using sql loader.
  • Created Autosys jobs by BOX Jobs and COMMAND Jobs to automate the whole merge process.
  • Handled more than 100million records in the upload process with proper data warehouse approach and fine tuned PL/SQL Codes, Bulk Collect utilities, intermediate Materialized views, Optimizer Hints.
  • Creates materialized views to source through db links and their refresh schedules using unix shell scripting and dbms_jobs. Created intermediate snaps shots to reduce the processing time & improved the performance.
  • Involved in creating predefined roles for the team to related privileges on the CDS objects.
  • Involved in creating control files, perl script, pl/sql script to the mass data merge process.
  • Involved creating control files that handles oracle built-in functions to handle the data while loading.
  • Created the Perl scripts in such way give mail alerts to the team in success, failures do to ORA errors, failures due to error records, mail details with proper error handling techniques and details.
  • Strong in Unix OS commands, shell scripts, perl scripting shell commands for the mass uploads.
  • Automated the project in such a way that minimize human support with proper email report generation, scheduling of the autosys, written Perl script that fetch, load and move the processed file to appropriate folder and send detail mails to the team
  • Involved in reviewing, testing, tuning of the sql and pl/sql scripts to improve the performance, using Explain plan methods.
  • Created complex views, snapshots to pull data from heterogeneous systems like sqlserver 2005 to oracle 10g.
  • Worked on complex queries to create them as materialized views with proper refresh schedules using unix autosys jobs and DBMS_JOBS.
  • Involved in DBA activities like ANALYZE, Materialized view refresh, allocating table spaces as per the project requirements, creating roles, users, granting roles to finance power users.
  • Utilized the DBMS features to handle the ETL setup to load sources to target with proper exception handling.
  • Involved in overall integration of the CDS project start from file load into staging tables to represent the target data to the finance power users.
  • Used UTL_FILE utility to generate data into flat files and delivered them to upstream teams.

Project Name : Confidential,Washington Duration: Mar 2009 – Feb 2010.
Technology : Oracle PL/SQL, Oracle 10g Database, Erwin Data Modeling.
Role : Senior Software Engineer – worked as Oracle PL/SQL developer.
Description : Confidential,Global Trade Finance Project involves in handling sensitive transaction between world bank , Issuing Banks , corparates and various countries government entites sharing the funding transactions for funding activites. The transactions between World bank and the benificeries like the developing countries in the world are taken care by GTFP project.
Responsibilities

  • Involved in data migration from legacy system to the enhanced application by functionally mapping the existing legacy data system to current system. Using SQL loader by writing control files and oracle pl/sql scripts to upload the flat file data to target tables.
  • Create/maintain database stored procedures, functions, packages, views, materialized views, types, tables, and triggers that efficiently implement business functions. Worked with Exceptional handling, cursors, DBMS_Packages.
  • Wrote approximately around about 200 SQL queries to get the count from the new application.
  • Involved in tuning the SQL statements using SQL TRACE.
  • Involved in the uploading and downloading data through UTL_FILE utility.
  • Extracting data from different sources like Oracle, Flat files, External files and transformed the data based on Business requirements and loading into Oracle target database.
  • Worked on Tools like Toad, SQL Navigator for designing SQL Statements used as extraction criteria for Interfaces, Explain Plan for performance tuning. Worked with UNIX system.
  • Design for the product, preparing the Internal Design documents, delegating development and design tasks to team members, brief project planning in terms of scheduling, development, review, preparing of Unit test plans, reviewing Unit test plans, executing Unit test plans, assisting team members.
  • Communicate several times per day with end users to provide technical guidance and discuss new development.
  • Created DB model for GTFP Phase II using Erwin data modeling tool.
  • Written packages, procedures, functions as per the requirement.
  • Written procedures for batch job which will be executed on regular routines on daily basis.
  • Written audit triggers to back up and keep track the data modification.
  • Created all the data base objects using Erwin data modeling and created reference integrity between the DB objects and various check constraints, indexes, sequences as per the requirements.
  • Created the Data Dictionary for the GTFP Phase I and Phase II using Erwin data modeling tool.
  • Created complex views for the reporting part of the project.
  • Created materialized views with scheduled refreshing which refers data from other schema data base.


Project Name : Confidential,Duration: Jul 2007–Feb 2009
Technology : Oracle PL/SQL, Unix shell scripting, Oracle 10g Database
Role : Senior Software Engineer – worked as Oracle PL/SQL developer.
Description: Confidential,a French based banking concern provides services in Retail banking Specialized Financial Services, Corporate & Investment Banking, Asset Management, Private Banking, Securities Services Payment Services etc. The project BDR means Base Domino Reference used as a referential data base by more than 270 banking clients around the world. The project deals with various aspects of investment banking domains like Counterparties or Third Parties, Securities, Market data like Ratings data of agencies, Analytical Structure, Transversal. The project technically supports various banking data operations through database programs like Extractions, Treatments which are launched by Sessions on scheduled time which executes db scripts and generate status report mail reports to clients, automated mass upload developments using oracle packages, procedures, pl/sql script programming etc.
Responsibilities

  • Involved in all the phases of SDLC from 10.3 Release to 11.0_1 Release more than 10 releases for the past 1 and ½ years of time. Involved in various stages of ENH, PRJ and solving production BUG.
  • Developed PL/SQL scripts (Treatments) for major projects like ICARE mass upload. Created procedures and functions for multi string data, validated more than 120 business rules. Effectively communicated with Paris BA to enhance the functional gaps. Appreciated by Paris side for the quality delivery of this task.
  • Developed PL/SQL scripts for internal ratings integration, implemented business rule validations, smart update, designed report as per spec, created $U objects like session to launch the treatment and status report mail sent to clients.
  • Developed Procedures for Automatic Securities mass upload in this task the records from the .csv which is in client format are been validated for various business rules and then integrated into the securities tables and status reports are sent to clients. Used pl/sql tables, system defined exception, various user defined date exceptions using PRAGMA exceptions, record types, autonomous transaction to commit error logs.
  • SAC – Single Accounting Calendar an important project for SG. Created a common package which is called from WEB and also by mass upload procedure. The package consists of procedures that validate business rules and integration of the records. The procedure designed for mass upload segregates the data from files. Used packages, procedures, temporary tables, MERGE statements, dynamic SQL, autonomous transactions, user defined & pre defined exceptions, smart update concepts, de-activation & Re-activation implemented as physical delete is not allowed in BDR.
  • Got appreciations from Paris onsite manager for defect free delivery of the task External ratings data integration (Ratings data from agencies like SNP, MOODYS, and FITCH).
  • Implemented new versioning mechanism by creating triggers which take back up of the modified records and modification history from T-tables to H-tables.
  • Developed computing functions for complex algorithms.

Company : Confidential,Bangalore, India. Duration: July 2006 – June 2007
Project Name : OMEGA – Internal Project of HP
Technology : Oracle 9i, SQL Navigator, Brio Reporting 8, and HP UNIX
Role : PL/SQL Developer
Description : Omega project is an internally developed and used by world wide HP Internal users to track their time which they spent for their Customers and projects. This generates the billing activities against the clients of HP. Moreover the details about the projects, customers, services, Activities, Individuals assignment for various projects, access rights between the Individuals and Projects, Time tracked against a projects and Billing for the tracked time are systematically handled by OMEGA. Also the project automates the addition/removal of Individuals, Projects, ARLs, Customers, Services, activities etc.
Responsibilities:

  • Created functions, Procedures, Triggers as per the RFCs
  • PL/SQL back-end programming – Database triggers, Package
  • Create/maintain database stored procedures, functions, packages, views, materialized views, types, tables, and triggers that efficiently implement business functions.
  • Involved in data migration from Oracle 8i to 9i. Optimize SQL queries for better performance.
  • Wrote complex SQL queries using joins and sub queries.
  • Involved in Unit Testing of each module. Monitoring existing SQL code and performance tuning if necessary.
  • Used TOAD, PL/SQL developer tools for faster application design and developments.
  • Use of EXPLAIN PLAN, ANALYZE, HINTS to tune queries for better performance and also Extensive Usage of Indexes.
  • Rebuilding the indexes
  • Assist Production DBA with underlying Oracle database architecture including schemas, database objects, file system structure, tables, views, packages, procedures, functions, materialized views, triggers, sequences, indexes, & constraints. SQL queries create database objects and optimally set storage parameters for tables and indexes.
  • Responsible for rewriting queries to improve the performance of the application.
  • Read data from flat files to oracle tables using SQL*loader.
  • Store procedure, function and management of schema objects.
  • Involved in the Life cycle of the project Created Procedures for Process MEC process.
  • System study, Requirement gathering. Client interaction, Reviews of the coding standards.
  • Interacted with clients at all level for RFC. Development and Maintenance of the existing project.

Company : Confidential,India. Duration : Mar 2006 – Jun2006
Project Name : Humana Web-IT
Technology : Oracle 9i, SQL Navigator, Java Servlet
Role : PL/SQL Developer
Description: It’s a Insuarance project that keeps track of the cutomers & coverage details. The humana project deals with insurance coverages of customers & their dependents of Humana. The details about the coverage, demographic details, benefit details, expiry and renewal details. As a large insurance company it deals with customers who insured in Humana more than 100 years.
Responsibilities:

    • Created functions with exceptions handling for various modules as per the requirement.
    • Created packages with integration of stored procedures, functions, variables, cursors.
    • Effectively used Conditional Control Statements in programs.
    • Used advanced Pl/SQL techniques like cursors for data retrieval.
    • Involved in Analyzing and Performance Tuning of the SQL queries using optimizer hints and Explain plan for the migration scripts.
    • Created and used Table Partitions to further improve the performance while using tables containing large number of columns and rows. Used Explain Plan to speed up the SQL Statements.
    • Managed the database and performed the basic tasks like creating users, assigning roles and privileges, creating new tables etc. Created and Maintained Oracle Schema.
    • Developing distributed database design (using Oracle) for a distributed client server based system using.
    • Used SQL Navigator for designing SQL statements.
    • Involved in designing of Unix Shell Scripting with the help of UNIX server.
    • Used Parallel Processes to improve the performance of a SQL statement.
    • Extracted data from MQ Series and loaded it into Oracle Database.
    • Preparation of Test cases, Review and Testing.
    • Creating and maintaining the data model and physical integrity of the databases.
    • Extensive use of Oracle External Tables new feature of Oracle 9i.which is useful to load the Data from flat file.
    • Involved with DBA in creating schema and related privileges and roles.
    • Prepared weekly status reports. Prepared Quality related documents.
    • Inserting Data into Tables by using MERGE in the procedure.
    • Handled PL/SQL tables effectively for Bulk Insert & Retrieve.
    • Interacted with external OS files with the help of Oracle supplied packages-UTL_FILE.
    • Created Triggers to maintain the history of vendors & customers.

We'd love your feedback!