Database Consultant Resume
Walnut Creek, CA
SUMMARY:
- Possess over 15 years of experience in comprehensive Applications analysis, design, development, testing, implementation, Systems Integration, Data Migrations, Data Conversions, Data Interfaces Decision Support/Reporting Systems, performance tuning, data interfaces/refresh processes for SOLR and some Oracle Database Administration, using the Oracle RDBMS and Oracle Exadata on UNIX and LINUX platforms.
- I have worked in companies associated with the following industries: Confidential, Mobile Software, Telecommunications, Comprehensive Wealth Management, Databases, Software Development, Internet Web Site Personalization and Risk Management and Credit Analysis.
- Performance Tuning Of Backend Oracle Database Processes and Applications.
- Re - architecture and Rewrite of oracle backend database processes for performance and scalability.
- Designing, Developing and Implementing Large Data Migrations.
- End To End Backend Database Development.
- Data Warehouse Design and Development.
- Designing, developing and implementing automated database interfaces between non-oracle/legacy applications and oracle OR oracle to oracle.
- Automation of Database Batch Processes.
- Designing, developing and implementing automated database extract processes into flat files needed by other applications, like big data backend.
TECHNICAL SKILLS:
PROGRAMMING LANGUAGES: PL/SQL, SQL, Korn Shell, sed and awk, Perl, HTML, C, Java Script, XML
RDBMS: Oracle 7.x, Oracle 8i. Oracle 9i, Oracle 10g, Oracle 11g, Oracle 11g Exadata 8 Node RAC, Oracle 12cSOFTWARE TOOLS/UTILITIES: Sql*Loader, Sql*Plus, Oracle Import/Export, Oracle Enterprise Manager Toad, Sql*Net, C Isam, ER - Win (data modeling), Sql*Net, ODBC, Benthic software tools, Clearcase
BIG DATA:Hadoop Architecture, HDFS, Hive, Sqoop(Possess theoretical knowledge only, but can come upto speed quickly)
SOFTWARE PACKAGES: Oracle Financials 10.6, Oracle Order Entry and Shipping Modules of Oracle Manufacturing Minx Manufacturing Application, ASK Manman Manufacturing Application Telecommunications Record Information Billing System (TRIS+) DataFaction High Networth Client Accounting System
OPERATING SYSTEMSHARDWARE PLATFORMS: HP - UX- HP C7000 Blades (used for Oracle Exadata) Sun Solaris- HP V Class Machines. UNIX 4.2 BSD and System V- Sun Enterprise Series Red Hat Linux- MC68030 based minicomputer running UNIX Windows NT/2000 Server Enterprise Edition-. Dell, dual Pentium CPU System
PROFESSIONAL EXPERIENCE:
Confidential, Walnut Creek, CA
Database Consultant
Responsibilities:
- Re-wrote and automated data archival processes, thereby reducing storage requirements, which also helped data load and report performance.
- Documented Method of Procedure (MOP) for deployments in Production and DR (Disaster Recovery) database environments.
- Designed black box and white box test cases for new database features and worked with and helped QA in detailed testing.
- Worked with subject matter experts and technical leads to design, develop, test and deploy the complete CLPT (Core Performance Tool) Application, used for monitoring the performance KPIs for Voice over LTE.
- Making data types in some query join conditions the same, thereby avoiding FTS on queries.
- Worked with subject matter experts and technical leads to design, develop, test and deploy new features for the three Applications, ALPT, ELPT and CLPT. These new features included many new KPIs that were required to be monitored by the by radio frequency engineers, business users and subject matter experts.
- Removed redundant portions of PL/SQL code, after discussions with Subject matter experts and technical leads, to improve batch load performance.
- Worked with DBAs and operations to analyze, debug and fix any production performance issues either by providing a patch fix or by more detailed fixes like data model changes and data migration, providing performance fixes in the database packages and so on.
- Re-designed and wrote detailed error handling, error logging and debug logging processes in the stored procedures.
- Improved the partition merge processes performance by 10 times, by using bulk data concurrent inserts from partitions to be merged into temp tables and exchange partition methods.
- Improving the sql query cost, buffer gets, physical reads and cpu cost by inserting missing join conditions on indexed columns that were causing merge join Cartesian on very tables, which is very expensive.
- Modified existing indexes by adding the missing column in the where clause to avoid index skip scans which are costly especially on very large tables.
- Partitioned large tables that were not partitioned and were causing performance/scalability issues during reporting. This also involved writing data migration code to migrate data from non-partitioned to partitioned tables.
- Added parallel query and append hints to speed data loads and perform hourly and daily summarization.
- Used partition exchange feature of oracle for large data inserts into transactional tables.
- Using partition exchange feature, to speed up data loads.
- Designed and developed automated data purge procedures to keep rolling one month for 15 minute transactional data, rolling three months for hourly transactional data and rolling one year for daily transactional data.
- Used the hybrid columnar data compression feature of Exadata to reduce space usage drastically.
- Worked with DBAs and operations to setup weekly procedures for collecting table and index statistics upto 20% for the cost based optimizer to improve report query performance.
- Replaced deletes in the pl/sql code that was causing data fragmentation and causing the batch processes to slow down.
- Improved load and select performance in the stored procedures by adding missing filters in the sql queries.
- Understood all the database packages/procedures in detail and made sure all the queries are tuned found functional issues, discussed with subject matter experts, got their approval and fixed them.
- Rebuilt transactional tables and indexes to remove data fragmentation.
- Modified sql loads to use direct path loads.
- Improved performance by loading data into transactional tables concurrently through multiple oracle sessions.
- Pinned the key network element table and KPI summary tables and all its index blocks into the Oracle Exadata flash cache and de-normalizing the network element table, to improve the report performance by 20 times.
- Pinned the busy hour tables into Oracle Exadata flash cache, to speed up report run times by a factor of 20 times.
- Used the hybrid columnar compression for data compression to drastically reduce storage requirements.
- Helped tune the oracle database batch processes from other database teams.
- Mentored the UI Java developers from other database teams in sql and pl/sql tuning.
- Providing extracts from ALPT/ELPT/CLPT databases to big data and other teams for their processing.
Technology Used: PL/SQL, SQL, Sql*Loader, Sql*Plus, korn Shell, sed/awk, Oracle 11g Exadata 8 node RAC, on UNIX, Oracle Enterprise Manager, AWR, Toad, Oracle import/Export utilities, Sql Trace, automated via cronjobs
Confidential, Pleasanton, CA
Database Consultant
Responsibilities:
- Performance Tuning Of Oracle Claims, Quote, Legacy, Billing and Fraud Data Warehouse ETL processes.(Data Warehouse size: 15 - 20 Terabytes). Improved performance by 10 - 20 times and also made the Data Warehouse systems scalable with data growth.
- Designing, developing, testing and implementing, load and performance testing of new ETL processes for Data Warehouses.
- Working with Big Data Team to design, develop, test very large data extracts and interfaces from OLTP source Databases for Hadoop based Big Data ETL processing.
- Designing, developing, unit/integration testing, load and performance testing of data refresh processes fromOracle data warehouses and OLTP databases required for SOLR searches. The main key behind this de-normalization which makes the SOLR merge and re-index of its file structures very fast. This process made the SOLR data merge and re-index 20 - 30 times faster, as the refreshes were incremental. OLTP Database size was 7 Terabytes.
- Designing, developing and implementing data archival strategies for the Oracle data warehouses and OLTP databases thereby making the ETL processes much faster and scalable and also making the OLTP batch processes faster and efficient and scalable.
- Reviewing SQL and PL/SQL code written by developers from Data Warehouse and OLTP teams and documenting issues withtheir code from performance perspective, exception handling, debug and error logging. dead lock and locking issues, contextswitching issues that can cause huge bottle necks, commenting of code and code indentation and readability, using correctquery hints for parallel processing, bypassing redo logs, adding the missing indexes, adding the missing query filters that canmake the query much more efficient, or re-writing the SQL query in a more efficient manner, creating tables/indexes for datawarehouses in larger block size tablespaces like 32k or 64k, for faster read and write performance, pinning tables in the Oracle SGAfor improving Dashboard query performance and so on.
- Working with developers to debug and fix performance and database issues in production, QA and UAT databases.
- Working with Oracle DBAs to change Oracle server parameters for performance improvements, monitor and providing quick and long term fixes for database performance issues caused by ETL processes.
- Working with Release Management Team and Developers to plan, document and implement and validate Database Releases for Data Warehouses and OLTP databases on Linux, Sun Solaris and HP-UX operating systems, for production Releases.
- Documenting Release notes, applying and validating database code Releases on all Development, QA and UAT databases.
- Working with Load and Performance testing teams.to make sure that new code changes for Data Warehouse ETL did not impact the SLA and also the new OLTP database code did not load the OLTP database and did not impact the new and existing query performance.
- Performing POCs for Data Warehouse and OLTP teams to provide high performance and scalable database solutions.
Confidential, Sunnyvale, CA
Lead Developer and Data Warehouse Architect
Responsibilities:
- Completely re-architected, re-designed and re-wrote the Data Warehouse and ETL processes, thereby making them more scalable, reliable and improved process performance by 15 - 20 times. Took complete ownership of the entire project from design to deployment. This project was developed for Mobile Update customers with very large data sets like Verizon, Tata, US Cellular, Intel, Qualcomm, Fijutsu and Orange. This project was implemented as a SaaS hosted service model for some customers.
- Architected, designed, developed and implemented Data warehouse and ETL solution for Active Care customers, with data containing intelligent solutions for Mobile Carriers, OEMs and internal operations. This was implemented as a SaaS hosted service model. Took complete ownership of the project from design to deployment.
- Architected, designed, developed and fully automated data extracts for very large customers like Verizon and Tata to provide list of mobile phones with outdated firmware. These extracts helped the customers in monitoring and improving the FOTA (Firmware Over The Air Update), success rate, and thereby reducing customer service calls for mobile device issues related to outdated firmware. Took complete ownership of the project from design to deployment.
- Architected, designed, developed, tested and implemented the end to end data mobile analytics warehouse/ETL solution, required by OEMs and Network service providers, to provide detailed analysis for the following business cases: Cell Phone battery heating up quickly, Cell Phone battery level drains quickly, Dropped Calls, Phone Reboots, Application crashes on Cell phone.
- These business cases help the OEMs to use the detailed analysis provided by the Mobile Analytics UI and Reports to have intelligent conversations with Network providers regarding why the cell phone battery heats up, cell phone battery drains quickly, possible reasons for dropped calls, phone reboots and application crashes.
- These business cases help the Network Service Providers to use the detailed analysis provided by the Mobile Analytics UI and Reports to have intelligent conversations with customers regarding why the cell phone battery heats up, cell phone battery drains quickly, possible reasons for dropped calls, phone reboots and application crashes.
- This was built to reduce the cell phone returns that are causing the OEMs and Network Service Providers to loose billions of dollars in revenue and also to reduce damage to cell phone brand names, thereby reducing loss in sales revenue.
- Data Warehouse size was about 7 terabytes, with a daily data volume from 5 - 10 million.
- This project was implemented as a SaaS hosted service model.
- Re-architected, re-designed, developed and fully automated the Java based batch processes used for Mass Campaigns, Device Import, Subscription Import, thereby making them scalable, reliable and improved their performance by 20 - 30 times. Took complete ownership of the project from design to deployment.
- Provided database performance solutions in production (emergency situations) for the following very large volume database processes that perform the following:
- Delete orphaned devices and subscriptions and data fix for devices.
- Provided data fixes for random issues created in production due to inefficient sql scripts, that performed incomplete data updates or had to be killed due to performance issues, thereby causing data to be inconsistent.
- Acted as mentor for Java developers and database developers trying to implement database scripts.
Technology Used: PL/SQL, SQL, Sql*Loader, Sql*Plus, korn Shell, Oracle 10g and Oracle 11g, on Linux and UNIX, Oracle Enterprise Manager, AWR, Toad, sed and awk, Oracle import/Export utilities, Sql Trace, automated via cronjobs.
Confidential, San Mateo, CA
Database Consultant
Responsibilities:
- Optimal re-partitioning of tables and indexes. Partitioning the un-partitioned tables and indexes, thereby reducing I/O and CPU cost drastically.
- Re-visiting and optimizing the Batch business processes, thereby removing redundant code, hence making it faster. In some cases re-writing few batch process packages from scratch in an optimal fashion.
- Creating subsets of very large tables, using parallel, (create table as method), and use these in the queries wherever possible, instead of hitting the very large tables each time. This was done once a day, before the batch process window. Performing partition specific updates, inserts and selects in parallel.
- Designing and developing data archiving strategies, thereby reducing data volumes, hence making the batch processes faster. Using INSERT-ALL wherever possible.
- Reorganizing large to very large tables and indexes that have very large delete operations on them.
- Pinning database packages in SGA. Re-ordering indexes, making them very selective, thereby improving the I/O and CPU cost of the queries. Making data types in some query join conditions the same, thereby avoiding FTS on queries. Improving the sql query cost, buffer gets, physical reads and cpu cost by inserting missing join conditions on indexed columns that were causing merge join cartesian on very tables, which is very expensive.
- Converting vertical tables into horizontal tables, thereby reducing the row counts to be processed drastically.
- Modified existing indexes by adding the missing column in the where clause to avoid index skip scans which are costly especially on very large tables. Re-writing inefficient batch process queries.
- Pining small tables and indexes in the SGA. Using sql loader direct path load instead of conventional path load.
- Coding, unit, integrated and functional testing. Documented performance changes in detail.
- Documented performance tuning test plan. Documented detailed functional test plans.
- Made sure that the business functionality was not affected by the performance changes.
- Documented database structure changes in detail. Helped QA in their white and black box testing.
- Documented production deployment procedures. Documented performance improvement results.
- Tested functional changes on a development environment. Tested the performance changes in a production like environment, with the same infrastructure like production.
- Performance tuned 400 UI based queries to run 8 - 10 times faster. This prevented the company from loosing some very large customers and made the existing customers very satisfied in terms of response times. It also helped in generating several prospects from old client referrals.
- Provided trace output of the old and new queries.
- Documented the performance changes.
- Provided the new optimized queries to the developers to integrate into the Java code.
- Worked with Oracle DBAs and Java developers to test these queries in environments with production volumes.
- Worked with QA to certify the tuned queries functionally.
Technology Used: PL/SQL, SQL, Sql*Loader, Sql*Plus, korn Shell, Oracle 10g, on Linux and UNIX, Oracle Enterprise Manager, AWR, Toad, sed and awk, Oracle import/Export utilities, Sql Trace, automated via cronjobs.
Confidential, Redwood City, CA
Senior Integration Engineer
Responsibilities:
- Understood the ETL processes in detail and documented performance issues.
- Suggested and discussed performance improvement strategies with the customers before implementation.
- Worked very closely with customer IT and DBA team. Designed test cases for performance testing.
- Documented and presented performance improvement results to the customers. Tested functional changes on a development environment.
- Tested the performance changes in a production like environment, with the same infrastructure like production.
- Made sure that the business functionality was not affected by the performance changes. Modified code and database level changes to get improved results.
- Lead the project from Merced side and obtained customer sign off on performance improvements and functional accuracy.
- Performed detailed data mapping between source and MPS datamart. Worked with customers to get details of data sources, server settings, db connection settings. Developed full user acceptance testing document for the ETL interfaces. Architected complete error handling procedures. Worked with the customer DBA for I/O allocation of data and other DB performance related items. Documented detailed interface flow diagrams.
- Designed and developed error handling reports and daily load reports.
- All the above interfaces were run via the cron jobs. Performed complete development, unit, module and integrated testing. Made sure that all these interfaces were performance tuned and scalable.
Technology Used: PL/SQL, SQL, Sql*Loader, Sql*Plus, korn Shell, Oracle 8i and 9i, on Linux and UNIX, Oracle Enterprise Manager, AWR, Toad, sed and awk, Oracle import/Export utilities, Sql Trace, automated via cronjobs.
Confidential, Redwood City, CA
Senior Engineer
Responsibilities:
- Designed and architected the Expense Manager data model, after detailed discussions with business analysts, users and managers, performed detailed data mappings between Client Accounting system (DataFaction) business entity attributes and Expense Manager data model attributes.
- Documented Interface Software Architecture Design document which contained:
- Detailed data mappings between client accounting business entities and interface tables, designed directory and file structures for data extracts from Client Accounting system for each business entity, detailed data transformations, data validations, data verifications, data mapping from interface tables and expense manager tables, project performance requirements, physical and logical data models, all database object definitions, high level and detailed flow diagrams of the extract and interface processes, extract and interface systems architecture diagrams, interface success and failure notifications and resolution procedures.
- Designed, developed, tested and deployed the interface process, documented and performed detailed unit,module and integrated test plans, documented system test plans for QA staff and helped them in testinginterface operations and support procedures, interface error handling and recovery procedures, planned and performed detailed dry runs during all stages of the project, documented software architecture design for vendor development of the file extract process from datafaction, worked with UNIX Systems Administrator and Oracle
- DBAs to perform operating system striping across disk controllers for maximum data throughput and uniform I/O distribution, laid out the Oracle datafiles and tablespaces across disks and controllers to obtain maximum performance throughput and uniformity in I/O, laid out the interface tables and expense manager DB tables and indexes across the tablespaces to obtain maximum throughput, pinned the PL/SQL packages into the shared pool area for performance improvements, used SQL/PLSQL tuning knowledge for improving performanceperformed performance benchmark dry runs in the production mirror environments before releasing to production, achieved a total Interface runtime of 30 minutes. Database Batch Interface To Determine Business Issues In A Database Application
- This interface dynamically executes a set of sql queries which are inputs to the interface and then spools the results into flat files, that are emailed to technical and business folks to analyze the results and determine the business issues associated with the data in the database application.
- Each query corresponds to a business issue that needs to be monitored.
- The business issue queries are stored in interface tables. Queries can be added to, deleted or edited in the interface table, depending on the business issues to be determined.
- All the flat files generated for that night’s run ( that contain the business issues spooled output ) are stored in the directory for each day.
- This interface helped keep the Application data clean and also help fix major bugs in the application that were causing these business issues in different software releases.
Technology Used:PL/SQL, SQL, Sql*Loader, Sql*Plus, korn Shell, Oracle 8i and 9i, on Linux and UNIX, Oracle Enterprise Manager, AWR, Toad, sed and awk, Oracle import/Export utilities, Sql Trace, automated via cronjobs.
Confidential, Fremont, CA
Senior Database Developer
Responsibilities:
- Managing and archiving users, DB object management, SQL and PL/SQL tuning, table, tablespace and user level reorganization, data conversions, supporting developers on a daily basis, preparing and implementing application software releases for bug fixes and enhancements, reverse engineering of the Collections application backend database stored packaged procedures and functions for the following reasons
- SQL, tuning, PL/SQL tuningdocumentation, determining issues with the physical structure design, this reverse engineering process helped to reduce the night operations runtime by half, and also helped in cleaning up a lot of code, fixing a lot of code bugs, developed and documented a process flow diagram for all the different business processes that are a part of night operations, that was very useful for developers to fix bugs, as they knew the packaged procedures that formed that business entity, where the problem was, removing redundant indexes to improve database and hence application performance, rebuilding highly fragmented indexes, removing row chaining in tables to improve database and hence application performance, perform code reviews of developer code, help developers in
- SQL/PLSQL performance tuning, trapping inefficient SQL during production and trying to tune it for future runs, provided recommendations to stripe disks across controllers, to achieve optimal I/O performance, laid out the complete architecture of the application tables, across several logical volumes, with each logical volume made up of several disks, that had been striped across controllers by the UNIX systems administrator
- This infrastructure recommendation was extremely beneficial in achieving tremendous database I/O performance improvements, developed cron jobs to collect Database performance statistics, in order to monitor database performance, improving work efficiencies by automating daily DBA tasks by using Korn Shell scripts etc.
Confidential
ETL Interface Developer
Responsibilities:
- Interfaced with business users and functional analysts to gather requirements, studied and analyzed the “You Owe Me” application backend structures and packaged procedures
- Developed high level functional specifications, converted functional requirements to high level and detailed technical specifications, performed detailed data mapping from the flat files to the open interface tables, developed detailed unit test cases and performed unit testing, developed and performed module level testing, developed and performed detailed integrated test cases, designed and performed dry test runs for the interface, developed and tested interface/conversion verification reports, documented the interface process in detailtuned all the interface SQL and PL/SQL code, made sure that all the code was Y2K Compliant, assigned table-spaces to pen interface tables that were on different disks from the application tables that were being written into or read from, in rder to avoid contention problems. Used parallel processing at the UNIX and ORACLE levels of each run-unit in order to achieve phenomenal run-times for the interface.
Technology Used: PL/SQL, SQL, Sql*Loader, Sql*Plus, korn Shell, Oracle 7.x, on Linux and UNIX, Oracle Enterprise Manager, TKPROFF, AWR, Toad, sed and awk, Oracle import/Export utilities, Sql Trace, automated via cronjobs.
Confidential, Redwood Shores, CA
Senior Applications Engineer
Responsibilities:
- Designed, developed and implemented, Revenue Interface between the Ingress based Telecommunications Billing System and Oracle General Ledger. Month End Revenue Accruals and Deferrals Interface between the Ingress based Telecommunications Billing System and Oracle General Ledger.
- Customer Payments interface (sub-system) between a Telecommunications Billing System and Oracle Financials.