Senior Database Developer/aws Migration Specialist Resume
Reston, VA
SUMMARY:
- A Self Motivated Senior Oracle Developer with over twelve years of experience in SQL, PL/SQL and SQL* Loader in UNIX and NT environments.
- 3 years of experience working with Postgres SQL on AWS cloud platform.
- Extensive experience in the entire application development life cycle, which includes system study, analysis, design, development, debugging, testing, implementation, GUIs, change management, document control, & documentation of various applications.
- Installed and configured Oracle databases and worked as a member in Oracle Database Administration team, which includes design of database, planning/implementing and backup/recovery strategies.
- Oracle Certified Associate SQL programming with proven ability in business analysis, application programming and troubleshooting. Adapts to new technologies quickly, excellent written/verbal communication, leadership and team work skills. Willing to take on new challenges and responsibilities.
- AWS Certified Developer Associate: Validation No 7SM1M4F11JQEQZWF
- AWS Certified Architect Associate: Validation No M41935H2CNQQQTSQ
- Worked in Federal space and have an active Public Trust Clearance (PTC).
TECHNICAL SKILLS:
Operating Systems: Windows 2000/XP, Sun Solaris 9.0, HP - Unix11.0, Linux
Languages: SQL, PL/SQL, PL/PGSQL, node .js, T-SQL, C, C++, C#, Java2.0.
Databases: Oracle 11g/10g/9i/8i, PostgreSQL 9.1, SQL Server 7.0, Sybase, MS Access
AWS Services: S3, EC2 instances, EMR (Sqoop, SPARK), RDS, DynamoDB, Athena, SNS, Lambda and Step Functions.
Utilities: Erwin 4.1, TOAD, PL/SQL developer 5.1.3, Enterprise Manager, SQL* Loader, VISIO 2000. ERWIN.
Version Control: Clear case, Clear Quest, SVN.
Front End Tools: Oracle APEX 4.2
PROFESSIONAL EXPERIENCE:
Confidential, Reston, VA
Senior Database Developer/AWS migration specialist
Responsibilities:
- Architected AP full year model to migrate from legacy system (oracle) to cloud, using RDS (Postgres) as database.
- Extensively used CI/CD process with the help of BitBucket (Source Control), Ansible playbook and Jenkins.
- Used Cloud Formation scripts written in YAML to build IAS (Infrastructure As Service).
- Worked with AWS services like EC2 Instances, S3 bucket, Lambda Functions, RDS and SNS.
- Migrated Oracle functions/procedures/packages to Postgres objects using PG/SQL.
- Worked with PostgreSQL database to load the incoming data into DPD portal.
- Spined up an Amazon EMR cluster to run sqoop jobs to migrate data from on premise oracle database to S3. Used S3 as source and read data through Athena for data analytics. Tested this process using different file formats such as CSV, Parquet and AVRO. Partitioned and compressed data to optimize and to save cost.
- Used EXPLAIN and EXPLAIN ANALYZE to fine tune queries for performance improvements in PostgreSQL.
- Completely designed and developed DMF project which is the future of all the feeds at Confidential .
- Involved in the design session for adding three new exams (seminar, research and CSP) as part of 2016 and 2017 AP exams.
- Involved in logical and physical data modeling of DMF project through ERWIN tool.
- Extensively used dynamic SQL to develop DMF project.
- Worked with XDB to extract XML information and populate the data into transaction tables.
- Migrated Legacy data into DMF (NextGen feeds).
- Used Explain plan, Partitions, Hints, parallel execute, process optimizations, query tuning techniques to optimize the overall performance of the database.
- Created complex database objects which include Tables, keys, Constraints, Sequences, Triggers, View, MVs, packages, procedures and functions using SQL and PL/SQL to handle complex business logic.
- Used DBMS PARALLEL EXECUTE to update millions of records in database by creating chunks and then updating those chunks of data in parallel.
- Create/allocate table space for the objects and create partition.
- Generate reports by exporting them to spread sheet and ftp the content to a location which can be viewed by the upper management.
- Worked in both Windows and in UNIX environments.
- Unit test the code and document the changes as part of the code review process.
- Create users for testers, ETL developers and provide RO or RW grants to access different databases.
- Setup and maintain a SPRUCE (Stress test env), MAPLE (UAT) and APPLE (Deployment check) environment.
- Working in an agile environment with 2 week sprint and a once in 2 month release schedule.
- Code deployment into production for APD related changes for every release.
- Used SVN for version control.
- 24*7 one-week production on call support once in every 6 weeks.
Environment: AWS, PostgreSQL 9.1, Oracle 11g, TOAD 11.6.0.43, SQL*Loader, ERWIN 9.6, Smart SVN 9, Bit Bucket, Jenkins, YAML, JSON, node js, IBM Clear Quest 7.1.2.0, UNIX
Confidential, Rockville, MD
Senior Oracle database developer
Responsibilities:
- Used ERWIN tool to design and model a couple of modules. (Compliance module and Archival Module).
- Performed DEV DBA activities like schema creation, table space management, object creations, import/export tables as and when required between schemas.
- Designed and developed ETL process which involved extracting XML data, Transforming and loading them into relational tables using industry standard business rules.
- Extensively used XMLDB and XQueries for parsing HL7 Standard xml information.
- Generating XMLs from the transaction tables.
- Developed the entire Compliance module which involves sending notification emails to the FDA subscribed industry personals.
- Designed and developed Terminology Reference module which will be used all across the project.
- Designed and developed user management module which will be used to create organization and the users associated to the firm.
- Designed and developed archival process.
- Worked closely with APEX team to develop complex reports.
- Used Materialized Views to enhance performance of interactive report.
- Designed backend error handling process for the entire project. This involves creating tables, common procedure used by the entire system to log errors accordingly.
- Have done a lot of performance tuning based on Explain plan, Tkprof. Used hints and added a lot of meaningful indexes to improve the performance of the code.
- Document the flow diagram for critical processes.
- Worked on UNIX to deploy code, create cron jobs and have written few shell scripts.
- Used SVN for version control.
- Pivotal tracker to maintain the defects (Scrum method).
- Used inbuilt APEX Team building tool to track internal defects.
- Been part of CMMI Team to get level 3 certification.
Environment: Oracle 11g/10g, TOAD 11.6.0.43, SQL*Loader, ERWIN 9.6, APEX, Smart SVN 9, UNIX.
Confidential, Gaithersburg, MD
Oracle Database Programmer
Responsibilities:
- Worked closely with the data modeling team to help them design tables, indexes and Constraints.
- Expert domain knowledge as I have worked with most of the subsystem.
- Created complex database objects like packages, procedures, functions.
- Used the data base scheduler to spawn multiple sessions by creating parallel processing which resulted in better performance during payment generation process.
- Designed and developed an archival process which will archive data (move data from main tables to history tables and delete data from main tables). The data involved is in terms of 50 million records per table. We used bulk collect method to achieve this goal.
- Written a lot of critical process like Account code derivation logic, Benefit Plan determination process, Payment generation process, Limit processing, provider data load process etc …
- Been a part of interface team to develop a lot of interfaces for the project. These interfaces are used to load the data with the flat structure to our CHMAPS table through daily/weekly or monthly jobs.
- Wrote a core process called Status Manager which was used across the system to maintain the status for several processes.
- Worked on complex queries to help the COGNOS reporting team to generate reports.
- Used FITNESSE Testing tool to unit test our packages and procedures.
- Document and draw flow diagram for the packages/procedures.
- Used Clear case/clear Quest to track code/defects.
Environment: Oracle 10g/9i, TOAD, SQL*Loader, ERWIN, FITNESSE, Clear Case/Clear Quest.
Confidential, Woonsocket, RI
Oracle Developer
Responsibilities:
- Have created tables, views, Sequences, Indexes, Partitions as when required.
- As a member of the design team, I gave some valuable suggestion to the team.
- Used the concept of External table load to load the data from the flat file to a external table.
- Used Dynamic SQL to make the whole process dynamic and efficient.
- Used the logic of threading, which is efficient while inserting a terabyte data into the database. It made the whole process 75% more efficient than the existing process.
- Have used BULK binding which improved the performance by minimizing the number of context switches between the PL/SQL and SQL engines.
- Have written several packages, which include procedures, Functions, Cursors and Exception blocks.
- Have written Shell Scripts to handle restart ability. The whole process will be called from the shell and the shell acts as a controller.
- Control M is used to schedule jobs.
- Used MS VSS to manage the code change and henceforth provide security to the source code.
- Used SQL Loader to load the data from the INTIATE server to do one shots.
- Have worked on BIDS - Reports generated and sent to the business users when required.
- Did Unit Testing and made sure the code matches with the design.
- Documented the whole process using CVS Standards.
- Gave 24 * 7 support after the application went to production.
Environment: Oracle 9i, TOAD, SQL*Loader, ERWIN, UNIX Sun Solaris 7.0., Control M, MS VSS.
Confidential, St Louis, MO
Oracle developer
Responsibilities:
- Have Re-designed a part of the database to accommodate the module specific tables and module specific system variables.
- Documented the Technical specification for the proposed data base changes.
- Have coded several packages, procedures and functions to in corporate complex business rules into the application.
- Did a major code Re-organization by writing function and procedures for frequently used functionalities and calling those functions/procedures when required.
- Designed and Developed data validation, error control routines, audit and log controls using PL/SQL, SQL scripts.
- Used MS VSS to manage the code change and henceforth provide security to the source code.
- Used SQL Loader to extract the data from the COL server in the form of flat files and load the extracted data back to our schema (For testing purposes).
- Have written code in VB 6.0 to display the results onto the screen.
- Tested the packages in our schema using C# and NUNIT tool. NUNIT Tool will provide the test results (either pass or fail) based on the input that was passed to execute a backend procedure/function.
- Have performed code review.
- Worked as a part of maintenance team to fix code (Sybase code) for COL brewery.
- Test the code change in the QA server and document the code change. Once Testing is done, a CMR is issued to implementation team who will deploy the code change to the production server.
Environment: Oracle 9i, Sybase, TOAD 7.3.0.0, VB 6.0, C#, NUNIT, SQL*Loader, ERWIN, Windows XP, MS VSS.
Confidential, Louisville, KY
Oracle Developer
Responsibilities:
- Written complex queries to do analysis work and henceforth generate reports to validate the results produced.
- Fix the proposed solution using PL/SQL stored procedures and functions.
- Create test scenarios and test the proposed fix in the QA region.
- Modify the Visio diagram for the fixes that go to production.
- Document the fix based on Confidential standards.
- Run one shot to clean up the unwanted data from the database.
- Have written a lot of PL/SQL stored procedures to automate many manual processes.
- Responsible for creating Tables, Views, and Indexes in the HDQ2 region (which is development test region) as and when required.
- Written crontab scripts in UNIX to automatically perform jobs at a specified time.
- Designed and developed data loading processes using SQL*Loader, PL/SQL and Unix Shell scripting.
- Used ftp to put/get the files to/from the oracle server.
Environment: Oracle 9i, PL/SQL Developer 5.1.3, TOAD 7.3.0.0, SQL*Loader, MS Visio, ERWIN, MS-ACCESS 9.0.2720, Windows XP, UNIX Sun Solaris 7.0/8.0.