We provide IT Staff Augmentation Services!

Oracle Pl/sql/ Big Data Developer Resume

3.00/5 (Submit Your Rating)

Boston -, MA

SUMMARY:

  • 13+ years of total work experience in IT Industry in Analysis, Design, Development, Maintenance and Migration of various software applications
  • Worked primarily in the domain of Insurance, Banking, Finance, Manufacturing and Retail services.
  • Extensively worked in designing & building ETL systems, automated batch jobs
  • 2+ years of Oracle Database administration experience
  • 3+ years of Team Lead experience
  • Technological fortes are Oracle 11g/10g SQL & PL/SQL programming, MongoDB, Informatica 9.5, Datastage, Autosys/Control - M scheduler, Unix, Oracle Forms & reports 10g, BIRT Reporting, Big data ecosystems - HDFS, hive, sqoop, impala, Spark, pig, Oozie, HUE, Kafka & Python.
  • Big data Ecosystems - Very good understanding of Hive, Sqoop, Pig, Impala, spark & Oozie
  • Performing data pipeline activities - creating Hive External tables, Data load using Sqoop commands, impala views.
  • Scheduling jobs using Oozie and monitoring via HUE
  • Enhancing Spark job as per the need.
  • Code deployment using Data Governance Framework - DGF
  • Developing Python Scripting and building pyspark jobs
  • Very good understanding of Kafka streaming.
  • Extensive experience as Oracle Developer in building various ETL systems using PL/SQL Packages, Stored Procedures, Functions, Cursors, Triggers, Views, Materialized Views, Global temporary tables, Oracle partitions (Exchange partition) & SQL Loader
  • Performed Oracle DBA activities - Oracle expdp / impdp, generating AWR report, managing database for tablespace, indexes, disk space, grants/synonyms and User/Roles/Privileges, rebuilding environment prior to new code releases
  • Involved in Performance Tuning - Analyzing explain plan, rebuilding complex sqls, adding Hints, gathering stats, generating TKPROF and trace files, Partitioning highly growth tables, Exchange partitioning
  • Involved in supporting Oracle Streams and CDC
  • Performed data modelling using ERwin and MS Visio.
  • Very Good ETL experience using Informatica PowerCenter 9.x/8.x and familiarity with Data stage 9x
  • Troubleshooting nightly batch job issues in production involving technologies such as: informatica, Oracle, Unix, Control-M/Autosys
  • Very much comfortable in building and modifying UNIX shell scripts for data migration, batch job processing, file manipulation and file archival policies.
  • Extensive experience in building various Autosys/Control-M batch jobs to perform data load, Archival of files and supporting production jobs

TECHNICAL SKILLS:

TOOLS: Oracle 11g/10g/9i, Oracle Forms/Reports 10g, SQL developer, TOAD, ERwin r8, Unix, MongoDB, Bigdata Ecosystems - HDFS, Hive, Sqoop, impala, pig, Oozie, HUE & Spark (Python), Data Governance Framework DGF, BIRT xml Reporting. Informatica 9.1/8.6, Datastage 9.0, WinSCP, Putty, Subversion, CVS, PVCS, JIRA (issue tracking and project management system), Rally Task management tool, RTC, Quality Centre, Gvim, UltraEdit, Autosys/Control-M, Cron job, Moody s Analytics Risk Foundation product

PROFESSIONAL EXPERIENCE:

Confidential, Boston - MA

Oracle PL/SQL/ Big data Developer

Responsibilities:

  • Performing Analysis and design of Oracle DB changes.
  • Building PL/SQL package, procedure/Function, Views to enhance LMS application to support new business rules while introducing new source systems or any changes to existing eco system.
  • Performing SQL tuning and Data modeling using ERwin 9.5
  • Build conceptual/logical/physical data modeling using ERwin 9.5 for the new stage area and target application tables, considering partitioning management options.
  • Implementing Oracle VPD rules & making analytical queries/reports for BIRT reports development
  • Enhancing Moody’s Risk Origin Application and BIRT report change in XML.
  • Participating in rewriting current LMS application in Hadoop ecosystems by - creating Hive partitioned external tables (Avro and Parquet), Sqoop queries to import data from various RDBMS systems.
  • Writing Pyspark job to process Hive data.
  • Building and enhancing Oozie jobs using workflows & coordinator XMLs and their properties files.
  • Building impala views and monitoring Oozie jobs via HUE.
  • Understanding distributed data streaming using Kafka
  • Experience with NoSQL database MongoDB

Environment: Oracle 11g, TOAD, SQL developer, ERwin 9.5, Putty, RTC, Clear case, Moody’s Risk Foundation product, Eclipse, MongoDB, Big data ecosystems - HDFS, hive, sqoop, impala, pig, Pyspark, Oozie, HUE, Kafka, Python, Data Governance Framework - DGF

Confidential, Malvern - PA

Sr. PL/SQL & ETL Developer

Responsibilities:

  • Performed Analysis and design of Oracle & Informatica requirements
  • Enhanced ADS Loader workflows for various vendors by Making changes to Informatica mapping, session and Oracle Package/procedure/functions
  • Providing support to ETL jobs in system/integration/UAT environments
  • Carried out design, construction, review and defect analysis activities to Oracle Loader and data movement packages.
  • Performed data modeling to setup new vendor load
  • Involved in the creation of database objects like Tables, Views, Stored Procedures, Functions, Packages and Indexes.
  • Performing data migration activities by applying various DMLs to sync up current data to support new business logic on any key attribute.
  • Completing Tasks with proper documentation, thorough Unit testing and code review.
  • Following Agile Scrum methodology to manage & deliver assigned tasks

Environment: Oracle 11g, TOAD, SQL developer, Putty, Informatica 9.5, WinScp, Control-M, SourceTree, Unix, Rally, Confluence, TOAD Data Modeler

Confidential, Wilmington - DE

Sr. PL/SQL & ETL Developer

Responsibilities:

  • Playing important role in Requirement gathering, Data Analysis and Designing ETL systems for setting up new partner or vendor feeds
  • Building ETL systems by creating Oracle packages, procedures, Functions, Table partitions, Datastage & Control-M jobs.
  • Moving from Oracle tables to Big data Hive tables, creating partitioned external tables, data exp/imp using sqoop
  • Setting up HDFS and Hive related config files,
  • Loading JSON files into MongoDB
  • Performing SQL query tuning & Partitioning table to the long running sqls.
  • Involved in various discussions with Business, Data Modeler and Client in setting the expectations, priority and dependencies for any new requirement.
  • Performing code review, deployment planning, helping team member in clarifying doubts, fixing production issues.
  • Adopted Agile Scrum methodology to deliver assigned tasks

Environment: Oracle 11g, TOAD, Putty, Quality Centre, Datastage, WinScp, Control-M, WinCVS, Unix, Rally - Task management tool, Big data - Hive, Sqoop, MongoDB

Confidential, Newark, DE

Sr. Oracle PL/SQL developer

Responsibilities:

  • Performed database and ETL process design to integrate new source systems data into current system.
  • Built PL/SQL packages, procedures, triggers, functions, Indexes and Collections to implement business logic to perform various calculation needed by the netting engine.
  • Built UNIX script and Autosys scheduler jobs to perform PNE batch jobs, File and Data Archival jobs
  • Provided L3 production support for Oracle, Informatica, Unix and Autosys issues for 3 PNE products - Securities Netting and Derivative netting and Fixed Income netting
  • Build Forward and reveres Data models using ERwin r8 and make sync with each release specific changes.
  • Developed Informatica mappings and workflows to load data xml files, regular csv files

Environment: Oracle 11g, Erwin r8, TOAD, Putty, Quality Centre, Informatica 9.1.0, WinScp, Autosys, Subversion, GreenHopper for JIRA

Confidential, CT

Sr. Oracle PL/SQL developer

Responsibilities:

  • Developed automated Oracle testing tool -“ACT” which could accept testing sqls, from various projects and primarily used by QA & Production support team for enhanced regression testing.
  • Performed requirement gathering, analysis and design for Oracle, Informatica and Autosys scheduler
  • Developed Data mart to store input test sqls, meta data information of each sql, resulted data after execution of sqls, summary of each test sql
  • Developed Oracle package, procedure, functions using oracle DBMS SCHEDULER to create job, running test sqls in parallel thread by assigning run id using ORA HASH function.
  • Designed and setup Autosys batch jobs by -Setting up autosys profile, built JIL script, UNIX script, dependency set up with other existing jobs
  • Worked on SQL*Loader to load data from flat files obtained from various facilities every day. Used standard packages like UTL FILE, DMBS SQL, and PL/SQL Collections and used BULK Binding involved in writing database procedures, functions and packages for Front End Module.

Environment: Oracle Exadata, Informatica 9.0.1, TOAD, PVCS, Putty, Quality Centre, WinScp, Autosys, Ultraedit, Subversion, ERwin

Confidential, NY

Sr. Oracle PL/SQL Developer

Responsibilities:

  • Built common repository system in Oracle for “Life and Health” division of Confidential to store financial, summary of Policies, and treaty information to facilitate decision making process for other business domains.
  • Oracle replication mechanism - Streams has been implemented to refresh staging OLTP data with near real time data.
  • Involved in the continuous enhancements and fixing of production problems. Designed, implemented and tuned interfaces and batch jobs using PL/SQL. Involved in data replication and high availability design scenarios with Oracle Streams.
  • Flashback views with required transformation logics are built on OLTP tables.
  • Informatica mappings are developed to use above created source views to load into target tables. Additionally, the load is supported by other oracle procedures and packages.
  • Implemented various validations during the load process-Preload checking (Replica checking, Mapping validation check), Load completion checking and then Post load checking (Audit and Reconciliation checks)
  • Prepared test cases to validate each transformation logic and business logic and the result data.
  • Performed testing and code release of developed components in various environments.

Environment: Oracle 10g, Informatica 8.6, TOAD, CVS (version control), PuttyQuality Centre, eCAB

Confidential

System Analyst

Responsibilities:

  • Designed common ETL system to support multiple LOB’s load process.
  • Developed PL/SQL package, Procedure, Function, Materialized View to load various LOB’s (Investment Banking, Treasury Security System) data into target tables
  • Used Sql loader to perform ad hoc data loads as needed by business.
  • Prepared UNIX scripts to support data load
  • Developed materialized views as the end result of the ETL system
  • Data management activities like upload user supplied files into the system, Other DML activities to update key business tables, preparing ad hoc sql reports
  • Designed and configured new Oracle database to make this system as a new source of data load.
  • Enhanced the ETL script to support the above newly configured database, to support application front end change.
  • Performed production releases in every 6 weeks for every new set of requirements to support new GUI change as well as ETL code change.
  • Documented each set up changes to the environments, adoption of new process or rules, change in deployment methodology and inclusion of any new services
  • Worked as single point of contact for JPMC business users, Offshore team members and other JPMC stakeholders for - Gathering new requirements, publishing weekly as well as monthly status reports, client meetings, proposing estimation, code reviews.

Environment: Oracle 10g, UNIX, WinSCP, Subversion (version control)JIRA (status tracking), SQL developer, Gvim (Unix editor), Autosys (Job scheduling)

Confidential

System Analyst

Responsibilities:

  • Build Oracle packages, procedure/Functions, Triggers, Synonyms, Views
  • Migrated Oracle DB 8i to 10g, performed DB design/modeling
  • Build new forms/reports as well as enhanced during migration from 6i to 10g
  • Enabled 10g forms features such as WebUtil

Environment: Oracle 8i/10g, Forms/reports 6i/10g, PVCS version control, Oracle Enterprise, Unix, Solaris application server.

We'd love your feedback!