Aws/etl/big Data Developer Resume
GeorgiA
SUMMARY:
- 13+ years of IT experience as Database Architect, ETL and Big Data Hadoop Development.
- Ability to independently multi - task, be a self-starter in a fast-paced environment, communicate fluidly and dynamically with the team and perform continuous process improvements with out of the box thinking.
- Experienced in extract transform and load (ETL) processing large datasets of different forms including structured, semi-structured and unstructured data.
- Experience in understanding business requirements for analysis, database design & development of applications.
- Enthusiastic learner and excellent problem solver.
TECHNICAL SKILLS:
Big Data Ecosystems: Hadoop, HDFS, Hive, Pig, Sqoop, AWS, CloudWatch, S3, Redshift Spectrum, Athena, Glue, AWS RedShift, DataBricks, Scala, Spark SQL, Zeppelin
ERP: Oracle Applications 11i.
Operating Systems: Windows NT/2000/XP, UNIX, Linux
Languages: C++, Java, VB, SQL, PL/SQL, HTML, UNIX Shell Scripting
Databases: Oracle 8.x/9i/10g/11g/12c, Postgres, MySQL, SQL Server
Tools: /Utilities: TOAD, SQL*Loader, Oracle Forms(6i/10g) and Reports(6i/10g), Oracle Portal, Crystal Reports, Cognos, SAP DataSevices, SQL Developer, Oracle Application Express (Oracle APEX), SQL Workbench, Aginity WorkBench, SQL Manager, Eclipse
Version Control Tools: TFS, Visual SourceSafe
Data Modeling: CA Erwin, Visio, ER/Studio
SDLC Methodology: Waterfall, Agile, Onsite-OffShore Model
BI: SAP, ThoughtSpot, Tableau
API: Google and Bing Java API DataWarehouse (ETL): Informatica, SAP Data Services, SSIS
PROFESSIONAL EXPERIENCE:
Confidential
AWS/ETL/Big Data Developer
Responsibilities:
- Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.
- Data Extraction, aggregations and consolidation of Adobe data within AWS Glue using PySpark.
- Create external tables with partitions using Hive, AWS Athena and Redshift
- Designed External and Managed tables in Hive and processed data to the HDFS using Sqoop
- Create user defined functions UDF in Redshift
- Migrate Adobe Marketing Campaign data from Oracle into HDFS using Hive, Pig, Sqoop
Confidential, Georgia
Database Architect/ AWS/ETL Developer
Responsibilities:
- Created entity relationship diagrams and multidimensional data models for merging Confidential and Whitefence data sets into one single datawarehouse using Embarcadero ER/Studio
- Created logical and physical data model for Online Campaign Data Management using ERstudio and Visio
- Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets.
- Lowered processing costs and time for Order processing system through the review and reverse engineering of existing database structures, which reduced redundancies and consolidated databases.
- Migrate data into RV Data Pipeline using DataBricks, Spark SQL and Scala.
- Migrate Confidential Callcenter Data into RV data pipeline from Oracle into HDFS using Hive and Sqoop
- Build Data migration processes using SQL Server as database and SSIS as ETL
- Design, Develop, Test and Maintain Allconnects Data Warehouse which is build in Oracle 12c
- Load data into Amazon Redshift and use AWS Cloud Watch to collect and monitor AWS RDS instances within Confidential
- Designed and developed ETL/ELT processes to handle data migration from multiple business units and sources including Oracle, Postgres, Informix, MSSQL, Access and others.
- Developed and executed a migration strategy to move Data Warehouse from an Oracle platform to AWS Redshift.
- Used BI Tools such as ThoughtSpot and SAP Tools to create and maintain Reports
- SAP DataServices Integrator ETL developer with strong ability to write procedures to ETL data into a Data Warehouse from a variety of data sources including flat files and database links (Postgres, MySQL, Oracle).
- Loaded data to STAR schemas (fact, bridge and dimension tables) for use in organization wide OLAP and analytics and written batch Files and unix scripting to automate data load processes
- Knowledge of extracting data from sources such as Google and Bing Adwords and Analytics using Java API into Datawarehouse
Confidential, Georgia
Data Architect/Sr.Oracle Developer/Team Lead/Scrum Master
Responsibilities:
- Data Modeler responsible for gathering and translating bussiness requirements into detailed techinical specifications creating robust data models using Erwin Data Modeler and Visio. Successfully completed more than 15 projects involving Health Records Print and Mail, Claim Rebutal System, Tax and Financials Models
- Collaborate with data architects for data model management and version control
- Conduct data model reviews with project team members enforcing standards and best practices around data modeling efforts.
- Ensure data warehouse and data mart designs efficiently support BI and end user
- Develop and Manintain business logic in backend using Oracle SQL PLSQL Views Materialized Views Prcedures Packages Triggers and Functions
- Provide technical support and troubleshooting various production issues
- Build Data migration and integration processes using Oracle and Informatica Power Center to load into a single datawarehouse repository
- Tuned Informatica Mappings for Optimum performance
- Software maintenance and development for applications implemented in Oracle Forms & Reports 6i
- Lead Scrum Meetings, Team Leader and Mentor for various Project Teams
Confidential
Oracle Developer
Responsibilities:
- Developed conversion programs and imported legacy data to GL using journal import.
- Developed interface programs to interface Oracle financials GL with legacy systems.
- Done customization of Forms and Reports of GL like account analysis report, budget reports, chart of accounts, consolidation reports, journal reports.
- Created procedures and SQL*Loader Scripts to populate Customer interface tables with data
- Designed and developed reports using Reports 6i and registered them as Concurrent Programs and added them to their corresponding menus in Oracle Application.
Confidential
Oracle Developer
Responsibilities:
- Migrated the data from the legacy systems into Oracle Applications INV module using Item Import in oracle applications using PL/SQL.
- Intensive Testing of applications against IFRS Standards with Ernst and Young
- Preparation of Test Cases and Test Data based on the IAS clauses
- Involved in writing the SQL queries for finding the data anomalies and developing custom programs to clean data
- Created concurrent programs like procedures and packages to check some Validation while importing data from legacy system to Oracle applications.
Confidential
Oracle Developer
Responsibilities:
- Design, customization and integration of Forms/Reports for the Modules Oracle Receivables, Payables, General Ledger in Oracle Applications 11i.
- Documentation of QMS and SDLC procedures for the organisation
- Coordinating with clients to develop new forms and reports to customize the modules according to their business requirements and integrate with the Oracle Applications 11i.
- Writing PL/SQL programs, SQL LOADER for migrating data from Legacy systems to Oracle Applications Standard interface tables.
Confidential
Programmer Analyst
Responsibilities:
- Develop custom programs using Java and Oracle
- Actively involved in the Analysis, Database Design, Coding, Testing and Implememtation Phases of the project
- Creation of objects like stored procedures, triggers, tables, views and analyzing tables and indexes for performance tuning.