We provide IT Staff Augmentation Services!

Etl Data Engineer Resume

4.00/5 (Submit Your Rating)

San Ramon, CA

SUMMARY:

  • 7 years of Experience with Design, Development, Administration, Testing, Implementation, Production Support & Maintenance of Enterprise Data Warehouse systems.
  • Hands on Experience with ETL tools like INFORMATICA, SSIS, Pentaho - PDI (Kettle), IBM Datastage. Hands on with SQL, NoSQL, Postgre SQL etc.
  • Experience with Python, Perl Scripting, UNIX KSH.
  • Experience in HealthCare, Financial, Investment/Banking, Automobile and Insurance domains.
  • Experience with Data Visualization dashboard tools like Microstratergy, Tableau, Pentaho Report Designer, Cognos, Business Intelligence, Platfora, Datamere etc.
  • Worked in Waterfall, Kanban and Agile methodologies accustomed to daily stand up scrum meetings and code delivery in short timelines.
  • Basic DevOps experience (especially on AWS). Familiarity with Jira, Confluence, Git, build management tools. Familiarity with AWS components S3, EC2, Dynamo DB, VPC, Autoscaling etc
  • Hands on experience using databases NETEZZA, HBase, Oracle, IBM DB2 8.0, NETEZZA, SQL server, Greenplum, MongoDB, Teradata, Dynamo DB, RedShift, MPPs, columnar etc.
  • Expertise in writing complex SQL queries, PL/SQL components, Stored Procedures, Triggers and Packages.
  • Experience in Performance Tuning of sources, targets, mappings, transformations and sessions.
  • Data Modelling experience using Erwin 3.0, Embarcadero Data Studio Architect.
  • Experience in developing Test Strategy, Test plan, Test cases, Use Cases, Test scripts, traceability matrices, and Bug Reporting using HP / Mercury Quality Center, Bugzilla etc.
  • Experience with on call production support, Incident resolutions, Onsite - Offshore model, carried out the co-ordination activities during different phases of the projects effectively.
  • Sound technical, communication, time management, interpersonal and team player skills.

TECHNICAL SKILLS:

ETL Tools: INFORMATICA, INFORMATICA Power Center 9.x/Big data Edition 9.6.1/8.x, Hadoop Data Analysis tools, PigHue, Impala, Pentaho, Kettle, PDI, SSIS, Datastage.

Programming Skills: UNIX Shell Scripting, Perl, Python.

Databases: ORACLE 8/8i/9i/10g/11g, SQL server, DB2, Teradata, Netezza, Aginity 4.5.0, Hadoop, Quin Fin 96, Greenplum, MongoDB.

Operating Systems: MS-DOS, Windows, UNIX, Linux.

Scheduling Tools: CRONTAB, AUTOSYS, Control- M

Reporting Tools: Microstratergy, Cognos, Business Objects, SSRS, Business Intelligence, Tableau, Datamere, Platfora

Miscellaneous Exposure: Sub Version, Rational Clear Case, HP Quality Center/ALM, Anthill Pro, Agile, SCRUMGNU Linux, Solaris, Macintosh, MS Visual Studio, IBM Netezza Server Admin, TFS, ERwin

WORK EXPERIENCE:

ETL Data Engineer

Confidential, San Ramon, CA

Responsibilities:

  • Worked as a member of a cross-functional development team to provide production support and customer service in support of the Clients Business Specific Data Marts and
  • Worked in developing and maintaining Client's business reporting and analytics applications.
  • Work independently to analyze and provide Work independently in developing and maintaining Client's business reporting and analytics applications.
  • Created technical designs, developed, unit tested, and documented application enhancements and fixes using approved technologies.
  • Design, implement and maintain report batches using Oracle Reporting tools and Hyperion toolsets.
  • Design and Develop ETL processes to populate Enterprise Data Warehouse and data marts using Informatica, Pentaho, Oracle PL/SQL, & Unix shell scripts, Python.
  • Troubleshooting and provide resolution to database, application, software configuration and ETL related performance issues
  • Worked in a distributed system and SOA (Service Oriented Architecture)
  • Worked AWS cloud migration from on-prem servers.

Environment: Pentaho, Informatica, UNIX KSH, Perl, Python, SQL, PL/SQL, SQL Developer, Oracle, AWS, Hyperion, Protiviti.

ETL Data Engineer/ETL Consultant

Confidential, Long Beach, CA

Responsibilities:

  • Primary work is based on Automating the CDC data flow between different QNXT PLANDATA databases (Netezza and SQL Server) for the Health plans related data for 14 US States by using Python, Perl, KSH scripts.
  • Automation scripts in Python, Perl, Unix KSH to monitor, notify, trigger, stop, pre process, process various tasks related to ETL Change Data Capture (CDC) tool.
  • Data ingestion into the Table containers using Python scripts (petl, pandas etc)
  • Automation scripts for Notifications, Job triggering, Job stopping, monitoring and
  • Latency issues related to Attunity Replicate (CDC tool).
  • Developed Store procs, free form SQLs, for data validation, data cleansing purposes after the Change Data Capture is performed between Netezza and SQL server databases.
  • Designed/Implemented Autosys jobs process to trigger the Perl/Python Automation scripts, Resource allocation jobs, one to one jobs, file watcher jobs, downstream and upstream dependency jobs.
  • Worked as an Admin/Operations for Datastage tool involved in new. dsx code deployment, on demand job processing, New user creation etc.
  • Worked as an Admin for Data Model tools: Embarcadero Team Server, Embarcadero Data Architect ER tool created new user groups, managed User enterprise glossaries, created new glossaries.
  • Worked on Agile, Waterfall and Kanban methodology for different projects.

Environment: SQL Server, Netezza, Attunity Replicate, Python, Perl, UNIX, IBM Datastage, Embarcadero Data Architect, Embarcadero Team server.

Sr. ETL Consultant

Confidential, Irvine, CA

Responsibilities:

  • Design of ETL Informatica process based on the provided business rules.
  • Develop, Unit test and implement Informatica ETL mappings and workflows.
  • Worked in Agile Methodology, Kanban methodology and accustomed to daily stand up scrum and code delivery in short timelines.
  • Experience with Big data tools like Informatica Big Data Edition, Data Analysis components of Hadoop like Hue, Pig, Impala
  • Creation and maintenance of Informatica users and privileges.
  • Migration of workflows, mappings, sessions, from DEV to QA, QA to Prod environments
  • Performed Informatica upgrades, maintenance, administration and installation in different platforms
  • Worked on various migration projects like SQL Server to Netezza (Quin Fin 96), and Windows to Linux.
  • Developed wrapper scripts, file handling scripts, post processing scripts using Perl and Shell Scripting primarily for FTP transfers of inbound and outbound files to 3rd party clients, File Archival, Error logging, unzipping files etc.
  • Test and troubleshoot defects found in the ETL process, write different test cases, and executing them.
  • Maintaining Informatica mappings/workflows.
  • Worked on building SSIS packages and .dtsx files for data extraction from SQL server to Netezza database
  • Optimizing Informatica mappings, sessions, Worklets and workflows.
  • Used KSH and Python scripts to analyze and quantify large source files.
  • Creating SQL stored procedures and views.
  • Built ERwin datamodels for all the databases by using the Erwin 3.0 tool.
  • Verified the coding standards and involved in Performance tuning.
  • Exposure to Reporting tools like Microstratergy, Platfora, Datamere.

Environment: INFORMATICA Big data edition 9.6.1, Hadoop, Hue, Pig, Impala, Sqoop, Informatica9.5, Netezza, Aginity, Oracle, SQL server, TOAD, Teradata, PL/SQL, SSIS, T-Sql, SQL server, Unix Shell scripting KSH, Microstratergy 10.0, Platfora, Datamere.

ETL Tech Lead

Confidential, St. Louis, MO

Responsibilities:

  • Interact with business analysts and business teams to gather and understand the business requirements to provide technical solutions.
  • Analyze highly complex business requirements, designs and writes technical specifications to design or redesign complex computer platforms and applications
  • Perform Unit testing on the new codes and workflows.
  • Worked with creating HLD and LLD design documents for the projects.
  • Prepare the testing run book based upon the business requirements and the developmental application changes.
  • Oversee and perform the testing process and deployment of the code to different environments.
  • Created Test Cases and developed Tractability Matrix and Test Coverage reports.
  • Developed Slowly Changing Dimension Mappings of type I, II and type III (version, flag and time stamp)
  • Created test scripts for automated testing based upon the changes and the test cases.
  • Data validation in the Oracle and DB2 databases using adhoc SQL queries.
  • Used Shell scripts extensively for automation of file manipulation and data loading procedures.
  • Develop parameter driven ETL process to map source systems to target data warehouse with Informatica complete source system profiling.
  • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the ORACLE database.
  • Extensive hands on with MS-Visio, prepared datamodels, Algorithms, Schematic representations
  • Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Experience working with Business Objects Data Integrator (BODS)
  • Performed CHANGE DATA CAPTURE(CDC)
  • Worked with Business Objects to provide the users with the required reports as per the Business needs.
  • Followed the Software Development Life Cycle in all stages of the project.

Environment: Oracle 11G, TOAD, IBM Datastage, PL/SQL, DB2, Unix Shell scripting, Business Objects SQL Developer, INFORMATICA v 9.1/v9.5, Autosys

Application Developer

Confidential, Wilmington, DE

Responsibilities:

  • Develop the PSN Application using Oracle, Pentaho, Informatica, SQL, PL/SQL, Perl, UNIX, XML, HTML, BrickStreet Connect.
  • Developed Test Plans, Test Cases, and Test Scripts for SIT and support for UAT tests.
  • Develop test scripts for Automated Testing process of new scripts and workflows.
  • Prepare testing run book for the testing process in IST and QA.
  • Design, Development, Unit Testing, Integration Testing, QA and implementation for all the new projects.
  • Maintenance and updating the system as per client requirements.
  • Updating the application information in the SharePoint with necessary documents during regular intervals of time.
  • Writing adhoc SQL queries to investigate data in the ETL Workflow in the Oracle Database.
  • Provide production support, debugging input data and updating the production Run book whenever necessary as per the Production ticket.
  • Used python scripting for pre wrapping the source file sets.
  • Worked in multiple projects and work orders simultaneously and delivered successfully.
  • Worked on ON site OFF shore models for almost all the projects and provided necessary knowledge transfer to the OFF shore team with the Business requirements.
  • Worked as a lead for almost all the projects right from Business requirements analysis, scoping estimation, Design, DEV, unit testing, Integration test support, code check-in, QA migration, QA test support and Implementation support.
  • Coded database Triggers, Packages, Functions and Procedures using PL/SQL and maintained the scripts for various data feeds.
  • Worked in resolving multiple production issues/tickets by coordinating with multiple groups like Tech operations, DBA, Production build and legal teams
  • Developed new email campaigns templates by using HTML, XML.
  • Developed and modified the KSH scripts based on the client’s requirements.
  • Worked/coordinated with the IST/QA teams to modify and write new test cases/used cases for the projects.
  • Developed workflows for Informatica/Pentaho to accommodate the new email campaigns as per business requirement
  • Experience in working with multiple groups from different lines of businesses and from different background (both Technical/non technical and business related).
  • Worked in conversion projects from Informatica to Pentaho.
  • Worked for conversion projects from Oracle Database to DB2.

Environment: UNIX F-Secure, KSH, Perl, SQL, PL/SQL, SQL Developer, Oracle, DB2, HTML, Pentaho, Brickstreet, XML, Sub Version, Clear CASE.

We'd love your feedback!