We provide IT Staff Augmentation Services!

Etl Informatica/data Warehouse Developer Lead/data Modeler Resume

0/5 (Submit Your Rating)

OH

SUMMARY

  • 7+ years of professional experience in Data Warehouse/ETL Application Design, Development, Testing, Implementation, Administration & Maintenance using Informatica, Oracle PL/SQL on UNIX environments.
  • A Self - starter with a positive attitude, willingness to learn new concepts and accept challenges.
  • Rich experience in creating Informatica Transformations like Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer, Transaction Control, Data Masking, etc.
  • Extensively used Informatica Designer for Mappings and processing tasks using Informatica Workflow Manager to move data from multiple sources into targets along with re-usable tasks like sessions, Pre/Post session commands and experience in mapping parameterization and setting up concurrent/parallel workflow instance executions.
  • Rich Experience in Informatica integration of various data sources like Oracle, DB2, Sybase, SQL server and non-relational sources like flat files.
  • Rich Experience in designing/developing Logical & Physical Data Models using Erwin DM
  • Rich experience in developing Views, PL/SQL Procedures, Functions, Packages and Triggers.
  • Rich experience in using Exception Handling methods along with PRAGMA EXCEPTION INIT/ RAISE APPLICATION ERROR in order to associate our own created exception names.
  • Rich Experience in configuration, development and testing of Autosys JIL Automation/Scheduling Tool scripts, Calendars, Global variables, Machine updates, etc.
  • Rich experience in Informatica Administration activities like Informatica folder setup, pmrep commands usage, security domain creation with LDAP account integration and Informatica Administrator Console Domain/Repository setup and maintenance.
  • Rich Experience in UNIX Shell Scripting and File Transfer Protocol configurations like SFTP, PGP & FTP methods.
  • Experience in creating Query, Validation, Map operation, Key generation, Table Comparison Transforms in SAP Data Services Designer BODS .
  • Experience in writing DB2 SQL queries.
  • Experience in writing Technical Specification Documents and Developing Test Document templates for the team to work on.
  • Proficient in Requirement Gathering, Data Profiling, designing detailed ETL code design and Functional Requirement document.
  • Experience in developing External Tables, Materialized Views, Joins, Indexes and sequences.
  • Experience in query optimization, Performance Tuning (PL/SQL), Explain Plan, Indexing, Hints, Bulk Binds, Bulk Collect, Creation of global temporary tables and table partitioning Static & Dynamic
  • Experience in using PL/SQL Wrappers to encrypt the PL/SQL procedures and packages adhering to security standards.
  • Experience in migrating Informatica repository from 8.6 to 9.5 and also worked on oracle data warehouse migration to new servers and testing the batch scripts and other batch configuration files in the new UNIX machine.
  • Have some basic knowledge in using DBMS SCHEDULER to run jobs inside oracle & usage of other DBMS built in packages.
  • Have in-depth knowledge in setting up validation scripts & understanding of application usage/purpose to aid DW/ODS batch job automation/alerts.
  • Extensively used TOAD, PL/SQL Developer & SQL*PLUS oracle utility for SQL query usage
  • Have some basic knowledge/hands on of Extraction Transformation and Loading (ETL) processes using SQL Loader.
  • Played various roles on Projectsthat required Data Warehouse Consulting which includes ETLDeveloper, Product Test/Support Engineer, Data warehouse Developer, ETL Technical Lead, and Team Lead.
  • Experience in interacting with Business Users, Business Analyst to gather and analyze business requirements and translating requirements to functional and technical design specifications.
  • Apart from the experience listed below have worked on more than couple of projects listed based on reporting requirements in the existing data warehouse & operational data store to support completely new reporting requirements as per client expectations.

TECHNICAL SKILLS

Databases: Oracle 9i/10g/11g, DB2

Developer Tools: Informatica Power Center Suite, Erwin DM, BODS

Batch Scripting: Autosys, Jobtrac & UNIX Shell Scripts

Tools: TOAD, PL/SQL Developer, SQL*Plus Utility, ALM HP Quality Center, SharePoint, File Transfer/SFTP servers, ITSM, EURC, Jenkins.

Version Control: Tortoise SVN, Harvest/Workbench

Agile: Jira

Deployment Tools: ARM Automatic Release Management

PROFESSIONAL EXPERIENCE

Confidential, OH

ETL Informatica/Data Warehouse Developer Lead/Data Modeler

Environment: Informatica, SQL, Erwin, BODS, DB2, UNIX

Responsibilities:

  • Includes interacting with Business Users, Business Analyst to gather and analyze business requirements and FSD’s to create Technical Specification Document TSD’s .
  • As ETL Lead daily activities include designing low level code design documents, coding, writing simple to complex SQL queries, Data profiling, unit testing, code reviewing, defect fixing and updating the documents when changes come in.
  • Updating data model as per business requirement changes and responsible for generating DDL scripts using CA Erwin data modeler and sending them to DBA’s to be applied on to the database.
  • Involved in creating data process architecture for the data flow that would be remediated.
  • Responsible for packaging & versioning the developed objects that would be sent to migration team to move to QA & Production environments.
  • As Onshore coordinator would have daily offshore call to delegate work and responsible for answering queries on existing ETL & DB designs
  • As part of development would do string testing to make sure data flow in a business group that has been remediated is not affected.
  • Responsible for adding error report at end of each job load by writing data Profiling queries after consultation/discussion with business stake holders.
  • Responsible for developing batch Unix scripts along with setting up Unix batch profiles and configurations.
  • Responsible for data availability in non-prod environments which would be mocked up based on the new business requirements and comparing them with existing production data.

Confidential, DE

ETL Developer Lead, Informatica/Data Warehouse Senior Developer

Environment: Informatica, Oracle PLSQL, Linux/Unix, Autosys, Jira/Subversion

Responsibilities:

  • Responsible for Requirement gathering, designing (Technical Design) ETL interfaces, reviewing mapping design by team members as well as leading the team in coding, testing, performance tuning in developing Informatica mappings and workflows.
  • Responsible for production migration and providing stabilization support.
  • Responsible for maintaining Data Cleansing standards within Informatica and storing related meta data in oracle for data analysis.
  • Responsible for writing data Profiling queries after consultation/discussion with business stake holders which would be sent as statistics report at the end of each batch cycle.
  • Responsible for providing updates on project status and issues and served as a point of contact on the ETL team for other teams such as Reporting, Testing, QA
  • Involved in interacting with Business Users, Business Analyst to gather and analyze business requirements and translating business requirements to functional and technical design specifications.
  • Responsible for providing masked copies of data in test environments.
  • Handling client escalations provide technical guidance to team members in various ETL issues and writing HLD & LLD documents and creating test document templates.
  • Creating Oracle Views, Tables, Procedures, Functions to aid batch cycle data load and validation.
  • Responsible for scheduling Oracle & Informatica objects in Unix via Autosys batch scripts.
  • Responsible for setting up Unix profiles and configurations to aid batch process.
  • Extensive testing of Informatica and Oracle objects in DEV, IST, QA environments.
  • Responsible for running security scans against the source code via Jenkins from subversion tool source code repository.
  • Responsible for design/development of an extensive validation, reconciliation and error-handling methods during batch process pipeline to maintain data integrity
  • Responsible for data availability in non-prod environments which would be reported via corporate level technology infrastructure service to front end java application.
  • Warehouse coding technique implemented has a unique future where business users via Admin module can control data display features in the front end application.
  • Responsible for storing security related application log data along with Java email functionally parameters and logs in the warehouse which was used for real-time application action/controls.
  • Code snippets are stored in subversion and deployments are formalized/scheduled via ITSM for manual deployment via DBA’s in UAT & Prod environments.

Confidential

Informatica, Data Warehouse/ETL Developer

Environment: Informatica, Oracle PLSQL, Linux/Unix, Autosys, Jira/Subversion

Responsibilities:

  • Informatica & Oracle Developer for creating objects in the repository as per the business requirement.
  • Creating High level & low-level ETL flow design.
  • Involved in detailed technical design.
  • Give functional KT to the QA team.
  • Was involved in conducting the review of Informatica Code, Unit Test Cases & Results with the Developers.
  • Organize daily technical discussions with the Onsite team also including the individual offshore work stream leads and set expectations for Offshore delivery.
  • Creating Oracle Views/Tables and other oracle objects like Procedures/Functions which would be useful for maintenance and aid reporting purposes.
  • Responsible for writing Unix batch scripts for scheduling jobs in Autosys.
  • This data warehouse consists for more than 2000 jobs that run every night to refresh the warehouse and some large volume mappings use Informatica partitioning and some of the huge table use oracle partitioning to maintain very large volume.
  • Apart from reporting purpose, data from this warehouse has been sent manually as interfaces to lot of other application via SFTP & table loads using Informatica.
  • Have also done Requirement gathering/design/technical specs for multiple interface clients which usually involves detailed business knowledge for various HR related processes inside the warehouse.
  • Each release for this warehouse is done with agile methodology where we use Bi-Weekly/Monthly release along with Jira tool for tracking and maintenance of code in subversion component and deployment using Automatic Release Management tool.
  • Have rich experience in formulating data Extraction/Loading & reporting validation done at different levels of data load.

Confidential

Data Warehouse/ETL Developer

Environment: Informatica, Oracle PLSQL, Linux/Unix, Autosys, Subversion

Responsibilities:

  • Informatica & Oracle Developer for creating objects in the repository as per the business requirement.
  • Creating Oracle Views/Tables and other oracle objects like Procedures/Functions which would be useful for maintenance and aid reporting purposes.
  • Responsible for writing Unix batch scripts for scheduling jobs in Autosys.
  • Extensive testing of those new Oracle/Informatica objects created in Development and testing environment before moving into UAT & Production implementation.
  • Subversion tool is used for code management and deployments.

Confidential

ETL Developer

Environment: Informatica, Oracle PLSQL, Linux/Unix, Autosys, Subversion

Responsibilities:

  • Informatica & Oracle Developer for creating objects in the repository as per the source/target database requirements.
  • Oracle Publication/Subscription concept is used mainly to move data while Informatica has been used to link/store/send data from/to other non-oracle databases
  • Responsible for writing Unix batch scripts for scheduling jobs in Autosys.
  • This data warehouse solely serves as data mover between different database applications.
  • External Tables have been used for loading data into this warehouse which is directly sent to this database server as text files from quite few applications.
  • Subversion tool is used for code management and deployments.

We'd love your feedback!