We provide IT Staff Augmentation Services!

Senior Etl Engineer Resume

0/5 (Submit Your Rating)

San Ramon, CA

SUMMARY

  • Overall 10 years of experience in analysing, designing, developing, testing and implementing of Data warehouse/ETL solutions to meet ever changing business requirements across diverse industries.
  • Understanding the business needs of the organisation, stakeholder management, developing strong and productive business relationships.
  • Experienced in Dimensional Data Modelling using Star/Snow Flake schema, Fact and Dimension Tables, De - normalization, Normalization (1st, 2nd, 3rd Normal forms) and Aggregations.
  • Experienced in leadership and key technical roles in projects related to effective implementation of data warehousing/ETL and business intelligence solutions.
  • Implemented Data warehouse/ETL solutions using tools likeUNIX shell scripting, DB2, Teradata, Oracle, SQL, PL-SQL, Informatica - PowerCenter 9.x/8.x/7.x, B2B, Data Exchange/ Power Exchange 9.x, Data Transformation (DT) Studio 9.x, DVO 9.x; Datastage and many more.
  • Extensive knowledge and hands on experience in writing complex SQL like analytical, aggregate, correlated queries.
  • Sound understanding of Master Data Management processes using Informatica MDM such as Match and Merge, Unmerge, Tokenization, Base tables, Xref Tables, Golden Records, Post merge process, Smart search index, Singleton records etc.
  • Experienced in creating logical and physical data models using Erwin.
  • Full software development lifecycle experience, has a strong technical ability and always willing to learn.
  • Outstanding ability to learn and apply technology to efficiently address requirements and meet deadlines.
  • Experienced in producing technical and functional documentation.
  • Experienced in Agile/Scrum environments.
  • Experienced in onsite/offshorebusiness model.
  • Sound knowledge in Hadoop Eco system and Big Data with hands on training using HDFS, Map Reduce, Hive, Pig, flume.
  • Worked in private and public sectors in USA and Australia which include Banking, Insurance, Healthcare, Tele-communication domains.

TECHNICAL SKILLS

  • Data Warehousing
  • ETL
  • Data mapping
  • Data analysis
  • Data profiling
  • Data discovery
  • Business Intelligence Reporting.
  • Informatica PowerCenter 9.x/8.x/7.x
  • Informatica Analyst/Developer (IDQ)
  • Informatica Data transformation Studio(DT)
  • Informatica Data ValidationOption (DVO)
  • Informatica Metadata Manager(IMM)
  • Informatica B2B/MFT(Managed File Transfer)
  • Informatica MDM/DAC/Power Exchange
  • Informatica Cloud
  • Informatica BDE
  • SSIS
  • Salesforce.com
  • Datastage.
  • Teradata
  • DB2
  • Oracle
  • SQL Server
  • SQL
  • PLSQL
  • SQL Assistant
  • TOAD
  • SQL Developer
  • SAP Business objects
  • OBIEE
  • OBIA.
  • Linux
  • Windows shell scripting
  • FTP
  • SFTP
  • Python
  • COBOL
  • XML
  • Apache Hadoop Eco System (Pig
  • Hive
  • Map Reduce
  • Hbase
  • Sqoop).
  • Microsoft Office Suite
  • Visio
  • Erwin
  • ER studio
  • CONTROL-M
  • Rational Team Concert
  • SharePoint
  • Tivoli Service Resource Manager
  • Service NOW
  • REST
  • SOAP
  • WSDL
  • Web Services.

PROFESSIONAL EXPERIENCE

Confidential, San Ramon, CA

Senior ETL Engineer

Responsibilities:

  • As one of the Lead member of Enterprise Master Data Management (MDM) Team - Project (Landis):
  • Assisted in designing the ETL approach for loading the MDM stage tables from the EDW Datastore tables for few source systems.
  • Involved in analysis of source to target mapping provided by BSA and prepare technical design documents.
  • Developed ETL code for loading multiple source systems (10) data from EDW Datastore tables to the MDM stage tables which serve as source data for the Master Data Management Program for identifying customers and related Information.
  • Developed/tuned SQL for extracting the data from the datastore tables.With clause, aggregate, analytic functions, multiple joins, Oracle functions, user defined functions were extensively used in these extract queries and to load data into Party, Account, Product and their child tables like party address, party ID Detail, party/Account related org etc.
  • Created Informatica objects like mappings, sessions, workflows, mapplets, reusable sessions, reusable transformations, shortcuts.
  • Created web services using Informatica - IDQ for real time party and account address verification for the front end systems.
  • Created Informatica objects for pre match and merge jobs and pave path for tokenization, match/merge party and party child tables. Also created few informatica objects for post merge jobs which load party derived fields and LOB assignment to party post merge staging table.
  • Involved in code reviews, co-ordinate with offshore on giving walk through of mapping documents and technical design documents.
  • Interacted with Business Analysts to verify requirements, liaised with QA team during defect fixing, presented the solutions designs to the offshore folks and also supported PRODOPS team during migration of code to higher environments.
  • Designed and developed enhancements to Inbound/Outbound Engine Framework (An automated file transfer Framework in Confidential to IHC) by using metadata, shell scripting, Informatica, Oracle Sql Loader scripts which will be used by all Development teams for allowing transfer of file extracts, decrypting/encrypting and staging them from/to Intermediate Holding Company(IHC) for BNP Paribas.
  • Created different UNIX scripts for handling multiple tasks such as a polling script to accomplish on demand transmission of files by users, a script to change different file names to expected file naming convention prior to sending the file to Outbound Transfer Engine etc.
  • Developed ETL code using informatica, Sql and UNIX for 15 Quantitative risk management files from IHC into BIDM EDW - pre-stage, stage, Datastore tables (Oracle tables) and finally to QRM tables in SQL Server.

Environment: UNIX, SQL, shell scripting, Informatica PowerCenter 9.x, Informatica Developer 9.x, Informatica MDM hub, IDQ, Web services, SOAP, Oracle 11g, SQL Server, Toad,SharePoint, Winscp, Quality Center, putty, Tumbleweed, GPG utility.

Confidential, Woodland Hills, CA

ETL Tech Lead

Responsibilities:

  • Managed multiple projects as a technical lead at the same time.
  • Contributed in every phase of the projects from requirement Gathering to deployment.
  • Worked closely with project manager for creating LOE (level of efforts), sizing, estimation and creating project plans.
  • Involved in requirements gathering sessionswith SMEs, BAs and Business and assessed them for completeness and accuracy.
  • Designed and defined ETL solution design involved in meeting the requirements.
  • Created Technical design documents (TDD) based on BRD and best practice artefacts and coordinated with Confidential Architects for approval/signoff.
  • Led and guided onshore/offshore team in the development of the defined ETL load routines using Informatica/DB2/Unix tools.
  • Reviewed the code developed by the onshore/offshore developers.
  • Created/tested Informatica ETL objects like mappings, sessions, workflows, re-usable objects and data extraction queries using SQL in DB2 when necessary for data loads from ODS to downstream systems.
  • Mentored junior developers.
  • Improved ETL jobs Performance by tuningInformatica mappings, sessions, DB2 queries.
  • Build and maintain complex SQL scripts/queries for data analysis and extraction.
  • Assisted in cross project reviews.
  • Monitored and reported on progress in completing projects and accomplishing goals to Manager.
  • Created unit test cases, reviewed unit test cases results produced by the team.
  • Involved in creating the technical runbooks and playbooks which will have the list of all tasks and jobs and the sequence in which they needed to be executed as part of each cycle.
  • Was single point of contact from Confidential for managing Lights on (sev1, sev2, sev3) defects on the provider Systems data.
  • Created work orders in Tivoli SRM, change records in Service NOW for moving code to SIT, UAT, PERF, PROD regions.
  • Liaised/ co-ordinated with test team for defect fixing during SIT phase.
  • Liaised/co-ordinated with release teamfor smooth migration of the code to production environment.

Environment: Informatica PowerCenter 9.x, Informatica Developer 9.x, DB2, UNIX, FTP, Salesforce, Rational Clear Quest, Tivoli SRM, Service Now, Teradata.

Confidential

ETL Consultant

Responsibilities:

  • Performed different roles like Tech Lead/Developer/Release Engineer/Solution designer in multiple projects.
  • Designed, Constructedand tested Informatica ETL which includes workflows, sessions, mappings, mapplets, reusable objects, pmcmd scripts for loading data from delimited/fixed width flat files, XML files, Excel, Oracle tables to Teradata staging/ODS/DDS tables in multiple projects.
  • Assisted in source to target mapping/modelling.
  • Did profiling on source flat files and tables for data discovery using Informatica Data Quality tools - Developer/Analyst.
  • Involved in designing Dimensional DataMart for Medicare Statistics project with six Dimensions and one Fact table.
  • Created Logical and Physical models for the Medicare Statistics DataMart using Erwin.
  • Designed and developed mappings using Informatica toolsfor loading data from Excel files, ODS and RDS to dimensional tables like Geography, Service Item, Medicare Claim Type, Hospital Service Delivery, Claim Line Item, Date and fact table Medicare Claim Service Line Item.
  • Implemented Change Data Capture (CDC) for loading PHIAC-B, Insurer, Population, Statistic definition Tables into ODS for MBDPHI project.
  • Created, tested and deployed multipleInformatica DT services using Informatica DT Studio to convert multi tabbed Excel file to XML files/comma delimited flat files.
  • Created XSDs,Parsers containing groups, repeating groups, variables and also Serializerswith Informatica DT Studio.
  • Configured Informatica B2B/MFT/DX and Partner/Profile/Endpoints in Data Exchange (DX) operational console for easy filesexchange betweenusersand Confidential .
  • Designed, created and ran test cases in Informatica DVOusing metadata tablesand reduced the regression testing time to one person running the test cases for an hour instead of multiple resources working for multiple hours.
  • Involved in creating data maps for MBD Line by Line mainframe datasets using Power Exchange Navigator.
  • Used Informatica Metadata Manager (IMM) for data lineage and impact analysis.
  • Used Teradata utilities like BTEQ, fast load, multi load in the Teradata parallel transporter (TPT) connections for faster load to Teradata database.
  • Assisted in cross project reviews.
  • Trained new staff in Informatica Data Transformation concepts.
  • Created shell scripts for backup, FTP, triggering jobs andhousekeeping.
  • Involved in unit and system testing, constructing test cases and test data, liaised with the testing team in resolving issues and defects.
  • Produced technical documentation likeRelease notes, Handover notes and detail design documents.
  • Coordinated with the primary vendor for release of several projects toTesting, Pre-production and Production environments.

Environment: Informatica tools - PowerCenter 8.x/9.x, Developer/Analyst 8.x/9.x, DVO, B2B, DX, DT Studio, Jasper soft; Oracle, Teradata Quality Center, AIX, UNIX, Windows, FTP, Oracle, MS Excel, Visio, flat files, Cognos, CONTROL-M.

Confidential

Technology Analyst

Responsibilities:

  • Worked in multiple projects from requirements gathering to deployment.
  • Worked with flat files, DB2, Oracle, XML and COBOLsource files.
  • Created Cycles/Steps/Processes in System Meta Data schema tables in Oracle like batch control tables, audit tablesthat formed the underlying framework for Westpac.
  • Developed/tested Informatica mappings, sessions, pmcmd scripts, workflows, reusable objects in Powercenter for all the projects in this assignment.
  • Executed ftp/sftp scripts to connect to remote host and facilitate flat file transfer.
  • Assisted in creatingdetail design document using the relevant Business Requirements Document.
  • Wrote UNIXshell scripts for adding header trailer, Ftp’ing and more.
  • Prepared unit test cases, verified functional specifications and reviewed deliverables.
  • Liaised with release team during deployment.
  • Created summary and detail reports using Business objects.
  • Used different functionalities of WEBI like Breaks, Sorts, Filters, and Sections and drill down in creating complex reports.
  • Pre-implementation steps such as CMS creation in mainframe for deployment into production environment.
  • Supported Production environment by working with support staff and trouble-shooting the support tickets.
  • Created and ran jobs in Control-M Scheduler for scheduling ETL jobs.

Environment: Informatica PowerCenter 8.x, Informatica Developer8.x, Oracle, Siebel, Remedy, Quality Center, AIX, UNIX, Windows, FTP, DB2, SQL Server, COBOL, flat files, Business Objects XI, CONTROL-M

Confidential

ETL Engineer

Responsibilities:

  • Responsible for creating/testingInformatica objects like mappings, sessions, workflows, and mapplets for loading into RDW (Reporting Data Warehouse).
  • Cleansed the source data, extracted and transformed data with business rules, built reusable informatica code for loading into Data Warehouse.
  • Analysed session/workflow log files for error resolve, used debugger for debugging purposes.
  • Created parameter files in UNIX and assisted in creating scripts for ftp/sftp file transfers.
  • Analysed existing Oracle PL/SQLPackages and Procedures for improving their run time.

Environment: Informatica power Center 7.x, Toad, oracle, flat files, MS Excel, SQL, PL/SQL, COGNOS, Windows, UNIX, Cron tab.

Confidential

ETL Developer

Responsibilities:

  • Was involved in requirements gathering sessions with business users.
  • Involved in preparation of mapping specification document.
  • Created Informatica mappings/sessions/workflowsfor extracting data from flat files, Oracle, COBOL sources and loading into staging tables and then to ODS.
  • Created extract queries, runSQLqueries on source tables in Oracle database for profiling the data for anomalies and reporting to the team lead.
  • Used UNIX commands like grep, sed, awk on the flat file sources for testing purposes.

Environment: Informatica power Center 7.x, Toad, oracle, Flat files, COBOL files, SQL, PL/SQL, Windows, UNIX, AIX.

We'd love your feedback!