We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

0/5 (Submit Your Rating)

Washington, DC

Objective

  • 11 years and 9 months of IT experience, which includes 9+ years in Ab Initio, two years in Informatica, involved in multiple ETL Data Warehouse Project Development, Production Support and Data Analysis Activities in Healthcare and Banking domain.

SUMMARY

  • Experience in Analysis, Design, Development, Implementation and Maintenance of ETL data warehouse applications using Ab Initio and Unix.
  • Well versed with various Ab Initio Parallelism techniques and implemented Ab Initio Graphs using Data Parallelism, Component Parallelism and Multi File System (MFS) techniques.
  • Improved performance of Ab Initio graphs by using various Ab Initio performance techniques and components like ICFF, Lookups, Joins and Rollups.
  • Experienced in ETL processes with large scale (550 million) of Retail and Mortgage Customer Banking data with their demographic and PI data analysis and processing in cross platform and integrated to Oracle and Teradata DB.
  • Experience in integration of various Data sources such as Flat files, XML files, DB Tables, COBOL Copybook and worked on Ab Initio Plan for execution for graphs.
  • Experience in real time data processing using Continuous flow, publish data to Ab Initio queue, JMS Queue, where Ab Initio batch job used to interact with middleware MDM and ESB real - time Servers.
  • Worked in different Ab Initio GDE version (1.15/1.16/2.14/2.15/3.0/3.1.7 , 3.2.1,3.3) and implement multiple projects.
  • Scheduled ETL batch jobs using Autosys, Control-M tool.
  • Worked on Teradata Utilities like Multi Load, Fast Load, Fast Export TPump and BTEQ.
  • Knowledge in AnalyzingDatausing Ab InitioDataProfiler(DQE) to estimate different Patterns ofdata, identifying duplicates, frequency, consistency, accuracy, completeness and referential integrity ofdata.
  • Well versed in Teradata Architecture and implemented different technique like Temporary Tables, Temporal Tables, Index, Partition Primary Index (PPI) for various performance improvement in database level.
  • Experience in Database Development using Stored Procedure, Function, Cursor, Triggers, SQL and PL/SQL in Oracle 9i/10g/11g, Teradata 13/14, MS SQL Server 2005, 2008, IBM DB2.
  • Worked as lead in Vendor Migration, Product Migration for smooth transition of new project and delivered it on time.
  • More than 6+ Years of Onsite (US) experience as Team Lead having Development, Maintenance and Production Support experience in ETL Applications.
  • Worked in Informatica and developed complex ETL mappings making use of transformations like Source Qualifier, Expression, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Router, Filter, Aggregator and Sequence Generator transformations.
  • Good knowledge of Data Warehouse concepts and principles (Kimball/ Inman) - Star Schema, Snowflake, SCD Types, Surrogate Keys, Normalization/De-normalization.
  • Knowledge in ITIL process and involved as key player for maintaining, documenting the production management standard process like Incident Management, Problem Management, Change Management, Continuity Management and Management process.
  • Experience of working in Agile Methodology (Scrum model) and Waterfall models.
  • Experience in L2/L3 Support for enhancement, troubleshooting, Production issue analysis within time bound SLA using Ab initio, Oracle, Teradata, Pl/SQL, and Unix.
  • Integrate Ab Initio batch jobs with interface of Data Quality (DQIP System).
  • Working knowledge in Pitney Bowes Spectrum, where Ab Initio batch job interact with third party software Pitney Bowes for name and address validation.
  • Knowledge in Hadoop Eco System.

TECHNICAL SKILLS

ETL Tool (Ab Initio): Ab Initio (GDE 1.15/1.16 /2.14/2.1.5/ 3.0.5/3.1.7/3.2.1 , Co- Operating System 2.12.2 / 2.1.4 / 2.15 /3.1/3.3 XFRs, EME, Messaging and Queues, Continuous Flows, ICFF, Ab Initio Plan.

ETL Tool (Informatica): Informatica 7.x/8.x Power Center, Work Flow Manager, Work Flow Monitor, Designer, Repository Manager.

Big Data: Hadoop, HDFS, MapReduce, HIVE, PIG, Sqoop, Oozie

Version Control Tools: EME 2.14, VSS, RCS

Database: IBM DB2, Oracle 9i/10g/11g, Teradata 12/13/14, DB2, MS SQL Server 2005/2008

Scheduling Tools: Autosys R11.3, Control-M, Maestro

Methodologies: Waterfall, Agile

Language: ASP.Net, C#.Net

Reporting: SSRS

Operating Systems: Windows XP/2000/NT and Unix (Sun Solaris 5.10, AIX-6.1)

PROFESSIONAL EXPERIENCE

Confidential, Maryland, Washington, DC

Senior ETL Developer

Responsibilities:

  • Involved in Analysis, Design, Development, end-to-end Implementation of this Project.
  • Implement ICFF in multiple Ab Initio process.
  • Used Ab Initio Plan for execution for graphs.
  • Implement Milt File system for faster processing of large scale of records.
  • Prepare project setup and generic wrapper scripts to control and execution of psets.
  • Understanding the specifications for Data Warehouse ETL Processes.
  • Involved in Performance tuning of Ab Initio graphs using ICFF, Multifile, compressed flow, partitioning on keys.
  • Handling complex ETL code development, review and production support issue fix.
  • Worked in Stored Procedure and PL/SQL queries for extract and update records in Oracle DB using Ab Initio components.
  • Involved in various EME repository operations like, code check-in, code checkout, TAG creation, prepare migration package.
  • Create ETL design document, maintain the design patterns, and create ETL interfaces.
  • Involved in creation of multiple new sandbox environment and setup for performance and UAT testing.
  • Responsible for project deliverables and mentoring the whole team technically Planning, Strategizing and Risk analysis of every project module prior to every release.

Environment: Ab Initio (GDE 3.3, Co>operating system 3.1.4), Control-M, Ab Initio Plans, Query-IT, DB2, Windows XP and Unix AIX-6.1.

Confidential, Deerfield, IL

ETL Developer

Responsibilities:

  • Involved in Analysis, Design, Development, end-to-end Implementation of this Project.
  • Understand and Prepare High-level, Low-level design document for the business requirement or any enhancement of the existing PARS process.
  • Involved in data profiling and profiled several data sets (serial, multi-file, database tables) and categorized into different projects, directories using Ab Initio Data Profiler.
  • Implement ICFF in multiple Ab Initio process.
  • Implement Milt File system for faster processing of large scale of records.
  • Understanding the specifications for Data Warehouse ETL Processes.
  • Involved in Performance tuning of Ab Initio graphs using ICFF, Multifile, compressed flow, partitioning on keys.
  • Design, Code and implementation experience with Ab Initio ACE/BRE (DQE).
  • Developed Data Quality checks with Data Quality Environment (DQE).
  • Processed different sources feed such as COBOL Copybook, Flat files, XML files.
  • Worked in Stored Procedure and PL/SQL queries for extract and update records in Oracle DB using Ab Initio components.
  • Involved in various RCS data store operations like, code check-in, code checkout, creating project parameters according to the environment settings for this application.

Environment: Ab Initio (GDE 3.1.5/3.2 Co>operating system 3.1.4), Control-M, Ab Initio Plans, Oracle 11g, Windows XP and Unix AIX-6.1.

Confidential, Irving, TX

ETL Lead

Responsibilities:

  • Involved in Analysis, Design, Development, end-to-end Implementation of this project.
  • Creates a common mechanism for early identification of customers so they receive appropriate treatment at all channels.
  • Creates traceability of customer’s behavior from operations, analytics and marketing a full life-cycle view.
  • Understand and Prepare High-level, Low-level design document for the business requirement or any enhancement of the existing process.
  • Build KYC system to enable the Bank to know the risk percentage of a customer.
  • Implement Continuous flow and ICFF in multiple Ab Initio process.
  • Implement Milt File system for faster processing of large scale of records.
  • Processed different sources feed such as COBOL Copybook, Flat files, XML files.
  • Maintained ODS tracking and Error tracking system to keep track of success and failure publish to real-time system.
  • Creation of Import Feeds in Abinitio Metadata Hub, Profiling Tables/Files through Data Profiler/DQE.
  • Worked in Stored Procedure and PL/SQL queries for extract and update records in Oracle DB using Ab Initio components.
  • Implement Dynamic Throttle to control the record update/insert to Oracle Database and publishing of record to JMS publish queue.
  • Create Autosys JIL scripts for scheduling batch process.
  • Worked with Teradata for loading data into Data warehouse. Worked in Teradata loading utilities like Multi Load and Fast Load.
  • Developed Data Quality checks using Express>IT tool with Data Quality Environment (DQE).
  • Involved in various EME data store operations like creating sandbox, code check-in, code checkout, creating project parameters according to the environment settings for this application.

Environment: Ab Initio (GDE 3.1.4/3.2 Co>operating system 3.1.4), Autosys, Oracle 11g, Teradata 14, Windows XP and Unix AIX-6.1.

Confidential, Irving, TX

ETL Developer

Responsibilities:

  • Analyzing design and preparing graphs to perform specific task, which forms a part of the project.
  • Created graphs using GDE Components like Reformat, Sort, Dedup Sorted, filter by expression, partitioning component like Partition by key, Round robin, expression, departitioning component like gather, merge, concatenate, replicate, join, join with db, rollup components.
  • Involved in performance tuning of Ab Initio graphs using ICFF, Multifile, compressed flow, partitioning on keys.
  • Integrate new Thank you process with existing ETL process and gather data for downstream processing.
  • Worked with Teradata for loading data into Data ware house. Knowledge in Teradata loading utilities like Multi Load, Fast Load and BTEQ.
  • Improved performance of Ab Initio graphs by using various Ab Initio performance techniques like using ICFF, Lookups, Joins and Rollups.
  • Provide a wholesome view of customer to better manage and mitigate risks.
  • Build KYC system to enable the bank to know the risk percentage of a customer.
  • Analyzing and fixing defects during testing cycles. Analyzing Productions issues, TRs and fixes either the code or creating ad-hoc graphs to cater the requirement.
  • Performing batch process for various testing cycles.
  • Understanding the specifications for Data Warehouse ETL Processes.
  • Involved in various EME data store operations like creating sandbox, code check-in, code checkout, creating project parameters according to the environment settings for this application.
  • Worked on Ab InitioDataProfilerto Load Metadata.
  • Handling complex code review, code development and production support.
  • Involved in code reviews, Unit testing and Integration testing, supported System testing, UAT testing, and preproduction testing.
  • Experience in analyzing production issue and provide temporary fix until the permanent fix ready to move Prod environment.

Environment: Ab Initio (GDE 3.0.5/3.1.4 Co>operating system 2.1.8/ 3.1.4 ), Autosys, Oracle 11g, Teradata 12/13, Windows XP and Unix AIX-6.1

Confidential, Irving, TX

ETL Developer/L3 Support

Responsibility:

  • Analyzing design and preparing graphs to perform specific task, which forms a part of the project.
  • Sales growth efficiency increases by understanding the customer’s product portfolio and engagement levels.
  • Created graphs using GDE Components like reformat, sort, dedup sorted, filter by expression, partitioning component like partition by key, round robin, expression, departitioning component like gather, merge, concatenate, replicate, join, join with db, rollup, continuous components.
  • Involved in performance tuning of Ab Initio graphs using ICFF, Multifile, compressed flow, partitioning on keys.
  • Prepared UNIX shell script to run the graphs and do calculation for reporting purpose.
  • Analyzing and fixing defects during testing cycles. Analyzing Productions issues, TRs and fixes either the code or creating ad-hoc graphs to cater the requirement.
  • Performing batch process for various testing cycles.
  • Understanding the specifications for Data Warehouse ETL Processes.
  • Involved in various EME data store operations like creating sandbox, code check-in, code checkout, creating project parameters according to the environment settings for this application.
  • Handling complex code review, code development and production support.
  • Work in Stored Procedure and PL/SQL queries for extract and update records in Oracle DB using Ab Initio components
  • Involved in code reviews, Unit testing and Integration testing.
  • Supported System testing, UAT testing, and preproduction testing.
  • Handled critical data issues in System Testing, UAT environments in the process of recovery.

Environment: Ab Initio (GDE 1.15/ 3.0.5/3.1.4 Co>operating system 2.11.8/ 3.1.4 ), Autosys, Oracle 10g, OBIEE, Windows XP and Unix AIX-6.1

Confidential, Irving, TX

ETL Developer

Responsibilities:

  • Worked with Business Analyst and Data team to translate requirements into system solutions and developed implementation plan and schedule.
  • Responsible for designing, requirements gathering, technical specifications, effort estimation, code reviews and development of the application.
  • Defined and documented the technical specification designs of all programming applications.
  • Used various Programs, conditional components and transform functions, sub graphs, most components like datasets, Filter, Join, sort, Partition and lookups.
  • Responsible for developing data extracts and transformation programs.

    Extensively used Unix shell scripting, general file manipulations etc.

  • Used Stored Procedure and PL/SQL query for Data Base operation using Ab Initio components.
  • Involved in various EME data store operations like creating sandbox, code check-in, code checkout, creating project parameters according to the environment settings for this application.
  • Worked in Agile Development Practices (SCRUM) with 2-week iterations.
  • Responsible for analyzing production issues, participate in installations/validations for production upgrades and validate functional behavior of new changes in production.
  • Design, Development, and Enhancement of new functionalities.
  • Logging in the different issues related to the development phase.
  • Responsible for monitoring of different existing process and identified potential improvements/enhancements.

Environment: AbInitioGDE 1.13, 1.14, 2.14 Co>Operating System 2.13, 2.14, Meastro, OBIEE, Oracle 9i, Windows XP and Unix AIX-6.1

Confidential

ETL Developer

Responsibilities:

  • Involved in Analysis, Design and Develop Imformatica mappings to perform specific task, which forms a part of the project.
  • Developed Logical and physical database designs for the transaction system.
  • Performance tuning on sources, targets, mappings and SQL queries in the transformations.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Expression, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Router, Filter, Aggregator and Sequence Generator transformations.
  • Created reusable transformations and mapplets based on the business rules to ease the development.
  • Worked with Debugger for identifying bottlenecks and to debug the complex mappings and fix them.
  • Developed mappings and implement SCD type 2 slowly changing dimensions.
  • Developed and implemented the Unix shell script for the start and stop procedures of the sessions.
  • Written UNIX shell scripts to automate SFTP files, archive files and pre-validation scripts.
  • Worked on Agile Development Practices (SCRUM) with 2-week iterations.
  • Used various Informatica Error handling technique to debug failed session.
  • Handled critical data issues in System Testing, UAT environments in the process of recovery.
  • Supported System Integration testing, UAT testing and Production go-live support

Environment: Environment: Power Center 7.1/8.1, Data Stage, Oracle9i, MS SQL Server 2005, SQL, PL/SQL, Unix Shell Scripts, Windows XP.

We'd love your feedback!