We provide IT Staff Augmentation Services!

Ab Initio/etl Developer Resume

2.00/5 (Submit Your Rating)

Jacksonville, FL

PROFESSIONAL SUMMARY:

  • 7+ years of Technical and Functional experience in Data warehouse implementations ETL methodology using Ab initio 3.04,3.15,3.31, co - ops 2.8,2.10, 3.1.7, Teradata 12/13.10/14, Oracle 10g/9i/8i and MS SQL SERVER 2008/2005/2000.
  • Strong hands on experience using Teradata utilities (Fast Export, Multi Load, Fast Load, T pump, BTEQ and Query Man).
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files
  • Excellent communication skills and experienced in client interaction while providing technical support and knowledge transfer.
  • Highly experienced in ETL tool Ab Initio in using GDE Designer. Very good working experience with all the Ab Initio components.
  • Expertise and well versed with various Ab Initio Transform, Partition, Departition, Dataset Database components, sort, validate and compress
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co-ordination and development with Teradata /Oracle/SQL Server based Relational Databases.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co-ordination and development with Teradata /Oracle/SQL Server based Relational Databases.
  • Expert in writing UNIX Shell Scripts including Korn Shell Scripts, Bourne Shell Scripts.
  • Very good experience with SQL Server & Oracle Databases.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, MS Access and Worked on integrating data from flat files
  • Strong understanding of Data Modeling (Relational, dimensional, Star and Snowflake)
  • Excellent analytical & problem solving and business interaction skills.
  • Strong understanding of Data Modeling in data warehouse environment such as star schema and snow flake schema.
  • Experience in all the phases of SDLC for Data Warehousing ETL projects including waterfall and Agile/Scrum methodologies.
  • Created Sandbox and edited Sandbox parameters according to repository Extensive exposure to EME.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co-ordination and development with Teradata /Oracle/SQL Server based Relational Databases.

TECHNICAL SKILLS:

ETL: Ab Initio GDE 3.04,3.15,3.1.7 Co-Operating System 2.14, 2.15

Databases: Oracle 10G/9i/8i, TeradataV2R6, SQL Server 2005,DB2, MS-Access

Languages/Scripting: C, C++, JAVA, SQL, PL/SQL, HTML, Java Script, XML, UNIX Shell Scripting

Scheduling: Autosys, Control-M-v9

Operating Systems: Windows NT/98/2000/XP, UNIX, LINUX

Reporting Tools: Business objects, Cognos, SSRS

PROFESSIONAL EXPERIENCE:

Confidential, Jacksonville, FL

Ab Initio/ETL Developer

Responsibilities:

  • Gathered Business requirements and Mapping documents.
  • Involved in understanding the Requirements of the end Users/Business Analysts and Developed Strategies for ETL processes.
  • Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their OLTP systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio.
  • Responsible for all pre-ETL tasks upon which the Data Warehouse depends, including managing and collection of various existing data sources.
  • Created Change logs and ETL mappings as part of pre ETL-tasks
  • Involved in developing UNIX Korn Shell wrappers to run various Ab Initio Scripts.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, and rollup, join, scan, normalize, gather, merge etc.
  • Worked on improving the performance of Ab Initio graphs by using various Ab Initio performance technique’s like using looks instead of Join’s etc.
  • Implemented Lookup’s, lookup local, in-Memory Joins and rollup’s to speed up various Ab Initio Graphs.
  • Design Documentation for the developed graphs.
  • Used components like run program and run sql components to run UNIX and SQL commands in AB Initio. Also used the components like filter by expression, partition by expression, replicate, partition by key and sort Components.
  • Worked with Departition Components like Gather, Interleave in order to departition and Repartition the data from Multi File accordingly.
  • Performing transformations of source data with Transform components like Join, Match Sorted, Dedup Sorted, Denormalize, Reformat, Filter-by- Expression.
  • Create Summary tables using Rollup, Scan & Aggregate.
  • Gather requirements working with DMA and set the business metadata and technical metadata in MHUB.
  • Working knowledge of various extractors for importing metadata - - Database, Business Metadata, Technical Metadata, Logical Metadata, SSIS, SAS, COGNOS etc.
  • Metadata hub import and build lineage, Metadata hub Structure - Mappings Metadata Hub Admin Activities like Adding new user, Change set approval etc.
  • Created automation of Data quality rules using Express>it tool correspondence to appropriate requirements.
  • Extensive usage of Ab Initio Conduct>it for building plans.
  • Resource server management for Conduct>it and debugging any issues related to it
  • Metadata hub upgrades- work with ab initio support team and ETL infra teams.
  • Used components like run program and run sql components to run UNIX and SQL commands in AB Initio. Also used the components like filter by expression, partition by expression, replicate, partition by key and sort Components.
  • Worked with Departition Components like Gather, Interleave in order to departition and Repartition the data from Multi File accordingly.
  • Performing transformations of source data with Transform components like Join, Match Sorted, Dedup Sorted, Denormalize, Reformat, Filter-by- Expression.
  • Create Summary tables using Rollup, Scan & Aggregate.
  • Involved in Unit testing and UAT testing after coding the Ab Initio objects
  • Created UNIX shell scripts to automate and schedule the jobs.
  • Converted SAS code into Ab initio Code as per the Business requirements.
  • Exposure to big data using hive, sqoop and HDFS.
  • Worked as a production support to run the jobs from Control-M and execute the daily, weekly, monthly objects
  • Created and executed low level Control- M cyclic jobs for Ab Initio Objects developed

Environment: Ab initio GDE 3.15,3.1.7, Metadata, Express>it, conduct>it, Toad data point, Oracle, Teradata, UNIX, db2, SAS, sql server 2008, Control-M.

Confidential, New Jersey

Ab Initio/Teradata Developer

Responsibilities:

  • Performed Analysis, designing and preparing the functional, technical design document, and code specifications.
  • Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their OLTP systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio.
  • Responsible for all pre-ETL tasks upon which the Data Warehouse depends, including managing and collection of various existing data sources
  • Involved in developing UNIX Korn Shell wrappers to run various Ab Initio Scripts.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, and rollup, join, scan, normalize, gather, merge etc.
  • Worked on improving the performance of Ab Initio graphs by using various Ab Initio performance technique’s like using looks instead of Join’s etc.
  • Implemented Lookup’s, lookup local, in-Memory Joins and rollup’s to speed up various Ab Initio Graphs.
  • Design Documentation for the developed graphs.
  • Used components like run program and run sql components to run UNIX and SQL commands in AB Initio. Also used the components like filter by expression, partition by expression, replicate, partition by key and sort Components.
  • Working with Departition Components like Gather, Interleave in order to departition and Repartition the data from Multi File accordingly.
  • Performing transformations of source data with Transform components like Join, Match Sorted, Dedup Sorted, Denormalize, Reformat, Filter-by- Expression.
  • Create Summary tables using Rollup, Scan & Aggregate.
  • Created UNIX shell scripts to automate and schedule the jobs.
  • Created the migration scripts, test scripts for testing the applications, creating and supporting the Cognos reports.

Environment: Ab initio GDE 3.04,3.15, Oracle, Teradata, UNIX, Cognos, Control-M.

Confidential, SanDiego, CA

Ab Initio Developer

Responsibilities:

  • Gathered Business requirements and Mapping documents.
  • Involved in understanding the Requirements of the end Users/Business Analysts and Developed Strategies for ETL processes.
  • Responsible for the detailed design and documentation. Provided technical solutions for the Process requests raised by Data team to fix the issues in the existing system.
  • Designed, developed and Unit tested Ab Initio graphs using GDE for Extraction, Transformation and Loading of data from source to target.
  • Created and modified various graphs based on the business rule enhancements.
  • Involved in writing test cases to validate the code changes.
  • Extensively used Database and Dataset components like Input file, Input table, and Output table and Transform components like Join, Rollup, Scan, Filter by expression, Reformat and other components like Merge, Lookup, Input/Output table and Sort.
  • Used Partition components like partition by expression, partition by key, etc., to run the middle layer processing parallel.
  • Extensively used various inbuilt transform functions like string substring, string lpad, string index, lookup functions, date functions, error functions.
  • Worked on improving performance of Ab Initio graphs by using various Ab Initio performance techniques like using lookups, in memory joins and rollups to speed up various Ab Initio graphs. Designed and developed parameterized generic graphs.
  • Closely monitored the Autosys batch jobs in ETL batch run during System, Integration and Acceptance test runs.
  • Worked closely with migration team to migrate the ETL code changes from development environment to System, Integration and Production environments.
  • Extensively worked in the UNIX environment using Shell Scripts. Created test cases and performed unit testing for the Ab Initio graphs. Documented Unit testing. Logged and resolved defects in the roll out phase. Responsible for supporting the functional team and troubleshooting any production issues.

Environment: Ab Initio GDE 3.0.4 Co-Operating System 2.15, Oracle 10g, Teradata, UNIX, Autosys, Business Objects.

Confidential

ETL/Ab-initio Developer

Responsibilities:

  • Developed the graphs using the GDE with components partition by round robin, partition by key and sort, rollup, sort, scan, reformat, join, merge, gather and PBE components.
  • Extensively involved in using the Ab Initio parallelism (pipeline, component, & data) and best practices.
  • Used generic graph for Handled Validation of incoming files.
  • Involved in the migration of the code in QA environment with QA team and completed rigorous unit testing of each graph.
  • Involved in conducting reviews for the Ab Initio codes after development.
  • Worked on Clear Quest to approve tickets, checking the status, fixing Business related issues.
  • Worked on checking out ETL codes like checking the Ab Initio setup, setting up configuration profiles, checking disk space, verifying parallelism.
  • Developed graphs separating the Extraction, Transformation and Load process to improve the efficiency of the system.
  • Involved in designing Load graphs using Ab Initio and tuned the performance of the queries to make the load process run faster.
  • Implemented Lookups, lookup local, In-Memory Joins to speed up various Ab Initio Graphs.
  • Extensively used Enterprise Meta Environment (EME) for version control
  • Involved in unit testing and assisted in system testing, integrated testing, and user acceptance testing.
  • Followed the best design principles, efficiency guidelines and naming standards in designing the graphs
  • Created High level Design, Detail Design, Unit Test Case documents following prescribed GIW standards.
  • Reduced the amount of data moving through flows to have a tremendous impact on graph performance.
  • Involved in implementing techniques in improving recoverability and limiting resource usage.
  • Referenced files with parameters instead of hard paths.
  • Analyzed the source and target record formats and made necessary changes.
  • Developed ETL procedure strategies, and worked closely with business and data validation groups to provide assistance and guidance for system analysis, data integrity analysis, and data validation activities.
  • Used Test Director to keep track of bugs.
  • Conducted Peer reviews inside the team and conducted the CODE reviews within the Organization.
  • Identified and debugged the errors before deploying.

Environment: Ab Initio, Oracle9i/ 10g, Flat Files, AIX5.0, Shell Scripting, Autosys, COBOL, Test Director, DB2, SQL, PL/SQL Windows NT/2000.

We'd love your feedback!