We provide IT Staff Augmentation Services!

Sr. Abinitio Developer Resume

0/5 (Submit Your Rating)

Baltimore, MD

SUMMARY

  • Over 7+ years of experience as a ETL/Data warehouse Developer with extensive project lifecycle
  • Experience in Design, Development, Implementation and testing of various sized applications using ETL and BI Tools.
  • Over 6+ years of Ab Initio experience with ETL, Data Mapping, Transformation and Loading from Source to Target Databases in a complex, high - volume environment.
  • Extensive experience with EME for version control, impact analysis and dependency analysis.
  • Good understanding of new Ab Initio features like Component Folding, Parameter Definition Language (PDL), and Continuous flows, Queues, publisher and subscriber components.
  • Working experience with the JCL language for creating the Batch jobs to run the graphs through the Mainframe. Experience with the Mainframe files (MVS, VSAM and Flat file).
  • Experience in Data Modeling Schemas (RDBMS, Multi-Dimensional Modeling (Star Schema, Snowflake), MOLAP, ROLAP and HOLAP) using Erwin Data Modeler.
  • Have good experience on Teradata SQL, Teradata ANSI SQL, Teradata Tools & Utilities (Fast Export, Multiload, Fast Load, TPUMP, BTEQ and QueryMan).
  • Good Experience working with various Heterogeneous Source Systems like Oracle, DB2, MS SQL Server, Flat files and Legacy Systems.
  • Extensive experience in prototyping, RAD, SDLC, JAD, Agile Iterative development, structured techniques, meta data management and data mapping.
  • Proficient in using Shell scripts and SQL for automation of ETL processes.
  • Good experience in Scheduling, Production Support and Troubleshooting for various ETL Jobs.
  • UNIX Shell scripts for Batch processing. Used VI and emacs editors.

TECHNICAL SKILLS

Data Warehousing: Ab Initio (GDE 3.0.5.1, Co> Operating System 3.0.6.1) Data warehousing designing, (Mapping Designing, Mapplet, Transformations), ETL, Metadata, Data Mining, Data mart, EME.

Databases: Sybase 12.5.1, Teradata 13.0, Oracle 11.1, MS SQL Server 6.5/7.0/2000 DB2,MS Access.

Data Modeling: Star & Snowflake Schemas, Visio.

Tools: SQL Assistant, MS Access Reports, Control M, Clear Quest

Programming: C,SQL,Shell Scripting, Korn Shell, SQL*Plus,PL/SQL, Visual Basic 6.0

Operating Systems: Windows NT/98/2000, AIX 5.0/5.2/5.3, OS/390

PROFESSIONAL EXPERIENCE

Confidential, Windsor Mill, MD

Sr. ETL/Abinitio Developer

Responsibilities:

  • Involved in Data transfer design and development for legacy data to a newly defined data model for the data warehouse.
  • Involved in meetings to gather requirements from the Business Users.
  • Used Ab Initio data cleansing functions like is valid, is defined, is error, is defined, string substring, string concat and other String, Date, Inquiry and Miscellaneous functions
  • Developed Complex AbInitio XFRs to solves rigorous and meet the business requirements.
  • Developed shell scripts to customize the ETL Ab Initio Graphs at runtime.
  • Automated the Data Loads using UNIX shell scripting for Production, Testing and development environment.
  • Involved in modifying the Ab-initio generic graph and wrapper scripts for unloading data from source systems.
  • Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.
  • Used Data Parallelism, Pipeline Parallelism and Component Parallelism in Graphs, where huge data files are partitioned into multifiles and each segment is processed simultaneously.
  • Involved in assessment of the BI architecture and created online survey which was accessed by end users.
  • Experienced in Oracle enterprise performance management system tool to create the reports.
  • Extensively used ETL to load data using Ab Initio DB components from heterogeneous source systems from DB2, Oracle etc to target DW oracle Database.
  • Involved in scheduling the jobs in Autosys.
  • Experienced in write the complex SQLs in and use the PL/SQL.

Environment: Abinitio (GDE 3.1.2, Co-Operating System 3.1.2) and UNIX shell Scripting, DB2, Oracle 11.1,Oracle EPM 11.1.2, UNIX/LINUX and Windows 7,Autosys.

Confidential, NJ

Sr. Abinitio Developer

Responsibilities:

  • Designed parameterized generic graphs to pass the values from Wrapper script
  • Worked on improving the performance of Ab Initio graphs by using Various performance techniques like using lookups (instead of joins),In-Memory Joins to speed up various Graphs.
  • Developed Ab-Initio graphs for Data validation using “validate” components like compare records, compute checksum etc.
  • Extensively used ETL to load data using Ab Initio DB components from heterogeneous source systems from Sybase, Teradata to target DW Oracle Database and other file systems.
  • Used the sub graphs to increase the clarity of graph and to impose reusable business restrictions.
  • Closely speak to users and then gather the requirements
  • Involved in developing Unix Korn Shell wrappers to run various Ab-Initio Scripts.
  • Support the PAA batch jobs runs in production
  • Experienced in writing complex SQL queries to extract the data out from Sybase database based on the given requirements.
  • Created the dbc file for Sybase for various environments in DEV, QA, PROD
  • Involved deployed the project objects from DEV to QA
  • Developed graphs to unload data into tables and validate the new data in comparison with already existing data.
  • Wrote several Shell scripts for Project maintenance to remove old/unused files and move raw logs to the archives.
  • Used Ab Initio for Error Handling by attaching error and rejecting files to each transformation and making provision for capturing and analyzing the message and data separately.

Environment: Abinitio (GDE 3.0.5.1, Co-Operating System 3.0.6.1) and UNIX shell Scripting, Sybase 12.5.1, Teradata 13.0, Oracle 11.1, UNIX/LINUX and Windows 2002.

Confidential, Durham, NC

ETL/ Abinitio Developer

Responsibilities:

  • Developed Abinitio graphs to load the ICD code set’s to the Repository.
  • To get required data and meet the business requirements used the Abinitio transformations by using the transform components such as Join, Rollup, Scan, Re format, Filter by Expression.
  • Each Code set has different versions and format to get the better understand, worked with the Data Analysis Team.
  • Worked closely with technical architecture to have the required knowledge on technical design and system environment.
  • Shared idea’s with Business Analysts to fulfill the Business requirements.
  • Played active role in prepare ETL design document approved by project manager.
  • Responsible for load ICD code set’s such as Initial load, Delta load and Mapping load.
  • Experienced in writing SQL queries to load and unload the data from target DB2 and Oracle database.

Environment: Abinitio (GDE 3.0.2.1, Co-Operating System 3-0-6) and UNIX shell Scripting, SQLServer, DB2, Oracle10g, UNIX and Windows 2000.

Confidential, Baltimore, MD

ETL/Abinito Developer

Responsibilities:

  • Developed Abinitio graphs using the salesforce components
  • Good knowledge about the salesforce components such as Query salesforce Components, Write Salesforce Components, Get Salesforce Info, Retrieve Salesforce Objects.
  • Involved in deploy the Abinitio graphs and migrate to various instances like QA by using the air commands.
  • Designed and Developed wrapper scripts for batch stream jobs.
  • Involved in create the Autosys Jobs for Abinito Graphs, and jilin various instances DEV and QA.
  • Developed Pset’s, Dmls, and xfrs for generic graphs.
  • Extensively used Partition components and developed graphs using Write Multi-Files, Read Multi-Files, Filter by Expression, Run Program, Join, Scan, Rollup, Sort, Reformat.
  • Responsible for load international batch stream into various Marts like domestic and International.
  • Extensively used ETL to load data using Ab Initio DB components from heterogeneous source systems from DB2, Oracle etc to target DW DB2 Database and other file systems.
  • Used various Ab Initio Multi File Systems (MFS) to run graphs in parallel using layout feature.
  • Involved in debug Production issues for Batch stream jobs.
  • Experienced in writing complex SQL queries based on the given requirements.
  • Interacting with business partners to gather requirements.

Environment: AbInitio (GDE 3.0.2.1, Co-Operating System 3-0-2) and (GDE 1.14, Co-Operating System 2.14, UNIX shell Scripting, SQLServer, DB2, Oracle10g,UNIX, Windows NT/2000.

Confidential, Collierville, TN

Sr. ETL/Abinito Developer

Responsibilities:

  • Developed several partition based Ab Initio Graphs for high volume data warehouse.
  • Involved in all phases of the System Development Life Cycle Analysis,& Data Modeling.
  • Extensively used Enterprise Meta Environment (EME) for version control.
  • Extensive exposure to Generic graphs for data cleansing, data validation and data transformation.
  • Created Sandbox and edited Sandbox parameters according to repository Extensive exposure to EME.
  • Used AIR commands to do dependency analysis for all ABI objects
  • Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping, Transformation and Loading in complex and high-volume environment and data processing at Terabytes level.
  • Extensively used Partition components and developed graphs using Write Multi-Files, Read Multi-Files, Filter by Expression, Run Program, Join, Sort, Re format.
  • Followed the best design principles, efficiency guidelines and naming standards in designing the graphs
  • Developed shell scripts for Archiving, Data Loading procedures and Validation
  • Involved in writing Unit Test scripts, support documents and implementation plan.
  • Tuned the graphs by creating Lookup files, Memory sort and Max-core parameters for maximum usage of cache memory and enhanced performance.
  • Implemented an 6 way multi file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories (using Multi directories).
  • Responsible for cleansing the data from source systems using Ab Initio components such as reformat and filter by expression.
  • Used the sub graphs to increase the clarity of graph and to impose reusable business restrictions.
  • Capacity of designing solutions around Ab initio, with advanced skills in high performance and parallelism in Ab Initio.

Environment: Abinito (GDE 1.15.10, Co-Operating System 2.15.10), UNIX shell Scripting, Windows NT/2000, DB2, UNIX IBM AIX 5.1.

Confidential, Owings Mills, MD

Sr. ETL Developer

Responsibilities:

  • Worked in the Data Management Team on Data Extraction, Fictionalization, Subset, Data Cleansing, and Data Validation.
  • Responsible for requirement gathering and development of Expiration Date Fictionalization project.
  • Creating the batch jobs through mainframe to run the graphs .
  • Working experience with the DCLGEN through DB2 to create the copy book for the DB2 tables.
  • Involved in migrating the data warehouse from DB2 to TERADATA.
  • Wrote several queries and BTEQ scripts for ETL process and also responsible for decision of primary, secondary indexes of tables and usage of Analytical Functions(QUALIFY ROW NUMBER(), OVER (PARTITION BY) for effective performance at peak volumes and complex joins.
  • Resolved major spool space issues by collecting Stats on proper columns of the Teradata tables and ensuring proper distribution of data in the volatile sessions of Teradata scripts.
  • Responsible for to create the params for the graphs to run through UNIX environment.
  • Co-ordination with different testing groups to accommodate their testing data requirements and translate them to data selection criteria in Abinitio format.
  • Extensively used ETL to load data using Ab Initio DB components from heterogeneous source systems from DB2, Oracle etc to target DW Teradata Database and other file systems.
  • Batch processing for data downsizing (subset).
  • Maintaining sandbox by storing all the work in a sequential order.
  • Developed UNIX shell scripts for the purpose of parsing and processing data files. Maintained and did trouble shooting for batch processes for overnight operations.
  • Co-ordinate with data team for the development of future changes in the file or table structures to accommodate future testing requirements.
  • Worked with MVS, VSAM, GDG, DB2, Flat files and excel sheets as the inputs for the graphs.
  • Worked with AbInitio functions like date, strings and user defined functions as per the requirements.

Environment: AbInitio (GDE 1.14.16, Co-Operating System 2.14.1), UNIX shell Scripting, Windows NT/2000,DB2, Teradata V2R6.

Confidential, NJ

Sr. ETL Developer

Responsibilities:

  • Extensively used the Ab initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize and Denormalize. Used Abi features like MFS (8 way partition), check point, phases etc...
  • Optimized and tuned the Ab Initio graphs. Architected and implemented this project in Agile Iterative Methodology with the help of BT Project Manager. The iterations are CD, MM, and Integration with Card products using Customer Identification Number, Marketing Campaigns and IVR Call and agent response data
  • Automated the ETL process using UNIX Shell scripting.
  • Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Involved in the Proof of Concept (POC) using Ab Initio to show that Ab Initio is the right solution Used various Ab Initio components like Call Web service, Read XML, Write XML, xml-to-dml utility for testing. Also did the POC with Ab Initio/Oracle Stored Procedures (PL/SQL) to evaluate the performance.
  • Involved in writing complex SQL queries based on the given requirements and Created series of Teradata Macros for various applications in Teradata SQL Assistant and performed tuning for Teradata SQL statements using Teradata Explain command.
  • Prepared Unit and Integration testing plans. Involved in Unit and Integration testing using the testing plans.
  • Created several SQL queries and created several reports using the above data mart for UAT and user reports. Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Subqueries, EXISTS, COALESCE, NULL etc.
  • Involved in after implementation support, user training and data models walkthroughs with business/user groups.

Environment: Ab Initio (GDE1.13.7, Co>operating system 2.14.104), Teradata V2 R5, Fast-loads, Multi-loads, Fast Exports, BTEQ, UNIX IBM AIX 5.1, Unix Shell scripts.

Confidential, Birmingham, MI

ETL/Abinitio Developer

Responsibilities:

  • Analyzed Data Warehouse/Decision Support business requirements.
  • Involved in designing fact, dimension and aggregate tables for Data Warehouse.
  • Followed Star schema methodology to store the data in the Data Warehouse.
  • Developed number of Ab-Initio Graphs based on business requirements using various Ab- Initio Components like Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather etc.
  • Used various Ab-Initio Multi File Systems (MFS) to run graphs in parallel using layout feature.
  • Involved with Teradata DBA team to understand the structures of the tables and attributes before loading from Abinitio to Teradata.
  • Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart and also wrote complex queries to load summary tables based on the core tables in DW.
  • Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Abinitio partition components to segment data.
  • Responsible for designing Parallel Partition Abinitio Graph for high volume data warehouse.
  • Used UNIX environment variables in All the Abinitio Graphs, which comprises of specified locations of the files to be used as source and target.
  • Responsible for creating a 4 way multi file system that is composed of individual files on that are partitioned and stored in distributed directories
  • Involved in Abinitio Graph Design and Performance tuning to Load graph process.
  • Developed Ab-Initio graphs for Data validation using “validate” components like compare records, compute checksum etc.
  • Developed Complex Ab-Initio XFR’s to derive new fields and solved various business requirements.
  • Extensively used Ab-Initio Co>OS commands like m ls, m dump, m mkfs m rollback etc.
  • Enhanced Ab-Initio graph’s performance by using various Ab-Initio performance techniques like using lookup’s (instead of joins), In-Memory Joins and rollup’s to speed up various Ab-Initio Graphs.
  • Involved in developing Unix Korn Shell wrappers to run various Ab-Initio Scripts.

Environment: AbInitio (GDE 1.13.11, Co-Operating System 2.13.1), UNIX shell Scripting, SQL Server, Teradata V2R5, UNIX, Windows NT/2000.

Confidential, Phoenix, AZ

ETL/DW Developer

Responsibilities:

  • Interacting with business partners to gather requirements
  • Converting the business requirements into list pull specifications.
  • Wrote precise, reusable ETL specifications and patterns to facilitate development of best practices and internal competencies.
  • Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform, and load data into databases.
  • Developed various informatica Graphs for data cleansing using functions like is valid, is defined, is error, string substring, string concat and other string * functions.
  • Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
  • Considering both the business requirements and factors created NUSI for smooth (fast and easy) access of data of Teradata tables.
  • Designed parameterized generic graphs to pass the values from Wrapper script
  • Based on business requirements, developed number of informatica Graphs using various Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, gather, Broadcast, merge etc.
  • Worked on improving the performance of Ab Initio graphs by using Various performance techniques like using lookups (instead of joins),In-Memory Joins to speed up various informatica Graphs.
  • Preparing Technical Design and Test Plans.

Environment: Ab Initio (GDE1.13.7, Co> 2.14.104), Erwin 3.5, Oracle 8i, Teradata,SQL Server 2000, AIX UNIX, Shell Scripts, PL/SQL, Windows NT

We'd love your feedback!