Sr. Etl/ab Initio Developer Resume
Atlanta, GA
SUMMARY
- Over 7+ years of I.T experience with about 5 years of experience in Ab initio.
- Data Warehouse/ETL Consultant: Senior ETL Developer with extensive project lifecycle experience in Design, Development, Implementation and testing of various sized applications using ETL & BI Tools.
- Over 5 years of Ab Initio experience with ETL, Data Mapping, Transformation and Loading from Source to Target Databases in a complex, high - volume environment.
- Extensive experience with EME for version control, impact analysis and dependency analysis.
- Good understanding of new Ab Initio features like Component Folding, Parameter Definition Language (PDL), and Continuous flows, Queues, publisher and subscriber components.
- Very good understanding of Teradata UPI and NUPI and secondary indexes and join indexes.
- Experience in performance tuning of queries using Join and Aggregate Indexes and understanding of Teradata Explain Plan.
- Experience with Teradata SQL, Teradata ANSI SQL, Teradata Tools & Utilities (Fast Export, Multiload, Fast Load, TPUMP, BTEQ and Query Man).
- Working experience with the JCL language for creating the Batch jobs to run the graphs through the Mainframe.
- Experience with the Mainframe files (MVS, VSAM, GDG and Flat file) and with the tolls Compuware File Aid, QMF, TSO SIM tool.
- Experience with the TSO SIM TOOL to load and unload the DB2 tables though Mainframe.
- Experience in Data Modeling Schemas (RDBMS, Multi-Dimensional Modeling (Star Schema, Snowflake), and MOLAP using Erwin Data Modeler.
- Good Experience working with various Heterogeneous Source Systems like Teradata, Oracle, DB2, MS SQL Server, Flat files and Legacy Systems.
- Extensive experience in prototyping SDLC, JAD, Agile Iterative development, structured techniques, Meta data management and data mapping.
- Proficient in using Shell scripts and SQL for automation of ETL processes
- Good experience in Scheduling, Production Support and Troubleshooting for various ETL Jobs.
- Provided Production support and flexible in managing multiple projects and off-shore-onsite model experience.
- Developed UNIX Shell scripts for Batch processing. Used VI and emac editors.
TECHNICAL SKILLS
Data Warehousing: Ab Initio (GDE 1.15.10, Co>Operating System 2.15.10) Data warehousing designing, (Mapping Designing, Transformations), ETL, Metadata, Data Mining, Data mart, OLAP, OLTP, EME
Databases: Teradata V2R12/6/5, Oracle 10g/9i, MS SQL Server 6.5/7.0/2000 DB2, MS Access 7.0
Data Modeling: Star-Schema Modeling, Snowflakes Modeling, Visio
Tools: SQL Assistant, MS Access Reports, TOAD, Maestro, Control M, Clear Quest
Programming: C, SQL, UNIX Shell Scripting, Korn Shell, SQL*Plus, PL/SQL, Visual Basic 6.0
Operating Systems: Windows NT/98/2000, HP-UX, AIX 5.0/5.2/5.3, OS/390
PROFESSIONAL EXPERIENCE
Confidential, Atlanta, GA
Sr. ETL/Ab initio Developer
Responsibilities:
- Developed several partition based Ab Initio Graphs for high volume data warehouse.
- Involved in all phases of the System Development Life Cycle Analysis, & Data Modeling.
- Extensively used Enterprise Meta Environment (EME) for version control
- Extensive exposure to Generic graphs for data cleansing, data validation and data transformation.
- Created Sandbox and edited Sandbox parameters according to repository Extensive exposure to EME.
- Used AIR commands to do dependency analysis for all ABI objects
- Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping, Transformation and Loading in complex and high-volume environment and data processing at Terabytes level.
- Extensively used Partition components and developed graphs using write Multi-Files, Read Multi-Files, Filter by Expression, Run Program, Join, Sort, Re format.
- Followed the best design principles, efficiency guidelines and naming standards in designing the graphs
- Created several BTEQ, Fastload and Multiload scripts to load backfill data to Data Warehouse for Performance Testing.
- Developed shell scripts for Archiving, Data Loading procedures and Validation
- Involved in writing Unit Test scripts, support documents and implementation plan.
- Tuned the graphs by creating Lookup files, Memory sort and Max-core parameters for maximum usage of cache memory and enhanced performance.
- Implemented a 6-way multi file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories (using Multi directories).
- Developed complex BTEQ scripts for pre population of the worktables prior to the main load process.
- Database Query Optimization and I/O tuning techniques have been used for performance enhancements.
- Tuned the Teradata queries for performance from explain plans and added necessary indexes, analytic functions and collect stats to solve the spool space issues encountered.
- Responsible for cleansing the data from source systems using Ab Initio components such as reformat and filter by expression.
- Worked with an offshore team to build the project and undertook responsibilities of working closely with users for requirements gathering and providing the offshore team with detailed requirements documents
- Used the sub graphs to increase the clarity of graph and to impose reusable business restrictions.
- Capacity of designing solutions around Ab initio, with advanced skills in high performance and parallelism in Ab Initio.
Environment: Ab initio (GDE 1.15.10, Co-Operating System 2.15.10), UNIX shell Scripting, Windows NT/2000, COBOL, JCL language, DB2, File-Aid, Teradata V2R12/6, UNIX IBM AIX 5.1, and QMF.
Confidential, Dallas, TX
Sr. ETL Developer
Responsibilities:
- Worked in the Data Management Team on Data Extraction, Fictionalization, Subset, Data Cleansing, and Data Validation.
- Responsible for requirement gathering and development of Expiration Date Fictionalization project.
- Creating the batch jobs through mainframe to run the graphs.
- Working experience with the DCLGEN through DB2 to create the copybook for the DB2 tables.
- Responsible for to create the params for the graphs to run through UNIX environment.
- Responsible for writing the wrapper scripts.
- Responsible to analyze and develop the data synchronization with TCOE (Retail business testing).
- Co-ordination with different testing groups to accommodate their testing data requirements and translate them to data selection criteria in Ab initio format.
- Batch processing for data downsizing (subset).
- Conducted several JAD sessions and with analysts from business side to come up with the better requirements.
- Document ways to automate manual processes.
- Maintaining sandbox by storing all the work in a sequential order.
- Developed UNIX shell scripts for the purpose of parsing and processing data files. Maintained and did trouble shooting for batch processes for overnight operations.
- Worked with the Lean team to prepare the project plan and remove the unnecessary things for effective ROI.
- Quick reports generated for the users and data analysis on Test Beds on numerous occasions.
- Co-ordinate with data team for the development of future changes in the file or table structures to accommodate future testing requirements.
- Worked with MVS, VSAM, GDG, DB2, Flat files and excel sheets as the inputs for the graphs.
- Experience in creating bases for the VSAM and generations for the GDG file.
- Experienced in the use of AGILE approaches including test driven development and scrums.
- Worked with Ab initio functions like date, strings and user-defined functions as per the requirements.
Environment: Ab initio (GDE 1.14.16, Co-Operating System 2.14.1), UNIX shell Scripting, Oracle 10g,Windows NT/2000, COBOL, JCL language, DB2, File-Aid, BMC, QMF, TSO SIM tool, Web star, Mercury Test Directory
Confidential, River woods, IL
Sr. ETL Developer
Responsibilities:
- Met with business/user groups to understand the business process and gather requirements. Extracted and analyzed the sample data from operational systems (OLTP system) to validate the user requirements. Created high level design documents.
- Participated in data model (Logical/Physical) discussions with Data Modelers and created both logical and physical data models. Implemented Star Schema models for the above data warehouses by creating facts and dimensions and also created the DTD.
- Extensively used the Ab initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize and Denormalize. Used Abi features like MFS (8 way partition), check point, phases etc...
- Participated Agile Iterative sessions to develop Extended Logical Models and Physical Models.
- Optimized and tuned the Ab Initio graphs. Architected and implemented this project in Agile Iterative Methodology with the help of BT Project Manager.
- The iterations are CD, MM, and Integration with Card products using Customer Identification Number, Marketing Campaigns and IVR Call and agent response data
- Wrote Unix shell script to perform ETL interfaces (BTEQ, MLoad, Fastload, and FastExport jobs) Analyzed the Teradata Data Distribution and Reviewed the Index choices.
- Automated the ETL process using UNIX Shell scripting.
- Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL).
- Supported with various Business units during and after Production roll out with adhoc Teradata queries to identify missing/incorrect data and resolve issues by fixing them. Also wrote complex Queries and BTEQ scripts for users to run their reports.
- Did the Proof of Concept (POC) using Ab Initio to show that Ab Initio is the right solution used various Ab Initio components like Read XML, Write XML, xml-to-dml utility for testing. Also did the POC with Ab Initio/Oracle Stored Procedures (PL/SQL) to evaluate the performance.
- Involved in writing complex SQL queries based on the given requirements and Created series of Teradata Macros for various applications in Teradata SQL Assistant and performed tuning for Teradata SQL statements using Teradata Explain command.
- Prepared Unit and Integration testing plans. Involved in Unit and Integration testing using the testing plans.
- Created several SQL queries and created several reports using the above data mart for UAT and user reports. Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Sub queries, EXISTS, COALESCE, NULL etc.
- Involved in after implementation support, user training and data models walkthroughs with business/user groups.
Environment: Ab Initio (GDE1.13.7, Co>operating system 2.14.104), Teradata V2 R5, Fast-loads, Multi-loads, Fast Exports, BTEQ, UNIX IBM AIX 5.1, Control-M, Unix Shell scripts.
Confidential, Memphis, TN
ETL DW Developer
Responsibilities:
- Analyzed Data Warehouse/Decision Support business requirements.
- Involved in designing fact, dimension and aggregate tables for Data Warehouse.
- Followed Star schema methodology to store the data in the Data Warehouse.
- Developed number of Ab-Initio Graphs based on business requirements using various Ab- Initio Components like Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather etc.
- Mapped Metadata from legacy source system to target database fields and loaded the data in to target using Output Table and Update Table components.
- Used various Ab-Initio Multi File Systems (MFS) to run graphs in parallel using layout feature.
- Involved with Teradata DBA team to understand the structures of the tables and attributes before loading from Ab initio to Teradata.
- Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab initio partition components to segment data.
- Responsible for designing Parallel Partition Ab initio Graph for high volume data warehouse.
- Used UNIX environment variables in All the Ab initio Graphs, which comprises of specified locations of the files to be used as source and target.
- Development of ETL Scripts for processing and transferring of raw data from the legacy systems into the warehouse every week using Teradata and Teradata import utilities Fastload, Tpump, MultiLoad.
- Responsible for creating a 4 way multi file system that is composed of individual files on that are partitioned and stored in distributed directories.
- Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart.
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Involved in Ab initio Graph Design and Performance tuning to Load graph process.
- Developed Ab-Initio graphs for Data validation using “validate” components like compare records, compute checksum etc.
- Developed Complex Ab-Initio XFR’s to derive new fields and solved various business requirements.
- Extensively used Ab-Initio Co>OS commands like m ls, m dump, m mkfs m rollback etc.
- Enhanced Ab-Initio graph’s performance by using various Ab-Initio performance techniques like using lookup’s (instead of joins), In-Memory Joins and rollup’s to speed up various Ab-Initio Graphs.
- Used Two Stage Routing to improve the performance of the graph.
- Involved in developing Unix Korn Shell wrappers to run various Ab-Initio Scripts.
Environment: Ab initio (GDE 1.13.11, Co-Operating System 2.13.1), UNIX shell Scripting, Oracle 9i, TeradataV2R6/5, UNIX, Windows NT/2000.
Confidential, Birmingham, MI
ETL Developer
Responsibilities:
- Interacting with business partners to gather requirements
- Converting the business requirements into list pull specifications.
- Wrote precise, reusable ETL specifications and patterns to facilitate development of best practices and internal competencies.
- Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform, and load data into databases.
- Developed various Ab Initio Graphs for data cleansing using Ab Initio function like is valid, is defined, is error, string substring, string concat and other string * functions.
- Designed parameterized generic graphs to pass the values from Wrapper script
- Based on business requirements, developed number of Ab Initio Graphs using various Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, gather, Broadcast, merge etc.
- Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookups (instead of joins), In-Memory Joins to speed up various Ab Initio Graphs.
- Preparing Technical Design and Test Plans.
- Developed Unix Korn Shell wrappers to run various Ab Initio Scripts.
- Developed complex Ab Initio graphs using various components like Partition by Expression, Reformat, Join, Merge, Dedup etc with emphasis on optimizing performance.
- Enhance performance using data parallelism.
Environment: Ab-Initio 2.11, GDE 1.11, Erwin 3.5, Oracle 8i, SQL Server 2000, AIX UNIX, Shell Scripts, PL/SQL, Windows NT
Confidential
Programmer
Responsibilities:
- Study and analysis of the System.
- Interacted with functional users for developing application specifications.
- Created High Level Design (HLD) with discussing end users.
- Involved in the design of the overall database design using Entity Relationship diagrams.
- Designed and developed all the tables, views for the project in the database.
- Created reusable components to access the database using ActiveX DLL
- Used DAO data access technologies to access the database.
- Wrote triggers and stored procedures for various activities using T-SQL.
- Involved in developing interactive forms and customization of screens using Visual Basic.
- Involved in building, debugging and running forms.
- Responsible for creating reports using Crystal Report based on the user requirements.
Environment: Visual Basic 5.0, SQL Server 2000, and Crystal Report.
Confidential
Programmer
Responsibilities:
- Study and analysis of the System
- Involved in creating tables, views, queries and stored procedures.
- Interacted with functional users for developing application specifications
- Was responsible for the development of Tender module, Project Analysis and part of Financial Accounting.
- Performed tuning of queries.
- PL/SQL codes have been used extensively for the creation of forms and reports.
- PL/SQL Scripts (Stored procedures and Functions) have been used for application Development.
- Involved in Integration and Acceptance testing.
- Debugging of internal code for the forms and reports is done.
- Developed forms and reports in the Finance - Accounts Payable module for an information system using Oracle 6.0, SQL*Forms 3.0
Environment: Oracle 8.0.5, Oracle Forms, Developer 2000