Sr. Ab Initio Developer Resume
San Francisco, CA
SUMMARY
- Over Seven years of professional IT experience in Business Analysis, Design, Data Modeling, Development and Implementation of various client server and decision support system environments with focus on Data Warehousing, Business Intelligence and Database Applications.
- Over 6 years of Ab Initio Consulting with Data mapping, Transformation and Loading from Source to Target Databases, well versed in various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques in complex, high volume Data Warehousing projects in both UNIX and Windows.
- Extensive experience in Korn Shell Scripting to maximize Ab - Initio data parallelism and Multi File System (MFS) techniques.
- Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
- Developed various UNIX shell scripts to run Ab Initio and Data base jobs. Good experience working with very large databases and Performance tuning.
- Good Experience working with various Heterogeneous Source Systems like Oracle, DB2 UDB, Teradata, Netezza, MS SQL Server, Flat files and Legacy Systems.
- Very good understanding of Teradata’s MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
- Experience in DBMS Utilities such as SQL, PL/SQL, TOAD, SQL*Loader, Teradata SQL Assistant.
- Good knowledge of Teradata RDBMS Architecture, Tools & Utilities.
- Experienced with Teradata utilities Fast Load, Multi Load, BTEQ scripting, FastExport, OleLoad, SQL Assistant.
- Skillfully exploit OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum, group by grouping set etc to generate detail reports for marketing folks.
- Extensively worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development with high complex Data models of Relational, Star, and Snowflake schema.
- Experienced in all phases of Software Development Life Cycle (SDLC).
- Experience in feed integration and automated data reconciliation.
- Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De-normalize, Partitioning and De-partitioning components etc.
- Experience in Data Modeling, Data Extraction, Data Migration, Data Integration, Data Testing and Data Warehousing using Ab Initio.
- Experience in application tuning and debugging strategies.
- Exposure to Conduct It,BRE, Data profilerproducts.
- Knowledge in Analyzing Data using AbInitioData Profilerto estimate different Patterns of data, identifying duplicates, frequency, consistency, accuracy, completeness and referential integrity of data.
- Knowledge on Transformation rules management usingBusiness Rules Engine (BRE).
- Worked with ODS (Operational Data Source) and DSS (Decision support System) to do the data profiling, Data validation and cleansing process using AbInitio.
- Hands on experience withMetadata Hubadministration tools, utilities for creatingMetadata Hubdata stores.
- Experience in creating and deployingMetadata HubWeb applications, and loadingMetadata Hub customizations
- Configured Ab Initio environment to connect to different databases using DB config, Input Table, Output Table, Update table Components.
- Experience in using EME for version controls, impact analysis and dependency analysis.
- Expertise in preparing code documentation in support of application development, including high level and detailed design documents, unit test specifications, interface specifications, etc.
- Excellent Communication skills in interacting with various people of different levels on all projects and also playing an active role in Business Analysis.
TECHNICAL SKILLS
Data warehousing Tools: Ab Initio (GDE 3.1.2/3.0.4/3.0.2/1.15/1.14 , Co>Operating System 3.0.5/2.15/2.14 ), Informatica 6.1/7.1x, SSIS, DTS
Data Modeling: Star-Schema Modeling, Snowflakes Modeling, Erwin 4.0, Visio
RDBMS: Oracle 10g/9i/8i/,Tera Data 13.0, Netezza 4.6.2, DB2, MS SQL Server 2000, 2005,2008
Programming: UNIX Shell Scripting, C/C++, Java, Korn Shell, TSQL, SQL*Plus, PL/SQL,HTML, ASP.Net
Operating Systems: Windows NT/XP/2000, UNIX, LINUX(Redhat)
BI tools: OBIEE 10.1.3.x, Crystal Reports 8.0/8.5
PROFESSIONAL EXPERIENCE
Confidential, San Francisco, CA
Sr. Ab Initio Developer
Responsibilities:
- Designed and deployed the Extract, Transform and Load process, using AbInitio by studying the business requirements from the business users.
- Developed AbInitio Graphs with complex transformation rules through GDE.
- Developed Complex AbInitio XFRs to derive new fields and solve various business requirements.
- Developed to write xml component to take a stream of data and convert it to an xml document.
- Extensively used AbInitio Components like Join, Rollup, Reformat, Partition and De-partition. Also used functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
- Implemented Lookups, lookup local, In-Memory Joins and rollups to speed up various AbInitio Graphs.
- Implemented 4 and 6 way Multi-file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories and utilized AbInitio parallelism techniques. Extensively used AbInitio Parallelism feature of Component, Data and Pipeline parallelism.
- Responsible for Performance-tuning of Ab Initio graphs.
- Collected and analyzed the user requirements and the existing application and designed logical and physical data models.
- Worked on EME, Meta data environment.
- Scripts were run through Unix shell scripts in Batch scheduling
- Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
- Performed data validation before moving the data into staging areas using in built functions like is valid, first defined, is blank, is defined, string length, and string index
- Developed Strategies for Data Analysis and Data Validation.
- Ensure ongoing data quality including data quality audit benchmarks and communicate monthly data quality metrics and follow prescribed data quality methodologies
- Provided guidance and quality assurance to all data masking activities, Profiles the source system data to identify potential data issues.
- Developed and execute complex SQLs for data validation.
- Extensively worked on the Continuous Flow technologies like Database Replication and Message Queuing.
- Updated and inserted transactional data according to the business changes using Continuous Flows.
- Responsible for testing the graph (Unit testing) for Data validations and preparing the test reports.
- Implemented Security Features of Business Objects like row level, object level and report level to make the data secure.
Environment: Ab Initio GDE 3.1.3.2, Co-Op 3.1.4.4, EME, Korn shell scripting, UNIX, Teradata 14, SQL Server Navigator 5.0, Windows-NT/2000.
Confidential, Lincolnshire IL
Ab Initio Developer
Responsibilities:
- Designed and deployed the Extract, Transform and Load process, using AbInitio by studying the business requirements from the business users.
- Developed AbInitio Graphs with complex transformation rules through GDE.
- Developed Complex AbInitio XFRs to derive new fields and solve various business requirements.
- Developed to write xml component to take a stream of data and convert it to an xml document.
- Extensively used AbInitio Components like Join, Rollup, and Reformat etc. as well as Partition and De partition extensively and functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
- Implemented Lookups, lookup local, In-Memory Joins and rollups to speed up various AbInitio Graphs.
- Implemented 4 and 6 way Multi-file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories and utilized AbInitio parallelism techniques.
- Extensively used AbInitio Parallelism feature of Component, Data and Pipeline parallelism.
- Profiled several data sets (serial, multi-file, database tables ) and categorized into different projects, directories using AbInitio data profiler
- Used AbInitio functions for improving performance of AbInitio graphs.
- Developed parameterized AbInitio graphs for increasing the performance of the Project.
- Used Check in and Checkout of graphs from EME for graphs modification and development.
- Developed UNIX Korn Shell scripts to run various AbInitio generated scripts.Prepared and implemented data verification and testing methods for the Data Warehouse.
- CreatingMetadata Hub data storesusing utilities.
- Creating and deploying Metadata Hub Web applications. Customizing the Metadata Explorer in order for the Business user to explore and analyze the Metadata and to see the contents of the system and applications and drill down in to details of the object.
- Creating new feed files for importing the metadata on the command line and also in the Metadata Portal. Creating rule files for Transformations and importing the feeds.
Environment: Ab Initio GDE 3.1.3.2, Co-Op 3.1.4.4, EME, Korn shell scripting, UNIX, Teradata 14, SQL Server Navigator 5.0, Windows-NT/2000.
Confidential, Memphis, TN
Ab Initio Developer
Responsibilities:
- Created Ab Initio graphs that transfer data from various sources like Oracle, flat files and CSV files to the Teradata database and flat files.
- Derived modeled the Facts, Dimensions, Aggregated facts in Ab Initio from data warehouse star schema for create billing, contracts reports.
- Worked on Multi file systems with extensive parallel processing.
- Automation of load processes using Autosys.
- Used Lookup Transformation in validating the warehouse customer data.
- Prepare logical/physical diagram of DW, and present it in front of business leaders. Used ERWIN for model design.
- Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS.
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
- Coded and tested Ab Initio graphs to extract the data from Oracle tables and MVS files.
- Enhancements were done to the existing System as specified by the customer using COBOL, DB2, and JCL.
- Worked on profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of the data that can be used for analytical purpose for business analysts.
- Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
- Produced mapping document and ETL design document.
- Worked closely with the end users in writing the functional specifications based on the business needs.
- Extensively Fast load, Tpump and TPT as load utilities.
- Participated in project review meetings.
- Extensively worked with PL/SQL Packages, Stored procedures & functions and created triggers to implement business rules and validations.
- Responsible for Performance-tuning of Ab Initio graphs.
- Collected and analyzed the user requirements and the existing application and designed logical and physical data models.
- Worked on EME environment.
- Scripts were run through Unix shell scripts in Batch scheduling
- Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
- Extensively worked on the Continuous Flow technologies like Database Replication and Message Queuing.
- Updated and inserted transactional data according to the business changes using Continuous Flows.
- Responsible for testing the graph (Unit testing) for Data validations and preparing the test reports.
- Implemented Security Features of Business Objects like row level, object level and report level to make the data secure.
Environment: Ab Initio (CO>Operating system 2.15/2.14, GDE 1.15/1/14), ER-win 4.0, UNIX, MVS, SQL, PL/SQL, Oracle 10g, Teradata V2R6, DB2, COBOL, Perl, Autosys.
Confidential, Atlanta, GA
Ab Initio/Teradata Developer
Responsibilities:
- Developed UNIX Korn Shell scripts to run various Ab Initio generated scripts.
- Developed parameterized Ab Initio graphs for increasing the performance of the Project.
- Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques.
- Provided customer support during warranty period by resolving issues in timely Manner.
- Experience in Design of the Warehouse Architecture and the Database using Erwin.
- Experience in understanding of the Specifications for Data Warehouse ETL Process and interacted with the designers and the end users for informational requirements.
- Analyze business requirements and developed metadata mappings and Ab Initio DML’s.
- Developed subject area graphs based on business requirements using various Ab Initio components like Filter by Expression, Partition by Expression, Partition by round robin, reformat, join, gather, merge rollup, normalize, scan, replicate etc.
- Extensively used Ab Initio functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
- Performed data validation before moving the data into staging areas using in built functions like is valid, first defined, is blank, is defined, string length, and string index
- Developed Strategies for Data Analysis and Data Validation.
- Ensure ongoing data quality including data quality audit benchmarks and communicate monthly data quality metrics and follow prescribed data quality methodologies
- Provided guidance and quality assurance to all data masking activities, Profiles the source system data to identify potential data issues
- Used Ab Initio GDE to generate complex graphs for transformation and loading of data into Staging and Target Data base area.
- Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
- Responsible for writing shell scripts (wrapper) to schedule the jobs in development environment.
- Developed graphs for the ETL processes using Join, Rollup and Reformat transform components as well as Partition and De-partition components extensively.
- Creating load ready files using Ab Initio to load into database.
- Experience in Unit testing, System testing and debugging.
- Provided 24/7-production support for a wide range of applications ZWP/ZLP.
- Developed various Ab Initio Graphs for data cleansing using Ab Initio function such as is valid, is error, is defined, is null and various other string functions etc.
- Resolved issues of various severities during testing and production phases on Time.
Environment: Ab Initio (GDE 1.12.6.1, Co>Operating System 2.12.2), UNIX5.2, Oracle 8.X, PERL, SQL\PL-SQL, TOAD, Windows NT/2000/XP.
Confidential, Cincinnati, OH
ETL Abinitio Developer
Responsibilities:
- Performed Metadata Mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs.
- Involved in creating detail data flows with Source and Target Mappings and convert data requirements into low level design templates.
- Responsible for setting up Repository projects using Ab Initio EME for creating a common development environment that can be used by the team for source code control.
- Implemented various levels of parameter definition like project parameters and graph parameters instead of start and end scripts.
- Developed graphs based on data requirements using various AB Initio Components such as Rollup, Reformat, Join, Scan, Normalize, Gather, Broadcast, Merge etc., making use of statements/variables in the components for creating complex data transformations.
- Used various Teradata utilities such as Mload, api and Fast load while using I-Table, O-Table components depending on the volume of data and status of the target database table.
- Created generic graphs for loading and unload Teradata tables using pre and post Run SQL components to clean WEL tables that are created due to the intermediate process failures.
- Performed data cleansing using AB Initio functions such as is valid, is error, is defind
- Extensively used string * functions, date functions and error functions for source to target data transformations.
- Well experienced in using Partition (Partition by key, partition by round robin) and Departition components (Concatenate, Gather, and Interleave, Merge) to achieve data parallelism.
- Created common graphs to perform common data conversions that can be used across the applications using parameter approach using conditional DMLs.
- Modified Ab Initio Graphs to utilize Data Parallelism and thereby improve the overall performance to fine-tune the execution times by using multi file systems and lookup files whenever required.
- Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
- Implemented Lookups instead of joins, in-memory sorts to minimize the execution times while dealing with huge volumes of data.
- Replicate operational table into staging tables, transform and load data into warehouse tables using Ab Initio GDE.
- Deployed and ran the graphs as executable Korn shell scripts in the applications system.
- Developed UNIX Korn Shell script wrappers to run Ab Initio deployed scripts, perform audit checks/data reconciliation and error handling to ensure data accuracy.
Environment: Ab Initio (GDE 1.12 Co-op 2.12), UNIX, PL/SQL, Oracle 10g, TeradataV2R6, Query man, UNIX, Windows NT/2000.
Confidential
Teradata Developer
Responsibilities:
- Managing databases, tables, indexes, views, stored procedures.
- Enforcing business rules with triggers and user defined functions, troubleshooting, and replication.
- Writing the Stored Procedures, checking the code for efficiency.
- Daily Monitoring of the Database Performance and network issues.
- Administering the Teradata Server by Creating User Logins with appropriate roles, dropping and locking the logins, monitoring the user accounts, creation of groups, granting the privileges to users and groups. SQL Authentication
- Rebuilding indexes on various tables.
Environment: Teradata RDBMS, BTEQ, FastLoad, MultiLoad, Fast Export Teradata Manager, Teradata SQL Assistant, Rational Clear Quest, UNIX, MQ, NDM, FTP