Ab Initio Developer Resume
AustiN
SUMMARY
- Over Eight years of professional IT experience in Business Analysis, Design, Data Modeling, Development and Implementation of various client server and decision support system environments with focus on Data Warehousing, Business Intelligence and Database Applications.
- Over 6 years of Ab Initio Consulting with Data mapping, Transformation and Loading from Source to Target Databases, well versed in various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques in complex, high volume Data Warehousing projects in both UNIX and Windows.
- Extensive experience in Korn Shell Scripting to maximize Ab - Initio data parallelism and Multi File System (MFS) techniques.
- Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
- Developed various UNIX shell scripts to run Ab Initio and Data base jobs. Good experience working with very large databases and Performance tuning.
- Good Experience working with various Heterogeneous Source Systems like Oracle, DB2 UDB, Teradata, Netezza,MS SQL Server, Flat files and Legacy Systems.
- Very good understanding of Teradata’s MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
- Experience in DBMS Utilities such as SQL, PL/SQL, TOAD, SQL*Loader, Teradata SQL Assistant.
- Good noledge of Teradata RDBMS Architecture, Tools & Utilities.
- Experienced with Teradata utilities Fast Load, Multi Load, BTEQ scripting, FastExport, OleLoad, SQL Assistant.
- Skillfully exploit OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum, group by grouping set etc to generate detail reports for marketing folks.
- Worked with Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs and Automation of load processes using Autosys.
- Extensively worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development with high complex Data models of Relational, Star, and Snowflake schema.
- Experienced in all phases of Software Development Life Cycle (SDLC).
- Experience in feed integration and automated data reconciliation.
- Expert noledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De-normalize, Partitioning and De-partitioning components etc.
- Experience in Data Modeling, Data Extraction, Data Migration, Data Integration, Data Testing and Data Warehousing using Ab Initio.
- Experience in application tuning and debugging strategies.
- Exposure to Conduct It,BRE, Data profilerproducts.
- Knowledge in Analyzing Data using AbInitioData Profilerto estimate different Patterns of data, identifying duplicates, frequency, consistency, accuracy, completeness and referential integrity of data.
- Knowledge on Transformation rules management usingBusiness Rules Engine (BRE).
- Worked with ODS (Operational Data Source) and DSS (Decision support System) to do teh data profiling, Data validation and cleansing process using AbInitio.
- Experience of usingMetadata Importerfor importing metadata from an EME Technical Repository and other sources like ETL tools (Informatica), Reporting tools (Cognos, SAS, Business Objects etc) and databases (Oracle, Teradata, DB2 etc.)
- Hands on experience withMetadata Hubadministration tools, utilities for creatingMetadata Hubdata stores.
- Experience in creating and deployingMetadata HubWeb applications, and loadingMetadata Hub customizations
- Configured Ab Initio environment to connect to different databases using DB config, Input Table, Output Table, Update table Components.
- Experience in using EME for version controls, impact analysis and dependency analysis.
- Able to interact effectively with other members of teh Business Engineering, Quality Assurance, Users and other teams involved with teh System Development Life cycle
- Expertise in preparing code documentation in support of application development, including high level and detailed design documents, unit test specifications, interface specifications, etc.
- Excellent Communication skills in interacting with various people of different levels on all projects and also playing an active role in Business Analysis.
- Manage multiple projects/tasks within Mortgag, Mantas, Banking & Financial Service industries in a high-transaction processing environments with excellent analytical, business process, written and verbal communication skills.
TECHNICAL SKILLS
Data warehousing Tools: Ab Initio (GDE 3.1.2/3.0.4/3.0.2/1.15/1.14 , Co>Operating System 3.0.5/2.15/2.14 ), Informatica 6.1/7.1x, SSIS, DTS
Data Modeling: Star-Schema Modeling, Snowflakes Modeling, Erwin 4.0, Visio
RDBMS: Oracle 10g/9i/8i/,Tera Data 13.0, Netezza 4.6.2, DB2, MS SQL Server 2000, 2005,2008
Programming: UNIX Shell Scripting, C/C++, Java, Korn Shell, TSQL, SQL*Plus, PL/SQL,HTML, ASP.Net
Operating Systems: Windows NT/XP/2000, UNIX, LINUX(Redhat)
BI tools: OBIEE 10.1.3.x, Crystal Reports 8.0/8.5
PROFESSIONAL EXPERIENCE:
Confidential, Austin
Ab Initio Developer
Responsibilities:
- Used Components of Ab Initio to extract and transfer teh data from multiple operational data sources like TERADATA, DB2 UDB, SQL Server and Oracle to destination data marts in Oracle.
- Expertise with various Ab Initio components such as Join, Rollup, Lookup, Replicate, Partition by expression, Partition by key, partition by round robin, gather, merge, interleave, Dedup sorted, sort, filter by expression, scan, validate, reformat, FTP, compare records etc.
- Implemented a number of Ab Initio graphs using Data parallelism and Multi File System (MFS) techniques.
- Extensive experience in developing transformations between source and target using Ab Initio data mappings, cleansing teh data, applying transformations and loading into a complex, high-volume environment.
- Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
- Written SQL scripts which are used in Database Components of Ab Initio to extract teh data from different source tables and to load teh target table using Update Table and Output Table components with teh support of Config (.cfg) file in graphs.
- Used Ab Initio components like Reformat, Input file, Output file, Join, Sort, Partition By key, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQL for developing graphs.
- Performed data cleansing operations on teh data using transformation functions like is valid,is defined,is null,is blank,string lrtrim,re index,re interpret as,string concat,string substring,lookup count,lookup first,now(),decimal strip, re index,re replace,decimal lpad,next in sequence(),length of test characters all(),force error(),switch(),first defined(),lookup match(),conditional dml,cobol-to-dml utility,xml-to-dml utility etc.
- Used Phases, Checkpoints to avoid deadlocks and multi-files in graphs and also used Run program, Run SQL components to run UNIX and SQL commands.
- Excellent understanding of teh System Development Life Cycle. Clear and through understanding of business process and workflow. Involved in all teh four phases namely planning, analysis, design and implementation. Experienced in testing, documentation and requirements gathering.
- Provided application requirements gathering, designing, development, technical documentation, and debugging. Assisted team members in defining Cleansing, Aggregating, and other Transformation rules.
- Able to interact effectively with other members of teh Business Engineering, Quality Assurance, Users and other teams involved with teh System Development Life cycle.
- Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
- Extensively used teh Teradata utilities like BTEQ, Fast load, Multiload, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve teh analysts.
- Helped Business Users by writing Complex efficient Teradata SQLs to get a detailed for Data Mining. Automated these extract using BTEQ an Unix Shell Scripting.
- Used Enterprise Meta Environment (EME) for version control.
- Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
- CreatingMetadata Hub data storesusing utilities.
- Creating and deploying Metadata Hub Web applications. Customizing teh Metadata Explorer in order for teh Business user to explore and analyze teh Metadata and to see teh contents of teh system and applications and drill down in to details of teh object.
- Creating new feed files for importing teh metadata on teh command line and also in teh Metadata Portal. Creating rule files for Transformations and importing teh feeds.
- CreatingData Source Connectionfiles for connecting to teh graphs in order to extract teh Metadata.
- GeneratingMetadata Reports and auditing.
- Adding and Exposing teh Divisions in teh Metadata Portal.
- Exposing teh Notes Tab and having teh various notes type in teh Metadata Portal.
Environment: Ab Initio GDE 3.2, Co-Op 3.1.4.4, EME, Kdellingorn shell scripting, UNIX, Teradata 15, SQL Server Navigator 5.0, Windows-NT/2000.
Confidential
Role: Sr Ab Initio Developer
Responsibilities:
- Designed and deployed teh Extract, Transform and Load process, using AbInitio by studying teh business requirements from teh business users.
- Developed AbInitio Graphs with complex transformation rules through GDE.
- Developed Complex AbInitio XFRs to derive new fields and solve various business requirements.
- Developed to write xml component to take a stream of data and convert it to an xml document.
- Extensively used AbInitio Components like Join, Rollup, and Reformat etc. as well as Partition and De partition extensively and functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
- Implemented Lookups, lookup local, In-Memory Joins and rollups to speed up various AbInitio Graphs.
- Implemented 4 and 6 way Multi-file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories and utilized AbInitio parallelism techniques. Extensively used AbInitio Parallelism feature of Component, Data and Pipeline parallelism.
- Responsible for Performance-tuning of Ab Initio graphs.
- Collected and analyzed teh user requirements and teh existing application and designed logical and physical data models.
- Worked on EME, Meta data environment.
- Scripts were run through Unix shell scripts in Batch scheduling
- Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
- Performed data validation before moving teh data into staging areas using in built functions like is valid, first defined, is blank, is defined, string length, and string index
- Developed Strategies for Data Analysis and Data Validation.
- Ensure ongoing data quality including data quality audit benchmarks and communicate monthly data quality metrics and follow prescribed data quality methodologies
- Provided guidance and quality assurance to all data masking activities, Profiles teh source system data to identify potential data issues.
- Developed and execute complex SQLs for data validation.
- Extensively worked on teh Continuous Flow technologies like Database Replication and Message Queuing.
- Updated and inserted transactional data according to teh business changes using Continuous Flows.
- Responsible for testing teh graph (Unit testing) for Data validations and preparing teh test reports.
- Implemented Security Features of Business Objects like row level, object level and report level to make teh data secure.
Environment: Ab Initio GDE 3.1.3.2, Co-Op 3.1.4.4, EME, Korn shell scripting, UNIX, Teradata 14, SQL Server Navigator 5.0, Windows-NT/2000.
Confidential, OR
Role: Sr Ab Initio Developer
Responsibilities:
- Designed and deployed teh Extract, Transform and Load process, using AbInitio by studying teh business requirements from teh business users.
- Developed AbInitio Graphs with complex transformation rules through GDE.
- Developed Complex AbInitio XFRs to derive new fields and solve various business requirements.
- Developed to write xml component to take a stream of data and convert it to an xml document.
- Extensively used AbInitio Components like Join, Rollup, and Reformat etc. as well as Partition and De partition extensively and functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
- Implemented Lookups, lookup local, In-Memory Joins and rollups to speed up various AbInitio Graphs.
- Implemented 4 and 6 way Multi-file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories and utilized AbInitio parallelism techniques.
- Extensively used AbInitio Parallelism feature of Component, Data and Pipeline parallelism.
- Profiled several data sets (serial, multi-file, database tables ) and categorized into different projects, directories using AbInitio data profiler
- Used AbInitio functions for improving performance of AbInitio graphs.
- Developed parameterized AbInitio graphs for increasing teh performance of teh Project.
- Used Check in and Checkout of graphs from EME for graphs modification and development.
- Developed UNIX Korn Shell scripts to run various AbInitio generated scripts.Prepared and implemented data verification and testing methods for teh Data Warehouse.
- CreatingMetadata Hub data storesusing utilities.
- Creating and deploying Metadata Hub Web applications. Customizing teh Metadata Explorer in order for teh Business user to explore and analyze teh Metadata and to see teh contents of teh system and applications and drill down in to details of teh object.
- Creating new feed files for importing teh metadata on teh command line and also in teh Metadata Portal. Creating rule files for Transformations and importing teh feeds.
Environment: Ab Initio GDE 3.1.3.2, Co-Op 3.1.4.4, EME, Korn shell scripting, UNIX, Teradata 14, SQL Server Navigator 5.0, Windows-NT/2000.
Confidential, San Francisco, CA
Role: Sr. Ab Initio Developer
Responsibilities:
- Created Ab Initio graphs that transfer data from various sources like Oracle, flat files and CSV files to teh Teradata database and flat files.
- Derived modeled teh Facts, Dimensions, Aggregated facts in Ab Initio from data warehouse star schema for create billing, contracts reports.
- Worked on Multi file systems with extensive parallel processing.
- Automation of load processes using Autosys.
- Used Lookup Transformation in validating teh warehouse customer data.
- Prepare logical/physical diagram of DW, and present it in front of business leaders. Used ERWIN for model design.
- Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS.
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching teh business requirements to Teradata RDBMS.
- Coded and tested Ab Initio graphs to extract teh data from Oracle tables and MVS files.
- Enhancements were done to teh existing System as specified by teh customer using COBOL, DB2, and JCL.
- Worked on profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of teh data that can be used for analytical purpose for business analysts.
- Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
- Produced mapping document and ETL design document.
- Worked closely with teh end users in writing teh functional specifications based on teh business needs.
- Extensively Fast load, Tpump and TPT as load utilities.
- Participated in project review meetings.
- Extensively worked with PL/SQL Packages, Stored procedures & functions and created triggers to implement business rules and validations.
- Responsible for Performance-tuning of Ab Initio graphs.
- Collected and analyzed teh user requirements and teh existing application and designed logical and physical data models.
- Worked on EME environment.
- Scripts were run through Unix shell scripts in Batch scheduling
- Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
- Extensively worked on teh Continuous Flow technologies like Database Replication and Message Queuing.
- Updated and inserted transactional data according to teh business changes using Continuous Flows.
- Responsible for testing teh graph (Unit testing) for Data validations and preparing teh test reports.
- Implemented Security Features of Business Objects like row level, object level and report level to make teh data secure.
Environment: Ab Initio (CO>Operating system 2.15/2.14, GDE 1.15/1/14), ER-win 4.0, UNIX, MVS, SQL, PL/SQL, Oracle 10g, Teradata V2R6, DB2, COBOL, Perl, Autosys.
Confidential, TX
Role: Ab Initio Developer
Responsibilities:
- Developed UNIX Korn Shell scripts to run various Ab Initio generated scripts.
- Developed parameterized Ab Initio graphs for increasing teh performance of teh Project.
- Worked on improving teh performance of Ab Initio graphs by using Various Ab Initio performance techniques.
- Provided customer support during warranty period by resolving issues in timely Manner.
- Experience in Design of teh Warehouse Architecture and teh Database using Erwin.
- Experience in understanding of teh Specifications for Data Warehouse ETL Process and interacted with teh designers and teh end users for informational requirements.
- Analyze business requirements and developed metadata mappings and Ab Initio DML’s.
- Developed subject area graphs based on business requirements using various Ab Initio components like Filter by Expression, Partition by Expression, Partition by round robin, reformat, join, gather, merge rollup, normalize, scan, replicate etc.
- Extensively used Ab Initio functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
- Performed data validation before moving teh data into staging areas using in built functions like is valid, first defined, is blank, is defined, string length, and string index
- Developed Strategies for Data Analysis and Data Validation.
- Ensure ongoing data quality including data quality audit benchmarks and communicate monthly data quality metrics and follow prescribed data quality methodologies
- Provided guidance and quality assurance to all data masking activities, Profiles teh source system data to identify potential data issues
- Used Ab Initio GDE to generate complex graphs for transformation and loading of data into Staging and Target Data base area.
- Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
- Responsible for writing shell scripts (wrapper) to schedule teh jobs in development environment.
- Developed graphs for teh ETL processes using Join, Rollup and Reformat transform components as well as Partition and De-partition components extensively.
- Creating load ready files using Ab Initio to load into database.
- Experience in Unit testing, System testing and debugging.
- Provided 24/7-production support for a wide range of applications ZWP/ZLP.
- Developed various Ab Initio Graphs for data cleansing using Ab Initio function such as is valid, is error, is defined, is null and various other string functions etc.
- Resolved issues of various severities during testing and production phases on Time.
Environment: Ab Initio (GDE 1.12.6.1, Co>Operating System 2.12.2), UNIX5.2, Oracle 8.X, PERL, SQL\PL-SQL, TOAD, Windows NT/2000/XP.
Confidential, Atlanta, GA
Role: Ab Initio Developer
Responsibilities:
- Performed Metadata Mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs.
- Involved in creating detail data flows with Source and Target Mappings and convert data requirements into low level design templates.
- Responsible for setting up Repository projects using Ab Initio EME for creating a common development environment that can be used by teh team for source code control.
- Implemented various levels of parameter definition like project parameters and graph parameters instead of start and end scripts.
- Developed graphs based on data requirements using various AB Initio Components such as Rollup, Reformat, Join, Scan, Normalize, Gather, Broadcast, Merge etc., making use of statements/variables in teh components for creating complex data transformations.
- Used various Teradata utilities such as Mload, api and Fast load while using me-Table, O-Table components depending on teh volume of data and status of teh target database table.
- Created generic graphs for loading and unload Teradata tables using pre and post Run SQL components to clean WEL tables that are created due to teh intermediate process failures.
- Performed data cleansing using AB Initio functions such as is valid, is error, is defind
- Extensively used string * functions, date functions and error functions for source to target data transformations.
- Well experienced in using Partition (Partition by key, partition by round robin) and Departition components (Concatenate, Gather, and Interleave, Merge) to achieve data parallelism.
- Created common graphs to perform common data conversions that can be used across teh applications using parameter approach using conditional DMLs.
- Modified Ab Initio Graphs to utilize Data Parallelism and thereby improve teh overall performance to fine-tune teh execution times by using multi file systems and lookup files whenever required.
- Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
- Implemented Lookups instead of joins, in-memory sorts to minimize teh execution times while dealing with huge volumes of data.
- Replicate operational table into staging tables, transform and load data into warehouse tables using Ab Initio GDE.
- Deployed and ran teh graphs as executable Korn shell scripts in teh applications system.
- Developed UNIX Korn Shell script wrappers to run Ab Initio deployed scripts, perform audit checks/data reconciliation and error handling to ensure data accuracy.
Environment: Ab Initio (GDE 1.12 Co-op 2.12), UNIX, PL/SQL, Oracle 10g, TeradataV2R6, Query man, UNIX, Windows NT/2000.
Confidential
Teradata Developer
Responsibilities:
- Managing databases, tables, indexes, views, stored procedures.
- Enforcing business rules with triggers and user defined functions, troubleshooting, and replication.
- Writing teh Stored Procedures, checking teh code for efficiency.
- Daily Monitoring of teh Database Performance and network issues.
- Administering teh Teradata Server by Creating User Logins with appropriate roles, dropping and locking teh logins, monitoring teh user accounts, creation of groups, granting teh privileges to users and groups. SQL Autantication
- Rebuilding indexes on various tables.
Environment: Teradata RDBMS, BTEQ, FastLoad, MultiLoad, Fast Export Teradata Manager, Teradata SQL Assistant, Rational Clear Quest, UNIX, MQ, NDM, FTP