We provide IT Staff Augmentation Services!

Sr. Ab Initio Developer Resume

5.00/5 (Submit Your Rating)

CA

SUMMARY

  • Over Six plus years of professional IT experience in Business Analysis, Design, Data Modeling, Development and Implementation of various client server and decision support system environments with focus on Data Warehousing, Business Intelligence and Database Applications.
  • Over 5 years of Ab Initio Consulting with Data mapping, Transformation and Loading from Source to Target Databases in a complex, high volume environment.
  • Over 3 years of ETL Administrator involved in Installing and/or configuring Ab initio software: upgrades, patches, removal of older versions, consolidation of servers, Configuring and maintaining EME repositories etc.
  • Extensively worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development with high complex Data models of Relational, Star, and Snowflake schema.
  • Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De - normalize, Partitioning and De-partitioning components etc.
  • Experience in Data Modeling, Data Extraction, Data Migration, Data Integration, Data Testing and Data Warehousing using Ab Initio.
  • Well versed with various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques.
  • Configured Ab Initio environment to connect to different databases using DB config, Input Table, Output Table, Update table Components.
  • Very good understanding of Teradata MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc. Extensively used different features of Teradata such as BTEQ, Fast load, Multiload, SQL Assistant, DDL and DML commands.
  • Experience in using EME for version controls, impact analysis and dependency analysis.
  • Extensive experience in Korn Shell Scripting to maximize Ab-Initio data parallelism and Multi File System (MFS) techniques.
  • Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
  • Developed various UNIX shell scripts to run Ab Initio and Data base jobs. Good experience working with very large databases and Performance tuning.
  • Good Experience working with various Heterogeneous Source Systems like Oracle, DB2 UDB, Teradata, MS SQL Server, Flat files and Legacy Systems.
  • Very good understanding of the several relational Databases such as Teradata, Oracle and DB2. Wrote several complex SQL queries using sub queries, join types, temporary tables, OLAP functions etc.
  • Experience in DBMS Utilities such as SQL, PL/SQL, TOAD, SQL*Loader, Teradata SQL Assistant.
  • Good knowledge of Teradata RDBMS Architecture, Tools & Utilities.
  • Experienced with Teradata utilities Fast Load, Multi Load, BTEQ scripting, FastExport, OleLoad, SQL Assistant.
  • Skillfully exploit OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum, group by grouping set etc to generate detail reports for marketing folks
  • Able to interact effectively with other members of the Business Engineering, Quality Assurance, Users and other teams involved with the System Development Life cycle
  • Expertise in preparing code documentation in support of application development, including High level and detailed design documents, unit test specifications, interface specifications, etc.
  • Excellent Communication skills in interacting with various people of different levels on all projects and also playing an active role in Business Analysis.
  • Manage multiple projects/tasks within Mortgage, Banking & Financial Service industries in a high - transaction processing environments with excellent analytical, business process, written and verbal communication skills.

TECHNICAL SKILLS

ETL Tools: Ab Initio (GDE 1.15 Co>Operating system 2.15)

RDBMS: Teradata V2R6.2, Oracle 8i/9i/10g, SQL Server, DB2, MS Access.

DB Tools: SQL*Loader, import/Export, TOAD, Teradata SQL Assistant

Languages: C, C++, SQL, PL / SQL, XML, PERL Scripting and Korn Shell Scripting.

Platforms: Sun Solaris 7.0, HP-UX UNIX, AIX, Windows NT/XP/2000/98/95, Mainframes, MS- DOS.

Modeling Tools: Erwin, MS Visio

PROFESSIONAL EXPERIENCE

Confidential, CA

Sr. Ab Initio Developer

Responsibilities:

  • Ab Initio Developer for the creation of Bank Data Mart Implementation which includes CD, MM products for the analytical and reporting use.
  • Coordinated with Vendor Fiserv to have scheduled transmission of Bank Operational Data on Daily basis and to have them catalogued in EDS.
  • Did the detailed profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of the data that can be used for analytical purpose for business analysts.
  • Participated in several JAD sessions with analysts from business side to come up with the better requirements.
  • Created a Bank Data Mart POC at Data Mart level in user Teradata space to validate the requirements with users and also to come up with the better mapping document with right transformations.
  • Participated in Agile Iterative Methodology with the help of BT Project Manager. The iterations are CD, MM, and Integration with Card products using Customer Identification Number, Marketing Campaigns and IVR Call and agent response data.
  • Implemented bank data marts in ODS, EDW, DM and ADB (Application Database). Coordinated with Enterprise Warehouse architects to follow the corporate standards for the implementation. Used existing Metadata, Audit and ETL frameworks.
  • Involved in creation of Logical and Physical models using Erwin for ODS, EDW, DM and ADB and created DDLs for the DBA to create structures in the Teradata environments, development, staging and production. The modeling part is done through JAD sessions with involvement from Enterprise Architects and Business Users.
  • Evaluated existing Teradata Industry logical data model (ILDM) related to Financial Services/Banking to be used for Banking Data Mart.
  • Created mapping document for all the above 4 environments and ETL design document for the ETL developers to code.
  • Extensively used ETL to load data from Oracle database, XSD, XML files, Flat files data also used import data from IBM Mainframes.
  • Involved in walkthrough of the Data Models and ETL design documents with ETL developers, before each of the ETL coding iteration. Did the integration testing and involved with UAT with business users. Did the Ab Initio ETL Code walk thru and did some performance improvements.
  • Automated the entire Bank Data Mart process using Unix Shell scripts and scheduled the process using Autosys after dependency analysis.
  • Extensively worked on continuous flows like Database replication and Messages Queuing
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the analysts.
  • Created backfill strategy to have past 13 months of data in the Bank Data Warehouse. Created several Bteq, Fastload and Multiload scripts to load backfill data to Data Warehouse.
  • Involved heavily in writing complex SQL queries based on the given requirements such as complex Teradata Joins, Stored Procedures, and Macros etc.
  • Did the performance tuning for Teradata SQL statements using Teradata Explain command.
  • Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize, Denormalize, Input, Output and Join With DB. Used Abi features like MFS (8 way), check point, phases etc.
  • Well versed with AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques.
  • Extensively used Ab Initio using GDE with EME for ETL process
  • Good understanding of new Ab Initio features like Component Folding, Parameter Definition Language (PDL), Continuous flows, Queues, publisher and subscriber components.
  • Worked extensively to create, schedule, and monitor the workflows and to send the notification messages to the concerned personnel in case of process failures.
  • Extensively used Ab Initio and EME for ETL processing.
  • With the help Enterprise Metadata team, uploaded the technical and business metadata to enterprise level Metacenter. Defined audit thresholds for the Balance and control rejections during ETL process.

Confidential, Los Angels, CA

Ab Initio Developer

Responsibilities:

  • ETL Developer involved in creation of Enterprise Project Management (EPM) Data Mart for the enterprise project level reporting and analytics.
  • Conducted user requirement sessions to get the requirements for the EPM Analytics and reporting.
  • Had a walkthrough of the Prism (EPM Vendor software) tables to have better understanding of the attributes/elements presented in the software related to projects, project requests, service requests and tasks.
  • Followed the enterprise standard of creating Normalized Standard Layer and Dimensional Presentation Layer for the Data Mart creation.
  • Did the data profiling using SQL and PL/SQL code to understand the data and relationships on the operational system.
  • Extensively used EME for Version Control System and for Code Promotion.
  • Involved in the creation of Logical and Physical Models for the EPM Standard Layer and Presentation Layer using Erwin Modeling tool and also created DDLs for DBA to create Oracle Structures.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, match and merge, creating DDL scripts, creating subject areas, creating DDL scripts, publishing model to PDF and HTML format, generating various data modeling reports etc.
  • Identified and tracked the slowly changing dimensions and determined the hierarchies in dimensions. Used Kimball methodologies for Star Schema Implementation.
  • Created Mapping Documents and ETL design documents for both Presentation Layer and Standard Layer. Followed Enterprise Level Metadata and Audit standards while creating design and source-to-target mapping documents.
  • Coordinated with ETL developers from Indian Offshore Company to develop Ab Initio ETL process to populate both Standard and Presentation Layers. Reviewed the entire ETL process for the performance tuning, audit and backfill and moving forward. Did the Integration testing in both the Layers. And also reviewed the audit checks. Validated the data against the Operational EPM reports.
  • Created several automated Unix Shell scripts to create wrappers for Abi code to do parameter passing and also to load Balance and Control files to Oracle B&C database.
  • Handed over the production maintenance phase to Offshore Team.
  • Coordinated with BO developer for the creation of BO Universe and created templates for the daily/monthly reports to be created. And also helped him in creation of Dashboards with useful KPIs.
  • Coordinated with BO developer to create several OLAP cubes for the dimensional analysis.
  • Conducted user trainings to help them understand the Presentation Layer structures and available Cubes and Reports for analysis.
  • Created complete metadata (data lineage) from ETL to Reporting and loaded to Enterprise Metadata Repository.

Confidential, Los Angeles, CA

Ab Initio Developer

Responsibilities:

  • ETL Ab Initio Developer involved in the Migration of HRM (Household Relationship Manegement), which is a Master Data Management (MDM) for entire company, to SOA (Service Oriented Archtecture) application called HTS.
  • The new platform uses J2EE framework, Websphere Application Server, TIBCO, Ab Initio, I/Lytics and Oracle to implement above architecture.
  • With this new SOA architecture, various Web Services (WS) like GetParty, GetHousehold, and GetPolicy etc., will be provided so that enterprise wide applications like SIEBEL, IRMS and NOVA etc. can use HRM data by calling these Web Services using SOAP/XML over HTTP.
  • Involved in the data migration process from Mainframe to Oracle database. Involved in the creation logical and physical models for the new architecture.
  • Involved in the Bulk Data Retrieval to extract data from SOA application using ETL tool using batch process, which is alternative to Web Services.
  • Using RUP methodology, created various Uses Case specification document and diagrams from client integration document for the Bulk Data Retrieval.
  • Did the benchmarking and selection of ETL tool for the Bulk processing, which is Ab Initio.
  • Did the Proof of Concept (POC) using Ab Initio to show that Ab Initio is the right solution for the HTS Bulk Data Service. Used various Ab Initio components like Call Web service, Read XML, Write XML, xml-to-dml utility to test HTS provided Web Services. Also did the POC with Ab Initio/Oracle Stored Procedures (PL/SQL) to evaluate the performance.
  • Performed the Unix shell scripting for the manipulation of XML files as well as for automation.
  • Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize, Denormalize, Input, Output and Join With DB. Used Abi features like MFS (8 way), check point, phases etc.
  • Well versed with AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques.
  • Used the HTS provided Web Services within ETL tool so that underlying logic won’t be duplicated within batch process.

Confidential, Boston, MA

Ab Initio/Teradata Developer

Responsibilities:

  • ETL Developer on CIMS warehouse implementation using Kimball’s dimensional BUS Architecture and methodologies to ETL data from EDS to Data Marts.
  • Implemented Star Schema Analytical Data Marts for the following subject areas under CIMS enterprise data warehouse:ACAPS Collections
  • IVR/Avaya Call and Agent Data
  • Conducted several JAD sessions with key business analysts to get the requirements related Reports, KPIs and Data Mining.
  • Performed extensive Data Profiling on the source data by loading the data sample into Database using Database Load Utilities.
  • Worked with Data Modelers to create a star schema model for the above subject areas and made sure that all the requirements can be generated from the models created.
  • Coordinated with EDS team to make sure all the data required for the above data marts is in the Enterprise data store. Otherwise worked with EDS team and Operational teams to push the data required in to the data store (ODS).
  • Created a mapping document for each of the tables involved in the above models and explained the mapping documents to the ETL developers.
  • Created High level and detail design documents for above data mart projects containing process flows, mapping document, initial and delta load strategies, Surrogate key generation, Type I and Type II dimension loading, Balance and control, Exception processing, Process Dependencies and scheduling.
  • Coordinated the ETL efforts with ETL developers to the successful implementation of the above data marts. Involved in day to day issue resolutions related the data quality, mapping and ETL designs.
  • Followed the enterprise standards on mapping documents, design documents, Metadata framework, Balance and Controls.
  • Involved in the integration testing with ETL developers and User Acceptance Testing (UAT) with Business Analysts.
  • Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis.
  • Did the performance tuning on the Ab Initio graphs to reduce the process time.
  • Designed and coordinated the backfill process to load 3 to 5 years of the data into Data Marts.
  • Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize, Denormalize, Input, Output and Join With DB. Used Abi features like MFS (8 way), check point, phases etc.
  • Well versed with AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques. Experience in using Conditional Components and Conditional DML.
  • Participated in the evaluation of Teradata DBMS to replace DB2/UDB 7 for the EDW. Created several large tables with real card portfolio data totaling 4 TB for the POC. With the help of Teradata folks at San Diego, created tables with right primary index and partitioning index on the 4 node Teradata system. Created several complex queries and ran them to get the performance measurements. Compared the results with the results from running the same queries on UDB DB2 system. Presented the results to Management.
  • Worked with DBA team to ensure implementation of the databases for the physical data models intended for the above data marts. Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
  • Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the analysts.
  • Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
  • Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
  • Supported several Business Areas by developing around 25 complex reports in several types and Dashborads using Cognos Reports Studio. Validated these reports against Operational Reports for better confidence.
  • Implemented data level, object level and package level security in framework manager to achieve highly complex security standards.
  • Created Master Detail reports, drill through and custom prompting reports, and Scheduled reports for efficient resource utilization.
  • Created Query Prompts, calculations, Conditions & Filters. Developing Prompt pages and Conditional Variables. Involved in Testing and improving Report Performance.
  • Trained Users on Query Studio for Ad-Hoc Reporting.
  • Helped Business Users by writing Complex efficient Teradata SQLs to get a detailed for Data Mining. Automated these extract using BTEQ an Unix Shell Scripting.
  • Interacted with senior manegement from risk, marketing, finance and fraud teams to create key metrics such as receivables, charge-offs, fraud write-off, card usage etc.
  • Modeled metadata (Facts and Dimensions) in framework manager and published packages to Cognos connection for implemented data marts. Created correct access paths while designing Framework Model.
  • Used Cognos Transformer to create/model (DMR) OLAP cubes for multi-dimensional analysis with appropriate alternate drill down paths and dimensions to enable slicing and dicing to explore data comprehensively.
  • Proficiency in Cognos Framework Manager, Transformer, Report Studio, Metric Studio and Analysis studio.
  • Helped business users understand the power and limitations of the Cognos reporting tool by conducting training sessions with them.
  • Converted several existing highly critical reports like yellow book and blue book, which are developed in SAS into Cognos framework.

Confidential

Mainframe/SAS Developer

Responsibilities:

  • Involved in multiple other projects within the company to support Business Units with Reports and Data using Mainframe and Unix SAS.
  • Extracted data needed for analysis and scheduled campaign runs from several centralized data sources such as EDS, ODS and EDW and External sources, Mainframe sources and converted into SAS datasets.
  • Loaded extracted data into Oracle using Oracle SQL Loader utilities
  • Created several complex efficient SQLs to extract data from above EDW tables and converted into SAS datasets.
  • Extensively used DB access to MS Access and Oracle using SAS SQL pass-thru facility.
  • Very good understanding of Relational Modeling concepts such as Entities, relationships, Normalization etc.
  • Performed Data entry and conversions, data validation and corrections. Extensive use of Data Null, Summary, Means, Freq, and Chart procedures ascertaining quality of data. Data validation by checking data distribution and comparison to a standard Data.
  • Successfully identified problems with the data, produced derived data sets, tables, listings and figures that analyzed the data to facilitate correction.
  • Produced data listings, summary tables and graphics for interim and final analysis.
  • Data manipulation by merging, Appending, Concatenating, sorting datasets.
  • Experience using Base SAS (MEANS, FREQ, TABULATE, REPORT etc) and other procedures for summarization, Cross-Tabulations and statistical analysis purposes.
  • Extensively used Data NULL for creating reports.
  • Extensively used the following SAS features: data, data null, sort, merge, append, transpose, tabulate, freq, means, summary, contents, copy, print, datasets, compare, report, format, proc sql, macros, symput, symget, includes.
  • Created targeted customer mailing lists for Direct Mailing and Telemarketing.
  • Automated the coding related the scheduled campaigns using SAS, SQL and Unix Shell scripting.
  • Produced several reports showing the campaign responses by comparing several attributes between campaign files and Acquisition files.
  • Generated analysis reports, graphs using SAS/Reports and SAS/ODS to produce HTML reports for forecasting.
  • Conducted individual account and household level analysis. Conducted study response rates for different offers.
  • Generated ad hoc reports in Excel and PDF/HTML via SAS ODS (Output Delivery System) based on client requests.
  • Performed categorical and exploratory statistical data analysis on sales and marketing information.
  • Extensively used DB access to Oracle 9i using SAS SQL pass-thru facility. Extensively used the following SAS features: data, data null, sort, merge, append, transpose, tabulate, freq, means, summary, contents, copy, print, datasets, compare, report, format, proc sql, macros, symput, symget, includes and ods to create various SAS reports.

We'd love your feedback!