Ab Initio Technical Lead Resume
Newport Beach, CA
SUMMARY
- Over 7 plus years of IT Experience in Application Development and Maintenance projects in Banking & Finance Services (BFS) and HealthCare Domains, with major focus on Data Warehousing, Business Intelligence and Database Applications
- 6 plus years of strong Data Warehousing experience using ETL tool Ab Initio, GDE (1.13 - 3.2), Co>Op (2.13-3.2)
- 4 plus years of working experience with Teradata, DB2. Experience working with various Heterogeneous Source Systems like Oracle Exadata, Teradata, DB2, SQL Server and Flat files
- As part of my assignments, I have been involved in Requirement Analysis, ETL Design, and Application Development & Maintenance, Functional Studies, Quality Reviews and Testing
- Expertise knowledge in dimensional data modelling, star & snow flake schema, creation of fact and dimension tables, OLAP, OLTP and thorough understanding of data warehousing/Data Marts
- Able toplanand execute all phases of life cycle in Data warehousing, Data Integration and Data Migration and Experience in designing, developing, optimizing and testing ETL programs handling large volume data
- Expertise in preparing ETL process design and documentation, like High-Level, Low-Level Design (HLD, LLD) documents and Data mapping documents
- Well versed in Ab Initio parallelism techniques and implemented ABI Graphs using Component/Data/Pipeline Parallelism & MFS techniques, Continuous Flows, Component Folding
- Expertise in writing various UNIX Shell Scripts to run Ab Initio and Database jobs. Automated, Scheduled the Ab Initio jobs using UNIX Shell Scripting and Control-M
- Good knowledge on Scheduling ABI jobs using Control-M, AutoSys tools and Conduct>It
- Worked on different source/target systems and proficient in writing complex transformations/transform functions using PDL, ICFF, Validation extension and Meta programming techniques in Ab Initio
- Involved in Performance tuning and troubleshooting methods in Ab Initio and created reusable Ab Initio graphs for handling CDC, validations of source files. Extensively used EME AIR utilities and knowledgeable about environment settings
- Experienced in working, EME environment for Check In, Check Out of the graphs, Versioning and migrating of the graph from DEV Environment to TEST environment using tags (AIR commands air lock group, air project group, air sandbox group, air tag group, air repository group).
- Very good experience in Oracle database application development using Oracle 10g/9i, SQL, PL/SQL, SQL*Loader. Strong experience in writing SQL, PL/SQL-Stored Procedures, Functions and Triggers.
- Experience in SDLC Analysis, Design, Development, Testing, Implementation and Maintenance in DW Environment, Agile and Waterfall methodology
- Extensive knowledge on various Connectivity protocols - FTP, SFTP, SCP, NDM, MQ
- Experience in Teradata RDBMS using Fast Load, Fast Export, Multi Load, T Pump, Teradata SQL Assistance and BTEQ Teradatautilities
- Strong knowledge in Ab Initio Metadata Repository
- Hands on experience with ABI Testing, various testing phases (SIT, UAT and OAT) and providing support to the applications in production.
- Advanced understanding of business processes and workflow.
- Diverse background with fast Learning and creative Analytical abilities with good Technical, Communication and Interpersonal Skills
TECHNICAL SKILLS
ETL Tools: Ab Initio GDE ( /3.1/3.2 ), Co>Op ( /3.1/3.2 ), EME, Ab Initio Metadata Hub 3.1.2, Data Profiler
Database (RDBMS): Teradata, Oracle 9i/10g/11g, DB2, MS Access 2000/2002, MS SQL Server, Oracle Exadata
BI/OLAP Reporting Tools: Business Objects, Cognos
Languages: UNIX Shell Scripting, SQL, PL/SQL, C, C++, Python, PERL, Java
Operating Systems: SUN Solaris 8/10, Linux, Windows 98/2000/NT/XP
Tools: TOAD, SQL *Loader, Teradata SQL Assistant
Scheduling Tools: CA AutoSys, BMC Control-M
Business Modeling Tools: MS Visio, MS Office Suite
Methodologies: SDLC - Agile Methodology, Waterfall model
Project Management Rational: Clear Quest, HPSM, Viper, Version One, BMC Remedy
Web Application Tools: HTML, Adobe: Acrobat, ASP, XML
PROFESSIONAL EXPERIENCE
Confidential, Newport Beach, CA
Ab Initio Technical Lead
Responsibilities:
- Working as a Lead Ab Initio developer in understanding the business requirements of the End Users/Business Analysts and developing applications for ETL Process using Ab Initio
- Working closely with data modelers and helped in the design of schemas and tables including SCDs
- Participated in SDLC Agile Iterative sessions to develop Extended Logical and Physical Models for every ETL enhancements, development ofABIGraphs, Plans, Oracle stored Procedures
- Managing and Leading the enhancements work, where effort is less than 80 hours
- Involved in redesign of various old projects to enhance performance and use the latest Ab Initio products like Conduct>IT, ICFF etc.
- Collaboration with the DIA and business during new development projects and existing failures or issues
- Co-ordination with different teams like DBA, UNIX Admin and ETL Admin team as part of architecture support for different projects
- Involved in design and code review for different projects as Ab Initio SME
- Ensuring the environment and space availability for all the projects to execute in production smoothly
- Managing the Support team which taking care of P1 to P5 tickets in production
- Enforce the generic graph usage across different new projects, and develop generic and stand-alone batch graphs by extensively using various transform and database components while enhancing the performance of ETL processes
- Developed scripts to strip header/trailer values directly from source files and load to tables through BTEQcommands/scripts
- Developed complex graphs by applying various transformations such as Filter By Expression, Reformat, Rollup, Scan, Join, Normalize etc.,
- Preparation of High Level Design of Source Data Analysis, ETL Process flow, Error Handling, Restartability and Extendibility
- Reviewing data mapping, design documents, Ab Initio code, Unit Test Cases
- Coordinate with testing team to ensure smooth testing, data validation and resolve from defects in testing phase
- Prepare ETL test datafor all ETL graphs based on transformation rules to testthe functionality of the application
- Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis
- Created, Managed and run large-scale data processing system using Plan>It
- Worked closely with Business for BRE (Business Rule Engine) concept
- Designed and developed parameterized generic graphs
- Working with Technical Architects in troubleshooting OracleExadataData Warehouse
- Help production support team to resolve production job abend and keep system up and running
- Leading the support team which supports 800 daily jobs in production besides the weekly, quarterly and yearly jobs
Environment: Finance domain, Ab Initio (GDE 3.2.1.1, Co>Op 3.2.3.6), EME, AutoSys, UNIX Shell Scripting, MS Visio, Oracle 11g, Oracle Exadata, Teradata, Sun OS 5.10
Confidential, Omaha, NE
Sr. Ab Initio Developer
Responsibilities:
- Developed and supported the extraction, transformation and load process (ETL) for Confidential ’s Data Warehouse from their OLTP systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio
- Defined and implemented large enterprise Operational Data Store and Big Data which consisted of Exadata, Oracle Big Data Appliance (Hadoop) and Exalytics
- Created aAbInitiographs/Planto extract data from various sources internally, validate and created a safety transfer process to third party vendor
- Involved in developing Technical detailed Design, Mapping document and Control M Visio
- Designed programs to utilize the parallel processing capabilities of Ab Initio using Data parallelism (MFS)
- Developed AbInitioGraphs,Plans, Psets, XFRs, generic components to materialize a new agency management system for the client
- Implemented partition techniques using MFS with round robin, Key, expression and interleave Gather and merge
- Participated in testing with various Ab Initio validate components like Validate Records, Compute Checksum etc. components, and also used FTP, Normalized, Denormalize components
- Used SandBox features to make graph more portable and also used file management utilities like m mkfs, m rmfs, m touch, m ls, m env and extensively Multi File System (MFS) commands
- Developed various Ab Initio Graphs for data cleansing using Ab Initio functions and components such as Validate Records, is valid, is blank, is defined, string substring, string * functions etc.
- Developed and worked on numerousBTEQ/SQL scripts, making changes to the existing scripts and completed unit and system test
- Developed theABIGraphs for loading/unloading the data using Fast Load, Multi Load, TPump utilitiesand BTEQ used in UNIX shell scripting
- Involved in the analysis of the data flow requirements for developing a scalable architecture for both staging and loading data
- Involved in the preparation of documentation for ETL using Ab Initio standards, procedures and naming conventions
- Generated Configuration files, DML files from Copybooks, XFR files specifies the Record format, which are used in components for building graphs in Ab Initio
- Was involved in using the built-in AB Initio functions to build the Custom components, which will help in implementation of the complex business logic
- Created Batch processes using Fast Load,BTEQ, Unix Shell and Teradata SQL to transfer cleanup and summarize data
- Developed UNIX Korn shell wrapper scripts to accept parameters and scheduled the processes using Maestro
- Prepared the implementationplanfor Production team to carry out the tasks one by one on Production DB
- Good understanding of new Ab Initio features like Component Folding, Parameter Definition Language (PDL), Continuous flows, Queues, publisher and subscriber components
- Used Parameter Definition language (PDL) in writing DML to execute the graphs without writing scripts or deploying using air commands
- Unit testing (UAT) of Batch graphs, QA migration and support, defect fixing, deploymentsetupincludingimplementation plan, QA document, RFC creation and for the above work streams
- Analyzing metadatato design for new metadatacustom interface when requirements to extend the metadata repositorytool's native metadata model
- Created the migration scripts, test scripts for testing the applications, creating and supporting the Business Objects reports
Environment: Finance domain, Ab Initio 3.1, Ab Initio Metadata Hub 3.1.2, Teradata SQL Assistant 13.0.015, Db Visualizer 9.0.8, Oracle Exadata, Business Objects, SQL Developer, UNIX Shell Scripting, Teradata, BTEQ, Agile, Autosys, Java, Cognos, Windows, UNIX (Sun Solaris)
Confidential, Fremont, CA
Ab Initio Developer
Responsibilities:
- Understand the business process and gather requirements from business users along
- Carried the detailed profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of the data that can be used for analytical purpose for Business/User groups
- Created, Managed and run large-scale data processing system
- Used ACE to create the generic graphs and Worked on complete Ab Initio ProjectSetupclosely with Ab Initio Admin
- Participated Agile Iterative sessions to develop Extended Logical Models and Physical Models
- Developed ABI graphs andBTEQscripts(subject areas which doesn't involve major transformations)
- Developed Ab-Initio graphs using Ab-Initio Parallelism techniques, Data Parallelism and MFS Techniques with the Conditional Components and Conditional DML
- Involved in all the stages of SDLC during the projects. Analyzed, designed and tested the new system for performance, efficiency and maintainability using ETL tool Ab Initio
- Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis
- Carried the performance tuning on the Ab Initio graphs to reduce the process time
- Created several SQL queries and created several reports using the above data mart for UAT and user reports
- UsedBTEQand Teradata SQL assistant to support queries in interactive and batch mode
- Prepared the Unit Test CasePlanand End-to-End Process document for the Graphs and Wrapper scripts
- Automated the entire Data Mart process using UNIX Shell scripts and scheduled the process using TWS, Job Track after dependency analysis
- Extensively used the Teradata utilities like Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL)
- Used FAST EXPORT and MULTI LOADutilitiesin Teradata for data un-loading and loading activities inAbInitioETL and Teradata DB
- Worked with Lead DWH Advisor and Architects in developing Road Map & Blue Print for Integrating Concentra MSBI warehouse to new DWH based onExadata
- Supported Testing Team with data verification, loading the Data in Test Environment, provided SQL’s for testing the Test cases
Environment: Health Insurance Domain, Ab Initio (GDE 1.15, Co>Op 2.15), Teradata V2R6, PL/SQL, Teradata Utilities, BTEQ, Oracle Exadata, TOAD, EME, Control-M, Agile, UNIX
Confidential, Cincinnati, OH
Ab Initio Developer
Responsibilities:
- Developed and supported the extraction, transformation and load process (ETL) for the Data Warehouse from heterogeneous source systems using Ab Initio
- Implemented a number of Ab Initio graphs using Data parallelism and Multi File System (MFS) techniques
- Used Ab Initio GDE to generate complex graphs for the ETL process using Join, Rollup and Reformat transform components and executed using Co>Operating System
- Configured the source and target database connections using .dbc files
- The design and implementation of the Data model for the Data Warehouse using Star Schema
- Created .xfr and .dml files for various transformations and specifying the record format
- Involved in automating the ETL process through scheduling
- Deploy and test run the graph as executable Korn shell scripts in the application system
- Modified the Ab Initio components parameters, utilize data parallelism and thereby improve the overall performance to fine-tune the execution times
- Utilized Phases, Checkpoints to avoid deadlocks and multi-files in graphs and also used Run program, Run SQL components to run UNIX and SQL commands
- Provided application requirements gathering, designing, development, technical documentation, and debugging. Assisted team members in defining Cleansing, Aggregating, and other Transformation rules
- Extensively worked with theAbInitio EME to obtain the initialsetup variables and maintaining version control during the development effort
- Involved in interacting effectively with other members of the Business Engineering, Quality Assurance
Environment: Grocery Domain, Ab Initio (GDE 1.13, Co>Op 2.13), UNIX, DB2, SQL Assistant, Putty, SQL/PL-SQL, Windows XP, MS Office, Shell Scripts, XML, Oracle 10g
Confidential, Lakewood, CO
Ab Initio Developer
Responsibilities:
- Understand the functional and technical requirements to better serve the project
- Responsible for Production support and maintenance for initial few months and worked with Senior Ab Initio Developers on code deployment and Environment Setup
- Developed Ab Initio graphs using different components for extracting, transforming and loading external data into data warehouse
- Worked on developing Ab Initio graphs and applied transformations based on the business requirements using various Ab Initio Components
- Extensively used Ab Initio Web Interface (AIW) to navigate the EME for viewing Ab Initio objects like graphs, datasets etc. and performed Impact analysis
- Implemented Multi-File Systems (MFS), Phases and made use of Ab Initio’s feature of parallelism by partitioning data
- Created Linked Sub-Graphs, Common Graphs, Generic Parameterized Graphs and PSETS for various common tasks, surrogate keys, referential Integrity checks and data scrubbing and cleansing
- Created common graphs to preform common data conversions that can be uses across the applications using Parameter approach using conditional DMLs
- Developed UNIX shell wrapper scripts to accept parameters and scheduled the processes using Control-M Schedulers
- Involved in large scale data processing systems using Ab Initio Conduct>It
- Extensively used EME for version control, code Check-in, Check-out, Dependency analysis, graph statistics and performance analysis
- Involved in Performance testing, tuning of ETL Ab Initio processes and provided production support if necessary
- Implemented data cleansing and data validation by using various Ab Initio functions like is valid, is defined, is error, is null, string trim etc.
- Created Generic Init to Clean graph for Chunked Files and Generic Clean to Load Graph
- Developed Ab Initio graphs for Data validation using validate components like compare records, compute and checksum etc.
- Developed several Korn Shell (ksh) programs, functions and packages for pre-inspection, post-inspection, pre-extraction inspection of the data getting loaded every week and populate errors, which are again used to generate error reports responsible for the automation of Ab Initio using Ksh scripts
- Wrote Unit Test scripts an involved in preparing support documents and implementationplan
- Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping, Transformation and Loading in complex and high-volume environment and data processing at Terabytes level
- Checked the accuracy of data loaded into Teradata and assured the linkage of keys on the tables for the various applications
- Documented the processes to enable better reusability and maintainability
- Scheduled the bug fixes for development/UAT and production migration
- Created test scripts & support for the Integration testing
Environment: Health care Domain, Ab Initio (GDE 1.14, Co>Op 2.14), Teradata utilities, Main Frame, UNIX, Control-M, AutoSys, Solaris 7.0
Confidential
Ab Initio Developer
Responsibilities:
- Involved in ETL development process-source to target mapping discussions, and participated in various data cleansing and data quality exercises
- Involved in preparation of Technical Design Document, Mapping Documents and review those documents
- Developed multiple graphs to unload all the data needed from different source databases by configuring the dbc file in the Input Table component
- Developed generic graphs to take process all reference files
- Developed number of Ab Initio Graphs based on business requirements, using various components such as Partition by round robin, Partition by key, Rollup, Reformat, Scan, Join, Normalize etc.
- Extensively used Ab Initio tool’s feature of Component, Data and Pipeline parallelism. Implemented MFS and Introduced phases for data parallelism
- Used AIR commands to do dependency analysis for all Ab Initio Objects
- Performed data cleansing and data validation by using various Ab Initio functions such as is valid, is defined, is error, is null, string trim etc.
- Developed shell scripts for Archiving, Data Loading procedures and Validation
- Implemented Slowly Changing Dimensions (SCDs) of types 1,2 and 3
- Worked in creating unit test plans, unit test cases, unit test scripts and unit test reports to verify accuracy of programs and the interdependent relationship of programs and systems
- Implemented phases and checkpoints in graphs to avoid deadlock, and recover completed stages of the graph, in case of a failure
- Involved in preparation and implementation of data verification and testing methods
- Worked on improving performance of Ab Initio graphs by using various Ab Initio performance techniques such as using lookups, in memory sort in joins and rollups to speed up various Ab Initio graphs, and optimum max core parameter for memory optimization
Environment: Finance Domain, Ab Initio (GDE 1.13, Co>Op 2.13), UNIX, Oracle 9i/10g, Flat files, SUN OS 5.8, Windows XP, Control-M, UNIX Shell Scripting
Confidential
PL/SQL Developer
Responsibilities:
- Developed/modified Oracle PL/SQL codes like stored procedures, functions, triggers, Packages, Tables, Indexes, Views etc. based on technical and functional specification documents
- Extensively used PL/SQL programming in backend and front end Functions, Procedures, and Packages to implement business logics/rules
- Developed/modified scripts to create tables, views and executed them using SQL Plus
- Developed/modified scripts to rectify data errors and executed them using SQL Plus
- Ran batch jobs for loading database tables from flat files using SQL Loader
- Created/updated TSQL stored procedures and UDFs to support efficient data storage, manipulation and enhancement
- Exception handling and cursor management in PL/SQL
- Involved in developing packages to transfer data from different Oracle forms to Oracle database, every time whenever processor do the transactions
- Involved in form level coding for Approval, Data Entry, Data Query and Response Modules as per the requirements
- Responsible for user management and object security maintenance
- Fine-tuned the application queries
- Analysis, Development, Testing and Implementation using Oracle 10g
- Used Data Transformation Services an Extract Transform Loading (ETL) tool of SQL Server to populate data from various data sources, creating packages for different data loading operations
Environment: Health/Pharmaceutical Domain, Oracle 10g, SQL Plus, PL/SQL, Forms & Reports, PL/SQL Developer, TOAD, Query Analyzer, Windows 98/2000/XP, MS Access 2000/2002