We provide IT Staff Augmentation Services!

Ab Initio Technical Architect Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Over 9 years of experience in ETL Development, Data Warehouse Design, Data Warehouse Administration, Application Design, Maintenance, Implementation and Support.
  • Sound knowledge of Life Sciences, Personal & Commercial Insurance and BFS domains
  • Experience in all phases of SDLC and Agile (Kanban & Scrum)
  • Involved in analysis, design, development, testing, implementation, and maintenance of various applications.
  • Experience in ETL performance tuning optimization and data administration.
  • Hands on experience in implementing parallelism, reusability and standardization.
  • In - depth knowledge of UNIX shell scripting.
  • Technical and Functional experience in Data warehouse implementing ETL methodologies using Ab Initio (GDE EME v3.1, Informatica Power Center Teradata, Oracle 10g/9i/8i in Telecom, Retail, Finance, Health Insurance domain.
  • Extensive experience in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De-normalize, Partitioning and De-partitioning components etc.
  • Extensive experience with Ab Initio Metadata hub, ACE/BRE/DQE.
  • Knowledge of Data warehousing concepts like Star Schema and Snowflake Schema, Data Marts, used In
  • Relational, Dimensional and Multidimensional data modelling.
  • Good Understanding of Hadoop (HDFS + MapReduce) and its ecosystem frameworks like SQOOP, Flume, HIVE, Pig and Oozie.
  • Expert in writing efficient SQLs to meet SLAs and thorough in understanding explain plans.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.
  • Worked on Teradata query optimization using DBQL data analysis, View point, Statistics collection, PPI/ MLPPI tables design, Explain plans and Indexes (Join/Hash/USI/NUSI/UPI/NUPI) selection.
  • Developed various UNIX shell scripts to run Ab Initio/Informatica and Database jobs.
  • Hands-on experience and strong understanding of Software Development Life Cycle (SDLC) and Software
  • Testing Life Cycle (STLC).
  • Good experience of QA methodologies like Waterfall, V-Model, Agile, Scrum, RAD and TDD.
  • Extensively used Ab Initio EME data store/sandbox for version control, code promotion and impact analysis
  • Thorough understanding and experience in DWH and data mart design, Slowly Changing Dimensions (SCD Type 1, Type 2 and Type 3), Normalization and Demoralization concepts.
  • Strong SQL experience in Teradata, developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.
  • Extensively used Mapping Variables, Mapping Parameters and Dynamic Parameter Files for improved performance and increased flexibility
  • Extensively used different features of Teradata such as BTEQ, FastLoad, MultiLoad, Tpump, QueryMan, SQL Assistant, DDL and DML commands.
  • Work experience reviewing and testing of data maps between various legacy systems and relational databases.
  • Very strong skills on project management, requirement analysis, business analysis, database modeling, design and analysis, issue co-ordination and development with Teradata/Oracle/SQL Server based Relational Databases.
  • Proficient in Teradata database design (conceptual and physical), Query optimization, Performance Tuning.
  • Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues.
  • Automated the BTEQ report generation using UNIX scheduling tools on weekly and monthly basis. Well versed with understanding of Explain Plans, confidence levels and Database Skew. Knowledge in Query performance tuning using Explain, Collect Statistics, Compression, NUSI and Join Indexes.
  • Experienced with mentoring Teradata Development teams, data modeling, program code development, test plan development, datasets creation, testing and result documentation, analyzing defects, bug fixing.
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files
  • Proficient with the concepts of Data Warehousing, Data Marts, ER Modeling, Dimensional Modeling, Fact and Dimensional Tables with data modeling tools ERWIN and ER Studio.
  • Implemented pipeline partitioning concepts like Hash-key, Round-Robin, Key-Range, Pass Through techniques in mapping transformations.
  • Well versed with Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism, MFS techniques, Continuous Flows, Component folding and PDL features.
  • Designed and implemented a PoC on “DBQL Text Parsing” in Hadoop - Big Data.
  • In depth knowledge of Data Warehousing concepts, ER modeling & Dimensional modeling, Data Analysis and ETL process.
  • Hands on experience in Rally Customization (Rally Custom App Programming).
  • Experience in building Enterprise-wide Architecture design.
  • Project Management experience in handling multiple applications across multiple environments and managing challenging complex and time critical projects

TECHNICAL SKILLS:

Major Skill sets: Abinitio, Cognos, Unix, Teradata, Oracle, DB2

Programming Languages: C, C++, HTML, XML, Visual Basic, VB.Net and ASP.Net

Operating Systems: UNIX AIX5.2, MS-DOS, Windows NT/98,/2000/XP/Vista/7.

DW Tools: Abinitio BRE, Abinitio Continous Flows, QlikView, Erwin, Teradata SQL Assistant, SQL Developer, TOAD.

Scheduling Tools: Autosys, Control-M

Databases: Teradata, ORACLE, Netezza, SQL Server, DB2

Business Modeling: MS Visio

Change/Repository Management: Sharepoint and Concurrent Versioning System (CVS)

Development Process: SDLC and Agile (Kanban, Scrum).

Project Management: MS Office Suite, MS Project, Agile process (Rally)

Estimation Technologies: IFPUG Function Point Estimation.

WORK EXPERIENCE:

Confidential

Ab Initio Technical Architect

Responsibilities:

  • Responsible for Design and Implementation of Global feeds into DW.
  • Coordinating with various stake holders across the globe.
  • Working closely with stake holders and building business roadmaps.
  • Design and Develop Ab Initio graphs to extract data from different sources like Flat files, databases like
  • Teradata, Oracle; Mainframes and XML files, apply transformations and load data into the target systems.
  • Efficiently used different Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Fuse, Lookups.
  • Designing and developing complex SQL queries and improvising the performance of existing SQL's by incorporating different optimizing techniques like reducing the joins and using temporary tables and collecting statistics as required.
  • Hands-on experience with PDL and metadata programing. Created DML'S dynamically as the process is being executed in runtime.
  • Create new UNIX scripts to automate and to handle different file processing, editing and execution sequences with shell scripting by using basic Unix commands and 'awk', 'sed' editing languages.
  • Efficiently using the Table extracts using Ab Initio and also Teradata utilities as required by the process.
  • Perform data quality and integrity, end to end testing, performance tuning, debug various production issues and provide support.
  • Worked on Ab Initio's Application configuration Environment- ACE,created datasets, lookups and data quality psets which are used as a part of data quality project.
  • Good experience in BRE (Business Rules Environment), created rule sets (XFR's) based on the business requirements that are later used different data quality validations.
  • Responsible to understand business requirements after detailed discussions with end users.
  • Worked with various source teams and built ETL models using new architecture.
  • Had to interact with various middle layers between source and target. Those middle layers include Golden Gate, BEAM, and Hadoop teams. Interacted with Hadoop teams for preparing strategies in Data Lake to receive data into Teradata.
  • Maintaining data in Teradata staging layers like Transient Stage, Semi persistent stage and Core layers.
  • Worked as ETL architect to design ETL models for Core layers and also performing tuning of ETL processes and implemented data purge strategies.
  • Responsible for Performance Tuning of High CPU consuming queries and tables with high skew for daily batch jobs.
  • Supported end users on ad hoc data usage and be a subject matter expert on functional side of the business.
  • Performed Development using Teradata utilities to populate the data into BI DW like BTEQ, FastLoad and MultiLoad.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata BTEQ's to load data into Incremental/Staging tables and then move data from staging into Base tables
  • Executing test scripts to verify actual results against expected results by using Oracle Golden Gate for source validation and Teradata for target validations.
  • Developed the FTP process from the Unix servers to the file services location for vendor delivery
  • Utilized Teradata DBQL to monitor the queries running in production and modified for better SLA's
  • Performed Teradata SQL assistant Import and Export utility to move data from production to development to refresh staging tables.
  • Handled customer service data into the EDW for the BI team to generate reports for the end users.
  • Interacted with BODS HANA teams to change queries in HANA data ingestion processes.
  • Interacted with BO team for changes in their BO universes.
  • Worked with offshore teams and various time zone resources.
  • Implemented Continuous Process improvements and performance enhancements.

Confidential

Ab Initio Senior Technical Lead

Responsibilities:

  • Design and Develop ETL design using Abinitio
  • Mentor the development team for effective deliverables
  • Optimize, improve and enhance Abinitio Graphs. Performed Analysis, designing and preparing the functional, technical design document, and code specifications.
  • Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their OLTP systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio.
  • Responsible for all pre-ETL tasks upon which the Data Warehouse depends, including managing and collection of various existing data sources
  • Involved in developing UNIX Korn Shell wrappers to run various Ab Initio Scripts.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, merge etc.
  • Worked on improving the performance of Ab Initio graphs by using various Ab Initio performance technique's like using looks instead of Join's etc.
  • Implemented Lookup's, lookup local, in-Memory Joins and rollup's to speed up various Ab Initio Graphs.
  • Design Documentation for the developed graphs.
  • Used components like run program and run sql components to run UNIX and SQL commands in AB Initio. Also used the components like filter by expression, partition by expression, replicate, partition by key and sort Components.
  • Create Summary tables using Rollup, Scan & Aggregate.
  • Created UNIX shell scripts to automate and schedule the jobs.
  • Created the migration scripts, test scripts for testing the applications, creating and supporting the Cognos reports.
  • Performed effort estimation for the work done. Track the progress of project and report the project status on weekly basis to client PM.
  • Involvement in Unit/Integration/End to End/UAT testing and make sure complete product goes to Production with no bugs.
  • Analyze, design and propose effective design approaches for DW design
  • Automation of various processes though shell script to maintain the stability of the system

Environment: Ab Initio, UNIX, Teradata, Abinitio BRE, Abinitio Continous Flows

Confidential

Ab Initio Lead

Responsibilities:

  • Involved in understanding the Requirements of the End Users/Business Analysts and developed strategies for ETL processes.
  • Developed the graphs using the GDE with components partition by round robin, partition by key and sort, rollup, sort, scan, reformat, join, merge, gather and PBE components.
  • Extensively involved in using the Ab Initio parallelism (pipeline, component, & data) and best practices.
  • Used generic graph for Handled Validation of incoming files.
  • Involved in the migration of the code in QA environment with QA team and completed rigorous unit testing of each graph.
  • Involved in conducting reviews for the Ab Initio codes after development.
  • Worked on Clear Quest to approve tickets, checking the status, fixing Business related issues.
  • Worked on checking out ETL codes like checking the Ab Initio setup, setting up configuration profiles, checking disk space, verifying parallelism.
  • Developed graphs separating the Extraction, Transformation and Load process to improve the efficiency of the system.
  • Involved in designing Load graphs using Ab Initio and tuned the performance of the queries to make the load process run faster.
  • Implemented Lookups, lookup local, In-Memory Joins to speed up various Ab Initio Graphs.
  • Extensively used Enterprise Meta Environment (EME) for version control
  • Involved in unit testing and assisted in system testing, integrated testing, and user acceptance testing.
  • Followed the best design principles, efficiency guidelines and naming standards in designing the graphs
  • Created High level Design, Detail Design, Unit Test Case documents following prescribed GIW standards.
  • Reduced the amount of data moving through flows to have a tremendous impact on graph performance.
  • Performance tuning and optimization of database configuration and application SQL by using Explain plans and Statistics collection based on UPI, NUPI, USI, and NUSI.
  • Developed OLAP reports and Dashboards using the Business intelligence tool - OBIEE.
  • Involved in comprehensive end-to-end testing- Unit Testing, System Integration Testing, User Acceptance

Confidential

Ab Initio Developer

Responsibilities:

  • Design and Develop DW ETL using Abinitio.
  • Effectively designed the ETL process to handle huge volume of feeds.
  • Built extensive reusable components and high performance ETL processing.
  • Involved in various reporting technologies including Cognos, Microstrategy, and BO
  • Build LDM and PDM data model using ERWin.
  • Design and implement 3NF and Dimensional Modelling.
  • Solely responsible for the development and delivery of multiple web interfaces to enable data provisioning and other operations by client using Oracle APEX
  • Build BI reports using Cognos.

Environment: Ab Initio, ERWin, Oracle APEX, Oracle 10g, Cognos

We'd love your feedback!