We provide IT Staff Augmentation Services!

Abinitio & Metadata Hub Developer/module Leader Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Over 6 plus years of IT experience in the Design, Analysis, Development, Modeling, Implementation and testing of various applications, Decision Support Systems & Data Warehousing applications.
  • Solid experience in Extraction, Transformation and Loading (ETL) mechanism using AbInitio. Knowledge of full life cycle development for building a data warehouse.
  • Experience in application tuning and debugging strategies.
  • Configured graph parameters, sandbox parameters, environment variables, EME Repository environment for Production/Testing/Development and performance tuning.
  • Developed and automated control validations for the ETL process to appropriately tie the source and target data.
  • Experience in various AbInitio parallelism techniques and implemented AbInitio Graphs using Data parallelism and MFS techniques.
  • Worked with different source/Target systems like Oracle, MS SQL Server, Teradata, DB2.
  • Developing and Testing wrapper scripts for Maestro jobs.
  • Involved in Developing AbInitio Graphs in a MPP (Massively Parallel Processors) Environment for the ETL process jobs in a High Volume and dynamically growing environments.
  • Installation of AI data profiler tool for BI/DW manager.
  • Tuned the Graphs by removing unnecessary components to enhance the performance.
  • Exposure to Conduct It, BRE, Data profiler products.
  • Knowledge in Analyzing Data using AbInitio Data Profiler to estimate different Patterns of data, identifying duplicates, frequency, consistency, accuracy, completeness and referential integrity of data.
  • Worked on Database migrating/Gap Analysis to migrate the database from SQL Server to Oracle.
  • Knowledge on Transformation rules management using Business Rules Engine (BRE).
  • Worked with ODS (Operational Data Source) and DSS (Decision support System) to do the data profiling, Data validation and cleansing process using AbInitio.
  • Excellent Knowledge of Business Intelligence and Data Warehousing Concepts.
  • Responsible for writing shell scripts (wrapper) to schedule the jobs.
  • Knowledge in Dimensional modeling like Star Schema and Snowflake Schema.
  • Extensively used ETL methodologies for supporting data extraction, transformations and loading processing using AbInitio.
  • Experience of using Metadata Importer for importing metadata from an EME Technical Repository and other sources like ETL tools (Informatica), Reporting tools (Cognos, SAS, Business Objects etc) and databases (Oracle, Teradata, DB2 etc.)
  • For building data lineageusing Metadata Hub
  • Hands on experience with Metadata Hub administration tools, utilities for creating Metadata Hub data stores.
  • Experience in creating and deploying Metadata Hub Web applications, and loading Metadata Hub customizations.
  • Good working experience with Very Large Database Systems (VLDB) that are Massively Parallel Processing (MPP) and Compressed Indexed Multi file systems using Ab Initio ETL tool by utilizing the concepts of Multi-files.
  • Experience in UNIX Shell Scripts including Korn Shell Scripts, Bourne Shell Scripts.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, and MS Access.
  • Experienced in writing SQL query and PL/SQL Procedures, Triggers, and Functions necessary for maintenance of the databases in the data warehouse development lifecycle.
  • Ability to co-ordinate effectively with development team, business partners, end users and management.
  • Deep understanding and capability in troubleshooting of the common issues faced during the development lifecycle including coding, debugging, testing and final roll-out.
  • Set up Development, QA & Production environments.
  • Experience in reviewing and monitoring project progress, planning & managing dependencies and risks, Resource Planning, Project Tracking and Forecasting.
  • Self motivated, excellent written and oral communication skills.
  • Strong analytical & problem solving skills.

TECHNICAL SKILLS

ETL Tools:

AbInitio GDE 1.1x, 3.0x, CO>OP 2.1x,3.0,Metada Hub , Data profiler, Informatica, Erwin 4.1/3.7, Microsoft Visio 2007

Tools:

Maestro, Autosys, Tivoli Work Load Scheduler

OS:

UNIX , Windows 95/98/NT/ME/00/XP, MS-DOS, Linux 8x

Databases:

Oracle 11i,10g,9i/8i/8/7.x, MS SQL Server 2008/2005/2000, Teradata v2r6.1, MS Access, IBM DB2.

Languages:

UNIX Shell Scripting, SQL, PL/SQL, T-SQL, C, C++, JCL, Perl.

Scripting:

UNIX Shell (Ksh/Csh), JavaScript.

Web:

HTML, DHTML, XML, ASP 2.0/3.0

EDUCATION

  • Bachelor of Science in Electronics and Communication Engineering

PROFESSIONAL EXPERIENCE

Confidential, Jan 12 – Present
AbInitio & Metadata Hub Developer/Module Lead
Description: Confidential, has initiated a Metadata process for Card Services which will allow a Target state that creates a single location where good Metadata identification is stored for reference, populate the metadata elements with business approved content, collects publishes and maintains data and publishes results to a shared Repository. The Metadata Explorer is a part of Abinitio Enterprise Metadata Management System (EMMS) used to Analyze, explore and Manage Metadata using Standard Web Browser.

Responsibilities:

  • Involved in development of AbInitio graphs for loading and extracting data from various schemas relating to Oracle Database.
  • Created mapping documents based on the requirements, had review meetings with Business and Architects to finalize the mapping documents.
  • Created graphs using components like Input/ Output File/Table, lookup, Reformat, Redefine Format, Dedup Sorted, Filter by expression, Partition By Expression ,Partition by Key and Sort, Sort, Broadcast, Replicate, Join, Merge, Concatenate, Gather, Rollup, Scan, Read Excel spread sheet, FTP to, Publish, Subscribe, Fuse, Run Program, Run SQL,MQ Publish Headers, Read/write xml, Batch Subscribe, Update Table.
  • Performed data cleansing operations on the data using transformation functions like is_valid, is_defined, is_null, is_blank,string_lrtrim , re_index, re_interpret as, string_concat,string_substring,lookup_count,lookup_first,now(),decimal_strip, re_index,re_replace,decimal_lpad, next_in_sequence(),length_of test_characters_all(), force_error(),switch(),first_defined(), lookup_match(), conditional dml, cobol-to-dml utility,xml-to-dml utility, etc.
  • Developed Complex AbInitio XFR’s to derive new fields and solves rigorous business requirements.
  • Worked on improving the performance of AbInitio graphs by employing AbInitio performance components like Lookups (instead of joins), In-Memory Joins, Rollup and Scan components to speed up execution of the graphs.
  • Tuned the Graphs by creating Lookup files, Memory sort and Max-core parameters for maximum usage of cache memory and to enhance the performance.
  • 24x7 Production Support for ETL jobs for daily, Monthly and Weekly schedules.
  • Using Metadata Importer for importing Metadata from an EME Technical Repository and other sources like ETL tools (Informatica), reporting tools (Cognos, SAS, Business Objects etc) and databases (Oracle, Teradata, DB2 etc.).
  • Importing the Catalog, Custom Catalog, Roles and Priviliges Feeds for the Database (Oracle and Teradata) and to appear the Databases in the Physical Assets Hierarchy.
  • Importing the Erwin Logical and Erwin UDPs Feeds for the Logical Models.
  • Loading Metadata Hub customizations.
  • Importing the EME Datasets and EME Graph Imports for building the Data Lineage.
  • Importing the EME Datasets and Data Profiler results from EME to Technical Repository to show up the results in the Metadata portal.
  • Importing the Business Data Definition Matrix feeds.
  • Loading the Customizations for Valid Value Combinations which include Master Values, Business Data Elements Groups and Domain Code Sets.
  • Creating Metadata Hub data stores using utilities.
  • Creating and deploying Metadata Hub Web applications.
  • Customizing the Metadata Explorer in order for the Business user to explore and analyze the Metadata and to see the contents of the system and applications and drill down in to details of the object.
  • Creating new feed files for importing the metadata on the command line and also in the Metadata Portal.
  • Creating rule files for Transformations and importing the feeds.
  • Creating Data Source Connection files for connecting to the graphs in order to extract the Metadata.
  • Generating Metadata Reports and auditing.
  • Adding and Exposing the Divisions in the Metadata Portal.
  • Exposing the Notes Tab and having the various notes type in the Metadata Portal.

Environment: co>op 2.15/3.0.1, GDE 3.0.1, Metadata Hub 3.0.4, Oracle, Erwin, UNIX and HP Quality Center.

Confidential,Irving, Tx Jun 10 – Jan 12
AbInitio Developer/Module Lead
Description: Confidential, wants to be proactive in decreasing fraud in its banking activities. Thus it wants to accomplish by being proactive in reducing opportunities to fraud, reporting fraud activities while increasing asset recovery. CBNA (Citi Bank North America) has initiated this Enterprise Fraud Detection Project. This project should put in a system to detect fraud in real time. This initiative takes transactions from different sources and determines whether to authorize them or not. ETL Ab Initio would be the key integration component in developing this application environment.
Responsibilities:

  • Involved in meetings with Business System Analysts and Business users to understand the functionality.
  • Created mapping documents based on the requirements, had review meetings with Business and Architects to finalize the mapping documents.
  • Responsible for Logical and Physical Design documents for the project and discussed the implementation process with Architect.
  • Dealt with ASCII (fixed, delimited, multiple headers/trailers), EBCDIC, Spread sheets and XML files.
  • Involved in development of AbInitio graphs for loading and extracting data from various schemas relating to Oracle Database.
  • Created graphs using components like Input/ Output File/Table, lookup, Reformat, Redefine Format, Dedup Sorted, Filter by expression, Partition By Expression ,Partition by Key and Sort, Sort, Broadcast, Replicate, Join, Merge, Concatenate, Gather, Rollup, Scan, Read Excel spread sheet, FTP to, Publish, Subscribe, Fuse, Run Program, Run SQL,MQ Publish Headers, Read/write xml, Batch Subscribe, Update Table.
  • Worked on Continuous Flows for sending the XML messages to Real Time application (TIBCO).
  • Used Continuous Flows for Data Enrichment with Main Frame.
  • Developing and testing wrapper scripts for graphs and Maestro Jobs/Schedulers.
  • Performed data cleansing operations on the data using transformation functions like is_valid, is_defined, is_null, is_blank,string_lrtrim , re_index, re_interpret as, string_concat,string_substring,lookup_count,lookup_first,now(),decimal_strip, re_index,re_replace,decimal_lpad, next_in_sequence(),length_of test_characters_all(), force_error(),switch(),first_defined(), lookup_match(), conditional dml, cobol-to-dml utility,xml-to-dml utility, etc.
  • Written NDM scripts, mail scripts for successful processing of the files, file watcher scripts, run ksh scripts for running the graphs, housekeeping scripts etc.
  • Processing the source feed files according to the Business logic and NDM’ed the files to Target systems.
  • Tuned the Graphs by creating Lookup files, Memory sort and Max-core parameters for maximum usage of cache memory and to enhance the performance.
  • Handled Deployments and Migration of the Code to Prod/Test regions.
  • Involved in various EME data store operations like creating sandbox, code check-in, code checkout, creating project parameters according to the environment settings for this application.
  • Created checkpoints, phases to avoid dead locks and tested the graphs with sample data.
  • Developed Complex AbInitio XFR’s to derive new fields and solves rigorous business requirements.
  • Generated DB configuration files (.dbc) for source and target tables.
  • Writing SQL queries.
  • Created mockup data for the dev environment to test the business functionality. Supported for System testing, UAT testing, and preproduction testing. Handled critical data issues in System Testing, UAT environments in the process of recovery.
  • Worked with the team in carrying out unit testing to identify errors in the AbInitio code related to the customer related data from everyday’s transactions in a real-time transaction environment.
  • Written user defined functions for business process to improve the performance of the application.
  • Written Production Handoff Documents.
  • Implemented the Parallel application by replicating the components and processing modules into number of partitions.
  • Coordinating with Offshore Team.
  • 24x7 Production Support for ETL jobs for daily, Monthly and Weekly schedules.

Environment: co>op 2.15/3.0.1, GDE 1.14/1.15/3.0.1, Oracle, Erwin, UNIX and HP Quality Center.

Confidential,Hartford, CT Jan 09 – Apr 10
AbInitio Developer
Description: Confidential,offers a full line of Auto, Home, and Renters insurance products to protect businesses, cars and homes. Auto Quote Generation, Quote Submissions, Policy Renewals, Reinstatements, Reissues, Coverage Charts, Endorsements, Conditions, Regular Solicitation Processes for its products, sub-products and their corresponding coverage limits are the aspects involved.

Responsibilities:

  • Involved in development of Ab Initio graphs for loading and extracting data from various schemas relating to Oracle Database.
  • Involved in creating sandbox.
  • Debugged and modified shell scripts.
  • Parameterized the graphs from local to global parameters for various job request loads.
  • Extensively used AbInitio components like Reformat, Redefine Format, Rollup, Scan, Input Table, Output Table, Partition By Expression, Partition by Key and Sort, Merge, Concatenate, Gather, Dedup, Replicate, Join.
  • Developing and testing wrapper scripts for graphs based on automation system, Maestro.
  • Implemented the Parallel application by replicating the components and processing modules into number of partitions
  • Involved in the Deployment and migration process.
  • Worked with Data Profiling on incoming data feeds depending on the classification of warnings and errors.
  • Designed and deployed well-tuned AbInitio graphs (Generic and Custom) for ODS and DSS instances and created Graphs for code reusability.
  • Extensive experience using EME data store to check in and checkout the graphs.
  • Produced the required documents like High/Low Level Design Documents.
  • Responsible for creating an Multi file system that is composed of individual files on that are partitioned and stored in distributed directories.

Environment: AbInitio 2.14, GDE 1.14, Oracle, Erwin, UNIX and HP Quality Center.

Confidential,Whitehouse Station, NJ Jun 07– Oct 08
AbInitio Developer
Description: Confidential,is a global research-driven pharmaceutical company dedicated to putting patients first. Merck discovers, develops, manufactures and markets vaccines and medicines to address unmet medical needs. The company devotes extensive efforts to increase access to medicines through far-reaching programs that not only donate Merck medicines but help deliver them to the people who need them, Merck also publishes unbiased health information as a not-for-profit service.
Responsibilities:

  • Created checkpoints, phases to avoid dead locks and tested the graphs with sample data.
  • Schedule the graphs using Job Scheduler.
  • Involved in creating the Generic graph for sending the Data to mainframe system using the FTP component.
  • Modified the AbInitio components parameters, utilize data parallelism and thereby improve the overall performance to fine-tune the execution times.
  • Involved in creating the (MFS) Multi-file, which gives the user the ability to centrally control the distributed data files and they provide the scalability and the kinds of access patterns that parallel applications require.
  • Used AbInitio Components like Partition by Key, reformat, rollup, join, scan, normalize, gather, replicate, merge etc.
  • Responsible for analyzing the source data and reporting data quality levels using AbInitio as an OLAP tool.
  • Extensively worked with UNIX environment and created lots of UNIX process through the shell scripts and worked closely with UNIX administrator for any performance tuning.
  • Utilized the EME for version control, and tracking of changes made to new and old graphs.
  • Responsible for interpreting the transformation rules for all target data objects and develop the software components to support the transformation. Coded and tested complex AbInitio ETL routines.
  • Knowledge with AbInitio Business Rule Designer/Editor to define, design, document,

and edit business rules and generate XFRs by using Rule sets in BRE(Business Rules Engine).

Environment: AbInitio (GDE 1.13, Co>Op 2.13), SQL, HP-UNIX 11.0, Erwin, AbInitio Continuous flows, Autosys.
Confidential,New York, NY Sep 06 – Apr 07
AbInitio Developer
Description: Confidential,is the world\'s leading provider of market research, information and analysis to the consumer products and services industries. More than 9,000 clients in more than 100 countries rely on ACNielsen\'s dedicated professionals to measure competitive marketplace dynamics, to understand consumer attitudes and behavior, and to develop advanced analytical insights that generate increased sales and profits. ACNielsen Corporation is the global leader in consumer research, offering comprehensive information that tracks sales, volume, shares, trends, pricing, promotions, distribution, and inventory levels for corporate clients in a variety of industries worldwide.
Responsibilities:

  • Involved in all phases of the ETL process, which includes requirement analysis, Source-Target Mapping document and ETL process design.
  • Extensively used components like Partition by Key & Sort, Partition by Expression/ Round Robin, Filter by Expression, Sort, Reformat, Gather, Redefine, Replicate, Scan, Denormalize, and Normalize components to develop the ETL transformation logic.
  • Developed Complex AbInitio XFR’s to derive new fields and solved rigorous business requirements.
  • Generated DB configuration files (.dbc) for source and target tables.
  • Worked with AbInitio components to create Summary tables using Rollup and Scan components.
  • Designed and developed the graphs with the components rollup, dedup sorted, reformat, read excel, join, merge, gather and concatenate components.
  • Created complex AbInitio graphs and extensively used Partition and Departition components to process huge volume of records quickly, thus decreasing the execution time.
  • Worked on improving the performance of AbInitio graphs by employing AbInitio performance components like Lookups (instead of joins), In-Memory Joins, Rollup and Scan components to speed up execution of the graphs.
  • Check in/Check Out existing applications using EME in order to perform the necessary modifications
  • Performed data cleansing operations on the data using transformation functions like is_valid, is_defined, is_null, is_error, string_trim etc.
  • Written user defined functions for business process to improve the performance of the application.

Environment: CO>op 2.1x, AbInitio GDE 1.1x, PL/SQL, Erwin, UNIX and Quality Center.

We'd love your feedback!