We provide IT Staff Augmentation Services!

Sr. Ab Initio Developer Resume

2.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY:

  • Around 8 years of experience in design and development of Data Warehouse and Business Intelligence solutions using Ab Initio ETL tool.
  • Implemented many complex modules in various projects through System Development Life Cycle (SDLC) process taking end - to-end ownership.
  • Involved in all the development life cycle stages of the project namely requirement gathering, designing, development, various testing phases (UAT and OAT) and support Production deployment.
  • Actively participated in Ab Initio COE activities and developed few generic scripts for automating code.
  • Worked on configuring AB-Initio environment to connect to database using DB configuration file, Input Table, Output Table, and Update Table components and other custom components.
  • Strong experience in Ab Initio Architecture, GDE, Co>Operating System.
  • Extensively used AIR commands to check in/check-out, perform dependency analysis and other EME(Enterprise Metadata Environment ) related operations.
  • Experience in integration of various data sources such as flat files, ASCII files, XML files, etc.
  • Designed and encouraged use of Reusable graphs in Ab Initio to promote reusability, eliminate coding redundancy and ease maintenance of Version control.
  • Experience in working with Linear fulfillment environment(LFE),BDF.(BRE,ACE Tools)
  • Expertise in the concepts of Data Warehousing, Data Marts, ER Modeling, Dimensional Modeling, Fact and Dimensional Tables.
  • Good experience with Data Migration, Data Transformation, Data quality environment configuration and Data loading using ETL with RDBMS and file systems such as Main frame files.
  • Experience in using Ab Initio as the ETL tool, Extensive knowledge in architecture design of Extract, Transform, and Load environment also worked as Analyst gathering user’s requirements.
  • Experience in Installation, configuration, and administration of ETL tools
  • Good knowledge in Teradata, hadoop utilities like SQL Assistant, Fast Load, Multi Load.
  • Good knowledge of Ab Initio Continuous Flows developing for web services
  • Automotive Service Excellence (ASE) will be provided as a support for the Backend graphs.
  • Good knowledge on core concepts of data warehousing and data mining.

TECHNICAL SKILLS:

ETL: Ab Initio GDE(1.15/3.1.6) & Co-op (2.13,2.15 and 3.12) & Control M, EME

Operating Systems: Linux, Windows 2000/XP,07

Languages: SQL, PL/SQL, UNIX Shell Scripting, C, C++

Tools: Control-M, AutoSys, IBM TWS, SAS, MS Visio

Database: Oracle 9i/10g/11g, Teradata, DB2, SQL Server

PROFESSIONAL EXPERIENCE:

Confidential, Chicago,IL

Sr. Ab Initio Developer

Responsibilities:

  • Developed various graphs which are required for the multiple clients of Confidential .
  • Worked approximately for more than 20+ clients of Confidential to process their data for their requested data
  • Reading the data from the Mainframe file i.e., JCL library file and processing it through the Open systems/abinitio or putty.
  • Analysis of Business Requirement and Analysis will be done before staring for the processing
  • Reformatting every mainframe file to convert the data from ebcdic data to ascii data.
  • Processed the Input Layouts and handling the large volumes of data.
  • Analyzing the requirements from the Clients Perspective through the Order specification form
  • Created the Structured, Unstructured, Hybrid graphs based on the requirements.
  • Worked on Different jobs like Analysis, Extraction, prescreen, Prospects, Interim file Processing, Test file processing, Final file processing, Portfolio review, forms handling for the data
  • Created lookup files for different reports having value ranges which varies from negative values, positive values for the sub codes
  • Zipping and unzipping the files from abinitio processing, Used ACE/LFE to create the generic graphs.
  • Good experience with Data Migration, Data Transformation and Data loading using ETL with RDBMS and file systems such as Main frame files.
  • Design and integration of the ETL components and resolving technical and data centric issues.
  • Used different Ab-Initio components like Subscribe,, Publish, Multi publish, Continuous update table, Read XML, Write XML, partition by key and sort, dedup, rollup, scan, reformat, join and fuse in various graphs.
  • Created the Data Detailed View file process and transferring them through the FTP and SFTP process
  • Maintenance of the graph at different phases to process all the requests and for the main process to complete.
  • Extensively used the Teradata utilities like Fast load, Multiload, DDL Commands and DML Commands (SQL).
  • Updated the FICO SSN Reports monthly, quarterly for the requested clients
  • Involved in review of Data Modeling and mapping (HLD) documents for Test cases preparation.
  • Used the Excel Sheet component in abinitio for Creation and sending the reports through the Email Reports component directly from the Abinitio process
  • Creation of Final Files, Stats File, waterfall Stats and Number Flow Stats, Manifest, pset Properties Files.
  • Created the Long term storage of files as per the retention period requested by the client
  • Sending the files for posting and billing team at the same time of sending the final files to the customers
  • Validating the record match count values through the historical order data values
  • Executed QC for data validation and data profiling in the development and production.
  • Delivering the graph with the data after the unit testing and QA was given and assured.
  • Documented every customer requirement and for the custom backend graph

Environment: Ab Initio GDE 3.1.6, Enterprise Metadata Environment (EME),Ab Initio, Co>operating system 3.1.2, UNIX, Oracle 11.2.3, Putty, Windows 7, Confidential Compare, Microsoft Visio, Mainframes

Confidential, Franklin Lake,NJ

Sr. Ab Initio Developer

Responsibilities:

  • Confidential Limit allows users to set up overall Plan Confidential such as Day Supply (Min and Max), Monthly and Quarterly, Quantity and Refills allowed.
  • Involved in writing of detailed design document and Test cases preparation.
  • Collaborate with other Domain Architects, DBA, Data Modelers, Support Groups and also
  • Application Development & Test teams to ensure timely deliverables.
  • Involved in Developing and Modifying Graphs for the Confidential application based on Converge rules and Refills conditions.
  • Involved in Developing the Ab initio Graphs for the Confidential, Claims Compare applications.
  • Involved in all the stages of SDLC during the project. Analyzed, designed and tested the new system
  • For performance, efficiency and maintainability using ETL tool AB INITIO.
  • Extracted West Claim related Files and translated into WHSEFUL format and Compare with East Claims related WHSEFUL format files.
  • Designed and developed the graphs using the GDE, with components sort, reformat, replicate, assign key, join, merge, gather, and concatenate components.
  • Makes changes to the application in Express>It or they publish the application from Express>It for promotion or release.
  • Setup and maintenance of environments created to support projects
  • Various inbuilt functions like lookup, lookup match, lookup local, string filter out, Vectors, scan, normalize, rollup were effectively used in various transform functions.
  • Experience working with VLDB (very large databases) to get the data and process them on it.
  • Filling multiple tables in data warehouse through datastage.
  • Experience with Advance knowledge of Ab Initio Meta Data Hub
  • Involved in Data Migration to process or transferring data from one system to another while changing the storage, database or application using ETL Ab Initio.
  • Involved in data quality environment (DQE) configurations.
  • Involved in working with Teradata Utilities such as FastLoad, MultiLoad, FastExport.
  • Involved in supporting end-users of an application and Automated the entire Data process using UNIX Shell scripts and scheduled the process
  • Involved in Unit testing and System testing of Confidential application.
  • Writing a JIL program to run jobs automatically through Autosys Control-M.

Environment: Ab Initio GDE 3.1.2, Ab Initio BRE, TeraData, Enterprise Metadata Environment (EME), Co>operating system 3.1.2, UNIX, Oracle, Db2, Putty

Confidential, Foster city,CA

Sr. Ab Initio Developer

Responsibilities:

  • Fund Service Reporting (FSR) contains three modules i.e. Extracts, MultiFunds, Loads.
  • Developed high level and low level design document for processing each Fund Extracts and documenting the various implementation done during the branch of the FSR application.
  • Design and develop Ab Initio Metadata Hub driven platform for reducing development effort for all future
  • ETL solutions using metaprogramming feature of Ab Initio.
  • Participate in various data modeling exercises including forward engineering, reverse engineering, complete compare, match and merge, creating DDL scripts, creating subject areas,publishing model to PDF and HTML format, generating various data modeling reports etc.
  • Used different Ab-Initio components like Subscribe, BatchSubscribe, Publish, Multi publish, Continuous update table, Read XML, Write XML, partition by key and sort, dedup, rollup, scan, reformat, join and fuse in various graphs.
  • Also used components like trash, run program and run sql components to run UNIX and Meta-SQL commands in Ab Initio.
  • Involved in writing Meta-SQL queries to migrate an Extracts from DEV to UAT.
  • Involved in writing Meta-SQL - Join queries to need to know funds loaded or not into database.
  • Involved in writing a Procedures, Functions, packages using PL/SQL.
  • Upgrade MDE (Metadata Driven Engine) platform to accommodate new strategic solutions.
  • Processed data from hadoop like HIVE and done graphs on it.
  • Developed graphs using Ab Initio processing data from cloudera hadoop like HIVE and HDFS.
  • Responsible for extract a data(daily, weekly, monthly) from database and to maintain a historical data in Database for BI Reports
  • Carried the detailed profiling of operational data using Ab Initio Data Profiler/SQL
  • Involved in testing the application with test data in Express>It to ensure that the features are configured as expected.
  • Involved in the design of the ETL application for designing and developing automated ETL processes.
  • Used different Enterprise Metadata Environment (EME) Air commands in project promotion like air tag create, air save, air load, air project export etc
  • Writing a JIL program to run jobs automatically through Autosys and CONTROL-M.

Environment: Ab Initio GDE 1.15.5, 1.16.7, Co>operating system 3.0, 3.12, UNIX, Db2, QTODBCSQLDBX, Putty, SQL\PL-SQL, Windows 2000/XP, MS Office, Data Modeling, Visio, Shell Scripts, XML, Autosys, CONTROL-M

Confidential,Atlanta, GA

Sr. Ab Initio Developer

Responsibilities:

  • Developed high level and low level design document for processing each feed and documenting the various implementation done during the course of the Project.
  • Participated Agile Iterative sessions to develop Extended Logical Data Modeling and Physical Model.
  • Involved in source to target mapping discussions. Participating in various data cleansing and data quality exercises.
  • Developed various graphs which include extracting various XML files and loading it into the database for developing and tuning Abinitio ETLs .
  • Phasing and check pointing were used in the graphs to avoid deadlock and recover completed stages of the graph, in case of a failure.
  • Various ab initio commands such as m ls, m cp, m mv, m db, m touch were used extensively to operate with multifiles.
  • Carried the detailed profiling of operational data using Ab Initio Data Profiler/SQL
  • Performed data cleansing and data validation by using various ab initio functions like is valid, is defined, is error, is null, string trim etc.
  • Involved in writing a Procedures, Functions, packages to uploading data from Database.
  • Involved in moving applications from AIX to Linux server and modified the unix shell scripts for that application in order to run on Linux server.
  • Good experience with Data Migration, Data Transformation and Data loading using ETL with RDBMS and file systems such as Main frame files.
  • Involved in Migration of code from DEV to QA and also from QA to PROD by using Heats (Home Depot Utility).
  • Extensively used the Teradata utilities like Fast load, Multiload, DDL Commands and DML Commands (SQL).
  • Developed graphs using Ab Initio processing data from cloudera hadoop like HIVE and HDFS.
  • Created HDDTM (Home Depot ETL process Dynamic Test Manager) plans, Go Scripts and child plans to run the jobs in sequence.
  • Involved in Production support to monitor the jobs and schedules through IBM Maestro 8.5 Tivoli Workload Scheduler.
  • Involved in unit testing of the graphs and also prepared test cases document.

Environment: Ab Initio GDE 1.15.7, Enterprise Metadata Environment (EME), Co>operating system 2.15, Teradata, UNIX, Db2, Teradata Sql Assistant, Putty, SQL\PL-SQL, Windows 2000/XP, Data Modeling, MS Office, Visio, Shell Scripts, XML, XSD, Maestro 8.5.

Confidential,NJ

Ab Initio Developer

Responsibilities:

  • Understand the business process and gather requirements from business users along.
  • Implemented Star Schema models for the above data warehouses by creating facts and dimensions.
  • Carried the detailed profiling of operational data using Ab Initio Data Profiler/SQL
  • Tool to get better understanding of the data that can be used for analytical purpose for Business/User groups.
  • Used ACE to create the generic graphs.
  • Participated Agile Iterative sessions to develop Extended Logical Data Modeling and Physical Models.
  • Created mapping document and ETL design document (HLD and DLD).
  • Developed Ab-Initio graphs using Ab-Initio Parallelism techniques, Data Parallelism and MFS Techniques with the Conditional Components and Conditional DML.
  • Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis
  • Used CONTROL-M and Autosys for the process of Running the graphs and checking the graphs on server.
  • Carried the performance tuning on the Ab Initio graphs to reduce the process time.
  • Automated the entire Data Mart process using UNIX Shell scripts and scheduled the process using TWS, Job Track after dependency analysis.
  • Extensively used the Teradata utilities like Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL).
  • Supported Testing Team with data verification, loading the Data in Test Environment, provided sqls for testing the Test cases.

Environment: Ab Initio GDE 2.14, Enterprise Metadata Environment (EME), Co-Operating system, Teradata V2R6, PL/SQL, TOAD, Control-M, UNIX.

We'd love your feedback!