We provide IT Staff Augmentation Services!

Ab Initio Developer Resume

4.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Over 11 + years of IT experience in Analysis, Design, Development and Maintenance of various Business applications.
  • Strong experience in Ab Initio Architecture, GDE, Co>Operating System.
  • Extensively used AIR commands to check in/check - out, perform dependency analysis and other EME related operations.
  • Expertise in the use of Co-Op, Express-It (ACE), BRE, DQE, Metadata Hub, Conduct>IT, Correlate, Query>It, Dependency analysis for development and testing.
  • Hands on experience in handling data from various source systems such as Flat Files, XML Source, Oracle, MS SQL Server, IBM DB2, Teradata and Excel Files.
  • Extensively used different features of Teradata such as BTEQ, FastLoad, MultiLoad, Tpump, SQL Assistant.
  • Comprehensive knowledge of Mainframe programming, test plan, regression testing, and techniques.
  • Good working knowledge inParameterizing Ab Initio graphs and creating PSETS
  • Experience in designing, developing, execute and maintain data extraction, transformation, and loading for multiple corporate Operational Data Store, Data warehousing, and Data mart systems.
  • Extensive knowledge of Mainframe tools and techniques as well as Unit regression testing.
  • Experience in EME, checkins, checkouts, command line interface, air commands and Dependency analysis.
  • Well versed wif variousAb Initio parallelism techniques and implemented number of Ab Initio Graphs using Data parallelism techniques
  • Experience in integration of various data sources such as flat files, ASCII, EBCDIC files, XML files, DB tables etc.
  • Strong Experience in working wif Business Objects.
  • Strong Experience in working wif Autosys, CA7.
  • Experience wif UNIX shell scripts, AutoSys JIL scripts and XML
  • Developed Unix Shell scripts, Perl scripts and SQL control files to load data through SQL Loader.
  • Worked onMetadata Importersfor importing metadata from an EME Technical Repository and other sources like ETL tools (Informatica), Reporting tools (Cognos, SAS, Business Objects etc) and databases (Oracle, Teradata, DB2 etc.)
  • Experience in defining, creating, documenting, verifying and executing Test cases and work wif development team to resolve product issues, create basic test plan and performing functional testing, Integration testing and performance testing.
  • Extremely effective in System Development Life Cycle (SDLC) from analysis, design, development, testing and implementation in diverse range of software applications.
  • Extensively used Control-M and Autosys for job management
  • Quick learner and ability to meet deadlines and work under pressure.
  • Experience wif databases like Teradata 14, 13.10 and Oracle 9i/11g.
  • Knowledge of System Analysis and Design and ObjectOriented Analysis & Design.
  • Working Knowledge of Cognos and Informatica.

TECHNICAL SKILLS

ETL Tools: Ab Initio GDE 3.1.5/3.0,Oxygen XML Editor 11.2, SQL server 2000/2005/2008/2012, Toad, Teradata SQL assistance, Bteq, Fastload, Multiload, Control-M, Autosys, CA7.

Database: Oracle 11i/10i/9i, SQL Server, Teradata, Mainframes

Languages: C, C++, JAVA, SQL, PLSQL

Scripting Languages: UNIX/Shell script, HTML/DHTML, XML, SQL

Operating Systems: Windows XP, UNIX (IBM AIX, Solaris 9.0/10.0, LINUX)

Testing Tools: Quality Center

Source control tools: EME (Enterprise Meta Environment), and PVCS

Data Modeling: ERWIN 4.1, ERStudio 8, Visio

Version control Tools: Microsoft Visual SourceSafe

PROFESSIONAL EXPERIENCE

Confidential, Atlanta GA

Ab Initio Developer

Responsibilities:

  • Developed and Modified existing graphs as per the business requirements graphs using AbInitio components
  • Involved in QA environment wif QA team and completed unit testing of each graph.
  • Extensively used AIR commands to check-in/check- out & other EME related operations
  • Worked in a sandbox environment while extensively interacting wif EME to maintain version control on objects.
  • Extensively used components and various editors of component parameters for development of graphs.
  • Worked wif De-Partition Components like Gather, Merge which will be used for adding the files which is done partition for fast process of data files.
  • Done appropriate unit-level testing of work products and the management of the review process for the AbInitio deliverables.
  • Responsible for effective communication between the offshore and the onshore team.
  • Scheduled ETL batch jobs using Autosys, Control-M tool.
  • Coordination wif multiple teams like Development, PSO, DBA, Main Frames, CA7.
  • Exposure to BRE (Business Rule Environment), ACE (Application Configuration Environment) and DQE (Data Quality Environment) products.
  • Updating DMLs and XFRs to the existing Graphs according to the customer requirements. Used generic plans to extract data from different source systems and load
  • Used Lookups wif Reformat component for fetching matched records based on the downstream process.
  • Worked wif Partition components like partition by key, partition by Expression. Efficient use ofMulti files system, which comes under Data Parallelism.
  • Interacting wif Business users to understand each Interface in detail.
  • Understanding various applications to migrate them to Data Lake environment.
  • Validating business requirements before moving them to Atlas Data Lake.
  • Creating Ab Initio Psets to ingest files to Hadoop Lake file system.
  • Developing Ab Initio graphs and plans to fulfill business demands.
  • Reading and writing HDFS file system as part of validating applications.
  • Scheduling jobs in CA7 Automation tool to automate processes as per application requirements.
  • Have a very good experience in writing and reading from HDFS using Abinitio Components.
  • Performed Proof of Concept on Data Extraction from DB2EE, PDOA Tables to Hadoop Platform.
  • Created Hive Tables wif partitions on HDFS Structure using Definitions from Oracle DB.
  • Importing theCatalog, Custom Catalog, Roles and PrivilegesFeeds for the Database (Oracle and Teradata) and to appear the Databases in the Physical Assets Hierarchy.
  • Extensively used UNIX ShellScripting for writing SQL execution scripts in Data Loading Process.
  • Customizing the Metadata Explorer in order for the Business user to explore and analyze the Metadata and to see the contents of the system and applications and drill down in to details of the object.
  • Creating new feed files for importing the metadata on the command line and also in the Metadata Portal
  • Creating rule files for Transformations and importing the feeds.
  • CreatingData Source Connectionfiles for connecting to the graphs in order to extract the Metadata
  • GeneratingMetadata Reports and auditing.

Environment: Ab Initio Version 3.1.5, Korn shellscripting, Co>Operating System- Version 3.1.1.6, UNIX, Oracle 10g, Metadata Hub, Pure Scale DB2, DB2 on Mainframes V10.5, Windows 7 Professional, Control-M, MS Share Point, Agile Methodology(Rally) and HP Quality center, HIVE 1.2,SQOOP 1.4,HortonWorks DP 2.7,Red Hat Linux,CA7.

Confidential

Sr Ab Initio Developer

Responsibilities:

  • Technical and Requirement Gap Analysis of New/existing requirements for the Modules/applications.
  • Developed Integration and ETL scripts and stored procedures.
  • Solution mapping, designing and effort estimation.
  • Code review and providing solutions/ideas for performance improvement, tuning and optimization of the code.
  • EME — Source Control (Check-in/Check-out), Version Control/Tagging, Dependency Analysis, Metadata Management and Reporting.
  • Database designing by using logical and relational models.
  • Creating shell scripts for scheduling jobs.
  • Develop new packages, stored procedures and functions for new databases and maintaince of the existing systems.
  • Prepare database migration plans and strategies for testing (SIT/UAT/ Production parallel and disaster recovery environments).
  • Performance tuning of Oracle databases by identify high-impact SQLs, verify optimal index usage, analyze and rebuild indexes, determine the execution plan for SQL and tune SQL statement.
  • Data conversion and integration of front end application to the data warehouse using SQL, PL/SQL and Stored Procedures
  • Prepare Hadoop query for oracle to Hadoop migration
  • Creating common Library functions and templates dat can be used by various projects.
  • Co-ordination wifin support teams for BAU/Hyper.
  • Reconciliation between FO and BO.

Environment: Ab-Initio, Oracle 11g, SQL, PL/SQL, Linux, Oracle Web Logic Server, Auto-sys, Unix, perl, shell scripting

Confidential, Riverwoods IL

Sr. Ab Initio Developer

Responsibilities:

  • Worked on the Production Enhancement fixes and High Priority Immediate fixes for issues inProduction.
  • Involved in analysis of end user requirements and business rules based on given preliminary design, documentation and working closely wif tech leads and data analysts in understanding the currentsystem.
  • Used Ab Initio GDE to generate complex graphs for transformation and loading of data into Stagingand Target Database areas for Confidential Applications.
  • Enhanced the Existing Application wif new changesas per the Business requirements.
  • Worked on Creating the Design Addendum once the Applicationsare developed.
  • Created the detail design documents and run books for generic jobs.
  • Developed Ab Initio jobs for loading and extracting data from various data stores like DB2, Oracle.
  • Designed SCD type 2-dimension data flows for data ware house and designed Fact data flows to Datawarehouse.
  • Performed Unit Testing and responsible for the promotions to QA environment.
  • Support and Maintenance of more than 40 ETL applications in production environment.
  • Well versed in recovering applications for any technical failure or due to data issue.
  • Expertise in de-bugging applications whenever their is wrapper script or Ab Initio graph failure.
  • Involved in performance tuning of Ab Initio graphs.
  • Responsible for coordinating wif Database Administrator (DBA) and System Administrator (SA) teams inmaintaining applications and data warehouse on Oracle and Teradata.
  • Involved in Code reviews, Test reviews and coordinated wif System Testing team in providing break-fix.
  • Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
  • Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
  • Written UNIX scripts to perform certain tasks and assisting developers wif problems and SQL.
  • Developed the Autosys JIL scripts for the new jobs in all the environments.
  • Working on DQE projects to halp end user in generating issue reports
  • Preparation and maintenance of coding standards check list and actively participated in defect prevention(DP) activities.
  • Provided on-call support.
  • Coordinated wif source, down-stream and business teams wif respect to data issues.
  • Actively involved in application support during a mega outage of server movement.
  • Deployed and ran the graphs as executable Korn shell scripts in the applications system.
  • Have experience working in onsite-offshore model.
  • Have a very good experience in writing and reading from HDFS using Abinitio Components.
  • Developed generic Sqoop scripts to fetch the data from Oracle DB tables to HDFS

Environment: Ab Initio Version 3.1.5, Korn shellscripting, Co>Operating System- Version 3.1.1.6, UNIX, Oracle 10g, Metadata Hub, Teradata, Pure Scale DB2, DB2 on Mainframes V10.

Confidential, Chicago IL

Ab Initio Developer

Responsibilities:

  • Worked on the Production Enhancement fixes and High priority Immediate fixes for issues in Production.
  • Involved in analysis of end user requirements and business rules based on given preliminary design documentation and working closely wif tech leads and data analysts in understanding the current system.
  • Used Ab Initio GDE to generate complex graphs for transformation and loading of data into Staging and Target Database areas for CCTI Applications.
  • Created a generic Ab initio job for automation testing like compare source and target tables, duplicate check in target.
  • Enhanced the Existing Application wif new changesas per the Business requirements.
  • Worked on Creating the Design Addendum once the Applicationsare developed.
  • Created the detail design documents and run books for generic jobs.
  • Developed Ab Initio jobs for loading and extracting data from various data stores like DB2, Oracle.
  • Designed SCD type 2dimension data flows for data ware house and designed Fact data flows to Data ware house.
  • Performed Unit Testing and responsible for the promotions to QA environment.
  • Used PDL and Shell interpretations to resolve the graph level parameters.
  • Created test cases and data validation for UAT.
  • Worked wif Production Service and Automation teams to debug issues in production and for migrations from Pre-prod to Production.
  • Used Control-M as a job scheduler to automate the daily, weekly and monthly jobs.
  • Analyzed schedules in Control-M to check for dependencies, modified to add dependencies on new jobs using xmls.
  • Worked wif Configuration Management team for code releases.
  • Expertise in Ab-initio Product Data Quality and Metadata Hub upgrades, view customization, extractor's registration, Metadata Hub Lineage and Metadata hub Import creations.
  • Involved in Teradata Query Tuning and tuned complex Queries, and Views, and implemented Macro's for reduce Parsing time. Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.

Environment: Ab Initio CO>OP 3.1.7.2, GDE 3.1.5, DB2, UNIX, QTP, Mainframes, Metadata hub Teradata, SQL server, Cognos, Control-M, Clear Quest, Service-Now, TOAD for DB2 6.1

Confidential, Greenville SC

Ab Initio Developer

Responsibilities:

  • Involved in migration all objects in the system to latest version of Co-op( V2.0 toV3.2.7).
  • Analyzing the current architecture and functionality of the system and proposing wif multiple design approaches for implementation of Parallelization in the system.
  • Implementation of the parallelization for performance optimization of system using Conduct IT.
  • Created multiple test cases for performance testing in terms of volume of incoming data.
  • Created new graphs to integrate to a new application (Smart Sensor) using Web Services components.
  • Single point of contact for maintenance/support of Web Services application.
  • Worked on gathering requirements for BAU enhancements.
  • Worked on Sprint planning wif the team for each release.
  • Have extensive experience in decoding, analyzing, design and development in a Migration of .Net based project to Abinitio.
  • Creating the FSD (Functional Specification Document) and LLD (Low Level Design Document) for existing system for better understanding.
  • Leading the project as a whole fromfront end and backend perspective taking the project delivery responsibility.
  • Worked wif QA team in performing the SIT and UAT testing for each release.
  • Preparation of Implementation Plan and Operational Readiness Documents.
  • Production deployment of the application.
  • Provided Warranty Support.
  • Major/Minor bug fixes in production.

Environment: Ab Initio CO>OP 3.1.7.2, GDE 3.1.5, DB2, UNIX, QTP, SQL server, Service-Now, Control-MTOAD for DB2 6.1, .Net

Confidential

ETL Developer

Responsibilities:

  • Analyzed business requirements, transformed data, and mapped source data using the Teradata Financial Services Logical Data Model tool, from the source system to theTeradataPhysical Data Model.
  • Developed Ab Initio graphs in accordance to the Business requirements and worked on Unit testing and as well as system testing.
  • Played a key role inoverall designof the entire application
  • Wrote multipleAb Initio transformsto covert currency values coming from different countries.
  • Lead thedesign and development effort to consolidate all business rulesinto a common XFR for re-usability.
  • ConfiguringAb InitioConfig files to interact wifTeradatadatabase.
  • Worked on Common Extract and Load Graphs from aTeradatadatabase.
  • Reviewed and halped define test cases forsystem test and UAT.
  • Played a key role indatabase design for tablesused for reporting purposes
  • Responsible forcode reviews across all modulesof the application.
  • Anticipated and prepared teams through changes (requests from client, scope of work, adjustment in deadlines, etc.)
  • Identified andapplied best practices to issues at hand. Generated new perspectives and frameworks dat allow problems to be solved. Makes difficult ideas and concepts easy to understand (e.g. using diagrams, analogies, etc.)
  • Integrated requirements from multiple subject areas/businessareas to define solution requirements.
  • Facilitated and negotiated the resolution of conflicting requirements and out of scope requirements.

Environment: Ab Initio CO>OS 2.13, GDE 1.13 Oracle9i/8i, UNIX, Teradata, DB2, Mainframe

Confidential

Ab Initio Developer

Responsibilities:

  • Used Ab Initio ETL tool in designing & implementing Extract Transformation & Load processes. Different Ab Initio components were used effectively to develop and maintain the database.
  • Did a POC on Opsconsole ( Now called Control Center)
  • Understood the business requirements wif extensive interaction wif users and reporting teams and assisted in developing the low-level design documents.
  • Involved in gathering business requirements, logical modelling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • Used Control M System for Scheduling.
  • Created several Sandboxes from scratch for the new project implementations along wif sandbox and project parameters.
  • Created several packages to set up and share global variables, types and transforms which were extensively used for many Ab Initio graphs
  • Converted user defined functions and complex business logic of an existing application process into Ab Initio graphs using Ab Initio components such as Reformat, Join, Transform, Sort, Partition to facilitate the subsequent loading process.
  • Extensively used Partitioning Components: Broad Cast, partition by key, partition by Range, partition by round robin and De-partition components like Concatenate, Gather and Merge in Ab Initio.
  • Implemented Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs.
  • Responsible for deploying Ab Initio graphs and running them through the Co-operating systems MP shell command language and responsible for automating the ETL process through scheduling
  • Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookup’s (instead of joins), In-Memory Joins and rollups to speed up various Ab Initio Graphs
  • Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
  • Working wif Business Users and Business Analyst for requirements gathering and Business Analysis.
  • Developed ETL code based on Business requirements using various AB Initio components,
  • Making use of statements/variables in the components for creating complex data transformations.
  • Extracted data from source tables and transformed the data based on user requirements and loaded data to target server.
  • Used Ab Initio functions such as is valid, is error, is defined, string * functions for performing the data cleansing.
  • Developed Ab Initio graphs for Data validation using validate components.

We'd love your feedback!