Ab Initio Etl Developer Resume
Santa Clara, CA
PROFESSIONAL SUMMARY:
- Around 7 years of IT experience in diversified fields of Application Software, Maintenance, Re - Engineering and Process Improvement projects in Ab initio and mainframes related applications.
- Over 6 years of Ab Initio development experience wif Data mapping, Transformation and Loading from source to target databases in complex, high volume environments.
- Extensively worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development wif high complex Data models of Relational, Star, and Snowflake schema.
- Experienced in all phases of Software Development Life Cycle (SDLC).
- Experience in Data Modeling, Data Extraction, Data Migration, Data Integration, Data Testing and Data Warehousing using Ab Initio.
- Well versed wif various Ab Initio Co Op’s, Micrographs, parallelism techniques and implemented Ab Initio Graphs using Data, Component, pipeline parallelism and MultiFile System (MFS) techniques.
- Experience wif using Abinito Plans (Conduct
- Worked extensively on PDL and DML Meta programming.
- Knowledge on Continuous flows and Business Rules Engine (BRE).
- Well versed wif the DML manipulation and creating complex conditional DML’s.
- Experienced wif the generation of the UTF8 record formats
- Extensive experience in Shell Scripting to maximize Ab-Initio data parallelism and Multi File System (MFS) techniques.
- Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
- Developed various UNIX shell scripts to run Ab Initio and Data base jobs. Good experience working wif very large databases and Performance tuning.
- Expertise in preparing code documentation in support of application development, including High level and detailed design documents, unit test specifications, interface specifications, etc.
- Worked on ACORD AL3 data.
- Worked on calling Web services through Ab Initio graphs and leveraging Name and Address Standardization techniques.
- Has noledge on Ab Intio BRE, ACE (Express
- Analyzed test results, identified issues and reported it through Jira and HPQC
- Excellent Communication skills in interacting wif various people of different levels on all projects and also playing an active role in Business Analysis.
- Good experience in using Microsoft products like MS Visio, Word, Excel etc.
- Worked in onshore and offshore model. Handled big teams in onshore.
- Handled multiple projects working in a team as well as working independently.
- Had strong noledge on Hadoop Map reduc, PIG and HIVE.
- Had strong noledge on JSON, PYTHON and Mongodb.
TECHNICAL SKILLS
Primary Skills: Ab Initio (GDE 3.2.2,Co>Op 3.1.6), Conduct
Secondary Skills: PIG,HIVE, Data Modeling, Data Warehouse design and Unix shell scripting,PYTHON,JSON, Mongodb, Hadoop
Operating Systems: Windows, UNIX, MS-DOS, Sun Solaris, Linux
Languages: SQL, PL/SQL, Unix shell scripting, smb client, core Java,, PIG, HIVE, batch scripting
Scheduling Tools: Autosys Tivoli work load scheduler and control - M
Databases: Oracle, Teradata, DB2, MS SQL Server
Version Control Tool: EME
Domain Knowledge: Banking,Insurance,retail
Data Modeling tools: Erwin Data modeler.
Defect Tracking Tool: HP-Mercury Quality Center, Jira
Reporting Tool: Qlikview, MSBI
Application Tools: MS Word, MS Access, MS Excel, MS Visio, File Zilla, WinScp, Notepad ++, TextPad.
PROFESSIONAL EXPERIENCE
Confidential, SANTA CLARA, CA
Ab Initio ETL Developer
Environment: Ab Initio 3.2.2, Co-op 3.0.4, Technical Repository management console, Oracle 11g, Teradata, SQL Developer, Teradata SQL Assistant, SQL Server, SSRS, SSIS Control- M, Unix. Scripting
Role and Responsibilities
- Designed and developed various Ab Initio Graphs and Plans.
- Developed Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, merge etc.
- Extensively worked on Ab Initio Plans (Conduct
- Involved in estimation activity for development efforts.
- Involved in Establish and/or continuously improve upon development guidelines, standards and procedures
- Developed and Trouble shooted Data load for Teradata using FAST Load, Multi Load, T Pump Techniques.
- Worked on PDL and DML meta programming.
- Responsible for loading configuration data from DEV environment to UAT and tan to PROD.
- Responsible for Designing and developing shell scripts (wrapper).
- Worked on windows batch scripting to archive the files on different windows servers.
- Responsible for writing SQL’s in SQL Developer to validate if the data load was correct.
- The data was extracted from various heterogeneous like Data Sources Oracle, Mainframe, SFDC and TM1.
- Used Teradata Sql Assistant to Query data in the target Teradata tables.
- Developed Unix Wrapper scripts to run multiple Ab initio graphs together in a sequence based on requirements.
- Worked Ab Initio Multi File System (MFS) and Multifiles to make use of Data parallelism and to achieve better performance.
- Used Ab Initio EME for Check-In/Check-Outs and Version control.
- Developed Data Validation Controls and Exceptions processes per business requirements to validate transformations in Ab Initio Graphs and report exceptions.
- Created tags and abinitio sav file to migrate the code through the environments.
- Performance tuned various graphs to optimize the performance and reduce the run time.
- Documenting of complete Ab Initio Graphs.
- Involved in unit testing of the Ab Initio graphs.
- Involved in meetings to gather information and requirements from the business.
- Responsible for fully understanding the source systems, documenting issues, following up on issues and seeking resolutions.
- Prepared the Detailed Design Document for the all the modules required for development.
- Confirm the field mapping to the new data warehouse from the old data warehouse is identified for which the one-to-many and many to one fields are properly matched and resolved.
- Created data models by analyzing the requirements.
- Created tables in Dev environment and provided the DDL’s to the DBA’s for migration through the environments.
- Worked on SQL Server Integration services (SSIS) for data load from Teradata to SQL Server.
- Worked on SQL Server Reporting services (SSRS) to generate reports as per business requirements.
- Worked closely wif business users to understand and analyze their requirements.
Confidential, Northbrook, IL
Ab Initio ETL Developer Lead
Environment: Ab Initio 3.2.2, Co-op 3.0.4, Oracle 11g, IBM Tivoli workload scheduler, UNIX.
Role and Responsibilities
- Creation of a whole new ETL system to extract the data from Web service by a SOAP request and extract the XML.
- Processing the XML information, transforming it as per the standards of downstream systems and providing them wif dis information.
- Worked on XML files, flat files and Accord AL3.
- Used SOAP wsdl’s to generate DML using xml to dml utility.
- Extensively used SOAP UI tool to perform unit testing of Web service requests.
- Created Ab Initio components to leverage the Name and address standardization technique to sync up wif the downstream systems.
- Created Ab Initio components to call the stored procedures to enhance the performance.
- Developed, tested and reviewed complex Ab Initio graphs, sub-graphs, DML, Pset, XFR, deployed scripts, DBC files for connectivity, create Package and exports.
- Extensively used various Ab Initio components like Reformat, Input file, Output file, Join, Sort, Normalize, Input Table, Output Table, Loads Table, Fuse, Update DB, Gather Logs and Run DB Sql for developing graphs.
- Developed generic graphs for the purpose of data extraction and data manipulation for various processes.
- Worked extensively on creating wrapper scripts and sniffer jobs to identify the input file availability.
- Extensively used AIR commands to check in and check out existing applications in EME.
- Worked on performance tuning of the abinitio graphs.
- Responsible for unit testing and debugging of Ab Initio graphs using flow watchers.
- Fine-tuned various Ab Initio graphs by using lookup local, In-Memory joins and rollups.
- As a lead developer, was responsible for the design and analyze the impact.
- Developed and used sub-graphs to prevent redundancy of transformation usage and maintenance.
- Single Point of Contact (SPOC) for off-shore resources.
- Design walk through wif the internal and external stake holders of the project.
- Code review /guiding the offshore team in Application creation.
- Used EME for managing and Version control of Graphs.
- Handled migration of code through test and stage environments.
- Worked wif QA team to review their test cases.
- Worked wif business UAT users to ensure there were no functional impacts.
- Worked on the Automation of the ETL jobs using Tivoli Work Scheduler.
- Creating TWS schedules for the processes and raises created JRS and JSDL’s for the same.
Confidential, River woods, IL
Ab Initio ETL Developer
Environment: Ab Initio 3.1.5, Co-op 3.1.6.5, Bteq, Teradata 14.0, Teradata SQLA, IBM Tivoli workload scheduler, AIX Unix.
Role and Responsibilities
- Working wif client teams to identify and understand various source systems and design the integration strategies.
- Design and Develop Ab Initio graphs for the data specifications provided by the client.
- Created mapping documents based on the requirements, had review meetings wif Business and Architects to finalize the mapping documents.
- Created graphs using components likeInput/ Output File/Table, lookup,Reformat, Redefine Format, Dedup Sorted, Filter by expression, Partition By Expression, Partition by Key and Sort, Sort, Broadcast, Replicate, Join, Merge, Concatenate, Gather, Rollup, Scan, Read Excel spread sheet, FTP to, Publish, Subscribe, Fuse, Run Program, Run SQL,MQ Publish Headers, Read/write xml, Batch Subscribe, Update Table.
- Performed data cleansing operations on the data using transformation functions likeis valid, is defined, is null, is blank,string lrtrim, re index, re interpret as, string concat,string substring,lookup count,lookup first,now(),decimal strip, re index,re replace,decimal lpad, next in sequence(),length of test characters all(), force error(),switch(),first defined(), lookup match(), conditional dml,xml-to-dml utility, etc.
- Developed ComplexAbInitio XFR’s to derive new fields and solves rigorous business requirements.Used type II SCD’S to track the data changed.
- Worked on improving the performance of AbInitio graphs by employing AbInitio performance components like Lookups (instead of joins), In-Memory Joins, Rollup and Scan components to speed up execution of the graphs.
- Created complex and typical conditional DML’s to map the source data for processing.
- Developed most complex applications that captures data from excel sheets and create the DML’s to map the corresponding data.
- Generated DML’s in UTF8 format.
- Developed several UNIX shell wrapper scripts to pass and initialize the parameters and run the graphs.
- As an Onsite Consultant was responsible for translating customer requirements into formal requirements and design documents, establish specific solutions, and leading the efforts including programming and testing that culminate in client acceptance of the results.
- Single point of contact (SPOC) for all technical and design queries on the project, ensuring the quality of the deliverables
- Provide day to day direction to the project team and regular project status to the customer.
- Developing Run book and Project Implementation plans for the project deliverables.
- Created several SQL queries and created several reports using the above data mart for UAT and user reports.
- Facilitating UAT to achieve the sign off.
- HandledDeployments and Migration of the Codeto Prod/Test regions.
- Extensively used AIR commands to check in the objects from Dev to Prod EME and also to perform dependency analysis on all ABI objects.
- Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis.
- Developing and testing wrapper scripts for Maestro jobs.
- Scheduled the jobs using Tivoli Workload Scheduler.
- Created applications to backfill large volume of data (3+ years) maintaining the type II SCD.
- Providing Warranty Support for the applications which we moved into production.
Confidential, NJ
Ab Initio ETL Developer
Environment: Ab Initio GDE 1.14/1.15/3.0.1, Co-op 2.15, Solaris 10, PL/SQL, Oracle 10g, Teradata 13.10, Autosys
Role and Responsibilities
- Study and understand all the functional and technical requirementsto better serve the project.
- Developed Ab Initio graphs using different components for extracting, loading and transforming external data into data mart.
- Developed number ofAb InitioGraphs based on business requirements using various Ab Initio Component like Filter by Expression, Partition by Expression, Partition by round robin, reformat, rollup, join, scan, normalize, gather, replicate, merge etc.
- Developed Ab Initio graphs for Data validation using validate components like compare records, compute checksum etc.
- Implemented Data Parallelism through graphs, by using Ab Initio partition components.
- Documentation of complete Graphs and its Components.
- Implemented Data Parallelism through graphs, which deals wif data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.
- Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping, Transformation and Loading in complex and high-volume environment and data processing at Terabytes level.
- Involved in automating the ETL process through scheduling.
- Used Autosys for scheduling the jobs.
- Responsible for cleansing the data from source systems using Ab Initio components such as reformat and filter by expression.
- Developed psets to impose reusable business restrictions and to improve the performance of the graph.
- Developed several partition based Ab Initio Graphs for high volume data warehouse.
- Checked the accuracy of data loaded intoTeradataand assured the linkage of keys on the tables for the variousapplications.
- Documenting the processesso as to enable better reusability and maintainability.
- Scheduling the bug fixes for development/UAT and production migration.
- Create test scripts for the Integration testing and support for Integration testing.
- Analyzing the Production bugs logged in Test Director.
- Responsible for tuning the long running SQL's and complex joins for better performance.
- Prepared MLOAD scripts for loading target tables based on requirements.
- Automated Workflows and BTEQ scripts thru MS DOS batch files in Development and through Tivoli scheduler for Production.
Confidential, Glastonbury, CT
Ab Initio ETL Developer
Environment: Ab Initio GDE 1.14, Co-op 2.14, PL/SQL, Oracle 10g, Autosys, IBM Rational tools (RequisitePro, Clear Case, Clear Quest), Teradata V2R5/R6.
Roles and responsibilities
- Involved in gathering the requirements and in the design of ETL process.
- Developed various Ab Initio Graphs based on business requirements using various Ab Initio Components like Partition by Key, reformat, rollup, join, gather, replicate, merge etc.
- Implemented 4-way multi file system and used Partition components to store the large amount of data in multiple files.
- Implemented the selection of best address for an account based on the ‘score’ assigned using some specific rules and update the target database wif the best address found.
- Responsible for creating a graph to Implement check digit calculation for the mail file generated according to US postal requirements.
- Developed UNIX scripts to FTP mail files generated, to a different server by considering naming conventions of the file at target server.
- Responsible for updating the RTM (Requirements Traceability Matrix) which involved updating Design Comp ID/Description and Code Component ID/Description fields for all the requirements and also helped in identifying the business requirements which are not satisfied.
- Supported UAT in running the interface/ DW load scripts according to the scheduled sequence.
- Responsible for resolving the issues identified in the testing.
- Responsible for identifying the changes made to the tables in the database in development (new fields added to an existing table and the tables newly created) before deploying the code to production.
- Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD and TPump.
- Worked on exporting data to flat files using Teradata FEXPORT.
- Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
- Worked on SQL queries and tuned them to improve performance.
- Prepared ETL Source to Target mapping specification document.
- Worked wif DBAs to tune the performance of the applications and Backups.
- Involved in developing Unit Test cases for the developed mappings.
- Extensively worked in the performance tuning of transformations, Sources, Sessions, Mappings and Targets.
Confidential, Lisle, IL
Teradata Developer
Environment: Teradata V2R5, Teradata Administrator, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Quality Center, UNIX, Windows 2000, Shell scripts
Roles and responsibilities
- Development of scripts for loading the data into the base tables in EDW using Fast Load, MultiLoad and BTEQ utilities of Teradata.
- Extracted data from various source systems like Oracle, Sql Server and flat files as per the requirements.
- Performed tuning and optimization of complex SQL queries using Teradata Explain.
- Created a BTEQ script for pre population of the work tables prior to the main load process.
- Used volatile table and derived tables for breaking up complex queries into simpler queries.
- Involved in loading of data into Teradata from legacy systems and flat files using complex MultiLoad scripts and FastLoad scripts.
- Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
- Involved heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistant.
- Created a UNIX shell script that checks the corruption of data file prior to the load.
- Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working wif loader logs.
- Created and automate the process of loading using Shell Script, Multi load, Teradata volatile tables and complex SQL statements.
- Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
- Involved in troubleshooting the production issues and providing production support.
- Used type II SCD’S to track the data changed.
- Streamlined the Teradata scripts and shell scripts migration process on the UNIX box.
- Involved in analysis of end user requirements and business rules based on given documentation and working closely wif tech leads and analysts in understanding the current system.
- Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process.
- Developed unit test plans and involved in system testing.