Dwh Technical Lead/ Sr. Abinitio Developer Resume
PROFESSIONAL SUMMARY:
- A total of 7 Years of experience in Data Warehousing, both as the Offshore developer and Onsite team lead, with major focus on development of new ETL codes, along with requirement, design, coding, testing, and implementation using the ETL tool of Ab Initio.
- Strong expertise in
- Data Warehousing,
- Business Intelligence,
- Performance management,
- Requirements Analysis and
- Data Modeling.
- Extensive knowledge in the Unix OS (Sun Solaris) platform - including shell script programming, commands for general operation, as well as admin related activities including connectivity between different Unix servers, as well as with other applications and databases.
- Ample experience of the Software Development Lifecycle (SDLC):
- Released multiple projects under the SDLC model, both as an offshore developer and as an Onsite Team lead - taking care of all SDLC phases including requirement, design, build, testing and release.
- Ensured providing quality deliverables within the proper toll-gates without any delays
- Sound knowledge in RDBMS using Oracle 8i/9i/10g/11g.
- Proper expertise in DDL/DML commands
- Experience in PL/SQL programming
- Experience in interconnectivity of Oracle DB with the Unix server and ETL applications
- Experience in Ab Initio toolsets including the following:
- Parallel and serial flow batch graphs, conditional components, other components like reformat, normalize, and join, lookup and graphs processing ASCII and EBCDIC layouts.
- Knowledge on Ab-initio architecture including the GDE, Co-operating system, EME and other related items.
- Fine tuning of Ab-initio graphs based on performance enhancement by proper usage of memory in the components.
- Well versed with various Ab Initio components such as Join, Rollup, Partition De-partition, Dedup sorted, Scan, Normalize, De-Normalize.
- Expertise knowledge improving Performance and Troubleshooting of the Ab-Initio graphs and monitoring AB-Initio run time statistics.
- Experience with advanced Ab-initio meta-programming, air commands and other admin related tasks including the creation of .save and performing migration from one server to the other.
- Experience in admin related activities in the Ab-initio Metadata Hub and Express>IT Data Quality tools, enabling quantitative and qualitative data management and analysis of metadata across multiple Data warehouses.
- Experience in maintaining the Ab Initio MetadataHub / Express>IT web application server data quality tools.
- Experience in managing the metadata hub / Express>IT application servers administration and design. Installation and upgrade of necessary services and tools, management of Apache and Tomcat services during reboots, monitoring the Ab-initio bridge status etc.
- Experience in setting up connectivity between Oracle databases and the MetadataHub and Express>IT tools
- Knowledge on adding/removing/modifying users from the MetadataHub and Express>IT applications
- Basics of Tableau reporting - including creation of basic dashboards from input spreadsheet and Oracle database tables.
- Strong work experience with tools like TOAD, SQL Developer and SQL*PLUS.
- Hands on experience with Autosys scheduling tool.
- Sound communication skills, and ability to maintain proper interpersonal relations.
- Believes in result-oriented work process by having proper co-ordination with the team.
TECHNICAL SKILLS:
Reporting: Tableau Desktop, Tableau Server
Visual Analysis Reporting tool: Basics of Tableau - creation of basic dashboards, using both spreadsheets as well as connection to Oracle database tables
RDBMS: TOAD 7.x/8.5
Database: ORACLE 9i,10G,11G,12c - Queries and PL/SQL Procedures
Other Expertise: Microsoft Office Suite (including Microsoft Access)
Operating Environment: Unix(Sun Solaris)Languages/Scripting: Oracle SQL, Unix Basics, Ab-Initio metaprogramming
PROFESSIONAL EXPERIENCE:
Confidential
DWH Technical Lead/ Sr. Abinitio Developer
Tools: Used: Ab-initio ETL tool, Unix (Sun Solaris) OS, Oracle 10g,SQL Developer, Autosys, Connect Direct NDM
Responsibilities:
- Actively participated in the requirement analysis and designing the new DWH environment.
- Assisted the business team to setup business Data Quality rules for each individual attribute in the incoming files for proper validation purposes.
- Ensured the smooth transition of the requirements from the business team to the technical team.
- Co-ordinated with different teams, like the source business team and the target Bancware team to analyze the best approach, and then finalize the design based on that.
- Worked with the different source teams and the target Bancware team - for setting up the NDM connectivity of files - which included setting up new NDM ID’s, modifying the netmap.cgf and userfiles.cfg to include the remote server nodes and user ID information, followed by testing the same through the transmission of dummy files.
- Created one basic design for the Abinitio graphs - using components such Reformat, Filter by Expression, Meta Pivot, Run Program, Concatenate etc, which will read the input data, and Perform the Data Quality checks on the data based on the business rules provided by the business - in case of any DQ exception, a detailed email will be sent to the business team, with the attachment of an excel sheet having the report for the DQ issue.
- Ensured that the performance for the new codes are tuned properly.
- Implemented robust audit mechanisms.
- Worked actively on the data validation task once the DWH was built in the test environment - prepared step-by-step approach to test the new system by analyzing the source data that need to be fed into the DWH system, and the expected output for the same - thereby ensuring the proper functioning of the newly setup system.
- Used SQL Developer extensively for pulling data from the DWH tables and analyze the data to check for the validity of the outcome.
- Extensively involved in debugging of data issues through proper data analysis during UAT validation.
- Ensured the smooth transition of the process in the production environment, and assisted the business team to analyze real time live production data.
- Involved in addition of the new LCR (Liquidity Coverage Ratio) interface to the existing Treasury (Bancware) warehouse - setup the proper requirements after data analysis, and helped design the new system.
Confidential
DWH Technical Lead/ Sr. Abinitio Developer
Tools: Used: Ab-initio ETL tool, Unix (Sun Solaris) OS, Oracle 10g,SQL Developer, Autosys, Connect Direct NDM
Responsibilities:
- Acted as the DWH team lead and participated in the requirement analysis of the business requirements for the new DWH setup.
- Played the primary role in coordinating with the different teams involved and coming up with an optimized and streamlined design for the new DWH environment.
- Ensured the smooth transition of the requirements and design from the business team to the technical offshore team.
- Worked closely with the business and the database team to setup the Data Model, as well as the table structures, that will properly distribute all the necessary information for all the supplier related data like the contract data, engagement details, country risk details, purchase order details into different tables.
- Actively involved in building the Abinitio codes and the basic framework for the new DWH environment - created the basic structure for the fact and the dimension loads based on the data model, and included the audit mechanisms and data reconciliation steps.
- Used Abinitio metaprogramming for making generic staging graph - which will take the runtime parameter of the file name, and will dynamically generate the dml file names, staging table name, dbc file name, necessary xfr’s etc.
- Ensured performance tuning is taken care of in the new codes - by making sure that the code consumes the lowest memory possible.
- Introduced robust audit mechanisms.
- Worked actively on the data validation task in test environment - prepared step-by-step approach to test the new system by analyzing the source data that need to be fed into the DWH system, and the expected output for the same - thereby ensuring the proper functioning of the newly setup system.
- Used SQL Developer extensively for pulling data from the DWH tables and analyze the data to check for the validity of the outcome.
- Extensively involved in debugging of data issues through proper data analysis during UAT validation.
- Ensured the smooth transition of the process in the production environment, and assisted the business team to analyze real time live production data.
- Acted as the SME for this new DWH when this DWH moved to a new Rapid Deployment team.
Confidential
DWH Technical Lead/ Sr. Abinitio Developer
Tools: Used: Ab-initio ETL tool, Unix (Sun Solaris) OS, Oracle 10g, SQL Developer, Autosys, Connect Direct NDM
Responsibilities:
- Participated as the DWH team lead in a total of 3 waves - for the migration of 4 of the major Data Warehouses in our environment.
- Actively participated in the impact analysis - which required an in-depth knowledge of each of the DWH, and demanded a thorough analysis so that none of the elements are missed as a part of the migration.
- Prepared the necessary documentation - having the details for each of the DWH’s - what all ETL and DB objects it has, what are the Autosys jobs that run daily, what are the environment configuration details, etc.
- Worked with the Unix System Admin team to make sure that the environment is properly copied over - including the reinstallation of all the necessary tools, and setting up of the exact same project and sandbox paths, environment configurations, as well as other work directories.
- Performed the EME switch for all the DWH's when the old datacenter EME was shut down - detached the project/sandbox from its current EME, and re-attached the same with the new EME for the new datacenter.
- Prepared the test scripts for each of the data warehouses - specific to its functionalities.
- Co-ordinated with the source and target teams to make sure that all the connectivity configurations - like NDM configurations - are modified based on the new datacenter IP and Node details.
- Actively participated in the daily parallel testing between the old and the new datacenter before going live - followed by proper validation steps.
- Overlooked the smooth transition of the production jobs from the old to the new datacenter on the go-live date.
Confidential
DWH Technical Lead/ Sr. Abinitio Developer
Tools: Used: Ab-initio ETL tool, MS Access, Tableau, Unix (Sun Solaris) OS, Oracle 10g, SQL Developer, Autosys, Connect Direct NDM
Responsibilities:
- As the DWH team lead, interacted with the Business to analyze the existing environment and understand the requirement for the new Abinitio ETL process that needed to be setup
- Thoroughly went through the existing MS Access environment, and then performed a reverse engineering of the whole process to understand the transformations required for the existing business logic/functionality for each of the processes.
- Created a mapping document based on the reverse engineering - which was used during the coding process to map each of the fields from the input to the output using the Reformat components.
- For each of the processes - a 3-step design was finalized - a staging process to dump the source data in a staging table, a target process where the data was transformed and joined between multiple tables and loaded to a final target table, and then a final summary process where multiple target table data was combined and loaded to a summary table.
- For performance tuning - used joins instead of lookups, used lesser number of max-core components such as scan, rollup etc., did proper phasing of the graphs etc.
- Co-ordinated with the source and target teams to make sure that all the connectivity configurations are working properly - like NDM/SFTP configurations for the source, and proper database connection with the Tableau for proper reporting.
- Used SQL Developer extensively for pulling data and performing business and IT level validations on the data.
- Performed step-by-step tracking and analysis of the new graphs to make sure the data flow and transformations are working flawlessly.
- Ensured the creation of proper partitions, indexes, views and roles for the new tables - for performance and security reasons respectively.
- Created Autosys JIL scripts with proper timing and success conditions - for automating the new process.
Confidential
Metadata Hub server administrator
Tools: Used: Abinitio Metadata Hub/Express>IT/Authorization gateway, Unix (Sun Solaris) OS, Oracle 10g
Responsibilities:
- Acted as the server level admin for the separate servers that were setup specifically for the installation and use of the Metadata Hub and Express IT tools.
- Maintained the server - kept it up and running by frequently checking on the Abinitio bridge status (ab-bridge) and the Tomcat services status - and restarted them if they were down.
- Performed import of the 00 and 99 files for multiple DWH environments.
- Created data lineage diagrams using the help of the imported ETL and database table's meta-data.
- Experienced in handling the Authorization Gateway - for the access management - adding/removing users, assigning user groups etc.
- Integrated the Metadata Hub and Express IT logins with the Singl Sign On (SSO) login service - Synchrony's interal secure login service
- Actively involved in the setup of the Apache services in the metadata hub server, required for the above SSO integration.
- Experienced in user ID password reset activity in the server - required during the META and AG user ID's password reset in the Metadata Hub database.
- Performed upgrade of Metadata Hub and Express IT from a lower to a higher version.
Confidential
Offshore Team Lead/ Abinitio Developer
Tools: Used: Ab-initio ETL tool, Unix (Sun Solaris) OS, Oracle 10g, SQL Developer, Autosys, Connect Direct NDM
Responsibilities:
- Acted as the DWH Offshore team lead and the SPOC for the primary coding activity for the re-engineering.
- Involved in the requirement and design analysis - listing down all the Unix shell scripts and PL/SQL procedures and load processes in the existing OPS DWH environment, calculating their interdependencies, upstream and downstream impacts.
- Performed a reverse engineering of all the Unix shell scripts to find out the business logics used in the transformations in each of them - which demanded in-depth knowledge of the shell scripting syntaxes and commands.
- Performed a detailed analysis of the PL/SQL procedures to understand the insert and update queries, cursors etc that were used.
- Created new Abinitio graphs from scratch - which corresponded the same transformation and loading logic as that of existing the Unix scripts.
- Used meta-programming knowledge to create generic graphs wherever possible - that will dynamically determine the dml, dbc, xfr, file name, table name etc.
- Created a new process for receiving files from source, archiving the same into proper systematic folders, and performing other audit steps such as count reconciliation etc.
- Modified all the autosys jobs to convert the script names to graph ksh names.
- Released all the new abinitio codes into production, and ensured the flawless first run of all the codes in the production environment.
Confidential
Offshore Team Lead/ Abinitio Developer
Tools: Used: Ab-initio ETL tool, Unix (Sun Solaris) OS, Oracle 10g, SQL Developer, Autosys, Connect Direct NDM
Responsibilities:
- Acted as the DWH Offshore team lead and the SPOC for the primary coding activity for the Tokenization process.
- Modified all the codes across the environment to ensure that the account numbers are being replaced by their unique token numbers.
- All the transformation logics that were performed on account numbers were to be modified to replace the account with the tokens.
- Used meta-programming to perform the history fix for all the existing table and the existing data in the DWH.
- Created the necessary SDLC documentation for all the phases
- Created step-by-step approach for the proper testing of all the interfaces.
- Executed performance tuning steps to make sure that the account to token conversion is not impacting the performance of the existing process.
- Used ICFF instead of the normal lookup file during account-token lookup to improve the performance of the tokenization process.
- Implemented robust auditing mechanisms to ensure that the account-token conversion is not causing any discrepancies to the existing process.
- Performed a smooth turn-on of the tokenization in the production environment - by migrating all the modified codes, and ensuring that they are running without any impact to the existing timing and logic.