Sr. Etl Abinito Developer/admin Resume
San Francisco, CA
SUMMARY
- Over 8 plus years of IT consulting experience in analysis, design, development, testing and maintenance of data warehouse systems. Hands on experience include developing Data Warehouses/Data Marts/ODS using in Telecom, Retail, Banking, Finance, Insurance.
- Ab Initio (GDE 1.14/1.15/3.0/3.1 .5/3.2 , Co>Op Sys 2.14/2/15/3.0/3.1.5/3.2 ) Consultant with 6+ years experience in Data Extraction from various data sources, Mapping, Transformation and Loading between Source and Target Databases in a complex, high - volume environment.
- Experience in working with various heterogeneous Source Systems like Oracle, MS SQL Server, DB2, ERP and Flat files.
- SQL/Database developer experience in writing efficient SQL queries, fine tuning queries and wrote several SQL queries for adhoc reporting.
- Extensively used Oracle SQL Loader, SQL Plus and TOAD. Installation, configuration and administration experience in Big Data platforms
- Performed validations, data quality checks and Data profiling on incoming data. Knowledge on DQE, ACE, BRE and Metadata hub.
- Extensive Knowledge in Ab Initio ETL tool, Teradata and UNIX Shell scripting.
- Expertise and well versed with various Ab Initio components such as Round Robin, Join, Rollup, Partition by key, gather, merge, interleave, Dedup sorted, Scan, Validate, FTP.
- Well versed with various AB Initio parallelism techniques and implemented number of Ab Initio Graphs using Data parallelism and Multi File System (MFS) techniques.
- Setting up the whole environment flawlessly UAT and PROD
- Abinito Installation/upgrade/configuration and troubleshooting
- Health check on unix environment and fixing issues related to it. Maintenance of document repository and activities on server reboot
- Configure the abinito work load manager and creating the multi file directories
- Giving previleges to the users and restrict some case for full permissions
- EME setup,project creation in EME, EME groups/uses and setting permissions, EME restart and backups. Key/License Mangement
- Configured Ab Initio environment to talk to database using db config, Input Table, Output Table, Update table Components.
- Experience in new Ab Initio features like Component Folding, Parameter Definition Language (PDL), Continuous flows, Queues, publisher and subscriber components.
- Sound Knowledge of Data Warehouse/Data Mart, Data Modeling Techniques. Very good understanding of Dimensional Modeling.
- Highly experienced in Creating Sandboxes and saving graphs in the sandbox for check in and check out of graphs from the EME
- Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
- Demonstrated ability in grasping new concepts (both Technical & Business) and utilized as needed.
- Excellent communication, interpersonal skills and has a strong ability to work as part of a team and as well as handle independent responsibilities.
TECHNICAL SKILLS
Industry Domain: Banking, Telecom, Healthcare
Tools: Foundation Server, Visio, MS Office, MS Project, Service CenterQuality Center, Clear Quest, Rally
Data Warehousing: Ab Initio (GDE 1.14/1.15/3.0/3.1 , Co>Op sys 2.14/2.15/3.0/3.1 ), Dimensional Modeling
RDBMS: Oracle 10g, MS SQL Server 6.5/7.0/2000 , DB2, MS Access 7.0, SQL
Programming: UNIX
Operating System: Windows NT/98/2000, MVS, HP-UX, AIX 5.0/5.2/5.3
PROFESSIONAL EXPERIENCE
Confidential, San Francisco, CA
Sr. ETL Abinito Developer/Admin
Responsibilities:
- Used Ab Initio GDE to generate complex graphs for transformation and loading of data into Staging and Target Database areas.
- Involved in requirement study, analysis and design of the system.
- Created Process Data Flow diagrams in Ab Initio GDE for data conditioning, transformation, and loading.
- Generated Configuration files, DML files, xfr files for specific Record format, which are used in components for building graphs in Ab Initio.
- Involved in creating Flat files using dataset components like Input file, Output file, Intermediate file in Ab Initio graphs.
- Developed graphs using multistage components.
- Developed new projects of EME model in coordination with designing and technological teams.
- Maintained desired working level of Ab Initio operational documentation and procedures.
- Executed effective processes for administration and maintenance of Ab Initio Key Server.
- Developed Ab Initio applications on shared basis for multiple applications.
- Supported technical teams in definition of requirements and creation of architectural specifications.
- Assisted in installation and configuration of Ab Initio software including patches and server consolidation processes.
- Prepared automated script, watcher files, home and temporary directories for other teams. checking the key status and bundle status and request for new keys and installs them if required configure the functional/application batch IDs
- Building and migrating new applications on redhat linux and releasing for all the environments
- Server Maintenance activities
- Extensively Used Transform Components like Join, Reformat, Rollup and Scan Components.
- Implemented the component level and Data parallelism in Ab Initio for ETL process for Data warehouse.
- Extensively used Partitioning Components like Broad Cast, partition by key, partition by Range, partition by round robin and Departition components like Concatenate, Gather and Merge in Ab Initio.
- Responsible for the automation of Ab Initio graphs using korn shell scripts.
- Developed Ab Initio scripts for data conditioning, transformation, validation and loading.
- Developed various BTEQ scripts to create business logic and process the data using Teradata database.
- Gather information from different dataware house systems and loaded into One Sprint Financial Information System Consolidated model using Fast Load, Fast Export, Multi Load, Bteq and unix shell scripts
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Assisted in maintaining comprehensive datamodel documentation including detailed descriptions of business entities, attributes, and datarelationships.
- Implement a complete data quality program that includes detection, remediation, reporting, and alerting with DQE.
- Performed assessments of data files or implement data quality systems in enterprise production data flows using DQE.
- Implemented data quality systems across the enterprise, or place data quality systems in enterprise applications that have known data quality issues using DQE
- Worked on ACE, BRE and Metadata hub.
- Using DQE, accessed the data sources from files or databases, join data sources for subsequent data quality analysis, and compile lookup files for use in data quality tests
- Wrote validation tests that can detect null or blank values, valid and invalid values, data patterns, invalid data relationships, and the uniqueness of key values
- Used the quality application to compile lists of issues in the data source, compute data quality metrics, profile the input data, and publish the results to the Metadata Hub
- Extensively used Multi-load and Fast-load utilities to populate the flat files data into Teradata database.
- Used Fast Exports to generate Flat files after the Change data capture has been accomplished which in turn creates a loadable file that is used to load the database.
- Extensively worked for the performance of the fast-load and Bteq utilities.
- Documented the process procedures and flow for the process.
- Modified the Ab Initio graphs to utilize data parallelism and thereby improve the overall performance to fine-tune the execution times.
- Created a generic Ab initio job for automation testing like compare source and target tables, duplicate check in target.
- Performed Data cleansing and Data profiling for detecting and correcting inaccurate data from the databases and to track data quality and to assess the risk involved in integrating data for new applications.
- Created a generic job to verify data quality for flat files, before processing.
- Worked with Production support team to debug issues in production and for migrations from Pre-prod to Production.
- Created the detail design documents and run books for generic jobs .
Environment: Ab Initio (GDE 3.2.6, Co>operating system 3.2.2), UDB, DB2, Teradata V2 R6 (Fast-loads, Multi-loads, Fast Exports), UNIX IBM AIX 5.1(Shell scripts and Rapper Scripts), Autosys, BTEQ
Confidential, Jacksonville, FL
Sr. Ab intio Admin/Developer
Responsibilities:
- Worked on business requirements with extensive interaction with Business analysts, customers and reporting teams, and assisted in developing the low level design documents.
- Extracted data from various sources like databases, delimited flat files.
- Extensively Used Ab Initio components like Reformat, Scan, Rollup, Join, Sort, Partition By key, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQL for developing graphs.
- Used inquiry and error functions like is valid, is error, is defined and string functions like string substring, string concat and other string * functions in developing Ab Initio graphs to perform data validation and data cleansing.
- Assisted in overall designing and development of software applications and systems. Implemented procedures for designing, configuration and maintenance of EME repositories.
- Provided technical assistance for development of file systems to support data, migration and archive directories.
- Provided technical consultancy services to team members and users regarding Ab Initio infrastructure.
- Implemented procedures for management of Ab Initio applications and Citrix servers.
- Responded to storage requests for submission to appropriate support groups.
- Conducted detailed analysis of data requirements to prepare design specifications and operational documents.
- Supported technical team members in establishment and maintenance of rigor and change controls.
- Executed effective processes for installation and validation on DR servers during drill.
- Developed and enhancement of Ab inito Graphs and business rules using Abintio Functions.
- Performed validations, data quality checks and Data profiling on incoming data.
- Knowledge in transalting business requirements into workable functional and non-functional requirements at detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modeling.
- Performed evaluations and made recommendations in improving the performance of graphs by minimizing the number of components in a graph, tuning the Max Core value, using Lookup components instead of joins for small tables and flat files, filter the data at the beginning of the graph etc.
- Debugged the failures from the log files and reran the jobs, by analyzing the checkpoints within graphs.
- Incorporated Error handling component, force error function and grouped error messages
- Extensively used File management commands like m ls, m wc, m dump, m copy, m mkfs etc.
- Responsible for deploying Ab Initio graphs and running them through the Co-operating systems mp shell command language and responsible for automating the ETL process through scheduling.
- Involved in Comprehensive end-to-end testing.
- Generate SQL queries for data verification and backend testing. Detect and Correct Data Quality issues.
- Performed Positive and Negative Testing Manually. Actively participated in walkthroughs and enhancement meetings. Created SQL statements to validate the data tables.
- Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookup Tables, In-Memory Joins and rollups to speed up various Ab Initio Graphs.
- Write validation tests that can detect null or blank values, valid and invalid values, data patterns, invalid data relationships and the uniqueness of key values.
- Created several packages to set up and share global variables, types and transforms which were extensively used for many Ab Initio graphs.
- Worked with data mapping from source to target and data profiling to maintain the consistency of the data.
- Experienced with SQL queries. Knowledge of checking the data flow through the front end to back end and used the SQL queries to extract the data from the database to validate it at the back end.
- Performed Data Profiling to assess the risk involved in integrating data for new applications, including the challenges of joins and to track data quality.
- Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
- Involved in monitoring the jobs and schedules through CONTROL M Scheduler. Tweaking the House keeping scripts for cleaning up the older dated data.
- Coordination with various teams for analysis on production issues. Analyzing and fixing defects during testing cycles.
- Participating in the agile trainings and meetings as a part of the agile team.
- Worked with Data profiling of incoming data feeds depending upon the classification of errors and warnings.
- Automate the complete daily, weekly and monthly refresh using custom build UNIX SHELL SCRIPTS.
- Created a generic job to verify data quality for vendor files before processing. Data loads in various environments to meet the business requirements.
- Involved in code reviews, system test reviews, test case reviews, test case execution result reviews for the projects designed.
- Working in analyzing production defects and suggesting solution to the defects.
- Worked with production support team to debug issues in production and for migrations from pre-prod to production.
Environment: Ab Initio (GDE 3.1, Co>Op Sys 3.1), UNIX, SQL,, IBM DB2,CONTROL M
Confidential, Columbus, OH
Abinito Developer / Admin
Responsibilities:
- Involved in all the stages of SDLC during the projects. Analyzed, designed and tested the new system for performance, efficiency and maintainability.
- Worked with business to translate the business requirements into High level design, Detailed Level Design and Functional code.
- Translating client’s strategic vision into technical solutions and application identification.
- Used Ab Initio GDE to generate complex graphs for transformation and Loading of data into Staging and Target Data base area.
- Assisted in overall designing and development of software applications and systems.
- Implemented procedures for designing, configuration and maintenance of EME repositories.
- Provided technical assistance for development of file systems to support data, migration and archive directories.
- Provided technical consultancy services to team members and users regarding Ab Initio infrastructure.
- Implemented procedures for management of Ab Initio applications and Citrix servers.
- Responded to storage requests for submission to appropriate support groups.
- Conducted detailed analysis of data requirements to prepare design specifications and operational documents.
- Supported technical team members in establishment and maintenance of rigor and change controls.
- Executed effective processes for installation and validation on DR servers during drill.
- Involved in complex Ab Initio XFRs and Conditional DMLs to derive new fields and solve various business requirements
- Partition Components (Partition by Key, by Expression, by round Robin) were used to Partition the large amount of data file into multiple data files.
- Used inquiry and error functions like is valid, is error, is defined and string functions like string substring, string concat and other string * functions in developing Ab Initio graphs to perform data validation and data cleansing.
- After the initial preliminary validation, was responsible for conditioning the data, translating lookup values, flattening the data and created a standardized common format for the downstream processes.
- Creating Graph level and Project level parameters according to the requirements for psets
- Created test cases and involved in Comprehensive end-to-end testing while resolving the load reject issues that evolved after execution of unit testing of Ab-Initio graphs.
- Had participated in code review and Unit testing -Worked with QA team to resolve the issues.
- Generate SQL queries for data verification and backend testing.
- Created Sandbox to work on a specific project by giving the connection parameters required.
- Good experience on working as on-site co-coordinator in an Onsite-offshore Model.
Environment: AbInitio Co>Op 3.1, GDE 3.1, Oracle 9i/10g, Toad, SQL, PL/SQL, UNIX (Shell scripts)
Confidential, Irving, TX
Abinitio Developer/Prouction support
Responsibilities:
- Analyzing the production issues, TR’s, pursue information, propose alternatives and bug-fixes while creating the adhoc graphs to cater the requirement.
- Involved in monitoring the jobs and schedules through Maestro and Autosys Scheduler.
- Performing batch process for various testing cycles and handling Critical data issues in Load, stress and production environments.
- Prepared UNIX shell scripts to run the graphs and calculation for reporting purpose.
- Understanding the Specification of Data Warehouse ETL processes.
- Worked on creating multifile system and various file management commands such as m touch, m ls, m mv, m cp, m gzip, m cat, m dump were extensively used.
- Extensively used m db commands to query the oracle databases for reporting purposes.
- Involved in moving applications from AIX to Linux server and modified the shell scripts for that application in order to run on Linux server.
- Involved in Production deployment activities like validating code tag and smoke testing etc.,
- Expertise in troubleshooting and handling the production support duties on 24/7 basis and also worked in an Onsite-offshore Model.
- Tweaking the House keeping scripts for cleaning up the older dated data.
- Co ordination with various teams for analysis on production issues.
- Analyzing and fixing defects during testing cycles.
- Gained demonstrated ability to grasp team guidelines, processes, practices and procedures
Environment: AbInitio Co>Op 3.0, GDE 3.0, Oracle 9i/10g, Toad, SQL, UNIX, TeradataV2R6
Confidential, Wilmington, DE
Abinitio Developer
Responsibilities:
- Understanding the specifications for Data Warehouse ETL Processes and interacting with the data analysts and the end users for informational requirements.
- Development of Ab Initio applications for survey Project.
- Working closely with Business Analysts to interpret the Business rules and make necessary modifications to process the data for accuracy.
- Used Ab Initio GDE to generate complex graphs for transformation and Loading of data into Staging and Target Data base area.
- Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
- Performed data extraction from source systems (Oracle 9i/10g) for the landing pad data.
- Involved in loading data to staging area and into Warehouse tables using SQL Loader.
- Created Ab Initio graphs that transfer data from various sources like Oracle, flat files and CSV files to the target database.
- Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, TPump, DDL Commands and DML Commands.
- Created Domains and Lookups to decode the incoming source data.
- Used the Ab Initio Web Interface to Navigate the EME to view graphs, files and datasets and examine the dependencies among objects
- Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their OLTP systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio.
- Modified the Ab Initio graphs to utilize data parallelism and thereby improve the overall performance to fine-tune the execution times.
- Responsible for all pre-ETL tasks upon which the Data Warehouse depends, including managing and collection of various existing data sources.
- Responsible for writing shell scripts (wrapper) to schedule the jobs in production environment.
- Developed graphs for the ETL processes using Join, Rollup, Scan, Normalize, Denormalize and Reformat transform components as well as Partition and Departition components extensively.
- Extensively used the Ab Initio tool’s feature of Component, Data parallelism.
- Implemented Lookup’s, lookup local, In-Memory Joins and rollup’s to speed up various Ab Initio Graphs.
- Maintained locks on objects while working in the sandbox to maintain the privacy
- Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance technique’s like using looks instead of Join’s etc.
- Guided the Production support members to fix issues with the duplicates in the data, data format errors etc for the files having millions of records.
- Extensively used Autosys for scheduling purposes.
- Responsible for maintaining the ETL resources on the mainframes environment.
Environment: AbInitio Co>Op 2.14/2.15, GDE 1.14/1.15, Oracle 9i/10g, Toad, SQL, IBM 390 Mainframes, UNIX, TeradataV2R6, AUTOSYS, PCVS ERWIN
Confidential
Database Developer
Responsibilities:
- Normalized the data and creating data tables with database constraints. Analyzed and studied the existing system.
- Developed screens using Visual Basic. Developed reports using inner and outer joins. Developed ActiveX components in Visual Basic.
- Developed stored procedures and triggers. Automated Email messaging system.
- Test and debug the stored procedures.
- Designed/wrote/test stored procedures and triggers to support application logic. Performed query optimization and analyzed query plans.
- Designed and developed Sybase stored procedures and triggers to maintain the data integrity
- Created clustered, non-clustered and unique indexes for better query performance.
- Analyzed and issued performance tuning recommendations for configuration and SQL code changes to resolve problems in Sybase production environment.
Environment: Visual Basic 5.0, ActiveX, ODBC, SQL, PL/SQL, MS Access, Windows NT