We provide IT Staff Augmentation Services!

Ab Initio Developer Resume

0/5 (Submit Your Rating)

SUMMARY

  • 9 years of IT experience in Software Development, Maintenance, Support and in various Data Warehouse technologies.
  • 9 years of experience in Ab Initio Designing, Development and Maintenance Activities.
  • Experience in writing UNIX shell scripting and AutoSys Jil scripts for Job Scheduling.
  • Designed various high level, low level and technical architecture design solutions using Ab Initio.
  • Extracting business requirement and proposing suitable solution from ETL side.
  • Worked with RDBMS like Oracle, Sybase, DB2, Teradata etc using various ETL tools.
  • Developed Ab Initio graphs, building wrapper scripts and installing the jobs.
  • Designed environment scripts for invoking environment variables.
  • Good programming skills using SQL and shell scripting.
  • Strong data warehousing concepts.
  • Managed a team efficiently in an onsite - offshore model.
  • Allocating work to resources.
TECHNICAL SKILLS

Databases: Oracle 9i, 10g,SQL, PL-SQL, Sybase,DB2, Teradata

Operating Systems: Windows 9x,XP,2000,Linux,Unix Sun Os 5.9

Tools: ETL: Abinitio (GDE 1.12, 1.13, 1.14,1.15) Co>op/EME (2.12, 2.13, 2.14,2.15)

OLAP: Concepts of Business Objects(5, Xi), Hyperion Essbase and reports

Automation: Autosys, Control M

Web Related: HTML

Languages: C,C++

Hardware: Sun E25k, EMC SAN. Knowledge about disk space estimation

PROFESSIONAL EXPERIENCE

Confidential

Ab initio Developer

Environment: Ab Initio, DB2, AIX, Autosys

Responsibilities:

  • Analyzed the requirements and preparation of technical design documents.
  • Created the functional requirement document.
  • Created a new sandbox/project directory structure.
  • Designed and DevelopedBatch Graphs using Ab Initio.
  • Created Generic Ab Initio graphs/scripts.
  • Performance tuning of Ab Initio graphs to get optimum results.
  • Written shell scripts and database queries.
  • Written jil script and scheduling autosys jobs.
  • Performed Unit Testing and system testing.
  • Supported for Acceptance testing
  • Onsite-Offshore Coordination.
  • Responsible for coordinating with production support team and development of support guide for various interfaces.
  • Participated in various infrastructures related activities like COB testing, ETL server migration etc.

Confidential

ETL Lead

Environment: Ab Initio, Teradata, AIX

Responsibilities:

  • Requirment gathering and analysis.
  • ETL Analysis and designing.
  • Design and developed Ab Initio graphs.
  • Written wrapper scripts.
  • Ab Initio Task scheduling using Conduct->IT and dynamic generation of Plan sets.
  • Worked on Continuous flows to get data from source and process them.
  • Solving business critical issues.
  • Participating in Agile meetings.
  • Ensuring smooth communication and notification to clients regarding run operations.
  • Offshore team Coordination. Managing a team of offshore resources.

Confidential

Senior Technical Specialist/Module Leader

Environment: Ab Initio, Oracle, Solaris, Autosys

Responsibilities:

  • Analyzing the requirements and preparation of technical design documents.
  • Developed SQL scripts to install the table structures, views, sequences, synonyms, indexes and various other database objects
  • Wrote stored procedures and packages to implement the business logic
  • Created configuration tables to pass graph level and environment parameters for ETL process
  • Created partitioned tables for efficient management of data.
  • Granted privileges and roles to functional ids and business users.
  • Coordinated with various teams and created tickets for DDL deployment to UAT and PROD.
  • Developed various Ab Initio Graphs using GDE 1.14 to extract, transform and load data into OATS tables
  • Creation of Abinitio pset files to parameterize the graphs.
  • Have implemented complex business logic using Ab Initio
  • Have used different file formats like XML and flat files
  • Have implemented user defined types and functions to manipulate or transform the data
  • Have used various components like reformat, scan, rollup, datasets, lookup files, sort & dedup sorted components, Read Multiple files, Join, FTP, partition components, Table components and so on for transformation and loading of the data.
  • Have prepared Source field mapping documents
  • Created Autosys Jil scripts for scheduling of Jobs using Autosys.
  • Developed Shell Scripts to be used in the ETL process.
  • Created sandboxes for various projects.
  • Checked in and Checked out code to EME repository.
  • Unit testing as per the test cases designed in the design phase.
  • Integration Testing and checking the dependencies.
  • User Acceptance Testing.
  • Performance enhancement for existing processes and newly developed processes.
  • Adhering to the change management process and migrating the graphs to production environment after proper approval.
  • Timely Resolution of critical production issues and bug fixing.

Confidential

ETL ANALYST /DEVELOPER

Environment: Ab Initio, Oracle, Solaris, Autosys

Responsibilities:

  • Analyzing the functional requirements and preparation of technical design documents.
  • Scheduling, executing, monitoring, and reviewing Data Warehouse Environment related jobs
  • Investigating technical and business problems related to a Data Warehouse Environment.
  • Following-up and assisting in resolving technical and business problems that could reasonably relate to a Data Warehouse Environment
  • Decommissioning warehouses and components of the Data Warehouse Environment
  • Assisting with the prioritization and administration of tasks like problem tickets and user requests
  • Providing appropriate updates to all relevant documentation.
  • Receiving problems from the GECF-Americas help desk.
  • Notify the GECF- Americas help desk of problems of which it becomes aware arising in the performance of the Services, including through automated alarms.
  • Monitoring, controlling and managing each problem, including coordinating such problems with GECF- Americas, arising from or relating to the Services until it is corrected or resolved.
  • Responding to and resolving problems within the scope.
  • Adhering to the procedures for escalation, review and reporting and also take appropriate measures to avoid reoccurrence of problems.
  • Improve the average time to resolve problems using commercially reasonable efforts
  • Participating in root cause analysis exercises with GECF- Americas service Delivery managers to ensure problems are corrected appropriately and implementing proactive processes to prevent future occurrences.
  • Maintaining communications with the GECF- Americas help desk and other relevant GECF- Americas IT contacts, who shall notify affected end users, on problems through resolution.
  • Resolving problems that cause the unavailability or unresponsiveness of a Data Warehouse Environment;
  • Resolving invalid data within a Data Warehouse Environment, without regard to the source of such problem
  • Analyzing the tables, rebuilding the indexes for the performance gain during the query execution.
  • Monitoring the performance of queries and tuning them as needed.
  • Responsible for creation of database objects like Table, indexes creation with Daily, monthly and yearly List partitions and Hash Partitioning options.
  • Coordinate with etl team during production migration and resolving issues.
  • Analyzing the tables, rebuilding the indexes for the performance gain during the query execution.
  • Monitoring space usage and table space growth during the graph generation in the ETL processes.
  • Monitoring the performance of queries and tuning them as needed.
  • Coordinating with the application team to develop necessary script that needs to be run on production server

We'd love your feedback!