We provide IT Staff Augmentation Services!

Abinitio Consultant/developer Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Having 9 years of experience in Ab Initio, Oracle and Unix, Technically adept and confidentWith exceptional skills in Coding, Application Maintenance - Support and Documentation.
  • Used Abinitio tool to Extract data from flat files, database tables, transform it to requiredFormat, apply business logic and load it into data warehouse. Connected Teradata Database using Abinitio.
  • Worked with Ab Initio Join, Lookup, Reformat, Sort, FBE, Replicate, Write Excel, MailComponent, Dedup Sort to remove duplicates Partition and Departition components.
  • Developed wrapper scripts, Abinitio graphs and UNIX shell scripts.
  • Having Good experience in SQL and Unix shell scripting
  • Having good experience with Abinitio EME repository.
  • Extensively used Air commands to import and export objects and sandboxes, Creation of tag,Deletion of tag, diff versions.
  • Good Experience in Abinitio Graphs Upgradation.
  • Knowledge in Continuous graphs and Meta programming and Plan.
  • Knowledge in downloading data from Cloud server.
  • Used Read JSON Component and Normalize Component to read the data from JSON format and normalize into readable format. Used Json-to-dml to generate dml.
  • Involved in preparing design documents for ETL projects
  • Good Exposure in creation and running Autosys jobs.
  • Having good Knowledge in Informatica Tool and Business Rule Engine and Data Profiler.
  • Have very good exposure to the entire Software Development life cycle viz. requirementsCollection, analysis, design, development, maintenance
  • Have good experience in Software Testing and Quality assurance, excellent analytical, communication, interpersonal, programming and problem solving skills.

TECHNICAL SUMMARY:

  • Abintio GDE(1.15 to 1.13) Abinitio Co>Opsys(3.0 to 2.12)
  • Oracle(10g
  • Sun Solaris
  • IBM AIX
  • Redhat Linux
  • Windows
  • UNIX Shell Scripting
  • PL/SQL
  • SQL
  • AutoSys
  • Cron
  • BRE (Business Rule Engine)
  • Data Profiler
  • Cron
  • Autosys
  • Toad
  • Informatica
  • SQL Server Reporting Service 2005/2008/2013
  • SQL Server Analysis Services 2005/2008/2003
  • MS SQL Server Integration Server 2003.

PROFESSIONAL EXPERIENCE

Confidential, CA

Abinitio Consultant/Developer

Responsibilities:

  • Analyzed business needs and documented functional and technical specifications based upon user requirements with extensive interactions with business users
  • Worked on the design document/Solution Documents.
  • Worked closely with Business for BRE (Business Rule Engine) concept.
  • Worked with all components of Ab Initio such as Reformat,join, Partition by key, Partition by Expression, Broadcast, Gather, Merge, Lookup, Database components, Multifiles.
  • Developed complex Ab-Initio graphs using Ab-Initio Parallelism techniques, Data Parallelism and MFS Techniques with the Conditional Components and Conditional DML for the data load from Legacy System to new Target System.
  • Used Read XML Component to convert XML Data into Readable data format.
  • Extracted data from Cloud server, Convereted data from JSON Format and created Data files, which are used as a source for the graph.
  • Developed Graph, which converts JSON Format data into Flat file data.
  • Created new autosys job for job scheduling.

Confidential

Abinitio Developer

Responsibilities:

  • Analyzing the existing process for the current application data coming from different product processors.
  • Worked on the design document/Solution Documents.
  • Involved in loading the flat files received from source system into Warehouse table and transform and load into Mart Table and finally unload the data from Mart and generates the Billing file (which as Volume or Customer information) to the clients.
  • Development of graphs and wrapper scripts for the project.
  • Worked with all components of Ab Initio such as Reformat, join, Partition by key, Partition by Expression, Broadcast, Gather, Merge, Lookup, Database components, Multifiles.
  • Developed generic graph that creates lookup files as per the parameter passed.
  • Developed SLA script that looks for the file, which needs to be received from source system, if the file is not, received a mail alert will be generated to source system that the source feed/file has missed the timing.
  • Converted cron jobs into Autosys jobs.
  • Prepared JILs, uploaded in Development environment, and tested.
  • Interaction with Quality support team to explain about the application, which needs to be tested as per the requirement.
  • Unit testing and integration testing.
  • Prepared Deployment plan steps to migrate code into different environments.
  • Prepared Solution document for the enhancements done in the existing application running in production environment.
  • Contacting the existing CITIMIS application team members and co-ordinating with them to gather all possible information on the process.
  • Analyzing the existing process for the current CITIMIS application Volume data and the new volume generated in production in pre production environment.
  • Worked on the design document/Solution Documents.
  • Development of graphs and wrapper scripts for the project.
  • Developed complicated graph that calculates the billing information based on the region and loads the data into Mart table.
  • Developed generic graph, which unloads data from Mart table and generates billing data based on different regions, in which region is passed as parameter to the graph.
  • Developed generic graph, which creates lookup files as per the parameter, passed.
  • Involved in performance tuning of the graphs.
  • Developed graphs which unload data from mart table and transform to generate a report and send the report to client using send mail component.
  • Created and uploaded autosys jobs in development environment for scheduling the graphs.
  • Interaction with Quality support team to explain about the application, which needs to be tested as per the requirement.
  • Unit testing and integration testing.
  • Interaction with User Acceptance Testing Team to run the graphs in UAT environment.
  • Interaction with Production support team to migrating the code to Production environment.
  • Prepared Deployment plan steps to migrate code into different environments.

Confidential

Equity Compensation Data Warehouse

Responsibilities:

  • Designed and built Ab graphs for unloading data from different source systems, Unit Testing
  • Involved in UAT Support & Enhancement’s
  • Extensively used EMEfor version control, Created sandbox for check in and checkout process.
  • Preparing JILS and uploading in DEV Environment, and tested the jobs through JILS
  • Involved in Migration from DEV to SIT and UAT Environment.
  • Communicating with AI Support Team for any GDE related issues.
  • Implemented the EDW (Enterprise Data Warehouse) project from scratch.
  • Designed and build Ab graphs for unloading data from diff source systems
  • Generated Ab components and xfr from Abinitio BRE
  • Involved in preparing unit test cases and executing them
  • Extensively worked on Plan
  • Involved in UAT Support & Enhancement’s
  • Extensively used EMEfor version control, Created sandbox for check in and checkout process.

Environment: Ab Initio GDE 1.15, Ab Initio Co>operating system 2.15, Ab Initio BRE, Ab Initio Conduct>IT, Oracle, UNIX

Confidential

Abinitio Developer

Responsibilities:

  • Developing Ab Initio graphs as daily and monthly cycle for loading, partitioning, cleaning and populating the data based on legal and business requirements.
  • Used Data Parallelism, Pipeline Parallelism and Component Parallelism in Graphs, where huge data files are partitioned into multifiles and each segment is processed simultaneously.
  • Used Sort component to sort the tables, and used de-dup Sort to remove duplicate values.
  • Worked in a sandbox environment while extensively interacting with EME to maintain version control on objects. Sandbox features like checkin and checkout were used for this purpose
  • Worked with departition components Join, Gather, Merge which will be used for add the files which is done partition for fast process of data files.
  • Used Ab Initio Components like Sort, Partition, Rollups, Reformat and Merge to build complex graphs.

We'd love your feedback!