We provide IT Staff Augmentation Services!

Senior Ab Initio Developer. Resume

3.00/5 (Submit Your Rating)

Norfolk, Va

PROFESSIONAL SUMMARY:

  • 8+ years of experience in design and development of data warehouse & business intelligence solutions using Ab Initio ETL tool.
  • 3+ Years of experience in Working with Public Cloud Platform like Amazon Web Services and good knowledge in using various services like AWS EC2, VPC, CLI, S3, Route53, Terraform, Cloud Formation
  • I am an AWS certified developer associate
  • Worked with various operational data sources like Flat files, Oracle, and SQL Server.
  • Knowledge of Data Warehousing concepts and Dimensional modeling like Star Schema.
  • Expertise in extracting data from multiple sources, data cleansing and validation based on business requirements.
  • Developed number of Ab Initio Graphs based on business requirements using various AbInitio Components like Partition by Key, Partition by Round robin, reformat, rollup, join, scan, replicate, merge etc.
  • Extensively used various components like Input, Output (Table/File), Filter by expression, Rollup, Sort within groups, Dedup sorted, Reformat and Join.
  • Invoking multiple libraries in Spark using Scala and utilizing Data frames and Spark SQL API for faster processing of data.
  • Knowledge on design and implementation of CI/CD (Continuous Integration and Continuous Delivery) Pipeline using tools like Jenkins, TeamCity, TFS, GIT and Release Management.
  • Worked on Jenkins and Bamboo by installing, configuring, and maintaining for the purpose of continuous integration (CI) and for End to End automation for all build and deployments.
  • Configured and monitored distributed and multi - platform servers using Chef, Ansible.
  • Familiar with Amazon Cloud Administration which includes services like: EC2, S3, EBS, VPC, ELB, AMI, SNS, RDS, Redshift, IAM, Route 53, Auto scaling, Cloud Front, Cloud Watch, Cloud Trail, Cloud Formation, OPS Work, ELK, Security Groups.
  • Expertise in all components in the GDE of Ab Initio for creating, executing, testing, and maintaining graphs in Ab Initio and experience with Ab Initio Co-operating System in application tuning and debugging strategies.
  • Experienced in using PLANS, PDL's, PSET's, Conduct>IT
  • Created generic plans which run the validations for that specific triggered PSETs of the plan.
  • Worked closely with business and made them understand the usage of ACE, BRE.
  • Implemented parallelism using Ab Initio in Very Large Database Systems Environment.
  • Experienced in UNIX shell scripting.
  • Configured transformations using ACE and BRE.
  • Designed, developed, and deployed well-tuned Ab Initio graphs (Generic and Custom) for UNIX environment.
  • Configured Ab-Initio environment to connect to database using DB configuration file, Input Table, Output Table, and Update Table components.
  • Involved in System and Integration testing of the project. Experience in configuring and monitoring jobs in Control-M scheduler
  • Experience in branching, tagging, and maintaining versions across the environments, using SCM tools like GIT
  • Effectively manage important projects and programs in fast-paced, time-critical environments.
  • Self-motivated and proactive leader with superb technical acumen and exemplary communication skills. Exceptional ability to create, implement and improve IT standards, policies, and procedures.
  • Worked on JENKINS and HUDSON for continuous integration and for End to End automation for all build and deployments.

TECHNICAL SKILLS:

ETL Tools: Ab Initio GDE 1.15,3.2.5,3.1.5,3.2.6, Co >Operating System 2.15, 3.1, 3.3.4.2, MDH, TDM, Query>IT, Conduct>IT, ACE, BRE

Data Modeling Tools: Erwin 4.0, Oracle Designer

Frontend tools: TOAD 7.x, HIVE, WinSCP, BMC Remedy, Soap UI, Putty

RDBMS: Oracle 10g/9i/8i, SQL Server 2008, DB2, Teradata V2R6, Cassandra, Post GRE SQL

Scripting Languages: SQL, UNIX Shell Scripting.

Operating Systems: SUN Solaris 8.x, HP 11i, Linux, Unix, Windows7/VISTA/XP/2000/NT

Scheduling Tools: Autosys, Control M, CA-7

AWS Services: EC2, S3, EBS, VPC, ELB, AMI, SNS, RDS, Redshift, IAM, Route 53, Auto scaling, Cloud Front, Cloud Watch, Cloud Trail, Cloud Formation, OPS Work, ELK, Security Groups.

Version control tools: GIT

PROFESSIONAL EXPERIENCE:

Confidential, Norfolk, VA.

Senior Ab initio Developer.

Responsibilities:

  • Involved in software development life cycle by creating the high-level design to the detailed level design of the project.
  • Used ETL tool Ab Initio to pull data from source systems, cleanse, transform, and load data into Oracle database.
  • Develop and Provide operational(break-fix) support to existing reporting, ETL and data warehouse applications.
  • Design, code, program/develop application architectural redesigns and workflows using sql, Ab initio, DB2
  • Responsible for trouble shooting, identifying, and resolving data problems, Worked with analysts to
  • Determine data requirements and identify data sources, provide estimates for task duration.
  • Developed file transfer scripts that will transfer files from source server through SFTP.
  • Developed a validation graph using Write Multi-files. The graph picks read through data in the incoming.
  • Developed number of Ab Initio Graphs based on business requirements using various AbInitio Components like Partition by Key, Partition by Round robin, reformat, rollup, join, scan, replicate, merge etc.
  • Extensively used various components like Input, Output (Table/File), Filter by expression, Rollup, Sort within groups, Dedup sorted, Reformat and Join.
  • Trouble shooting network related connectivity issues using Cloud Watch and AWS Cloud config and Network interfaces
  • Developed Jenkins shared library used for CICD process to automate application development, build, scan, and deployment to multiple platforms.
  • Design and Developed High availability solutions in AWS cloud to support applications.
  • Define the dataflow requirements, interpret the transformation rules for all target data objects and develop graphs to support the transformation, document technical specifications.
  • Involved in designing and deploying multitude applications utilizing almost all AWS stack (Including EC2, S3, AMI, Route53, RDS, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and Auto-Scaling in Terraform scripting.
  • Developed Generic graphs and wrappers for the project.
  • Developed plans, sub-plans, and looped plans using Conduct>IT.
  • Designed and Developed Generic Graph that generates Extract PSET’s out from an extract metadata table.
  • Monitor performances in all environments using Ab initio control center, Control M/Autosys scheduler.
  • Worked closely with different team like networking and DBA to achieve end-to-end automation of AWS infrastructure
  • Created Continuous graphs using Subscribe and publish components.
  • Extensively used Partitioning Components like partition by key and De-partition components like
  • Concatenate, Gather and Merge in Ab Initio.
  • Responsible for the automation of Ab Initio graphs using korn shell scripts.
  • Carry the business requirements forward through conceptual solution ad design implementation.
  • Work with business analysts daily to deliver optimal business solutions.

Environment: Ab Initio GDE-3.2.7.2, 3.1.1, Co >Operating System-3.1.7,3.2.7.10, SQL, DB2, Oracle, Mainframe, XML, UNIX Sun Solaris, UNIX Shell Scripts, Query>IT, MDH, TDM, Conduct>IT, Linux, Control-m, Soap-UI, AWS EC2, S3, AMI, CloudFormation.

Confidential, Atlanta, GA.

Senior Ab initio Developer

Responsibilities:

  • Involved in designing and deploying multitude applications utilizing almost all AWS stack (Including EC2, S3, AMI, Route53, RDS, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and Auto-Scaling in AWS Cloud Formation.
  • Data load of E2E flow in all the graphs will be adhered to Source file validation, Data quality checks, Data masking, reconciliation, handling rejects/error
  • Perform data profiling and troubleshooting of solution, by examining the data available in existing data source all the results are loaded into Mhub.
  • Launched and configured of Amazon EC2 Cloud Servers using AMI's.
  • Configured an AWS Virtual Private Cloud (VPC), Database Subnet Group for isolation of resources within the Amazon RDS.
  • Invoking multiple libraries in Spark using Scala and utilizing Data frames and Spark SQL API for faster processing of data.
  • Used Spark SQL to migrate the data from hive to python using pyspark library
  • Develop common components of code which can be reused across the projects. Which will reduce the effort and cost.
  • Review and audit of existing solution, design, and system architecture. Provided resolutions during QA/UAT/Prod phases.
  • Developed Ab Initio graphs for receiving the data for various sources like Mainframe, XML, Databases (DB2, oracle, Sql), Flat files and convert them into Hadoop files and use them in distribution and outbound process.
  • Responsible for design and maintenance of the GIT Repositories, views, and the access control strategies.
  • Integration of read and write HDFS components in new and migration applications to lake.
  • Worked extensively with big data team in creating hive DB, tables for Hadoop file system.
  • Implemented performance tuning techniques in optimizing the existing applications with integration of Hadoop file system.
  • Created function in Lambda that aggregates the data from incoming events, then stored result data in Amazon Dynamo DB and S3.
  • Resolved Test harness issues for all the cut over applications which are critical to business.
  • Provide progress reports on status of the in Design/Development, Issues/Concerns and s/Milestones to the Project Management group.
  • Centralized metadata capture and ensure E2E lineage in MHUB and maintain the enterprise data governance.
  • Worked on CA7/ESP scheduler tool for scheduling the jobs
  • Identify the best technique for data provisioning using Ab initio ETL tool and with Big data appliance.
  • Used Jenkins which helped us, drive all Microservices builds out to the Docker registry and then deployed to Kubernetes.
  • Worked closely with business analysts, Data Modeler, and data owners to ensure the E2E design needs are met without any change in user experience.
  • Developed number of Ab Initio Graphs based on business requirements using various AbInitio Components like Partition by Key, Partition by Round robin, reformat, rollup, join, scan, replicate, merge etc.
  • Extensively used various components like Input, Output (Table/File), Filter by expression, Rollup, Sort within groups, Dedup sorted, Reformat and Join.
  • Metadata mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs.
  • Defined Source and Target definition for various files /tables such as UNIX files, Oracle.
  • Involved in Creating Ab Initio Multi File Systems (MFS) to run graphs in parallel.

Environment: Ab Initio GDE-3.2.6, 3.1.7, 3.2.3.2, Co >Operating System-3.3.4.2, 3.1.7,3.2, SQL, DB2, Oracle, Mainframe, XML, UNIX Sun Solaris, UNIX Shell Scripts, CA-7/ESP, Hive, Linux, AWS EC2, S3, AMI, Route53, Jenkins.

Confidential, Phoenix, AZ.

Application architect/ Ab intio Dev

Responsibilities:

  • Lead the Ab Initio design team, Data Mapping, Code review, Implementation planning meetings.
  • Developed Generic graphs and Plans which can run for any region which validates and pushes the files to downstream application streams.
  • Developed shell scripts for automation of environment setup, wrapper scripts for process, batch execution and other process.
  • Worked on Data Model with team and created flexible model considering the needs of future.
  • Designed and developed a SFTP process using Abinitio, to push the files to down streams application process.
  • Worked extensively in creating generic Ab initio plans using components like publish to plan which handles recovery by re-sending published messages when required.
  • The File Entry Process is updated to send as a message with a P2P, which is then used in the subscribe loop.
  • Worked on different methods in Conduct>IT like evaluating an expression, running a shell script.
  • Worked on Co>Op upgrade and GDE version projects from versions 2.0 to 3.1.5
  • Created new graphs which had data manipulation logic for which join, rollup, reformat, Partition by Key, Partition by round robin, components were used and also to develop complex and generic Ab-Initio graphs using Ab-Initio Parallelism techniques, Data Parallelism and MFS Techniques.
  • Developed the process required to gain optimal performance and usage from the integration of the Ab Initio Metadata repository.
  • Access files, databases, standard lookup files for the data quality analysis and utilizing Express IT(ACE/BRE) to create the App Configs and Business Rules.
  • Developed, tested, and reviewed complex Ab Initio graphs, sub-graphs, DML, Pset, XFR, deployed scripts, DBC files for connectivity, create Package and exports.
  • Developed stored procedures using scripts to generate unique values which are used in registering a client file.
  • Involved in migration of the applications from Unix to Linux environment.

Environment: Ab Initio GDE 2.0.1, 3.1.7, 3.2.3.2, ACE, BRE, Co >Operating System 3.1.7,3.2 SQL, PL/SQL, UNIX Sun Solaris, UNIX Shell Scripts, Control-M, Linux.

Confidential, Jacksonville, FL.

Ab Initio Developer

Responsibilities:

  • Developed Ab-Initio graphs - using database, dataset, departition, transform, sort and partition components for extracting, loading, and transforming external data to feed PARS by creating DML's, XFR's, DB Config's, SQL’s, MP’s.
  • Created XFR's using BRE.
  • Design team member for integrating batch and online architectures, define re-usable components, setting programming and naming conventions, assigning sub-tasks and give inputs and monitor scope/business/technical requirements.
  • Usage of Ab-Initio data profiler for data cleansing and analysis. Also using the same to analyze test output for making the data warehouse more robust by identifying highly secured information such as personal account information.
  • Work in the UNIX environment using Shell Scripts for FTP, SFTP processes.
  • Worked closely with Business for BRE (Business Rule Engine) concept.
  • Usage of Ab Initio EME for dependency analysis and configuration management.
  • Performance tuning of Ab Initio Load processes. Participating in various data cleansing and data quality exercises.
  • Design & development of data load components using Ab-Initio and UNIX shell scripts.
  • Understanding the reporting requirements of users in marketing department.
  • Developed UNIX shell scripts using SQL programs for daily and weekly data loads and involved in creating Database Configuration files (.dbc), used in transformations to extract data from different sources and load into target tables.
  • Created PSET"s for generic graphs.
  • Developed Data Quality checks using Express>IT tool with Data Quality Environment (DQE)
  • Coordinated and developed interfaces for external systems to communicate to PARS.
  • Responsible for optimizing performance time for the Ab Initio applications by implementing parallel processing, using look ups’, in-memory sort etc.
  • Created and updated AutoSys jills.

Environment: Ab Initio GDE 3.1.7, BRE, Co >Operating System 3.1.7, SQL, PL/SQL, UNIX Sun Solaris, UNIX Shell Scripts, Autosys 11.3.

We'd love your feedback!